Sample records for target detection probability

  1. Entanglement-enhanced Neyman-Pearson target detection using quantum illumination

    NASA Astrophysics Data System (ADS)

    Zhuang, Quntao; Zhang, Zheshen; Shapiro, Jeffrey H.

    2017-08-01

    Quantum illumination (QI) provides entanglement-based target detection---in an entanglement-breaking environment---whose performance is significantly better than that of optimum classical-illumination target detection. QI's performance advantage was established in a Bayesian setting with the target presumed equally likely to be absent or present and error probability employed as the performance metric. Radar theory, however, eschews that Bayesian approach, preferring the Neyman-Pearson performance criterion to avoid the difficulties of accurately assigning prior probabilities to target absence and presence and appropriate costs to false-alarm and miss errors. We have recently reported an architecture---based on sum-frequency generation (SFG) and feedforward (FF) processing---for minimum error-probability QI target detection with arbitrary prior probabilities for target absence and presence. In this paper, we use our results for FF-SFG reception to determine the receiver operating characteristic---detection probability versus false-alarm probability---for optimum QI target detection under the Neyman-Pearson criterion.

  2. Knowing where is different from knowing what: Distinct response time profiles and accuracy effects for target location, orientation, and color probability.

    PubMed

    Jabar, Syaheed B; Filipowicz, Alex; Anderson, Britt

    2017-11-01

    When a location is cued, targets appearing at that location are detected more quickly. When a target feature is cued, targets bearing that feature are detected more quickly. These attentional cueing effects are only superficially similar. More detailed analyses find distinct temporal and accuracy profiles for the two different types of cues. This pattern parallels work with probability manipulations, where both feature and spatial probability are known to affect detection accuracy and reaction times. However, little has been done by way of comparing these effects. Are probability manipulations on space and features distinct? In a series of five experiments, we systematically varied spatial probability and feature probability along two dimensions (orientation or color). In addition, we decomposed response times into initiation and movement components. Targets appearing at the probable location were reported more quickly and more accurately regardless of whether the report was based on orientation or color. On the other hand, when either color probability or orientation probability was manipulated, response time and accuracy improvements were specific for that probable feature dimension. Decomposition of the response time benefits demonstrated that spatial probability only affected initiation times, whereas manipulations of feature probability affected both initiation and movement times. As detection was made more difficult, the two effects further diverged, with spatial probability disproportionally affecting initiation times and feature probability disproportionately affecting accuracy. In conclusion, all manipulations of probability, whether spatial or featural, affect detection. However, only feature probability affects perceptual precision, and precision effects are specific to the probable attribute.

  3. Rare targets are less susceptible to attention capture once detection has begun.

    PubMed

    Hon, Nicholas; Ng, Gavin; Chan, Gerald

    2016-04-01

    Rare or low probability targets are detected more slowly and/ or less accurately than higher probability counterparts. Various proposals have implicated perceptual and response-based processes in this deficit. Recent evidence, however, suggests that it is attentional in nature, with low probability targets requiring more attentional resources than high probability ones to detect. This difference in attentional requirements, in turn, suggests the possibility that low and high probability targets may have different susceptibilities to attention capture, which is also known to be resource-dependent. Supporting this hypothesis, we found that, once attentional resources have begun to be engaged by detection processes, low, but not high, probability targets have a reduced susceptibility to capture. Our findings speak to several issues. First, they indicate that the likelihood of attention capture occurring when a given task-relevant stimulus is being processed is dependent, to some extent, on how said stimulus is represented within mental task sets. Second, they provide added support for the idea that the behavioural deficit associated with low probability targets is attention-based. Finally, the current data point to reduced top-down biasing of target templates as a likely mechanism underlying the attentional locus of the deficit in question.

  4. Spatial Probability Dynamically Modulates Visual Target Detection in Chickens

    PubMed Central

    Sridharan, Devarajan; Ramamurthy, Deepa L.; Knudsen, Eric I.

    2013-01-01

    The natural world contains a rich and ever-changing landscape of sensory information. To survive, an organism must be able to flexibly and rapidly locate the most relevant sources of information at any time. Humans and non-human primates exploit regularities in the spatial distribution of relevant stimuli (targets) to improve detection at locations of high target probability. Is the ability to flexibly modify behavior based on visual experience unique to primates? Chickens (Gallus domesticus) were trained on a multiple alternative Go/NoGo task to detect a small, briefly-flashed dot (target) in each of the quadrants of the visual field. When targets were presented with equal probability (25%) in each quadrant, chickens exhibited a distinct advantage for detecting targets at lower, relative to upper, hemifield locations. Increasing the probability of presentation in the upper hemifield locations (to 80%) dramatically improved detection performance at these locations to be on par with lower hemifield performance. Finally, detection performance in the upper hemifield changed on a rapid timescale, improving with successive target detections, and declining with successive detections at the diagonally opposite location in the lower hemifield. These data indicate the action of a process that in chickens, as in primates, flexibly and dynamically modulates detection performance based on the spatial probabilities of sensory stimuli as well as on recent performance history. PMID:23734188

  5. An Experiment Quantifying The Effect Of Clutter On Target Detection

    NASA Astrophysics Data System (ADS)

    Weathersby, Marshall R.; Schmieder, David E.

    1985-01-01

    Experiments were conducted to determine the influence of background clutter on target detection criteria. The experiment consisted of placing observers in front of displayed images on a TV monitor. Observer ability to detect military targets embedded in simulated natural and manmade background clutter was measured when there was unlimited viewing time. Results were described in terms of detection probability versus target resolution for various signal to clutter ratios (SCR). The experiments were preceded by a search for a meaningful clutter definition. The selected definition was a statistical measure computed by averaging the standard deviation of contiguous scene cells over the whole scene. The cell size was comparable to the target size. Observer test results confirmed the expectation that the resolution required for a given detection probability was a continuum function of the clutter level. At the lower SCRs the resolution required for a high probability of detection was near 6 lines pairs per target (LP/TGT), while at the higher SCRs it was found that a resolution of less than 0.25 LP/TGT would yield a high probability of detection. These results are expected to aid in target acquisition performance modeling and to lead to improved specifications for imaging automatic target screeners.

  6. Detection performance in clutter with variable resolution

    NASA Astrophysics Data System (ADS)

    Schmieder, D. E.; Weathersby, M. R.

    1983-07-01

    Experiments were conducted to determine the influence of background clutter on target detection criteria. The experiment consisted of placing observers in front of displayed images on a TV monitor. Observer ability to detect military targets embedded in simulated natural and manmade background clutter was measured when there was unlimited viewing time. Results were described in terms of detection probability versus target resolution for various signal to clutter ratios (SCR). The experiments were preceded by a search for a meaningful clutter definition. The selected definition was a statistical measure computed by averaging the standard deviation of contiguous scene cells over the whole scene. The cell size was comparable to the target size. Observer test results confirmed the expectation that the resolution required for a given detection probability was a continuum function of the clutter level. At the lower SCRs the resolution required for a high probability of detection was near 6 line pairs per target (LP/TGT), while at the higher SCRs it was found that a resoluton of less than 0.25 LP/TGT would yield a high probability of detection. These results are expected to aid in target acquisition performance modeling and to lead to improved specifications for imaging automatic target screeners.

  7. Multi-target Detection, Tracking, and Data Association on Road Networks Using Unmanned Aerial Vehicles

    NASA Astrophysics Data System (ADS)

    Barkley, Brett E.

    A cooperative detection and tracking algorithm for multiple targets constrained to a road network is presented for fixed-wing Unmanned Air Vehicles (UAVs) with a finite field of view. Road networks of interest are formed into graphs with nodes that indicate the target likelihood ratio (before detection) and position probability (after detection). A Bayesian likelihood ratio tracker recursively assimilates target observations until the cumulative observations at a particular location pass a detection criterion. At this point, a target is considered detected and a position probability is generated for the target on the graph. Data association is subsequently used to route future measurements to update the likelihood ratio tracker (for undetected target) or to update a position probability (a previously detected target). Three strategies for motion planning of UAVs are proposed to balance searching for new targets with tracking known targets for a variety of scenarios. Performance was tested in Monte Carlo simulations for a variety of mission parameters, including tracking on road networks with varying complexity and using UAVs at various altitudes.

  8. The relationship study between image features and detection probability based on psychology experiments

    NASA Astrophysics Data System (ADS)

    Lin, Wei; Chen, Yu-hua; Wang, Ji-yuan; Gao, Hong-sheng; Wang, Ji-jun; Su, Rong-hua; Mao, Wei

    2011-04-01

    Detection probability is an important index to represent and estimate target viability, which provides basis for target recognition and decision-making. But it will expend a mass of time and manpower to obtain detection probability in reality. At the same time, due to the different interpretation of personnel practice knowledge and experience, a great difference will often exist in the datum obtained. By means of studying the relationship between image features and perception quantity based on psychology experiments, the probability model has been established, in which the process is as following.Firstly, four image features have been extracted and quantified, which affect directly detection. Four feature similarity degrees between target and background were defined. Secondly, the relationship between single image feature similarity degree and perception quantity was set up based on psychological principle, and psychological experiments of target interpretation were designed which includes about five hundred people for interpretation and two hundred images. In order to reduce image features correlativity, a lot of artificial synthesis images have been made which include images with single brightness feature difference, images with single chromaticity feature difference, images with single texture feature difference and images with single shape feature difference. By analyzing and fitting a mass of experiments datum, the model quantitys have been determined. Finally, by applying statistical decision theory and experimental results, the relationship between perception quantity with target detection probability has been found. With the verification of a great deal of target interpretation in practice, the target detection probability can be obtained by the model quickly and objectively.

  9. Improved detection probability of low level light and infrared image fusion system

    NASA Astrophysics Data System (ADS)

    Luo, Yuxiang; Fu, Rongguo; Zhang, Junju; Wang, Wencong; Chang, Benkang

    2018-02-01

    Low level light(LLL) image contains rich information on environment details, but is easily affected by the weather. In the case of smoke, rain, cloud or fog, much target information will lose. Infrared image, which is from the radiation produced by the object itself, can be "active" to obtain the target information in the scene. However, the image contrast and resolution is bad, the ability of the acquisition of target details is very poor, and the imaging mode does not conform to the human visual habit. The fusion of LLL and infrared image can make up for the deficiency of each sensor and give play to the advantages of single sensor. At first, we show the hardware design of fusion circuit. Then, through the recognition probability calculation of the target(one person) and the background image(trees), we find that the trees detection probability of LLL image is higher than that of the infrared image, and the person detection probability of the infrared image is obviously higher than that of LLL image. The detection probability of fusion image for one person and trees is higher than that of single detector. Therefore, image fusion can significantly enlarge recognition probability and improve detection efficiency.

  10. An empirical probability model of detecting species at low densities.

    PubMed

    Delaney, David G; Leung, Brian

    2010-06-01

    False negatives, not detecting things that are actually present, are an important but understudied problem. False negatives are the result of our inability to perfectly detect species, especially those at low density such as endangered species or newly arriving introduced species. They reduce our ability to interpret presence-absence survey data and make sound management decisions (e.g., rapid response). To reduce the probability of false negatives, we need to compare the efficacy and sensitivity of different sampling approaches and quantify an unbiased estimate of the probability of detection. We conducted field experiments in the intertidal zone of New England and New York to test the sensitivity of two sampling approaches (quadrat vs. total area search, TAS), given different target characteristics (mobile vs. sessile). Using logistic regression we built detection curves for each sampling approach that related the sampling intensity and the density of targets to the probability of detection. The TAS approach reduced the probability of false negatives and detected targets faster than the quadrat approach. Mobility of targets increased the time to detection but did not affect detection success. Finally, we interpreted two years of presence-absence data on the distribution of the Asian shore crab (Hemigrapsus sanguineus) in New England and New York, using our probability model for false negatives. The type of experimental approach in this paper can help to reduce false negatives and increase our ability to detect species at low densities by refining sampling approaches, which can guide conservation strategies and management decisions in various areas of ecology such as conservation biology and invasion ecology.

  11. Joint detection and tracking of size-varying infrared targets based on block-wise sparse decomposition

    NASA Astrophysics Data System (ADS)

    Li, Miao; Lin, Zaiping; Long, Yunli; An, Wei; Zhou, Yiyu

    2016-05-01

    The high variability of target size makes small target detection in Infrared Search and Track (IRST) a challenging task. A joint detection and tracking method based on block-wise sparse decomposition is proposed to address this problem. For detection, the infrared image is divided into overlapped blocks, and each block is weighted on the local image complexity and target existence probabilities. Target-background decomposition is solved by block-wise inexact augmented Lagrange multipliers. For tracking, label multi-Bernoulli (LMB) tracker tracks multiple targets taking the result of single-frame detection as input, and provides corresponding target existence probabilities for detection. Unlike fixed-size methods, the proposed method can accommodate size-varying targets, due to no special assumption for the size and shape of small targets. Because of exact decomposition, classical target measurements are extended and additional direction information is provided to improve tracking performance. The experimental results show that the proposed method can effectively suppress background clutters, detect and track size-varying targets in infrared images.

  12. Dim target detection method based on salient graph fusion

    NASA Astrophysics Data System (ADS)

    Hu, Ruo-lan; Shen, Yi-yan; Jiang, Jun

    2018-02-01

    Dim target detection is one key problem in digital image processing field. With development of multi-spectrum imaging sensor, it becomes a trend to improve the performance of dim target detection by fusing the information from different spectral images. In this paper, one dim target detection method based on salient graph fusion was proposed. In the method, Gabor filter with multi-direction and contrast filter with multi-scale were combined to construct salient graph from digital image. And then, the maximum salience fusion strategy was designed to fuse the salient graph from different spectral images. Top-hat filter was used to detect dim target from the fusion salient graph. Experimental results show that proposal method improved the probability of target detection and reduced the probability of false alarm on clutter background images.

  13. Dissociation in decision bias mechanism between probabilistic information and previous decision

    PubMed Central

    Kaneko, Yoshiyuki; Sakai, Katsuyuki

    2015-01-01

    Target detection performance is known to be influenced by events in the previous trials. It has not been clear, however, whether this bias effect is due to the previous sensory stimulus, motor response, or decision. Also it remains open whether or not the previous trial effect emerges via the same mechanism as the effect of knowledge about the target probability. In the present study, we asked normal human subjects to make a decision about the presence or absence of a visual target. We presented a pre-cue indicating the target probability before the stimulus, and also a decision-response mapping cue after the stimulus so as to tease apart the effect of decision from that of motor response. We found that the target detection performance was significantly affected by the probability cue in the current trial and also by the decision in the previous trial. While the information about the target probability modulated the decision criteria, the previous decision modulated the sensitivity to target-relevant sensory signals (d-prime). Using functional magnetic resonance imaging (fMRI), we also found that activation in the left intraparietal sulcus (IPS) was decreased when the probability cue indicated a high probability of the target. By contrast, activation in the right inferior frontal gyrus (IFG) was increased when the subjects made a target-present decision in the previous trial, but this change was observed specifically when the target was present in the current trial. Activation in these regions was associated with individual-difference in the decision computation parameters. We argue that the previous decision biases the target detection performance by modulating the processing of target-selective information, and this mechanism is distinct from modulation of decision criteria due to expectation of a target. PMID:25999844

  14. Probability cueing of distractor locations: both intertrial facilitation and statistical learning mediate interference reduction.

    PubMed

    Goschy, Harriet; Bakos, Sarolta; Müller, Hermann J; Zehetleitner, Michael

    2014-01-01

    Targets in a visual search task are detected faster if they appear in a probable target region as compared to a less probable target region, an effect which has been termed "probability cueing." The present study investigated whether probability cueing cannot only speed up target detection, but also minimize distraction by distractors in probable distractor regions as compared to distractors in less probable distractor regions. To this end, three visual search experiments with a salient, but task-irrelevant, distractor ("additional singleton") were conducted. Experiment 1 demonstrated that observers can utilize uneven spatial distractor distributions to selectively reduce interference by distractors in frequent distractor regions as compared to distractors in rare distractor regions. Experiments 2 and 3 showed that intertrial facilitation, i.e., distractor position repetitions, and statistical learning (independent of distractor position repetitions) both contribute to the probability cueing effect for distractor locations. Taken together, the present results demonstrate that probability cueing of distractor locations has the potential to serve as a strong attentional cue for the shielding of likely distractor locations.

  15. Texture metric that predicts target detection performance

    NASA Astrophysics Data System (ADS)

    Culpepper, Joanne B.

    2015-12-01

    Two texture metrics based on gray level co-occurrence error (GLCE) are used to predict probability of detection and mean search time. The two texture metrics are local clutter metrics and are based on the statistics of GLCE probability distributions. The degree of correlation between various clutter metrics and the target detection performance of the nine military vehicles in complex natural scenes found in the Search_2 dataset are presented. Comparison is also made between four other common clutter metrics found in the literature: root sum of squares, Doyle, statistical variance, and target structure similarity. The experimental results show that the GLCE energy metric is a better predictor of target detection performance when searching for targets in natural scenes than the other clutter metrics studied.

  16. Effectiveness of scat detection dogs for detecting forest carnivores

    USGS Publications Warehouse

    Long, Robert A.; Donovan, T.M.; MacKay, Paula; Zielinski, William J.; Buzas, Jeffrey S.

    2007-01-01

    We assessed the detection and accuracy rates of detection dogs trained to locate scats from free-ranging black bears (Ursus americanus), fishers (Martes pennanti), and bobcats (Lynx rufus). During the summers of 2003-2004, 5 detection teams located 1,565 scats (747 putative black bear, 665 putative fisher, and 153 putative bobcat) at 168 survey sites throughout Vermont, USA. Of 347 scats genetically analyzed for species identification, 179 (51.6%) yielded a positive identification, 131 (37.8%) failed to yield DNA information, and 37 (10.7%) yielded DNA but provided no species confirmation. For 70 survey sites where confirmation of a putative target species' scat was not possible, we assessed the probability that ???1 of the scats collected at the site was deposited by the target species (probability of correct identification; P ID). Based on species confirmations or PID values, we detected bears at 57.1% (96) of sites, fishers at 61.3% (103) of sites, and bobcats at 12.5%o (21) of sites. We estimated that the mean probability of detecting the target species (when present) during a single visit to a site was 0.86 for black bears, 0.95 for fishers, and 0.40 for bobcats. The probability of detecting black bears was largely unaffected by site- or visit-specific covariates, but the probability of detecting fishers varied by detection team. We found little or no effect of topographic ruggedness, vegetation density, or local weather (e.g., temp, humidity) on detection probability for fishers or black bears (data were insufficient for bobcat analyses). Detection dogs were highly effective at locating scats from forest carnivores and provided an efficient and accurate method for collecting detection-nondetection data on multiple species.

  17. A new method for detecting small and dim targets in starry background

    NASA Astrophysics Data System (ADS)

    Yao, Rui; Zhang, Yanning; Jiang, Lei

    2011-08-01

    Small visible optical space targets detection is one of the key issues in the research of long-range early warning and space debris surveillance. The SNR(Signal to Noise Ratio) of the target is very low because of the self influence of image device. Random noise and background movement also increase the difficulty of target detection. In order to detect small visible optical space targets effectively and rapidly, we bring up a novel detecting method based on statistic theory. Firstly, we get a reasonable statistical model of visible optical space image. Secondly, we extract SIFT(Scale-Invariant Feature Transform) feature of the image frames, and calculate the transform relationship, then use the transform relationship to compensate whole visual field's movement. Thirdly, the influence of star was wiped off by using interframe difference method. We find segmentation threshold to differentiate candidate targets and noise by using OTSU method. Finally, we calculate statistical quantity to judge whether there is the target for every pixel position in the image. Theory analysis shows the relationship of false alarm probability and detection probability at different SNR. The experiment result shows that this method could detect target efficiently, even the target passing through stars.

  18. Applying the log-normal distribution to target detection

    NASA Astrophysics Data System (ADS)

    Holst, Gerald C.

    1992-09-01

    Holst and Pickard experimentally determined that MRT responses tend to follow a log-normal distribution. The log normal distribution appeared reasonable because nearly all visual psychological data is plotted on a logarithmic scale. It has the additional advantage that it is bounded to positive values; an important consideration since probability of detection is often plotted in linear coordinates. Review of published data suggests that the log-normal distribution may have universal applicability. Specifically, the log-normal distribution obtained from MRT tests appears to fit the target transfer function and the probability of detection of rectangular targets.

  19. Optimal search strategies of space-time coupled random walkers with finite lifetimes

    NASA Astrophysics Data System (ADS)

    Campos, D.; Abad, E.; Méndez, V.; Yuste, S. B.; Lindenberg, K.

    2015-05-01

    We present a simple paradigm for detection of an immobile target by a space-time coupled random walker with a finite lifetime. The motion of the walker is characterized by linear displacements at a fixed speed and exponentially distributed duration, interrupted by random changes in the direction of motion and resumption of motion in the new direction with the same speed. We call these walkers "mortal creepers." A mortal creeper may die at any time during its motion according to an exponential decay law characterized by a finite mean death rate ωm. While still alive, the creeper has a finite mean frequency ω of change of the direction of motion. In particular, we consider the efficiency of the target search process, characterized by the probability that the creeper will eventually detect the target. Analytic results confirmed by numerical results show that there is an ωm-dependent optimal frequency ω =ωopt that maximizes the probability of eventual target detection. We work primarily in one-dimensional (d =1 ) domains and examine the role of initial conditions and of finite domain sizes. Numerical results in d =2 domains confirm the existence of an optimal frequency of change of direction, thereby suggesting that the observed effects are robust to changes in dimensionality. In the d =1 case, explicit expressions for the probability of target detection in the long time limit are given. In the case of an infinite domain, we compute the detection probability for arbitrary times and study its early- and late-time behavior. We further consider the survival probability of the target in the presence of many independent creepers beginning their motion at the same location and at the same time. We also consider a version of the standard "target problem" in which many creepers start at random locations at the same time.

  20. Resolution Enhanced Magnetic Sensing System for Wide Coverage Real Time UXO Detection

    NASA Astrophysics Data System (ADS)

    Zalevsky, Zeev; Bregman, Yuri; Salomonski, Nizan; Zafrir, Hovav

    2012-09-01

    In this paper we present a new high resolution automatic detection algorithm based upon a Wavelet transform and then validate it in marine related experiments. The proposed approach allows obtaining an automatic detection in a very low signal to noise ratios. The amount of calculations is reduced, the magnetic trend is depressed and the probability of detection/ false alarm rate can easily be controlled. Moreover, the algorithm enables to distinguish between close targets. In the algorithm we use the physical dependence of the magnetic field of a magnetic dipole in order to define a Wavelet mother function that later on can detect magnetic targets modeled as dipoles and embedded in noisy surrounding, at improved resolution. The proposed algorithm was realized on synthesized targets and then validated in field experiments involving a marine surface-floating system for wide coverage real time unexploded ordinance (UXO) detection and mapping. The detection probability achieved in the marine experiment was above 90%. The horizontal radial error of most of the detected targets was only 16 m and two baseline targets that were immersed about 20 m one to another could easily be distinguished.

  1. A Max-Flow Based Algorithm for Connected Target Coverage with Probabilistic Sensors

    PubMed Central

    Shan, Anxing; Xu, Xianghua; Cheng, Zongmao; Wang, Wensheng

    2017-01-01

    Coverage is a fundamental issue in the research field of wireless sensor networks (WSNs). Connected target coverage discusses the sensor placement to guarantee the needs of both coverage and connectivity. Existing works largely leverage on the Boolean disk model, which is only a coarse approximation to the practical sensing model. In this paper, we focus on the connected target coverage issue based on the probabilistic sensing model, which can characterize the quality of coverage more accurately. In the probabilistic sensing model, sensors are only be able to detect a target with certain probability. We study the collaborative detection probability of target under multiple sensors. Armed with the analysis of collaborative detection probability, we further formulate the minimum ϵ-connected target coverage problem, aiming to minimize the number of sensors satisfying the requirements of both coverage and connectivity. We map it into a flow graph and present an approximation algorithm called the minimum vertices maximum flow algorithm (MVMFA) with provable time complex and approximation ratios. To evaluate our design, we analyze the performance of MVMFA theoretically and also conduct extensive simulation studies to demonstrate the effectiveness of our proposed algorithm. PMID:28587084

  2. A Max-Flow Based Algorithm for Connected Target Coverage with Probabilistic Sensors.

    PubMed

    Shan, Anxing; Xu, Xianghua; Cheng, Zongmao; Wang, Wensheng

    2017-05-25

    Coverage is a fundamental issue in the research field of wireless sensor networks (WSNs). Connected target coverage discusses the sensor placement to guarantee the needs of both coverage and connectivity. Existing works largely leverage on the Boolean disk model, which is only a coarse approximation to the practical sensing model. In this paper, we focus on the connected target coverage issue based on the probabilistic sensing model, which can characterize the quality of coverage more accurately. In the probabilistic sensing model, sensors are only be able to detect a target with certain probability. We study the collaborative detection probability of target under multiple sensors. Armed with the analysis of collaborative detection probability, we further formulate the minimum ϵ -connected target coverage problem, aiming to minimize the number of sensors satisfying the requirements of both coverage and connectivity. We map it into a flow graph and present an approximation algorithm called the minimum vertices maximum flow algorithm (MVMFA) with provable time complex and approximation ratios. To evaluate our design, we analyze the performance of MVMFA theoretically and also conduct extensive simulation studies to demonstrate the effectiveness of our proposed algorithm.

  3. Is the detection of aquatic environmental DNA influenced by substrate type?

    PubMed

    Buxton, Andrew S; Groombridge, Jim J; Griffiths, Richard A

    2017-01-01

    The use of environmental DNA (eDNA) to assess the presence-absence of rare, cryptic or invasive species is hindered by a poor understanding of the factors that can remove DNA from the system. In aquatic systems, eDNA can be transported out either horizontally in water flows or vertically by incorporation into the sediment. Equally, eDNA may be broken down by various biotic and abiotic processes if the target organism leaves the system. We use occupancy modelling and a replicated mesocosm experiment to examine how detection probability of eDNA changes once the target species is no longer present. We hypothesise that detection probability falls faster with a sediment which has a large number of DNA binding sites such as topsoil or clay, over lower DNA binding capacity substrates such as sand. Water removed from ponds containing the target species (the great crested newt) initially showed high detection probabilities, but these fell to between 40% and 60% over the first 10 days and to between 10% and 22% by day 15: eDNA remained detectable at very low levels until day 22. Very little difference in detection was observed between the control group (no substrate) and the sand substrate. A small reduction in detection probability was observed between the control and clay substrates, but this was not significant. However, a highly significant reduction in detection probability was observed with a topsoil substrate. This result is likely to have stemmed from increased levels of PCR inhibition, suggesting that incorporation of DNA into the sentiment is of only limited importance. Surveys of aquatic species using eDNA clearly need to take account of substrate type as well as other environmental factors when collecting samples, analysing data and interpreting the results.

  4. Improving detection of low SNR targets using moment-based detection

    NASA Astrophysics Data System (ADS)

    Young, Shannon R.; Steward, Bryan J.; Hawks, Michael; Gross, Kevin C.

    2016-05-01

    Increases in the number of cameras deployed, frame rate, and detector array sizes have led to a dramatic increase in the volume of motion imagery data that is collected. Without a corresponding increase in analytical manpower, much of the data is not analyzed to full potential. This creates a need for fast, automated, and robust methods for detecting signals of interest. Current approaches fall into two categories: detect-before-track (DBT), which are fast but often poor at detecting dim targets, and track-before-detect (TBD) methods which can offer better performance but are typically much slower. This research seeks to contribute to the near real time detection of low SNR, unresolved moving targets through an extension of earlier work on higher order moments anomaly detection, a method that exploits both spatial and temporal information but is still computationally efficient and massively parallelizable. It was found that intelligent selection of parameters can improve probability of detection by as much as 25% compared to earlier work with higherorder moments. The present method can reduce detection thresholds by 40% compared to the Reed-Xiaoli anomaly detector for low SNR targets (for a given probability of detection and false alarm).

  5. A new approach to increase the two-dimensional detection probability of CSI algorithm for WAS-GMTI mode

    NASA Astrophysics Data System (ADS)

    Yan, H.; Zheng, M. J.; Zhu, D. Y.; Wang, H. T.; Chang, W. S.

    2015-07-01

    When using clutter suppression interferometry (CSI) algorithm to perform signal processing in a three-channel wide-area surveillance radar system, the primary concern is to effectively suppress the ground clutter. However, a portion of moving target's energy is also lost in the process of channel cancellation, which is often neglected in conventional applications. In this paper, we firstly investigate the two-dimensional (radial velocity dimension and squint angle dimension) residual amplitude of moving targets after channel cancellation with CSI algorithm. Then, a new approach is proposed to increase the two-dimensional detection probability of moving targets by reserving the maximum value of the three channel cancellation results in non-uniformly spaced channel system. Besides, theoretical expression of the false alarm probability with the proposed approach is derived in the paper. Compared with the conventional approaches in uniformly spaced channel system, simulation results validate the effectiveness of the proposed approach. To our knowledge, it is the first time that the two-dimensional detection probability of CSI algorithm is studied.

  6. Application of multivariate Gaussian detection theory to known non-Gaussian probability density functions

    NASA Astrophysics Data System (ADS)

    Schwartz, Craig R.; Thelen, Brian J.; Kenton, Arthur C.

    1995-06-01

    A statistical parametric multispectral sensor performance model was developed by ERIM to support mine field detection studies, multispectral sensor design/performance trade-off studies, and target detection algorithm development. The model assumes target detection algorithms and their performance models which are based on data assumed to obey multivariate Gaussian probability distribution functions (PDFs). The applicability of these algorithms and performance models can be generalized to data having non-Gaussian PDFs through the use of transforms which convert non-Gaussian data to Gaussian (or near-Gaussian) data. An example of one such transform is the Box-Cox power law transform. In practice, such a transform can be applied to non-Gaussian data prior to the introduction of a detection algorithm that is formally based on the assumption of multivariate Gaussian data. This paper presents an extension of these techniques to the case where the joint multivariate probability density function of the non-Gaussian input data is known, and where the joint estimate of the multivariate Gaussian statistics, under the Box-Cox transform, is desired. The jointly estimated multivariate Gaussian statistics can then be used to predict the performance of a target detection algorithm which has an associated Gaussian performance model.

  7. Search times and probability of detection in time-limited search

    NASA Astrophysics Data System (ADS)

    Wilson, David; Devitt, Nicole; Maurer, Tana

    2005-05-01

    When modeling the search and target acquisition process, probability of detection as a function of time is important to war games and physical entity simulations. Recent US Army RDECOM CERDEC Night Vision and Electronics Sensor Directorate modeling of search and detection has focused on time-limited search. Developing the relationship between detection probability and time of search as a differential equation is explored. One of the parameters in the current formula for probability of detection in time-limited search corresponds to the mean time to detect in time-unlimited search. However, the mean time to detect in time-limited search is shorter than the mean time to detect in time-unlimited search and the relationship between them is a mathematical relationship between these two mean times. This simple relationship is derived.

  8. A Brownian Bridge Movement Model to Track Mobile Targets

    DTIC Science & Technology

    2016-09-01

    breakout of Chinese forces in the South China Sea. Probability heat maps, depicting the probability of a target location at discrete times, are...achieve a higher probability of detection, it is more effective to have sensors cover a wider area at fewer discrete points in time than to have a...greater number of discrete looks using sensors covering smaller areas. 14. SUBJECT TERMS Brownian bridge movement models, unmanned sensors

  9. Environmentally Adaptive UXO Detection and Classification Systems

    DTIC Science & Technology

    2016-04-01

    probability of false alarm ( Pfa ), as well as Receiver Op- erating Characteristic (ROC) curve and confusion matrix characteristics. The results of these...techniques at a false alarm probability of Pfa = 1× 10−3. X̃ = g(X). In this case, the problem remains invariant to the group of transformations G = { g : g(X...and observed target responses as well as the probability of detection versus SNR for both detection techniques at Pfa = 1× 10−3. with N = 128 and M = 50

  10. Space moving target detection using time domain feature

    NASA Astrophysics Data System (ADS)

    Wang, Min; Chen, Jin-yong; Gao, Feng; Zhao, Jin-yu

    2018-01-01

    The traditional space target detection methods mainly use the spatial characteristics of the star map to detect the targets, which can not make full use of the time domain information. This paper presents a new space moving target detection method based on time domain features. We firstly construct the time spectral data of star map, then analyze the time domain features of the main objects (target, stars and the background) in star maps, finally detect the moving targets using single pulse feature of the time domain signal. The real star map target detection experimental results show that the proposed method can effectively detect the trajectory of moving targets in the star map sequence, and the detection probability achieves 99% when the false alarm rate is about 8×10-5, which outperforms those of compared algorithms.

  11. Comparing scat detection dogs, cameras, and hair snares for surveying carnivores

    USGS Publications Warehouse

    Long, Robert A.; Donovan, T.M.; MacKay, Paula; Zielinski, William J.; Buzas, Jeffrey S.

    2007-01-01

    Carnivores typically require large areas of habitat, exist at low natural densities, and exhibit elusive behavior - characteristics that render them difficult to study. Noninvasive survey methods increasingly provide means to collect extensive data on carnivore occupancy, distribution, and abundance. During the summers of 2003-2004, we compared the abilities of scat detection dogs, remote cameras, and hair snares to detect black bears (Ursus americanus), fishers (Martes pennanti), and bobcats (Lynx rufus) at 168 sites throughout Vermont. All 3 methods detected black bears; neither fishers nor bobcats were detected by hair snares. Scat detection dogs yielded the highest raw detection rate and probability of detection (given presence) for each of the target species, as well as the greatest number of unique detections (i.e., occasions when only one method detected the target species). We estimated that the mean probability of detecting the target species during a single visit to a site with a detection dog was 0.87 for black bears, 0.84 for fishers, and 0.27 for bobcats. Although the cost of surveying with detection dogs was higher than that of remote cameras or hair snares, the efficiency of this method rendered it the most cost-effective survey method.

  12. A Waveform Detector that Targets Template-Decorrelated Signals and Achieves its Predicted Performance: Demonstration with IMS Data

    NASA Astrophysics Data System (ADS)

    Carmichael, J.

    2016-12-01

    Waveform correlation detectors used in seismic monitoring scan multichannel data to test two competing hypotheses: that data contain (1) a noisy, amplitude-scaled version of a template waveform, or, (2) only noise. In reality, seismic wavefields include signals triggered by non-target sources (background seismicity) and target signals that are only partially correlated with the waveform template. We reform the waveform correlation detector hypothesis test to accommodate deterministic uncertainty in template/target waveform similarity and thereby derive a new detector from convex set projections (the "cone detector") for use in explosion monitoring. Our analyses give probability density functions that quantify the detectors' degraded performance with decreasing waveform similarity. We then apply our results to three announced North Korean nuclear tests and use International Monitoring System (IMS) arrays to determine the probability that low magnitude, off-site explosions can be reliably detected with a given waveform template. We demonstrate that cone detectors provide (1) an improved predictive capability over correlation detectors to identify such spatially separated explosive sources, (2) competitive detection rates, and (3) reduced false alarms on background seismicity. Figure Caption: Observed and predicted receiver operating characteristic curves for correlation statistic r(x) (left) and cone statistic s(x) (right) versus semi-empirical explosion magnitude. a: Shaded region shows range of ROC curves for r(x) that give the predicted detection performance in noise conditions recorded over 24 hrs on 8 October 2006. Superimposed stair plot shows the empirical detection performance (recorded detections/total events) averaged over 24 hr of data. Error bars indicate the demeaned range in observed detection probability over the day; means are removed to avoid risk of misinterpreting range to indicate probabilities can exceed one. b: Shaded region shows range of ROC curves for s(x) that give the predicted detection performance for the cone detector. Superimposed stair plot show observed detection performance averaged over 24 hr of data analogous to that shown in a.

  13. Target Coverage in Wireless Sensor Networks with Probabilistic Sensors

    PubMed Central

    Shan, Anxing; Xu, Xianghua; Cheng, Zongmao

    2016-01-01

    Sensing coverage is a fundamental problem in wireless sensor networks (WSNs), which has attracted considerable attention. Conventional research on this topic focuses on the 0/1 coverage model, which is only a coarse approximation to the practical sensing model. In this paper, we study the target coverage problem, where the objective is to find the least number of sensor nodes in randomly-deployed WSNs based on the probabilistic sensing model. We analyze the joint detection probability of target with multiple sensors. Based on the theoretical analysis of the detection probability, we formulate the minimum ϵ-detection coverage problem. We prove that the minimum ϵ-detection coverage problem is NP-hard and present an approximation algorithm called the Probabilistic Sensor Coverage Algorithm (PSCA) with provable approximation ratios. To evaluate our design, we analyze the performance of PSCA theoretically and also perform extensive simulations to demonstrate the effectiveness of our proposed algorithm. PMID:27618902

  14. The Spitzer search for the transits of HARPS low-mass planets. II. Null results for 19 planets

    NASA Astrophysics Data System (ADS)

    Gillon, M.; Demory, B.-O.; Lovis, C.; Deming, D.; Ehrenreich, D.; Lo Curto, G.; Mayor, M.; Pepe, F.; Queloz, D.; Seager, S.; Ségransan, D.; Udry, S.

    2017-05-01

    Short-period super-Earths and Neptunes are now known to be very frequent around solar-type stars. Improving our understanding of these mysterious planets requires the detection of a significant sample of objects suitable for detailed characterization. Searching for the transits of the low-mass planets detected by Doppler surveys is a straightforward way to achieve this goal. Indeed, Doppler surveys target the most nearby main-sequence stars, they regularly detect close-in low-mass planets with significant transit probability, and their radial velocity data constrain strongly the ephemeris of possible transits. In this context, we initiated in 2010 an ambitious Spitzer multi-Cycle transit search project that targeted 25 low-mass planets detected by radial velocity, focusing mainly on the shortest-period planets detected by the HARPS spectrograph. We report here null results for 19 targets of the project. For 16 planets out of 19, a transiting configuration is strongly disfavored or firmly rejected by our data for most planetary compositions. We derive a posterior probability of 83% that none of the probed 19 planets transits (for a prior probability of 22%), which still leaves a significant probability of 17% that at least one of them does transit. Globally, our Spitzer project revealed or confirmed transits for three of its 25 targeted planets, and discarded or disfavored the transiting nature of 20 of them. Our light curves demonstrate for Warm Spitzer excellent photometric precisions: for 14 targets out of 19, we were able to reach standard deviations that were better than 50 ppm per 30 min intervals. Combined with its Earth-trailing orbit, which makes it capable of pointing any star in the sky and to monitor it continuously for days, this work confirms Spitzer as an optimal instrument to detect sub-mmag-deep transits on the bright nearby stars targeted by Doppler surveys. The photometric and radial velocity time series used in this work are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/601/A117

  15. Development and Testing of a Multiple Frequency Continuous Wave Radar for Target Detection and Classification

    DTIC Science & Technology

    2007-03-01

    1 2’ VIH " 1 ’ 󈧏) (34) where is the modified Bessel function of zero order. Here is the conditional variance and is the conditional probability...10, the probability of detection is the area under the signal-plus-noise curve above the detection threshold co M vF (V 2+ A2)]10 ( vAPd= fnp~ju,( vIH

  16. Organic Over-the-Horizon Targeting for the 2025 Surface Fleet

    DTIC Science & Technology

    2015-06-01

    Detection Phit Probability of Hit Pk Probability of Kill PLAN People’s Liberation Army Navy PMEL Pacific Marine Environmental Laboratory...probability of hit ( Phit ). 2. Top-Level Functional Flow Block Diagram With the high-level functions of the project’s systems of systems properly

  17. Minimum time search in uncertain dynamic domains with complex sensorial platforms.

    PubMed

    Lanillos, Pablo; Besada-Portas, Eva; Lopez-Orozco, Jose Antonio; de la Cruz, Jesus Manuel

    2014-08-04

    The minimum time search in uncertain domains is a searching task, which appears in real world problems such as natural disasters and sea rescue operations, where a target has to be found, as soon as possible, by a set of sensor-equipped searchers. The automation of this task, where the time to detect the target is critical, can be achieved by new probabilistic techniques that directly minimize the Expected Time (ET) to detect a dynamic target using the observation probability models and actual observations collected by the sensors on board the searchers. The selected technique, described in algorithmic form in this paper for completeness, has only been previously partially tested with an ideal binary detection model, in spite of being designed to deal with complex non-linear/non-differential sensorial models. This paper covers the gap, testing its performance and applicability over different searching tasks with searchers equipped with different complex sensors. The sensorial models under test vary from stepped detection probabilities to continuous/discontinuous differentiable/non-differentiable detection probabilities dependent on distance, orientation, and structured maps. The analysis of the simulated results of several static and dynamic scenarios performed in this paper validates the applicability of the technique with different types of sensor models.

  18. Statistical methods for identifying and bounding a UXO target area or minefield

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKinstry, Craig A.; Pulsipher, Brent A.; Gilbert, Richard O.

    2003-09-18

    The sampling unit for minefield or UXO area characterization is typically represented by a geographical block or transect swath that lends itself to characterization by geophysical instrumentation such as mobile sensor arrays. New spatially based statistical survey methods and tools, more appropriate for these unique sampling units have been developed and implemented at PNNL (Visual Sample Plan software, ver. 2.0) with support from the US Department of Defense. Though originally developed to support UXO detection and removal efforts, these tools may also be used in current form or adapted to support demining efforts and aid in the development of newmore » sensors and detection technologies by explicitly incorporating both sampling and detection error in performance assessments. These tools may be used to (1) determine transect designs for detecting and bounding target areas of critical size, shape, and density of detectable items of interest with a specified confidence probability, (2) evaluate the probability that target areas of a specified size, shape and density have not been missed by a systematic or meandering transect survey, and (3) support post-removal verification by calculating the number of transects required to achieve a specified confidence probability that no UXO or mines have been missed.« less

  19. Minimum Time Search in Uncertain Dynamic Domains with Complex Sensorial Platforms

    PubMed Central

    Lanillos, Pablo; Besada-Portas, Eva; Lopez-Orozco, Jose Antonio; de la Cruz, Jesus Manuel

    2014-01-01

    The minimum time search in uncertain domains is a searching task, which appears in real world problems such as natural disasters and sea rescue operations, where a target has to be found, as soon as possible, by a set of sensor-equipped searchers. The automation of this task, where the time to detect the target is critical, can be achieved by new probabilistic techniques that directly minimize the Expected Time (ET) to detect a dynamic target using the observation probability models and actual observations collected by the sensors on board the searchers. The selected technique, described in algorithmic form in this paper for completeness, has only been previously partially tested with an ideal binary detection model, in spite of being designed to deal with complex non-linear/non-differential sensorial models. This paper covers the gap, testing its performance and applicability over different searching tasks with searchers equipped with different complex sensors. The sensorial models under test vary from stepped detection probabilities to continuous/discontinuous differentiable/non-differentiable detection probabilities dependent on distance, orientation, and structured maps. The analysis of the simulated results of several static and dynamic scenarios performed in this paper validates the applicability of the technique with different types of sensor models. PMID:25093345

  20. Pattern recognition for passive polarimetric data using nonparametric classifiers

    NASA Astrophysics Data System (ADS)

    Thilak, Vimal; Saini, Jatinder; Voelz, David G.; Creusere, Charles D.

    2005-08-01

    Passive polarization based imaging is a useful tool in computer vision and pattern recognition. A passive polarization imaging system forms a polarimetric image from the reflection of ambient light that contains useful information for computer vision tasks such as object detection (classification) and recognition. Applications of polarization based pattern recognition include material classification and automatic shape recognition. In this paper, we present two target detection algorithms for images captured by a passive polarimetric imaging system. The proposed detection algorithms are based on Bayesian decision theory. In these approaches, an object can belong to one of any given number classes and classification involves making decisions that minimize the average probability of making incorrect decisions. This minimum is achieved by assigning an object to the class that maximizes the a posteriori probability. Computing a posteriori probabilities requires estimates of class conditional probability density functions (likelihoods) and prior probabilities. A Probabilistic neural network (PNN), which is a nonparametric method that can compute Bayes optimal boundaries, and a -nearest neighbor (KNN) classifier, is used for density estimation and classification. The proposed algorithms are applied to polarimetric image data gathered in the laboratory with a liquid crystal-based system. The experimental results validate the effectiveness of the above algorithms for target detection from polarimetric data.

  1. Attention to baseline: does orienting visuospatial attention really facilitate target detection?

    PubMed

    Albares, Marion; Criaud, Marion; Wardak, Claire; Nguyen, Song Chi Trung; Ben Hamed, Suliann; Boulinguez, Philippe

    2011-08-01

    Standard protocols testing the orientation of visuospatial attention usually present spatial cues before targets and compare valid-cue trials with invalid-cue trials. The valid/invalid contrast results in a relative behavioral or physiological difference that is generally interpreted as a benefit of attention orientation. However, growing evidence suggests that inhibitory control of response is closely involved in this kind of protocol that requires the subjects to withhold automatic responses to cues, probably biasing behavioral and physiological baselines. Here, we used two experiments to disentangle the inhibitory control of automatic responses from orienting of visuospatial attention in a saccadic reaction time task in humans, a variant of the classical cue-target detection task and a sustained visuospatial attentional task. Surprisingly, when referring to a simple target detection task in which there is no need to refrain from reacting to avoid inappropriate responses, we found no consistent evidence of facilitation of target detection at the attended location. Instead, we observed a cost at the unattended location. Departing from the classical view, our results suggest that reaction time measures of visuospatial attention probably relie on the attenuation of elementary processes involved in visual target detection and saccade initiation away from the attended location rather than on facilitation at the attended location. This highlights the need to use proper control conditions in experimental designs to disambiguate relative from absolute cueing benefits on target detection reaction times, both in psychophysical and neurophysiological studies.

  2. Target intersection probabilities for parallel-line and continuous-grid types of search

    USGS Publications Warehouse

    McCammon, R.B.

    1977-01-01

    The expressions for calculating the probability of intersection of hidden targets of different sizes and shapes for parallel-line and continuous-grid types of search can be formulated by vsing the concept of conditional probability. When the prior probability of the orientation of a widden target is represented by a uniform distribution, the calculated posterior probabilities are identical with the results obtained by the classic methods of probability. For hidden targets of different sizes and shapes, the following generalizations about the probability of intersection can be made: (1) to a first approximation, the probability of intersection of a hidden target is proportional to the ratio of the greatest dimension of the target (viewed in plane projection) to the minimum line spacing of the search pattern; (2) the shape of the hidden target does not greatly affect the probability of the intersection when the largest dimension of the target is small relative to the minimum spacing of the search pattern, (3) the probability of intersecting a target twice for a particular type of search can be used as a lower bound if there is an element of uncertainty of detection for a particular type of tool; (4) the geometry of the search pattern becomes more critical when the largest dimension of the target equals or exceeds the minimum spacing of the search pattern; (5) for elongate targets, the probability of intersection is greater for parallel-line search than for an equivalent continuous square-grid search when the largest dimension of the target is less than the minimum spacing of the search pattern, whereas the opposite is true when the largest dimension exceeds the minimum spacing; (6) the probability of intersection for nonorthogonal continuous-grid search patterns is not greatly different from the probability of intersection for the equivalent orthogonal continuous-grid pattern when the orientation of the target is unknown. The probability of intersection for an elliptically shaped target can be approximated by treating the ellipse as intermediate between a circle and a line. A search conducted along a continuous rectangular grid can be represented as intermediate between a search along parallel lines and along a continuous square grid. On this basis, an upper and lower bound for the probability of intersection of an elliptically shaped target for a continuous rectangular grid can be calculated. Charts have been constructed that permit the values for these probabilities to be obtained graphically. The use of conditional probability allows the explorationist greater flexibility in considering alternate search strategies for locating hidden targets. ?? 1977 Plenum Publishing Corp.

  3. Assessing Performance Tradeoffs in Undersea Distributed Sensor Networks

    DTIC Science & Technology

    2006-09-01

    time. We refer to this process as track - before - detect (see [5] for a description), since the final determination of a target presence is not made until...expressions for probability of successful search and probability of false search for modeling the track - before - detect process. We then describe a numerical...random manner (randomly sampled from a uniform distribution). II. SENSOR NETWORK PERFORMANCE MODELS We model the process of track - before - detect by

  4. Variation in the standard deviation of the lure rating distribution: Implications for estimates of recollection probability.

    PubMed

    Dopkins, Stephen; Varner, Kaitlin; Hoyer, Darin

    2017-10-01

    In word recognition semantic priming of test words increased the false-alarm rate and the mean of confidence ratings to lures. Such priming also increased the standard deviation of confidence ratings to lures and the slope of the z-ROC function, suggesting that the priming increased the standard deviation of the lure evidence distribution. The Unequal Variance Signal Detection (UVSD) model interpreted the priming as increasing the standard deviation of the lure evidence distribution. Without additional parameters the Dual Process Signal Detection (DPSD) model could only accommodate the results by fitting the data for related and unrelated primes separately, interpreting the priming, implausibly, as decreasing the probability of target recollection (DPSD). With an additional parameter, for the probability of false (lure) recollection the model could fit the data for related and unrelated primes together, interpreting the priming as increasing the probability of false recollection. These results suggest that DPSD estimates of target recollection probability will decrease with increases in the lure confidence/evidence standard deviation unless a parameter is included for false recollection. Unfortunately the size of a given lure confidence/evidence standard deviation relative to other possible lure confidence/evidence standard deviations is often unspecified by context. Hence the model often has no way of estimating false recollection probability and thereby correcting its estimates of target recollection probability.

  5. Laser based in-situ and standoff detection of chemical warfare agents and explosives

    NASA Astrophysics Data System (ADS)

    Patel, C. Kumar N.

    2009-09-01

    Laser based detection of gaseous, liquid and solid residues and trace amounts has been developed ever since lasers were invented. However, the lack of availability of reasonably high power tunable lasers in the spectral regions where the relevant targets can be interrogated as well as appropriate techniques for high sensitivity, high selectivity detection has hampered the practical exploitation of techniques for the detection of targets important for homeland security and defense applications. Furthermore, emphasis has been on selectivity without particular attention being paid to the impact of interfering species on the quality of detection. Having high sensitivity is necessary but not a sufficient condition. High sensitivity assures a high probability of detection of the target species. However, it is only recently that the sensor community has come to recognize that any measure of probability of detection must be associated with a probability of false alarm, if it is to have any value as a measure of performance. This is especially true when one attempts to compare performance characteristics of different sensors based on different physical principles. In this paper, I will provide a methodology for characterizing the performance of sensors utilizing optical absorption measurement techniques. However, the underlying principles are equally application to all other sensors. While most of the current progress in high sensitivity, high selectivity detection of CWAs, TICs and explosives involve identifying and quantifying the target species in-situ, there is an urgent need for standoff detection of explosives from safe distances. I will describe our results on CO2 and quantum cascade laser (QCL) based photoacoustic sensors for the detection of CWAs, TICs and explosives as well the very new results on stand-off detection of explosives at distances up to 150 meters. The latter results are critically important for assuring safety of military personnel in battlefield environment, especially from improvised explosive devices (IEDs), and of civilian personnel from terrorist attacks in metropolitan areas.

  6. Quantum illumination for enhanced detection of Rayleigh-fading targets

    NASA Astrophysics Data System (ADS)

    Zhuang, Quntao; Zhang, Zheshen; Shapiro, Jeffrey H.

    2017-08-01

    Quantum illumination (QI) is an entanglement-enhanced sensing system whose performance advantage over a comparable classical system survives its usage in an entanglement-breaking scenario plagued by loss and noise. In particular, QI's error-probability exponent for discriminating between equally likely hypotheses of target absence or presence is 6 dB higher than that of the optimum classical system using the same transmitted power. This performance advantage, however, presumes that the target return, when present, has known amplitude and phase, a situation that seldom occurs in light detection and ranging (lidar) applications. At lidar wavelengths, most target surfaces are sufficiently rough that their returns are speckled, i.e., they have Rayleigh-distributed amplitudes and uniformly distributed phases. QI's optical parametric amplifier receiver—which affords a 3 dB better-than-classical error-probability exponent for a return with known amplitude and phase—fails to offer any performance gain for Rayleigh-fading targets. We show that the sum-frequency generation receiver [Zhuang et al., Phys. Rev. Lett. 118, 040801 (2017), 10.1103/PhysRevLett.118.040801]—whose error-probability exponent for a nonfading target achieves QI's full 6 dB advantage over optimum classical operation—outperforms the classical system for Rayleigh-fading targets. In this case, QI's advantage is subexponential: its error probability is lower than the classical system's by a factor of 1 /ln(M κ ¯NS/NB) , when M κ ¯NS/NB≫1 , with M ≫1 being the QI transmitter's time-bandwidth product, NS≪1 its brightness, κ ¯ the target return's average intensity, and NB the background light's brightness.

  7. A Track Initiation Method for the Underwater Target Tracking Environment

    NASA Astrophysics Data System (ADS)

    Li, Dong-dong; Lin, Yang; Zhang, Yao

    2018-04-01

    A novel efficient track initiation method is proposed for the harsh underwater target tracking environment (heavy clutter and large measurement errors): track splitting, evaluating, pruning and merging method (TSEPM). Track initiation demands that the method should determine the existence and initial state of a target quickly and correctly. Heavy clutter and large measurement errors certainly pose additional difficulties and challenges, which deteriorate and complicate the track initiation in the harsh underwater target tracking environment. There are three primary shortcomings for the current track initiation methods to initialize a target: (a) they cannot eliminate the turbulences of clutter effectively; (b) there may be a high false alarm probability and low detection probability of a track; (c) they cannot estimate the initial state for a new confirmed track correctly. Based on the multiple hypotheses tracking principle and modified logic-based track initiation method, in order to increase the detection probability of a track, track splitting creates a large number of tracks which include the true track originated from the target. And in order to decrease the false alarm probability, based on the evaluation mechanism, track pruning and track merging are proposed to reduce the false tracks. TSEPM method can deal with the track initiation problems derived from heavy clutter and large measurement errors, determine the target's existence and estimate its initial state with the least squares method. What's more, our method is fully automatic and does not require any kind manual input for initializing and tuning any parameter. Simulation results indicate that our new method improves significantly the performance of the track initiation in the harsh underwater target tracking environment.

  8. Small target pre-detection with an attention mechanism

    NASA Astrophysics Data System (ADS)

    Wang, Yuehuan; Zhang, Tianxu; Wang, Guoyou

    2002-04-01

    We introduce the concept of predetection based on an attention mechanism to improve the efficiency of small-target detection by limiting the image region of detection. According to the characteristics of small-target detection, local contrast is taken as the only feature in predetection and a nonlinear sampling model is adopted to make the predetection adaptive to detect small targets with different area sizes. To simplify the predetection itself and decrease the false alarm probability, neighboring nodes in the sampling grid are used to generate a saliency map, and a short-term memory is adopted to accelerate the `pop-out' of targets. We discuss the fact that the proposed approach is simple enough in computational complexity. In addition, even in a cluttered background, attention can be led to targets in a satisfying few iterations, which ensures that the detection efficiency will not be decreased due to false alarms. Experimental results are presented to demonstrate the applicability of the approach.

  9. Detection probability of EBPSK-MODEM system

    NASA Astrophysics Data System (ADS)

    Yao, Yu; Wu, Lenan

    2016-07-01

    Since the impacting filter-based receiver is able to transform phase modulation into amplitude peak, a simple threshold decision can detect the Extend-Binary Phase Shift Keying (EBPSK) modulated ranging signal in noise environment. In this paper, an analysis of the EBPSK-MODEM system output gives the probability density function for EBPSK modulated signals plus noise. The equation of detection probability (pd) for fluctuating and non-fluctuating targets has been deduced. Also, a comparison of the pd for the EBPSK-MODEM system and pulse radar receiver is made, and some results are plotted. Moreover, the probability curves of such system with several modulation parameters are analysed. When modulation parameter is not smaller than 6, the detection performance of EBPSK-MODEM system is more excellent than traditional radar system. In addition to theoretical considerations, computer simulations are provided for illustrating the performance.

  10. Cumulative detection probabilities and range accuracy of a pulsed Geiger-mode avalanche photodiode laser ranging system

    NASA Astrophysics Data System (ADS)

    Luo, Hanjun; Ouyang, Zhengbiao; Liu, Qiang; Chen, Zhiliang; Lu, Hualan

    2017-10-01

    Cumulative pulses detection with appropriate cumulative pulses number and threshold has the ability to improve the detection performance of the pulsed laser ranging system with GM-APD. In this paper, based on Poisson statistics and multi-pulses cumulative process, the cumulative detection probabilities and their influence factors are investigated. With the normalized probability distribution of each time bin, the theoretical model of the range accuracy and precision is established, and the factors limiting the range accuracy and precision are discussed. The results show that the cumulative pulses detection can produce higher target detection probability and lower false alarm probability. However, for a heavy noise level and extremely weak echo intensity, the false alarm suppression performance of the cumulative pulses detection deteriorates quickly. The range accuracy and precision is another important parameter evaluating the detection performance, the echo intensity and pulse width are main influence factors on the range accuracy and precision, and higher range accuracy and precision is acquired with stronger echo intensity and narrower echo pulse width, for 5-ns echo pulse width, when the echo intensity is larger than 10, the range accuracy and precision lower than 7.5 cm can be achieved.

  11. Wavelength band selection method for multispectral target detection.

    PubMed

    Karlholm, Jörgen; Renhorn, Ingmar

    2002-11-10

    A framework is proposed for the selection of wavelength bands for multispectral sensors by use of hyperspectral reference data. Using the results from the detection theory we derive a cost function that is minimized by a set of spectral bands optimal in terms of detection performance for discrimination between a class of small rare targets and clutter with known spectral distribution. The method may be used, e.g., in the design of multispectral infrared search and track and electro-optical missile warning sensors, where a low false-alarm rate and a high-detection probability for detection of small targets against a clutter background are of critical importance, but the required high frame rate prevents the use of hyperspectral sensors.

  12. Simulation of target interpretation based on infrared image features and psychology principle

    NASA Astrophysics Data System (ADS)

    Lin, Wei; Chen, Yu-hua; Gao, Hong-sheng; Wang, Zhan-feng; Wang, Ji-jun; Su, Rong-hua; Huang, Yan-ping

    2009-07-01

    It's an important and complicated process in target interpretation that target features extraction and identification, which effect psychosensorial quantity of interpretation person to target infrared image directly, and decide target viability finally. Using statistical decision theory and psychology principle, designing four psychophysical experiment, the interpretation model of the infrared target is established. The model can get target detection probability by calculating four features similarity degree between target region and background region, which were plotted out on the infrared image. With the verification of a great deal target interpretation in practice, the model can simulate target interpretation and detection process effectively, get the result of target interpretation impersonality, which can provide technique support for target extraction, identification and decision-making.

  13. Simulation Model of Mobile Detection Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edmunds, T; Faissol, D; Yao, Y

    2009-01-27

    In this paper, we consider a mobile source that we attempt to detect with man-portable, vehicle-mounted or boat-mounted radiation detectors. The source is assumed to transit an area populated with these mobile detectors, and the objective is to detect the source before it reaches a perimeter. We describe a simulation model developed to estimate the probability that one of the mobile detectors will come in to close proximity of the moving source and detect it. We illustrate with a maritime simulation example. Our simulation takes place in a 10 km by 5 km rectangular bay patrolled by boats equipped withmore » 2-inch x 4-inch x 16-inch NaI detectors. Boats to be inspected enter the bay and randomly proceed to one of seven harbors on the shore. A source-bearing boat enters the mouth of the bay and proceeds to a pier on the opposite side. We wish to determine the probability that the source is detected and its range from target when detected. Patrol boats select the nearest in-bound boat for inspection and initiate an intercept course. Once within an operational range for the detection system, a detection algorithm is started. If the patrol boat confirms the source is not present, it selects the next nearest boat for inspection. Each run of the simulation ends either when a patrol successfully detects a source or when the source reaches its target. Several statistical detection algorithms have been implemented in the simulation model. First, a simple k-sigma algorithm, which alarms with the counts in a time window exceeds the mean background plus k times the standard deviation of background, is available to the user. The time window used is optimized with respect to the signal-to-background ratio for that range and relative speed. Second, a sequential probability ratio test [Wald 1947] is available, and configured in this simulation with a target false positive probability of 0.001 and false negative probability of 0.1. This test is utilized when the mobile detector maintains a constant range to the vessel being inspected. Finally, a variation of the sequential probability ratio test that is more appropriate when source and detector are not at constant range is available [Nelson 2005]. Each patrol boat in the fleet can be assigned a particular zone of the bay, or all boats can be assigned to monitor the entire bay. Boats assigned to a zone will only intercept and inspect other boats when they enter their zone. In our example simulation, each of two patrol boats operate in a 5 km by 5 km zone. Other parameters for this example include: (1) Detection range - 15 m range maintained between patrol boat and inspected boat; (2) Inbound boat arrival rate - Poisson process with mean arrival rate of 30 boats per hour; (3) Speed of boats to be inspected - Random between 4.5 and 9 knots; (4) Patrol boat speed - 10 knots; (5) Number of detectors per patrol boat - 4-2-inch x 4-inch x 16-inch NaI detectors; (6) Background radiation - 40 counts/sec per detector; and (7) Detector response due to radiation source at 1 meter - 1,589 counts/sec per detector. Simulation results indicate that two patrol boats are able to detect the source 81% of the time without zones and 90% of the time with zones. The average distances between the source and target at the end of the simulation is 5,866 km and 5,712 km for non-zoned and zoned patrols, respectively. Of those that did not reach the target, the average distance to the target is 7,305 km and 6,441 km respectively. Note that a design trade-off exists. While zoned patrols provide a higher probability of detection, the nonzoned patrols tend to detect the source farther from its target. Figure 1 displays the location of the source at the end of 1,000 simulations for the 5 x 10 km bay simulation. The simulation model and analysis described here can be used to determine the number of mobile detectors one would need to deploy in order to have a have reasonable chance of detecting a source in transit. By fixing the source speed to zero, the same model could be used to estimate how long it would take to detect a stationary source. For example, the model could predict how long it would take plant staff performing assigned duties carrying dosimeters to discover a contaminated spot in the facility.« less

  14. A Bayesian Method for Managing Uncertainties Relating to Distributed Multistatic Sensor Search

    DTIC Science & Technology

    2006-07-01

    before - detect process. There will also be an increased probability of high signal-to-noise ratio (SNR) detections associated with specular and near...and high target strength and high Doppler opportunities give rise to the expectation of an increased number of detections that could feed a track

  15. Multi-Agent Cooperative Target Search

    PubMed Central

    Hu, Jinwen; Xie, Lihua; Xu, Jun; Xu, Zhao

    2014-01-01

    This paper addresses a vision-based cooperative search for multiple mobile ground targets by a group of unmanned aerial vehicles (UAVs) with limited sensing and communication capabilities. The airborne camera on each UAV has a limited field of view and its target discriminability varies as a function of altitude. First, by dividing the whole surveillance region into cells, a probability map can be formed for each UAV indicating the probability of target existence within each cell. Then, we propose a distributed probability map updating model which includes the fusion of measurement information, information sharing among neighboring agents, information decay and transmission due to environmental changes such as the target movement. Furthermore, we formulate the target search problem as a multi-agent cooperative coverage control problem by optimizing the collective coverage area and the detection performance. The proposed map updating model and the cooperative control scheme are distributed, i.e., assuming that each agent only communicates with its neighbors within its communication range. Finally, the effectiveness of the proposed algorithms is illustrated by simulation. PMID:24865884

  16. A Search Model for Imperfectly Detected Targets

    NASA Technical Reports Server (NTRS)

    Ahumada, Albert

    2012-01-01

    Under the assumptions that 1) the search region can be divided up into N non-overlapping sub-regions that are searched sequentially, 2) the probability of detection is unity if a sub-region is selected, and 3) no information is available to guide the search, there are two extreme case models. The search can be done perfectly, leading to a uniform distribution over the number of searches required, or the search can be done with no memory, leading to a geometric distribution for the number of searches required with a success probability of 1/N. If the probability of detection P is less than unity, but the search is done otherwise perfectly, the searcher will have to search the N regions repeatedly until detection occurs. The number of searches is thus the sum two random variables. One is N times the number of full searches (a geometric distribution with success probability P) and the other is the uniform distribution over the integers 1 to N. The first three moments of this distribution were computed, giving the mean, standard deviation, and the kurtosis of the distribution as a function of the two parameters. The model was fit to the data presented last year (Ahumada, Billington, & Kaiwi, 2 required to find a single pixel target on a simulated horizon. The model gave a good fit to the three moments for all three observers.

  17. Lidar detection algorithm for time and range anomalies.

    PubMed

    Ben-David, Avishai; Davidson, Charles E; Vanderbeek, Richard G

    2007-10-10

    A new detection algorithm for lidar applications has been developed. The detection is based on hyperspectral anomaly detection that is implemented for time anomaly where the question "is a target (aerosol cloud) present at range R within time t(1) to t(2)" is addressed, and for range anomaly where the question "is a target present at time t within ranges R(1) and R(2)" is addressed. A detection score significantly different in magnitude from the detection scores for background measurements suggests that an anomaly (interpreted as the presence of a target signal in space/time) exists. The algorithm employs an option for a preprocessing stage where undesired oscillations and artifacts are filtered out with a low-rank orthogonal projection technique. The filtering technique adaptively removes the one over range-squared dependence of the background contribution of the lidar signal and also aids visualization of features in the data when the signal-to-noise ratio is low. A Gaussian-mixture probability model for two hypotheses (anomaly present or absent) is computed with an expectation-maximization algorithm to produce a detection threshold and probabilities of detection and false alarm. Results of the algorithm for CO(2) lidar measurements of bioaerosol clouds Bacillus atrophaeus (formerly known as Bacillus subtilis niger, BG) and Pantoea agglomerans, Pa (formerly known as Erwinia herbicola, Eh) are shown and discussed.

  18. Visual performance on detection tasks with double-targets of the same and different difficulty.

    PubMed

    Chan, Alan H S; Courtney, Alan J; Ma, C W

    2002-10-20

    This paper reports a study of measurement of horizontal visual sensitivity limits for 16 subjects in single-target and double-targets detection tasks. Two phases of tests were conducted in the double-targets task; targets of the same difficulty were tested in phase one while targets of different difficulty were tested in phase two. The range of sensitivity for the double-targets test was found to be smaller than that for single-target in both the same and different target difficulty cases. The presence of another target was found to affect performance to a marked degree. Interference effect of the difficult target on detection of the easy one was greater than that of the easy one on the detection of the difficult one. Performance decrement was noted when correct percentage detection was plotted against eccentricity of target in both the single-target and double-targets tests. Nevertheless, the non-significant correlation found between the performance for the two tasks demonstrated that it was impossible to predict quantitatively ability for detection of double targets from the data for single targets. This indicated probable problems in generalizing data for single target visual lobes to those for multiple targets. Also lobe area values obtained from measurements using a single-target task cannot be applied in a mathematical model for situations with multiple occurrences of targets.

  19. Bayesian performance metrics of binary sensors in homeland security applications

    NASA Astrophysics Data System (ADS)

    Jannson, Tomasz P.; Forrester, Thomas C.

    2008-04-01

    Bayesian performance metrics, based on such parameters, as: prior probability, probability of detection (or, accuracy), false alarm rate, and positive predictive value, characterizes the performance of binary sensors; i.e., sensors that have only binary response: true target/false target. Such binary sensors, very common in Homeland Security, produce an alarm that can be true, or false. They include: X-ray airport inspection, IED inspections, product quality control, cancer medical diagnosis, part of ATR, and many others. In this paper, we analyze direct and inverse conditional probabilities in the context of Bayesian inference and binary sensors, using X-ray luggage inspection statistical results as a guideline.

  20. Dealing with incomplete and variable detectability in multi-year, multi-site monitoring of ecological populations

    USGS Publications Warehouse

    Converse, Sarah J.; Royle, J. Andrew; Gitzen, Robert A.; Millspaugh, Joshua J.; Cooper, Andrew B.; Licht, Daniel S.

    2012-01-01

    An ecological monitoring program should be viewed as a component of a larger framework designed to advance science and/or management, rather than as a stand-alone activity. Monitoring targets (the ecological variables of interest; e.g. abundance or occurrence of a species) should be set based on the needs of that framework (Nichols and Williams 2006; e.g. Chapters 2–4). Once such monitoring targets are set, the subsequent step in monitoring design involves consideration of the field and analytical methods that will be used to measure monitoring targets with adequate accuracy and precision. Long-term monitoring programs will involve replication of measurements over time, and possibly over space; that is, one location or each of multiple locations will be monitored multiple times, producing a collection of site visits (replicates). Clearly this replication is important for addressing spatial and temporal variability in the ecological resources of interest (Chapters 7–10), but it is worth considering how this replication can further be exploited to increase the effectiveness of monitoring. In particular, defensible monitoring of the majority of animal, and to a lesser degree plant, populations and communities will generally require investigators to account for imperfect detection (Chapters 4, 18). Raw indices of population state variables, such as abundance or occupancy (sensu MacKenzie et al. 2002), are rarely defensible when detection probabilities are < 1, because in those cases detection may vary over time and space in unpredictable ways. Myriad authors have discussed the risks inherent in making inference from monitoring data while failing to correct for differences in detection, resulting in indices that have an unknown relationship to the parameters of interest (e.g. Nichols 1992, Anderson 2001, MacKenzie et al. 2002, Williams et al. 2002, Anderson 2003, White 2005, Kéry and Schmidt 2008). While others have argued that indices may be preferable in some cases due to the challenges associated with estimating detection probabilities (e.g. McKelvey and Pearson 2001, Johnson 2008), we do not attempt to resolve this debate here. Rather, we are more apt to agree with MacKenzie and Kendall (2002) that the burden of proof ought to be on the assertion that detection probabilities are constant. Furthermore, given the wide variety of field methods available for estimating detection probabilities and the inability for an investigator to know, a priori, if detection probabilities will be constant over time and space, we believe that development of monitoring programs ought to include field and analytical methods to account for the imperfect detection of organisms.

  1. Optical detection of chemical warfare agents and toxic industrial chemicals

    NASA Astrophysics Data System (ADS)

    Webber, Michael E.; Pushkarsky, Michael B.; Patel, C. Kumar N.

    2004-12-01

    We present an analytical model evaluating the suitability of optical absorption based spectroscopic techniques for detection of chemical warfare agents (CWAs) and toxic industrial chemicals (TICs) in ambient air. The sensor performance is modeled by simulating absorption spectra of a sample containing both the target and multitude of interfering species as well as an appropriate stochastic noise and determining the target concentrations from the simulated spectra via a least square fit (LSF) algorithm. The distribution of the LSF target concentrations determines the sensor sensitivity, probability of false positives (PFP) and probability of false negatives (PFN). The model was applied to CO2 laser based photoacosutic (L-PAS) CWA sensor and predicted single digit ppb sensitivity with very low PFP rates in the presence of significant amount of interferences. This approach will be useful for assessing sensor performance by developers and users alike; it also provides methodology for inter-comparison of different sensing technologies.

  2. Research on quantitative relationship between NIIRS and the probabilities of discrimination

    NASA Astrophysics Data System (ADS)

    Bai, Honggang

    2011-08-01

    There are a large number of electro-optical (EO) and infrared (IR) sensors used on military platforms including ground vehicle, low altitude air vehicle, high altitude air vehicle, and satellite systems. Ground vehicle and low-altitude air vehicle (rotary and fixed-wing aircraft) sensors typically use the probabilities of discrimination (detection, recognition, and identification) as design requirements and system performance indicators. High-altitude air vehicles and satellite sensors have traditionally used the National Imagery Interpretation Rating Scale (NIIRS) performance measures for guidance in design and measures of system performance. Recently, there has a large effort to make strategic sensor information available to tactical forces or make the information of targets acquisition can be used by strategic systems. In this paper, the two techniques about the probabilities of discrimination and NIIRS for sensor design are presented separately. For the typical infrared remote sensor design parameters, the function of the probability of recognition and NIIRS scale as the distance R is given to Standard NATO Target and M1Abrams two different size targets based on the algorithm of predicting the field performance and NIIRS. For Standard NATO Target, M1Abrams, F-15, and B-52 four different size targets, the conversion from NIIRS to the probabilities of discrimination are derived and calculated, and the similarities and differences between NIIRS and the probabilities of discrimination are analyzed based on the result of calculation. Comparisons with preliminary calculation results show that the conversion between NIIRS and the probabilities of discrimination is probable although more validation experiments are needed.

  3. Small target detection based on difference accumulation and Gaussian curvature under complex conditions

    NASA Astrophysics Data System (ADS)

    Zhang, He; Niu, Yanxiong; Zhang, Hao

    2017-12-01

    Small target detection is a significant subject in infrared search and track and other photoelectric imaging systems. The small target is imaged under complex conditions, which contains clouds, horizon and bright part. In this paper, a novel small target detection method is proposed based on difference accumulation, clustering and Gaussian curvature. Difference accumulation varies from regions. Therefore, after obtaining difference accumulations, clustering is applied to determine whether the pixel belongs to the heterogeneous region, and eliminate heterogeneous region. Then Gaussian curvature is used to separate target from the homogeneous region. Experiments are conducted for verification, along with comparisons to several other methods. The experimental results demonstrate that our method has an advantage of 1-2 orders of magnitude on SCRG and BSF than others. Given that the false alarm rate is 1, the detection probability can be approximately 0.9 by using proposed method.

  4. Crowding with detection and coarse discrimination of simple visual features.

    PubMed

    Põder, Endel

    2008-04-24

    Some recent studies have suggested that there are actually no crowding effects with detection and coarse discrimination of simple visual features. The present study tests the generality of this idea. A target Gabor patch, surrounded by either 2 or 6 flanker Gabors, was presented briefly at 4 deg eccentricity of the visual field. Each Gabor patch was oriented either vertically or horizontally (selected randomly). Observers' task was either to detect the presence of the target (presented with probability 0.5) or to identify the orientation of the target. The target-flanker distance was varied. Results were similar for the two tasks but different for 2 and 6 flankers. The idea that feature detection and coarse discrimination are immune to crowding may be valid for the two-flanker condition only. With six flankers, a normal crowding effect was observed. It is suggested that the complexity of the full pattern (target plus flankers) could explain the difference.

  5. Multiwaveband simulation-based signature analysis of camouflaged human dismounts in cluttered environments with TAIThermIR and MuSES

    NASA Astrophysics Data System (ADS)

    Packard, Corey D.; Klein, Mark D.; Viola, Timothy S.; Hepokoski, Mark A.

    2016-10-01

    The ability to predict electro-optical (EO) signatures of diverse targets against cluttered backgrounds is paramount for signature evaluation and/or management. Knowledge of target and background signatures is essential for a variety of defense-related applications. While there is no substitute for measured target and background signatures to determine contrast and detection probability, the capability to simulate any mission scenario with desired environmental conditions is a tremendous asset for defense agencies. In this paper, a systematic process for the thermal and visible-through-infrared simulation of camouflaged human dismounts in cluttered outdoor environments is presented. This process, utilizing the thermal and EO/IR radiance simulation tool TAIThermIR (and MuSES), provides a repeatable and accurate approach for analyzing contrast, signature and detectability of humans in multiple wavebands. The engineering workflow required to combine natural weather boundary conditions and the human thermoregulatory module developed by ThermoAnalytics is summarized. The procedure includes human geometry creation, human segmental physiology description and transient physical temperature prediction using environmental boundary conditions and active thermoregulation. Radiance renderings, which use Sandford-Robertson BRDF optical surface property descriptions and are coupled with MODTRAN for the calculation of atmospheric effects, are demonstrated. Sensor effects such as optical blurring and photon noise can be optionally included, increasing the accuracy of detection probability outputs that accompany each rendering. This virtual evaluation procedure has been extensively validated and provides a flexible evaluation process that minimizes the difficulties inherent in human-subject field testing. Defense applications such as detection probability assessment, camouflage pattern evaluation, conspicuity tests and automatic target recognition are discussed.

  6. Wavefront-Guided Versus Wavefront-Optimized Photorefractive Keratectomy: Visual and Military Task Performance.

    PubMed

    Ryan, Denise S; Sia, Rose K; Stutzman, Richard D; Pasternak, Joseph F; Howard, Robin S; Howell, Christopher L; Maurer, Tana; Torres, Mark F; Bower, Kraig S

    2017-01-01

    To compare visual performance, marksmanship performance, and threshold target identification following wavefront-guided (WFG) versus wavefront-optimized (WFO) photorefractive keratectomy (PRK). In this prospective, randomized clinical trial, active duty U.S. military Soldiers, age 21 or over, electing to undergo PRK were randomized to undergo WFG (n = 27) or WFO (n = 27) PRK for myopia or myopic astigmatism. Binocular visual performance was assessed preoperatively and 1, 3, and 6 months postoperatively: Super Vision Test high contrast, Super Vision Test contrast sensitivity (CS), and 25% contrast acuity with night vision goggle filter. CS function was generated testing at five spatial frequencies. Marksmanship performance in low light conditions was evaluated in a firing tunnel. Target detection and identification performance was tested for probability of identification of varying target sets and probability of detection of humans in cluttered environments. Visual performance, CS function, marksmanship, and threshold target identification demonstrated no statistically significant differences over time between the two treatments. Exploratory regression analysis of firing range tasks at 6 months showed no significant differences or correlations between procedures. Regression analysis of vehicle and handheld probability of identification showed a significant association with pretreatment performance. Both WFG and WFO PRK results translate to excellent and comparable visual and military performance. Reprint & Copyright © 2017 Association of Military Surgeons of the U.S.

  7. Mirage: a visible signature evaluation tool

    NASA Astrophysics Data System (ADS)

    Culpepper, Joanne B.; Meehan, Alaster J.; Shao, Q. T.; Richards, Noel

    2017-10-01

    This paper presents the Mirage visible signature evaluation tool, designed to provide a visible signature evaluation capability that will appropriately reflect the effect of scene content on the detectability of targets, providing a capability to assess visible signatures in the context of the environment. Mirage is based on a parametric evaluation of input images, assessing the value of a range of image metrics and combining them using the boosted decision tree machine learning method to produce target detectability estimates. It has been developed using experimental data from photosimulation experiments, where human observers search for vehicle targets in a variety of digital images. The images used for tool development are synthetic (computer generated) images, showing vehicles in many different scenes and exhibiting a wide variation in scene content. A preliminary validation has been performed using k-fold cross validation, where 90% of the image data set was used for training and 10% of the image data set was used for testing. The results of the k-fold validation from 200 independent tests show a prediction accuracy between Mirage predictions of detection probability and observed probability of detection of r(262) = 0:63, p < 0:0001 (Pearson correlation) and a MAE = 0:21 (mean absolute error).

  8. Integrating drivers influencing the detection of plant pests carried in the international cut flower trade.

    PubMed

    Areal, F J; Touza, J; MacLeod, A; Dehnen-Schmutz, K; Perrings, C; Palmieri, M G; Spence, N J

    2008-12-01

    This paper analyses the cut flower market as an example of an invasion pathway along which species of non-indigenous plant pests can travel to reach new areas. The paper examines the probability of pest detection by assessing information on pest detection and detection effort associated with the import of cut flowers. We test the link between the probability of plant pest arrivals, as a precursor to potential invasion, and volume of traded flowers using count data regression models. The analysis is applied to the UK import of specific genera of cut flowers from Kenya between 1996 and 2004. There is a link between pest detection and the Genus of cut flower imported. Hence, pest detection efforts should focus on identifying and targeting those imported plants with a high risk of carrying pest species. For most of the plants studied, efforts allocated to inspection have a significant influence on the probability of pest detection. However, by better targeting inspection efforts, it is shown that plant inspection effort could be reduced without increasing the risk of pest entry. Similarly, for most of the plants analysed, an increase in volume traded will not necessarily lead to an increase in the number of pests entering the UK. For some species, such as Carthamus and Veronica, the volume of flowers traded has a significant and positive impact on the likelihood of pest detection. We conclude that analysis at the rank of plant Genus is important both to understand the effectiveness of plant pest detection efforts and consequently to manage the risk of introduction of non-indigenous species.

  9. History and Evolution of the Johnson Criteria.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sjaardema, Tracy A.; Smith, Collin S.; Birch, Gabriel Carisle

    The Johnson Criteria metric calculates probability of detection of an object imaged by an optical system, and was created in 1958 by John Johnson. As understanding of target detection has improved, detection models have evolved to better model additional factors such as weather, scene content, and object placement. The initial Johnson Criteria, while sufficient for technology and understanding at the time, does not accurately reflect current research into target acquisition and technology. Even though current research shows a dependence on human factors, there appears to be a lack of testing and modeling of human variability.

  10. Simulation of sea surface wave influence on small target detection with airborne laser depth sounding.

    PubMed

    Tulldahl, H Michael; Steinvall, K Ove

    2004-04-20

    A theoretical model for simulation of airborne depth-sounding lidar is presented with the purpose of analyzing the influence from water surface waves on the ability to detect 1-m3 targets placed on the sea bottom. Although water clarity is the main limitation, sea surface waves can significantly affect the detectability. The detection probability for a target at a 9-m depth can be above 90% at 1-m/s wind and below 80% at 6-m/s wind for the same water clarity. The simulation model contains both numerical and analytical components. Simulated data are compared with measured data and give realistic results for bottom depths between 3 and 10 m.

  11. Designing efficient surveys: spatial arrangement of sample points for detection of invasive species

    Treesearch

    Ludek Berec; John M. Kean; Rebecca Epanchin-Niell; Andrew M. Liebhold; Robert G. Haight

    2015-01-01

    Effective surveillance is critical to managing biological invasions via early detection and eradication. The efficiency of surveillance systems may be affected by the spatial arrangement of sample locations. We investigate how the spatial arrangement of sample points, ranging from random to fixed grid arrangements, affects the probability of detecting a target...

  12. Estimating site occupancy and detection probability parameters for meso- and large mammals in a coastal eosystem

    USGS Publications Warehouse

    O'Connell, Allan F.; Talancy, Neil W.; Bailey, Larissa L.; Sauer, John R.; Cook, Robert; Gilbert, Andrew T.

    2006-01-01

    Large-scale, multispecies monitoring programs are widely used to assess changes in wildlife populations but they often assume constant detectability when documenting species occurrence. This assumption is rarely met in practice because animal populations vary across time and space. As a result, detectability of a species can be influenced by a number of physical, biological, or anthropogenic factors (e.g., weather, seasonality, topography, biological rhythms, sampling methods). To evaluate some of these influences, we estimated site occupancy rates using species-specific detection probabilities for meso- and large terrestrial mammal species on Cape Cod, Massachusetts, USA. We used model selection to assess the influence of different sampling methods and major environmental factors on our ability to detect individual species. Remote cameras detected the most species (9), followed by cubby boxes (7) and hair traps (4) over a 13-month period. Estimated site occupancy rates were similar among sampling methods for most species when detection probabilities exceeded 0.15, but we question estimates obtained from methods with detection probabilities between 0.05 and 0.15, and we consider methods with lower probabilities unacceptable for occupancy estimation and inference. Estimated detection probabilities can be used to accommodate variation in sampling methods, which allows for comparison of monitoring programs using different protocols. Vegetation and seasonality produced species-specific differences in detectability and occupancy, but differences were not consistent within or among species, which suggests that our results should be considered in the context of local habitat features and life history traits for the target species. We believe that site occupancy is a useful state variable and suggest that monitoring programs for mammals using occupancy data consider detectability prior to making inferences about species distributions or population change.

  13. A Deterministic Approach to Active Debris Removal Target Selection

    NASA Astrophysics Data System (ADS)

    Lidtke, A.; Lewis, H.; Armellin, R.

    2014-09-01

    Many decisions, with widespread economic, political and legal consequences, are being considered based on space debris simulations that show that Active Debris Removal (ADR) may be necessary as the concerns about the sustainability of spaceflight are increasing. The debris environment predictions are based on low-accuracy ephemerides and propagators. This raises doubts about the accuracy of those prognoses themselves but also the potential ADR target-lists that are produced. Target selection is considered highly important as removal of many objects will increase the overall mission cost. Selecting the most-likely candidates as soon as possible would be desirable as it would enable accurate mission design and allow thorough evaluation of in-orbit validations, which are likely to occur in the near-future, before any large investments are made and implementations realized. One of the primary factors that should be used in ADR target selection is the accumulated collision probability of every object. A conjunction detection algorithm, based on the smart sieve method, has been developed. Another algorithm is then applied to the found conjunctions to compute the maximum and true probabilities of collisions taking place. The entire framework has been verified against the Conjunction Analysis Tools in AGIs Systems Toolkit and relative probability error smaller than 1.5% has been achieved in the final maximum collision probability. Two target-lists are produced based on the ranking of the objects according to the probability they will take part in any collision over the simulated time window. These probabilities are computed using the maximum probability approach, that is time-invariant, and estimates of the true collision probability that were computed with covariance information. The top-priority targets are compared, and the impacts of the data accuracy and its decay are highlighted. General conclusions regarding the importance of Space Surveillance and Tracking for the purpose of ADR are also drawn and a deterministic method for ADR target selection, which could reduce the number of ADR missions to be performed, is proposed.

  14. Experimental evaluation of penetration capabilities of a Geiger-mode APD array laser radar system

    NASA Astrophysics Data System (ADS)

    Jonsson, Per; Tulldahl, Michael; Hedborg, Julia; Henriksson, Markus; Sjöqvist, Lars

    2017-10-01

    Laser radar 3D imaging has the potential to improve target recognition in many scenarios. One case that is challenging for most optical sensors is to recognize targets hidden in vegetation or behind camouflage. The range resolution of timeof- flight 3D sensors allows segmentation of obscuration and target if the surfaces are separated far enough so that they can be resolved as two distances. Systems based on time-correlated single-photon counting (TCSPC) have the potential to resolve surfaces closer to each other compared to laser radar systems based on proportional mode detection technologies and is therefore especially interesting. Photon counting detection is commonly performed with Geigermode Avalanche Photodiodes (GmAPD) that have the disadvantage that they can only detect one photon per laser pulse per pixel. A strong return from an obscuring object may saturate the detector and thus limit the possibility to detect the hidden target even if photons from the target reach the detector. The operational range where good foliage penetration is observed is therefore relatively narrow for GmAPD systems. In this paper we investigate the penetration capability through semi-transparent surfaces for a laser radar with a 128×32 pixel GmAPD array and a 1542 nm wavelength laser operating at a pulse repetition frequency of 90 kHz. In the evaluation a screen was placed behind different canvases with varying transmissions and the detected signals from the surfaces for different laser intensities were measured. The maximum return from the second surface occurs when the total detection probability is around 0.65-0.75 per pulse. At higher laser excitation power the signal from the second surface decreases. To optimize the foliage penetration capability it is thus necessary to adaptively control the laser power to keep the returned signal within this region. In addition to the experimental results, simulations to study the influence of the pulse energy on penetration through foliage in a scene with targets behind vegetation are presented. The optimum detection of targets occurs here at a slightly higher total photon count rate probability because a number of pixel have no obscuration in front the target in their field of view.

  15. Labeled RFS-Based Track-Before-Detect for Multiple Maneuvering Targets in the Infrared Focal Plane Array.

    PubMed

    Li, Miao; Li, Jun; Zhou, Yiyu

    2015-12-08

    The problem of jointly detecting and tracking multiple targets from the raw observations of an infrared focal plane array is a challenging task, especially for the case with uncertain target dynamics. In this paper a multi-model labeled multi-Bernoulli (MM-LMB) track-before-detect method is proposed within the labeled random finite sets (RFS) framework. The proposed track-before-detect method consists of two parts-MM-LMB filter and MM-LMB smoother. For the MM-LMB filter, original LMB filter is applied to track-before-detect based on target and measurement models, and is integrated with the interacting multiple models (IMM) approach to accommodate the uncertainty of target dynamics. For the MM-LMB smoother, taking advantage of the track labels and posterior model transition probability, the single-model single-target smoother is extended to a multi-model multi-target smoother. A Sequential Monte Carlo approach is also presented to implement the proposed method. Simulation results show the proposed method can effectively achieve tracking continuity for multiple maneuvering targets. In addition, compared with the forward filtering alone, our method is more robust due to its combination of forward filtering and backward smoothing.

  16. Labeled RFS-Based Track-Before-Detect for Multiple Maneuvering Targets in the Infrared Focal Plane Array

    PubMed Central

    Li, Miao; Li, Jun; Zhou, Yiyu

    2015-01-01

    The problem of jointly detecting and tracking multiple targets from the raw observations of an infrared focal plane array is a challenging task, especially for the case with uncertain target dynamics. In this paper a multi-model labeled multi-Bernoulli (MM-LMB) track-before-detect method is proposed within the labeled random finite sets (RFS) framework. The proposed track-before-detect method consists of two parts—MM-LMB filter and MM-LMB smoother. For the MM-LMB filter, original LMB filter is applied to track-before-detect based on target and measurement models, and is integrated with the interacting multiple models (IMM) approach to accommodate the uncertainty of target dynamics. For the MM-LMB smoother, taking advantage of the track labels and posterior model transition probability, the single-model single-target smoother is extended to a multi-model multi-target smoother. A Sequential Monte Carlo approach is also presented to implement the proposed method. Simulation results show the proposed method can effectively achieve tracking continuity for multiple maneuvering targets. In addition, compared with the forward filtering alone, our method is more robust due to its combination of forward filtering and backward smoothing. PMID:26670234

  17. Clutter attenuation using the Doppler effect in standoff electromagnetic quantum sensing

    NASA Astrophysics Data System (ADS)

    Lanzagorta, Marco; Jitrik, Oliverio; Uhlmann, Jeffrey; Venegas, Salvador

    2016-05-01

    In the context of traditional radar systems, the Doppler effect is crucial to detect and track moving targets in the presence of clutter. In the quantum radar context, however, most theoretical performance analyses to date have assumed static targets. In this paper we consider the Doppler effect at the single photon level. In particular, we describe how the Doppler effect produced by clutter and moving targets modifies the quantum distinguishability and the quantum radar error detection probability equations. Furthermore, we show that Doppler-based delayline cancelers can reduce the effects of clutter in the context of quantum radar, but only in the low-brightness regime. Thus, quantum radar may prove to be an important technology if the electronic battlefield requires stealthy tracking and detection of moving targets in the presence of clutter.

  18. Maximizing the Detection Probability of Kilonovae Associated with Gravitational Wave Observations

    NASA Astrophysics Data System (ADS)

    Chan, Man Leong; Hu, Yi-Ming; Messenger, Chris; Hendry, Martin; Heng, Ik Siong

    2017-01-01

    Estimates of the source sky location for gravitational wave signals are likely to span areas of up to hundreds of square degrees or more, making it very challenging for most telescopes to search for counterpart signals in the electromagnetic spectrum. To boost the chance of successfully observing such counterparts, we have developed an algorithm that optimizes the number of observing fields and their corresponding time allocations by maximizing the detection probability. As a proof-of-concept demonstration, we optimize follow-up observations targeting kilonovae using telescopes including the CTIO-Dark Energy Camera, Subaru-HyperSuprimeCam, Pan-STARRS, and the Palomar Transient Factory. We consider three simulated gravitational wave events with 90% credible error regions spanning areas from ∼ 30 {\\deg }2 to ∼ 300 {\\deg }2. Assuming a source at 200 {Mpc}, we demonstrate that to obtain a maximum detection probability, there is an optimized number of fields for any particular event that a telescope should observe. To inform future telescope design studies, we present the maximum detection probability and corresponding number of observing fields for a combination of limiting magnitudes and fields of view over a range of parameters. We show that for large gravitational wave error regions, telescope sensitivity rather than field of view is the dominating factor in maximizing the detection probability.

  19. The small low SNR target tracking using sparse representation information

    NASA Astrophysics Data System (ADS)

    Yin, Lifan; Zhang, Yiqun; Wang, Shuo; Sun, Chenggang

    2017-11-01

    Tracking small targets, such as missile warheads, from a remote distance is a difficult task since the targets are "points" which are similar to sensor's noise points. As a result, traditional tracking algorithms only use the information contained in point measurement, such as the position information and intensity information, as characteristics to identify targets from noise points. But in fact, as a result of the diffusion of photon, any small target is not a point in the focal plane array and it occupies an area which is larger than one sensor cell. So, if we can take the geometry characteristic into account as a new dimension of information, it will be of helpful in distinguishing targets from noise points. In this paper, we use a novel method named sparse representation (SR) to depict the geometry information of target intensity and define it as the SR information of target. Modeling the intensity spread and solving its SR coefficients, the SR information is represented by establishing its likelihood function. Further, the SR information likelihood is incorporated in the conventional Probability Hypothesis Density (PHD) filter algorithm with point measurement. To illustrate the different performances of algorithm with or without the SR information, the detection capability and estimation error have been compared through simulation. Results demonstrate the proposed method has higher estimation accuracy and probability of detecting target than the conventional algorithm without the SR information.

  20. Robust human detection, tracking, and recognition in crowded urban areas

    NASA Astrophysics Data System (ADS)

    Chen, Hai-Wen; McGurr, Mike

    2014-06-01

    In this paper, we present algorithms we recently developed to support an automated security surveillance system for very crowded urban areas. In our approach for human detection, the color features are obtained by taking the difference of R, G, B spectrum and converting R, G, B to HSV (Hue, Saturation, Value) space. Morphological patch filtering and regional minimum and maximum segmentation on the extracted features are applied for target detection. The human tracking process approach includes: 1) Color and intensity feature matching track candidate selection; 2) Separate three parallel trackers for color, bright (above mean intensity), and dim (below mean intensity) detections, respectively; 3) Adaptive track gate size selection for reducing false tracking probability; and 4) Forward position prediction based on previous moving speed and direction for continuing tracking even when detections are missed from frame to frame. The Human target recognition is improved with a Super-Resolution Image Enhancement (SRIE) process. This process can improve target resolution by 3-5 times and can simultaneously process many targets that are tracked. Our approach can project tracks from one camera to another camera with a different perspective viewing angle to obtain additional biometric features from different perspective angles, and to continue tracking the same person from the 2nd camera even though the person moved out of the Field of View (FOV) of the 1st camera with `Tracking Relay'. Finally, the multiple cameras at different view poses have been geo-rectified to nadir view plane and geo-registered with Google- Earth (or other GIS) to obtain accurate positions (latitude, longitude, and altitude) of the tracked human for pin-point targeting and for a large area total human motion activity top-view. Preliminary tests of our algorithms indicate than high probability of detection can be achieved for both moving and stationary humans. Our algorithms can simultaneously track more than 100 human targets with averaged tracking period (time length) longer than the performance of the current state-of-the-art.

  1. Investigation to synthesis more isotopes of superheavy nuclei Z = 118

    NASA Astrophysics Data System (ADS)

    Manjunatha, H. C.; Sridhar, K. N.

    2018-07-01

    We have studied the α-decay properties of superheavy nuclei Z = 118 in the range 275 ≤ A ≤ 325. Most of the predicted, unknown nuclei in the range 291 ≤ A ≤ 301 were found to have α-decay chains. Of these the nuclei 293-301118 were found to have long half-lives and hence could be sufficient to detect them if synthesized in a laboratory. Fusion barries for different projectile-target combinations to synthesis superheavy nuclei Z = 118 are studied and are also represented in simple relations. We have also studied the evaporation residue cross section, compound nucleus formation probability (PCN) and survival probability (PSurv) of different projectile-target combinations to synthesis superheavy element Z = 118. The selected most probable projectile-target combinations are Ca+Cf, Ti+Cm, Sc+Bk, V+Am, Cr+Pu, Fe+U, Mn+Np, Ni+Th and Kr+Pb. We have formulated simple relations for maximum evaporation residue cross sections and its corresponding energies. This helps to identify the projectile-target combinations quickly. Hence, we have identified the most probable projectile-target combinations to synthesis these superheavy nuclei. We hope that our predictions may be a guide for the future experiments in the synthesis of more isotopes of superheavy nuclei Z = 118.

  2. The simulation study on optical target laser active detection performance

    NASA Astrophysics Data System (ADS)

    Li, Ying-chun; Hou, Zhao-fei; Fan, Youchen

    2014-12-01

    According to the working principle of laser active detection system, the paper establishes the optical target laser active detection simulation system, carry out the simulation study on the detection process and detection performance of the system. For instance, the performance model such as the laser emitting, the laser propagation in the atmosphere, the reflection of optical target, the receiver detection system, the signal processing and recognition. We focus on the analysis and modeling the relationship between the laser emitting angle and defocus amount and "cat eye" effect echo laser in the reflection of optical target. Further, in the paper some performance index such as operating range, SNR and the probability of the system have been simulated. The parameters including laser emitting parameters, the reflection of the optical target and the laser propagation in the atmosphere which make a great influence on the performance of the optical target laser active detection system. Finally, using the object-oriented software design methods, the laser active detection system with the opening type, complete function and operating platform, realizes the process simulation that the detection system detect and recognize the optical target, complete the performance simulation of each subsystem, and generate the data report and the graph. It can make the laser active detection system performance models more intuitive because of the visible simulation process. The simulation data obtained from the system provide a reference to adjust the structure of the system parameters. And it provides theoretical and technical support for the top level design of the optical target laser active detection system and performance index optimization.

  3. Genomic Disruption of VEGF-A Expression in Human Retinal Pigment Epithelial Cells Using CRISPR-Cas9 Endonuclease.

    PubMed

    Yiu, Glenn; Tieu, Eric; Nguyen, Anthony T; Wong, Brittany; Smit-McBride, Zeljka

    2016-10-01

    To employ type II clustered regularly interspaced short palindromic repeats (CRISPR)-Cas9 endonuclease to suppress ocular angiogenesis by genomic disruption of VEGF-A in human RPE cells. CRISPR sequences targeting exon 1 of human VEGF-A were computationally identified based on predicted Cas9 on- and off-target probabilities. Single guide RNA (gRNA) cassettes with these target sequences were cloned into lentiviral vectors encoding the Streptococcuspyogenes Cas9 endonuclease (SpCas9) gene. The lentiviral vectors were used to infect ARPE-19 cells, a human RPE cell line. Frequency of insertion or deletion (indel) mutations was assessed by T7 endonuclease 1 mismatch detection assay; mRNA levels were assessed with quantitative real-time PCR; and VEGF-A protein levels were determined by ELISA. In vitro angiogenesis was measured using an endothelial cell tube formation assay. Five gRNAs targeting VEGF-A were selected based on the highest predicted on-target probabilities, lowest off-target probabilities, or combined average of both scores. Lentiviral delivery of the top-scoring gRNAs with SpCas9 resulted in indel formation in the VEGF-A gene at frequencies up to 37.0% ± 4.0% with corresponding decreases in secreted VEGF-A protein up to 41.2% ± 7.4% (P < 0.001), and reduction of endothelial tube formation up to 39.4% ± 9.8% (P = 0.02). No significant indel formation in the top three putative off-target sites tested was detected. The CRISPR-Cas9 endonuclease system may reduce VEGF-A secretion from human RPE cells and suppress angiogenesis, supporting the possibility of employing gene editing for antiangiogenesis therapy in ocular diseases.

  4. Detection and recognition of targets by using signal polarization properties

    NASA Astrophysics Data System (ADS)

    Ponomaryov, Volodymyr I.; Peralta-Fabi, Ricardo; Popov, Anatoly V.; Babakov, Mikhail F.

    1999-08-01

    The quality of radar target recognition can be enhanced by exploiting its polarization signatures. A specialized X-band polarimetric radar was used for target recognition in experimental investigations. The following polarization characteristics connected to the object geometrical properties were investigated: the amplitudes of the polarization matrix elements; an anisotropy coefficient; depolarization coefficient; asymmetry coefficient; the energy of a backscattering signal; object shape factor. A large quantity of polarimetric radar data was measured and processed to form a database of different object and different weather conditions. The histograms of polarization signatures were approximated by a Nakagami distribution, then used for real- time target recognition. The Neyman-Pearson criterion was used for the target detection, and the criterion of the maximum of a posterior probability was used for recognition problem. Some results of experimental verification of pattern recognition and detection of objects with different electrophysical and geometrical characteristics urban in clutter are presented in this paper.

  5. Target recognitions in multiple-camera closed-circuit television using color constancy

    NASA Astrophysics Data System (ADS)

    Soori, Umair; Yuen, Peter; Han, Ji Wen; Ibrahim, Izzati; Chen, Wentao; Hong, Kan; Merfort, Christian; James, David; Richardson, Mark

    2013-04-01

    People tracking in crowded scenes from closed-circuit television (CCTV) footage has been a popular and challenging task in computer vision. Due to the limited spatial resolution in the CCTV footage, the color of people's dress may offer an alternative feature for their recognition and tracking. However, there are many factors, such as variable illumination conditions, viewing angles, and camera calibration, that may induce illusive modification of intrinsic color signatures of the target. Our objective is to recognize and track targets in multiple camera views using color as the detection feature, and to understand if a color constancy (CC) approach may help to reduce these color illusions due to illumination and camera artifacts and thereby improve target recognition performance. We have tested a number of CC algorithms using various color descriptors to assess the efficiency of target recognition from a real multicamera Imagery Library for Intelligent Detection Systems (i-LIDS) data set. Various classifiers have been used for target detection, and the figure of merit to assess the efficiency of target recognition is achieved through the area under the receiver operating characteristics (AUROC). We have proposed two modifications of luminance-based CC algorithms: one with a color transfer mechanism and the other using a pixel-wise sigmoid function for an adaptive dynamic range compression, a method termed enhanced luminance reflectance CC (ELRCC). We found that both algorithms improve the efficiency of target recognitions substantially better than that of the raw data without CC treatment, and in some cases the ELRCC improves target tracking by over 100% within the AUROC assessment metric. The performance of the ELRCC has been assessed over 10 selected targets from three different camera views of the i-LIDS footage, and the averaged target recognition efficiency over all these targets is found to be improved by about 54% in AUROC after the data are processed by the proposed ELRCC algorithm. This amount of improvement represents a reduction of probability of false alarm by about a factor of 5 at the probability of detection of 0.5. Our study concerns mainly the detection of colored targets; and issues for the recognition of white or gray targets will be addressed in a forthcoming study.

  6. Multi-scale occupancy estimation and modelling using multiple detection methods

    USGS Publications Warehouse

    Nichols, James D.; Bailey, Larissa L.; O'Connell, Allan F.; Talancy, Neil W.; Grant, Evan H. Campbell; Gilbert, Andrew T.; Annand, Elizabeth M.; Husband, Thomas P.; Hines, James E.

    2008-01-01

    Occupancy estimation and modelling based on detection–nondetection data provide an effective way of exploring change in a species’ distribution across time and space in cases where the species is not always detected with certainty. Today, many monitoring programmes target multiple species, or life stages within a species, requiring the use of multiple detection methods. When multiple methods or devices are used at the same sample sites, animals can be detected by more than one method.We develop occupancy models for multiple detection methods that permit simultaneous use of data from all methods for inference about method-specific detection probabilities. Moreover, the approach permits estimation of occupancy at two spatial scales: the larger scale corresponds to species’ use of a sample unit, whereas the smaller scale corresponds to presence of the species at the local sample station or site.We apply the models to data collected on two different vertebrate species: striped skunks Mephitis mephitis and red salamanders Pseudotriton ruber. For striped skunks, large-scale occupancy estimates were consistent between two sampling seasons. Small-scale occupancy probabilities were slightly lower in the late winter/spring when skunks tend to conserve energy, and movements are limited to males in search of females for breeding. There was strong evidence of method-specific detection probabilities for skunks. As anticipated, large- and small-scale occupancy areas completely overlapped for red salamanders. The analyses provided weak evidence of method-specific detection probabilities for this species.Synthesis and applications. Increasingly, many studies are utilizing multiple detection methods at sampling locations. The modelling approach presented here makes efficient use of detections from multiple methods to estimate occupancy probabilities at two spatial scales and to compare detection probabilities associated with different detection methods. The models can be viewed as another variation of Pollock's robust design and may be applicable to a wide variety of scenarios where species occur in an area but are not always near the sampled locations. The estimation approach is likely to be especially useful in multispecies conservation programmes by providing efficient estimates using multiple detection devices and by providing device-specific detection probability estimates for use in survey design.

  7. Modern Approaches to the Computation of the Probability of Target Detection in Cluttered Environments

    NASA Astrophysics Data System (ADS)

    Meitzler, Thomas J.

    The field of computer vision interacts with fields such as psychology, vision research, machine vision, psychophysics, mathematics, physics, and computer science. The focus of this thesis is new algorithms and methods for the computation of the probability of detection (Pd) of a target in a cluttered scene. The scene can be either a natural visual scene such as one sees with the naked eye (visual), or, a scene displayed on a monitor with the help of infrared sensors. The relative clutter and the temperature difference between the target and background (DeltaT) are defined and then used to calculate a relative signal -to-clutter ratio (SCR) from which the Pd is calculated for a target in a cluttered scene. It is shown how this definition can include many previous definitions of clutter and (DeltaT). Next, fuzzy and neural -fuzzy techniques are used to calculate the Pd and it is shown how these methods can give results that have a good correlation with experiment. The experimental design for actually measuring the Pd of a target by observers is described. Finally, wavelets are applied to the calculation of clutter and it is shown how this new definition of clutter based on wavelets can be used to compute the Pd of a target.

  8. Target-type probability combining algorithms for multisensor tracking

    NASA Astrophysics Data System (ADS)

    Wigren, Torbjorn

    2001-08-01

    Algorithms for the handing of target type information in an operational multi-sensor tracking system are presented. The paper discusses recursive target type estimation, computation of crosses from passive data (strobe track triangulation), as well as the computation of the quality of the crosses for deghosting purposes. The focus is on Bayesian algorithms that operate in the discrete target type probability space, and on the approximations introduced for computational complexity reduction. The centralized algorithms are able to fuse discrete data from a variety of sensors and information sources, including IFF equipment, ESM's, IRST's as well as flight envelopes estimated from track data. All algorithms are asynchronous and can be tuned to handle clutter, erroneous associations as well as missed and erroneous detections. A key to obtain this ability is the inclusion of data forgetting by a procedure for propagation of target type probability states between measurement time instances. Other important properties of the algorithms are their abilities to handle ambiguous data and scenarios. The above aspects are illustrated in a simulations study. The simulation setup includes 46 air targets of 6 different types that are tracked by 5 airborne sensor platforms using ESM's and IRST's as data sources.

  9. Object tracking algorithm based on the color histogram probability distribution

    NASA Astrophysics Data System (ADS)

    Li, Ning; Lu, Tongwei; Zhang, Yanduo

    2018-04-01

    In order to resolve tracking failure resulted from target's being occlusion and follower jamming caused by objects similar to target in the background, reduce the influence of light intensity. This paper change HSV and YCbCr color channel correction the update center of the target, continuously updated image threshold self-adaptive target detection effect, Clustering the initial obstacles is roughly range, shorten the threshold range, maximum to detect the target. In order to improve the accuracy of detector, this paper increased the Kalman filter to estimate the target state area. The direction predictor based on the Markov model is added to realize the target state estimation under the condition of background color interference and enhance the ability of the detector to identify similar objects. The experimental results show that the improved algorithm more accurate and faster speed of processing.

  10. The influence of working memory on the anger superiority effect.

    PubMed

    Moriya, Jun; Koster, Ernst H W; De Raedt, Rudi

    2014-01-01

    The anger superiority effect shows that an angry face is detected more efficiently than a happy face. However, it is still controversial whether attentional allocation to angry faces is a bottom-up process or not. We investigated whether the anger superiority effect is influenced by top-down control, especially working memory (WM). Participants remembered a colour and then searched for differently coloured facial expressions. Just holding the colour information in WM did not modulate the anger superiority effect. However, when increasing the probabilities of trials in which the colour of a target face matched the colour held in WM, participants were inclined to direct attention to the target face regardless of the facial expression. Moreover, the knowledge of high probability of valid trials eliminated the anger superiority effect. These results suggest that the anger superiority effect is modulated by top-down effects of WM, the probability of events and expectancy about these probabilities.

  11. Various Effects of Embedded Intrapulse Communications on Pulsed Radar

    DTIC Science & Technology

    2017-06-01

    specific type of interference that may be encountered by radar; however, this introductory information should suffice to illustrate to the reader why...chapter we seek to not merely understand the overall statistical performance of the radar with embedded intrapulse communications but rather to evaluate...Theory Probability of detection, discussed in Chapter 4, assesses the statistical probability of a radar accurately identifying a target given a

  12. Effects of age and eccentricity on visual target detection.

    PubMed

    Gruber, Nicole; Müri, René M; Mosimann, Urs P; Bieri, Rahel; Aeschimann, Andrea; Zito, Giuseppe A; Urwyler, Prabitha; Nyffeler, Thomas; Nef, Tobias

    2013-01-01

    The aim of this study was to examine the effects of aging and target eccentricity on a visual search task comprising 30 images of everyday life projected into a hemisphere, realizing a ±90° visual field. The task performed binocularly allowed participants to freely move their eyes to scan images for an appearing target or distractor stimulus (presented at 10°; 30°, and 50° eccentricity). The distractor stimulus required no response, while the target stimulus required acknowledgment by pressing the response button. One hundred and seventeen healthy subjects (mean age = 49.63 years, SD = 17.40 years, age range 20-78 years) were studied. The results show that target detection performance decreases with age as well as with increasing eccentricity, especially for older subjects. Reaction time also increases with age and eccentricity, but in contrast to target detection, there is no interaction between age and eccentricity. Eye movement analysis showed that younger subjects exhibited a passive search strategy while older subjects exhibited an active search strategy probably as a compensation for their reduced peripheral detection performance.

  13. Search by photo methodology for signature properties assessment by human observers

    NASA Astrophysics Data System (ADS)

    Selj, Gorm K.; Heinrich, Daniela H.

    2015-05-01

    Reliable, low-cost and simple methods for assessment of signature properties for military purposes are very important. In this paper we present such an approach that uses human observers in a search by photo assessment of signature properties of generic test targets. The method was carried out by logging a large number of detection times of targets recorded in relevant terrain backgrounds. The detection times were harvested by using human observers searching for targets in scene images shown by a high definition pc screen. All targets were identically located in each "search image", allowing relative comparisons (and not just rank by order) of targets. To avoid biased detections, each observer only searched for one target per scene. Statistical analyses were carried out for the detection times data. Analysis of variance was chosen if detection times distribution associated with all targets satisfied normality, and non-parametric tests, such as Wilcoxon's rank test, if otherwise. The new methodology allows assessment of signature properties in a reproducible, rapid and reliable setting. Such assessments are very complex as they must sort out what is of relevance in a signature test, but not loose information of value. We believe that choosing detection times as the primary variable for a comparison of signature properties, allows a careful and necessary inspection of observer data as the variable is continuous rather than discrete. Our method thus stands in opposition to approaches based on detections by subsequent, stepwise reductions in distance to target, or based on probability of detection.

  14. Target Detection of Quantum Illumination Receiver Based on Photon-subtracted Entanglement State

    NASA Astrophysics Data System (ADS)

    Chi, Jiao; Liu, HongJun; Huang, Nan; Wang, ZhaoLu

    2017-12-01

    We theoretically propose a quantum illumination receiver based on the ideal photon-subtracted two-mode squeezed state (PSTMSS) to efficiently detect the noise-hidden target. This receiver is generated by applying an optical parametric amplifier (OPA) to the cross correlation detection. With analyzing the output performance, it is found that OPA as a preposition technology of the receiver can contribute to the PSTMSS by significantly reducing the error probability than that of the general two-mode squeezed state (TMSS). Comparing with TMSS, the signal-to-noise ratio of quantum illumination based on ideal PSTMSS and OPA is improved more than 4 dB under an optimal gain of OPA. This work may provide a potential improvement in the application of accurate target detection when two kinds of resource have the identical real squeezing parameter.

  15. Optimizing occupancy surveys by maximizing detection probability: application to amphibian monitoring in the Mediterranean region.

    PubMed

    Petitot, Maud; Manceau, Nicolas; Geniez, Philippe; Besnard, Aurélien

    2014-09-01

    Setting up effective conservation strategies requires the precise determination of the targeted species' distribution area and, if possible, its local abundance. However, detection issues make these objectives complex for most vertebrates. The detection probability is usually <1 and is highly dependent on species phenology and other environmental variables. The aim of this study was to define an optimized survey protocol for the Mediterranean amphibian community, that is, to determine the most favorable periods and the most effective sampling techniques for detecting all species present on a site in a minimum number of field sessions and a minimum amount of prospecting effort. We visited 49 ponds located in the Languedoc region of southern France on four occasions between February and June 2011. Amphibians were detected using three methods: nighttime call count, nighttime visual encounter, and daytime netting. The detection nondetection data obtained was then modeled using site-occupancy models. The detection probability of amphibians sharply differed between species, the survey method used and the date of the survey. These three covariates also interacted. Thus, a minimum of three visits spread over the breeding season, using a combination of all three survey methods, is needed to reach a 95% detection level for all species in the Mediterranean region. Synthesis and applications: detection nondetection surveys combined to site occupancy modeling approach are powerful methods that can be used to estimate the detection probability and to determine the prospecting effort necessary to assert that a species is absent from a site.

  16. Comparison of three different detectors applied to synthetic aperture radar data

    NASA Astrophysics Data System (ADS)

    Ranney, Kenneth I.; Khatri, Hiralal; Nguyen, Lam H.

    2002-08-01

    The U.S. Army Research Laboratory has investigated the relative performance of three different target detection paradigms applied to foliage penetration (FOPEN) synthetic aperture radar (SAR) data. The three detectors - a quadratic polynomial discriminator (QPD), Bayesian neural network (BNN) and a support vector machine (SVM) - utilize a common collection of statistics (feature values) calculated from the fully polarimetric FOPEN data. We describe the parametric variations required as part of the algorithm optimizations, and we present the relative performance of the detectors in terms of probability of false alarm (Pfa) and probability of detection (Pd).

  17. A robust algorithm for automated target recognition using precomputed radar cross sections

    NASA Astrophysics Data System (ADS)

    Ehrman, Lisa M.; Lanterman, Aaron D.

    2004-09-01

    Passive radar is an emerging technology that offers a number of unique benefits, including covert operation. Many such systems are already capable of detecting and tracking aircraft. The goal of this work is to develop a robust algorithm for adding automated target recognition (ATR) capabilities to existing passive radar systems. In previous papers, we proposed conducting ATR by comparing the precomputed RCS of known targets to that of detected targets. To make the precomputed RCS as accurate as possible, a coordinated flight model is used to estimate aircraft orientation. Once the aircraft's position and orientation are known, it is possible to determine the incident and observed angles on the aircraft, relative to the transmitter and receiver. This makes it possible to extract the appropriate radar cross section (RCS) from our simulated database. This RCS is then scaled to account for propagation losses and the receiver's antenna gain. A Rician likelihood model compares these expected signals from different targets to the received target profile. We have previously employed Monte Carlo runs to gauge the probability of error in the ATR algorithm; however, generation of a statistically significant set of Monte Carlo runs is computationally intensive. As an alternative to Monte Carlo runs, we derive the relative entropy (also known as Kullback-Liebler distance) between two Rician distributions. Since the probability of Type II error in our hypothesis testing problem can be expressed as a function of the relative entropy via Stein's Lemma, this provides us with a computationally efficient method for determining an upper bound on our algorithm's performance. It also provides great insight into the types of classification errors we can expect from our algorithm. This paper compares the numerically approximated probability of Type II error with the results obtained from a set of Monte Carlo runs.

  18. Scan statistics with local vote for target detection in distributed system

    NASA Astrophysics Data System (ADS)

    Luo, Junhai; Wu, Qi

    2017-12-01

    Target detection has occupied a pivotal position in distributed system. Scan statistics, as one of the most efficient detection methods, has been applied to a variety of anomaly detection problems and significantly improves the probability of detection. However, scan statistics cannot achieve the expected performance when the noise intensity is strong, or the signal emitted by the target is weak. The local vote algorithm can also achieve higher target detection rate. After the local vote, the counting rule is always adopted for decision fusion. The counting rule does not use the information about the contiguity of sensors but takes all sensors' data into consideration, which makes the result undesirable. In this paper, we propose a scan statistics with local vote (SSLV) method. This method combines scan statistics with local vote decision. Before scan statistics, each sensor executes local vote decision according to the data of its neighbors and its own. By combining the advantages of both, our method can obtain higher detection rate in low signal-to-noise ratio environment than the scan statistics. After the local vote decision, the distribution of sensors which have detected the target becomes more intensive. To make full use of local vote decision, we introduce a variable-step-parameter for the SSLV. It significantly shortens the scan period especially when the target is absent. Analysis and simulations are presented to demonstrate the performance of our method.

  19. Antenna Allocation in MIMO Radar with Widely Separated Antennas for Multi-Target Detection

    PubMed Central

    Gao, Hao; Wang, Jian; Jiang, Chunxiao; Zhang, Xudong

    2014-01-01

    In this paper, we explore a new resource called multi-target diversity to optimize the performance of multiple input multiple output (MIMO) radar with widely separated antennas for detecting multiple targets. In particular, we allocate antennas of the MIMO radar to probe different targets simultaneously in a flexible manner based on the performance metric of relative entropy. Two antenna allocation schemes are proposed. In the first scheme, each antenna is allocated to illuminate a proper target over the entire illumination time, so that the detection performance of each target is guaranteed. The problem is formulated as a minimum makespan scheduling problem in the combinatorial optimization framework. Antenna allocation is implemented through a branch-and-bound algorithm and an enhanced factor 2 algorithm. In the second scheme, called antenna-time allocation, each antenna is allocated to illuminate different targets with different illumination time. Both antenna allocation and time allocation are optimized based on illumination probabilities. Over a large range of transmitted power, target fluctuations and target numbers, both of the proposed antenna allocation schemes outperform the scheme without antenna allocation. Moreover, the antenna-time allocation scheme achieves a more robust detection performance than branch-and-bound algorithm and the enhanced factor 2 algorithm when the target number changes. PMID:25350505

  20. Antenna allocation in MIMO radar with widely separated antennas for multi-target detection.

    PubMed

    Gao, Hao; Wang, Jian; Jiang, Chunxiao; Zhang, Xudong

    2014-10-27

    In this paper, we explore a new resource called multi-target diversity to optimize the performance of multiple input multiple output (MIMO) radar with widely separated antennas for detecting multiple targets. In particular, we allocate antennas of the MIMO radar to probe different targets simultaneously in a flexible manner based on the performance metric of relative entropy. Two antenna allocation schemes are proposed. In the first scheme, each antenna is allocated to illuminate a proper target over the entire illumination time, so that the detection performance of each target is guaranteed. The problem is formulated as a minimum makespan scheduling problem in the combinatorial optimization framework. Antenna allocation is implemented through a branch-and-bound algorithm and an enhanced factor 2 algorithm. In the second scheme, called antenna-time allocation, each antenna is allocated to illuminate different targets with different illumination time. Both antenna allocation and time allocation are optimized based on illumination probabilities. Over a large range of transmitted power, target fluctuations and target numbers, both of the proposed antenna allocation schemes outperform the scheme without antenna allocation. Moreover, the antenna-time allocation scheme achieves a more robust detection performance than branch-and-bound algorithm and the enhanced factor 2 algorithm when the target number changes.

  1. Doppler characteristics of sea clutter.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raynal, Ann Marie; Doerry, Armin Walter

    2010-06-01

    Doppler radars can distinguish targets from clutter if the target's velocity along the radar line of sight is beyond that of the clutter. Some targets of interest may have a Doppler shift similar to that of clutter. The nature of sea clutter is different in the clutter and exo-clutter regions. This behavior requires special consideration regarding where a radar can expect to find sea-clutter returns in Doppler space and what detection algorithms are most appropriate to help mitigate false alarms and increase probability of detection of a target. This paper studies the existing state-of-the-art in the understanding of Doppler characteristicsmore » of sea clutter and scattering from the ocean to better understand the design and performance choices of a radar in differentiating targets from clutter under prevailing sea conditions.« less

  2. The correct estimate of the probability of false detection of the matched filter in weak-signal detection problems . II. Further results with application to a set of ALMA and ATCA data

    NASA Astrophysics Data System (ADS)

    Vio, R.; Vergès, C.; Andreani, P.

    2017-08-01

    The matched filter (MF) is one of the most popular and reliable techniques to the detect signals of known structure and amplitude smaller than the level of the contaminating noise. Under the assumption of stationary Gaussian noise, MF maximizes the probability of detection subject to a constant probability of false detection or false alarm (PFA). This property relies upon a priori knowledge of the position of the searched signals, which is usually not available. Recently, it has been shown that when applied in its standard form, MF may severely underestimate the PFA. As a consequence the statistical significance of features that belong to noise is overestimated and the resulting detections are actually spurious. For this reason, an alternative method of computing the PFA has been proposed that is based on the probability density function (PDF) of the peaks of an isotropic Gaussian random field. In this paper we further develop this method. In particular, we discuss the statistical meaning of the PFA and show that, although useful as a preliminary step in a detection procedure, it is not able to quantify the actual reliability of a specific detection. For this reason, a new quantity is introduced called the specific probability of false alarm (SPFA), which is able to carry out this computation. We show how this method works in targeted simulations and apply it to a few interferometric maps taken with the Atacama Large Millimeter/submillimeter Array (ALMA) and the Australia Telescope Compact Array (ATCA). We select a few potential new point sources and assign an accurate detection reliability to these sources.

  3. Sensor planning for moving targets

    NASA Astrophysics Data System (ADS)

    Musman, Scott A.; Lehner, Paul; Elsaesser, Chris

    1994-10-01

    Planning a search for moving ground targets is difficult for humans and computationally intractable. This paper describes a technique to solve such problems. The main idea is to combine probability of detection assessments with computational search heuristics to generate sensor plans which approximately maximize either the probability of detection or a user- specified knowledge function (e.g., determining the target's probable destination; locating the enemy tanks). In contrast to super computer-based moving target search planning, our technique has been implemented using workstation technology. The data structures generated by sensor planning can be used to evaluate sensor reports during plan execution. Our system revises its objective function with each sensor report, allowing the user to assess both the current situation as well as the expected value of future information. This capability is particularly useful in situations involving a high rate of sensor reporting, helping the user focus his attention on sensors reports most pertinent to current needs. Our planning approach is implemented in a three layer architecture. The layers are: mobility analysis, followed by sensor coverage analysis, and concluding with sensor plan analysis. It is possible using these layers to describe the physical, spatial, and temporal characteristics of a scenario in the first two layers, and customize the final analysis to specific intelligence objectives. The architecture also allows a user to customize operational parameters in each of the three major components of the system. As examples of these performance options, we briefly describe the mobility analysis and discuss issues affecting sensor plan analysis.

  4. Beyond the Sparsity-Based Target Detector: A Hybrid Sparsity and Statistics Based Detector for Hyperspectral Images.

    PubMed

    Du, Bo; Zhang, Yuxiang; Zhang, Liangpei; Tao, Dacheng

    2016-08-18

    Hyperspectral images provide great potential for target detection, however, new challenges are also introduced for hyperspectral target detection, resulting that hyperspectral target detection should be treated as a new problem and modeled differently. Many classical detectors are proposed based on the linear mixing model and the sparsity model. However, the former type of model cannot deal well with spectral variability in limited endmembers, and the latter type of model usually treats the target detection as a simple classification problem and pays less attention to the low target probability. In this case, can we find an efficient way to utilize both the high-dimension features behind hyperspectral images and the limited target information to extract small targets? This paper proposes a novel sparsitybased detector named the hybrid sparsity and statistics detector (HSSD) for target detection in hyperspectral imagery, which can effectively deal with the above two problems. The proposed algorithm designs a hypothesis-specific dictionary based on the prior hypotheses for the test pixel, which can avoid the imbalanced number of training samples for a class-specific dictionary. Then, a purification process is employed for the background training samples in order to construct an effective competition between the two hypotheses. Next, a sparse representation based binary hypothesis model merged with additive Gaussian noise is proposed to represent the image. Finally, a generalized likelihood ratio test is performed to obtain a more robust detection decision than the reconstruction residual based detection methods. Extensive experimental results with three hyperspectral datasets confirm that the proposed HSSD algorithm clearly outperforms the stateof- the-art target detectors.

  5. Effects of Target Probability and Memory Demands on the Vigilance of Adults with and without Mental Retardation.

    ERIC Educational Resources Information Center

    Tomporowski, Phillip D.; Tinsley, Veronica

    1994-01-01

    The vigilance of young adults with and without mild mental retardation (MR) was compared, with subjects performing two memory demanding, cognitively based tests. The vigilance decrement of MR adults declined more rapidly than did the vigilance of non-MR adults, due to an interaction between target detectability and response bias, and poor target…

  6. A comparison of error bounds for a nonlinear tracking system with detection probability Pd < 1.

    PubMed

    Tong, Huisi; Zhang, Hao; Meng, Huadong; Wang, Xiqin

    2012-12-14

    Error bounds for nonlinear filtering are very important for performance evaluation and sensor management. This paper presents a comparative study of three error bounds for tracking filtering, when the detection probability is less than unity. One of these bounds is the random finite set (RFS) bound, which is deduced within the framework of finite set statistics. The others, which are the information reduction factor (IRF) posterior Cramer-Rao lower bound (PCRLB) and enumeration method (ENUM) PCRLB are introduced within the framework of finite vector statistics. In this paper, we deduce two propositions and prove that the RFS bound is equal to the ENUM PCRLB, while it is tighter than the IRF PCRLB, when the target exists from the beginning to the end. Considering the disappearance of existing targets and the appearance of new targets, the RFS bound is tighter than both IRF PCRLB and ENUM PCRLB with time, by introducing the uncertainty of target existence. The theory is illustrated by two nonlinear tracking applications: ballistic object tracking and bearings-only tracking. The simulation studies confirm the theory and reveal the relationship among the three bounds.

  7. A Comparison of Error Bounds for a Nonlinear Tracking System with Detection Probability Pd < 1

    PubMed Central

    Tong, Huisi; Zhang, Hao; Meng, Huadong; Wang, Xiqin

    2012-01-01

    Error bounds for nonlinear filtering are very important for performance evaluation and sensor management. This paper presents a comparative study of three error bounds for tracking filtering, when the detection probability is less than unity. One of these bounds is the random finite set (RFS) bound, which is deduced within the framework of finite set statistics. The others, which are the information reduction factor (IRF) posterior Cramer-Rao lower bound (PCRLB) and enumeration method (ENUM) PCRLB are introduced within the framework of finite vector statistics. In this paper, we deduce two propositions and prove that the RFS bound is equal to the ENUM PCRLB, while it is tighter than the IRF PCRLB, when the target exists from the beginning to the end. Considering the disappearance of existing targets and the appearance of new targets, the RFS bound is tighter than both IRF PCRLB and ENUM PCRLB with time, by introducing the uncertainty of target existence. The theory is illustrated by two nonlinear tracking applications: ballistic object tracking and bearings-only tracking. The simulation studies confirm the theory and reveal the relationship among the three bounds. PMID:23242274

  8. Radar waveform requirements for reliable detection of an aircraft-launched missile

    NASA Astrophysics Data System (ADS)

    Blair, W. Dale; Brandt-Pearce, Maite

    1996-06-01

    When tracking a manned aircraft with a phase array radar, detecting a missile launch (i.e., a target split) is particularly important because the missile can have a very small radar cross section (RCS) and drop below the horizon of the radar shortly after launch. Reliable detection of the launch is made difficult because the RCS of the missile is very small compared to that of the manned aircraft and the radar typically revisits a manned aircraft every few seconds. Furthermore, any measurements of the aircraft and missile taken shortly after the launch will be merged until the two targets are resolved in range, frequency, or space. In this paper, detection of the launched missile is addressed through the detection of the presence of target multiplicity with the in-phase and quadrature monopulse measurements. The probability of detecting the launch using monopulse processing will be studied with regard to the tracking signal-to-noise ratio and the number of pulses n the radar waveform.

  9. Electrophysiological evidence that top-down knowledge controls working memory processing for subsequent visual search.

    PubMed

    Kawashima, Tomoya; Matsumoto, Eriko

    2016-03-23

    Items in working memory guide visual attention toward a memory-matching object. Recent studies have shown that when searching for an object this attentional guidance can be modulated by knowing the probability that the target will match an item in working memory. Here, we recorded the P3 and contralateral delay activity to investigate how top-down knowledge controls the processing of working memory items. Participants performed memory task (recognition only) and memory-or-search task (recognition or visual search) in which they were asked to maintain two colored oriented bars in working memory. For visual search, we manipulated the probability that target had the same color as memorized items (0, 50, or 100%). Participants knew the probabilities before the task. Target detection in 100% match condition was faster than that in 50% match condition, indicating that participants used their knowledge of the probabilities. We found that the P3 amplitude in 100% condition was larger than in other conditions and that contralateral delay activity amplitude did not vary across conditions. These results suggest that more attention was allocated to the memory items when observers knew in advance that their color would likely match a target. This led to better search performance despite using qualitatively equal working memory representations.

  10. Feature Transformation Detection Method with Best Spectral Band Selection Process for Hyper-spectral Imaging

    NASA Astrophysics Data System (ADS)

    Chen, Hai-Wen; McGurr, Mike; Brickhouse, Mark

    2015-11-01

    We present a newly developed feature transformation (FT) detection method for hyper-spectral imagery (HSI) sensors. In essence, the FT method, by transforming the original features (spectral bands) to a different feature domain, may considerably increase the statistical separation between the target and background probability density functions, and thus may significantly improve the target detection and identification performance, as evidenced by the test results in this paper. We show that by differentiating the original spectral, one can completely separate targets from the background using a single spectral band, leading to perfect detection results. In addition, we have proposed an automated best spectral band selection process with a double-threshold scheme that can rank the available spectral bands from the best to the worst for target detection. Finally, we have also proposed an automated cross-spectrum fusion process to further improve the detection performance in lower spectral range (<1000 nm) by selecting the best spectral band pair with multivariate analysis. Promising detection performance has been achieved using a small background material signature library for concept-proving, and has then been further evaluated and verified using a real background HSI scene collected by a HYDICE sensor.

  11. Can you hear me now? Range-testing a submerged passive acoustic receiver array in a Caribbean coral reef habitat

    USGS Publications Warehouse

    Selby, Thomas H.; Hart, Kristen M.; Fujisaki, Ikuko; Smith, Brian J.; Pollock, Clayton J; Hillis-Star, Zandy M; Lundgren, Ian; Oli, Madan K.

    2016-01-01

    Submerged passive acoustic technology allows researchers to investigate spatial and temporal movement patterns of many marine and freshwater species. The technology uses receivers to detect and record acoustic transmissions emitted from tags attached to an individual. Acoustic signal strength naturally attenuates over distance, but numerous environmental variables also affect the probability a tag is detected. Knowledge of receiver range is crucial for designing acoustic arrays and analyzing telemetry data. Here, we present a method for testing a relatively large-scale receiver array in a dynamic Caribbean coastal environment intended for long-term monitoring of multiple species. The U.S. Geological Survey and several academic institutions in collaboration with resource management at Buck Island Reef National Monument (BIRNM), off the coast of St. Croix, recently deployed a 52 passive acoustic receiver array. We targeted 19 array-representative receivers for range-testing by submersing fixed delay interval range-testing tags at various distance intervals in each cardinal direction from a receiver for a minimum of an hour. Using a generalized linear mixed model (GLMM), we estimated the probability of detection across the array and assessed the effect of water depth, habitat, wind, temperature, and time of day on the probability of detection. The predicted probability of detection across the entire array at 100 m distance from a receiver was 58.2% (95% CI: 44.0–73.0%) and dropped to 26.0% (95% CI: 11.4–39.3%) 200 m from a receiver indicating a somewhat constrained effective detection range. Detection probability varied across habitat classes with the greatest effective detection range occurring in homogenous sand substrate and the smallest in high rugosity reef. Predicted probability of detection across BIRNM highlights potential gaps in coverage using the current array as well as limitations of passive acoustic technology within a complex coral reef environment.

  12. Driver landmark and traffic sign identification in early Alzheimer's disease.

    PubMed

    Uc, E Y; Rizzo, M; Anderson, S W; Shi, Q; Dawson, J D

    2005-06-01

    To assess visual search and recognition of roadside targets and safety errors during a landmark and traffic sign identification task in drivers with Alzheimer's disease. 33 drivers with probable Alzheimer's disease of mild severity and 137 neurologically normal older adults underwent a battery of visual and cognitive tests and were asked to report detection of specific landmarks and traffic signs along a segment of an experimental drive. The drivers with mild Alzheimer's disease identified significantly fewer landmarks and traffic signs and made more at-fault safety errors during the task than control subjects. Roadside target identification performance and safety errors were predicted by scores on standardised tests of visual and cognitive function. Drivers with Alzheimer's disease are impaired in a task of visual search and recognition of roadside targets; the demands of these targets on visual perception, attention, executive functions, and memory probably increase the cognitive load, worsening driving safety.

  13. Laser radar range and detection performance for MEMS corner cube retroreflector arrays

    NASA Astrophysics Data System (ADS)

    Grasso, Robert J.; Odhner, Jefferson E.; Stewart, Hamilton; McDaniel, Robert V.

    2004-12-01

    BAE SYSTEMS reports on a program to characterize the performance of MEMS corner cube retroreflector arrays under laser illumination. These arrays have significant military and commercial application in the areas of: 1) target identification; 2) target tracking; 3) target location; 4) identification friend-or-foe (IFF); 5) parcel tracking, and; 6) search and rescue assistance. BAE SYSTEMS has theoretically determined the feasibility of these devices to learn if sufficient signal-to-noise performance exists to permit a cooperative laser radar sensor to be considered for device location and interrogation. Results indicate that modest power-apertures are required to achieve SNR performance consistent with high probability of detection and low false alarm rates.

  14. Laser radar range and detection performance for MEMS corner cube retroreflector arrays

    NASA Astrophysics Data System (ADS)

    Grasso, Robert J.; Jost, Steven R.; Smith, M. J.; McDaniel, Robert V.

    2004-01-01

    BAE SYSTEMS reports on a program to characterize the performance of MEMS corner cube retroreflector arrays under laser illumination. These arrays have significant military and commercial application in the areas of: (1) target identification; (2) target tracking; (3) target location; (4) identification friend-or-foe (IFF); (5) parcel tracking, and; (6) search and rescue assistance. BAE SYSTEMS has theoretically determined the feasibility of these devices to learn if sufficient signal-to-noise performance exists to permit a cooperative laser radar sensor to be considered for device location and interrogation. Results indicate that modest power-apertures are required to achieve SNR performance consistent with high probability of detection and low false alarm rates.

  15. Investigating the probability of detection of typical cavity shapes through modelling and comparison of geophysical techniques

    NASA Astrophysics Data System (ADS)

    James, P.

    2011-12-01

    With a growing need for housing in the U.K., the government has proposed increased development of brownfield sites. However, old mine workings and natural cavities represent a potential hazard before, during and after construction on such sites, and add further complication to subsurface parameters. Cavities are hence a limitation to certain redevelopment and their detection is an ever important consideration. The current standard technique for cavity detection is a borehole grid, which is intrusive, non-continuous, slow and expensive. A new robust investigation standard in the detection of cavities is sought and geophysical techniques offer an attractive alternative. Geophysical techniques have previously been utilised successfully in the detection of cavities in various geologies, but still has an uncertain reputation in the engineering industry. Engineers are unsure of the techniques and are inclined to rely on well known techniques than utilise new technologies. Bad experiences with geophysics are commonly due to the indiscriminate choice of particular techniques. It is imperative that a geophysical survey is designed with the specific site and target in mind at all times, and the ability and judgement to rule out some, or all, techniques. To this author's knowledge no comparative software exists to aid technique choice. Also, previous modelling software limit the shapes of bodies and hence typical cavity shapes are not represented. Here, we introduce 3D modelling software (Matlab) which computes and compares the response to various cavity targets from a range of techniques (gravity, gravity gradient, magnetic, magnetic gradient and GPR). Typical near surface cavity shapes are modelled including shafts, bellpits, various lining and capping materials, and migrating voids. The probability of cavity detection is assessed in typical subsurface and noise conditions across a range of survey parameters. Techniques can be compared and the limits of detection distance assessed. The density of survey points required to achieve a required probability of detection can be calculated. The software aids discriminate choice of technique, improves survey design, and increases the likelihood of survey success; all factors sought in the engineering industry. As a simple example, the response from magnetometry, gravimetry, and gravity gradient techniques above an example 3m deep, 1m cube air cavity in limestone across a 15m grid was calculated. The maximum responses above the cavity are small (amplitudes of 0.018nT, 0.0013mGal, 8.3eotvos respectively), but at typical site noise levels the detection reliability is over 50% for the gradient gravity method on a single survey line. Increasing the number of survey points across the site increases the reliability of detection of the anomaly by the addition of probabilities. We can calculate the probability of detection at different profile spacings to assess the best possible survey design. At 1m spacing the overall probability of by the gradient gravity method is over 90%, and over 60% for magnetometry (at 3m spacing the probability drops to 32%). The use of modelling in near surface surveys is a useful tool to assess the feasibility of a range of techniques to detect subtle signals. Future work will integrate this work with borehole measured parameters.

  16. Location detection and tracking of moving targets by a 2D IR-UWB radar system.

    PubMed

    Nguyen, Van-Han; Pyun, Jae-Young

    2015-03-19

    In indoor environments, the Global Positioning System (GPS) and long-range tracking radar systems are not optimal, because of signal propagation limitations in the indoor environment. In recent years, the use of ultra-wide band (UWB) technology has become a possible solution for object detection, localization and tracking in indoor environments, because of its high range resolution, compact size and low cost. This paper presents improved target detection and tracking techniques for moving objects with impulse-radio UWB (IR-UWB) radar in a short-range indoor area. This is achieved through signal-processing steps, such as clutter reduction, target detection, target localization and tracking. In this paper, we introduce a new combination consisting of our proposed signal-processing procedures. In the clutter-reduction step, a filtering method that uses a Kalman filter (KF) is proposed. Then, in the target detection step, a modification of the conventional CLEAN algorithm which is used to estimate the impulse response from observation region is applied for the advanced elimination of false alarms. Then, the output is fed into the target localization and tracking step, in which the target location and trajectory are determined and tracked by using unscented KF in two-dimensional coordinates. In each step, the proposed methods are compared to conventional methods to demonstrate the differences in performance. The experiments are carried out using actual IR-UWB radar under different scenarios. The results verify that the proposed methods can improve the probability and efficiency of target detection and tracking.

  17. The MYStIX Infrared-Excess Source Catalog

    NASA Astrophysics Data System (ADS)

    Povich, Matthew S.; Kuhn, Michael A.; Getman, Konstantin V.; Busk, Heather A.; Feigelson, Eric D.; Broos, Patrick S.; Townsley, Leisa K.; King, Robert R.; Naylor, Tim

    2013-12-01

    The Massive Young Star-Forming Complex Study in Infrared and X-rays (MYStIX) project provides a comparative study of 20 Galactic massive star-forming complexes (d = 0.4-3.6 kpc). Probable stellar members in each target complex are identified using X-ray and/or infrared data via two pathways: (1) X-ray detections of young/massive stars with coronal activity/strong winds or (2) infrared excess (IRE) selection of young stellar objects (YSOs) with circumstellar disks and/or protostellar envelopes. We present the methodology for the second pathway using Spitzer/IRAC, 2MASS, and UKIRT imaging and photometry. Although IRE selection of YSOs is well-trodden territory, MYStIX presents unique challenges. The target complexes range from relatively nearby clouds in uncrowded fields located toward the outer Galaxy (e.g., NGC 2264, the Flame Nebula) to more distant, massive complexes situated along complicated, inner Galaxy sightlines (e.g., NGC 6357, M17). We combine IR spectral energy distribution (SED) fitting with IR color cuts and spatial clustering analysis to identify IRE sources and isolate probable YSO members in each MYStIX target field from the myriad types of contaminating sources that can resemble YSOs: extragalactic sources, evolved stars, nebular knots, and even unassociated foreground/background YSOs. Applying our methodology consistently across 18 of the target complexes, we produce the MYStIX IRE Source (MIRES) Catalog comprising 20,719 sources, including 8686 probable stellar members of the MYStIX target complexes. We also classify the SEDs of 9365 IR counterparts to MYStIX X-ray sources to assist the first pathway, the identification of X-ray-detected stellar members. The MIRES Catalog provides a foundation for follow-up studies of diverse phenomena related to massive star cluster formation, including protostellar outflows, circumstellar disks, and sequential star formation triggered by massive star feedback processes.

  18. Optimal Power Allocation Strategy in a Joint Bistatic Radar and Communication System Based on Low Probability of Intercept

    PubMed Central

    Wang, Fei; Salous, Sana; Zhou, Jianjiang

    2017-01-01

    In this paper, we investigate a low probability of intercept (LPI)-based optimal power allocation strategy for a joint bistatic radar and communication system, which is composed of a dedicated transmitter, a radar receiver, and a communication receiver. The joint system is capable of fulfilling the requirements of both radar and communications simultaneously. First, assuming that the signal-to-noise ratio (SNR) corresponding to the target surveillance path is much weaker than that corresponding to the line of sight path at radar receiver, the analytically closed-form expression for the probability of false alarm is calculated, whereas the closed-form expression for the probability of detection is not analytically tractable and is approximated due to the fact that the received signals are not zero-mean Gaussian under target presence hypothesis. Then, an LPI-based optimal power allocation strategy is presented to minimize the total transmission power for information signal and radar waveform, which is constrained by a specified information rate for the communication receiver and the desired probabilities of detection and false alarm for the radar receiver. The well-known bisection search method is employed to solve the resulting constrained optimization problem. Finally, numerical simulations are provided to reveal the effects of several system parameters on the power allocation results. It is also demonstrated that the LPI performance of the joint bistatic radar and communication system can be markedly improved by utilizing the proposed scheme. PMID:29186850

  19. Optimal Power Allocation Strategy in a Joint Bistatic Radar and Communication System Based on Low Probability of Intercept.

    PubMed

    Shi, Chenguang; Wang, Fei; Salous, Sana; Zhou, Jianjiang

    2017-11-25

    In this paper, we investigate a low probability of intercept (LPI)-based optimal power allocation strategy for a joint bistatic radar and communication system, which is composed of a dedicated transmitter, a radar receiver, and a communication receiver. The joint system is capable of fulfilling the requirements of both radar and communications simultaneously. First, assuming that the signal-to-noise ratio (SNR) corresponding to the target surveillance path is much weaker than that corresponding to the line of sight path at radar receiver, the analytically closed-form expression for the probability of false alarm is calculated, whereas the closed-form expression for the probability of detection is not analytically tractable and is approximated due to the fact that the received signals are not zero-mean Gaussian under target presence hypothesis. Then, an LPI-based optimal power allocation strategy is presented to minimize the total transmission power for information signal and radar waveform, which is constrained by a specified information rate for the communication receiver and the desired probabilities of detection and false alarm for the radar receiver. The well-known bisection search method is employed to solve the resulting constrained optimization problem. Finally, numerical simulations are provided to reveal the effects of several system parameters on the power allocation results. It is also demonstrated that the LPI performance of the joint bistatic radar and communication system can be markedly improved by utilizing the proposed scheme.

  20. Discriminating quantum-optical beam-splitter channels with number-diagonal signal states: Applications to quantum reading and target detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nair, Ranjith

    2011-09-15

    We consider the problem of distinguishing, with minimum probability of error, two optical beam-splitter channels with unequal complex-valued reflectivities using general quantum probe states entangled over M signal and M' idler mode pairs of which the signal modes are bounced off the beam splitter while the idler modes are retained losslessly. We obtain a lower bound on the output state fidelity valid for any pure input state. We define number-diagonal signal (NDS) states to be input states whose density operator in the signal modes is diagonal in the multimode number basis. For such input states, we derive series formulas formore » the optimal error probability, the output state fidelity, and the Chernoff-type upper bounds on the error probability. For the special cases of quantum reading of a classical digital memory and target detection (for which the reflectivities are real valued), we show that for a given input signal photon probability distribution, the fidelity is minimized by the NDS states with that distribution and that for a given average total signal energy N{sub s}, the fidelity is minimized by any multimode Fock state with N{sub s} total signal photons. For reading of an ideal memory, it is shown that Fock state inputs minimize the Chernoff bound. For target detection under high-loss conditions, a no-go result showing the lack of appreciable quantum advantage over coherent state transmitters is derived. A comparison of the error probability performance for quantum reading of number state and two-mode squeezed vacuum state (or EPR state) transmitters relative to coherent state transmitters is presented for various values of the reflectances. While the nonclassical states in general perform better than the coherent state, the quantitative performance gains differ depending on the values of the reflectances. The experimental outlook for realizing nonclassical gains from number state transmitters with current technology at moderate to high values of the reflectances is argued to be good.« less

  1. Detection and classification of underwater targets by echolocating dolphins

    NASA Astrophysics Data System (ADS)

    Au, Whitlow

    2003-10-01

    Many experiments have been performed with echolocating dolphins to determine their target detection and discrimination capabilities. Target detection experiments have been performed in a naturally noisy environment, with masking noise and with both phantom echoes and masking noise, and in reverberation. The echo energy to rms noise spectral density for the Atlantic bottlenose dolphin (Tursiops truncatus) at the 75% correct response threshold is approximately 7.5 dB whereas for the beluga whale (Delphinapterus leucas) the threshold is approximately 1 dB. The dolphin's detection threshold in reverberation is approximately 2.5 dB vs 2 dB for the beluga. The difference in performance between species can probably be ascribed to differences in how both species perceived the task. The bottlenose dolphin may be performing a combination detection/discrimination task whereas the beluga may be performing a simple detection task. Echolocating dolphins also have the capability to make fine discriminate of target properties such as wall thickness difference of water-filled cylinders and material differences in metallic plates. The high resolution property of the animal's echolocation signals and the high dynamic range of its auditory system are important factors in their outstanding discrimination capabilities.

  2. Detection, recognition, identification, and tracking of military vehicles using biomimetic intelligence

    NASA Astrophysics Data System (ADS)

    Pace, Paul W.; Sutherland, John

    2001-10-01

    This project is aimed at analyzing EO/IR images to provide automatic target detection/recognition/identification (ATR/D/I) of militarily relevant land targets. An increase in performance was accomplished using a biomimetic intelligence system functioning on low-cost, commercially available processing chips. Biomimetic intelligence has demonstrated advanced capabilities in the areas of hand- printed character recognition, real-time detection/identification of multiple faces in full 3D perspectives in cluttered environments, advanced capabilities in classification of ground-based military vehicles from SAR, and real-time ATR/D/I of ground-based military vehicles from EO/IR/HRR data in cluttered environments. The investigation applied these tools to real data sets and examined the parameters such as the minimum resolution for target recognition, the effect of target size, rotation, line-of-sight changes, contrast, partial obscuring, background clutter etc. The results demonstrated a real-time ATR/D/I capability against a subset of militarily relevant land targets operating in a realistic scenario. Typical results on the initial EO/IR data indicate probabilities of correct classification of resolved targets to be greater than 95 percent.

  3. Multi-Target State Extraction for the SMC-PHD Filter

    PubMed Central

    Si, Weijian; Wang, Liwei; Qu, Zhiyu

    2016-01-01

    The sequential Monte Carlo probability hypothesis density (SMC-PHD) filter has been demonstrated to be a favorable method for multi-target tracking. However, the time-varying target states need to be extracted from the particle approximation of the posterior PHD, which is difficult to implement due to the unknown relations between the large amount of particles and the PHD peaks representing potential target locations. To address this problem, a novel multi-target state extraction algorithm is proposed in this paper. By exploiting the information of measurements and particle likelihoods in the filtering stage, we propose a validation mechanism which aims at selecting effective measurements and particles corresponding to detected targets. Subsequently, the state estimates of the detected and undetected targets are performed separately: the former are obtained from the particle clusters directed by effective measurements, while the latter are obtained from the particles corresponding to undetected targets via clustering method. Simulation results demonstrate that the proposed method yields better estimation accuracy and reliability compared to existing methods. PMID:27322274

  4. Approach range and velocity determination using laser sensors and retroreflector targets

    NASA Technical Reports Server (NTRS)

    Donovan, William J.

    1991-01-01

    A laser docking sensor study is currently in the third year of development. The design concept is considered to be validated. The concept is based on using standard radar techniques to provide range, velocity, and bearing information. Multiple targets are utilized to provide relative attitude data. The design requirements were to utilize existing space-qualifiable technology and require low system power, weight, and size yet, operate from 0.3 to 150 meters with a range accuracy greater than 3 millimeters and a range rate accuracy greater than 3 mm per second. The field of regard for the system is +/- 20 deg. The transmitter and receiver design features a diode laser, microlens beam steering, and power control as a function of range. The target design consists of five target sets, each having seven 3-inch retroreflectors, arranged around the docking port. The target map is stored in the sensor memory. Phase detection is used for ranging, with the frequency range-optimized. Coarse bearing measurement is provided by the scanning system (one set of binary optics) angle. Fine bearing measurement is provided by a quad detector. A MIL-STD-1750 A/B computer is used for processing. Initial test results indicate a probability of detection greater than 99 percent and a probability of false alarm less than 0.0001. The functional system is currently at the MIT/Lincoln Lab for demonstration.

  5. Target Detection and Identification Using Canonical Correlations Analysis and Subspace Partitioning

    DTIC Science & Technology

    2008-04-01

    Fig. 2. ROCs for DCC, DCC-P, NNLS, and NNLSP (Present chemical=t1, background= t56 , SNR= 5 dB) alarm, or 1−specificity, and PD is the probability of...discrimination values are given in each ROC plot. In Fig. 2, we use t56 as the background, and t1 as the target chemical. The SNR is 5 dB. For each

  6. Multi-Target Tracking Using an Improved Gaussian Mixture CPHD Filter.

    PubMed

    Si, Weijian; Wang, Liwei; Qu, Zhiyu

    2016-11-23

    The cardinalized probability hypothesis density (CPHD) filter is an alternative approximation to the full multi-target Bayesian filter for tracking multiple targets. However, although the joint propagation of the posterior intensity and cardinality distribution in its recursion allows more reliable estimates of the target number than the PHD filter, the CPHD filter suffers from the spooky effect where there exists arbitrary PHD mass shifting in the presence of missed detections. To address this issue in the Gaussian mixture (GM) implementation of the CPHD filter, this paper presents an improved GM-CPHD filter, which incorporates a weight redistribution scheme into the filtering process to modify the updated weights of the Gaussian components when missed detections occur. In addition, an efficient gating strategy that can adaptively adjust the gate sizes according to the number of missed detections of each Gaussian component is also presented to further improve the computational efficiency of the proposed filter. Simulation results demonstrate that the proposed method offers favorable performance in terms of both estimation accuracy and robustness to clutter and detection uncertainty over the existing methods.

  7. Probabilistic track coverage in cooperative sensor networks.

    PubMed

    Ferrari, Silvia; Zhang, Guoxian; Wettergren, Thomas A

    2010-12-01

    The quality of service of a network performing cooperative track detection is represented by the probability of obtaining multiple elementary detections over time along a target track. Recently, two different lines of research, namely, distributed-search theory and geometric transversals, have been used in the literature for deriving the probability of track detection as a function of random and deterministic sensors' positions, respectively. In this paper, we prove that these two approaches are equivalent under the same problem formulation. Also, we present a new performance function that is derived by extending the geometric-transversal approach to the case of random sensors' positions using Poisson flats. As a result, a unified approach for addressing track detection in both deterministic and probabilistic sensor networks is obtained. The new performance function is validated through numerical simulations and is shown to bring about considerable computational savings for both deterministic and probabilistic sensor networks.

  8. Detection of the earth with the SETI microwave observing system assumed to be operating out in the Galaxy

    NASA Technical Reports Server (NTRS)

    Billingham, John; Tarter, Jill

    1989-01-01

    The maximum range is calculated at which radar signals from the earth could be detected by a search system similar to the NASA SETI Microwave Observing Project (SETI MOP) assumed to be operating out in the Galaxy. Figures are calculated for the Targeted Search and for the Sky Survey parts of the MOP, both planned to be operating in the 1990s. The probability of detection is calculated for the two most powerful transmitters, the planetary radar at Arecibo (Puerto Rico) and the ballistic missile early warning systems (BMEWSs), assuming that the terrestrial radars are only in the eavesdropping mode. It was found that, for the case of a single transmitter within the maximum range, the highest probability is for the sky survey detecting BMEWSs; this is directly proportional to BMEWS sky coverage and is therefore 0.25.

  9. Small-target leak detection for a closed vessel via infrared image sequences

    NASA Astrophysics Data System (ADS)

    Zhao, Ling; Yang, Hongjiu

    2017-03-01

    This paper focus on a leak diagnosis and localization method based on infrared image sequences. Some problems on high probability of false warning and negative affect for marginal information are solved by leak detection. An experimental model is established for leak diagnosis and localization on infrared image sequences. The differential background prediction is presented to eliminate the negative affect of marginal information on test vessel based on a kernel regression method. A pipeline filter based on layering voting is designed to reduce probability of leak point false warning. A synthesize leak diagnosis and localization algorithm is proposed based on infrared image sequences. The effectiveness and potential are shown for developed techniques through experimental results.

  10. System and method for automated object detection in an image

    DOEpatents

    Kenyon, Garrett T.; Brumby, Steven P.; George, John S.; Paiton, Dylan M.; Schultz, Peter F.

    2015-10-06

    A contour/shape detection model may use relatively simple and efficient kernels to detect target edges in an object within an image or video. A co-occurrence probability may be calculated for two or more edge features in an image or video using an object definition. Edge features may be differentiated between in response to measured contextual support, and prominent edge features may be extracted based on the measured contextual support. The object may then be identified based on the extracted prominent edge features.

  11. Microwave quantum illumination.

    PubMed

    Barzanjeh, Shabir; Guha, Saikat; Weedbrook, Christian; Vitali, David; Shapiro, Jeffrey H; Pirandola, Stefano

    2015-02-27

    Quantum illumination is a quantum-optical sensing technique in which an entangled source is exploited to improve the detection of a low-reflectivity object that is immersed in a bright thermal background. Here, we describe and analyze a system for applying this technique at microwave frequencies, a more appropriate spectral region for target detection than the optical, due to the naturally occurring bright thermal background in the microwave regime. We use an electro-optomechanical converter to entangle microwave signal and optical idler fields, with the former being sent to probe the target region and the latter being retained at the source. The microwave radiation collected from the target region is then phase conjugated and upconverted into an optical field that is combined with the retained idler in a joint-detection quantum measurement. The error probability of this microwave quantum-illumination system, or quantum radar, is shown to be superior to that of any classical microwave radar of equal transmitted energy.

  12. Using a constrained formulation based on probability summation to fit receiver operating characteristic (ROC) curves

    NASA Astrophysics Data System (ADS)

    Swensson, Richard G.; King, Jill L.; Good, Walter F.; Gur, David

    2000-04-01

    A constrained ROC formulation from probability summation is proposed for measuring observer performance in detecting abnormal findings on medical images. This assumes the observer's detection or rating decision on each image is determined by a latent variable that characterizes the specific finding (type and location) considered most likely to be a target abnormality. For positive cases, this 'maximum- suspicion' variable is assumed to be either the value for the actual target or for the most suspicious non-target finding, whichever is the greater (more suspicious). Unlike the usual ROC formulation, this constrained formulation guarantees a 'well-behaved' ROC curve that always equals or exceeds chance- level decisions and cannot exhibit an upward 'hook.' Its estimated parameters specify the accuracy for separating positive from negative cases, and they also predict accuracy in locating or identifying the actual abnormal findings. The present maximum-likelihood procedure (runs on PC with Windows 95 or NT) fits this constrained formulation to rating-ROC data using normal distributions with two free parameters. Fits of the conventional and constrained ROC formulations are compared for continuous and discrete-scale ratings of chest films in a variety of detection problems, both for localized lesions (nodules, rib fractures) and for diffuse abnormalities (interstitial disease, infiltrates or pnumothorax). The two fitted ROC curves are nearly identical unless the conventional ROC has an ill behaved 'hook,' below the constrained ROC.

  13. Studies on Radar Sensor Networks

    DTIC Science & Technology

    2007-08-08

    scheme in which 2-D image was created via adding voltages with the appropriate time offset. Simulation results show that our DCT-based scheme works...using RSNs in terms of the probability of miss detection PMD and the root mean square error (RMSE). Simulation results showed that multi-target detection... Simulation results are presented to evaluate the feasibility and effectiveness of the proposed JMIC algorithm in a query surveillance region. 5 SVD-QR and

  14. Cueing spatial attention through timing and probability.

    PubMed

    Girardi, Giovanna; Antonucci, Gabriella; Nico, Daniele

    2013-01-01

    Even when focused on an effortful task we retain the ability to detect salient environmental information, and even irrelevant visual stimuli can be automatically detected. However, to which extent unattended information affects attentional control is not fully understood. Here we provide evidences of how the brain spontaneously organizes its cognitive resources by shifting attention between a selective-attending and a stimulus-driven modality within a single task. Using a spatial cueing paradigm we investigated the effect of cue-target asynchronies as a function of their probabilities of occurrence (i.e., relative frequency). Results show that this accessory information modulates attentional shifts. A valid spatial cue improved participants' performance as compared to an invalid one only in trials in which target onset was highly predictable because of its more robust occurrence. Conversely, cuing proved ineffective when spatial cue and target were associated according to a less frequent asynchrony. These patterns of response depended on asynchronies' probability and not on their duration. Our findings clearly demonstrate that through a fine decision-making, performed trial-by-trial, the brain utilizes implicit information to decide whether or not voluntarily shifting spatial attention. As if according to a cost-planning strategy, the cognitive effort of shifting attention depending on the cue is performed only when the expected advantages are higher. In a trade-off competition for cognitive resources, voluntary/automatic attending may thus be a more complex process than expected. Copyright © 2011 Elsevier Ltd. All rights reserved.

  15. Active interrogation using low-energy nuclear reactions

    NASA Astrophysics Data System (ADS)

    Antolak, Arlyn; Doyle, Barney; Leung, Ka-Ngo; Morse, Daniel; Provencio, Paula

    2005-09-01

    High-energy photons and neutrons can be used to interrogate for heavily shielded fissile materials inside sealed cargo containers by detecting their prompt and/or delayed fission signatures. The FIND (Fissmat Inspection for Nuclear Detection) active interrogation system is based on a dual neutron+gamma source that uses low-energy (< 500 keV) proton- or deuteron-induced nuclear reactions to produce high intensities of mono-energetic gamma rays and/or neutrons. The source can be operated in either pulsed (e.g., to detect delayed photofission neutrons and gammas) or continuous (e.g., detecting prompt fission signatures) modes. For the gamma-rays, the source target can be segmented to incorporate different (p,γ) isotopes for producing gamma-rays at selective energies, thereby improving the probability of detection. The design parameters for the FIND system are discussed and preliminary accelerator-based measurements of gamma and neutron yields, background levels, and fission signals for several target materials under consideration are presented.

  16. Small battery operated unattended radar sensor for security systems

    NASA Astrophysics Data System (ADS)

    Plummer, Thomas J.; Brady, Stephen; Raines, Robert

    2013-06-01

    McQ has developed, tested, and is supplying to Unattended Ground Sensor (UGS) customers a new radar sensor. This radar sensor is designed for short range target detection and classification. The design emphasis was to have low power consumption, totally automated operation, a very high probability of detection coupled with a very low false alarm rate, be able to locate and track targets, and have a price compatible with the UGS market. The radar sensor complements traditional UGS sensors by providing solutions for scenarios that are difficult for UGS. The design of this radar sensor and the testing are presented in this paper.

  17. Airborne net-centric multi-INT sensor control, display, fusion, and exploitation systems

    NASA Astrophysics Data System (ADS)

    Linne von Berg, Dale C.; Lee, John N.; Kruer, Melvin R.; Duncan, Michael D.; Olchowski, Fred M.; Allman, Eric; Howard, Grant

    2004-08-01

    The NRL Optical Sciences Division has initiated a multi-year effort to develop and demonstrate an airborne net-centric suite of multi-intelligence (multi-INT) sensors and exploitation systems for real-time target detection and targeting product dissemination. The goal of this Net-centric Multi-Intelligence Fusion Targeting Initiative (NCMIFTI) is to develop an airborne real-time intelligence gathering and targeting system that can be used to detect concealed, camouflaged, and mobile targets. The multi-INT sensor suite will include high-resolution visible/infrared (EO/IR) dual-band cameras, hyperspectral imaging (HSI) sensors in the visible-to-near infrared, short-wave and long-wave infrared (VNIR/SWIR/LWIR) bands, Synthetic Aperture Radar (SAR), electronics intelligence sensors (ELINT), and off-board networked sensors. Other sensors are also being considered for inclusion in the suite to address unique target detection needs. Integrating a suite of multi-INT sensors on a single platform should optimize real-time fusion of the on-board sensor streams, thereby improving the detection probability and reducing the false alarms that occur in reconnaissance systems that use single-sensor types on separate platforms, or that use independent target detection algorithms on multiple sensors. In addition to the integration and fusion of the multi-INT sensors, the effort is establishing an open-systems net-centric architecture that will provide a modular "plug and play" capability for additional sensors and system components and provide distributed connectivity to multiple sites for remote system control and exploitation.

  18. Confidence level estimation in multi-target classification problems

    NASA Astrophysics Data System (ADS)

    Chang, Shi; Isaacs, Jason; Fu, Bo; Shin, Jaejeong; Zhu, Pingping; Ferrari, Silvia

    2018-04-01

    This paper presents an approach for estimating the confidence level in automatic multi-target classification performed by an imaging sensor on an unmanned vehicle. An automatic target recognition algorithm comprised of a deep convolutional neural network in series with a support vector machine classifier detects and classifies targets based on the image matrix. The joint posterior probability mass function of target class, features, and classification estimates is learned from labeled data, and recursively updated as additional images become available. Based on the learned joint probability mass function, the approach presented in this paper predicts the expected confidence level of future target classifications, prior to obtaining new images. The proposed approach is tested with a set of simulated sonar image data. The numerical results show that the estimated confidence level provides a close approximation to the actual confidence level value determined a posteriori, i.e. after the new image is obtained by the on-board sensor. Therefore, the expected confidence level function presented in this paper can be used to adaptively plan the path of the unmanned vehicle so as to optimize the expected confidence levels and ensure that all targets are classified with satisfactory confidence after the path is executed.

  19. Sampling stored product insect pests: a comparison of four statistical sampling models for probability of pest detection

    USDA-ARS?s Scientific Manuscript database

    Statistically robust sampling strategies form an integral component of grain storage and handling activities throughout the world. Developing sampling strategies to target biological pests such as insects in stored grain is inherently difficult due to species biology and behavioral characteristics. ...

  20. Detecting Land-based Signals in the Near-shore Zone of Lake Erie During Summer 2009

    EPA Science Inventory

    We conducted two styles of nearshore surveys in Lake Erie during August to mid-September 2009. The first used a spatially-balanced probability survey (SBS) design to establish discrete stations within a GIS-defined target populationthe nearshore zone extending approximately 5 km...

  1. Unmanned Aircraft Systems (UAS) Sensor and Targeting

    DTIC Science & Technology

    2010-07-27

    4.7.1 Objective. The objective of this subtest is to determine the detection performance of the Synthetic Aperture Radar (SAR) with the radar...Detection SAR – Synthetic Aperture Radar 4.7.3 Data Required. Section 5.1 outlines general test data required. The following additional data may...m – meter No. – Number PC – Probability of Classification SAR – Synthetic Aperture Radar 4.8.3 Data Required. Section 5.1 outlines

  2. Using hyperentanglement to enhance resolution, signal-to-noise ratio, and measurement time

    NASA Astrophysics Data System (ADS)

    Smith, James F.

    2017-03-01

    A hyperentanglement-based atmospheric imaging/detection system involving only a signal and an ancilla photon will be considered for optical and infrared frequencies. Only the signal photon will propagate in the atmosphere and its loss will be classical. The ancilla photon will remain within the sensor experiencing low loss. Closed form expressions for the wave function, normalization, density operator, reduced density operator, symmetrized logarithmic derivative, quantum Fisher information, quantum Cramer-Rao lower bound, coincidence probabilities, probability of detection, probability of false alarm, probability of error after M measurements, signal-to-noise ratio, quantum Chernoff bound, time-on-target expressions related to probability of error, and resolution will be provided. The effect of noise in every mode will be included as well as loss. The system will provide the basic design for an imaging/detection system functioning at optical or infrared frequencies that offers better than classical angular and range resolution. Optimization for enhanced resolution will be included. The signal-to-noise ratio will be increased by a factor equal to the number of modes employed during the hyperentanglement process. Likewise, the measurement time can be reduced by the same factor. The hyperentanglement generator will typically make use of entanglement in polarization, energy-time, orbital angular momentum and so on. Mathematical results will be provided describing the system's performance as a function of loss mechanisms and noise.

  3. Wald Sequential Probability Ratio Test for Space Object Conjunction Assessment

    NASA Technical Reports Server (NTRS)

    Carpenter, James R.; Markley, F Landis

    2014-01-01

    This paper shows how satellite owner/operators may use sequential estimates of collision probability, along with a prior assessment of the base risk of collision, in a compound hypothesis ratio test to inform decisions concerning collision risk mitigation maneuvers. The compound hypothesis test reduces to a simple probability ratio test, which appears to be a novel result. The test satisfies tolerances related to targeted false alarm and missed detection rates. This result is independent of the method one uses to compute the probability density that one integrates to compute collision probability. A well-established test case from the literature shows that this test yields acceptable results within the constraints of a typical operational conjunction assessment decision timeline. Another example illustrates the use of the test in a practical conjunction assessment scenario based on operations of the International Space Station.

  4. Incorporation of operator knowledge for improved HMDS GPR classification

    NASA Astrophysics Data System (ADS)

    Kennedy, Levi; McClelland, Jessee R.; Walters, Joshua R.

    2012-06-01

    The Husky Mine Detection System (HMDS) detects and alerts operators to potential threats observed in groundpenetrating RADAR (GPR) data. In the current system architecture, the classifiers have been trained using available data from multiple training sites. Changes in target types, clutter types, and operational conditions may result in statistical differences between the training data and the testing data for the underlying features used by the classifier, potentially resulting in an increased false alarm rate or a lower probability of detection for the system. In the current mode of operation, the automated detection system alerts the human operator when a target-like object is detected. The operator then uses data visualization software, contextual information, and human intuition to decide whether the alarm presented is an actual target or a false alarm. When the statistics of the training data and the testing data are mismatched, the automated detection system can overwhelm the analyst with an excessive number of false alarms. This is evident in the performance of and the data collected from deployed systems. This work demonstrates that analyst feedback can be successfully used to re-train a classifier to account for variable testing data statistics not originally captured in the initial training data.

  5. Metacognitive monitoring and control in visual change detection: Implications for situation awareness and cognitive control

    PubMed Central

    McAnally, Ken I.; Morris, Adam P.; Best, Christopher

    2017-01-01

    Metacognitive monitoring and control of situation awareness (SA) are important for a range of safety-critical roles (e.g., air traffic control, military command and control). We examined the factors affecting these processes using a visual change detection task that included representative tactical displays. SA was assessed by asking novice observers to detect changes to a tactical display. Metacognitive monitoring was assessed by asking observers to estimate the probability that they would correctly detect a change, either after study of the display and before the change (judgement of learning; JOL) or after the change and detection response (judgement of performance; JOP). In Experiment 1, observers failed to detect some changes to the display, indicating imperfect SA, but JOPs were reasonably well calibrated to objective performance. Experiment 2 examined JOLs and JOPs in two task contexts: with study-time limits imposed by the task or with self-pacing to meet specified performance targets. JOPs were well calibrated in both conditions as were JOLs for high performance targets. In summary, observers had limited SA, but good insight about their performance and learning for high performance targets and allocated study time appropriately. PMID:28915244

  6. Tracking Object Existence From an Autonomous Patrol Vehicle

    NASA Technical Reports Server (NTRS)

    Wolf, Michael; Scharenbroich, Lucas

    2011-01-01

    An autonomous vehicle patrols a large region, during which an algorithm receives measurements of detected potential objects within its sensor range. The goal of the algorithm is to track all objects in the region over time. This problem differs from traditional multi-target tracking scenarios because the region of interest is much larger than the sensor range and relies on the movement of the sensor through this region for coverage. The goal is to know whether anything has changed between visits to the same location. In particular, two kinds of alert conditions must be detected: (1) a previously detected object has disappeared and (2) a new object has appeared in a location already checked. For the time an object is within sensor range, the object can be assumed to remain stationary, changing position only between visits. The problem is difficult because the upstream object detection processing is likely to make many errors, resulting in heavy clutter (false positives) and missed detections (false negatives), and because only noisy, bearings-only measurements are available. This work has three main goals: (1) Associate incoming measurements with known objects or mark them as new objects or false positives, as appropriate. For this, a multiple hypothesis tracker was adapted to this scenario. (2) Localize the objects using multiple bearings-only measurements to provide estimates of global position (e.g., latitude and longitude). A nonlinear Kalman filter extension provides these 2D position estimates using the 1D measurements. (3) Calculate the probability that a suspected object truly exists (in the estimated position), and determine whether alert conditions have been triggered (for new objects or disappeared objects). The concept of a probability of existence was created, and a new Bayesian method for updating this probability at each time step was developed. A probabilistic multiple hypothesis approach is chosen because of its superiority in handling the uncertainty arising from errors in sensors and upstream processes. However, traditional target tracking methods typically assume a stationary detection volume of interest, whereas in this case, one must make adjustments for being able to see only a small portion of the region of interest and understand when an alert situation has occurred. To track object existence inside and outside the vehicle's sensor range, a probability of existence was defined for each hypothesized object, and this value was updated at every time step in a Bayesian manner based on expected characteristics of the sensor and object and whether that object has been detected in the most recent time step. Then, this value feeds into a sequential probability ratio test (SPRT) to determine the status of the object (suspected, confirmed, or deleted). Alerts are sent upon selected status transitions. Additionally, in order to track objects that move in and out of sensor range and update the probability of existence appropriately a variable probability detection has been defined and the hypothesis probability equations have been re-derived to accommodate this change. Unsupervised object tracking is a pervasive issue in automated perception systems. This work could apply to any mobile platform (ground vehicle, sea vessel, air vehicle, or orbiter) that intermittently revisits regions of interest and needs to determine whether anything interesting has changed.

  7. Optimal Attack Strategies Subject to Detection Constraints Against Cyber-Physical Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yuan; Kar, Soummya; Moura, Jose M. F.

    This paper studies an attacker against a cyberphysical system (CPS) whose goal is to move the state of a CPS to a target state while ensuring that his or her probability of being detected does not exceed a given bound. The attacker’s probability of being detected is related to the nonnegative bias induced by his or her attack on the CPS’s detection statistic. We formulate a linear quadratic cost function that captures the attacker’s control goal and establish constraints on the induced bias that reflect the attacker’s detection-avoidance objectives. When the attacker is constrained to be detected at the false-alarmmore » rate of the detector, we show that the optimal attack strategy reduces to a linear feedback of the attacker’s state estimate. In the case that the attacker’s bias is upper bounded by a positive constant, we provide two algorithms – an optimal algorithm and a sub-optimal, less computationally intensive algorithm – to find suitable attack sequences. Lastly, we illustrate our attack strategies in numerical examples based on a remotely-controlled helicopter under attack.« less

  8. Optimal Attack Strategies Subject to Detection Constraints Against Cyber-Physical Systems

    DOE PAGES

    Chen, Yuan; Kar, Soummya; Moura, Jose M. F.

    2017-03-31

    This paper studies an attacker against a cyberphysical system (CPS) whose goal is to move the state of a CPS to a target state while ensuring that his or her probability of being detected does not exceed a given bound. The attacker’s probability of being detected is related to the nonnegative bias induced by his or her attack on the CPS’s detection statistic. We formulate a linear quadratic cost function that captures the attacker’s control goal and establish constraints on the induced bias that reflect the attacker’s detection-avoidance objectives. When the attacker is constrained to be detected at the false-alarmmore » rate of the detector, we show that the optimal attack strategy reduces to a linear feedback of the attacker’s state estimate. In the case that the attacker’s bias is upper bounded by a positive constant, we provide two algorithms – an optimal algorithm and a sub-optimal, less computationally intensive algorithm – to find suitable attack sequences. Lastly, we illustrate our attack strategies in numerical examples based on a remotely-controlled helicopter under attack.« less

  9. Performance Evaluation of Target Detection with a Near-Space Vehicle-Borne Radar in Blackout Condition.

    PubMed

    Li, Yanpeng; Li, Xiang; Wang, Hongqiang; Deng, Bin; Qin, Yuliang

    2016-01-06

    Radar is a very important sensor in surveillance applications. Near-space vehicle-borne radar (NSVBR) is a novel installation of a radar system, which offers many benefits, like being highly suited to the remote sensing of extremely large areas, having a rapidly deployable capability and having low vulnerability to electronic countermeasures. Unfortunately, a target detection challenge arises because of complicated scenarios, such as nuclear blackout, rain attenuation, etc. In these cases, extra care is needed to evaluate the detection performance in blackout situations, since this a classical problem along with the application of an NSVBR. However, the existing evaluation measures are the probability of detection and the receiver operating curve (ROC), which cannot offer detailed information in such a complicated application. This work focuses on such requirements. We first investigate the effect of blackout on an electromagnetic wave. Performance evaluation indexes are then built: three evaluation indexes on the detection capability and two evaluation indexes on the robustness of the detection process. Simulation results show that the proposed measure will offer information on the detailed performance of detection. These measures are therefore very useful in detecting the target of interest in a remote sensing system and are helpful for both the NSVBR designers and users.

  10. Performance Evaluation of Target Detection with a Near-Space Vehicle-Borne Radar in Blackout Condition

    PubMed Central

    Li, Yanpeng; Li, Xiang; Wang, Hongqiang; Deng, Bin; Qin, Yuliang

    2016-01-01

    Radar is a very important sensor in surveillance applications. Near-space vehicle-borne radar (NSVBR) is a novel installation of a radar system, which offers many benefits, like being highly suited to the remote sensing of extremely large areas, having a rapidly deployable capability and having low vulnerability to electronic countermeasures. Unfortunately, a target detection challenge arises because of complicated scenarios, such as nuclear blackout, rain attenuation, etc. In these cases, extra care is needed to evaluate the detection performance in blackout situations, since this a classical problem along with the application of an NSVBR. However, the existing evaluation measures are the probability of detection and the receiver operating curve (ROC), which cannot offer detailed information in such a complicated application. This work focuses on such requirements. We first investigate the effect of blackout on an electromagnetic wave. Performance evaluation indexes are then built: three evaluation indexes on the detection capability and two evaluation indexes on the robustness of the detection process. Simulation results show that the proposed measure will offer information on the detailed performance of detection. These measures are therefore very useful in detecting the target of interest in a remote sensing system and are helpful for both the NSVBR designers and users. PMID:26751445

  11. The Efficacy of Multiparametric Magnetic Resonance Imaging and Magnetic Resonance Imaging Targeted Biopsy in Risk Classification for Patients with Prostate Cancer on Active Surveillance.

    PubMed

    Recabal, Pedro; Assel, Melissa; Sjoberg, Daniel D; Lee, Daniel; Laudone, Vincent P; Touijer, Karim; Eastham, James A; Vargas, Hebert A; Coleman, Jonathan; Ehdaie, Behfar

    2016-08-01

    We determined whether multiparametric magnetic resonance imaging targeted biopsies may replace systematic biopsies to detect higher grade prostate cancer (Gleason score 7 or greater) and whether biopsy may be avoided based on multiparametric magnetic resonance imaging among men with Gleason 3+3 prostate cancer on active surveillance. We identified men with previously diagnosed Gleason score 3+3 prostate cancer on active surveillance who underwent multiparametric magnetic resonance imaging and a followup prostate biopsy. Suspicion for higher grade cancer was scored on a standardized 5-point scale. All patients underwent a systematic biopsy. Patients with multiparametric magnetic resonance imaging regions of interest also underwent magnetic resonance imaging targeted biopsy. The detection rate of higher grade cancer was estimated for different multiparametric magnetic resonance imaging scores with the 3 biopsy strategies of systematic, magnetic resonance imaging targeted and combined. Of 206 consecutive men on active surveillance 135 (66%) had a multiparametric magnetic resonance imaging region of interest. Overall, higher grade cancer was detected in 72 (35%) men. A higher multiparametric magnetic resonance imaging score was associated with an increased probability of detecting higher grade cancer (Wilcoxon-type trend test p <0.0001). Magnetic resonance imaging targeted biopsy detected higher grade cancer in 23% of men. Magnetic resonance imaging targeted biopsy alone missed higher grade cancers in 17%, 12% and 10% of patients with multiparametric magnetic resonance imaging scores of 3, 4 and 5, respectively. Magnetic resonance imaging targeted biopsies increased the detection of higher grade cancer among men on active surveillance compared to systematic biopsy alone. However, a clinically relevant proportion of higher grade cancer was detected using only systematic biopsy. Despite the improved detection of disease progression using magnetic resonance imaging targeted biopsy, systematic biopsy cannot be excluded as part of surveillance for men with low risk prostate cancer. Copyright © 2016 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  12. The Moving Group Targets of the Seeds High-Contrast Imaging Survey of Exoplanets and Disks: Results and Observations from the First Three Years

    NASA Technical Reports Server (NTRS)

    Brandt, Timothy D.; Kuzuhara, Masayuki; McElwain, Michael W.; Schlieder, Joshua E.; Wisniewski, John P.; Turner, Edwin L.; Carson, J.; Matsuo, T.; Biller, B.; Bonnefoy, M.; hide

    2014-01-01

    We present results from the first three years of observations of moving group (MG) targets in the Strategic Exploration of Exoplanets and Disks with Subaru (SEEDS) high-contrast imaging survey of exoplanets and disks using the Subaru telescope. We achieve typical contrasts of (is) approximately10(exp 5) at 1" and (is) approximately 10(exp 6) beyond 2" around 63 proposed members of nearby kinematic MGs. We review each of the kinematic associations to which our targets belong, concluding that five, beta Pictoris ((is) approximately 20 Myr), AB Doradus ((is) approximately 100 Myr), Columba ((is) approximately 30 Myr), Tucana-Horogium ((is) approximately 30 Myr), and TW Hydrae ((is) approximately 10 Myr), are sufficiently well-defined to constrain the ages of individual targets. Somewhat less than half of our targets are high-probability members of one of these MGs. For all of our targets, we combine proposed MG membership with other age indicators where available, including Ca ii HK emission, X-ray activity, and rotation period, to produce a posterior probability distribution of age. SEEDS observations discovered a substellar companion to one of our targets, kappa And, a late B star. We do not detect any other substellar companions, but do find seven new close binary systems, of which one still needs to be confirmed. A detailed analysis of the statistics of this sample, and of the companion mass constraints given our age probability distributions and exoplanet cooling models, will be presented in a forthcoming paper.

  13. The moving group targets of the seeds high-contrast imaging survey of exoplanets and disks: Results and observations from the first three years

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brandt, Timothy D.; Turner, Edwin L.; Janson, M.

    2014-05-01

    We present results from the first three years of observations of moving group (MG) targets in the Strategic Exploration of Exoplanets and Disks with Subaru (SEEDS) high-contrast imaging survey of exoplanets and disks using the Subaru telescope. We achieve typical contrasts of ∼10{sup 5} at 1'' and ∼10{sup 6} beyond 2'' around 63 proposed members of nearby kinematic MGs. We review each of the kinematic associations to which our targets belong, concluding that five, β Pictoris (∼20 Myr), AB Doradus (∼100 Myr), Columba (∼30 Myr), Tucana-Horogium (∼30 Myr), and TW Hydrae (∼10 Myr), are sufficiently well-defined to constrain the agesmore » of individual targets. Somewhat less than half of our targets are high-probability members of one of these MGs. For all of our targets, we combine proposed MG membership with other age indicators where available, including Ca II HK emission, X-ray activity, and rotation period, to produce a posterior probability distribution of age. SEEDS observations discovered a substellar companion to one of our targets, κ And, a late B star. We do not detect any other substellar companions, but do find seven new close binary systems, of which one still needs to be confirmed. A detailed analysis of the statistics of this sample, and of the companion mass constraints given our age probability distributions and exoplanet cooling models, will be presented in a forthcoming paper.« less

  14. Ultra-cool dwarfs viewed equator-on: surveying the best host stars for biosignature detection in transiting exoplanets

    NASA Astrophysics Data System (ADS)

    Miles-Paez, Paulo; Metchev, Stanimir; Burgasser, Adam; Apai, Daniel; Palle, Enric; Zapatero Osorio, Maria Rosa; Artigau, Etienne; Mace, Greg; Tannock, Megan; Triaud, Amaury

    2018-05-01

    There are about 150 known planets around M dwarfs, but only one system around an ultra-cool (>M7) dwarf: Trappist-1. Ultra-cool dwarfs are arguably the most promising hosts for atmospheric and biosignature detection in transiting planets because of the enhanced feature contrast in transit and eclipse spectroscopy. We propose a Spitzer survey to continuously monitor 15 of the brightest ultra-cool dwarfs over 3 days. To maximize the probability of detecting transiting planets, we have selected only targets seen close to equator-on. Spin-orbit alignment expectations dictate that the planetary systems around these ultra-cool dwarfs should also be oriented nearly edge-on. Any planet detections from this survey will immediately become top priority targets for JWST transit spectroscopy. No other telescope, present or within the foreseeable future, will be able to conduct a similarly sensitive and dedicated survey for characterizeable Earth analogs.

  15. Estimating site occupancy, colonization, and local extinction when a species is detected imperfectly

    USGS Publications Warehouse

    MacKenzie, D.I.; Nichols, J.D.; Hines, J.E.; Knutson, M.G.; Franklin, A.B.

    2003-01-01

    Few species are likely to be so evident that they will always be detected when present. Failing to allow for the possibility that a target species was present, but undetected, at a site will lead to biased estimates of site occupancy, colonization, and local extinction probabilities. These population vital rates are often of interest in long-term monitoring programs and metapopulation studies. We present a model that enables direct estimation of these parameters when the probability of detecting the species is less than 1. The model does not require any assumptions of process stationarity, as do some previous methods, but does require detection/nondetection data to be collected in a manner similar to Pollock's robust design as used in mark?recapture studies. Via simulation, we show that the model provides good estimates of parameters for most scenarios considered. We illustrate the method with data from monitoring programs of Northern Spotted Owls (Strix occidentalis caurina) in northern California and tiger salamanders (Ambystoma tigrinum) in Minnesota, USA.

  16. Single- and multiple-pulse noncoherent detection statistics associated with partially developed speckle.

    PubMed

    Osche, G R

    2000-08-20

    Single- and multiple-pulse detection statistics are presented for aperture-averaged direct detection optical receivers operating against partially developed speckle fields. A partially developed speckle field arises when the probability density function of the received intensity does not follow negative exponential statistics. The case of interest here is the target surface that exhibits diffuse as well as specular components in the scattered radiation. An approximate expression is derived for the integrated intensity at the aperture, which leads to single- and multiple-pulse discrete probability density functions for the case of a Poisson signal in Poisson noise with an additive coherent component. In the absence of noise, the single-pulse discrete density function is shown to reduce to a generalized negative binomial distribution. The radar concept of integration loss is discussed in the context of direct detection optical systems where it is shown that, given an appropriate set of system parameters, multiple-pulse processing can be more efficient than single-pulse processing over a finite range of the integration parameter n.

  17. Target Detection Routine (TADER). User’s Guide.

    DTIC Science & Technology

    1987-09-01

    o System range capability subset (one record - omitted for standoff SLAR and penetrating system) o System inherent detection probability subset ( IELT ...records, i.e., one per element type) * System capability modifier subset/A=1, E=1 ( IELT records) o System capability modifier subset/A=1, E=2 ( IELT ...records) s System capability modifier subset/A=2, E=1 ( IELT records) o System capability modifier subset/A=2, E=2 ( IELT records) Unit Data Set (one set

  18. K β to K α X-ray intensity ratios and K to L shell vacancy transfer probabilities of Co, Ni, Cu, and Zn

    NASA Astrophysics Data System (ADS)

    Anand, L. F. M.; Gudennavar, S. B.; Bubbly, S. G.; Kerur, B. R.

    2015-12-01

    The K to L shell total vacancy transfer probabilities of low Z elements Co, Ni, Cu, and Zn are estimated by measuring the K β to K α intensity ratio adopting the 2π-geometry. The target elements were excited by 32.86 keV barium K-shell X-rays from a weak 137Cs γ-ray source. The emitted K-shell X-rays were detected using a low energy HPGe X-ray detector coupled to a 16 k MCA. The measured intensity ratios and the total vacancy transfer probabilities are compared with theoretical results and others' work, establishing a good agreement.

  19. Striatal activity is modulated by target probability.

    PubMed

    Hon, Nicholas

    2017-06-14

    Target probability has well-known neural effects. In the brain, target probability is known to affect frontal activity, with lower probability targets producing more prefrontal activation than those that occur with higher probability. Although the effect of target probability on cortical activity is well specified, its effect on subcortical structures such as the striatum is less well understood. Here, I examined this issue and found that the striatum was highly responsive to target probability. This is consistent with its hypothesized role in the gating of salient information into higher-order task representations. The current data are interpreted in light of that fact that different components of the striatum are sensitive to different types of task-relevant information.

  20. Common IED exploitation target set ontology

    NASA Astrophysics Data System (ADS)

    Russomanno, David J.; Qualls, Joseph; Wowczuk, Zenovy; Franken, Paul; Robinson, William

    2010-04-01

    The Common IED Exploitation Target Set (CIEDETS) ontology provides a comprehensive semantic data model for capturing knowledge about sensors, platforms, missions, environments, and other aspects of systems under test. The ontology also includes representative IEDs; modeled as explosives, camouflage, concealment objects, and other background objects, which comprise an overall threat scene. The ontology is represented using the Web Ontology Language and the SPARQL Protocol and RDF Query Language, which ensures portability of the acquired knowledge base across applications. The resulting knowledge base is a component of the CIEDETS application, which is intended to support the end user sensor test and evaluation community. CIEDETS associates a system under test to a subset of cataloged threats based on the probability that the system will detect the threat. The associations between systems under test, threats, and the detection probabilities are established based on a hybrid reasoning strategy, which applies a combination of heuristics and simplified modeling techniques. Besides supporting the CIEDETS application, which is focused on efficient and consistent system testing, the ontology can be leveraged in a myriad of other applications, including serving as a knowledge source for mission planning tools.

  1. Ensemble learning and model averaging for material identification in hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Basener, William F.

    2017-05-01

    In this paper we present a method for identifying the material contained in a pixel or region of pixels in a hyperspectral image. An identification process can be performed on a spectrum from an image from pixels that has been pre-determined to be of interest, generally comparing the spectrum from the image to spectra in an identification library. The metric for comparison used in this paper a Bayesian probability for each material. This probability can be computed either from Bayes' theorem applied to normal distributions for each library spectrum or using model averaging. Using probabilities has the advantage that the probabilities can be summed over spectra for any material class to obtain a class probability. For example, the probability that the spectrum of interest is a fabric is equal to the sum of all probabilities for fabric spectra in the library. We can do the same to determine the probability for a specific type of fabric, or any level of specificity contained in our library. Probabilities not only tell us which material is most likely, the tell us how confident we can be in the material presence; a probability close to 1 indicates near certainty of the presence of a material in the given class, and a probability close to 0.5 indicates that we cannot know if the material is present at the given level of specificity. This is much more informative than a detection score from a target detection algorithm or a label from a classification algorithm. In this paper we present results in the form of a hierarchical tree with probabilities for each node. We use Forest Radiance imagery with 159 bands.

  2. Integrating occupancy modeling and interview data for corridor identification: A case study for jaguars in Nicaragua

    USGS Publications Warehouse

    Zeller, K.A.; Nijhawan, S.; Salom-Perez, R.; Potosme, S.H.; Hines, J.E.

    2011-01-01

    Corridors are critical elements in the long-term conservation of wide-ranging species like the jaguar (Panthera onca). Jaguar corridors across the range of the species were initially identified using a GIS-based least-cost corridor model. However, due to inherent errors in remotely sensed data and model uncertainties, these corridors warrant field verification before conservation efforts can begin. We developed a novel corridor assessment protocol based on interview data and site occupancy modeling. We divided our pilot study area, in southeastern Nicaragua, into 71, 6. ??. 6 km sampling units and conducted 160 structured interviews with local residents. Interviews were designed to collect data on jaguar and seven prey species so that detection/non-detection matrices could be constructed for each sampling unit. Jaguars were reportedly detected in 57% of the sampling units and had a detection probability of 28%. With the exception of white-lipped peccary, prey species were reportedly detected in 82-100% of the sampling units. Though the use of interview data may violate some assumptions of the occupancy modeling approach for determining 'proportion of area occupied', we countered these shortcomings through study design and interpreting the occupancy parameter, psi, as 'probability of habitat used'. Probability of habitat use was modeled for each target species using single state or multistate models. A combination of the estimated probabilities of habitat use for jaguar and prey was selected to identify the final jaguar corridor. This protocol provides an efficient field methodology for identifying corridors for easily-identifiable species, across large study areas comprised of unprotected, private lands. ?? 2010 Elsevier Ltd.

  3. Power allocation for target detection in radar networks based on low probability of intercept: A cooperative game theoretical strategy

    NASA Astrophysics Data System (ADS)

    Shi, Chenguang; Salous, Sana; Wang, Fei; Zhou, Jianjiang

    2017-08-01

    Distributed radar network systems have been shown to have many unique features. Due to their advantage of signal and spatial diversities, radar networks are attractive for target detection. In practice, the netted radars in radar networks are supposed to maximize their transmit power to achieve better detection performance, which may be in contradiction with low probability of intercept (LPI). Therefore, this paper investigates the problem of adaptive power allocation for radar networks in a cooperative game-theoretic framework such that the LPI performance can be improved. Taking into consideration both the transmit power constraints and the minimum signal to interference plus noise ratio (SINR) requirement of each radar, a cooperative Nash bargaining power allocation game based on LPI is formulated, whose objective is to minimize the total transmit power by optimizing the power allocation in radar networks. First, a novel SINR-based network utility function is defined and utilized as a metric to evaluate power allocation. Then, with the well-designed network utility function, the existence and uniqueness of the Nash bargaining solution are proved analytically. Finally, an iterative Nash bargaining algorithm is developed that converges quickly to a Pareto optimal equilibrium for the cooperative game. Numerical simulations and theoretic analysis are provided to evaluate the effectiveness of the proposed algorithm.

  4. A new FOD recognition algorithm based on multi-source information fusion and experiment analysis

    NASA Astrophysics Data System (ADS)

    Li, Yu; Xiao, Gang

    2011-08-01

    Foreign Object Debris (FOD) is a kind of substance, debris or article alien to an aircraft or system, which would potentially cause huge damage when it appears on the airport runway. Due to the airport's complex circumstance, quick and precise detection of FOD target on the runway is one of the important protections for airplane's safety. A multi-sensor system including millimeter-wave radar and Infrared image sensors is introduced and a developed new FOD detection and recognition algorithm based on inherent feature of FOD is proposed in this paper. Firstly, the FOD's location and coordinate can be accurately obtained by millimeter-wave radar, and then according to the coordinate IR camera will take target images and background images. Secondly, in IR image the runway's edges which are straight lines can be extracted by using Hough transformation method. The potential target region, that is, runway region, can be segmented from the whole image. Thirdly, background subtraction is utilized to localize the FOD target in runway region. Finally, in the detailed small images of FOD target, a new characteristic is discussed and used in target classification. The experiment results show that this algorithm can effectively reduce the computational complexity, satisfy the real-time requirement and possess of high detection and recognition probability.

  5. Rapid Target Detection in High Resolution Remote Sensing Images Using Yolo Model

    NASA Astrophysics Data System (ADS)

    Wu, Z.; Chen, X.; Gao, Y.; Li, Y.

    2018-04-01

    Object detection in high resolution remote sensing images is a fundamental and challenging problem in the field of remote sensing imagery analysis for civil and military application due to the complex neighboring environments, which can cause the recognition algorithms to mistake irrelevant ground objects for target objects. Deep Convolution Neural Network(DCNN) is the hotspot in object detection for its powerful ability of feature extraction and has achieved state-of-the-art results in Computer Vision. Common pipeline of object detection based on DCNN consists of region proposal, CNN feature extraction, region classification and post processing. YOLO model frames object detection as a regression problem, using a single CNN predicts bounding boxes and class probabilities in an end-to-end way and make the predict faster. In this paper, a YOLO based model is used for object detection in high resolution sensing images. The experiments on NWPU VHR-10 dataset and our airport/airplane dataset gain from GoogleEarth show that, compare with the common pipeline, the proposed model speeds up the detection process and have good accuracy.

  6. A generic nuclei detection method for histopathological breast images

    NASA Astrophysics Data System (ADS)

    Kost, Henning; Homeyer, André; Bult, Peter; Balkenhol, Maschenka C. A.; van der Laak, Jeroen A. W. M.; Hahn, Horst K.

    2016-03-01

    The detection of cell nuclei plays a key role in various histopathological image analysis problems. Considering the high variability of its applications, we propose a novel generic and trainable detection approach. Adaption to specific nuclei detection tasks is done by providing training samples. A trainable deconvolution and classification algorithm is used to generate a probability map indicating the presence of a nucleus. The map is processed by an extended watershed segmentation step to identify the nuclei positions. We have tested our method on data sets with different stains and target nuclear types. We obtained F1-measures between 0.83 and 0.93.

  7. Target Tracking Using SePDAF under Ambiguous Angles for Distributed Array Radar.

    PubMed

    Long, Teng; Zhang, Honggang; Zeng, Tao; Chen, Xinliang; Liu, Quanhua; Zheng, Le

    2016-09-09

    Distributed array radar can improve radar detection capability and measurement accuracy. However, it will suffer cyclic ambiguity in its angle estimates according to the spatial Nyquist sampling theorem since the large sparse array is undersampling. Consequently, the state estimation accuracy and track validity probability degrades when the ambiguous angles are directly used for target tracking. This paper proposes a second probability data association filter (SePDAF)-based tracking method for distributed array radar. Firstly, the target motion model and radar measurement model is built. Secondly, the fusion result of each radar's estimation is employed to the extended Kalman filter (EKF) to finish the first filtering. Thirdly, taking this result as prior knowledge, and associating with the array-processed ambiguous angles, the SePDAF is applied to accomplish the second filtering, and then achieving a high accuracy and stable trajectory with relatively low computational complexity. Moreover, the azimuth filtering accuracy will be promoted dramatically and the position filtering accuracy will also improve. Finally, simulations illustrate the effectiveness of the proposed method.

  8. Intensity information extraction in Geiger mode detector array based three-dimensional imaging applications

    NASA Astrophysics Data System (ADS)

    Wang, Fei

    2013-09-01

    Geiger-mode detectors have single photon sensitivity and picoseconds timing resolution, which make it a good candidate for low light level ranging applications, especially in the case of flash three dimensional imaging applications where the received laser power is extremely limited. Another advantage of Geiger-mode APD is their capability of large output current which can drive CMOS timing circuit directly, which means that larger format focal plane arrays can be easily fabricated using the mature CMOS technology. However Geiger-mode detector based FPAs can only measure the range information of a scene but not the reflectivity. Reflectivity is a major characteristic which can help target classification and identification. According to Poisson statistic nature, detection probability is tightly connected to the incident number of photon. Employing this relation, a signal intensity estimation method based on probability inversion is proposed. Instead of measuring intensity directly, several detections are conducted, then the detection probability is obtained and the intensity is estimated using this method. The relation between the estimator's accuracy, measuring range and number of detections are discussed based on statistical theory. Finally Monte-Carlo simulation is conducted to verify the correctness of this theory. Using 100 times of detection, signal intensity equal to 4.6 photons per detection can be measured using this method. With slight modification of measuring strategy, intensity information can be obtained using current Geiger-mode detector based FPAs, which can enrich the information acquired and broaden the application field of current technology.

  9. Neyman Pearson detection of K-distributed random variables

    NASA Astrophysics Data System (ADS)

    Tucker, J. Derek; Azimi-Sadjadi, Mahmood R.

    2010-04-01

    In this paper a new detection method for sonar imagery is developed in K-distributed background clutter. The equation for the log-likelihood is derived and compared to the corresponding counterparts derived for the Gaussian and Rayleigh assumptions. Test results of the proposed method on a data set of synthetic underwater sonar images is also presented. This database contains images with targets of different shapes inserted into backgrounds generated using a correlated K-distributed model. Results illustrating the effectiveness of the K-distributed detector are presented in terms of probability of detection, false alarm, and correct classification rates for various bottom clutter scenarios.

  10. Detection of Chemical Precursors of Explosives

    NASA Technical Reports Server (NTRS)

    Li, Jing

    2012-01-01

    Certain selected chemicals associated with terrorist activities are too unstable to be prepared in final form. These chemicals are often prepared as precursor components, to be combined at a time immediately preceding the detonation. One example is a liquid explosive, which usually requires an oxidizer, an energy source, and a chemical or physical mechanism to combine the other components. Detection of the oxidizer (e.g. H2O2) or the energy source (e.g., nitromethane) is often possible, but must be performed in a short time interval (e.g., 5 15 seconds) and in an environment with a very small concentration (e.g.,1 100 ppm), because the target chemical(s) is carried in a sealed container. These needs are met by this invention, which provides a system and associated method for detecting one or more chemical precursors (components) of a multi-component explosive compound. Different carbon nanotubes (CNTs) are loaded (by doping, impregnation, coating, or other functionalization process) for detecting of different chemical substances that are the chemical precursors, respectively, if these precursors are present in a gas to which the CNTs are exposed. After exposure to the gas, a measured electrical parameter (e.g. voltage or current that correlate to impedance, conductivity, capacitance, inductance, etc.) changes with time and concentration in a predictable manner if a selected chemical precursor is present, and will approach an asymptotic value promptly after exposure to the precursor. The measured voltage or current are compared with one or more sequences of their reference values for one or more known target precursor molecules, and a most probable concentration value is estimated for each one, two, or more target molecules. An error value is computed, based on differences of voltage or current for the measured and reference values, using the most probable concentration values. Where the error value is less than a threshold, the system concludes that the target molecule is likely. Presence of one, two, or more target molecules in the gas can be sensed from a single set of measurements.

  11. Compact and cost effective instrument for detecting drug precursors in different environments based on fluorescence polarization

    NASA Astrophysics Data System (ADS)

    Antolín-Urbaneja, J. C.; Eguizabal, I.; Briz, N.; Dominguez, A.; Estensoro, P.; Secchi, A.; Varriale, A.; Di Giovanni, S.; D'Auria, S.

    2013-05-01

    Several techniques for detecting chemical drug precursors have been developed in the last decade. Most of them are able to identify molecules at very low concentration under lab conditions. Other commercial devices are able to detect a fixed number and type of target substances based on a single detection technique providing an absence of flexibility with respect to target compounds. The construction of compact and easy to use detection systems providing screening for a large number of compounds being able to discriminate them with low false alarm rate and high probability of detection is still an open concern. Under CUSTOM project, funded by the European Commission within the FP7, a stand-alone portable sensing device based on multiple techniques is being developed. One of these techniques is based on the LED induced fluorescence polarization to detect Ephedrine and Benzyl Methyl Keton (BMK) as a first approach. This technique is highly selective with respect to the target compounds due to the generation of properly engineered fluorescent proteins which are able to bind the target analytes, as it happens in an "immune-type reaction". This paper deals with the advances in the design, construction and validation of the LED induced fluorescence sensor to detect BMK analytes. This sensor includes an analysis module based on high performance LED and PMT detector, a fluidic system to dose suitable quantities of reagents and some printed circuit boards, all of them fixed in a small structure (167mm × 193mm × 228mm) with the capability of working as a stand-alone application.

  12. K{sub β} to K{sub α} X-ray intensity ratios and K to L shell vacancy transfer probabilities of Co, Ni, Cu, and Zn

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anand, L. F. M.; Gudennavar, S. B., E-mail: shivappa.b.gudennavar@christuniversity.in; Bubbly, S. G.

    The K to L shell total vacancy transfer probabilities of low Z elements Co, Ni, Cu, and Zn are estimated by measuring the K{sub β} to K{sub α} intensity ratio adopting the 2π-geometry. The target elements were excited by 32.86 keV barium K-shell X-rays from a weak {sup 137}Cs γ-ray source. The emitted K-shell X-rays were detected using a low energy HPGe X-ray detector coupled to a 16 k MCA. The measured intensity ratios and the total vacancy transfer probabilities are compared with theoretical results and others’ work, establishing a good agreement.

  13. Optimum Sensors Integration for Multi-Sensor Multi-Target Environment for Ballistic Missile Defense Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Imam, Neena; Barhen, Jacob; Glover, Charles Wayne

    2012-01-01

    Multi-sensor networks may face resource limitations in a dynamically evolving multiple target tracking scenario. It is necessary to task the sensors efficiently so that the overall system performance is maximized within the system constraints. The central sensor resource manager may control the sensors to meet objective functions that are formulated to meet system goals such as minimization of track loss, maximization of probability of target detection, and minimization of track error. This paper discusses the variety of techniques that may be utilized to optimize sensor performance for either near term gain or future reward over a longer time horizon.

  14. a Coarse-To Model for Airplane Detection from Large Remote Sensing Images Using Saliency Modle and Deep Learning

    NASA Astrophysics Data System (ADS)

    Song, Z. N.; Sui, H. G.

    2018-04-01

    High resolution remote sensing images are bearing the important strategic information, especially finding some time-sensitive-targets quickly, like airplanes, ships, and cars. Most of time the problem firstly we face is how to rapidly judge whether a particular target is included in a large random remote sensing image, instead of detecting them on a given image. The problem of time-sensitive-targets target finding in a huge image is a great challenge: 1) Complex background leads to high loss and false alarms in tiny object detection in a large-scale images. 2) Unlike traditional image retrieval, what we need to do is not just compare the similarity of image blocks, but quickly find specific targets in a huge image. In this paper, taking the target of airplane as an example, presents an effective method for searching aircraft targets in large scale optical remote sensing images. Firstly, we used an improved visual attention model utilizes salience detection and line segment detector to quickly locate suspected regions in a large and complicated remote sensing image. Then for each region, without region proposal method, a single neural network predicts bounding boxes and class probabilities directly from full images in one evaluation is adopted to search small airplane objects. Unlike sliding window and region proposal-based techniques, we can do entire image (region) during training and test time so it implicitly encodes contextual information about classes as well as their appearance. Experimental results show the proposed method is quickly identify airplanes in large-scale images.

  15. The costs of evaluating species densities and composition of snakes to assess development impacts in amazonia.

    PubMed

    Fraga, Rafael de; Stow, Adam J; Magnusson, William E; Lima, Albertina P

    2014-01-01

    Studies leading to decision-making for environmental licensing often fail to provide accurate estimates of diversity. Measures of snake diversity are regularly obtained to assess development impacts in the rainforests of the Amazon Basin, but this taxonomic group may be subject to poor detection probabilities. Recently, the Brazilian government tried to standardize sampling designs by the implementation of a system (RAPELD) to quantify biological diversity using spatially-standardized sampling units. Consistency in sampling design allows the detection probabilities to be compared among taxa, and sampling effort and associated cost to be evaluated. The cost effectiveness of detecting snakes has received no attention in Amazonia. Here we tested the effects of reducing sampling effort on estimates of species densities and assemblage composition. We identified snakes in seven plot systems, each standardised with 14 plots. The 250 m long centre line of each plot followed an altitudinal contour. Surveys were repeated four times in each plot and detection probabilities were estimated for the 41 species encountered. Reducing the number of observations, or the size of the sampling modules, caused significant loss of information on species densities and local patterns of variation in assemblage composition. We estimated the cost to find a snake as $ 120 U.S., but general linear models indicated the possibility of identifying differences in assemblage composition for half the overall survey costs. Decisions to reduce sampling effort depend on the importance of lost information to target-issues, and may not be the preferred option if there is the potential for identifying individual snake species requiring specific conservation actions. However, in most studies of human disturbance on species assemblages, it is likely to be more cost-effective to focus on other groups of organisms with higher detection probabilities.

  16. The Costs of Evaluating Species Densities and Composition of Snakes to Assess Development Impacts in Amazonia

    PubMed Central

    de Fraga, Rafael; Stow, Adam J.; Magnusson, William E.; Lima, Albertina P.

    2014-01-01

    Studies leading to decision-making for environmental licensing often fail to provide accurate estimates of diversity. Measures of snake diversity are regularly obtained to assess development impacts in the rainforests of the Amazon Basin, but this taxonomic group may be subject to poor detection probabilities. Recently, the Brazilian government tried to standardize sampling designs by the implementation of a system (RAPELD) to quantify biological diversity using spatially-standardized sampling units. Consistency in sampling design allows the detection probabilities to be compared among taxa, and sampling effort and associated cost to be evaluated. The cost effectiveness of detecting snakes has received no attention in Amazonia. Here we tested the effects of reducing sampling effort on estimates of species densities and assemblage composition. We identified snakes in seven plot systems, each standardised with 14 plots. The 250 m long centre line of each plot followed an altitudinal contour. Surveys were repeated four times in each plot and detection probabilities were estimated for the 41 species encountered. Reducing the number of observations, or the size of the sampling modules, caused significant loss of information on species densities and local patterns of variation in assemblage composition. We estimated the cost to find a snake as $ 120 U.S., but general linear models indicated the possibility of identifying differences in assemblage composition for half the overall survey costs. Decisions to reduce sampling effort depend on the importance of lost information to target-issues, and may not be the preferred option if there is the potential for identifying individual snake species requiring specific conservation actions. However, in most studies of human disturbance on species assemblages, it is likely to be more cost-effective to focus on other groups of organisms with higher detection probabilities. PMID:25147930

  17. Infrared maritime target detection using a probabilistic single Gaussian model of sea clutter in Fourier domain

    NASA Astrophysics Data System (ADS)

    Zhou, Anran; Xie, Weixin; Pei, Jihong; Chen, Yapei

    2018-02-01

    For ship targets detection in cluttered infrared image sequences, a robust detection method, based on the probabilistic single Gaussian model of sea background in Fourier domain, is put forward. The amplitude spectrum sequences at each frequency point of the pure seawater images in Fourier domain, being more stable than the gray value sequences of each background pixel in the spatial domain, are regarded as a Gaussian model. Next, a probability weighted matrix is built based on the stability of the pure seawater's total energy spectrum in the row direction, to make the Gaussian model more accurate. Then, the foreground frequency points are separated from the background frequency points by the model. Finally, the false-alarm points are removed utilizing ships' shape features. The performance of the proposed method is tested by visual and quantitative comparisons with others.

  18. The impact of Relative Prevalence on dual-target search for threat items from airport X-ray screening.

    PubMed

    Godwin, Hayward J; Menneer, Tamaryn; Cave, Kyle R; Helman, Shaun; Way, Rachael L; Donnelly, Nick

    2010-05-01

    The probability of target presentation in visual search tasks influences target detection performance: this is known as the prevalence effect (Wolfe et al., 2005). Additionally, searching for several targets simultaneously reduces search performance: this is known as the dual-target cost (DTC: Menneer et al., 2007). The interaction between the DTC and prevalence effect was investigated in a single study by presenting one target in dual-target search at a higher level of prevalence than the other target (Target A: 45% Prevalence; Target B: 5% Prevalence). An overall DTC was found for both RTs and response accuracy. Furthermore, there was an effect of target prevalence in dual-target search, suggesting that, when one target is presented at a higher level of prevalence than the other, both the dual-target cost and the prevalence effect contribute to decrements in performance. The implications for airport X-ray screening are discussed. Copyright 2010 Elsevier B.V. All rights reserved.

  19. Effect of inherent location uncertainty on detection of stationary targets in noisy image sequences.

    PubMed

    Manjeshwar, R M; Wilson, D L

    2001-01-01

    The effect of inherent location uncertainty on the detection of stationary targets was determined in noisy image sequences. Targets were thick and thin projected cylinders mimicking arteries, catheters, and guide wires in medical imaging x-ray fluoroscopy. With the use of an adaptive forced-choice method, detection contrast sensitivity (the inverse of contrast) was measured both with and without marker cues that directed the attention of observers to the target location. With the probability correct clamped at 80%, contrast sensitivity increased an average of 77% when the marker was added to the thin-cylinder target. There was an insignificant effect on the thick cylinder. The large enhancement with the thin cylinder was obtained even though the target was located exactly in the center of a small panel, giving observers the impression that it was well localized. Psychometric functions consisting of d' plotted as a function of the square root of the signal-energy-to-noise-ratio gave a positive x intercept for the case of the thin cylinder without a marker. This x intercept, characteristic of uncertainty in other types of detection experiments, disappeared when the marker was added or when the thick cylinder was used. Inherent location uncertainty was further characterized by using four different markers with varying proximity to the target. Visual detection by human observers increased monotonically as the markers better localized the target. Human performance was modeled as a matched-filter detector with an uncertainty in the placement of the template. The removal of a location cue was modeled by introducing a location uncertainty of approximately equals 0.4 mm on the display device or only 7 microm on the retina, a size on the order of a single photoreceptor field. We conclude that detection is affected by target location uncertainty on the order of cellular dimensions, an observation with important implications for detection mechanisms in humans. In medical imaging, the results argue strongly for inclusion of high-contrast visualization markers on catheters and other interventional devices.

  20. Scale invariant SURF detector and automatic clustering segmentation for infrared small targets detection

    NASA Astrophysics Data System (ADS)

    Zhang, Haiying; Bai, Jiaojiao; Li, Zhengjie; Liu, Yan; Liu, Kunhong

    2017-06-01

    The detection and discrimination of infrared small dim targets is a challenge in automatic target recognition (ATR), because there is no salient information of size, shape and texture. Many researchers focus on mining more discriminative information of targets in temporal-spatial. However, such information may not be available with the change of imaging environments, and the targets size and intensity keep changing in different imaging distance. So in this paper, we propose a novel research scheme using density-based clustering and backtracking strategy. In this scheme, the speeded up robust feature (SURF) detector is applied to capture candidate targets in single frame at first. And then, these points are mapped into one frame, so that target traces form a local aggregation pattern. In order to isolate the targets from noises, a newly proposed density-based clustering algorithm, fast search and find of density peak (FSFDP for short), is employed to cluster targets by the spatial intensive distribution. Two important factors of the algorithm, percent and γ , are exploited fully to determine the clustering scale automatically, so as to extract the trace with highest clutter suppression ratio. And at the final step, a backtracking algorithm is designed to detect and discriminate target trace as well as to eliminate clutter. The consistence and continuity of the short-time target trajectory in temporal-spatial is incorporated into the bounding function to speed up the pruning. Compared with several state-of-arts methods, our algorithm is more effective for the dim targets with lower signal-to clutter ratio (SCR). Furthermore, it avoids constructing the candidate target trajectory searching space, so its time complexity is limited to a polynomial level. The extensive experimental results show that it has superior performance in probability of detection (Pd) and false alarm suppressing rate aiming at variety of complex backgrounds.

  1. Proposal and Implementation of a Robust Sensing Method for DVB-T Signal

    NASA Astrophysics Data System (ADS)

    Song, Chunyi; Rahman, Mohammad Azizur; Harada, Hiroshi

    This paper proposes a sensing method for TV signals of DVB-T standard to realize effective TV White Space (TVWS) Communication. In the TVWS technology trial organized by the Infocomm Development Authority (iDA) of Singapore, with regard to the sensing level and sensing time, detecting DVB-T signal at the level of -120dBm over an 8MHz channel with a sensing time below 1 second is required. To fulfill such a strict sensing requirement, we propose a smart sensing method which combines feature detection and energy detection (CFED), and is also characterized by using dynamic threshold selection (DTS) based on a threshold table to improve sensing robustness to noise uncertainty. The DTS based CFED (DTS-CFED) is evaluated by computer simulations and is also implemented into a hardware sensing prototype. The results show that the DTS-CFED achieves a detection probability above 0.9 for a target false alarm probability of 0.1 for DVB-T signals at the level of -120dBm over an 8MHz channel with the sensing time equals to 0.1 second.

  2. Vehicle Detection for RCTA/ANS (Autonomous Navigation System)

    NASA Technical Reports Server (NTRS)

    Brennan, Shane; Bajracharya, Max; Matthies, Larry H.; Howard, Andrew B.

    2012-01-01

    Using a stereo camera pair, imagery is acquired and processed through the JPLV stereo processing pipeline. From this stereo data, large 3D blobs are found. These blobs are then described and classified by their shape to determine which are vehicles and which are not. Prior vehicle detection algorithms are either targeted to specific domains, such as following lead cars, or are intensity- based methods that involve learning typical vehicle appearances from a large corpus of training data. In order to detect vehicles, the JPL Vehicle Detection (JVD) algorithm goes through the following steps: 1. Take as input a left disparity image and left rectified image from JPLV stereo. 2. Project the disparity data onto a two-dimensional Cartesian map. 3. Perform some post-processing of the map built in the previous step in order to clean it up. 4. Take the processed map and find peaks. For each peak, grow it out into a map blob. These map blobs represent large, roughly vehicle-sized objects in the scene. 5. Take these map blobs and reject those that do not meet certain criteria. Build descriptors for the ones that remain. Pass these descriptors onto a classifier, which determines if the blob is a vehicle or not. The probability of detection is the probability that if a vehicle is present in the image, is visible, and un-occluded, then it will be detected by the JVD algorithm. In order to estimate this probability, eight sequences were ground-truthed from the RCTA (Robotics Collaborative Technology Alliances) program, totaling over 4,000 frames with 15 unique vehicles. Since these vehicles were observed at varying ranges, one is able to find the probability of detection as a function of range. At the time of this reporting, the JVD algorithm was tuned to perform best at cars seen from the front, rear, or either side, and perform poorly on vehicles seen from oblique angles.

  3. Estimating site occupancy, colonization, and local extinction when a species is detected imperfectly

    USGS Publications Warehouse

    MacKenzie, D.I.; Nichols, J.D.; Hines, J.E.; Knutson, M.G.; Franklin, A.B.

    2003-01-01

    Few species are likely to be so evident that they will always be defected when present: Failing to allow for the possibility that a target species was present, but undetected at a site will lead to biased estimates of site occupancy, colonization,and local extinction probabilities. These population vital rates are often of interest in long-term monitoring programs and metapopulation studies. We present a model that enables direct estimation of these parameters when the probability of detecting the species is less than 1. The model does not require any assumptions-of process stationarity, as do some previous methods, but does require detection/nondetection data to be collected in a-manner similar to. Pollock's robust design as used-in mark-recapture studies. Via simulation, we,show that the model provides good estimates of parameters for most scenarios considered. We illustrate the method with data from monitoring programs of Northern Spotted Owls (Strix occidentalis caurina) in northern California and tiger salamanders (Ambystoma tigrinum) in Minnesota, USA.

  4. Multisensor fusion for the detection of mines and minelike targets

    NASA Astrophysics Data System (ADS)

    Hanshaw, Terilee

    1995-06-01

    The US Army's Communications and Electronics Command through the auspices of its Night Vision and Electronics Sensors Directorate (CECOM-NVESD) is actively applying multisensor techniques to the detection of mine targets. This multisensor research results from the 'detection activity' with its broad range of operational conditions and targets. Multisensor operation justifies significant attention by yielding high target detection and low false alarm statistics. Furthermore, recent advances in sensor and computing technologies make its practical application realistic and affordable. The mine detection field-of-endeavor has since its WWI baptismal investigated the known spectra for applicable mine observation phenomena. Countless sensors, algorithms, processors, networks, and other techniques have been investigated to determine candidacy for mine detection. CECOM-NVESD efforts have addressed a wide range of sensors spanning the spectrum from gravity field perturbations, magentic field disturbances, seismic sounding, electromagnetic fields, earth penetrating radar imagery, and infrared/visible/ultraviolet surface imaging technologies. Supplementary analysis has considered sensor candidate applicability by testing under field conditions (versus laboratory), in determination of fieldability. As these field conditions directly effect the probability of detection and false alarms, sensor employment and design must be considered. Consequently, as a given sensor's performance is influenced directly by the operational conditions, tradeoffs are necessary. At present, mass produced and fielded mine detection techniques are limited to those incorporating a single sensor/processor methodology such as, pulse induction and megnetometry, as found in hand held detectors. The most sensitive fielded systems can detect minute metal components in small mine targets but result in very high false alarm rates reducing velocity in operation environments. Furthermore, the actual speed of advance for the entire mission (convoy, movement to engagement, etc.) is determined by the level of difficulty presented in clearance or avoidance activities required in response to the potential 'targets' marked throughout a detection activity. Therefore the application of fielded hand held systems to convoy operations in clearly impractical. CECOM-NVESD efforts are presently seeking to overcome these operational limitations by substantially increasing speed of detection while reducing the false alarm rate through the application of multisensor techniques. The CECOM-NVESD application of multisensor techniques through integration/fusion methods will be defined in this paper.

  5. Image Discrimination Models Predict Object Detection in Natural Backgrounds

    NASA Technical Reports Server (NTRS)

    Ahumada, Albert J., Jr.; Rohaly, A. M.; Watson, Andrew B.; Null, Cynthia H. (Technical Monitor)

    1994-01-01

    Object detection involves looking for one of a large set of object sub-images in a large set of background images. Image discrimination models only predict the probability that an observer will detect a difference between two images. In a recent study based on only six different images, we found that discrimination models can predict the relative detectability of objects in those images, suggesting that these simpler models may be useful in some object detection applications. Here we replicate this result using a new, larger set of images. Fifteen images of a vehicle in an other-wise natural setting were altered to remove the vehicle and mixed with the original image in a proportion chosen to make the target neither perfectly recognizable nor unrecognizable. The target was also rotated about a vertical axis through its center and mixed with the background. Sixteen observers rated these 30 target images and the 15 background-only images for the presence of a vehicle. The likelihoods of the observer responses were computed from a Thurstone scaling model with the assumption that the detectabilities are proportional to the predictions of an image discrimination model. Three image discrimination models were used: a cortex transform model, a single channel model with a contrast sensitivity function filter, and the Root-Mean-Square (RMS) difference of the digital target and background-only images. As in the previous study, the cortex transform model performed best; the RMS difference predictor was second best; and last, but still a reasonable predictor, was the single channel model. Image discrimination models can predict the relative detectabilities of objects in natural backgrounds.

  6. Where Should We Look for Life?

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2018-04-01

    The first challenge in the hunt for life elsewhere in our universe is to decide where to look. In a new study, two scientists examine whether Sun-like stars or low-mass M dwarfs are the best bet for hosting exoplanets with detectable life.Ambiguity of HabitabilityThe habitable zones of cool M-dwarf stars lie much closer in than for Sun-like stars, placing habitable-zone planets around M dwarfs at greater risk of being affected by space weather.Most exoplanet scientists will freely admit frustration with the term habitability its a word that has many different meanings and is easily misinterpreted when it appears in news articles. Just because a planet lies in a stars habitable zone, for instance, doesnt mean its necessarily capable of supporting life.This ambiguity, argue authors Manasvi Lingam and Abraham Loeb (Harvard University and Harvard-Smithsonian Center for Astrophysics), requires us to take a strategic approach when pursuing the search for primitive life outside of our solar system. In particular, we risk losing the enthusiasm and support of the public (and funding sources!) when wefocus on the general search for planets in stellar habitable zones, rather than specifically searching for the planets most likely to have detectable signatures of life.Illustration of the difference between a Sun-like star and a lower-mass, cooler M-dwarf star. [NASAs Goddard Space Flight Center/S. Wiessinger]Weighing Two TargetsSo how do we determine where best to look for planets with detectable biosignatures? To figure out which stars make the optimal targets, Lingam and Loeb suggest an approach based on standard cost-benefit analyses common in economics. Here, whats being balanced is the cost of an exoplanet survey mission against the benefit of different types of stellar targets.In particular, Lingam and Loeb weigh the benefit of targeting solar-type stars against that of targeting stars of any other mass (such as low-mass M-dwarfs, popular targets of many current exoplanet surveys). The advantage of one type of target over the other depends on two chief factors:the probability that the targeted star hosts planets with life, andthe probability that biosignatures arising from this life are detectable, given our available technology.Promise of Sun-Like StarsRelative benefit of searching for signatures of life around stars with varying masses, assuming a transmission spectroscopy survey mission; results are similar for a direct-imaging mission. Green curve assumes a flat prior; red and blue curves assume priors in which habitability is suppressed around low-mass stars. [Lingam Loeb 2018]Taking observational constraints into account, Lingam and Loebs results depend on what is known in statistics as a prior an assumption that goes into the calculation. The two possible outcomes are:If we assume a flat prior i.e., that the probability of life is the same for any choice of star then searching for life around M-dwarfs proves the most advantageous, because the detection of biosignatures becomes much easier.If we assume a prior in which habitability is suppressed around low-mass stars, then it is more advantageous to search for life around solar-type stars.So which of these priors is correct? There is mounting evidence, particularly based on considerations of space weather, that the habitability of Earth-like planets around M dwarfs might be much lower than their counterparts around solar-like stars.If this turns out to be true, then Lingam and Loeb argue exoplanet survey missions should target Sun-like stars throughout our galaxy for the best chances of efficiently detecting life beyond our solar system.CitationManasvi Lingam and Abraham Loeb 2018 ApJL 857 L17. doi:10.3847/2041-8213/aabd86

  7. Tracking the Sensory Environment: An ERP Study of Probability and Context Updating in ASD

    PubMed Central

    Westerfield, Marissa A.; Zinni, Marla; Vo, Khang; Townsend, Jeanne

    2014-01-01

    We recorded visual event-related brain potentials (ERPs) from 32 adult male participants (16 high-functioning participants diagnosed with Autism Spectrum Disorder (ASD) and 16 control participants, ranging in age from 18–53 yrs) during a three-stimulus oddball paradigm. Target and non-target stimulus probability was varied across three probability conditions, whereas the probability of a third non-target stimulus was held constant in all conditions. P3 amplitude to target stimuli was more sensitive to probability in ASD than in TD participants, whereas P3 amplitude to non-target stimuli was less responsive to probability in ASD participants. This suggests that neural responses to changes in event probability are attention-dependant in high-functioning ASD. The implications of these findings for higher-level behaviors such as prediction and planning are discussed. PMID:24488156

  8. Assessing the performance of a covert automatic target recognition algorithm

    NASA Astrophysics Data System (ADS)

    Ehrman, Lisa M.; Lanterman, Aaron D.

    2005-05-01

    Passive radar systems exploit illuminators of opportunity, such as TV and FM radio, to illuminate potential targets. Doing so allows them to operate covertly and inexpensively. Our research seeks to enhance passive radar systems by adding automatic target recognition (ATR) capabilities. In previous papers we proposed conducting ATR by comparing the radar cross section (RCS) of aircraft detected by a passive radar system to the precomputed RCS of aircraft in the target class. To effectively model the low-frequency setting, the comparison is made via a Rician likelihood model. Monte Carlo simulations indicate that the approach is viable. This paper builds on that work by developing a method for quickly assessing the potential performance of the ATR algorithm without using exhaustive Monte Carlo trials. This method exploits the relation between the probability of error in a binary hypothesis test under the Bayesian framework to the Chernoff information. Since the data are well-modeled as Rician, we begin by deriving a closed-form approximation for the Chernoff information between two Rician densities. This leads to an approximation for the probability of error in the classification algorithm that is a function of the number of available measurements. We conclude with an application that would be particularly cumbersome to accomplish via Monte Carlo trials, but that can be quickly addressed using the Chernoff information approach. This application evaluates the length of time that an aircraft must be tracked before the probability of error in the ATR algorithm drops below a desired threshold.

  9. Statistical Algorithms for Designing Geophysical Surveys to Detect UXO Target Areas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Brien, Robert F.; Carlson, Deborah K.; Gilbert, Richard O.

    2005-07-29

    The U.S. Department of Defense is in the process of assessing and remediating closed, transferred, and transferring military training ranges across the United States. Many of these sites have areas that are known to contain unexploded ordnance (UXO). Other sites or portions of sites are not expected to contain UXO, but some verification of this expectation using geophysical surveys is needed. Many sites are so large that it is often impractical and/or cost prohibitive to perform surveys over 100% of the site. In that case, it is particularly important to be explicit about the performance required of the survey. Thismore » article presents the statistical algorithms developed to support the design of geophysical surveys along transects (swaths) to find target areas (TAs) of anomalous geophysical readings that may indicate the presence of UXO. The algorithms described here determine 1) the spacing between transects that should be used for the surveys to achieve a specified probability of traversing the TA, 2) the probability of both traversing and detecting a TA of anomalous geophysical readings when the spatial density of anomalies within the TA is either uniform (unchanging over space) or has a bivariate normal distribution, and 3) the probability that a TA exists when it was not found by surveying along transects. These algorithms have been implemented in the Visual Sample Plan (VSP) software to develop cost-effective transect survey designs that meet performance objectives.« less

  10. Statistical Algorithms for Designing Geophysical Surveys to Detect UXO Target Areas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Brien, Robert F.; Carlson, Deborah K.; Gilbert, Richard O.

    2005-07-28

    The U.S. Department of Defense is in the process of assessing and remediating closed, transferred, and transferring military training ranges across the United States. Many of these sites have areas that are known to contain unexploded ordnance (UXO). Other sites or portions of sites are not expected to contain UXO, but some verification of this expectation using geophysical surveys is needed. Many sites are so large that it is often impractical and/or cost prohibitive to perform surveys over 100% of the site. In such cases, it is particularly important to be explicit about the performance required of the surveys. Thismore » article presents the statistical algorithms developed to support the design of geophysical surveys along transects (swaths) to find target areas (TAs) of anomalous geophysical readings that may indicate the presence of UXO. The algorithms described here determine (1) the spacing between transects that should be used for the surveys to achieve a specified probability of traversing the TA, (2) the probability of both traversing and detecting a TA of anomalous geophysical readings when the spatial density of anomalies within the TA is either uniform (unchanging over space) or has a bivariate normal distribution, and (3) the probability that a TA exists when it was not found by surveying along transects. These algorithms have been implemented in the Visual Sample Plan (VSP) software to develop cost-effective transect survey designs that meet performance objectives.« less

  11. The performance analysis of three-dimensional track-before-detect algorithm based on Fisher-Tippett-Gnedenko theorem

    NASA Astrophysics Data System (ADS)

    Cho, Hoonkyung; Chun, Joohwan; Song, Sungchan

    2016-09-01

    The dim moving target tracking from the infrared image sequence in the presence of high clutter and noise has been recently under intensive investigation. The track-before-detect (TBD) algorithm processing the image sequence over a number of frames before decisions on the target track and existence is known to be especially attractive in very low SNR environments (⩽ 3 dB). In this paper, we shortly present a three-dimensional (3-D) TBD with dynamic programming (TBD-DP) algorithm using multiple IR image sensors. Since traditional two-dimensional TBD algorithm cannot track and detect the along the viewing direction, we use 3-D TBD with multiple sensors and also strictly analyze the detection performance (false alarm and detection probabilities) based on Fisher-Tippett-Gnedenko theorem. The 3-D TBD-DP algorithm which does not require a separate image registration step uses the pixel intensity values jointly read off from multiple image frames to compute the merit function required in the DP process. Therefore, we also establish the relationship between the pixel coordinates of image frame and the reference coordinates.

  12. Measurement of K to L shell vacancy transfer probabilities for the elements 46≤ Z≤55 by photoionization

    NASA Astrophysics Data System (ADS)

    Şimşek, Ö.; Karagöz, D.; Ertugrul, M.

    2003-10-01

    The K to L shell vacancy transfer probabilities for nine elements in the atomic region 46≤ Z≤55 were determined by measuring the L X-ray yields from targets excited by 5.96 and 59.5 keV photons and using the theoretical K and L shell photoionization cross-sections. The L X-rays from different targets were detected with an Ultra-LEGe detector with very thin polymer window. Present experimental results were compared with the semi empirical values tabulated by Rao et al. [Atomic vacancy distributions product by inner shellionization, Phys. Rev. A 5 (1972) 997-1002] and theoretically calculated values using radiative and radiationless transitions. The radiative transitions of these elements were observed from the relativistic Hartree-Slater model, which was proposed by Scofield [Relativistic Hartree-Slater values for K and L shell X-ray emission rates, At. Data Nucl. Data Tables 14 (1974) 121-137]. The radiationless transitions were observed from the Dirac-Hartree-Slater model, which was proposed by Chen et al. [Relativistic radiationless transition probabilities for atomic K- and L-shells, At. Data Nucl. Data Tables 24 (1979) 13-37]. To the best of our knowledge, these vacancy transfer probabilities are reported for the first time.

  13. Anti-dynamic-crosstalk method for single photon LIDAR detection

    NASA Astrophysics Data System (ADS)

    Zhang, Fan; Liu, Qiang; Gong, Mali; Fu, Xing

    2017-11-01

    With increasing number of vehicles equipped with light detection and ranging (LIDAR), crosstalk is identified as a critical and urgent issue in the range detection for active collision avoidance. Chaotic pulse position modulation (CPPM) applied in the transmitting pulse train has been shown to prevent crosstalk as well as range ambiguity. However, static and unified strategy on discrimination threshold and the number of accumulated pulse is not valid against crosstalk with varying number of sources and varying intensity of each source. This paper presents an adaptive algorithm to distinguish the target echo from crosstalk with dynamic and unknown level of intensity in the context of intelligent vehicles. New strategy is given based on receiver operating characteristics (ROC) curves that consider the detection requirements of the probability of detection and false alarm for the scenario with varying crosstalk. In the adaptive algorithm, the detected results are compared by the new strategy with both the number of accumulated pulses and the threshold being raised step by step, so that the target echo can be exactly identified from crosstalk with the dynamic and unknown level of intensity. The validity of the algorithm has been verified through the experiments with a single photon detector and the time correlated single photo counting (TCSPC) technique, demonstrating a marked drop in required shots for identifying the target compared with static and unified strategy

  14. Probability theory for 3-layer remote sensing radiative transfer model: univariate case.

    PubMed

    Ben-David, Avishai; Davidson, Charles E

    2012-04-23

    A probability model for a 3-layer radiative transfer model (foreground layer, cloud layer, background layer, and an external source at the end of line of sight) has been developed. The 3-layer model is fundamentally important as the primary physical model in passive infrared remote sensing. The probability model is described by the Johnson family of distributions that are used as a fit for theoretically computed moments of the radiative transfer model. From the Johnson family we use the SU distribution that can address a wide range of skewness and kurtosis values (in addition to addressing the first two moments, mean and variance). In the limit, SU can also describe lognormal and normal distributions. With the probability model one can evaluate the potential for detecting a target (vapor cloud layer), the probability of observing thermal contrast, and evaluate performance (receiver operating characteristics curves) in clutter-noise limited scenarios. This is (to our knowledge) the first probability model for the 3-layer remote sensing geometry that treats all parameters as random variables and includes higher-order statistics. © 2012 Optical Society of America

  15. Laser radar system for obstacle avoidance

    NASA Astrophysics Data System (ADS)

    Bers, Karlheinz; Schulz, Karl R.; Armbruster, Walter

    2005-09-01

    The threat of hostile surveillance and weapon systems require military aircraft to fly under extreme conditions such as low altitude, high speed, poor visibility and incomplete terrain information. The probability of collision with natural and man-made obstacles during such contour missions is high if detection capability is restricted to conventional vision aids. Forward-looking scanning laser radars which are build by the EADS company and presently being flight tested and evaluated at German proving grounds, provide a possible solution, having a large field of view, high angular and range resolution, a high pulse repetition rate, and sufficient pulse energy to register returns from objects at distances of military relevance with a high hit-and-detect probability. The development of advanced 3d-scene analysis algorithms had increased the recognition probability and reduced the false alarm rate by using more readily recognizable objects such as terrain, poles, pylons, trees, etc. to generate a parametric description of the terrain surface as well as the class, position, orientation, size and shape of all objects in the scene. The sensor system and the implemented algorithms can be used for other applications such as terrain following, autonomous obstacle avoidance, and automatic target recognition. This paper describes different 3D-imaging ladar sensors with unique system architecture but different components matched for different military application. Emphasis is laid on an obstacle warning system with a high probability of detection of thin wires, the real time processing of the measured range image data, obstacle classification und visualization.

  16. QPCR detection of Mucorales DNA in bronchoalveolar lavage fluid to diagnose pulmonary mucormycosis.

    PubMed

    Scherer, Emeline E; Iriart, Xavier; Bellanger, Anne Pauline; Dupont, Damien; Guitard, Juliette; Gabriel, Frederic; Cassaing, Sophie; Charpentier, Eléna; Guenounou, Sarah; Cornet, Murielle; Botterel, Françoise; Rocchi, Steffi; Berceanu, Ana; Millon, Laurence

    2018-06-06

    Early diagnosis and treatment are essential to improving the outcome of mucormycosis. The aim of this retrospective study was to assess the contribution of quantitative PCR detection of Mucorales DNA in bronchoalveolar lavage fluids for early diagnosis of pulmonary mucormycosis.Bronchoalveolar lavage fluids (n=450) from 374 patients with pneumonia and immunosuppressive conditions were analyzed using a combination of 3 quantitative PCR assays targeting the main genera involved in mucormycosis in France ( Rhizomucor, Mucor/Rhizopus, Lichtheimia ).Among these 374 patients, 24 had at least one bronchoalveolar lavage with a positive PCR; 23/24 patients had radiological criteria for invasive fungal infections according to consensual criteria : 10 patients with probable or proven mucormycosis, and 13 additional patients with other invasive fungal infections (4 probable aspergillosis, 1 proven fusariosis, and 8 possible invasive fungal infections). Only 2/24 patients with a positive PCR on bronchoalveolar lavage had a positive Mucorales culture.PCR was also positive on serum in 17/24 patients. In most cases, PCR was first detected positive on sera (15/17). However, a positive PCR on bronchoalveolar lavage was the earliest and/or the only biological test revealing mucormycosis in 4 patients with a final diagnosis of probable or proven mucormycosis, 3 patients with probable aspergillosis and one patient with a possible invasive fungal infection.Mucorales PCR performed on bronchoalveolar lavage could provide additional arguments for earlier administration of Mucorales-directed antifungal therapy, thus improving the outcome of lung mucormycosis. Copyright © 2018 American Society for Microbiology.

  17. Target Tracking Using SePDAF under Ambiguous Angles for Distributed Array Radar

    PubMed Central

    Long, Teng; Zhang, Honggang; Zeng, Tao; Chen, Xinliang; Liu, Quanhua; Zheng, Le

    2016-01-01

    Distributed array radar can improve radar detection capability and measurement accuracy. However, it will suffer cyclic ambiguity in its angle estimates according to the spatial Nyquist sampling theorem since the large sparse array is undersampling. Consequently, the state estimation accuracy and track validity probability degrades when the ambiguous angles are directly used for target tracking. This paper proposes a second probability data association filter (SePDAF)-based tracking method for distributed array radar. Firstly, the target motion model and radar measurement model is built. Secondly, the fusion result of each radar’s estimation is employed to the extended Kalman filter (EKF) to finish the first filtering. Thirdly, taking this result as prior knowledge, and associating with the array-processed ambiguous angles, the SePDAF is applied to accomplish the second filtering, and then achieving a high accuracy and stable trajectory with relatively low computational complexity. Moreover, the azimuth filtering accuracy will be promoted dramatically and the position filtering accuracy will also improve. Finally, simulations illustrate the effectiveness of the proposed method. PMID:27618058

  18. Making sense of the noise: The effect of hydrology on silver carp eDNA detection in the Chicago area waterway system.

    PubMed

    Song, Jeffery W; Small, Mitchell J; Casman, Elizabeth A

    2017-12-15

    Environmental DNA (eDNA) sampling is an emerging tool for monitoring the spread of aquatic invasive species. One confounding factor when interpreting eDNA sampling evidence is that eDNA can be present in the water in the absence of living target organisms, originating from excreta, dead tissue, boats, or sewage effluent, etc. In the Chicago Area Waterway System (CAWS), electric fish dispersal barriers were built to prevent non-native Asian carp species from invading Lake Michigan, and yet Asian carp eDNA has been detected above the barriers sporadically since 2009. In this paper the influence of stream flow characteristics in the CAWS on the probability of invasive Asian carp eDNA detection in the CAWS from 2009 to 2012 was examined. In the CAWS, the direction of stream flow is mostly away from Lake Michigan, though there are infrequent reversals in flow direction towards Lake Michigan during dry spells. We find that the flow reversal volume into the Lake has a statistically significant positive relationship with eDNA detection probability, while other covariates, like gage height, precipitation, season, water temperature, dissolved oxygen concentration, pH and chlorophyll concentration do not. This suggests that stream flow direction is highly influential on eDNA detection in the CAWS and should be considered when interpreting eDNA evidence. We also find that the beta-binomial regression model provides a stronger fit for eDNA detection probability compared to a binomial regression model. This paper provides a statistical modeling framework for interpreting eDNA sampling evidence and for evaluating covariates influencing eDNA detection. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Prioritizing molecular markers to test for in the initial workup of advanced non-small cell lung cancer: wants versus needs.

    PubMed

    West, Howard

    2017-09-01

    The current standard of care for molecular marker testing in patients with advanced non-small cell lung cancer (NSCLC) has been evolving over several years and is a product of the quality of the evidence supporting a targeted therapy for a specific molecular marker, the pre-test probability of that marker in the population, and the magnitude of benefit seen with that treatment. Among the markers that have one or more matched targeted therapies, only a few are in the subset for which they should be considered as most clearly worthy of prioritizing to detect in the first line setting in order to have them supplant other first line alternatives, and in only a subset of patients, as defined currently by NSCLC histology. Specifically, this currently includes testing for an activating epidermal growth factor receptor ( EGFR ) mutation or an anaplastic lymphoma kinase ( ALK ) or ROS1 rearrangement. This article reviews the history and data supporting the prioritization of these markers in patients with non-squamous NSCLC, a histologically selected population in whom the probability of these markers combined with the anticipated efficacy of targeted therapies against them is high enough to favor these treatments in the first line setting. In reviewing the evidence supporting this very limited core subset of most valuable molecular markers to detect in the initial workup of such patients, we can also see the criteria by which other actionable markers need to reach in order to be widely recognized as reliably valuable enough to warrant prioritization to detect in the initial workup of advanced NSCLC as well.

  20. Effects of shifts in the rate of repetitive stimulation on sustained attention

    NASA Technical Reports Server (NTRS)

    Krulewitz, J. E.; Warm, J. S.; Wohl, T. H.

    1975-01-01

    The effects of shifts in the rate of presentation of repetitive neutral events (background event rate) were studied in a visual vigilance task. Four groups of subjects experienced either a high (21 events/min) or a low (6 events/min) event rate for 20 min and then experienced either the same or the alternate event rate for an additional 40 min. The temporal occurrence of critical target signals was identical for all groups, irrespective of event rate. The density of critical signals was 12 signals/20 min. By the end of the session, shifts in event rate were associated with changes in performance which resembled contrast effects found in other experimental situations in which shift paradigms were used. Relative to constant event rate control conditions, a shift from a low to a high event rate depressed the probability of signal detections, while a shift in the opposite direction enhanced the probability of signal detections.

  1. Estimating the Probability of a Diffusing Target Encountering a Stationary Sensor.

    DTIC Science & Technology

    1985-07-01

    7 RD-R1577 6- 44 ESTIMATING THE PROBABILITY OF A DIFFUSING TARGET i/i ENCOUNTERING R STATIONARY SENSOR(U) NAVAL POSTGRADUATE U SCHOOL MONTEREY CA...8217,: *.:.; - -*.. ,’.-,:;;’.’.. ’,. ,. .*.’.- 4 6 6- ..- .-,,.. : .-.;.- -. NPS55-85-013 NAVAL POSTGRADUATE SCHOOL Monterey, California ESTIMATING THE PROBABILITY OF A DIFFUSING TARGET...PROBABILITY OF A DIFFUSING Technical TARGET ENCOUNTERING A STATIONARY SENSOR S. PERFORMING ORG. REPORT NUMBER 7. AUTHOR(@) S. CONTRACT OR GRANT NUMBER(a

  2. Detection of multiple airborne targets from multisensor data

    NASA Astrophysics Data System (ADS)

    Foltz, Mark A.; Srivastava, Anuj; Miller, Michael I.; Grenander, Ulf

    1995-08-01

    Previously we presented a jump-diffusion based random sampling algorithm for generating conditional mean estimates of scene representations for the tracking and recongition of maneuvering airborne targets. These representations include target positions and orientations along their trajectories and the target type associated with each trajectory. Taking a Bayesian approach, a posterior measure is defined on the parameter space by combining sensor models with a sophisticated prior based on nonlinear airplane dynamics. The jump-diffusion algorithm constructs a Markov process which visits the elements of the parameter space with frequencies proportional to the posterior probability. It consititutes both the infinitesimal, local search via a sample path continuous diffusion transform and the larger, global steps through discrete jump moves. The jump moves involve the addition and deletion of elements from the scene configuration or changes in the target type assoviated with each target trajectory. One such move results in target detection by the addition of a track seed to the inference set. This provides initial track data for the tracking/recognition algorithm to estimate linear graph structures representing tracks using the other jump moves and the diffusion process, as described in our earlier work. Target detection ideally involves a continuous research over a continuum of the observation space. In this work we conclude that for practical implemenations the search space must be discretized with lattice granularity comparable to sensor resolution, and discuss how fast Fourier transforms are utilized for efficient calcuation of sufficient statistics given our array models. Some results are also presented from our implementation on a networked system including a massively parallel machine architecture and a silicon graphics onyx workstation.

  3. Simple summation rule for optimal fixation selection in visual search.

    PubMed

    Najemnik, Jiri; Geisler, Wilson S

    2009-06-01

    When searching for a known target in a natural texture, practiced humans achieve near-optimal performance compared to a Bayesian ideal searcher constrained with the human map of target detectability across the visual field [Najemnik, J., & Geisler, W. S. (2005). Optimal eye movement strategies in visual search. Nature, 434, 387-391]. To do so, humans must be good at choosing where to fixate during the search [Najemnik, J., & Geisler, W.S. (2008). Eye movement statistics in humans are consistent with an optimal strategy. Journal of Vision, 8(3), 1-14. 4]; however, it seems unlikely that a biological nervous system would implement the computations for the Bayesian ideal fixation selection because of their complexity. Here we derive and test a simple heuristic for optimal fixation selection that appears to be a much better candidate for implementation within a biological nervous system. Specifically, we show that the near-optimal fixation location is the maximum of the current posterior probability distribution for target location after the distribution is filtered by (convolved with) the square of the retinotopic target detectability map. We term the model that uses this strategy the entropy limit minimization (ELM) searcher. We show that when constrained with human-like retinotopic map of target detectability and human search error rates, the ELM searcher performs as well as the Bayesian ideal searcher, and produces fixation statistics similar to human.

  4. Cognitive versus Software-Assisted Registration: Development of a New Nomogram Predicting Prostate Cancer at MRI-Targeted Biopsies.

    PubMed

    Kaufmann, Sascha; Russo, Giorgio I; Thaiss, Wolfgang; Notohamiprodjo, Mike; Bamberg, Fabian; Bedke, Jens; Morgia, Giuseppe; Nikolaou, Konstantin; Stenzl, Arnulf; Kruck, Stephan

    2018-04-03

    Multiparametric magnetic resonance imaging (mpMRI) is gaining acceptance to guide targeted biopsy (TB) in prostate cancer (PC) diagnosis. We aimed to compare the detection rate of software-assisted fusion TB (SA-TB) versus cognitive fusion TB (COG-TB) for PC and to evaluate potential clinical features in detecting PC and clinically significant PC (csPC) at TB. This was a retrospective cohort study of patients with rising and/or persistently elevated prostate-specific antigen (PSA) undergoing mpMRI followed by either transperineal SA-TB or transrectal COG-TB. The analysis showed a matched-paired analysis between SA-TB versus COG-TB without differences in clinical or radiological characteristics. Differences among detection of PC/csPC among groups were analyzed. A multivariable logistic regression model predicting PC at TB was fitted. The model was evaluated using the receiver operating characteristic-derived area under the curve, goodness of fit test, and decision-curve analyses. One hundred ninety-one and 87 patients underwent SA-TB or COG-TB, respectively. The multivariate logistic analysis showed that SA-TB was associated with overall PC (odds ratio [OR], 5.70; P < .01) and PC at TB (OR, 3.00; P < .01) but not with overall csPC (P = .40) and csPC at TB (P = .40). A nomogram predicting PC at TB was constructed using the Prostate Imaging Reporting and Data System version 2.0, age, PSA density and biopsy technique, showing improved clinical risk prediction against a threshold probability of 10% with a c-index of 0.83. In patients with suspected PC, software-assisted biopsy detects most cancers and outperforms the cognitive approach in targeting magnetic resonance imaging-visible lesions. Furthermore, we introduced a prebiopsy nomogram for the probability of PC in TB. Copyright © 2018 Elsevier Inc. All rights reserved.

  5. Heterogeneous Defensive Naval Weapon Assignment To Swarming Threats In Real Time

    DTIC Science & Technology

    2016-03-01

    threat Damage potential of target t if it hits the ship [integer from 0 to 3] _ ttarget phit Probability that target t hits the ship [probability...secondary weapon systems on target t [integer] _ tsec phit Probability that secondary weapon systems launched from target t hit the ship...pairing. These parameters are calculated as follows: 310 _ _t t tpriority target threat target phit = × × (3.1) 3_ 10 _ _t t tsec priority sec

  6. Markov Chain Monte Carlo estimation of species distributions: a case study of the swift fox in western Kansas

    USGS Publications Warehouse

    Sargeant, Glen A.; Sovada, Marsha A.; Slivinski, Christiane C.; Johnson, Douglas H.

    2005-01-01

    Accurate maps of species distributions are essential tools for wildlife research and conservation. Unfortunately, biologists often are forced to rely on maps derived from observed occurrences recorded opportunistically during observation periods of variable length. Spurious inferences are likely to result because such maps are profoundly affected by the duration and intensity of observation and by methods used to delineate distributions, especially when detection is uncertain. We conducted a systematic survey of swift fox (Vulpes velox) distribution in western Kansas, USA, and used Markov chain Monte Carlo (MCMC) image restoration to rectify these problems. During 1997–1999, we searched 355 townships (ca. 93 km) 1–3 times each for an average cost of $7,315 per year and achieved a detection rate (probability of detecting swift foxes, if present, during a single search) of = 0.69 (95% Bayesian confidence interval [BCI] = [0.60, 0.77]). Our analysis produced an estimate of the underlying distribution, rather than a map of observed occurrences, that reflected the uncertainty associated with estimates of model parameters. To evaluate our results, we analyzed simulated data with similar properties. Results of our simulations suggest negligible bias and good precision when probabilities of detection on ≥1 survey occasions (cumulative probabilities of detection) exceed 0.65. Although the use of MCMC image restoration has been limited by theoretical and computational complexities, alternatives do not possess the same advantages. Image models accommodate uncertain detection, do not require spatially independent data or a census of map units, and can be used to estimate species distributions directly from observations without relying on habitat covariates or parameters that must be estimated subjectively. These features facilitate economical surveys of large regions, the detection of temporal trends in distribution, and assessments of landscape-level relations between species and habitats. Requirements for the use of MCMC image restoration include study areas that can be partitioned into regular grids of mapping units, spatially contagious species distributions, reliable methods for identifying target species, and cumulative probabilities of detection ≥0.65.

  7. Markov chain Monte Carlo estimation of species distributions: A case study of the swift fox in western Kansas

    USGS Publications Warehouse

    Sargeant, G.A.; Sovada, M.A.; Slivinski, C.C.; Johnson, D.H.

    2005-01-01

    Accurate maps of species distributions are essential tools for wildlife research and conservation. Unfortunately, biologists often are forced to rely on maps derived from observed occurrences recorded opportunistically during observation periods of variable length. Spurious inferences are likely to result because such maps are profoundly affected by the duration and intensity of observation and by methods used to delineate distributions, especially when detection is uncertain. We conducted a systematic survey of swift fox (Vulpes velox) distribution in western Kansas, USA, and used Markov chain Monte Carlo (MCMC) image restoration to rectify these problems. During 1997-1999, we searched 355 townships (ca. 93 km2) 1-3 times each for an average cost of $7,315 per year and achieved a detection rate (probability of detecting swift foxes, if present, during a single search) of ?? = 0.69 (95% Bayesian confidence interval [BCI] = [0.60, 0.77]). Our analysis produced an estimate of the underlying distribution, rather than a map of observed occurrences, that reflected the uncertainty associated with estimates of model parameters. To evaluate our results, we analyzed simulated data with similar properties. Results of our simulations suggest negligible bias and good precision when probabilities of detection on ???1 survey occasions (cumulative probabilities of detection) exceed 0.65. Although the use of MCMC image restoration has been limited by theoretical and computational complexities, alternatives do not possess the same advantages. Image models accommodate uncertain detection, do not require spatially independent data or a census of map units, and can be used to estimate species distributions directly from observations without relying on habitat covariates or parameters that must be estimated subjectively. These features facilitate economical surveys of large regions, the detection of temporal trends in distribution, and assessments of landscape-level relations between species and habitats. Requirements for the use of MCMC image restoration include study areas that can be partitioned into regular grids of mapping units, spatially contagious species distributions, reliable methods for identifying target species, and cumulative probabilities of detection ???0.65.

  8. M Dwarf Variability and Periodicities in Praesepe

    NASA Astrophysics Data System (ADS)

    Hamper, R.; Honeycutt, R. K.

    2018-02-01

    212 M dwarfs in the Praesepe cluster have been monitored photometrically for three observing seasons. It is found that Praesepe M dwarfs earlier than ∼M4 often have significant photometric variations, while variability is not detected for >M4. Time series analysis was performed on 147 of the targets having likely variability in order to study possible periodicities. For 83% of these targets, we detected no periodicities; these null results included targets with published photometric periods from earlier work. Our detected periods ranged from 20 to 45 days, and we are not able to confirm any of the 1–5 day periods in Praesepe periods reported by Schultz et al., which we attribute to the very different observing cadences of the two studies. We conjecture that our more widely spaced data cannot adequately sample the Schultz et al. periodicities before the growth and decay of spots have a chance to ruin the coherence. The new periods we find in the range 20–45 days (in targets that do not overlap with those from Schultz having shorter periods) have very small false alarm probabilities. We argue that rotation is unlikely to be responsible for these 20–45 day periods. Perhaps short activity cycles in the Praesepe M dwarfs play a role in generating such periodicities.

  9. Precision Landing and Hazard Avoidance Doman

    NASA Technical Reports Server (NTRS)

    Robertson, Edward A.; Carson, John M., III

    2016-01-01

    The Precision Landing and Hazard Avoidance (PL&HA) domain addresses the development, integration, testing, and spaceflight infusion of sensing, processing, and GN&C functions critical to the success and safety of future human and robotic exploration missions. PL&HA sensors also have applications to other mission events, such as rendezvous and docking. Autonomous PL&HA builds upon the core GN&C capabilities developed to enable soft, controlled landings on the Moon, Mars, and other solar system bodies. Through the addition of a Terrain Relative Navigation (TRN) function, precision landing within tens of meters of a map-based target is possible. The addition of a 3-D terrain mapping lidar sensor improves the probability of a safe landing via autonomous, real-time Hazard Detection and Avoidance (HDA). PL&HA significantly improves the probability of mission success and enhances access to sites of scientific interest located in challenging terrain. PL&HA can also utilize external navigation aids, such as navigation satellites and surface beacons. Advanced Lidar Sensors High precision ranging, velocimetry, and 3-D terrain mapping Terrain Relative Navigation (TRN) TRN compares onboard reconnaissance data with real-time terrain imaging data to update the S/C position estimate Hazard Detection and Avoidance (HDA) Generates a high-resolution, 3-D terrain map in real-time during the approach trajectory to identify safe landing targets Inertial Navigation During Terminal Descent High precision surface relative sensors enable accurate inertial navigation during terminal descent and a tightly controlled touchdown within meters of the selected safe landing target.

  10. Attack Detection in Sensor Network Target Localization Systems With Quantized Data

    NASA Astrophysics Data System (ADS)

    Zhang, Jiangfan; Wang, Xiaodong; Blum, Rick S.; Kaplan, Lance M.

    2018-04-01

    We consider a sensor network focused on target localization, where sensors measure the signal strength emitted from the target. Each measurement is quantized to one bit and sent to the fusion center. A general attack is considered at some sensors that attempts to cause the fusion center to produce an inaccurate estimation of the target location with a large mean-square-error. The attack is a combination of man-in-the-middle, hacking, and spoofing attacks that can effectively change both signals going into and coming out of the sensor nodes in a realistic manner. We show that the essential effect of attacks is to alter the estimated distance between the target and each attacked sensor to a different extent, giving rise to a geometric inconsistency among the attacked and unattacked sensors. Hence, with the help of two secure sensors, a class of detectors are proposed to detect the attacked sensors by scrutinizing the existence of the geometric inconsistency. We show that the false alarm and miss probabilities of the proposed detectors decrease exponentially as the number of measurement samples increases, which implies that for sufficiently large number of samples, the proposed detectors can identify the attacked and unattacked sensors with any required accuracy.

  11. Deep brain stimulation abolishes slowing of reactions to unlikely stimuli.

    PubMed

    Antoniades, Chrystalina A; Bogacz, Rafal; Kennard, Christopher; FitzGerald, James J; Aziz, Tipu; Green, Alexander L

    2014-08-13

    The cortico-basal-ganglia circuit plays a critical role in decision making on the basis of probabilistic information. Computational models have suggested how this circuit could compute the probabilities of actions being appropriate according to Bayes' theorem. These models predict that the subthalamic nucleus (STN) provides feedback that normalizes the neural representation of probabilities, such that if the probability of one action increases, the probabilities of all other available actions decrease. Here we report the results of an experiment testing a prediction of this theory that disrupting information processing in the STN with deep brain stimulation should abolish the normalization of the neural representation of probabilities. In our experiment, we asked patients with Parkinson's disease to saccade to a target that could appear in one of two locations, and the probability of the target appearing in each location was periodically changed. When the stimulator was switched off, the target probability affected the reaction times (RT) of patients in a similar way to healthy participants. Specifically, the RTs were shorter for more probable targets and, importantly, they were longer for the unlikely targets. When the stimulator was switched on, the patients were still faster for more probable targets, but critically they did not increase RTs as the target was becoming less likely. This pattern of results is consistent with the prediction of the model that the patients on DBS no longer normalized their neural representation of prior probabilities. We discuss alternative explanations for the data in the context of other published results. Copyright © 2014 the authors 0270-6474/14/3410844-09$15.00/0.

  12. Unmanned aerial vehicles for surveying marine fauna: assessing detection probability.

    PubMed

    Hodgson, Amanda; Peel, David; Kelly, Natalie

    2017-06-01

    Aerial surveys are conducted for various fauna to assess abundance, distribution, and habitat use over large spatial scales. They are traditionally conducted using light aircraft with observers recording sightings in real time. Unmanned Aerial Vehicles (UAVs) offer an alternative with many potential advantages, including eliminating human risk. To be effective, this emerging platform needs to provide detection rates of animals comparable to traditional methods. UAVs can also acquire new types of information, and this new data requires a reevaluation of traditional analyses used in aerial surveys; including estimating the probability of detecting animals. We conducted 17 replicate UAV surveys of humpback whales (Megaptera novaeangliae) while simultaneously obtaining a 'census' of the population from land-based observations, to assess UAV detection probability. The ScanEagle UAV, carrying a digital SLR camera, continuously captured images (with 75% overlap) along transects covering the visual range of land-based observers. We also used ScanEagle to conduct focal follows of whale pods (n = 12, mean duration = 40 min), to assess a new method of estimating availability. A comparison of the whale detections from the UAV to the land-based census provided an estimated UAV detection probability of 0.33 (CV = 0.25; incorporating both availability and perception biases), which was not affected by environmental covariates (Beaufort sea state, glare, and cloud cover). According to our focal follows, the mean availability was 0.63 (CV = 0.37), with pods including mother/calf pairs having a higher availability (0.86, CV = 0.20) than those without (0.59, CV = 0.38). The follows also revealed (and provided a potential correction for) a downward bias in group size estimates from the UAV surveys, which resulted from asynchronous diving within whale pods, and a relatively short observation window of 9 s. We have shown that UAVs are an effective alternative to traditional methods, providing a detection probability that is within the range of previous studies for our target species. We also describe a method of assessing availability bias that represents spatial and temporal characteristics of a survey, from the same perspective as the survey platform, is benign, and provides additional data on animal behavior. © 2017 by the Ecological Society of America.

  13. Chemical Selectivity and Sensitivity of a 16-Channel Electronic Nose for Trace Vapour Detection

    PubMed Central

    Strle, Drago; Trifkovič, Mario; Van Miden, Marion; Kvasić, Ivan; Zupanič, Erik; Muševič, Igor

    2017-01-01

    Good chemical selectivity of sensors for detecting vapour traces of targeted molecules is vital to reliable detection systems for explosives and other harmful materials. We present the design, construction and measurements of the electronic response of a 16 channel electronic nose based on 16 differential microcapacitors, which were surface-functionalized by different silanes. The e-nose detects less than 1 molecule of TNT out of 10+12 N2 molecules in a carrier gas in 1 s. Differently silanized sensors give different responses to different molecules. Electronic responses are presented for TNT, RDX, DNT, H2S, HCN, FeS, NH3, propane, methanol, acetone, ethanol, methane, toluene and water. We consider the number density of these molecules and find that silane surfaces show extreme affinity for attracting molecules of TNT, DNT and RDX. The probability to bind these molecules and form a surface-adsorbate is typically 10+7 times larger than the probability to bind water molecules, for example. We present a matrix of responses of differently functionalized microcapacitors and we propose that chemical selectivity of multichannel e-nose could be enhanced by using artificial intelligence deep learning methods. PMID:29292764

  14. Real-time, wide-area hyperspectral imaging sensors for standoff detection of explosives and chemical warfare agents

    NASA Astrophysics Data System (ADS)

    Gomer, Nathaniel R.; Tazik, Shawna; Gardner, Charles W.; Nelson, Matthew P.

    2017-05-01

    Hyperspectral imaging (HSI) is a valuable tool for the detection and analysis of targets located within complex backgrounds. HSI can detect threat materials on environmental surfaces, where the concentration of the target of interest is often very low and is typically found within complex scenery. Unfortunately, current generation HSI systems have size, weight, and power limitations that prohibit their use for field-portable and/or real-time applications. Current generation systems commonly provide an inefficient area search rate, require close proximity to the target for screening, and/or are not capable of making real-time measurements. ChemImage Sensor Systems (CISS) is developing a variety of real-time, wide-field hyperspectral imaging systems that utilize shortwave infrared (SWIR) absorption and Raman spectroscopy. SWIR HSI sensors provide wide-area imagery with at or near real time detection speeds. Raman HSI sensors are being developed to overcome two obstacles present in standard Raman detection systems: slow area search rate (due to small laser spot sizes) and lack of eye-safety. SWIR HSI sensors have been integrated into mobile, robot based platforms and handheld variants for the detection of explosives and chemical warfare agents (CWAs). In addition, the fusion of these two technologies into a single system has shown the feasibility of using both techniques concurrently to provide higher probability of detection and lower false alarm rates. This paper will provide background on Raman and SWIR HSI, discuss the applications for these techniques, and provide an overview of novel CISS HSI sensors focusing on sensor design and detection results.

  15. Methods for threshold determination in multiplexed assays

    DOEpatents

    Tammero, Lance F. Bentley; Dzenitis, John M; Hindson, Benjamin J

    2014-06-24

    Methods for determination of threshold values of signatures comprised in an assay are described. Each signature enables detection of a target. The methods determine a probability density function of negative samples and a corresponding false positive rate curve. A false positive criterion is established and a threshold for that signature is determined as a point at which the false positive rate curve intersects the false positive criterion. A method for quantitative analysis and interpretation of assay results together with a method for determination of a desired limit of detection of a signature in an assay are also described.

  16. Non-targeted analysis of unexpected food contaminants using LC-HRMS.

    PubMed

    Kunzelmann, Marco; Winter, Martin; Åberg, Magnus; Hellenäs, Karl-Erik; Rosén, Johan

    2018-03-29

    A non-target analysis method for unexpected contaminants in food is described. Many current methods referred to as "non-target" are capable of detecting hundreds or even thousands of contaminants. However, they will typically still miss all other possible contaminants. Instead, a metabolomics approach might be used to obtain "true non-target" analysis. In the present work, such a method was optimized for improved detection capability at low concentrations. The method was evaluated using 19 chemically diverse model compounds spiked into milk samples to mimic unknown contamination. Other milk samples were used as reference samples. All samples were analyzed with UHPLC-TOF-MS (ultra-high-performance liquid chromatography time-of-flight mass spectrometry), using reversed-phase chromatography and electrospray ionization in positive mode. Data evaluation was performed by the software TracMass 2. No target lists of specific compounds were used to search for the contaminants. Instead, the software was used to sort out all features only occurring in the spiked sample data, i.e., the workflow resembled a metabolomics approach. Procedures for chemical identification of peaks were outside the scope of the study. Method, study design, and settings in the software were optimized to minimize manual evaluation and faulty or irrelevant hits and to maximize hit rate of the spiked compounds. A practical detection limit was established at 25 μg/kg. At this concentration, most compounds (17 out of 19) were detected as intact precursor ions, as fragments or as adducts. Only 2 irrelevant hits, probably natural compounds, were obtained. Limitations and possible practical use of the approach are discussed.

  17. Detection of presence of chemical precursors

    NASA Technical Reports Server (NTRS)

    Li, Jing (Inventor); Meyyappan, Meyya (Inventor); Lu, Yijiang (Inventor)

    2009-01-01

    Methods and systems for determining if one or more target molecules are present in a gas, by exposing a functionalized carbon nanostructure (CNS) to the gas and measuring an electrical parameter value EPV(n) associated with each of N CNS sub-arrays. In a first embodiment, a most-probable concentration value C(opt) is estimated, and an error value, depending upon differences between the measured values EPV(n) and corresponding values EPV(n;C(opt)) is computed. If the error value is less than a first error threshold value, the system interprets this as indicating that the target molecule is present in a concentration C.apprxeq.C(opt). A second embodiment uses extensive statistical and vector space analysis to estimate target molecule concentration.

  18. Environmental DNA Marker Development with Sparse Biological Information: A Case Study on Opossum Shrimp (Mysis diluviana).

    PubMed

    Carim, Kellie J; Christianson, Kyle R; McKelvey, Kevin M; Pate, William M; Silver, Douglas B; Johnson, Brett M; Galloway, Benjamin T; Young, Michael K; Schwartz, Michael K

    2016-01-01

    The spread of Mysis diluviana, a small glacial relict crustacean, outside its native range has led to unintended shifts in the composition of native fish communities throughout western North America. As a result, biologists seek accurate methods of determining the presence of M. diluviana, especially at low densities or during the initial stages of an invasion. Environmental DNA (eDNA) provides one solution for detecting M. diluviana, but building eDNA markers that are both sensitive and species-specific is challenging when the distribution and taxonomy of closely related non-target taxa are poorly understood, published genetic data are sparse, and tissue samples are difficult to obtain. To address these issues, we developed a pair of independent eDNA markers to increase the likelihood of a positive detection of M. diluviana when present and reduce the probability of false positive detections from closely related non-target species. Because tissue samples of closely-related and possibly sympatric, non-target taxa could not be obtained, we used synthetic DNA sequences of closely related non-target species to test the specificity of eDNA markers. Both eDNA markers yielded positive detections from five waterbodies where M. diluviana was known to be present, and no detections in five others where this species was thought to be absent. Daytime samples from varying depths in one waterbody occupied by M. diluviana demonstrated that samples near the lake bottom produced 5 to more than 300 times as many eDNA copies as samples taken at other depths, but all samples tested positive regardless of depth.

  19. Pharmacodynamic analysis of ceftriaxone, gatifloxacin,and levofloxacin against Streptococcus pneumoniae with the use of Monte Carlo simulation.

    PubMed

    Frei, Christopher R; Burgess, David S

    2005-09-01

    To evaluate the pharmacodynamics of four intravenous antimicrobial regimens-ceftriaxone 1 g, gatifloxacin 400 mg, levofloxacin 500 mg, and levofloxacin 750 mg, each every 24 hours-against recent Streptococcus pneumoniae isolates. Pharmacodynamic analysis using Monte Carlo simulation. The Surveillance Network (TSN) 2002 database. Streptococcus pneumoniae isolates (7866 isolates) were stratified according to penicillin susceptibilities as follows: susceptible (4593), intermediate (1986), and resistant (1287). Risk analysis software was used to simulate 10,000 patients by integrating published pharmacokinetic parameters, their variability, and minimum inhibitory concentration (MIC) distributions from the TSN database. Probability of target attainment was determined for percentage of time above the MIC (%T > MIC) from 0-100% for ceftriaxone and area under the concentration-time curve (AUC):MIC ratio from 0-150 for the fluoroquinolones. For ceftriaxone, probability of target attainment remained 90% or greater against the three isolate groups until a %T > MIC of 70% or greater, and it remained 90% or greater against susceptible and intermediate isolates over the entire interval (%T > MIC 0-100%). For levofloxacin 500 mg, probability of target attainment was 90% at an AUC:MIC < or = 30, but the curve declined sharply with further increases in pharmacodynamic target. Levofloxacin 750 mg achieved a probability of target attainment of 99% at an AUC:MIC ratio < or = 30; the probability remained approximately 90% until a target of 70 or greater, when it declined steeply. Gatifloxacin demonstrated a high probability (99%) of target attainment at an AUC:MIC ratio < or = 30, and it remained above 90% until a target of 70. Ceftriaxone maintained high probability of target attainment over a broad range of pharmacodynamic targets regardless of penicillin susceptibility (%T > MIC 0-60%). Levofloxacin 500 mg maintained high probability of target attainment for AUC:MIC ratios 0-30; whereas, levofloxacin 750 mg and gatifloxacin maintained high probability of target attainment for AUC:MIC ratios 0-60. Rate of decline in the pharmacodynamic curve was most pronounced for the two levofloxacin regimens and more gradual for gatifloxacin and ceftriaxone.

  20. Influence of gravel mining and other factors on detection probabilities of Coastal Plain fishes in the Mobile River Basin, Alabama

    USGS Publications Warehouse

    Hayer, C.-A.; Irwin, E.R.

    2008-01-01

    We used an information-theoretic approach to examine the variation in detection probabilities for 87 Piedmont and Coastal Plain fishes in relation to instream gravel mining in four Alabama streams of the Mobile River drainage. Biotic and abiotic variables were also included in candidate models. Detection probabilities were heterogeneous across species and varied with habitat type, stream, season, and water quality. Instream gravel mining influenced the variation in detection probabilities for 38% of the species collected, probably because it led to habitat loss and increased sedimentation. Higher detection probabilities were apparent at unmined sites than at mined sites for 78% of the species for which gravel mining was shown to influence detection probabilities, indicating potential negative impacts to these species. Physical and chemical attributes also explained the variation in detection probabilities for many species. These results indicate that anthropogenic impacts can affect detection probabilities for fishes, and such variation should be considered when developing monitoring programs or routine sampling protocols. ?? Copyright by the American Fisheries Society 2008.

  1. Modeling Disease Vector Occurrence when Detection Is Imperfect: Infestation of Amazonian Palm Trees by Triatomine Bugs at Three Spatial Scales

    PubMed Central

    Abad-Franch, Fernando; Ferraz, Gonçalo; Campos, Ciro; Palomeque, Francisco S.; Grijalva, Mario J.; Aguilar, H. Marcelo; Miles, Michael A.

    2010-01-01

    Background Failure to detect a disease agent or vector where it actually occurs constitutes a serious drawback in epidemiology. In the pervasive situation where no sampling technique is perfect, the explicit analytical treatment of detection failure becomes a key step in the estimation of epidemiological parameters. We illustrate this approach with a study of Attalea palm tree infestation by Rhodnius spp. (Triatominae), the most important vectors of Chagas disease (CD) in northern South America. Methodology/Principal Findings The probability of detecting triatomines in infested palms is estimated by repeatedly sampling each palm. This knowledge is used to derive an unbiased estimate of the biologically relevant probability of palm infestation. We combine maximum-likelihood analysis and information-theoretic model selection to test the relationships between environmental covariates and infestation of 298 Amazonian palm trees over three spatial scales: region within Amazonia, landscape, and individual palm. Palm infestation estimates are high (40–60%) across regions, and well above the observed infestation rate (24%). Detection probability is higher (∼0.55 on average) in the richest-soil region than elsewhere (∼0.08). Infestation estimates are similar in forest and rural areas, but lower in urban landscapes. Finally, individual palm covariates (accumulated organic matter and stem height) explain most of infestation rate variation. Conclusions/Significance Individual palm attributes appear as key drivers of infestation, suggesting that CD surveillance must incorporate local-scale knowledge and that peridomestic palm tree management might help lower transmission risk. Vector populations are probably denser in rich-soil sub-regions, where CD prevalence tends to be higher; this suggests a target for research on broad-scale risk mapping. Landscape-scale effects indicate that palm triatomine populations can endure deforestation in rural areas, but become rarer in heavily disturbed urban settings. Our methodological approach has wide application in infectious disease research; by improving eco-epidemiological parameter estimation, it can also significantly strengthen vector surveillance-control strategies. PMID:20209149

  2. Why Waveform Correlation Sometimes Fails

    NASA Astrophysics Data System (ADS)

    Carmichael, J.

    2015-12-01

    Waveform correlation detectors used in explosion monitoring scan noisy geophysical data to test two competing hypotheses: either (1) an amplitude-scaled version of a template waveform is present, or, (2) no signal is present at all. In reality, geophysical wavefields that are monitored for explosion signatures include waveforms produced by non-target sources that are partially correlated with the waveform template. Such signals can falsely trigger correlation detectors, particularly at low thresholds required to monitor for smaller target explosions. This challenge is particularly formidable when monitoring known test sites for seismic disturbances, since uncatalogued natural seismicity is (generally) more prevalent at lower magnitudes, and could be mistaken for small explosions. To address these challenges, we identify real examples in which correlation detectors targeting explosions falsely trigger on both site-proximal earthquakes (Figure 1, below) and microseismic "noise". Motivated by these examples, we quantify performance loss when applying these detectors, and re-evaluate the correlation-detector's hypothesis test. We thereby derive new detectors from more general hypotheses that admit unknown background seismicity, and apply these to real data. From our treatment, we derive "rules of thumb'' for proper template and threshold selection in heavily cluttered signal environments. Last, we answer the question "what is the probability of falsely detecting an earthquake collocated at a test site?", using correlation detectors that include explosion-triggered templates. Figure Top: An eight-channel data stream (black) recorded from an earthquake near a mine. Red markers indicate a detection. Middle: The correlation statistic computed by scanning the template against the data stream at top. The red line indicates the threshold for event declaration, determined by a false-alarm on noise probability constraint, as computed from the signal-absent distribution using the Neyman Pearson criteria. Bottom: The histogram of the correlation statistic time series (gray) superimposed on the theoretical null distribution (black curve). The line shows the threshold, consistent with a right-tail probability, computed from the black curve.

  3. Knowledge-based tracking algorithm

    NASA Astrophysics Data System (ADS)

    Corbeil, Allan F.; Hawkins, Linda J.; Gilgallon, Paul F.

    1990-10-01

    This paper describes the Knowledge-Based Tracking (KBT) algorithm for which a real-time flight test demonstration was recently conducted at Rome Air Development Center (RADC). In KBT processing, the radar signal in each resolution cell is thresholded at a lower than normal setting to detect low RCS targets. This lower threshold produces a larger than normal false alarm rate. Therefore, additional signal processing including spectral filtering, CFAR and knowledge-based acceptance testing are performed to eliminate some of the false alarms. TSC's knowledge-based Track-Before-Detect (TBD) algorithm is then applied to the data from each azimuth sector to detect target tracks. In this algorithm, tentative track templates are formed for each threshold crossing and knowledge-based association rules are applied to the range, Doppler, and azimuth measurements from successive scans. Lastly, an M-association out of N-scan rule is used to declare a detection. This scan-to-scan integration enhances the probability of target detection while maintaining an acceptably low output false alarm rate. For a real-time demonstration of the KBT algorithm, the L-band radar in the Surveillance Laboratory (SL) at RADC was used to illuminate a small Cessna 310 test aircraft. The received radar signal wa digitized and processed by a ST-100 Array Processor and VAX computer network in the lab. The ST-100 performed all of the radar signal processing functions, including Moving Target Indicator (MTI) pulse cancelling, FFT Doppler filtering, and CFAR detection. The VAX computers performed the remaining range-Doppler clustering, beamsplitting and TBD processing functions. The KBT algorithm provided a 9.5 dB improvement relative to single scan performance with a nominal real time delay of less than one second between illumination and display.

  4. Detection and Localization of Subsurface Two-Dimensional Metallic Objects

    NASA Astrophysics Data System (ADS)

    Meschino, S.; Pajewski, L.; Schettini, G.

    2009-04-01

    "Roma Tre" University, Applied Electronics Dept.v. Vasca Navale 84, 00146 Rome, Italy Non-invasive identification of buried objects in the near-field of a receiver array is a subject of great interest, due to its application to the remote sensing of the earth's subsurface, to the detection of landmines, pipes, conduits, to the archaeological site characterization, and more. In this work, we present a Sub-Array Processing (SAP) approach for the detection and localization of subsurface perfectly-conducting circular cylinders. We consider a plane wave illuminating the region of interest, which is assumed to be a homogeneous, unlossy medium of unknown permittivity containing one or more targets. In a first step, we partition the receiver array so that the field scattered from the targets result to be locally plane at each sub-array. Then, we apply a Direction of Arrival (DOA) technique to obtain a set of angles for each locally plane wave, and triangulate these directions obtaining a collection of crossing crowding in the expected object locations [1]. We compare several DOA algorithms such as the traditional Bartlett and Capon Beamforming, the Pisarenko Harmonic Decomposition (PHD), the Minimum-Norm method, the Multiple Signal Classification (MUSIC) and the Estimation of Signal Parameters via Rotational Techinque (ESPRIT) [2]. In a second stage, we develop a statistical Poisson based model to manage the crossing pattern in order to extract the probable target's centre position. In particular, if the crossings are Poisson distributed, it is possible to feature two different distribution parameters [3]. These two parameters perform two density rate for the crossings, so that we can previously divide the crossing pattern in a certain number of equal-size windows and we can collect the windows of the crossing pattern with low rate parameters (that probably are background windows) and remove them. In this way we can consider only the high rate parameter windows (that most probably locate the target) and extract the center position of the object. We also consider some other localization-connected aspects. For example how to obtain a likely estimation of the soil permittivity and of the cylinders radius. Finally, when multiple objects are present, we refine our localization procedure by performing a Clustering Analysis of the crossing pattern. In particular, we apply the K-means algorithm to extract the coordinates of the objects centroids and the clusters extension. References [1] Şahin A., Miller L., "Object Detection Using High Resolution Near-Field Array Processing", IEEE Trans. on Geoscience and Remote Sensing, vol.39, no.1, Jan. 2001, pp. 136-141. [2] Gross F.B., "Smart Antennas for Wireless Communications", Mc.Graw-Hill 2005. [3] Hoaglin D.C., "A Poisonnes Plot", The American Statistician, vol.34, no.3 August 1980, pp.146-149.

  5. A removal model for estimating detection probabilities from point-count surveys

    USGS Publications Warehouse

    Farnsworth, G.L.; Pollock, K.H.; Nichols, J.D.; Simons, T.R.; Hines, J.E.; Sauer, J.R.

    2000-01-01

    We adapted a removal model to estimate detection probability during point count surveys. The model assumes one factor influencing detection during point counts is the singing frequency of birds. This may be true for surveys recording forest songbirds when most detections are by sound. The model requires counts to be divided into several time intervals. We used time intervals of 2, 5, and 10 min to develop a maximum-likelihood estimator for the detectability of birds during such surveys. We applied this technique to data from bird surveys conducted in Great Smoky Mountains National Park. We used model selection criteria to identify whether detection probabilities varied among species, throughout the morning, throughout the season, and among different observers. The overall detection probability for all birds was 75%. We found differences in detection probability among species. Species that sing frequently such as Winter Wren and Acadian Flycatcher had high detection probabilities (about 90%) and species that call infrequently such as Pileated Woodpecker had low detection probability (36%). We also found detection probabilities varied with the time of day for some species (e.g. thrushes) and between observers for other species. This method of estimating detectability during point count surveys offers a promising new approach to using count data to address questions of the bird abundance, density, and population trends.

  6. Isothermal amplification detection of nucleic acids by a double-nicked beacon.

    PubMed

    Shi, Chao; Zhou, Meiling; Pan, Mei; Zhong, Guilin; Ma, Cuiping

    2016-03-01

    Isothermal and rapid amplification detection of nucleic acids is an important technology in environmental monitoring, foodborne pathogen detection, and point-of-care clinical diagnostics. Here we have developed a novel method of isothermal signal amplification for single-stranded DNA (ssDNA) detection. The ssDNA target could be used as an initiator, coupled with a double-nicked molecular beacon, to originate amplification cycles, achieving cascade signal amplification. In addition, the method showed good specificity and strong anti-jamming capability. Overall, it is a one-pot and isothermal strand displacement amplification method without the requirement of a stepwise procedure, which greatly simplifies the experimental procedure and decreases the probability of contamination of samples. With its advantages, the method would be very useful to detect nucleic acids in point-of-care or field use. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Design of an Acoustic Target Intrusion Detection System Based on Small-Aperture Microphone Array.

    PubMed

    Zu, Xingshui; Guo, Feng; Huang, Jingchang; Zhao, Qin; Liu, Huawei; Li, Baoqing; Yuan, Xiaobing

    2017-03-04

    Automated surveillance of remote locations in a wireless sensor network is dominated by the detection algorithm because actual intrusions in such locations are a rare event. Therefore, a detection method with low power consumption is crucial for persistent surveillance to ensure longevity of the sensor networks. A simple and effective two-stage algorithm composed of energy detector (ED) and delay detector (DD) with all its operations in time-domain using small-aperture microphone array (SAMA) is proposed. The algorithm analyzes the quite different velocities between wind noise and sound waves to improve the detection capability of ED in the surveillance area. Experiments in four different fields with three types of vehicles show that the algorithm is robust to wind noise and the probability of detection and false alarm are 96.67% and 2.857%, respectively.

  8. Detection of the Earth with the SETI microwave observing system assumed to be operating out in the galaxy

    NASA Technical Reports Server (NTRS)

    Billingham, J.; Tarter, J.

    1992-01-01

    This paper estimates the maximum range at which radar signals from the Earth could be detected by a search system similar to the NASA Search for Extraterrestrial Intelligence Microwave Observing Project (SETI MOP) assumed to be operating out in the galaxy. Figures are calculated for the Targeted Search, and for the Sky Survey parts of the MOP, both operating, as currently planned, in the second half of the decade of the 1990s. Only the most powerful terrestrial transmitters are considered, namely, the planetary radar at Arecibo in Puerto Rico, and the ballistic missile early warning systems (BMEWS). In each case the probabilities of detection over the life of the MOP are also calculated. The calculation assumes that we are only in the eavesdropping mode. Transmissions intended to be detected by SETI systems are likely to be much stronger and would of course be found with higher probability to a greater range. Also, it is assumed that the transmitting civilization is at the same level of technological evolution as ours on Earth. This is very improbable. If we were to detect another technological civilization, it would, on statistical grounds, be much older than we are and might well have much more powerful transmitters. Both factors would make detection by the NASA MOP a much more likely outcome.

  9. Detection of the Earth with the SETI microwave observing system assumed to be operating out in the galaxy.

    PubMed

    Billingham, J; Tarter, J

    1992-01-01

    This paper estimates the maximum range at which radar signals from the Earth could be detected by a search system similar to the NASA Search for Extraterrestrial Intelligence Microwave Observing Project (SETI MOP) assumed to be operating out in the galaxy. Figures are calculated for the Targeted Search, and for the Sky Survey parts of the MOP, both operating, as currently planned, in the second half of the decade of the 1990s. Only the most powerful terrestrial transmitters are considered, namely, the planetary radar at Arecibo in Puerto Rico, and the ballistic missile early warning systems (BMEWS). In each case the probabilities of detection over the life of the MOP are also calculated. The calculation assumes that we are only in the eavesdropping mode. Transmissions intended to be detected by SETI systems are likely to be much stronger and would of course be found with higher probability to a greater range. Also, it is assumed that the transmitting civilization is at the same level of technological evolution as ours on Earth. This is very improbable. If we were to detect another technological civilization, it would, on statistical grounds, be much older than we are and might well have much more powerful transmitters. Both factors would make detection by the NASA MOP a much more likely outcome.

  10. Continuous time wavelet entropy of auditory evoked potentials.

    PubMed

    Cek, M Emre; Ozgoren, Murat; Savaci, F Acar

    2010-01-01

    In this paper, the continuous time wavelet entropy (CTWE) of auditory evoked potentials (AEP) has been characterized by evaluating the relative wavelet energies (RWE) in specified EEG frequency bands. Thus, the rapid variations of CTWE due to the auditory stimulation could be detected in post-stimulus time interval. This approach removes the probability of missing the information hidden in short time intervals. The discrete time and continuous time wavelet based wavelet entropy variations were compared on non-target and target AEP data. It was observed that CTWE can also be an alternative method to analyze entropy as a function of time. 2009 Elsevier Ltd. All rights reserved.

  11. Method- and species-specific detection probabilities of fish occupancy in Arctic lakes: Implications for design and management

    USGS Publications Warehouse

    Haynes, Trevor B.; Rosenberger, Amanda E.; Lindberg, Mark S.; Whitman, Matthew; Schmutz, Joel A.

    2013-01-01

    Studies examining species occurrence often fail to account for false absences in field sampling. We investigate detection probabilities of five gear types for six fish species in a sample of lakes on the North Slope, Alaska. We used an occupancy modeling approach to provide estimates of detection probabilities for each method. Variation in gear- and species-specific detection probability was considerable. For example, detection probabilities for the fyke net ranged from 0.82 (SE = 0.05) for least cisco (Coregonus sardinella) to 0.04 (SE = 0.01) for slimy sculpin (Cottus cognatus). Detection probabilities were also affected by site-specific variables such as depth of the lake, year, day of sampling, and lake connection to a stream. With the exception of the dip net and shore minnow traps, each gear type provided the highest detection probability of at least one species. Results suggest that a multimethod approach may be most effective when attempting to sample the entire fish community of Arctic lakes. Detection probability estimates will be useful for designing optimal fish sampling and monitoring protocols in Arctic lakes.

  12. Pharmaceuticals and consumer products in four wastewater treatment plants in urban and suburb areas of Shanghai.

    PubMed

    Sui, Qian; Wang, Dan; Zhao, Wentao; Huang, Jun; Yu, Gang; Cao, Xuqi; Qiu, Zhaofu; Lu, Shuguang

    2015-04-01

    Ten pharmaceuticals and two consumer products were investigated in four wastewater treatment plants (WWTPs) in Shanghai, China. The concentrations of target compounds in the wastewater influents ranged from below the limit of quantification (LOQ) to 9340 ng/L, with the frequency of detection of 31-100%, and the removal efficiencies were observed to be -82 to 100% in the four WWTPs. Concentrations of most target compounds (i.e. diclofenac, caffeine, metoprolol, sulpiride) in the wastewater influents were around three to eight times higher in urban WWTPs than in suburb ones, probably due to the different population served and lifestyles. Mean concentrations of target compounds in the wastewater influent generally decreased by 5-76% after rainfall due to the dilution of raw sewage by rainwater, which infiltrated into the sewer system. In the WWTPs located in the suburb area, the increased flow of wastewater influent led to a shortened hydraulic retention time (HRT) and decreased removal efficiencies of some compounds. On the contrary, the influence of rainfall was not significant on the removal efficiencies of investigated compounds in urban WWTPs, probably due to the almost unchanged influent flow, good removal performance, or bypass system employed.

  13. A Mobile Decision Aid for Determining Detection Probabilities for Acoustic Targets

    DTIC Science & Technology

    2002-08-01

    propagation mobile application . Personal Computer Memory Card International Association, an organization of some 500 companies that has developed a...SENSOR: lHuman and possible outputs, it was felt that for a mobile application , the interface and number of output parameters should be kept simple...value could be computed on the server and transmitted back to the mobile application for display. FUTURE CAPABILITIES 2-D/3-D Displays The full ABFA

  14. European Scientific Notes. Volume 36, Number 6,

    DTIC Science & Technology

    1982-06-30

    densities. The temperature scribed cross-relaxation between F centers in dependence of different emission bands was CaO observed via the spin-echo decay...both modes were accomplished via the display shown in Figure 1. The three the same basic signal to threshold manipu- adjacent rectangular sectors cover...Confidence Bands - --- Around Target Vector Detectability 1.4. __ _(Shown in Orange) Measure 1.6 . probably %.--- no taraet 1.2 - - .0" " 1ure no Fig. 2 The

  15. Optimal Sensor-Based Motion Planning for Autonomous Vehicle Teams

    DTIC Science & Technology

    2017-03-01

    calculated for non -dimensional ranges with Equation (3.26) and DU = 100 meters (shown at right) are equivalent to propagation loss calculated for 72 0 100...sensor and uniform target PDF, both choices are equivalent and the probability of non -detection equals the fraction of un- searched area. Time...feasible. Another goal is maximizing sensor performance in the presence of uncertainty. Optimal control provides a useful frame- work for solving these

  16. Production of isotopes and isomers with irradiation of Z = 47–50 targets by 23-MeV bremsstrahlung

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karamian, S. A., E-mail: karamian@nrmail.jinr.ru; Carroll, J. J.; Aksenov, N. V.

    2015-09-15

    The irradiations of Ag to Sn targets by bremsstrahlung generated with 23-MeV electron beams are performed at the MT-25 microtron. Gamma spectra of the induced activities have been measured and the yields of all detected radionuclides and isomers are carefully measured and analyzed. A regular dependence of yields versus changed reaction threshold is confirmed. Many isomers are detected and the suppression of the production probability is observed with growing product spin. Special peculiarities for the isomer-to-ground state ratios were deduced for the {sup 106m}Ag, {sup 108m}Ag, {sup 113m}In, {sup 115m}In, and {sup 123m}Sn isomers. The production of such nuclides asmore » {sup 108m}Ag, {sup 115m}In, {sup 117g}In, and {sup 113m}Cd is of interest for applications, especially when economic methods are available.« less

  17. Effects of antenna orientation on 3-D ground penetrating radar surveys: an archaeological perspective

    NASA Astrophysics Data System (ADS)

    Lualdi, Maurizio; Lombardi, Federico

    2014-02-01

    This paper investigates the impact that the GPR antenna orientation, or survey direction, has on migrated image resulting from 3-D georadar acquisitions carried out on heterogeneous and anisotropic subsurface. This feature is related to the directional dependency of wave propagation effects, such as dispersion, absorption, depolarization, and scattering phenomena. We provide a proof of this with two field examples, demonstrating that a 3-D survey performed along a single direction could bring weak results in terms of target detection and reconstruction. To overcome this risk, we show the improvements that the combination of GPR 3-D data acquired along different directions on the same area can obtain: an enhancement of target detection probability and the practical advantage for the end-user of looking through a single image. Further on, we develop a stacking scheme that employs a threshold associated with amplitude comparison to adaptively handle the combination of georadar data volumes.

  18. Characterizing the vibration behavior in crack vicinity in sonic infrared imaging NDE

    NASA Astrophysics Data System (ADS)

    Yu, Qiuye; Obeidat, Omar; Han, Xiaoyan

    2018-04-01

    Sonic Infrared Imaging uses ultrasound excitation and infrared imaging to detect defects in different materials, including metals, metal alloys, and composites. In this NDE technology, the ultrasound excitation applied is typically a short pulse, usually a fraction of a second. The ultrasound causes the opposing surfaces of a crack or a defect to rub each other and result in temperature change with noticeable infrared radiation increase. This thermal signal can be captured by IR camera and used to locate the defect within the target. Probability of detection of defects can be significantly improved when chaotic sound is introduced to the materials. This nonlinearity between the ultrasound transducer and the target materials is an important phenomenon, and the understanding is critical to improve the repeatability and reliability of this technology. In this paper, we will present our study on this topic with emphasis of characterizing vibration in the crack vicinity.

  19. Physics-based, Bayesian sequential detection method and system for radioactive contraband

    DOEpatents

    Candy, James V; Axelrod, Michael C; Breitfeller, Eric F; Chambers, David H; Guidry, Brian L; Manatt, Douglas R; Meyer, Alan W; Sale, Kenneth E

    2014-03-18

    A distributed sequential method and system for detecting and identifying radioactive contraband from highly uncertain (noisy) low-count, radionuclide measurements, i.e. an event mode sequence (EMS), using a statistical approach based on Bayesian inference and physics-model-based signal processing based on the representation of a radionuclide as a monoenergetic decomposition of monoenergetic sources. For a given photon event of the EMS, the appropriate monoenergy processing channel is determined using a confidence interval condition-based discriminator for the energy amplitude and interarrival time and parameter estimates are used to update a measured probability density function estimate for a target radionuclide. A sequential likelihood ratio test is then used to determine one of two threshold conditions signifying that the EMS is either identified as the target radionuclide or not, and if not, then repeating the process for the next sequential photon event of the EMS until one of the two threshold conditions is satisfied.

  20. Probability effects on stimulus evaluation and response processes

    NASA Technical Reports Server (NTRS)

    Gehring, W. J.; Gratton, G.; Coles, M. G.; Donchin, E.

    1992-01-01

    This study investigated the effects of probability information on response preparation and stimulus evaluation. Eight subjects responded with one hand to the target letter H and with the other to the target letter S. The target letter was surrounded by noise letters that were either the same as or different from the target letter. In 2 conditions, the targets were preceded by a warning stimulus unrelated to the target letter. In 2 other conditions, a warning letter predicted that the same letter or the opposite letter would appear as the imperative stimulus with .80 probability. Correct reaction times were faster and error rates were lower when imperative stimuli confirmed the predictions of the warning stimulus. Probability information affected (a) the preparation of motor responses during the foreperiod, (b) the development of expectancies for a particular target letter, and (c) a process sensitive to the identities of letter stimuli but not to their locations.

  1. UXO detection and identification based on intrinsic target polarizabilities: A case history

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gasperikova, E.; Smith, J.T.; Morrison, H.F.

    2008-07-15

    Electromagnetic induction data parameterized in time dependent object intrinsic polarizabilities allow discrimination of unexploded ordnance (UXO) from false targets (scrap metal). Data from a cart-mounted system designed for discrimination of UXO with 20 mm to 155 mm diameters are used. Discrimination of UXO from irregular scrap metal is based on the principal dipole polarizabilities of a target. A near-intact UXO displays a single major polarizability coincident with the long axis of the object and two equal smaller transverse polarizabilities, whereas metal scraps have distinct polarizability signatures that rarely mimic those of elongated symmetric bodies. Based on a training data setmore » of known targets, object identification was made by estimating the probability that an object is a single UXO. Our test survey took place on a military base where both 4.2-inch mortar shells and scrap metal were present. The results show that we detected and discriminated correctly all 4.2-inch mortars, and in that process we added 7%, and 17%, respectively, of dry holes (digging scrap) to the total number of excavations in two different survey modes. We also demonstrated a mode of operation that might be more cost effective than the current practice.« less

  2. A removal model for estimating detection probabilities from point-count surveys

    USGS Publications Warehouse

    Farnsworth, G.L.; Pollock, K.H.; Nichols, J.D.; Simons, T.R.; Hines, J.E.; Sauer, J.R.

    2002-01-01

    Use of point-count surveys is a popular method for collecting data on abundance and distribution of birds. However, analyses of such data often ignore potential differences in detection probability. We adapted a removal model to directly estimate detection probability during point-count surveys. The model assumes that singing frequency is a major factor influencing probability of detection when birds are surveyed using point counts. This may be appropriate for surveys in which most detections are by sound. The model requires counts to be divided into several time intervals. Point counts are often conducted for 10 min, where the number of birds recorded is divided into those first observed in the first 3 min, the subsequent 2 min, and the last 5 min. We developed a maximum-likelihood estimator for the detectability of birds recorded during counts divided into those intervals. This technique can easily be adapted to point counts divided into intervals of any length. We applied this method to unlimited-radius counts conducted in Great Smoky Mountains National Park. We used model selection criteria to identify whether detection probabilities varied among species, throughout the morning, throughout the season, and among different observers. We found differences in detection probability among species. Species that sing frequently such as Winter Wren (Troglodytes troglodytes) and Acadian Flycatcher (Empidonax virescens) had high detection probabilities (∼90%) and species that call infrequently such as Pileated Woodpecker (Dryocopus pileatus) had low detection probability (36%). We also found detection probabilities varied with the time of day for some species (e.g. thrushes) and between observers for other species. We used the same approach to estimate detection probability and density for a subset of the observations with limited-radius point counts.

  3. Digital video steganalysis exploiting collusion sensitivity

    NASA Astrophysics Data System (ADS)

    Budhia, Udit; Kundur, Deepa

    2004-09-01

    In this paper we present an effective steganalyis technique for digital video sequences based on the collusion attack. Steganalysis is the process of detecting with a high probability and low complexity the presence of covert data in multimedia. Existing algorithms for steganalysis target detecting covert information in still images. When applied directly to video sequences these approaches are suboptimal. In this paper, we present a method that overcomes this limitation by using redundant information present in the temporal domain to detect covert messages in the form of Gaussian watermarks. Our gains are achieved by exploiting the collusion attack that has recently been studied in the field of digital video watermarking, and more sophisticated pattern recognition tools. Applications of our scheme include cybersecurity and cyberforensics.

  4. PMHT Approach for Multi-Target Multi-Sensor Sonar Tracking in Clutter.

    PubMed

    Li, Xiaohua; Li, Yaan; Yu, Jing; Chen, Xiao; Dai, Miao

    2015-11-06

    Multi-sensor sonar tracking has many advantages, such as the potential to reduce the overall measurement uncertainty and the possibility to hide the receiver. However, the use of multi-target multi-sensor sonar tracking is challenging because of the complexity of the underwater environment, especially the low target detection probability and extremely large number of false alarms caused by reverberation. In this work, to solve the problem of multi-target multi-sensor sonar tracking in the presence of clutter, a novel probabilistic multi-hypothesis tracker (PMHT) approach based on the extended Kalman filter (EKF) and unscented Kalman filter (UKF) is proposed. The PMHT can efficiently handle the unknown measurements-to-targets and measurements-to-transmitters data association ambiguity. The EKF and UKF are used to deal with the high degree of nonlinearity in the measurement model. The simulation results show that the proposed algorithm can improve the target tracking performance in a cluttered environment greatly, and its computational load is low.

  5. Cost and detection rate of glaucoma screening with imaging devices in a primary care center

    PubMed Central

    Anton, Alfonso; Fallon, Monica; Cots, Francesc; Sebastian, María A; Morilla-Grasa, Antonio; Mojal, Sergi; Castells, Xavier

    2017-01-01

    Purpose To analyze the cost and detection rate of a screening program for detecting glaucoma with imaging devices. Materials and methods In this cross-sectional study, a glaucoma screening program was applied in a population-based sample randomly selected from a population of 23,527. Screening targeted the population at risk of glaucoma. Examinations included optic disk tomography (Heidelberg retina tomograph [HRT]), nerve fiber analysis, and tonometry. Subjects who met at least 2 of 3 endpoints (HRT outside normal limits, nerve fiber index ≥30, or tonometry ≥21 mmHg) were referred for glaucoma consultation. The currently established (“conventional”) detection method was evaluated by recording data from primary care and ophthalmic consultations in the same population. The direct costs of screening and conventional detection were calculated by adding the unit costs generated during the diagnostic process. The detection rate of new glaucoma cases was assessed. Results The screening program evaluated 414 subjects; 32 cases were referred for glaucoma consultation, 7 had glaucoma, and 10 had probable glaucoma. The current detection method assessed 677 glaucoma suspects in the population, of whom 29 were diagnosed with glaucoma or probable glaucoma. Glaucoma screening and the conventional detection method had detection rates of 4.1% and 3.1%, respectively, and the cost per case detected was 1,410 and 1,435€, respectively. The cost of screening 1 million inhabitants would be 5.1 million euros and would allow the detection of 4,715 new cases. Conclusion The proposed screening method directed at population at risk allows a detection rate of 4.1% and a cost of 1,410 per case detected. PMID:28243057

  6. Performance of fusion algorithms for computer-aided detection and classification of mines in very shallow water obtained from testing in navy Fleet Battle Exercise-Hotel 2000

    NASA Astrophysics Data System (ADS)

    Ciany, Charles M.; Zurawski, William; Kerfoot, Ian

    2001-10-01

    The performance of Computer Aided Detection/Computer Aided Classification (CAD/CAC) Fusion algorithms on side-scan sonar images was evaluated using data taken at the Navy's's Fleet Battle Exercise-Hotel held in Panama City, Florida, in August 2000. A 2-of-3 binary fusion algorithm is shown to provide robust performance. The algorithm accepts the classification decisions and associated contact locations form three different CAD/CAC algorithms, clusters the contacts based on Euclidian distance, and then declares a valid target when a clustered contact is declared by at least 2 of the 3 individual algorithms. This simple binary fusion provided a 96 percent probability of correct classification at a false alarm rate of 0.14 false alarms per image per side. The performance represented a 3.8:1 reduction in false alarms over the best performing single CAD/CAC algorithm, with no loss in probability of correct classification.

  7. Validating An Analytic Completeness Model for Kepler Target Stars Based on Flux-level Transit Injection Experiments

    NASA Astrophysics Data System (ADS)

    Catanzarite, Joseph; Burke, Christopher J.; Li, Jie; Seader, Shawn; Haas, Michael R.; Batalha, Natalie; Henze, Christopher; Christiansen, Jessie; Kepler Project, NASA Advanced Supercomputing Division

    2016-06-01

    The Kepler Mission is developing an Analytic Completeness Model (ACM) to estimate detection completeness contours as a function of exoplanet radius and period for each target star. Accurate completeness contours are necessary for robust estimation of exoplanet occurrence rates.The main components of the ACM for a target star are: detection efficiency as a function of SNR, the window function (WF) and the one-sigma depth function (OSDF). (Ref. Burke et al. 2015). The WF captures the falloff in transit detection probability at long periods that is determined by the observation window (the duration over which the target star has been observed). The OSDF is the transit depth (in parts per million) that yields SNR of unity for the full transit train. It is a function of period, and accounts for the time-varying properties of the noise and for missing or deweighted data.We are performing flux-level transit injection (FLTI) experiments on selected Kepler target stars with the goal of refining and validating the ACM. “Flux-level” injection machinery inserts exoplanet transit signatures directly into the flux time series, as opposed to “pixel-level” injection, which inserts transit signatures into the individual pixels using the pixel response function. See Jie Li's poster: ID #2493668, "Flux-level transit injection experiments with the NASA Pleiades Supercomputer" for details, including performance statistics.Since FLTI is affordable for only a small subset of the Kepler targets, the ACM is designed to apply to most Kepler target stars. We validate this model using “deep” FLTI experiments, with ~500,000 injection realizations on each of a small number of targets and “shallow” FLTI experiments with ~2000 injection realizations on each of many targets. From the results of these experiments, we identify anomalous targets, model their behavior and refine the ACM accordingly.In this presentation, we discuss progress in validating and refining the ACM, and we compare our detection efficiency curves with those derived from the associated pixel-level transit injection experiments.Kepler was selected as the 10th mission of the Discovery Program. Funding for this mission is provided by NASA, Science Mission Directorate.

  8. Thermal bioaerosol cloud tracking with Bayesian classification

    NASA Astrophysics Data System (ADS)

    Smith, Christian W.; Dupuis, Julia R.; Schundler, Elizabeth C.; Marinelli, William J.

    2017-05-01

    The development of a wide area, bioaerosol early warning capability employing existing uncooled thermal imaging systems used for persistent perimeter surveillance is discussed. The capability exploits thermal imagers with other available data streams including meteorological data and employs a recursive Bayesian classifier to detect, track, and classify observed thermal objects with attributes consistent with a bioaerosol plume. Target detection is achieved based on similarity to a phenomenological model which predicts the scene-dependent thermal signature of bioaerosol plumes. Change detection in thermal sensor data is combined with local meteorological data to locate targets with the appropriate thermal characteristics. Target motion is tracked utilizing a Kalman filter and nearly constant velocity motion model for cloud state estimation. Track management is performed using a logic-based upkeep system, and data association is accomplished using a combinatorial optimization technique. Bioaerosol threat classification is determined using a recursive Bayesian classifier to quantify the threat probability of each tracked object. The classifier can accept additional inputs from visible imagers, acoustic sensors, and point biological sensors to improve classification confidence. This capability was successfully demonstrated for bioaerosol simulant releases during field testing at Dugway Proving Grounds. Standoff detection at a range of 700m was achieved for as little as 500g of anthrax simulant. Developmental test results will be reviewed for a range of simulant releases, and future development and transition plans for the bioaerosol early warning platform will be discussed.

  9. MO-FG-CAMPUS-TeP2-04: Optimizing for a Specified Target Coverage Probability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fredriksson, A

    2016-06-15

    Purpose: The purpose of this work is to develop a method for inverse planning of radiation therapy margins. When using this method the user specifies a desired target coverage probability and the system optimizes to meet the demand without any explicit specification of margins to handle setup uncertainty. Methods: The method determines which voxels to include in an optimization function promoting target coverage in order to achieve a specified target coverage probability. Voxels are selected in a way that retains the correlation between them: The target is displaced according to the setup errors and the voxels to include are selectedmore » as the union of the displaced target regions under the x% best scenarios according to some quality measure. The quality measure could depend on the dose to the considered structure alone or could depend on the dose to multiple structures in order to take into account correlation between structures. Results: A target coverage function was applied to the CTV of a prostate case with prescription 78 Gy and compared to conventional planning using a DVH function on the PTV. Planning was performed to achieve 90% probability of CTV coverage. The plan optimized using the coverage probability function had P(D98 > 77.95 Gy) = 0.97 for the CTV. The PTV plan using a constraint on minimum DVH 78 Gy at 90% had P(D98 > 77.95) = 0.44 for the CTV. To match the coverage probability optimization, the DVH volume parameter had to be increased to 97% which resulted in 0.5 Gy higher average dose to the rectum. Conclusion: Optimizing a target coverage probability is an easily used method to find a margin that achieves the desired coverage probability. It can lead to reduced OAR doses at the same coverage probability compared to planning with margins and DVH functions.« less

  10. Development and test of photon counting lidar

    NASA Astrophysics Data System (ADS)

    Wang, Chun-hui; Wang, Ao-you; Tao, Yu-liang; Li, Xu; Peng, Huan; Meng, Pei-bei

    2018-02-01

    In order to satisfy the application requirements of spaceborne three dimensional imaging lidar , a prototype of nonscanning multi-channel lidar based on receiver field of view segmentation was designed and developed. High repetition frequency micro-pulse lasers, optics fiber array and Geiger-mode APD, combination with time-correlated single photon counting technology, were adopted to achieve multi-channel detection. Ranging experiments were carried out outdoors. In low echo photon condition, target photon counting showed time correlated and noise photon counting were random. Detection probability and range precision versus threshold were described and range precision increased from 0.44 to 0.11 when threshold increased from 4 to 8.

  11. The relationship between species detection probability and local extinction probability

    USGS Publications Warehouse

    Alpizar-Jara, R.; Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Pollock, K.H.; Rosenberry, C.S.

    2004-01-01

    In community-level ecological studies, generally not all species present in sampled areas are detected. Many authors have proposed the use of estimation methods that allow detection probabilities that are < 1 and that are heterogeneous among species. These methods can also be used to estimate community-dynamic parameters such as species local extinction probability and turnover rates (Nichols et al. Ecol Appl 8:1213-1225; Conserv Biol 12:1390-1398). Here, we present an ad hoc approach to estimating community-level vital rates in the presence of joint heterogeneity of detection probabilities and vital rates. The method consists of partitioning the number of species into two groups using the detection frequencies and then estimating vital rates (e.g., local extinction probabilities) for each group. Estimators from each group are combined in a weighted estimator of vital rates that accounts for the effect of heterogeneity. Using data from the North American Breeding Bird Survey, we computed such estimates and tested the hypothesis that detection probabilities and local extinction probabilities were negatively related. Our analyses support the hypothesis that species detection probability covaries negatively with local probability of extinction and turnover rates. A simulation study was conducted to assess the performance of vital parameter estimators as well as other estimators relevant to questions about heterogeneity, such as coefficient of variation of detection probabilities and proportion of species in each group. Both the weighted estimator suggested in this paper and the original unweighted estimator for local extinction probability performed fairly well and provided no basis for preferring one to the other.

  12. Population size influences amphibian detection probability: implications for biodiversity monitoring programs.

    PubMed

    Tanadini, Lorenzo G; Schmidt, Benedikt R

    2011-01-01

    Monitoring is an integral part of species conservation. Monitoring programs must take imperfect detection of species into account in order to be reliable. Theory suggests that detection probability may be determined by population size but this relationship has not yet been assessed empirically. Population size is particularly important because it may induce heterogeneity in detection probability and thereby cause bias in estimates of biodiversity. We used a site occupancy model to analyse data from a volunteer-based amphibian monitoring program to assess how well different variables explain variation in detection probability. An index to population size best explained detection probabilities for four out of six species (to avoid circular reasoning, we used the count of individuals at a previous site visit as an index to current population size). The relationship between the population index and detection probability was positive. Commonly used weather variables best explained detection probabilities for two out of six species. Estimates of site occupancy probabilities differed depending on whether the population index was or was not used to model detection probability. The relationship between the population index and detectability has implications for the design of monitoring and species conservation. Most importantly, because many small populations are likely to be overlooked, monitoring programs should be designed in such a way that small populations are not overlooked. The results also imply that methods cannot be standardized in such a way that detection probabilities are constant. As we have shown here, one can easily account for variation in population size in the analysis of data from long-term monitoring programs by using counts of individuals from surveys at the same site in previous years. Accounting for variation in population size is important because it can affect the results of long-term monitoring programs and ultimately the conservation of imperiled species.

  13. Distinguishing bias from sensitivity effects in multialternative detection tasks.

    PubMed

    Sridharan, Devarajan; Steinmetz, Nicholas A; Moore, Tirin; Knudsen, Eric I

    2014-08-21

    Studies investigating the neural bases of cognitive phenomena increasingly employ multialternative detection tasks that seek to measure the ability to detect a target stimulus or changes in some target feature (e.g., orientation or direction of motion) that could occur at one of many locations. In such tasks, it is essential to distinguish the behavioral and neural correlates of enhanced perceptual sensitivity from those of increased bias for a particular location or choice (choice bias). However, making such a distinction is not possible with established approaches. We present a new signal detection model that decouples the behavioral effects of choice bias from those of perceptual sensitivity in multialternative (change) detection tasks. By formulating the perceptual decision in a multidimensional decision space, our model quantifies the respective contributions of bias and sensitivity to multialternative behavioral choices. With a combination of analytical and numerical approaches, we demonstrate an optimal, one-to-one mapping between model parameters and choice probabilities even for tasks involving arbitrarily large numbers of alternatives. We validated the model with published data from two ternary choice experiments: a target-detection experiment and a length-discrimination experiment. The results of this validation provided novel insights into perceptual processes (sensory noise and competitive interactions) that can accurately and parsimoniously account for observers' behavior in each task. The model will find important application in identifying and interpreting the effects of behavioral manipulations (e.g., cueing attention) or neural perturbations (e.g., stimulation or inactivation) in a variety of multialternative tasks of perception, attention, and decision-making. © 2014 ARVO.

  14. Distinguishing bias from sensitivity effects in multialternative detection tasks

    PubMed Central

    Sridharan, Devarajan; Steinmetz, Nicholas A.; Moore, Tirin; Knudsen, Eric I.

    2014-01-01

    Studies investigating the neural bases of cognitive phenomena increasingly employ multialternative detection tasks that seek to measure the ability to detect a target stimulus or changes in some target feature (e.g., orientation or direction of motion) that could occur at one of many locations. In such tasks, it is essential to distinguish the behavioral and neural correlates of enhanced perceptual sensitivity from those of increased bias for a particular location or choice (choice bias). However, making such a distinction is not possible with established approaches. We present a new signal detection model that decouples the behavioral effects of choice bias from those of perceptual sensitivity in multialternative (change) detection tasks. By formulating the perceptual decision in a multidimensional decision space, our model quantifies the respective contributions of bias and sensitivity to multialternative behavioral choices. With a combination of analytical and numerical approaches, we demonstrate an optimal, one-to-one mapping between model parameters and choice probabilities even for tasks involving arbitrarily large numbers of alternatives. We validated the model with published data from two ternary choice experiments: a target-detection experiment and a length-discrimination experiment. The results of this validation provided novel insights into perceptual processes (sensory noise and competitive interactions) that can accurately and parsimoniously account for observers' behavior in each task. The model will find important application in identifying and interpreting the effects of behavioral manipulations (e.g., cueing attention) or neural perturbations (e.g., stimulation or inactivation) in a variety of multialternative tasks of perception, attention, and decision-making. PMID:25146574

  15. Improved relocatable over-the-horizon radar detection and tracking using the maximum likelihood adaptive neural system algorithm

    NASA Astrophysics Data System (ADS)

    Perlovsky, Leonid I.; Webb, Virgil H.; Bradley, Scott R.; Hansen, Christopher A.

    1998-07-01

    An advanced detection and tracking system is being developed for the U.S. Navy's Relocatable Over-the-Horizon Radar (ROTHR) to provide improved tracking performance against small aircraft typically used in drug-smuggling activities. The development is based on the Maximum Likelihood Adaptive Neural System (MLANS), a model-based neural network that combines advantages of neural network and model-based algorithmic approaches. The objective of the MLANS tracker development effort is to address user requirements for increased detection and tracking capability in clutter and improved track position, heading, and speed accuracy. The MLANS tracker is expected to outperform other approaches to detection and tracking for the following reasons. It incorporates adaptive internal models of target return signals, target tracks and maneuvers, and clutter signals, which leads to concurrent clutter suppression, detection, and tracking (track-before-detect). It is not combinatorial and thus does not require any thresholding or peak picking and can track in low signal-to-noise conditions. It incorporates superresolution spectrum estimation techniques exceeding the performance of conventional maximum likelihood and maximum entropy methods. The unique spectrum estimation method is based on the Einsteinian interpretation of the ROTHR received energy spectrum as a probability density of signal frequency. The MLANS neural architecture and learning mechanism are founded on spectrum models and maximization of the "Einsteinian" likelihood, allowing knowledge of the physical behavior of both targets and clutter to be injected into the tracker algorithms. The paper describes the addressed requirements and expected improvements, theoretical foundations, engineering methodology, and results of the development effort to date.

  16. A double-observer approach for estimating detection probability and abundance from point counts

    USGS Publications Warehouse

    Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Fallon, F.W.; Fallon, J.E.; Heglund, P.J.

    2000-01-01

    Although point counts are frequently used in ornithological studies, basic assumptions about detection probabilities often are untested. We apply a double-observer approach developed to estimate detection probabilities for aerial surveys (Cook and Jacobson 1979) to avian point counts. At each point count, a designated 'primary' observer indicates to another ('secondary') observer all birds detected. The secondary observer records all detections of the primary observer as well as any birds not detected by the primary observer. Observers alternate primary and secondary roles during the course of the survey. The approach permits estimation of observer-specific detection probabilities and bird abundance. We developed a set of models that incorporate different assumptions about sources of variation (e.g. observer, bird species) in detection probability. Seventeen field trials were conducted, and models were fit to the resulting data using program SURVIV. Single-observer point counts generally miss varying proportions of the birds actually present, and observer and bird species were found to be relevant sources of variation in detection probabilities. Overall detection probabilities (probability of being detected by at least one of the two observers) estimated using the double-observer approach were very high (>0.95), yielding precise estimates of avian abundance. We consider problems with the approach and recommend possible solutions, including restriction of the approach to fixed-radius counts to reduce the effect of variation in the effective radius of detection among various observers and to provide a basis for using spatial sampling to estimate bird abundance on large areas of interest. We believe that most questions meriting the effort required to carry out point counts also merit serious attempts to estimate detection probabilities associated with the counts. The double-observer approach is a method that can be used for this purpose.

  17. Sampling design for long-term regional trends in marine rocky intertidal communities

    USGS Publications Warehouse

    Irvine, Gail V.; Shelley, Alice

    2013-01-01

    Probability-based designs reduce bias and allow inference of results to the pool of sites from which they were chosen. We developed and tested probability-based designs for monitoring marine rocky intertidal assemblages at Glacier Bay National Park and Preserve (GLBA), Alaska. A multilevel design was used that varied in scale and inference. The levels included aerial surveys, extensive sampling of 25 sites, and more intensive sampling of 6 sites. Aerial surveys of a subset of intertidal habitat indicated that the original target habitat of bedrock-dominated sites with slope ≤30° was rare. This unexpected finding illustrated one value of probability-based surveys and led to a shift in the target habitat type to include steeper, more mixed rocky habitat. Subsequently, we evaluated the statistical power of different sampling methods and sampling strategies to detect changes in the abundances of the predominant sessile intertidal taxa: barnacles Balanomorpha, the mussel Mytilus trossulus, and the rockweed Fucus distichus subsp. evanescens. There was greatest power to detect trends in Mytilus and lesser power for barnacles and Fucus. Because of its greater power, the extensive, coarse-grained sampling scheme was adopted in subsequent years over the intensive, fine-grained scheme. The sampling attributes that had the largest effects on power included sampling of “vertical” line transects (vs. horizontal line transects or quadrats) and increasing the number of sites. We also evaluated the power of several management-set parameters. Given equal sampling effort, sampling more sites fewer times had greater power. The information gained through intertidal monitoring is likely to be useful in assessing changes due to climate, including ocean acidification; invasive species; trampling effects; and oil spills.

  18. Modelling detection probabilities to evaluate management and control tools for an invasive species

    USGS Publications Warehouse

    Christy, M.T.; Yackel Adams, A.A.; Rodda, G.H.; Savidge, J.A.; Tyrrell, C.L.

    2010-01-01

    For most ecologists, detection probability (p) is a nuisance variable that must be modelled to estimate the state variable of interest (i.e. survival, abundance, or occupancy). However, in the realm of invasive species control, the rate of detection and removal is the rate-limiting step for management of this pervasive environmental problem. For strategic planning of an eradication (removal of every individual), one must identify the least likely individual to be removed, and determine the probability of removing it. To evaluate visual searching as a control tool for populations of the invasive brown treesnake Boiga irregularis, we designed a mark-recapture study to evaluate detection probability as a function of time, gender, size, body condition, recent detection history, residency status, searcher team and environmental covariates. We evaluated these factors using 654 captures resulting from visual detections of 117 snakes residing in a 5-ha semi-forested enclosure on Guam, fenced to prevent immigration and emigration of snakes but not their prey. Visual detection probability was low overall (= 0??07 per occasion) but reached 0??18 under optimal circumstances. Our results supported sex-specific differences in detectability that were a quadratic function of size, with both small and large females having lower detection probabilities than males of those sizes. There was strong evidence for individual periodic changes in detectability of a few days duration, roughly doubling detection probability (comparing peak to non-elevated detections). Snakes in poor body condition had estimated mean detection probabilities greater than snakes with high body condition. Search teams with high average detection rates exhibited detection probabilities about twice that of search teams with low average detection rates. Surveys conducted with bright moonlight and strong wind gusts exhibited moderately decreased probabilities of detecting snakes. Synthesis and applications. By emphasizing and modelling detection probabilities, we now know: (i) that eradication of this species by searching is possible, (ii) how much searching effort would be required, (iii) under what environmental conditions searching would be most efficient, and (iv) several factors that are likely to modulate this quantification when searching is applied to new areas. The same approach can be use for evaluation of any control technology or population monitoring programme. ?? 2009 The Authors. Journal compilation ?? 2009 British Ecological Society.

  19. A fast ellipse extended target PHD filter using box-particle implementation

    NASA Astrophysics Data System (ADS)

    Zhang, Yongquan; Ji, Hongbing; Hu, Qi

    2018-01-01

    This paper presents a box-particle implementation of the ellipse extended target probability hypothesis density (ET-PHD) filter, called the ellipse extended target box particle PHD (EET-BP-PHD) filter, where the extended targets are described as a Poisson model developed by Gilholm et al. and the term "box" is here equivalent to the term "interval" used in interval analysis. The proposed EET-BP-PHD filter is capable of dynamically tracking multiple ellipse extended targets and estimating the target states and the number of targets, in the presence of clutter measurements, false alarms and missed detections. To derive the PHD recursion of the EET-BP-PHD filter, a suitable measurement likelihood is defined for a given partitioning cell, and the main implementation steps are presented along with the necessary box approximations and manipulations. The limitations and capabilities of the proposed EET-BP-PHD filter are illustrated by simulation examples. The simulation results show that a box-particle implementation of the ET-PHD filter can avoid the high number of particles and reduce computational burden, compared to a particle implementation of that for extended target tracking.

  20. Optimal directed searches for continuous gravitational waves

    NASA Astrophysics Data System (ADS)

    Ming, Jing; Krishnan, Badri; Papa, Maria Alessandra; Aulbert, Carsten; Fehrmann, Henning

    2016-03-01

    Wide parameter space searches for long-lived continuous gravitational wave signals are computationally limited. It is therefore critically important that the available computational resources are used rationally. In this paper we consider directed searches, i.e., targets for which the sky position is known accurately but the frequency and spin-down parameters are completely unknown. Given a list of such potential astrophysical targets, we therefore need to prioritize. On which target(s) should we spend scarce computing resources? What parameter space region in frequency and spin-down should we search through? Finally, what is the optimal search setup that we should use? In this paper we present a general framework that allows us to solve all three of these problems. This framework is based on maximizing the probability of making a detection subject to a constraint on the maximum available computational cost. We illustrate the method for a simplified problem.

  1. Tracking, aiming, and hitting the UAV with ordinary assault rifle

    NASA Astrophysics Data System (ADS)

    Racek, František; Baláž, Teodor; Krejčí, Jaroslav; Procházka, Stanislav; Macko, Martin

    2017-10-01

    The usage small-unmanned aerial vehicles (UAVs) is significantly increasing nowadays. They are being used as a carrier of military spy and reconnaissance devices (taking photos, live video streaming and so on), or as a carrier of potentially dangerous cargo (intended for destruction and killing). Both ways of utilizing the UAV cause the necessity to disable it. From the military point of view, to disable the UAV means to bring it down by a weapon of an ordinary soldier that is the assault rifle. This task can be challenging for the soldier because he needs visually detect and identify the target, track the target visually and aim on the target. The final success of the soldier's mission depends not only on the said visual tasks, but also on the properties of the weapon and ammunition. The paper deals with possible methods of prediction of probability of hitting the UAV targets.

  2. Detecting truly clonal alterations from multi-region profiling of tumours

    PubMed Central

    Werner, Benjamin; Traulsen, Arne; Sottoriva, Andrea; Dingli, David

    2017-01-01

    Modern cancer therapies aim at targeting tumour-specific alterations, such as mutations or neo-antigens, and maximal treatment efficacy requires that targeted alterations are present in all tumour cells. Currently, treatment decisions are based on one or a few samples per tumour, creating uncertainty on whether alterations found in those samples are actually present in all tumour cells. The probability of classifying clonal versus sub-clonal alterations from multi-region profiling of tumours depends on the earliest phylogenetic branching event during tumour growth. By analysing 181 samples from 10 renal carcinoma and 11 colorectal cancers we demonstrate that the information gain from additional sampling falls onto a simple universal curve. We found that in colorectal cancers, 30% of alterations identified as clonal with one biopsy proved sub-clonal when 8 samples were considered. The probability to overestimate clonal alterations fell below 1% in 7/11 patients with 8 samples per tumour. In renal cell carcinoma, 8 samples reduced the list of clonal alterations by 40% with respect to a single biopsy. The probability to overestimate clonal alterations remained as high as 92% in 7/10 renal cancer patients. Furthermore, treatment was associated with more unbalanced tumour phylogenetic trees, suggesting the need of denser sampling of tumours at relapse. PMID:28344344

  3. Detecting truly clonal alterations from multi-region profiling of tumours

    NASA Astrophysics Data System (ADS)

    Werner, Benjamin; Traulsen, Arne; Sottoriva, Andrea; Dingli, David

    2017-03-01

    Modern cancer therapies aim at targeting tumour-specific alterations, such as mutations or neo-antigens, and maximal treatment efficacy requires that targeted alterations are present in all tumour cells. Currently, treatment decisions are based on one or a few samples per tumour, creating uncertainty on whether alterations found in those samples are actually present in all tumour cells. The probability of classifying clonal versus sub-clonal alterations from multi-region profiling of tumours depends on the earliest phylogenetic branching event during tumour growth. By analysing 181 samples from 10 renal carcinoma and 11 colorectal cancers we demonstrate that the information gain from additional sampling falls onto a simple universal curve. We found that in colorectal cancers, 30% of alterations identified as clonal with one biopsy proved sub-clonal when 8 samples were considered. The probability to overestimate clonal alterations fell below 1% in 7/11 patients with 8 samples per tumour. In renal cell carcinoma, 8 samples reduced the list of clonal alterations by 40% with respect to a single biopsy. The probability to overestimate clonal alterations remained as high as 92% in 7/10 renal cancer patients. Furthermore, treatment was associated with more unbalanced tumour phylogenetic trees, suggesting the need of denser sampling of tumours at relapse.

  4. Distributed Denial of Service Attack Source Detection Using Efficient Traceback Technique (ETT) in Cloud-Assisted Healthcare Environment.

    PubMed

    Latif, Rabia; Abbas, Haider; Latif, Seemab; Masood, Ashraf

    2016-07-01

    Security and privacy are the first and foremost concerns that should be given special attention when dealing with Wireless Body Area Networks (WBANs). As WBAN sensors operate in an unattended environment and carry critical patient health information, Distributed Denial of Service (DDoS) attack is one of the major attacks in WBAN environment that not only exhausts the available resources but also influence the reliability of information being transmitted. This research work is an extension of our previous work in which a machine learning based attack detection algorithm is proposed to detect DDoS attack in WBAN environment. However, in order to avoid complexity, no consideration was given to the traceback mechanism. During traceback, the challenge lies in reconstructing the attack path leading to identify the attack source. Among existing traceback techniques, Probabilistic Packet Marking (PPM) approach is the most commonly used technique in conventional IP- based networks. However, since marking probability assignment has significant effect on both the convergence time and performance of a scheme, it is not directly applicable in WBAN environment due to high convergence time and overhead on intermediate nodes. Therefore, in this paper we have proposed a new scheme called Efficient Traceback Technique (ETT) based on Dynamic Probability Packet Marking (DPPM) approach and uses MAC header in place of IP header. Instead of using fixed marking probability, the proposed scheme uses variable marking probability based on the number of hops travelled by a packet to reach the target node. Finally, path reconstruction algorithms are proposed to traceback an attacker. Evaluation and simulation results indicate that the proposed solution outperforms fixed PPM in terms of convergence time and computational overhead on nodes.

  5. Comparing synthetic imagery with real imagery for visible signature analysis: human observer results

    NASA Astrophysics Data System (ADS)

    Culpepper, Joanne B.; Richards, Noel; Madden, Christopher S.; Winter, Neal; Wheaton, Vivienne C.

    2017-10-01

    Synthetic imagery could potentially enhance visible signature analysis by providing a wider range of target images in differing environmental conditions than would be feasible to collect in field trials. Achieving this requires a method for generating synthetic imagery that is both verified to be realistic and produces the same visible signature analysis results as real images. Is target detectability as measured by image metrics the same for real images and synthetic images of the same scene? Is target detectability as measured by human observer trials the same for real images and synthetic images of the same scene, and how realistic do the synthetic images need to be? In this paper we present the results of a small scale exploratory study on the second question: a photosimulation experiment conducted using digital photographs and synthetic images generated of the same scene. Two sets of synthetic images were created: a high fidelity set created using an image generation tool, E-on Vue, and a low fidelity set created using a gaming engine, Unity 3D. The target detection results obtained using digital photographs were compared with those obtained using the two sets of synthetic images. There was a moderate correlation between the high fidelity synthetic image set and the real images in both the probability of correct detection (Pd: PCC = 0.58, SCC = 0.57) and mean search time (MST: PCC = 0.63, SCC = 0.61). There was no correlation between the low fidelity synthetic image set and the real images for the Pd, but a moderate correlation for MST (PCC = 0.67, SCC = 0.55).

  6. Robust Detection of Rare Species Using Environmental DNA: The Importance of Primer Specificity

    PubMed Central

    Wilcox, Taylor M.; McKelvey, Kevin S.; Young, Michael K.; Jane, Stephen F.; Lowe, Winsor H.; Whiteley, Andrew R.; Schwartz, Michael K.

    2013-01-01

    Environmental DNA (eDNA) is being rapidly adopted as a tool to detect rare animals. Quantitative PCR (qPCR) using probe-based chemistries may represent a particularly powerful tool because of the method’s sensitivity, specificity, and potential to quantify target DNA. However, there has been little work understanding the performance of these assays in the presence of closely related, sympatric taxa. If related species cause any cross-amplification or interference, false positives and negatives may be generated. These errors can be disastrous if false positives lead to overestimate the abundance of an endangered species or if false negatives prevent detection of an invasive species. In this study we test factors that influence the specificity and sensitivity of TaqMan MGB assays using co-occurring, closely related brook trout (Salvelinus fontinalis) and bull trout (S. confluentus) as a case study. We found qPCR to be substantially more sensitive than traditional PCR, with a high probability of detection at concentrations as low as 0.5 target copies/µl. We also found that number and placement of base pair mismatches between the Taqman MGB assay and non-target templates was important to target specificity, and that specificity was most influenced by base pair mismatches in the primers, rather than in the probe. We found that insufficient specificity can result in both false positive and false negative results, particularly in the presence of abundant related species. Our results highlight the utility of qPCR as a highly sensitive eDNA tool, and underscore the importance of careful assay design. PMID:23555689

  7. Robust detection of rare species using environmental DNA: the importance of primer specificity.

    PubMed

    Wilcox, Taylor M; McKelvey, Kevin S; Young, Michael K; Jane, Stephen F; Lowe, Winsor H; Whiteley, Andrew R; Schwartz, Michael K

    2013-01-01

    Environmental DNA (eDNA) is being rapidly adopted as a tool to detect rare animals. Quantitative PCR (qPCR) using probe-based chemistries may represent a particularly powerful tool because of the method's sensitivity, specificity, and potential to quantify target DNA. However, there has been little work understanding the performance of these assays in the presence of closely related, sympatric taxa. If related species cause any cross-amplification or interference, false positives and negatives may be generated. These errors can be disastrous if false positives lead to overestimate the abundance of an endangered species or if false negatives prevent detection of an invasive species. In this study we test factors that influence the specificity and sensitivity of TaqMan MGB assays using co-occurring, closely related brook trout (Salvelinus fontinalis) and bull trout (S. confluentus) as a case study. We found qPCR to be substantially more sensitive than traditional PCR, with a high probability of detection at concentrations as low as 0.5 target copies/µl. We also found that number and placement of base pair mismatches between the Taqman MGB assay and non-target templates was important to target specificity, and that specificity was most influenced by base pair mismatches in the primers, rather than in the probe. We found that insufficient specificity can result in both false positive and false negative results, particularly in the presence of abundant related species. Our results highlight the utility of qPCR as a highly sensitive eDNA tool, and underscore the importance of careful assay design.

  8. Optimization of a chemical identification algorithm

    NASA Astrophysics Data System (ADS)

    Chyba, Thomas H.; Fisk, Brian; Gunning, Christin; Farley, Kevin; Polizzi, Amber; Baughman, David; Simpson, Steven; Slamani, Mohamed-Adel; Almassy, Robert; Da Re, Ryan; Li, Eunice; MacDonald, Steve; Slamani, Ahmed; Mitchell, Scott A.; Pendell-Jones, Jay; Reed, Timothy L.; Emge, Darren

    2010-04-01

    A procedure to evaluate and optimize the performance of a chemical identification algorithm is presented. The Joint Contaminated Surface Detector (JCSD) employs Raman spectroscopy to detect and identify surface chemical contamination. JCSD measurements of chemical warfare agents, simulants, toxic industrial chemicals, interferents and bare surface backgrounds were made in the laboratory and under realistic field conditions. A test data suite, developed from these measurements, is used to benchmark algorithm performance throughout the improvement process. In any one measurement, one of many possible targets can be present along with interferents and surfaces. The detection results are expressed as a 2-category classification problem so that Receiver Operating Characteristic (ROC) techniques can be applied. The limitations of applying this framework to chemical detection problems are discussed along with means to mitigate them. Algorithmic performance is optimized globally using robust Design of Experiments and Taguchi techniques. These methods require figures of merit to trade off between false alarms and detection probability. Several figures of merit, including the Matthews Correlation Coefficient and the Taguchi Signal-to-Noise Ratio are compared. Following the optimization of global parameters which govern the algorithm behavior across all target chemicals, ROC techniques are employed to optimize chemical-specific parameters to further improve performance.

  9. Using areas of known occupancy to identify sources of variation in detection probability of raptors: taking time lowers replication effort for surveys.

    PubMed

    Murn, Campbell; Holloway, Graham J

    2016-10-01

    Species occurring at low density can be difficult to detect and if not properly accounted for, imperfect detection will lead to inaccurate estimates of occupancy. Understanding sources of variation in detection probability and how they can be managed is a key part of monitoring. We used sightings data of a low-density and elusive raptor (white-headed vulture Trigonoceps occipitalis ) in areas of known occupancy (breeding territories) in a likelihood-based modelling approach to calculate detection probability and the factors affecting it. Because occupancy was known a priori to be 100%, we fixed the model occupancy parameter to 1.0 and focused on identifying sources of variation in detection probability. Using detection histories from 359 territory visits, we assessed nine covariates in 29 candidate models. The model with the highest support indicated that observer speed during a survey, combined with temporal covariates such as time of year and length of time within a territory, had the highest influence on the detection probability. Averaged detection probability was 0.207 (s.e. 0.033) and based on this the mean number of visits required to determine within 95% confidence that white-headed vultures are absent from a breeding area is 13 (95% CI: 9-20). Topographical and habitat covariates contributed little to the best models and had little effect on detection probability. We highlight that low detection probabilities of some species means that emphasizing habitat covariates could lead to spurious results in occupancy models that do not also incorporate temporal components. While variation in detection probability is complex and influenced by effects at both temporal and spatial scales, temporal covariates can and should be controlled as part of robust survey methods. Our results emphasize the importance of accounting for detection probability in occupancy studies, particularly during presence/absence studies for species such as raptors that are widespread and occur at low densities.

  10. Probability of detection for bolt hole eddy current in extracted from service aircraft wing structures

    NASA Astrophysics Data System (ADS)

    Underhill, P. R.; Uemura, C.; Krause, T. W.

    2018-04-01

    Fatigue cracks are prone to develop around fasteners found in multi-layer aluminum structures on aging aircraft. Bolt hole eddy current (BHEC) is used for detection of cracks from within bolt holes after fastener removal. In support of qualification towards a target a90/95 (detect 90% of cracks of depth a, 95% of the time) of 0.76 mm (0.030"), a preliminary probability of detection (POD) study was performed to identify those parameters whose variation may keep a bolt hole inspection from attaining its goal. Parameters that were examined included variability in lift-off due to probe type, out-of-round holes, holes with diameters too large to permit surface-contact of the probe and mechanical damage to the holes, including burrs. The study examined the POD for BHEC of corner cracks in unfinished fastener holes extracted from service material. 68 EDM notches were introduced into two specimens of a horizontal stabilizer from a CC-130 Hercules aircraft. The fastener holes were inspected in the unfinished state, simulating potential inspection conditions, by 7 certified inspectors using a manual BHEC setup with an impedance plane display and also with one inspection conducted utilizing a BHEC automated C-Scan apparatus. While the standard detection limit of 1.27 mm (0.050") was achieved, given the a90/95 of 0.97 mm (0.039"), the target 0.76 mm (0.030") was not achieved. The work highlighted a number of areas where there was insufficient information to complete the qualification. Consequently, a number of recommendations were made. These included; development of a specification for minimum probe requirements; criteria for condition of the hole to be inspected, including out-of-roundness and presence of corrosion pits; statement of range of hole sizes; inspection frequency and data display for analysis.

  11. Milagro Observations of Potential TeV Emitters

    NASA Technical Reports Server (NTRS)

    Abdo, A. A.; Abeysekara, A. U.; Allen, B. T.; Aune, T.; Barber, A. S.; Berley, D.; Braun, J.; Chen, C.; Christopher, G. E.; DeYoung, T.; hide

    2014-01-01

    This paper reports the results from three targeted searches of Milagro TeV sky maps: two extragalactic point source lists and one pulsar source list. The first extragalactic candidate list consists of 709 candidates selected from the Fermi-LAT 2FGL catalog. The second extragalactic candidate list contains 31 candidates selected from the TeVCat source catalog that have been detected by imaging atmospheric Cherenkov telescopes (IACTs). In both extragalactic candidate lists Mkn 421 was the only source detected by Milagro. This paper presents the Milagro TeV flux for Mkn 421 and flux limits for the brighter Fermi- LAT extragalactic sources and for all TeVCat candidates. The pulsar list extends a previously published Milagro targeted search for Galactic sources. With the 32 new gamma-ray pulsars identified in 2FGL, the number of pulsars that are studied by both Fermi-LAT and Milagro is increased to 52. In this sample, we find that the probability of Milagro detecting a TeV emission coincident with a pulsar increases with the GeV flux observed by the Fermi-LAT in the energy range from 0.1 GeV to 100 GeV.

  12. Realized detection and capture probabilities for giant gartersnakes (Thamnophis gigas) using modified floating aquatic funnel traps

    USGS Publications Warehouse

    Halstead, Brian J.; Skalos, Shannon M.; Casazza, Michael L.; Wylie, Glenn D.

    2015-01-01

    Detection and capture probabilities for giant gartersnakes (Thamnophis gigas) are very low, and successfully evaluating the effects of variables or experimental treatments on giant gartersnake populations will require greater detection and capture probabilities than those that had been achieved with standard trap designs. Previous research identified important trap modifications that can increase the probability of snakes entering traps and help prevent the escape of captured snakes. The purpose of this study was to quantify detection and capture probabilities obtained using the most successful modification to commercially available traps to date (2015), and examine the ability of realized detection and capture probabilities to achieve benchmark levels of precision in occupancy and capture-mark-recapture studies.

  13. Implications of directed energy for SETI

    NASA Astrophysics Data System (ADS)

    Lubin, Philip

    2016-09-01

    We compute the detectability of directed-energy (DE) sources from distant civilizations that may exist. Recent advances in our own DE technology suggest that our eventual capabilities will radically enhance our capacity to broadcast our presence and hence allow us to ponder the reverse case of detection. We show that DE systems are detectable at vast distances, possibly across the entire horizon, which profoundly alters conceivable search strategies for extra-terrestrial, technologically-advanced civilizations. Even modest searches are extremely effective at detecting or constraining many civilization classes. A single civilization anywhere in our galaxy of comparable technological advancement to our own can be detected with near unity probability with a cluster of 0.1 m telescopes on Earth. A 1 m class telescope can detect a single civilization anywhere in the Andromeda galaxy. A search strategy is proposed using small Earth-based telescopes to observe 1012-1020 stellar and planetary systems. Such observations could address whether there exist other civilizations which are broadcasting with similar or more advanced DE capability. We show that such searches have near-unity probability of detecting comparably advanced civilizations anywhere in our galaxy within a few years, assuming the civilization: (1) adopts a simple "intelligent targeting" beacon strategy; (2) is beaconing at a wavelength we can detect; (3) broadcast the beacon long enough for the light to reach Earth now. In this blind-beacon, blind-search strategy, the civilization need not know where we are nor do we need to know where they are. The same basic strategy can be extended to extragalactic distances.

  14. A Cooperative Search and Coverage Algorithm with Controllable Revisit and Connectivity Maintenance for Multiple Unmanned Aerial Vehicles.

    PubMed

    Liu, Zhong; Gao, Xiaoguang; Fu, Xiaowei

    2018-05-08

    In this paper, we mainly study a cooperative search and coverage algorithm for a given bounded rectangle region, which contains several unknown stationary targets, by a team of unmanned aerial vehicles (UAVs) with non-ideal sensors and limited communication ranges. Our goal is to minimize the search time, while gathering more information about the environment and finding more targets. For this purpose, a novel cooperative search and coverage algorithm with controllable revisit mechanism is presented. Firstly, as the representation of the environment, the cognitive maps that included the target probability map (TPM), the uncertain map (UM), and the digital pheromone map (DPM) are constituted. We also design a distributed update and fusion scheme for the cognitive map. This update and fusion scheme can guarantee that each one of the cognitive maps converges to the same one, which reflects the targets’ true existence or absence in each cell of the search region. Secondly, we develop a controllable revisit mechanism based on the DPM. This mechanism can concentrate the UAVs to revisit sub-areas that have a large target probability or high uncertainty. Thirdly, in the frame of distributed receding horizon optimizing, a path planning algorithm for the multi-UAVs cooperative search and coverage is designed. In the path planning algorithm, the movement of the UAVs is restricted by the potential fields to meet the requirements of avoiding collision and maintaining connectivity constraints. Moreover, using the minimum spanning tree (MST) topology optimization strategy, we can obtain a tradeoff between the search coverage enhancement and the connectivity maintenance. The feasibility of the proposed algorithm is demonstrated by comparison simulations by way of analyzing the effects of the controllable revisit mechanism and the connectivity maintenance scheme. The Monte Carlo method is employed to validate the influence of the number of UAVs, the sensing radius, the detection and false alarm probabilities, and the communication range on the proposed algorithm.

  15. Systemic and Mucosal Differences in HIV Burden, Immune and Therapeutic Responses

    PubMed Central

    Wahl, Sharon M.; Redford, Maryann; Christensen, Shawna; Mack, Wendy; Cohn, Jon; Janoff, Edward N.; Mestecky, Jiri; Jenson, Hal B.; Navazesh, Mahvash; Cohen, Mardge; Reichelderfer, Patricia; Kovacs, Andrea

    2011-01-01

    Background Mucosal tissues represent major targets for HIV transmission, but differ in susceptibility and reservoir function by unknown mechanisms. Methods In a cross-sectional study, HIV RNA and infectious virus were compared between oral and genital compartments and blood in HIV-infected women, in association with clinical parameters, co-pathogens and putative innate and adaptive HIV inhibitors. Results HIV RNA was detectable in 24.5% of women from all 3 compartments, whereas 45% had RNA in only one or two sites. By comparison, infectious HIV, present in blood of the majority, was rare in mucosal sites. Innate mediators, SLPI and TSP, were highest in mucosae. Highly active antiretroviral therapy (HAART) was associated with an 80% decreased probability of shedding. Multivariate logistic regression models revealed that mucosal HIV RNA was associated with higher plasma RNA, infectious virus, and total mucosal IgA, but not IgG. There was a 37-fold increased probability of detecting RNA in both genital and oral specimens (P=0.008;P=0.02, respectively) among women in highest vs lowest IgA tertiles. Conclusions Mucosal sites exhibit distinct characteristics of infectious HIV, viral shedding and responses to therapy, dependent upon both systemic and local factors. Of the putative innate and adaptive mucosal defense factors examined, only IgA was associated with HIV RNA shedding. However, rather than being protective, there was a striking increase in probability of detectable HIV RNA shedding in women with highest total IgA. PMID:21239996

  16. Minimum resolvable power contrast model

    NASA Astrophysics Data System (ADS)

    Qian, Shuai; Wang, Xia; Zhou, Jingjing

    2018-01-01

    Signal-to-noise ratio and MTF are important indexs to evaluate the performance of optical systems. However,whether they are used alone or joint assessment cannot intuitively describe the overall performance of the system. Therefore, an index is proposed to reflect the comprehensive system performance-Minimum Resolvable Radiation Performance Contrast (MRP) model. MRP is an evaluation model without human eyes. It starts from the radiance of the target and the background, transforms the target and background into the equivalent strips,and considers attenuation of the atmosphere, the optical imaging system, and the detector. Combining with the signal-to-noise ratio and the MTF, the Minimum Resolvable Radiation Performance Contrast is obtained. Finally the detection probability model of MRP is given.

  17. Measuring attention using the Posner cuing paradigm: the role of across and within trial target probabilities

    PubMed Central

    Hayward, Dana A.; Ristic, Jelena

    2013-01-01

    Numerous studies conducted within the recent decades have utilized the Posner cuing paradigm for eliciting, measuring, and theoretically characterizing attentional orienting. However, the data from recent studies suggest that the Posner cuing task might not provide an unambiguous measure of attention, as reflexive spatial orienting has been found to interact with extraneous processes engaged by the task's typical structure, i.e., the probability of target presence across trials, which affects tonic alertness, and the probability of target presence within trials, which affects voluntary temporal preparation. To understand the contribution of each of these two processes to the measurement of attentional orienting we assessed their individual and combined effects on reflexive attention elicited by a spatially nonpredictive peripheral cue. Our results revealed that the magnitude of spatial orienting was modulated by joint changes in the global probability of target presence across trials and the local probability of target presence within trials, while the time course of spatial orienting was susceptible to changes in the probability of target presence across trials. These data thus raise important questions about the choice of task parameters within the Posner cuing paradigm and their role in both the measurement and theoretical attributions of the observed attentional effects. PMID:23730280

  18. Effectiveness of scat-detection dogs in determining species presence in a tropical savanna landscape.

    PubMed

    Vynne, Carly; Skalski, John R; Machado, Ricardo B; Groom, Martha J; Jácomo, Anah T A; Marinho-Filho, Jader; Ramos Neto, Mario B; Pomilla, Cristina; Silveira, Leandro; Smith, Heath; Wasser, Samuel K

    2011-02-01

    Most protected areas are too small to sustain populations of wide-ranging mammals; thus, identification and conservation of high-quality habitat for those animals outside parks is often a high priority, particularly for regions where extensive land conversion is occurring. This is the case in the vicinity of Emas National Park, a small protected area in the Brazilian Cerrado. Over the last 40 years the native vegetation surrounding the park has been converted to agriculture, but the region still supports virtually all of the animals native to the area. We determined the effectiveness of scat-detection dogs in detecting presence of five species of mammals threatened with extinction by habitat loss: maned wolf (Chrysocyon brachyurus), puma (Puma concolor), jaguar (Panthera onca), giant anteater (Myrmecophaga tridactyla), and giant armadillo (Priodontes maximus). The probability of scat detection varied among the five species and among survey quadrats of different size, but was consistent across team, season, and year. The probability of occurrence, determined from the presence of scat, in a randomly selected site within the study area ranged from 0.14 for jaguars, which occur primarily in the forested areas of the park, to 0.91 for maned wolves, the most widely distributed species in our study area. Most occurrences of giant armadillos in the park were in open grasslands, but in the agricultural matrix they tended to occur in riparian woodlands. At least one target species occurred in every survey quadrat, and giant armadillos, jaguars, and maned wolves were more likely to be present in quadrats located inside than outside the park. The effort required for detection of scats was highest for the two felids. We were able to detect the presence for each of five wide-ranging species inside and outside the park and to assign occurrence probabilities to specific survey sites. Thus, scat dogs provide an effective survey tool for rare species even when accurate detection likelihoods are required. We believe the way we used scat-detection dogs to determine the presence of species can be applied to the detection of other mammalian species in other ecosystems. ©2010 Society for Conservation Biology.

  19. Addressing the selective role of distinct prefrontal areas in response suppression: A study with brain tumor patients.

    PubMed

    Arbula, Sandra; Pacella, Valentina; De Pellegrin, Serena; Rossetto, Marta; Denaro, Luca; D'Avella, Domenico; Della Puppa, Alessandro; Vallesi, Antonino

    2017-06-01

    The diverging evidence for functional localization of response inhibition within the prefrontal cortex might be justified by the still unclear involvement of other intrinsically related cognitive processes like response selection and sustained attention. In this study, the main aim was to understand whether inhibitory impairments, previously found in patients with both left and right frontal lesions, could be better accounted for by assessing these potentially related cognitive processes. We tested 37 brain tumor patients with left prefrontal, right prefrontal and non-prefrontal lesions and a healthy control group on Go/No-Go and Foreperiod tasks. In both types of tasks inhibitory impairments are likely to cause false alarms, although additionally the former task requires response selection and the latter target detection abilities. Irrespective of the task context, patients with right prefrontal damage showed frequent Go and target omissions, probably due to sustained attention lapses. Left prefrontal patients, on the other hand, showed both Go and target omissions and high false alarm rates to No-Go and warning stimuli, suggesting a decisional rather than an inhibitory impairment. An exploratory whole-brain voxel-based lesion-symptom mapping analysis confirmed the association of left ventrolateral and dorsolateral prefrontal lesions with target discrimination failure, and right ventrolateral and medial prefrontal lesions with target detection failure. Results from this study show how left and right prefrontal areas, which previous research has linked to response inhibition, underlie broader cognitive control processes, particularly involved in response selection and target detection. Based on these findings, we suggest that successful inhibitory control relies on more than one functionally distinct process which, if assessed appropriately, might help us to better understand inhibitory impairments across different pathologies. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  20. Evaluating detection probabilities for American marten in the Black Hills, South Dakota

    USGS Publications Warehouse

    Smith, Joshua B.; Jenks, Jonathan A.; Klaver, Robert W.

    2007-01-01

    Assessing the effectiveness of monitoring techniques designed to determine presence of forest carnivores, such as American marten (Martes americana), is crucial for validation of survey results. Although comparisons between techniques have been made, little attention has been paid to the issue of detection probabilities (p). Thus, the underlying assumption has been that detection probabilities equal 1.0. We used presence-absence data obtained from a track-plate survey in conjunction with results from a saturation-trapping study to derive detection probabilities when marten occurred at high (>2 marten/10.2 km2) and low (???1 marten/10.2 km2) densities within 8 10.2-km2 quadrats. Estimated probability of detecting marten in high-density quadrats was p = 0.952 (SE = 0.047), whereas the detection probability for low-density quadrats was considerably lower (p = 0.333, SE = 0.136). Our results indicated that failure to account for imperfect detection could lead to an underestimation of marten presence in 15-52% of low-density quadrats in the Black Hills, South Dakota, USA. We recommend that repeated site-survey data be analyzed to assess detection probabilities when documenting carnivore survey results.

  1. Exoplanet Biosignatures: Future Directions.

    PubMed

    Walker, Sara I; Bains, William; Cronin, Leroy; DasSarma, Shiladitya; Danielache, Sebastian; Domagal-Goldman, Shawn; Kacar, Betul; Kiang, Nancy Y; Lenardic, Adrian; Reinhard, Christopher T; Moore, William; Schwieterman, Edward W; Shkolnik, Evgenya L; Smith, Harrison B

    2018-06-01

    We introduce a Bayesian method for guiding future directions for detection of life on exoplanets. We describe empirical and theoretical work necessary to place constraints on the relevant likelihoods, including those emerging from better understanding stellar environment, planetary climate and geophysics, geochemical cycling, the universalities of physics and chemistry, the contingencies of evolutionary history, the properties of life as an emergent complex system, and the mechanisms driving the emergence of life. We provide examples for how the Bayesian formalism could guide future search strategies, including determining observations to prioritize or deciding between targeted searches or larger lower resolution surveys to generate ensemble statistics and address how a Bayesian methodology could constrain the prior probability of life with or without a positive detection. Key Words: Exoplanets-Biosignatures-Life detection-Bayesian analysis. Astrobiology 18, 779-824.

  2. Optimizing Probability of Detection Point Estimate Demonstration

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  3. Monitoring multiple species: Estimating state variables and exploring the efficacy of a monitoring program

    USGS Publications Warehouse

    Mattfeldt, S.D.; Bailey, L.L.; Grant, E.H.C.

    2009-01-01

    Monitoring programs have the potential to identify population declines and differentiate among the possible cause(s) of these declines. Recent criticisms regarding the design of monitoring programs have highlighted a failure to clearly state objectives and to address detectability and spatial sampling issues. Here, we incorporate these criticisms to design an efficient monitoring program whose goals are to determine environmental factors which influence the current distribution and measure change in distributions over time for a suite of amphibians. In designing the study we (1) specified a priori factors that may relate to occupancy, extinction, and colonization probabilities and (2) used the data collected (incorporating detectability) to address our scientific questions and adjust our sampling protocols. Our results highlight the role of wetland hydroperiod and other local covariates in the probability of amphibian occupancy. There was a change in overall occupancy probabilities for most species over the first three years of monitoring. Most colonization and extinction estimates were constant over time (years) and space (among wetlands), with one notable exception: local extinction probabilities for Rana clamitans were lower for wetlands with longer hydroperiods. We used information from the target system to generate scenarios of population change and gauge the ability of the current sampling to meet monitoring goals. Our results highlight the limitations of the current sampling design, emphasizing the need for long-term efforts, with periodic re-evaluation of the program in a framework that can inform management decisions.

  4. Internal Medicine residents use heuristics to estimate disease probability.

    PubMed

    Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin

    2015-01-01

    Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. We randomized 55 Internal Medicine residents to different versions of four clinical vignettes and asked them to estimate probabilities of target conditions. We manipulated the clinical data for each vignette to be consistent with either 1) using a representative heuristic, by adding non-discriminating prototypical clinical features of the target condition, or 2) using anchoring with adjustment heuristic, by providing a high or low anchor for the target condition. When presented with additional non-discriminating data the odds of diagnosing the target condition were increased (odds ratio (OR) 2.83, 95% confidence interval [1.30, 6.15], p = 0.009). Similarly, the odds of diagnosing the target condition were increased when a high anchor preceded the vignette (OR 2.04, [1.09, 3.81], p = 0.025). Our findings suggest that despite previous exposure to the use of Bayesian reasoning, residents use heuristics, such as the representative heuristic and anchoring with adjustment, to estimate probabilities. Potential reasons for attribute substitution include the relative cognitive ease of heuristics vs. Bayesian reasoning or perhaps residents in their clinical practice use gist traces rather than precise probability estimates when diagnosing.

  5. Retrodiction for Bayesian multiple-hypothesis/multiple-target tracking in densely cluttered environment

    NASA Astrophysics Data System (ADS)

    Koch, Wolfgang

    1996-05-01

    Sensor data processing in a dense target/dense clutter environment is inevitably confronted with data association conflicts which correspond with the multiple hypothesis character of many modern approaches (MHT: multiple hypothesis tracking). In this paper we analyze the efficiency of retrodictive techniques that generalize standard fixed interval smoothing to MHT applications. 'Delayed estimation' based on retrodiction provides uniquely interpretable and accurate trajectories from ambiguous MHT output if a certain time delay is tolerated. In a Bayesian framework the theoretical background of retrodiction and its intimate relation to Bayesian MHT is sketched. By a simulated example with two closely-spaced targets, relatively low detection probabilities, and rather high false return densities, we demonstrate the benefits of retrodiction and quantitatively discuss the achievable track accuracies and the time delays involved for typical radar parameters.

  6. Incorporating detection probability into northern Great Plains pronghorn population estimates

    USGS Publications Warehouse

    Jacques, Christopher N.; Jenks, Jonathan A.; Grovenburg, Troy W.; Klaver, Robert W.; DePerno, Christopher S.

    2014-01-01

    Pronghorn (Antilocapra americana) abundances commonly are estimated using fixed-wing surveys, but these estimates are likely to be negatively biased because of violations of key assumptions underpinning line-transect methodology. Reducing bias and improving precision of abundance estimates through use of detection probability and mark-resight models may allow for more responsive pronghorn management actions. Given their potential application in population estimation, we evaluated detection probability and mark-resight models for use in estimating pronghorn population abundance. We used logistic regression to quantify probabilities that detecting pronghorn might be influenced by group size, animal activity, percent vegetation, cover type, and topography. We estimated pronghorn population size by study area and year using mixed logit-normal mark-resight (MLNM) models. Pronghorn detection probability increased with group size, animal activity, and percent vegetation; overall detection probability was 0.639 (95% CI = 0.612–0.667) with 396 of 620 pronghorn groups detected. Despite model selection uncertainty, the best detection probability models were 44% (range = 8–79%) and 180% (range = 139–217%) greater than traditional pronghorn population estimates. Similarly, the best MLNM models were 28% (range = 3–58%) and 147% (range = 124–180%) greater than traditional population estimates. Detection probability of pronghorn was not constant but depended on both intrinsic and extrinsic factors. When pronghorn detection probability is a function of animal group size, animal activity, landscape complexity, and percent vegetation, traditional aerial survey techniques will result in biased pronghorn abundance estimates. Standardizing survey conditions, increasing resighting occasions, or accounting for variation in individual heterogeneity in mark-resight models will increase the accuracy and precision of pronghorn population estimates.

  7. Diagnostic potential of multi-targeted LAMP (loop-mediated isothermal amplification) for osteoarticular tuberculosis.

    PubMed

    Sharma, Kusum; Sharma, Megha; Batra, Nitya; Sharma, Aman; Dhillon, Mandeep Singh

    2017-02-01

    Delay in diagnosing osteoarticular tuberculosis (OATB) contributes significantly to morbidity by causing disfiguration and neurological sequelae. The delay caused by conventional culture and the expertise and expense involved in other nucleic acid based tests, make LAMP (loop-mediated isothermal amplification) assay a favorable middle path. We evaluated LAMP assay using IS6110 and MPB64 for rapid diagnosis of OATB by comparing with IS6110 PCR and culture. LAMP assay was performed on 140 synovial fluid and pus samples (10 culture-positive proven cases, 80 culture-negative probable cases, and 50 negative controls) using three set of primer pairs each for IS6110 and MPB64. LAMP assay, using two-target approach, had an overall sensitivity and specificity of 90% and 100% in detecting OATB. Sensitivity of IS6110 PCR, IS6110 LAMP, and MPB64 LAMP was 80%, 100%, and 100%, respectively, for confirmed cases and 72.5%, 81.75%, and 86.25%, respectively, for probable cases. Six additional cases were picked using two-target approach. LAMP assay utilizing IS6110 and MPB64 is a cost-effective technique for an early and reliable diagnosis of OATB. © 2016 Orthopaedic Research Society. Published by Wiley Periodicals, Inc. J Orthop Res 35:361-365, 2017. © 2016 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  8. Improvement of the quantitation method for the tdh+ Vibrio parahaemolyticus in molluscan shellfish based on most-probable- number, immunomagnetic separation, and loop-mediated isothermal amplification

    PubMed Central

    Escalante-Maldonado, Oscar; Kayali, Ahmad Y.; Yamazaki, Wataru; Vuddhakul, Varaporn; Nakaguchi, Yoshitsugu; Nishibuchi, Mitsuaki

    2015-01-01

    Vibrio parahaemolyticus is a marine microorganism that can cause seafood-borne gastroenteritis in humans. The infection can be spread and has become a pandemic through the international trade of contaminated seafood. Strains carrying the tdh gene encoding the thermostable direct hemolysin (TDH) and/or the trh gene encoding the TDH-related hemolysin (TRH) are considered to be pathogenic with the former gene being the most frequently found in clinical strains. However, their distribution frequency in environmental isolates is below 1%. Thus, very sensitive methods are required for detection and quantitation of tdh+ strains in seafood. We previously reported a method to detect and quantify tdh+ V. parahaemolyticus in seafood. This method consists of three components: the most-probable-number (MPN), the immunomagnetic separation (IMS) targeting all established K antigens, and the loop-mediated isothermal amplification (LAMP) targeting the tdh gene. However, this method faces regional issues in tropical zones of the world. Technicians have difficulties in securing dependable reagents in high-temperature climates where we found MPN underestimation in samples having tdh+ strains as well as other microorganisms present at high concentrations. In the present study, we solved the underestimation problem associated with the salt polymyxin broth enrichment for the MPN component and with the immunomagnetic bead-target association for the IMS component. We also improved the supply and maintenance of the dependable reagents by introducing a dried reagent system to the LAMP component. The modified method is specific, sensitive, quick and easy and applicable regardless of the concentrations of tdh+ V. parahaemolyticus. Therefore, we conclude this modified method is useful in world tropical, sub-tropical, and temperate zones. PMID:25914681

  9. Developing an Operational and Tactical Methodology for Incorporating Existing Technologies to Produce the Highest Probability of Detecting an Individual Wearing an IED

    DTIC Science & Technology

    2010-06-01

    The public response to the tactical use of suicide bombing depends on how the tactic is used by the insurgent organizations, against whom, and...people from unnecessarily getting hurt. If the goal is to pull aside potential suspects and conduct further searches, then a less certain or...a significant question to security forces. How do you stop a suicide bomber on his way to the target? Individuals who carry improvised explosives

  10. Calculation of Cumulative Distributions and Detection Probabilities in Communications and Optics.

    DTIC Science & Technology

    1986-03-31

    result, Figure 3.1 shows the additional SNR required (often called the CFAR loss) for the MLD, CMLD , and OSD in a multiple target environment to...Notice that although the CFAR loss increases with INR for the MLD, the CMLD and OSD have a bounded loss as the INR + w. These results have been more...false-alarm rate ( CFAR ) when the background noise level is unknown. In Section 2 we described the application of saddlepoint integration techniques to

  11. The Marine Corps Needs a Targeting, Sensors, and Surveillance Systems Operational Integration and Support Team

    DTIC Science & Technology

    2010-03-02

    triggerman is probably still close ; lately all IEDs in the area have been initiated via command-wire. The squad leader sets a cordon, ensures an IED 9...Operational Surveillance System (G-BOSS) with a Class IIIb laser pointer. This class of laser requires users to receive a laser safety class...2) The Keyhole kit of surveillance equipment. Designed to provide “snipers with an increased capability to visually detect the enemy emplacing IEDs

  12. Method for enhancing single-trial P300 detection by introducing the complexity degree of image information in rapid serial visual presentation tasks

    PubMed Central

    Lin, Zhimin; Zeng, Ying; Tong, Li; Zhang, Hangming; Zhang, Chi

    2017-01-01

    The application of electroencephalogram (EEG) generated by human viewing images is a new thrust in image retrieval technology. A P300 component in the EEG is induced when the subjects see their point of interest in a target image under the rapid serial visual presentation (RSVP) experimental paradigm. We detected the single-trial P300 component to determine whether a subject was interested in an image. In practice, the latency and amplitude of the P300 component may vary in relation to different experimental parameters, such as target probability and stimulus semantics. Thus, we proposed a novel method, Target Recognition using Image Complexity Priori (TRICP) algorithm, in which the image information is introduced in the calculation of the interest score in the RSVP paradigm. The method combines information from the image and EEG to enhance the accuracy of single-trial P300 detection on the basis of traditional single-trial P300 detection algorithm. We defined an image complexity parameter based on the features of the different layers of a convolution neural network (CNN). We used the TRICP algorithm to compute for the complexity of an image to quantify the effect of different complexity images on the P300 components and training specialty classifier according to the image complexity. We compared TRICP with the HDCA algorithm. Results show that TRICP is significantly higher than the HDCA algorithm (Wilcoxon Sign Rank Test, p<0.05). Thus, the proposed method can be used in other and visual task-related single-trial event-related potential detection. PMID:29283998

  13. The Target Selective Neural Response — Similarity, Ambiguity, and Learning Effects

    PubMed Central

    Hampshire, Adam; Thompson, Russell; Duncan, John; Owen, Adrian M.

    2008-01-01

    A network of frontal and parietal brain regions is commonly recruited during tasks that require the deliberate ‘top-down’ control of thought and action. Previously, using simple target detection, we have demonstrated that within this frontoparietal network, the right ventrolateral prefrontal cortex (VLPFC) in particular is sensitive to the presentation of target objects. Here, we use a range of target/non-target morphs to plot the target selective response within distinct frontoparietal sub-regions in greater detail. The increased resolution allows us to examine the extent to which different cognitive factors can predict the blood oxygenation level dependent (BOLD) response to targets. Our results reveal that both probability of positive identification (similarity to target) and proximity to the 50% decision boundary (ambiguity) are significant predictors of BOLD signal change, particularly in the right VLPFC. Furthermore, the profile of target related signal change is not static, with the degree of selectivity increasing as the task becomes familiar. These findings demonstrate that frontoparietal sub-regions are recruited under increased cognitive demand and that when recruited, they adapt, using both fast and slow mechanisms, to selectively respond to those items that are of the most relevance to current intentions. PMID:18575585

  14. Robust chemical and chemical-resistant material detection using hyper-spectral imager and a new bend interpolation and local scaling HSI sharpening method

    NASA Astrophysics Data System (ADS)

    Chen, Hai-Wen; McGurr, Michael; Brickhouse, Mark

    2015-05-01

    We present new results from our ongoing research activity for chemical threat detection using hyper-spectral imager (HSI) detection techniques by detecting nontraditional threat spectral signatures of agent usage, such as protective equipment, coatings, paints, spills, and stains that are worn by human or on trucks or other objects. We have applied several current state-of-the-art HSI target detection methods such as Matched Filter (MF), Adaptive Coherence Estimator (ACE), Constrained Energy Minimization (CEM), and Spectral Angle Mapper (SAM). We are interested in detecting several chemical related materials: (a) Tyvek clothing is chemical resistance and Tyvek coveralls are one-piece garments for protecting human body from harmful chemicals, and (b) ammonium salts from background could be representative of spills from scrubbers or related to other chemical activities. The HSI dataset that we used for detection covers a chemical test field with more than 50 different kinds of chemicals, protective materials, coatings, and paints. Among them, there are four different kinds of Tyvek material, three types of ammonium salts, and one yellow jugs. The imagery cube data were collected by a HSI sensor with a spectral range of 400-2,500nm. Preliminary testing results are promising, and very high probability of detection (Pd) and low probability of false detection are achieved with the usage of full spectral range (400- 2,500nm). In the second part of this paper, we present our newly developed HSI sharpening technique. A new Band Interpolation and Local Scaling (BILS) method has been developed to improve HSI spatial resolution by 4-16 times with a low-cost high-resolution pen-chromatic camera and a RGB camera. Preliminary results indicate that this new technique is promising.

  15. Persistence rates and detection probabilities of oiled king eider carcasses on St Paul Island, Alaska

    USGS Publications Warehouse

    Fowler, A.C.; Flint, Paul L.

    1997-01-01

    Following an oil spill off St Paul Island, Alaska in February 1996, persistence rates and detection probabilities of oiled king eider (Somateria spectabilis) carcasses were estimated using the Cormack-Jolly-Seber model. Carcass persistence rates varied by day, beach type and sex, while detection probabilities varied by day and beach type. Scavenging, wave action and weather influenced carcass persistence. The patterns of persistence differed on rock and sand beaches and female carcasses had a different persistence function than males. Weather, primarily snow storms, and degree of carcass scavenging, diminished carcass detectability. Detection probabilities on rock beaches were lower and more variable than on sand beaches. The combination of persistence rates and detection probabilities can be used to improve techniques of estimating total mortality.

  16. A novel mechanism for a survival advantage of vigilant individuals in groups.

    PubMed

    van der Post, Daniel J; de Weerd, Harmen; Verbrugge, Rineke; Hemelrijk, Charlotte K

    2013-11-01

    In many animal species, vigilance is crucial for avoiding predation. In groups, however, nonvigilant individuals could benefit from the vigilance of others without any of the associated costs. In an evolutionary sense, such exploitation may be compensated if vigilant individuals have a survival advantage. The novelty in our model is that the probability to detect a predator is "distance dependent." We show that even if nonvigilant individuals benefit fully from information produced by vigilant individuals, vigilant individuals nevertheless enjoy a survival advantage. This happens because detection of predators is more likely when vigilant individuals happen to be targets of predation. We expect this distance-dependent mechanism to be compatible with previously reported mechanisms.

  17. Development of clinical paroxysmal nocturnal haemoglobinuria in children with aplastic anaemia.

    PubMed

    Narita, Atsushi; Muramatsu, Hideki; Okuno, Yusuke; Sekiya, Yuko; Suzuki, Kyogo; Hamada, Motoharu; Kataoka, Shinsuke; Ichikawa, Daisuke; Taniguchi, Rieko; Murakami, Norihiro; Kojima, Daiei; Nishikawa, Eri; Kawashima, Nozomu; Nishio, Nobuhiro; Hama, Asahito; Takahashi, Yoshiyuki; Kojima, Seiji

    2017-09-01

    The clinical significance of paroxysmal nocturnal haemoglobinuria (PNH) in children with aplastic anaemia (AA) remains unclear. We retrospectively studied 57 children with AA between 1992 and 2010. During the follow-up, five patients developed clinical PNH, in whom somatic PIGA mutations were detected by targeted sequencing. The 10-year probability of clinical PNH development was 10·2% (95% confidence interval, 3·6-20·7%). Furthermore, the detection of minor PNH clones by flow cytometry at AA diagnosis was a risk factor for the subsequent development of clinical PNH. These patients with PNH clones at AA diagnosis should undergo periodic monitoring for potential clinical PNH development. © 2017 John Wiley & Sons Ltd.

  18. [Subnational analysis of probability of premature mortality caused by four main non-communicable diseases in China during 1990-2015 and " Health China 2030" reduction target].

    PubMed

    Zeng, X Y; Li, Y C; Liu, S W; Wang, L J; Liu, Y N; Liu, J M; Zhou, M G

    2017-03-06

    Objective: To investigate the current status, temporal trend and achieving Health China 2030 reduction target of probability of premature mortality caused by four main non-communicable diseases (NCDs) including cardiovascular and cerebrovascular diseases, tumour, diabetes, and chronic respiratory disease in China both at national and provincial level during 1990 to 2015. Methods: Using the results of Global Burden of Disease study 2015 (GBD 2015), according to the method of calculating premature mortality probability recommended by WHO, the current status and temporal trend by different gender from 1990 to 2015 were calculated, analyzed, and compared. Referring to " Health China 2030" target of reduction 30% of probability of premature mortality caused by major NCDs, we evaluated the difficulty of achieving the reduction target among provinces (not including Taiwan). Results: From 1990 to 2015, the probabilities of premature mortality in cardiovascular and cerebrovascular diseases, tumour, and chronic respiratory disease were all declined consistently for both men and women in China, the total of four main NCDs decreased from 30.69% to 18.54% with higher decreasing in women (from 25.97% to 12.40%) than that in men (from 34.94% to 24.19%). In 2015, the top five provinces in terms of probability of premature mortality caused by four main NCDs were Qinghai (28.81%), Tibet (25.88%), Guizhou (24.67%), Guangxi (23.56%), and Xinjiang (23.21%) in turn, while the top five provinces with the lowest probability were Shanghai (8.40%), Beijing (9.39%), Hong Kong (10.10%), Macao (10.31%), and Zhejiang (11.70%). If achieving the " Health China 2030" target, the probabilities of premature mortality in Qinghai and Tibet with the highest probability should decline to about 20.17%, and 18.12%, respectively in 2030, while 5.88%, and 6.57% in Shanghai and Beijing, respectively. From 1990 to 2015, the probability of premature mortality of four main NCDs declined by 2.00% a year on average, the top five provinces with the fastest decline were Beijing (3.48%), Shanghai (3.24%), Zhejiang (2.81%), Fujian (2.75%), and Guangdong (2.67%), and 11 provinces including these five provinces could achieve the " Health China 2030" target by the usual rate of decline, while other 22 provinces could not achieve the target, they need greater rate of decline in order to achieve the target. Conclusion: From 1990 to 2015, the probabilities of premature mortality of four main NCDs were declined consistently in China both at national and provincial level, compared with women, the men had higher probabilities and declined slower, there were significant different in probabilities of premature mortality and their change speed among provinces. Based on the results from 1990 to 2015, there were about two thirds of the provinces, which the task of achieving the Health China 2030 target will be daunting.

  19. Colloidal core-seeded semiconductor nanorods as fluorescent labels for in-vitro diagnostics (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Chan, YinThai

    2016-03-01

    Colloidal semiconductor nanocrystals are ideal fluorophores for clinical diagnostics, therapeutics, and highly sensitive biochip applications due to their high photostability, size-tunable color of emission and flexible surface chemistry. The relatively recent development of core-seeded semiconductor nanorods showed that the presence of a rod-like shell can confer even more advantageous physicochemical properties than their spherical counterparts, such as large multi-photon absorption cross-sections and facet-specific chemistry that can be exploited to deposit secondary nanoparticles. It may be envisaged that these highly fluorescent nanorods can be integrated with large scale integrated (LSI) microfluidic systems that allow miniaturization and integration of multiple biochemical processes in a single device at the nanoliter scale, resulting in a highly sensitive and automated detection platform. In this talk, I will describe a LSI microfluidic device that integrates RNA extraction, reverse transcription to cDNA, amplification and target pull-down to detect histidine decarboxylase (HDC) gene directly from human white blood cells samples. When anisotropic colloidal semiconductor nanorods (NRs) were used as the fluorescent readout, the detection limit was found to be 0.4 ng of total RNA, which was much lower than that obtained using spherical quantum dots (QDs) or organic dyes. This was attributed to the large action cross-section of NRs and their high probability of target capture in a pull-down detection scheme. The combination of large scale integrated microfluidics with highly fluorescent semiconductor NRs may find widespread utility in point-of-care devices and multi-target diagnostics.

  20. Handheld and mobile hyperspectral imaging sensors for wide-area standoff detection of explosives and chemical warfare agents

    NASA Astrophysics Data System (ADS)

    Gomer, Nathaniel R.; Gardner, Charles W.; Nelson, Matthew P.

    2016-05-01

    Hyperspectral imaging (HSI) is a valuable tool for the investigation and analysis of targets in complex background with a high degree of autonomy. HSI is beneficial for the detection of threat materials on environmental surfaces, where the concentration of the target of interest is often very low and is typically found within complex scenery. Two HSI techniques that have proven to be valuable are Raman and shortwave infrared (SWIR) HSI. Unfortunately, current generation HSI systems have numerous size, weight, and power (SWaP) limitations that make their potential integration onto a handheld or field portable platform difficult. The systems that are field-portable do so by sacrificing system performance, typically by providing an inefficient area search rate, requiring close proximity to the target for screening, and/or eliminating the potential to conduct real-time measurements. To address these shortcomings, ChemImage Sensor Systems (CISS) is developing a variety of wide-field hyperspectral imaging systems. Raman HSI sensors are being developed to overcome two obstacles present in standard Raman detection systems: slow area search rate (due to small laser spot sizes) and lack of eye-safety. SWIR HSI sensors have been integrated into mobile, robot based platforms and handheld variants for the detection of explosives and chemical warfare agents (CWAs). In addition, the fusion of these two technologies into a single system has shown the feasibility of using both techniques concurrently to provide higher probability of detection and lower false alarm rates. This paper will provide background on Raman and SWIR HSI, discuss the applications for these techniques, and provide an overview of novel CISS HSI sensors focused on sensor design and detection results.

  1. Site occupancy models with heterogeneous detection probabilities

    USGS Publications Warehouse

    Royle, J. Andrew

    2006-01-01

    Models for estimating the probability of occurrence of a species in the presence of imperfect detection are important in many ecological disciplines. In these ?site occupancy? models, the possibility of heterogeneity in detection probabilities among sites must be considered because variation in abundance (and other factors) among sampled sites induces variation in detection probability (p). In this article, I develop occurrence probability models that allow for heterogeneous detection probabilities by considering several common classes of mixture distributions for p. For any mixing distribution, the likelihood has the general form of a zero-inflated binomial mixture for which inference based upon integrated likelihood is straightforward. A recent paper by Link (2003, Biometrics 59, 1123?1130) demonstrates that in closed population models used for estimating population size, different classes of mixture distributions are indistinguishable from data, yet can produce very different inferences about population size. I demonstrate that this problem can also arise in models for estimating site occupancy in the presence of heterogeneous detection probabilities. The implications of this are discussed in the context of an application to avian survey data and the development of animal monitoring programs.

  2. Probability of identification: a statistical model for the validation of qualitative botanical identification methods.

    PubMed

    LaBudde, Robert A; Harnly, James M

    2012-01-01

    A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer, or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive nontarget (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given.

  3. Bottom-up guidance in visual search for conjunctions.

    PubMed

    Proulx, Michael J

    2007-02-01

    Understanding the relative role of top-down and bottom-up guidance is crucial for models of visual search. Previous studies have addressed the role of top-down and bottom-up processes in search for a conjunction of features but with inconsistent results. Here, the author used an attentional capture method to address the role of top-down and bottom-up processes in conjunction search. The role of bottom-up processing was assayed by inclusion of an irrelevant-size singleton in a search for a conjunction of color and orientation. One object was uniquely larger on each trial, with chance probability of coinciding with the target; thus, the irrelevant feature of size was not predictive of the target's location. Participants searched more efficiently for the target when it was also the size singleton, and they searched less efficiently for the target when a nontarget was the size singleton. Although a conjunction target cannot be detected on the basis of bottom-up processing alone, participants used search strategies that relied significantly on bottom-up guidance in finding the target, resulting in interference from the irrelevant-size singleton.

  4. Quantifying seining detection probability for fishes of Great Plains sand‐bed rivers

    USGS Publications Warehouse

    Mollenhauer, Robert; Logue, Daniel R.; Brewer, Shannon K.

    2018-01-01

    Species detection error (i.e., imperfect and variable detection probability) is an essential consideration when investigators map distributions and interpret habitat associations. When fish detection error that is due to highly variable instream environments needs to be addressed, sand‐bed streams of the Great Plains represent a unique challenge. We quantified seining detection probability for diminutive Great Plains fishes across a range of sampling conditions in two sand‐bed rivers in Oklahoma. Imperfect detection resulted in underestimates of species occurrence using naïve estimates, particularly for less common fishes. Seining detection probability also varied among fishes and across sampling conditions. We observed a quadratic relationship between water depth and detection probability, in which the exact nature of the relationship was species‐specific and dependent on water clarity. Similarly, the direction of the relationship between water clarity and detection probability was species‐specific and dependent on differences in water depth. The relationship between water temperature and detection probability was also species dependent, where both the magnitude and direction of the relationship varied among fishes. We showed how ignoring detection error confounded an underlying relationship between species occurrence and water depth. Despite imperfect and heterogeneous detection, our results support that determining species absence can be accomplished with two to six spatially replicated seine hauls per 200‐m reach under average sampling conditions; however, required effort would be higher under certain conditions. Detection probability was low for the Arkansas River Shiner Notropis girardi, which is federally listed as threatened, and more than 10 seine hauls per 200‐m reach would be required to assess presence across sampling conditions. Our model allows scientists to estimate sampling effort to confidently assess species occurrence, which maximizes the use of available resources. Increased implementation of approaches that consider detection error promote ecological advancements and conservation and management decisions that are better informed.

  5. Ferrocene-oligonucleotide conjugates for electrochemical probing of DNA.

    PubMed Central

    Ihara, T; Maruo, Y; Takenaka, S; Takagi, M

    1996-01-01

    Toward the development of a universal, sensitive and convenient method of DNA (or RNA) detection, electrochemically active oligonucleotides were prepared by covalent linkage of a ferrocenyl group to the 5'-aminohexyl-terminated synthetic oligonucleotides. Using these electrochemically active probes, we have been able to demonstrate the detection of DNA and RNA at femtomole levels by HPLC equipped with an ordinary electrochemical detector (ECD) [Takenaka,S., Uto,Y., Kondo,H., Ihara,T. and Takagi,M. (1994) Anal. Biochem., 218, 436-443]. Thermodynamic and electrochemical studies of the interaction between the probes and the targets are presented here. The thermodynamics obtained revealed that the conjugation stabilizes the triple-helix complexes by 2-3 kcal mol-1 (1-2 orders increment in binding constant) at 298 K, which corresponds to the effect of elongation of additional several base triplets. The main cause of this thermodynamic stabilization by the conjugation is likely to be the overall conformational change of whole structure of the conjugate rather than the additional local interaction. The redox potential of the probe was independent of the target structure, which is either single- or double stranded. However, the potential is slightly dependent (with a 10-30 mV negative shift on complexation) on the extra sequence in the target, probably because the individual sequence is capable of contacting or interacting with the ferrocenyl group in a slightly different way from each other. This small potential shift itself, however, does not cause any inconvenience on practical applications in detecting the probes by using ECD. These results lead to the conclusion that the redox-active probes are very useful for the microanalysis of nucleic acids due to the stability of the complexes, high detection sensitivity and wide applicability to the target structures (DNA and RNA; single- and double strands) and the sequences. PMID:8932383

  6. Prevalence effects in newly trained airport checkpoint screeners: trained observers miss rare targets, too.

    PubMed

    Wolfe, Jeremy M; Brunelli, David N; Rubinstein, Joshua; Horowitz, Todd S

    2013-12-02

    Many socially important search tasks are characterized by low target prevalence, meaning that targets are rarely encountered. For example, transportation security officers (TSOs) at airport checkpoints encounter very few actual threats in carry-on bags. In laboratory-based visual search experiments, low prevalence reduces the probability of detecting targets (Wolfe, Horowitz, & Kenner, 2005). In the lab, this "prevalence effect" is caused by changes in decision and response criteria (Wolfe & Van Wert, 2010) and can be mitigated by presenting a burst of high-prevalence search with feedback (Wolfe et al., 2007). The goal of this study was to see if these effects could be replicated in the field with TSOs. A total of 125 newly trained TSOs participated in one of two experiments as part of their final evaluation following training. They searched for threats in simulated bags across five blocks. The first three blocks were low prevalence (target prevalence ≤ .05) with no feedback; the fourth block was high prevalence (.50) with full feedback; and the final block was, again, low prevalence. We found that newly trained TSOs were better at detecting targets at high compared to low prevalence, replicating the prevalence effect. Furthermore, performance was better (and response criterion was more "liberal") in the low-prevalence block that took place after the high-prevalence block than in the initial three low-prevalence blocks, suggesting that a burst of high-prevalence trials may help alleviate the prevalence effect in the field.

  7. Prevalence effects in newly trained airport checkpoint screeners: Trained observers miss rare targets, too

    PubMed Central

    Wolfe, Jeremy M.; Brunelli, David N.; Rubinstein, Joshua; Horowitz, Todd S.

    2013-01-01

    Many socially important search tasks are characterized by low target prevalence, meaning that targets are rarely encountered. For example, transportation security officers (TSOs) at airport checkpoints encounter very few actual threats in carry-on bags. In laboratory-based visual search experiments, low prevalence reduces the probability of detecting targets (Wolfe, Horowitz, & Kenner, 2005). In the lab, this “prevalence effect” is caused by changes in decision and response criteria (Wolfe & Van Wert, 2010) and can be mitigated by presenting a burst of high-prevalence search with feedback (Wolfe et al., 2007). The goal of this study was to see if these effects could be replicated in the field with TSOs. A total of 125 newly trained TSOs participated in one of two experiments as part of their final evaluation following training. They searched for threats in simulated bags across five blocks. The first three blocks were low prevalence (target prevalence ≤ .05) with no feedback; the fourth block was high prevalence (.50) with full feedback; and the final block was, again, low prevalence. We found that newly trained TSOs were better at detecting targets at high compared to low prevalence, replicating the prevalence effect. Furthermore, performance was better (and response criterion was more “liberal”) in the low-prevalence block that took place after the high-prevalence block than in the initial three low-prevalence blocks, suggesting that a burst of high-prevalence trials may help alleviate the prevalence effect in the field. PMID:24297778

  8. First targeted search for gravitational-wave bursts from core-collapse supernovae in data of first-generation laser interferometer detectors

    NASA Astrophysics Data System (ADS)

    Abbott, B. P.; Abbott, R.; Abbott, T. D.; Abernathy, M. R.; Acernese, F.; Ackley, K.; Adams, C.; Adams, T.; Addesso, P.; Adhikari, R. X.; Adya, V. B.; Affeldt, C.; Agathos, M.; Agatsuma, K.; Aggarwal, N.; Aguiar, O. D.; Aiello, L.; Ain, A.; Ajith, P.; Allen, B.; Allocca, A.; Altin, P. A.; Anderson, S. B.; Anderson, W. G.; Arai, K.; Araya, M. C.; Arceneaux, C. C.; Areeda, J. S.; Arnaud, N.; Arun, K. G.; Ascenzi, S.; Ashton, G.; Ast, M.; Aston, S. M.; Astone, P.; Aufmuth, P.; Aulbert, C.; Babak, S.; Bacon, P.; Bader, M. K. M.; Baker, P. T.; Baldaccini, F.; Ballardin, G.; Ballmer, S. W.; Barayoga, J. C.; Barclay, S. E.; Barish, B. C.; Barker, D.; Barone, F.; Barr, B.; Barsotti, L.; Barsuglia, M.; Barta, D.; Bartlett, J.; Bartos, I.; Bassiri, R.; Basti, A.; Batch, J. C.; Baune, C.; Bavigadda, V.; Bazzan, M.; Behnke, B.; Bejger, M.; Bell, A. S.; Bell, C. J.; Berger, B. K.; Bergman, J.; Bergmann, G.; Berry, C. P. L.; Bersanetti, D.; Bertolini, A.; Betzwieser, J.; Bhagwat, S.; Bhandare, R.; Bilenko, I. A.; Billingsley, G.; Birch, J.; Birney, R.; Biscans, S.; Bisht, A.; Bitossi, M.; Biwer, C.; Bizouard, M. A.; Blackburn, J. K.; Blair, C. D.; Blair, D. G.; Blair, R. M.; Bloemen, S.; Bock, O.; Bodiya, T. P.; Boer, M.; Bogaert, G.; Bogan, C.; Bohe, A.; Bojtos, P.; Bond, C.; Bondu, F.; Bonnand, R.; Boom, B. A.; Bork, R.; Boschi, V.; Bose, S.; Bouffanais, Y.; Bozzi, A.; Bradaschia, C.; Brady, P. R.; Braginsky, V. B.; Branchesi, M.; Brau, J. E.; Briant, T.; Brillet, A.; Brinkmann, M.; Brisson, V.; Brockill, P.; Brooks, A. F.; Brown, D. A.; Brown, D. D.; Brown, N. M.; Buchanan, C. C.; Buikema, A.; Bulik, T.; Bulten, H. J.; Buonanno, A.; Buskulic, D.; Buy, C.; Byer, R. L.; Cadonati, L.; Cagnoli, G.; Cahillane, C.; Calderón Bustillo, J.; Callister, T.; Calloni, E.; Camp, J. B.; Cannon, K. C.; Cao, J.; Capano, C. D.; Capocasa, E.; Carbognani, F.; Caride, S.; Casanueva Diaz, J.; Casentini, C.; Caudill, S.; Cavaglià, M.; Cavalier, F.; Cavalieri, R.; Cella, G.; Cepeda, C. B.; Cerboni Baiardi, L.; Cerretani, G.; Cesarini, E.; Chakraborty, R.; Chalermsongsak, T.; Chamberlin, S. J.; Chan, M.; Chao, S.; Charlton, P.; Chassande-Mottin, E.; Chen, H. Y.; Chen, Y.; Cheng, C.; Chincarini, A.; Chiummo, A.; Cho, H. S.; Cho, M.; Chow, J. H.; Christensen, N.; Chu, Q.; Chua, S.; Chung, S.; Ciani, G.; Clara, F.; Clark, J. A.; Cleva, F.; Coccia, E.; Cohadon, P.-F.; Colla, A.; Collette, C. G.; Cominsky, L.; Constancio, M.; Conte, A.; Conti, L.; Cook, D.; Corbitt, T. R.; Cornish, N.; Corpuz, A.; Corsi, A.; Cortese, S.; Costa, C. A.; Coughlin, M. W.; Coughlin, S. B.; Coulon, J.-P.; Countryman, S. T.; Couvares, P.; Coward, D. M.; Cowart, M. J.; Coyne, D. C.; Coyne, R.; Craig, K.; Creighton, J. D. E.; Cripe, J.; Crowder, S. G.; Cumming, A.; Cunningham, L.; Cuoco, E.; Dal Canton, T.; Danilishin, S. L.; D'Antonio, S.; Danzmann, K.; Darman, N. S.; Dattilo, V.; Dave, I.; Daveloza, H. P.; Davier, M.; Davies, G. S.; Daw, E. J.; Day, R.; DeBra, D.; Debreczeni, G.; Degallaix, J.; De Laurentis, M.; Deléglise, S.; Del Pozzo, W.; Denker, T.; Dent, T.; Dergachev, V.; De Rosa, R.; DeRosa, R. T.; DeSalvo, R.; Dhurandhar, S.; Díaz, M. C.; Di Fiore, L.; Di Giovanni, M.; Di Girolamo, T.; Di Lieto, A.; Di Pace, S.; Di Palma, I.; Di Virgilio, A.; Dojcinoski, G.; Dolique, V.; Donovan, F.; Dooley, K. L.; Doravari, S.; Douglas, R.; Downes, T. P.; Drago, M.; Drever, R. W. P.; Driggers, J. C.; Du, Z.; Ducrot, M.; Dwyer, S. E.; Edo, T. B.; Edwards, M. C.; Effler, A.; Eggenstein, H.-B.; Ehrens, P.; Eichholz, J.; Eikenberry, S. S.; Engels, W.; Essick, R. C.; Etzel, T.; Evans, M.; Evans, T. M.; Everett, R.; Factourovich, M.; Fafone, V.; Fair, H.; Fairhurst, S.; Fan, X.; Fang, Q.; Farinon, S.; Farr, B.; Farr, W. M.; Favata, M.; Fays, M.; Fehrmann, H.; Fejer, M. M.; Ferrante, I.; Ferreira, E. C.; Ferrini, F.; Fidecaro, F.; Fiori, I.; Fiorucci, D.; Fisher, R. P.; Flaminio, R.; Fletcher, M.; Fournier, J.-D.; Frasca, S.; Frasconi, F.; Frei, Z.; Freise, A.; Frey, R.; Frey, V.; Fricke, T. T.; Fritschel, P.; Frolov, V. V.; Fulda, P.; Fyffe, M.; Gabbard, H. A. G.; Gair, J. R.; Gammaitoni, L.; Gaonkar, S. G.; Garufi, F.; Gaur, G.; Gehrels, N.; Gemme, G.; Genin, E.; Gennai, A.; George, J.; Gergely, L.; Germain, V.; Ghosh, Archisman; Ghosh, S.; Giaime, J. A.; Giardina, K. D.; Giazotto, A.; Gill, K.; Glaefke, A.; Goetz, E.; Goetz, R.; Gondan, L.; González, G.; Gonzalez Castro, J. M.; Gopakumar, A.; Gordon, N. A.; Gorodetsky, M. L.; Gossan, S. E.; Gosselin, M.; Gouaty, R.; Grado, A.; Graef, C.; Graff, P. B.; Granata, M.; Grant, A.; Gras, S.; Gray, C.; Greco, G.; Green, A. C.; Groot, P.; Grote, H.; Grunewald, S.; Guidi, G. M.; Guo, X.; Gupta, A.; Gupta, M. K.; Gushwa, K. E.; Gustafson, E. K.; Gustafson, R.; Hacker, J. J.; Hall, B. R.; Hall, E. D.; Hammond, G.; Haney, M.; Hanke, M. M.; Hanks, J.; Hanna, C.; Hannam, M. D.; Hanson, J.; Hardwick, T.; Harms, J.; Harry, G. M.; Harry, I. W.; Hart, M. J.; Hartman, M. T.; Haster, C.-J.; Haughian, K.; Heidmann, A.; Heintze, M. C.; Heitmann, H.; Hello, P.; Hemming, G.; Hendry, M.; Heng, I. S.; Hennig, J.; Heptonstall, A. W.; Heurs, M.; Hild, S.; Hoak, D.; Hodge, K. A.; Hofman, D.; Hollitt, S. E.; Holt, K.; Holz, D. E.; Hopkins, P.; Hosken, D. J.; Hough, J.; Houston, E. A.; Howell, E. J.; Hu, Y. M.; Huang, S.; Huerta, E. A.; Huet, D.; Hughey, B.; Husa, S.; Huttner, S. H.; Huynh-Dinh, T.; Idrisy, A.; Indik, N.; Ingram, D. R.; Inta, R.; Isa, H. N.; Isac, J.-M.; Isi, M.; Islas, G.; Isogai, T.; Iyer, B. R.; Izumi, K.; Jacqmin, T.; Jang, H.; Jani, K.; Jaranowski, P.; Jawahar, S.; Jiménez-Forteza, F.; Johnson, W. W.; Jones, D. I.; Jones, R.; Jonker, R. J. G.; Ju, L.; Haris, K.; Kalaghatgi, C. V.; Kalmus, P.; Kalogera, V.; Kamaretsos, I.; Kandhasamy, S.; Kang, G.; Kanner, J. B.; Karki, S.; Kasprzack, M.; Katsavounidis, E.; Katzman, W.; Kaufer, S.; Kaur, T.; Kawabe, K.; Kawazoe, F.; Kéfélian, F.; Kehl, M. S.; Keitel, D.; Kelley, D. B.; Kells, W.; Kennedy, R.; Key, J. S.; Khalaidovski, A.; Khalili, F. Y.; Khan, I.; Khan, S.; Khan, Z.; Khazanov, E. A.; Kijbunchoo, N.; Kim, Chunglee; Kim, J.; Kim, K.; Kim, Nam-Gyu; Kim, Namjun; Kim, Y.-M.; King, E. J.; King, P. J.; Kinzel, D. L.; Kissel, J. S.; Kleybolte, L.; Klimenko, S.; Koehlenbeck, S. M.; Kokeyama, K.; Koley, S.; Kondrashov, V.; Kontos, A.; Korobko, M.; Korth, W. Z.; Kowalska, I.; Kozak, D. B.; Kringel, V.; Krishnan, B.; Królak, A.; Krueger, C.; Kuehn, G.; Kumar, P.; Kuo, L.; Kutynia, A.; Lackey, B. D.; Landry, M.; Lange, J.; Lantz, B.; Lasky, P. D.; Lazzarini, A.; Lazzaro, C.; Leaci, P.; Leavey, S.; Lebigot, E. O.; Lee, C. H.; Lee, H. K.; Lee, H. M.; Lee, K.; Lenon, A.; Leonardi, M.; Leong, J. R.; Leroy, N.; Letendre, N.; Levin, Y.; Levine, B. M.; Li, T. G. F.; Libson, A.; Littenberg, T. B.; Lockerbie, N. A.; Loew, K.; Logue, J.; Lombardi, A. L.; Lord, J. E.; Lorenzini, M.; Loriette, V.; Lormand, M.; Losurdo, G.; Lough, J. D.; Lück, H.; Lundgren, A. P.; Luo, J.; Lynch, R.; Ma, Y.; MacDonald, T.; Machenschalk, B.; MacInnis, M.; Macleod, D. M.; Magaña-Sandoval, F.; Magee, R. M.; Mageswaran, M.; Majorana, E.; Maksimovic, I.; Malvezzi, V.; Man, N.; Mandel, I.; Mandic, V.; Mangano, V.; Mansell, G. L.; Manske, M.; Mantovani, M.; Marchesoni, F.; Marion, F.; Márka, S.; Márka, Z.; Markosyan, A. S.; Maros, E.; Martelli, F.; Martellini, L.; Martin, I. W.; Martin, R. M.; Martynov, D. V.; Marx, J. N.; Mason, K.; Masserot, A.; Massinger, T. J.; Masso-Reid, M.; Mastrogiovanni, S.; Matichard, F.; Matone, L.; Mavalvala, N.; Mazumder, N.; Mazzolo, G.; McCarthy, R.; McClelland, D. E.; McCormick, S.; McGuire, S. C.; McIntyre, G.; McIver, J.; McManus, D. J.; McWilliams, S. T.; Meacher, D.; Meadors, G. D.; Meidam, J.; Melatos, A.; Mendell, G.; Mendoza-Gandara, D.; Mercer, R. A.; Merilh, E. L.; Merzougui, M.; Meshkov, S.; Messenger, C.; Messick, C.; Metzdorff, R.; Meyers, P. M.; Mezzani, F.; Miao, H.; Michel, C.; Middleton, H.; Mikhailov, E. E.; Milano, L.; Miller, A. L.; Miller, J.; Millhouse, M.; Minenkov, Y.; Ming, J.; Mirshekari, S.; Mishra, C.; Mitra, S.; Mitrofanov, V. P.; Mitselmakher, G.; Mittleman, R.; Moggi, A.; Mohan, M.; Mohapatra, S. R. P.; Montani, M.; Moore, B. C.; Moore, C. J.; Moraru, D.; Moreno, G.; Morriss, S. R.; Mossavi, K.; Mours, B.; Mow-Lowry, C. M.; Mueller, C. L.; Mueller, G.; Muir, A. W.; Mukherjee, Arunava; Mukherjee, D.; Mukherjee, S.; Mukund, K. N.; Mullavey, A.; Munch, J.; Murphy, D. J.; Murray, P. G.; Mytidis, A.; Nardecchia, I.; Naticchioni, L.; Nayak, R. K.; Necula, V.; Nedkova, K.; Nelemans, G.; Neri, M.; Neunzert, A.; Newton, G.; Nguyen, T. T.; Nielsen, A. B.; Nissanke, S.; Nitz, A.; Nocera, F.; Nolting, D.; Normandin, M. E. N.; Nuttall, L. K.; Oberling, J.; Ochsner, E.; O'Dell, J.; Oelker, E.; Ogin, G. H.; Oh, J. J.; Oh, S. H.; Ohme, F.; Oliver, M.; Oppermann, P.; Oram, Richard J.; O'Reilly, B.; O'Shaughnessy, R.; Ott, C. D.; Ottaway, D. J.; Ottens, R. S.; Overmier, H.; Owen, B. J.; Pai, A.; Pai, S. A.; Palamos, J. R.; Palashov, O.; Palomba, C.; Pal-Singh, A.; Pan, H.; Pankow, C.; Pannarale, F.; Pant, B. C.; Paoletti, F.; Paoli, A.; Papa, M. A.; Paris, H. R.; Parker, W.; Pascucci, D.; Pasqualetti, A.; Passaquieti, R.; Passuello, D.; Patricelli, B.; Patrick, Z.; Pearlstone, B. L.; Pedraza, M.; Pedurand, R.; Pekowsky, L.; Pele, A.; Penn, S.; Pereira, R.; Perreca, A.; Phelps, M.; Piccinni, O. J.; Pichot, M.; Piergiovanni, F.; Pierro, V.; Pillant, G.; Pinard, L.; Pinto, I. M.; Pitkin, M.; Poggiani, R.; Popolizio, P.; Post, A.; Powell, J.; Prasad, J.; Predoi, V.; Premachandra, S. S.; Prestegard, T.; Price, L. R.; Prijatelj, M.; Principe, M.; Privitera, S.; Prix, R.; Prodi, G. A.; Prokhorov, L.; Puncken, O.; Punturo, M.; Puppo, P.; Pürrer, M.; Qi, H.; Qin, J.; Quetschke, V.; Quintero, E. A.; Quitzow-James, R.; Raab, F. J.; Rabeling, D. S.; Radkins, H.; Raffai, P.; Raja, S.; Rakhmanov, M.; Rapagnani, P.; Raymond, V.; Razzano, M.; Re, V.; Read, J.; Reed, C. M.; Regimbau, T.; Rei, L.; Reid, S.; Reitze, D. H.; Rew, H.; Ricci, F.; Riles, K.; Robertson, N. A.; Robie, R.; Robinet, F.; Rocchi, A.; Rolland, L.; Rollins, J. G.; Roma, V. J.; Romano, J. D.; Romano, R.; Romanov, G.; Romie, J. H.; Rosińska, D.; Rowan, S.; Rüdiger, A.; Ruggi, P.; Ryan, K.; Sachdev, S.; Sadecki, T.; Sadeghian, L.; Salconi, L.; Saleem, M.; Salemi, F.; Samajdar, A.; Sammut, L.; Sanchez, E. J.; Sandberg, V.; Sandeen, B.; Sanders, J. R.; Santamaria, L.; Sassolas, B.; Sathyaprakash, B. S.; Saulson, P. R.; Sauter, O. E. S.; Savage, R. L.; Sawadsky, A.; Schale, P.; Schilling, R.; Schmidt, J.; Schmidt, P.; Schnabel, R.; Schofield, R. M. S.; Schönbeck, A.; Schreiber, E.; Schuette, D.; Schutz, B. F.; Scott, J.; Scott, S. M.; Sellers, D.; Sentenac, D.; Sequino, V.; Sergeev, A.; Serna, G.; Setyawati, Y.; Sevigny, A.; Shaddock, D. A.; Shahriar, M. S.; Shaltev, M.; Shao, Z.; Shapiro, B.; Shawhan, P.; Sheperd, A.; Shoemaker, D. H.; Shoemaker, D. M.; Siellez, K.; Siemens, X.; Sieniawska, M.; Sigg, D.; Silva, A. D.; Simakov, D.; Singer, A.; Singer, L. P.; Singh, A.; Singh, R.; Singhal, A.; Sintes, A. M.; Slagmolen, B. J. J.; Smith, J. R.; Smith, N. D.; Smith, R. J. E.; Son, E. J.; Sorazu, B.; Sorrentino, F.; Souradeep, T.; Srivastava, A. K.; Staley, A.; Steinke, M.; Steinlechner, J.; Steinlechner, S.; Steinmeyer, D.; Stephens, B. C.; Stone, R.; Strain, K. A.; Straniero, N.; Stratta, G.; Strauss, N. A.; Strigin, S.; Sturani, R.; Stuver, A. L.; Summerscales, T. Z.; Sun, L.; Sutton, P. J.; Swinkels, B. L.; Szczepańczyk, M. J.; Tacca, M.; Talukder, D.; Tanner, D. B.; Tápai, M.; Tarabrin, S. P.; Taracchini, A.; Taylor, R.; Theeg, T.; Thirugnanasambandam, M. P.; Thomas, E. G.; Thomas, M.; Thomas, P.; Thorne, K. A.; Thorne, K. S.; Thrane, E.; Tiwari, S.; Tiwari, V.; Tokmakov, K. V.; Tomlinson, C.; Tonelli, M.; Torres, C. V.; Torrie, C. I.; Töyrä, D.; Travasso, F.; Traylor, G.; Trifirò, D.; Tringali, M. C.; Trozzo, L.; Tse, M.; Turconi, M.; Tuyenbayev, D.; Ugolini, D.; Unnikrishnan, C. S.; Urban, A. L.; Usman, S. A.; Vahlbruch, H.; Vajente, G.; Valdes, G.; van Bakel, N.; van Beuzekom, M.; van den Brand, J. F. J.; Van Den Broeck, C.; Vander-Hyde, D. C.; van der Schaaf, L.; van Heijningen, J. V.; van Veggel, A. A.; Vardaro, M.; Vass, S.; Vasúth, M.; Vaulin, R.; Vecchio, A.; Vedovato, G.; Veitch, J.; Veitch, P. J.; Venkateswara, K.; Verkindt, D.; Vetrano, F.; Viceré, A.; Vinciguerra, S.; Vine, D. J.; Vinet, J.-Y.; Vitale, S.; Vo, T.; Vocca, H.; Vorvick, C.; Voss, D. V.; Vousden, W. D.; Vyatchanin, S. P.; Wade, A. R.; Wade, L. E.; Wade, M.; Walker, M.; Wallace, L.; Walsh, S.; Wang, G.; Wang, H.; Wang, M.; Wang, X.; Wang, Y.; Ward, R. L.; Warner, J.; Was, M.; Weaver, B.; Wei, L.-W.; Weinert, M.; Weinstein, A. J.; Weiss, R.; Welborn, T.; Wen, L.; Weßels, P.; Westphal, T.; Wette, K.; Whelan, J. T.; Whitcomb, S. E.; White, D. J.; Whiting, B. F.; Williams, R. D.; Williamson, A. R.; Willis, J. L.; Willke, B.; Wimmer, M. H.; Winkler, W.; Wipf, C. C.; Wittel, H.; Woan, G.; Worden, J.; Wright, J. L.; Wu, G.; Yablon, J.; Yam, W.; Yamamoto, H.; Yancey, C. C.; Yap, M. J.; Yu, H.; Yvert, M.; ZadroŻny, A.; Zangrando, L.; Zanolin, M.; Zendri, J.-P.; Zevin, M.; Zhang, F.; Zhang, L.; Zhang, M.; Zhang, Y.; Zhao, C.; Zhou, M.; Zhou, Z.; Zhu, X. J.; Zucker, M. E.; Zuraw, S. E.; Zweizig, J.; LIGO Scientific Collaboration; Virgo Collaboration

    2016-11-01

    We present results from a search for gravitational-wave bursts coincident with two core-collapse supernovae observed optically in 2007 and 2011. We employ data from the Laser Interferometer Gravitational-wave Observatory (LIGO), the Virgo gravitational-wave observatory, and the GEO 600 gravitational-wave observatory. The targeted core-collapse supernovae were selected on the basis of (1) proximity (within approximately 15 Mpc), (2) tightness of observational constraints on the time of core collapse that defines the gravitational-wave search window, and (3) coincident operation of at least two interferometers at the time of core collapse. We find no plausible gravitational-wave candidates. We present the probability of detecting signals from both astrophysically well-motivated and more speculative gravitational-wave emission mechanisms as a function of distance from Earth, and discuss the implications for the detection of gravitational waves from core-collapse supernovae by the upgraded Advanced LIGO and Virgo detectors.

  9. Optimal random search for a single hidden target.

    PubMed

    Snider, Joseph

    2011-01-01

    A single target is hidden at a location chosen from a predetermined probability distribution. Then, a searcher must find a second probability distribution from which random search points are sampled such that the target is found in the minimum number of trials. Here it will be shown that if the searcher must get very close to the target to find it, then the best search distribution is proportional to the square root of the target distribution regardless of dimension. For a Gaussian target distribution, the optimum search distribution is approximately a Gaussian with a standard deviation that varies inversely with how close the searcher must be to the target to find it. For a network where the searcher randomly samples nodes and looks for the fixed target along edges, the optimum is either to sample a node with probability proportional to the square root of the out-degree plus 1 or not to do so at all.

  10. Internal Medicine residents use heuristics to estimate disease probability

    PubMed Central

    Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin

    2015-01-01

    Background Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. Method We randomized 55 Internal Medicine residents to different versions of four clinical vignettes and asked them to estimate probabilities of target conditions. We manipulated the clinical data for each vignette to be consistent with either 1) using a representative heuristic, by adding non-discriminating prototypical clinical features of the target condition, or 2) using anchoring with adjustment heuristic, by providing a high or low anchor for the target condition. Results When presented with additional non-discriminating data the odds of diagnosing the target condition were increased (odds ratio (OR) 2.83, 95% confidence interval [1.30, 6.15], p = 0.009). Similarly, the odds of diagnosing the target condition were increased when a high anchor preceded the vignette (OR 2.04, [1.09, 3.81], p = 0.025). Conclusions Our findings suggest that despite previous exposure to the use of Bayesian reasoning, residents use heuristics, such as the representative heuristic and anchoring with adjustment, to estimate probabilities. Potential reasons for attribute substitution include the relative cognitive ease of heuristics vs. Bayesian reasoning or perhaps residents in their clinical practice use gist traces rather than precise probability estimates when diagnosing. PMID:27004080

  11. Tools for in silico target fishing.

    PubMed

    Cereto-Massagué, Adrià; Ojeda, María José; Valls, Cristina; Mulero, Miquel; Pujadas, Gerard; Garcia-Vallve, Santiago

    2015-01-01

    Computational target fishing methods are designed to identify the most probable target of a query molecule. This process may allow the prediction of the bioactivity of a compound, the identification of the mode of action of known drugs, the detection of drug polypharmacology, drug repositioning or the prediction of the adverse effects of a compound. The large amount of information regarding the bioactivity of thousands of small molecules now allows the development of these types of methods. In recent years, we have witnessed the emergence of many methods for in silico target fishing. Most of these methods are based on the similarity principle, i.e., that similar molecules might bind to the same targets and have similar bioactivities. However, the difficult validation of target fishing methods hinders comparisons of the performance of each method. In this review, we describe the different methods developed for target prediction, the bioactivity databases most frequently used by these methods, and the publicly available programs and servers that enable non-specialist users to obtain these types of predictions. It is expected that target prediction will have a large impact on drug development and on the functional food industry. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Heterogeneous detection probabilities for imperiled Missouri River fishes: implications for large-river monitoring programs

    USGS Publications Warehouse

    Schloesser, J.T.; Paukert, Craig P.; Doyle, W.J.; Hill, Tracy D.; Steffensen, K.D.; Travnichek, Vincent H.

    2012-01-01

    Occupancy modeling was used to determine (1) if detection probabilities (p) for 7 regionally imperiled Missouri River fishes (Scaphirhynchus albus, Scaphirhynchus platorynchus, Cycleptus elongatus, Sander canadensis, Macrhybopsis aestivalis, Macrhybopsis gelida, and Macrhybopsis meeki) differed among gear types (i.e. stationary gill nets, drifted trammel nets, and otter trawls), and (2) how detection probabilities were affected by habitat (i.e. pool, bar, and open water), longitudinal position (five 189 to 367 rkm long segments), sampling year (2003 to 2006), and season (July 1 to October 30 and October 31 to June 30). Adult, large-bodied fishes were best detected with gill nets (p: 0.02–0.74), but most juvenile large-bodied and all small-bodied species were best detected with otter trawls (p: 0.02–0.58). Trammel nets may be a redundant sampling gear for imperiled fishes in the lower Missouri River because most species had greater detection probabilities with gill nets or otter trawls. Detection probabilities varied with river segment for S. platorynchus, C. elongatus, and all small-bodied fishes, suggesting that changes in habitat influenced gear efficiency or abundance changes among river segments. Detection probabilities varied by habitat for adult S. albus and S. canadensis, year for juvenile S. albus, C. elongatus, and S. canadensis, and season for adult S. albus. Concentrating sampling effort on gears with the greatest detection probabilities may increase species detections to better monitor a population's response to environmental change and the effects of management actions on large-river fishes.

  13. Anomaly-based intrusion detection for SCADA systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, D.; Usynin, A.; Hines, J. W.

    2006-07-01

    Most critical infrastructure such as chemical processing plants, electrical generation and distribution networks, and gas distribution is monitored and controlled by Supervisory Control and Data Acquisition Systems (SCADA. These systems have been the focus of increased security and there are concerns that they could be the target of international terrorists. With the constantly growing number of internet related computer attacks, there is evidence that our critical infrastructure may also be vulnerable. Researchers estimate that malicious online actions may cause $75 billion at 2007. One of the interesting countermeasures for enhancing information system security is called intrusion detection. This paper willmore » briefly discuss the history of research in intrusion detection techniques and introduce the two basic detection approaches: signature detection and anomaly detection. Finally, it presents the application of techniques developed for monitoring critical process systems, such as nuclear power plants, to anomaly intrusion detection. The method uses an auto-associative kernel regression (AAKR) model coupled with the statistical probability ratio test (SPRT) and applied to a simulated SCADA system. The results show that these methods can be generally used to detect a variety of common attacks. (authors)« less

  14. Automatic target recognition and detection in infrared imagery under cluttered background

    NASA Astrophysics Data System (ADS)

    Gundogdu, Erhan; Koç, Aykut; Alatan, A. Aydın.

    2017-10-01

    Visual object classification has long been studied in visible spectrum by utilizing conventional cameras. Since the labeled images has recently increased in number, it is possible to train deep Convolutional Neural Networks (CNN) with significant amount of parameters. As the infrared (IR) sensor technology has been improved during the last two decades, labeled images extracted from IR sensors have been started to be used for object detection and recognition tasks. We address the problem of infrared object recognition and detection by exploiting 15K images from the real-field with long-wave and mid-wave IR sensors. For feature learning, a stacked denoising autoencoder is trained in this IR dataset. To recognize the objects, the trained stacked denoising autoencoder is fine-tuned according to the binary classification loss of the target object. Once the training is completed, the test samples are propagated over the network, and the probability of the test sample belonging to a class is computed. Moreover, the trained classifier is utilized in a detect-by-classification method, where the classification is performed in a set of candidate object boxes and the maximum confidence score in a particular location is accepted as the score of the detected object. To decrease the computational complexity, the detection step at every frame is avoided by running an efficient correlation filter based tracker. The detection part is performed when the tracker confidence is below a pre-defined threshold. The experiments conducted on the real field images demonstrate that the proposed detection and tracking framework presents satisfactory results for detecting tanks under cluttered background.

  15. Red-shouldered hawk occupancy surveys in central Minnesota, USA

    USGS Publications Warehouse

    Henneman, C.; McLeod, M.A.; Andersen, D.E.

    2007-01-01

    Forest-dwelling raptors are often difficult to detect because many species occur at low density or are secretive. Broadcasting conspecific vocalizations can increase the probability of detecting forest-dwelling raptors and has been shown to be an effective method for locating raptors and assessing their relative abundance. Recent advances in statistical techniques based on presence-absence data use probabilistic arguments to derive probability of detection when it is <1 and to provide a model and likelihood-based method for estimating proportion of sites occupied. We used these maximum-likelihood models with data from red-shouldered hawk (Buteo lineatus) call-broadcast surveys conducted in central Minnesota, USA, in 1994-1995 and 2004-2005. Our objectives were to obtain estimates of occupancy and detection probability 1) over multiple sampling seasons (yr), 2) incorporating within-season time-specific detection probabilities, 3) with call type and breeding stage included as covariates in models of probability of detection, and 4) with different sampling strategies. We visited individual survey locations 2-9 times per year, and estimates of both probability of detection (range = 0.28-0.54) and site occupancy (range = 0.81-0.97) varied among years. Detection probability was affected by inclusion of a within-season time-specific covariate, call type, and breeding stage. In 2004 and 2005 we used survey results to assess the effect that number of sample locations, double sampling, and discontinued sampling had on parameter estimates. We found that estimates of probability of detection and proportion of sites occupied were similar across different sampling strategies, and we suggest ways to reduce sampling effort in a monitoring program.

  16. Study on polarization image methods in turbid medium

    NASA Astrophysics Data System (ADS)

    Fu, Qiang; Mo, Chunhe; Liu, Boyu; Duan, Jin; Zhang, Su; Zhu, Yong

    2014-11-01

    Polarization imaging detection technology in addition to the traditional imaging information, also can get polarization multi-dimensional information, thus improve the probability of target detection and recognition.Image fusion in turbid medium target polarization image research, is helpful to obtain high quality images. Based on visible light wavelength of light wavelength of laser polarization imaging, through the rotation Angle of polaroid get corresponding linear polarized light intensity, respectively to obtain the concentration range from 5% to 10% of turbid medium target stocks of polarization parameters, introduces the processing of image fusion technology, main research on access to the polarization of the image by using different polarization image fusion methods for image processing, discusses several kinds of turbid medium has superior performance of polarization image fusion method, and gives the treatment effect and analysis of data tables. Then use pixel level, feature level and decision level fusion algorithm on three levels of information fusion, DOLP polarization image fusion, the results show that: with the increase of the polarization Angle, polarization image will be more and more fuzzy, quality worse and worse. Than a single fused image contrast of the image be improved obviously, the finally analysis on reasons of the increase the image contrast and polarized light.

  17. Polarimetric phenomenology in the reflective regime: a case study using polarized hyperspectral data

    NASA Astrophysics Data System (ADS)

    Gibney, Mark

    2016-05-01

    Understanding the phenomenology of polarimetric data is necessary if we want to obtain the maximum benefit when we exploit that data. To first order, polarimetric phenomenology is driven by two things; the target material type (specular or diffuse) and the illuminating source (point (sun) or extended (body emission)). Polarimetric phenomenology can then be broken into three basic categories; ([specular material/sun source], [diffuse/sun], [specular/body]) where we have assigned body emission to the IR passband where materials are generally specular. The task of interest determines the category of interest since the task determines the dominant target material and the illuminating source (eg detecting diffuse targets under trees in VNIR = [diffuse/sun] category). In this paper, a specific case study for the important [diffuse/sun] category will be presented. For the reflective regime (0.3 - 3.0um), the largest polarimetric signal is obtained when the sun illuminates a significant portion of the material BRDF lobe. This naturally points us to problems whose primary target materials are diffuse since the BRDF lobe for specular materials is tiny (low probability of acquiring on the BRDF lobe) and glinty (high probability of saturating the sensor when on lobe). In this case study, we investigated signatures of solar illuminated diffuse paints acquired by a polarimetric hyperspectral sensor. We will discuss the acquisition, reduction and exploitation of that data, and use it to illustrate the primary characteristics of reflective polarimetric phenomenology.

  18. Optimising Camera Traps for Monitoring Small Mammals

    PubMed Central

    Glen, Alistair S.; Cockburn, Stuart; Nichols, Margaret; Ekanayake, Jagath; Warburton, Bruce

    2013-01-01

    Practical techniques are required to monitor invasive animals, which are often cryptic and occur at low density. Camera traps have potential for this purpose, but may have problems detecting and identifying small species. A further challenge is how to standardise the size of each camera’s field of view so capture rates are comparable between different places and times. We investigated the optimal specifications for a low-cost camera trap for small mammals. The factors tested were 1) trigger speed, 2) passive infrared vs. microwave sensor, 3) white vs. infrared flash, and 4) still photographs vs. video. We also tested a new approach to standardise each camera’s field of view. We compared the success rates of four camera trap designs in detecting and taking recognisable photographs of captive stoats ( Mustela erminea ), feral cats (Felis catus) and hedgehogs ( Erinaceus europaeus ). Trigger speeds of 0.2–2.1 s captured photographs of all three target species unless the animal was running at high speed. The camera with a microwave sensor was prone to false triggers, and often failed to trigger when an animal moved in front of it. A white flash produced photographs that were more readily identified to species than those obtained under infrared light. However, a white flash may be more likely to frighten target animals, potentially affecting detection probabilities. Video footage achieved similar success rates to still cameras but required more processing time and computer memory. Placing two camera traps side by side achieved a higher success rate than using a single camera. Camera traps show considerable promise for monitoring invasive mammal control operations. Further research should address how best to standardise the size of each camera’s field of view, maximise the probability that an animal encountering a camera trap will be detected, and eliminate visible or audible cues emitted by camera traps. PMID:23840790

  19. Nucleic acid amplification tests (NAATs) for gonorrhoea diagnosis in women: experience of a tertiary care hospital in north India.

    PubMed

    Sood, Seema; Verma, Rachna; Mir, Shazia Shaheen; Agarwal, Madhav; Singh, Neeta; Kar, Hemanta Kumar; Sharma, Vinod Kumar

    2014-11-01

    Gonorrhoea is among the most frequent of the estimated bacterial sexually transmitted infections (STIs) and has significant health implications in women. The use of nucleic acid amplification tests (NAATs) has been shown to provide enhanced diagnosis of gonorrhoea in female patients. However, it is recommended that an on-going assessment of the test assays should be performed to check for any probable sequence variation occurring in the targeted region. In this study, an in-house PCR targeting opa-gene of Neisseria gonorrhoeae was used in conjunction with 16S ribosomal PCR to determine the presence of gonorrhoea in female patients attending the tertiary care hospitals. Endocervical samples collected from 250 female patients with complaints of vaginal or cervical discharge or pain in lower abdomen were tested using opa and 16S ribosomal assay. The samples were also processed by conventional methods. Of the 250 female patients included in the study, only one was positive by conventional methods (microscopy and culture) whereas 17 patients were found to be positive based on PCR results. The clinical sensitivity of conventional methods for the detection of N. gonorrhoeae in female patients was low. The gonococcal detection rates increased when molecular method was used giving 16 additional positives. Studies should be done to find out other gene targets that may be used in the screening assays to detect the presence of gonorrhoea.

  20. Discriminating between camouflaged targets by their time of detection by a human-based observer assessment method

    NASA Astrophysics Data System (ADS)

    Selj, G. K.; Søderblom, M.

    2015-10-01

    Detection of a camouflaged object in natural sceneries requires the target to be distinguishable from its local background. The development of any new camouflage pattern therefore has to rely on a well-founded test methodology - which has to be correlated with the final purpose of the pattern - as well as an evaluation procedure, containing the optimal criteria for i) discriminating between the targets and then eventually ii) for a final rank of the targets. In this study we present results from a recent camouflage assessment trial where human observers were used in a search by photo methodology to assess generic test camouflage patterns. We conducted a study to investigate possible improvements in camouflage patterns for battle dress uniforms. The aim was to do a comparative study of potential, and generic patterns intended for use in arid areas (sparsely vegetated, semi desert). We developed a test methodology that was intended to be simple, reliable and realistic with respect to the operational benefit of camouflage. Therefore we chose to conduct a human based observer trial founded on imagery of realistic targets in natural backgrounds. Inspired by a recent and similar trial in the UK, we developed new and purpose-based software to be able to conduct the observer trial. Our preferred assessment methodology - the observer trial - was based on target recordings in 12 different, but operational relevant scenes, collected in a dry and sparsely vegetated area (Rhodes). The scenes were chosen with the intention to span as broadly as possible. The targets were human-shaped mannequins and were situated identically in each of the scenes to allow for a relative comparison of camouflage effectiveness in each scene. Test of significance, among the targets' performance, was carried out by non-parametric tests as the corresponding time of detection distributions in overall were found to be difficult to parameterize. From the trial, containing 12 different scenes from sparsely vegetated areas we collected detection time's distributions for 6 generic targets through visual search by 148 observers. We found that the different targets performed differently, given by their corresponding time of detection distributions, within a single scene. Furthermore, we gained an overall ranking over all the 12 scenes by performing a weighted sum over all scenes, intended to keep as much of the vital information on the targets' signature effectiveness as possible. Our results show that it was possible to measure the targets performance relatively to another also when summing over all scenes. We also compared our ranking based on our preferred criterion (detection time) with a secondary (probability of detection) to assess the sensitivity of a final ranking based upon the test set-up and evaluation criterion. We found our observer-based approach to be well suited regarding its ability to discriminate between similar targets and to assign numeric values to the observed differences in performance. We believe our approach will be well suited as a tool whenever different aspects of camouflage are to be evaluated and understood further.

  1. First Detected Arrival of a Quantum Walker on an Infinite Line

    NASA Astrophysics Data System (ADS)

    Thiel, Felix; Barkai, Eli; Kessler, David A.

    2018-01-01

    The first detection of a quantum particle on a graph is shown to depend sensitively on the distance ξ between the detector and initial location of the particle, and on the sampling time τ . Here, we use the recently introduced quantum renewal equation to investigate the statistics of first detection on an infinite line, using a tight-binding lattice Hamiltonian with nearest-neighbor hops. Universal features of the first detection probability are uncovered and simple limiting cases are analyzed. These include the large ξ limit, the small τ limit, and the power law decay with the attempt number of the detection probability over which quantum oscillations are superimposed. For large ξ the first detection probability assumes a scaling form and when the sampling time is equal to the inverse of the energy band width nonanalytical behaviors arise, accompanied by a transition in the statistics. The maximum total detection probability is found to occur for τ close to this transition point. When the initial location of the particle is far from the detection node we find that the total detection probability attains a finite value that is distance independent.

  2. Sampling techniques for burbot in a western non-wadeable river

    USGS Publications Warehouse

    Klein, Z. B.; Quist, Michael C.; Rhea, D.T.; Senecal, A. C.

    2015-01-01

    Burbot, Lota lota (L.), populations are declining throughout much of their native distribution. Although numerous aspects of burbot ecology are well understood, less is known about effective sampling techniques for burbot in lotic systems. Occupancy models were used to estimate the probability of detection () for three gears (6.4- and 19-mm bar mesh hoop nets, night electric fishing), within the context of various habitat characteristics. During the summer, night electric fishing had the highest estimated detection probability for both juvenile (, 95% C.I.; 0.35, 0.26–0.46) and adult (0.30, 0.20–0.41) burbot. However, small-mesh hoop nets (6.4-mm bar mesh) had similar detection probabilities to night electric fishing for both juvenile (0.26, 0.17–0.36) and adult (0.27, 0.18–0.39) burbot during the summer. In autumn, a similar overlap between detection probabilities was observed for juvenile and adult burbot. Small-mesh hoop nets had the highest estimated probability of detection for both juvenile and adult burbot (0.46, 0.33–0.59), whereas night electric fishing had a detection probability of 0.39 (0.28–0.52) for juvenile and adult burbot. By using detection probabilities to compare gears, the most effective sampling technique can be identified, leading to increased species detections and more effective management of burbot.

  3. Probability of detection of clinical seizures using heart rate changes.

    PubMed

    Osorio, Ivan; Manly, B F J

    2015-08-01

    Heart rate-based seizure detection is a viable complement or alternative to ECoG/EEG. This study investigates the role of various biological factors on the probability of clinical seizure detection using heart rate. Regression models were applied to 266 clinical seizures recorded from 72 subjects to investigate if factors such as age, gender, years with epilepsy, etiology, seizure site origin, seizure class, and data collection centers, among others, shape the probability of EKG-based seizure detection. Clinical seizure detection probability based on heart rate changes, is significantly (p<0.001) shaped by patients' age and gender, seizure class, and years with epilepsy. The probability of detecting clinical seizures (>0.8 in the majority of subjects) using heart rate is highest for complex partial seizures, increases with a patient's years with epilepsy, is lower for females than for males and is unrelated to the side of hemisphere origin. Clinical seizure detection probability using heart rate is multi-factorially dependent and sufficiently high (>0.8) in most cases to be clinically useful. Knowledge of the role that these factors play in shaping said probability will enhance its applicability and usefulness. Heart rate is a reliable and practical signal for extra-cerebral detection of clinical seizures originating from or spreading to central autonomic network structures. Copyright © 2015 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.

  4. Turbulence effects in a horizontal propagation path close to ground: implications for optics detection

    NASA Astrophysics Data System (ADS)

    Sjöqvist, Lars; Allard, Lars; Gustafsson, Ove; Henriksson, Markus; Pettersson, Magnus

    2011-11-01

    Atmospheric turbulence effects close to ground may affect the performance of laser based systems severely. The variations in the refractive index along the propagation path cause effects such as beam wander, intensity fluctuations (scintillations) and beam broadening. Typical geometries of interest for optics detection include nearly horizontal propagation paths close to the ground and up to kilometre distance to the target. The scintillations and beam wander affect the performance in terms of detection probability and false alarm rate. Of interest is to study the influence of turbulence in optics detection applications. In a field trial atmospheric turbulence effects along a 1 kilometre horizontal propagation path were studied using a diode laser with a rectangular beam profile operating at 0.8 micrometer wavelength. Single-path beam characteristics were registered and analysed using photodetectors arranged in horizontal and vertical directions. The turbulence strength along the path was determined using a scintillometer and single-point ultrasonic anemometers. Strong scintillation effects were observed as a function of the turbulence strength and amplitude characteristics were fitted to model distributions. In addition to the single-path analysis double-path measurements were carried out on different targets. Experimental results are compared with existing theoretical turbulence laser beam propagation models. The results show that influence from scintillations needs to be considered when predicting performance in optics detection applications.

  5. Sleep Deprivation Attack Detection in Wireless Sensor Network

    NASA Astrophysics Data System (ADS)

    Bhattasali, Tapalina; Chaki, Rituparna; Sanyal, Sugata

    2012-02-01

    Deployment of sensor network in hostile environment makes it mainly vulnerable to battery drainage attacks because it is impossible to recharge or replace the battery power of sensor nodes. Among different types of security threats, low power sensor nodes are immensely affected by the attacks which cause random drainage of the energy level of sensors, leading to death of the nodes. The most dangerous type of attack in this category is sleep deprivation, where target of the intruder is to maximize the power consumption of sensor nodes, so that their lifetime is minimized. Most of the existing works on sleep deprivation attack detection involve a lot of overhead, leading to poor throughput. The need of the day is to design a model for detecting intrusions accurately in an energy efficient manner. This paper proposes a hierarchical framework based on distributed collaborative mechanism for detecting sleep deprivation torture in wireless sensor network efficiently. Proposed model uses anomaly detection technique in two steps to reduce the probability of false intrusion.

  6. Variables associated with detection probability, detection latency, and behavioral responses of Golden-winged Warblers (Vermivora chrysoptera)

    USGS Publications Warehouse

    Aldinger, Kyle R.; Wood, Petra B.

    2015-01-01

    Detection probability during point counts and its associated variables are important considerations for bird population monitoring and have implications for conservation planning by influencing population estimates. During 2008–2009, we evaluated variables hypothesized to be associated with detection probability, detection latency, and behavioral responses of male Golden-winged Warblers in pastures in the Monongahela National Forest, West Virginia, USA. This is the first study of male Golden-winged Warbler detection probability, detection latency, or behavioral response based on point-count sampling with known territory locations and identities for all males. During 3-min passive point counts, detection probability decreased as distance to a male's territory and time since sunrise increased. During 3-min point counts with playback, detection probability decreased as distance to a male's territory increased, but remained constant as time since sunrise increased. Detection probability was greater when point counts included type 2 compared with type 1 song playback, particularly during the first 2 min of type 2 song playback. Golden-winged Warblers primarily use type 1 songs (often zee bee bee bee with a higher-pitched first note) in intersexual contexts and type 2 songs (strident, rapid stutter ending with a lower-pitched buzzy note) in intrasexual contexts. Distance to a male's territory, ordinal date, and song playback type were associated with the type of behavioral response to song playback. Overall, ~2 min of type 2 song playback may increase the efficacy of point counts for monitoring populations of Golden-winged Warblers by increasing the conspicuousness of males for visual identification and offsetting the consequences of surveying later in the morning. Because playback may interfere with the ability to detect distant males, it is important to follow playback with a period of passive listening. Our results indicate that even in relatively open pasture vegetation, detection probability of male Golden-winged Warblers is imperfect and highly variable.

  7. Contrast model for three-dimensional vehicles in natural lighting and search performance analysis

    NASA Astrophysics Data System (ADS)

    Witus, Gary; Gerhart, Grant R.; Ellis, R. Darin

    2001-09-01

    Ground vehicles in natural lighting tend to have significant and systematic variation in luminance through the presented area. This arises, in large part, from the vehicle surfaces having different orientations and shadowing relative to the source of illumination and the position of the observer. These systematic differences create the appearance of a structured 3D object. The 3D appearance is an important factor in search, figure-ground segregation, and object recognition. We present a contrast metric to predict search and detection performance that accounts for the 3D structure. The approach first computes the contrast of the front (or rear), side, and top surfaces. The vehicle contrast metric is the area-weighted sum of the absolute values of the contrasts of the component surfaces. The 3D structure contrast metric, together with target height, account for more than 80% of the variance in probability of detection and 75% of the variance in search time. When false alarm effects are discounted, they account for 89% of the variance in probability of detection and 95% of the variance in search time. The predictive power of the signature metric, when calibrated to half the data and evaluated against the other half, is 90% of the explanatory power.

  8. Object Detection in Natural Backgrounds Predicted by Discrimination Performance and Models

    NASA Technical Reports Server (NTRS)

    Ahumada, A. J., Jr.; Watson, A. B.; Rohaly, A. M.; Null, Cynthia H. (Technical Monitor)

    1995-01-01

    In object detection, an observer looks for an object class member in a set of backgrounds. In discrimination, an observer tries to distinguish two images. Discrimination models predict the probability that an observer detects a difference between two images. We compare object detection and image discrimination with the same stimuli by: (1) making stimulus pairs of the same background with and without the target object and (2) either giving many consecutive trials with the same background (discrimination) or intermixing the stimuli (object detection). Six images of a vehicle in a natural setting were altered to remove the vehicle and mixed with the original image in various proportions. Detection observers rated the images for vehicle presence. Discrimination observers rated the images for any difference from the background image. Estimated detectabilities of the vehicles were found by maximizing the likelihood of a Thurstone category scaling model. The pattern of estimated detectabilities is similar for discrimination and object detection, and is accurately predicted by a Cortex Transform discrimination model. Predictions of a Contrast- Sensitivity- Function filter model and a Root-Mean-Square difference metric based on the digital image values are less accurate. The discrimination detectabilities averaged about twice those of object detection.

  9. Estimating site occupancy rates for aquatic plants using spatial sub-sampling designs when detection probabilities are less than one

    USGS Publications Warehouse

    Nielson, Ryan M.; Gray, Brian R.; McDonald, Lyman L.; Heglund, Patricia J.

    2011-01-01

    Estimation of site occupancy rates when detection probabilities are <1 is well established in wildlife science. Data from multiple visits to a sample of sites are used to estimate detection probabilities and the proportion of sites occupied by focal species. In this article we describe how site occupancy methods can be applied to estimate occupancy rates of plants and other sessile organisms. We illustrate this approach and the pitfalls of ignoring incomplete detection using spatial data for 2 aquatic vascular plants collected under the Upper Mississippi River's Long Term Resource Monitoring Program (LTRMP). Site occupancy models considered include: a naïve model that ignores incomplete detection, a simple site occupancy model assuming a constant occupancy rate and a constant probability of detection across sites, several models that allow site occupancy rates and probabilities of detection to vary with habitat characteristics, and mixture models that allow for unexplained variation in detection probabilities. We used information theoretic methods to rank competing models and bootstrapping to evaluate the goodness-of-fit of the final models. Results of our analysis confirm that ignoring incomplete detection can result in biased estimates of occupancy rates. Estimates of site occupancy rates for 2 aquatic plant species were 19–36% higher compared to naive estimates that ignored probabilities of detection <1. Simulations indicate that final models have little bias when 50 or more sites are sampled, and little gains in precision could be expected for sample sizes >300. We recommend applying site occupancy methods for monitoring presence of aquatic species.

  10. Formation of S-type planets in close binaries: scattering induced tidal capture of circumbinary planets

    NASA Astrophysics Data System (ADS)

    Gong, Yan-Xiang; Ji, Jianghui

    2018-05-01

    Although several S-type and P-type planets in binary systems were discovered in past years, S-type planets have not yet been found in close binaries with an orbital separation not more than 5 au. Recent studies suggest that S-type planets in close binaries may be detected through high-accuracy observations. However, nowadays planet formation theories imply that it is difficult for S-type planets in close binaries systems to form in situ. In this work, we extensively perform numerical simulations to explore scenarios of planet-planet scattering among circumbinary planets and subsequent tidal capture in various binary configurations, to examine whether the mechanism can play a part in producing such kind of planets. Our results show that this mechanism is robust. The maximum capture probability is ˜10%, which can be comparable to the tidal capture probability of hot Jupiters in single star systems. The capture probability is related to binary configurations, where a smaller eccentricity or a low mass ratio of the binary will lead to a larger probability of capture, and vice versa. Furthermore, we find that S-type planets with retrograde orbits can be naturally produced via capture process. These planets on retrograde orbits can help us distinguish in situ formation and post-capture origin for S-type planet in close binaries systems. The forthcoming missions (PLATO) will provide the opportunity and feasibility to detect such planets. Our work provides several suggestions for selecting target binaries in search for S-type planets in the near future.

  11. Distributed micro-radar system for detection and tracking of low-profile, low-altitude targets

    NASA Astrophysics Data System (ADS)

    Gorwara, Ashok; Molchanov, Pavlo

    2016-05-01

    Proposed airborne surveillance radar system can detect, locate, track, and classify low-profile, low-altitude targets: from traditional fixed and rotary wing aircraft to non-traditional targets like unmanned aircraft systems (drones) and even small projectiles. Distributed micro-radar system is the next step in the development of passive monopulse direction finder proposed by Stephen E. Lipsky in the 80s. To extend high frequency limit and provide high sensitivity over the broadband of frequencies, multiple angularly spaced directional antennas are coupled with front end circuits and separately connected to a direction finder processor by a digital interface. Integration of antennas with front end circuits allows to exclude waveguide lines which limits system bandwidth and creates frequency dependent phase errors. Digitizing of received signals proximate to antennas allows loose distribution of antennas and dramatically decrease phase errors connected with waveguides. Accuracy of direction finding in proposed micro-radar in this case will be determined by time accuracy of digital processor and sampling frequency. Multi-band, multi-functional antennas can be distributed around the perimeter of a Unmanned Aircraft System (UAS) and connected to the processor by digital interface or can be distributed between swarm/formation of mini/micro UAS and connected wirelessly. Expendable micro-radars can be distributed by perimeter of defense object and create multi-static radar network. Low-profile, lowaltitude, high speed targets, like small projectiles, create a Doppler shift in a narrow frequency band. This signal can be effectively filtrated and detected with high probability. Proposed micro-radar can work in passive, monostatic or bistatic regime.

  12. Sandpile-based model for capturing magnitude distributions and spatiotemporal clustering and separation in regional earthquakes

    NASA Astrophysics Data System (ADS)

    Batac, Rene C.; Paguirigan, Antonino A., Jr.; Tarun, Anjali B.; Longjas, Anthony G.

    2017-04-01

    We propose a cellular automata model for earthquake occurrences patterned after the sandpile model of self-organized criticality (SOC). By incorporating a single parameter describing the probability to target the most susceptible site, the model successfully reproduces the statistical signatures of seismicity. The energy distributions closely follow power-law probability density functions (PDFs) with a scaling exponent of around -1. 6, consistent with the expectations of the Gutenberg-Richter (GR) law, for a wide range of the targeted triggering probability values. Additionally, for targeted triggering probabilities within the range 0.004-0.007, we observe spatiotemporal distributions that show bimodal behavior, which is not observed previously for the original sandpile. For this critical range of values for the probability, model statistics show remarkable comparison with long-period empirical data from earthquakes from different seismogenic regions. The proposed model has key advantages, the foremost of which is the fact that it simultaneously captures the energy, space, and time statistics of earthquakes by just introducing a single parameter, while introducing minimal parameters in the simple rules of the sandpile. We believe that the critical targeting probability parameterizes the memory that is inherently present in earthquake-generating regions.

  13. Probability of detection of nests and implications for survey design

    USGS Publications Warehouse

    Smith, P.A.; Bart, J.; Lanctot, Richard B.; McCaffery, B.J.; Brown, S.

    2009-01-01

    Surveys based on double sampling include a correction for the probability of detection by assuming complete enumeration of birds in an intensively surveyed subsample of plots. To evaluate this assumption, we calculated the probability of detecting active shorebird nests by using information from observers who searched the same plots independently. Our results demonstrate that this probability varies substantially by species and stage of the nesting cycle but less by site or density of nests. Among the species we studied, the estimated single-visit probability of nest detection during the incubation period varied from 0.21 for the White-rumped Sandpiper (Calidris fuscicollis), the most difficult species to detect, to 0.64 for the Western Sandpiper (Calidris mauri), the most easily detected species, with a mean across species of 0.46. We used these detection probabilities to predict the fraction of persistent nests found over repeated nest searches. For a species with the mean value for detectability, the detection rate exceeded 0.85 after four visits. This level of nest detection was exceeded in only three visits for the Western Sandpiper, but six to nine visits were required for the White-rumped Sandpiper, depending on the type of survey employed. Our results suggest that the double-sampling method's requirement of nearly complete counts of birds in the intensively surveyed plots is likely to be met for birds with nests that survive over several visits of nest searching. Individuals with nests that fail quickly or individuals that do not breed can be detected with high probability only if territorial behavior is used to identify likely nesting pairs. ?? The Cooper Ornithological Society, 2009.

  14. Unmodeled observation error induces bias when inferring patterns and dynamics of species occurrence via aural detections

    USGS Publications Warehouse

    McClintock, Brett T.; Bailey, Larissa L.; Pollock, Kenneth H.; Simons, Theodore R.

    2010-01-01

    The recent surge in the development and application of species occurrence models has been associated with an acknowledgment among ecologists that species are detected imperfectly due to observation error. Standard models now allow unbiased estimation of occupancy probability when false negative detections occur, but this is conditional on no false positive detections and sufficient incorporation of explanatory variables for the false negative detection process. These assumptions are likely reasonable in many circumstances, but there is mounting evidence that false positive errors and detection probability heterogeneity may be much more prevalent in studies relying on auditory cues for species detection (e.g., songbird or calling amphibian surveys). We used field survey data from a simulated calling anuran system of known occupancy state to investigate the biases induced by these errors in dynamic models of species occurrence. Despite the participation of expert observers in simplified field conditions, both false positive errors and site detection probability heterogeneity were extensive for most species in the survey. We found that even low levels of false positive errors, constituting as little as 1% of all detections, can cause severe overestimation of site occupancy, colonization, and local extinction probabilities. Further, unmodeled detection probability heterogeneity induced substantial underestimation of occupancy and overestimation of colonization and local extinction probabilities. Completely spurious relationships between species occurrence and explanatory variables were also found. Such misleading inferences would likely have deleterious implications for conservation and management programs. We contend that all forms of observation error, including false positive errors and heterogeneous detection probabilities, must be incorporated into the estimation framework to facilitate reliable inferences about occupancy and its associated vital rate parameters.

  15. Experimental estimation of snare detectability for robust threat monitoring.

    PubMed

    O'Kelly, Hannah J; Rowcliffe, J Marcus; Durant, Sarah; Milner-Gulland, E J

    2018-02-01

    Hunting with wire snares is rife within many tropical forest systems, and constitutes one of the severest threats to a wide range of vertebrate taxa. As for all threats, reliable monitoring of snaring levels is critical for assessing the relative effectiveness of management interventions. However, snares pose a particular challenge in terms of tracking spatial or temporal trends in their prevalence because they are extremely difficult to detect, and are typically spread across large, inaccessible areas. As with cryptic animal targets, any approach used to monitor snaring levels must address the issue of imperfect detection, but no standard method exists to do so. We carried out a field experiment in Keo Seima Wildlife Reserve in eastern Cambodia with the following objectives: (1) To estimate the detection probably of wire snares within a tropical forest context, and to investigate how detectability might be affected by habitat type, snare type, or observer. (2) To trial two sets of sampling protocols feasible to implement in a range of challenging field conditions. (3) To conduct a preliminary assessment of two potential analytical approaches to dealing with the resulting snare encounter data. We found that although different observers had no discernible effect on detection probability, detectability did vary between habitat type and snare type. We contend that simple repeated counts carried out at multiple sites and analyzed using binomial mixture models could represent a practical yet robust solution to the problem of monitoring snaring levels both inside and outside of protected areas. This experiment represents an important first step in developing improved methods of threat monitoring, and such methods are greatly needed in southeast Asia, as well as in as many other regions.

  16. Sampling little fish in big rivers: Larval fish detection probabilities in two Lake Erie tributaries and implications for sampling effort and abundance indices

    USGS Publications Warehouse

    Pritt, Jeremy J.; DuFour, Mark R.; Mayer, Christine M.; Roseman, Edward F.; DeBruyne, Robin L.

    2014-01-01

    Larval fish are frequently sampled in coastal tributaries to determine factors affecting recruitment, evaluate spawning success, and estimate production from spawning habitats. Imperfect detection of larvae is common, because larval fish are small and unevenly distributed in space and time, and coastal tributaries are often large and heterogeneous. We estimated detection probabilities of larval fish from several taxa in the Maumee and Detroit rivers, the two largest tributaries of Lake Erie. We then demonstrated how accounting for imperfect detection influenced (1) the probability of observing taxa as present relative to sampling effort and (2) abundance indices for larval fish of two Detroit River species. We found that detection probabilities ranged from 0.09 to 0.91 but were always less than 1.0, indicating that imperfect detection is common among taxa and between systems. In general, taxa with high fecundities, small larval length at hatching, and no nesting behaviors had the highest detection probabilities. Also, detection probabilities were higher in the Maumee River than in the Detroit River. Accounting for imperfect detection produced up to fourfold increases in abundance indices for Lake Whitefish Coregonus clupeaformis and Gizzard Shad Dorosoma cepedianum. The effect of accounting for imperfect detection in abundance indices was greatest during periods of low abundance for both species. Detection information can be used to determine the appropriate level of sampling effort for larval fishes and may improve management and conservation decisions based on larval fish data.

  17. Probability of acoustic transmitter detections by receiver lines in Lake Huron: results of multi-year field tests and simulations

    USGS Publications Warehouse

    Hayden, Todd A.; Holbrook, Christopher M.; Binder, Thomas; Dettmers, John M.; Cooke, Steven J.; Vandergoot, Christopher S.; Krueger, Charles C.

    2016-01-01

    BackgroundAdvances in acoustic telemetry technology have led to an improved understanding of the spatial ecology of many freshwater and marine fish species. Understanding the performance of acoustic receivers is necessary to distinguish between tagged fish that may have been present but not detected and from those fish that were absent from the area. In this study, two stationary acoustic transmitters were deployed 250 m apart within each of four acoustic receiver lines each containing at least 10 receivers (i.e., eight acoustic transmitters) located in Saginaw Bay and central Lake Huron for nearly 2 years to determine whether the probability of detecting an acoustic transmission varied as a function of time (i.e., season), location, and distance between acoustic transmitter and receiver. Distances between acoustic transmitters and receivers ranged from 200 m to >10 km in each line. The daily observed probability of detecting an acoustic transmission was used in simulation models to estimate the probability of detecting a moving acoustic transmitter on a line of receivers.ResultsThe probability of detecting an acoustic transmitter on a receiver 1000 m away differed by month for different receiver lines in Lake Huron and Saginaw Bay but was similar for paired acoustic transmitters deployed 250 m apart within the same line. Mean probability of detecting an acoustic transmitter at 1000 m calculated over the study period varied among acoustic transmitters 250 m apart within a line and differed among receiver lines in Lake Huron and Saginaw Bay. The simulated probability of detecting a moving acoustic transmitter on a receiver line was characterized by short periods of time with decreased detection. Although increased receiver spacing and higher fish movement rates decreased simulated detection probability, the location of the simulated receiver line in Lake Huron had the strongest effect on simulated detection probability.ConclusionsPerformance of receiver lines in Lake Huron varied across a range of spatiotemporal scales and was inconsistent among receiver lines. Our simulations indicated that if 69 kHz acoustic transmitters operating at 158 dB in 10–30 m of freshwater were being used, then receivers should be placed 1000 m apart to ensure that all fish moving at 1 m s−1 or less will be detected 90% of days over a 2-year period. Whereas these results can be used as general guidelines for designing new studies, the irregular variation in acoustic transmitter detection probabilities we observed among receiver line locations in Lake Huron makes designing receiver lines in similar systems challenging and emphasizes the need to conduct post hoc analyses of acoustic transmitter detection probabilities.

  18. Systematic study of probable projectile-target combinations for the synthesis of the superheavy nucleus 302120

    NASA Astrophysics Data System (ADS)

    Santhosh, K. P.; Safoora, V.

    2016-08-01

    Probable projectile-target combinations for the synthesis of the superheavy element 302120 have been studied taking the Coulomb and proximity potential as the interaction barrier. The probabilities of the compound nucleus formation PCN for the projectile-target combinations found in the cold reaction valley of 302120 are estimated. At energies near and above the Coulomb barrier, we have calculated the capture, fusion, and evaporation residue cross sections for the reactions of all probable projectile-target combinations so as to predict the most promising projectile-target combinations for the synthesis of the superheavy element 302120 in heavy-ion fusion reactions. The calculated fusion and evaporation cross sections for the more asymmetric ("hotter") projectile-target combination is found to be higher than the less asymmetric ("colder") combination. It can be seen from the nature of the quasifission barrier height, mass asymmetry, the probability of compound nucleus formation, survival probability, and excitation energy, the systems 44Ar+258No , 46Ar+256No , 48Ca+254Fm , 50Ca+252Fm , 54Ti+248Cf , and 58Cr+244Cm in deep region I of the cold reaction valley and the systems 62Fe+240Pu , 64Fe+238Pu , 68Ni+234U , 70Ni+232U , 72Ni+230U , and 74Zn+228Th in the other cold valleys are identified as the better projectile-target combinations for the synthesis of 302120. Our predictions on the synthesis of 302120 superheavy nuclei using the combinations 54Cr+248Cm , 58Fe+244Pu , 64Ni+238U , and 50Ti+249Cf are compared with available experimental data and other theoretical predictions.

  19. Assessment of environmental DNA for detecting presence of imperiled aquatic amphibian species in isolated wetlands

    USGS Publications Warehouse

    Mckee, Anna; Calhoun, Daniel L.; Barichivich, William J.; Spear, Stephen F.; Goldberg, Caren S.; Glenn, Travis C

    2015-01-01

    Environmental DNA (eDNA) is an emerging tool that allows low-impact sampling for aquatic species by isolating DNA from water samples and screening for DNA sequences specific to species of interest. However, researchers have not tested this method in naturally acidic wetlands that provide breeding habitat for a number of imperiled species, including the frosted salamander (Ambystoma cingulatum), reticulated flatwoods salamanders (Ambystoma bishopi), striped newt (Notophthalmus perstriatus), and gopher frog (Lithobates capito). Our objectives for this study were to develop and optimize eDNA survey protocols and assays to complement and enhance capture-based survey methods for these amphibian species. We collected three or more water samples, dipnetted or trapped larval and adult amphibians, and conducted visual encounter surveys for egg masses for target species at 40 sites on 12 different longleaf pine (Pinus palustris) tracts. We used quantitative PCRs to screen eDNA from each site for target species presence. We detected flatwoods salamanders at three sites with eDNA but did not detect them during physical surveys. Based on the sample location we assumed these eDNA detections to indicate the presence of frosted flatwoods salamanders. We did not detect reticulated flatwoods salamanders. We detected striped newts with physical and eDNA surveys at two wetlands. We detected gopher frogs at 12 sites total, three with eDNA alone, two with physical surveys alone, and seven with physical and eDNA surveys. We detected our target species with eDNA at 9 of 11 sites where they were present as indicated from traditional surveys and at six sites where they were not detected with traditional surveys. It was, however, critical to use at least three water samples per site for eDNA. Our results demonstrate eDNA surveys can be a useful complement to traditional survey methods for detecting imperiled pond-breeding amphibians. Environmental DNA may be particularly useful in situations where detection probability using traditional survey methods is low or access by trained personnel is limited.

  20. Video Shot Boundary Detection Using QR-Decomposition and Gaussian Transition Detection

    NASA Astrophysics Data System (ADS)

    Amiri, Ali; Fathy, Mahmood

    2010-12-01

    This article explores the problem of video shot boundary detection and examines a novel shot boundary detection algorithm by using QR-decomposition and modeling of gradual transitions by Gaussian functions. Specifically, the authors attend to the challenges of detecting gradual shots and extracting appropriate spatiotemporal features that affect the ability of algorithms to efficiently detect shot boundaries. The algorithm utilizes the properties of QR-decomposition and extracts a block-wise probability function that illustrates the probability of video frames to be in shot transitions. The probability function has abrupt changes in hard cut transitions, and semi-Gaussian behavior in gradual transitions. The algorithm detects these transitions by analyzing the probability function. Finally, we will report the results of the experiments using large-scale test sets provided by the TRECVID 2006, which has assessments for hard cut and gradual shot boundary detection. These results confirm the high performance of the proposed algorithm.

  1. PEST reduces bias in forced choice psychophysics.

    PubMed

    Taylor, M M; Forbes, S M; Creelman, C D

    1983-11-01

    Observers performed several different detection tasks using both the PEST adaptive psychophysical procedure and a fixed-level (method of constant stimuli) psychophysical procedure. In two experiments, PEST runs targeted at P (C) = 0.80 were immediately followed by fixed-level detection runs presented at the difficulty level resulting from the PEST run. The fixed-level runs yielded P (C) about 0.75. During the fixed-level runs, the probability of a correct response was greater when the preceding response was correct than when it was wrong. Observers, even highly trained ones, perform in a nonstationary manner. The sequential dependency data can be used to determine a lower bound for the observer's "true" capability when performing optimally; this lower bound is close to the PEST target, and well above the forced choice P (C). The observer's "true" capability is the measure used by most theories of detection performance. A further experiment compared psychometric functions obtained from a set of PEST runs using different targets with those obtained from blocks of fixed-level trials at different levels. PEST results were more stable across observers, performance at all but the highest signal levels was better with PEST, and the PEST psychometric functions had shallower slopes. We hypothesize that PEST permits the observer to keep track of what he is trying to detect, whereas in the fixed-level method performance is disrupted by memory failure. Some recently suggested "more virulent" versions of PEST may be subject to biases similar to those of the fixed-level procedures.(ABSTRACT TRUNCATED AT 250 WORDS)

  2. Commercially Available Low Probability of Intercept Radars and Non-Cooperative ELINT Receiver Capabilities

    DTIC Science & Technology

    2014-09-01

    The Pilot radar has a low average power output, the 2.4 m range cell resolution, a resistance to electronic support system detection and/or anti...installation on walls, towers, or buildings, or it can be used as man-portable radar [35]. It features a scan rate of 30°/ s , which allows for a ...Target Velocity .1 – 50 m / s Operating Range 5 – 1400 m False Alarm Rates < 1 per 24 hours Coverage area 6.16 km2 Power Consumption 45 Watts

  3. Developing Human-Machine Interfaces to Support Appropriate Trust and Reliance on Automated Combat Identification Systems (Developpement d’Interfaces Homme-Machine Pour Appuyer la Confiance dans les Systemes Automatises d’Identification au Combat)

    DTIC Science & Technology

    2008-03-31

    on automation; the ‘response bias’ approach. This new approach is based on Signal Detection Theory (SDT) (Macmillan & Creelman , 1991; Wickens...SDT), response bias will vary with the expectation of the target probability, whereas their sensitivity will stay constant (Macmillan & Creelman ...measures, C has the simplest statistical properties (Macmillan & Creelman , 1991, p273), and it was also the measure used in Dzindolet et al.’s study

  4. Handling target obscuration through Markov chain observations

    NASA Astrophysics Data System (ADS)

    Kouritzin, Michael A.; Wu, Biao

    2008-04-01

    Target Obscuration, including foliage or building obscuration of ground targets and landscape or horizon obscuration of airborne targets, plagues many real world filtering problems. In particular, ground moving target identification Doppler radar, mounted on a surveillance aircraft or unattended airborne vehicle, is used to detect motion consistent with targets of interest. However, these targets try to obscure themselves (at least partially) by, for example, traveling along the edge of a forest or around buildings. This has the effect of creating random blockages in the Doppler radar image that move dynamically and somewhat randomly through this image. Herein, we address tracking problems with target obscuration by building memory into the observations, eschewing the usual corrupted, distorted partial measurement assumptions of filtering in favor of dynamic Markov chain assumptions. In particular, we assume the observations are a Markov chain whose transition probabilities depend upon the signal. The state of the observation Markov chain attempts to depict the current obscuration and the Markov chain dynamics are used to handle the evolution of the partially obscured radar image. Modifications of the classical filtering equations that allow observation memory (in the form of a Markov chain) are given. We use particle filters to estimate the position of the moving targets. Moreover, positive proof-of-concept simulations are included.

  5. Detection of nuclear resonance signals: modification of the receiver operating characteristics using feedback.

    PubMed

    Blauch, A J; Schiano, J L; Ginsberg, M D

    2000-06-01

    The performance of a nuclear resonance detection system can be quantified using binary detection theory. Within this framework, signal averaging increases the probability of a correct detection and decreases the probability of a false alarm by reducing the variance of the noise in the average signal. In conjunction with signal averaging, we propose another method based on feedback control concepts that further improves detection performance. By maximizing the nuclear resonance signal amplitude, feedback raises the probability of correct detection. Furthermore, information generated by the feedback algorithm can be used to reduce the probability of false alarm. We discuss the advantages afforded by feedback that cannot be obtained using signal averaging. As an example, we show how this method is applicable to the detection of explosives using nuclear quadrupole resonance. Copyright 2000 Academic Press.

  6. Landscape- and local-scale habitat influences on occupancy and detection probability of stream-dwelling crayfish: Implications for conservation

    USGS Publications Warehouse

    Magoulick, Daniel D.; DiStefano, Robert J.; Imhoff, Emily M.; Nolen, Matthew S.; Wagner, Brian K.

    2017-01-01

    Crayfish are ecologically important in freshwater systems worldwide and are imperiled in North America and globally. We sought to examine landscape- to local-scale environmental variables related to occupancy and detection probability of a suite of stream-dwelling crayfish species. We used a quantitative kickseine method to sample crayfish presence at 102 perennial stream sites with eight surveys per site. We modeled occupancy (psi) and detection probability (P) and local- and landscape-scale environmental covariates. We developed a set of a priori candidate models for each species and ranked models using (Q)AICc. Detection probabilities and occupancy estimates differed among crayfish species with Orconectes eupunctus, O. marchandi, and Cambarus hubbsi being relatively rare (psi < 0.20) with moderate (0.46–0.60) to high (0.81) detection probability and O. punctimanus and O. ozarkae being relatively common (psi > 0.60) with high detection probability (0.81). Detection probability was often related to local habitat variables current velocity, depth, or substrate size. Important environmental variables for crayfish occupancy were species dependent but were mainly landscape variables such as stream order, geology, slope, topography, and land use. Landscape variables strongly influenced crayfish occupancy and should be considered in future studies and conservation plans.

  7. Paediatric autoimmune encephalopathies: clinical features, laboratory investigations and outcomes in patients with or without antibodies to known central nervous system autoantigens

    PubMed Central

    Hacohen, Yael; Wright, Sukhvir; Waters, Patrick; Agrawal, Shakti; Carr, Lucinda; Cross, Helen; De Sousa, Carlos; DeVile, Catherine; Fallon, Penny; Gupta, Rajat; Hedderly, Tammy; Hughes, Elaine; Kerr, Tim; Lascelles, Karine; Lin, Jean-Pierre; Philip, Sunny; Pohl, Keith; Prabahkar, Prab; Smith, Martin; Williams, Ruth; Clarke, Antonia; Hemingway, Cheryl; Wassmer, Evangeline; Vincent, Angela; Lim, Ming J

    2013-01-01

    Objective To report the clinical and investigative features of children with a clinical diagnosis of probable autoimmune encephalopathy, both with and without antibodies to central nervous system antigens. Method Patients with encephalopathy plus one or more of neuropsychiatric symptoms, seizures, movement disorder or cognitive dysfunction, were identified from 111 paediatric serum samples referred from five tertiary paediatric neurology centres to Oxford for antibody testing in 2007–2010. A blinded clinical review panel identified 48 patients with a diagnosis of probable autoimmune encephalitis whose features are described. All samples were tested/retested for antibodies to N-methyl-D-aspartate receptor (NMDAR), VGKC-complex, LGI1, CASPR2 and contactin-2, GlyR, D1R, D2R, AMPAR, GABA(B)R and glutamic acid decarboxylase. Results Seizures (83%), behavioural change (63%), confusion (50%), movement disorder (38%) and hallucinations (25%) were common. 52% required intensive care support for seizure control or profound encephalopathy. An acute infective organism (15%) or abnormal cerebrospinal fluid (32%), EEG (70%) or MRI (37%) abnormalities were found. One 14-year-old girl had an ovarian teratoma. Serum antibodies were detected in 21/48 (44%) patients: NMDAR 13/48 (27%), VGKC-complex 7/48(15%) and GlyR 1/48(2%). Antibody negative patients shared similar clinical features to those who had specific antibodies detected. 18/34 patients (52%) who received immunotherapy made a complete recovery compared to 4/14 (28%) who were not treated; reductions in modified Rankin Scale for children scores were more common following immunotherapies. Antibody status did not appear to influence the treatment effect. Conclusions Our study outlines the common clinical and paraclinical features of children and adolescents with probable autoimmune encephalopathies. These patients, irrespective of positivity for the known antibody targets, appeared to benefit from immunotherapies and further antibody targets may be defined in the future. PMID:23175854

  8. Cross-modal cueing effects of visuospatial attention on conscious somatosensory perception.

    PubMed

    Doruk, Deniz; Chanes, Lorena; Malavera, Alejandra; Merabet, Lotfi B; Valero-Cabré, Antoni; Fregni, Felipe

    2018-04-01

    The impact of visuospatial attention on perception with supraliminal stimuli and stimuli at the threshold of conscious perception has been previously investigated. In this study, we assess the cross-modal effects of visuospatial attention on conscious perception for near-threshold somatosensory stimuli applied to the face. Fifteen healthy participants completed two sessions of a near-threshold cross-modality cue-target discrimination/conscious detection paradigm. Each trial began with an endogenous visuospatial cue that predicted the location of a weak near-threshold electrical pulse delivered to the right or left cheek with high probability (∼75%). Participants then completed two tasks: first, a forced-choice somatosensory discrimination task (felt once or twice?) and then, a somatosensory conscious detection task (did you feel the stimulus and, if yes, where (left/right)?). Somatosensory discrimination was evaluated with the response reaction times of correctly detected targets, whereas the somatosensory conscious detection was quantified using perceptual sensitivity (d') and response bias (beta). A 2 × 2 repeated measures ANOVA was used for statistical analysis. In the somatosensory discrimination task (1 st task), participants were significantly faster in responding to correctly detected targets (p < 0.001). In the somatosensory conscious detection task (2 nd task), a significant effect of visuospatial attention on response bias (p = 0.008) was observed, suggesting that participants had a less strict criterion for stimuli preceded by spatially valid than invalid visuospatial cues. We showed that spatial attention has the potential to modulate the discrimination and the conscious detection of near-threshold somatosensory stimuli as measured, respectively, by a reduction of reaction times and a shift in response bias toward less conservative responses when the cue predicted stimulus location. A shift in response bias indicates possible effects of spatial attention on internal decision processes. The lack of significant results in perceptual sensitivity (d') could be due to weaker effects of endogenous attention on perception.

  9. Compensating for geographic variation in detection probability with water depth improves abundance estimates of coastal marine megafauna.

    PubMed

    Hagihara, Rie; Jones, Rhondda E; Sobtzick, Susan; Cleguer, Christophe; Garrigue, Claire; Marsh, Helene

    2018-01-01

    The probability of an aquatic animal being available for detection is typically <1. Accounting for covariates that reduce the probability of detection is important for obtaining robust estimates of the population abundance and determining its status and trends. The dugong (Dugong dugon) is a bottom-feeding marine mammal and a seagrass community specialist. We hypothesized that the probability of a dugong being available for detection is dependent on water depth and that dugongs spend more time underwater in deep-water seagrass habitats than in shallow-water seagrass habitats. We tested this hypothesis by quantifying the depth use of 28 wild dugongs fitted with GPS satellite transmitters and time-depth recorders (TDRs) at three sites with distinct seagrass depth distributions: 1) open waters supporting extensive seagrass meadows to 40 m deep (Torres Strait, 6 dugongs, 2015); 2) a protected bay (average water depth 6.8 m) with extensive shallow seagrass beds (Moreton Bay, 13 dugongs, 2011 and 2012); and 3) a mixture of lagoon, coral and seagrass habitats to 60 m deep (New Caledonia, 9 dugongs, 2013). The fitted instruments were used to measure the times the dugongs spent in the experimentally determined detection zones under various environmental conditions. The estimated probability of detection was applied to aerial survey data previously collected at each location. In general, dugongs were least available for detection in Torres Strait, and the population estimates increased 6-7 fold using depth-specific availability correction factors compared with earlier estimates that assumed homogeneous detection probability across water depth and location. Detection probabilities were higher in Moreton Bay and New Caledonia than Torres Strait because the water transparency in these two locations was much greater than in Torres Strait and the effect of correcting for depth-specific detection probability much less. The methodology has application to visual survey of coastal megafauna including surveys using Unmanned Aerial Vehicles.

  10. Directed Design of Experiments (DOE) for Determining Probability of Detection (POD) Capability of NDE Systems (DOEPOD)

    NASA Technical Reports Server (NTRS)

    Generazio, Ed

    2007-01-01

    This viewgraph presentation reviews some of the issues that people who specialize in Non destructive evaluation (NDE) have with determining the statistics of the probability of detection. There is discussion of the use of the binominal distribution, and the probability of hit. The presentation then reviews the concepts of Directed Design of Experiments for Validating Probability of Detection of Inspection Systems (DOEPOD). Several cases are reviewed, and discussed. The concept of false calls is also reviewed.

  11. Optimal background matching camouflage.

    PubMed

    Michalis, Constantine; Scott-Samuel, Nicholas E; Gibson, David P; Cuthill, Innes C

    2017-07-12

    Background matching is the most familiar and widespread camouflage strategy: avoiding detection by having a similar colour and pattern to the background. Optimizing background matching is straightforward in a homogeneous environment, or when the habitat has very distinct sub-types and there is divergent selection leading to polymorphism. However, most backgrounds have continuous variation in colour and texture, so what is the best solution? Not all samples of the background are likely to be equally inconspicuous, and laboratory experiments on birds and humans support this view. Theory suggests that the most probable background sample (in the statistical sense), at the size of the prey, would, on average, be the most cryptic. We present an analysis, based on realistic assumptions about low-level vision, that estimates the distribution of background colours and visual textures, and predicts the best camouflage. We present data from a field experiment that tests and supports our predictions, using artificial moth-like targets under bird predation. Additionally, we present analogous data for humans, under tightly controlled viewing conditions, searching for targets on a computer screen. These data show that, in the absence of predator learning, the best single camouflage pattern for heterogeneous backgrounds is the most probable sample. © 2017 The Authors.

  12. Application of infrared uncooled cameras in surveillance systems

    NASA Astrophysics Data System (ADS)

    Dulski, R.; Bareła, J.; Trzaskawka, P.; PiÄ tkowski, T.

    2013-10-01

    The recent necessity to protect military bases, convoys and patrols gave serious impact to the development of multisensor security systems for perimeter protection. One of the most important devices used in such systems are IR cameras. The paper discusses technical possibilities and limitations to use uncooled IR camera in a multi-sensor surveillance system for perimeter protection. Effective ranges of detection depend on the class of the sensor used and the observed scene itself. Application of IR camera increases the probability of intruder detection regardless of the time of day or weather conditions. It also simultaneously decreased the false alarm rate produced by the surveillance system. The role of IR cameras in the system was discussed as well as technical possibilities to detect human being. Comparison of commercially available IR cameras, capable to achieve desired ranges was done. The required spatial resolution for detection, recognition and identification was calculated. The simulation of detection ranges was done using a new model for predicting target acquisition performance which uses the Targeting Task Performance (TTP) metric. Like its predecessor, the Johnson criteria, the new model bounds the range performance with image quality. The scope of presented analysis is limited to the estimation of detection, recognition and identification ranges for typical thermal cameras with uncooled microbolometer focal plane arrays. This type of cameras is most widely used in security systems because of competitive price to performance ratio. Detection, recognition and identification range calculations were made, and the appropriate results for the devices with selected technical specifications were compared and discussed.

  13. Detection of dim targets in multiple environments

    NASA Astrophysics Data System (ADS)

    Mirsky, Grace M.; Woods, Matthew; Grasso, Robert J.

    2013-10-01

    The proliferation of a wide variety of weapons including Anti-Aircraft Artillery (AAA), rockets, and small arms presents a substantial threat to both military and civilian aircraft. To address this ever-present threat, Northrop Grumman has assessed unguided threat phenomenology to understand the underlying physical principles for detection. These principles, based upon threat transit through the atmosphere, exploit a simple phenomenon universal to all objects moving through an atmosphere comprised of gaseous media to detect and track the threat in the presence of background and clutter. Threat detection has rapidly become a crucial component of aircraft survivability systems that provide situational awareness to the crew. It is particularly important to platforms which may spend a majority of their time at low altitudes and within the effective range of a large variety of weapons. Detection of these threats presents a unique challenge as this class of threat typically has a dim signature coupled with a short duration. Correct identification of each of the threat components (muzzle flash and projectile) is important to determine trajectory and intent while minimizing false alarms and maintaining a high detection probability in all environments.

  14. Fabrication of few-layer graphene film based field effect transistor and its application for trace-detection of herbicide atrazine

    NASA Astrophysics Data System (ADS)

    Thanh Cao, Thi; Chuc Nguyen, Van; Binh Nguyen, Hai; Thang Bui, Hung; Thu Vu, Thi; Phan, Ngoc Hong; Thang Phan, Bach; Hoang, Le; Bayle, Maxime; Paillet, Matthieu; Sauvajol, Jean Louis; Phan, Ngoc Minh; Tran, Dai Lam

    2016-09-01

    We describe the fabrication of highly sensitive graphene-based field effect transistor (FET) enzymatic biosensor for trace-detection of atrazine. The few-layers graphene films were prepared on polycrystalline copper foils by atmospheric pressure chemical vapor deposition method using an argon/hydrogen/methane mixture. The characteristics of graphene films were investigated by scanning electron microscopy, transmission electron microscopy and Raman spectroscopy. The results indicated low uniformity of graphene layers, which is probably induced by heterogeneous distribution of graphene nucleation sites on the Cu surface. The pesticide detection is accomplished through the measurement of the drain-source current variations of the FET sensor upon the urea enzymatic hydrolysis reaction. The obtained biosensor is able to detect atrazine with a sensitivity of 56 μA/logCATZ in range between 2 × 10-4 and 20 ppb and has a limit of detection as low as 0.05 ppt. The elaboration of such highly sensitive biosensors will provide better biosensing performances for the detection of biochemical targets.

  15. Climate change and the detection of trends in annual runoff

    USGS Publications Warehouse

    McCabe, G.J.; Wolock, D.M.

    1997-01-01

    This study examines the statistical likelihood of detecting a trend in annual runoff given an assumed change in mean annual runoff, the underlying year-to-year variability in runoff, and serial correlation of annual runoff. Means, standard deviations, and lag-1 serial correlations of annual runoff were computed for 585 stream gages in the conterminous United States, and these statistics were used to compute the probability of detecting a prescribed trend in annual runoff. Assuming a linear 20% change in mean annual runoff over a 100 yr period and a significance level of 95%, the average probability of detecting a significant trend was 28% among the 585 stream gages. The largest probability of detecting a trend was in the northwestern U.S., the Great Lakes region, the northeastern U.S., the Appalachian Mountains, and parts of the northern Rocky Mountains. The smallest probability of trend detection was in the central and southwestern U.S., and in Florida. Low probabilities of trend detection were associated with low ratios of mean annual runoff to the standard deviation of annual runoff and with high lag-1 serial correlation in the data.

  16. Spatial patch occupancy patterns of the Lower Keys marsh rabbit

    USGS Publications Warehouse

    Eaton, Mitchell J.; Hughes, Phillip T.; Nichols, James D.; Morkill, Anne; Anderson, Chad

    2011-01-01

    Reliable estimates of presence or absence of a species can provide substantial information on management questions related to distribution and habitat use but should incorporate the probability of detection to reduce bias. We surveyed for the endangered Lower Keys marsh rabbit (Sylvilagus palustris hefneri) in habitat patches on 5 Florida Key islands, USA, to estimate occupancy and detection probabilities. We derived detection probabilities using spatial replication of plots and evaluated hypotheses that patch location (coastal or interior) and patch size influence occupancy and detection. Results demonstrate that detection probability, given rabbits were present, was <0.5 and suggest that naïve estimates (i.e., estimates without consideration of imperfect detection) of patch occupancy are negatively biased. We found that patch size and location influenced probability of occupancy but not detection. Our findings will be used by Refuge managers to evaluate population trends of Lower Keys marsh rabbits from historical data and to guide management decisions for species recovery. The sampling and analytical methods we used may be useful for researchers and managers of other endangered lagomorphs and cryptic or fossorial animals occupying diverse habitats.

  17. Detection probabilities of electrofishing, hoop nets, and benthic trawls for fishes in two western North American rivers

    USGS Publications Warehouse

    Smith, Christopher D.; Quist, Michael C.; Hardy, Ryan S.

    2015-01-01

    Research comparing different sampling techniques helps improve the efficiency and efficacy of sampling efforts. We compared the effectiveness of three sampling techniques (small-mesh hoop nets, benthic trawls, boat-mounted electrofishing) for 30 species in the Green (WY, USA) and Kootenai (ID, USA) rivers by estimating conditional detection probabilities (probability of detecting a species given its presence at a site). Electrofishing had the highest detection probabilities (generally greater than 0.60) for most species (88%), but hoop nets also had high detectability for several taxa (e.g., adult burbot Lota lota, juvenile northern pikeminnow Ptychocheilus oregonensis). Benthic trawls had low detection probabilities (<0.05) for most taxa (84%). Gear-specific effects were present for most species indicating large differences in gear effectiveness among techniques. In addition to gear effects, habitat characteristics also influenced detectability of fishes. Most species-specific habitat relationships were idiosyncratic and reflected the ecology of the species. Overall findings of our study indicate that boat-mounted electrofishing and hoop nets are the most effective techniques for sampling fish assemblages in large, coldwater rivers.

  18. Molecular level detection and localization of mechanical damage in collagen enabled by collagen hybridizing peptides.

    PubMed

    Zitnay, Jared L; Li, Yang; Qin, Zhao; San, Boi Hoa; Depalle, Baptiste; Reese, Shawn P; Buehler, Markus J; Yu, S Michael; Weiss, Jeffrey A

    2017-03-22

    Mechanical injury to connective tissue causes changes in collagen structure and material behaviour, but the role and mechanisms of molecular damage have not been established. In the case of mechanical subfailure damage, no apparent macroscale damage can be detected, yet this damage initiates and potentiates in pathological processes. Here, we utilize collagen hybridizing peptide (CHP), which binds unfolded collagen by triple helix formation, to detect molecular level subfailure damage to collagen in mechanically stretched rat tail tendon fascicle. Our results directly reveal that collagen triple helix unfolding occurs during tensile loading of collagenous tissues and thus is an important damage mechanism. Steered molecular dynamics simulations suggest that a likely mechanism for triple helix unfolding is intermolecular shearing of collagen α-chains. Our results elucidate a probable molecular failure mechanism associated with subfailure injuries, and demonstrate the potential of CHP targeting for diagnosis, treatment and monitoring of tissue disease and injury.

  19. Measuring multielectron beam imaging fidelity with a signal-to-noise ratio analysis

    NASA Astrophysics Data System (ADS)

    Mukhtar, Maseeh; Bunday, Benjamin D.; Quoi, Kathy; Malloy, Matt; Thiel, Brad

    2016-07-01

    Java Monte Carlo Simulator for Secondary Electrons (JMONSEL) simulations are used to generate expected imaging responses of chosen test cases of patterns and defects with the ability to vary parameters for beam energy, spot size, pixel size, and/or defect material and form factor. The patterns are representative of the design rules for an aggressively scaled FinFET-type design. With these simulated images and resulting shot noise, a signal-to-noise framework is developed, which relates to defect detection probabilities. Additionally, with this infrastructure, the effect of detection chain noise and frequency-dependent system response can be made, allowing for targeting of best recipe parameters for multielectron beam inspection validation experiments. Ultimately, these results should lead to insights into how such parameters will impact tool design, including necessary doses for defect detection and estimations of scanning speeds for achieving high throughput for high-volume manufacturing.

  20. A plastic scintillator-based muon tomography system with an integrated muon spectrometer

    NASA Astrophysics Data System (ADS)

    Anghel, V.; Armitage, J.; Baig, F.; Boniface, K.; Boudjemline, K.; Bueno, J.; Charles, E.; Drouin, P.-L.; Erlandson, A.; Gallant, G.; Gazit, R.; Godin, D.; Golovko, V. V.; Howard, C.; Hydomako, R.; Jewett, C.; Jonkmans, G.; Liu, Z.; Robichaud, A.; Stocki, T. J.; Thompson, M.; Waller, D.

    2015-10-01

    A muon scattering tomography system which uses extruded plastic scintillator bars for muon tracking and a dedicated muon spectrometer that measures scattering through steel slabs has been constructed and successfully tested. The atmospheric muon detection efficiency is measured to be 97% per plane on average and the average intrinsic hit resolution is 2.5 mm. In addition to creating a variety of three-dimensional images of objects of interest, a quantitative study has been carried out to investigate the impact of including muon momentum measurements when attempting to detect high-density, high-Z material. As expected, the addition of momentum information improves the performance of the system. For a fixed data-taking time of 60 s and a fixed false positive fraction, the probability to detect a target increases when momentum information is used. This is the first demonstration of the use of muon momentum information from dedicated spectrometer measurements in muon scattering tomography.

  1. A simple two-stage model predicts response time distributions.

    PubMed

    Carpenter, R H S; Reddi, B A J; Anderson, A J

    2009-08-15

    The neural mechanisms underlying reaction times have previously been modelled in two distinct ways. When stimuli are hard to detect, response time tends to follow a random-walk model that integrates noisy sensory signals. But studies investigating the influence of higher-level factors such as prior probability and response urgency typically use highly detectable targets, and response times then usually correspond to a linear rise-to-threshold mechanism. Here we show that a model incorporating both types of element in series - a detector integrating noisy afferent signals, followed by a linear rise-to-threshold performing decision - successfully predicts not only mean response times but, much more stringently, the observed distribution of these times and the rate of decision errors over a wide range of stimulus detectability. By reconciling what previously may have seemed to be conflicting theories, we are now closer to having a complete description of reaction time and the decision processes that underlie it.

  2. Locality-constrained anomaly detection for hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Liu, Jiabin; Li, Wei; Du, Qian; Liu, Kui

    2015-12-01

    Detecting a target with low-occurrence-probability from unknown background in a hyperspectral image, namely anomaly detection, is of practical significance. Reed-Xiaoli (RX) algorithm is considered as a classic anomaly detector, which calculates the Mahalanobis distance between local background and the pixel under test. Local RX, as an adaptive RX detector, employs a dual-window strategy to consider pixels within the frame between inner and outer windows as local background. However, the detector is sensitive if such a local region contains anomalous pixels (i.e., outliers). In this paper, a locality-constrained anomaly detector is proposed to remove outliers in the local background region before employing the RX algorithm. Specifically, a local linear representation is designed to exploit the internal relationship between linearly correlated pixels in the local background region and the pixel under test and its neighbors. Experimental results demonstrate that the proposed detector improves the original local RX algorithm.

  3. Radar Detectability Studies of Slow and Small Zodiacal Dust Cloud Particles. III. The Role of Sodium and the Head Echo Size on the Probability of Detection

    NASA Astrophysics Data System (ADS)

    Janches, D.; Swarnalingam, N.; Carrillo-Sanchez, J. D.; Gomez-Martin, J. C.; Marshall, R.; Nesvorný, D.; Plane, J. M. C.; Feng, W.; Pokorný, P.

    2017-07-01

    We present a path forward on a long-standing issue concerning the flux of small and slow meteoroids, which are believed to be the dominant portion of the incoming meteoric mass flux into the Earth’s atmosphere. Such a flux, which is predicted by dynamical dust models of the Zodiacal Cloud, is not evident in ground-based radar observations. For decades this was attributed to the fact that the radars used for meteor observations lack the sensitivity to detect this population, due to the small amount of ionization produced by slow-velocity meteors. Such a hypothesis has been challenged by the introduction of meteor head echo (HE) observations with High Power and Large Aperture radars, in particular the Arecibo 430 MHz radar. Janches et al. developed a probabilistic approach to estimate the detectability of meteors by these radars and initially showed that, with the current knowledge of ablation and ionization, such particles should dominate the detected rates by one to two orders of magnitude compared to the actual observations. In this paper, we include results in our model from recently published laboratory measurements, which showed that (1) the ablation of Na is less intense covering a wider altitude range; and (2) the ionization probability, {β }{ip}, for Na atoms in the air is up to two orders of magnitude smaller for low speeds than originally believed. By applying these results and using a somewhat smaller size of the HE radar target we offer a solution that reconciles these observations with model predictions.

  4. Radar Detectability Studies of Slow and Small Zodiacal Dust Cloud Particles. III. The Role of Sodium and the Head Echo Size on the Probability of Detection

    NASA Technical Reports Server (NTRS)

    Janches, D.; Swarnalingam, N.; Carrillo-Sanchez, J. D.; Gomez-Martin, J. C.; Marshall, R.; Nesvorny, D.; Plane, J. M. C.; Feng, W.; Pokorny, P.

    2017-01-01

    We present a path forward on a long-standing issue concerning the flux of small and slow meteoroids, which are believed to be the dominant portion of the incoming meteoric mass flux into the Earth's atmosphere. Such a flux, which is predicted by dynamical dust models of the Zodiacal Cloud, is not evident in ground-based radar observations. For decades this was attributed to the fact that the radars used for meteor observations lack the sensitivity to detect this population, due to the small amount of ionization produced by slow-velocity meteors. Such a hypothesis has been challenged by the introduction of meteor head echo (HE) observations with High Power and Large Aperture radars, in particular the Arecibo 430 MHz radar. Janches et al. developed a probabilistic approach to estimate the detectability of meteors by these radars and initially showed that, with the current knowledge of ablation and ionization, such particles should dominate the detected rates by one to two orders of magnitude compared to the actual observations. In this paper, we include results in our model from recently published laboratory measurements, which showed that (1) the ablation of Na is less intense covering a wider altitude range; and (2) the ionization probability, Beta ip, for Na atoms in the air is up to two orders of magnitude smaller for low speeds than originally believed. By applying these results and using a somewhat smaller size of the HE radar target we offer a solution that reconciles these observations with model predictions.

  5. Development of recombinase polymerase amplification assays for the rapid detection of peste des petits ruminants virus.

    PubMed

    Zhang, Yongning; Wang, Jianchang; Zhang, Zhou; Mei, Lin; Wang, Jinfeng; Wu, Shaoqiang; Lin, Xiangmei

    2018-04-01

    Peste des petits ruminants (PPR) is a severe infectious disease of small ruminants caused by PPR virus (PPRV). Rapid and sensitive detection of PPRV is critical for controlling PPR. This report describes the development and evaluation of a conventional reverse transcription recombinase polymerase amplification (RT-RPA) assay and a real-time RT-RPA assay, targeting the PPRV N gene. Sensitivity analysis revealed that the conventional RT-RPA assay could detect 852 copies of standard PPRV RNA per reaction at 95% probability within 20 min at 41 °C, and the real-time RT-RPA assay could detect 103 copies of RNA molecules per reaction at 95% probability. Specificity analysis showed that both assays have no cross-reactivity with nucleic acid templates prepared from other selected viruses or common pathogens. Clinical evaluation using 162 ovine and hircine serum and nasal swab samples showed that the performance of both the real-time RT-RPA assay and the conventional RT-RPA assay were comparable to that of real-time RT-PCR. The overall agreements between real-time RT-PCR and real-time RT-RPA, and conventional RT-RPA were 99.4% (161/162) and 98.8% (160/162), respectively. The R 2 value of real-time RT-RPA and real-time RT-PCR was 0.900 by linear regression analysis. Our results suggest that both RT-RPA assays have a potential application in the rapid, sensitive and specific detection of PPRV. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Count data, detection probabilities, and the demography, dynamics, distribution, and decline of amphibians.

    PubMed

    Schmidt, Benedikt R

    2003-08-01

    The evidence for amphibian population declines is based on count data that were not adjusted for detection probabilities. Such data are not reliable even when collected using standard methods. The formula C = Np (where C is a count, N the true parameter value, and p is a detection probability) relates count data to demography, population size, or distributions. With unadjusted count data, one assumes a linear relationship between C and N and that p is constant. These assumptions are unlikely to be met in studies of amphibian populations. Amphibian population data should be based on methods that account for detection probabilities.

  7. A method of determining where to target surveillance efforts in heterogeneous epidemiological systems

    PubMed Central

    van den Bosch, Frank; Gottwald, Timothy R.; Alonso Chavez, Vasthi

    2017-01-01

    The spread of pathogens into new environments poses a considerable threat to human, animal, and plant health, and by extension, human and animal wellbeing, ecosystem function, and agricultural productivity, worldwide. Early detection through effective surveillance is a key strategy to reduce the risk of their establishment. Whilst it is well established that statistical and economic considerations are of vital importance when planning surveillance efforts, it is also important to consider epidemiological characteristics of the pathogen in question—including heterogeneities within the epidemiological system itself. One of the most pronounced realisations of this heterogeneity is seen in the case of vector-borne pathogens, which spread between ‘hosts’ and ‘vectors’—with each group possessing distinct epidemiological characteristics. As a result, an important question when planning surveillance for emerging vector-borne pathogens is where to place sampling resources in order to detect the pathogen as early as possible. We answer this question by developing a statistical function which describes the probability distributions of the prevalences of infection at first detection in both hosts and vectors. We also show how this method can be adapted in order to maximise the probability of early detection of an emerging pathogen within imposed sample size and/or cost constraints, and demonstrate its application using two simple models of vector-borne citrus pathogens. Under the assumption of a linear cost function, we find that sampling costs are generally minimised when either hosts or vectors, but not both, are sampled. PMID:28846676

  8. Multicentre evaluation of targeted and systematic biopsies using magnetic resonance and ultrasound image-fusion guided transperineal prostate biopsy in patients with a previous negative biopsy.

    PubMed

    Hansen, Nienke L; Kesch, Claudia; Barrett, Tristan; Koo, Brendan; Radtke, Jan P; Bonekamp, David; Schlemmer, Heinz-Peter; Warren, Anne Y; Wieczorek, Kathrin; Hohenfellner, Markus; Kastner, Christof; Hadaschik, Boris

    2017-11-01

    To evaluate the detection rates of targeted and systematic biopsies in magnetic resonance imaging (MRI) and ultrasound (US) image-fusion transperineal prostate biopsy for patients with previous benign transrectal biopsies in two high-volume centres. A two centre prospective outcome study of 487 patients with previous benign biopsies that underwent transperineal MRI/US fusion-guided targeted and systematic saturation biopsy from 2012 to 2015. Multiparametric MRI (mpMRI) was reported according to Prostate Imaging Reporting and Data System (PI-RADS) Version 1. Detection of Gleason score 7-10 prostate cancer on biopsy was the primary outcome. Positive (PPV) and negative (NPV) predictive values including 95% confidence intervals (95% CIs) were calculated. Detection rates of targeted and systematic biopsies were compared using McNemar's test. The median (interquartile range) PSA level was 9.0 (6.7-13.4) ng/mL. PI-RADS 3-5 mpMRI lesions were reported in 343 (70%) patients and Gleason score 7-10 prostate cancer was detected in 149 (31%). The PPV (95% CI) for detecting Gleason score 7-10 prostate cancer was 0.20 (±0.07) for PI-RADS 3, 0.32 (±0.09) for PI-RADS 4, and 0.70 (±0.08) for PI-RADS 5. The NPV (95% CI) of PI-RADS 1-2 was 0.92 (±0.04) for Gleason score 7-10 and 0.99 (±0.02) for Gleason score ≥4 + 3 cancer. Systematic biopsies alone found 125/138 (91%) Gleason score 7-10 cancers. In patients with suspicious lesions (PI-RADS 4-5) on mpMRI, systematic biopsies would not have detected 12/113 significant prostate cancers (11%), while targeted biopsies alone would have failed to diagnose 10/113 (9%). In equivocal lesions (PI-RADS 3), targeted biopsy alone would not have diagnosed 14/25 (56%) of Gleason score 7-10 cancers, whereas systematic biopsies alone would have missed 1/25 (4%). Combination with PSA density improved the area under the curve of PI-RADS from 0.822 to 0.846. In patients with high probability mpMRI lesions, the highest detection rates of Gleason score 7-10 cancer still required combined targeted and systematic MRI/US image-fusion; however, systematic biopsy alone may be sufficient in patients with equivocal lesions. Repeated prostate biopsies may not be needed at all for patients with a low PSA density and a negative mpMRI read by experienced radiologists. © 2016 The Authors BJU International © 2016 BJU International Published by John Wiley & Sons Ltd.

  9. Detection probability of an in-stream passive integrated transponder (PIT) tag detection system for juvenile salmonids in the Klamath River, northern California, 2011

    USGS Publications Warehouse

    Beeman, John W.; Hayes, Brian; Wright, Katrina

    2012-01-01

    A series of in-stream passive integrated transponder (PIT) detection antennas installed across the Klamath River in August 2010 were tested using tagged fish in the summer of 2011. Six pass-by antennas were constructed and anchored to the bottom of the Klamath River at a site between the Shasta and Scott Rivers. Two of the six antennas malfunctioned during the spring of 2011 and two pass-through antennas were installed near the opposite shoreline prior to system testing. The detection probability of the PIT tag detection system was evaluated using yearling coho salmon implanted with a PIT tag and a radio transmitter and then released into the Klamath River slightly downstream of Iron Gate Dam. Cormack-Jolly-Seber capture-recapture methods were used to estimate the detection probability of the PIT tag detection system based on detections of PIT tags there and detections of radio transmitters at radio-telemetry detection systems downstream. One of the 43 PIT- and radio-tagged fish released was detected by the PIT tag detection system and 23 were detected by the radio-telemetry detection systems. The estimated detection probability of the PIT tag detection system was 0.043 (standard error 0.042). Eight PIT-tagged fish from other studies also were detected. Detections at the PIT tag detection system were at the two pass-through antennas and the pass-by antenna adjacent to them. Above average river discharge likely was a factor in the low detection probability of the PIT tag detection system. High discharges dislodged two power cables leaving 12 meters of the river width unsampled for PIT detections and resulted in water depths greater than the read distance of the antennas, which allowed fish to pass over much of the system with little chance of being detected. Improvements in detection probability may be expected under river discharge conditions where water depth over the antennas is within maximum read distance of the antennas. Improvements also may be expected if additional arrays of antennas are used.

  10. Environmental DNA (eDNA) Detection Probability Is Influenced by Seasonal Activity of Organisms.

    PubMed

    de Souza, Lesley S; Godwin, James C; Renshaw, Mark A; Larson, Eric

    2016-01-01

    Environmental DNA (eDNA) holds great promise for conservation applications like the monitoring of invasive or imperiled species, yet this emerging technique requires ongoing testing in order to determine the contexts over which it is effective. For example, little research to date has evaluated how seasonality of organism behavior or activity may influence detection probability of eDNA. We applied eDNA to survey for two highly imperiled species endemic to the upper Black Warrior River basin in Alabama, US: the Black Warrior Waterdog (Necturus alabamensis) and the Flattened Musk Turtle (Sternotherus depressus). Importantly, these species have contrasting patterns of seasonal activity, with N. alabamensis more active in the cool season (October-April) and S. depressus more active in the warm season (May-September). We surveyed sites historically occupied by these species across cool and warm seasons over two years with replicated eDNA water samples, which were analyzed in the laboratory using species-specific quantitative PCR (qPCR) assays. We then used occupancy estimation with detection probability modeling to evaluate both the effects of landscape attributes on organism presence and season of sampling on detection probability of eDNA. Importantly, we found that season strongly affected eDNA detection probability for both species, with N. alabamensis having higher eDNA detection probabilities during the cool season and S. depressus have higher eDNA detection probabilities during the warm season. These results illustrate the influence of organismal behavior or activity on eDNA detection in the environment and identify an important role for basic natural history in designing eDNA monitoring programs.

  11. Environmental DNA (eDNA) Detection Probability Is Influenced by Seasonal Activity of Organisms

    PubMed Central

    de Souza, Lesley S.; Godwin, James C.; Renshaw, Mark A.; Larson, Eric

    2016-01-01

    Environmental DNA (eDNA) holds great promise for conservation applications like the monitoring of invasive or imperiled species, yet this emerging technique requires ongoing testing in order to determine the contexts over which it is effective. For example, little research to date has evaluated how seasonality of organism behavior or activity may influence detection probability of eDNA. We applied eDNA to survey for two highly imperiled species endemic to the upper Black Warrior River basin in Alabama, US: the Black Warrior Waterdog (Necturus alabamensis) and the Flattened Musk Turtle (Sternotherus depressus). Importantly, these species have contrasting patterns of seasonal activity, with N. alabamensis more active in the cool season (October-April) and S. depressus more active in the warm season (May-September). We surveyed sites historically occupied by these species across cool and warm seasons over two years with replicated eDNA water samples, which were analyzed in the laboratory using species-specific quantitative PCR (qPCR) assays. We then used occupancy estimation with detection probability modeling to evaluate both the effects of landscape attributes on organism presence and season of sampling on detection probability of eDNA. Importantly, we found that season strongly affected eDNA detection probability for both species, with N. alabamensis having higher eDNA detection probabilities during the cool season and S. depressus have higher eDNA detection probabilities during the warm season. These results illustrate the influence of organismal behavior or activity on eDNA detection in the environment and identify an important role for basic natural history in designing eDNA monitoring programs. PMID:27776150

  12. Factors affecting detectability of river otters during sign surveys

    USGS Publications Warehouse

    Jeffress, Mackenzie R.; Paukert, Craig P.; Sandercock, Brett K.; Gipson, Philip S.

    2011-01-01

    Sign surveys are commonly used to study and monitor wildlife species but may be flawed when surveys are conducted only once and cover short distances, which can lead to a lack of accountability for false absences. Multiple observers surveyed for river otter (Lontra canadensis) scat and tracks along stream and reservoir shorelines at 110 randomly selected sites in eastern Kansas from January to April 2008 and 2009 to determine if detection probability differed among substrates, sign types, observers, survey lengths, and near access points. We estimated detection probabilities (p) of river otters using occupancy models in Program PRESENCE. Mean detection probability for a 400-m survey was highest in mud substrates (p = 0.60) and lowest in snow (p = 0.18) and leaf litter substrates (p = 0.27). Scat had a higher detection probability (p = 0.53) than tracks (p = 0.18), and experienced observers had higher detection probabilities (p < 0.71) than novice observers (p < 0.55). Detection probabilities increased almost 3-fold as survey length increased from 200 m to 1,000 m, and otter sign was not concentrated near access points. After accounting for imperfect detection, our estimates of otter site occupancy based on a 400-m survey increased >3-fold, providing further evidence of the potential negative bias that can occur in estimates from sign surveys when imperfect detection is not addressed. Our study identifies areas for improvement in sign survey methodologies and results are applicable for sign surveys commonly used for many species across a range of habitats.

  13. Direct sampling of chemical weapons in water by photoionization mass spectrometry.

    PubMed

    Syage, Jack A; Cai, Sheng-Suan; Li, Jianwei; Evans, Matthew D

    2006-05-01

    The vulnerability of water supplies to toxic contamination calls for fast and effective means for screening water samples for multiple threats. We describe the use of photoionization (PI) mass spectrometry (MS) for high-speed, high-throughput screening and molecular identification of chemical weapons (CW) threats and other hazardous compounds. The screening technology can detect a wide range of compounds at subacute concentrations with no sample preparation and a sampling cycle time of approximately 45 s. The technology was tested with CW agents VX, GA, GB, GD, GF, HD, HN1, and HN3, in addition to riot agents and precursors. All are sensitively detected and give simple PI mass spectra dominated by the parent ion. The target application of the PI MS method is as a routine, real-time early warning system for CW agents and other hazardous compounds in air and in water. In this work, we also present comprehensive measurements for water analysis and report on the system detection limits, linearity, quantitation accuracy, and false positive (FP) and false negative rates for concentrations at subacute levels. The latter data are presented in the form of receiver operating characteristic curves of the form of detection probability P(D) versus FP probability P(FP). These measurements were made using the CW surrogate compounds, DMMP, DEMP, DEEP, and DIMP. Method detection limits (3sigma) obtained using a capillary injection method yielded 1, 6, 3, and 2 ng/mL, respectively. These results were obtained using 1-microL injections of water samples without any preparation, corresponding to mass detection limits of 1, 6, 3, and 2 pg, respectively. The linear range was about 3-4 decades and the dynamic range about 4-5 decades. The relative standard deviations were generally <10% at CW subacute concentrations levels.

  14. Fusion of multiple quadratic penalty function support vector machines (QPFSVM) for automated sea mine detection and classification

    NASA Astrophysics Data System (ADS)

    Dobeck, Gerald J.; Cobb, J. Tory

    2002-08-01

    The high-resolution sonar is one of the principal sensors used by the Navy to detect and classify sea mines in minehunting operations. For such sonar systems, substantial effort has been devoted to the development of automated detection and classification (D/C) algorithms. These have been spurred by several factors including (1) aids for operators to reduce work overload, (2) more optimal use of all available data, and (3) the introduction of unmanned minehunting systems. The environments where sea mines are typically laid (harbor areas, shipping lanes, and the littorals) give rise to many false alarms caused by natural, biologic, and man-made clutter. The objective of the automated D/C algorithms is to eliminate most of these false alarms while still maintaining a very high probability of mine detection and classification (PdPc). In recent years, the benefits of fusing the outputs of multiple D/C algorithms have been studied. We refer to this as Algorithm Fusion. The results have been remarkable, including reliable robustness to new environments. The Quadratic Penalty Function Support Vector Machine (QPFSVM) algorithm to aid in the automated detection and classification of sea mines is introduced in this paper. The QPFSVM algorithm is easy to train, simple to implement, and robust to feature space dimension. Outputs of successive SVM algorithms are cascaded in stages (fused) to improve the Probability of Classification (Pc) and reduce the number of false alarms. Even though our experience has been gained in the area of sea mine detection and classification, the principles described herein are general and can be applied to fusion of any D/C problem (e.g., automated medical diagnosis or automatic target recognition for ballistic missile defense).

  15. Detection of bulk explosives using the GPR only portion of the HSTAMIDS system

    NASA Astrophysics Data System (ADS)

    Tabony, Joshua; Carlson, Douglas O.; Duvoisin, Herbert A., III; Torres-Rosario, Juan

    2010-04-01

    The legacy AN/PSS-14 (Army-Navy Portable Special Search-14) Handheld Mine Detecting Set (also called HSTAMIDS for Handheld Standoff Mine Detection System) has proven itself over the last 7 years as the state-of-the-art in land mine detection, both for the US Army and for Humanitarian Demining groups. Its dual GPR (Ground Penetrating Radar) and MD (Metal Detection) sensor has provided receiver operating characteristic curves (probability of detection or Pd versus false alarm rate or FAR) that routinely set the mark for such devices. Since its inception and type-classification in 2003 as the US (United States) Army standard, the desire for use of the AN/PSS-14 against alternate threats - such as bulk explosives - has recently become paramount. To this end, L-3 CyTerra has developed and tested bulk explosive detection and discrimination algorithms using only the Stepped Frequency Continuous Wave (SFCW) Ground Penetrating Radar (GPR) portion of the system, versus the fused version that is used to optimally detect land mines. Performance of the new bulk explosive algorithm against representative zero-metal bulk explosive target and clutter emplacements is depicted, with the utility to the operator also described.

  16. Observation of > 5 wt % zinc at the Kimberley outcrop, Gale crater, Mars

    DOE PAGES

    Lasue, J.; Clegg, Samuel M.; Forni, O.; ...

    2016-03-12

    Zinc-enriched targets have been detected at the Kimberley formation, Gale crater, Mars, using the Chemistry Camera (ChemCam) instrument. The Zn content is analyzed with a univariate calibration based on the 481.2 nm emission line. The limit of quantification for ZnO is 3 wt % (at 95% confidence level) and 1 wt % (at 68% confidence level). The limit of detection is shown to be around 0.5 wt %. As of sol 950, 12 targets on Mars present high ZnO content ranging from 1.0 wt % to 8.4 wt % (Yarrada, sol 628). Those Zn-enriched targets are almost entirely located atmore » the Dillinger member of the Kimberley formation, where high Mn and alkali contents were also detected, probably in different phases. Zn enrichment does not depend on the textures of the rocks (coarse-grained sandstones, pebbly conglomerates, and resistant fins). The lack of sulfur enhancement suggests that Zn is not present in the sphalerite phase. Zn appears somewhat correlated with Na 2O and the ChemCam hydration index, suggesting that it could be in an amorphous clay phase (such as sauconite). On Earth, such an enrichment would be consistent with a supergene alteration of a sphalerite gossan cap in a primary siliciclastic bedrock or a possible hypogene nonsulfide zinc deposition where Zn, Fe, Mn would have been transported in a reduced sulfur-poor fluid and precipitated rapidly in the form of oxides.« less

  17. Observation of > 5 wt % zinc at the Kimberley outcrop, Gale crater, Mars

    NASA Astrophysics Data System (ADS)

    Lasue, J.; Clegg, S. M.; Forni, O.; Cousin, A.; Wiens, R. C.; Lanza, N.; Mangold, N.; Le Deit, L.; Gasnault, O.; Maurice, S.; Berger, J. A.; Stack, K.; Blaney, D.; Fabre, C.; Goetz, W.; Johnson, J.; Le Mouélic, S.; Nachon, M.; Payré, V.; Rapin, W.; Sumner, D. Y.

    2016-03-01

    Zinc-enriched targets have been detected at the Kimberley formation, Gale crater, Mars, using the Chemistry Camera (ChemCam) instrument. The Zn content is analyzed with a univariate calibration based on the 481.2 nm emission line. The limit of quantification for ZnO is 3 wt % (at 95% confidence level) and 1 wt % (at 68% confidence level). The limit of detection is shown to be around 0.5 wt %. As of sol 950, 12 targets on Mars present high ZnO content ranging from 1.0 wt % to 8.4 wt % (Yarrada, sol 628). Those Zn-enriched targets are almost entirely located at the Dillinger member of the Kimberley formation, where high Mn and alkali contents were also detected, probably in different phases. Zn enrichment does not depend on the textures of the rocks (coarse-grained sandstones, pebbly conglomerates, and resistant fins). The lack of sulfur enhancement suggests that Zn is not present in the sphalerite phase. Zn appears somewhat correlated with Na2O and the ChemCam hydration index, suggesting that it could be in an amorphous clay phase (such as sauconite). On Earth, such an enrichment would be consistent with a supergene alteration of a sphalerite gossan cap in a primary siliciclastic bedrock or a possible hypogene nonsulfide zinc deposition where Zn, Fe, Mn would have been transported in a reduced sulfur-poor fluid and precipitated rapidly in the form of oxides.

  18. Observation of > 5 wt % zinc at the Kimberley outcrop, Gale crater, Mars

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lasue, J.; Clegg, Samuel M.; Forni, O.

    Zinc-enriched targets have been detected at the Kimberley formation, Gale crater, Mars, using the Chemistry Camera (ChemCam) instrument. The Zn content is analyzed with a univariate calibration based on the 481.2 nm emission line. The limit of quantification for ZnO is 3 wt % (at 95% confidence level) and 1 wt % (at 68% confidence level). The limit of detection is shown to be around 0.5 wt %. As of sol 950, 12 targets on Mars present high ZnO content ranging from 1.0 wt % to 8.4 wt % (Yarrada, sol 628). Those Zn-enriched targets are almost entirely located atmore » the Dillinger member of the Kimberley formation, where high Mn and alkali contents were also detected, probably in different phases. Zn enrichment does not depend on the textures of the rocks (coarse-grained sandstones, pebbly conglomerates, and resistant fins). The lack of sulfur enhancement suggests that Zn is not present in the sphalerite phase. Zn appears somewhat correlated with Na 2O and the ChemCam hydration index, suggesting that it could be in an amorphous clay phase (such as sauconite). On Earth, such an enrichment would be consistent with a supergene alteration of a sphalerite gossan cap in a primary siliciclastic bedrock or a possible hypogene nonsulfide zinc deposition where Zn, Fe, Mn would have been transported in a reduced sulfur-poor fluid and precipitated rapidly in the form of oxides.« less

  19. Presence-nonpresence surveys of golden-cheeked warblers: detection, occupancy and survey effort

    USGS Publications Warehouse

    Watson, C.A.; Weckerly, F.W.; Hatfield, J.S.; Farquhar, C.C.; Williamson, P.S.

    2008-01-01

    Surveys to detect the presence or absence of endangered species may not consistently cover an area, account for imperfect detection or consider that detection and species presence at sample units may change within a survey season. We evaluated a detection?nondetection survey method for the federally endangered golden-cheeked warbler (GCWA) Dendroica chrysoparia. Three study areas were selected across the breeding range of GCWA in central Texas. Within each area, 28-36 detection stations were placed 200 m apart. Each detection station was surveyed nine times during the breeding season in 2 consecutive years. Surveyors remained up to 8 min at each detection station recording GCWA detected by sight or sound. To assess the potential influence of environmental covariates (e.g. slope, aspect, canopy cover, study area) on detection and occupancy and possible changes in occupancy and detection probabilities within breeding seasons, 30 models were analyzed. Using information-theoretic model selection procedures, we found that detection probabilities and occupancy varied among study areas and within breeding seasons. Detection probabilities ranged from 0.20 to 0.80 and occupancy ranged from 0.56 to 0.95. Because study areas with high detection probabilities had high occupancy, a conservative survey effort (erred towards too much surveying) was estimated using the lowest detection probability. We determined that nine surveys of 35 stations were needed to have estimates of occupancy with coefficients of variation <20%. Our survey evaluation evidently captured the key environmental variable that influenced bird detection (GCWA density) and accommodated the changes in GCWA distribution throughout the breeding season.

  20. Detection of cat-eye effect echo based on unit APD

    NASA Astrophysics Data System (ADS)

    Wu, Dong-Sheng; Zhang, Peng; Hu, Wen-Gang; Ying, Jia-Ju; Liu, Jie

    2016-10-01

    The cat-eye effect echo of optical system can be detected based on CCD, but the detection range is limited within several kilometers. In order to achieve long-range even ultra-long-range detection, it ought to select APD as detector because of the high sensitivity of APD. The detection system of cat-eye effect echo based on unit APD is designed in paper. The implementation scheme and key technology of the detection system is presented. The detection performances of the detection system including detection range, detection probability and false alarm probability are modeled. Based on the model, the performances of the detection system are analyzed using typical parameters. The results of numerical calculation show that the echo signal-to-noise ratio is greater than six, the detection probability is greater than 99.9% and the false alarm probability is less tan 0.1% within 20 km detection range. In order to verify the detection effect, we built the experimental platform of detection system according to the design scheme and carry out the field experiments. The experimental results agree well with the results of numerical calculation, which prove that the detection system based on the unit APD is feasible to realize remote detection for cat-eye effect echo.

  1. Understanding environmental DNA detection probabilities: A case study using a stream-dwelling char Salvelinus fontinalis

    USGS Publications Warehouse

    Wilcox, Taylor M; Mckelvey, Kevin S.; Young, Michael K.; Sepulveda, Adam; Shepard, Bradley B.; Jane, Stephen F; Whiteley, Andrew R.; Lowe, Winsor H.; Schwartz, Michael K.

    2016-01-01

    Environmental DNA sampling (eDNA) has emerged as a powerful tool for detecting aquatic animals. Previous research suggests that eDNA methods are substantially more sensitive than traditional sampling. However, the factors influencing eDNA detection and the resulting sampling costs are still not well understood. Here we use multiple experiments to derive independent estimates of eDNA production rates and downstream persistence from brook trout (Salvelinus fontinalis) in streams. We use these estimates to parameterize models comparing the false negative detection rates of eDNA sampling and traditional backpack electrofishing. We find that using the protocols in this study eDNA had reasonable detection probabilities at extremely low animal densities (e.g., probability of detection 0.18 at densities of one fish per stream kilometer) and very high detection probabilities at population-level densities (e.g., probability of detection > 0.99 at densities of ≥ 3 fish per 100 m). This is substantially more sensitive than traditional electrofishing for determining the presence of brook trout and may translate into important cost savings when animals are rare. Our findings are consistent with a growing body of literature showing that eDNA sampling is a powerful tool for the detection of aquatic species, particularly those that are rare and difficult to sample using traditional methods.

  2. Contaminants in fish tissue from US lakes and reservoirs: A ...

    EPA Pesticide Factsheets

    An unequal probability design was used to develop national estimates for 268 persistent, bioaccumulative, and toxic chemicals in fish tissue from lakes and reservoirs of the conterminous United States (excluding the Laurentian Great Lakes and Great Salt Lake). Predator (fillet) and bottom-dweller (whole-body) composites were collected from 500 lakes selected randomly from the target population of 147,343 lakes in the lower 48 states. Each of these composite types comprised nationally representative samples whose results were extrapolated to the sampled population of an estimated 76,559 lakes for predators and 46,190 lakes for bottom dwellers. Mercury and PCBs were detected in all fish samples. Dioxins and furans were detected in 81% and 99% of predator and bottom-dweller samples, respectively. Cumulative frequency distributions showed that mercury concentrations exceeded the EPA 300 ppb mercury fish tissue criterion at nearly half of the lakes in the sampled population. Total PCB concentrations exceeded a 12 ppb human health risk-based consumption limit at nearly 17% of lakes, and dioxins and furans exceeded a 0.15 ppt (toxic equivalent or TEQ) risk-based threshold at nearly 8% of lakes in the sampled population. In contrast, 43 target chemicals were not detected in any samples. No detections were reported for nine organophosphate pesticides, one PCB congener, 16 polycyclic aromatic hydrocarbons, or 17 other semivolatile organic chemicals. An unequal prob

  3. People Like Logical Truth: Testing the Intuitive Detection of Logical Value in Basic Propositions.

    PubMed

    Nakamura, Hiroko; Kawaguchi, Jun

    2016-01-01

    Recent studies on logical reasoning have suggested that people are intuitively aware of the logical validity of syllogisms or that they intuitively detect conflict between heuristic responses and logical norms via slight changes in their feelings. According to logical intuition studies, logically valid or heuristic logic no-conflict reasoning is fluently processed and induces positive feelings without conscious awareness. One criticism states that such effects of logicality disappear when confounding factors such as the content of syllogisms are controlled. The present study used abstract propositions and tested whether people intuitively detect logical value. Experiment 1 presented four logical propositions (conjunctive, biconditional, conditional, and material implications) regarding a target case and asked the participants to rate the extent to which they liked the statement. Experiment 2 tested the effects of matching bias, as well as intuitive logic, on the reasoners' feelings by manipulating whether the antecedent or consequent (or both) of the conditional was affirmed or negated. The results showed that both logicality and matching bias affected the reasoners' feelings, and people preferred logically true targets over logically false ones for all forms of propositions. These results suggest that people intuitively detect what is true from what is false during abstract reasoning. Additionally, a Bayesian mixed model meta-analysis of conditionals indicated that people's intuitive interpretation of the conditional "if p then q" fits better with the conditional probability, q given p.

  4. Contaminants in fish tissue from US lakes and reservoirs: a national probabilistic study.

    PubMed

    Stahl, Leanne L; Snyder, Blaine D; Olsen, Anthony R; Pitt, Jennifer L

    2009-03-01

    An unequal probability design was used to develop national estimates for 268 persistent, bioaccumulative, and toxic chemicals in fish tissue from lakes and reservoirs of the conterminous United States (excluding the Laurentian Great Lakes and Great Salt Lake). Predator (fillet) and bottom-dweller (whole body) composites were collected from 500 lakes selected randomly from the target population of 147,343 lakes in the lower 48 states. Each of these composite types comprised nationally representative samples whose results were extrapolated to the sampled population of an estimated 76,559 lakes for predators and 46,190 lakes for bottom dwellers. Mercury and PCBs were detected in all fish samples. Dioxins and furans were detected in 81% and 99% of predator and bottom-dweller samples, respectively. Cumulative frequency distributions showed that mercury concentrations exceeded the EPA 300 ppb mercury fish tissue criterion at nearly half of the lakes in the sampled population. Total PCB concentrations exceeded a 12 ppb human health risk-based consumption limit at nearly 17% of lakes, and dioxins and furans exceeded a 0.15 ppt (toxic equivalent or TEQ) risk-based threshold at nearly 8% of lakes in the sampled population. In contrast, 43 target chemicals were not detected in any samples. No detections were reported for nine organophosphate pesticides, one PCB congener, 16 polycyclic aromatic hydrocarbons, or 17 other semivolatile organic chemicals.

  5. Altering spatial priority maps via statistical learning of target selection and distractor filtering.

    PubMed

    Ferrante, Oscar; Patacca, Alessia; Di Caro, Valeria; Della Libera, Chiara; Santandrea, Elisa; Chelazzi, Leonardo

    2018-05-01

    The cognitive system has the capacity to learn and make use of environmental regularities - known as statistical learning (SL), including for the implicit guidance of attention. For instance, it is known that attentional selection is biased according to the spatial probability of targets; similarly, changes in distractor filtering can be triggered by the unequal spatial distribution of distractors. Open questions remain regarding the cognitive/neuronal mechanisms underlying SL of target selection and distractor filtering. Crucially, it is unclear whether the two processes rely on shared neuronal machinery, with unavoidable cross-talk, or they are fully independent, an issue that we directly addressed here. In a series of visual search experiments, participants had to discriminate a target stimulus, while ignoring a task-irrelevant salient distractor (when present). We systematically manipulated spatial probabilities of either one or the other stimulus, or both. We then measured performance to evaluate the direct effects of the applied contingent probability distribution (e.g., effects on target selection of the spatial imbalance in target occurrence across locations) as well as its indirect or "transfer" effects (e.g., effects of the same spatial imbalance on distractor filtering across locations). By this approach, we confirmed that SL of both target and distractor location implicitly bias attention. Most importantly, we described substantial indirect effects, with the unequal spatial probability of the target affecting filtering efficiency and, vice versa, the unequal spatial probability of the distractor affecting target selection efficiency across locations. The observed cross-talk demonstrates that SL of target selection and distractor filtering are instantiated via (at least partly) shared neuronal machinery, as further corroborated by strong correlations between direct and indirect effects at the level of individual participants. Our findings are compatible with the notion that both kinds of SL adjust the priority of specific locations within attentional priority maps of space. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Detection of small surface vessels in near, medium, and far infrared spectral bands

    NASA Astrophysics Data System (ADS)

    Dulski, R.; Milewski, S.; Kastek, M.; Trzaskawka, P.; Szustakowski, M.; Ciurapinski, W.; Zyczkowski, M.

    2011-11-01

    Protection of naval bases and harbors requires close co-operation between security and access control systems covering land areas and those monitoring sea approach routes. The typical location of naval bases and harbors - usually next to a large city - makes it difficult to detect and identify a threat in the dense regular traffic of various sea vessels (i.e. merchant ships, fishing boats, tourist ships). Due to the properties of vessel control systems, such as AIS (Automatic Identification System), and the effectiveness of radar and optoelectronic systems against different targets it seems that fast motor boats called RIB (Rigid Inflatable Boat) could be the most serious threat to ships and harbor infrastructure. In the paper the process and conditions for the detection and identification of high-speed boats in the areas of ports and naval bases in the near, medium and far infrared is presented. Based on the results of measurements and recorded thermal images the actual temperature contrast delta T (RIB / sea) will be determined, which will further allow to specify the theoretical ranges of detection and identification of the RIB-type targets for an operating security system. The data will also help to determine the possible advantages of image fusion where the component images are taken in different spectral ranges. This will increase the probability of identifying the object by the multi-sensor security system equipped additionally with the appropriate algorithms for detecting, tracking and performing the fusion of images from the visible and infrared cameras.

  7. Cluster State Quantum Computing

    DTIC Science & Technology

    2012-12-01

    probability that the desired target gate ATar has been faithfully implemented on the computational modes given a successful measurement of the ancilla...modes: () = �(†)� 2 2(†) , (3) since Tr ( ATar † ATar )=2Mc for a properly normalized target gate. As we are interested...optimization method we have developed maximizes the success probability S for a given target transformation ATar , for given ancilla resources, and for a

  8. Statistics provide guidance for indigenous organic carbon detection on Mars missions.

    PubMed

    Sephton, Mark A; Carter, Jonathan N

    2014-08-01

    Data from the Viking and Mars Science Laboratory missions indicate the presence of organic compounds that are not definitively martian in origin. Both contamination and confounding mineralogies have been suggested as alternatives to indigenous organic carbon. Intuitive thought suggests that we are repeatedly obtaining data that confirms the same level of uncertainty. Bayesian statistics may suggest otherwise. If an organic detection method has a true positive to false positive ratio greater than one, then repeated organic matter detection progressively increases the probability of indigeneity. Bayesian statistics also reveal that methods with higher ratios of true positives to false positives give higher overall probabilities and that detection of organic matter in a sample with a higher prior probability of indigenous organic carbon produces greater confidence. Bayesian statistics, therefore, provide guidance for the planning and operation of organic carbon detection activities on Mars. Suggestions for future organic carbon detection missions and instruments are as follows: (i) On Earth, instruments should be tested with analog samples of known organic content to determine their true positive to false positive ratios. (ii) On the mission, for an instrument with a true positive to false positive ratio above one, it should be recognized that each positive detection of organic carbon will result in a progressive increase in the probability of indigenous organic carbon being present; repeated measurements, therefore, can overcome some of the deficiencies of a less-than-definitive test. (iii) For a fixed number of analyses, the highest true positive to false positive ratio method or instrument will provide the greatest probability that indigenous organic carbon is present. (iv) On Mars, analyses should concentrate on samples with highest prior probability of indigenous organic carbon; intuitive desires to contrast samples of high prior probability and low prior probability of indigenous organic carbon should be resisted.

  9. Multiple functional units in the preattentive segmentation of speech in Japanese: evidence from word illusions.

    PubMed

    Nakamura, Miyoko; Kolinsky, Régine

    2014-12-01

    We explored the functional units of speech segmentation in Japanese using dichotic presentation and a detection task requiring no intentional sublexical analysis. Indeed, illusory perception of a target word might result from preattentive migration of phonemes, morae, or syllables from one ear to the other. In Experiment I, Japanese listeners detected targets presented in hiragana and/or kanji. Phoneme migrations did occur, suggesting that orthography-independent sublexical constituents play some role in segmentation. However, syllable and especially mora migrations were more numerous. This pattern of results was not observed in French speakers (Experiment 2), suggesting that it reflects native segmentation in Japanese. To control for the intervention of kanji representations (many words are written in kanji, and one kanji often corresponds to one syllable), in Experiment 3, Japanese listeners were presented with target loanwords that can be written only in katakana. Again, phoneme migrations occurred, while the first mora and syllable led to similar rates of illusory percepts. No migration occurred for the second, "special" mora (/J/ or/N/), probably because this constitutes the latter part of a heavy syllable. Overall, these findings suggest that multiple units, such as morae, syllables, and even phonemes, function independently of orthographic knowledge in Japanese preattentive speech segmentation.

  10. Targeting the probability versus cost of feared outcomes in public speaking anxiety.

    PubMed

    Nelson, Elizabeth A; Deacon, Brett J; Lickel, James J; Sy, Jennifer T

    2010-04-01

    Cognitive-behavioral theory suggests that social phobia is maintained, in part, by overestimates of the probability and cost of negative social events. Indeed, empirically supported cognitive-behavioral treatments directly target these cognitive biases through the use of in vivo exposure or behavioral experiments. While cognitive-behavioral theories and treatment protocols emphasize the importance of targeting probability and cost biases in the reduction of social anxiety, few studies have examined specific techniques for reducing probability and cost bias, and thus the relative efficacy of exposure to the probability versus cost of negative social events is unknown. In the present study, 37 undergraduates with high public speaking anxiety were randomly assigned to a single-session intervention designed to reduce either the perceived probability or the perceived cost of negative outcomes associated with public speaking. Compared to participants in the probability treatment condition, those in the cost treatment condition demonstrated significantly greater improvement on measures of public speaking anxiety and cost estimates for negative social events. The superior efficacy of the cost treatment condition was mediated by greater treatment-related changes in social cost estimates. The clinical implications of these findings are discussed. Published by Elsevier Ltd.

  11. The preference of probability over negative values in action selection.

    PubMed

    Neyedli, Heather F; Welsh, Timothy N

    2015-01-01

    It has previously been found that when participants are presented with a pair of motor prospects, they can select the prospect with the largest maximum expected gain (MEG). Many of those decisions, however, were trivial because of large differences in MEG between the prospects. The purpose of the present study was to explore participants' preferences when making non-trivial decisions between two motor prospects. Participants were presented with pairs of prospects that: 1) differed in MEG with either only the values or only the probabilities differing between the prospects; and 2) had similar MEG with one prospect having a larger probability of hitting the target and a higher penalty value and the other prospect a smaller probability of hitting the target but a lower penalty value. In different experiments, participants either had 400 ms or 2000 ms to decide between the prospects. It was found that participants chose the configuration with the larger MEG more often when the probability varied between prospects than when the value varied. In pairs with similar MEGs, participants preferred a larger probability of hitting the target over a smaller penalty value. These results indicate that participants prefer probability information over negative value information in a motor selection task.

  12. Search For Debris Disks Around A Few Radio Pulsars

    NASA Astrophysics Data System (ADS)

    Wang, Zhongxiang; Kaplan, David; Kaspi, Victoria

    2007-05-01

    We propose to observe 7 radio pulsars with Spitzer/IRAC at 4.5 and 8.0 microns, in an effort to probe the general existence of debris disks around isolated neutron stars. Such disks, probably formed from fallback or pushback material left over from supernova explosions, has been suggested to be associated with various phenomena seen in radio pulsars. Recently, new evidence for such a disk around an isolated young neutron star was found in Spitzer observations of an X-ray pulsar. If they exist, the disks could be illuminated by energy output from central pulsars and thus be generally detectable in the infrared by IRAC. We have selected 40 relatively young, energetic pulsars from the most recent pulsar catalogue as the preliminary targets for our ground-based near-IR imaging survey. Based on the results from the survey observations, 7 pulsars are further selected because of their relatively sparse field and estimated low extinction. Combined with our near-IR images, Spitzer/IRAC observations will allow us to unambiguously identify disks if they are detected at the source positions. This Spitzer observation program we propose here probably represents the best test we can do on the general existence of disks around radio pulsars.

  13. Early detection of probable idiopathic Parkinson's disease: I. development of a diagnostic test battery.

    PubMed

    Montgomery, Erwin B; Koller, William C; LaMantia, Theodora J K; Newman, Mary C; Swanson-Hyland, Elizabeth; Kaszniak, Alfred W; Lyons, Kelly

    2000-05-01

    We developed a test battery as an inexpensive and objective aid for the early diagnosis of idiopathic Parkinson's disease (iPD) and its differential diagnoses. The test battery incorporates tests of motor function, olfaction, and mood. In the motor task, a wrist flexion-and-extension task to different targets, movement velocities were recorded. Olfaction was tested with the University of Pennsylvania Smell Identification Test. Mood was assessed with the Beck Depression Inventory. An initial regression model was developed from the results of 19 normal control subjects and 18 patients with early, mild, probable iPD. Prospective application to an independent validation set of 122 normal control subjects and 103 patients resulted in an 88% specificity rate and 69% sensitivity rate, with an area under the Receiver Operator Characteristic curve of 0.87. Copyright © 2000 Movement Disorder Society.

  14. Sensing systems efficiency evaluation and comparison for homeland security and homeland defense

    NASA Astrophysics Data System (ADS)

    Pakhomov, Alexander A.

    2010-04-01

    Designers and consumers of various security, intelligence, surveillance and reconnaissance (ISR) systems as well as various unattended ground sensors pay most attention to their commonly used performance characteristics such as probability of a target detection and probability of a false alarm. These characteristics are used for systems comparison and evaluation. However, it is not enough for end-users of these systems as well as for their total/final effectiveness assessment. This article presents and discusses a system approach to an efficiency estimation of the security and ISR systems. Presented approach aims at final result of the system's function and use. It allows setting up reasonable technical and structural requirements for the security and ISR systems, to make trustworthy comparison and practical application planning of such systems. It also allows finding forward-looking, perspective ways of systems development. Presented results can be guidance to both designers and consumers.

  15. Characterizing the distribution of an endangered salmonid using environmental DNA analysis

    USGS Publications Warehouse

    Laramie, Matthew B.; Pilliod, David S.; Goldberg, Caren S.

    2015-01-01

    Determining species distributions accurately is crucial to developing conservation and management strategies for imperiled species, but a challenging task for small populations. We evaluated the efficacy of environmental DNA (eDNA) analysis for improving detection and thus potentially refining the known distribution of Chinook salmon (Oncorhynchus tshawytscha) in the Methow and Okanogan Subbasins of the Upper Columbia River, which span the border between Washington, USA and British Columbia, Canada. We developed an assay to target a 90 base pair sequence of Chinook DNA and used quantitative polymerase chain reaction (qPCR) to quantify the amount of Chinook eDNA in triplicate 1-L water samples collected at 48 stream locations in June and again in August 2012. The overall probability of detecting Chinook with our eDNA method in areas within the known distribution was 0.77 (±0.05 SE). Detection probability was lower in June (0.62, ±0.08 SE) during high flows and at the beginning of spring Chinook migration than during base flows in August (0.93, ±0.04 SE). In the Methow subbasin, mean eDNA concentration was higher in August compared to June, especially in smaller tributaries, probably resulting from the arrival of spring Chinook adults, reduced discharge, or both. Chinook eDNA concentrations did not appear to change in the Okanogan subbasin from June to August. Contrary to our expectations about downstream eDNA accumulation, Chinook eDNA did not decrease in concentration in upstream reaches (0–120 km). Further examination of factors influencing spatial distribution of eDNA in lotic systems may allow for greater inference of local population densities along stream networks or watersheds. These results demonstrate the potential effectiveness of eDNA detection methods for determining landscape-level distribution of anadromous salmonids in large river systems.

  16. A new proof-of-principle contraband detection system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sredniawski, J.J.; Debiak, T.; Kamykowski, E.

    1995-12-01

    A new concept for a CDS has been developed under a Phase I ARPA funded program; it uses gamma resonance absorption (GRA) to detect certain illegal drugs that may be transported in man-portable containers. A high detection probability for heroin and cocaine is possible with a device that is also searching for explosives. Elemental detection of both N and Cl is utilized, and with tomography, a 3D density image of the elements is generated. Total density image is also developed. These two together may be used with considerable confidence in determining if heroin or cocaine is present in the interrogatedmore » containers in a small quantity (1 kg). The CDS employs a high current ({ge}10 mA) DC accelerator that produces a beam of 1.75 or 1.89 MeV protons. These protons impact a target with coatings of {sup 13}C and {sup 34}S. Depending on the coating, the resultant resonant gamma rays are preferentially absorbed in either {sup 14}N or {sup 35}Cl. The resonant gammas come off the target in a conical fan at 80.7{degree} for N and 82{degree} for Cl; a common array of segmented BGO detectors is used over an arc of 53{degree} to provide input to an imaging subsystem. The tomography makes use of rotation and vertical translation of a baggage carousel holding typically 18 average sized bags for batch processing of the contents. The single proton accelerator and target can supply multiple detection stations with the appropriate gammas, a feature that may lead to very high throughput potential approaching 2000 bags/hr. Each detection station can operate somewhat independently from the others. This paper presents the overall requirements, design, operating principles, and characteristics of the CDS proof-of-principle device developed in the Phase I program.« less

  17. Active Brown Fat During 18F-FDG PET/CT Imaging Defines a Patient Group with Characteristic Traits and an Increased Probability of Brown Fat Redetection.

    PubMed

    Gerngroß, Carlos; Schretter, Johanna; Klingenspor, Martin; Schwaiger, Markus; Fromme, Tobias

    2017-07-01

    Brown adipose tissue (BAT) provides a means of nonshivering thermogenesis. In humans, active BAT can be visualized by 18 F-FDG uptake as detected by PET combined with CT. The retrospective analysis of clinical scans is a valuable source to identify anthropometric parameters that influence BAT mass and activity and thus the potential efficacy of envisioned drugs targeting this tissue to treat metabolic disease. Methods: We analyzed 2,854 18 F-FDG PET/CT scans from 1,644 patients and identified 98 scans from 81 patients with active BAT. We quantified the volume of active BAT depots (mean values in mL ± SD: total BAT, 162 ± 183 [ n = 98]; cervical, 40 ± 37 [ n = 53]; supraclavicular, 66 ± 68 [ n = 71]; paravertebral, 51 ± 53 [ n = 69]; mediastinal, 43 ± 40 [ n = 51]; subphrenic, 21 ± 21 [ n = 29]). Because only active BAT is detectable by 18 F-FDG uptake, these numbers underestimate the total amount of BAT. Considering only 32 scans of the highest activity as categorized by a visual scoring strategy, we determined a mean total BAT volume of 308 ± 208 mL. In 30 BAT-positive patients with 3 or more repeated scans, we calculated a much higher mean probability to redetect active BAT (52% ± 25%) as compared with the overall prevalence of 4.9%. We calculated a BAT activity index (BFI) based on volume and intensity of individual BAT depots. Results: We detected higher total BFI in younger patients ( P = 0.009), whereas sex, body mass index, height, mass, outdoor temperature, and blood parameters did not affect total or depot-specific BAT activity. Surprisingly, renal creatinine clearance as estimated from mass, age, and plasma creatinine was a significant predictor of BFI on the total ( P = 0.005) as well as on the level of several individual depots. In summary, we detected a high amount of more than 300 mL of BAT tissue. Conclusion: BAT-positive patients represent a group with a higher than usual probability to activate BAT during a scan. Estimated renal creatinine clearance correlated with the extent of activated BAT in a given scan. These data imply an efficacy of drugs targeting BAT to treat metabolic disease that is at the same time higher and subject to a larger individual variation than previously assumed. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.

  18. Sampling designs matching species biology produce accurate and affordable abundance indices

    PubMed Central

    Farley, Sean; Russell, Gareth J.; Butler, Matthew J.; Selinger, Jeff

    2013-01-01

    Wildlife biologists often use grid-based designs to sample animals and generate abundance estimates. Although sampling in grids is theoretically sound, in application, the method can be logistically difficult and expensive when sampling elusive species inhabiting extensive areas. These factors make it challenging to sample animals and meet the statistical assumption of all individuals having an equal probability of capture. Violating this assumption biases results. Does an alternative exist? Perhaps by sampling only where resources attract animals (i.e., targeted sampling), it would provide accurate abundance estimates more efficiently and affordably. However, biases from this approach would also arise if individuals have an unequal probability of capture, especially if some failed to visit the sampling area. Since most biological programs are resource limited, and acquiring abundance data drives many conservation and management applications, it becomes imperative to identify economical and informative sampling designs. Therefore, we evaluated abundance estimates generated from grid and targeted sampling designs using simulations based on geographic positioning system (GPS) data from 42 Alaskan brown bears (Ursus arctos). Migratory salmon drew brown bears from the wider landscape, concentrating them at anadromous streams. This provided a scenario for testing the targeted approach. Grid and targeted sampling varied by trap amount, location (traps placed randomly, systematically or by expert opinion), and traps stationary or moved between capture sessions. We began by identifying when to sample, and if bears had equal probability of capture. We compared abundance estimates against seven criteria: bias, precision, accuracy, effort, plus encounter rates, and probabilities of capture and recapture. One grid (49 km2 cells) and one targeted configuration provided the most accurate results. Both placed traps by expert opinion and moved traps between capture sessions, which raised capture probabilities. The grid design was least biased (−10.5%), but imprecise (CV 21.2%), and used most effort (16,100 trap-nights). The targeted configuration was more biased (−17.3%), but most precise (CV 12.3%), with least effort (7,000 trap-nights). Targeted sampling generated encounter rates four times higher, and capture and recapture probabilities 11% and 60% higher than grid sampling, in a sampling frame 88% smaller. Bears had unequal probability of capture with both sampling designs, partly because some bears never had traps available to sample them. Hence, grid and targeted sampling generated abundance indices, not estimates. Overall, targeted sampling provided the most accurate and affordable design to index abundance. Targeted sampling may offer an alternative method to index the abundance of other species inhabiting expansive and inaccessible landscapes elsewhere, provided their attraction to resource concentrations. PMID:24392290

  19. I Environmental DNA sampling is more sensitive than a traditional survey technique for detecting an aquatic invader.

    PubMed

    Smart, Adam S; Tingley, Reid; Weeks, Andrew R; van Rooyen, Anthony R; McCarthy, Michael A

    2015-10-01

    Effective management of alien species requires detecting populations in the early stages of invasion. Environmental DNA (eDNA) sampling can detect aquatic species at relatively low densities, but few studies have directly compared detection probabilities of eDNA sampling with those of traditional sampling methods. We compare the ability of a traditional sampling technique (bottle trapping) and eDNA to detect a recently established invader, the smooth newt Lissotriton vulgaris vulgaris, at seven field sites in Melbourne, Australia. Over a four-month period, per-trap detection probabilities ranged from 0.01 to 0.26 among sites where L. v. vulgaris was detected, whereas per-sample eDNA estimates were much higher (0.29-1.0). Detection probabilities of both methods varied temporally (across days and months), but temporal variation appeared to be uncorrelated between methods. Only estimates of spatial variation were strongly correlated across the two sampling techniques. Environmental variables (water depth, rainfall, ambient temperature) were not clearly correlated with detection probabilities estimated via trapping, whereas eDNA detection probabilities were negatively correlated with water depth, possibly reflecting higher eDNA concentrations at lower water levels. Our findings demonstrate that eDNA sampling can be an order of magnitude more sensitive than traditional methods, and illustrate that traditional- and eDNA-based surveys can provide independent information on species distributions when occupancy surveys are conducted over short timescales.

  20. Exoplanet Biosignatures: Future Directions

    PubMed Central

    Bains, William; Cronin, Leroy; DasSarma, Shiladitya; Danielache, Sebastian; Domagal-Goldman, Shawn; Kacar, Betul; Kiang, Nancy Y.; Lenardic, Adrian; Reinhard, Christopher T.; Moore, William; Schwieterman, Edward W.; Shkolnik, Evgenya L.; Smith, Harrison B.

    2018-01-01

    Abstract We introduce a Bayesian method for guiding future directions for detection of life on exoplanets. We describe empirical and theoretical work necessary to place constraints on the relevant likelihoods, including those emerging from better understanding stellar environment, planetary climate and geophysics, geochemical cycling, the universalities of physics and chemistry, the contingencies of evolutionary history, the properties of life as an emergent complex system, and the mechanisms driving the emergence of life. We provide examples for how the Bayesian formalism could guide future search strategies, including determining observations to prioritize or deciding between targeted searches or larger lower resolution surveys to generate ensemble statistics and address how a Bayesian methodology could constrain the prior probability of life with or without a positive detection. Key Words: Exoplanets—Biosignatures—Life detection—Bayesian analysis. Astrobiology 18, 779–824. PMID:29938538

  1. Search strategies

    NASA Astrophysics Data System (ADS)

    Oliver, B. M.

    Attention is given to the approaches which would provide the greatest chance of success in attempts related to the discovery of extraterrestrial advanced cultures in the Galaxy, taking into account the principle of least energy expenditure. The energetics of interstellar contact are explored, giving attention to the use of manned spacecraft, automatic probes, and beacons. The least expensive approach to a search for other civilizations involves a listening program which attempts to detect signals emitted by such civilizations. The optimum part of the spectrum for the considered search is found to be in the range from 1 to 2 GHz. Antenna and transmission formulas are discussed along with the employment of matched gates and filters, the probable characteristics of the signals to be detected, the filter-signal mismatch loss, surveys of the radio sky, the conduction of targeted searches.

  2. Homoharringtonine targets Smad3 and TGF-β pathway to inhibit the proliferation of acute myeloid leukemia cells.

    PubMed

    Chen, Jian; Mu, Qitian; Li, Xia; Yin, Xiufeng; Yu, Mengxia; Jin, Jing; Li, Chenying; Zhou, Yile; Zhou, Jiani; Suo, Shanshan; Lu, Demin; Jin, Jie

    2017-06-20

    Homoharringtonine (HHT) has long and widely been used in China for the treatment of acute myeloid leukemia (AML), the clinical therapeutic effect is significant but the working mechanism is poorly understood. The purpose of this study is to screen the possible target for HHT with virtual screening and verify the findings by cell experiments. Software including Autodock, Python, and MGL tools were used, with HHT being the ligand and proteins from PI3K-Akt pathway, Jak-stat pathway, TGF-β pathway and NK-κB pathway as the receptors. Human AML cell lines including U937, KG-1, THP-1 were cultured and used as the experiment cell lines. MTT assay was used for proliferation detection, flowcytometry was used to detect apoptosis and cell cycle arrest upon HHT functioning, western blotting was used to detect the protein level changes, viral shRNA transfection was used to suppress the expression level of the target protein candidate, and viral mRNA transfection was used for over-expression. Virtual screening revealed that smad3 from TGF-β pathway might be the candidate for HHT binding. In AML cell line U937 and KG-1, HHT can induce the Ser423/425 phosphorylation of smad3, and this phosphorylation can subsequently activate the TGF-β pathway, causing cell cycle arrest at G1 phase in U937 cells and apoptosis in KG-1 cells, knockdown of smad3 can impair the sensitivity of U937 cell to HHT, and over-expression of smad3 can re-establish the sensitivity in both cell lines. We conclude that smad3 is the probable target protein of HHT and plays an important role in the functioning mechanism of HHT.

  3. Homoharringtonine targets Smad3 and TGF-β pathway to inhibit the proliferation of acute myeloid leukemia cells

    PubMed Central

    Yin, Xiufeng; Yu, Mengxia; Jin, Jing; Li, Chenying; Zhou, Yile; Zhou, Jiani; Suo, Shanshan; Lu, Demin; Jin, Jie

    2017-01-01

    Homoharringtonine (HHT) has long and widely been used in China for the treatment of acute myeloid leukemia (AML), the clinical therapeutic effect is significant but the working mechanism is poorly understood. The purpose of this study is to screen the possible target for HHT with virtual screening and verify the findings by cell experiments. Software including Autodock, Python, and MGL tools were used, with HHT being the ligand and proteins from PI3K-Akt pathway, Jak-stat pathway, TGF-β pathway and NK-κB pathway as the receptors. Human AML cell lines including U937, KG-1, THP-1 were cultured and used as the experiment cell lines. MTT assay was used for proliferation detection, flowcytometry was used to detect apoptosis and cell cycle arrest upon HHT functioning, western blotting was used to detect the protein level changes, viral shRNA transfection was used to suppress the expression level of the target protein candidate, and viral mRNA transfection was used for over-expression. Virtual screening revealed that smad3 from TGF-β pathway might be the candidate for HHT binding. In AML cell line U937 and KG-1, HHT can induce the Ser423/425 phosphorylation of smad3, and this phosphorylation can subsequently activate the TGF-β pathway, causing cell cycle arrest at G1 phase in U937 cells and apoptosis in KG-1 cells, knockdown of smad3 can impair the sensitivity of U937 cell to HHT, and over-expression of smad3 can re-establish the sensitivity in both cell lines. We conclude that smad3 is the probable target protein of HHT and plays an important role in the functioning mechanism of HHT. PMID:28454099

  4. Highlight on Supernova Early Warning at Daya Bay

    NASA Astrophysics Data System (ADS)

    Wei, Hanyu

    Providing an early warning of supernova burst neutrinos is of importance in studying both supernova dynamics and neutrino physics. The Daya Bay Reactor Neutrino Experiment, with a unique feature of multiple liquid scintillator detectors, is sensitive to the full energy spectrum of supernova burst electron-antineutrinos. By utilizing 8 Antineutrino Detectors (ADs) in the three different experimental halls which are about 1 km's apart from each other, we obtain a powerful and prompt rejection of muon spallation background than single-detector experiments with the same target volume. A dedicated trigger system embedded in the data acquisition system has been installed to allow the detection of a coincidence of neutrino signals of all ADs via an inverse beta-decay (IBD) within a 10-second window, thus providing a robust early warning of a supernova occurrence within the Milky Way. An 8-AD associated supernova trigger table has been established theoretically to tabulate the 8-AD event counts' coincidence vs. the trigger rate. As a result, a golden trigger threshold, i.e. with a false alarm rate < 1/3-months, can be set as low as 6 candidates among the 8 detectors, leading to a 100% detection probability for all 1987A type supernova bursts at the distance to the Milky Way center and a 96% detection probability to those at the edge of the Milky Way.

  5. Advances in radioguided surgery in oncology.

    PubMed

    Valdés Olmos, Renato A; Vidal-Sicart, Sergi; Manca, Gianpiero; Mariani, Giuliano; León-Ramírez, Luisa F; Rubello, Domenico; Giammarile, Francesco

    2017-09-01

    The sentinel lymph node (SLN) biopsy is probably the most well-known radioguided technique in surgical oncology. Today SLN biopsy reduces the morbidity associated with lymphadenectomy and increases the identification rate of occult lymphatic metastases by offering the pathologist the lymph nodes with the highest probability of containing metastatic cells. These advantages may result in a change in clinical management both in melanoma and breast cancer patients. The SLN evaluation by pathology currently implies tumor burden stratification for further prognostic information. The concept of SLN biopsy includes pre-surgical lymphoscintigraphy as a "roadmap" to guide the surgeon toward the SLNs and to localize unpredictable lymphatic drainage patterns. In addition to planar images, SPECT/CT improves SLN detection, especially in sites closer to the injection site, providing anatomic landmarks which are helpful in localizing SLNs in difficult to interpret studies. The use of intraoperative imaging devices allows a better surgical approach and SLN localization. Several studies report the value of such devices for excision of additional sentinel nodes and for monitoring the whole procedure. The combination of preoperative imaging and radioguided localization constitutes the basis for a whole spectrum of basic and advanced nuclear medicine procedures, which recently have been encompassed under the term "guided intraoperative scintigraphic tumor targeting" (GOSTT). Excepting SLN biopsy, GOSTT includes procedures based on the detection of target lesions with visible uptake of tumor-seeking radiotracers on SPECT/CT or PET/CT enabling their subsequent radioguided excisional biopsy for diagnostic of therapeutic purposes. The incorporation of new PET-tracers into nuclear medicine has reinforced this field delineating new strategies for radioguided excision. In cases with insufficient lesion uptake after systemic radiotracer administration, intralesional injection of a tracer without migration may enable subsequent excision of the targeted tissue. This approach has been helpful in non-palpable breast cancer and in solitary pulmonary nodules. The introduction of allied technologies like fluorescence constitutes a recent advance aimed to refine the search for SLNs and tracer-avid lesions in the operation theatre in combination with radioguidance.

  6. Influences of Availability on Parameter Estimates from Site Occupancy Models with Application to Submersed Aquatic Vegetation

    USGS Publications Warehouse

    Gray, Brian R.; Holland, Mark D.; Yi, Feng; Starcevich, Leigh Ann Harrod

    2013-01-01

    Site occupancy models are commonly used by ecologists to estimate the probabilities of species site occupancy and of species detection. This study addresses the influence on site occupancy and detection estimates of variation in species availability among surveys within sites. Such variation in availability may result from temporary emigration, nonavailability of the species for detection, and sampling sites spatially when species presence is not uniform within sites. We demonstrate, using Monte Carlo simulations and aquatic vegetation data, that variation in availability and heterogeneity in the probability of availability may yield biases in the expected values of the site occupancy and detection estimates that have traditionally been associated with low-detection probabilities and heterogeneity in those probabilities. These findings confirm that the effects of availability may be important for ecologists and managers, and that where such effects are expected, modification of sampling designs and/or analytical methods should be considered. Failure to limit the effects of availability may preclude reliable estimation of the probability of site occupancy.

  7. True detection limits in an experimental linearly heteroscedastic system. Part 1

    NASA Astrophysics Data System (ADS)

    Voigtman, Edward; Abraham, Kevin T.

    2011-11-01

    Using a lab-constructed laser-excited filter fluorimeter deliberately designed to exhibit linearly heteroscedastic, additive Gaussian noise, it has been shown that accurate estimates may be made of the true theoretical Currie decision levels ( YC and XC) and true Currie detection limits ( YD and XD) for the detection of rhodamine 6 G tetrafluoroborate in ethanol. The obtained experimental values, for 5% probability of false positives and 5% probability of false negatives, were YC = 56.1 mV, YD = 125. mV, XC = 0.132 μg /mL and XD = 0.294 μg /mL. For 5% probability of false positives and 1% probability of false negatives, the obtained detection limits were YD = 158. mV and XD = 0.372 μg /mL. These decision levels and corresponding detection limits were shown to pass the ultimate test: they resulted in observed probabilities of false positives and false negatives that were statistically equivalent to the a priori specified values.

  8. Modeling co-occurrence of northern spotted and barred owls: accounting for detection probability differences

    USGS Publications Warehouse

    Bailey, Larissa L.; Reid, Janice A.; Forsman, Eric D.; Nichols, James D.

    2009-01-01

    Barred owls (Strix varia) have recently expanded their range and now encompass the entire range of the northern spotted owl (Strix occidentalis caurina). This expansion has led to two important issues of concern for management of northern spotted owls: (1) possible competitive interactions between the two species that could contribute to population declines of northern spotted owls, and (2) possible changes in vocalization behavior and detection probabilities of northern spotted owls induced by presence of barred owls. We used a two-species occupancy model to investigate whether there was evidence of competitive exclusion between the two species at study locations in Oregon, USA. We simultaneously estimated detection probabilities for both species and determined if the presence of one species influenced the detection of the other species. Model selection results and associated parameter estimates provided no evidence that barred owls excluded spotted owls from territories. We found strong evidence that detection probabilities differed for the two species, with higher probabilities for northern spotted owls that are the object of current surveys. Non-detection of barred owls is very common in surveys for northern spotted owls, and detection of both owl species was negatively influenced by the presence of the congeneric species. Our results suggest that analyses directed at hypotheses of barred owl effects on demographic or occupancy vital rates of northern spotted owls need to deal adequately with imperfect and variable detection probabilities for both species.

  9. Factors influencing territorial occupancy and reproductive success in a Eurasian Eagle-owl (Bubo bubo) population.

    PubMed

    León-Ortega, Mario; Jiménez-Franco, María V; Martínez, José E; Calvo, José F

    2017-01-01

    Modelling territorial occupancy and reproductive success is a key issue for better understanding the population dynamics of territorial species. This study aimed to investigate these ecological processes in a Eurasian Eagle-owl (Bubo bubo) population in south-eastern Spain during a seven-year period. A multi-season, multi-state modelling approach was followed to estimate the probabilities of occupancy and reproductive success in relation to previous state, time and habitat covariates, and accounting for imperfect detection. The best estimated models showed past breeding success in the territories to be the most important factor determining a high probability of reoccupation and reproductive success in the following year. In addition, alternative occupancy models suggested the positive influence of crops on the probability of territory occupation. By contrast, the best reproductive model revealed strong interannual variations in the rates of breeding success, which may be related to changes in the abundance of the European Rabbit, the main prey of the Eurasian Eagle-owl. Our models also estimated the probabilities of detecting the presence of owls in a given territory and the probability of detecting evidence of successful reproduction. Estimated detection probabilities were high throughout the breeding season, decreasing in time for unsuccessful breeders but increasing for successful breeders. The probability of detecting reproductive success increased with time, being close to one in the last survey. These results suggest that reproduction failure in the early stages of the breeding season is a determinant factor in the probability of detecting occupancy and reproductive success.

  10. Factors influencing territorial occupancy and reproductive success in a Eurasian Eagle-owl (Bubo bubo) population

    PubMed Central

    León-Ortega, Mario; Jiménez-Franco, María V.; Martínez, José E.

    2017-01-01

    Modelling territorial occupancy and reproductive success is a key issue for better understanding the population dynamics of territorial species. This study aimed to investigate these ecological processes in a Eurasian Eagle-owl (Bubo bubo) population in south-eastern Spain during a seven-year period. A multi-season, multi-state modelling approach was followed to estimate the probabilities of occupancy and reproductive success in relation to previous state, time and habitat covariates, and accounting for imperfect detection. The best estimated models showed past breeding success in the territories to be the most important factor determining a high probability of reoccupation and reproductive success in the following year. In addition, alternative occupancy models suggested the positive influence of crops on the probability of territory occupation. By contrast, the best reproductive model revealed strong interannual variations in the rates of breeding success, which may be related to changes in the abundance of the European Rabbit, the main prey of the Eurasian Eagle-owl. Our models also estimated the probabilities of detecting the presence of owls in a given territory and the probability of detecting evidence of successful reproduction. Estimated detection probabilities were high throughout the breeding season, decreasing in time for unsuccessful breeders but increasing for successful breeders. The probability of detecting reproductive success increased with time, being close to one in the last survey. These results suggest that reproduction failure in the early stages of the breeding season is a determinant factor in the probability of detecting occupancy and reproductive success. PMID:28399175

  11. Statistical approaches to the analysis of point count data: A little extra information can go a long way

    USGS Publications Warehouse

    Farnsworth, G.L.; Nichols, J.D.; Sauer, J.R.; Fancy, S.G.; Pollock, K.H.; Shriner, S.A.; Simons, T.R.; Ralph, C. John; Rich, Terrell D.

    2005-01-01

    Point counts are a standard sampling procedure for many bird species, but lingering concerns still exist about the quality of information produced from the method. It is well known that variation in observer ability and environmental conditions can influence the detection probability of birds in point counts, but many biologists have been reluctant to abandon point counts in favor of more intensive approaches to counting. However, over the past few years a variety of statistical and methodological developments have begun to provide practical ways of overcoming some of the problems with point counts. We describe some of these approaches, and show how they can be integrated into standard point count protocols to greatly enhance the quality of the information. Several tools now exist for estimation of detection probability of birds during counts, including distance sampling, double observer methods, time-depletion (removal) methods, and hybrid methods that combine these approaches. Many counts are conducted in habitats that make auditory detection of birds much more likely than visual detection. As a framework for understanding detection probability during such counts, we propose separating two components of the probability a bird is detected during a count into (1) the probability a bird vocalizes during the count and (2) the probability this vocalization is detected by an observer. In addition, we propose that some measure of the area sampled during a count is necessary for valid inferences about bird populations. This can be done by employing fixed-radius counts or more sophisticated distance-sampling models. We recommend any studies employing point counts be designed to estimate detection probability and to include a measure of the area sampled.

  12. Location Prediction Based on Transition Probability Matrices Constructing from Sequential Rules for Spatial-Temporal K-Anonymity Dataset

    PubMed Central

    Liu, Zhao; Zhu, Yunhong; Wu, Chenxue

    2016-01-01

    Spatial-temporal k-anonymity has become a mainstream approach among techniques for protection of users’ privacy in location-based services (LBS) applications, and has been applied to several variants such as LBS snapshot queries and continuous queries. Analyzing large-scale spatial-temporal anonymity sets may benefit several LBS applications. In this paper, we propose two location prediction methods based on transition probability matrices constructing from sequential rules for spatial-temporal k-anonymity dataset. First, we define single-step sequential rules mined from sequential spatial-temporal k-anonymity datasets generated from continuous LBS queries for multiple users. We then construct transition probability matrices from mined single-step sequential rules, and normalize the transition probabilities in the transition matrices. Next, we regard a mobility model for an LBS requester as a stationary stochastic process and compute the n-step transition probability matrices by raising the normalized transition probability matrices to the power n. Furthermore, we propose two location prediction methods: rough prediction and accurate prediction. The former achieves the probabilities of arriving at target locations along simple paths those include only current locations, target locations and transition steps. By iteratively combining the probabilities for simple paths with n steps and the probabilities for detailed paths with n-1 steps, the latter method calculates transition probabilities for detailed paths with n steps from current locations to target locations. Finally, we conduct extensive experiments, and correctness and flexibility of our proposed algorithm have been verified. PMID:27508502

  13. Performance analysis of EM-based blind detection for ON-OFF keying modulation over atmospheric optical channels

    NASA Astrophysics Data System (ADS)

    Dabiri, Mohammad Taghi; Sadough, Seyed Mohammad Sajad

    2018-04-01

    In the free-space optical (FSO) links, atmospheric turbulence lead to scintillation in the received signal. Due to its ease of implementation, intensity modulation with direct detection (IM/DD) based on ON-OFF keying (OOK) is a popular signaling scheme in these systems. Over turbulence channel, to detect OOK symbols in a blind way, i.e., without sending pilot symbols, an expectation-maximization (EM)-based detection method was recently proposed in the literature related to free-space optical (FSO) communication. However, the performance of EM-based detection methods severely depends on the length of the observation interval (Ls). To choose the optimum values of Ls at target bit error rates (BER)s of FSO communications which are commonly lower than 10-9, Monte-Carlo simulations would be very cumbersome and require a very long processing time. To facilitate performance evaluation, in this letter we derive the analytic expressions for BER and outage probability. Numerical results validate the accuracy of our derived analytic expressions. Our results may serve to evaluate the optimum value for Ls without resorting to time-consuming Monte-Carlo simulations.

  14. A simplified model for the assessment of the impact probability of fragments.

    PubMed

    Gubinelli, Gianfilippo; Zanelli, Severino; Cozzani, Valerio

    2004-12-31

    A model was developed for the assessment of fragment impact probability on a target vessel, following the collapse and fragmentation of a primary vessel due to internal pressure. The model provides the probability of impact of a fragment with defined shape, mass and initial velocity on a target of a known shape and at a given position with respect to the source point. The model is based on the ballistic analysis of the fragment trajectory and on the determination of impact probabilities by the analysis of initial direction of fragment flight. The model was validated using available literature data.

  15. Assessing Aircraft Supply Air to Recommend Compounds for Timely Warning of Contamination

    NASA Astrophysics Data System (ADS)

    Fox, Richard B.

    Taking aircraft out of service for even one day to correct fume-in-cabin events can cost the industry roughly $630 million per year in lost revenue. The quantitative correlation study investigated quantitative relationships between measured concentrations of contaminants in bleed air and probability of odor detectability. Data were collected from 94 aircraft engine and auxiliary power unit (APU) bleed air tests from an archival data set between 1997 and 2011, and no relationships were found. Pearson correlation was followed by regression analysis for individual contaminants. Significant relationships of concentrations of compounds in bleed air to probability of odor detectability were found (p<0.05), as well as between compound concentration and probability of sensory irritancy detectability. Study results may be useful to establish early warning levels. Predictive trend monitoring, a method to identify potential pending failure modes within a mechanical system, may influence scheduled down-time for maintenance as a planned event, rather than repair after a mechanical failure and thereby reduce operational costs associated with odor-in-cabin events. Twenty compounds (independent variables) were found statistically significant as related to probability of odor detectability (dependent variable 1). Seventeen compounds (independent variables) were found statistically significant as related to probability of sensory irritancy detectability (dependent variable 2). Additional research was recommended to further investigate relationships between concentrations of contaminants and probability of odor detectability or probability of sensory irritancy detectability for all turbine oil brands. Further research on implementation of predictive trend monitoring may be warranted to demonstrate how the monitoring process might be applied to in-flight application.

  16. Multiple model cardinalized probability hypothesis density filter

    NASA Astrophysics Data System (ADS)

    Georgescu, Ramona; Willett, Peter

    2011-09-01

    The Probability Hypothesis Density (PHD) filter propagates the first-moment approximation to the multi-target Bayesian posterior distribution while the Cardinalized PHD (CPHD) filter propagates both the posterior likelihood of (an unlabeled) target state and the posterior probability mass function of the number of targets. Extensions of the PHD filter to the multiple model (MM) framework have been published and were implemented either with a Sequential Monte Carlo or a Gaussian Mixture approach. In this work, we introduce the multiple model version of the more elaborate CPHD filter. We present the derivation of the prediction and update steps of the MMCPHD particularized for the case of two target motion models and proceed to show that in the case of a single model, the new MMCPHD equations reduce to the original CPHD equations.

  17. SPIDERS: selection of spectroscopic targets using AGN candidates detected in all-sky X-ray surveys

    NASA Astrophysics Data System (ADS)

    Dwelly, T.; Salvato, M.; Merloni, A.; Brusa, M.; Buchner, J.; Anderson, S. F.; Boller, Th.; Brandt, W. N.; Budavári, T.; Clerc, N.; Coffey, D.; Del Moro, A.; Georgakakis, A.; Green, P. J.; Jin, C.; Menzel, M.-L.; Myers, A. D.; Nandra, K.; Nichol, R. C.; Ridl, J.; Schwope, A. D.; Simm, T.

    2017-07-01

    SPIDERS (SPectroscopic IDentification of eROSITA Sources) is a Sloan Digital Sky Survey IV (SDSS-IV) survey running in parallel to the Extended Baryon Oscillation Spectroscopic Survey (eBOSS) cosmology project. SPIDERS will obtain optical spectroscopy for large numbers of X-ray-selected active galactic nuclei (AGN) and galaxy cluster members detected in wide-area eROSITA, XMM-Newton and ROSAT surveys. We describe the methods used to choose spectroscopic targets for two sub-programmes of SPIDERS X-ray selected AGN candidates detected in the ROSAT All Sky and the XMM-Newton Slew surveys. We have exploited a Bayesian cross-matching algorithm, guided by priors based on mid-IR colour-magnitude information from the Wide-field Infrared Survey Explorer survey, to select the most probable optical counterpart to each X-ray detection. We empirically demonstrate the high fidelity of our counterpart selection method using a reference sample of bright well-localized X-ray sources collated from XMM-Newton, Chandra and Swift-XRT serendipitous catalogues, and also by examining blank-sky locations. We describe the down-selection steps which resulted in the final set of SPIDERS-AGN targets put forward for spectroscopy within the eBOSS/TDSS/SPIDERS survey, and present catalogues of these targets. We also present catalogues of ˜12 000 ROSAT and ˜1500 XMM-Newton Slew survey sources that have existing optical spectroscopy from SDSS-DR12, including the results of our visual inspections. On completion of the SPIDERS programme, we expect to have collected homogeneous spectroscopic redshift information over a footprint of ˜7500 deg2 for >85 per cent of the ROSAT and XMM-Newton Slew survey sources having optical counterparts in the magnitude range 17 < r < 22.5, producing a large and highly complete sample of bright X-ray-selected AGN suitable for statistical studies of AGN evolution and clustering.

  18. Design of a multi-purpose fragment screening library using molecular complexity and orthogonal diversity metrics

    NASA Astrophysics Data System (ADS)

    Lau, Wan F.; Withka, Jane M.; Hepworth, David; Magee, Thomas V.; Du, Yuhua J.; Bakken, Gregory A.; Miller, Michael D.; Hendsch, Zachary S.; Thanabal, Venkataraman; Kolodziej, Steve A.; Xing, Li; Hu, Qiyue; Narasimhan, Lakshmi S.; Love, Robert; Charlton, Maura E.; Hughes, Samantha; van Hoorn, Willem P.; Mills, James E.

    2011-07-01

    Fragment Based Drug Discovery (FBDD) continues to advance as an efficient and alternative screening paradigm for the identification and optimization of novel chemical matter. To enable FBDD across a wide range of pharmaceutical targets, a fragment screening library is required to be chemically diverse and synthetically expandable to enable critical decision making for chemical follow-up and assessing new target druggability. In this manuscript, the Pfizer fragment library design strategy which utilized multiple and orthogonal metrics to incorporate structure, pharmacophore and pharmacological space diversity is described. Appropriate measures of molecular complexity were also employed to maximize the probability of detection of fragment hits using a variety of biophysical and biochemical screening methods. In addition, structural integrity, purity, solubility, fragment and analog availability as well as cost were important considerations in the selection process. Preliminary analysis of primary screening results for 13 targets using NMR Saturation Transfer Difference (STD) indicates the identification of uM-mM hits and the uniqueness of hits at weak binding affinities for these targets.

  19. Design of a multi-purpose fragment screening library using molecular complexity and orthogonal diversity metrics.

    PubMed

    Lau, Wan F; Withka, Jane M; Hepworth, David; Magee, Thomas V; Du, Yuhua J; Bakken, Gregory A; Miller, Michael D; Hendsch, Zachary S; Thanabal, Venkataraman; Kolodziej, Steve A; Xing, Li; Hu, Qiyue; Narasimhan, Lakshmi S; Love, Robert; Charlton, Maura E; Hughes, Samantha; van Hoorn, Willem P; Mills, James E

    2011-07-01

    Fragment Based Drug Discovery (FBDD) continues to advance as an efficient and alternative screening paradigm for the identification and optimization of novel chemical matter. To enable FBDD across a wide range of pharmaceutical targets, a fragment screening library is required to be chemically diverse and synthetically expandable to enable critical decision making for chemical follow-up and assessing new target druggability. In this manuscript, the Pfizer fragment library design strategy which utilized multiple and orthogonal metrics to incorporate structure, pharmacophore and pharmacological space diversity is described. Appropriate measures of molecular complexity were also employed to maximize the probability of detection of fragment hits using a variety of biophysical and biochemical screening methods. In addition, structural integrity, purity, solubility, fragment and analog availability as well as cost were important considerations in the selection process. Preliminary analysis of primary screening results for 13 targets using NMR Saturation Transfer Difference (STD) indicates the identification of uM-mM hits and the uniqueness of hits at weak binding affinities for these targets.

  20. The striking similarities between standard, distractor-free, and target-free recognition

    PubMed Central

    Dobbins, Ian G.

    2012-01-01

    It is often assumed that observers seek to maximize correct responding during recognition testing by actively adjusting a decision criterion. However, early research by Wallace (Journal of Experimental Psychology: Human Learning and Memory 4:441–452, 1978) suggested that recognition rates for studied items remained similar, regardless of whether or not the tests contained distractor items. We extended these findings across three experiments, addressing whether detection rates or observer confidence changed when participants were presented standard tests (targets and distractors) versus “pure-list” tests (lists composed entirely of targets or distractors). Even when observers were made aware of the composition of the pure-list test, the endorsement rates and confidence patterns remained largely similar to those observed during standard testing, suggesting that observers are typically not striving to maximize the likelihood of success across the test. We discuss the implications for decision models that assume a likelihood ratio versus a strength decision axis, as well as the implications for prior findings demonstrating large criterion shifts using target probability manipulations. PMID:21476108

  1. Application of Probability of Crack Detection to Aircraft Systems Reliability.

    DOT National Transportation Integrated Search

    1993-08-31

    This report describes three tasks related to probability of crack detection (POD) and aircraft systems reliablity. All three consider previous work in which crack growth simulations and crack detection data in the Service Difficulty Report (SDR) data...

  2. Automated target recognition using passive radar and coordinated flight models

    NASA Astrophysics Data System (ADS)

    Ehrman, Lisa M.; Lanterman, Aaron D.

    2003-09-01

    Rather than emitting pulses, passive radar systems rely on illuminators of opportunity, such as TV and FM radio, to illuminate potential targets. These systems are particularly attractive since they allow receivers to operate without emitting energy, rendering them covert. Many existing passive radar systems estimate the locations and velocities of targets. This paper focuses on adding an automatic target recognition (ATR) component to such systems. Our approach to ATR compares the Radar Cross Section (RCS) of targets detected by a passive radar system to the simulated RCS of known targets. To make the comparison as accurate as possible, the received signal model accounts for aircraft position and orientation, propagation losses, and antenna gain patterns. The estimated positions become inputs for an algorithm that uses a coordinated flight model to compute probable aircraft orientation angles. The Fast Illinois Solver Code (FISC) simulates the RCS of several potential target classes as they execute the estimated maneuvers. The RCS is then scaled by the Advanced Refractive Effects Prediction System (AREPS) code to account for propagation losses that occur as functions of altitude and range. The Numerical Electromagnetic Code (NEC2) computes the antenna gain pattern, so that the RCS can be further scaled. The Rician model compares the RCS of the illuminated aircraft with those of the potential targets. This comparison results in target identification.

  3. Interference Information Based Power Control for Cognitive Radio with Multi-Hop Cooperative Sensing

    NASA Astrophysics Data System (ADS)

    Yu, Youngjin; Murata, Hidekazu; Yamamoto, Koji; Yoshida, Susumu

    Reliable detection of other radio systems is crucial for systems that share the same frequency band. In wireless communication channels, there is uncertainty in the received signal level due to multipath fading and shadowing. Cooperative sensing techniques in which radio stations share their sensing information can improve the detection probability of other systems. In this paper, a new cooperative sensing scheme that reduces the false detection probability while maintaining the outage probability of other systems is investigated. In the proposed system, sensing information is collected using multi-hop transmission from all sensing stations that detect other systems, and transmission decisions are based on the received sensing information. The proposed system also controls the transmit power based on the received CINRs from the sensing stations. Simulation results reveal that the proposed system can reduce the outage probability of other systems, or improve its link success probability.

  4. Probability of Detection of Genotyping Errors and Mutations as Inheritance Inconsistencies in Nuclear-Family Data

    PubMed Central

    Douglas, Julie A.; Skol, Andrew D.; Boehnke, Michael

    2002-01-01

    Gene-mapping studies routinely rely on checking for Mendelian transmission of marker alleles in a pedigree, as a means of screening for genotyping errors and mutations, with the implicit assumption that, if a pedigree is consistent with Mendel’s laws of inheritance, then there are no genotyping errors. However, the occurrence of inheritance inconsistencies alone is an inadequate measure of the number of genotyping errors, since the rate of occurrence depends on the number and relationships of genotyped pedigree members, the type of errors, and the distribution of marker-allele frequencies. In this article, we calculate the expected probability of detection of a genotyping error or mutation as an inheritance inconsistency in nuclear-family data, as a function of both the number of genotyped parents and offspring and the marker-allele frequency distribution. Through computer simulation, we explore the sensitivity of our analytic calculations to the underlying error model. Under a random-allele–error model, we find that detection rates are 51%–77% for multiallelic markers and 13%–75% for biallelic markers; detection rates are generally lower when the error occurs in a parent than in an offspring, unless a large number of offspring are genotyped. Errors are especially difficult to detect for biallelic markers with equally frequent alleles, even when both parents are genotyped; in this case, the maximum detection rate is 34% for four-person nuclear families. Error detection in families in which parents are not genotyped is limited, even with multiallelic markers. Given these results, we recommend that additional error checking (e.g., on the basis of multipoint analysis) be performed, beyond routine checking for Mendelian consistency. Furthermore, our results permit assessment of the plausibility of an observed number of inheritance inconsistencies for a family, allowing the detection of likely pedigree—rather than genotyping—errors in the early stages of a genome scan. Such early assessments are valuable in either the targeting of families for resampling or discontinued genotyping. PMID:11791214

  5. MOUNT NAOMI ROADLESS AREA, UTAH AND IDAHO.

    USGS Publications Warehouse

    Dover, James H.; Bigsby, Philip R.

    1984-01-01

    Geologic, geophysical, and geochemical surveys, and an examination of mines and prospects were made in the Mount Naomi Roadless Area, Utah and Idaho. No significant precious-metal, base-metal, other trace-metal, or uranium anomalies are apparent in the geochemical data from the Mount Naomi Roadless Area, and no exploration targets were detected. However, a belt of probable resource potential for stratabound copper, lead, and zinc occurrences exists on the west side of the area in limestone and shale. The possibility that oil and gas concentration lie deeply buried beneath the roadless area cannot be evaluated from available data.

  6. Identification of (antioxidative) plants in herbal pharmaceutical preparations and dietary supplements.

    PubMed

    Deconinck, Eric; Custers, Deborah; De Beer, Jacques Omer

    2015-01-01

    The standard procedures for the identification, authentication, and quality control of medicinal plants and herbs are nowadays limited to pure herbal products. No guidelines or procedures, describing the detection or identification of a targeted plant or herb in pharmaceutical preparations or dietary supplements, can be found. In these products the targeted plant is often present together with other components of herbal or synthetic origin. This chapter describes a strategy for the fast development of a chromatographic fingerprint approach that allows the identification of a targeted plant in herbal preparations and dietary supplements. The strategy consists of a standard chromatographic gradient that is tested for the targeted plant with different extraction solvents and different mobile phases. From the results obtained, the optimal fingerprint is selected. Subsequently the samples are analyzed according to the selected methodological parameters, and the obtained fingerprints can be compared with the one obtained for the pure herbal product or a standard preparation. Calculation of the dissimilarity between these fingerprints will result in a probability of presence of the targeted plant. Optionally mass spectrometry can be used to improve specificity, to confirm identification, or to identify molecules with a potential medicinal or antioxidant activity.

  7. Performance metrics for state-of-the-art airborne magnetic and electromagnetic systems for mapping and detection of unexploded ordnance

    NASA Astrophysics Data System (ADS)

    Doll, William E.; Bell, David T.; Gamey, T. Jeffrey; Beard, Les P.; Sheehan, Jacob R.; Norton, Jeannemarie

    2010-04-01

    Over the past decade, notable progress has been made in the performance of airborne geophysical systems for mapping and detection of unexploded ordnance in terrestrial and shallow marine environments. For magnetometer systems, the most significant improvements include development of denser magnetometer arrays and vertical gradiometer configurations. In prototype analyses and recent Environmental Security Technology Certification Program (ESTCP) assessments using new production systems the greatest sensitivity has been achieved with a vertical gradiometer configuration, despite model-based survey design results which suggest that dense total-field arrays would be superior. As effective as magnetometer systems have proven to be at many sites, they are inadequate at sites where basalts and other ferrous geologic formations or soils produce anomalies that approach or exceed those of target ordnance items. Additionally, magnetometer systems are ineffective where detection of non-ferrous ordnance items is of primary concern. Recent completion of the Battelle TEM-8 airborne time-domain electromagnetic system represents the culmination of nearly nine years of assessment and development of airborne electromagnetic systems for UXO mapping and detection. A recent ESTCP demonstration of this system in New Mexico showed that it was able to detect 99% of blind-seeded ordnance items, 81mm and larger, and that it could be used to map in detail a bombing target on a basalt flow where previous airborne magnetometer surveys had failed. The probability of detection for the TEM-8 in the blind-seeded study area was better than that reported for a dense-array total-field magnetometer demonstration of the same blind-seeded site, and the TEM-8 system successfully detected these items with less than half as many anomaly picks as the dense-array total-field magnetometer system.

  8. Assessing bat detectability and occupancy with multiple automated echolocation detectors

    Treesearch

    Marcos P. Gorresen; Adam C. Miles; Christopher M. Todd; Frank J. Bonaccorso; Theodore J. Weller

    2008-01-01

    Occupancy analysis and its ability to account for differential detection probabilities is important for studies in which detecting echolocation calls is used as a measure of bat occurrence and activity. We examined the feasibility of remotely acquiring bat encounter histories to estimate detection probability and occupancy. We used echolocation detectors coupled o...

  9. Spectral methods to detect surface mines

    NASA Astrophysics Data System (ADS)

    Winter, Edwin M.; Schatten Silvious, Miranda

    2008-04-01

    Over the past five years, advances have been made in the spectral detection of surface mines under minefield detection programs at the U. S. Army RDECOM CERDEC Night Vision and Electronic Sensors Directorate (NVESD). The problem of detecting surface land mines ranges from the relatively simple, the detection of large anti-vehicle mines on bare soil, to the very difficult, the detection of anti-personnel mines in thick vegetation. While spatial and spectral approaches can be applied to the detection of surface mines, spatial-only detection requires many pixels-on-target such that the mine is actually imaged and shape-based features can be exploited. This method is unreliable in vegetated areas because only part of the mine may be exposed, while spectral detection is possible without the mine being resolved. At NVESD, hyperspectral and multi-spectral sensors throughout the reflection and thermal spectral regimes have been applied to the mine detection problem. Data has been collected on mines in forest and desert regions and algorithms have been developed both to detect the mines as anomalies and to detect the mines based on their spectral signature. In addition to the detection of individual mines, algorithms have been developed to exploit the similarities of mines in a minefield to improve their detection probability. In this paper, the types of spectral data collected over the past five years will be summarized along with the advances in algorithm development.

  10. Point count length and detection of forest neotropical migrant birds

    USGS Publications Warehouse

    Dawson, D.K.; Smith, D.R.; Robbins, C.S.; Ralph, C. John; Sauer, John R.; Droege, Sam

    1995-01-01

    Comparisons of bird abundances among years or among habitats assume that the rates at which birds are detected and counted are constant within species. We use point count data collected in forests of the Mid-Atlantic states to estimate detection probabilities for Neotropical migrant bird species as a function of count length. For some species, significant differences existed among years or observers in both the probability of detecting the species and in the rate at which individuals are counted. We demonstrate the consequence that variability in species' detection probabilities can have on estimates of population change, and discuss ways for reducing this source of bias in point count studies.

  11. BP network identification technology of infrared polarization based on fuzzy c-means clustering

    NASA Astrophysics Data System (ADS)

    Zeng, Haifang; Gu, Guohua; He, Weiji; Chen, Qian; Yang, Wei

    2011-08-01

    Infrared detection system is frequently employed on surveillance operations and reconnaissance mission to detect particular targets of interest in both civilian and military communities. By incorporating the polarization of light as supplementary information, the target discrimination performance could be enhanced. So this paper proposed an infrared target identification method which is based on fuzzy theory and neural network with polarization properties of targets. The paper utilizes polarization degree and light intensity to advance the unsupervised KFCM (kernel fuzzy C-Means) clustering method. And establish different material pol1arization properties database. In the built network, the system can feedback output corresponding material types of probability distribution toward any input polarized degree such as 10° 15°, 20°, 25°, 30°. KFCM, which has stronger robustness and accuracy than FCM, introduces kernel idea and gives the noise points and invalid value different but intuitively reasonable weights. Because of differences in characterization of material properties, there will be some conflicts in classification results. And D - S evidence theory was used in the combination of the polarization and intensity information. Related results show KFCM clustering precision and operation rate are higher than that of the FCM clustering method. The artificial neural network method realizes material identification, which reasonable solved the problems of complexity in environmental information of infrared polarization, and improperness of background knowledge and inference rule. This method of polarization identification is fast in speed, good in self-adaption and high in resolution.

  12. Battlefield decision aid for acoustical ground sensors with interface to meteorological data sources

    NASA Astrophysics Data System (ADS)

    Wilson, D. Keith; Noble, John M.; VanAartsen, Bruce H.; Szeto, Gregory L.

    2001-08-01

    The performance of acoustical ground sensors depends heavily on the local atmospheric and terrain conditions. This paper describes a prototype physics-based decision aid, called the Acoustic Battlefield Aid (ABFA), for predicting these environ-mental effects. ABFA integrates advanced models for acoustic propagation, atmospheric structure, and array signal process-ing into a convenient graphical user interface. The propagation calculations are performed in the frequency domain on user-definable target spectra. The solution method involves a parabolic approximation to the wave equation combined with a ter-rain diffraction model. Sensor performance is characterized with Cramer-Rao lower bounds (CRLBs). The CRLB calcula-tions include randomization of signal energy and wavefront orientation resulting from atmospheric turbulence. Available performance characterizations include signal-to-noise ratio, probability of detection, direction-finding accuracy for isolated receiving arrays, and location-finding accuracy for networked receiving arrays. A suite of integrated tools allows users to create new target descriptions from standard digitized audio files and to design new sensor array layouts. These tools option-ally interface with the ARL Database/Automatic Target Recognition (ATR) Laboratory, providing access to an extensive library of target signatures. ABFA also includes a Java-based capability for network access of near real-time data from sur-face weather stations or forecasts from the Army's Integrated Meteorological System. As an example, the detection footprint of an acoustical sensor, as it evolves over a 13-hour period, is calculated.

  13. Treatment planning for prostate focal laser ablation in the face of needle placement uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cepek, Jeremy, E-mail: jcepek@robarts.ca; Fenster, Aaron; Lindner, Uri

    2014-01-15

    Purpose: To study the effect of needle placement uncertainty on the expected probability of achieving complete focal target destruction in focal laser ablation (FLA) of prostate cancer. Methods: Using a simplified model of prostate cancer focal target, and focal laser ablation region shapes, Monte Carlo simulations of needle placement error were performed to estimate the probability of completely ablating a region of target tissue. Results: Graphs of the probability of complete focal target ablation are presented over clinically relevant ranges of focal target sizes and shapes, ablation region sizes, and levels of needle placement uncertainty. In addition, a table ismore » provided for estimating the maximum target size that is treatable. The results predict that targets whose length is at least 5 mm smaller than the diameter of each ablation region can be confidently ablated using, at most, four laser fibers if the standard deviation in each component of needle placement error is less than 3 mm. However, targets larger than this (i.e., near to or exceeding the diameter of each ablation region) require more careful planning. This process is facilitated by using the table provided. Conclusions: The probability of completely ablating a focal target using FLA is sensitive to the level of needle placement uncertainty, especially as the target length approaches and becomes greater than the diameter of ablated tissue that each individual laser fiber can achieve. The results of this work can be used to help determine individual patient eligibility for prostate FLA, to guide the planning of prostate FLA, and to quantify the clinical benefit of using advanced systems for accurate needle delivery for this treatment modality.« less

  14. Evolution and Optimality of Similar Neural Mechanisms for Perception and Action during Search

    PubMed Central

    Zhang, Sheng; Eckstein, Miguel P.

    2010-01-01

    A prevailing theory proposes that the brain's two visual pathways, the ventral and dorsal, lead to differing visual processing and world representations for conscious perception than those for action. Others have claimed that perception and action share much of their visual processing. But which of these two neural architectures is favored by evolution? Successful visual search is life-critical and here we investigate the evolution and optimality of neural mechanisms mediating perception and eye movement actions for visual search in natural images. We implement an approximation to the ideal Bayesian searcher with two separate processing streams, one controlling the eye movements and the other stream determining the perceptual search decisions. We virtually evolved the neural mechanisms of the searchers' two separate pathways built from linear combinations of primary visual cortex receptive fields (V1) by making the simulated individuals' probability of survival depend on the perceptual accuracy finding targets in cluttered backgrounds. We find that for a variety of targets, backgrounds, and dependence of target detectability on retinal eccentricity, the mechanisms of the searchers' two processing streams converge to similar representations showing that mismatches in the mechanisms for perception and eye movements lead to suboptimal search. Three exceptions which resulted in partial or no convergence were a case of an organism for which the targets are equally detectable across the retina, an organism with sufficient time to foveate all possible target locations, and a strict two-pathway model with no interconnections and differential pre-filtering based on parvocellular and magnocellular lateral geniculate cell properties. Thus, similar neural mechanisms for perception and eye movement actions during search are optimal and should be expected from the effects of natural selection on an organism with limited time to search for food that is not equi-detectable across its retina and interconnected perception and action neural pathways. PMID:20838589

  15. Using fingerprint image quality to improve the identification performance of the U.S. Visitor and Immigrant Status Indicator Technology Program

    PubMed Central

    Wein, Lawrence M.; Baveja, Manas

    2005-01-01

    Motivated by the difficulty of biometric systems to correctly match fingerprints with poor image quality, we formulate and solve a game-theoretic formulation of the identification problem in two settings: U.S. visa applicants are checked against a list of visa holders to detect visa fraud, and visitors entering the U.S. are checked against a watchlist of criminals and suspected terrorists. For three types of biometric strategies, we solve the game in which the U.S. Government chooses the strategy's optimal parameter values to maximize the detection probability subject to a constraint on the mean biometric processing time per legal visitor, and then the terrorist chooses the image quality to minimize the detection probability. At current inspector staffing levels at ports of entry, our model predicts that a quality-dependent two-finger strategy achieves a detection probability of 0.733, compared to 0.526 under the quality-independent two-finger strategy that is currently implemented at the U.S. border. Increasing the staffing level of inspectors offers only minor increases in the detection probability for these two strategies. Using more than two fingers to match visitors with poor image quality allows a detection probability of 0.949 under current staffing levels, but may require major changes to the current U.S. biometric program. The detection probabilities during visa application are ≈11–22% smaller than at ports of entry for all three strategies, but the same qualitative conclusions hold. PMID:15894628

  16. Detection probability of cliff-nesting raptors during helicopter and fixed-wing aircraft surveys in western Alaska

    USGS Publications Warehouse

    Booms, T.L.; Schempf, P.F.; McCaffery, B.J.; Lindberg, M.S.; Fuller, M.R.

    2010-01-01

    We conducted repeated aerial surveys for breeding cliff-nesting raptors on the Yukon Delta National Wildlife Refuge (YDNWR) in western Alaska to estimate detection probabilities of Gyrfalcons (Falco rusticolus), Golden Eagles (Aquila chrysaetos), Rough-legged Hawks (Buteo lagopus), and also Common Ravens (Corvus corax). Using the program PRESENCE, we modeled detection histories of each species based on single species occupancy modeling. We used different observers during four helicopter replicate surveys in the Kilbuck Mountains and five fixed-wing replicate surveys in the Ingakslugwat Hills near Bethel, AK. During helicopter surveys, Gyrfalcons had the highest detection probability estimate (p^;p^ 0.79; SE 0.05), followed by Golden Eagles (p^=0.68; SE 0.05), Common Ravens (p^=0.45; SE 0.17), and Rough-legged Hawks (p^=0.10; SE 0.11). Detection probabilities from fixed-wing aircraft in the Ingakslugwat Hills were similar to those from the helicopter in the Kilbuck Mountains for Gyrfalcons and Golden Eagles, but were higher for Common Ravens (p^=0.85; SE 0.06) and Rough-legged Hawks (p^=0.42; SE 0.07). Fixed-wing aircraft provided detection probability estimates and SEs in the Ingakslugwat Hills similar to or better than those from helicopter surveys in the Kilbucks and should be considered for future cliff-nesting raptor surveys where safe, low-altitude flight is possible. Overall, detection probability varied by observer experience and in some cases, by study area/aircraft type.

  17. Using fingerprint image quality to improve the identification performance of the U.S. Visitor and Immigrant Status Indicator Technology Program.

    PubMed

    Wein, Lawrence M; Baveja, Manas

    2005-05-24

    Motivated by the difficulty of biometric systems to correctly match fingerprints with poor image quality, we formulate and solve a game-theoretic formulation of the identification problem in two settings: U.S. visa applicants are checked against a list of visa holders to detect visa fraud, and visitors entering the U.S. are checked against a watchlist of criminals and suspected terrorists. For three types of biometric strategies, we solve the game in which the U.S. Government chooses the strategy's optimal parameter values to maximize the detection probability subject to a constraint on the mean biometric processing time per legal visitor, and then the terrorist chooses the image quality to minimize the detection probability. At current inspector staffing levels at ports of entry, our model predicts that a quality-dependent two-finger strategy achieves a detection probability of 0.733, compared to 0.526 under the quality-independent two-finger strategy that is currently implemented at the U.S. border. Increasing the staffing level of inspectors offers only minor increases in the detection probability for these two strategies. Using more than two fingers to match visitors with poor image quality allows a detection probability of 0.949 under current staffing levels, but may require major changes to the current U.S. biometric program. The detection probabilities during visa application are approximately 11-22% smaller than at ports of entry for all three strategies, but the same qualitative conclusions hold.

  18. A Comparative Study of Automated Infrasound Detectors - PMCC and AFD with Analyst Review.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Junghyun; Hayward, Chris; Zeiler, Cleat

    Automated detections calculated by the progressive multi-channel correlation (PMCC) method (Cansi, 1995) and the adaptive F detector (AFD) (Arrowsmith et al., 2009) are compared to the signals identified by five independent analysts. Each detector was applied to a four-hour time sequence recorded by the Korean infrasound array CHNAR. This array was used because it is composed of both small (<100 m) and large (~1000 m) aperture element spacing. The four hour time sequence contained a number of easily identified signals under noise conditions that have average RMS amplitudes varied from 1.2 to 4.5 mPa (1 to 5 Hz), estimated withmore » running five-minute window. The effectiveness of the detectors was estimated for the small aperture, large aperture, small aperture combined with the large aperture, and full array. The full and combined arrays performed the best for AFD under all noise conditions while the large aperture array had the poorest performance for both detectors. PMCC produced similar results as AFD under the lower noise conditions, but did not produce as dramatic an increase in detections using the full and combined arrays. Both automated detectors and the analysts produced a decrease in detections under the higher noise conditions. Comparing the detection probabilities with Estimated Receiver Operating Characteristic (EROC) curves we found that the smaller value of consistency for PMCC and the larger p-value for AFD had the highest detection probability. These parameters produced greater changes in detection probability than estimates of the false alarm rate. The detection probability was impacted the most by noise level, with low noise (average RMS amplitude of 1.7 mPa) having an average detection probability of ~40% and high noise (average RMS amplitude of 2.9 mPa) average detection probability of ~23%.« less

  19. Modelling detectability of kiore (Rattus exulans) on Aguiguan, Mariana Islands, to inform possible eradication and monitoring efforts

    USGS Publications Warehouse

    Adams, A.A.Y.; Stanford, J.W.; Wiewel, A.S.; Rodda, G.H.

    2011-01-01

    Estimating the detection probability of introduced organisms during the pre-monitoring phase of an eradication effort can be extremely helpful in informing eradication and post-eradication monitoring efforts, but this step is rarely taken. We used data collected during 11 nights of mark-recapture sampling on Aguiguan, Mariana Islands, to estimate introduced kiore (Rattus exulans Peale) density and detection probability, and evaluated factors affecting detectability to help inform possible eradication efforts. Modelling of 62 captures of 48 individuals resulted in a model-averaged density estimate of 55 kiore/ha. Kiore detection probability was best explained by a model allowing neophobia to diminish linearly (i.e. capture probability increased linearly) until occasion 7, with additive effects of sex and cumulative rainfall over the prior 48 hours. Detection probability increased with increasing rainfall and females were up to three times more likely than males to be trapped. In this paper, we illustrate the type of information that can be obtained by modelling mark-recapture data collected during pre-eradication monitoring and discuss the potential of using these data to inform eradication and posteradication monitoring efforts. ?? New Zealand Ecological Society.

  20. Maximizing detection probability of Wetland-dependent birds during point-count surveys in northwestern Florida

    USGS Publications Warehouse

    Nadeau, C.P.; Conway, C.J.; Smith, B.S.; Lewis, T.E.

    2008-01-01

    We conducted 262 call-broadcast point-count surveys (1-6 replicate surveys on each of 62 points) using standardized North American Marsh Bird Monitoring Protocols between 31 May and 7 July 2006 on St. Vincent National Wildlife Refuge, an island off the northwest coast of Florida. We conducted double-blind multiple-observer surveys, paired morning and evening surveys, and paired morning and night surveys to examine the influence of call-broadcast and time of day on detection probability. Observer detection probability for all species pooled was 75% and was similar between passive (69%) and call-broadcast (65%) periods. Detection probability was higher on morning than evening (t = 3.0, P = 0.030) or night (t = 3.4, P = 0.042) surveys when we pooled all species. Detection probability was higher (but not significant for all species) on morning compared to evening or night surveys for all five focal species detected on surveys: Least Bittern (Ixobrychus exilis), Clapper Rail (Rallus longirostris), Purple Gallinule (Porphyrula martinica), Common Moorhen (Gallinula chloropus), and American Coot (Fulica americana). We detected more Least Bitterns (t = 2.4, P = 0.064) and Common Moorhens (t = 2.8, P = 0.026) on morning than evening surveys, and more Clapper Rails (t = 5.1, P = 0.014) on morning than night surveys.

  1. Systems Approach to Defeating Maritime Improvised Explosive Devices in U.S. Ports

    DTIC Science & Technology

    2008-12-01

    DETECTION Pfi PROBABILITY OF FALSE IDENTIFICATION PHPK PROBABILITY OF HIT/PROBABILITY OF KILL PMA POST MISSION ANALYSIS PNNL PACIFIC...16 Naval Warfare Publication 27-2(Rev. B), Section 1.8.4.1(unclassified) 42 detection analysis is conducted...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA Approved for public release; distribution is unlimited Prepared

  2. Detection of sea otters in boat-based surveys of Prince William Sound, Alaska

    USGS Publications Warehouse

    Udevitz, Mark S.; Bodkin, James L.; Costa, Daniel P.

    1995-01-01

    Boat-based surveys have been commonly used to monitor sea otter populations, but there has been little quantitative work to evaluate detection biases that may affect these surveys. We used ground-based observers to investigate sea otter detection probabilities in a boat-based survey of Prince William Sound, Alaska. We estimated that 30% of the otters present on surveyed transects were not detected by boat crews. Approximately half (53%) of the undetected otters were missed because the otters left the transects, apparently in response to the approaching boat. Unbiased estimates of detection probabilities will be required for obtaining unbiased population estimates from boat-based surveys of sea otters. Therefore, boat-based surveys should include methods to estimate sea otter detection probabilities under the conditions specific to each survey. Unbiased estimation of detection probabilities with ground-based observers requires either that the ground crews detect all of the otters in observed subunits, or that there are no errors in determining which crews saw each detected otter. Ground-based observer methods may be appropriate in areas where nearly all of the sea otter habitat is potentially visible from ground-based vantage points.

  3. Classification of change detection and change blindness from near-infrared spectroscopy signals

    NASA Astrophysics Data System (ADS)

    Tanaka, Hirokazu; Katura, Takusige

    2011-08-01

    Using a machine-learning classification algorithm applied to near-infrared spectroscopy (NIRS) signals, we classify a success (change detection) or a failure (change blindness) in detecting visual changes for a change-detection task. Five subjects perform a change-detection task, and their brain activities are continuously monitored. A support-vector-machine algorithm is applied to classify the change-detection and change-blindness trials, and correct classification probability of 70-90% is obtained for four subjects. Two types of temporal shapes in classification probabilities are found: one exhibiting a maximum value after the task is completed (postdictive type), and another exhibiting a maximum value during the task (predictive type). As for the postdictive type, the classification probability begins to increase immediately after the task completion and reaches its maximum in about the time scale of neuronal hemodynamic response, reflecting a subjective report of change detection. As for the predictive type, the classification probability shows an increase at the task initiation and is maximal while subjects are performing the task, predicting the task performance in detecting a change. We conclude that decoding change detection and change blindness from NIRS signal is possible and argue some future applications toward brain-machine interfaces.

  4. Investigation of an Optimum Detection Scheme for a Star-Field Mapping System

    NASA Technical Reports Server (NTRS)

    Aldridge, M. D.; Credeur, L.

    1970-01-01

    An investigation was made to determine the optimum detection scheme for a star-field mapping system that uses coded detection resulting from starlight shining through specially arranged multiple slits of a reticle. The computer solution of equations derived from a theoretical model showed that the greatest probability of detection for a given star and background intensity occurred with the use of a single transparent slit. However, use of multiple slits improved the system's ability to reject the detection of undesirable lower intensity stars, but only by decreasing the probability of detection for lower intensity stars to be mapped. Also, it was found that the coding arrangement affected the root-mean-square star-position error and that detection is possible with error in the system's detected spin rate, though at a reduced probability.

  5. Investigating prior probabilities in a multiple hypothesis test for use in space domain awareness

    NASA Astrophysics Data System (ADS)

    Hardy, Tyler J.; Cain, Stephen C.

    2016-05-01

    The goal of this research effort is to improve Space Domain Awareness (SDA) capabilities of current telescope systems through improved detection algorithms. Ground-based optical SDA telescopes are often spatially under-sampled, or aliased. This fact negatively impacts the detection performance of traditionally proposed binary and correlation-based detection algorithms. A Multiple Hypothesis Test (MHT) algorithm has been previously developed to mitigate the effects of spatial aliasing. This is done by testing potential Resident Space Objects (RSOs) against several sub-pixel shifted Point Spread Functions (PSFs). A MHT has been shown to increase detection performance for the same false alarm rate. In this paper, the assumption of a priori probability used in a MHT algorithm is investigated. First, an analysis of the pixel decision space is completed to determine alternate hypothesis prior probabilities. These probabilities are then implemented into a MHT algorithm, and the algorithm is then tested against previous MHT algorithms using simulated RSO data. Results are reported with Receiver Operating Characteristic (ROC) curves and probability of detection, Pd, analysis.

  6. Detection of the nipple in automated 3D breast ultrasound using coronal slab-average-projection and cumulative probability map

    NASA Astrophysics Data System (ADS)

    Kim, Hannah; Hong, Helen

    2014-03-01

    We propose an automatic method for nipple detection on 3D automated breast ultrasound (3D ABUS) images using coronal slab-average-projection and cumulative probability map. First, to identify coronal images that appeared remarkable distinction between nipple-areola region and skin, skewness of each coronal image is measured and the negatively skewed images are selected. Then, coronal slab-average-projection image is reformatted from selected images. Second, to localize nipple-areola region, elliptical ROI covering nipple-areola region is detected using Hough ellipse transform in coronal slab-average-projection image. Finally, to separate the nipple from areola region, 3D Otsu's thresholding is applied to the elliptical ROI and cumulative probability map in the elliptical ROI is generated by assigning high probability to low intensity region. False detected small components are eliminated using morphological opening and the center point of detected nipple region is calculated. Experimental results show that our method provides 94.4% nipple detection rate.

  7. Contour-Based Corner Detection and Classification by Using Mean Projection Transform

    PubMed Central

    Kahaki, Seyed Mostafa Mousavi; Nordin, Md Jan; Ashtari, Amir Hossein

    2014-01-01

    Image corner detection is a fundamental task in computer vision. Many applications require reliable detectors to accurately detect corner points, commonly achieved by using image contour information. The curvature definition is sensitive to local variation and edge aliasing, and available smoothing methods are not sufficient to address these problems properly. Hence, we propose Mean Projection Transform (MPT) as a corner classifier and parabolic fit approximation to form a robust detector. The first step is to extract corner candidates using MPT based on the integral properties of the local contours in both the horizontal and vertical directions. Then, an approximation of the parabolic fit is calculated to localize the candidate corner points. The proposed method presents fewer false-positive (FP) and false-negative (FN) points compared with recent standard corner detection techniques, especially in comparison with curvature scale space (CSS) methods. Moreover, a new evaluation metric, called accuracy of repeatability (AR), is introduced. AR combines repeatability and the localization error (Le) for finding the probability of correct detection in the target image. The output results exhibit better repeatability, localization, and AR for the detected points compared with the criteria in original and transformed images. PMID:24590354

  8. Contour-based corner detection and classification by using mean projection transform.

    PubMed

    Kahaki, Seyed Mostafa Mousavi; Nordin, Md Jan; Ashtari, Amir Hossein

    2014-02-28

    Image corner detection is a fundamental task in computer vision. Many applications require reliable detectors to accurately detect corner points, commonly achieved by using image contour information. The curvature definition is sensitive to local variation and edge aliasing, and available smoothing methods are not sufficient to address these problems properly. Hence, we propose Mean Projection Transform (MPT) as a corner classifier and parabolic fit approximation to form a robust detector. The first step is to extract corner candidates using MPT based on the integral properties of the local contours in both the horizontal and vertical directions. Then, an approximation of the parabolic fit is calculated to localize the candidate corner points. The proposed method presents fewer false-positive (FP) and false-negative (FN) points compared with recent standard corner detection techniques, especially in comparison with curvature scale space (CSS) methods. Moreover, a new evaluation metric, called accuracy of repeatability (AR), is introduced. AR combines repeatability and the localization error (Le) for finding the probability of correct detection in the target image. The output results exhibit better repeatability, localization, and AR for the detected points compared with the criteria in original and transformed images.

  9. Performance Evaluation Modeling of Network Sensors

    NASA Technical Reports Server (NTRS)

    Clare, Loren P.; Jennings, Esther H.; Gao, Jay L.

    2003-01-01

    Substantial benefits are promised by operating many spatially separated sensors collectively. Such systems are envisioned to consist of sensor nodes that are connected by a communications network. A simulation tool is being developed to evaluate the performance of networked sensor systems, incorporating such metrics as target detection probabilities, false alarms rates, and classification confusion probabilities. The tool will be used to determine configuration impacts associated with such aspects as spatial laydown, and mixture of different types of sensors (acoustic, seismic, imaging, magnetic, RF, etc.), and fusion architecture. The QualNet discrete-event simulation environment serves as the underlying basis for model development and execution. This platform is recognized for its capabilities in efficiently simulating networking among mobile entities that communicate via wireless media. We are extending QualNet's communications modeling constructs to capture the sensing aspects of multi-target sensing (analogous to multiple access communications), unimodal multi-sensing (broadcast), and multi-modal sensing (multiple channels and correlated transmissions). Methods are also being developed for modeling the sensor signal sources (transmitters), signal propagation through the media, and sensors (receivers) that are consistent with the discrete event paradigm needed for performance determination of sensor network systems. This work is supported under the Microsensors Technical Area of the Army Research Laboratory (ARL) Advanced Sensors Collaborative Technology Alliance.

  10. A multisensor system for detection and characterization of UXO(MM-0437) - Demonstration Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gasperikova, Erika; Smith, J.T.; Morrison, H.F.

    2006-06-01

    The Berkeley UXO discriminator (BUD) (Figure 1) is a portable Active Electromagnetic (AEM) system for UXO detection and characterization that quickly determines the location, size, and symmetry properties of a suspected UXO. The BUD comprises of three orthogonal transmitters that 'illuminate' a target with fields in three independent directions in order to stimulate the three polarization modes that, in general, characterize the target EM response. In addition, the BUD uses eight pairs of differenced receivers for response recording. Eight receiver coils are placed horizontally along the two diagonals of the upper and lower planes of the two horizontal transmitter loops.more » These receiver coil pairs are located on symmetry lines through the center of the system and each pair sees identical fields during the on-time of the pulse in all of the transmitter coils. They are wired in opposition to produce zero output during the on-time of the pulses in three orthogonal transmitters. Moreover, this configuration dramatically reduces noise in the measurements by canceling the background electromagnetic fields (these fields are uniform over the scale of the receiver array and are consequently nulled by the differencing operation), and by canceling the noise contributed by the tilt of the receivers in the Earth's magnetic field, and greatly enhances receivers sensitivity to the gradients of the target response. The BUD performs target characterization from a single position of the sensor platform above a target. BUD was designed to detect and characterize UXO in the 20 mm to 155 mm size range for depths between 0 and 1 m. The relationship between the object size and the depth at which it can be detected is illustrated in Figure 2. This curve was calculated for BUD assuming that the receiver plane is 20 cm above the ground. Figure 2 shows that, for example, BUD can detect and characterize an object with 10 cm diameter down to the depth of 90 cm with depth uncertainty of 10%. Any objects buried at the depth more than 1 m have a low probability of detection. With existing algorithms in the system computer it is not possible to recover the principal polarizabilities of large objects close to the system. Detection of large shallow objects is assured, but at present real time discrimination for shallow objects is not. Post processing of the field data is required for shape discrimination of large shallow targets. Next generation of BUD software will not have this limitation. Successful application of the inversion algorithm that solves for the target parameters is contingent upon resolution of this limitation. At the moment, interpretation software is developed for a single object only. In case of multiple objects the software indicates the presence of a cluster of objects but is unable to provide characteristics of each individual object.« less

  11. BODY SENSING SYSTEM

    NASA Technical Reports Server (NTRS)

    Mah, Robert W. (Inventor)

    2005-01-01

    System and method for performing one or more relevant measurements at a target site in an animal body, using a probe. One or more of a group of selected internal measurements is performed at the target site, is optionally combined with one or more selected external measurements, and is optionally combined with one or more selected heuristic information items, in order to reduce to a relatively small number the probable medical conditions associated with the target site. One or more of the internal measurements is optionally used to navigate the probe to the target site. Neural net information processing is performed to provide a reduced set of probable medical conditions associated with the target site.

  12. Exploiting vibrational resonance in weak-signal detection

    NASA Astrophysics Data System (ADS)

    Ren, Yuhao; Pan, Yan; Duan, Fabing; Chapeau-Blondeau, François; Abbott, Derek

    2017-08-01

    In this paper, we investigate the first exploitation of the vibrational resonance (VR) effect to detect weak signals in the presence of strong background noise. By injecting a series of sinusoidal interference signals of the same amplitude but with different frequencies into a generalized correlation detector, we show that the detection probability can be maximized at an appropriate interference amplitude. Based on a dual-Dirac probability density model, we compare the VR method with the stochastic resonance approach via adding dichotomous noise. The compared results indicate that the VR method can achieve a higher detection probability for a wider variety of noise distributions.

  13. Exploiting vibrational resonance in weak-signal detection.

    PubMed

    Ren, Yuhao; Pan, Yan; Duan, Fabing; Chapeau-Blondeau, François; Abbott, Derek

    2017-08-01

    In this paper, we investigate the first exploitation of the vibrational resonance (VR) effect to detect weak signals in the presence of strong background noise. By injecting a series of sinusoidal interference signals of the same amplitude but with different frequencies into a generalized correlation detector, we show that the detection probability can be maximized at an appropriate interference amplitude. Based on a dual-Dirac probability density model, we compare the VR method with the stochastic resonance approach via adding dichotomous noise. The compared results indicate that the VR method can achieve a higher detection probability for a wider variety of noise distributions.

  14. The effects of anterior arcuate and dorsomedial frontal cortex lesions on visually guided eye movements: 2. Paired and multiple targets.

    PubMed

    Schiller, P H; Chou, I

    2000-01-01

    This study examined the effects of anterior arcuate and dorsomedial frontal cortex lesions on the execution of saccadic eye movements made to paired and multiple targets in rhesus monkeys. Identical paired targets were presented with various temporal asynchronies to determine the temporal offset required to yield equal probability choices to either target. In the intact animal equal probability choices were typically obtained when the targets appeared simultaneously. After unilateral anterior arcuate lesions a major shift arose in the temporal offset required to obtain equal probability choices for paired targets that necessitated presenting the target in the hemifield contralateral to the lesion more than 100 ms prior to the target in the ipsilateral hemifield. This deficit was still pronounced 1 year after the lesion. Dorsomedial frontal cortex lesions produced much smaller but significant shifts in target selection that recovered more rapidly. Paired lesions produced deficits similar to those observed with anterior arcuate lesions alone. Major deficits were also observed on a multiple target temporal discrimination task after anterior arcuate but not after dorsomedial frontal cortex lesions. These results suggest that the frontal eye fields that reside in anterior bank of the arcuate sulcus play an important role in temporal processing and in target selection. Dorsomedial frontal cortex, that contains the medial eye fields, plays a much less important role in the execution of these tasks.

  15. Design and implementation of estimation-based monitoring programs for flora and fauna: A case study on the Cherokee National Forest

    USGS Publications Warehouse

    Klimstra, J.D.; O'Connell, A.F.; Pistrang, M.J.; Lewis, L.M.; Herrig, J.A.; Sauer, J.R.

    2007-01-01

    Science-based monitoring of biological resources is important for a greater understanding of ecological systems and for assessment of the target population using theoretic-based management approaches. When selecting variables to monitor, managers first need to carefully consider their objectives, the geographic and temporal scale at which they will operate, and the effort needed to implement the program. Generally, monitoring can be divided into two categories: index and inferential. Although index monitoring is usually easier to implement, analysis of index data requires strong assumptions about consistency in detection rates over time and space, and parameters are often biasednot accounting for detectability and spatial variation. In most cases, individuals are not always available for detection during sampling periods, and the entire area of interest cannot be sampled. Conversely, inferential monitoring is more rigorous because it is based on nearly unbiased estimators of spatial distribution. Thus, we recommend that detectability and spatial variation be considered for all monitoring programs that intend to make inferences about the target population or the area of interest. Application of these techniques is especially important for the monitoring of Threatened and Endangered (T&E) species because it is critical to determine if population size is increasing or decreasing with some level of certainty. Use of estimation-based methods and probability sampling will reduce many of the biases inherently associated with index data and provide meaningful information with respect to changes that occur in target populations. We incorporated inferential monitoring into protocols for T&E species spanning a wide range of taxa on the Cherokee National Forest in the Southern Appalachian Mountains. We review the various approaches employed for different taxa and discuss design issues, sampling strategies, data analysis, and the details of estimating detectability using site occupancy. These techniques provide a science-based approach for monitoring and can be of value to all resource managers responsible for management of T&E species.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kipping, D. M.; Forgan, D.; Hartman, J.

    Kepler-22b is the first transiting planet to have been detected in the habitable zone of its host star. At 2.4 R{sub ⊕}, Kepler-22b is too large to be considered an Earth analog, but should the planet host a moon large enough to maintain an atmosphere, then the Kepler-22 system may yet possess a telluric world. Aside from being within the habitable zone, the target is attractive due to the availability of previously measured precise radial velocities and low intrinsic photometric noise, which has also enabled asteroseismology studies of the star. For these reasons, Kepler-22b was selected as a target-of-opportunity bymore » the 'Hunt for Exomoons with Kepler' (HEK) project. In this work, we conduct a photodynamical search for an exomoon around Kepler-22b leveraging the transits, radial velocities, and asteroseismology plus several new tools developed by the HEK project to improve exomoon searches. We find no evidence for an exomoon around the planet and exclude moons of mass M{sub S} > 0.5 M{sub ⊕} to 95% confidence. By signal injection and blind retrieval, we demonstrate that an Earth-like moon is easily detected for this planet even when the time-correlated noise of the data set is taken into account. We provide updated parameters for the planet Kepler-22b, including a revised mass of M{sub P} < 53 M{sub ⊕} to 95% confidence and an eccentricity of 0.13{sub -0.13}{sup +0.36} by exploiting Single-body Asterodensity Profiling. Finally, we show that Kepler-22b has a >95% probability of being within the empirical habitable zone but a <5% probability of being within the conservative habitable zone.« less

  17. Designs for a quantum electron microscope.

    PubMed

    Kruit, P; Hobbs, R G; Kim, C-S; Yang, Y; Manfrinato, V R; Hammer, J; Thomas, S; Weber, P; Klopfer, B; Kohstall, C; Juffmann, T; Kasevich, M A; Hommelhoff, P; Berggren, K K

    2016-05-01

    One of the astounding consequences of quantum mechanics is that it allows the detection of a target using an incident probe, with only a low probability of interaction of the probe and the target. This 'quantum weirdness' could be applied in the field of electron microscopy to generate images of beam-sensitive specimens with substantially reduced damage to the specimen. A reduction of beam-induced damage to specimens is especially of great importance if it can enable imaging of biological specimens with atomic resolution. Following a recent suggestion that interaction-free measurements are possible with electrons, we now analyze the difficulties of actually building an atomic resolution interaction-free electron microscope, or "quantum electron microscope". A quantum electron microscope would require a number of unique components not found in conventional transmission electron microscopes. These components include a coherent electron beam-splitter or two-state-coupler, and a resonator structure to allow each electron to interrogate the specimen multiple times, thus supporting high success probabilities for interaction-free detection of the specimen. Different system designs are presented here, which are based on four different choices of two-state-couplers: a thin crystal, a grating mirror, a standing light wave and an electro-dynamical pseudopotential. Challenges for the detailed electron optical design are identified as future directions for development. While it is concluded that it should be possible to build an atomic resolution quantum electron microscope, we have also identified a number of hurdles to the development of such a microscope and further theoretical investigations that will be required to enable a complete interpretation of the images produced by such a microscope. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  18. Statistical Regularities Attract Attention when Task-Relevant.

    PubMed

    Alamia, Andrea; Zénon, Alexandre

    2016-01-01

    Visual attention seems essential for learning the statistical regularities in our environment, a process known as statistical learning. However, how attention is allocated when exploring a novel visual scene whose statistical structure is unknown remains unclear. In order to address this question, we investigated visual attention allocation during a task in which we manipulated the conditional probability of occurrence of colored stimuli, unbeknown to the subjects. Participants were instructed to detect a target colored dot among two dots moving along separate circular paths. We evaluated implicit statistical learning, i.e., the effect of color predictability on reaction times (RTs), and recorded eye position concurrently. Attention allocation was indexed by comparing the Mahalanobis distance between the position, velocity and acceleration of the eyes and the two colored dots. We found that learning the conditional probabilities occurred very early during the course of the experiment as shown by the fact that, starting already from the first block, predictable stimuli were detected with shorter RT than unpredictable ones. In terms of attentional allocation, we found that the predictive stimulus attracted gaze only when it was informative about the occurrence of the target but not when it predicted the occurrence of a task-irrelevant stimulus. This suggests that attention allocation was influenced by regularities only when they were instrumental in performing the task. Moreover, we found that the attentional bias towards task-relevant predictive stimuli occurred at a very early stage of learning, concomitantly with the first effects of learning on RT. In conclusion, these results show that statistical regularities capture visual attention only after a few occurrences, provided these regularities are instrumental to perform the task.

  19. Warship Combat System Selection Methodology Based on Discrete Event Simulation

    DTIC Science & Technology

    2010-09-01

    Platform (from Spanish) PD Damage Probability xiv PHit Hit Probability PKill Kill Probability RSM Response Surface Model SAM Surface-Air Missile...such a large target allows an assumption that the probability of a hit ( PHit ) is one. This structure can be considered as a bridge; therefore, the

  20. Development of monitoring protocols to detect change in rocky intertidal communities of Glacier Bay National Park and Preserve

    USGS Publications Warehouse

    Irvine, Gail V.

    2010-01-01

    Glacier Bay National Park and Preserve in southeastern Alaska includes extensive coastlines representing a major proportion of all coastlines held by the National Park Service. The marine plants and invertebrates that occupy intertidal shores form highly productive communities that are ecologically important to a number of vertebrate and invertebrate consumers and that are vulnerable to human disturbances. To better understand these communities and their sensitivity, it is important to obtain information on species abundances over space and time. During field studies from 1997 to 2001, I investigated probability-based rocky intertidal monitoring designs that allow inference of results to similar habitat within the bay and that reduce bias. Aerial surveys of a subset of intertidal habitat indicated that the original target habitat of bedrock-dominated sites with slope less than or equal to 30 degrees was rare. This finding illustrated the value of probability-based surveys and led to a shift in the target habitat type to more mixed rocky habitat with steeper slopes. Subsequently, I investigated different sampling methods and strategies for their relative power to detect changes in the abundances of the predominant sessile intertidal taxa: barnacles -Balanomorpha, the mussel Mytilus trossulus and the rockweed Fucus distichus subsp. evanescens. I found that lower-intensity sampling of 25 randomly selected sites (= coarse-grained sampling) provided a greater ability to detect changes in the abundances of these taxa than did more intensive sampling of 6 sites (= fine-grained sampling). Because of its greater power, the coarse-grained sampling scheme was adopted in subsequent years. This report provides detailed analyses of the 4 years of data and evaluates the relative effect of different sampling attributes and management-set parameters on the ability of the sampling to detect changes in the abundances of these taxa. The intent was to provide managers with information to guide design choices for intertidal monitoring. I found that the coarse-grained surveys, as conducted from 1998 to 2001, had power ranging from 0.68 to 1.0 to detect 10 percent annual changes in the abundances of these predominant sessile species. The information gained through intertidal monitoring would be useful in assessing changes due to climate (including ocean acidification), invasive species, trampling effects, and oil spills.

  1. A Review of Methods for Detection of Hepatozoon Infection in Carnivores and Arthropod Vectors.

    PubMed

    Modrý, David; Beck, Relja; Hrazdilová, Kristýna; Baneth, Gad

    2017-01-01

    Vector-borne protists of the genus Hepatozoon belong to the apicomplexan suborder Adeleorina. The taxonomy of Hepatozoon is unsettled and different phylogenetic clades probably represent evolutionary units deserving the status of separate genera. Throughout our review, we focus on the monophyletic assemblage of Hepatozoon spp. from carnivores, classified as Hepatozoon sensu stricto that includes important pathogens of domestic and free-ranging canine and feline hosts. We provide an overview of diagnostic methods and approaches from classical detection in biological materials, through serological tests to nucleic acid amplification tests (NAATs). Critical review of used primers for the 18S rDNA is provided, together with information on individual primer pairs. Extension of used NAATs target to cover also mitochondrial genes is suggested as a key step in understanding the diversity and molecular epidemiology of Hepatozoon infections in mammals.

  2. Suboptimal Decision Criteria Are Predicted by Subjectively Weighted Probabilities and Rewards

    PubMed Central

    Ackermann, John F.; Landy, Michael S.

    2014-01-01

    Subjects performed a visual detection task in which the probability of target occurrence at each of the two possible locations, and the rewards for correct responses for each, were varied across conditions. To maximize monetary gain, observers should bias their responses, choosing one location more often than the other in line with the varied probabilities and rewards. Typically, and in our task, observers do not bias their responses to the extent they should, and instead distribute their responses more evenly across locations, a phenomenon referred to as ‘conservatism.’ We investigated several hypotheses regarding the source of the conservatism. We measured utility and probability weighting functions under Prospect Theory for each subject in an independent economic choice task and used the weighting-function parameters to calculate each subject’s subjective utility (SU(c)) as a function of the criterion c, and the corresponding weighted optimal criteria (wcopt). Subjects’ criteria were not close to optimal relative to wcopt. The slope of SU (c) and of expected gain EG(c) at the neutral criterion corresponding to β = 1 were both predictive of subjects’ criteria. The slope of SU(c) was a better predictor of observers’ decision criteria overall. Thus, rather than behaving optimally, subjects move their criterion away from the neutral criterion by estimating how much they stand to gain by such a change based on the slope of subjective gain as a function of criterion, using inherently distorted probabilities and values. PMID:25366822

  3. Suboptimal decision criteria are predicted by subjectively weighted probabilities and rewards.

    PubMed

    Ackermann, John F; Landy, Michael S

    2015-02-01

    Subjects performed a visual detection task in which the probability of target occurrence at each of the two possible locations, and the rewards for correct responses for each, were varied across conditions. To maximize monetary gain, observers should bias their responses, choosing one location more often than the other in line with the varied probabilities and rewards. Typically, and in our task, observers do not bias their responses to the extent they should, and instead distribute their responses more evenly across locations, a phenomenon referred to as 'conservatism.' We investigated several hypotheses regarding the source of the conservatism. We measured utility and probability weighting functions under Prospect Theory for each subject in an independent economic choice task and used the weighting-function parameters to calculate each subject's subjective utility (SU(c)) as a function of the criterion c, and the corresponding weighted optimal criteria (wc opt ). Subjects' criteria were not close to optimal relative to wc opt . The slope of SU(c) and of expected gain EG(c) at the neutral criterion corresponding to β = 1 were both predictive of the subjects' criteria. The slope of SU(c) was a better predictor of observers' decision criteria overall. Thus, rather than behaving optimally, subjects move their criterion away from the neutral criterion by estimating how much they stand to gain by such a change based on the slope of subjective gain as a function of criterion, using inherently distorted probabilities and values.

  4. People Like Logical Truth: Testing the Intuitive Detection of Logical Value in Basic Propositions

    PubMed Central

    2016-01-01

    Recent studies on logical reasoning have suggested that people are intuitively aware of the logical validity of syllogisms or that they intuitively detect conflict between heuristic responses and logical norms via slight changes in their feelings. According to logical intuition studies, logically valid or heuristic logic no-conflict reasoning is fluently processed and induces positive feelings without conscious awareness. One criticism states that such effects of logicality disappear when confounding factors such as the content of syllogisms are controlled. The present study used abstract propositions and tested whether people intuitively detect logical value. Experiment 1 presented four logical propositions (conjunctive, biconditional, conditional, and material implications) regarding a target case and asked the participants to rate the extent to which they liked the statement. Experiment 2 tested the effects of matching bias, as well as intuitive logic, on the reasoners’ feelings by manipulating whether the antecedent or consequent (or both) of the conditional was affirmed or negated. The results showed that both logicality and matching bias affected the reasoners’ feelings, and people preferred logically true targets over logically false ones for all forms of propositions. These results suggest that people intuitively detect what is true from what is false during abstract reasoning. Additionally, a Bayesian mixed model meta-analysis of conditionals indicated that people’s intuitive interpretation of the conditional “if p then q” fits better with the conditional probability, q given p. PMID:28036402

  5. An empirical investigation into the role of subjective prior probability in searching for potentially missing items

    PubMed Central

    Fanshawe, T. R.

    2015-01-01

    There are many examples from the scientific literature of visual search tasks in which the length, scope and success rate of the search have been shown to vary according to the searcher's expectations of whether the search target is likely to be present. This phenomenon has major practical implications, for instance in cancer screening, when the prevalence of the condition is low and the consequences of a missed disease diagnosis are severe. We consider this problem from an empirical Bayesian perspective to explain how the effect of a low prior probability, subjectively assessed by the searcher, might impact on the extent of the search. We show how the searcher's posterior probability that the target is present depends on the prior probability and the proportion of possible target locations already searched, and also consider the implications of imperfect search, when the probability of false-positive and false-negative decisions is non-zero. The theoretical results are applied to two studies of radiologists' visual assessment of pulmonary lesions on chest radiographs. Further application areas in diagnostic medicine and airport security are also discussed. PMID:26587267

  6. Total coliform and E. coli in public water systems using undisinfected ground water in the United States.

    PubMed

    Messner, Michael J; Berger, Philip; Javier, Julie

    2017-06-01

    Public water systems (PWSs) in the United States generate total coliform (TC) and Escherichia coli (EC) monitoring data, as required by the Total Coliform Rule (TCR). We analyzed data generated in 2011 by approximately 38,000 small (serving fewer than 4101 individuals) undisinfected public water systems (PWSs). We used statistical modeling to characterize a distribution of TC detection probabilities for each of nine groupings of PWSs based on system type (community, non-transient non-community, and transient non-community) and population served (less than 101, 101-1000 and 1001-4100 people). We found that among PWS types sampled in 2011, on average, undisinfected transient PWSs test positive for TC 4.3% of the time as compared with 3% for undisinfected non-transient PWSs and 2.5% for undisinfected community PWSs. Within each type of PWS, the smaller systems have higher median TC detection than the larger systems. All TC-positive samples were assayed for EC. Among TC-positive samples from small undisinfected PWSs, EC is detected in about 5% of samples, regardless of PWS type or size. We evaluated the upper tail of the TC detection probability distributions and found that significant percentages of some system types have high TC detection probabilities. For example, assuming the systems providing data are nationally-representative, then 5.0% of the ∼50,000 small undisinfected transient PWSs in the U.S. have TC detection probabilities of 20% or more. Communities with such high TC detection probabilities may have elevated risk of acute gastrointestinal (AGI) illness - perhaps as great or greater than the attributable risk to drinking water (6-22%) calculated for 14 Wisconsin community PWSs with much lower TC detection probabilities (about 2.3%, Borchardt et al., 2012). Published by Elsevier GmbH.

  7. EARLY SCIENCE WITH THE KOREAN VLBI NETWORK: THE QCAL-1 43 GHz CALIBRATOR SURVEY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petrov, Leonid; Lee, Sang-Sung; Kim, Jongsoo

    2012-11-01

    This paper presents the catalog of correlated flux densities in three ranges of baseline projection lengths of 637 sources from a 43 GHz (Q band) survey observed with the Korean VLBI Network. Of them, 14 objects used as calibrators were previously observed, but 623 sources have not been observed before in the Q band with very long baseline interferometry (VLBI). The goal of this work in the early science phase of the new VLBI array is twofold: to evaluate the performance of the new instrument that operates in a frequency range of 22-129 GHz and to build a list ofmore » objects that can be used as targets and as calibrators. We have observed the list of 799 target sources with declinations down to -40 Degree-Sign . Among them, 724 were observed before with VLBI at 22 GHz and had correlated flux densities greater than 200 mJy. The overall detection rate is 78%. The detection limit, defined as the minimum flux density for a source to be detected with 90% probability in a single observation, was in the range of 115-180 mJy depending on declination. However, some sources as weak as 70 mJy have been detected. Of 623 detected sources, 33 objects are detected for the first time in VLBI mode. We determined their coordinates with a median formal uncertainty of 20 mas. The results of this work set the basis for future efforts to build the complete flux-limited sample of extragalactic sources at frequencies of 22 GHz and higher at 3/4 of the celestial sphere.« less

  8. Primer-BLAST: A tool to design target-specific primers for polymerase chain reaction

    PubMed Central

    2012-01-01

    Background Choosing appropriate primers is probably the single most important factor affecting the polymerase chain reaction (PCR). Specific amplification of the intended target requires that primers do not have matches to other targets in certain orientations and within certain distances that allow undesired amplification. The process of designing specific primers typically involves two stages. First, the primers flanking regions of interest are generated either manually or using software tools; then they are searched against an appropriate nucleotide sequence database using tools such as BLAST to examine the potential targets. However, the latter is not an easy process as one needs to examine many details between primers and targets, such as the number and the positions of matched bases, the primer orientations and distance between forward and reverse primers. The complexity of such analysis usually makes this a time-consuming and very difficult task for users, especially when the primers have a large number of hits. Furthermore, although the BLAST program has been widely used for primer target detection, it is in fact not an ideal tool for this purpose as BLAST is a local alignment algorithm and does not necessarily return complete match information over the entire primer range. Results We present a new software tool called Primer-BLAST to alleviate the difficulty in designing target-specific primers. This tool combines BLAST with a global alignment algorithm to ensure a full primer-target alignment and is sensitive enough to detect targets that have a significant number of mismatches to primers. Primer-BLAST allows users to design new target-specific primers in one step as well as to check the specificity of pre-existing primers. Primer-BLAST also supports placing primers based on exon/intron locations and excluding single nucleotide polymorphism (SNP) sites in primers. Conclusions We describe a robust and fully implemented general purpose primer design tool that designs target-specific PCR primers. Primer-BLAST offers flexible options to adjust the specificity threshold and other primer properties. This tool is publicly available at http://www.ncbi.nlm.nih.gov/tools/primer-blast. PMID:22708584

  9. Detection methods for stochastic gravitational-wave backgrounds: a unified treatment

    NASA Astrophysics Data System (ADS)

    Romano, Joseph D.; Cornish, Neil. J.

    2017-04-01

    We review detection methods that are currently in use or have been proposed to search for a stochastic background of gravitational radiation. We consider both Bayesian and frequentist searches using ground-based and space-based laser interferometers, spacecraft Doppler tracking, and pulsar timing arrays; and we allow for anisotropy, non-Gaussianity, and non-standard polarization states. Our focus is on relevant data analysis issues, and not on the particular astrophysical or early Universe sources that might give rise to such backgrounds. We provide a unified treatment of these searches at the level of detector response functions, detection sensitivity curves, and, more generally, at the level of the likelihood function, since the choice of signal and noise models and prior probability distributions are actually what define the search. Pedagogical examples are given whenever possible to compare and contrast different approaches. We have tried to make the article as self-contained and comprehensive as possible, targeting graduate students and new researchers looking to enter this field.

  10. The Probability of Hitting a Polygonal Target

    DTIC Science & Technology

    1981-04-01

    required for the use of this method for coalputing the probability of hitting d polygonal target. These functions are 1. PHIT (called by user’s main progran...2. FIJ (called by PHIT ) 3. FUN (called by FIJ) The user must include all three of these in his main program, but needs only to call PHIT . The

  11. Tracking the Sensory Environment: An ERP Study of Probability and Context Updating in ASD

    ERIC Educational Resources Information Center

    Westerfield, Marissa A.; Zinni, Marla; Vo, Khang; Townsend, Jeanne

    2015-01-01

    We recorded visual event-related brain potentials from 32 adult male participants (16 high-functioning participants diagnosed with autism spectrum disorder (ASD) and 16 control participants, ranging in age from 18 to 53 years) during a three-stimulus oddball paradigm. Target and non-target stimulus probability was varied across three probability…

  12. Accident hazard evaluation and control decisions on forested recreation sites

    Treesearch

    Lee A. Paine

    1971-01-01

    Accident hazard associated with trees on recreation sites is inherently concerned with probabilities. The major factors include the probabilities of mechanical failure and of target impact if failure occurs, the damage potential of the failure, and the target value. Hazard may be evaluated as the product of these factors; i.e., expected loss during the current...

  13. Deployment Design of Wireless Sensor Network for Simple Multi-Point Surveillance of a Moving Target

    PubMed Central

    Tsukamoto, Kazuya; Ueda, Hirofumi; Tamura, Hitomi; Kawahara, Kenji; Oie, Yuji

    2009-01-01

    In this paper, we focus on the problem of tracking a moving target in a wireless sensor network (WSN), in which the capability of each sensor is relatively limited, to construct large-scale WSNs at a reasonable cost. We first propose two simple multi-point surveillance schemes for a moving target in a WSN and demonstrate that one of the schemes can achieve high tracking probability with low power consumption. In addition, we examine the relationship between tracking probability and sensor density through simulations, and then derive an approximate expression representing the relationship. As the results, we present guidelines for sensor density, tracking probability, and the number of monitoring sensors that satisfy a variety of application demands. PMID:22412326

  14. Allometry indicates giant eyes of giant squid are not exceptional.

    PubMed

    Schmitz, Lars; Motani, Ryosuke; Oufiero, Christopher E; Martin, Christopher H; McGee, Matthew D; Gamarra, Ashlee R; Lee, Johanna J; Wainwright, Peter C

    2013-02-18

    The eyes of giant and colossal squid are among the largest eyes in the history of life. It was recently proposed that sperm whale predation is the main driver of eye size evolution in giant squid, on the basis of an optical model that suggested optimal performance in detecting large luminous visual targets such as whales in the deep sea. However, it is poorly understood how the eye size of giant and colossal squid compares to that of other aquatic organisms when scaling effects are considered. We performed a large-scale comparative study that included 87 squid species and 237 species of acanthomorph fish. While squid have larger eyes than most acanthomorphs, a comparison of relative eye size among squid suggests that giant and colossal squid do not have unusually large eyes. After revising constants used in a previous model we found that large eyes perform equally well in detecting point targets and large luminous targets in the deep sea. The eyes of giant and colossal squid do not appear exceptionally large when allometric effects are considered. It is probable that the giant eyes of giant squid result from a phylogenetically conserved developmental pattern manifested in very large animals. Whatever the cause of large eyes, they appear to have several advantages for vision in the reduced light of the deep mesopelagic zone.

  15. Pattern drilling exploration: Optimum pattern types and hole spacings when searching for elliptical shaped targets

    USGS Publications Warehouse

    Drew, L.J.

    1979-01-01

    In this study the selection of the optimum type of drilling pattern to be used when exploring for elliptical shaped targets is examined. The rhombic pattern is optimal when the targets are known to have a preferred orientation. Situations can also be found where a rectangular pattern is as efficient as the rhombic pattern. A triangular or square drilling pattern should be used when the orientations of the targets are unknown. The way in which the optimum hole spacing varies as a function of (1) the cost of drilling, (2) the value of the targets, (3) the shape of the targets, (4) the target occurrence probabilities was determined for several examples. Bayes' rule was used to show how target occurrence probabilities can be revised within a multistage pattern drilling scheme. ?? 1979 Plenum Publishing Corporation.

  16. The effects of flow on schooling Devario aequipinnatus: school structure, startle response and information transmission

    PubMed Central

    Chicoli, A.; Butail, S.; Lun, Y.; Bak-Coleman, J.; Coombs, S.; Paley, D.A.

    2014-01-01

    To assess how flow affects school structure and threat detection, startle response rates of solitary and small groups of giant danio Devario aequipinnatus were compared to visual looming stimuli in flow and no-flow conditions. The instantaneous position and heading of each D. aequipinnatus were extracted from high-speed videos. Behavioural results indicate that (1) school structure is altered in flow such that D. aequipinnatus orient upstream while spanning out in a crosswise direction, (2) the probability of at least one D. aequipinnatus detecting the visual looming stimulus is higher in flow than no flow for both solitary D. aequipinnatus and groups of eight D. aequipinnatus, however, (3) the probability of three or more individuals responding is higher in no flow than flow. Taken together, these results indicate a higher probability of stimulus detection in flow but a higher probability of internal transmission of information in no flow. Finally, results were well predicted by a computational model of collective fright response that included the probability of direct detection (based on signal detection theory) and indirect detection (i.e. via interactions between group members) of threatening stimuli. This model provides a new theoretical framework for analysing the collective transfer of information among groups of fishes and other organisms. PMID:24773538

  17. The Accuracy of Tank Main Armaments.

    DTIC Science & Technology

    1987-04-07

    width (m) 1.4,3.2 hull height, width (m) 0.5,1.0043,1.1233,0.357,0.0, rr,o’ffrP&PY The program produces the following hit probabilities: a) Phit -0.52 for...hull defllade b) Phit =0.74 for ully exposed c) Phit -0.94 for the standard NATO target. The calculation of subsequent round hit probabilities is a more...hit probabilities: a) Phit =0.66 for hull defilade b) Phit =0.86 for fully exposed c) Phit =0.98 for the standard NATO target. Moving Firer Versus

  18. A discrimination method for the detection of pneumonia using chest radiograph.

    PubMed

    Noor, Norliza Mohd; Rijal, Omar Mohd; Yunus, Ashari; Abu-Bakar, S A R

    2010-03-01

    This paper presents a statistical method for the detection of lobar pneumonia when using digitized chest X-ray films. Each region of interest was represented by a vector of wavelet texture measures which is then multiplied by the orthogonal matrix Q(2). The first two elements of the transformed vectors were shown to have a bivariate normal distribution. Misclassification probabilities were estimated using probability ellipsoids and discriminant functions. The result of this study recommends the detection of pneumonia by constructing probability ellipsoids or discriminant function using maximum energy and maximum column sum energy texture measures where misclassification probabilities were less than 0.15. 2009 Elsevier Ltd. All rights reserved.

  19. Phase-Conjugate Receiver for Gaussian-State Quantum Illumination

    NASA Technical Reports Server (NTRS)

    Erkmen, Baris I.; Guha, Saikat

    2010-01-01

    An active optical sensor probes a region of free space that is engulfed in bright thermal noise to determine the presence (or absence) of a weakly reflecting target. The returned light (which is just thermal noise if no target is present, and thermal noise plus a weak reflection of the probe beam if a target is present) is measured and processed by a receiver and a decision is made on whether a target is present. It has been shown that generating an entangled pair of photons (which is a highly nonclassical state of light), using one photon as the probe beam and storing the other photon for comparison to the returned light, has superior performance to the traditional classical-light (coherent-state) target detection sensors. An entangled-photon transmitter and optimal receiver combination can yield up to a factor of 4 (i.e., 6 dB) gain in the error-probability exponent over a coherent state transmitter and optimal receiver combination, in a highly lossy and noisy scenario (when both sensors have the same number of transmitted photons). However, the receiver that achieves this advantage is not known. One structured receiver can close half of the 6-dB gap (i.e., a 3-dB improvement). It is based on phase-conjugating the returned light, then performing dual-balanced difference detection with the stored half of the entangled-photon pair. Active optical sensors are of tremendous value to NASA s missions. Although this work focuses on target detection, it can be extended to imaging (2D, 3D, hyperspectral, etc.) scenarios as well, where the image quality can be better than that offered by traditional active sensors. Although the current work is theoretical, NASA s future missions could benefit significantly from developing and demonstrating this capability. This is an optical receiver design whose components are, in principle, all implementable. However, the work is currently entirely theoretical. It is necessary to: 1. Demonstrate a bench-top proof of the theoretical principle, 2. Create an operational prototype off-the-bench, and 3. Build a practical sensor that can fly in a mission.

  20. An Enhanced Method for Scheduling Observations of Large Sky Error Regions for Finding Optical Counterparts to Transients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rana, Javed; Singhal, Akshat; Gadre, Bhooshan

    2017-04-01

    The discovery and subsequent study of optical counterparts to transient sources is crucial for their complete astrophysical understanding. Various gamma-ray burst (GRB) detectors, and more notably the ground-based gravitational wave detectors, typically have large uncertainties in the sky positions of detected sources. Searching these large sky regions spanning hundreds of square degrees is a formidable challenge for most ground-based optical telescopes, which can usually image less than tens of square degrees of the sky in a single night. We present algorithms for better scheduling of such follow-up observations in order to maximize the probability of imaging the optical counterpart, basedmore » on the all-sky probability distribution of the source position. We incorporate realistic observing constraints such as the diurnal cycle, telescope pointing limitations, available observing time, and the rising/setting of the target at the observatory’s location. We use simulations to demonstrate that our proposed algorithms outperform the default greedy observing schedule used by many observatories. Our algorithms are applicable for follow-up of other transient sources with large positional uncertainties, such as Fermi -detected GRBs, and can easily be adapted for scheduling radio or space-based X-ray follow-up.« less

  1. Three-dimensional obstacle classification in laser range data

    NASA Astrophysics Data System (ADS)

    Armbruster, Walter; Bers, Karl-Heinz

    1998-10-01

    The threat of hostile surveillance and weapon systems require military aircraft to fly under extreme conditions such as low altitude, high speed, poor visibility and incomplete terrain information. The probability of collision with natural and man-made obstacles during such contour missions is high if detection capability is restricted to conventional vision aids. Forward-looking scanning laser rangefinders which are presently being flight tested and evaluated at German proving grounds, provide a possible solution, having a large field of view, high angular and range resolution, a high pulse repetition rate, and sufficient pulse energy to register returns from wires at over 500 m range (depends on the system) with a high hit-and-detect probability. Despite the efficiency of the sensor, acceptance of current obstacle warning systems by test pilots is not very high, mainly due to the systems' inadequacies in obstacle recognition and visualization. This has motivated the development and the testing of more advanced 3d-scene analysis algorithm at FGAN-FIM to replace the obstacle recognition component of current warning systems. The basic ideas are to increase the recognition probability and to reduce the false alarm rate for hard-to-extract obstacles such as wires, by using more readily recognizable objects such as terrain, poles, pylons, trees, etc. by implementing a hierarchical classification procedure to generate a parametric description of the terrain surface as well as the class, position, orientation, size and shape of all objects in the scene. The algorithms can be used for other applications such as terrain following, autonomous obstacle avoidance, and automatic target recognition.

  2. Simulating Operation of a Complex Sensor Network

    NASA Technical Reports Server (NTRS)

    Jennings, Esther; Clare, Loren; Woo, Simon

    2008-01-01

    Simulation Tool for ASCTA Microsensor Network Architecture (STAMiNA) ["ASCTA" denotes the Advanced Sensors Collaborative Technology Alliance.] is a computer program for evaluating conceptual sensor networks deployed over terrain to provide military situational awareness. This or a similar program is needed because of the complexity of interactions among such diverse phenomena as sensing and communication portions of a network, deployment of sensor nodes, effects of terrain, data-fusion algorithms, and threat characteristics. STAMiNA is built upon a commercial network-simulator engine, with extensions to include both sensing and communication models in a discrete-event simulation environment. Users can define (1) a mission environment, including terrain features; (2) objects to be sensed; (3) placements and modalities of sensors, abilities of sensors to sense objects of various types, and sensor false alarm rates; (4) trajectories of threatening objects; (5) means of dissemination and fusion of data; and (6) various network configurations. By use of STAMiNA, one can simulate detection of targets through sensing, dissemination of information by various wireless communication subsystems under various scenarios, and fusion of information, incorporating such metrics as target-detection probabilities, false-alarm rates, and communication loads, and capturing effects of terrain and threat.

  3. Targeted mutation screening panels expose systematic population bias in detection of cystic fibrosis risk.

    PubMed

    Lim, Regine M; Silver, Ari J; Silver, Maxwell J; Borroto, Carlos; Spurrier, Brett; Petrossian, Tanya C; Larson, Jessica L; Silver, Lee M

    2016-02-01

    Carrier screening for mutations contributing to cystic fibrosis (CF) is typically accomplished with panels composed of variants that are clinically validated primarily in patients of European descent. This approach has created a static genetic and phenotypic profile for CF. An opportunity now exists to reevaluate the disease profile of CFTR at a global population level. CFTR allele and genotype frequencies were obtained from a nonpatient cohort with more than 60,000 unrelated personal genomes collected by the Exome Aggregation Consortium. Likely disease-contributing mutations were identified with the use of public database annotations and computational tools. We identified 131 previously described and likely pathogenic variants and another 210 untested variants with a high probability of causing protein damage. None of the current genetic screening panels or existing CFTR mutation databases covered a majority of deleterious variants in any geographical population outside of Europe. Both clinical annotation and mutation coverage by commercially available targeted screening panels for CF are strongly biased toward detection of reproductive risk in persons of European descent. South and East Asian populations are severely underrepresented, in part because of a definition of disease that preferences the phenotype associated with European-typical CFTR alleles.

  4. The correct estimate of the probability of false detection of the matched filter in weak-signal detection problems

    NASA Astrophysics Data System (ADS)

    Vio, R.; Andreani, P.

    2016-05-01

    The reliable detection of weak signals is a critical issue in many astronomical contexts and may have severe consequences for determining number counts and luminosity functions, but also for optimizing the use of telescope time in follow-up observations. Because of its optimal properties, one of the most popular and widely-used detection technique is the matched filter (MF). This is a linear filter designed to maximise the detectability of a signal of known structure that is buried in additive Gaussian random noise. In this work we show that in the very common situation where the number and position of the searched signals within a data sequence (e.g. an emission line in a spectrum) or an image (e.g. a point-source in an interferometric map) are unknown, this technique, when applied in its standard form, may severely underestimate the probability of false detection. This is because the correct use of the MF relies upon a priori knowledge of the position of the signal of interest. In the absence of this information, the statistical significance of features that are actually noise is overestimated and detections claimed that are actually spurious. For this reason, we present an alternative method of computing the probability of false detection that is based on the probability density function (PDF) of the peaks of a random field. It is able to provide a correct estimate of the probability of false detection for the one-, two- and three-dimensional case. We apply this technique to a real two-dimensional interferometric map obtained with ALMA.

  5. Maximum ikelihood estimation for the double-count method with independent observers

    USGS Publications Warehouse

    Manly, Bryan F.J.; McDonald, Lyman L.; Garner, Gerald W.

    1996-01-01

    Data collected under a double-count protocol during line transect surveys were analyzed using new maximum likelihood methods combined with Akaike's information criterion to provide estimates of the abundance of polar bear (Ursus maritimus Phipps) in a pilot study off the coast of Alaska. Visibility biases were corrected by modeling the detection probabilities using logistic regression functions. Independent variables that influenced the detection probabilities included perpendicular distance of bear groups from the flight line and the number of individuals in the groups. A series of models were considered which vary from (1) the simplest, where the probability of detection was the same for both observers and was not affected by either distance from the flight line or group size, to (2) models where probability of detection is different for the two observers and depends on both distance from the transect and group size. Estimation procedures are developed for the case when additional variables may affect detection probabilities. The methods are illustrated using data from the pilot polar bear survey and some recommendations are given for design of a survey over the larger Chukchi Sea between Russia and the United States.

  6. Probability of detection evaluation results for railroad tank car nondestructive testing : final report.

    DOT National Transportation Integrated Search

    2016-08-01

    The Federal Railroad Administration (FRA), Transportation Technology Center, Inc. (TTCI), and rail industry participants have : performed probability of detection (POD) assessments to evaluate nondestructive testing (NDT) technologies, which are : pr...

  7. Detecting background changes in environments with dynamic foreground by separating probability distribution function mixtures using Pearson's method of moments

    NASA Astrophysics Data System (ADS)

    Jenkins, Colleen; Jordan, Jay; Carlson, Jeff

    2007-02-01

    This paper presents parameter estimation techniques useful for detecting background changes in a video sequence with extreme foreground activity. A specific application of interest is automated detection of the covert placement of threats (e.g., a briefcase bomb) inside crowded public facilities. We propose that a histogram of pixel intensity acquired from a fixed mounted camera over time for a series of images will be a mixture of two Gaussian functions: the foreground probability distribution function and background probability distribution function. We will use Pearson's Method of Moments to separate the two probability distribution functions. The background function can then be "remembered" and changes in the background can be detected. Subsequent comparisons of background estimates are used to detect changes. Changes are flagged to alert security forces to the presence and location of potential threats. Results are presented that indicate the significant potential for robust parameter estimation techniques as applied to video surveillance.

  8. Nano-biosensing approaches on tuberculosis: Defy of aptamers.

    PubMed

    Golichenari, Behrouz; Nosrati, Rahim; Farokhi-Fard, Aref; Abnous, Khalil; Vaziri, Farzam; Behravan, Javad

    2018-06-11

    Tuberculosis is a major global health problem caused by the bacterium Mycobacterium tuberculosis (Mtb) complex. According to WHO reports, 53 million TB patients died from 2000 to 2016. Therefore, early diagnosis of the disease is of great importance for global health care programs. The restrictions of traditional methods have encouraged the development of innovative methods for rapid, reliable, and cost-effective diagnosis of tuberculosis. In recent years, aptamer-based biosensors or aptasensors have drawn great attention to sensitive and accessible detection of tuberculosis. Aptamers are small short single-stranded molecules of DNA or RNA that fold to a unique form and bind to targets. Once combined with nanomaterials, nano-scale aptasensors provide powerful analytical platforms for diagnosing of tuberculosis. Various groups designed and studied aptamers specific for the whole cells of M. tuberculosis, mycobacterial proteins and IFN-γ for early diagnosis of TB. Advantages such as high specificity and strong affinity, potential for binding to a larger variety of targets, increased stability, lower costs of synthesis and storage requirements, and lower probability of contamination make aptasensors pivotal alternatives for future TB diagnostics. In recent years, the concept of SOMAmer has opened new horizons in high precision detection of tuberculosis biomarkers. This review article provides a description of the research progresses of aptamer-based and SOMAmer-based biosensors and nanobiosensors for the detection of tuberculosis. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. A search for J-band variability from late-L and T brown dwarfs

    NASA Astrophysics Data System (ADS)

    Clarke, F. J.; Hodgkin, S. T.; Oppenheimer, B. R.; Robertson, J.; Haubois, X.

    2008-06-01

    We present J-band photometric observations of eight late-L and T type brown dwarfs designed to search for variability. We detect small amplitude periodic variability from three of the objects on time-scales of several hours, probably indicating the rotation period of the objects. The other targets do not show any variability down to the level of 0.5-5 per cent This work is based on observations obtained at the European Southern Observatory, La Silla, Chile (ESO Programme 72.C-0006). E-mail: fclarke@astro.ox.ac.uk (FJC); sth@ast.cam.ac.uk (STH); bro@amnh.org (BRO); xavier.haubois@obspm.fr (XH)

  10. Hubble Space Telescope Discovery of a Probable Caustic-Crossing Event in the MACS1149 Galaxy Cluster Field

    NASA Astrophysics Data System (ADS)

    Kelly, Patrick L.; Rodney, Steven; Diego, Jose Maria; Zitrin, Adi; Broadhurst, Tom; Selsing, Jonatan; Balestra, Italo; Benito, Alberto Molino; Bradac, Marusa; Bradley, Larry; Brammer, Gabriel; Cenko, Brad; Christensen, Lise; Coe, Dan; Filippenko, Alexei V.; Foley, Ryan; Frye, Brenda; Graham, Melissa; Graur, Or; Grillo, Claudio; Hjorth, Jens; Howell, Andy; Jauzac, Mathilde; Jha, Saurabh; Kaiser, Nick; Kawamata, Ryota; Kneib, Jean-Paul; Lotz, Jennifer; Matheson, Thomas; McCully, Curtis; Merten, Julian; Nonino, Mario; Oguri, Masamune; Richard, Johan; Riess, Adam; Rosati, Piero; Schmidt, Kasper Borello; Sharon, Keren; Smith, Nathan; Strolger, Lou; Treu, Tommaso; Wang, Xin; Weiner, Ben; Williams, Liliya; Zheng, Weikang

    2016-05-01

    While monitoring the MACS1149 (z = 0.54) galaxy cluster as part of the RefsdalRedux program (PID 14199; PI Kelly) with the Hubble Space Telescope (HST) WFC3 IR camera, we have detected a rising transient that appears to be coincident ( Target-of-opportunity optical follow-up imaging in several ACS and WFC3 bands with the FrontierSN program (PID 14208; PI Rodney) has revealed that its rest-frame ultraviolet through optical spectrum may be reasonably well fit with that of a B star at z=1.49 exhibiting a strong Balmer break.

  11. Partitioning Detectability Components in Populations Subject to Within-Season Temporary Emigration Using Binomial Mixture Models

    PubMed Central

    O’Donnell, Katherine M.; Thompson, Frank R.; Semlitsch, Raymond D.

    2015-01-01

    Detectability of individual animals is highly variable and nearly always < 1; imperfect detection must be accounted for to reliably estimate population sizes and trends. Hierarchical models can simultaneously estimate abundance and effective detection probability, but there are several different mechanisms that cause variation in detectability. Neglecting temporary emigration can lead to biased population estimates because availability and conditional detection probability are confounded. In this study, we extend previous hierarchical binomial mixture models to account for multiple sources of variation in detectability. The state process of the hierarchical model describes ecological mechanisms that generate spatial and temporal patterns in abundance, while the observation model accounts for the imperfect nature of counting individuals due to temporary emigration and false absences. We illustrate our model’s potential advantages, including the allowance of temporary emigration between sampling periods, with a case study of southern red-backed salamanders Plethodon serratus. We fit our model and a standard binomial mixture model to counts of terrestrial salamanders surveyed at 40 sites during 3–5 surveys each spring and fall 2010–2012. Our models generated similar parameter estimates to standard binomial mixture models. Aspect was the best predictor of salamander abundance in our case study; abundance increased as aspect became more northeasterly. Increased time-since-rainfall strongly decreased salamander surface activity (i.e. availability for sampling), while higher amounts of woody cover objects and rocks increased conditional detection probability (i.e. probability of capture, given an animal is exposed to sampling). By explicitly accounting for both components of detectability, we increased congruence between our statistical modeling and our ecological understanding of the system. We stress the importance of choosing survey locations and protocols that maximize species availability and conditional detection probability to increase population parameter estimate reliability. PMID:25775182

  12. Optimizing probability of detection point estimate demonstration

    NASA Astrophysics Data System (ADS)

    Koshti, Ajay M.

    2017-04-01

    The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using point estimate method. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. Traditionally largest flaw size in the set is considered to be a conservative estimate of the flaw size with minimum 90% probability and 95% confidence. The flaw size is denoted as α90/95PE. The paper investigates relationship between range of flaw sizes in relation to α90, i.e. 90% probability flaw size, to provide a desired PPD. The range of flaw sizes is expressed as a proportion of the standard deviation of the probability density distribution. Difference between median or average of the 29 flaws and α90 is also expressed as a proportion of standard deviation of the probability density distribution. In general, it is concluded that, if probability of detection increases with flaw size, average of 29 flaw sizes would always be larger than or equal to α90 and is an acceptable measure of α90/95PE. If NDE technique has sufficient sensitivity and signal-to-noise ratio, then the 29 flaw-set can be optimized to meet requirements of minimum required PPD, maximum allowable POF, requirements on flaw size tolerance about mean flaw size and flaw size detectability requirements. The paper provides procedure for optimizing flaw sizes in the point estimate demonstration flaw-set.

  13. Fast adaptation of the internal model of gravity for manual interceptions: evidence for event-dependent learning.

    PubMed

    Zago, Myrka; Bosco, Gianfranco; Maffei, Vincenzo; Iosa, Marco; Ivanenko, Yuri P; Lacquaniti, Francesco

    2005-02-01

    We studied how subjects learn to deal with two conflicting sensory environments as a function of the probability of each environment and the temporal distance between repeated events. Subjects were asked to intercept a visual target moving downward on a screen with randomized laws of motion. We compared five protocols that differed in the probability of constant speed (0g) targets and accelerated (1g) targets. Probability ranged from 9 to 100%, and the time interval between consecutive repetitions of the same target ranged from about 1 to 20 min. We found that subjects systematically timed their responses consistent with the assumption of gravity effects, for both 1 and 0g trials. With training, subjects rapidly adapted to 0g targets by shifting the time of motor activation. Surprisingly, the adaptation rate was independent of both the probability of 0g targets and their temporal distance. Very few 0g trials sporadically interspersed as catch trials during immersive practice with 1g trials were sufficient for learning and consolidation in long-term memory, as verified by retesting after 24 h. We argue that the memory store for adapted states of the internal gravity model is triggered by individual events and can be sustained for prolonged periods of time separating sporadic repetitions. This form of event-related learning could depend on multiple-stage memory, with exponential rise and decay in the initial stages followed by a sample-and-hold module.

  14. Site specific probability of passive acoustic detection of humpback whale calls from single fixed hydrophones.

    PubMed

    Helble, Tyler A; D'Spain, Gerald L; Hildebrand, John A; Campbell, Gregory S; Campbell, Richard L; Heaney, Kevin D

    2013-09-01

    Passive acoustic monitoring of marine mammal calls is an increasingly important method for assessing population numbers, distribution, and behavior. A common mistake in the analysis of marine mammal acoustic data is formulating conclusions about these animals without first understanding how environmental properties such as bathymetry, sediment properties, water column sound speed, and ocean acoustic noise influence the detection and character of vocalizations in the acoustic data. The approach in this paper is to use Monte Carlo simulations with a full wave field acoustic propagation model to characterize the site specific probability of detection of six types of humpback whale calls at three passive acoustic monitoring locations off the California coast. Results show that the probability of detection can vary by factors greater than ten when comparing detections across locations, or comparing detections at the same location over time, due to environmental effects. Effects of uncertainties in the inputs to the propagation model are also quantified, and the model accuracy is assessed by comparing calling statistics amassed from 24,690 humpback units recorded in the month of October 2008. Under certain conditions, the probability of detection can be estimated with uncertainties sufficiently small to allow for accurate density estimates.

  15. Spaced Retrieval Enhances Memory for a Name-Face-Occupation Association in Older Adults with Probable Alzheimer's Disease

    ERIC Educational Resources Information Center

    Cherry, Katie E.; Walvoord, Ashley A. G.; Hawley, Karri S.

    2010-01-01

    The authors trained 4 older adults with probable Alzheimer's disease to recall a name-face-occupation association using the spaced retrieval technique. Six training sessions were administered over a 2-week period. On each trial, participants selected a target photograph and stated the target name and occupation at increasingly longer retention…

  16. Integrating geological uncertainty in long-term open pit mine production planning by ant colony optimization

    NASA Astrophysics Data System (ADS)

    Gilani, Seyed-Omid; Sattarvand, Javad

    2016-02-01

    Meeting production targets in terms of ore quantity and quality is critical for a successful mining operation. In-situ grade uncertainty causes both deviations from production targets and general financial deficits. A new stochastic optimization algorithm based on ant colony optimization (ACO) approach is developed herein to integrate geological uncertainty described through a series of the simulated ore bodies. Two different strategies were developed based on a single predefined probability value (Prob) and multiple probability values (Pro bnt) , respectively in order to improve the initial solutions that created by deterministic ACO procedure. Application at the Sungun copper mine in the northwest of Iran demonstrate the abilities of the stochastic approach to create a single schedule and control the risk of deviating from production targets over time and also increase the project value. A comparison between two strategies and traditional approach illustrates that the multiple probability strategy is able to produce better schedules, however, the single predefined probability is more practical in projects requiring of high flexibility degree.

  17. A Bayesian predictive two-stage design for phase II clinical trials.

    PubMed

    Sambucini, Valeria

    2008-04-15

    In this paper, we propose a Bayesian two-stage design for phase II clinical trials, which represents a predictive version of the single threshold design (STD) recently introduced by Tan and Machin. The STD two-stage sample sizes are determined specifying a minimum threshold for the posterior probability that the true response rate exceeds a pre-specified target value and assuming that the observed response rate is slightly higher than the target. Unlike the STD, we do not refer to a fixed experimental outcome, but take into account the uncertainty about future data. In both stages, the design aims to control the probability of getting a large posterior probability that the true response rate exceeds the target value. Such a probability is expressed in terms of prior predictive distributions of the data. The performance of the design is based on the distinction between analysis and design priors, recently introduced in the literature. The properties of the method are studied when all the design parameters vary.

  18. Impact of high-risk conjunctions on Active Debris Removal target selection

    NASA Astrophysics Data System (ADS)

    Lidtke, Aleksander A.; Lewis, Hugh G.; Armellin, Roberto

    2015-10-01

    Space debris simulations show that if current space launches continue unchanged, spacecraft operations might become difficult in the congested space environment. It has been suggested that Active Debris Removal (ADR) might be necessary in order to prevent such a situation. Selection of objects to be targeted by ADR is considered important because removal of non-relevant objects will unnecessarily increase the cost of ADR. One of the factors to be used in this ADR target selection is the collision probability accumulated by every object. This paper shows the impact of high-probability conjunctions on the collision probability accumulated by individual objects as well as the probability of any collision occurring in orbit. Such conjunctions cannot be predicted far in advance and, consequently, not all the objects that will be involved in such dangerous conjunctions can be removed through ADR. Therefore, a debris remediation method that would address such events at short notice, and thus help prevent likely collisions, is suggested.

  19. Review of Literature for Model Assisted Probability of Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meyer, Ryan M.; Crawford, Susan L.; Lareau, John P.

    This is a draft technical letter report for NRC client documenting a literature review of model assisted probability of detection (MAPOD) for potential application to nuclear power plant components for improvement of field NDE performance estimations.

  20. Probability of detection evaluation results for railroad tank cars : final report.

    DOT National Transportation Integrated Search

    2016-08-01

    The Transportation Technology Center, Inc. (TTCI) used the approach developed for the National Aeronautics and Space : Association to determine the probability of detection (POD) for various nondestructive test (NDT) methods used during inspection : ...

Top