Sample records for modelling detection probabilities

  1. A removal model for estimating detection probabilities from point-count surveys

    USGS Publications Warehouse

    Farnsworth, G.L.; Pollock, K.H.; Nichols, J.D.; Simons, T.R.; Hines, J.E.; Sauer, J.R.

    2000-01-01

    We adapted a removal model to estimate detection probability during point count surveys. The model assumes one factor influencing detection during point counts is the singing frequency of birds. This may be true for surveys recording forest songbirds when most detections are by sound. The model requires counts to be divided into several time intervals. We used time intervals of 2, 5, and 10 min to develop a maximum-likelihood estimator for the detectability of birds during such surveys. We applied this technique to data from bird surveys conducted in Great Smoky Mountains National Park. We used model selection criteria to identify whether detection probabilities varied among species, throughout the morning, throughout the season, and among different observers. The overall detection probability for all birds was 75%. We found differences in detection probability among species. Species that sing frequently such as Winter Wren and Acadian Flycatcher had high detection probabilities (about 90%) and species that call infrequently such as Pileated Woodpecker had low detection probability (36%). We also found detection probabilities varied with the time of day for some species (e.g. thrushes) and between observers for other species. This method of estimating detectability during point count surveys offers a promising new approach to using count data to address questions of the bird abundance, density, and population trends.

  2. Site occupancy models with heterogeneous detection probabilities

    USGS Publications Warehouse

    Royle, J. Andrew

    2006-01-01

    Models for estimating the probability of occurrence of a species in the presence of imperfect detection are important in many ecological disciplines. In these ?site occupancy? models, the possibility of heterogeneity in detection probabilities among sites must be considered because variation in abundance (and other factors) among sampled sites induces variation in detection probability (p). In this article, I develop occurrence probability models that allow for heterogeneous detection probabilities by considering several common classes of mixture distributions for p. For any mixing distribution, the likelihood has the general form of a zero-inflated binomial mixture for which inference based upon integrated likelihood is straightforward. A recent paper by Link (2003, Biometrics 59, 1123?1130) demonstrates that in closed population models used for estimating population size, different classes of mixture distributions are indistinguishable from data, yet can produce very different inferences about population size. I demonstrate that this problem can also arise in models for estimating site occupancy in the presence of heterogeneous detection probabilities. The implications of this are discussed in the context of an application to avian survey data and the development of animal monitoring programs.

  3. A removal model for estimating detection probabilities from point-count surveys

    USGS Publications Warehouse

    Farnsworth, G.L.; Pollock, K.H.; Nichols, J.D.; Simons, T.R.; Hines, J.E.; Sauer, J.R.

    2002-01-01

    Use of point-count surveys is a popular method for collecting data on abundance and distribution of birds. However, analyses of such data often ignore potential differences in detection probability. We adapted a removal model to directly estimate detection probability during point-count surveys. The model assumes that singing frequency is a major factor influencing probability of detection when birds are surveyed using point counts. This may be appropriate for surveys in which most detections are by sound. The model requires counts to be divided into several time intervals. Point counts are often conducted for 10 min, where the number of birds recorded is divided into those first observed in the first 3 min, the subsequent 2 min, and the last 5 min. We developed a maximum-likelihood estimator for the detectability of birds recorded during counts divided into those intervals. This technique can easily be adapted to point counts divided into intervals of any length. We applied this method to unlimited-radius counts conducted in Great Smoky Mountains National Park. We used model selection criteria to identify whether detection probabilities varied among species, throughout the morning, throughout the season, and among different observers. We found differences in detection probability among species. Species that sing frequently such as Winter Wren (Troglodytes troglodytes) and Acadian Flycatcher (Empidonax virescens) had high detection probabilities (∼90%) and species that call infrequently such as Pileated Woodpecker (Dryocopus pileatus) had low detection probability (36%). We also found detection probabilities varied with the time of day for some species (e.g. thrushes) and between observers for other species. We used the same approach to estimate detection probability and density for a subset of the observations with limited-radius point counts.

  4. Incorporating detection probability into northern Great Plains pronghorn population estimates

    USGS Publications Warehouse

    Jacques, Christopher N.; Jenks, Jonathan A.; Grovenburg, Troy W.; Klaver, Robert W.; DePerno, Christopher S.

    2014-01-01

    Pronghorn (Antilocapra americana) abundances commonly are estimated using fixed-wing surveys, but these estimates are likely to be negatively biased because of violations of key assumptions underpinning line-transect methodology. Reducing bias and improving precision of abundance estimates through use of detection probability and mark-resight models may allow for more responsive pronghorn management actions. Given their potential application in population estimation, we evaluated detection probability and mark-resight models for use in estimating pronghorn population abundance. We used logistic regression to quantify probabilities that detecting pronghorn might be influenced by group size, animal activity, percent vegetation, cover type, and topography. We estimated pronghorn population size by study area and year using mixed logit-normal mark-resight (MLNM) models. Pronghorn detection probability increased with group size, animal activity, and percent vegetation; overall detection probability was 0.639 (95% CI = 0.612–0.667) with 396 of 620 pronghorn groups detected. Despite model selection uncertainty, the best detection probability models were 44% (range = 8–79%) and 180% (range = 139–217%) greater than traditional pronghorn population estimates. Similarly, the best MLNM models were 28% (range = 3–58%) and 147% (range = 124–180%) greater than traditional population estimates. Detection probability of pronghorn was not constant but depended on both intrinsic and extrinsic factors. When pronghorn detection probability is a function of animal group size, animal activity, landscape complexity, and percent vegetation, traditional aerial survey techniques will result in biased pronghorn abundance estimates. Standardizing survey conditions, increasing resighting occasions, or accounting for variation in individual heterogeneity in mark-resight models will increase the accuracy and precision of pronghorn population estimates.

  5. Using areas of known occupancy to identify sources of variation in detection probability of raptors: taking time lowers replication effort for surveys.

    PubMed

    Murn, Campbell; Holloway, Graham J

    2016-10-01

    Species occurring at low density can be difficult to detect and if not properly accounted for, imperfect detection will lead to inaccurate estimates of occupancy. Understanding sources of variation in detection probability and how they can be managed is a key part of monitoring. We used sightings data of a low-density and elusive raptor (white-headed vulture Trigonoceps occipitalis ) in areas of known occupancy (breeding territories) in a likelihood-based modelling approach to calculate detection probability and the factors affecting it. Because occupancy was known a priori to be 100%, we fixed the model occupancy parameter to 1.0 and focused on identifying sources of variation in detection probability. Using detection histories from 359 territory visits, we assessed nine covariates in 29 candidate models. The model with the highest support indicated that observer speed during a survey, combined with temporal covariates such as time of year and length of time within a territory, had the highest influence on the detection probability. Averaged detection probability was 0.207 (s.e. 0.033) and based on this the mean number of visits required to determine within 95% confidence that white-headed vultures are absent from a breeding area is 13 (95% CI: 9-20). Topographical and habitat covariates contributed little to the best models and had little effect on detection probability. We highlight that low detection probabilities of some species means that emphasizing habitat covariates could lead to spurious results in occupancy models that do not also incorporate temporal components. While variation in detection probability is complex and influenced by effects at both temporal and spatial scales, temporal covariates can and should be controlled as part of robust survey methods. Our results emphasize the importance of accounting for detection probability in occupancy studies, particularly during presence/absence studies for species such as raptors that are widespread and occur at low densities.

  6. Estimating site occupancy rates for aquatic plants using spatial sub-sampling designs when detection probabilities are less than one

    USGS Publications Warehouse

    Nielson, Ryan M.; Gray, Brian R.; McDonald, Lyman L.; Heglund, Patricia J.

    2011-01-01

    Estimation of site occupancy rates when detection probabilities are <1 is well established in wildlife science. Data from multiple visits to a sample of sites are used to estimate detection probabilities and the proportion of sites occupied by focal species. In this article we describe how site occupancy methods can be applied to estimate occupancy rates of plants and other sessile organisms. We illustrate this approach and the pitfalls of ignoring incomplete detection using spatial data for 2 aquatic vascular plants collected under the Upper Mississippi River's Long Term Resource Monitoring Program (LTRMP). Site occupancy models considered include: a naïve model that ignores incomplete detection, a simple site occupancy model assuming a constant occupancy rate and a constant probability of detection across sites, several models that allow site occupancy rates and probabilities of detection to vary with habitat characteristics, and mixture models that allow for unexplained variation in detection probabilities. We used information theoretic methods to rank competing models and bootstrapping to evaluate the goodness-of-fit of the final models. Results of our analysis confirm that ignoring incomplete detection can result in biased estimates of occupancy rates. Estimates of site occupancy rates for 2 aquatic plant species were 19–36% higher compared to naive estimates that ignored probabilities of detection <1. Simulations indicate that final models have little bias when 50 or more sites are sampled, and little gains in precision could be expected for sample sizes >300. We recommend applying site occupancy methods for monitoring presence of aquatic species.

  7. Factors influencing territorial occupancy and reproductive success in a Eurasian Eagle-owl (Bubo bubo) population.

    PubMed

    León-Ortega, Mario; Jiménez-Franco, María V; Martínez, José E; Calvo, José F

    2017-01-01

    Modelling territorial occupancy and reproductive success is a key issue for better understanding the population dynamics of territorial species. This study aimed to investigate these ecological processes in a Eurasian Eagle-owl (Bubo bubo) population in south-eastern Spain during a seven-year period. A multi-season, multi-state modelling approach was followed to estimate the probabilities of occupancy and reproductive success in relation to previous state, time and habitat covariates, and accounting for imperfect detection. The best estimated models showed past breeding success in the territories to be the most important factor determining a high probability of reoccupation and reproductive success in the following year. In addition, alternative occupancy models suggested the positive influence of crops on the probability of territory occupation. By contrast, the best reproductive model revealed strong interannual variations in the rates of breeding success, which may be related to changes in the abundance of the European Rabbit, the main prey of the Eurasian Eagle-owl. Our models also estimated the probabilities of detecting the presence of owls in a given territory and the probability of detecting evidence of successful reproduction. Estimated detection probabilities were high throughout the breeding season, decreasing in time for unsuccessful breeders but increasing for successful breeders. The probability of detecting reproductive success increased with time, being close to one in the last survey. These results suggest that reproduction failure in the early stages of the breeding season is a determinant factor in the probability of detecting occupancy and reproductive success.

  8. Factors influencing territorial occupancy and reproductive success in a Eurasian Eagle-owl (Bubo bubo) population

    PubMed Central

    León-Ortega, Mario; Jiménez-Franco, María V.; Martínez, José E.

    2017-01-01

    Modelling territorial occupancy and reproductive success is a key issue for better understanding the population dynamics of territorial species. This study aimed to investigate these ecological processes in a Eurasian Eagle-owl (Bubo bubo) population in south-eastern Spain during a seven-year period. A multi-season, multi-state modelling approach was followed to estimate the probabilities of occupancy and reproductive success in relation to previous state, time and habitat covariates, and accounting for imperfect detection. The best estimated models showed past breeding success in the territories to be the most important factor determining a high probability of reoccupation and reproductive success in the following year. In addition, alternative occupancy models suggested the positive influence of crops on the probability of territory occupation. By contrast, the best reproductive model revealed strong interannual variations in the rates of breeding success, which may be related to changes in the abundance of the European Rabbit, the main prey of the Eurasian Eagle-owl. Our models also estimated the probabilities of detecting the presence of owls in a given territory and the probability of detecting evidence of successful reproduction. Estimated detection probabilities were high throughout the breeding season, decreasing in time for unsuccessful breeders but increasing for successful breeders. The probability of detecting reproductive success increased with time, being close to one in the last survey. These results suggest that reproduction failure in the early stages of the breeding season is a determinant factor in the probability of detecting occupancy and reproductive success. PMID:28399175

  9. Partitioning Detectability Components in Populations Subject to Within-Season Temporary Emigration Using Binomial Mixture Models

    PubMed Central

    O’Donnell, Katherine M.; Thompson, Frank R.; Semlitsch, Raymond D.

    2015-01-01

    Detectability of individual animals is highly variable and nearly always < 1; imperfect detection must be accounted for to reliably estimate population sizes and trends. Hierarchical models can simultaneously estimate abundance and effective detection probability, but there are several different mechanisms that cause variation in detectability. Neglecting temporary emigration can lead to biased population estimates because availability and conditional detection probability are confounded. In this study, we extend previous hierarchical binomial mixture models to account for multiple sources of variation in detectability. The state process of the hierarchical model describes ecological mechanisms that generate spatial and temporal patterns in abundance, while the observation model accounts for the imperfect nature of counting individuals due to temporary emigration and false absences. We illustrate our model’s potential advantages, including the allowance of temporary emigration between sampling periods, with a case study of southern red-backed salamanders Plethodon serratus. We fit our model and a standard binomial mixture model to counts of terrestrial salamanders surveyed at 40 sites during 3–5 surveys each spring and fall 2010–2012. Our models generated similar parameter estimates to standard binomial mixture models. Aspect was the best predictor of salamander abundance in our case study; abundance increased as aspect became more northeasterly. Increased time-since-rainfall strongly decreased salamander surface activity (i.e. availability for sampling), while higher amounts of woody cover objects and rocks increased conditional detection probability (i.e. probability of capture, given an animal is exposed to sampling). By explicitly accounting for both components of detectability, we increased congruence between our statistical modeling and our ecological understanding of the system. We stress the importance of choosing survey locations and protocols that maximize species availability and conditional detection probability to increase population parameter estimate reliability. PMID:25775182

  10. Modeling stream fish distributions using interval-censored detection times.

    PubMed

    Ferreira, Mário; Filipe, Ana Filipa; Bardos, David C; Magalhães, Maria Filomena; Beja, Pedro

    2016-08-01

    Controlling for imperfect detection is important for developing species distribution models (SDMs). Occupancy-detection models based on the time needed to detect a species can be used to address this problem, but this is hindered when times to detection are not known precisely. Here, we extend the time-to-detection model to deal with detections recorded in time intervals and illustrate the method using a case study on stream fish distribution modeling. We collected electrofishing samples of six fish species across a Mediterranean watershed in Northeast Portugal. Based on a Bayesian hierarchical framework, we modeled the probability of water presence in stream channels, and the probability of species occupancy conditional on water presence, in relation to environmental and spatial variables. We also modeled time-to-first detection conditional on occupancy in relation to local factors, using modified interval-censored exponential survival models. Posterior distributions of occupancy probabilities derived from the models were used to produce species distribution maps. Simulations indicated that the modified time-to-detection model provided unbiased parameter estimates despite interval-censoring. There was a tendency for spatial variation in detection rates to be primarily influenced by depth and, to a lesser extent, stream width. Species occupancies were consistently affected by stream order, elevation, and annual precipitation. Bayesian P-values and AUCs indicated that all models had adequate fit and high discrimination ability, respectively. Mapping of predicted occupancy probabilities showed widespread distribution by most species, but uncertainty was generally higher in tributaries and upper reaches. The interval-censored time-to-detection model provides a practical solution to model occupancy-detection when detections are recorded in time intervals. This modeling framework is useful for developing SDMs while controlling for variation in detection rates, as it uses simple data that can be readily collected by field ecologists.

  11. Modelling detectability of kiore (Rattus exulans) on Aguiguan, Mariana Islands, to inform possible eradication and monitoring efforts

    USGS Publications Warehouse

    Adams, A.A.Y.; Stanford, J.W.; Wiewel, A.S.; Rodda, G.H.

    2011-01-01

    Estimating the detection probability of introduced organisms during the pre-monitoring phase of an eradication effort can be extremely helpful in informing eradication and post-eradication monitoring efforts, but this step is rarely taken. We used data collected during 11 nights of mark-recapture sampling on Aguiguan, Mariana Islands, to estimate introduced kiore (Rattus exulans Peale) density and detection probability, and evaluated factors affecting detectability to help inform possible eradication efforts. Modelling of 62 captures of 48 individuals resulted in a model-averaged density estimate of 55 kiore/ha. Kiore detection probability was best explained by a model allowing neophobia to diminish linearly (i.e. capture probability increased linearly) until occasion 7, with additive effects of sex and cumulative rainfall over the prior 48 hours. Detection probability increased with increasing rainfall and females were up to three times more likely than males to be trapped. In this paper, we illustrate the type of information that can be obtained by modelling mark-recapture data collected during pre-eradication monitoring and discuss the potential of using these data to inform eradication and posteradication monitoring efforts. ?? New Zealand Ecological Society.

  12. Multi-scale occupancy estimation and modelling using multiple detection methods

    USGS Publications Warehouse

    Nichols, James D.; Bailey, Larissa L.; O'Connell, Allan F.; Talancy, Neil W.; Grant, Evan H. Campbell; Gilbert, Andrew T.; Annand, Elizabeth M.; Husband, Thomas P.; Hines, James E.

    2008-01-01

    Occupancy estimation and modelling based on detection–nondetection data provide an effective way of exploring change in a species’ distribution across time and space in cases where the species is not always detected with certainty. Today, many monitoring programmes target multiple species, or life stages within a species, requiring the use of multiple detection methods. When multiple methods or devices are used at the same sample sites, animals can be detected by more than one method.We develop occupancy models for multiple detection methods that permit simultaneous use of data from all methods for inference about method-specific detection probabilities. Moreover, the approach permits estimation of occupancy at two spatial scales: the larger scale corresponds to species’ use of a sample unit, whereas the smaller scale corresponds to presence of the species at the local sample station or site.We apply the models to data collected on two different vertebrate species: striped skunks Mephitis mephitis and red salamanders Pseudotriton ruber. For striped skunks, large-scale occupancy estimates were consistent between two sampling seasons. Small-scale occupancy probabilities were slightly lower in the late winter/spring when skunks tend to conserve energy, and movements are limited to males in search of females for breeding. There was strong evidence of method-specific detection probabilities for skunks. As anticipated, large- and small-scale occupancy areas completely overlapped for red salamanders. The analyses provided weak evidence of method-specific detection probabilities for this species.Synthesis and applications. Increasingly, many studies are utilizing multiple detection methods at sampling locations. The modelling approach presented here makes efficient use of detections from multiple methods to estimate occupancy probabilities at two spatial scales and to compare detection probabilities associated with different detection methods. The models can be viewed as another variation of Pollock's robust design and may be applicable to a wide variety of scenarios where species occur in an area but are not always near the sampled locations. The estimation approach is likely to be especially useful in multispecies conservation programmes by providing efficient estimates using multiple detection devices and by providing device-specific detection probability estimates for use in survey design.

  13. Search times and probability of detection in time-limited search

    NASA Astrophysics Data System (ADS)

    Wilson, David; Devitt, Nicole; Maurer, Tana

    2005-05-01

    When modeling the search and target acquisition process, probability of detection as a function of time is important to war games and physical entity simulations. Recent US Army RDECOM CERDEC Night Vision and Electronics Sensor Directorate modeling of search and detection has focused on time-limited search. Developing the relationship between detection probability and time of search as a differential equation is explored. One of the parameters in the current formula for probability of detection in time-limited search corresponds to the mean time to detect in time-unlimited search. However, the mean time to detect in time-limited search is shorter than the mean time to detect in time-unlimited search and the relationship between them is a mathematical relationship between these two mean times. This simple relationship is derived.

  14. Modelling detection probabilities to evaluate management and control tools for an invasive species

    USGS Publications Warehouse

    Christy, M.T.; Yackel Adams, A.A.; Rodda, G.H.; Savidge, J.A.; Tyrrell, C.L.

    2010-01-01

    For most ecologists, detection probability (p) is a nuisance variable that must be modelled to estimate the state variable of interest (i.e. survival, abundance, or occupancy). However, in the realm of invasive species control, the rate of detection and removal is the rate-limiting step for management of this pervasive environmental problem. For strategic planning of an eradication (removal of every individual), one must identify the least likely individual to be removed, and determine the probability of removing it. To evaluate visual searching as a control tool for populations of the invasive brown treesnake Boiga irregularis, we designed a mark-recapture study to evaluate detection probability as a function of time, gender, size, body condition, recent detection history, residency status, searcher team and environmental covariates. We evaluated these factors using 654 captures resulting from visual detections of 117 snakes residing in a 5-ha semi-forested enclosure on Guam, fenced to prevent immigration and emigration of snakes but not their prey. Visual detection probability was low overall (= 0??07 per occasion) but reached 0??18 under optimal circumstances. Our results supported sex-specific differences in detectability that were a quadratic function of size, with both small and large females having lower detection probabilities than males of those sizes. There was strong evidence for individual periodic changes in detectability of a few days duration, roughly doubling detection probability (comparing peak to non-elevated detections). Snakes in poor body condition had estimated mean detection probabilities greater than snakes with high body condition. Search teams with high average detection rates exhibited detection probabilities about twice that of search teams with low average detection rates. Surveys conducted with bright moonlight and strong wind gusts exhibited moderately decreased probabilities of detecting snakes. Synthesis and applications. By emphasizing and modelling detection probabilities, we now know: (i) that eradication of this species by searching is possible, (ii) how much searching effort would be required, (iii) under what environmental conditions searching would be most efficient, and (iv) several factors that are likely to modulate this quantification when searching is applied to new areas. The same approach can be use for evaluation of any control technology or population monitoring programme. ?? 2009 The Authors. Journal compilation ?? 2009 British Ecological Society.

  15. Red-shouldered hawk occupancy surveys in central Minnesota, USA

    USGS Publications Warehouse

    Henneman, C.; McLeod, M.A.; Andersen, D.E.

    2007-01-01

    Forest-dwelling raptors are often difficult to detect because many species occur at low density or are secretive. Broadcasting conspecific vocalizations can increase the probability of detecting forest-dwelling raptors and has been shown to be an effective method for locating raptors and assessing their relative abundance. Recent advances in statistical techniques based on presence-absence data use probabilistic arguments to derive probability of detection when it is <1 and to provide a model and likelihood-based method for estimating proportion of sites occupied. We used these maximum-likelihood models with data from red-shouldered hawk (Buteo lineatus) call-broadcast surveys conducted in central Minnesota, USA, in 1994-1995 and 2004-2005. Our objectives were to obtain estimates of occupancy and detection probability 1) over multiple sampling seasons (yr), 2) incorporating within-season time-specific detection probabilities, 3) with call type and breeding stage included as covariates in models of probability of detection, and 4) with different sampling strategies. We visited individual survey locations 2-9 times per year, and estimates of both probability of detection (range = 0.28-0.54) and site occupancy (range = 0.81-0.97) varied among years. Detection probability was affected by inclusion of a within-season time-specific covariate, call type, and breeding stage. In 2004 and 2005 we used survey results to assess the effect that number of sample locations, double sampling, and discontinued sampling had on parameter estimates. We found that estimates of probability of detection and proportion of sites occupied were similar across different sampling strategies, and we suggest ways to reduce sampling effort in a monitoring program.

  16. Landscape- and local-scale habitat influences on occupancy and detection probability of stream-dwelling crayfish: Implications for conservation

    USGS Publications Warehouse

    Magoulick, Daniel D.; DiStefano, Robert J.; Imhoff, Emily M.; Nolen, Matthew S.; Wagner, Brian K.

    2017-01-01

    Crayfish are ecologically important in freshwater systems worldwide and are imperiled in North America and globally. We sought to examine landscape- to local-scale environmental variables related to occupancy and detection probability of a suite of stream-dwelling crayfish species. We used a quantitative kickseine method to sample crayfish presence at 102 perennial stream sites with eight surveys per site. We modeled occupancy (psi) and detection probability (P) and local- and landscape-scale environmental covariates. We developed a set of a priori candidate models for each species and ranked models using (Q)AICc. Detection probabilities and occupancy estimates differed among crayfish species with Orconectes eupunctus, O. marchandi, and Cambarus hubbsi being relatively rare (psi < 0.20) with moderate (0.46–0.60) to high (0.81) detection probability and O. punctimanus and O. ozarkae being relatively common (psi > 0.60) with high detection probability (0.81). Detection probability was often related to local habitat variables current velocity, depth, or substrate size. Important environmental variables for crayfish occupancy were species dependent but were mainly landscape variables such as stream order, geology, slope, topography, and land use. Landscape variables strongly influenced crayfish occupancy and should be considered in future studies and conservation plans.

  17. Population size influences amphibian detection probability: implications for biodiversity monitoring programs.

    PubMed

    Tanadini, Lorenzo G; Schmidt, Benedikt R

    2011-01-01

    Monitoring is an integral part of species conservation. Monitoring programs must take imperfect detection of species into account in order to be reliable. Theory suggests that detection probability may be determined by population size but this relationship has not yet been assessed empirically. Population size is particularly important because it may induce heterogeneity in detection probability and thereby cause bias in estimates of biodiversity. We used a site occupancy model to analyse data from a volunteer-based amphibian monitoring program to assess how well different variables explain variation in detection probability. An index to population size best explained detection probabilities for four out of six species (to avoid circular reasoning, we used the count of individuals at a previous site visit as an index to current population size). The relationship between the population index and detection probability was positive. Commonly used weather variables best explained detection probabilities for two out of six species. Estimates of site occupancy probabilities differed depending on whether the population index was or was not used to model detection probability. The relationship between the population index and detectability has implications for the design of monitoring and species conservation. Most importantly, because many small populations are likely to be overlooked, monitoring programs should be designed in such a way that small populations are not overlooked. The results also imply that methods cannot be standardized in such a way that detection probabilities are constant. As we have shown here, one can easily account for variation in population size in the analysis of data from long-term monitoring programs by using counts of individuals from surveys at the same site in previous years. Accounting for variation in population size is important because it can affect the results of long-term monitoring programs and ultimately the conservation of imperiled species.

  18. Review of Literature for Model Assisted Probability of Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meyer, Ryan M.; Crawford, Susan L.; Lareau, John P.

    This is a draft technical letter report for NRC client documenting a literature review of model assisted probability of detection (MAPOD) for potential application to nuclear power plant components for improvement of field NDE performance estimations.

  19. A double-observer approach for estimating detection probability and abundance from point counts

    USGS Publications Warehouse

    Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Fallon, F.W.; Fallon, J.E.; Heglund, P.J.

    2000-01-01

    Although point counts are frequently used in ornithological studies, basic assumptions about detection probabilities often are untested. We apply a double-observer approach developed to estimate detection probabilities for aerial surveys (Cook and Jacobson 1979) to avian point counts. At each point count, a designated 'primary' observer indicates to another ('secondary') observer all birds detected. The secondary observer records all detections of the primary observer as well as any birds not detected by the primary observer. Observers alternate primary and secondary roles during the course of the survey. The approach permits estimation of observer-specific detection probabilities and bird abundance. We developed a set of models that incorporate different assumptions about sources of variation (e.g. observer, bird species) in detection probability. Seventeen field trials were conducted, and models were fit to the resulting data using program SURVIV. Single-observer point counts generally miss varying proportions of the birds actually present, and observer and bird species were found to be relevant sources of variation in detection probabilities. Overall detection probabilities (probability of being detected by at least one of the two observers) estimated using the double-observer approach were very high (>0.95), yielding precise estimates of avian abundance. We consider problems with the approach and recommend possible solutions, including restriction of the approach to fixed-radius counts to reduce the effect of variation in the effective radius of detection among various observers and to provide a basis for using spatial sampling to estimate bird abundance on large areas of interest. We believe that most questions meriting the effort required to carry out point counts also merit serious attempts to estimate detection probabilities associated with the counts. The double-observer approach is a method that can be used for this purpose.

  20. Unmodeled observation error induces bias when inferring patterns and dynamics of species occurrence via aural detections

    USGS Publications Warehouse

    McClintock, Brett T.; Bailey, Larissa L.; Pollock, Kenneth H.; Simons, Theodore R.

    2010-01-01

    The recent surge in the development and application of species occurrence models has been associated with an acknowledgment among ecologists that species are detected imperfectly due to observation error. Standard models now allow unbiased estimation of occupancy probability when false negative detections occur, but this is conditional on no false positive detections and sufficient incorporation of explanatory variables for the false negative detection process. These assumptions are likely reasonable in many circumstances, but there is mounting evidence that false positive errors and detection probability heterogeneity may be much more prevalent in studies relying on auditory cues for species detection (e.g., songbird or calling amphibian surveys). We used field survey data from a simulated calling anuran system of known occupancy state to investigate the biases induced by these errors in dynamic models of species occurrence. Despite the participation of expert observers in simplified field conditions, both false positive errors and site detection probability heterogeneity were extensive for most species in the survey. We found that even low levels of false positive errors, constituting as little as 1% of all detections, can cause severe overestimation of site occupancy, colonization, and local extinction probabilities. Further, unmodeled detection probability heterogeneity induced substantial underestimation of occupancy and overestimation of colonization and local extinction probabilities. Completely spurious relationships between species occurrence and explanatory variables were also found. Such misleading inferences would likely have deleterious implications for conservation and management programs. We contend that all forms of observation error, including false positive errors and heterogeneous detection probabilities, must be incorporated into the estimation framework to facilitate reliable inferences about occupancy and its associated vital rate parameters.

  1. Individual heterogeneity and identifiability in capture-recapture models

    USGS Publications Warehouse

    Link, W.A.

    2004-01-01

    Individual heterogeneity in detection probabilities is a far more serious problem for capture-recapture modeling than has previously been recognized. In this note, I illustrate that population size is not an identifiable parameter under the general closed population mark-recapture model Mh. The problem of identifiability is obvious if the population includes individuals with pi = 0, but persists even when it is assumed that individual detection probabilities are bounded away from zero. Identifiability may be attained within parametric families of distributions for pi, but not among parametric families of distributions. Consequently, in the presence of individual heterogeneity in detection probability, capture-recapture analysis is strongly model dependent.

  2. Influence of gravel mining and other factors on detection probabilities of Coastal Plain fishes in the Mobile River Basin, Alabama

    USGS Publications Warehouse

    Hayer, C.-A.; Irwin, E.R.

    2008-01-01

    We used an information-theoretic approach to examine the variation in detection probabilities for 87 Piedmont and Coastal Plain fishes in relation to instream gravel mining in four Alabama streams of the Mobile River drainage. Biotic and abiotic variables were also included in candidate models. Detection probabilities were heterogeneous across species and varied with habitat type, stream, season, and water quality. Instream gravel mining influenced the variation in detection probabilities for 38% of the species collected, probably because it led to habitat loss and increased sedimentation. Higher detection probabilities were apparent at unmined sites than at mined sites for 78% of the species for which gravel mining was shown to influence detection probabilities, indicating potential negative impacts to these species. Physical and chemical attributes also explained the variation in detection probabilities for many species. These results indicate that anthropogenic impacts can affect detection probabilities for fishes, and such variation should be considered when developing monitoring programs or routine sampling protocols. ?? Copyright by the American Fisheries Society 2008.

  3. Modeling co-occurrence of northern spotted and barred owls: accounting for detection probability differences

    USGS Publications Warehouse

    Bailey, Larissa L.; Reid, Janice A.; Forsman, Eric D.; Nichols, James D.

    2009-01-01

    Barred owls (Strix varia) have recently expanded their range and now encompass the entire range of the northern spotted owl (Strix occidentalis caurina). This expansion has led to two important issues of concern for management of northern spotted owls: (1) possible competitive interactions between the two species that could contribute to population declines of northern spotted owls, and (2) possible changes in vocalization behavior and detection probabilities of northern spotted owls induced by presence of barred owls. We used a two-species occupancy model to investigate whether there was evidence of competitive exclusion between the two species at study locations in Oregon, USA. We simultaneously estimated detection probabilities for both species and determined if the presence of one species influenced the detection of the other species. Model selection results and associated parameter estimates provided no evidence that barred owls excluded spotted owls from territories. We found strong evidence that detection probabilities differed for the two species, with higher probabilities for northern spotted owls that are the object of current surveys. Non-detection of barred owls is very common in surveys for northern spotted owls, and detection of both owl species was negatively influenced by the presence of the congeneric species. Our results suggest that analyses directed at hypotheses of barred owl effects on demographic or occupancy vital rates of northern spotted owls need to deal adequately with imperfect and variable detection probabilities for both species.

  4. Assessing Performance Tradeoffs in Undersea Distributed Sensor Networks

    DTIC Science & Technology

    2006-09-01

    time. We refer to this process as track - before - detect (see [5] for a description), since the final determination of a target presence is not made until...expressions for probability of successful search and probability of false search for modeling the track - before - detect process. We then describe a numerical...random manner (randomly sampled from a uniform distribution). II. SENSOR NETWORK PERFORMANCE MODELS We model the process of track - before - detect by

  5. Method- and species-specific detection probabilities of fish occupancy in Arctic lakes: Implications for design and management

    USGS Publications Warehouse

    Haynes, Trevor B.; Rosenberger, Amanda E.; Lindberg, Mark S.; Whitman, Matthew; Schmutz, Joel A.

    2013-01-01

    Studies examining species occurrence often fail to account for false absences in field sampling. We investigate detection probabilities of five gear types for six fish species in a sample of lakes on the North Slope, Alaska. We used an occupancy modeling approach to provide estimates of detection probabilities for each method. Variation in gear- and species-specific detection probability was considerable. For example, detection probabilities for the fyke net ranged from 0.82 (SE = 0.05) for least cisco (Coregonus sardinella) to 0.04 (SE = 0.01) for slimy sculpin (Cottus cognatus). Detection probabilities were also affected by site-specific variables such as depth of the lake, year, day of sampling, and lake connection to a stream. With the exception of the dip net and shore minnow traps, each gear type provided the highest detection probability of at least one species. Results suggest that a multimethod approach may be most effective when attempting to sample the entire fish community of Arctic lakes. Detection probability estimates will be useful for designing optimal fish sampling and monitoring protocols in Arctic lakes.

  6. Influences of Availability on Parameter Estimates from Site Occupancy Models with Application to Submersed Aquatic Vegetation

    USGS Publications Warehouse

    Gray, Brian R.; Holland, Mark D.; Yi, Feng; Starcevich, Leigh Ann Harrod

    2013-01-01

    Site occupancy models are commonly used by ecologists to estimate the probabilities of species site occupancy and of species detection. This study addresses the influence on site occupancy and detection estimates of variation in species availability among surveys within sites. Such variation in availability may result from temporary emigration, nonavailability of the species for detection, and sampling sites spatially when species presence is not uniform within sites. We demonstrate, using Monte Carlo simulations and aquatic vegetation data, that variation in availability and heterogeneity in the probability of availability may yield biases in the expected values of the site occupancy and detection estimates that have traditionally been associated with low-detection probabilities and heterogeneity in those probabilities. These findings confirm that the effects of availability may be important for ecologists and managers, and that where such effects are expected, modification of sampling designs and/or analytical methods should be considered. Failure to limit the effects of availability may preclude reliable estimation of the probability of site occupancy.

  7. Assessment of imperfect detection of blister rust in whitebark pine within the Greater Yellowstone Ecosystem

    USGS Publications Warehouse

    Wright, Wilson J.; Irvine, Kathryn M.

    2017-01-01

    We examined data on white pine blister rust (blister rust) collected during the monitoring of whitebark pine trees in the Greater Yellowstone Ecosystem (from 2004-2015). Summaries of repeat observations performed by multiple independent observers are reviewed and discussed. These summaries show variability among observers and the potential for errors being made in blister rust status. Based on this assessment, we utilized occupancy models to analyze blister rust prevalence while explicitly accounting for imperfect detection. Available covariates were used to model both the probability of a tree being infected with blister rust and the probability of an observer detecting the infection. The fitted model provided strong evidence that the probability of blister rust infection increases as tree diameter increases and decreases as site elevation increases. Most importantly, we found evidence of heterogeneity in detection probabilities related to tree size and average slope of a transect. These results suggested that detecting the presence of blister rust was more difficult in larger trees. Also, there was evidence that blister rust was easier to detect on transects located on steeper slopes. Our model accounted for potential impacts of observer experience on blister rust detection probabilities and also showed moderate variability among the different observers in their ability to detect blister rust. Based on these model results, we suggest that multiple observer sampling continue in future field seasons in order to allow blister rust prevalence estimates to be corrected for imperfect detection. We suggest that the multiple observer effort be spread out across many transects (instead of concentrated at a few each field season) while retaining the overall proportion of trees with multiple observers around 5-20%. Estimates of prevalence are confounded with detection unless it is explicitly accounted for in an analysis and we demonstrate how an occupancy model can be used to do account for this source of observation error.

  8. Maximum ikelihood estimation for the double-count method with independent observers

    USGS Publications Warehouse

    Manly, Bryan F.J.; McDonald, Lyman L.; Garner, Gerald W.

    1996-01-01

    Data collected under a double-count protocol during line transect surveys were analyzed using new maximum likelihood methods combined with Akaike's information criterion to provide estimates of the abundance of polar bear (Ursus maritimus Phipps) in a pilot study off the coast of Alaska. Visibility biases were corrected by modeling the detection probabilities using logistic regression functions. Independent variables that influenced the detection probabilities included perpendicular distance of bear groups from the flight line and the number of individuals in the groups. A series of models were considered which vary from (1) the simplest, where the probability of detection was the same for both observers and was not affected by either distance from the flight line or group size, to (2) models where probability of detection is different for the two observers and depends on both distance from the transect and group size. Estimation procedures are developed for the case when additional variables may affect detection probabilities. The methods are illustrated using data from the pilot polar bear survey and some recommendations are given for design of a survey over the larger Chukchi Sea between Russia and the United States.

  9. Probability of detecting atrazine/desethyl-atrazine and elevated concentrations of nitrate plus nitrate as nitrogen in ground water in the Idaho part of the western Snake River Plain

    USGS Publications Warehouse

    Donato, Mary M.

    2000-01-01

    As ground water continues to provide an ever-growing proportion of Idaho?s drinking water, concerns about the quality of that resource are increasing. Pesticides (most commonly, atrazine/desethyl-atrazine, hereafter referred to as atrazine) and nitrite plus nitrate as nitrogen (hereafter referred to as nitrate) have been detected in many aquifers in the State. To provide a sound hydrogeologic basis for atrazine and nitrate management in southern Idaho—the largest region of land and water use in the State—the U.S. Geological Survey produced maps showing the probability of detecting these contaminants in ground water in the upper Snake River Basin (published in a 1998 report) and the western Snake River Plain (published in this report). The atrazine probability map for the western Snake River Plain was constructed by overlaying ground-water quality data with hydrogeologic and anthropogenic data in a geographic information system (GIS). A data set was produced in which each well had corresponding information on land use, geology, precipitation, soil characteristics, regional depth to ground water, well depth, water level, and atrazine use. These data were analyzed by logistic regression using a statistical software package. Several preliminary multivariate models were developed and those that best predicted the detection of atrazine were selected. The multivariate models then were entered into a GIS and the probability maps were produced. Land use, precipitation, soil hydrologic group, and well depth were significantly correlated with atrazine detections in the western Snake River Plain. These variables also were important in the 1998 probability study of the upper Snake River Basin. The effectiveness of the probability models for atrazine might be improved if more detailed data were available for atrazine application. A preliminary atrazine probability map for the entire Snake River Plain in Idaho, based on a data set representing that region, also was produced. In areas where this map overlaps the 1998 map of the upper Snake River Basin, the two maps show broadly similar probabilities of detecting atrazine. Logistic regression also was used to develop a preliminary statistical model that predicts the probability of detecting elevated nitrate in the western Snake River Plain. A nitrate probability map was produced from this model. Results showed that elevated nitrate concentrations were correlated with land use, soil organic content, well depth, and water level. Detailed information on nitrate input, specifically fertilizer application, might have improved the effectiveness of this model.

  10. Site specific probability of passive acoustic detection of humpback whale calls from single fixed hydrophones.

    PubMed

    Helble, Tyler A; D'Spain, Gerald L; Hildebrand, John A; Campbell, Gregory S; Campbell, Richard L; Heaney, Kevin D

    2013-09-01

    Passive acoustic monitoring of marine mammal calls is an increasingly important method for assessing population numbers, distribution, and behavior. A common mistake in the analysis of marine mammal acoustic data is formulating conclusions about these animals without first understanding how environmental properties such as bathymetry, sediment properties, water column sound speed, and ocean acoustic noise influence the detection and character of vocalizations in the acoustic data. The approach in this paper is to use Monte Carlo simulations with a full wave field acoustic propagation model to characterize the site specific probability of detection of six types of humpback whale calls at three passive acoustic monitoring locations off the California coast. Results show that the probability of detection can vary by factors greater than ten when comparing detections across locations, or comparing detections at the same location over time, due to environmental effects. Effects of uncertainties in the inputs to the propagation model are also quantified, and the model accuracy is assessed by comparing calling statistics amassed from 24,690 humpback units recorded in the month of October 2008. Under certain conditions, the probability of detection can be estimated with uncertainties sufficiently small to allow for accurate density estimates.

  11. Persistence rates and detection probabilities of oiled king eider carcasses on St Paul Island, Alaska

    USGS Publications Warehouse

    Fowler, A.C.; Flint, Paul L.

    1997-01-01

    Following an oil spill off St Paul Island, Alaska in February 1996, persistence rates and detection probabilities of oiled king eider (Somateria spectabilis) carcasses were estimated using the Cormack-Jolly-Seber model. Carcass persistence rates varied by day, beach type and sex, while detection probabilities varied by day and beach type. Scavenging, wave action and weather influenced carcass persistence. The patterns of persistence differed on rock and sand beaches and female carcasses had a different persistence function than males. Weather, primarily snow storms, and degree of carcass scavenging, diminished carcass detectability. Detection probabilities on rock beaches were lower and more variable than on sand beaches. The combination of persistence rates and detection probabilities can be used to improve techniques of estimating total mortality.

  12. Detection of cat-eye effect echo based on unit APD

    NASA Astrophysics Data System (ADS)

    Wu, Dong-Sheng; Zhang, Peng; Hu, Wen-Gang; Ying, Jia-Ju; Liu, Jie

    2016-10-01

    The cat-eye effect echo of optical system can be detected based on CCD, but the detection range is limited within several kilometers. In order to achieve long-range even ultra-long-range detection, it ought to select APD as detector because of the high sensitivity of APD. The detection system of cat-eye effect echo based on unit APD is designed in paper. The implementation scheme and key technology of the detection system is presented. The detection performances of the detection system including detection range, detection probability and false alarm probability are modeled. Based on the model, the performances of the detection system are analyzed using typical parameters. The results of numerical calculation show that the echo signal-to-noise ratio is greater than six, the detection probability is greater than 99.9% and the false alarm probability is less tan 0.1% within 20 km detection range. In order to verify the detection effect, we built the experimental platform of detection system according to the design scheme and carry out the field experiments. The experimental results agree well with the results of numerical calculation, which prove that the detection system based on the unit APD is feasible to realize remote detection for cat-eye effect echo.

  13. Computer-aided mathematical analysis of probability of intercept for ground-based communication intercept system

    NASA Astrophysics Data System (ADS)

    Park, Sang Chul

    1989-09-01

    We develop a mathematical analysis model to calculate the probability of intercept (POI) for the ground-based communication intercept (COMINT) system. The POI is a measure of the effectiveness of the intercept system. We define the POI as the product of the probability of detection and the probability of coincidence. The probability of detection is a measure of the receiver's capability to detect a signal in the presence of noise. The probability of coincidence is the probability that an intercept system is available, actively listening in the proper frequency band, in the right direction and at the same time that the signal is received. We investigate the behavior of the POI with respect to the observation time, the separation distance, antenna elevations, the frequency of the signal, and the receiver bandwidths. We observe that the coincidence characteristic between the receiver scanning parameters and the signal parameters is the key factor to determine the time to obtain a given POI. This model can be used to find the optimal parameter combination to maximize the POI in a given scenario. We expand this model to a multiple system. This analysis is conducted on a personal computer to provide the portability. The model is also flexible and can be easily implemented under different situations.

  14. Accounting for Incomplete Species Detection in Fish Community Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McManamay, Ryan A; Orth, Dr. Donald J; Jager, Yetta

    2013-01-01

    Riverine fish assemblages are heterogeneous and very difficult to characterize with a one-size-fits-all approach to sampling. Furthermore, detecting changes in fish assemblages over time requires accounting for variation in sampling designs. We present a modeling approach that permits heterogeneous sampling by accounting for site and sampling covariates (including method) in a model-based framework for estimation (versus a sampling-based framework). We snorkeled during three surveys and electrofished during a single survey in suite of delineated habitats stratified by reach types. We developed single-species occupancy models to determine covariates influencing patch occupancy and species detection probabilities whereas community occupancy models estimated speciesmore » richness in light of incomplete detections. For most species, information-theoretic criteria showed higher support for models that included patch size and reach as covariates of occupancy. In addition, models including patch size and sampling method as covariates of detection probabilities also had higher support. Detection probability estimates for snorkeling surveys were higher for larger non-benthic species whereas electrofishing was more effective at detecting smaller benthic species. The number of sites and sampling occasions required to accurately estimate occupancy varied among fish species. For rare benthic species, our results suggested that higher number of occasions, and especially the addition of electrofishing, may be required to improve detection probabilities and obtain accurate occupancy estimates. Community models suggested that richness was 41% higher than the number of species actually observed and the addition of an electrofishing survey increased estimated richness by 13%. These results can be useful to future fish assemblage monitoring efforts by informing sampling designs, such as site selection (e.g. stratifying based on patch size) and determining effort required (e.g. number of sites versus occasions).« less

  15. An evaluation of the Bayesian approach to fitting the N-mixture model for use with pseudo-replicated count data

    USGS Publications Warehouse

    Toribo, S.G.; Gray, B.R.; Liang, S.

    2011-01-01

    The N-mixture model proposed by Royle in 2004 may be used to approximate the abundance and detection probability of animal species in a given region. In 2006, Royle and Dorazio discussed the advantages of using a Bayesian approach in modelling animal abundance and occurrence using a hierarchical N-mixture model. N-mixture models assume replication on sampling sites, an assumption that may be violated when the site is not closed to changes in abundance during the survey period or when nominal replicates are defined spatially. In this paper, we studied the robustness of a Bayesian approach to fitting the N-mixture model for pseudo-replicated count data. Our simulation results showed that the Bayesian estimates for abundance and detection probability are slightly biased when the actual detection probability is small and are sensitive to the presence of extra variability within local sites.

  16. Using open robust design models to estimate temporary emigration from capture-recapture data.

    PubMed

    Kendall, W L; Bjorkland, R

    2001-12-01

    Capture-recapture studies are crucial in many circumstances for estimating demographic parameters for wildlife and fish populations. Pollock's robust design, involving multiple sampling occasions per period of interest, provides several advantages over classical approaches. This includes the ability to estimate the probability of being present and available for detection, which in some situations is equivalent to breeding probability. We present a model for estimating availability for detection that relaxes two assumptions required in previous approaches. The first is that the sampled population is closed to additions and deletions across samples within a period of interest. The second is that each member of the population has the same probability of being available for detection in a given period. We apply our model to estimate survival and breeding probability in a study of hawksbill sea turtles (Eretmochelys imbricata), where previous approaches are not appropriate.

  17. Using open robust design models to estimate temporary emigration from capture-recapture data

    USGS Publications Warehouse

    Kendall, W.L.; Bjorkland, R.

    2001-01-01

    Capture-recapture studies are crucial in many circumstances for estimating demographic parameters for wildlife and fish populations. Pollock's robust design, involving multiple sampling occasions per period of interest, provides several advantages over classical approaches. This includes the ability to estimate the probability of being present and available for detection, which in some situations is equivalent to breeding probability. We present a model for estimating availability for detection that relaxes two assumptions required in previous approaches. The first is that the sampled population is closed to additions and deletions across samples within a period of interest. The second is that each member of the population has the same probability of being available for detection in a given period. We apply our model to estimate survival and breeding probability in a study of hawksbill sea turtles (Eretmochelys imbricata), where previous approaches are not appropriate.

  18. Modeling anuran detection and site occupancy on North American Amphibian Monitoring Program (NAAMP) routes in Maryland

    USGS Publications Warehouse

    Weir, L.A.; Royle, J. Andrew; Nanjappa, P.; Jung, R.E.

    2005-01-01

    One of the most fundamental problems in monitoring animal populations is that of imperfect detection. Although imperfect detection can be modeled, studies examining patterns in occurrence often ignore detection and thus fail to properly partition variation in detection from that of occurrence. In this study, we used anuran calling survey data collected on North American Amphibian Monitoring Program routes in eastern Maryland to investigate factors that influence detection probability and site occupancy for 10 anuran species. In 2002, 17 calling survey routes in eastern Maryland were surveyed to collect environmental and species data nine or more times. To analyze these data, we developed models incorporating detection probability and site occupancy. The results suggest that, for more than half of the 10 species, detection probabilities vary most with season (i.e., day-of-year), air temperature, time, and moon illumination, whereas site occupancy may vary by the amount of palustrine forested wetland habitat. Our results suggest anuran calling surveys should document air temperature, time of night, moon illumination, observer skill, and habitat change over time, as these factors can be important to model-adjusted estimates of site occupancy. Our study represents the first formal modeling effort aimed at developing an analytic assessment framework for NAAMP calling survey data.

  19. Presence-nonpresence surveys of golden-cheeked warblers: detection, occupancy and survey effort

    USGS Publications Warehouse

    Watson, C.A.; Weckerly, F.W.; Hatfield, J.S.; Farquhar, C.C.; Williamson, P.S.

    2008-01-01

    Surveys to detect the presence or absence of endangered species may not consistently cover an area, account for imperfect detection or consider that detection and species presence at sample units may change within a survey season. We evaluated a detection?nondetection survey method for the federally endangered golden-cheeked warbler (GCWA) Dendroica chrysoparia. Three study areas were selected across the breeding range of GCWA in central Texas. Within each area, 28-36 detection stations were placed 200 m apart. Each detection station was surveyed nine times during the breeding season in 2 consecutive years. Surveyors remained up to 8 min at each detection station recording GCWA detected by sight or sound. To assess the potential influence of environmental covariates (e.g. slope, aspect, canopy cover, study area) on detection and occupancy and possible changes in occupancy and detection probabilities within breeding seasons, 30 models were analyzed. Using information-theoretic model selection procedures, we found that detection probabilities and occupancy varied among study areas and within breeding seasons. Detection probabilities ranged from 0.20 to 0.80 and occupancy ranged from 0.56 to 0.95. Because study areas with high detection probabilities had high occupancy, a conservative survey effort (erred towards too much surveying) was estimated using the lowest detection probability. We determined that nine surveys of 35 stations were needed to have estimates of occupancy with coefficients of variation <20%. Our survey evaluation evidently captured the key environmental variable that influenced bird detection (GCWA density) and accommodated the changes in GCWA distribution throughout the breeding season.

  20. An empirical probability model of detecting species at low densities.

    PubMed

    Delaney, David G; Leung, Brian

    2010-06-01

    False negatives, not detecting things that are actually present, are an important but understudied problem. False negatives are the result of our inability to perfectly detect species, especially those at low density such as endangered species or newly arriving introduced species. They reduce our ability to interpret presence-absence survey data and make sound management decisions (e.g., rapid response). To reduce the probability of false negatives, we need to compare the efficacy and sensitivity of different sampling approaches and quantify an unbiased estimate of the probability of detection. We conducted field experiments in the intertidal zone of New England and New York to test the sensitivity of two sampling approaches (quadrat vs. total area search, TAS), given different target characteristics (mobile vs. sessile). Using logistic regression we built detection curves for each sampling approach that related the sampling intensity and the density of targets to the probability of detection. The TAS approach reduced the probability of false negatives and detected targets faster than the quadrat approach. Mobility of targets increased the time to detection but did not affect detection success. Finally, we interpreted two years of presence-absence data on the distribution of the Asian shore crab (Hemigrapsus sanguineus) in New England and New York, using our probability model for false negatives. The type of experimental approach in this paper can help to reduce false negatives and increase our ability to detect species at low densities by refining sampling approaches, which can guide conservation strategies and management decisions in various areas of ecology such as conservation biology and invasion ecology.

  1. Application of multivariate Gaussian detection theory to known non-Gaussian probability density functions

    NASA Astrophysics Data System (ADS)

    Schwartz, Craig R.; Thelen, Brian J.; Kenton, Arthur C.

    1995-06-01

    A statistical parametric multispectral sensor performance model was developed by ERIM to support mine field detection studies, multispectral sensor design/performance trade-off studies, and target detection algorithm development. The model assumes target detection algorithms and their performance models which are based on data assumed to obey multivariate Gaussian probability distribution functions (PDFs). The applicability of these algorithms and performance models can be generalized to data having non-Gaussian PDFs through the use of transforms which convert non-Gaussian data to Gaussian (or near-Gaussian) data. An example of one such transform is the Box-Cox power law transform. In practice, such a transform can be applied to non-Gaussian data prior to the introduction of a detection algorithm that is formally based on the assumption of multivariate Gaussian data. This paper presents an extension of these techniques to the case where the joint multivariate probability density function of the non-Gaussian input data is known, and where the joint estimate of the multivariate Gaussian statistics, under the Box-Cox transform, is desired. The jointly estimated multivariate Gaussian statistics can then be used to predict the performance of a target detection algorithm which has an associated Gaussian performance model.

  2. The relationship study between image features and detection probability based on psychology experiments

    NASA Astrophysics Data System (ADS)

    Lin, Wei; Chen, Yu-hua; Wang, Ji-yuan; Gao, Hong-sheng; Wang, Ji-jun; Su, Rong-hua; Mao, Wei

    2011-04-01

    Detection probability is an important index to represent and estimate target viability, which provides basis for target recognition and decision-making. But it will expend a mass of time and manpower to obtain detection probability in reality. At the same time, due to the different interpretation of personnel practice knowledge and experience, a great difference will often exist in the datum obtained. By means of studying the relationship between image features and perception quantity based on psychology experiments, the probability model has been established, in which the process is as following.Firstly, four image features have been extracted and quantified, which affect directly detection. Four feature similarity degrees between target and background were defined. Secondly, the relationship between single image feature similarity degree and perception quantity was set up based on psychological principle, and psychological experiments of target interpretation were designed which includes about five hundred people for interpretation and two hundred images. In order to reduce image features correlativity, a lot of artificial synthesis images have been made which include images with single brightness feature difference, images with single chromaticity feature difference, images with single texture feature difference and images with single shape feature difference. By analyzing and fitting a mass of experiments datum, the model quantitys have been determined. Finally, by applying statistical decision theory and experimental results, the relationship between perception quantity with target detection probability has been found. With the verification of a great deal of target interpretation in practice, the target detection probability can be obtained by the model quickly and objectively.

  3. A multimodal detection model of dolphins to estimate abundance validated by field experiments.

    PubMed

    Akamatsu, Tomonari; Ura, Tamaki; Sugimatsu, Harumi; Bahl, Rajendar; Behera, Sandeep; Panda, Sudarsan; Khan, Muntaz; Kar, S K; Kar, C S; Kimura, Satoko; Sasaki-Yamamoto, Yukiko

    2013-09-01

    Abundance estimation of marine mammals requires matching of detection of an animal or a group of animal by two independent means. A multimodal detection model using visual and acoustic cues (surfacing and phonation) that enables abundance estimation of dolphins is proposed. The method does not require a specific time window to match the cues of both means for applying mark-recapture method. The proposed model was evaluated using data obtained in field observations of Ganges River dolphins and Irrawaddy dolphins, as examples of dispersed and condensed distributions of animals, respectively. The acoustic detection probability was approximately 80%, 20% higher than that of visual detection for both species, regardless of the distribution of the animals in present study sites. The abundance estimates of Ganges River dolphins and Irrawaddy dolphins fairly agreed with the numbers reported in previous monitoring studies. The single animal detection probability was smaller than that of larger cluster size, as predicted by the model and confirmed by field data. However, dense groups of Irrawaddy dolphins showed difference in cluster sizes observed by visual and acoustic methods. Lower detection probability of single clusters of this species seemed to be caused by the clumped distribution of this species.

  4. Lane detection based on color probability model and fuzzy clustering

    NASA Astrophysics Data System (ADS)

    Yu, Yang; Jo, Kang-Hyun

    2018-04-01

    In the vehicle driver assistance systems, the accuracy and speed of lane line detection are the most important. This paper is based on color probability model and Fuzzy Local Information C-Means (FLICM) clustering algorithm. The Hough transform and the constraints of structural road are used to detect the lane line accurately. The global map of the lane line is drawn by the lane curve fitting equation. The experimental results show that the algorithm has good robustness.

  5. Optimizing occupancy surveys by maximizing detection probability: application to amphibian monitoring in the Mediterranean region.

    PubMed

    Petitot, Maud; Manceau, Nicolas; Geniez, Philippe; Besnard, Aurélien

    2014-09-01

    Setting up effective conservation strategies requires the precise determination of the targeted species' distribution area and, if possible, its local abundance. However, detection issues make these objectives complex for most vertebrates. The detection probability is usually <1 and is highly dependent on species phenology and other environmental variables. The aim of this study was to define an optimized survey protocol for the Mediterranean amphibian community, that is, to determine the most favorable periods and the most effective sampling techniques for detecting all species present on a site in a minimum number of field sessions and a minimum amount of prospecting effort. We visited 49 ponds located in the Languedoc region of southern France on four occasions between February and June 2011. Amphibians were detected using three methods: nighttime call count, nighttime visual encounter, and daytime netting. The detection nondetection data obtained was then modeled using site-occupancy models. The detection probability of amphibians sharply differed between species, the survey method used and the date of the survey. These three covariates also interacted. Thus, a minimum of three visits spread over the breeding season, using a combination of all three survey methods, is needed to reach a 95% detection level for all species in the Mediterranean region. Synthesis and applications: detection nondetection surveys combined to site occupancy modeling approach are powerful methods that can be used to estimate the detection probability and to determine the prospecting effort necessary to assert that a species is absent from a site.

  6. Spatially explicit dynamic N-mixture models

    USGS Publications Warehouse

    Zhao, Qing; Royle, Andy; Boomer, G. Scott

    2017-01-01

    Knowledge of demographic parameters such as survival, reproduction, emigration, and immigration is essential to understand metapopulation dynamics. Traditionally the estimation of these demographic parameters requires intensive data from marked animals. The development of dynamic N-mixture models makes it possible to estimate demographic parameters from count data of unmarked animals, but the original dynamic N-mixture model does not distinguish emigration and immigration from survival and reproduction, limiting its ability to explain important metapopulation processes such as movement among local populations. In this study we developed a spatially explicit dynamic N-mixture model that estimates survival, reproduction, emigration, local population size, and detection probability from count data under the assumption that movement only occurs among adjacent habitat patches. Simulation studies showed that the inference of our model depends on detection probability, local population size, and the implementation of robust sampling design. Our model provides reliable estimates of survival, reproduction, and emigration when detection probability is high, regardless of local population size or the type of sampling design. When detection probability is low, however, our model only provides reliable estimates of survival, reproduction, and emigration when local population size is moderate to high and robust sampling design is used. A sensitivity analysis showed that our model is robust against the violation of the assumption that movement only occurs among adjacent habitat patches, suggesting wide applications of this model. Our model can be used to improve our understanding of metapopulation dynamics based on count data that are relatively easy to collect in many systems.

  7. Detection probability of cliff-nesting raptors during helicopter and fixed-wing aircraft surveys in western Alaska

    USGS Publications Warehouse

    Booms, T.L.; Schempf, P.F.; McCaffery, B.J.; Lindberg, M.S.; Fuller, M.R.

    2010-01-01

    We conducted repeated aerial surveys for breeding cliff-nesting raptors on the Yukon Delta National Wildlife Refuge (YDNWR) in western Alaska to estimate detection probabilities of Gyrfalcons (Falco rusticolus), Golden Eagles (Aquila chrysaetos), Rough-legged Hawks (Buteo lagopus), and also Common Ravens (Corvus corax). Using the program PRESENCE, we modeled detection histories of each species based on single species occupancy modeling. We used different observers during four helicopter replicate surveys in the Kilbuck Mountains and five fixed-wing replicate surveys in the Ingakslugwat Hills near Bethel, AK. During helicopter surveys, Gyrfalcons had the highest detection probability estimate (p^;p^ 0.79; SE 0.05), followed by Golden Eagles (p^=0.68; SE 0.05), Common Ravens (p^=0.45; SE 0.17), and Rough-legged Hawks (p^=0.10; SE 0.11). Detection probabilities from fixed-wing aircraft in the Ingakslugwat Hills were similar to those from the helicopter in the Kilbuck Mountains for Gyrfalcons and Golden Eagles, but were higher for Common Ravens (p^=0.85; SE 0.06) and Rough-legged Hawks (p^=0.42; SE 0.07). Fixed-wing aircraft provided detection probability estimates and SEs in the Ingakslugwat Hills similar to or better than those from helicopter surveys in the Kilbucks and should be considered for future cliff-nesting raptor surveys where safe, low-altitude flight is possible. Overall, detection probability varied by observer experience and in some cases, by study area/aircraft type.

  8. Video Shot Boundary Detection Using QR-Decomposition and Gaussian Transition Detection

    NASA Astrophysics Data System (ADS)

    Amiri, Ali; Fathy, Mahmood

    2010-12-01

    This article explores the problem of video shot boundary detection and examines a novel shot boundary detection algorithm by using QR-decomposition and modeling of gradual transitions by Gaussian functions. Specifically, the authors attend to the challenges of detecting gradual shots and extracting appropriate spatiotemporal features that affect the ability of algorithms to efficiently detect shot boundaries. The algorithm utilizes the properties of QR-decomposition and extracts a block-wise probability function that illustrates the probability of video frames to be in shot transitions. The probability function has abrupt changes in hard cut transitions, and semi-Gaussian behavior in gradual transitions. The algorithm detects these transitions by analyzing the probability function. Finally, we will report the results of the experiments using large-scale test sets provided by the TRECVID 2006, which has assessments for hard cut and gradual shot boundary detection. These results confirm the high performance of the proposed algorithm.

  9. Nonidentifiability of population size from capture-recapture data with heterogeneous detection probabilities

    USGS Publications Warehouse

    Link, W.A.

    2003-01-01

    Heterogeneity in detection probabilities has long been recognized as problematic in mark-recapture studies, and numerous models developed to accommodate its effects. Individual heterogeneity is especially problematic, in that reasonable alternative models may predict essentially identical observations from populations of substantially different sizes. Thus even with very large samples, the analyst will not be able to distinguish among reasonable models of heterogeneity, even though these yield quite distinct inferences about population size. The problem is illustrated with models for closed and open populations.

  10. Modeling of site occupancy dynamics for northern spotted owls, with emphasis on the effects of barred owls

    USGS Publications Warehouse

    Olson, Gail S.; Anthony, Robert G.; Forsman, Eric D.; Ackers, Steven H.; Loschl, Peter J.; Reid, Janice A.; Dugger, Katie M.; Glenn, Elizabeth M.; Ripple, William J.

    2005-01-01

    Northern spotted owls (Strix occidentalis caurina) have been studied intensively since their listing as a threatened species by the U.S. Fish and Wildlife Service in 1990. Studies of spotted owl site occupancy have used various binary response measures, but most of these studies have made the assumption that detectability is perfect, or at least high and not variable. Further, previous studies did not consider temporal variation in site occupancy. We used relatively new methods for open population modeling of site occupancy that incorporated imperfect and variable detectability of spotted owls and allowed modeling of temporal variation in site occupancy, extinction, and colonization probabilities. We also examined the effects of barred owl (S. varia) presence on these parameters. We used spotted owl survey data from 1990 to 2002 for 3 study areas in Oregon, USA, and we used program MARK to develop and analyze site occupancy models. We found per visit detection probabilities averaged <0.70 and were highly variable among study years and study areas. Site occupancy probabilities for owl pairs declined greatly on 1 study area and slightly on the other 2 areas. For all owls, including singles and pairs, site occupancy was mostly stable through time. Barred owl presence had a negative effect on spotted owl detection probabilities, and it had either a positive effect on local-extinction probabilities or a negative effect on colonization probabilities. We conclude that further analyses of spotted owls must account for imperfect and variable detectability and barred owl presence to properly interpret results. Further, because barred owl presence is increasing within the range of northern spotted owls, we expect to see further declines in the proportion of sites occupied by spotted owls.

  11. MRI Brain Tumor Segmentation and Necrosis Detection Using Adaptive Sobolev Snakes.

    PubMed

    Nakhmani, Arie; Kikinis, Ron; Tannenbaum, Allen

    2014-03-21

    Brain tumor segmentation in brain MRI volumes is used in neurosurgical planning and illness staging. It is important to explore the tumor shape and necrosis regions at different points of time to evaluate the disease progression. We propose an algorithm for semi-automatic tumor segmentation and necrosis detection. Our algorithm consists of three parts: conversion of MRI volume to a probability space based on the on-line learned model, tumor probability density estimation, and adaptive segmentation in the probability space. We use manually selected acceptance and rejection classes on a single MRI slice to learn the background and foreground statistical models. Then, we propagate this model to all MRI slices to compute the most probable regions of the tumor. Anisotropic 3D diffusion is used to estimate the probability density. Finally, the estimated density is segmented by the Sobolev active contour (snake) algorithm to select smoothed regions of the maximum tumor probability. The segmentation approach is robust to noise and not very sensitive to the manual initialization in the volumes tested. Also, it is appropriate for low contrast imagery. The irregular necrosis regions are detected by using the outliers of the probability distribution inside the segmented region. The necrosis regions of small width are removed due to a high probability of noisy measurements. The MRI volume segmentation results obtained by our algorithm are very similar to expert manual segmentation.

  12. MRI brain tumor segmentation and necrosis detection using adaptive Sobolev snakes

    NASA Astrophysics Data System (ADS)

    Nakhmani, Arie; Kikinis, Ron; Tannenbaum, Allen

    2014-03-01

    Brain tumor segmentation in brain MRI volumes is used in neurosurgical planning and illness staging. It is important to explore the tumor shape and necrosis regions at di erent points of time to evaluate the disease progression. We propose an algorithm for semi-automatic tumor segmentation and necrosis detection. Our algorithm consists of three parts: conversion of MRI volume to a probability space based on the on-line learned model, tumor probability density estimation, and adaptive segmentation in the probability space. We use manually selected acceptance and rejection classes on a single MRI slice to learn the background and foreground statistical models. Then, we propagate this model to all MRI slices to compute the most probable regions of the tumor. Anisotropic 3D di usion is used to estimate the probability density. Finally, the estimated density is segmented by the Sobolev active contour (snake) algorithm to select smoothed regions of the maximum tumor probability. The segmentation approach is robust to noise and not very sensitive to the manual initialization in the volumes tested. Also, it is appropriate for low contrast imagery. The irregular necrosis regions are detected by using the outliers of the probability distribution inside the segmented region. The necrosis regions of small width are removed due to a high probability of noisy measurements. The MRI volume segmentation results obtained by our algorithm are very similar to expert manual segmentation.

  13. The effects of flow on schooling Devario aequipinnatus: school structure, startle response and information transmission

    PubMed Central

    Chicoli, A.; Butail, S.; Lun, Y.; Bak-Coleman, J.; Coombs, S.; Paley, D.A.

    2014-01-01

    To assess how flow affects school structure and threat detection, startle response rates of solitary and small groups of giant danio Devario aequipinnatus were compared to visual looming stimuli in flow and no-flow conditions. The instantaneous position and heading of each D. aequipinnatus were extracted from high-speed videos. Behavioural results indicate that (1) school structure is altered in flow such that D. aequipinnatus orient upstream while spanning out in a crosswise direction, (2) the probability of at least one D. aequipinnatus detecting the visual looming stimulus is higher in flow than no flow for both solitary D. aequipinnatus and groups of eight D. aequipinnatus, however, (3) the probability of three or more individuals responding is higher in no flow than flow. Taken together, these results indicate a higher probability of stimulus detection in flow but a higher probability of internal transmission of information in no flow. Finally, results were well predicted by a computational model of collective fright response that included the probability of direct detection (based on signal detection theory) and indirect detection (i.e. via interactions between group members) of threatening stimuli. This model provides a new theoretical framework for analysing the collective transfer of information among groups of fishes and other organisms. PMID:24773538

  14. Factors influencing detection of the federally endangered Diamond Darter Crystallaria cincotta: Implications for long-term monitoring strategies

    USGS Publications Warehouse

    Rizzo, Austin A.; Brown, Donald J.; Welsh, Stuart A.; Thompson, Patricia A.

    2017-01-01

    Population monitoring is an essential component of endangered species recovery programs. The federally endangered Diamond Darter Crystallaria cincotta is in need of an effective monitoring design to improve our understanding of its distribution and track population trends. Because of their small size, cryptic coloration, and nocturnal behavior, along with limitations associated with current sampling methods, individuals are difficult to detect at known occupied sites. Therefore, research is needed to determine if survey efforts can be improved by increasing probability of individual detection. The primary objective of this study was to determine if there are seasonal and diel patterns in Diamond Darter detectability during population surveys. In addition to temporal factors, we also assessed five habitat variables that might influence individual detection. We used N-mixture models to estimate site abundances and relationships between covariates and individual detectability and ranked models using Akaike's information criteria. During 2015 three known occupied sites were sampled 15 times each between May and Oct. The best supported model included water temperature as a quadratic function influencing individual detectability, with temperatures around 22 C resulting in the highest detection probability. Detection probability when surveying at the optimal temperature was approximately 6% and 7.5% greater than when surveying at 16 C and 29 C, respectively. Time of Night and day of year were not strong predictors of Diamond Darter detectability. The results of this study will allow researchers and agencies to maximize detection probability when surveying populations, resulting in greater monitoring efficiency and likely more precise abundance estimates.

  15. A model of human event detection in multiple process monitoring situations

    NASA Technical Reports Server (NTRS)

    Greenstein, J. S.; Rouse, W. B.

    1978-01-01

    It is proposed that human decision making in many multi-task situations might be modeled in terms of the manner in which the human detects events related to his tasks and the manner in which he allocates his attention among his tasks once he feels events have occurred. A model of human event detection performance in such a situation is presented. An assumption of the model is that, in attempting to detect events, the human generates the probability that events have occurred. Discriminant analysis is used to model the human's generation of these probabilities. An experimental study of human event detection performance in a multiple process monitoring situation is described and the application of the event detection model to this situation is addressed. The experimental study employed a situation in which subjects simulataneously monitored several dynamic processes for the occurrence of events and made yes/no decisions on the presence of events in each process. Input to the event detection model of the information displayed to the experimental subjects allows comparison of the model's performance with the performance of the subjects.

  16. Integrating occupancy modeling and interview data for corridor identification: A case study for jaguars in Nicaragua

    USGS Publications Warehouse

    Zeller, K.A.; Nijhawan, S.; Salom-Perez, R.; Potosme, S.H.; Hines, J.E.

    2011-01-01

    Corridors are critical elements in the long-term conservation of wide-ranging species like the jaguar (Panthera onca). Jaguar corridors across the range of the species were initially identified using a GIS-based least-cost corridor model. However, due to inherent errors in remotely sensed data and model uncertainties, these corridors warrant field verification before conservation efforts can begin. We developed a novel corridor assessment protocol based on interview data and site occupancy modeling. We divided our pilot study area, in southeastern Nicaragua, into 71, 6. ??. 6 km sampling units and conducted 160 structured interviews with local residents. Interviews were designed to collect data on jaguar and seven prey species so that detection/non-detection matrices could be constructed for each sampling unit. Jaguars were reportedly detected in 57% of the sampling units and had a detection probability of 28%. With the exception of white-lipped peccary, prey species were reportedly detected in 82-100% of the sampling units. Though the use of interview data may violate some assumptions of the occupancy modeling approach for determining 'proportion of area occupied', we countered these shortcomings through study design and interpreting the occupancy parameter, psi, as 'probability of habitat used'. Probability of habitat use was modeled for each target species using single state or multistate models. A combination of the estimated probabilities of habitat use for jaguar and prey was selected to identify the final jaguar corridor. This protocol provides an efficient field methodology for identifying corridors for easily-identifiable species, across large study areas comprised of unprotected, private lands. ?? 2010 Elsevier Ltd.

  17. Exploiting vibrational resonance in weak-signal detection

    NASA Astrophysics Data System (ADS)

    Ren, Yuhao; Pan, Yan; Duan, Fabing; Chapeau-Blondeau, François; Abbott, Derek

    2017-08-01

    In this paper, we investigate the first exploitation of the vibrational resonance (VR) effect to detect weak signals in the presence of strong background noise. By injecting a series of sinusoidal interference signals of the same amplitude but with different frequencies into a generalized correlation detector, we show that the detection probability can be maximized at an appropriate interference amplitude. Based on a dual-Dirac probability density model, we compare the VR method with the stochastic resonance approach via adding dichotomous noise. The compared results indicate that the VR method can achieve a higher detection probability for a wider variety of noise distributions.

  18. Exploiting vibrational resonance in weak-signal detection.

    PubMed

    Ren, Yuhao; Pan, Yan; Duan, Fabing; Chapeau-Blondeau, François; Abbott, Derek

    2017-08-01

    In this paper, we investigate the first exploitation of the vibrational resonance (VR) effect to detect weak signals in the presence of strong background noise. By injecting a series of sinusoidal interference signals of the same amplitude but with different frequencies into a generalized correlation detector, we show that the detection probability can be maximized at an appropriate interference amplitude. Based on a dual-Dirac probability density model, we compare the VR method with the stochastic resonance approach via adding dichotomous noise. The compared results indicate that the VR method can achieve a higher detection probability for a wider variety of noise distributions.

  19. Analysis of capture-recapture models with individual covariates using data augmentation

    USGS Publications Warehouse

    Royle, J. Andrew

    2009-01-01

    I consider the analysis of capture-recapture models with individual covariates that influence detection probability. Bayesian analysis of the joint likelihood is carried out using a flexible data augmentation scheme that facilitates analysis by Markov chain Monte Carlo methods, and a simple and straightforward implementation in freely available software. This approach is applied to a study of meadow voles (Microtus pennsylvanicus) in which auxiliary data on a continuous covariate (body mass) are recorded, and it is thought that detection probability is related to body mass. In a second example, the model is applied to an aerial waterfowl survey in which a double-observer protocol is used. The fundamental unit of observation is the cluster of individual birds, and the size of the cluster (a discrete covariate) is used as a covariate on detection probability.

  20. A Probabilistic Model of Illegal Drug Trafficking Operations in the Eastern Pacific and Caribbean Sea

    DTIC Science & Technology

    2013-09-01

    partner agencies and nations, detects, tracks, and interdicts illegal drug-trafficking in this region. In this thesis, we develop a probability model based...trafficking in this region. In this thesis, we develop a probability model based on intelligence inputs to generate a spatial temporal heat map specifying the...complement and vet such complicated simulation by developing more analytically tractable models. We develop probability models to generate a heat map

  1. Predicting species distributions from checklist data using site-occupancy models

    USGS Publications Warehouse

    Kery, M.; Gardner, B.; Monnerat, C.

    2010-01-01

    Aim: (1) To increase awareness of the challenges induced by imperfect detection, which is a fundamental issue in species distribution modelling; (2) to emphasize the value of replicate observations for species distribution modelling; and (3) to show how 'cheap' checklist data in faunal/floral databases may be used for the rigorous modelling of distributions by site-occupancy models. Location: Switzerland. Methods: We used checklist data collected by volunteers during 1999 and 2000 to analyse the distribution of the blue hawker, Aeshna cyanea (Odonata, Aeshnidae), a common dragonfly in Switzerland. We used data from repeated visits to 1-ha pixels to derive 'detection histories' and apply site-occupancy models to estimate the 'true' species distribution, i.e. corrected for imperfect detection. We modelled blue hawker distribution as a function of elevation and year and its detection probability of elevation, year and season. Results: The best model contained cubic polynomial elevation effects for distribution and quadratic effects of elevation and season for detectability. We compared the site-occupancy model with a conventional distribution model based on a generalized linear model, which assumes perfect detectability (p = 1). The conventional distribution map looked very different from the distribution map obtained using site-occupancy models that accounted for the imperfect detection. The conventional model underestimated the species distribution by 60%, and the slope parameters of the occurrence-elevation relationship were also underestimated when assuming p = 1. Elevation was not only an important predictor of blue hawker occurrence, but also of the detection probability, with a bell-shaped relationship. Furthermore, detectability increased over the season. The average detection probability was estimated at only 0.19 per survey. Main conclusions: Conventional species distribution models do not model species distributions per se but rather the apparent distribution, i.e. an unknown proportion of species distributions. That unknown proportion is equivalent to detectability. Imperfect detection in conventional species distribution models yields underestimates of the extent of distributions and covariate effects that are biased towards zero. In addition, patterns in detectability will erroneously be ascribed to species distributions. In contrast, site-occupancy models applied to replicated detection/non-detection data offer a powerful framework for making inferences about species distributions corrected for imperfect detection. The use of 'cheap' checklist data greatly enhances the scope of applications of this useful class of models. ?? 2010 Blackwell Publishing Ltd.

  2. Estimating occupancy and abundance using aerial images with imperfect detection

    USGS Publications Warehouse

    Williams, Perry J.; Hooten, Mevin B.; Womble, Jamie N.; Bower, Michael R.

    2017-01-01

    Species distribution and abundance are critical population characteristics for efficient management, conservation, and ecological insight. Point process models are a powerful tool for modelling distribution and abundance, and can incorporate many data types, including count data, presence-absence data, and presence-only data. Aerial photographic images are a natural tool for collecting data to fit point process models, but aerial images do not always capture all animals that are present at a site. Methods for estimating detection probability for aerial surveys usually include collecting auxiliary data to estimate the proportion of time animals are available to be detected.We developed an approach for fitting point process models using an N-mixture model framework to estimate detection probability for aerial occupancy and abundance surveys. Our method uses multiple aerial images taken of animals at the same spatial location to provide temporal replication of sample sites. The intersection of the images provide multiple counts of individuals at different times. We examined this approach using both simulated and real data of sea otters (Enhydra lutris kenyoni) in Glacier Bay National Park, southeastern Alaska.Using our proposed methods, we estimated detection probability of sea otters to be 0.76, the same as visual aerial surveys that have been used in the past. Further, simulations demonstrated that our approach is a promising tool for estimating occupancy, abundance, and detection probability from aerial photographic surveys.Our methods can be readily extended to data collected using unmanned aerial vehicles, as technology and regulations permit. The generality of our methods for other aerial surveys depends on how well surveys can be designed to meet the assumptions of N-mixture models.

  3. A hierarchical model combining distance sampling and time removal to estimate detection probability during avian point counts

    USGS Publications Warehouse

    Amundson, Courtney L.; Royle, J. Andrew; Handel, Colleen M.

    2014-01-01

    Imperfect detection during animal surveys biases estimates of abundance and can lead to improper conclusions regarding distribution and population trends. Farnsworth et al. (2005) developed a combined distance-sampling and time-removal model for point-transect surveys that addresses both availability (the probability that an animal is available for detection; e.g., that a bird sings) and perceptibility (the probability that an observer detects an animal, given that it is available for detection). We developed a hierarchical extension of the combined model that provides an integrated analysis framework for a collection of survey points at which both distance from the observer and time of initial detection are recorded. Implemented in a Bayesian framework, this extension facilitates evaluating covariates on abundance and detection probability, incorporating excess zero counts (i.e. zero-inflation), accounting for spatial autocorrelation, and estimating population density. Species-specific characteristics, such as behavioral displays and territorial dispersion, may lead to different patterns of availability and perceptibility, which may, in turn, influence the performance of such hierarchical models. Therefore, we first test our proposed model using simulated data under different scenarios of availability and perceptibility. We then illustrate its performance with empirical point-transect data for a songbird that consistently produces loud, frequent, primarily auditory signals, the Golden-crowned Sparrow (Zonotrichia atricapilla); and for 2 ptarmigan species (Lagopus spp.) that produce more intermittent, subtle, and primarily visual cues. Data were collected by multiple observers along point transects across a broad landscape in southwest Alaska, so we evaluated point-level covariates on perceptibility (observer and habitat), availability (date within season and time of day), and abundance (habitat, elevation, and slope), and included a nested point-within-transect and park-level effect. Our results suggest that this model can provide insight into the detection process during avian surveys and reduce bias in estimates of relative abundance but is best applied to surveys of species with greater availability (e.g., breeding songbirds).

  4. Heterogeneous detection probabilities for imperiled Missouri River fishes: implications for large-river monitoring programs

    USGS Publications Warehouse

    Schloesser, J.T.; Paukert, Craig P.; Doyle, W.J.; Hill, Tracy D.; Steffensen, K.D.; Travnichek, Vincent H.

    2012-01-01

    Occupancy modeling was used to determine (1) if detection probabilities (p) for 7 regionally imperiled Missouri River fishes (Scaphirhynchus albus, Scaphirhynchus platorynchus, Cycleptus elongatus, Sander canadensis, Macrhybopsis aestivalis, Macrhybopsis gelida, and Macrhybopsis meeki) differed among gear types (i.e. stationary gill nets, drifted trammel nets, and otter trawls), and (2) how detection probabilities were affected by habitat (i.e. pool, bar, and open water), longitudinal position (five 189 to 367 rkm long segments), sampling year (2003 to 2006), and season (July 1 to October 30 and October 31 to June 30). Adult, large-bodied fishes were best detected with gill nets (p: 0.02–0.74), but most juvenile large-bodied and all small-bodied species were best detected with otter trawls (p: 0.02–0.58). Trammel nets may be a redundant sampling gear for imperiled fishes in the lower Missouri River because most species had greater detection probabilities with gill nets or otter trawls. Detection probabilities varied with river segment for S. platorynchus, C. elongatus, and all small-bodied fishes, suggesting that changes in habitat influenced gear efficiency or abundance changes among river segments. Detection probabilities varied by habitat for adult S. albus and S. canadensis, year for juvenile S. albus, C. elongatus, and S. canadensis, and season for adult S. albus. Concentrating sampling effort on gears with the greatest detection probabilities may increase species detections to better monitor a population's response to environmental change and the effects of management actions on large-river fishes.

  5. Sampling techniques for burbot in a western non-wadeable river

    USGS Publications Warehouse

    Klein, Z. B.; Quist, Michael C.; Rhea, D.T.; Senecal, A. C.

    2015-01-01

    Burbot, Lota lota (L.), populations are declining throughout much of their native distribution. Although numerous aspects of burbot ecology are well understood, less is known about effective sampling techniques for burbot in lotic systems. Occupancy models were used to estimate the probability of detection () for three gears (6.4- and 19-mm bar mesh hoop nets, night electric fishing), within the context of various habitat characteristics. During the summer, night electric fishing had the highest estimated detection probability for both juvenile (, 95% C.I.; 0.35, 0.26–0.46) and adult (0.30, 0.20–0.41) burbot. However, small-mesh hoop nets (6.4-mm bar mesh) had similar detection probabilities to night electric fishing for both juvenile (0.26, 0.17–0.36) and adult (0.27, 0.18–0.39) burbot during the summer. In autumn, a similar overlap between detection probabilities was observed for juvenile and adult burbot. Small-mesh hoop nets had the highest estimated probability of detection for both juvenile and adult burbot (0.46, 0.33–0.59), whereas night electric fishing had a detection probability of 0.39 (0.28–0.52) for juvenile and adult burbot. By using detection probabilities to compare gears, the most effective sampling technique can be identified, leading to increased species detections and more effective management of burbot.

  6. Probability of detection of clinical seizures using heart rate changes.

    PubMed

    Osorio, Ivan; Manly, B F J

    2015-08-01

    Heart rate-based seizure detection is a viable complement or alternative to ECoG/EEG. This study investigates the role of various biological factors on the probability of clinical seizure detection using heart rate. Regression models were applied to 266 clinical seizures recorded from 72 subjects to investigate if factors such as age, gender, years with epilepsy, etiology, seizure site origin, seizure class, and data collection centers, among others, shape the probability of EKG-based seizure detection. Clinical seizure detection probability based on heart rate changes, is significantly (p<0.001) shaped by patients' age and gender, seizure class, and years with epilepsy. The probability of detecting clinical seizures (>0.8 in the majority of subjects) using heart rate is highest for complex partial seizures, increases with a patient's years with epilepsy, is lower for females than for males and is unrelated to the side of hemisphere origin. Clinical seizure detection probability using heart rate is multi-factorially dependent and sufficiently high (>0.8) in most cases to be clinically useful. Knowledge of the role that these factors play in shaping said probability will enhance its applicability and usefulness. Heart rate is a reliable and practical signal for extra-cerebral detection of clinical seizures originating from or spreading to central autonomic network structures. Copyright © 2015 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.

  7. A scan statistic for binary outcome based on hypergeometric probability model, with an application to detecting spatial clusters of Japanese encephalitis.

    PubMed

    Zhao, Xing; Zhou, Xiao-Hua; Feng, Zijian; Guo, Pengfei; He, Hongyan; Zhang, Tao; Duan, Lei; Li, Xiaosong

    2013-01-01

    As a useful tool for geographical cluster detection of events, the spatial scan statistic is widely applied in many fields and plays an increasingly important role. The classic version of the spatial scan statistic for the binary outcome is developed by Kulldorff, based on the Bernoulli or the Poisson probability model. In this paper, we apply the Hypergeometric probability model to construct the likelihood function under the null hypothesis. Compared with existing methods, the likelihood function under the null hypothesis is an alternative and indirect method to identify the potential cluster, and the test statistic is the extreme value of the likelihood function. Similar with Kulldorff's methods, we adopt Monte Carlo test for the test of significance. Both methods are applied for detecting spatial clusters of Japanese encephalitis in Sichuan province, China, in 2009, and the detected clusters are identical. Through a simulation to independent benchmark data, it is indicated that the test statistic based on the Hypergeometric model outweighs Kulldorff's statistics for clusters of high population density or large size; otherwise Kulldorff's statistics are superior.

  8. Radiation detection method and system using the sequential probability ratio test

    DOEpatents

    Nelson, Karl E [Livermore, CA; Valentine, John D [Redwood City, CA; Beauchamp, Brock R [San Ramon, CA

    2007-07-17

    A method and system using the Sequential Probability Ratio Test to enhance the detection of an elevated level of radiation, by determining whether a set of observations are consistent with a specified model within a given bounds of statistical significance. In particular, the SPRT is used in the present invention to maximize the range of detection, by providing processing mechanisms for estimating the dynamic background radiation, adjusting the models to reflect the amount of background knowledge at the current point in time, analyzing the current sample using the models to determine statistical significance, and determining when the sample has returned to the expected background conditions.

  9. Probability of Detection (POD) as a statistical model for the validation of qualitative methods.

    PubMed

    Wehling, Paul; LaBudde, Robert A; Brunelle, Sharon L; Nelson, Maria T

    2011-01-01

    A statistical model is presented for use in validation of qualitative methods. This model, termed Probability of Detection (POD), harmonizes the statistical concepts and parameters between quantitative and qualitative method validation. POD characterizes method response with respect to concentration as a continuous variable. The POD model provides a tool for graphical representation of response curves for qualitative methods. In addition, the model allows comparisons between candidate and reference methods, and provides calculations of repeatability, reproducibility, and laboratory effects from collaborative study data. Single laboratory study and collaborative study examples are given.

  10. Minimum time search in uncertain dynamic domains with complex sensorial platforms.

    PubMed

    Lanillos, Pablo; Besada-Portas, Eva; Lopez-Orozco, Jose Antonio; de la Cruz, Jesus Manuel

    2014-08-04

    The minimum time search in uncertain domains is a searching task, which appears in real world problems such as natural disasters and sea rescue operations, where a target has to be found, as soon as possible, by a set of sensor-equipped searchers. The automation of this task, where the time to detect the target is critical, can be achieved by new probabilistic techniques that directly minimize the Expected Time (ET) to detect a dynamic target using the observation probability models and actual observations collected by the sensors on board the searchers. The selected technique, described in algorithmic form in this paper for completeness, has only been previously partially tested with an ideal binary detection model, in spite of being designed to deal with complex non-linear/non-differential sensorial models. This paper covers the gap, testing its performance and applicability over different searching tasks with searchers equipped with different complex sensors. The sensorial models under test vary from stepped detection probabilities to continuous/discontinuous differentiable/non-differentiable detection probabilities dependent on distance, orientation, and structured maps. The analysis of the simulated results of several static and dynamic scenarios performed in this paper validates the applicability of the technique with different types of sensor models.

  11. Minimum Time Search in Uncertain Dynamic Domains with Complex Sensorial Platforms

    PubMed Central

    Lanillos, Pablo; Besada-Portas, Eva; Lopez-Orozco, Jose Antonio; de la Cruz, Jesus Manuel

    2014-01-01

    The minimum time search in uncertain domains is a searching task, which appears in real world problems such as natural disasters and sea rescue operations, where a target has to be found, as soon as possible, by a set of sensor-equipped searchers. The automation of this task, where the time to detect the target is critical, can be achieved by new probabilistic techniques that directly minimize the Expected Time (ET) to detect a dynamic target using the observation probability models and actual observations collected by the sensors on board the searchers. The selected technique, described in algorithmic form in this paper for completeness, has only been previously partially tested with an ideal binary detection model, in spite of being designed to deal with complex non-linear/non-differential sensorial models. This paper covers the gap, testing its performance and applicability over different searching tasks with searchers equipped with different complex sensors. The sensorial models under test vary from stepped detection probabilities to continuous/discontinuous differentiable/non-differentiable detection probabilities dependent on distance, orientation, and structured maps. The analysis of the simulated results of several static and dynamic scenarios performed in this paper validates the applicability of the technique with different types of sensor models. PMID:25093345

  12. Changes in tropical precipitation cluster size distributions under global warming

    NASA Astrophysics Data System (ADS)

    Neelin, J. D.; Quinn, K. M.

    2016-12-01

    The total amount of precipitation integrated across a tropical storm or other precipitation feature (contiguous clusters of precipitation exceeding a minimum rain rate) is a useful measure of the aggregate size of the disturbance. To establish baseline behavior in current climate, the probability distribution of cluster sizes from multiple satellite retrievals and National Center for Environmental Prediction (NCEP) reanalysis is compared to those from Coupled Model Intercomparison Project (CMIP5) models and the Geophysical Fluid Dynamics Laboratory high-resolution atmospheric model (HIRAM-360 and -180). With the caveat that a minimum rain rate threshold is important in the models (which tend to overproduce low rain rates), the models agree well with observations in leading properties. In particular, scale-free power law ranges in which the probability drops slowly with increasing cluster size are well modeled, followed by a rapid drop in probability of the largest clusters above a cutoff scale. Under the RCP 8.5 global warming scenario, the models indicate substantial increases in probability (up to an order of magnitude) of the largest clusters by the end of century. For models with continuous time series of high resolution output, there is substantial spread on when these probability increases for the largest precipitation clusters should be detectable, ranging from detectable within the observational period to statistically significant trends emerging only in the second half of the century. Examination of NCEP reanalysis and SSMI/SSMIS series of satellite retrievals from 1979 to present does not yield reliable evidence of trends at this time. The results suggest improvements in inter-satellite calibration of the SSMI/SSMIS retrievals could aid future detection.

  13. BEESCOUT: A model of bee scouting behaviour and a software tool for characterizing nectar/pollen landscapes for BEEHAVE.

    PubMed

    Becher, M A; Grimm, V; Knapp, J; Horn, J; Twiston-Davies, G; Osborne, J L

    2016-11-24

    Social bees are central place foragers collecting floral resources from the surrounding landscape, but little is known about the probability of a scouting bee finding a particular flower patch. We therefore developed a software tool, BEESCOUT, to theoretically examine how bees might explore a landscape and distribute their scouting activities over time and space. An image file can be imported, which is interpreted by the model as a "forage map" with certain colours representing certain crops or habitat types as specified by the user. BEESCOUT calculates the size and location of these potential food sources in that landscape relative to a bee colony. An individual-based model then determines the detection probabilities of the food patches by bees, based on parameter values gathered from the flight patterns of radar-tracked honeybees and bumblebees. Various "search modes" describe hypothetical search strategies for the long-range exploration of scouting bees. The resulting detection probabilities of forage patches can be used as input for the recently developed honeybee model BEEHAVE, to explore realistic scenarios of colony growth and death in response to different stressors. In example simulations, we find that detection probabilities for food sources close to the colony fit empirical data reasonably well. However, for food sources further away no empirical data are available to validate model output. The simulated detection probabilities depend largely on the bees' search mode, and whether they exchange information about food source locations. Nevertheless, we show that landscape structure and connectivity of food sources can have a strong impact on the results. We believe that BEESCOUT is a valuable tool to better understand how landscape configurations and searching behaviour of bees affect detection probabilities of food sources. It can also guide the collection of relevant data and the design of experiments to close knowledge gaps, and provides a useful extension to the BEEHAVE honeybee model, enabling future users to explore how landscape structure and food availability affect the foraging decisions and patch visitation rates of the bees and, in consequence, to predict colony development and survival.

  14. Estimating abundance while accounting for rarity, correlated behavior, and other sources of variation in counts

    USGS Publications Warehouse

    Dorazio, Robert M.; Martin, Juulien; Edwards, Holly H.

    2013-01-01

    The class of N-mixture models allows abundance to be estimated from repeated, point count surveys while adjusting for imperfect detection of individuals. We developed an extension of N-mixture models to account for two commonly observed phenomena in point count surveys: rarity and lack of independence induced by unmeasurable sources of variation in the detectability of individuals. Rarity increases the number of locations with zero detections in excess of those expected under simple models of abundance (e.g., Poisson or negative binomial). Correlated behavior of individuals and other phenomena, though difficult to measure, increases the variation in detection probabilities among surveys. Our extension of N-mixture models includes a hurdle model of abundance and a beta-binomial model of detectability that accounts for additional (extra-binomial) sources of variation in detections among surveys. As an illustration, we fit this model to repeated point counts of the West Indian manatee, which was observed in a pilot study using aerial surveys. Our extension of N-mixture models provides increased flexibility. The effects of different sets of covariates may be estimated for the probability of occurrence of a species, for its mean abundance at occupied locations, and for its detectability.

  15. Detection method of financial crisis in Indonesia using MSGARCH models based on banking condition indicators

    NASA Astrophysics Data System (ADS)

    Sugiyanto; Zukhronah, E.; Sari, S. P.

    2018-05-01

    Financial crisis has hit Indonesia for several times resulting the needs for an early detection system to minimize the impact. One of many methods that can be used to detect the crisis is to model the crisis indicators using combination of volatility and Markov switching models [5]. There are some indicators that can be used to detect financial crisis. Three of them are the difference between interest rate on deposit and lending, the real interest rate on deposit, and the difference between real BI rate and real Fed rate which can be referred as banking condition indicators. Volatility model used to overcome the conditional variance that change over time. Combination of volatility and Markov switching models used to detect condition change on the data. The smoothed probability from the combined models can be used to detect the crisis. This research resulted that the best combined volatility and Markov switching models for the three indicators are MS-GARCH(3,1,1) models with three states assumption. Crises in mid of 1997 until 1998 has successfully detected with a certain range of smoothed probability value for the three indicators.

  16. Estimating abundance while accounting for rarity, correlated behavior, and other sources of variation in counts.

    PubMed

    Dorazio, Robert M; Martin, Julien; Edwards, Holly H

    2013-07-01

    The class of N-mixture models allows abundance to be estimated from repeated, point count surveys while adjusting for imperfect detection of individuals. We developed an extension of N-mixture models to account for two commonly observed phenomena in point count surveys: rarity and lack of independence induced by unmeasurable sources of variation in the detectability of individuals. Rarity increases the number of locations with zero detections in excess of those expected under simple models of abundance (e.g., Poisson or negative binomial). Correlated behavior of individuals and other phenomena, though difficult to measure, increases the variation in detection probabilities among surveys. Our extension of N-mixture models includes a hurdle model of abundance and a beta-binomial model of detectability that accounts for additional (extra-binomial) sources of variation in detections among surveys. As an illustration, we fit this model to repeated point counts of the West Indian manatee, which was observed in a pilot study using aerial surveys. Our extension of N-mixture models provides increased flexibility. The effects of different sets of covariates may be estimated for the probability of occurrence of a species, for its mean abundance at occupied locations, and for its detectability.

  17. Using multilevel spatial models to understand salamander site occupancy patterns after wildfire

    USGS Publications Warehouse

    Chelgren, Nathan; Adams, Michael J.; Bailey, Larissa L.; Bury, R. Bruce

    2011-01-01

    Studies of the distribution of elusive forest wildlife have suffered from the confounding of true presence with the uncertainty of detection. Occupancy modeling, which incorporates probabilities of species detection conditional on presence, is an emerging approach for reducing observation bias. However, the current likelihood modeling framework is restrictive for handling unexplained sources of variation in the response that may occur when there are dependence structures such as smaller sampling units that are nested within larger sampling units. We used multilevel Bayesian occupancy modeling to handle dependence structures and to partition sources of variation in occupancy of sites by terrestrial salamanders (family Plethodontidae) within and surrounding an earlier wildfire in western Oregon, USA. Comparison of model fit favored a spatial N-mixture model that accounted for variation in salamander abundance over models that were based on binary detection/non-detection data. Though catch per unit effort was higher in burned areas than unburned, there was strong support that this pattern was due to a higher probability of capture for individuals in burned plots. Within the burn, the odds of capturing an individual given it was present were 2.06 times the odds outside the burn, reflecting reduced complexity of ground cover in the burn. There was weak support that true occupancy was lower within the burned area. While the odds of occupancy in the burn were 0.49 times the odds outside the burn among the five species, the magnitude of variation attributed to the burn was small in comparison to variation attributed to other landscape variables and to unexplained, spatially autocorrelated random variation. While ordinary occupancy models may separate the biological pattern of interest from variation in detection probability when all sources of variation are known, the addition of random effects structures for unexplained sources of variation in occupancy and detection probability may often more appropriately represent levels of uncertainty. ?? 2011 by the Ecological Society of America.

  18. Threshold detection in an on-off binary communications channel with atmospheric scintillation

    NASA Technical Reports Server (NTRS)

    Webb, W. E.; Marino, J. T., Jr.

    1974-01-01

    The optimum detection threshold in an on-off binary optical communications system operating in the presence of atmospheric turbulence was investigated assuming a poisson detection process and log normal scintillation. The dependence of the probability of bit error on log amplitude variance and received signal strength was analyzed and semi-emperical relationships to predict the optimum detection threshold derived. On the basis of this analysis a piecewise linear model for an adaptive threshold detection system is presented. Bit error probabilities for non-optimum threshold detection system were also investigated.

  19. Threshold detection in an on-off binary communications channel with atmospheric scintillation

    NASA Technical Reports Server (NTRS)

    Webb, W. E.

    1975-01-01

    The optimum detection threshold in an on-off binary optical communications system operating in the presence of atmospheric turbulence was investigated assuming a poisson detection process and log normal scintillation. The dependence of the probability of bit error on log amplitude variance and received signal strength was analyzed and semi-empirical relationships to predict the optimum detection threshold derived. On the basis of this analysis a piecewise linear model for an adaptive threshold detection system is presented. The bit error probabilities for nonoptimum threshold detection systems were also investigated.

  20. Cumulative detection probabilities and range accuracy of a pulsed Geiger-mode avalanche photodiode laser ranging system

    NASA Astrophysics Data System (ADS)

    Luo, Hanjun; Ouyang, Zhengbiao; Liu, Qiang; Chen, Zhiliang; Lu, Hualan

    2017-10-01

    Cumulative pulses detection with appropriate cumulative pulses number and threshold has the ability to improve the detection performance of the pulsed laser ranging system with GM-APD. In this paper, based on Poisson statistics and multi-pulses cumulative process, the cumulative detection probabilities and their influence factors are investigated. With the normalized probability distribution of each time bin, the theoretical model of the range accuracy and precision is established, and the factors limiting the range accuracy and precision are discussed. The results show that the cumulative pulses detection can produce higher target detection probability and lower false alarm probability. However, for a heavy noise level and extremely weak echo intensity, the false alarm suppression performance of the cumulative pulses detection deteriorates quickly. The range accuracy and precision is another important parameter evaluating the detection performance, the echo intensity and pulse width are main influence factors on the range accuracy and precision, and higher range accuracy and precision is acquired with stronger echo intensity and narrower echo pulse width, for 5-ns echo pulse width, when the echo intensity is larger than 10, the range accuracy and precision lower than 7.5 cm can be achieved.

  1. Investigation of an Optimum Detection Scheme for a Star-Field Mapping System

    NASA Technical Reports Server (NTRS)

    Aldridge, M. D.; Credeur, L.

    1970-01-01

    An investigation was made to determine the optimum detection scheme for a star-field mapping system that uses coded detection resulting from starlight shining through specially arranged multiple slits of a reticle. The computer solution of equations derived from a theoretical model showed that the greatest probability of detection for a given star and background intensity occurred with the use of a single transparent slit. However, use of multiple slits improved the system's ability to reject the detection of undesirable lower intensity stars, but only by decreasing the probability of detection for lower intensity stars to be mapped. Also, it was found that the coding arrangement affected the root-mean-square star-position error and that detection is possible with error in the system's detected spin rate, though at a reduced probability.

  2. Reliability analysis and fault-tolerant system development for a redundant strapdown inertial measurement unit. [inertial platforms

    NASA Technical Reports Server (NTRS)

    Motyka, P.

    1983-01-01

    A methodology is developed and applied for quantitatively analyzing the reliability of a dual, fail-operational redundant strapdown inertial measurement unit (RSDIMU). A Markov evaluation model is defined in terms of the operational states of the RSDIMU to predict system reliability. A 27 state model is defined based upon a candidate redundancy management system which can detect and isolate a spectrum of failure magnitudes. The results of parametric studies are presented which show the effect on reliability of the gyro failure rate, both the gyro and accelerometer failure rates together, false alarms, probability of failure detection, probability of failure isolation, and probability of damage effects and mission time. A technique is developed and evaluated for generating dynamic thresholds for detecting and isolating failures of the dual, separated IMU. Special emphasis is given to the detection of multiple, nonconcurrent failures. Digital simulation time histories are presented which show the thresholds obtained and their effectiveness in detecting and isolating sensor failures.

  3. Stochastic performance modeling and evaluation of obstacle detectability with imaging range sensors

    NASA Technical Reports Server (NTRS)

    Matthies, Larry; Grandjean, Pierrick

    1993-01-01

    Statistical modeling and evaluation of the performance of obstacle detection systems for Unmanned Ground Vehicles (UGVs) is essential for the design, evaluation, and comparison of sensor systems. In this report, we address this issue for imaging range sensors by dividing the evaluation problem into two levels: quality of the range data itself and quality of the obstacle detection algorithms applied to the range data. We review existing models of the quality of range data from stereo vision and AM-CW LADAR, then use these to derive a new model for the quality of a simple obstacle detection algorithm. This model predicts the probability of detecting obstacles and the probability of false alarms, as a function of the size and distance of the obstacle, the resolution of the sensor, and the level of noise in the range data. We evaluate these models experimentally using range data from stereo image pairs of a gravel road with known obstacles at several distances. The results show that the approach is a promising tool for predicting and evaluating the performance of obstacle detection with imaging range sensors.

  4. Quantifying seining detection probability for fishes of Great Plains sand‐bed rivers

    USGS Publications Warehouse

    Mollenhauer, Robert; Logue, Daniel R.; Brewer, Shannon K.

    2018-01-01

    Species detection error (i.e., imperfect and variable detection probability) is an essential consideration when investigators map distributions and interpret habitat associations. When fish detection error that is due to highly variable instream environments needs to be addressed, sand‐bed streams of the Great Plains represent a unique challenge. We quantified seining detection probability for diminutive Great Plains fishes across a range of sampling conditions in two sand‐bed rivers in Oklahoma. Imperfect detection resulted in underestimates of species occurrence using naïve estimates, particularly for less common fishes. Seining detection probability also varied among fishes and across sampling conditions. We observed a quadratic relationship between water depth and detection probability, in which the exact nature of the relationship was species‐specific and dependent on water clarity. Similarly, the direction of the relationship between water clarity and detection probability was species‐specific and dependent on differences in water depth. The relationship between water temperature and detection probability was also species dependent, where both the magnitude and direction of the relationship varied among fishes. We showed how ignoring detection error confounded an underlying relationship between species occurrence and water depth. Despite imperfect and heterogeneous detection, our results support that determining species absence can be accomplished with two to six spatially replicated seine hauls per 200‐m reach under average sampling conditions; however, required effort would be higher under certain conditions. Detection probability was low for the Arkansas River Shiner Notropis girardi, which is federally listed as threatened, and more than 10 seine hauls per 200‐m reach would be required to assess presence across sampling conditions. Our model allows scientists to estimate sampling effort to confidently assess species occurrence, which maximizes the use of available resources. Increased implementation of approaches that consider detection error promote ecological advancements and conservation and management decisions that are better informed.

  5. Integrating count and detection-nondetection data to model population dynamics.

    PubMed

    Zipkin, Elise F; Rossman, Sam; Yackulic, Charles B; Wiens, J David; Thorson, James T; Davis, Raymond J; Grant, Evan H Campbell

    2017-06-01

    There is increasing need for methods that integrate multiple data types into a single analytical framework as the spatial and temporal scale of ecological research expands. Current work on this topic primarily focuses on combining capture-recapture data from marked individuals with other data types into integrated population models. Yet, studies of species distributions and trends often rely on data from unmarked individuals across broad scales where local abundance and environmental variables may vary. We present a modeling framework for integrating detection-nondetection and count data into a single analysis to estimate population dynamics, abundance, and individual detection probabilities during sampling. Our dynamic population model assumes that site-specific abundance can change over time according to survival of individuals and gains through reproduction and immigration. The observation process for each data type is modeled by assuming that every individual present at a site has an equal probability of being detected during sampling processes. We examine our modeling approach through a series of simulations illustrating the relative value of count vs. detection-nondetection data under a variety of parameter values and survey configurations. We also provide an empirical example of the model by combining long-term detection-nondetection data (1995-2014) with newly collected count data (2015-2016) from a growing population of Barred Owl (Strix varia) in the Pacific Northwest to examine the factors influencing population abundance over time. Our model provides a foundation for incorporating unmarked data within a single framework, even in cases where sampling processes yield different detection probabilities. This approach will be useful for survey design and to researchers interested in incorporating historical or citizen science data into analyses focused on understanding how demographic rates drive population abundance. © 2017 by the Ecological Society of America.

  6. A Search Model for Imperfectly Detected Targets

    NASA Technical Reports Server (NTRS)

    Ahumada, Albert

    2012-01-01

    Under the assumptions that 1) the search region can be divided up into N non-overlapping sub-regions that are searched sequentially, 2) the probability of detection is unity if a sub-region is selected, and 3) no information is available to guide the search, there are two extreme case models. The search can be done perfectly, leading to a uniform distribution over the number of searches required, or the search can be done with no memory, leading to a geometric distribution for the number of searches required with a success probability of 1/N. If the probability of detection P is less than unity, but the search is done otherwise perfectly, the searcher will have to search the N regions repeatedly until detection occurs. The number of searches is thus the sum two random variables. One is N times the number of full searches (a geometric distribution with success probability P) and the other is the uniform distribution over the integers 1 to N. The first three moments of this distribution were computed, giving the mean, standard deviation, and the kurtosis of the distribution as a function of the two parameters. The model was fit to the data presented last year (Ahumada, Billington, & Kaiwi, 2 required to find a single pixel target on a simulated horizon. The model gave a good fit to the three moments for all three observers.

  7. Assessing environmental DNA detection in controlled lentic systems.

    PubMed

    Moyer, Gregory R; Díaz-Ferguson, Edgardo; Hill, Jeffrey E; Shea, Colin

    2014-01-01

    Little consideration has been given to environmental DNA (eDNA) sampling strategies for rare species. The certainty of species detection relies on understanding false positive and false negative error rates. We used artificial ponds together with logistic regression models to assess the detection of African jewelfish eDNA at varying fish densities (0, 0.32, 1.75, and 5.25 fish/m3). Our objectives were to determine the most effective water stratum for eDNA detection, estimate true and false positive eDNA detection rates, and assess the number of water samples necessary to minimize the risk of false negatives. There were 28 eDNA detections in 324, 1-L, water samples collected from four experimental ponds. The best-approximating model indicated that the per-L-sample probability of eDNA detection was 4.86 times more likely for every 2.53 fish/m3 (1 SD) increase in fish density and 1.67 times less likely for every 1.02 C (1 SD) increase in water temperature. The best section of the water column to detect eDNA was the surface and to a lesser extent the bottom. Although no false positives were detected, the estimated likely number of false positives in samples from ponds that contained fish averaged 3.62. At high densities of African jewelfish, 3-5 L of water provided a >95% probability for the presence/absence of its eDNA. Conversely, at moderate and low densities, the number of water samples necessary to achieve a >95% probability of eDNA detection approximated 42-73 and >100 L, respectively. Potential biases associated with incomplete detection of eDNA could be alleviated via formal estimation of eDNA detection probabilities under an occupancy modeling framework; alternatively, the filtration of hundreds of liters of water may be required to achieve a high (e.g., 95%) level of certainty that African jewelfish eDNA will be detected at low densities (i.e., <0.32 fish/m3 or 1.75 g/m3).

  8. Analysis of the impact of error detection on computer performance

    NASA Technical Reports Server (NTRS)

    Shin, K. C.; Lee, Y. H.

    1983-01-01

    Conventionally, reliability analyses either assume that a fault/error is detected immediately following its occurrence, or neglect damages caused by latent errors. Though unrealistic, this assumption was imposed in order to avoid the difficulty of determining the respective probabilities that a fault induces an error and the error is then detected in a random amount of time after its occurrence. As a remedy for this problem a model is proposed to analyze the impact of error detection on computer performance under moderate assumptions. Error latency, the time interval between occurrence and the moment of detection, is used to measure the effectiveness of a detection mechanism. This model is used to: (1) predict the probability of producing an unreliable result, and (2) estimate the loss of computation due to fault and/or error.

  9. Understanding environmental DNA detection probabilities: A case study using a stream-dwelling char Salvelinus fontinalis

    USGS Publications Warehouse

    Wilcox, Taylor M; Mckelvey, Kevin S.; Young, Michael K.; Sepulveda, Adam; Shepard, Bradley B.; Jane, Stephen F; Whiteley, Andrew R.; Lowe, Winsor H.; Schwartz, Michael K.

    2016-01-01

    Environmental DNA sampling (eDNA) has emerged as a powerful tool for detecting aquatic animals. Previous research suggests that eDNA methods are substantially more sensitive than traditional sampling. However, the factors influencing eDNA detection and the resulting sampling costs are still not well understood. Here we use multiple experiments to derive independent estimates of eDNA production rates and downstream persistence from brook trout (Salvelinus fontinalis) in streams. We use these estimates to parameterize models comparing the false negative detection rates of eDNA sampling and traditional backpack electrofishing. We find that using the protocols in this study eDNA had reasonable detection probabilities at extremely low animal densities (e.g., probability of detection 0.18 at densities of one fish per stream kilometer) and very high detection probabilities at population-level densities (e.g., probability of detection > 0.99 at densities of ≥ 3 fish per 100 m). This is substantially more sensitive than traditional electrofishing for determining the presence of brook trout and may translate into important cost savings when animals are rare. Our findings are consistent with a growing body of literature showing that eDNA sampling is a powerful tool for the detection of aquatic species, particularly those that are rare and difficult to sample using traditional methods.

  10. A Physical Model of Human Skin and Its Application for Search and Rescue

    DTIC Science & Technology

    2009-12-01

    allow iv for the development of skin detection algorithms with a high probability of detection (PD) and a low probability of false alarm ( PFA ). The...various skin colors were collected. The skin detection algorithm developed in this work had a PD as high as 0.95 with a PFA of 0.006. Skin...threshold 0 < γ < 1. . . . . . . . . . . . . 103 63. Top: Detection Image with NDSI threshold γ = 0.314 with PD = 0.95 and corresponding PFA = 0.0156

  11. A model of human decision making in multiple process monitoring situations

    NASA Technical Reports Server (NTRS)

    Greenstein, J. S.; Rouse, W. B.

    1982-01-01

    Human decision making in multiple process monitoring situations is considered. It is proposed that human decision making in many multiple process monitoring situations can be modeled in terms of the human's detection of process related events and his allocation of attention among processes once he feels event have occurred. A mathematical model of human event detection and attention allocation performance in multiple process monitoring situations is developed. An assumption made in developing the model is that, in attempting to detect events, the human generates estimates of the probabilities that events have occurred. An elementary pattern recognition technique, discriminant analysis, is used to model the human's generation of these probability estimates. The performance of the model is compared to that of four subjects in a multiple process monitoring situation requiring allocation of attention among processes.

  12. Investigating the probability of detection of typical cavity shapes through modelling and comparison of geophysical techniques

    NASA Astrophysics Data System (ADS)

    James, P.

    2011-12-01

    With a growing need for housing in the U.K., the government has proposed increased development of brownfield sites. However, old mine workings and natural cavities represent a potential hazard before, during and after construction on such sites, and add further complication to subsurface parameters. Cavities are hence a limitation to certain redevelopment and their detection is an ever important consideration. The current standard technique for cavity detection is a borehole grid, which is intrusive, non-continuous, slow and expensive. A new robust investigation standard in the detection of cavities is sought and geophysical techniques offer an attractive alternative. Geophysical techniques have previously been utilised successfully in the detection of cavities in various geologies, but still has an uncertain reputation in the engineering industry. Engineers are unsure of the techniques and are inclined to rely on well known techniques than utilise new technologies. Bad experiences with geophysics are commonly due to the indiscriminate choice of particular techniques. It is imperative that a geophysical survey is designed with the specific site and target in mind at all times, and the ability and judgement to rule out some, or all, techniques. To this author's knowledge no comparative software exists to aid technique choice. Also, previous modelling software limit the shapes of bodies and hence typical cavity shapes are not represented. Here, we introduce 3D modelling software (Matlab) which computes and compares the response to various cavity targets from a range of techniques (gravity, gravity gradient, magnetic, magnetic gradient and GPR). Typical near surface cavity shapes are modelled including shafts, bellpits, various lining and capping materials, and migrating voids. The probability of cavity detection is assessed in typical subsurface and noise conditions across a range of survey parameters. Techniques can be compared and the limits of detection distance assessed. The density of survey points required to achieve a required probability of detection can be calculated. The software aids discriminate choice of technique, improves survey design, and increases the likelihood of survey success; all factors sought in the engineering industry. As a simple example, the response from magnetometry, gravimetry, and gravity gradient techniques above an example 3m deep, 1m cube air cavity in limestone across a 15m grid was calculated. The maximum responses above the cavity are small (amplitudes of 0.018nT, 0.0013mGal, 8.3eotvos respectively), but at typical site noise levels the detection reliability is over 50% for the gradient gravity method on a single survey line. Increasing the number of survey points across the site increases the reliability of detection of the anomaly by the addition of probabilities. We can calculate the probability of detection at different profile spacings to assess the best possible survey design. At 1m spacing the overall probability of by the gradient gravity method is over 90%, and over 60% for magnetometry (at 3m spacing the probability drops to 32%). The use of modelling in near surface surveys is a useful tool to assess the feasibility of a range of techniques to detect subtle signals. Future work will integrate this work with borehole measured parameters.

  13. Evidential reasoning research on intrusion detection

    NASA Astrophysics Data System (ADS)

    Wang, Xianpei; Xu, Hua; Zheng, Sheng; Cheng, Anyu

    2003-09-01

    In this paper, we mainly aim at D-S theory of evidence and the network intrusion detection these two fields. It discusses the method how to apply this probable reasoning as an AI technology to the Intrusion Detection System (IDS). This paper establishes the application model, describes the new mechanism of reasoning and decision-making and analyses how to implement the model based on the synscan activities detection on the network. The results suggest that if only rational probability values were assigned at the beginning, the engine can, according to the rules of evidence combination and hierarchical reasoning, compute the values of belief and finally inform the administrators of the qualities of the traced activities -- intrusions, normal activities or abnormal activities.

  14. Determining carnivore habitat use in a rubber/forest landscape in Brazil using multispecies occupancy models

    PubMed Central

    Flesher, Kevin M.; Lindell, Catherine; Vega de Oliveira, Téo; Maurer, Brian A.

    2018-01-01

    Understanding the factors that influence the presence and distribution of carnivores in human-dominated agricultural landscapes is one of the main challenges for biodiversity conservation, especially in landscapes where setting aside large protected areas is not feasible. Habitat use models of carnivore communities in rubber plantations are lacking despite the critical roles carnivores play in structuring ecosystems and the increasing expansion of rubber plantations. We investigated the habitat use of a mammalian carnivore community within a 4,200-ha rubber plantation/forest landscape in Bahia, Brazil. We placed two different brands of camera traps in a 90-site grid. We used a multispecies occupancy model to determine the probabilities of habitat use by each species and the effect of different brands of camera traps on their detection probabilities. Species showed significant differences in habitat use with domestic dogs (Canis familiaris) and crab-eating foxes (Cerdocyon thous) having higher probabilities of using rubber groves and coatis (Nasua nasua) having a higher probability of using forest. The moderate level of captures and low detection probabilities (≤ 0.1) of tayras (Eira barbara) and wildcats (Leopardus spp.) precluded a precise estimation of habitat use probabilities using the multispecies occupancy model. The different brands of camera traps had a significant effect on the detection probability of all species. Given that the carnivore community has persisted in this 70-year-old landscape, the results show the potential of rubber/forest landscapes to provide for the long-term conservation of carnivore communities in the Atlantic forest, especially in mosaics with 30–40% forest cover and guard patrolling systems. The results also provide insights for mitigating the impact of rubber production on biodiversity. PMID:29659594

  15. Multivariate Bayesian modeling of known and unknown causes of events--an application to biosurveillance.

    PubMed

    Shen, Yanna; Cooper, Gregory F

    2012-09-01

    This paper investigates Bayesian modeling of known and unknown causes of events in the context of disease-outbreak detection. We introduce a multivariate Bayesian approach that models multiple evidential features of every person in the population. This approach models and detects (1) known diseases (e.g., influenza and anthrax) by using informative prior probabilities and (2) unknown diseases (e.g., a new, highly contagious respiratory virus that has never been seen before) by using relatively non-informative prior probabilities. We report the results of simulation experiments which support that this modeling method can improve the detection of new disease outbreaks in a population. A contribution of this paper is that it introduces a multivariate Bayesian approach for jointly modeling both known and unknown causes of events. Such modeling has general applicability in domains where the space of known causes is incomplete. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  16. Probability, not linear summation, mediates the detection of concentric orientation-defined textures.

    PubMed

    Schmidtmann, Gunnar; Jennings, Ben J; Bell, Jason; Kingdom, Frederick A A

    2015-01-01

    Previous studies investigating signal integration in circular Glass patterns have concluded that the information in these patterns is linearly summed across the entire display for detection. Here we test whether an alternative form of summation, probability summation (PS), modeled under the assumptions of Signal Detection Theory (SDT), can be rejected as a model of Glass pattern detection. PS under SDT alone predicts that the exponent β of the Quick- (or Weibull-) fitted psychometric function should decrease with increasing signal area. We measured spatial integration in circular, radial, spiral, and parallel Glass patterns, as well as comparable patterns composed of Gabors instead of dot pairs. We measured the signal-to-noise ratio required for detection as a function of the size of the area containing signal, with the remaining area containing dot-pair or Gabor-orientation noise. Contrary to some previous studies, we found that the strength of summation never reached values close to linear summation for any stimuli. More importantly, the exponent β systematically decreased with signal area, as predicted by PS under SDT. We applied a model for PS under SDT and found that it gave a good account of the data. We conclude that probability summation is the most likely basis for the detection of circular, radial, spiral, and parallel orientation-defined textures.

  17. Large-scale monitoring of shorebird populations using count data and N-mixture models: Black Oystercatcher (Haematopus bachmani) surveys by land and sea

    USGS Publications Warehouse

    Lyons, James E.; Andrew, Royle J.; Thomas, Susan M.; Elliott-Smith, Elise; Evenson, Joseph R.; Kelly, Elizabeth G.; Milner, Ruth L.; Nysewander, David R.; Andres, Brad A.

    2012-01-01

    Large-scale monitoring of bird populations is often based on count data collected across spatial scales that may include multiple physiographic regions and habitat types. Monitoring at large spatial scales may require multiple survey platforms (e.g., from boats and land when monitoring coastal species) and multiple survey methods. It becomes especially important to explicitly account for detection probability when analyzing count data that have been collected using multiple survey platforms or methods. We evaluated a new analytical framework, N-mixture models, to estimate actual abundance while accounting for multiple detection biases. During May 2006, we made repeated counts of Black Oystercatchers (Haematopus bachmani) from boats in the Puget Sound area of Washington (n = 55 sites) and from land along the coast of Oregon (n = 56 sites). We used a Bayesian analysis of N-mixture models to (1) assess detection probability as a function of environmental and survey covariates and (2) estimate total Black Oystercatcher abundance during the breeding season in the two regions. Probability of detecting individuals during boat-based surveys was 0.75 (95% credible interval: 0.42–0.91) and was not influenced by tidal stage. Detection probability from surveys conducted on foot was 0.68 (0.39–0.90); the latter was not influenced by fog, wind, or number of observers but was ~35% lower during rain. The estimated population size was 321 birds (262–511) in Washington and 311 (276–382) in Oregon. N-mixture models provide a flexible framework for modeling count data and covariates in large-scale bird monitoring programs designed to understand population change.

  18. Factors affecting detectability of river otters during sign surveys

    USGS Publications Warehouse

    Jeffress, Mackenzie R.; Paukert, Craig P.; Sandercock, Brett K.; Gipson, Philip S.

    2011-01-01

    Sign surveys are commonly used to study and monitor wildlife species but may be flawed when surveys are conducted only once and cover short distances, which can lead to a lack of accountability for false absences. Multiple observers surveyed for river otter (Lontra canadensis) scat and tracks along stream and reservoir shorelines at 110 randomly selected sites in eastern Kansas from January to April 2008 and 2009 to determine if detection probability differed among substrates, sign types, observers, survey lengths, and near access points. We estimated detection probabilities (p) of river otters using occupancy models in Program PRESENCE. Mean detection probability for a 400-m survey was highest in mud substrates (p = 0.60) and lowest in snow (p = 0.18) and leaf litter substrates (p = 0.27). Scat had a higher detection probability (p = 0.53) than tracks (p = 0.18), and experienced observers had higher detection probabilities (p < 0.71) than novice observers (p < 0.55). Detection probabilities increased almost 3-fold as survey length increased from 200 m to 1,000 m, and otter sign was not concentrated near access points. After accounting for imperfect detection, our estimates of otter site occupancy based on a 400-m survey increased >3-fold, providing further evidence of the potential negative bias that can occur in estimates from sign surveys when imperfect detection is not addressed. Our study identifies areas for improvement in sign survey methodologies and results are applicable for sign surveys commonly used for many species across a range of habitats.

  19. A Brownian Bridge Movement Model to Track Mobile Targets

    DTIC Science & Technology

    2016-09-01

    breakout of Chinese forces in the South China Sea. Probability heat maps, depicting the probability of a target location at discrete times, are...achieve a higher probability of detection, it is more effective to have sensors cover a wider area at fewer discrete points in time than to have a...greater number of discrete looks using sensors covering smaller areas. 14. SUBJECT TERMS Brownian bridge movement models, unmanned sensors

  20. Variation in the standard deviation of the lure rating distribution: Implications for estimates of recollection probability.

    PubMed

    Dopkins, Stephen; Varner, Kaitlin; Hoyer, Darin

    2017-10-01

    In word recognition semantic priming of test words increased the false-alarm rate and the mean of confidence ratings to lures. Such priming also increased the standard deviation of confidence ratings to lures and the slope of the z-ROC function, suggesting that the priming increased the standard deviation of the lure evidence distribution. The Unequal Variance Signal Detection (UVSD) model interpreted the priming as increasing the standard deviation of the lure evidence distribution. Without additional parameters the Dual Process Signal Detection (DPSD) model could only accommodate the results by fitting the data for related and unrelated primes separately, interpreting the priming, implausibly, as decreasing the probability of target recollection (DPSD). With an additional parameter, for the probability of false (lure) recollection the model could fit the data for related and unrelated primes together, interpreting the priming as increasing the probability of false recollection. These results suggest that DPSD estimates of target recollection probability will decrease with increases in the lure confidence/evidence standard deviation unless a parameter is included for false recollection. Unfortunately the size of a given lure confidence/evidence standard deviation relative to other possible lure confidence/evidence standard deviations is often unspecified by context. Hence the model often has no way of estimating false recollection probability and thereby correcting its estimates of target recollection probability.

  1. Estimating Lion Abundance using N-mixture Models for Social Species

    PubMed Central

    Belant, Jerrold L.; Bled, Florent; Wilton, Clay M.; Fyumagwa, Robert; Mwampeta, Stanslaus B.; Beyer, Dean E.

    2016-01-01

    Declining populations of large carnivores worldwide, and the complexities of managing human-carnivore conflicts, require accurate population estimates of large carnivores to promote their long-term persistence through well-informed management We used N-mixture models to estimate lion (Panthera leo) abundance from call-in and track surveys in southeastern Serengeti National Park, Tanzania. Because of potential habituation to broadcasted calls and social behavior, we developed a hierarchical observation process within the N-mixture model conditioning lion detectability on their group response to call-ins and individual detection probabilities. We estimated 270 lions (95% credible interval = 170–551) using call-ins but were unable to estimate lion abundance from track data. We found a weak negative relationship between predicted track density and predicted lion abundance from the call-in surveys. Luminosity was negatively correlated with individual detection probability during call-in surveys. Lion abundance and track density were influenced by landcover, but direction of the corresponding effects were undetermined. N-mixture models allowed us to incorporate multiple parameters (e.g., landcover, luminosity, observer effect) influencing lion abundance and probability of detection directly into abundance estimates. We suggest that N-mixture models employing a hierarchical observation process can be used to estimate abundance of other social, herding, and grouping species. PMID:27786283

  2. Estimating Lion Abundance using N-mixture Models for Social Species.

    PubMed

    Belant, Jerrold L; Bled, Florent; Wilton, Clay M; Fyumagwa, Robert; Mwampeta, Stanslaus B; Beyer, Dean E

    2016-10-27

    Declining populations of large carnivores worldwide, and the complexities of managing human-carnivore conflicts, require accurate population estimates of large carnivores to promote their long-term persistence through well-informed management We used N-mixture models to estimate lion (Panthera leo) abundance from call-in and track surveys in southeastern Serengeti National Park, Tanzania. Because of potential habituation to broadcasted calls and social behavior, we developed a hierarchical observation process within the N-mixture model conditioning lion detectability on their group response to call-ins and individual detection probabilities. We estimated 270 lions (95% credible interval = 170-551) using call-ins but were unable to estimate lion abundance from track data. We found a weak negative relationship between predicted track density and predicted lion abundance from the call-in surveys. Luminosity was negatively correlated with individual detection probability during call-in surveys. Lion abundance and track density were influenced by landcover, but direction of the corresponding effects were undetermined. N-mixture models allowed us to incorporate multiple parameters (e.g., landcover, luminosity, observer effect) influencing lion abundance and probability of detection directly into abundance estimates. We suggest that N-mixture models employing a hierarchical observation process can be used to estimate abundance of other social, herding, and grouping species.

  3. Modeling Systematic Change in Stopover Duration Does Not Improve Bias in Trends Estimated from Migration Counts.

    PubMed

    Crewe, Tara L; Taylor, Philip D; Lepage, Denis

    2015-01-01

    The use of counts of unmarked migrating animals to monitor long term population trends assumes independence of daily counts and a constant rate of detection. However, migratory stopovers often last days or weeks, violating the assumption of count independence. Further, a systematic change in stopover duration will result in a change in the probability of detecting individuals once, but also in the probability of detecting individuals on more than one sampling occasion. We tested how variation in stopover duration influenced accuracy and precision of population trends by simulating migration count data with known constant rate of population change and by allowing daily probability of survival (an index of stopover duration) to remain constant, or to vary randomly, cyclically, or increase linearly over time by various levels. Using simulated datasets with a systematic increase in stopover duration, we also tested whether any resulting bias in population trend could be reduced by modeling the underlying source of variation in detection, or by subsampling data to every three or five days to reduce the incidence of recounting. Mean bias in population trend did not differ significantly from zero when stopover duration remained constant or varied randomly over time, but bias and the detection of false trends increased significantly with a systematic increase in stopover duration. Importantly, an increase in stopover duration over time resulted in a compounding effect on counts due to the increased probability of detection and of recounting on subsequent sampling occasions. Under this scenario, bias in population trend could not be modeled using a covariate for stopover duration alone. Rather, to improve inference drawn about long term population change using counts of unmarked migrants, analyses must include a covariate for stopover duration, as well as incorporate sampling modifications (e.g., subsampling) to reduce the probability that individuals will be detected on more than one occasion.

  4. Environmental DNA (eDNA) Detection Probability Is Influenced by Seasonal Activity of Organisms.

    PubMed

    de Souza, Lesley S; Godwin, James C; Renshaw, Mark A; Larson, Eric

    2016-01-01

    Environmental DNA (eDNA) holds great promise for conservation applications like the monitoring of invasive or imperiled species, yet this emerging technique requires ongoing testing in order to determine the contexts over which it is effective. For example, little research to date has evaluated how seasonality of organism behavior or activity may influence detection probability of eDNA. We applied eDNA to survey for two highly imperiled species endemic to the upper Black Warrior River basin in Alabama, US: the Black Warrior Waterdog (Necturus alabamensis) and the Flattened Musk Turtle (Sternotherus depressus). Importantly, these species have contrasting patterns of seasonal activity, with N. alabamensis more active in the cool season (October-April) and S. depressus more active in the warm season (May-September). We surveyed sites historically occupied by these species across cool and warm seasons over two years with replicated eDNA water samples, which were analyzed in the laboratory using species-specific quantitative PCR (qPCR) assays. We then used occupancy estimation with detection probability modeling to evaluate both the effects of landscape attributes on organism presence and season of sampling on detection probability of eDNA. Importantly, we found that season strongly affected eDNA detection probability for both species, with N. alabamensis having higher eDNA detection probabilities during the cool season and S. depressus have higher eDNA detection probabilities during the warm season. These results illustrate the influence of organismal behavior or activity on eDNA detection in the environment and identify an important role for basic natural history in designing eDNA monitoring programs.

  5. Environmental DNA (eDNA) Detection Probability Is Influenced by Seasonal Activity of Organisms

    PubMed Central

    de Souza, Lesley S.; Godwin, James C.; Renshaw, Mark A.; Larson, Eric

    2016-01-01

    Environmental DNA (eDNA) holds great promise for conservation applications like the monitoring of invasive or imperiled species, yet this emerging technique requires ongoing testing in order to determine the contexts over which it is effective. For example, little research to date has evaluated how seasonality of organism behavior or activity may influence detection probability of eDNA. We applied eDNA to survey for two highly imperiled species endemic to the upper Black Warrior River basin in Alabama, US: the Black Warrior Waterdog (Necturus alabamensis) and the Flattened Musk Turtle (Sternotherus depressus). Importantly, these species have contrasting patterns of seasonal activity, with N. alabamensis more active in the cool season (October-April) and S. depressus more active in the warm season (May-September). We surveyed sites historically occupied by these species across cool and warm seasons over two years with replicated eDNA water samples, which were analyzed in the laboratory using species-specific quantitative PCR (qPCR) assays. We then used occupancy estimation with detection probability modeling to evaluate both the effects of landscape attributes on organism presence and season of sampling on detection probability of eDNA. Importantly, we found that season strongly affected eDNA detection probability for both species, with N. alabamensis having higher eDNA detection probabilities during the cool season and S. depressus have higher eDNA detection probabilities during the warm season. These results illustrate the influence of organismal behavior or activity on eDNA detection in the environment and identify an important role for basic natural history in designing eDNA monitoring programs. PMID:27776150

  6. A spatial model of bird abundance as adjusted for detection probability

    USGS Publications Warehouse

    Gorresen, P.M.; Mcmillan, G.P.; Camp, R.J.; Pratt, T.K.

    2009-01-01

    Modeling the spatial distribution of animals can be complicated by spatial and temporal effects (i.e. spatial autocorrelation and trends in abundance over time) and other factors such as imperfect detection probabilities and observation-related nuisance variables. Recent advances in modeling have demonstrated various approaches that handle most of these factors but which require a degree of sampling effort (e.g. replication) not available to many field studies. We present a two-step approach that addresses these challenges to spatially model species abundance. Habitat, spatial and temporal variables were handled with a Bayesian approach which facilitated modeling hierarchically structured data. Predicted abundance was subsequently adjusted to account for imperfect detection and the area effectively sampled for each species. We provide examples of our modeling approach for two endemic Hawaiian nectarivorous honeycreepers: 'i'iwi Vestiaria coccinea and 'apapane Himatione sanguinea. ?? 2009 Ecography.

  7. Bayesian network models for error detection in radiotherapy plans

    NASA Astrophysics Data System (ADS)

    Kalet, Alan M.; Gennari, John H.; Ford, Eric C.; Phillips, Mark H.

    2015-04-01

    The purpose of this study is to design and develop a probabilistic network for detecting errors in radiotherapy plans for use at the time of initial plan verification. Our group has initiated a multi-pronged approach to reduce these errors. We report on our development of Bayesian models of radiotherapy plans. Bayesian networks consist of joint probability distributions that define the probability of one event, given some set of other known information. Using the networks, we find the probability of obtaining certain radiotherapy parameters, given a set of initial clinical information. A low probability in a propagated network then corresponds to potential errors to be flagged for investigation. To build our networks we first interviewed medical physicists and other domain experts to identify the relevant radiotherapy concepts and their associated interdependencies and to construct a network topology. Next, to populate the network’s conditional probability tables, we used the Hugin Expert software to learn parameter distributions from a subset of de-identified data derived from a radiation oncology based clinical information database system. These data represent 4990 unique prescription cases over a 5 year period. Under test case scenarios with approximately 1.5% introduced error rates, network performance produced areas under the ROC curve of 0.88, 0.98, and 0.89 for the lung, brain and female breast cancer error detection networks, respectively. Comparison of the brain network to human experts performance (AUC of 0.90 ± 0.01) shows the Bayes network model performs better than domain experts under the same test conditions. Our results demonstrate the feasibility and effectiveness of comprehensive probabilistic models as part of decision support systems for improved detection of errors in initial radiotherapy plan verification procedures.

  8. Surveying Europe's Only Cave-Dwelling Chordate Species (Proteus anguinus) Using Environmental DNA.

    PubMed

    Vörös, Judit; Márton, Orsolya; Schmidt, Benedikt R; Gál, Júlia Tünde; Jelić, Dušan

    2017-01-01

    In surveillance of subterranean fauna, especially in the case of rare or elusive aquatic species, traditional techniques used for epigean species are often not feasible. We developed a non-invasive survey method based on environmental DNA (eDNA) to detect the presence of the red-listed cave-dwelling amphibian, Proteus anguinus, in the caves of the Dinaric Karst. We tested the method in fifteen caves in Croatia, from which the species was previously recorded or expected to occur. We successfully confirmed the presence of P. anguinus from ten caves and detected the species for the first time in five others. Using a hierarchical occupancy model we compared the availability and detection probability of eDNA of two water sampling methods, filtration and precipitation. The statistical analysis showed that both availability and detection probability depended on the method and estimates for both probabilities were higher using filter samples than for precipitation samples. Combining reliable field and laboratory methods with robust statistical modeling will give the best estimates of species occurrence.

  9. A H-infinity Fault Detection and Diagnosis Scheme for Discrete Nonlinear System Using Output Probability Density Estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang Yumin; Lum, Kai-Yew; Wang Qingguo

    In this paper, a H-infinity fault detection and diagnosis (FDD) scheme for a class of discrete nonlinear system fault using output probability density estimation is presented. Unlike classical FDD problems, the measured output of the system is viewed as a stochastic process and its square root probability density function (PDF) is modeled with B-spline functions, which leads to a deterministic space-time dynamic model including nonlinearities, uncertainties. A weighting mean value is given as an integral function of the square root PDF along space direction, which leads a function only about time and can be used to construct residual signal. Thus,more » the classical nonlinear filter approach can be used to detect and diagnose the fault in system. A feasible detection criterion is obtained at first, and a new H-infinity adaptive fault diagnosis algorithm is further investigated to estimate the fault. Simulation example is given to demonstrate the effectiveness of the proposed approaches.« less

  10. A H-infinity Fault Detection and Diagnosis Scheme for Discrete Nonlinear System Using Output Probability Density Estimation

    NASA Astrophysics Data System (ADS)

    Zhang, Yumin; Wang, Qing-Guo; Lum, Kai-Yew

    2009-03-01

    In this paper, a H-infinity fault detection and diagnosis (FDD) scheme for a class of discrete nonlinear system fault using output probability density estimation is presented. Unlike classical FDD problems, the measured output of the system is viewed as a stochastic process and its square root probability density function (PDF) is modeled with B-spline functions, which leads to a deterministic space-time dynamic model including nonlinearities, uncertainties. A weighting mean value is given as an integral function of the square root PDF along space direction, which leads a function only about time and can be used to construct residual signal. Thus, the classical nonlinear filter approach can be used to detect and diagnose the fault in system. A feasible detection criterion is obtained at first, and a new H-infinity adaptive fault diagnosis algorithm is further investigated to estimate the fault. Simulation example is given to demonstrate the effectiveness of the proposed approaches.

  11. Impact of coverage on the reliability of a fault tolerant computer

    NASA Technical Reports Server (NTRS)

    Bavuso, S. J.

    1975-01-01

    A mathematical reliability model is established for a reconfigurable fault tolerant avionic computer system utilizing state-of-the-art computers. System reliability is studied in light of the coverage probabilities associated with the first and second independent hardware failures. Coverage models are presented as a function of detection, isolation, and recovery probabilities. Upper and lower bonds are established for the coverage probabilities and the method for computing values for the coverage probabilities is investigated. Further, an architectural variation is proposed which is shown to enhance coverage.

  12. N-mix for fish: estimating riverine salmonid habitat selection via N-mixture models

    USGS Publications Warehouse

    Som, Nicholas A.; Perry, Russell W.; Jones, Edward C.; De Juilio, Kyle; Petros, Paul; Pinnix, William D.; Rupert, Derek L.

    2018-01-01

    Models that formulate mathematical linkages between fish use and habitat characteristics are applied for many purposes. For riverine fish, these linkages are often cast as resource selection functions with variables including depth and velocity of water and distance to nearest cover. Ecologists are now recognizing the role that detection plays in observing organisms, and failure to account for imperfect detection can lead to spurious inference. Herein, we present a flexible N-mixture model to associate habitat characteristics with the abundance of riverine salmonids that simultaneously estimates detection probability. Our formulation has the added benefits of accounting for demographics variation and can generate probabilistic statements regarding intensity of habitat use. In addition to the conceptual benefits, model application to data from the Trinity River, California, yields interesting results. Detection was estimated to vary among surveyors, but there was little spatial or temporal variation. Additionally, a weaker effect of water depth on resource selection is estimated than that reported by previous studies not accounting for detection probability. N-mixture models show great promise for applications to riverine resource selection.

  13. Estimating site occupancy and detection probability parameters for meso- and large mammals in a coastal eosystem

    USGS Publications Warehouse

    O'Connell, Allan F.; Talancy, Neil W.; Bailey, Larissa L.; Sauer, John R.; Cook, Robert; Gilbert, Andrew T.

    2006-01-01

    Large-scale, multispecies monitoring programs are widely used to assess changes in wildlife populations but they often assume constant detectability when documenting species occurrence. This assumption is rarely met in practice because animal populations vary across time and space. As a result, detectability of a species can be influenced by a number of physical, biological, or anthropogenic factors (e.g., weather, seasonality, topography, biological rhythms, sampling methods). To evaluate some of these influences, we estimated site occupancy rates using species-specific detection probabilities for meso- and large terrestrial mammal species on Cape Cod, Massachusetts, USA. We used model selection to assess the influence of different sampling methods and major environmental factors on our ability to detect individual species. Remote cameras detected the most species (9), followed by cubby boxes (7) and hair traps (4) over a 13-month period. Estimated site occupancy rates were similar among sampling methods for most species when detection probabilities exceeded 0.15, but we question estimates obtained from methods with detection probabilities between 0.05 and 0.15, and we consider methods with lower probabilities unacceptable for occupancy estimation and inference. Estimated detection probabilities can be used to accommodate variation in sampling methods, which allows for comparison of monitoring programs using different protocols. Vegetation and seasonality produced species-specific differences in detectability and occupancy, but differences were not consistent within or among species, which suggests that our results should be considered in the context of local habitat features and life history traits for the target species. We believe that site occupancy is a useful state variable and suggest that monitoring programs for mammals using occupancy data consider detectability prior to making inferences about species distributions or population change.

  14. Using occupancy models to understand the distribution of an amphibian pathogen, Batrachochytrium dendrobatidis

    USGS Publications Warehouse

    Adams, Michael J.; Chelgren, Nathan; Reinitz, David M.; Cole, Rebecca A.; Rachowicz, L.J.; Galvan, Stephanie; Mccreary, Brome; Pearl, Christopher A.; Bailey, Larissa L.; Bettaso, Jamie B.; Bull, Evelyn L.; Leu, Matthias

    2010-01-01

    Batrachochytrium dendrobatidis is a fungal pathogen that is receiving attention around the world for its role in amphibian declines. Study of its occurrence patterns is hampered by false negatives: the failure to detect the pathogen when it is present. Occupancy models are a useful but currently underutilized tool for analyzing detection data when the probability of detecting a species is <1. We use occupancy models to evaluate hypotheses concerning the occurrence and prevalence of B. dendrobatidis and discuss how this application differs from a conventional occupancy approach. We found that the probability of detecting the pathogen, conditional on presence of the pathogen in the anuran population, was related to amphibian development stage, day of the year, elevation, and human activities. Batrachochytrium dendrobatidis was found throughout our study area but was only estimated to occur in 53.4% of 78 populations of native amphibians and 66.4% of 40 populations of nonnative Rana catesbeiana tested. We found little evidence to support any spatial hypotheses concerning the probability that the pathogen occurs in a population, but did find evidence of some taxonomic variation. We discuss the interpretation of occupancy model parameters, when, unlike a conventional occupancy application, the number of potential samples or observations is finite.

  15. Testing the consistency of wildlife data types before combining them: the case of camera traps and telemetry.

    PubMed

    Popescu, Viorel D; Valpine, Perry; Sweitzer, Rick A

    2014-04-01

    Wildlife data gathered by different monitoring techniques are often combined to estimate animal density. However, methods to check whether different types of data provide consistent information (i.e., can information from one data type be used to predict responses in the other?) before combining them are lacking. We used generalized linear models and generalized linear mixed-effects models to relate camera trap probabilities for marked animals to independent space use from telemetry relocations using 2 years of data for fishers (Pekania pennanti) as a case study. We evaluated (1) camera trap efficacy by estimating how camera detection probabilities are related to nearby telemetry relocations and (2) whether home range utilization density estimated from telemetry data adequately predicts camera detection probabilities, which would indicate consistency of the two data types. The number of telemetry relocations within 250 and 500 m from camera traps predicted detection probability well. For the same number of relocations, females were more likely to be detected during the first year. During the second year, all fishers were more likely to be detected during the fall/winter season. Models predicting camera detection probability and photo counts solely from telemetry utilization density had the best or nearly best Akaike Information Criterion (AIC), suggesting that telemetry and camera traps provide consistent information on space use. Given the same utilization density, males were more likely to be photo-captured due to larger home ranges and higher movement rates. Although methods that combine data types (spatially explicit capture-recapture) make simple assumptions about home range shapes, it is reasonable to conclude that in our case, camera trap data do reflect space use in a manner consistent with telemetry data. However, differences between the 2 years of data suggest that camera efficacy is not fully consistent across ecological conditions and make the case for integrating other sources of space-use data.

  16. Comparison of methods for estimating density of forest songbirds from point counts

    Treesearch

    Jennifer L. Reidy; Frank R. Thompson; J. Wesley. Bailey

    2011-01-01

    New analytical methods have been promoted for estimating the probability of detection and density of birds from count data but few studies have compared these methods using real data. We compared estimates of detection probability and density from distance and time-removal models and survey protocols based on 5- or 10-min counts and outer radii of 50 or 100 m. We...

  17. Probability theory for 3-layer remote sensing radiative transfer model: univariate case.

    PubMed

    Ben-David, Avishai; Davidson, Charles E

    2012-04-23

    A probability model for a 3-layer radiative transfer model (foreground layer, cloud layer, background layer, and an external source at the end of line of sight) has been developed. The 3-layer model is fundamentally important as the primary physical model in passive infrared remote sensing. The probability model is described by the Johnson family of distributions that are used as a fit for theoretically computed moments of the radiative transfer model. From the Johnson family we use the SU distribution that can address a wide range of skewness and kurtosis values (in addition to addressing the first two moments, mean and variance). In the limit, SU can also describe lognormal and normal distributions. With the probability model one can evaluate the potential for detecting a target (vapor cloud layer), the probability of observing thermal contrast, and evaluate performance (receiver operating characteristics curves) in clutter-noise limited scenarios. This is (to our knowledge) the first probability model for the 3-layer remote sensing geometry that treats all parameters as random variables and includes higher-order statistics. © 2012 Optical Society of America

  18. Statistical approaches to the analysis of point count data: A little extra information can go a long way

    USGS Publications Warehouse

    Farnsworth, G.L.; Nichols, J.D.; Sauer, J.R.; Fancy, S.G.; Pollock, K.H.; Shriner, S.A.; Simons, T.R.; Ralph, C. John; Rich, Terrell D.

    2005-01-01

    Point counts are a standard sampling procedure for many bird species, but lingering concerns still exist about the quality of information produced from the method. It is well known that variation in observer ability and environmental conditions can influence the detection probability of birds in point counts, but many biologists have been reluctant to abandon point counts in favor of more intensive approaches to counting. However, over the past few years a variety of statistical and methodological developments have begun to provide practical ways of overcoming some of the problems with point counts. We describe some of these approaches, and show how they can be integrated into standard point count protocols to greatly enhance the quality of the information. Several tools now exist for estimation of detection probability of birds during counts, including distance sampling, double observer methods, time-depletion (removal) methods, and hybrid methods that combine these approaches. Many counts are conducted in habitats that make auditory detection of birds much more likely than visual detection. As a framework for understanding detection probability during such counts, we propose separating two components of the probability a bird is detected during a count into (1) the probability a bird vocalizes during the count and (2) the probability this vocalization is detected by an observer. In addition, we propose that some measure of the area sampled during a count is necessary for valid inferences about bird populations. This can be done by employing fixed-radius counts or more sophisticated distance-sampling models. We recommend any studies employing point counts be designed to estimate detection probability and to include a measure of the area sampled.

  19. Target Coverage in Wireless Sensor Networks with Probabilistic Sensors

    PubMed Central

    Shan, Anxing; Xu, Xianghua; Cheng, Zongmao

    2016-01-01

    Sensing coverage is a fundamental problem in wireless sensor networks (WSNs), which has attracted considerable attention. Conventional research on this topic focuses on the 0/1 coverage model, which is only a coarse approximation to the practical sensing model. In this paper, we study the target coverage problem, where the objective is to find the least number of sensor nodes in randomly-deployed WSNs based on the probabilistic sensing model. We analyze the joint detection probability of target with multiple sensors. Based on the theoretical analysis of the detection probability, we formulate the minimum ϵ-detection coverage problem. We prove that the minimum ϵ-detection coverage problem is NP-hard and present an approximation algorithm called the Probabilistic Sensor Coverage Algorithm (PSCA) with provable approximation ratios. To evaluate our design, we analyze the performance of PSCA theoretically and also perform extensive simulations to demonstrate the effectiveness of our proposed algorithm. PMID:27618902

  20. Using fingerprint image quality to improve the identification performance of the U.S. Visitor and Immigrant Status Indicator Technology Program

    PubMed Central

    Wein, Lawrence M.; Baveja, Manas

    2005-01-01

    Motivated by the difficulty of biometric systems to correctly match fingerprints with poor image quality, we formulate and solve a game-theoretic formulation of the identification problem in two settings: U.S. visa applicants are checked against a list of visa holders to detect visa fraud, and visitors entering the U.S. are checked against a watchlist of criminals and suspected terrorists. For three types of biometric strategies, we solve the game in which the U.S. Government chooses the strategy's optimal parameter values to maximize the detection probability subject to a constraint on the mean biometric processing time per legal visitor, and then the terrorist chooses the image quality to minimize the detection probability. At current inspector staffing levels at ports of entry, our model predicts that a quality-dependent two-finger strategy achieves a detection probability of 0.733, compared to 0.526 under the quality-independent two-finger strategy that is currently implemented at the U.S. border. Increasing the staffing level of inspectors offers only minor increases in the detection probability for these two strategies. Using more than two fingers to match visitors with poor image quality allows a detection probability of 0.949 under current staffing levels, but may require major changes to the current U.S. biometric program. The detection probabilities during visa application are ≈11–22% smaller than at ports of entry for all three strategies, but the same qualitative conclusions hold. PMID:15894628

  1. Using fingerprint image quality to improve the identification performance of the U.S. Visitor and Immigrant Status Indicator Technology Program.

    PubMed

    Wein, Lawrence M; Baveja, Manas

    2005-05-24

    Motivated by the difficulty of biometric systems to correctly match fingerprints with poor image quality, we formulate and solve a game-theoretic formulation of the identification problem in two settings: U.S. visa applicants are checked against a list of visa holders to detect visa fraud, and visitors entering the U.S. are checked against a watchlist of criminals and suspected terrorists. For three types of biometric strategies, we solve the game in which the U.S. Government chooses the strategy's optimal parameter values to maximize the detection probability subject to a constraint on the mean biometric processing time per legal visitor, and then the terrorist chooses the image quality to minimize the detection probability. At current inspector staffing levels at ports of entry, our model predicts that a quality-dependent two-finger strategy achieves a detection probability of 0.733, compared to 0.526 under the quality-independent two-finger strategy that is currently implemented at the U.S. border. Increasing the staffing level of inspectors offers only minor increases in the detection probability for these two strategies. Using more than two fingers to match visitors with poor image quality allows a detection probability of 0.949 under current staffing levels, but may require major changes to the current U.S. biometric program. The detection probabilities during visa application are approximately 11-22% smaller than at ports of entry for all three strategies, but the same qualitative conclusions hold.

  2. Environmental DNA (eDNA) Sampling Improves Occurrence and Detection Estimates of Invasive Burmese Pythons

    PubMed Central

    Hunter, Margaret E.; Oyler-McCance, Sara J.; Dorazio, Robert M.; Fike, Jennifer A.; Smith, Brian J.; Hunter, Charles T.; Reed, Robert N.; Hart, Kristen M.

    2015-01-01

    Environmental DNA (eDNA) methods are used to detect DNA that is shed into the aquatic environment by cryptic or low density species. Applied in eDNA studies, occupancy models can be used to estimate occurrence and detection probabilities and thereby account for imperfect detection. However, occupancy terminology has been applied inconsistently in eDNA studies, and many have calculated occurrence probabilities while not considering the effects of imperfect detection. Low detection of invasive giant constrictors using visual surveys and traps has hampered the estimation of occupancy and detection estimates needed for population management in southern Florida, USA. Giant constrictor snakes pose a threat to native species and the ecological restoration of the Florida Everglades. To assist with detection, we developed species-specific eDNA assays using quantitative PCR (qPCR) for the Burmese python (Python molurus bivittatus), Northern African python (P. sebae), boa constrictor (Boa constrictor), and the green (Eunectes murinus) and yellow anaconda (E. notaeus). Burmese pythons, Northern African pythons, and boa constrictors are established and reproducing, while the green and yellow anaconda have the potential to become established. We validated the python and boa constrictor assays using laboratory trials and tested all species in 21 field locations distributed in eight southern Florida regions. Burmese python eDNA was detected in 37 of 63 field sampling events; however, the other species were not detected. Although eDNA was heterogeneously distributed in the environment, occupancy models were able to provide the first estimates of detection probabilities, which were greater than 91%. Burmese python eDNA was detected along the leading northern edge of the known population boundary. The development of informative detection tools and eDNA occupancy models can improve conservation efforts in southern Florida and support more extensive studies of invasive constrictors. Generic sampling design and terminology are proposed to standardize and clarify interpretations of eDNA-based occupancy models. PMID:25874630

  3. Environmental DNA (eDNA) sampling improves occurrence and detection estimates of invasive Burmese pythons

    USGS Publications Warehouse

    Hunter, Margaret E.; Oyler-McCance, Sara J.; Dorazio, Robert M.; Fike, Jennifer A.; Smith, Brian J.; Hunter, Charles T.; Reed, Robert N.; Hart, Kristen M.

    2015-01-01

    Environmental DNA (eDNA) methods are used to detect DNA that is shed into the aquatic environment by cryptic or low density species. Applied in eDNA studies, occupancy models can be used to estimate occurrence and detection probabilities and thereby account for imperfect detection. However, occupancy terminology has been applied inconsistently in eDNA studies, and many have calculated occurrence probabilities while not considering the effects of imperfect detection. Low detection of invasive giant constrictors using visual surveys and traps has hampered the estimation of occupancy and detection estimates needed for population management in southern Florida, USA. Giant constrictor snakes pose a threat to native species and the ecological restoration of the Florida Everglades. To assist with detection, we developed species-specific eDNA assays using quantitative PCR (qPCR) for the Burmese python (Python molurus bivittatus), Northern African python (P. sebae), boa constrictor (Boa constrictor), and the green (Eunectes murinus) and yellow anaconda (E. notaeus). Burmese pythons, Northern African pythons, and boa constrictors are established and reproducing, while the green and yellow anaconda have the potential to become established. We validated the python and boa constrictor assays using laboratory trials and tested all species in 21 field locations distributed in eight southern Florida regions. Burmese python eDNA was detected in 37 of 63 field sampling events; however, the other species were not detected. Although eDNA was heterogeneously distributed in the environment, occupancy models were able to provide the first estimates of detection probabilities, which were greater than 91%. Burmese python eDNA was detected along the leading northern edge of the known population boundary. The development of informative detection tools and eDNA occupancy models can improve conservation efforts in southern Florida and support more extensive studies of invasive constrictors. Generic sampling design and terminology are proposed to standardize and clarify interpretations of eDNA-based occupancy models.

  4. Environmental DNA (eDNA) sampling improves occurrence and detection estimates of invasive burmese pythons.

    PubMed

    Hunter, Margaret E; Oyler-McCance, Sara J; Dorazio, Robert M; Fike, Jennifer A; Smith, Brian J; Hunter, Charles T; Reed, Robert N; Hart, Kristen M

    2015-01-01

    Environmental DNA (eDNA) methods are used to detect DNA that is shed into the aquatic environment by cryptic or low density species. Applied in eDNA studies, occupancy models can be used to estimate occurrence and detection probabilities and thereby account for imperfect detection. However, occupancy terminology has been applied inconsistently in eDNA studies, and many have calculated occurrence probabilities while not considering the effects of imperfect detection. Low detection of invasive giant constrictors using visual surveys and traps has hampered the estimation of occupancy and detection estimates needed for population management in southern Florida, USA. Giant constrictor snakes pose a threat to native species and the ecological restoration of the Florida Everglades. To assist with detection, we developed species-specific eDNA assays using quantitative PCR (qPCR) for the Burmese python (Python molurus bivittatus), Northern African python (P. sebae), boa constrictor (Boa constrictor), and the green (Eunectes murinus) and yellow anaconda (E. notaeus). Burmese pythons, Northern African pythons, and boa constrictors are established and reproducing, while the green and yellow anaconda have the potential to become established. We validated the python and boa constrictor assays using laboratory trials and tested all species in 21 field locations distributed in eight southern Florida regions. Burmese python eDNA was detected in 37 of 63 field sampling events; however, the other species were not detected. Although eDNA was heterogeneously distributed in the environment, occupancy models were able to provide the first estimates of detection probabilities, which were greater than 91%. Burmese python eDNA was detected along the leading northern edge of the known population boundary. The development of informative detection tools and eDNA occupancy models can improve conservation efforts in southern Florida and support more extensive studies of invasive constrictors. Generic sampling design and terminology are proposed to standardize and clarify interpretations of eDNA-based occupancy models.

  5. Estimating site occupancy rates when detection probabilities are less than one

    USGS Publications Warehouse

    MacKenzie, D.I.; Nichols, J.D.; Lachman, G.B.; Droege, S.; Royle, J. Andrew; Langtimm, C.A.

    2002-01-01

    Nondetection of a species at a site does not imply that the species is absent unless the probability of detection is 1. We propose a model and likelihood-based method for estimating site occupancy rates when detection probabilities are 0.3). We estimated site occupancy rates for two anuran species at 32 wetland sites in Maryland, USA, from data collected during 2000 as part of an amphibian monitoring program, Frogwatch USA. Site occupancy rates were estimated as 0.49 for American toads (Bufo americanus), a 44% increase over the proportion of sites at which they were actually observed, and as 0.85 for spring peepers (Pseudacris crucifer), slightly above the observed proportion of 0.83.

  6. Estimating rates of local extinction and colonization in colonial species and an extension to the metapopulation and community levels

    USGS Publications Warehouse

    Barbraud, C.; Nichols, J.D.; Hines, J.E.; Hafner, H.

    2003-01-01

    Coloniality has mainly been studied from an evolutionary perspective, but relatively few studies have developed methods for modelling colony dynamics. Changes in number of colonies over time provide a useful tool for predicting and evaluating the responses of colonial species to management and to environmental disturbance. Probabilistic Markov process models have been recently used to estimate colony site dynamics using presence-absence data when all colonies are detected in sampling efforts. Here, we define and develop two general approaches for the modelling and analysis of colony dynamics for sampling situations in which all colonies are, and are not, detected. For both approaches, we develop a general probabilistic model for the data and then constrain model parameters based on various hypotheses about colony dynamics. We use Akaike's Information Criterion (AIC) to assess the adequacy of the constrained models. The models are parameterised with conditional probabilities of local colony site extinction and colonization. Presence-absence data arising from Pollock's robust capture-recapture design provide the basis for obtaining unbiased estimates of extinction, colonization, and detection probabilities when not all colonies are detected. This second approach should be particularly useful in situations where detection probabilities are heterogeneous among colony sites. The general methodology is illustrated using presence-absence data on two species of herons (Purple Heron, Ardea purpurea and Grey Heron, Ardea cinerea). Estimates of the extinction and colonization rates showed interspecific differences and strong temporal and spatial variations. We were also able to test specific predictions about colony dynamics based on ideas about habitat change and metapopulation dynamics. We recommend estimators based on probabilistic modelling for future work on colony dynamics. We also believe that this methodological framework has wide application to problems in animal ecology concerning metapopulation and community dynamics.

  7. Model-Free CUSUM Methods for Person Fit

    ERIC Educational Resources Information Center

    Armstrong, Ronald D.; Shi, Min

    2009-01-01

    This article demonstrates the use of a new class of model-free cumulative sum (CUSUM) statistics to detect person fit given the responses to a linear test. The fundamental statistic being accumulated is the likelihood ratio of two probabilities. The detection performance of this CUSUM scheme is compared to other model-free person-fit statistics…

  8. Detection performance in clutter with variable resolution

    NASA Astrophysics Data System (ADS)

    Schmieder, D. E.; Weathersby, M. R.

    1983-07-01

    Experiments were conducted to determine the influence of background clutter on target detection criteria. The experiment consisted of placing observers in front of displayed images on a TV monitor. Observer ability to detect military targets embedded in simulated natural and manmade background clutter was measured when there was unlimited viewing time. Results were described in terms of detection probability versus target resolution for various signal to clutter ratios (SCR). The experiments were preceded by a search for a meaningful clutter definition. The selected definition was a statistical measure computed by averaging the standard deviation of contiguous scene cells over the whole scene. The cell size was comparable to the target size. Observer test results confirmed the expectation that the resolution required for a given detection probability was a continuum function of the clutter level. At the lower SCRs the resolution required for a high probability of detection was near 6 line pairs per target (LP/TGT), while at the higher SCRs it was found that a resoluton of less than 0.25 LP/TGT would yield a high probability of detection. These results are expected to aid in target acquisition performance modeling and to lead to improved specifications for imaging automatic target screeners.

  9. An Experiment Quantifying The Effect Of Clutter On Target Detection

    NASA Astrophysics Data System (ADS)

    Weathersby, Marshall R.; Schmieder, David E.

    1985-01-01

    Experiments were conducted to determine the influence of background clutter on target detection criteria. The experiment consisted of placing observers in front of displayed images on a TV monitor. Observer ability to detect military targets embedded in simulated natural and manmade background clutter was measured when there was unlimited viewing time. Results were described in terms of detection probability versus target resolution for various signal to clutter ratios (SCR). The experiments were preceded by a search for a meaningful clutter definition. The selected definition was a statistical measure computed by averaging the standard deviation of contiguous scene cells over the whole scene. The cell size was comparable to the target size. Observer test results confirmed the expectation that the resolution required for a given detection probability was a continuum function of the clutter level. At the lower SCRs the resolution required for a high probability of detection was near 6 lines pairs per target (LP/TGT), while at the higher SCRs it was found that a resolution of less than 0.25 LP/TGT would yield a high probability of detection. These results are expected to aid in target acquisition performance modeling and to lead to improved specifications for imaging automatic target screeners.

  10. Rejecting probability summation for radial frequency patterns, not so Quick!

    PubMed

    Baldwin, Alex S; Schmidtmann, Gunnar; Kingdom, Frederick A A; Hess, Robert F

    2016-05-01

    Radial frequency (RF) patterns are used to assess how the visual system processes shape. They are thought to be detected globally. This is supported by studies that have found summation for RF patterns to be greater than what is possible if the parts were being independently detected and performance only then improved with an increasing number of cycles by probability summation between them. However, the model of probability summation employed in these previous studies was based on High Threshold Theory (HTT), rather than Signal Detection Theory (SDT). We conducted rating scale experiments to investigate the receiver operating characteristics. We find these are of the curved form predicted by SDT, rather than the straight lines predicted by HTT. This means that to test probability summation we must use a model based on SDT. We conducted a set of summation experiments finding that thresholds decrease as the number of modulated cycles increases at approximately the same rate as previously found. As this could be consistent with either additive or probability summation, we performed maximum-likelihood fitting of a set of summation models (Matlab code provided in our Supplementary material) and assessed the fits using cross validation. We find we are not able to distinguish whether the responses to the parts of an RF pattern are combined by additive or probability summation, because the predictions are too similar. We present similar results for summation between separate RF patterns, suggesting that the summation process there may be the same as that within a single RF. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Assessment of different surveillance systems for avian influenza in commercial poultry in Catalonia (North-Eastern Spain).

    PubMed

    Alba, A; Casal, J; Napp, S; Martin, P A J

    2010-11-01

    Compulsory surveillance programmes for avian influenza (AI) have been implemented in domestic poultry and wild birds in all the European Member States since 2005. The implementation of these programmes is complex and requires a close evaluation. A good indicator to assess their efficacy is the sensitivity (Se) of the surveillance system. In this study, the sensitivities for different sampling designs proposed by the Spanish authorities for the commercial poultry population of Catalonia were assessed, using the scenario tree model methodology. These samplings were stratified throughout the territory of Spain and took into account the species, the types of production and their specific risks. The probabilities of detecting infection at different prevalences at both individual and holding level were estimated. Furthermore, those subpopulations that contributed more to the Se of the system were identified. The model estimated that all the designs met the requirements of the European Commission. The probability of detecting AI circulating in Catalonian poultry did not change significantly when the within-holding design prevalence varied from 30% to 10%. In contrast, when the among-holding design prevalence decreased from 5% to 1%, the probability of detecting AI was drastically reduced. The sampling of duck and goose holdings, and to a lesser extent the sampling of turkey and game bird holdings, increased the Se substantially. The Se of passive surveillance in chickens for highly pathogenic avian influenza (HPAI) and low pathogenicity avian influenza (LPAI) were also assessed. The probability of the infected birds manifesting apparent clinical signs and the awareness of veterinarians and farmers had great influence on the probability of detecting AI. In order to increase the probability of an early detection of HPAI in chicken, the probability of performing AI specific tests when AI is suspected would need to be increased. Copyright © 2010 Elsevier B.V. All rights reserved.

  12. Signal coverage approach to the detection probability of hypothetical extraterrestrial emitters in the Milky Way.

    PubMed

    Grimaldi, Claudio

    2017-04-12

    The lack of evidence for the existence of extraterrestrial life, even the simplest forms of animal life, makes it is difficult to decide whether the search for extraterrestrial intelligence (SETI) is more a high-risk, high-payoff endeavor than a futile attempt. Here we insist that even if extraterrestrial civilizations do exist and communicate, the likelihood of detecting their signals crucially depends on whether the Earth lies within a region of the galaxy covered by such signals. By considering possible populations of independent emitters in the galaxy, we build a statistical model of the domain covered by hypothetical extraterrestrial signals to derive the detection probability that the Earth is within such a domain. We show that for general distributions of the signal longevity and directionality, the mean number of detectable emitters is less than one even for detection probabilities as large as 50%, regardless of the number of emitters in the galaxy.

  13. Signal coverage approach to the detection probability of hypothetical extraterrestrial emitters in the Milky Way

    NASA Astrophysics Data System (ADS)

    Grimaldi, Claudio

    2017-04-01

    The lack of evidence for the existence of extraterrestrial life, even the simplest forms of animal life, makes it is difficult to decide whether the search for extraterrestrial intelligence (SETI) is more a high-risk, high-payoff endeavor than a futile attempt. Here we insist that even if extraterrestrial civilizations do exist and communicate, the likelihood of detecting their signals crucially depends on whether the Earth lies within a region of the galaxy covered by such signals. By considering possible populations of independent emitters in the galaxy, we build a statistical model of the domain covered by hypothetical extraterrestrial signals to derive the detection probability that the Earth is within such a domain. We show that for general distributions of the signal longevity and directionality, the mean number of detectable emitters is less than one even for detection probabilities as large as 50%, regardless of the number of emitters in the galaxy.

  14. Estimating the influence of population density and dispersal behavior on the ability to detect and monitor Agrilus planipennis (Coleoptera: Buprestidae) populations.

    PubMed

    Mercader, R J; Siegert, N W; McCullough, D G

    2012-02-01

    Emerald ash borer, Agrilus planipennis Fairmaire (Coleoptera: Buprestidae), a phloem-feeding pest of ash (Fraxinus spp.) trees native to Asia, was first discovered in North America in 2002. Since then, A. planipennis has been found in 15 states and two Canadian provinces and has killed tens of millions of ash trees. Understanding the probability of detecting and accurately delineating low density populations of A. planipennis is a key component of effective management strategies. Here we approach this issue by 1) quantifying the efficiency of sampling nongirdled ash trees to detect new infestations of A. planipennis under varying population densities and 2) evaluating the likelihood of accurately determining the localized spread of discrete A. planipennis infestations. To estimate the probability a sampled tree would be detected as infested across a gradient of A. planipennis densities, we used A. planipennis larval density estimates collected during intensive surveys conducted in three recently infested sites with known origins. Results indicated the probability of detecting low density populations by sampling nongirdled trees was very low, even when detection tools were assumed to have three-fold higher detection probabilities than nongirdled trees. Using these results and an A. planipennis spread model, we explored the expected accuracy with which the spatial extent of an A. planipennis population could be determined. Model simulations indicated a poor ability to delineate the extent of the distribution of localized A. planipennis populations, particularly when a small proportion of the population was assumed to have a higher propensity for dispersal.

  15. A Max-Flow Based Algorithm for Connected Target Coverage with Probabilistic Sensors

    PubMed Central

    Shan, Anxing; Xu, Xianghua; Cheng, Zongmao; Wang, Wensheng

    2017-01-01

    Coverage is a fundamental issue in the research field of wireless sensor networks (WSNs). Connected target coverage discusses the sensor placement to guarantee the needs of both coverage and connectivity. Existing works largely leverage on the Boolean disk model, which is only a coarse approximation to the practical sensing model. In this paper, we focus on the connected target coverage issue based on the probabilistic sensing model, which can characterize the quality of coverage more accurately. In the probabilistic sensing model, sensors are only be able to detect a target with certain probability. We study the collaborative detection probability of target under multiple sensors. Armed with the analysis of collaborative detection probability, we further formulate the minimum ϵ-connected target coverage problem, aiming to minimize the number of sensors satisfying the requirements of both coverage and connectivity. We map it into a flow graph and present an approximation algorithm called the minimum vertices maximum flow algorithm (MVMFA) with provable time complex and approximation ratios. To evaluate our design, we analyze the performance of MVMFA theoretically and also conduct extensive simulation studies to demonstrate the effectiveness of our proposed algorithm. PMID:28587084

  16. A Max-Flow Based Algorithm for Connected Target Coverage with Probabilistic Sensors.

    PubMed

    Shan, Anxing; Xu, Xianghua; Cheng, Zongmao; Wang, Wensheng

    2017-05-25

    Coverage is a fundamental issue in the research field of wireless sensor networks (WSNs). Connected target coverage discusses the sensor placement to guarantee the needs of both coverage and connectivity. Existing works largely leverage on the Boolean disk model, which is only a coarse approximation to the practical sensing model. In this paper, we focus on the connected target coverage issue based on the probabilistic sensing model, which can characterize the quality of coverage more accurately. In the probabilistic sensing model, sensors are only be able to detect a target with certain probability. We study the collaborative detection probability of target under multiple sensors. Armed with the analysis of collaborative detection probability, we further formulate the minimum ϵ -connected target coverage problem, aiming to minimize the number of sensors satisfying the requirements of both coverage and connectivity. We map it into a flow graph and present an approximation algorithm called the minimum vertices maximum flow algorithm (MVMFA) with provable time complex and approximation ratios. To evaluate our design, we analyze the performance of MVMFA theoretically and also conduct extensive simulation studies to demonstrate the effectiveness of our proposed algorithm.

  17. Probability of acoustic transmitter detections by receiver lines in Lake Huron: results of multi-year field tests and simulations

    USGS Publications Warehouse

    Hayden, Todd A.; Holbrook, Christopher M.; Binder, Thomas; Dettmers, John M.; Cooke, Steven J.; Vandergoot, Christopher S.; Krueger, Charles C.

    2016-01-01

    BackgroundAdvances in acoustic telemetry technology have led to an improved understanding of the spatial ecology of many freshwater and marine fish species. Understanding the performance of acoustic receivers is necessary to distinguish between tagged fish that may have been present but not detected and from those fish that were absent from the area. In this study, two stationary acoustic transmitters were deployed 250 m apart within each of four acoustic receiver lines each containing at least 10 receivers (i.e., eight acoustic transmitters) located in Saginaw Bay and central Lake Huron for nearly 2 years to determine whether the probability of detecting an acoustic transmission varied as a function of time (i.e., season), location, and distance between acoustic transmitter and receiver. Distances between acoustic transmitters and receivers ranged from 200 m to >10 km in each line. The daily observed probability of detecting an acoustic transmission was used in simulation models to estimate the probability of detecting a moving acoustic transmitter on a line of receivers.ResultsThe probability of detecting an acoustic transmitter on a receiver 1000 m away differed by month for different receiver lines in Lake Huron and Saginaw Bay but was similar for paired acoustic transmitters deployed 250 m apart within the same line. Mean probability of detecting an acoustic transmitter at 1000 m calculated over the study period varied among acoustic transmitters 250 m apart within a line and differed among receiver lines in Lake Huron and Saginaw Bay. The simulated probability of detecting a moving acoustic transmitter on a receiver line was characterized by short periods of time with decreased detection. Although increased receiver spacing and higher fish movement rates decreased simulated detection probability, the location of the simulated receiver line in Lake Huron had the strongest effect on simulated detection probability.ConclusionsPerformance of receiver lines in Lake Huron varied across a range of spatiotemporal scales and was inconsistent among receiver lines. Our simulations indicated that if 69 kHz acoustic transmitters operating at 158 dB in 10–30 m of freshwater were being used, then receivers should be placed 1000 m apart to ensure that all fish moving at 1 m s−1 or less will be detected 90% of days over a 2-year period. Whereas these results can be used as general guidelines for designing new studies, the irregular variation in acoustic transmitter detection probabilities we observed among receiver line locations in Lake Huron makes designing receiver lines in similar systems challenging and emphasizes the need to conduct post hoc analyses of acoustic transmitter detection probabilities.

  18. A hybrid double-observer sightability model for aerial surveys

    USGS Publications Warehouse

    Griffin, Paul C.; Lubow, Bruce C.; Jenkins, Kurt J.; Vales, David J.; Moeller, Barbara J.; Reid, Mason; Happe, Patricia J.; Mccorquodale, Scott M.; Tirhi, Michelle J.; Schaberi, Jim P.; Beirne, Katherine

    2013-01-01

    Raw counts from aerial surveys make no correction for undetected animals and provide no estimate of precision with which to judge the utility of the counts. Sightability modeling and double-observer (DO) modeling are 2 commonly used approaches to account for detection bias and to estimate precision in aerial surveys. We developed a hybrid DO sightability model (model MH) that uses the strength of each approach to overcome the weakness in the other, for aerial surveys of elk (Cervus elaphus). The hybrid approach uses detection patterns of 2 independent observer pairs in a helicopter and telemetry-based detections of collared elk groups. Candidate MH models reflected hypotheses about effects of recorded covariates and unmodeled heterogeneity on the separate front-seat observer pair and back-seat observer pair detection probabilities. Group size and concealing vegetation cover strongly influenced detection probabilities. The pilot's previous experience participating in aerial surveys influenced detection by the front pair of observers if the elk group was on the pilot's side of the helicopter flight path. In 9 surveys in Mount Rainier National Park, the raw number of elk counted was approximately 80–93% of the abundance estimated by model MH. Uncorrected ratios of bulls per 100 cows generally were low compared to estimates adjusted for detection bias, but ratios of calves per 100 cows were comparable whether based on raw survey counts or adjusted estimates. The hybrid method was an improvement over commonly used alternatives, with improved precision compared to sightability modeling and reduced bias compared to DO modeling.

  19. Quality metrics for sensor images

    NASA Technical Reports Server (NTRS)

    Ahumada, AL

    1993-01-01

    Methods are needed for evaluating the quality of augmented visual displays (AVID). Computational quality metrics will help summarize, interpolate, and extrapolate the results of human performance tests with displays. The FLM Vision group at NASA Ames has been developing computational models of visual processing and using them to develop computational metrics for similar problems. For example, display modeling systems use metrics for comparing proposed displays, halftoning optimizing methods use metrics to evaluate the difference between the halftone and the original, and image compression methods minimize the predicted visibility of compression artifacts. The visual discrimination models take as input two arbitrary images A and B and compute an estimate of the probability that a human observer will report that A is different from B. If A is an image that one desires to display and B is the actual displayed image, such an estimate can be regarded as an image quality metric reflecting how well B approximates A. There are additional complexities associated with the problem of evaluating the quality of radar and IR enhanced displays for AVID tasks. One important problem is the question of whether intruding obstacles are detectable in such displays. Although the discrimination model can handle detection situations by making B the original image A plus the intrusion, this detection model makes the inappropriate assumption that the observer knows where the intrusion will be. Effects of signal uncertainty need to be added to our models. A pilot needs to make decisions rapidly. The models need to predict not just the probability of a correct decision, but the probability of a correct decision by the time the decision needs to be made. That is, the models need to predict latency as well as accuracy. Luce and Green have generated models for auditory detection latencies. Similar models are needed for visual detection. Most image quality models are designed for static imagery. Watson has been developing a general spatial-temporal vision model to optimize video compression techniques. These models need to be adapted and calibrated for AVID applications.

  20. Surveying Europe’s Only Cave-Dwelling Chordate Species (Proteus anguinus) Using Environmental DNA

    PubMed Central

    Márton, Orsolya; Schmidt, Benedikt R.; Gál, Júlia Tünde; Jelić, Dušan

    2017-01-01

    In surveillance of subterranean fauna, especially in the case of rare or elusive aquatic species, traditional techniques used for epigean species are often not feasible. We developed a non-invasive survey method based on environmental DNA (eDNA) to detect the presence of the red-listed cave-dwelling amphibian, Proteus anguinus, in the caves of the Dinaric Karst. We tested the method in fifteen caves in Croatia, from which the species was previously recorded or expected to occur. We successfully confirmed the presence of P. anguinus from ten caves and detected the species for the first time in five others. Using a hierarchical occupancy model we compared the availability and detection probability of eDNA of two water sampling methods, filtration and precipitation. The statistical analysis showed that both availability and detection probability depended on the method and estimates for both probabilities were higher using filter samples than for precipitation samples. Combining reliable field and laboratory methods with robust statistical modeling will give the best estimates of species occurrence. PMID:28129383

  1. Probability of detecting atrazine/desethyl-atrazine and elevated concentrations of nitrate (NO2+NO3-N) in ground water in the Idaho part of the upper Snake River basin

    USGS Publications Warehouse

    Rupert, Michael G.

    1998-01-01

    Draft Federal regulations may require that each State develop a State Pesticide Management Plan for the herbicides atrazine, alachlor, cyanazine, metolachlor, and simazine. This study developed maps that the Idaho State Department of Agriculture might use to predict the probability of detecting atrazine and desethyl-atrazine (a breakdown product of atrazine) in ground water in the Idaho part of the upper Snake River Basin. These maps can be incorporated in the State Pesticide Management Plan and help provide a sound hydrogeologic basis for atrazine management in the study area. Maps showing the probability of detecting atrazine/desethyl-atrazine in ground water were developed as follows: (1) Ground-water monitoring data were overlaid with hydrogeologic and anthropogenic data using a geographic information system to produce a data set in which each well had corresponding data on atrazine use, depth to ground water, geology, land use, precipitation, soils, and well depth. These data then were downloaded to a statistical software package for analysis by logistic regression. (2) Individual (univariate) relations between atrazine/desethyl-atrazine in ground water and atrazine use, depth to ground water, geology, land use, precipitation, soils, and well depth data were evaluated to identify those independent variables significantly related to atrazine/ desethyl-atrazine detections. (3) Several preliminary multivariate models with various combinations of independent variables were constructed. (4) The multivariate models which best predicted the presence of atrazine/desethyl-atrazine in ground water were selected. (5) The multivariate models were entered into the geographic information system and the probability maps were constructed. Two models which best predicted the presence of atrazine/desethyl-atrazine in ground water were selected; one with and one without atrazine use. Correlations of the predicted probabilities of atrazine/desethyl-atrazine in ground water with the percent of actual detections were good; r-squared values were 0.91 and 0.96, respectively. Models were verified using a second set of groundwater quality data. Verification showed that wells with water containing atrazine/desethyl-atrazine had significantly higher probability ratings than wells with water containing no atrazine/desethylatrazine (p <0.002). Logistic regression also was used to develop a preliminary model to predict the probability of nitrite plus nitrate as nitrogen concentrations greater than background levels of 2 milligrams per liter. A direct comparison between the atrazine/ desethyl-atrazine and nitrite plus nitrate as nitrogen probability maps was possible because the same ground-water monitoring, hydrogeologic, and anthropogenic data were used to develop both maps. Land use, precipitation, soil hydrologic group, and well depth were significantly related with atrazine/desethyl-atrazine detections. Depth to water, land use, and soil drainage were signifi- cantly related with elevated nitrite plus nitrate as nitrogen concentrations. The differences between atrazine/desethyl-atrazine and nitrite plus nitrate as nitrogen relations were attributed to differences in chemical behavior of these compounds in the environment and possibly to differences in the extent of use and rates of their application.

  2. a Cloud Boundary Detection Scheme Combined with Aslic and Cnn Using ZY-3, GF-1/2 Satellite Imagery

    NASA Astrophysics Data System (ADS)

    Guo, Z.; Li, C.; Wang, Z.; Kwok, E.; Wei, X.

    2018-04-01

    Remote sensing optical image cloud detection is one of the most important problems in remote sensing data processing. Aiming at the information loss caused by cloud cover, a cloud detection method based on convolution neural network (CNN) is presented in this paper. Firstly, a deep CNN network is used to extract the multi-level feature generation model of cloud from the training samples. Secondly, the adaptive simple linear iterative clustering (ASLIC) method is used to divide the detected images into superpixels. Finally, the probability of each superpixel belonging to the cloud region is predicted by the trained network model, thereby generating a cloud probability map. The typical region of GF-1/2 and ZY-3 were selected to carry out the cloud detection test, and compared with the traditional SLIC method. The experiment results show that the average accuracy of cloud detection is increased by more than 5 %, and it can detected thin-thick cloud and the whole cloud boundary well on different imaging platforms.

  3. Effects of surrounding land use and water depth on seagrass dynamics relative to a catastrophic algal bloom.

    PubMed

    Breininger, David R; Breininger, Robert D; Hall, Carlton R

    2017-02-01

    Seagrasses are the foundation of many coastal ecosystems and are in global decline because of anthropogenic impacts. For the Indian River Lagoon (Florida, U.S.A.), we developed competing multistate statistical models to quantify how environmental factors (surrounding land use, water depth, and time [year]) influenced the variability of seagrass state dynamics from 2003 to 2014 while accounting for time-specific detection probabilities that quantified our ability to determine seagrass state at particular locations and times. We classified seagrass states (presence or absence) at 764 points with geographic information system maps for years when seagrass maps were available and with aerial photographs when seagrass maps were not available. We used 4 categories (all conservation, mostly conservation, mostly urban, urban) to describe surrounding land use within sections of lagoonal waters, usually demarcated by land features that constricted these waters. The best models predicted that surrounding land use, depth, and year would affect transition and detection probabilities. Sections of the lagoon bordered by urban areas had the least stable seagrass beds and lowest detection probabilities, especially after a catastrophic seagrass die-off linked to an algal bloom. Sections of the lagoon bordered by conservation lands had the most stable seagrass beds, which supports watershed conservation efforts. Our results show that a multistate approach can empirically estimate state-transition probabilities as functions of environmental factors while accounting for state-dependent differences in seagrass detection probabilities as part of the overall statistical inference procedure. © 2016 Society for Conservation Biology.

  4. An integrated logit model for contamination event detection in water distribution systems.

    PubMed

    Housh, Mashor; Ostfeld, Avi

    2015-05-15

    The problem of contamination event detection in water distribution systems has become one of the most challenging research topics in water distribution systems analysis. Current attempts for event detection utilize a variety of approaches including statistical, heuristics, machine learning, and optimization methods. Several existing event detection systems share a common feature in which alarms are obtained separately for each of the water quality indicators. Unifying those single alarms from different indicators is usually performed by means of simple heuristics. A salient feature of the current developed approach is using a statistically oriented model for discrete choice prediction which is estimated using the maximum likelihood method for integrating the single alarms. The discrete choice model is jointly calibrated with other components of the event detection system framework in a training data set using genetic algorithms. The fusing process of each indicator probabilities, which is left out of focus in many existing event detection system models, is confirmed to be a crucial part of the system which could be modelled by exploiting a discrete choice model for improving its performance. The developed methodology is tested on real water quality data, showing improved performances in decreasing the number of false positive alarms and in its ability to detect events with higher probabilities, compared to previous studies. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Investigation of reliability indicators of information analysis systems based on Markov’s absorbing chain model

    NASA Astrophysics Data System (ADS)

    Gilmanshin, I. R.; Kirpichnikov, A. P.

    2017-09-01

    In the result of study of the algorithm of the functioning of the early detection module of excessive losses, it is proven the ability to model it by using absorbing Markov chains. The particular interest is in the study of probability characteristics of early detection module functioning algorithm of losses in order to identify the relationship of indicators of reliability of individual elements, or the probability of occurrence of certain events and the likelihood of transmission of reliable information. The identified relations during the analysis allow to set thresholds reliability characteristics of the system components.

  6. Dynamic N -occupancy models: estimating demographic rates and local abundance from detection-nondetection data

    Treesearch

    Sam Rossman; Charles B. Yackulic; Sarah P. Saunders; Janice Reid; Ray Davis; Elise F. Zipkin

    2016-01-01

    Occupancy modeling is a widely used analytical technique for assessing species distributions and range dynamics. However, occupancy analyses frequently ignore variation in abundance of occupied sites, even though site abundances affect many of the parameters being estimated (e.g., extinction, colonization, detection probability). We introduce a new model (“dynamic

  7. Estimating the Effects of Detection Heterogeneity and Overdispersion on Trends Estimated from Avian Point Counts

    EPA Science Inventory

    Point counts are a common method for sampling avian distribution and abundance. Though methods for estimating detection probabilities are available, many analyses use raw counts and do not correct for detectability. We use a removal model of detection within an N-mixture approa...

  8. Lamb wave-based damage quantification and probability of detection modeling for fatigue life assessment of riveted lap joint

    NASA Astrophysics Data System (ADS)

    He, Jingjing; Wang, Dengjiang; Zhang, Weifang

    2015-03-01

    This study presents an experimental and modeling study for damage detection and quantification in riveted lap joints. Embedded lead zirconate titanate piezoelectric (PZT) ceramic wafer-type sensors are employed to perform in-situ non-destructive testing during fatigue cyclical loading. A multi-feature integration method is developed to quantify the crack size using signal features of correlation coefficient, amplitude change, and phase change. In addition, probability of detection (POD) model is constructed to quantify the reliability of the developed sizing method. Using the developed crack size quantification method and the resulting POD curve, probabilistic fatigue life prediction can be performed to provide comprehensive information for decision-making. The effectiveness of the overall methodology is demonstrated and validated using several aircraft lap joint specimens from different manufactures and under different loading conditions.

  9. Relaxing the closure assumption in single-season occupancy models: staggered arrival and departure times

    USGS Publications Warehouse

    Kendall, William L.; Hines, James E.; Nichols, James D.; Grant, Evan H. Campbell

    2013-01-01

    Occupancy statistical models that account for imperfect detection have proved very useful in several areas of ecology, including species distribution and spatial dynamics, disease ecology, and ecological responses to climate change. These models are based on the collection of multiple samples at each of a number of sites within a given season, during which it is assumed the species is either absent or present and available for detection while each sample is taken. However, for some species, individuals are only present or available for detection seasonally. We present a statistical model that relaxes the closure assumption within a season by permitting staggered entry and exit times for the species of interest at each site. Based on simulation, our open model eliminates bias in occupancy estimators and in some cases increases precision. The power to detect the violation of closure is high if detection probability is reasonably high. In addition to providing more robust estimation of occupancy, this model permits comparison of phenology across sites, species, or years, by modeling variation in arrival or departure probabilities. In a comparison of four species of amphibians in Maryland we found that two toad species arrived at breeding sites later in the season than a salamander and frog species, and departed from sites earlier.

  10. Probability of detecting atrazine/desethyl-atrazine and elevated concentrations of nitrate in ground water in Colorado

    USGS Publications Warehouse

    Rupert, Michael G.

    2003-01-01

    Draft Federal regulations may require that each State develop a State Pesticide Management Plan for the herbicides atrazine, alachlor, metolachlor, and simazine. Maps were developed that the State of Colorado could use to predict the probability of detecting atrazine and desethyl-atrazine (a breakdown product of atrazine) in ground water in Colorado. These maps can be incorporated into the State Pesticide Management Plan and can help provide a sound hydrogeologic basis for atrazine management in Colorado. Maps showing the probability of detecting elevated nitrite plus nitrate as nitrogen (nitrate) concentrations in ground water in Colorado also were developed because nitrate is a contaminant of concern in many areas of Colorado. Maps showing the probability of detecting atrazine and(or) desethyl-atrazine (atrazine/DEA) at or greater than concentrations of 0.1 microgram per liter and nitrate concentrations in ground water greater than 5 milligrams per liter were developed as follows: (1) Ground-water quality data were overlaid with anthropogenic and hydrogeologic data using a geographic information system to produce a data set in which each well had corresponding data on atrazine use, fertilizer use, geology, hydrogeomorphic regions, land cover, precipitation, soils, and well construction. These data then were downloaded to a statistical software package for analysis by logistic regression. (2) Relations were observed between ground-water quality and the percentage of land-cover categories within circular regions (buffers) around wells. Several buffer sizes were evaluated; the buffer size that provided the strongest relation was selected for use in the logistic regression models. (3) Relations between concentrations of atrazine/DEA and nitrate in ground water and atrazine use, fertilizer use, geology, hydrogeomorphic regions, land cover, precipitation, soils, and well-construction data were evaluated, and several preliminary multivariate models with various combinations of independent variables were constructed. (4) The multivariate models that best predicted the presence of atrazine/DEA and elevated concentrations of nitrate in ground water were selected. (5) The accuracy of the multivariate models was confirmed by validating the models with an independent set of ground-water quality data. (6) The multivariate models were entered into a geographic information system and the probability maps were constructed.

  11. Total coliform and E. coli in public water systems using undisinfected ground water in the United States.

    PubMed

    Messner, Michael J; Berger, Philip; Javier, Julie

    2017-06-01

    Public water systems (PWSs) in the United States generate total coliform (TC) and Escherichia coli (EC) monitoring data, as required by the Total Coliform Rule (TCR). We analyzed data generated in 2011 by approximately 38,000 small (serving fewer than 4101 individuals) undisinfected public water systems (PWSs). We used statistical modeling to characterize a distribution of TC detection probabilities for each of nine groupings of PWSs based on system type (community, non-transient non-community, and transient non-community) and population served (less than 101, 101-1000 and 1001-4100 people). We found that among PWS types sampled in 2011, on average, undisinfected transient PWSs test positive for TC 4.3% of the time as compared with 3% for undisinfected non-transient PWSs and 2.5% for undisinfected community PWSs. Within each type of PWS, the smaller systems have higher median TC detection than the larger systems. All TC-positive samples were assayed for EC. Among TC-positive samples from small undisinfected PWSs, EC is detected in about 5% of samples, regardless of PWS type or size. We evaluated the upper tail of the TC detection probability distributions and found that significant percentages of some system types have high TC detection probabilities. For example, assuming the systems providing data are nationally-representative, then 5.0% of the ∼50,000 small undisinfected transient PWSs in the U.S. have TC detection probabilities of 20% or more. Communities with such high TC detection probabilities may have elevated risk of acute gastrointestinal (AGI) illness - perhaps as great or greater than the attributable risk to drinking water (6-22%) calculated for 14 Wisconsin community PWSs with much lower TC detection probabilities (about 2.3%, Borchardt et al., 2012). Published by Elsevier GmbH.

  12. Estimation of the POD function and the LOD of a qualitative microbiological measurement method.

    PubMed

    Wilrich, Cordula; Wilrich, Peter-Theodor

    2009-01-01

    Qualitative microbiological measurement methods in which the measurement results are either 0 (microorganism not detected) or 1 (microorganism detected) are discussed. The performance of such a measurement method is described by its probability of detection as a function of the contamination (CFU/g or CFU/mL) of the test material, or by the LOD(p), i.e., the contamination that is detected (measurement result 1) with a specified probability p. A complementary log-log model was used to statistically estimate these performance characteristics. An intralaboratory experiment for the detection of Listeria monocytogenes in various food matrixes illustrates the method. The estimate of LOD50% is compared with the Spearman-Kaerber method.

  13. Improving inferences from fisheries capture-recapture studies through remote detection of PIT tags

    USGS Publications Warehouse

    Hewitt, David A.; Janney, Eric C.; Hayes, Brian S.; Shively, Rip S.

    2010-01-01

    Models for capture-recapture data are commonly used in analyses of the dynamics of fish and wildlife populations, especially for estimating vital parameters such as survival. Capture-recapture methods provide more reliable inferences than other methods commonly used in fisheries studies. However, for rare or elusive fish species, parameter estimation is often hampered by small probabilities of re-encountering tagged fish when encounters are obtained through traditional sampling methods. We present a case study that demonstrates how remote antennas for passive integrated transponder (PIT) tags can increase encounter probabilities and the precision of survival estimates from capture-recapture models. Between 1999 and 2007, trammel nets were used to capture and tag over 8,400 endangered adult Lost River suckers (Deltistes luxatus) during the spawning season in Upper Klamath Lake, Oregon. Despite intensive sampling at relatively discrete spawning areas, encounter probabilities from Cormack-Jolly-Seber models were consistently low (< 0.2) and the precision of apparent annual survival estimates was poor. Beginning in 2005, remote PIT tag antennas were deployed at known spawning locations to increase the probability of re-encountering tagged fish. We compare results based only on physical recaptures with results based on both physical recaptures and remote detections to demonstrate the substantial improvement in estimates of encounter probabilities (approaching 100%) and apparent annual survival provided by the remote detections. The richer encounter histories provided robust inferences about the dynamics of annual survival and have made it possible to explore more realistic models and hypotheses about factors affecting the conservation and recovery of this endangered species. Recent advances in technology related to PIT tags have paved the way for creative implementation of large-scale tagging studies in systems where they were previously considered impracticable.

  14. Modeling the frequency-dependent detective quantum efficiency of photon-counting x-ray detectors.

    PubMed

    Stierstorfer, Karl

    2018-01-01

    To find a simple model for the frequency-dependent detective quantum efficiency (DQE) of photon-counting detectors in the low flux limit. Formula for the spatial cross-talk, the noise power spectrum and the DQE of a photon-counting detector working at a given threshold are derived. Parameters are probabilities for types of events like single counts in the central pixel, double counts in the central pixel and a neighboring pixel or single count in a neighboring pixel only. These probabilities can be derived in a simple model by extensive use of Monte Carlo techniques: The Monte Carlo x-ray propagation program MOCASSIM is used to simulate the energy deposition from the x-rays in the detector material. A simple charge cloud model using Gaussian clouds of fixed width is used for the propagation of the electric charge generated by the primary interactions. Both stages are combined in a Monte Carlo simulation randomizing the location of impact which finally produces the required probabilities. The parameters of the charge cloud model are fitted to the spectral response to a polychromatic spectrum measured with our prototype detector. Based on the Monte Carlo model, the DQE of photon-counting detectors as a function of spatial frequency is calculated for various pixel sizes, photon energies, and thresholds. The frequency-dependent DQE of a photon-counting detector in the low flux limit can be described with an equation containing only a small set of probabilities as input. Estimates for the probabilities can be derived from a simple model of the detector physics. © 2017 American Association of Physicists in Medicine.

  15. Kepler Planet Reliability Metrics: Astrophysical Positional Probabilities for Data Release 25

    NASA Technical Reports Server (NTRS)

    Bryson, Stephen T.; Morton, Timothy D.

    2017-01-01

    This document is very similar to KSCI-19092-003, Planet Reliability Metrics: Astrophysical Positional Probabilities, which describes the previous release of the astrophysical positional probabilities for Data Release 24. The important changes for Data Release 25 are:1. The computation of the astrophysical positional probabilities uses the Data Release 25 processed pixel data for all Kepler Objects of Interest.2. Computed probabilities now have associated uncertainties, whose computation is described in x4.1.3.3. The scene modeling described in x4.1.2 uses background stars detected via ground-based high-resolution imaging, described in x5.1, that are not in the Kepler Input Catalog or UKIRT catalog. These newly detected stars are presented in Appendix B. Otherwise the text describing the algorithms and examples is largely unchanged from KSCI-19092-003.

  16. Estimating site occupancy, colonization, and local extinction when a species is detected imperfectly

    USGS Publications Warehouse

    MacKenzie, D.I.; Nichols, J.D.; Hines, J.E.; Knutson, M.G.; Franklin, A.B.

    2003-01-01

    Few species are likely to be so evident that they will always be detected when present. Failing to allow for the possibility that a target species was present, but undetected, at a site will lead to biased estimates of site occupancy, colonization, and local extinction probabilities. These population vital rates are often of interest in long-term monitoring programs and metapopulation studies. We present a model that enables direct estimation of these parameters when the probability of detecting the species is less than 1. The model does not require any assumptions of process stationarity, as do some previous methods, but does require detection/nondetection data to be collected in a manner similar to Pollock's robust design as used in mark?recapture studies. Via simulation, we show that the model provides good estimates of parameters for most scenarios considered. We illustrate the method with data from monitoring programs of Northern Spotted Owls (Strix occidentalis caurina) in northern California and tiger salamanders (Ambystoma tigrinum) in Minnesota, USA.

  17. Decreasing Plagiarism: What Works and What Doesn't

    ERIC Educational Resources Information Center

    Houtman, Anne M.; Walker, Sean

    2010-01-01

    The authors tested the predictions of a game theory model of plagiarism, using a test population of student papers submitted to an online plagiarism detection program, over five semesters in a non-majors biology course with multiple sections and high enrollment. Consistent with the model, as the probability of detection and the penalty if caught…

  18. Testing Models for Perceptual Discrimination Using Repeatable Noise

    NASA Technical Reports Server (NTRS)

    Ahumada, Albert J., Jr.; Null, Cynthia H. (Technical Monitor)

    1998-01-01

    Adding noise to stimuli to be discriminated allows estimation of observer classification functions based on the correlation between observer responses and relevant features of the noisy stimuli. Examples will be presented of stimulus features that are found in auditory tone detection and visual Vernier acuity. Using the standard signal detection model (Thurstone scaling), we derive formulas to estimate the proportion of the observer's decision variable variance that is controlled by the added noise. One is based on the probability of agreement of the observer with him/herself on trials with the same noise sample. Another is based on the relative performance of the observer and the model. When these do not agree, the model can be rejected. A second derivation gives the probability of agreement of observer and model when the observer follows the model except for internal noise. Agreement significantly less than this amount allows rejection of the model.

  19. Using new edges for anomaly detection in computer networks

    DOEpatents

    Neil, Joshua Charles

    2017-07-04

    Creation of new edges in a network may be used as an indication of a potential attack on the network. Historical data of a frequency with which nodes in a network create and receive new edges may be analyzed. Baseline models of behavior among the edges in the network may be established based on the analysis of the historical data. A new edge that deviates from a respective baseline model by more than a predetermined threshold during a time window may be detected. The new edge may be flagged as potentially anomalous when the deviation from the respective baseline model is detected. Probabilities for both new and existing edges may be obtained for all edges in a path or other subgraph. The probabilities may then be combined to obtain a score for the path or other subgraph. A threshold may be obtained by calculating an empirical distribution of the scores under historical conditions.

  20. Using new edges for anomaly detection in computer networks

    DOEpatents

    Neil, Joshua Charles

    2015-05-19

    Creation of new edges in a network may be used as an indication of a potential attack on the network. Historical data of a frequency with which nodes in a network create and receive new edges may be analyzed. Baseline models of behavior among the edges in the network may be established based on the analysis of the historical data. A new edge that deviates from a respective baseline model by more than a predetermined threshold during a time window may be detected. The new edge may be flagged as potentially anomalous when the deviation from the respective baseline model is detected. Probabilities for both new and existing edges may be obtained for all edges in a path or other subgraph. The probabilities may then be combined to obtain a score for the path or other subgraph. A threshold may be obtained by calculating an empirical distribution of the scores under historical conditions.

  1. A Gibbs sampler for Bayesian analysis of site-occupancy data

    USGS Publications Warehouse

    Dorazio, Robert M.; Rodriguez, Daniel Taylor

    2012-01-01

    1. A Bayesian analysis of site-occupancy data containing covariates of species occurrence and species detection probabilities is usually completed using Markov chain Monte Carlo methods in conjunction with software programs that can implement those methods for any statistical model, not just site-occupancy models. Although these software programs are quite flexible, considerable experience is often required to specify a model and to initialize the Markov chain so that summaries of the posterior distribution can be estimated efficiently and accurately. 2. As an alternative to these programs, we develop a Gibbs sampler for Bayesian analysis of site-occupancy data that include covariates of species occurrence and species detection probabilities. This Gibbs sampler is based on a class of site-occupancy models in which probabilities of species occurrence and detection are specified as probit-regression functions of site- and survey-specific covariate measurements. 3. To illustrate the Gibbs sampler, we analyse site-occupancy data of the blue hawker, Aeshna cyanea (Odonata, Aeshnidae), a common dragonfly species in Switzerland. Our analysis includes a comparison of results based on Bayesian and classical (non-Bayesian) methods of inference. We also provide code (based on the R software program) for conducting Bayesian and classical analyses of site-occupancy data.

  2. Development of a Random Field Model for Gas Plume Detection in Multiple LWIR Images.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heasler, Patrick G.

    This report develops a random field model that describes gas plumes in LWIR remote sensing images. The random field model serves as a prior distribution that can be combined with LWIR data to produce a posterior that determines the probability that a gas plume exists in the scene and also maps the most probable location of any plume. The random field model is intended to work with a single pixel regression estimator--a regression model that estimates gas concentration on an individual pixel basis.

  3. Signal coverage approach to the detection probability of hypothetical extraterrestrial emitters in the Milky Way

    PubMed Central

    Grimaldi, Claudio

    2017-01-01

    The lack of evidence for the existence of extraterrestrial life, even the simplest forms of animal life, makes it is difficult to decide whether the search for extraterrestrial intelligence (SETI) is more a high-risk, high-payoff endeavor than a futile attempt. Here we insist that even if extraterrestrial civilizations do exist and communicate, the likelihood of detecting their signals crucially depends on whether the Earth lies within a region of the galaxy covered by such signals. By considering possible populations of independent emitters in the galaxy, we build a statistical model of the domain covered by hypothetical extraterrestrial signals to derive the detection probability that the Earth is within such a domain. We show that for general distributions of the signal longevity and directionality, the mean number of detectable emitters is less than one even for detection probabilities as large as 50%, regardless of the number of emitters in the galaxy. PMID:28401943

  4. On the detection of very high redshift gamma-ray bursts with Swift

    NASA Astrophysics Data System (ADS)

    Salvaterra, R.; Campana, S.; Chincarini, G.; Tagliaferri, G.; Covino, S.

    2007-09-01

    We compute the probability of detecting long gamma-ray bursts (GRBs) at z >= 5 with Swift, assuming that GRBs form preferentially in low-metallicity environments. The model fits both the observed Burst and Transient Source Experiment (BATSE) and Swift GRB differential peak flux distributions well and is consistent with the number of z >= 2.5 detections in the 2-yr Swift data. We find that the probability of observing a burst at z >= 5 becomes larger than 10 per cent for photon fluxes P < 1 ph s-1 cm-2, consistent with the number of confirmed detections. The corresponding fraction of z >= 5 bursts in the Swift catalogue is ~10-30 per cent depending on the adopted metallicity threshold for GRB formation. We propose to use the computed probability as a tool to identify high-redshift GRBs. By jointly considering promptly available information provided by Swift and model results, we can select reliable z >= 5 candidates in a few hours from the BAT detection. We test the procedure against last year Swift data: only three bursts match all our requirements, two being confirmed at z >= 5. Another three possible candidates are picked up by slightly relaxing the adopted criteria. No low-z interloper is found among the six candidates.

  5. Spatial patterns of breeding success of grizzly bears derived from hierarchical multistate models.

    PubMed

    Fisher, Jason T; Wheatley, Matthew; Mackenzie, Darryl

    2014-10-01

    Conservation programs often manage populations indirectly through the landscapes in which they live. Empirically, linking reproductive success with landscape structure and anthropogenic change is a first step in understanding and managing the spatial mechanisms that affect reproduction, but this link is not sufficiently informed by data. Hierarchical multistate occupancy models can forge these links by estimating spatial patterns of reproductive success across landscapes. To illustrate, we surveyed the occurrence of grizzly bears (Ursus arctos) in the Canadian Rocky Mountains Alberta, Canada. We deployed camera traps for 6 weeks at 54 surveys sites in different types of land cover. We used hierarchical multistate occupancy models to estimate probability of detection, grizzly bear occupancy, and probability of reproductive success at each site. Grizzly bear occupancy varied among cover types and was greater in herbaceous alpine ecotones than in low-elevation wetlands or mid-elevation conifer forests. The conditional probability of reproductive success given grizzly bear occupancy was 30% (SE = 0.14). Grizzly bears with cubs had a higher probability of detection than grizzly bears without cubs, but sites were correctly classified as being occupied by breeding females 49% of the time based on raw data and thus would have been underestimated by half. Repeated surveys and multistate modeling reduced the probability of misclassifying sites occupied by breeders as unoccupied to <2%. The probability of breeding grizzly bear occupancy varied across the landscape. Those patches with highest probabilities of breeding occupancy-herbaceous alpine ecotones-were small and highly dispersed and are projected to shrink as treelines advance due to climate warming. Understanding spatial correlates in breeding distribution is a key requirement for species conservation in the face of climate change and can help identify priorities for landscape management and protection. © 2014 Society for Conservation Biology.

  6. Hidden Markov models and neural networks for fault detection in dynamic systems

    NASA Technical Reports Server (NTRS)

    Smyth, Padhraic

    1994-01-01

    Neural networks plus hidden Markov models (HMM) can provide excellent detection and false alarm rate performance in fault detection applications, as shown in this viewgraph presentation. Modified models allow for novelty detection. Key contributions of neural network models are: (1) excellent nonparametric discrimination capability; (2) a good estimator of posterior state probabilities, even in high dimensions, and thus can be embedded within overall probabilistic model (HMM); and (3) simple to implement compared to other nonparametric models. Neural network/HMM monitoring model is currently being integrated with the new Deep Space Network (DSN) antenna controller software and will be on-line monitoring a new DSN 34-m antenna (DSS-24) by July, 1994.

  7. A sampling design and model for estimating abundance of Nile crocodiles while accounting for heterogeneity of detectability of multiple observers

    USGS Publications Warehouse

    Shirley, Matthew H.; Dorazio, Robert M.; Abassery, Ekramy; Elhady, Amr A.; Mekki, Mohammed S.; Asran, Hosni H.

    2012-01-01

    As part of the development of a management program for Nile crocodiles in Lake Nasser, Egypt, we used a dependent double-observer sampling protocol with multiple observers to compute estimates of population size. To analyze the data, we developed a hierarchical model that allowed us to assess variation in detection probabilities among observers and survey dates, as well as account for variation in crocodile abundance among sites and habitats. We conducted surveys from July 2008-June 2009 in 15 areas of Lake Nasser that were representative of 3 main habitat categories. During these surveys, we sampled 1,086 km of lake shore wherein we detected 386 crocodiles. Analysis of the data revealed significant variability in both inter- and intra-observer detection probabilities. Our raw encounter rate was 0.355 crocodiles/km. When we accounted for observer effects and habitat, we estimated a surface population abundance of 2,581 (2,239-2,987, 95% credible intervals) crocodiles in Lake Nasser. Our results underscore the importance of well-trained, experienced monitoring personnel in order to decrease heterogeneity in intra-observer detection probability and to better detect changes in the population based on survey indices. This study will assist the Egyptian government establish a monitoring program as an integral part of future crocodile harvest activities in Lake Nasser

  8. Using occupancy models of forest breeding birds to prioritize conservation planning

    USGS Publications Warehouse

    De Wan, A. A.; Sullivan, P.J.; Lembo, A.J.; Smith, C.R.; Maerz, J.C.; Lassoie, J.P.; Richmond, M.E.

    2009-01-01

    As urban development continues to encroach on the natural and rural landscape, land-use planners struggle to identify high priority conservation areas for protection. Although knowing where urban-sensitive species may be occurring on the landscape would facilitate conservation planning, research efforts are often not sufficiently designed to make quality predictions at unknown locations. Recent advances in occupancy modeling allow for more precise estimates of occupancy by accounting for differences in detectability. We applied these techniques to produce robust estimates of habitat occupancy for a subset of forest breeding birds, a group that has been shown to be sensitive to urbanization, in a rapidly urbanizing yet biological diverse region of New York State. We found that detection probability ranged widely across species, from 0.05 to 0.8. Our models suggest that detection probability declined with increasing forest fragmentation. We also found that the probability of occupancy of forest breeding birds is negatively influenced by increasing perimeter-area ratio of forest fragments and urbanization in the surrounding habitat matrix. We capitalized on our random sampling design to produce spatially explicit models that predict high priority conservation areas across the entire region, where interior-species were most likely to occur. Finally, we use our predictive maps to demonstrate how a strict sampling design coupled with occupancy modeling can be a valuable tool for prioritizing biodiversity conservation in land-use planning. ?? 2009 Elsevier Ltd.

  9. Standard plane localization in ultrasound by radial component model and selective search.

    PubMed

    Ni, Dong; Yang, Xin; Chen, Xin; Chin, Chien-Ting; Chen, Siping; Heng, Pheng Ann; Li, Shengli; Qin, Jing; Wang, Tianfu

    2014-11-01

    Acquisition of the standard plane is crucial for medical ultrasound diagnosis. However, this process requires substantial experience and a thorough knowledge of human anatomy. Therefore it is very challenging for novices and even time consuming for experienced examiners. We proposed a hierarchical, supervised learning framework for automatically detecting the standard plane from consecutive 2-D ultrasound images. We tested this technique by developing a system that localizes the fetal abdominal standard plane from ultrasound video by detecting three key anatomical structures: the stomach bubble, umbilical vein and spine. We first proposed a novel radial component-based model to describe the geometric constraints of these key anatomical structures. We then introduced a novel selective search method which exploits the vessel probability algorithm to produce probable locations for the spine and umbilical vein. Next, using component classifiers trained by random forests, we detected the key anatomical structures at their probable locations within the regions constrained by the radial component-based model. Finally, a second-level classifier combined the results from the component detection to identify an ultrasound image as either a "fetal abdominal standard plane" or a "non- fetal abdominal standard plane." Experimental results on 223 fetal abdomen videos showed that the detection accuracy of our method was as high as 85.6% and significantly outperformed both the full abdomen and the separate anatomy detection methods without geometric constraints. The experimental results demonstrated that our system shows great promise for application to clinical practice. Copyright © 2014 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  10. Can you hear me now? Range-testing a submerged passive acoustic receiver array in a Caribbean coral reef habitat

    USGS Publications Warehouse

    Selby, Thomas H.; Hart, Kristen M.; Fujisaki, Ikuko; Smith, Brian J.; Pollock, Clayton J; Hillis-Star, Zandy M; Lundgren, Ian; Oli, Madan K.

    2016-01-01

    Submerged passive acoustic technology allows researchers to investigate spatial and temporal movement patterns of many marine and freshwater species. The technology uses receivers to detect and record acoustic transmissions emitted from tags attached to an individual. Acoustic signal strength naturally attenuates over distance, but numerous environmental variables also affect the probability a tag is detected. Knowledge of receiver range is crucial for designing acoustic arrays and analyzing telemetry data. Here, we present a method for testing a relatively large-scale receiver array in a dynamic Caribbean coastal environment intended for long-term monitoring of multiple species. The U.S. Geological Survey and several academic institutions in collaboration with resource management at Buck Island Reef National Monument (BIRNM), off the coast of St. Croix, recently deployed a 52 passive acoustic receiver array. We targeted 19 array-representative receivers for range-testing by submersing fixed delay interval range-testing tags at various distance intervals in each cardinal direction from a receiver for a minimum of an hour. Using a generalized linear mixed model (GLMM), we estimated the probability of detection across the array and assessed the effect of water depth, habitat, wind, temperature, and time of day on the probability of detection. The predicted probability of detection across the entire array at 100 m distance from a receiver was 58.2% (95% CI: 44.0–73.0%) and dropped to 26.0% (95% CI: 11.4–39.3%) 200 m from a receiver indicating a somewhat constrained effective detection range. Detection probability varied across habitat classes with the greatest effective detection range occurring in homogenous sand substrate and the smallest in high rugosity reef. Predicted probability of detection across BIRNM highlights potential gaps in coverage using the current array as well as limitations of passive acoustic technology within a complex coral reef environment.

  11. Behavior Knowledge Space-Based Fusion for Copy-Move Forgery Detection.

    PubMed

    Ferreira, Anselmo; Felipussi, Siovani C; Alfaro, Carlos; Fonseca, Pablo; Vargas-Munoz, John E; Dos Santos, Jefersson A; Rocha, Anderson

    2016-07-20

    The detection of copy-move image tampering is of paramount importance nowadays, mainly due to its potential use for misleading the opinion forming process of the general public. In this paper, we go beyond traditional forgery detectors and aim at combining different properties of copy-move detection approaches by modeling the problem on a multiscale behavior knowledge space, which encodes the output combinations of different techniques as a priori probabilities considering multiple scales of the training data. Afterwards, the conditional probabilities missing entries are properly estimated through generative models applied on the existing training data. Finally, we propose different techniques that exploit the multi-directionality of the data to generate the final outcome detection map in a machine learning decision-making fashion. Experimental results on complex datasets, comparing the proposed techniques with a gamut of copy-move detection approaches and other fusion methodologies in the literature show the effectiveness of the proposed method and its suitability for real-world applications.

  12. Double-observer line transect surveys with Markov-modulated Poisson process models for animal availability.

    PubMed

    Borchers, D L; Langrock, R

    2015-12-01

    We develop maximum likelihood methods for line transect surveys in which animals go undetected at distance zero, either because they are stochastically unavailable while within view or because they are missed when they are available. These incorporate a Markov-modulated Poisson process model for animal availability, allowing more clustered availability events than is possible with Poisson availability models. They include a mark-recapture component arising from the independent-observer survey, leading to more accurate estimation of detection probability given availability. We develop models for situations in which (a) multiple detections of the same individual are possible and (b) some or all of the availability process parameters are estimated from the line transect survey itself, rather than from independent data. We investigate estimator performance by simulation, and compare the multiple-detection estimators with estimators that use only initial detections of individuals, and with a single-observer estimator. Simultaneous estimation of detection function parameters and availability model parameters is shown to be feasible from the line transect survey alone with multiple detections and double-observer data but not with single-observer data. Recording multiple detections of individuals improves estimator precision substantially when estimating the availability model parameters from survey data, and we recommend that these data be gathered. We apply the methods to estimate detection probability from a double-observer survey of North Atlantic minke whales, and find that double-observer data greatly improve estimator precision here too. © 2015 The Authors Biometrics published by Wiley Periodicals, Inc. on behalf of International Biometric Society.

  13. Multi-Level Anomaly Detection on Time-Varying Graph Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bridges, Robert A; Collins, John P; Ferragut, Erik M

    This work presents a novel modeling and analysis framework for graph sequences which addresses the challenge of detecting and contextualizing anomalies in labelled, streaming graph data. We introduce a generalization of the BTER model of Seshadhri et al. by adding flexibility to community structure, and use this model to perform multi-scale graph anomaly detection. Specifically, probability models describing coarse subgraphs are built by aggregating probabilities at finer levels, and these closely related hierarchical models simultaneously detect deviations from expectation. This technique provides insight into a graph's structure and internal context that may shed light on a detected event. Additionally, thismore » multi-scale analysis facilitates intuitive visualizations by allowing users to narrow focus from an anomalous graph to particular subgraphs or nodes causing the anomaly. For evaluation, two hierarchical anomaly detectors are tested against a baseline Gaussian method on a series of sampled graphs. We demonstrate that our graph statistics-based approach outperforms both a distribution-based detector and the baseline in a labeled setting with community structure, and it accurately detects anomalies in synthetic and real-world datasets at the node, subgraph, and graph levels. To illustrate the accessibility of information made possible via this technique, the anomaly detector and an associated interactive visualization tool are tested on NCAA football data, where teams and conferences that moved within the league are identified with perfect recall, and precision greater than 0.786.« less

  14. Ensemble learning and model averaging for material identification in hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Basener, William F.

    2017-05-01

    In this paper we present a method for identifying the material contained in a pixel or region of pixels in a hyperspectral image. An identification process can be performed on a spectrum from an image from pixels that has been pre-determined to be of interest, generally comparing the spectrum from the image to spectra in an identification library. The metric for comparison used in this paper a Bayesian probability for each material. This probability can be computed either from Bayes' theorem applied to normal distributions for each library spectrum or using model averaging. Using probabilities has the advantage that the probabilities can be summed over spectra for any material class to obtain a class probability. For example, the probability that the spectrum of interest is a fabric is equal to the sum of all probabilities for fabric spectra in the library. We can do the same to determine the probability for a specific type of fabric, or any level of specificity contained in our library. Probabilities not only tell us which material is most likely, the tell us how confident we can be in the material presence; a probability close to 1 indicates near certainty of the presence of a material in the given class, and a probability close to 0.5 indicates that we cannot know if the material is present at the given level of specificity. This is much more informative than a detection score from a target detection algorithm or a label from a classification algorithm. In this paper we present results in the form of a hierarchical tree with probabilities for each node. We use Forest Radiance imagery with 159 bands.

  15. Study on optimization method of test conditions for fatigue crack detection using lock-in vibrothermography

    NASA Astrophysics Data System (ADS)

    Min, Qing-xu; Zhu, Jun-zhen; Feng, Fu-zhou; Xu, Chao; Sun, Ji-wei

    2017-06-01

    In this paper, the lock-in vibrothermography (LVT) is utilized for defect detection. Specifically, for a metal plate with an artificial fatigue crack, the temperature rise of the defective area is used for analyzing the influence of different test conditions, i.e. engagement force, excitation intensity, and modulated frequency. The multivariate nonlinear and logistic regression models are employed to estimate the POD (probability of detection) and POA (probability of alarm) of fatigue crack, respectively. The resulting optimal selection of test conditions is presented. The study aims to provide an optimized selection method of the test conditions in the vibrothermography system with the enhanced detection ability.

  16. Using spatially explicit surveillance models to provide confidence in the eradication of an invasive ant

    PubMed Central

    Ward, Darren F.; Anderson, Dean P.; Barron, Mandy C.

    2016-01-01

    Effective detection plays an important role in the surveillance and management of invasive species. Invasive ants are very difficult to eradicate and are prone to imperfect detection because of their small size and cryptic nature. Here we demonstrate the use of spatially explicit surveillance models to estimate the probability that Argentine ants (Linepithema humile) have been eradicated from an offshore island site, given their absence across four surveys and three surveillance methods, conducted since ant control was applied. The probability of eradication increased sharply as each survey was conducted. Using all surveys and surveillance methods combined, the overall median probability of eradication of Argentine ants was 0.96. There was a high level of confidence in this result, with a high Credible Interval Value of 0.87. Our results demonstrate the value of spatially explicit surveillance models for the likelihood of eradication of Argentine ants. We argue that such models are vital to give confidence in eradication programs, especially from highly valued conservation areas such as offshore islands. PMID:27721491

  17. A Comparison of Grizzly Bear Demographic Parameters Estimated from Non-Spatial and Spatial Open Population Capture-Recapture Models.

    PubMed

    Whittington, Jesse; Sawaya, Michael A

    2015-01-01

    Capture-recapture studies are frequently used to monitor the status and trends of wildlife populations. Detection histories from individual animals are used to estimate probability of detection and abundance or density. The accuracy of abundance and density estimates depends on the ability to model factors affecting detection probability. Non-spatial capture-recapture models have recently evolved into spatial capture-recapture models that directly include the effect of distances between an animal's home range centre and trap locations on detection probability. Most studies comparing non-spatial and spatial capture-recapture biases focussed on single year models and no studies have compared the accuracy of demographic parameter estimates from open population models. We applied open population non-spatial and spatial capture-recapture models to three years of grizzly bear DNA-based data from Banff National Park and simulated data sets. The two models produced similar estimates of grizzly bear apparent survival, per capita recruitment, and population growth rates but the spatial capture-recapture models had better fit. Simulations showed that spatial capture-recapture models produced more accurate parameter estimates with better credible interval coverage than non-spatial capture-recapture models. Non-spatial capture-recapture models produced negatively biased estimates of apparent survival and positively biased estimates of per capita recruitment. The spatial capture-recapture grizzly bear population growth rates and 95% highest posterior density averaged across the three years were 0.925 (0.786-1.071) for females, 0.844 (0.703-0.975) for males, and 0.882 (0.779-0.981) for females and males combined. The non-spatial capture-recapture population growth rates were 0.894 (0.758-1.024) for females, 0.825 (0.700-0.948) for males, and 0.863 (0.771-0.957) for both sexes. The combination of low densities, low reproductive rates, and predominantly negative population growth rates suggest that Banff National Park's population of grizzly bears requires continued conservation-oriented management actions.

  18. Making sense of the noise: The effect of hydrology on silver carp eDNA detection in the Chicago area waterway system.

    PubMed

    Song, Jeffery W; Small, Mitchell J; Casman, Elizabeth A

    2017-12-15

    Environmental DNA (eDNA) sampling is an emerging tool for monitoring the spread of aquatic invasive species. One confounding factor when interpreting eDNA sampling evidence is that eDNA can be present in the water in the absence of living target organisms, originating from excreta, dead tissue, boats, or sewage effluent, etc. In the Chicago Area Waterway System (CAWS), electric fish dispersal barriers were built to prevent non-native Asian carp species from invading Lake Michigan, and yet Asian carp eDNA has been detected above the barriers sporadically since 2009. In this paper the influence of stream flow characteristics in the CAWS on the probability of invasive Asian carp eDNA detection in the CAWS from 2009 to 2012 was examined. In the CAWS, the direction of stream flow is mostly away from Lake Michigan, though there are infrequent reversals in flow direction towards Lake Michigan during dry spells. We find that the flow reversal volume into the Lake has a statistically significant positive relationship with eDNA detection probability, while other covariates, like gage height, precipitation, season, water temperature, dissolved oxygen concentration, pH and chlorophyll concentration do not. This suggests that stream flow direction is highly influential on eDNA detection in the CAWS and should be considered when interpreting eDNA evidence. We also find that the beta-binomial regression model provides a stronger fit for eDNA detection probability compared to a binomial regression model. This paper provides a statistical modeling framework for interpreting eDNA sampling evidence and for evaluating covariates influencing eDNA detection. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Coffee Agroforests Remain Beneficial for Neotropical Bird Community Conservation across Seasons

    PubMed Central

    Peters, Valerie E.; Cooper, Robert J.; Carroll, C. Ron

    2013-01-01

    Coffee agroforestry systems and secondary forests have been shown to support similar bird communities but comparing these habitat types are challenged by potential biases due to differences in detectability between habitats. Furthermore, seasonal dynamics may influence bird communities differently in different habitat types and therefore seasonal effects should be considered in comparisons. To address these issues, we incorporated seasonal effects and factors potentially affecting bird detectability into models to compare avian community composition and dynamics between coffee agroforests and secondary forest fragments. In particular, we modeled community composition and community dynamics of bird functional groups based on habitat type (coffee agroforest vs. secondary forest) and season while accounting for variation in capture probability (i.e. detectability). The models we used estimated capture probability to be similar between habitat types for each dietary guild, but omnivores had a lower capture probability than frugivores and insectivores. Although apparent species richness was higher in coffee agroforest than secondary forest, model results indicated that omnivores and insectivores were more common in secondary forest when accounting for heterogeneity in capture probability. Our results largely support the notion that shade-coffee can serve as a surrogate habitat for secondary forest with respect to avian communities. Small coffee agroforests embedded within the typical tropical countryside matrix of secondary forest patches and small-scale agriculture, therefore, may host avian communities that resemble those of surrounding secondary forest, and may serve as viable corridors linking patches of forest within these landscapes. This information is an important step toward effective landscape-scale conservation in Neotropical agricultural landscapes. PMID:24058437

  20. Evidence for skipped spawning in a potamodromous cyprinid, humpback chub (Gila cypha), with implications for demographic parameter estimates

    USGS Publications Warehouse

    Pearson, Kristen Nicole; Kendall, William L.; Winkelman, Dana L.; Persons, William R.

    2015-01-01

    Our findings reveal evidence for skipped spawning in a potamodromous cyprinid, humpback chub (HBC; Gila cypha  ). Using closed robust design mark-recapture models, we found, on average, spawning HBC transition to the skipped spawning state () with a probability of 0.45 (95% CRI (i.e. credible interval): 0.10, 0.80) and skipped spawners remain in the skipped spawning state () with a probability of 0.60 (95% CRI: 0.26, 0.83), yielding an average spawning cycle of every 2.12 years, conditional on survival. As a result, migratory skipped spawners are unavailable for detection during annual sampling events. If availability is unaccounted for, survival and detection probability estimates will be biased. Therefore, we estimated annual adult survival probability (S), while accounting for skipped spawning, and found S remained reasonably stable throughout the study period, with an average of 0.75 ((95% CRI: 0.66, 0.82), process varianceσ2 = 0.005), while skipped spawning probability was highly dynamic (σ2 = 0.306). By improving understanding of HBC spawning strategies, conservation decisions can be based on less biased estimates of survival and a more informed population model structure.

  1. Autonomous detection of crowd anomalies in multiple-camera surveillance feeds

    NASA Astrophysics Data System (ADS)

    Nordlöf, Jonas; Andersson, Maria

    2016-10-01

    A novel approach for autonomous detection of anomalies in crowded environments is presented in this paper. The proposed models uses a Gaussian mixture probability hypothesis density (GM-PHD) filter as feature extractor in conjunction with different Gaussian mixture hidden Markov models (GM-HMMs). Results, based on both simulated and recorded data, indicate that this method can track and detect anomalies on-line in individual crowds through multiple camera feeds in a crowded environment.

  2. Log-Linear Models for Gene Association

    PubMed Central

    Hu, Jianhua; Joshi, Adarsh; Johnson, Valen E.

    2009-01-01

    We describe a class of log-linear models for the detection of interactions in high-dimensional genomic data. This class of models leads to a Bayesian model selection algorithm that can be applied to data that have been reduced to contingency tables using ranks of observations within subjects, and discretization of these ranks within gene/network components. Many normalization issues associated with the analysis of genomic data are thereby avoided. A prior density based on Ewens’ sampling distribution is used to restrict the number of interacting components assigned high posterior probability, and the calculation of posterior model probabilities is expedited by approximations based on the likelihood ratio statistic. Simulation studies are used to evaluate the efficiency of the resulting algorithm for known interaction structures. Finally, the algorithm is validated in a microarray study for which it was possible to obtain biological confirmation of detected interactions. PMID:19655032

  3. Is the detection of aquatic environmental DNA influenced by substrate type?

    PubMed

    Buxton, Andrew S; Groombridge, Jim J; Griffiths, Richard A

    2017-01-01

    The use of environmental DNA (eDNA) to assess the presence-absence of rare, cryptic or invasive species is hindered by a poor understanding of the factors that can remove DNA from the system. In aquatic systems, eDNA can be transported out either horizontally in water flows or vertically by incorporation into the sediment. Equally, eDNA may be broken down by various biotic and abiotic processes if the target organism leaves the system. We use occupancy modelling and a replicated mesocosm experiment to examine how detection probability of eDNA changes once the target species is no longer present. We hypothesise that detection probability falls faster with a sediment which has a large number of DNA binding sites such as topsoil or clay, over lower DNA binding capacity substrates such as sand. Water removed from ponds containing the target species (the great crested newt) initially showed high detection probabilities, but these fell to between 40% and 60% over the first 10 days and to between 10% and 22% by day 15: eDNA remained detectable at very low levels until day 22. Very little difference in detection was observed between the control group (no substrate) and the sand substrate. A small reduction in detection probability was observed between the control and clay substrates, but this was not significant. However, a highly significant reduction in detection probability was observed with a topsoil substrate. This result is likely to have stemmed from increased levels of PCR inhibition, suggesting that incorporation of DNA into the sentiment is of only limited importance. Surveys of aquatic species using eDNA clearly need to take account of substrate type as well as other environmental factors when collecting samples, analysing data and interpreting the results.

  4. The Effect of Scattering and Absorption on Noise from a Cavitating Noise Source in the Subsurface Ocean Layer.

    DTIC Science & Technology

    1981-06-01

    for a de- tection probability of PD and associated false alarm probability PFA (in dB). 21 - - - II V. REFERENCE MODEL A. INTRODUCTION In order to...space for which to choose HI . PFA = P (wI 0o)dw = Q(---) (26) j 0 Similarity, the miss probability=l-detection probability is obtained by integrating...31) = 2 (1+ (22 [()BT z] ~Z The input signal-to-noise ratio: S/N(input) - a2 (32) The probability of false alarm: PFA = Q[ tB(j-I) 1 (33) The

  5. Masking by Gratings Predicted by an Image Sequence Discriminating Model: Testing Models for Perceptual Discrimination Using Repeatable Noise

    NASA Technical Reports Server (NTRS)

    Ahumada, Albert J., Jr.; Null, Cynthia H. (Technical Monitor)

    1998-01-01

    Adding noise to stimuli to be discriminated allows estimation of observer classification functions based on the correlation between observer responses and relevant features of the noisy stimuli. Examples will be presented of stimulus features that are found in auditory tone detection and visual vernier acuity. using the standard signal detection model (Thurstone scaling), we derive formulas to estimate the proportion of the observers decision variable variance that is controlled by the added noise. one is based on the probability of agreement of the observer with him/herself on trials with the same noise sample. Another is based on the relative performance of the observer and the model. When these do not agree, the model can be rejected. A second derivation gives the probability of agreement of observer and model when the observer follows the model except for internal noise. Agreement significantly less than this amount allows rejection of the model.

  6. Simulation Model of Mobile Detection Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edmunds, T; Faissol, D; Yao, Y

    2009-01-27

    In this paper, we consider a mobile source that we attempt to detect with man-portable, vehicle-mounted or boat-mounted radiation detectors. The source is assumed to transit an area populated with these mobile detectors, and the objective is to detect the source before it reaches a perimeter. We describe a simulation model developed to estimate the probability that one of the mobile detectors will come in to close proximity of the moving source and detect it. We illustrate with a maritime simulation example. Our simulation takes place in a 10 km by 5 km rectangular bay patrolled by boats equipped withmore » 2-inch x 4-inch x 16-inch NaI detectors. Boats to be inspected enter the bay and randomly proceed to one of seven harbors on the shore. A source-bearing boat enters the mouth of the bay and proceeds to a pier on the opposite side. We wish to determine the probability that the source is detected and its range from target when detected. Patrol boats select the nearest in-bound boat for inspection and initiate an intercept course. Once within an operational range for the detection system, a detection algorithm is started. If the patrol boat confirms the source is not present, it selects the next nearest boat for inspection. Each run of the simulation ends either when a patrol successfully detects a source or when the source reaches its target. Several statistical detection algorithms have been implemented in the simulation model. First, a simple k-sigma algorithm, which alarms with the counts in a time window exceeds the mean background plus k times the standard deviation of background, is available to the user. The time window used is optimized with respect to the signal-to-background ratio for that range and relative speed. Second, a sequential probability ratio test [Wald 1947] is available, and configured in this simulation with a target false positive probability of 0.001 and false negative probability of 0.1. This test is utilized when the mobile detector maintains a constant range to the vessel being inspected. Finally, a variation of the sequential probability ratio test that is more appropriate when source and detector are not at constant range is available [Nelson 2005]. Each patrol boat in the fleet can be assigned a particular zone of the bay, or all boats can be assigned to monitor the entire bay. Boats assigned to a zone will only intercept and inspect other boats when they enter their zone. In our example simulation, each of two patrol boats operate in a 5 km by 5 km zone. Other parameters for this example include: (1) Detection range - 15 m range maintained between patrol boat and inspected boat; (2) Inbound boat arrival rate - Poisson process with mean arrival rate of 30 boats per hour; (3) Speed of boats to be inspected - Random between 4.5 and 9 knots; (4) Patrol boat speed - 10 knots; (5) Number of detectors per patrol boat - 4-2-inch x 4-inch x 16-inch NaI detectors; (6) Background radiation - 40 counts/sec per detector; and (7) Detector response due to radiation source at 1 meter - 1,589 counts/sec per detector. Simulation results indicate that two patrol boats are able to detect the source 81% of the time without zones and 90% of the time with zones. The average distances between the source and target at the end of the simulation is 5,866 km and 5,712 km for non-zoned and zoned patrols, respectively. Of those that did not reach the target, the average distance to the target is 7,305 km and 6,441 km respectively. Note that a design trade-off exists. While zoned patrols provide a higher probability of detection, the nonzoned patrols tend to detect the source farther from its target. Figure 1 displays the location of the source at the end of 1,000 simulations for the 5 x 10 km bay simulation. The simulation model and analysis described here can be used to determine the number of mobile detectors one would need to deploy in order to have a have reasonable chance of detecting a source in transit. By fixing the source speed to zero, the same model could be used to estimate how long it would take to detect a stationary source. For example, the model could predict how long it would take plant staff performing assigned duties carrying dosimeters to discover a contaminated spot in the facility.« less

  7. An integrated framework for detecting suspicious behaviors in video surveillance

    NASA Astrophysics Data System (ADS)

    Zin, Thi Thi; Tin, Pyke; Hama, Hiromitsu; Toriu, Takashi

    2014-03-01

    In this paper, we propose an integrated framework for detecting suspicious behaviors in video surveillance systems which are established in public places such as railway stations, airports, shopping malls and etc. Especially, people loitering in suspicion, unattended objects left behind and exchanging suspicious objects between persons are common security concerns in airports and other transit scenarios. These involve understanding scene/event, analyzing human movements, recognizing controllable objects, and observing the effect of the human movement on those objects. In the proposed framework, multiple background modeling technique, high level motion feature extraction method and embedded Markov chain models are integrated for detecting suspicious behaviors in real time video surveillance systems. Specifically, the proposed framework employs probability based multiple backgrounds modeling technique to detect moving objects. Then the velocity and distance measures are computed as the high level motion features of the interests. By using an integration of the computed features and the first passage time probabilities of the embedded Markov chain, the suspicious behaviors in video surveillance are analyzed for detecting loitering persons, objects left behind and human interactions such as fighting. The proposed framework has been tested by using standard public datasets and our own video surveillance scenarios.

  8. Using occupancy models to investigate the prevalence of ectoparasitic vectors on hosts: an example with fleas on prairie dogs

    USGS Publications Warehouse

    Eads, David A.; Biggins, Dean E.; Doherty, Paul F.; Gage, Kenneth L.; Huyvaert, Kathryn P.; Long, Dustin H.; Antolin, Michael F.

    2013-01-01

    Ectoparasites are often difficult to detect in the field. We developed a method that can be used with occupancy models to estimate the prevalence of ectoparasites on hosts, and to investigate factors that influence rates of ectoparasite occupancy while accounting for imperfect detection. We describe the approach using a study of fleas (Siphonaptera) on black-tailed prairie dogs (Cynomys ludovicianus). During each primary occasion (monthly trapping events), we combed a prairie dog three consecutive times to detect fleas (15 s/combing). We used robust design occupancy modeling to evaluate hypotheses for factors that might correlate with the occurrence of fleas on prairie dogs, and factors that might influence the rate at which prairie dogs are colonized by fleas. Our combing method was highly effective; dislodged fleas fell into a tub of water and could not escape, and there was an estimated 99.3% probability of detecting a flea on an occupied host when using three combings. While overall detection was high, the probability of detection was always <1.00 during each primary combing occasion, highlighting the importance of considering imperfect detection. The combing method (removal of fleas) caused a decline in detection during primary occasions, and we accounted for that decline to avoid inflated estimates of occupancy. Regarding prairie dogs, flea occupancy was heightened in old/natural colonies of prairie dogs, and on hosts that were in poor condition. Occupancy was initially low in plots with high densities of prairie dogs, but, as the study progressed, the rate of flea colonization increased in plots with high densities of prairie dogs in particular. Our methodology can be used to improve studies of ectoparasites, especially when the probability of detection is low. Moreover, the method can be modified to investigate the co-occurrence of ectoparasite species, and community level factors such as species richness and interspecific interactions.

  9. Habitat relationships of landbirds in the Northern Region, USDA Forest Service

    Treesearch

    Richard L. Hutto; Jock S. Young

    1999-01-01

    A series of first-generation habitat-relationships models for 83 bird species were detected in a 3-year study on point counts conducted in association with the USDA Forest Service's Northern Region Landbird Monitoring Program. The models depict probabilities of detection for each of the bird species on 100-m-radius, 10-minute point counts conducted across a series...

  10. "A violation of the conditional independence assumption in the two-high-threshold Model of recognition memory": Correction to Chen, Starns, and Rotello (2015).

    PubMed

    2016-01-01

    Reports an error in "A violation of the conditional independence assumption in the two-high-threshold model of recognition memory" by Tina Chen, Jeffrey J. Starns and Caren M. Rotello (Journal of Experimental Psychology: Learning, Memory, and Cognition, 2015[Jul], Vol 41[4], 1215-1222). In the article, Chen et al. compared three models: a continuous signal detection model (SDT), a standard two-high-threshold discrete-state model in which detect states always led to correct responses (2HT), and a full-mapping version of the 2HT model in which detect states could lead to either correct or incorrect responses. After publication, Rani Moran (personal communication, April 21, 2015) identified two errors that impact the reported fit statistics for the Bayesian information criterion (BIC) metric of all models as well as the Akaike information criterion (AIC) results for the full-mapping model. The errors are described in the erratum. (The following abstract of the original article appeared in record 2014-56216-001.) The 2-high-threshold (2HT) model of recognition memory assumes that test items result in distinct internal states: they are either detected or not, and the probability of responding at a particular confidence level that an item is "old" or "new" depends on the state-response mapping parameters. The mapping parameters are independent of the probability that an item yields a particular state (e.g., both strong and weak items that are detected as old have the same probability of producing a highest-confidence "old" response). We tested this conditional independence assumption by presenting nouns 1, 2, or 4 times. To maximize the strength of some items, "superstrong" items were repeated 4 times and encoded in conjunction with pleasantness, imageability, anagram, and survival processing tasks. The 2HT model failed to simultaneously capture the response rate data for all item classes, demonstrating that the data violated the conditional independence assumption. In contrast, a Gaussian signal detection model, which posits that the level of confidence that an item is "old" or "new" is a function of its continuous strength value, provided a good account of the data. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  11. Anomaly Detection in Dynamic Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turcotte, Melissa

    2014-10-14

    Anomaly detection in dynamic communication networks has many important security applications. These networks can be extremely large and so detecting any changes in their structure can be computationally challenging; hence, computationally fast, parallelisable methods for monitoring the network are paramount. For this reason the methods presented here use independent node and edge based models to detect locally anomalous substructures within communication networks. As a first stage, the aim is to detect changes in the data streams arising from node or edge communications. Throughout the thesis simple, conjugate Bayesian models for counting processes are used to model these data streams. Amore » second stage of analysis can then be performed on a much reduced subset of the network comprising nodes and edges which have been identified as potentially anomalous in the first stage. The first method assumes communications in a network arise from an inhomogeneous Poisson process with piecewise constant intensity. Anomaly detection is then treated as a changepoint problem on the intensities. The changepoint model is extended to incorporate seasonal behavior inherent in communication networks. This seasonal behavior is also viewed as a changepoint problem acting on a piecewise constant Poisson process. In a static time frame, inference is made on this extended model via a Gibbs sampling strategy. In a sequential time frame, where the data arrive as a stream, a novel, fast Sequential Monte Carlo (SMC) algorithm is introduced to sample from the sequence of posterior distributions of the change points over time. A second method is considered for monitoring communications in a large scale computer network. The usage patterns in these types of networks are very bursty in nature and don’t fit a Poisson process model. For tractable inference, discrete time models are considered, where the data are aggregated into discrete time periods and probability models are fitted to the communication counts. In a sequential analysis, anomalous behavior is then identified from outlying behavior with respect to the fitted predictive probability models. Seasonality is again incorporated into the model and is treated as a changepoint model on the transition probabilities of a discrete time Markov process. Second stage analytics are then developed which combine anomalous edges to identify anomalous substructures in the network.« less

  12. Recent Advances in Model-Assisted Probability of Detection

    NASA Technical Reports Server (NTRS)

    Thompson, R. Bruce; Brasche, Lisa J.; Lindgren, Eric; Swindell, Paul; Winfree, William P.

    2009-01-01

    The increased role played by probability of detection (POD) in structural integrity programs, combined with the significant time and cost associated with the purely empirical determination of POD, provides motivation for alternate means to estimate this important metric of NDE techniques. One approach to make the process of POD estimation more efficient is to complement limited empirical experiments with information from physics-based models of the inspection process or controlled laboratory experiments. The Model-Assisted Probability of Detection (MAPOD) Working Group was formed by the Air Force Research Laboratory, the FAA Technical Center, and NASA to explore these possibilities. Since the 2004 inception of the MAPOD Working Group, 11 meetings have been held in conjunction with major NDE conferences. This paper will review the accomplishments of this group, which includes over 90 members from around the world. Included will be a discussion of strategies developed to combine physics-based and empirical understanding, draft protocols that have been developed to guide application of the strategies, and demonstrations that have been or are being carried out in a number of countries. The talk will conclude with a discussion of future directions, which will include documentation of benefits via case studies, development of formal protocols for engineering practice, as well as a number of specific technical issues.

  13. History and Evolution of the Johnson Criteria.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sjaardema, Tracy A.; Smith, Collin S.; Birch, Gabriel Carisle

    The Johnson Criteria metric calculates probability of detection of an object imaged by an optical system, and was created in 1958 by John Johnson. As understanding of target detection has improved, detection models have evolved to better model additional factors such as weather, scene content, and object placement. The initial Johnson Criteria, while sufficient for technology and understanding at the time, does not accurately reflect current research into target acquisition and technology. Even though current research shows a dependence on human factors, there appears to be a lack of testing and modeling of human variability.

  14. Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD)

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.

    2015-01-01

    Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD) Manual v.1.2 The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that there is 95% confidence that the POD is greater than 90% (90/95 POD). Design of experiments for validating probability of detection capability of nondestructive evaluation (NDE) systems (DOEPOD) is a methodology that is implemented via software to serve as a diagnostic tool providing detailed analysis of POD test data, guidance on establishing data distribution requirements, and resolving test issues. DOEPOD demands utilization of observance of occurrences. The DOEPOD capability has been developed to provide an efficient and accurate methodology that yields observed POD and confidence bounds for both Hit-Miss or signal amplitude testing. DOEPOD does not assume prescribed POD logarithmic or similar functions with assumed adequacy over a wide range of flaw sizes and inspection system technologies, so that multi-parameter curve fitting or model optimization approaches to generate a POD curve are not required. DOEPOD applications for supporting inspector qualifications is included.

  15. Using counts to simultaneously estimate abundance and detection probabilities in a salamander community

    USGS Publications Warehouse

    Dodd, C.K.; Dorazio, R.M.

    2004-01-01

    A critical variable in both ecological and conservation field studies is determining how many individuals of a species are present within a defined sampling area. Labor intensive techniques such as capture-mark-recapture and removal sampling may provide estimates of abundance, but there are many logistical constraints to their widespread application. Many studies on terrestrial and aquatic salamanders use counts as an index of abundance, assuming that detection remains constant while sampling. If this constancy is violated, determination of detection probabilities is critical to the accurate estimation of abundance. Recently, a model was developed that provides a statistical approach that allows abundance and detection to be estimated simultaneously from spatially and temporally replicated counts. We adapted this model to estimate these parameters for salamanders sampled over a six vear period in area-constrained plots in Great Smoky Mountains National Park. Estimates of salamander abundance varied among years, but annual changes in abundance did not vary uniformly among species. Except for one species, abundance estimates were not correlated with site covariates (elevation/soil and water pH, conductivity, air and water temperature). The uncertainty in the estimates was so large as to make correlations ineffectual in predicting which covariates might influence abundance. Detection probabilities also varied among species and sometimes among years for the six species examined. We found such a high degree of variation in our counts and in estimates of detection among species, sites, and years as to cast doubt upon the appropriateness of using count data to monitor population trends using a small number of area-constrained survey plots. Still, the model provided reasonable estimates of abundance that could make it useful in estimating population size from count surveys.

  16. Estimating detection and density of the Andean cat in the high Andes

    USGS Publications Warehouse

    Reppucci, J.; Gardner, B.; Lucherini, M.

    2011-01-01

    The Andean cat (Leopardus jacobita) is one of the most endangered, yet least known, felids. Although the Andean cat is considered at risk of extinction, rigorous quantitative population studies are lacking. Because physical observations of the Andean cat are difficult to make in the wild, we used a camera-trapping array to photo-capture individuals. The survey was conducted in northwestern Argentina at an elevation of approximately 4,200 m during October-December 2006 and April-June 2007. In each year we deployed 22 pairs of camera traps, which were strategically placed. To estimate detection probability and density we applied models for spatial capture-recapture using a Bayesian framework. Estimated densities were 0.07 and 0.12 individual/km 2 for 2006 and 2007, respectively. Mean baseline detection probability was estimated at 0.07. By comparison, densities of the Pampas cat (Leopardus colocolo), another poorly known felid that shares its habitat with the Andean cat, were estimated at 0.74-0.79 individual/km2 in the same study area for 2006 and 2007, and its detection probability was estimated at 0.02. Despite having greater detectability, the Andean cat is rarer in the study region than the Pampas cat. Properly accounting for the detection probability is important in making reliable estimates of density, a key parameter in conservation and management decisions for any species. ?? 2011 American Society of Mammalogists.

  17. Estimating detection and density of the Andean cat in the high Andes

    USGS Publications Warehouse

    Reppucci, Juan; Gardner, Beth; Lucherini, Mauro

    2011-01-01

    The Andean cat (Leopardus jacobita) is one of the most endangered, yet least known, felids. Although the Andean cat is considered at risk of extinction, rigorous quantitative population studies are lacking. Because physical observations of the Andean cat are difficult to make in the wild, we used a camera-trapping array to photo-capture individuals. The survey was conducted in northwestern Argentina at an elevation of approximately 4,200 m during October–December 2006 and April–June 2007. In each year we deployed 22 pairs of camera traps, which were strategically placed. To estimate detection probability and density we applied models for spatial capture–recapture using a Bayesian framework. Estimated densities were 0.07 and 0.12 individual/km2 for 2006 and 2007, respectively. Mean baseline detection probability was estimated at 0.07. By comparison, densities of the Pampas cat (Leopardus colocolo), another poorly known felid that shares its habitat with the Andean cat, were estimated at 0.74–0.79 individual/km2 in the same study area for 2006 and 2007, and its detection probability was estimated at 0.02. Despite having greater detectability, the Andean cat is rarer in the study region than the Pampas cat. Properly accounting for the detection probability is important in making reliable estimates of density, a key parameter in conservation and management decisions for any species.

  18. A method of real-time fault diagnosis for power transformers based on vibration analysis

    NASA Astrophysics Data System (ADS)

    Hong, Kaixing; Huang, Hai; Zhou, Jianping; Shen, Yimin; Li, Yujie

    2015-11-01

    In this paper, a novel probability-based classification model is proposed for real-time fault detection of power transformers. First, the transformer vibration principle is introduced, and two effective feature extraction techniques are presented. Next, the details of the classification model based on support vector machine (SVM) are shown. The model also includes a binary decision tree (BDT) which divides transformers into different classes according to health state. The trained model produces posterior probabilities of membership to each predefined class for a tested vibration sample. During the experiments, the vibrations of transformers under different conditions are acquired, and the corresponding feature vectors are used to train the SVM classifiers. The effectiveness of this model is illustrated experimentally on typical in-service transformers. The consistency between the results of the proposed model and the actual condition of the test transformers indicates that the model can be used as a reliable method for transformer fault detection.

  19. Estimating abundance of mountain lions from unstructured spatial sampling

    USGS Publications Warehouse

    Russell, Robin E.; Royle, J. Andrew; Desimone, Richard; Schwartz, Michael K.; Edwards, Victoria L.; Pilgrim, Kristy P.; Mckelvey, Kevin S.

    2012-01-01

    Mountain lions (Puma concolor) are often difficult to monitor because of their low capture probabilities, extensive movements, and large territories. Methods for estimating the abundance of this species are needed to assess population status, determine harvest levels, evaluate the impacts of management actions on populations, and derive conservation and management strategies. Traditional mark–recapture methods do not explicitly account for differences in individual capture probabilities due to the spatial distribution of individuals in relation to survey effort (or trap locations). However, recent advances in the analysis of capture–recapture data have produced methods estimating abundance and density of animals from spatially explicit capture–recapture data that account for heterogeneity in capture probabilities due to the spatial organization of individuals and traps. We adapt recently developed spatial capture–recapture models to estimate density and abundance of mountain lions in western Montana. Volunteers and state agency personnel collected mountain lion DNA samples in portions of the Blackfoot drainage (7,908 km2) in west-central Montana using 2 methods: snow back-tracking mountain lion tracks to collect hair samples and biopsy darting treed mountain lions to obtain tissue samples. Overall, we recorded 72 individual capture events, including captures both with and without tissue sample collection and hair samples resulting in the identification of 50 individual mountain lions (30 females, 19 males, and 1 unknown sex individual). We estimated lion densities from 8 models containing effects of distance, sex, and survey effort on detection probability. Our population density estimates ranged from a minimum of 3.7 mountain lions/100 km2 (95% Cl 2.3–5.7) under the distance only model (including only an effect of distance on detection probability) to 6.7 (95% Cl 3.1–11.0) under the full model (including effects of distance, sex, survey effort, and distance x sex on detection probability). These numbers translate to a total estimate of 293 mountain lions (95% Cl 182–451) to 529 (95% Cl 245–870) within the Blackfoot drainage. Results from the distance model are similar to previous estimates of 3.6 mountain lions/100 km2 for the study area; however, results from all other models indicated greater numbers of mountain lions. Our results indicate that unstructured spatial sampling combined with spatial capture–recapture analysis can be an effective method for estimating large carnivore densities.

  20. Estimating site occupancy, colonization, and local extinction when a species is detected imperfectly

    USGS Publications Warehouse

    MacKenzie, D.I.; Nichols, J.D.; Hines, J.E.; Knutson, M.G.; Franklin, A.B.

    2003-01-01

    Few species are likely to be so evident that they will always be defected when present: Failing to allow for the possibility that a target species was present, but undetected at a site will lead to biased estimates of site occupancy, colonization,and local extinction probabilities. These population vital rates are often of interest in long-term monitoring programs and metapopulation studies. We present a model that enables direct estimation of these parameters when the probability of detecting the species is less than 1. The model does not require any assumptions-of process stationarity, as do some previous methods, but does require detection/nondetection data to be collected in a-manner similar to. Pollock's robust design as used-in mark-recapture studies. Via simulation, we,show that the model provides good estimates of parameters for most scenarios considered. We illustrate the method with data from monitoring programs of Northern Spotted Owls (Strix occidentalis caurina) in northern California and tiger salamanders (Ambystoma tigrinum) in Minnesota, USA.

  1. Disentangling sampling and ecological explanations underlying species-area relationships

    USGS Publications Warehouse

    Cam, E.; Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Alpizar-Jara, R.; Flather, C.H.

    2002-01-01

    We used a probabilistic approach to address the influence of sampling artifacts on the form of species-area relationships (SARs). We developed a model in which the increase in observed species richness is a function of sampling effort exclusively. We assumed that effort depends on area sampled, and we generated species-area curves under that model. These curves can be realistic looking. We then generated SARs from avian data, comparing SARs based on counts with those based on richness estimates. We used an approach to estimation of species richness that accounts for species detection probability and, hence, for variation in sampling effort. The slopes of SARs based on counts are steeper than those of curves based on estimates of richness, indicating that the former partly reflect failure to account for species detection probability. SARs based on estimates reflect ecological processes exclusively, not sampling processes. This approach permits investigation of ecologically relevant hypotheses. The slope of SARs is not influenced by the slope of the relationship between habitat diversity and area. In situations in which not all of the species are detected during sampling sessions, approaches to estimation of species richness integrating species detection probability should be used to investigate the rate of increase in species richness with area.

  2. Occupancy models for monitoring marine fish: a bayesian hierarchical approach to model imperfect detection with a novel gear combination.

    PubMed

    Coggins, Lewis G; Bacheler, Nathan M; Gwinn, Daniel C

    2014-01-01

    Occupancy models using incidence data collected repeatedly at sites across the range of a population are increasingly employed to infer patterns and processes influencing population distribution and dynamics. While such work is common in terrestrial systems, fewer examples exist in marine applications. This disparity likely exists because the replicate samples required by these models to account for imperfect detection are often impractical to obtain when surveying aquatic organisms, particularly fishes. We employ simultaneous sampling using fish traps and novel underwater camera observations to generate the requisite replicate samples for occupancy models of red snapper, a reef fish species. Since the replicate samples are collected simultaneously by multiple sampling devices, many typical problems encountered when obtaining replicate observations are avoided. Our results suggest that augmenting traditional fish trap sampling with camera observations not only doubled the probability of detecting red snapper in reef habitats off the Southeast coast of the United States, but supplied the necessary observations to infer factors influencing population distribution and abundance while accounting for imperfect detection. We found that detection probabilities tended to be higher for camera traps than traditional fish traps. Furthermore, camera trap detections were influenced by the current direction and turbidity of the water, indicating that collecting data on these variables is important for future monitoring. These models indicate that the distribution and abundance of this species is more heavily influenced by latitude and depth than by micro-scale reef characteristics lending credence to previous characterizations of red snapper as a reef habitat generalist. This study demonstrates the utility of simultaneous sampling devices, including camera traps, in aquatic environments to inform occupancy models and account for imperfect detection when describing factors influencing fish population distribution and dynamics.

  3. Occupancy Models for Monitoring Marine Fish: A Bayesian Hierarchical Approach to Model Imperfect Detection with a Novel Gear Combination

    PubMed Central

    Coggins, Lewis G.; Bacheler, Nathan M.; Gwinn, Daniel C.

    2014-01-01

    Occupancy models using incidence data collected repeatedly at sites across the range of a population are increasingly employed to infer patterns and processes influencing population distribution and dynamics. While such work is common in terrestrial systems, fewer examples exist in marine applications. This disparity likely exists because the replicate samples required by these models to account for imperfect detection are often impractical to obtain when surveying aquatic organisms, particularly fishes. We employ simultaneous sampling using fish traps and novel underwater camera observations to generate the requisite replicate samples for occupancy models of red snapper, a reef fish species. Since the replicate samples are collected simultaneously by multiple sampling devices, many typical problems encountered when obtaining replicate observations are avoided. Our results suggest that augmenting traditional fish trap sampling with camera observations not only doubled the probability of detecting red snapper in reef habitats off the Southeast coast of the United States, but supplied the necessary observations to infer factors influencing population distribution and abundance while accounting for imperfect detection. We found that detection probabilities tended to be higher for camera traps than traditional fish traps. Furthermore, camera trap detections were influenced by the current direction and turbidity of the water, indicating that collecting data on these variables is important for future monitoring. These models indicate that the distribution and abundance of this species is more heavily influenced by latitude and depth than by micro-scale reef characteristics lending credence to previous characterizations of red snapper as a reef habitat generalist. This study demonstrates the utility of simultaneous sampling devices, including camera traps, in aquatic environments to inform occupancy models and account for imperfect detection when describing factors influencing fish population distribution and dynamics. PMID:25255325

  4. Landscape characteristics influence pond occupancy by frogs after accounting for detectability

    USGS Publications Warehouse

    Mazerolle, M.J.; Desrochers, A.; Rochefort, L.

    2005-01-01

    Many investigators have hypothesized that landscape attributes such as the amount and proximity of habitat are important for amphibian spatial patterns. This has produced a number of studies focusing on the effects of landscape characteristics on amphibian patterns of occurrence in patches or ponds, most of which conclude that the landscape is important. We identified two concerns associated with these studies: one deals with their applicability to other landscape types, as most have been conducted in agricultural landscapes; the other highlights the need to account for the probability of detection. We tested the hypothesis that landscape characteristics influence spatial patterns of amphibian occurrence at ponds after accounting for the probability of detection in little-studied peatland landscapes undergoing peat mining. We also illustrated the costs of not accounting for the probability of detection by comparing our results to conventional logistic regression analyses. Results indicate that frog occurrence increased with the percent cover of ponds within 100, 250, and 1000 m, as well as the amount of forest cover within 1000 m. However, forest cover at 250 m had a negative influence on frog presence at ponds. Not accounting for the probability of detection resulted in underestimating the influence of most variables on frog occurrence, whereas a few were overestimated. Regardless, we show that conventional logistic regression can lead to different conclusions than analyses accounting for detectability. Our study is consistent with the hypothesis that landscape characteristics are important in determining the spatial patterns of frog occurrence at ponds. We strongly recommend estimating the probability of detection in field surveys, as this will increase the quality and conservation potential of models derived from such data. ?? 2005 by the Ecological Society of America.

  5. A multi-level anomaly detection algorithm for time-varying graph data with interactive visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bridges, Robert A.; Collins, John P.; Ferragut, Erik M.

    This work presents a novel modeling and analysis framework for graph sequences which addresses the challenge of detecting and contextualizing anomalies in labelled, streaming graph data. We introduce a generalization of the BTER model of Seshadhri et al. by adding flexibility to community structure, and use this model to perform multi-scale graph anomaly detection. Specifically, probability models describing coarse subgraphs are built by aggregating node probabilities, and these related hierarchical models simultaneously detect deviations from expectation. This technique provides insight into a graph's structure and internal context that may shed light on a detected event. Additionally, this multi-scale analysis facilitatesmore » intuitive visualizations by allowing users to narrow focus from an anomalous graph to particular subgraphs or nodes causing the anomaly. For evaluation, two hierarchical anomaly detectors are tested against a baseline Gaussian method on a series of sampled graphs. We demonstrate that our graph statistics-based approach outperforms both a distribution-based detector and the baseline in a labeled setting with community structure, and it accurately detects anomalies in synthetic and real-world datasets at the node, subgraph, and graph levels. Furthermore, to illustrate the accessibility of information made possible via this technique, the anomaly detector and an associated interactive visualization tool are tested on NCAA football data, where teams and conferences that moved within the league are identified with perfect recall, and precision greater than 0.786.« less

  6. A multi-level anomaly detection algorithm for time-varying graph data with interactive visualization

    DOE PAGES

    Bridges, Robert A.; Collins, John P.; Ferragut, Erik M.; ...

    2016-01-01

    This work presents a novel modeling and analysis framework for graph sequences which addresses the challenge of detecting and contextualizing anomalies in labelled, streaming graph data. We introduce a generalization of the BTER model of Seshadhri et al. by adding flexibility to community structure, and use this model to perform multi-scale graph anomaly detection. Specifically, probability models describing coarse subgraphs are built by aggregating node probabilities, and these related hierarchical models simultaneously detect deviations from expectation. This technique provides insight into a graph's structure and internal context that may shed light on a detected event. Additionally, this multi-scale analysis facilitatesmore » intuitive visualizations by allowing users to narrow focus from an anomalous graph to particular subgraphs or nodes causing the anomaly. For evaluation, two hierarchical anomaly detectors are tested against a baseline Gaussian method on a series of sampled graphs. We demonstrate that our graph statistics-based approach outperforms both a distribution-based detector and the baseline in a labeled setting with community structure, and it accurately detects anomalies in synthetic and real-world datasets at the node, subgraph, and graph levels. Furthermore, to illustrate the accessibility of information made possible via this technique, the anomaly detector and an associated interactive visualization tool are tested on NCAA football data, where teams and conferences that moved within the league are identified with perfect recall, and precision greater than 0.786.« less

  7. Evaluating species richness: biased ecological inference results from spatial heterogeneity in species detection probabilities

    USGS Publications Warehouse

    McNew, Lance B.; Handel, Colleen M.

    2015-01-01

    Accurate estimates of species richness are necessary to test predictions of ecological theory and evaluate biodiversity for conservation purposes. However, species richness is difficult to measure in the field because some species will almost always be overlooked due to their cryptic nature or the observer's failure to perceive their cues. Common measures of species richness that assume consistent observability across species are inviting because they may require only single counts of species at survey sites. Single-visit estimation methods ignore spatial and temporal variation in species detection probabilities related to survey or site conditions that may confound estimates of species richness. We used simulated and empirical data to evaluate the bias and precision of raw species counts, the limiting forms of jackknife and Chao estimators, and multi-species occupancy models when estimating species richness to evaluate whether the choice of estimator can affect inferences about the relationships between environmental conditions and community size under variable detection processes. Four simulated scenarios with realistic and variable detection processes were considered. Results of simulations indicated that (1) raw species counts were always biased low, (2) single-visit jackknife and Chao estimators were significantly biased regardless of detection process, (3) multispecies occupancy models were more precise and generally less biased than the jackknife and Chao estimators, and (4) spatial heterogeneity resulting from the effects of a site covariate on species detection probabilities had significant impacts on the inferred relationships between species richness and a spatially explicit environmental condition. For a real dataset of bird observations in northwestern Alaska, the four estimation methods produced different estimates of local species richness, which severely affected inferences about the effects of shrubs on local avian richness. Overall, our results indicate that neglecting the effects of site covariates on species detection probabilities may lead to significant bias in estimation of species richness, as well as the inferred relationships between community size and environmental covariates.

  8. Spectral dimension controlling the decay of the quantum first-detection probability

    NASA Astrophysics Data System (ADS)

    Thiel, Felix; Kessler, David A.; Barkai, Eli

    2018-06-01

    We consider a quantum system that is initially localized at xin and that is repeatedly projectively probed with a fixed period τ at position xd. We ask for the probability Fn that the system is detected at xd for the very first time, where n is the number of detection attempts. We relate the asymptotic decay and oscillations of Fn with the system's energy spectrum, which is assumed to be absolutely continuous. In particular, Fn is determined by the Hamiltonian's measurement spectral density of states (MSDOS) f (E ) that is closely related to the density of energy states (DOS). We find that Fn decays like a power law whose exponent is determined by the power-law exponent dS of f (E ) around its singularities E*. Our findings are analogous to the classical first passage theory of random walks. In contrast to the classical case, the decay of Fn is accompanied by oscillations with frequencies that are determined by the singularities E*. This gives rise to critical detection periods τc at which the oscillations disappear. In the ordinary case dS can be identified with the spectral dimension associated with the DOS. Furthermore, the singularities E* are the van Hove singularities of the DOS in this case. We find that the asymptotic statistics of Fn depend crucially on the initial and detection state and can be wildly different for out-of-the-ordinary states, which is in sharp contrast to the classical theory. The properties of the first-detection probabilities can alternatively be derived from the transition amplitudes. All our results are confirmed by numerical simulations of the tight-binding model, and of a free particle in continuous space both with a normal and with an anomalous dispersion relation. We provide explicit asymptotic formulas for the first-detection probability in these models.

  9. Combining inferences from models of capture efficiency, detectability, and suitable habitat to classify landscapes for conservation of threatened bull trout

    USGS Publications Warehouse

    Peterson, J.; Dunham, J.B.

    2003-01-01

    Effective conservation efforts for at-risk species require knowledge of the locations of existing populations. Species presence can be estimated directly by conducting field-sampling surveys or alternatively by developing predictive models. Direct surveys can be expensive and inefficient, particularly for rare and difficult-to-sample species, and models of species presence may produce biased predictions. We present a Bayesian approach that combines sampling and model-based inferences for estimating species presence. The accuracy and cost-effectiveness of this approach were compared to those of sampling surveys and predictive models for estimating the presence of the threatened bull trout ( Salvelinus confluentus ) via simulation with existing models and empirical sampling data. Simulations indicated that a sampling-only approach would be the most effective and would result in the lowest presence and absence misclassification error rates for three thresholds of detection probability. When sampling effort was considered, however, the combined approach resulted in the lowest error rates per unit of sampling effort. Hence, lower probability-of-detection thresholds can be specified with the combined approach, resulting in lower misclassification error rates and improved cost-effectiveness.

  10. A new understanding of multiple-pulsed laser-induced retinal injury thresholds.

    PubMed

    Lund, David J; Sliney, David H

    2014-04-01

    Laser safety standards committees have struggled for years to formulate adequately a sound method for treating repetitive-pulse laser exposures. Safety standards for lamps and LEDs have ignored this issue because averaged irradiance appeared to treat the issue adequately for large retinal image sizes and skin exposures. Several authors have recently questioned the current approach of three test conditions (i.e., limiting single-pulse exposure, average irradiance, and a single-pulse-reduction factor) as still insufficient to treat pulses of unequal energies or certain pulse groupings. Schulmeister et al. employed thermal modeling to show that a total-on-time pulse (TOTP) rule was conservative. Lund further developed the approach of probability summation proposed by Menendez et al. to explain pulse-additivity, whereby additivity is the result of an increasing probability of detecting injury with multiple pulse exposures. This latter argument relates the increase in detection probability to the slope of the probit curve for the threshold studies. Since the uncertainty in the threshold for producing an ophthalmoscopically detectable minimal visible lesion (MVL) is large for retinal exposure to a collimated laser beam, safety committees traditionally applied large risk reduction factors ("safety factors") of one order of magnitude when deriving intrabeam, "point-source" exposure limits. This reduction factor took into account the probability of visually detecting the low-contrast lesion among other factors. The reduction factor is smaller for large spot sizes where these difficulties are quite reduced. Thus the N⁻⁰·²⁵ reduction factor may result from the difficulties in detecting the lesion. Recent studies on repetitive pulse exposures in both animal and in vitro (retinal explant) models support this interpretation of the available data.

  11. Occupancy modeling of autonomously recorded vocalizations to predict distribution of rallids in tidal wetlands

    USGS Publications Warehouse

    Stiffler, Lydia L.; Anderson, James T.; Katzner, Todd

    2018-01-01

    Conservation and management for a species requires reliable information on its status, distribution, and habitat use. We identified occupancy and distributions of king (Rallus elegans) and clapper (R. crepitans) rail populations in marsh complexes along the Pamunkey and Mattaponi Rivers in Virginia, USA by modeling data on vocalizations recorded from autonomous recording units (ARUs). Occupancy probability for both species combined was 0.64 (95% CI: 0.53, 0.75) in marshes along the Pamunkey and 0.59 (0.45, 0.72) in marshes along the Mattaponi. Occupancy probability along the Pamunkey was strongly influenced by salinity, increasing logistically by a factor of 1.62 (0.6, 2.65) per parts per thousand of salinity. In contrast, there was not a strong salinity gradient on the Mattaponi and therefore vegetative community structure determined occupancy probability on that river. Estimated detection probability across both marshes was 0.63 (0.62, 0.65), but detection rates decreased as the season progressed. Monitoring wildlife within wetlands presents unique challenges for conservation managers. Our findings provide insight not only into how rails responded to environmental variation but also into the general utility of ARUs for occupancy modeling of the distribution and habitat associations of rails within tidal marsh systems.

  12. DECHADE: DEtecting slight Changes with HArd DEcisions in Wireless Sensor Networks

    NASA Astrophysics Data System (ADS)

    Ciuonzo, D.; Salvo Rossi, P.

    2018-07-01

    This paper focuses on the problem of change detection through a Wireless Sensor Network (WSN) whose nodes report only binary decisions (on the presence/absence of a certain event to be monitored), due to bandwidth/energy constraints. The resulting problem can be modelled as testing the equality of samples drawn from independent Bernoulli probability mass functions, when the bit probabilities under both hypotheses are not known. Both One-Sided (OS) and Two-Sided (TS) tests are considered, with reference to: (i) identical bit probability (a homogeneous scenario), (ii) different per-sensor bit probabilities (a non-homogeneous scenario) and (iii) regions with identical bit probability (a block-homogeneous scenario) for the observed samples. The goal is to provide a systematic framework collecting a plethora of viable detectors (designed via theoretically founded criteria) which can be used for each instance of the problem. Finally, verification of the derived detectors in two relevant WSN-related problems is provided to show the appeal of the proposed framework.

  13. Universal phase transition in community detectability under a stochastic block model.

    PubMed

    Chen, Pin-Yu; Hero, Alfred O

    2015-03-01

    We prove the existence of an asymptotic phase-transition threshold on community detectability for the spectral modularity method [M. E. J. Newman, Phys. Rev. E 74, 036104 (2006) and Proc. Natl. Acad. Sci. (USA) 103, 8577 (2006)] under a stochastic block model. The phase transition on community detectability occurs as the intercommunity edge connection probability p grows. This phase transition separates a subcritical regime of small p, where modularity-based community detection successfully identifies the communities, from a supercritical regime of large p where successful community detection is impossible. We show that, as the community sizes become large, the asymptotic phase-transition threshold p* is equal to √[p1p2], where pi(i=1,2) is the within-community edge connection probability. Thus the phase-transition threshold is universal in the sense that it does not depend on the ratio of community sizes. The universal phase-transition phenomenon is validated by simulations for moderately sized communities. Using the derived expression for the phase-transition threshold, we propose an empirical method for estimating this threshold from real-world data.

  14. Integrating count and detection–nondetection data to model population dynamics

    USGS Publications Warehouse

    Zipkin, Elise F.; Rossman, Sam; Yackulic, Charles B.; Wiens, David; Thorson, James T.; Davis, Raymond J.; Grant, Evan H. Campbell

    2017-01-01

    There is increasing need for methods that integrate multiple data types into a single analytical framework as the spatial and temporal scale of ecological research expands. Current work on this topic primarily focuses on combining capture–recapture data from marked individuals with other data types into integrated population models. Yet, studies of species distributions and trends often rely on data from unmarked individuals across broad scales where local abundance and environmental variables may vary. We present a modeling framework for integrating detection–nondetection and count data into a single analysis to estimate population dynamics, abundance, and individual detection probabilities during sampling. Our dynamic population model assumes that site-specific abundance can change over time according to survival of individuals and gains through reproduction and immigration. The observation process for each data type is modeled by assuming that every individual present at a site has an equal probability of being detected during sampling processes. We examine our modeling approach through a series of simulations illustrating the relative value of count vs. detection–nondetection data under a variety of parameter values and survey configurations. We also provide an empirical example of the model by combining long-term detection–nondetection data (1995–2014) with newly collected count data (2015–2016) from a growing population of Barred Owl (Strix varia) in the Pacific Northwest to examine the factors influencing population abundance over time. Our model provides a foundation for incorporating unmarked data within a single framework, even in cases where sampling processes yield different detection probabilities. This approach will be useful for survey design and to researchers interested in incorporating historical or citizen science data into analyses focused on understanding how demographic rates drive population abundance.

  15. Camera-Model Identification Using Markovian Transition Probability Matrix

    NASA Astrophysics Data System (ADS)

    Xu, Guanshuo; Gao, Shang; Shi, Yun Qing; Hu, Ruimin; Su, Wei

    Detecting the (brands and) models of digital cameras from given digital images has become a popular research topic in the field of digital forensics. As most of images are JPEG compressed before they are output from cameras, we propose to use an effective image statistical model to characterize the difference JPEG 2-D arrays of Y and Cb components from the JPEG images taken by various camera models. Specifically, the transition probability matrices derived from four different directional Markov processes applied to the image difference JPEG 2-D arrays are used to identify statistical difference caused by image formation pipelines inside different camera models. All elements of the transition probability matrices, after a thresholding technique, are directly used as features for classification purpose. Multi-class support vector machines (SVM) are used as the classification tool. The effectiveness of our proposed statistical model is demonstrated by large-scale experimental results.

  16. Calibrating passive acoustic monitoring: correcting humpback whale call detections for site-specific and time-dependent environmental characteristics.

    PubMed

    Helble, Tyler A; D'Spain, Gerald L; Campbell, Greg S; Hildebrand, John A

    2013-11-01

    This paper demonstrates the importance of accounting for environmental effects on passive underwater acoustic monitoring results. The situation considered is the reduction in shipping off the California coast between 2008-2010 due to the recession and environmental legislation. The resulting variations in ocean noise change the probability of detecting marine mammal vocalizations. An acoustic model was used to calculate the time-varying probability of detecting humpback whale vocalizations under best-guess environmental conditions and varying noise. The uncorrected call counts suggest a diel pattern and an increase in calling over a two-year period; the corrected call counts show minimal evidence of these features.

  17. Health Monitoring of a Satellite System

    NASA Technical Reports Server (NTRS)

    Chen, Robert H.; Ng, Hok K.; Speyer, Jason L.; Guntur, Lokeshkumar S.; Carpenter, Russell

    2004-01-01

    A health monitoring system based on analytical redundancy is developed for satellites on elliptical orbits. First, the dynamics of the satellite including orbital mechanics and attitude dynamics is modelled as a periodic system. Then, periodic fault detection filters are designed to detect and identify the satellite's actuator and sensor faults. In addition, parity equations are constructed using the algebraic redundant relationship among the actuators and sensors. Furthermore, a residual processor is designed to generate the probability of each of the actuator and sensor faults by using a sequential probability test. Finally, the health monitoring system, consisting of periodic fault detection lters, parity equations and residual processor, is evaluated in the simulation in the presence of disturbances and uncertainty.

  18. Probability of cancer in pulmonary nodules detected on first screening CT.

    PubMed

    McWilliams, Annette; Tammemagi, Martin C; Mayo, John R; Roberts, Heidi; Liu, Geoffrey; Soghrati, Kam; Yasufuku, Kazuhiro; Martel, Simon; Laberge, Francis; Gingras, Michel; Atkar-Khattra, Sukhinder; Berg, Christine D; Evans, Ken; Finley, Richard; Yee, John; English, John; Nasute, Paola; Goffin, John; Puksa, Serge; Stewart, Lori; Tsai, Scott; Johnston, Michael R; Manos, Daria; Nicholas, Garth; Goss, Glenwood D; Seely, Jean M; Amjadi, Kayvan; Tremblay, Alain; Burrowes, Paul; MacEachern, Paul; Bhatia, Rick; Tsao, Ming-Sound; Lam, Stephen

    2013-09-05

    Major issues in the implementation of screening for lung cancer by means of low-dose computed tomography (CT) are the definition of a positive result and the management of lung nodules detected on the scans. We conducted a population-based prospective study to determine factors predicting the probability that lung nodules detected on the first screening low-dose CT scans are malignant or will be found to be malignant on follow-up. We analyzed data from two cohorts of participants undergoing low-dose CT screening. The development data set included participants in the Pan-Canadian Early Detection of Lung Cancer Study (PanCan). The validation data set included participants involved in chemoprevention trials at the British Columbia Cancer Agency (BCCA), sponsored by the U.S. National Cancer Institute. The final outcomes of all nodules of any size that were detected on baseline low-dose CT scans were tracked. Parsimonious and fuller multivariable logistic-regression models were prepared to estimate the probability of lung cancer. In the PanCan data set, 1871 persons had 7008 nodules, of which 102 were malignant, and in the BCCA data set, 1090 persons had 5021 nodules, of which 42 were malignant. Among persons with nodules, the rates of cancer in the two data sets were 5.5% and 3.7%, respectively. Predictors of cancer in the model included older age, female sex, family history of lung cancer, emphysema, larger nodule size, location of the nodule in the upper lobe, part-solid nodule type, lower nodule count, and spiculation. Our final parsimonious and full models showed excellent discrimination and calibration, with areas under the receiver-operating-characteristic curve of more than 0.90, even for nodules that were 10 mm or smaller in the validation set. Predictive tools based on patient and nodule characteristics can be used to accurately estimate the probability that lung nodules detected on baseline screening low-dose CT scans are malignant. (Funded by the Terry Fox Research Institute and others; ClinicalTrials.gov number, NCT00751660.).

  19. Environmental DNA (eDNA) detects the invasive rusty crayfish Orconectes rusticus at low abundances.

    PubMed

    Dougherty, Matthew M; Larson, Eric R; Renshaw, Mark A; Gantz, Crysta A; Egan, Scott P; Erickson, Daniel M; Lodge, David M

    2016-06-01

    Early detection is invaluable for the cost-effective control and eradication of invasive species, yet many traditional sampling techniques are ineffective at the low population abundances found at the onset of the invasion process. Environmental DNA (eDNA) is a promising and sensitive tool for early detection of some invasive species, but its efficacy has not yet been evaluated for many taxonomic groups and habitat types.We evaluated the ability of eDNA to detect the invasive rusty crayfish Orconectes rusticus and to reflect patterns of its relative abundance, in upper Midwest, USA, inland lakes. We paired conventional baited trapping as a measure of crayfish relative abundance with water samples for eDNA, which were analysed in the laboratory with a qPCR assay. We modelled detection probability for O. rusticus eDNA using relative abundance and site characteristics as covariates and also tested the relationship between eDNA copy number and O. rusticus relative abundance.We detected O. rusticus eDNA in all lakes where this species was collected by trapping, down to low relative abundances, as well as in two lakes where trap catch was zero. Detection probability of O. rusticus eDNA was well predicted by relative abundance of this species and lake water clarity. However, there was poor correspondence between eDNA copy number and O. rusticus relative abundance estimated by trap catches. Synthesis and applications . Our study demonstrates a field and laboratory protocol for eDNA monitoring of crayfish invasions, with results of statistical models that provide guidance of sampling effort and detection probabilities for researchers in other regions and systems. We propose eDNA be included as a tool in surveillance for invasive or imperilled crayfishes and other benthic arthropods.

  20. Cetacean population density estimation from single fixed sensors using passive acoustics.

    PubMed

    Küsel, Elizabeth T; Mellinger, David K; Thomas, Len; Marques, Tiago A; Moretti, David; Ward, Jessica

    2011-06-01

    Passive acoustic methods are increasingly being used to estimate animal population density. Most density estimation methods are based on estimates of the probability of detecting calls as functions of distance. Typically these are obtained using receivers capable of localizing calls or from studies of tagged animals. However, both approaches are expensive to implement. The approach described here uses a MonteCarlo model to estimate the probability of detecting calls from single sensors. The passive sonar equation is used to predict signal-to-noise ratios (SNRs) of received clicks, which are then combined with a detector characterization that predicts probability of detection as a function of SNR. Input distributions for source level, beam pattern, and whale depth are obtained from the literature. Acoustic propagation modeling is used to estimate transmission loss. Other inputs for density estimation are call rate, obtained from the literature, and false positive rate, obtained from manual analysis of a data sample. The method is applied to estimate density of Blainville's beaked whales over a 6-day period around a single hydrophone located in the Tongue of the Ocean, Bahamas. Results are consistent with those from previous analyses, which use additional tag data. © 2011 Acoustical Society of America

  1. Analysis of the impact of safeguards criteria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mullen, M.F.; Reardon, P.T.

    As part of the US Program of Technical Assistance to IAEA Safeguards, the Pacific Northwest Laboratory (PNL) was asked to assist in developing and demonstrating a model for assessing the impact of setting criteria for the application of IAEA safeguards. This report presents the results of PNL's work on the task. The report is in three parts. The first explains the technical approach and methodology. The second contains an example application of the methodology. The third presents the conclusions of the study. PNL used the model and computer programs developed as part of Task C.5 (Estimation of Inspection Efforts) ofmore » the Program of Technical Assistance. The example application of the methodology involves low-enriched uranium conversion and fuel fabrication facilities. The effects of variations in seven parameters are considered: false alarm probability, goal probability of detection, detection goal quantity, the plant operator's measurement capability, the inspector's variables measurement capability, the inspector's attributes measurement capability, and annual plant throughput. Among the key results and conclusions of the analysis are the following: the variables with the greatest impact on the probability of detection are the inspector's measurement capability, the goal quantity, and the throughput; the variables with the greatest impact on inspection costs are the throughput, the goal quantity, and the goal probability of detection; there are important interactions between variables. That is, the effects of a given variable often depends on the level or value of some other variable. With the methodology used in this study, these interactions can be quantitatively analyzed; reasonably good approximate prediction equations can be developed using the methodology described here.« less

  2. A NEW METHOD OF PEAK DETECTION FOR ANALYSIS OF COMPREHENSIVE TWO-DIMENSIONAL GAS CHROMATOGRAPHY MASS SPECTROMETRY DATA.

    PubMed

    Kim, Seongho; Ouyang, Ming; Jeong, Jaesik; Shen, Changyu; Zhang, Xiang

    2014-06-01

    We develop a novel peak detection algorithm for the analysis of comprehensive two-dimensional gas chromatography time-of-flight mass spectrometry (GC×GC-TOF MS) data using normal-exponential-Bernoulli (NEB) and mixture probability models. The algorithm first performs baseline correction and denoising simultaneously using the NEB model, which also defines peak regions. Peaks are then picked using a mixture of probability distribution to deal with the co-eluting peaks. Peak merging is further carried out based on the mass spectral similarities among the peaks within the same peak group. The algorithm is evaluated using experimental data to study the effect of different cut-offs of the conditional Bayes factors and the effect of different mixture models including Poisson, truncated Gaussian, Gaussian, Gamma, and exponentially modified Gaussian (EMG) distributions, and the optimal version is introduced using a trial-and-error approach. We then compare the new algorithm with two existing algorithms in terms of compound identification. Data analysis shows that the developed algorithm can detect the peaks with lower false discovery rates than the existing algorithms, and a less complicated peak picking model is a promising alternative to the more complicated and widely used EMG mixture models.

  3. People Detection by a Mobile Robot Using Stereo Vision in Dynamic Indoor Environments

    NASA Astrophysics Data System (ADS)

    Méndez-Polanco, José Alberto; Muñoz-Meléndez, Angélica; Morales, Eduardo F.

    People detection and tracking is a key issue for social robot design and effective human robot interaction. This paper addresses the problem of detecting people with a mobile robot using a stereo camera. People detection using mobile robots is a difficult task because in real world scenarios it is common to find: unpredictable motion of people, dynamic environments, and different degrees of human body occlusion. Additionally, we cannot expect people to cooperate with the robot to perform its task. In our people detection method, first, an object segmentation method that uses the distance information provided by a stereo camera is used to separate people from the background. The segmentation method proposed in this work takes into account human body proportions to segment people and provides a first estimation of people location. After segmentation, an adaptive contour people model based on people distance to the robot is used to calculate a probability of detecting people. Finally, people are detected merging the probabilities of the contour people model and by evaluating evidence over time by applying a Bayesian scheme. We present experiments on detection of standing and sitting people, as well as people in frontal and side view with a mobile robot in real world scenarios.

  4. A Comparison of Grizzly Bear Demographic Parameters Estimated from Non-Spatial and Spatial Open Population Capture-Recapture Models

    PubMed Central

    Whittington, Jesse; Sawaya, Michael A.

    2015-01-01

    Capture-recapture studies are frequently used to monitor the status and trends of wildlife populations. Detection histories from individual animals are used to estimate probability of detection and abundance or density. The accuracy of abundance and density estimates depends on the ability to model factors affecting detection probability. Non-spatial capture-recapture models have recently evolved into spatial capture-recapture models that directly include the effect of distances between an animal’s home range centre and trap locations on detection probability. Most studies comparing non-spatial and spatial capture-recapture biases focussed on single year models and no studies have compared the accuracy of demographic parameter estimates from open population models. We applied open population non-spatial and spatial capture-recapture models to three years of grizzly bear DNA-based data from Banff National Park and simulated data sets. The two models produced similar estimates of grizzly bear apparent survival, per capita recruitment, and population growth rates but the spatial capture-recapture models had better fit. Simulations showed that spatial capture-recapture models produced more accurate parameter estimates with better credible interval coverage than non-spatial capture-recapture models. Non-spatial capture-recapture models produced negatively biased estimates of apparent survival and positively biased estimates of per capita recruitment. The spatial capture-recapture grizzly bear population growth rates and 95% highest posterior density averaged across the three years were 0.925 (0.786–1.071) for females, 0.844 (0.703–0.975) for males, and 0.882 (0.779–0.981) for females and males combined. The non-spatial capture-recapture population growth rates were 0.894 (0.758–1.024) for females, 0.825 (0.700–0.948) for males, and 0.863 (0.771–0.957) for both sexes. The combination of low densities, low reproductive rates, and predominantly negative population growth rates suggest that Banff National Park’s population of grizzly bears requires continued conservation-oriented management actions. PMID:26230262

  5. Comparison of electrofishing techniques to detect larval lampreys in wadeable streams in the Pacific Northwest

    USGS Publications Warehouse

    Dunham, Jason B.; Chelgren, Nathan D.; Heck, Michael P.; Clark, Steven M.

    2013-01-01

    We evaluated the probability of detecting larval lampreys using different methods of backpack electrofishing in wadeable streams in the U.S. Pacific Northwest. Our primary objective was to compare capture of lampreys using electrofishing with standard settings for salmon and trout to settings specifically adapted for capture of lampreys. Field work consisted of removal sampling by means of backpack electrofishing in 19 sites in streams representing a broad range of conditions in the region. Captures of lampreys at these sites were analyzed with a modified removal-sampling model and Bayesian estimation to measure the relative odds of capture using the lamprey-specific settings compared with the standard salmonid settings. We found that the odds of capture were 2.66 (95% credible interval, 0.87–78.18) times greater for the lamprey-specific settings relative to standard salmonid settings. When estimates of capture probability were applied to estimating the probabilities of detection, we found high (>0.80) detectability when the actual number of lampreys in a site was greater than 10 individuals and effort was at least two passes of electrofishing, regardless of the settings used. Further work is needed to evaluate key assumptions in our approach, including the evaluation of individual-specific capture probabilities and population closure. For now our results suggest comparable results are possible for detection of lampreys by using backpack electrofishing with salmonid- or lamprey-specific settings.

  6. Estimation of occupancy, breeding success, and predicted abundance of golden eagles (Aquila chrysaetos) in the Diablo Range, California, 2014

    USGS Publications Warehouse

    Wiens, J. David; Kolar, Patrick S.; Fuller, Mark R.; Hunt, W. Grainger; Hunt, Teresa

    2015-01-01

    We used a multistate occupancy sampling design to estimate occupancy, breeding success, and abundance of territorial pairs of golden eagles (Aquila chrysaetos) in the Diablo Range, California, in 2014. This method uses the spatial pattern of detections and non-detections over repeated visits to survey sites to estimate probabilities of occupancy and successful reproduction while accounting for imperfect detection of golden eagles and their young during surveys. The estimated probability of detecting territorial pairs of golden eagles and their young was less than 1 and varied with time of the breeding season, as did the probability of correctly classifying a pair’s breeding status. Imperfect detection and breeding classification led to a sizeable difference between the uncorrected, naïve estimate of the proportion of occupied sites where successful reproduction was observed (0.20) and the model-based estimate (0.30). The analysis further indicated a relatively high overall probability of landscape occupancy by pairs of golden eagles (0.67, standard error = 0.06), but that areas with the greatest occupancy and reproductive potential were patchily distributed. We documented a total of 138 territorial pairs of golden eagles during surveys completed in the 2014 breeding season, which represented about one-half of the 280 pairs we estimated to occur in the broader 5,169-square kilometer region sampled. The study results emphasize the importance of accounting for imperfect detection and spatial heterogeneity in studies of site occupancy, breeding success, and abundance of golden eagles.

  7. Evaluation of aerial survey methods for Dall's sheep

    USGS Publications Warehouse

    Udevitz, Mark S.; Shults, Brad S.; Adams, Layne G.; Kleckner, Christopher

    2006-01-01

    Most Dall's sheep (Ovis dalli dalli) population-monitoring efforts use intensive aerial surveys with no attempt to estimate variance or adjust for potential sightability bias. We used radiocollared sheep to assess factors that could affect sightability of Dall's sheep in standard fixed-wing and helicopter surveys and to evaluate feasibility of methods that might account for sightability bias. Work was conducted in conjunction with annual aerial surveys of Dall's sheep in the western Baird Mountains, Alaska, USA, in 2000–2003. Overall sightability was relatively high compared with other aerial wildlife surveys, with 88% of the available, marked sheep detected in our fixed-wing surveys. Total counts from helicopter surveys were not consistently larger than counts from fixed-wing surveys of the same units, and detection probabilities did not differ for the 2 aircraft types. Our results suggest that total counts from helicopter surveys cannot be used to obtain reliable estimates of detection probabilities for fixed-wing surveys. Groups containing radiocollared sheep often changed in size and composition before they could be observed by a second crew in units that were double-surveyed. Double-observer methods that require determination of which groups were detected by each observer will be infeasible unless survey procedures can be modified so that groups remain more stable between observations. Mean group sizes increased during our study period, and our logistic regression sightability model indicated that detection probabilities increased with group size. Mark–resight estimates of annual population sizes were similar to sightability-model estimates, and confidence intervals overlapped broadly. We recommend the sightability-model approach as the most effective and feasible of the alternatives we considered for monitoring Dall's sheep populations.

  8. Estimating occurrence and detection probabilities for stream-breeding salamanders in the Gulf Coastal Plain

    USGS Publications Warehouse

    Lamb, Jennifer Y.; Waddle, J. Hardin; Qualls, Carl P.

    2017-01-01

    Large gaps exist in our knowledge of the ecology of stream-breeding plethodontid salamanders in the Gulf Coastal Plain. Data describing where these salamanders are likely to occur along environmental gradients, as well as their likelihood of detection, are important for the prevention and management of amphibian declines. We used presence/absence data from leaf litter bag surveys and a hierarchical Bayesian multispecies single-season occupancy model to estimate the occurrence of five species of plethodontids across reaches in headwater streams in the Gulf Coastal Plain. Average detection probabilities were high (range = 0.432–0.942) and unaffected by sampling covariates specific to the use of litter bags (i.e., bag submergence, sampling season, in-stream cover). Estimates of occurrence probabilities differed substantially between species (range = 0.092–0.703) and were influenced by the size of the upstream drainage area and by the maximum proportion of the reach that dried. The effects of these two factors were not equivalent across species. Our results demonstrate that hierarchical multispecies models successfully estimate occurrence parameters for both rare and common stream-breeding plethodontids. The resulting models clarify how species are distributed within stream networks, and they provide baseline values that will be useful in evaluating the conservation statuses of plethodontid species within lotic systems in the Gulf Coastal Plain.

  9. Evaluation of radio-tracking and strip transect methods for determining foraging ranges of Black-Legged Kittiwakes

    USGS Publications Warehouse

    Ostrand, William D.; Drew, G.S.; Suryan, R.M.; McDonald, L.L.

    1998-01-01

    We compared strip transect and radio-tracking methods of determining foraging range of Black-legged Kittiwakes (Rissa tridactyla). The mean distance birds were observed from their colony determined by radio-tracking was significantly greater than the mean value calculated from strip transects. We determined that this difference was due to two sources of bias: (1) as distance from the colony increased, the area of available habitat also increased resulting in decreasing bird densities (bird spreading). Consequently, the probability of detecting birds during transect surveys also would decrease as distance from the colony increased, and (2) the maximum distance birds were observed from the colony during radio-tracking exceeded the extent of the strip transect survey. We compared the observed number of birds seen on the strip transect survey to the predictions of a model of the decreasing probability of detection due to bird spreading. Strip transect data were significantly different from modeled data; however, the field data were consistently equal to or below the model predictions, indicating a general conformity to the concept of declining detection at increasing distance. We conclude that radio-tracking data gave a more representative indication of foraging distances than did strip transect sampling. Previous studies of seabirds that have used strip transect sampling without accounting for bird spreading or the effects of study-area limitations probably underestimated foraging range.

  10. Optimize the Coverage Probability of Prediction Interval for Anomaly Detection of Sensor-Based Monitoring Series

    PubMed Central

    Liu, Datong; Peng, Yu; Peng, Xiyuan

    2018-01-01

    Effective anomaly detection of sensing data is essential for identifying potential system failures. Because they require no prior knowledge or accumulated labels, and provide uncertainty presentation, the probability prediction methods (e.g., Gaussian process regression (GPR) and relevance vector machine (RVM)) are especially adaptable to perform anomaly detection for sensing series. Generally, one key parameter of prediction models is coverage probability (CP), which controls the judging threshold of the testing sample and is generally set to a default value (e.g., 90% or 95%). There are few criteria to determine the optimal CP for anomaly detection. Therefore, this paper designs a graphic indicator of the receiver operating characteristic curve of prediction interval (ROC-PI) based on the definition of the ROC curve which can depict the trade-off between the PI width and PI coverage probability across a series of cut-off points. Furthermore, the Youden index is modified to assess the performance of different CPs, by the minimization of which the optimal CP is derived by the simulated annealing (SA) algorithm. Experiments conducted on two simulation datasets demonstrate the validity of the proposed method. Especially, an actual case study on sensing series from an on-orbit satellite illustrates its significant performance in practical application. PMID:29587372

  11. Markov Chain Monte Carlo estimation of species distributions: a case study of the swift fox in western Kansas

    USGS Publications Warehouse

    Sargeant, Glen A.; Sovada, Marsha A.; Slivinski, Christiane C.; Johnson, Douglas H.

    2005-01-01

    Accurate maps of species distributions are essential tools for wildlife research and conservation. Unfortunately, biologists often are forced to rely on maps derived from observed occurrences recorded opportunistically during observation periods of variable length. Spurious inferences are likely to result because such maps are profoundly affected by the duration and intensity of observation and by methods used to delineate distributions, especially when detection is uncertain. We conducted a systematic survey of swift fox (Vulpes velox) distribution in western Kansas, USA, and used Markov chain Monte Carlo (MCMC) image restoration to rectify these problems. During 1997–1999, we searched 355 townships (ca. 93 km) 1–3 times each for an average cost of $7,315 per year and achieved a detection rate (probability of detecting swift foxes, if present, during a single search) of = 0.69 (95% Bayesian confidence interval [BCI] = [0.60, 0.77]). Our analysis produced an estimate of the underlying distribution, rather than a map of observed occurrences, that reflected the uncertainty associated with estimates of model parameters. To evaluate our results, we analyzed simulated data with similar properties. Results of our simulations suggest negligible bias and good precision when probabilities of detection on ≥1 survey occasions (cumulative probabilities of detection) exceed 0.65. Although the use of MCMC image restoration has been limited by theoretical and computational complexities, alternatives do not possess the same advantages. Image models accommodate uncertain detection, do not require spatially independent data or a census of map units, and can be used to estimate species distributions directly from observations without relying on habitat covariates or parameters that must be estimated subjectively. These features facilitate economical surveys of large regions, the detection of temporal trends in distribution, and assessments of landscape-level relations between species and habitats. Requirements for the use of MCMC image restoration include study areas that can be partitioned into regular grids of mapping units, spatially contagious species distributions, reliable methods for identifying target species, and cumulative probabilities of detection ≥0.65.

  12. Markov chain Monte Carlo estimation of species distributions: A case study of the swift fox in western Kansas

    USGS Publications Warehouse

    Sargeant, G.A.; Sovada, M.A.; Slivinski, C.C.; Johnson, D.H.

    2005-01-01

    Accurate maps of species distributions are essential tools for wildlife research and conservation. Unfortunately, biologists often are forced to rely on maps derived from observed occurrences recorded opportunistically during observation periods of variable length. Spurious inferences are likely to result because such maps are profoundly affected by the duration and intensity of observation and by methods used to delineate distributions, especially when detection is uncertain. We conducted a systematic survey of swift fox (Vulpes velox) distribution in western Kansas, USA, and used Markov chain Monte Carlo (MCMC) image restoration to rectify these problems. During 1997-1999, we searched 355 townships (ca. 93 km2) 1-3 times each for an average cost of $7,315 per year and achieved a detection rate (probability of detecting swift foxes, if present, during a single search) of ?? = 0.69 (95% Bayesian confidence interval [BCI] = [0.60, 0.77]). Our analysis produced an estimate of the underlying distribution, rather than a map of observed occurrences, that reflected the uncertainty associated with estimates of model parameters. To evaluate our results, we analyzed simulated data with similar properties. Results of our simulations suggest negligible bias and good precision when probabilities of detection on ???1 survey occasions (cumulative probabilities of detection) exceed 0.65. Although the use of MCMC image restoration has been limited by theoretical and computational complexities, alternatives do not possess the same advantages. Image models accommodate uncertain detection, do not require spatially independent data or a census of map units, and can be used to estimate species distributions directly from observations without relying on habitat covariates or parameters that must be estimated subjectively. These features facilitate economical surveys of large regions, the detection of temporal trends in distribution, and assessments of landscape-level relations between species and habitats. Requirements for the use of MCMC image restoration include study areas that can be partitioned into regular grids of mapping units, spatially contagious species distributions, reliable methods for identifying target species, and cumulative probabilities of detection ???0.65.

  13. Threshold Monitoring Maps for Under-Water Explosions

    NASA Astrophysics Data System (ADS)

    Arora, N. S.

    2014-12-01

    Hydro-acoustic energy in the 1-100 Hz range from under-water explosions can easily spread for thousands of miles due to the unique properties of the deep sound channel. This channel, aka SOFAR channel, exists almost everywhere in the earth's oceans where the water has at least 1500m depth. Once the energy is trapped in this channel it spreads out cylindrically, and hence experiences very little loss, as long as there is an unblocked path from source to receiver. Other losses such as absorption due to chemicals in the ocean (mainly boric acid and magnesium sulphate) are also quite minimal at these low frequencies. It is not surprising then that the International Monitoring System (IMS) maintains a global network of hydrophone stations listening on this particular frequency range. The overall objective of our work is to build a probabilistic model to detect and locate under-water explosions using the IMS network. A number of critical pieces for this model, such as travel time predictions, are already well known. We are extending the existing knowledge-base by building the remaining pieces, most crucially the models for transmission losses and detection probabilities. With a complete model for detecting under-water explosions we are able to combine it with our existing model for seismic events, NET-VISA. In the conference we will present threshold monitoring maps for explosions in the earth's oceans. Our premise is that explosive sources release an unknown fraction of their total energy into the SOFAR channel, and this trapped energy determines their detection probability at each of the IMS hydrophone stations. Our threshold monitoring maps compute the minimum amount of energy at each location that must be released into the deep sound channel such that there is a ninety percent probability that at least two of the IMS stations detect the event. We will also present results of our effort to detect and locate hydro-acoustic events. In particular, we will show results from a recent under-water volcanic eruption at the Ahyl Seamount (April-May 2014), and compare our work with the current processing, both automated and human, at the IDC.

  14. Time series sightability modeling of animal populations.

    PubMed

    ArchMiller, Althea A; Dorazio, Robert M; St Clair, Katherine; Fieberg, John R

    2018-01-01

    Logistic regression models-or "sightability models"-fit to detection/non-detection data from marked individuals are often used to adjust for visibility bias in later detection-only surveys, with population abundance estimated using a modified Horvitz-Thompson (mHT) estimator. More recently, a model-based alternative for analyzing combined detection/non-detection and detection-only data was developed. This approach seemed promising, since it resulted in similar estimates as the mHT when applied to data from moose (Alces alces) surveys in Minnesota. More importantly, it provided a framework for developing flexible models for analyzing multiyear detection-only survey data in combination with detection/non-detection data. During initial attempts to extend the model-based approach to multiple years of detection-only data, we found that estimates of detection probabilities and population abundance were sensitive to the amount of detection-only data included in the combined (detection/non-detection and detection-only) analysis. Subsequently, we developed a robust hierarchical modeling approach where sightability model parameters are informed only by the detection/non-detection data, and we used this approach to fit a fixed-effects model (FE model) with year-specific parameters and a temporally-smoothed model (TS model) that shares information across years via random effects and a temporal spline. The abundance estimates from the TS model were more precise, with decreased interannual variability relative to the FE model and mHT abundance estimates, illustrating the potential benefits from model-based approaches that allow information to be shared across years.

  15. Unsupervised, low latency anomaly detection of algorithmically generated domain names by generative probabilistic modeling.

    PubMed

    Raghuram, Jayaram; Miller, David J; Kesidis, George

    2014-07-01

    We propose a method for detecting anomalous domain names, with focus on algorithmically generated domain names which are frequently associated with malicious activities such as fast flux service networks, particularly for bot networks (or botnets), malware, and phishing. Our method is based on learning a (null hypothesis) probability model based on a large set of domain names that have been white listed by some reliable authority. Since these names are mostly assigned by humans, they are pronounceable, and tend to have a distribution of characters, words, word lengths, and number of words that are typical of some language (mostly English), and often consist of words drawn from a known lexicon. On the other hand, in the present day scenario, algorithmically generated domain names typically have distributions that are quite different from that of human-created domain names. We propose a fully generative model for the probability distribution of benign (white listed) domain names which can be used in an anomaly detection setting for identifying putative algorithmically generated domain names. Unlike other methods, our approach can make detections without considering any additional (latency producing) information sources, often used to detect fast flux activity. Experiments on a publicly available, large data set of domain names associated with fast flux service networks show encouraging results, relative to several baseline methods, with higher detection rates and low false positive rates.

  16. Unsupervised, low latency anomaly detection of algorithmically generated domain names by generative probabilistic modeling

    PubMed Central

    Raghuram, Jayaram; Miller, David J.; Kesidis, George

    2014-01-01

    We propose a method for detecting anomalous domain names, with focus on algorithmically generated domain names which are frequently associated with malicious activities such as fast flux service networks, particularly for bot networks (or botnets), malware, and phishing. Our method is based on learning a (null hypothesis) probability model based on a large set of domain names that have been white listed by some reliable authority. Since these names are mostly assigned by humans, they are pronounceable, and tend to have a distribution of characters, words, word lengths, and number of words that are typical of some language (mostly English), and often consist of words drawn from a known lexicon. On the other hand, in the present day scenario, algorithmically generated domain names typically have distributions that are quite different from that of human-created domain names. We propose a fully generative model for the probability distribution of benign (white listed) domain names which can be used in an anomaly detection setting for identifying putative algorithmically generated domain names. Unlike other methods, our approach can make detections without considering any additional (latency producing) information sources, often used to detect fast flux activity. Experiments on a publicly available, large data set of domain names associated with fast flux service networks show encouraging results, relative to several baseline methods, with higher detection rates and low false positive rates. PMID:25685511

  17. Optical detection of chemical warfare agents and toxic industrial chemicals

    NASA Astrophysics Data System (ADS)

    Webber, Michael E.; Pushkarsky, Michael B.; Patel, C. Kumar N.

    2004-12-01

    We present an analytical model evaluating the suitability of optical absorption based spectroscopic techniques for detection of chemical warfare agents (CWAs) and toxic industrial chemicals (TICs) in ambient air. The sensor performance is modeled by simulating absorption spectra of a sample containing both the target and multitude of interfering species as well as an appropriate stochastic noise and determining the target concentrations from the simulated spectra via a least square fit (LSF) algorithm. The distribution of the LSF target concentrations determines the sensor sensitivity, probability of false positives (PFP) and probability of false negatives (PFN). The model was applied to CO2 laser based photoacosutic (L-PAS) CWA sensor and predicted single digit ppb sensitivity with very low PFP rates in the presence of significant amount of interferences. This approach will be useful for assessing sensor performance by developers and users alike; it also provides methodology for inter-comparison of different sensing technologies.

  18. Trackline and point detection probabilities for acoustic surveys of Cuvier's and Blainville's beaked whales.

    PubMed

    Barlow, Jay; Tyack, Peter L; Johnson, Mark P; Baird, Robin W; Schorr, Gregory S; Andrews, Russel D; Aguilar de Soto, Natacha

    2013-09-01

    Acoustic survey methods can be used to estimate density and abundance using sounds produced by cetaceans and detected using hydrophones if the probability of detection can be estimated. For passive acoustic surveys, probability of detection at zero horizontal distance from a sensor, commonly called g(0), depends on the temporal patterns of vocalizations. Methods to estimate g(0) are developed based on the assumption that a beaked whale will be detected if it is producing regular echolocation clicks directly under or above a hydrophone. Data from acoustic recording tags placed on two species of beaked whales (Cuvier's beaked whale-Ziphius cavirostris and Blainville's beaked whale-Mesoplodon densirostris) are used to directly estimate the percentage of time they produce echolocation clicks. A model of vocal behavior for these species as a function of their diving behavior is applied to other types of dive data (from time-depth recorders and time-depth-transmitting satellite tags) to indirectly determine g(0) in other locations for low ambient noise conditions. Estimates of g(0) for a single instant in time are 0.28 [standard deviation (s.d.) = 0.05] for Cuvier's beaked whale and 0.19 (s.d. = 0.01) for Blainville's beaked whale.

  19. A Stochastic Model for the Landing Dispersion of Hazard Detection and Avoidance Capable Flight Systems

    NASA Astrophysics Data System (ADS)

    Witte, L.

    2014-06-01

    To support landing site assessments for HDA-capable flight systems and to facilitate trade studies between the potential HDA architectures versus the yielded probability of safe landing a stochastic landing dispersion model has been developed.

  20. Description of Aspergillus flavus growth under the influence of different factors (water activity, incubation temperature, protein and fat concentration, pH, and cinnamon essential oil concentration) by kinetic, probability of growth, and time-to-detection models.

    PubMed

    Kosegarten, Carlos E; Ramírez-Corona, Nelly; Mani-López, Emma; Palou, Enrique; López-Malo, Aurelio

    2017-01-02

    A Box-Behnken design was used to determine the effect of protein concentration (0, 5, or 10g of casein/100g), fat (0, 3, or 6g of corn oil/100g), a w (0.900, 0.945, or 0.990), pH (3.5, 5.0, or 6.5), concentration of cinnamon essential oil (CEO, 0, 200, or 400μL/kg) and incubation temperature (15, 25, or 35°C) on the growth of Aspergillus flavus during 50days of incubation. Mold response under the evaluated conditions was modeled by the modified Gompertz equation, logistic regression, and time-to-detection model. The obtained polynomial regression models allow the significant coefficients (p<0.05) for linear, quadratic and interaction effects for the Gompertz equation's parameters to be identified, which adequately described (R 2 >0.967) the studied mold responses. After 50days of incubation, every tested model system was classified according to the observed response as 1 (growth) or 0 (no growth), then a binary logistic regression was utilized to model A. flavus growth interface, allowing to predict the probability of mold growth under selected combinations of tested factors. The time-to-detection model was utilized to estimate the time at which A. flavus visible growth begins. Water activity, temperature, and CEO concentration were the most important factors affecting fungal growth. It was observed that there is a range of possible combinations that may induce growth, such that incubation conditions and the amount of essential oil necessary for fungal growth inhibition strongly depend on protein and fat concentrations as well as on the pH of studied model systems. The probabilistic model and the time-to-detection models constitute another option to determine appropriate storage/processing conditions and accurately predict the probability and/or the time at which A. flavus growth occurs. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Pairing field methods to improve inference in wildlife surveys while accommodating detection covariance

    USGS Publications Warehouse

    Clare, John; McKinney, Shawn T.; DePue, John E.; Loftin, Cynthia S.

    2017-01-01

    It is common to use multiple field sampling methods when implementing wildlife surveys to compare method efficacy or cost efficiency, integrate distinct pieces of information provided by separate methods, or evaluate method-specific biases and misclassification error. Existing models that combine information from multiple field methods or sampling devices permit rigorous comparison of method-specific detection parameters, enable estimation of additional parameters such as false-positive detection probability, and improve occurrence or abundance estimates, but with the assumption that the separate sampling methods produce detections independently of one another. This assumption is tenuous if methods are paired or deployed in close proximity simultaneously, a common practice that reduces the additional effort required to implement multiple methods and reduces the risk that differences between method-specific detection parameters are confounded by other environmental factors. We develop occupancy and spatial capture–recapture models that permit covariance between the detections produced by different methods, use simulation to compare estimator performance of the new models to models assuming independence, and provide an empirical application based on American marten (Martes americana) surveys using paired remote cameras, hair catches, and snow tracking. Simulation results indicate existing models that assume that methods independently detect organisms produce biased parameter estimates and substantially understate estimate uncertainty when this assumption is violated, while our reformulated models are robust to either methodological independence or covariance. Empirical results suggested that remote cameras and snow tracking had comparable probability of detecting present martens, but that snow tracking also produced false-positive marten detections that could potentially substantially bias distribution estimates if not corrected for. Remote cameras detected marten individuals more readily than passive hair catches. Inability to photographically distinguish individual sex did not appear to induce negative bias in camera density estimates; instead, hair catches appeared to produce detection competition between individuals that may have been a source of negative bias. Our model reformulations broaden the range of circumstances in which analyses incorporating multiple sources of information can be robustly used, and our empirical results demonstrate that using multiple field-methods can enhance inferences regarding ecological parameters of interest and improve understanding of how reliably survey methods sample these parameters.

  2. Estimating parameters of hidden Markov models based on marked individuals: use of robust design data

    USGS Publications Warehouse

    Kendall, William L.; White, Gary C.; Hines, James E.; Langtimm, Catherine A.; Yoshizaki, Jun

    2012-01-01

    Development and use of multistate mark-recapture models, which provide estimates of parameters of Markov processes in the face of imperfect detection, have become common over the last twenty years. Recently, estimating parameters of hidden Markov models, where the state of an individual can be uncertain even when it is detected, has received attention. Previous work has shown that ignoring state uncertainty biases estimates of survival and state transition probabilities, thereby reducing the power to detect effects. Efforts to adjust for state uncertainty have included special cases and a general framework for a single sample per period of interest. We provide a flexible framework for adjusting for state uncertainty in multistate models, while utilizing multiple sampling occasions per period of interest to increase precision and remove parameter redundancy. These models also produce direct estimates of state structure for each primary period, even for the case where there is just one sampling occasion. We apply our model to expected value data, and to data from a study of Florida manatees, to provide examples of the improvement in precision due to secondary capture occasions. We also provide user-friendly software to implement these models. This general framework could also be used by practitioners to consider constrained models of particular interest, or model the relationship between within-primary period parameters (e.g., state structure) and between-primary period parameters (e.g., state transition probabilities).

  3. Solar Radiation Determines Site Occupancy of Coexisting Tropical and Temperate Deer Species Introduced to New Zealand Forests

    PubMed Central

    Allen, Robert B.; Forsyth, David M.; Allen, Roy K. J.; Affeld, Kathrin; MacKenzie, Darryl I.

    2015-01-01

    Assemblages of introduced taxa provide an opportunity to understand how abiotic and biotic factors shape habitat use by coexisting species. We tested hypotheses about habitat selection by two deer species recently introduced to New Zealand’s temperate rainforests. We hypothesised that, due to different thermoregulatory abilities, rusa deer (Cervus timorensis; a tropical species) would prefer warmer locations in winter than red deer (Cervus elaphus scoticus; a temperate species). Since adult male rusa deer are aggressive in winter (the rut), we also hypothesised that rusa deer and red deer would not use the same winter locations. Finally, we hypothesised that in summer both species would prefer locations with fertile soils that supported more plant species preferred as food. We used a 250 × 250 m grid of 25 remote cameras to collect images in a 100-ha montane study area over two winters and summers. Plant composition, solar radiation, and soil fertility were also determined for each camera location. Multiseason occupancy models revealed that direct solar radiation was the best predictor of occupancy and detection probabilities for rusa deer in winter. Multistate, multiseason occupancy models provided strong evidence that the detection probability of adult male rusa deer was greater in winter and when other rusa deer were present at a location. Red deer mostly vacated the study area in winter. For the one season that had sufficient camera images of both species (summer 2011) to allow two-species occupancy models to be fitted, the detection probability of rusa deer also increased with solar radiation. Detection probability also varied with plant composition for both deer species. We conclude that habitat use by coexisting tropical and temperate deer species in New Zealand likely depends on the interplay between the thermoregulatory and behavioural traits of the deer and the abiotic and biotic features of the habitat. PMID:26061426

  4. Repeated count surveys help standardize multi-agency estimates of American Oystercatcher (Haematopus palliatus) abundance

    USGS Publications Warehouse

    Hostetter, Nathan J.; Gardner, Beth; Schweitzer, Sara H.; Boettcher, Ruth; Wilke, Alexandra L.; Addison, Lindsay; Swilling, William R.; Pollock, Kenneth H.; Simons, Theodore R.

    2015-01-01

    The extensive breeding range of many shorebird species can make integration of survey data problematic at regional spatial scales. We evaluated the effectiveness of standardized repeated count surveys coordinated across 8 agencies to estimate the abundance of American Oystercatcher (Haematopus palliatus) breeding pairs in the southeastern United States. Breeding season surveys were conducted across coastal North Carolina (90 plots) and the Eastern Shore of Virginia (3 plots). Plots were visited on 1–5 occasions during April–June 2013. N-mixture models were used to estimate abundance and detection probability in relation to survey date, tide stage, plot size, and plot location (coastal bay vs. barrier island). The estimated abundance of oystercatchers in the surveyed area was 1,048 individuals (95% credible interval: 851–1,408) and 470 pairs (384–637), substantially higher than estimates that did not account for detection probability (maximum counts of 674 individuals and 316 pairs). Detection probability was influenced by a quadratic function of survey date, and increased from mid-April (~0.60) to mid-May (~0.80), then remained relatively constant through June. Detection probability was also higher during high tide than during low, rising, or falling tides. Abundance estimates from N-mixture models were validated at 13 plots by exhaustive productivity studies (2–5 surveys wk−1). Intensive productivity studies identified 78 breeding pairs across 13 productivity plots while the N-mixture model abundance estimate was 74 pairs (62–119) using only 1–5 replicated surveys season−1. Our results indicate that standardized replicated count surveys coordinated across multiple agencies and conducted during a relatively short time window (closure assumption) provide tremendous potential to meet both agency-level (e.g., state) and regional-level (e.g., flyway) objectives in large-scale shorebird monitoring programs.

  5. Airborne radar technology for windshear detection

    NASA Technical Reports Server (NTRS)

    Hibey, Joseph L.; Khalaf, Camille S.

    1988-01-01

    The objectives and accomplishments of the two-and-a-half year effort to describe how returns from on-board Doppler radar are to be used to detect the presence of a wind shear are reported. The problem is modeled as one of first passage in terms of state variables, the state estimates are generated by a bank of extended Kalman filters working in parallel, and the decision strategy involves the use of a voting algorithm for a series of likelihood ratio tests. The performance issue for filtering is addressed in terms of error-covariance reduction and filter divergence, and the performance issue for detection is addressed in terms of using a probability measure transformation to derive theoretical expressions for the error probabilities of a false alarm and a miss.

  6. Time series sightability modeling of animal populations

    USGS Publications Warehouse

    ArchMiller, Althea A.; Dorazio, Robert; St. Clair, Katherine; Fieberg, John R.

    2018-01-01

    Logistic regression models—or “sightability models”—fit to detection/non-detection data from marked individuals are often used to adjust for visibility bias in later detection-only surveys, with population abundance estimated using a modified Horvitz-Thompson (mHT) estimator. More recently, a model-based alternative for analyzing combined detection/non-detection and detection-only data was developed. This approach seemed promising, since it resulted in similar estimates as the mHT when applied to data from moose (Alces alces) surveys in Minnesota. More importantly, it provided a framework for developing flexible models for analyzing multiyear detection-only survey data in combination with detection/non-detection data. During initial attempts to extend the model-based approach to multiple years of detection-only data, we found that estimates of detection probabilities and population abundance were sensitive to the amount of detection-only data included in the combined (detection/non-detection and detection-only) analysis. Subsequently, we developed a robust hierarchical modeling approach where sightability model parameters are informed only by the detection/non-detection data, and we used this approach to fit a fixed-effects model (FE model) with year-specific parameters and a temporally-smoothed model (TS model) that shares information across years via random effects and a temporal spline. The abundance estimates from the TS model were more precise, with decreased interannual variability relative to the FE model and mHT abundance estimates, illustrating the potential benefits from model-based approaches that allow information to be shared across years.

  7. Probability of detecting perchlorate under natural conditions in deep groundwater in California and the Southwestern United States

    USGS Publications Warehouse

    Fram, Miranda S.; Belitz, Kenneth

    2011-01-01

    We use data from 1626 groundwater samples collected in California, primarily from public drinking water supply wells, to investigate the distribution of perchlorate in deep groundwater under natural conditions. The wells were sampled for the California Groundwater Ambient Monitoring and Assessment Priority Basin Project. We develop a logistic regression model for predicting probabilities of detecting perchlorate at concentrations greater than multiple threshold concentrations as a function of climate (represented by an aridity index) and potential anthropogenic contributions of perchlorate (quantified as an anthropogenic score, AS). AS is a composite categorical variable including terms for nitrate, pesticides, and volatile organic compounds. Incorporating water-quality parameters in AS permits identification of perturbation of natural occurrence patterns by flushing of natural perchlorate salts from unsaturated zones by irrigation recharge as well as addition of perchlorate from industrial and agricultural sources. The data and model results indicate low concentrations (0.1-0.5 μg/L) of perchlorate occur under natural conditions in groundwater across a wide range of climates, beyond the arid to semiarid climates in which they mostly have been previously reported. The probability of detecting perchlorate at concentrations greater than 0.1 μg/L under natural conditions ranges from 50-70% in semiarid to arid regions of California and the Southwestern United States to 5-15% in the wettest regions sampled (the Northern California coast). The probability of concentrations above 1 μg/L under natural conditions is low (generally <3%).

  8. The relationship between species detection probability and local extinction probability

    USGS Publications Warehouse

    Alpizar-Jara, R.; Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Pollock, K.H.; Rosenberry, C.S.

    2004-01-01

    In community-level ecological studies, generally not all species present in sampled areas are detected. Many authors have proposed the use of estimation methods that allow detection probabilities that are < 1 and that are heterogeneous among species. These methods can also be used to estimate community-dynamic parameters such as species local extinction probability and turnover rates (Nichols et al. Ecol Appl 8:1213-1225; Conserv Biol 12:1390-1398). Here, we present an ad hoc approach to estimating community-level vital rates in the presence of joint heterogeneity of detection probabilities and vital rates. The method consists of partitioning the number of species into two groups using the detection frequencies and then estimating vital rates (e.g., local extinction probabilities) for each group. Estimators from each group are combined in a weighted estimator of vital rates that accounts for the effect of heterogeneity. Using data from the North American Breeding Bird Survey, we computed such estimates and tested the hypothesis that detection probabilities and local extinction probabilities were negatively related. Our analyses support the hypothesis that species detection probability covaries negatively with local probability of extinction and turnover rates. A simulation study was conducted to assess the performance of vital parameter estimators as well as other estimators relevant to questions about heterogeneity, such as coefficient of variation of detection probabilities and proportion of species in each group. Both the weighted estimator suggested in this paper and the original unweighted estimator for local extinction probability performed fairly well and provided no basis for preferring one to the other.

  9. Detection probability in aerial surveys of feral horses

    USGS Publications Warehouse

    Ransom, Jason I.

    2011-01-01

    Observation bias pervades data collected during aerial surveys of large animals, and although some sources can be mitigated with informed planning, others must be addressed using valid sampling techniques that carefully model detection probability. Nonetheless, aerial surveys are frequently employed to count large mammals without applying such methods to account for heterogeneity in visibility of animal groups on the landscape. This often leaves managers and interest groups at odds over decisions that are not adequately informed. I analyzed detection of feral horse (Equus caballus) groups by dual independent observers from 24 fixed-wing and 16 helicopter flights using mixed-effect logistic regression models to investigate potential sources of observation bias. I accounted for observer skill, population location, and aircraft type in the model structure and analyzed the effects of group size, sun effect (position related to observer), vegetation type, topography, cloud cover, percent snow cover, and observer fatigue on detection of horse groups. The most important model-averaged effects for both fixed-wing and helicopter surveys included group size (fixed-wing: odds ratio = 0.891, 95% CI = 0.850–0.935; helicopter: odds ratio = 0.640, 95% CI = 0.587–0.698) and sun effect (fixed-wing: odds ratio = 0.632, 95% CI = 0.350–1.141; helicopter: odds ratio = 0.194, 95% CI = 0.080–0.470). Observer fatigue was also an important effect in the best model for helicopter surveys, with detection probability declining after 3 hr of survey time (odds ratio = 0.278, 95% CI = 0.144–0.537). Biases arising from sun effect and observer fatigue can be mitigated by pre-flight survey design. Other sources of bias, such as those arising from group size, topography, and vegetation can only be addressed by employing valid sampling techniques such as double sampling, mark–resight (batch-marked animals), mark–recapture (uniquely marked and identifiable animals), sightability bias correction models, and line transect distance sampling; however, some of these techniques may still only partially correct for negative observation biases.

  10. Model-assisted probability of detection of flaws in aluminum blocks using polynomial chaos expansions

    NASA Astrophysics Data System (ADS)

    Du, Xiaosong; Leifsson, Leifur; Grandin, Robert; Meeker, William; Roberts, Ronald; Song, Jiming

    2018-04-01

    Probability of detection (POD) is widely used for measuring reliability of nondestructive testing (NDT) systems. Typically, POD is determined experimentally, while it can be enhanced by utilizing physics-based computational models in combination with model-assisted POD (MAPOD) methods. With the development of advanced physics-based methods, such as ultrasonic NDT testing, the empirical information, needed for POD methods, can be reduced. However, performing accurate numerical simulations can be prohibitively time-consuming, especially as part of stochastic analysis. In this work, stochastic surrogate models for computational physics-based measurement simulations are developed for cost savings of MAPOD methods while simultaneously ensuring sufficient accuracy. The stochastic surrogate is used to propagate the random input variables through the physics-based simulation model to obtain the joint probability distribution of the output. The POD curves are then generated based on those results. Here, the stochastic surrogates are constructed using non-intrusive polynomial chaos (NIPC) expansions. In particular, the NIPC methods used are the quadrature, ordinary least-squares (OLS), and least-angle regression sparse (LARS) techniques. The proposed approach is demonstrated on the ultrasonic testing simulation of a flat bottom hole flaw in an aluminum block. The results show that the stochastic surrogates have at least two orders of magnitude faster convergence on the statistics than direct Monte Carlo sampling (MCS). Moreover, the evaluation of the stochastic surrogate models is over three orders of magnitude faster than the underlying simulation model for this case, which is the UTSim2 model.

  11. Probabilistic Model for Untargeted Peak Detection in LC-MS Using Bayesian Statistics.

    PubMed

    Woldegebriel, Michael; Vivó-Truyols, Gabriel

    2015-07-21

    We introduce a novel Bayesian probabilistic peak detection algorithm for liquid chromatography-mass spectroscopy (LC-MS). The final probabilistic result allows the user to make a final decision about which points in a chromatogram are affected by a chromatographic peak and which ones are only affected by noise. The use of probabilities contrasts with the traditional method in which a binary answer is given, relying on a threshold. By contrast, with the Bayesian peak detection presented here, the values of probability can be further propagated into other preprocessing steps, which will increase (or decrease) the importance of chromatographic regions into the final results. The present work is based on the use of the statistical overlap theory of component overlap from Davis and Giddings (Davis, J. M.; Giddings, J. Anal. Chem. 1983, 55, 418-424) as prior probability in the Bayesian formulation. The algorithm was tested on LC-MS Orbitrap data and was able to successfully distinguish chemical noise from actual peaks without any data preprocessing.

  12. Sampling design trade-offs in occupancy studies with imperfect detection: examples and software

    USGS Publications Warehouse

    Bailey, L.L.; Hines, J.E.; Nichols, J.D.

    2007-01-01

    Researchers have used occupancy, or probability of occupancy, as a response or state variable in a variety of studies (e.g., habitat modeling), and occupancy is increasingly favored by numerous state, federal, and international agencies engaged in monitoring programs. Recent advances in estimation methods have emphasized that reliable inferences can be made from these types of studies if detection and occupancy probabilities are simultaneously estimated. The need for temporal replication at sampled sites to estimate detection probability creates a trade-off between spatial replication (number of sample sites distributed within the area of interest/inference) and temporal replication (number of repeated surveys at each site). Here, we discuss a suite of questions commonly encountered during the design phase of occupancy studies, and we describe software (program GENPRES) developed to allow investigators to easily explore design trade-offs focused on particularities of their study system and sampling limitations. We illustrate the utility of program GENPRES using an amphibian example from Greater Yellowstone National Park, USA.

  13. Notes on Search, Detection and Localization Modeling. Revision 4

    DTIC Science & Technology

    1990-10-01

    2, --- , k where the subregions are relabeled so that the following order relation holds: pi/6 1 > p 2!6 2 > ... > pr65 and where k is chosen so... Interfaces ," Operations Research, Vol. 19, No.3, pp 559-586, 1971. 15. Coggins, P.B., "Detection Probability Computations for Random Search of an

  14. Using exceedance probabilities to detect anomalies in routinely recorded animal health data, with particular reference to foot-and-mouth disease in Viet Nam.

    PubMed

    Richards, K K; Hazelton, M L; Stevenson, M A; Lockhart, C Y; Pinto, J; Nguyen, L

    2014-10-01

    The widespread availability of computer hardware and software for recording and storing disease event information means that, in theory, we have the necessary information to carry out detailed analyses of factors influencing the spatial distribution of disease in animal populations. However, the reliability of such analyses depends on data quality, with anomalous records having the potential to introduce significant bias and lead to inappropriate decision making. In this paper we promote the use of exceedance probabilities as a tool for detecting anomalies when applying hierarchical spatio-temporal models to animal health data. We illustrate this methodology through a case study data on outbreaks of foot-and-mouth disease (FMD) in Viet Nam for the period 2006-2008. A flexible binomial logistic regression was employed to model the number of FMD infected communes within each province of the country. Standard analyses of the residuals from this model failed to identify problems, but exceedance probabilities identified provinces in which the number of reported FMD outbreaks was unexpectedly low. This finding is interesting given that these provinces are on major cattle movement pathways through Viet Nam. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Efficiency of prompt quarantine measures on a susceptible-infected-removed model in networks.

    PubMed

    Hasegawa, Takehisa; Nemoto, Koji

    2017-08-01

    This study focuses on investigating the manner in which a prompt quarantine measure suppresses epidemics in networks. A simple and ideal quarantine measure is considered in which an individual is detected with a probability immediately after it becomes infected and the detected one and its neighbors are promptly isolated. The efficiency of this quarantine in suppressing a susceptible-infected-removed (SIR) model is tested in random graphs and uncorrelated scale-free networks. Monte Carlo simulations are used to show that the prompt quarantine measure outperforms random and acquaintance preventive vaccination schemes in terms of reducing the number of infected individuals. The epidemic threshold for the SIR model is analytically derived under the quarantine measure, and the theoretical findings indicate that prompt executions of quarantines are highly effective in containing epidemics. Even if infected individuals are detected with a very low probability, the SIR model under a prompt quarantine measure has finite epidemic thresholds in fat-tailed scale-free networks in which an infected individual can always cause an outbreak of a finite relative size without any measure. The numerical simulations also demonstrate that the present quarantine measure is effective in suppressing epidemics in real networks.

  16. Efficiency of prompt quarantine measures on a susceptible-infected-removed model in networks

    NASA Astrophysics Data System (ADS)

    Hasegawa, Takehisa; Nemoto, Koji

    2017-08-01

    This study focuses on investigating the manner in which a prompt quarantine measure suppresses epidemics in networks. A simple and ideal quarantine measure is considered in which an individual is detected with a probability immediately after it becomes infected and the detected one and its neighbors are promptly isolated. The efficiency of this quarantine in suppressing a susceptible-infected-removed (SIR) model is tested in random graphs and uncorrelated scale-free networks. Monte Carlo simulations are used to show that the prompt quarantine measure outperforms random and acquaintance preventive vaccination schemes in terms of reducing the number of infected individuals. The epidemic threshold for the SIR model is analytically derived under the quarantine measure, and the theoretical findings indicate that prompt executions of quarantines are highly effective in containing epidemics. Even if infected individuals are detected with a very low probability, the SIR model under a prompt quarantine measure has finite epidemic thresholds in fat-tailed scale-free networks in which an infected individual can always cause an outbreak of a finite relative size without any measure. The numerical simulations also demonstrate that the present quarantine measure is effective in suppressing epidemics in real networks.

  17. Determination of a Limited Scope Network's Lightning Detection Efficiency

    NASA Technical Reports Server (NTRS)

    Rompala, John T.; Blakeslee, R.

    2008-01-01

    This paper outlines a modeling technique to map lightning detection efficiency variations over a region surveyed by a sparse array of ground based detectors. A reliable flash peak current distribution (PCD) for the region serves as the technique's base. This distribution is recast as an event probability distribution function. The technique then uses the PCD together with information regarding: site signal detection thresholds, type of solution algorithm used, and range attenuation; to formulate the probability that a flash at a specified location will yield a solution. Applying this technique to the full region produces detection efficiency contour maps specific to the parameters employed. These contours facilitate a comparative analysis of each parameter's effect on the network's detection efficiency. In an alternate application, this modeling technique gives an estimate of the number, strength, and distribution of events going undetected. This approach leads to a variety of event density contour maps. This application is also illustrated. The technique's base PCD can be empirical or analytical. A process for formulating an empirical PCD specific to the region and network being studied is presented. A new method for producing an analytical representation of the empirical PCD is also introduced.

  18. A comparator-hypothesis account of biased contingency detection.

    PubMed

    Vadillo, Miguel A; Barberia, Itxaso

    2018-02-12

    Our ability to detect statistical dependencies between different events in the environment is strongly biased by the number of coincidences between them. Even when there is no true covariation between a cue and an outcome, if the marginal probability of either of them is high, people tend to perceive some degree of statistical contingency between both events. The present paper explores the ability of the Comparator Hypothesis to explain the general pattern of results observed in this literature. Our simulations show that this model can account for the biasing effects of the marginal probabilities of cues and outcomes. Furthermore, the overall fit of the Comparator Hypothesis to a sample of experimental conditions from previous studies is comparable to that of the popular Rescorla-Wagner model. These results should encourage researchers to further explore and put to the test the predictions of the Comparator Hypothesis in the domain of biased contingency detection. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Effectiveness of early detection on breast cancer mortality reduction in Catalonia (Spain)

    PubMed Central

    2009-01-01

    Background At present, it is complicated to use screening trials to determine the optimal age intervals and periodicities of breast cancer early detection. Mathematical models are an alternative that has been widely used. The aim of this study was to estimate the effect of different breast cancer early detection strategies in Catalonia (Spain), in terms of breast cancer mortality reduction (MR) and years of life gained (YLG), using the stochastic models developed by Lee and Zelen (LZ). Methods We used the LZ model to estimate the cumulative probability of death for a cohort exposed to different screening strategies after T years of follow-up. We also obtained the cumulative probability of death for a cohort with no screening. These probabilities were used to estimate the possible breast cancer MR and YLG by age, period and cohort of birth. The inputs of the model were: incidence of, mortality from and survival after breast cancer, mortality from other causes, distribution of breast cancer stages at diagnosis and sensitivity of mammography. The outputs were relative breast cancer MR and YLG. Results Relative breast cancer MR varied from 20% for biennial exams in the 50 to 69 age interval to 30% for annual exams in the 40 to 74 age interval. When strategies differ in periodicity but not in the age interval of exams, biennial screening achieved almost 80% of the annual screening MR. In contrast to MR, the effect on YLG of extending screening from 69 to 74 years of age was smaller than the effect of extending the screening from 50 to 45 or 40 years. Conclusion In this study we have obtained a measure of the effect of breast cancer screening in terms of mortality and years of life gained. The Lee and Zelen mathematical models have been very useful for assessing the impact of different modalities of early detection on MR and YLG in Catalonia (Spain). PMID:19754959

  20. Potential for spatial displacement of Cook Inlet beluga whales by anthropogenic noise in critical habitat

    USGS Publications Warehouse

    Small, Robert J.; Brost, Brian M.; Hooten, Mevin B.; Castellote, Manuel; Mondragon, Jeffrey

    2017-01-01

    The population of beluga whales in Cook Inlet, Alaska, USA, declined by nearly half in the mid-1990s, primarily from an unsustainable harvest, and was listed as endangered in 2008. In 2014, abundance was ~340 whales, and the population trend during 1999-2014 was -1.3% yr-1. Cook Inlet beluga whales are particularly vulnerable to anthropogenic impacts, and noise that has the potential to reduce communication and echolocation range considerably has been documented in critical habitat; thus, noise was ranked as a high potential threat in the Cook Inlet beluga Recovery Plan. The current recovery strategy includes research on effects of threats potentially limiting recovery, and thus we examined the potential impact of anthropogenic noise in critical habitat, specifically, spatial displacement. Using a subset of data on anthropogenic noise and beluga detections from a 5 yr acoustic study, we evaluated the influence of noise events on beluga occupancy probability. We used occupancy models, which account for factors that affect detection probability when estimating occupancy, the first application of these models to examine the potential impacts of anthropogenic noise on marine mammal behavior. Results were inconclusive, primarily because beluga detections were relatively infrequent. Even though noise metrics (sound pressure level and noise duration) appeared in high-ranking models as covariates for occupancy probability, the data were insufficient to indicate better predictive ability beyond those models that only included environmental covariates. Future studies that implement protocols designed specifically for beluga occupancy will be most effective for accurately estimating the effect of noise on beluga displacement.

  1. A NEW METHOD OF PEAK DETECTION FOR ANALYSIS OF COMPREHENSIVE TWO-DIMENSIONAL GAS CHROMATOGRAPHY MASS SPECTROMETRY DATA*

    PubMed Central

    Kim, Seongho; Ouyang, Ming; Jeong, Jaesik; Shen, Changyu; Zhang, Xiang

    2014-01-01

    We develop a novel peak detection algorithm for the analysis of comprehensive two-dimensional gas chromatography time-of-flight mass spectrometry (GC×GC-TOF MS) data using normal-exponential-Bernoulli (NEB) and mixture probability models. The algorithm first performs baseline correction and denoising simultaneously using the NEB model, which also defines peak regions. Peaks are then picked using a mixture of probability distribution to deal with the co-eluting peaks. Peak merging is further carried out based on the mass spectral similarities among the peaks within the same peak group. The algorithm is evaluated using experimental data to study the effect of different cut-offs of the conditional Bayes factors and the effect of different mixture models including Poisson, truncated Gaussian, Gaussian, Gamma, and exponentially modified Gaussian (EMG) distributions, and the optimal version is introduced using a trial-and-error approach. We then compare the new algorithm with two existing algorithms in terms of compound identification. Data analysis shows that the developed algorithm can detect the peaks with lower false discovery rates than the existing algorithms, and a less complicated peak picking model is a promising alternative to the more complicated and widely used EMG mixture models. PMID:25264474

  2. Realized detection and capture probabilities for giant gartersnakes (Thamnophis gigas) using modified floating aquatic funnel traps

    USGS Publications Warehouse

    Halstead, Brian J.; Skalos, Shannon M.; Casazza, Michael L.; Wylie, Glenn D.

    2015-01-01

    Detection and capture probabilities for giant gartersnakes (Thamnophis gigas) are very low, and successfully evaluating the effects of variables or experimental treatments on giant gartersnake populations will require greater detection and capture probabilities than those that had been achieved with standard trap designs. Previous research identified important trap modifications that can increase the probability of snakes entering traps and help prevent the escape of captured snakes. The purpose of this study was to quantify detection and capture probabilities obtained using the most successful modification to commercially available traps to date (2015), and examine the ability of realized detection and capture probabilities to achieve benchmark levels of precision in occupancy and capture-mark-recapture studies.

  3. Mathematical modeling and assessment of microbial migration during the sprouting of alfalfa in trays in a nonuniformly contaminated seed batch using Enterobacter aerogenes as a surrogate for Salmonella Stanley.

    PubMed

    Liu, Bin; Schaffner, Donald W

    2007-11-01

    Raw seed sprouts have been implicated in several food poisoning outbreaks in the past 10 years. The U.S. Food and Drug Administration recommends that sprout growers use interventions (such as testing of spent irrigation water) to control the presence of pathogens in the finished product. During the sprouting process, initially low concentrations of pathogen may increase, and contamination may spread within a batch of sprouting seeds. A model of pathogen growth as a function of time and distance from the contamination spot during the sprouting of alfalfa in trays has been developed with Enterobacter aerogenes. The probability of detecting contamination was assessed by logistic regression at various time points and distances by sampling from sprouts or irrigation water. Our results demonstrate that microbial populations and possibility of detection were greatly reduced at distances of > or = 20 cm from the point of contamination in a seed batch during tray sprouting; however, the probability of detecting microbial contamination at distances less than 10 cm from the point of inoculation was almost 100% at the end of the sprouting process. Our results also show that sampling irrigation water, especially large volumes of water, is highly effective at detecting contamination: by collecting 100 ml of irrigation water for membrane filtration, the probability of detection was increased by three to four times during the first 6 h of seed germination. Our findings have quantified the degree to which a small level of contamination will spread throughout a tray of sprouting alfalfa seeds and subsequently be detected by either sprout or irrigation water sampling.

  4. Using occupancy modelling to compare environmental DNA to traditional field methods for regional-scale monitoring of an endangered aquatic species.

    PubMed

    Schmelzle, Molly C; Kinziger, Andrew P

    2016-07-01

    Environmental DNA (eDNA) monitoring approaches promise to greatly improve detection of rare, endangered and invasive species in comparison with traditional field approaches. Herein, eDNA approaches and traditional seining methods were applied at 29 research locations to compare method-specific estimates of detection and occupancy probabilities for endangered tidewater goby (Eucyclogobius newberryi). At each location, multiple paired seine hauls and water samples for eDNA analysis were taken, ranging from two to 23 samples per site, depending upon habitat size. Analysis using a multimethod occupancy modelling framework indicated that the probability of detection using eDNA was nearly double (0.74) the rate of detection for seining (0.39). The higher detection rates afforded by eDNA allowed determination of tidewater goby occupancy at two locations where they have not been previously detected and at one location considered to be locally extirpated. Additionally, eDNA concentration was positively related to tidewater goby catch per unit effort, suggesting eDNA could potentially be used as a proxy for local tidewater goby abundance. Compared to traditional field sampling, eDNA provided improved occupancy parameter estimates and can be applied to increase management efficiency across a broad spatial range and within a diversity of habitats. © 2015 John Wiley & Sons Ltd.

  5. Evaluation and comparison of statistical methods for early temporal detection of outbreaks: A simulation-based study

    PubMed Central

    Le Strat, Yann

    2017-01-01

    The objective of this paper is to evaluate a panel of statistical algorithms for temporal outbreak detection. Based on a large dataset of simulated weekly surveillance time series, we performed a systematic assessment of 21 statistical algorithms, 19 implemented in the R package surveillance and two other methods. We estimated false positive rate (FPR), probability of detection (POD), probability of detection during the first week, sensitivity, specificity, negative and positive predictive values and F1-measure for each detection method. Then, to identify the factors associated with these performance measures, we ran multivariate Poisson regression models adjusted for the characteristics of the simulated time series (trend, seasonality, dispersion, outbreak sizes, etc.). The FPR ranged from 0.7% to 59.9% and the POD from 43.3% to 88.7%. Some methods had a very high specificity, up to 99.4%, but a low sensitivity. Methods with a high sensitivity (up to 79.5%) had a low specificity. All methods had a high negative predictive value, over 94%, while positive predictive values ranged from 6.5% to 68.4%. Multivariate Poisson regression models showed that performance measures were strongly influenced by the characteristics of time series. Past or current outbreak size and duration strongly influenced detection performances. PMID:28715489

  6. Advances in statistics

    Treesearch

    Howard Stauffer; Nadav Nur

    2005-01-01

    The papers included in the Advances in Statistics section of the Partners in Flight (PIF) 2002 Proceedings represent a small sample of statistical topics of current importance to Partners In Flight research scientists: hierarchical modeling, estimation of detection probabilities, and Bayesian applications. Sauer et al. (this volume) examines a hierarchical model...

  7. NetMOD v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merchant, Bion J

    2015-12-22

    NetMOD is a tool to model the performance of global ground-based explosion monitoring systems. The version 2.0 of the software supports the simulation of seismic, hydroacoustic, and infrasonic detection capability. The tool provides a user interface to execute simulations based upon a hypothetical definition of the monitoring system configuration, geophysical properties of the Earth, and detection analysis criteria. NetMOD will be distributed with a project file defining the basic performance characteristics of the International Monitoring System (IMS), a network of sensors operated by the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO). Network modeling is needed to be able to assess and explainmore » the potential effect of changes to the IMS, to prioritize station deployment and repair, and to assess the overall CTBTO monitoring capability currently and in the future. Currently the CTBTO uses version 1.0 of NetMOD, provided to them in early 2014. NetMOD will provide a modern tool that will cover all the simulations currently available and allow for the development of additional simulation capabilities of the IMS in the future. NetMOD simulates the performance of monitoring networks by estimating the relative amplitudes of the signal and noise measured at each of the stations within the network based upon known geophysical principles. From these signal and noise estimates, a probability of detection may be determined for each of the stations. The detection probabilities at each of the stations may then be combined to produce an estimate of the detection probability for the entire monitoring network.« less

  8. Modeling Disease Vector Occurrence when Detection Is Imperfect: Infestation of Amazonian Palm Trees by Triatomine Bugs at Three Spatial Scales

    PubMed Central

    Abad-Franch, Fernando; Ferraz, Gonçalo; Campos, Ciro; Palomeque, Francisco S.; Grijalva, Mario J.; Aguilar, H. Marcelo; Miles, Michael A.

    2010-01-01

    Background Failure to detect a disease agent or vector where it actually occurs constitutes a serious drawback in epidemiology. In the pervasive situation where no sampling technique is perfect, the explicit analytical treatment of detection failure becomes a key step in the estimation of epidemiological parameters. We illustrate this approach with a study of Attalea palm tree infestation by Rhodnius spp. (Triatominae), the most important vectors of Chagas disease (CD) in northern South America. Methodology/Principal Findings The probability of detecting triatomines in infested palms is estimated by repeatedly sampling each palm. This knowledge is used to derive an unbiased estimate of the biologically relevant probability of palm infestation. We combine maximum-likelihood analysis and information-theoretic model selection to test the relationships between environmental covariates and infestation of 298 Amazonian palm trees over three spatial scales: region within Amazonia, landscape, and individual palm. Palm infestation estimates are high (40–60%) across regions, and well above the observed infestation rate (24%). Detection probability is higher (∼0.55 on average) in the richest-soil region than elsewhere (∼0.08). Infestation estimates are similar in forest and rural areas, but lower in urban landscapes. Finally, individual palm covariates (accumulated organic matter and stem height) explain most of infestation rate variation. Conclusions/Significance Individual palm attributes appear as key drivers of infestation, suggesting that CD surveillance must incorporate local-scale knowledge and that peridomestic palm tree management might help lower transmission risk. Vector populations are probably denser in rich-soil sub-regions, where CD prevalence tends to be higher; this suggests a target for research on broad-scale risk mapping. Landscape-scale effects indicate that palm triatomine populations can endure deforestation in rural areas, but become rarer in heavily disturbed urban settings. Our methodological approach has wide application in infectious disease research; by improving eco-epidemiological parameter estimation, it can also significantly strengthen vector surveillance-control strategies. PMID:20209149

  9. Markovian Search Games in Heterogeneous Spaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Griffin, Christopher H

    2009-01-01

    We consider how to search for a mobile evader in a large heterogeneous region when sensors are used for detection. Sensors are modeled using probability of detection. Due to environmental effects, this probability will not be constant over the entire region. We map this problem to a graph search problem and, even though deterministic graph search is NP-complete, we derive a tractable, optimal, probabilistic search strategy. We do this by defining the problem as a differential game played on a Markov chain. We prove that this strategy is optimal in the sense of Nash. Simulations of an example problem illustratemore » our approach and verify our claims.« less

  10. Mechanistic understanding of human-wildlife conflict through a novel application of dynamic occupancy models.

    PubMed

    Goswami, Varun R; Medhi, Kamal; Nichols, James D; Oli, Madan K

    2015-08-01

    Crop and livestock depredation by wildlife is a primary driver of human-wildlife conflict, a problem that threatens the coexistence of people and wildlife globally. Understanding mechanisms that underlie depredation patterns holds the key to mitigating conflicts across time and space. However, most studies do not consider imperfect detection and reporting of conflicts, which may lead to incorrect inference regarding its spatiotemporal drivers. We applied dynamic occupancy models to elephant crop depredation data from India between 2005 and 2011 to estimate crop depredation occurrence and model its underlying dynamics as a function of spatiotemporal covariates while accounting for imperfect detection of conflicts. The probability of detecting conflicts was consistently <1.0 and was negatively influenced by distance to roads and elevation gradient, averaging 0.08-0.56 across primary periods (distinct agricultural seasons within each year). The probability of crop depredation occurrence ranged from 0.29 (SE 0.09) to 0.96 (SE 0.04). The probability that sites raided by elephants in primary period t would not be raided in primary period t + 1 varied with elevation gradient in different seasons and was influenced negatively by mean rainfall and village density and positively by distance to forests. Negative effects of rainfall variation and distance to forests best explained variation in the probability that sites not raided by elephants in primary period t would be raided in primary period t + 1. With our novel application of occupancy models, we teased apart the spatiotemporal drivers of conflicts from factors that influence how they are observed, thereby allowing more reliable inference on mechanisms underlying observed conflict patterns. We found that factors associated with increased crop accessibility and availability (e.g., distance to forests and rainfall patterns) were key drivers of elephant crop depredation dynamics. Such an understanding is essential for rigorous prediction of future conflicts, a critical requirement for effective conflict management in the context of increasing human-wildlife interactions. © 2015 Society for Conservation Biology.

  11. Interpreting null results from measurements with uncertain correlations: an info-gap approach.

    PubMed

    Ben-Haim, Yakov

    2011-01-01

    Null events—not detecting a pernicious agent—are the basis for declaring the agent is absent. Repeated nulls strengthen confidence in the declaration. However, correlations between observations are difficult to assess in many situations and introduce uncertainty in interpreting repeated nulls. We quantify uncertain correlations using an info-gap model, which is an unbounded family of nested sets of possible probabilities. An info-gap model is nonprobabilistic and entails no assumption about a worst case. We then evaluate the robustness, to uncertain correlations, of estimates of the probability of a null event. This is then the basis for evaluating a nonprobabilistic robustness-based confidence interval for the probability of a null. © 2010 Society for Risk Analysis.

  12. Double-observer approach to estimating egg mass abundance of vernal pool breeding amphibians

    USGS Publications Warehouse

    Grant, E.H.C.; Jung, R.E.; Nichols, J.D.; Hines, J.E.

    2005-01-01

    Interest in seasonally flooded pools, and the status of associated amphibian populations, has initiated programs in the northeastern United States to document and monitor these habitats. Counting egg masses is an effective way to determine the population size of pool-breeding amphibians, such as wood frogs (Rana sylvatica) and spotted salamanders (Ambystoma maculatum). However, bias is associated with counts if egg masses are missed. Counts unadjusted for the proportion missed (i.e., without adjustment for detection probability) could lead to false assessments of population trends. We used a dependent double-observer method in 2002-2003 to estimate numbers of wood frog and spotted salamander egg masses at seasonal forest pools in 13 National Wildlife Refuges, 1 National Park, 1 National Seashore, and 1 State Park in the northeastern United States. We calculated detection probabilities for egg masses and examined whether detection probabilities varied by species, observers, pools, and in relation to pool characteristics (pool area, pool maximum depth, within-pool vegetation). For the 2 years, model selection indicated that no consistent set of variables explained the variation in data sets from individual Refuges and Parks. Because our results indicated that egg mass detection probabilities vary spatially and temporally, we conclude that it is essential to use estimation procedures, such as double-observer methods with egg mass surveys, to determine population sizes and trends of these species.

  13. Detecting a trend change in cross-border epidemic transmission

    NASA Astrophysics Data System (ADS)

    Maeno, Yoshiharu

    2016-09-01

    A method for a system of Langevin equations is developed for detecting a trend change in cross-border epidemic transmission. The equations represent a standard epidemiological SIR compartment model and a meta-population network model. The method analyzes a time series of the number of new cases reported in multiple geographical regions. The method is applicable to investigating the efficacy of the implemented public health intervention in managing infectious travelers across borders. It is found that the change point of the probability of travel movements was one week after the WHO worldwide alert on the SARS outbreak in 2003. The alert was effective in managing infectious travelers. On the other hand, it is found that the probability of travel movements did not change at all for the flu pandemic in 2009. The pandemic did not affect potential travelers despite the WHO alert.

  14. Bayesian microsaccade detection

    PubMed Central

    Mihali, Andra; van Opheusden, Bas; Ma, Wei Ji

    2017-01-01

    Microsaccades are high-velocity fixational eye movements, with special roles in perception and cognition. The default microsaccade detection method is to determine when the smoothed eye velocity exceeds a threshold. We have developed a new method, Bayesian microsaccade detection (BMD), which performs inference based on a simple statistical model of eye positions. In this model, a hidden state variable changes between drift and microsaccade states at random times. The eye position is a biased random walk with different velocity distributions for each state. BMD generates samples from the posterior probability distribution over the eye state time series given the eye position time series. Applied to simulated data, BMD recovers the “true” microsaccades with fewer errors than alternative algorithms, especially at high noise. Applied to EyeLink eye tracker data, BMD detects almost all the microsaccades detected by the default method, but also apparent microsaccades embedded in high noise—although these can also be interpreted as false positives. Next we apply the algorithms to data collected with a Dual Purkinje Image eye tracker, whose higher precision justifies defining the inferred microsaccades as ground truth. When we add artificial measurement noise, the inferences of all algorithms degrade; however, at noise levels comparable to EyeLink data, BMD recovers the “true” microsaccades with 54% fewer errors than the default algorithm. Though unsuitable for online detection, BMD has other advantages: It returns probabilities rather than binary judgments, and it can be straightforwardly adapted as the generative model is refined. We make our algorithm available as a software package. PMID:28114483

  15. Protocol for Reliability Assessment of Structural Health Monitoring Systems Incorporating Model-assisted Probability of Detection (MAPOD) Approach

    DTIC Science & Technology

    2011-09-01

    a quality evaluation with limited data, a model -based assessment must be...that affect system performance, a multistage approach to system validation, a modeling and experimental methodology for efficiently addressing a ...affect system performance, a multistage approach to system validation, a modeling and experimental methodology for efficiently addressing a wide range

  16. Costas loop lock detection in the advanced receiver

    NASA Technical Reports Server (NTRS)

    Mileant, A.; Hinedi, S.

    1989-01-01

    The advanced receiver currently being developed uses a Costas digital loop to demodulate the subcarrier. Previous analyses of lock detector algorithms for Costas loops have ignored the effects of the inherent correlation between the samples of the phase-error process. Accounting for this correlation is necessary to achieve the desired lock-detection probability for a given false-alarm rate. Both analysis and simulations are used to quantify the effects of phase correlation on lock detection for the square-law and the absolute-value type detectors. Results are obtained which depict the lock-detection probability as a function of loop signal-to-noise ratio for a given false-alarm rate. The mathematical model and computer simulation show that the square-law detector experiences less degradation due to phase jitter than the absolute-value detector and that the degradation in detector signal-to-noise ratio is more pronounced for square-wave than for sine-wave signals.

  17. Evaluating detection probabilities for American marten in the Black Hills, South Dakota

    USGS Publications Warehouse

    Smith, Joshua B.; Jenks, Jonathan A.; Klaver, Robert W.

    2007-01-01

    Assessing the effectiveness of monitoring techniques designed to determine presence of forest carnivores, such as American marten (Martes americana), is crucial for validation of survey results. Although comparisons between techniques have been made, little attention has been paid to the issue of detection probabilities (p). Thus, the underlying assumption has been that detection probabilities equal 1.0. We used presence-absence data obtained from a track-plate survey in conjunction with results from a saturation-trapping study to derive detection probabilities when marten occurred at high (>2 marten/10.2 km2) and low (???1 marten/10.2 km2) densities within 8 10.2-km2 quadrats. Estimated probability of detecting marten in high-density quadrats was p = 0.952 (SE = 0.047), whereas the detection probability for low-density quadrats was considerably lower (p = 0.333, SE = 0.136). Our results indicated that failure to account for imperfect detection could lead to an underestimation of marten presence in 15-52% of low-density quadrats in the Black Hills, South Dakota, USA. We recommend that repeated site-survey data be analyzed to assess detection probabilities when documenting carnivore survey results.

  18. Ubiquitous Log Odds: A Common Representation of Probability and Frequency Distortion in Perception, Action, and Cognition

    PubMed Central

    Zhang, Hang; Maloney, Laurence T.

    2012-01-01

    In decision from experience, the source of probability information affects how probability is distorted in the decision task. Understanding how and why probability is distorted is a key issue in understanding the peculiar character of experience-based decision. We consider how probability information is used not just in decision-making but also in a wide variety of cognitive, perceptual, and motor tasks. Very similar patterns of distortion of probability/frequency information have been found in visual frequency estimation, frequency estimation based on memory, signal detection theory, and in the use of probability information in decision-making under risk and uncertainty. We show that distortion of probability in all cases is well captured as linear transformations of the log odds of frequency and/or probability, a model with a slope parameter, and an intercept parameter. We then consider how task and experience influence these two parameters and the resulting distortion of probability. We review how the probability distortions change in systematic ways with task and report three experiments on frequency distortion where the distortions change systematically in the same task. We found that the slope of frequency distortions decreases with the sample size, which is echoed by findings in decision from experience. We review previous models of the representation of uncertainty and find that none can account for the empirical findings. PMID:22294978

  19. A probabilistic method for the estimation of residual risk in donated blood.

    PubMed

    Bish, Ebru K; Ragavan, Prasanna K; Bish, Douglas R; Slonim, Anthony D; Stramer, Susan L

    2014-10-01

    The residual risk (RR) of transfusion-transmitted infections, including the human immunodeficiency virus and hepatitis B and C viruses, is typically estimated by the incidence[Formula: see text]window period model, which relies on the following restrictive assumptions: Each screening test, with probability 1, (1) detects an infected unit outside of the test's window period; (2) fails to detect an infected unit within the window period; and (3) correctly identifies an infection-free unit. These assumptions need not hold in practice due to random or systemic errors and individual variations in the window period. We develop a probability model that accurately estimates the RR by relaxing these assumptions, and quantify their impact using a published cost-effectiveness study and also within an optimization model. These assumptions lead to inaccurate estimates in cost-effectiveness studies and to sub-optimal solutions in the optimization model. The testing solution generated by the optimization model translates into fewer expected infections without an increase in the testing cost. © The Author 2014. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  20. Design of a DNA chip for detection of unknown genetically modified organisms (GMOs).

    PubMed

    Nesvold, Håvard; Kristoffersen, Anja Bråthen; Holst-Jensen, Arne; Berdal, Knut G

    2005-05-01

    Unknown genetically modified organisms (GMOs) have not undergone a risk evaluation, and hence might pose a danger to health and environment. There are, today, no methods for detecting unknown GMOs. In this paper we propose a novel method intended as a first step in an approach for detecting unknown genetically modified (GM) material in a single plant. A model is designed where biological and combinatorial reduction rules are applied to a set of DNA chip probes containing all possible sequences of uniform length n, creating probes capable of detecting unknown GMOs. The model is theoretically tested for Arabidopsis thaliana Columbia, and the probabilities for detecting inserts and receiving false positives are assessed for various parameters for this organism. From a theoretical standpoint, the model looks very promising but should be tested further in the laboratory. The model and algorithms will be available upon request to the corresponding author.

  1. Optimizing Probability of Detection Point Estimate Demonstration

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  2. Entanglement-enhanced Neyman-Pearson target detection using quantum illumination

    NASA Astrophysics Data System (ADS)

    Zhuang, Quntao; Zhang, Zheshen; Shapiro, Jeffrey H.

    2017-08-01

    Quantum illumination (QI) provides entanglement-based target detection---in an entanglement-breaking environment---whose performance is significantly better than that of optimum classical-illumination target detection. QI's performance advantage was established in a Bayesian setting with the target presumed equally likely to be absent or present and error probability employed as the performance metric. Radar theory, however, eschews that Bayesian approach, preferring the Neyman-Pearson performance criterion to avoid the difficulties of accurately assigning prior probabilities to target absence and presence and appropriate costs to false-alarm and miss errors. We have recently reported an architecture---based on sum-frequency generation (SFG) and feedforward (FF) processing---for minimum error-probability QI target detection with arbitrary prior probabilities for target absence and presence. In this paper, we use our results for FF-SFG reception to determine the receiver operating characteristic---detection probability versus false-alarm probability---for optimum QI target detection under the Neyman-Pearson criterion.

  3. Object Detection in Natural Backgrounds Predicted by Discrimination Performance and Models

    NASA Technical Reports Server (NTRS)

    Ahumada, A. J., Jr.; Watson, A. B.; Rohaly, A. M.; Null, Cynthia H. (Technical Monitor)

    1995-01-01

    In object detection, an observer looks for an object class member in a set of backgrounds. In discrimination, an observer tries to distinguish two images. Discrimination models predict the probability that an observer detects a difference between two images. We compare object detection and image discrimination with the same stimuli by: (1) making stimulus pairs of the same background with and without the target object and (2) either giving many consecutive trials with the same background (discrimination) or intermixing the stimuli (object detection). Six images of a vehicle in a natural setting were altered to remove the vehicle and mixed with the original image in various proportions. Detection observers rated the images for vehicle presence. Discrimination observers rated the images for any difference from the background image. Estimated detectabilities of the vehicles were found by maximizing the likelihood of a Thurstone category scaling model. The pattern of estimated detectabilities is similar for discrimination and object detection, and is accurately predicted by a Cortex Transform discrimination model. Predictions of a Contrast- Sensitivity- Function filter model and a Root-Mean-Square difference metric based on the digital image values are less accurate. The discrimination detectabilities averaged about twice those of object detection.

  4. Can camera traps monitor Komodo dragons a large ectothermic predator?

    PubMed

    Ariefiandy, Achmad; Purwandana, Deni; Seno, Aganto; Ciofi, Claudio; Jessop, Tim S

    2013-01-01

    Camera trapping has greatly enhanced population monitoring of often cryptic and low abundance apex carnivores. Effectiveness of passive infrared camera trapping, and ultimately population monitoring, relies on temperature mediated differences between the animal and its ambient environment to ensure good camera detection. In ectothermic predators such as large varanid lizards, this criterion is presumed less certain. Here we evaluated the effectiveness of camera trapping to potentially monitor the population status of the Komodo dragon (Varanus komodoensis), an apex predator, using site occupancy approaches. We compared site-specific estimates of site occupancy and detection derived using camera traps and cage traps at 181 trapping locations established across six sites on four islands within Komodo National Park, Eastern Indonesia. Detection and site occupancy at each site were estimated using eight competing models that considered site-specific variation in occupancy (ψ)and varied detection probabilities (p) according to detection method, site and survey number using a single season site occupancy modelling approach. The most parsimonious model [ψ (site), p (site survey); ω = 0.74] suggested that site occupancy estimates differed among sites. Detection probability varied as an interaction between site and survey number. Our results indicate that overall camera traps produced similar estimates of detection and site occupancy to cage traps, irrespective of being paired, or unpaired, with cage traps. Whilst one site showed some evidence detection was affected by trapping method detection was too low to produce an accurate occupancy estimate. Overall, as camera trapping is logistically more feasible it may provide, with further validation, an alternative method for evaluating long-term site occupancy patterns in Komodo dragons, and potentially other large reptiles, aiding conservation of this species.

  5. Can Camera Traps Monitor Komodo Dragons a Large Ectothermic Predator?

    PubMed Central

    Ariefiandy, Achmad; Purwandana, Deni; Seno, Aganto; Ciofi, Claudio; Jessop, Tim S.

    2013-01-01

    Camera trapping has greatly enhanced population monitoring of often cryptic and low abundance apex carnivores. Effectiveness of passive infrared camera trapping, and ultimately population monitoring, relies on temperature mediated differences between the animal and its ambient environment to ensure good camera detection. In ectothermic predators such as large varanid lizards, this criterion is presumed less certain. Here we evaluated the effectiveness of camera trapping to potentially monitor the population status of the Komodo dragon (Varanus komodoensis), an apex predator, using site occupancy approaches. We compared site-specific estimates of site occupancy and detection derived using camera traps and cage traps at 181 trapping locations established across six sites on four islands within Komodo National Park, Eastern Indonesia. Detection and site occupancy at each site were estimated using eight competing models that considered site-specific variation in occupancy (ψ)and varied detection probabilities (p) according to detection method, site and survey number using a single season site occupancy modelling approach. The most parsimonious model [ψ (site), p (site*survey); ω = 0.74] suggested that site occupancy estimates differed among sites. Detection probability varied as an interaction between site and survey number. Our results indicate that overall camera traps produced similar estimates of detection and site occupancy to cage traps, irrespective of being paired, or unpaired, with cage traps. Whilst one site showed some evidence detection was affected by trapping method detection was too low to produce an accurate occupancy estimate. Overall, as camera trapping is logistically more feasible it may provide, with further validation, an alternative method for evaluating long-term site occupancy patterns in Komodo dragons, and potentially other large reptiles, aiding conservation of this species. PMID:23527027

  6. Evaluating detection and estimation capabilities of magnetometer-based vehicle sensors

    NASA Astrophysics Data System (ADS)

    Slater, David M.; Jacyna, Garry M.

    2013-05-01

    In an effort to secure the northern and southern United States borders, MITRE has been tasked with developing Modeling and Simulation (M&S) tools that accurately capture the mapping between algorithm-level Measures of Performance (MOP) and system-level Measures of Effectiveness (MOE) for current/future surveillance systems deployed by the the Customs and Border Protection Office of Technology Innovations and Acquisitions (OTIA). This analysis is part of a larger M&S undertaking. The focus is on two MOPs for magnetometer-based Unattended Ground Sensors (UGS). UGS are placed near roads to detect passing vehicles and estimate properties of the vehicle's trajectory such as bearing and speed. The first MOP considered is the probability of detection. We derive probabilities of detection for a network of sensors over an arbitrary number of observation periods and explore how the probability of detection changes when multiple sensors are employed. The performance of UGS is also evaluated based on the level of variance in the estimation of trajectory parameters. We derive the Cramer-Rao bounds for the variances of the estimated parameters in two cases: when no a priori information is known and when the parameters are assumed to be Gaussian with known variances. Sample results show that UGS perform significantly better in the latter case.

  7. PSE-HMM: genome-wide CNV detection from NGS data using an HMM with Position-Specific Emission probabilities.

    PubMed

    Malekpour, Seyed Amir; Pezeshk, Hamid; Sadeghi, Mehdi

    2016-11-03

    Copy Number Variation (CNV) is envisaged to be a major source of large structural variations in the human genome. In recent years, many studies apply Next Generation Sequencing (NGS) data for the CNV detection. However, still there is a necessity to invent more accurate computational tools. In this study, mate pair NGS data are used for the CNV detection in a Hidden Markov Model (HMM). The proposed HMM has position specific emission probabilities, i.e. a Gaussian mixture distribution. Each component in the Gaussian mixture distribution captures a different type of aberration that is observed in the mate pairs, after being mapped to the reference genome. These aberrations may include any increase (decrease) in the insertion size or change in the direction of mate pairs that are mapped to the reference genome. This HMM with Position-Specific Emission probabilities (PSE-HMM) is utilized for the genome-wide detection of deletions and tandem duplications. The performance of PSE-HMM is evaluated on a simulated dataset and also on a real data of a Yoruban HapMap individual, NA18507. PSE-HMM is effective in taking observation dependencies into account and reaches a high accuracy in detecting genome-wide CNVs. MATLAB programs are available at http://bs.ipm.ir/softwares/PSE-HMM/ .

  8. Estimating trends in alligator populations from nightlight survey data

    USGS Publications Warehouse

    Fujisaki, Ikuko; Mazzotti, Frank J.; Dorazio, Robert M.; Rice, Kenneth G.; Cherkiss, Michael; Jeffery, Brian

    2011-01-01

    Nightlight surveys are commonly used to evaluate status and trends of crocodilian populations, but imperfect detection caused by survey- and location-specific factors makes it difficult to draw population inferences accurately from uncorrected data. We used a two-stage hierarchical model comprising population abundance and detection probability to examine recent abundance trends of American alligators (Alligator mississippiensis) in subareas of Everglades wetlands in Florida using nightlight survey data. During 2001–2008, there were declining trends in abundance of small and/or medium sized animals in a majority of subareas, whereas abundance of large sized animals had either demonstrated an increased or unclear trend. For small and large sized class animals, estimated detection probability declined as water depth increased. Detection probability of small animals was much lower than for larger size classes. The declining trend of smaller alligators may reflect a natural population response to the fluctuating environment of Everglades wetlands under modified hydrology. It may have negative implications for the future of alligator populations in this region, particularly if habitat conditions do not favor recruitment of offspring in the near term. Our study provides a foundation to improve inferences made from nightlight surveys of other crocodilian populations.

  9. Estimating trends in alligator populations from nightlight survey data

    USGS Publications Warehouse

    Fujisaki, Ikuko; Mazzotti, F.J.; Dorazio, R.M.; Rice, K.G.; Cherkiss, M.; Jeffery, B.

    2011-01-01

    Nightlight surveys are commonly used to evaluate status and trends of crocodilian populations, but imperfect detection caused by survey- and location-specific factors makes it difficult to draw population inferences accurately from uncorrected data. We used a two-stage hierarchical model comprising population abundance and detection probability to examine recent abundance trends of American alligators (Alligator mississippiensis) in subareas of Everglades wetlands in Florida using nightlight survey data. During 2001-2008, there were declining trends in abundance of small and/or medium sized animals in a majority of subareas, whereas abundance of large sized animals had either demonstrated an increased or unclear trend. For small and large sized class animals, estimated detection probability declined as water depth increased. Detection probability of small animals was much lower than for larger size classes. The declining trend of smaller alligators may reflect a natural population response to the fluctuating environment of Everglades wetlands under modified hydrology. It may have negative implications for the future of alligator populations in this region, particularly if habitat conditions do not favor recruitment of offspring in the near term. Our study provides a foundation to improve inferences made from nightlight surveys of other crocodilian populations. ?? 2011 US Government.

  10. TORNADO-WARNING PERFORMANCE IN THE PAST AND FUTURE: A Perspective from Signal Detection Theory.

    NASA Astrophysics Data System (ADS)

    Brooks, Harold E.

    2004-06-01

    Changes over the years in tornado-warning performance in the United States can be modeled from the perspective of signal detection theory. From this view, it can be seen that there have been distinct periods of change in performance, most likely associated with deployment of radars, and changes in scientific understanding and training. The model also makes it clear that improvements in the false alarm ratio can only occur at the cost of large decreases in the probability of detection, or with large improvements in the overall quality of the warning system.

  11. Knowing where is different from knowing what: Distinct response time profiles and accuracy effects for target location, orientation, and color probability.

    PubMed

    Jabar, Syaheed B; Filipowicz, Alex; Anderson, Britt

    2017-11-01

    When a location is cued, targets appearing at that location are detected more quickly. When a target feature is cued, targets bearing that feature are detected more quickly. These attentional cueing effects are only superficially similar. More detailed analyses find distinct temporal and accuracy profiles for the two different types of cues. This pattern parallels work with probability manipulations, where both feature and spatial probability are known to affect detection accuracy and reaction times. However, little has been done by way of comparing these effects. Are probability manipulations on space and features distinct? In a series of five experiments, we systematically varied spatial probability and feature probability along two dimensions (orientation or color). In addition, we decomposed response times into initiation and movement components. Targets appearing at the probable location were reported more quickly and more accurately regardless of whether the report was based on orientation or color. On the other hand, when either color probability or orientation probability was manipulated, response time and accuracy improvements were specific for that probable feature dimension. Decomposition of the response time benefits demonstrated that spatial probability only affected initiation times, whereas manipulations of feature probability affected both initiation and movement times. As detection was made more difficult, the two effects further diverged, with spatial probability disproportionally affecting initiation times and feature probability disproportionately affecting accuracy. In conclusion, all manipulations of probability, whether spatial or featural, affect detection. However, only feature probability affects perceptual precision, and precision effects are specific to the probable attribute.

  12. Bayesian methods for outliers detection in GNSS time series

    NASA Astrophysics Data System (ADS)

    Qianqian, Zhang; Qingming, Gui

    2013-07-01

    This article is concerned with the problem of detecting outliers in GNSS time series based on Bayesian statistical theory. Firstly, a new model is proposed to simultaneously detect different types of outliers based on the conception of introducing different types of classification variables corresponding to the different types of outliers; the problem of outlier detection is converted into the computation of the corresponding posterior probabilities, and the algorithm for computing the posterior probabilities based on standard Gibbs sampler is designed. Secondly, we analyze the reasons of masking and swamping about detecting patches of additive outliers intensively; an unmasking Bayesian method for detecting additive outlier patches is proposed based on an adaptive Gibbs sampler. Thirdly, the correctness of the theories and methods proposed above is illustrated by simulated data and then by analyzing real GNSS observations, such as cycle slips detection in carrier phase data. Examples illustrate that the Bayesian methods for outliers detection in GNSS time series proposed by this paper are not only capable of detecting isolated outliers but also capable of detecting additive outlier patches. Furthermore, it can be successfully used to process cycle slips in phase data, which solves the problem of small cycle slips.

  13. Prediction Metrics for Chemical Detection in Long-Wave Infrared Hyperspectral Imagery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chilton, Marie C.; Walsh, Stephen J.; Daly, Don S.

    2009-01-29

    A natural or anthropogenic process often generates a signature gas plume whose chemical constituents may be identified using hyperspectral imagery. A hyperspectral image is a pixel-indexed set of spectra where each spectrum reflects the chemical constituents of the plume, the atmosphere, the bounding background surface, and instrument noise. This study explored the relationship between gas absorbance and background emissivity across the long-wave infrared (LWIR) spectrum and how they affect relative gas detection sensitivity. The physics-based model for the observed radiance shows that high gas absorbance coupled with low background emissivity at a single wavenumber results in a stronger recorded radiance.more » Two sensitivity measures were developed to predict relative probability of detection using chemical absorbance and background emissivity: one focused on a single wavenumber while another accounted for the entire spectrum. The predictive abilities of these measures were compared to synthetic image analysis. This study simulated images with 499 distinct gases at each of 6 concentrations over 6 different background surfaces with the atmosphere and level of instrument noise held constant. The Whitened Matched Filter was used to define gas detection from an image spectrum. The estimate of a chemical’s probability of detection at a given concentration over a specific background was the proportion of detections in 500 trials. Of the 499 chemicals used in the images, 276 had estimated probabilities of detection below 0.2 across all backgrounds and concentrations; these chemicals were removed from the study. For 92.8 percent of the remaining chemicals, the single channel measure correctly predicted the background over which the chemical had the largest relative probability of detection. Further, the measure which accounted for information across all wavenumbers predicted the background over which the chemical had the largest relative probability of detection for 93.3 percent of the chemicals. These results suggest that the wavenumber with largest gas absorbance has the most influence over gas detection for this data. By furthering the in-silico experimentation with higher concentrations of gases not detectable in this experiment or by standardizing the gas absorbance spectra to unit vectors, these conclusions may be confirmed and generalized to more gases. This will help simplify image acquisition planning and the identification of unknowns in field collected images.« less

  14. Heterozygosity-fitness correlations in a wild mammal population: accounting for parental and environmental effects.

    PubMed

    Annavi, Geetha; Newman, Christopher; Buesching, Christina D; Macdonald, David W; Burke, Terry; Dugdale, Hannah L

    2014-06-01

    HFCs (heterozygosity-fitness correlations) measure the direct relationship between an individual's genetic diversity and fitness. The effects of parental heterozygosity and the environment on HFCs are currently under-researched. We investigated these in a high-density U.K. population of European badgers (Meles meles), using a multimodel capture-mark-recapture framework and 35 microsatellite loci. We detected interannual variation in first-year, but not adult, survival probability. Adult females had higher annual survival probabilities than adult males. Cubs with more heterozygous fathers had higher first-year survival, but only in wetter summers; there was no relationship with individual or maternal heterozygosity. Moist soil conditions enhance badger food supply (earthworms), improving survival. In dryer years, higher indiscriminate mortality rates appear to mask differential heterozygosity-related survival effects. This paternal interaction was significant in the most supported model; however, the model-averaged estimate had a relative importance of 0.50 and overlapped zero slightly. First-year survival probabilities were not correlated with the inbreeding coefficient (f); however, small sample sizes limited the power to detect inbreeding depression. Correlations between individual heterozygosity and inbreeding were weak, in line with published meta-analyses showing that HFCs tend to be weak. We found support for general rather than local heterozygosity effects on first-year survival probability, and g2 indicated that our markers had power to detect inbreeding. We emphasize the importance of assessing how environmental stressors can influence the magnitude and direction of HFCs and of considering how parental genetic diversity can affect fitness-related traits, which could play an important role in the evolution of mate choice.

  15. Early detection of emerging forest disease using dispersal estimation and ecological niche modeling.

    PubMed

    Meentemeyer, Ross K; Anacker, Brian L; Mark, Walter; Rizzo, David M

    2008-03-01

    Distinguishing the manner in which dispersal limitation and niche requirements control the spread of invasive pathogens is important for prediction and early detection of disease outbreaks. Here, we use niche modeling augmented by dispersal estimation to examine the degree to which local habitat conditions vs. force of infection predict invasion of Phytophthora ramorum, the causal agent of the emerging infectious tree disease sudden oak death. We sampled 890 field plots for the presence of P. ramorum over a three-year period (2003-2005) across a range of host and abiotic conditions with variable proximities to known infections in California, USA. We developed and validated generalized linear models of invasion probability to analyze the relative predictive power of 12 niche variables and a negative exponential dispersal kernel estimated by likelihood profiling. Models were developed incrementally each year (2003, 2003-2004, 2003-2005) to examine annual variability in model parameters and to create realistic scenarios for using models to predict future infections and to guide early-detection sampling. Overall, 78 new infections were observed up to 33.5 km from the nearest known site of infection, with slightly increasing rates of prevalence across time windows (2003, 6.5%; 2003-2004, 7.1%; 2003-2005, 9.6%). The pathogen was not detected in many field plots that contained susceptible host vegetation. The generalized linear modeling indicated that the probability of invasion is limited by both dispersal and niche constraints. Probability of invasion was positively related to precipitation and temperature in the wet season and the presence of the inoculum-producing foliar host Umbellularia californica and decreased exponentially with distance to inoculum sources. Models that incorporated niche and dispersal parameters best predicted the locations of new infections, with accuracies ranging from 0.86 to 0.90, suggesting that the modeling approach can be used to forecast locations of disease spread. Application of the combined niche plus dispersal models in a geographic information system predicted the presence of P. ramorum across approximately 8228 km2 of California's 84785 km2 (9.7%) of land area with susceptible host species. This research illustrates how probabilistic modeling can be used to analyze the relative roles of niche and dispersal limitation in controlling the distribution of invasive pathogens.

  16. Autonomous acoustic recorders reveal complex patterns in avian detection probability

    USGS Publications Warehouse

    Thompson, Sarah J.; Handel, Colleen M.; McNew, Lance B.

    2017-01-01

    Avian point‐count surveys are typically designed to occur during periods when birds are consistently active and singing, but seasonal and diurnal patterns of detection probability are often not well understood and may vary regionally or between years. We deployed autonomous acoustic recorders to assess how avian availability for detection (i.e., the probability that a bird signals its presence during a recording) varied during the breeding season with time of day, date, and weather‐related variables at multiple subarctic tundra sites in Alaska, USA, 2013–2014. A single observer processed 2,692 10‐minute recordings across 11 site‐years. We used time‐removal methods to assess availability and used generalized additive models to examine patterns of detectability (joint probability of presence, availability, and detection) for 16 common species. Despite lack of distinct dawn or dusk, most species displayed circadian vocalization patterns, with detection rates generally peaking between 0800 hours and 1200 hours but remaining high as late as 2000 hours for some species. Between 2200 hours and 0500 hours, most species’ detection rates dropped to near 0, signaling a distinctive rest period. Detectability dropped sharply for most species in early July. For all species considered, time‐removal analysis indicated nearly 100% likelihood of detection during a 10‐minute recording conducted in June, between 0500 hours and 2000 hours. This indicates that non‐detections during appropriate survey times and dates were attributable to the species’ absence or that silent birds were unlikely to initiate singing during a 10‐minute interval, whereas vocally active birds were singing very frequently. Systematic recordings revealed a gradient of species’ presence at each site, from ubiquitous to incidental. Although the total number of species detected at a site ranged from 16 to 27, we detected only 4 to 15 species on ≥5% of the site's recordings. Recordings provided an unusually detailed and consistent dataset that allowed us to identify, among other things, appropriate survey dates and times for species breeding at northern latitudes. Our results also indicated that more recordings of shorter duration (1–4 min) may be most efficient for detecting passerines.

  17. Efficient estimation of abundance for patchily distributed populations via two-phase, adaptive sampling.

    USGS Publications Warehouse

    Conroy, M.J.; Runge, J.P.; Barker, R.J.; Schofield, M.R.; Fonnesbeck, C.J.

    2008-01-01

    Many organisms are patchily distributed, with some patches occupied at high density, others at lower densities, and others not occupied. Estimation of overall abundance can be difficult and is inefficient via intensive approaches such as capture-mark-recapture (CMR) or distance sampling. We propose a two-phase sampling scheme and model in a Bayesian framework to estimate abundance for patchily distributed populations. In the first phase, occupancy is estimated by binomial detection samples taken on all selected sites, where selection may be of all sites available, or a random sample of sites. Detection can be by visual surveys, detection of sign, physical captures, or other approach. At the second phase, if a detection threshold is achieved, CMR or other intensive sampling is conducted via standard procedures (grids or webs) to estimate abundance. Detection and CMR data are then used in a joint likelihood to model probability of detection in the occupancy sample via an abundance-detection model. CMR modeling is used to estimate abundance for the abundance-detection relationship, which in turn is used to predict abundance at the remaining sites, where only detection data are collected. We present a full Bayesian modeling treatment of this problem, in which posterior inference on abundance and other parameters (detection, capture probability) is obtained under a variety of assumptions about spatial and individual sources of heterogeneity. We apply the approach to abundance estimation for two species of voles (Microtus spp.) in Montana, USA. We also use a simulation study to evaluate the frequentist properties of our procedure given known patterns in abundance and detection among sites as well as design criteria. For most population characteristics and designs considered, bias and mean-square error (MSE) were low, and coverage of true parameter values by Bayesian credibility intervals was near nominal. Our two-phase, adaptive approach allows efficient estimation of abundance of rare and patchily distributed species and is particularly appropriate when sampling in all patches is impossible, but a global estimate of abundance is required.

  18. On the choice of statistical models for estimating occurrence and extinction from animal surveys

    USGS Publications Warehouse

    Dorazio, R.M.

    2007-01-01

    In surveys of natural animal populations the number of animals that are present and available to be detected at a sample location is often low, resulting in few or no detections. Low detection frequencies are especially common in surveys of imperiled species; however, the choice of sampling method and protocol also may influence the size of the population that is vulnerable to detection. In these circumstances, probabilities of animal occurrence and extinction will generally be estimated more accurately if the models used in data analysis account for differences in abundance among sample locations and for the dependence between site-specific abundance and detection. Simulation experiments are used to illustrate conditions wherein these types of models can be expected to outperform alternative estimators of population site occupancy and extinction. ?? 2007 by the Ecological Society of America.

  19. Pairing field methods to improve inference in wildlife surveys while accommodating detection covariance.

    PubMed

    Clare, John; McKinney, Shawn T; DePue, John E; Loftin, Cynthia S

    2017-10-01

    It is common to use multiple field sampling methods when implementing wildlife surveys to compare method efficacy or cost efficiency, integrate distinct pieces of information provided by separate methods, or evaluate method-specific biases and misclassification error. Existing models that combine information from multiple field methods or sampling devices permit rigorous comparison of method-specific detection parameters, enable estimation of additional parameters such as false-positive detection probability, and improve occurrence or abundance estimates, but with the assumption that the separate sampling methods produce detections independently of one another. This assumption is tenuous if methods are paired or deployed in close proximity simultaneously, a common practice that reduces the additional effort required to implement multiple methods and reduces the risk that differences between method-specific detection parameters are confounded by other environmental factors. We develop occupancy and spatial capture-recapture models that permit covariance between the detections produced by different methods, use simulation to compare estimator performance of the new models to models assuming independence, and provide an empirical application based on American marten (Martes americana) surveys using paired remote cameras, hair catches, and snow tracking. Simulation results indicate existing models that assume that methods independently detect organisms produce biased parameter estimates and substantially understate estimate uncertainty when this assumption is violated, while our reformulated models are robust to either methodological independence or covariance. Empirical results suggested that remote cameras and snow tracking had comparable probability of detecting present martens, but that snow tracking also produced false-positive marten detections that could potentially substantially bias distribution estimates if not corrected for. Remote cameras detected marten individuals more readily than passive hair catches. Inability to photographically distinguish individual sex did not appear to induce negative bias in camera density estimates; instead, hair catches appeared to produce detection competition between individuals that may have been a source of negative bias. Our model reformulations broaden the range of circumstances in which analyses incorporating multiple sources of information can be robustly used, and our empirical results demonstrate that using multiple field-methods can enhance inferences regarding ecological parameters of interest and improve understanding of how reliably survey methods sample these parameters. © 2017 by the Ecological Society of America.

  20. Piéron’s Law and Optimal Behavior in Perceptual Decision-Making

    PubMed Central

    van Maanen, Leendert; Grasman, Raoul P. P. P.; Forstmann, Birte U.; Wagenmakers, Eric-Jan

    2012-01-01

    Piéron’s Law is a psychophysical regularity in signal detection tasks that states that mean response times decrease as a power function of stimulus intensity. In this article, we extend Piéron’s Law to perceptual two-choice decision-making tasks, and demonstrate that the law holds as the discriminability between two competing choices is manipulated, even though the stimulus intensity remains constant. This result is consistent with predictions from a Bayesian ideal observer model. The model assumes that in order to respond optimally in a two-choice decision-making task, participants continually update the posterior probability of each response alternative, until the probability of one alternative crosses a criterion value. In addition to predictions for two-choice decision-making tasks, we extend the ideal observer model to predict Piéron’s Law in signal detection tasks. We conclude that Piéron’s Law is a general phenomenon that may be caused by optimality constraints. PMID:22232572

  1. Replication of genetic associations as pseudoreplication due to shared genealogy.

    PubMed

    Rosenberg, Noah A; Vanliere, Jenna M

    2009-09-01

    The genotypes of individuals in replicate genetic association studies have some level of correlation due to shared descent in the complete pedigree of all living humans. As a result of this genealogical sharing, replicate studies that search for genotype-phenotype associations using linkage disequilibrium between marker loci and disease-susceptibility loci can be considered as "pseudoreplicates" rather than true replicates. We examine the size of the pseudoreplication effect in association studies simulated from evolutionary models of the history of a population, evaluating the excess probability that both of a pair of studies detect a disease association compared to the probability expected under the assumption that the two studies are independent. Each of nine combinations of a demographic model and a penetrance model leads to a detectable pseudoreplication effect, suggesting that the degree of support that can be attributed to a replicated genetic association result is less than that which can be attributed to a replicated result in a context of true independence.

  2. Replication of genetic associations as pseudoreplication due to shared genealogy

    PubMed Central

    Rosenberg, Noah A.; VanLiere, Jenna M.

    2009-01-01

    The genotypes of individuals in replicate genetic association studies have some level of correlation due to shared descent in the complete pedigree of all living humans. As a result of this genealogical sharing, replicate studies that search for genotype-phenotype associations using linkage disequilibrium between marker loci and disease-susceptibility loci can be considered “pseudoreplicates” rather than true replicates. We examine the size of the pseudoreplication effect in association studies simulated from evolutionary models of the history of a population, evaluating the excess probability that both of a pair of studies detect a disease association compared to the probability expected under the assumption that the two studies are independent. Each of nine combinations of a demographic model and a penetrance model leads to a detectable pseudoreplication effect, suggesting that the degree of support that can be attributed to a replicated genetic association result is less than that which can be attributed to a replicated result in a context of true independence. PMID:19191270

  3. Mixture models for detecting differentially expressed genes in microarrays.

    PubMed

    Jones, Liat Ben-Tovim; Bean, Richard; McLachlan, Geoffrey J; Zhu, Justin Xi

    2006-10-01

    An important and common problem in microarray experiments is the detection of genes that are differentially expressed in a given number of classes. As this problem concerns the selection of significant genes from a large pool of candidate genes, it needs to be carried out within the framework of multiple hypothesis testing. In this paper, we focus on the use of mixture models to handle the multiplicity issue. With this approach, a measure of the local FDR (false discovery rate) is provided for each gene. An attractive feature of the mixture model approach is that it provides a framework for the estimation of the prior probability that a gene is not differentially expressed, and this probability can subsequently be used in forming a decision rule. The rule can also be formed to take the false negative rate into account. We apply this approach to a well-known publicly available data set on breast cancer, and discuss our findings with reference to other approaches.

  4. Use of spatial capture-recapture modeling and DNA data to estimate densities of elusive animals

    USGS Publications Warehouse

    Kery, Marc; Gardner, Beth; Stoeckle, Tabea; Weber, Darius; Royle, J. Andrew

    2011-01-01

    Assessment of abundance, survival, recruitment rates, and density (i.e., population assessment) is especially challenging for elusive species most in need of protection (e.g., rare carnivores). Individual identification methods, such as DNA sampling, provide ways of studying such species efficiently and noninvasively. Additionally, statistical methods that correct for undetected animals and account for locations where animals are captured are available to efficiently estimate density and other demographic parameters. We collected hair samples of European wildcat (Felis silvestris) from cheek-rub lure sticks, extracted DNA from the samples, and identified each animals' genotype. To estimate the density of wildcats, we used Bayesian inference in a spatial capture-recapture model. We used WinBUGS to fit a model that accounted for differences in detection probability among individuals and seasons and between two lure arrays. We detected 21 individual wildcats (including possible hybrids) 47 times. Wildcat density was estimated at 0.29/km2 (SE 0.06), and 95% of the activity of wildcats was estimated to occur within 1.83 km from their home-range center. Lures located systematically were associated with a greater number of detections than lures placed in a cell on the basis of expert opinion. Detection probability of individual cats was greatest in late March. Our model is a generalized linear mixed model; hence, it can be easily extended, for instance, to incorporate trap- and individual-level covariates. We believe that the combined use of noninvasive sampling techniques and spatial capture-recapture models will improve population assessments, especially for rare and elusive animals.

  5. Sequential Probability Ratio Test for Collision Avoidance Maneuver Decisions Based on a Bank of Norm-Inequality-Constrained Epoch-State Filters

    NASA Technical Reports Server (NTRS)

    Carpenter, J. R.; Markley, F. L.; Alfriend, K. T.; Wright, C.; Arcido, J.

    2011-01-01

    Sequential probability ratio tests explicitly allow decision makers to incorporate false alarm and missed detection risks, and are potentially less sensitive to modeling errors than a procedure that relies solely on a probability of collision threshold. Recent work on constrained Kalman filtering has suggested an approach to formulating such a test for collision avoidance maneuver decisions: a filter bank with two norm-inequality-constrained epoch-state extended Kalman filters. One filter models 1he null hypothesis 1ha1 the miss distance is inside the combined hard body radius at the predicted time of closest approach, and one filter models the alternative hypothesis. The epoch-state filter developed for this method explicitly accounts for any process noise present in the system. The method appears to work well using a realistic example based on an upcoming highly-elliptical orbit formation flying mission.

  6. Sequential Probability Ratio Test for Spacecraft Collision Avoidance Maneuver Decisions

    NASA Technical Reports Server (NTRS)

    Carpenter, J. Russell; Markley, F. Landis

    2013-01-01

    A document discusses sequential probability ratio tests that explicitly allow decision-makers to incorporate false alarm and missed detection risks, and are potentially less sensitive to modeling errors than a procedure that relies solely on a probability of collision threshold. Recent work on constrained Kalman filtering has suggested an approach to formulating such a test for collision avoidance maneuver decisions: a filter bank with two norm-inequality-constrained epoch-state extended Kalman filters. One filter models the null hypotheses that the miss distance is inside the combined hard body radius at the predicted time of closest approach, and one filter models the alternative hypothesis. The epoch-state filter developed for this method explicitly accounts for any process noise present in the system. The method appears to work well using a realistic example based on an upcoming, highly elliptical orbit formation flying mission.

  7. Predicting potentially toxigenic Pseudo-nitzschia blooms in the Chesapeake Bay

    USGS Publications Warehouse

    Anderson, C.R.; Sapiano, M.R.P.; Prasad, M.B.K.; Long, W.; Tango, P.J.; Brown, C.W.; Murtugudde, R.

    2010-01-01

    Harmful algal blooms are now recognized as a significant threat to the Chesapeake Bay as they can severely compromise the economic viability of important recreational and commercial fisheries in the largest estuary of the United States. This study describes the development of empirical models for the potentially domoic acid-producing Pseudo-nitzschia species complex present in the Bay, developed from a 22-year time series of cell abundance and concurrent measurements of hydrographic and chemical properties. Using a logistic Generalized Linear Model (GLM) approach, model parameters and performance were compared over a range of Pseudo-nitzschia bloom thresholds relevant to toxin production by different species. Small-threshold blooms (???10cellsmL-1) are explained by time of year, location, and variability in surface values of phosphate, temperature, nitrate plus nitrite, and freshwater discharge. Medium- (100cellsmL-1) to large- threshold (1000cellsmL-1) blooms are further explained by salinity, silicic acid, dissolved organic carbon, and light attenuation (Secchi) depth. These predictors are similar to other models for Pseudo-nitzschia blooms on the west coast, suggesting commonalities across ecosystems. Hindcasts of bloom probabilities at a 19% bloom prediction point yield a Heidke Skill Score of -53%, a Probability of Detection ~75%, a False Alarm Ratio of ~52%, and a Probability of False Detection ~9%. The implication of possible future changes in Baywide nutrient stoichiometry on Pseudo-nitzschia blooms is discussed. ?? 2010 Elsevier B.V.

  8. Integrating drivers influencing the detection of plant pests carried in the international cut flower trade.

    PubMed

    Areal, F J; Touza, J; MacLeod, A; Dehnen-Schmutz, K; Perrings, C; Palmieri, M G; Spence, N J

    2008-12-01

    This paper analyses the cut flower market as an example of an invasion pathway along which species of non-indigenous plant pests can travel to reach new areas. The paper examines the probability of pest detection by assessing information on pest detection and detection effort associated with the import of cut flowers. We test the link between the probability of plant pest arrivals, as a precursor to potential invasion, and volume of traded flowers using count data regression models. The analysis is applied to the UK import of specific genera of cut flowers from Kenya between 1996 and 2004. There is a link between pest detection and the Genus of cut flower imported. Hence, pest detection efforts should focus on identifying and targeting those imported plants with a high risk of carrying pest species. For most of the plants studied, efforts allocated to inspection have a significant influence on the probability of pest detection. However, by better targeting inspection efforts, it is shown that plant inspection effort could be reduced without increasing the risk of pest entry. Similarly, for most of the plants analysed, an increase in volume traded will not necessarily lead to an increase in the number of pests entering the UK. For some species, such as Carthamus and Veronica, the volume of flowers traded has a significant and positive impact on the likelihood of pest detection. We conclude that analysis at the rank of plant Genus is important both to understand the effectiveness of plant pest detection efforts and consequently to manage the risk of introduction of non-indigenous species.

  9. Seasonal variation in size-dependent survival of juvenile Atlantic salmon (Salmo salar): Performance of multistate capture-mark-recapture models

    USGS Publications Warehouse

    Letcher, B.H.; Horton, G.E.

    2008-01-01

    We estimated the magnitude and shape of size-dependent survival (SDS) across multiple sampling intervals for two cohorts of stream-dwelling Atlantic salmon (Salmo salar) juveniles using multistate capture-mark-recapture (CMR) models. Simulations designed to test the effectiveness of multistate models for detecting SDS in our system indicated that error in SDS estimates was low and that both time-invariant and time-varying SDS could be detected with sample sizes of >250, average survival of >0.6, and average probability of capture of >0.6, except for cases of very strong SDS. In the field (N ??? 750, survival 0.6-0.8 among sampling intervals, probability of capture 0.6-0.8 among sampling occasions), about one-third of the sampling intervals showed evidence of SDS, with poorer survival of larger fish during the age-2+ autumn and quadratic survival (opposite direction between cohorts) during age-1+ spring. The varying magnitude and shape of SDS among sampling intervals suggest a potential mechanism for the maintenance of the very wide observed size distributions. Estimating SDS using multistate CMR models appears complementary to established approaches, can provide estimates with low error, and can be used to detect intermittent SDS. ?? 2008 NRC Canada.

  10. Detecting temporal trends in species assemblages with bootstrapping procedures and hierarchical models

    USGS Publications Warehouse

    Gotelli, Nicholas J.; Dorazio, Robert M.; Ellison, Aaron M.; Grossman, Gary D.

    2010-01-01

    Quantifying patterns of temporal trends in species assemblages is an important analytical challenge in community ecology. We describe methods of analysis that can be applied to a matrix of counts of individuals that is organized by species (rows) and time-ordered sampling periods (columns). We first developed a bootstrapping procedure to test the null hypothesis of random sampling from a stationary species abundance distribution with temporally varying sampling probabilities. This procedure can be modified to account for undetected species. We next developed a hierarchical model to estimate species-specific trends in abundance while accounting for species-specific probabilities of detection. We analysed two long-term datasets on stream fishes and grassland insects to demonstrate these methods. For both assemblages, the bootstrap test indicated that temporal trends in abundance were more heterogeneous than expected under the null model. We used the hierarchical model to estimate trends in abundance and identified sets of species in each assemblage that were steadily increasing, decreasing or remaining constant in abundance over more than a decade of standardized annual surveys. Our methods of analysis are broadly applicable to other ecological datasets, and they represent an advance over most existing procedures, which do not incorporate effects of incomplete sampling and imperfect detection.

  11. Detection of Sea Ice and Open Water from RADARSAT-2 Images for Data Assimilation

    NASA Astrophysics Data System (ADS)

    Komarov, A.; Buehner, M.

    2016-12-01

    Automated detection of sea ice and open water from SAR data is very important for further assimilation into coupled ocean-sea ice-atmosphere numerical models, such as the Regional Ice-Ocean Prediction System being implemented at the Environment and Climate Change Canada. Conventional classification approaches based on various learning techniques are found to be limited by the fact that they typically do not indicate the level of confidence for ice and water retrievals. Meanwhile, only ice/water retrievals with a very high level of confidence are allowed to be assimilated into the sea ice model to avoid propagating and magnifying errors into the numerical prediction system. In this study we developed a new technique for ice and water detection from dual-polarization RADARSAT-2 HH-HV images which provides the probability of ice/water at a given location. We collected many hundreds of thousands of SAR signatures over various sea ice types (i.e. new, grey, first-year, and multi-year ice) and open water from all available RADARSAT-2 images and the corresponding Canadian Ice Service Image Analysis products over the period from November 2010 to May 2016. Our analysis of the dataset revealed that ice/water separation can be effectively performed in the space of SAR-based variables independent of the incidence angle and noise floor (such as texture measures) and auxiliary Global Environmental Multiscale Model parameters (such as surface wind speed). Choice of the parameters will be specifically discussed in the presentation. An ice probability empirical model as a function of the selected predictors was built in a form of logistic regression, based on the training dataset from 2012 to 2016. The developed ice probability model showed very good performance on the independent testing subset (year 2011). With the ice/water probability threshold of 0.95 reflecting a very high level of confidence, 79% of the testing ice and water samples were classified with the accuracy of 99%. These results are particularly important in light of the upcoming RADARSAT Constellation mission which will drastically increase the amount of SAR data over the Arctic region.

  12. Development of a score and probability estimate for detecting angle closure based on anterior segment optical coherence tomography.

    PubMed

    Nongpiur, Monisha E; Haaland, Benjamin A; Perera, Shamira A; Friedman, David S; He, Mingguang; Sakata, Lisandro M; Baskaran, Mani; Aung, Tin

    2014-01-01

    To develop a score along with an estimated probability of disease for detecting angle closure based on anterior segment optical coherence tomography (AS OCT) imaging. Cross-sectional study. A total of 2047 subjects 50 years of age and older were recruited from a community polyclinic in Singapore. All subjects underwent standardized ocular examination including gonioscopy and imaging by AS OCT (Carl Zeiss Meditec). Customized software (Zhongshan Angle Assessment Program) was used to measure AS OCT parameters. Complete data were available for 1368 subjects. Data from the right eyes were used for analysis. A stepwise logistic regression model with Akaike information criterion was used to generate a score that then was converted to an estimated probability of the presence of gonioscopic angle closure, defined as the inability to visualize the posterior trabecular meshwork for at least 180 degrees on nonindentation gonioscopy. Of the 1368 subjects, 295 (21.6%) had gonioscopic angle closure. The angle closure score was calculated from the shifted linear combination of the AS OCT parameters. The score can be converted to an estimated probability of having angle closure using the relationship: estimated probability = e(score)/(1 + e(score)), where e is the natural exponential. The score performed well in a second independent sample of 178 angle-closure subjects and 301 normal controls, with an area under the receiver operating characteristic curve of 0.94. A score derived from a single AS OCT image, coupled with an estimated probability, provides an objective platform for detection of angle closure. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. Seasonal trends in eDNA detection and occupancy of bigheaded carps

    USGS Publications Warehouse

    Erickson, Richard A.; Merkes, Christopher; Jackson, Craig; Goforth, Reuben; Amberg, Jon J.

    2017-01-01

    Bigheaded carps, which include silver and bighead carp, are threatening to invade the Great Lakes. These species vary seasonally in distribution and abundance due to environmental conditions such as precipitation and temperature. Monitoring this seasonal movement is important for management to control the population size and spread of the species. We examined if environmental DNA (eDNA) approaches could detect seasonal changes of these species. To do this, we developed a novel genetic marker that was able to both detect and differentiate bighead and silver carp DNA. We used the marker, combined with a novel occupancy model, to study the occurrence of bigheaded carps at 3 sites on the Wabash River over the course of a year. We studied the Wabash River because of concerns that carps may be able to use the system to invade the Great Lakes via a now closed (ca. 2017) connection at Eagle Marsh between the Wabash River's watershed and the Great Lakes' watershed. We found seasonal trends in the probability of detection and occupancy that varied across sites. These findings demonstrate that eDNA methods can detect seasonal changes in bigheaded carps densities and suggest that the amount of eDNA present changes seasonally. The site that was farthest upstream and had the lowest carp densities exhibited the strongest seasonal trends for both detection probabilities and sample occupancy probabilities. Furthermore, other observations suggest that carps seasonally leave this site, and we were able to detect this with our eDNA approach.

  14. Spatial Distribution and Conservation of Speckled Hind and Warsaw Grouper in the Atlantic Ocean off the Southeastern U.S.

    PubMed Central

    Farmer, Nicholas A.; Karnauskas, Mandy

    2013-01-01

    There is broad interest in the development of efficient marine protected areas (MPAs) to reduce bycatch and end overfishing of speckled hind (Epinephelus drummondhayi) and warsaw grouper (Hyporthodus nigritus) in the Atlantic Ocean off the southeastern U.S. We assimilated decades of data from many fishery-dependent, fishery-independent, and anecdotal sources to describe the spatial distribution of these data limited stocks. A spatial classification model was developed to categorize depth-grids based on the distribution of speckled hind and warsaw grouper point observations and identified benthic habitats. Logistic regression analysis was used to develop a quantitative model to predict the spatial distribution of speckled hind and warsaw grouper as a function of depth, latitude, and habitat. Models, controlling for sampling gear effects, were selected based on AIC and 10-fold cross validation. The best-fitting model for warsaw grouper included latitude and depth to explain 10.8% of the variability in probability of detection, with a false prediction rate of 28–33%. The best-fitting model for speckled hind, per cross-validation, included latitude and depth to explain 36.8% of the variability in probability of detection, with a false prediction rate of 25–27%. The best-fitting speckled hind model, per AIC, also included habitat, but had false prediction rates up to 36%. Speckled hind and warsaw grouper habitats followed a shelf-edge hardbottom ridge from North Carolina to southeast Florida, with speckled hind more common to the north and warsaw grouper more common to the south. The proportion of habitat classifications and model-estimated stock contained within established and proposed MPAs was computed. Existing MPAs covered 10% of probable shelf-edge habitats for speckled hind and warsaw grouper, protecting 3–8% of speckled hind and 8% of warsaw grouper stocks. Proposed MPAs could add 24% more probable shelf-edge habitat, and protect an additional 14–29% of speckled hind and 20% of warsaw grouper stocks. PMID:24260126

  15. Microbial Performance of Food Safety Control and Assurance Activities in a Fresh Produce Processing Sector Measured Using a Microbial Assessment Scheme and Statistical Modeling.

    PubMed

    Njage, Patrick Murigu Kamau; Sawe, Chemutai Tonui; Onyango, Cecilia Moraa; Habib, I; Njagi, Edmund Njeru; Aerts, Marc; Molenberghs, Geert

    2017-01-01

    Current approaches such as inspections, audits, and end product testing cannot detect the distribution and dynamics of microbial contamination. Despite the implementation of current food safety management systems, foodborne outbreaks linked to fresh produce continue to be reported. A microbial assessment scheme and statistical modeling were used to systematically assess the microbial performance of core control and assurance activities in five Kenyan fresh produce processing and export companies. Generalized linear mixed models and correlated random-effects joint models for multivariate clustered data followed by empirical Bayes estimates enabled the analysis of the probability of contamination across critical sampling locations (CSLs) and factories as a random effect. Salmonella spp. and Listeria monocytogenes were not detected in the final products. However, none of the processors attained the maximum safety level for environmental samples. Escherichia coli was detected in five of the six CSLs, including the final product. Among the processing-environment samples, the hand or glove swabs of personnel revealed a higher level of predicted contamination with E. coli , and 80% of the factories were E. coli positive at this CSL. End products showed higher predicted probabilities of having the lowest level of food safety compared with raw materials. The final products were E. coli positive despite the raw materials being E. coli negative for 60% of the processors. There was a higher probability of contamination with coliforms in water at the inlet than in the final rinse water. Four (80%) of the five assessed processors had poor to unacceptable counts of Enterobacteriaceae on processing surfaces. Personnel-, equipment-, and product-related hygiene measures to improve the performance of preventive and intervention measures are recommended.

  16. Real-Time Inspection Of Currency

    NASA Astrophysics Data System (ADS)

    Blazek, Henry

    1986-12-01

    An automatic inspection machine, designed and manufactured by the Perkin-Elmer Corporation for the U.S. Bureau of Engraving and Printing, is capable of real-time inspection of currency at rates compatible with the output of modern high-speed printing presses. Inspection is accomplished by comparing test notes (in 32-per-sheet format) with reference notes stored in the memory of a digital computer. This paper describes the development of algorithms for detecting defective notes, one of the key problems solved during the development of the inspection system. Results achieved on an analytical model, used for predicting probability of false alarms and probability of detecting typically defective notes, are compared to those obtained by system simulation.

  17. Small-target leak detection for a closed vessel via infrared image sequences

    NASA Astrophysics Data System (ADS)

    Zhao, Ling; Yang, Hongjiu

    2017-03-01

    This paper focus on a leak diagnosis and localization method based on infrared image sequences. Some problems on high probability of false warning and negative affect for marginal information are solved by leak detection. An experimental model is established for leak diagnosis and localization on infrared image sequences. The differential background prediction is presented to eliminate the negative affect of marginal information on test vessel based on a kernel regression method. A pipeline filter based on layering voting is designed to reduce probability of leak point false warning. A synthesize leak diagnosis and localization algorithm is proposed based on infrared image sequences. The effectiveness and potential are shown for developed techniques through experimental results.

  18. Variables that influence BRAF mutation probability: A next-generation sequencing, non-interventional investigation of BRAFV600 mutation status in melanoma.

    PubMed

    Gaiser, Maria Rita; Skorokhod, Alexander; Gransheier, Diana; Weide, Benjamin; Koch, Winfried; Schif, Birgit; Enk, Alexander; Garbe, Claus; Bauer, Jürgen

    2017-01-01

    The incidence of melanoma, particularly in older patients, has steadily increased over the past few decades. Activating mutations of BRAF, the majority occurring in BRAFV600, are frequently detected in melanoma; however, the prognostic significance remains unclear. This study aimed to define the probability and distribution of BRAFV600 mutations, and the clinico-pathological factors that may affect BRAF mutation status, in patients with advanced melanoma using next-generation sequencing. This was a non-interventional, retrospective study of BRAF mutation testing at two German centers, in Heidelberg and Tübingen. Archival tumor samples from patients with histologically confirmed melanoma (stage IIIB, IIIC, IV) were analyzed using PCR amplification and deep sequencing. Clinical, histological, and mutation data were collected. The statistical influence of patient- and tumor-related characteristics on BRAFV600 mutation status was assessed using multiple logistic regression (MLR) and a prediction profiler. BRAFV600 mutation status was assessed in 453 samples. Mutations were detected in 57.6% of patients (n = 261), with 48.1% (n = 102) at the Heidelberg site and 66.0% (n = 159) at the Tübingen site. The decreasing influence of increasing age on mutation probability was quantified. A main effects MLR model identified age (p = 0.0001), center (p = 0.0004), and melanoma subtype (p = 0.014) as significantly influencing BRAFV600 mutation probability; ultraviolet (UV) exposure showed a statistical trend (p = 0.1419). An interaction model of age versus other variables showed that center (p<0.0001) and melanoma subtype (p = 0.0038) significantly influenced BRAF mutation probability; age had a statistically significant effect only as part of an interaction with both UV exposure (p = 0.0110) and melanoma subtype (p = 0.0134). This exploratory study highlights that testing center, melanoma subtype, and age in combination with UV exposure and melanoma subtype significantly influence BRAFV600 mutation probability in patients with melanoma. Further validation of this model, in terms of reproducibility and broader relevance, is required.

  19. Grizzly Bear Noninvasive Genetic Tagging Surveys: Estimating the Magnitude of Missed Detections.

    PubMed

    Fisher, Jason T; Heim, Nicole; Code, Sandra; Paczkowski, John

    2016-01-01

    Sound wildlife conservation decisions require sound information, and scientists increasingly rely on remotely collected data over large spatial scales, such as noninvasive genetic tagging (NGT). Grizzly bears (Ursus arctos), for example, are difficult to study at population scales except with noninvasive data, and NGT via hair trapping informs management over much of grizzly bears' range. Considerable statistical effort has gone into estimating sources of heterogeneity, but detection error-arising when a visiting bear fails to leave a hair sample-has not been independently estimated. We used camera traps to survey grizzly bear occurrence at fixed hair traps and multi-method hierarchical occupancy models to estimate the probability that a visiting bear actually leaves a hair sample with viable DNA. We surveyed grizzly bears via hair trapping and camera trapping for 8 monthly surveys at 50 (2012) and 76 (2013) sites in the Rocky Mountains of Alberta, Canada. We used multi-method occupancy models to estimate site occupancy, probability of detection, and conditional occupancy at a hair trap. We tested the prediction that detection error in NGT studies could be induced by temporal variability within season, leading to underestimation of occupancy. NGT via hair trapping consistently underestimated grizzly bear occupancy at a site when compared to camera trapping. At best occupancy was underestimated by 50%; at worst, by 95%. Probability of false absence was reduced through successive surveys, but this mainly accounts for error imparted by movement among repeated surveys, not necessarily missed detections by extant bears. The implications of missed detections and biased occupancy estimates for density estimation-which form the crux of management plans-require consideration. We suggest hair-trap NGT studies should estimate and correct detection error using independent survey methods such as cameras, to ensure the reliability of the data upon which species management and conservation actions are based.

  20. Grizzly Bear Noninvasive Genetic Tagging Surveys: Estimating the Magnitude of Missed Detections

    PubMed Central

    Fisher, Jason T.; Heim, Nicole; Code, Sandra; Paczkowski, John

    2016-01-01

    Sound wildlife conservation decisions require sound information, and scientists increasingly rely on remotely collected data over large spatial scales, such as noninvasive genetic tagging (NGT). Grizzly bears (Ursus arctos), for example, are difficult to study at population scales except with noninvasive data, and NGT via hair trapping informs management over much of grizzly bears’ range. Considerable statistical effort has gone into estimating sources of heterogeneity, but detection error–arising when a visiting bear fails to leave a hair sample–has not been independently estimated. We used camera traps to survey grizzly bear occurrence at fixed hair traps and multi-method hierarchical occupancy models to estimate the probability that a visiting bear actually leaves a hair sample with viable DNA. We surveyed grizzly bears via hair trapping and camera trapping for 8 monthly surveys at 50 (2012) and 76 (2013) sites in the Rocky Mountains of Alberta, Canada. We used multi-method occupancy models to estimate site occupancy, probability of detection, and conditional occupancy at a hair trap. We tested the prediction that detection error in NGT studies could be induced by temporal variability within season, leading to underestimation of occupancy. NGT via hair trapping consistently underestimated grizzly bear occupancy at a site when compared to camera trapping. At best occupancy was underestimated by 50%; at worst, by 95%. Probability of false absence was reduced through successive surveys, but this mainly accounts for error imparted by movement among repeated surveys, not necessarily missed detections by extant bears. The implications of missed detections and biased occupancy estimates for density estimation–which form the crux of management plans–require consideration. We suggest hair-trap NGT studies should estimate and correct detection error using independent survey methods such as cameras, to ensure the reliability of the data upon which species management and conservation actions are based. PMID:27603134

  1. Post-Control Surveillance of Triatoma infestans and Triatoma sordida with Chemically-Baited Sticky Traps

    PubMed Central

    Acosta, Nidia; López, Elsa; González, Nilsa; Zerba, Eduardo; Tarelli, Guillermo; Masuh, Héctor

    2012-01-01

    Background Chagas disease prevention critically depends on keeping houses free of triatomine vectors. Insecticide spraying is very effective, but re-infestation of treated dwellings is commonplace. Early detection-elimination of re-infestation foci is key to long-term control; however, all available vector-detection methods have low sensitivity. Chemically-baited traps are widely used in vector and pest control-surveillance systems; here, we test this approach for Triatoma spp. detection under field conditions in the Gran Chaco. Methodology/Principal Findings Using a repeated-sampling approach and logistic models that explicitly take detection failures into account, we simultaneously estimate vector occurrence and detection probabilities. We then model detection probabilities (conditioned on vector occurrence) as a function of trapping system to measure the effect of chemical baits. We find a positive effect of baits after three (odds ratio [OR] 5.10; 95% confidence interval [CI95] 2.59–10.04) and six months (OR 2.20, CI95 1.04–4.65). Detection probabilities are estimated at p≈0.40–0.50 for baited and at just p≈0.15 for control traps. Bait effect is very strong on T. infestans (three-month assessment: OR 12.30, CI95 4.44–34.10; p≈0.64), whereas T. sordida is captured with similar frequency in baited and unbaited traps. Conclusions/Significance Chemically-baited traps hold promise for T. infestans surveillance; the sensitivity of the system at detecting small re-infestation foci rises from 12.5% to 63.6% when traps are baited with semiochemicals. Accounting for imperfect detection, infestation is estimated at 26% (CI95 16–40) after three and 20% (CI95 11–34) after six months. In the same assessments, traps detected infestation in 14% and 8.5% of dwellings, whereas timed manual searches (the standard approach) did so in just 1.4% of dwellings only in the first survey. Since infestation rates are the main indicator used for decision-making in control programs, the approach we present may help improve T. infestans surveillance and control program management. PMID:23029583

  2. A Generalized Process Model of Human Action Selection and Error and its Application to Error Prediction

    DTIC Science & Technology

    2014-07-01

    Macmillan & Creelman , 2005). This is a quite high degree of discriminability and it means that when the decision model predicts a probability of...ROC analysis. Pattern Recognition Letters, 27(8), 861-874. Retrieved from Google Scholar. Macmillan, N. A., & Creelman , C. D. (2005). Detection

  3. Computational Aspects of N-Mixture Models

    PubMed Central

    Dennis, Emily B; Morgan, Byron JT; Ridout, Martin S

    2015-01-01

    The N-mixture model is widely used to estimate the abundance of a population in the presence of unknown detection probability from only a set of counts subject to spatial and temporal replication (Royle, 2004, Biometrics 60, 105–115). We explain and exploit the equivalence of N-mixture and multivariate Poisson and negative-binomial models, which provides powerful new approaches for fitting these models. We show that particularly when detection probability and the number of sampling occasions are small, infinite estimates of abundance can arise. We propose a sample covariance as a diagnostic for this event, and demonstrate its good performance in the Poisson case. Infinite estimates may be missed in practice, due to numerical optimization procedures terminating at arbitrarily large values. It is shown that the use of a bound, K, for an infinite summation in the N-mixture likelihood can result in underestimation of abundance, so that default values of K in computer packages should be avoided. Instead we propose a simple automatic way to choose K. The methods are illustrated by analysis of data on Hermann's tortoise Testudo hermanni. PMID:25314629

  4. Assessment of NDE Reliability Data

    NASA Technical Reports Server (NTRS)

    Yee, B. G. W.; Chang, F. H.; Couchman, J. C.; Lemon, G. H.; Packman, P. F.

    1976-01-01

    Twenty sets of relevant Nondestructive Evaluation (NDE) reliability data have been identified, collected, compiled, and categorized. A criterion for the selection of data for statistical analysis considerations has been formulated. A model to grade the quality and validity of the data sets has been developed. Data input formats, which record the pertinent parameters of the defect/specimen and inspection procedures, have been formulated for each NDE method. A comprehensive computer program has been written to calculate the probability of flaw detection at several confidence levels by the binomial distribution. This program also selects the desired data sets for pooling and tests the statistical pooling criteria before calculating the composite detection reliability. Probability of detection curves at 95 and 50 percent confidence levels have been plotted for individual sets of relevant data as well as for several sets of merged data with common sets of NDE parameters.

  5. Network anomaly detection system with optimized DS evidence theory.

    PubMed

    Liu, Yuan; Wang, Xiaofeng; Liu, Kaiyu

    2014-01-01

    Network anomaly detection has been focused on by more people with the fast development of computer network. Some researchers utilized fusion method and DS evidence theory to do network anomaly detection but with low performance, and they did not consider features of network-complicated and varied. To achieve high detection rate, we present a novel network anomaly detection system with optimized Dempster-Shafer evidence theory (ODS) and regression basic probability assignment (RBPA) function. In this model, we add weights for each sensor to optimize DS evidence theory according to its previous predict accuracy. And RBPA employs sensor's regression ability to address complex network. By four kinds of experiments, we find that our novel network anomaly detection model has a better detection rate, and RBPA as well as ODS optimization methods can improve system performance significantly.

  6. Network Anomaly Detection System with Optimized DS Evidence Theory

    PubMed Central

    Liu, Yuan; Wang, Xiaofeng; Liu, Kaiyu

    2014-01-01

    Network anomaly detection has been focused on by more people with the fast development of computer network. Some researchers utilized fusion method and DS evidence theory to do network anomaly detection but with low performance, and they did not consider features of network—complicated and varied. To achieve high detection rate, we present a novel network anomaly detection system with optimized Dempster-Shafer evidence theory (ODS) and regression basic probability assignment (RBPA) function. In this model, we add weights for each senor to optimize DS evidence theory according to its previous predict accuracy. And RBPA employs sensor's regression ability to address complex network. By four kinds of experiments, we find that our novel network anomaly detection model has a better detection rate, and RBPA as well as ODS optimization methods can improve system performance significantly. PMID:25254258

  7. Occupancy Estimation and Modeling : Inferring Patterns and Dynamics of Species Occurrence

    USGS Publications Warehouse

    MacKenzie, D.I.; Nichols, J.D.; Royle, J. Andrew; Pollock, K.H.; Bailey, L.L.; Hines, J.E.

    2006-01-01

    This is the first book to examine the latest methods in analyzing presence/absence data surveys. Using four classes of models (single-species, single-season; single-species, multiple season; multiple-species, single-season; and multiple-species, multiple-season), the authors discuss the practical sampling situation, present a likelihood-based model enabling direct estimation of the occupancy-related parameters while allowing for imperfect detectability, and make recommendations for designing studies using these models. It provides authoritative insights into the latest in estimation modeling; discusses multiple models which lay the groundwork for future study designs; addresses critical issues of imperfect detectibility and its effects on estimation; and explores the role of probability in estimating in detail.

  8. An novel frequent probability pattern mining algorithm based on circuit simulation method in uncertain biological networks.

    PubMed

    He, Jieyue; Wang, Chunyan; Qiu, Kunpu; Zhong, Wei

    2014-01-01

    Motif mining has always been a hot research topic in bioinformatics. Most of current research on biological networks focuses on exact motif mining. However, due to the inevitable experimental error and noisy data, biological network data represented as the probability model could better reflect the authenticity and biological significance, therefore, it is more biological meaningful to discover probability motif in uncertain biological networks. One of the key steps in probability motif mining is frequent pattern discovery which is usually based on the possible world model having a relatively high computational complexity. In this paper, we present a novel method for detecting frequent probability patterns based on circuit simulation in the uncertain biological networks. First, the partition based efficient search is applied to the non-tree like subgraph mining where the probability of occurrence in random networks is small. Then, an algorithm of probability isomorphic based on circuit simulation is proposed. The probability isomorphic combines the analysis of circuit topology structure with related physical properties of voltage in order to evaluate the probability isomorphism between probability subgraphs. The circuit simulation based probability isomorphic can avoid using traditional possible world model. Finally, based on the algorithm of probability subgraph isomorphism, two-step hierarchical clustering method is used to cluster subgraphs, and discover frequent probability patterns from the clusters. The experiment results on data sets of the Protein-Protein Interaction (PPI) networks and the transcriptional regulatory networks of E. coli and S. cerevisiae show that the proposed method can efficiently discover the frequent probability subgraphs. The discovered subgraphs in our study contain all probability motifs reported in the experiments published in other related papers. The algorithm of probability graph isomorphism evaluation based on circuit simulation method excludes most of subgraphs which are not probability isomorphism and reduces the search space of the probability isomorphism subgraphs using the mismatch values in the node voltage set. It is an innovative way to find the frequent probability patterns, which can be efficiently applied to probability motif discovery problems in the further studies.

  9. An novel frequent probability pattern mining algorithm based on circuit simulation method in uncertain biological networks

    PubMed Central

    2014-01-01

    Background Motif mining has always been a hot research topic in bioinformatics. Most of current research on biological networks focuses on exact motif mining. However, due to the inevitable experimental error and noisy data, biological network data represented as the probability model could better reflect the authenticity and biological significance, therefore, it is more biological meaningful to discover probability motif in uncertain biological networks. One of the key steps in probability motif mining is frequent pattern discovery which is usually based on the possible world model having a relatively high computational complexity. Methods In this paper, we present a novel method for detecting frequent probability patterns based on circuit simulation in the uncertain biological networks. First, the partition based efficient search is applied to the non-tree like subgraph mining where the probability of occurrence in random networks is small. Then, an algorithm of probability isomorphic based on circuit simulation is proposed. The probability isomorphic combines the analysis of circuit topology structure with related physical properties of voltage in order to evaluate the probability isomorphism between probability subgraphs. The circuit simulation based probability isomorphic can avoid using traditional possible world model. Finally, based on the algorithm of probability subgraph isomorphism, two-step hierarchical clustering method is used to cluster subgraphs, and discover frequent probability patterns from the clusters. Results The experiment results on data sets of the Protein-Protein Interaction (PPI) networks and the transcriptional regulatory networks of E. coli and S. cerevisiae show that the proposed method can efficiently discover the frequent probability subgraphs. The discovered subgraphs in our study contain all probability motifs reported in the experiments published in other related papers. Conclusions The algorithm of probability graph isomorphism evaluation based on circuit simulation method excludes most of subgraphs which are not probability isomorphism and reduces the search space of the probability isomorphism subgraphs using the mismatch values in the node voltage set. It is an innovative way to find the frequent probability patterns, which can be efficiently applied to probability motif discovery problems in the further studies. PMID:25350277

  10. Info-gap theory and robust design of surveillance for invasive species: the case study of Barrow Island.

    PubMed

    Davidovitch, Lior; Stoklosa, Richard; Majer, Jonathan; Nietrzeba, Alex; Whittle, Peter; Mengersen, Kerrie; Ben-Haim, Yakov

    2009-06-01

    Surveillance for invasive non-indigenous species (NIS) is an integral part of a quarantine system. Estimating the efficiency of a surveillance strategy relies on many uncertain parameters estimated by experts, such as the efficiency of its components in face of the specific NIS, the ability of the NIS to inhabit different environments, and so on. Due to the importance of detecting an invasive NIS within a critical period of time, it is crucial that these uncertainties be accounted for in the design of the surveillance system. We formulate a detection model that takes into account, in addition to structured sampling for incursive NIS, incidental detection by untrained workers. We use info-gap theory for satisficing (not minimizing) the probability of detection, while at the same time maximizing the robustness to uncertainty. We demonstrate the trade-off between robustness to uncertainty, and an increase in the required probability of detection. An empirical example based on the detection of Pheidole megacephala on Barrow Island demonstrates the use of info-gap analysis to select a surveillance strategy.

  11. Assessing the sensitivity of bovine tuberculosis surveillance in Canada's cattle population, 2009-2013.

    PubMed

    El Allaki, Farouk; Harrington, Noel; Howden, Krista

    2016-11-01

    The objectives of this study were (1) to estimate the annual sensitivity of Canada's bTB surveillance system and its three system components (slaughter surveillance, export testing and disease investigation) using a scenario tree modelling approach, and (2) to identify key model parameters that influence the estimates of the surveillance system sensitivity (SSSe). To achieve these objectives, we designed stochastic scenario tree models for three surveillance system components included in the analysis. Demographic data, slaughter data, export testing data, and disease investigation data from 2009 to 2013 were extracted for input into the scenario trees. Sensitivity analysis was conducted to identify key influential parameters on SSSe estimates. The median annual SSSe estimates generated from the study were very high, ranging from 0.95 (95% probability interval [PI]: 0.88-0.98) to 0.97 (95% PI: 0.93-0.99). Median annual sensitivity estimates for the slaughter surveillance component ranged from 0.95 (95% PI: 0.88-0.98) to 0.97 (95% PI: 0.93-0.99). This shows that slaughter surveillance to be the major contributor to overall surveillance system sensitivity with a high probability to detect M. bovis infection if present at a prevalence of 0.00028% or greater during the study period. The export testing and disease investigation components had extremely low component sensitivity estimates-the maximum median sensitivity estimates were 0.02 (95% PI: 0.014-0.023) and 0.0061 (95% PI: 0.0056-0.0066) respectively. The three most influential input parameters on the model's output (SSSe) were the probability of a granuloma being detected at slaughter inspection, the probability of a granuloma being present in older animals (≥12 months of age), and the probability of a granuloma sample being submitted to the laboratory. Additional studies are required to reduce the levels of uncertainty and variability associated with these three parameters influencing the surveillance system sensitivity. Crown Copyright © 2016. Published by Elsevier B.V. All rights reserved.

  12. Resolution Enhanced Magnetic Sensing System for Wide Coverage Real Time UXO Detection

    NASA Astrophysics Data System (ADS)

    Zalevsky, Zeev; Bregman, Yuri; Salomonski, Nizan; Zafrir, Hovav

    2012-09-01

    In this paper we present a new high resolution automatic detection algorithm based upon a Wavelet transform and then validate it in marine related experiments. The proposed approach allows obtaining an automatic detection in a very low signal to noise ratios. The amount of calculations is reduced, the magnetic trend is depressed and the probability of detection/ false alarm rate can easily be controlled. Moreover, the algorithm enables to distinguish between close targets. In the algorithm we use the physical dependence of the magnetic field of a magnetic dipole in order to define a Wavelet mother function that later on can detect magnetic targets modeled as dipoles and embedded in noisy surrounding, at improved resolution. The proposed algorithm was realized on synthesized targets and then validated in field experiments involving a marine surface-floating system for wide coverage real time unexploded ordinance (UXO) detection and mapping. The detection probability achieved in the marine experiment was above 90%. The horizontal radial error of most of the detected targets was only 16 m and two baseline targets that were immersed about 20 m one to another could easily be distinguished.

  13. Food provisioning and parental status in songbirds: can occupancy models be used to estimate nesting performance?

    PubMed

    Corbani, Aude Catherine; Hachey, Marie-Hélène; Desrochers, André

    2014-01-01

    Indirect methods to estimate parental status, such as the observation of parental provisioning, have been problematic due to potential biases associated with imperfect detection. We developed a method to evaluate parental status based on a novel combination of parental provisioning observations and hierarchical modeling. In the summers of 2009 to 2011, we surveyed 393 sites, each on three to four consecutive days at Forêt Montmorency, Québec, Canada. We assessed parental status of 2331 adult songbirds based on parental food provisioning. To account for imperfect detection of parental status, we applied MacKenzie et al.'s (2002) two-state hierarchical model to obtain unbiased estimates of the proportion of sites with successfully nesting birds, and the proportion of adults with offspring. To obtain an independent evaluation of detection probability, we monitored 16 active nests in 2010 and conducted parental provisioning observations away from them. The probability of detecting food provisioning was 0.31 when using nest monitoring, a value within the 0.11 to 0.38 range that was estimated by two-state models. The proportion of adults or sites with broods approached 0.90 and varied depending on date during the sampling season and year, exemplifying the role of eastern boreal forests as highly productive nesting grounds for songbirds. This study offers a simple and effective sampling design for studying avian reproductive performance that could be implemented in national surveys such as breeding bird atlases.

  14. Food Provisioning and Parental Status in Songbirds: Can Occupancy Models Be Used to Estimate Nesting Performance?

    PubMed Central

    Corbani, Aude Catherine; Hachey, Marie-Hélène; Desrochers, André

    2014-01-01

    Indirect methods to estimate parental status, such as the observation of parental provisioning, have been problematic due to potential biases associated with imperfect detection. We developed a method to evaluate parental status based on a novel combination of parental provisioning observations and hierarchical modeling. In the summers of 2009 to 2011, we surveyed 393 sites, each on three to four consecutive days at Forêt Montmorency, Québec, Canada. We assessed parental status of 2331 adult songbirds based on parental food provisioning. To account for imperfect detection of parental status, we applied MacKenzie et al.'s (2002) two-state hierarchical model to obtain unbiased estimates of the proportion of sites with successfully nesting birds, and the proportion of adults with offspring. To obtain an independent evaluation of detection probability, we monitored 16 active nests in 2010 and conducted parental provisioning observations away from them. The probability of detecting food provisioning was 0.31 when using nest monitoring, a value within the 0.11 to 0.38 range that was estimated by two-state models. The proportion of adults or sites with broods approached 0.90 and varied depending on date during the sampling season and year, exemplifying the role of eastern boreal forests as highly productive nesting grounds for songbirds. This study offers a simple and effective sampling design for studying avian reproductive performance that could be implemented in national surveys such as breeding bird atlases. PMID:24999969

  15. Impact of signal scattering and parametric uncertainties on receiver operating characteristics

    NASA Astrophysics Data System (ADS)

    Wilson, D. Keith; Breton, Daniel J.; Hart, Carl R.; Pettit, Chris L.

    2017-05-01

    The receiver operating characteristic (ROC curve), which is a plot of the probability of detection as a function of the probability of false alarm, plays a key role in the classical analysis of detector performance. However, meaningful characterization of the ROC curve is challenging when practically important complications such as variations in source emissions, environmental impacts on the signal propagation, uncertainties in the sensor response, and multiple sources of interference are considered. In this paper, a relatively simple but realistic model for scattered signals is employed to explore how parametric uncertainties impact the ROC curve. In particular, we show that parametric uncertainties in the mean signal and noise power substantially raise the tails of the distributions; since receiver operation with a very low probability of false alarm and a high probability of detection is normally desired, these tails lead to severely degraded performance. Because full a priori knowledge of such parametric uncertainties is rarely available in practice, analyses must typically be based on a finite sample of environmental states, which only partially characterize the range of parameter variations. We show how this effect can lead to misleading assessments of system performance. For the cases considered, approximately 64 or more statistically independent samples of the uncertain parameters are needed to accurately predict the probabilities of detection and false alarm. A connection is also described between selection of suitable distributions for the uncertain parameters, and Bayesian adaptive methods for inferring the parameters.

  16. Population variability complicates the accurate detection of climate change responses.

    PubMed

    McCain, Christy; Szewczyk, Tim; Bracy Knight, Kevin

    2016-06-01

    The rush to assess species' responses to anthropogenic climate change (CC) has underestimated the importance of interannual population variability (PV). Researchers assume sampling rigor alone will lead to an accurate detection of response regardless of the underlying population fluctuations of the species under consideration. Using population simulations across a realistic, empirically based gradient in PV, we show that moderate to high PV can lead to opposite and biased conclusions about CC responses. Between pre- and post-CC sampling bouts of modeled populations as in resurvey studies, there is: (i) A 50% probability of erroneously detecting the opposite trend in population abundance change and nearly zero probability of detecting no change. (ii) Across multiple years of sampling, it is nearly impossible to accurately detect any directional shift in population sizes with even moderate PV. (iii) There is up to 50% probability of detecting a population extirpation when the species is present, but in very low natural abundances. (iv) Under scenarios of moderate to high PV across a species' range or at the range edges, there is a bias toward erroneous detection of range shifts or contractions. Essentially, the frequency and magnitude of population peaks and troughs greatly impact the accuracy of our CC response measurements. Species with moderate to high PV (many small vertebrates, invertebrates, and annual plants) may be inaccurate 'canaries in the coal mine' for CC without pertinent demographic analyses and additional repeat sampling. Variation in PV may explain some idiosyncrasies in CC responses detected so far and urgently needs more careful consideration in design and analysis of CC responses. © 2016 John Wiley & Sons Ltd.

  17. Two-IMU FDI performance of the sequential probability ratio test during shuttle entry

    NASA Technical Reports Server (NTRS)

    Rich, T. M.

    1976-01-01

    Performance data for the sequential probability ratio test (SPRT) during shuttle entry are presented. Current modeling constants and failure thresholds are included for the full mission 3B from entry through landing trajectory. Minimum 100 percent detection/isolation failure levels and a discussion of the effects of failure direction are presented. Finally, a limited comparison of failures introduced at trajectory initiation shows that the SPRT algorithm performs slightly worse than the data tracking test.

  18. Research of the orbital evolution of asteroid 2012 DA14 (in Russian)

    NASA Astrophysics Data System (ADS)

    Zausaev, A. F.; Denisov, S. S.; Derevyanka, A. E.

    Research of the orbital evolution of asteroid 2012 DA14 on the time interval from 1800 to 2206 is made, an object close approaches with Earth and the Moon are detected, the probability of impact with Earth is calculated. The used mathematical model is consistent with the DE405, the integration was performed using a modified Everhart's method of 27th order, the probability of collision is calculated using the Monte Carlo method.

  19. Detection of influenza-like illness aberrations by directly monitoring Pearson residuals of fitted negative binomial regression models.

    PubMed

    Chan, Ta-Chien; Teng, Yung-Chu; Hwang, Jing-Shiang

    2015-02-21

    Emerging novel influenza outbreaks have increasingly been a threat to the public and a major concern of public health departments. Real-time data in seamless surveillance systems such as health insurance claims data for influenza-like illnesses (ILI) are ready for analysis, making it highly desirable to develop practical techniques to analyze such readymade data for outbreak detection so that the public can receive timely influenza epidemic warnings. This study proposes a simple and effective approach to analyze area-based health insurance claims data including outpatient and emergency department (ED) visits for early detection of any aberrations of ILI. The health insurance claims data during 2004-2009 from a national health insurance research database were used for developing early detection methods. The proposed approach fitted the daily new ILI visits and monitored the Pearson residuals directly for aberration detection. First, negative binomial regression was used for both outpatient and ED visits to adjust for potentially influential factors such as holidays, weekends, seasons, temporal dependence and temperature. Second, if the Pearson residuals exceeded 1.96, aberration signals were issued. The empirical validation of the model was done in 2008 and 2009. In addition, we designed a simulation study to compare the time of outbreak detection, non-detection probability and false alarm rate between the proposed method and modified CUSUM. The model successfully detected the aberrations of 2009 pandemic (H1N1) influenza virus in northern, central and southern Taiwan. The proposed approach was more sensitive in identifying aberrations in ED visits than those in outpatient visits. Simulation studies demonstrated that the proposed approach could detect the aberrations earlier, and with lower non-detection probability and mean false alarm rate in detecting aberrations compared to modified CUSUM methods. The proposed simple approach was able to filter out temporal trends, adjust for temperature, and issue warning signals for the first wave of the influenza epidemic in a timely and accurate manner.

  20. A Probabilistic Approach to Network Event Formation from Pre-Processed Waveform Data

    NASA Astrophysics Data System (ADS)

    Kohl, B. C.; Given, J.

    2017-12-01

    The current state of the art for seismic event detection still largely depends on signal detection at individual sensor stations, including picking accurate arrivals times and correctly identifying phases, and relying on fusion algorithms to associate individual signal detections to form event hypotheses. But increasing computational capability has enabled progress toward the objective of fully utilizing body-wave recordings in an integrated manner to detect events without the necessity of previously recorded ground truth events. In 2011-2012 Leidos (then SAIC) operated a seismic network to monitor activity associated with geothermal field operations in western Nevada. We developed a new association approach for detecting and quantifying events by probabilistically combining pre-processed waveform data to deal with noisy data and clutter at local distance ranges. The ProbDet algorithm maps continuous waveform data into continuous conditional probability traces using a source model (e.g. Brune earthquake or Mueller-Murphy explosion) to map frequency content and an attenuation model to map amplitudes. Event detection and classification is accomplished by combining the conditional probabilities from the entire network using a Bayesian formulation. This approach was successful in producing a high-Pd, low-Pfa automated bulletin for a local network and preliminary tests with regional and teleseismic data show that it has promise for global seismic and nuclear monitoring applications. The approach highlights several features that we believe are essential to achieving low-threshold automated event detection: Minimizes the utilization of individual seismic phase detections - in traditional techniques, errors in signal detection, timing, feature measurement and initial phase ID compound and propagate into errors in event formation, Has a formalized framework that utilizes information from non-detecting stations, Has a formalized framework that utilizes source information, in particular the spectral characteristics of events of interest, Is entirely model-based, i.e. does not rely on a priori's - particularly important for nuclear monitoring, Does not rely on individualized signal detection thresholds - it's the network solution that matters.

  1. First Detected Arrival of a Quantum Walker on an Infinite Line

    NASA Astrophysics Data System (ADS)

    Thiel, Felix; Barkai, Eli; Kessler, David A.

    2018-01-01

    The first detection of a quantum particle on a graph is shown to depend sensitively on the distance ξ between the detector and initial location of the particle, and on the sampling time τ . Here, we use the recently introduced quantum renewal equation to investigate the statistics of first detection on an infinite line, using a tight-binding lattice Hamiltonian with nearest-neighbor hops. Universal features of the first detection probability are uncovered and simple limiting cases are analyzed. These include the large ξ limit, the small τ limit, and the power law decay with the attempt number of the detection probability over which quantum oscillations are superimposed. For large ξ the first detection probability assumes a scaling form and when the sampling time is equal to the inverse of the energy band width nonanalytical behaviors arise, accompanied by a transition in the statistics. The maximum total detection probability is found to occur for τ close to this transition point. When the initial location of the particle is far from the detection node we find that the total detection probability attains a finite value that is distance independent.

  2. VizieR Online Data Catalog: Bayesian method for detecting stellar flares (Pitkin+, 2014)

    NASA Astrophysics Data System (ADS)

    Pitkin, M.; Williams, D.; Fletcher, L.; Grant, S. D. T.

    2015-05-01

    We present a Bayesian-odds-ratio-based algorithm for detecting stellar flares in light-curve data. We assume flares are described by a model in which there is a rapid rise with a half-Gaussian profile, followed by an exponential decay. Our signal model also contains a polynomial background model required to fit underlying light-curve variations in the data, which could otherwise partially mimic a flare. We characterize the false alarm probability and efficiency of this method under the assumption that any unmodelled noise in the data is Gaussian, and compare it with a simpler thresholding method based on that used in Walkowicz et al. We find our method has a significant increase in detection efficiency for low signal-to-noise ratio (S/N) flares. For a conservative false alarm probability our method can detect 95 per cent of flares with S/N less than 20, as compared to S/N of 25 for the simpler method. We also test how well the assumption of Gaussian noise holds by applying the method to a selection of 'quiet' Kepler stars. As an example we have applied our method to a selection of stars in Kepler Quarter 1 data. The method finds 687 flaring stars with a total of 1873 flares after vetos have been applied. For these flares we have made preliminary characterizations of their durations and and S/N. (1 data file).

  3. A Bayesian method for detecting stellar flares

    NASA Astrophysics Data System (ADS)

    Pitkin, M.; Williams, D.; Fletcher, L.; Grant, S. D. T.

    2014-12-01

    We present a Bayesian-odds-ratio-based algorithm for detecting stellar flares in light-curve data. We assume flares are described by a model in which there is a rapid rise with a half-Gaussian profile, followed by an exponential decay. Our signal model also contains a polynomial background model required to fit underlying light-curve variations in the data, which could otherwise partially mimic a flare. We characterize the false alarm probability and efficiency of this method under the assumption that any unmodelled noise in the data is Gaussian, and compare it with a simpler thresholding method based on that used in Walkowicz et al. We find our method has a significant increase in detection efficiency for low signal-to-noise ratio (S/N) flares. For a conservative false alarm probability our method can detect 95 per cent of flares with S/N less than 20, as compared to S/N of 25 for the simpler method. We also test how well the assumption of Gaussian noise holds by applying the method to a selection of `quiet' Kepler stars. As an example we have applied our method to a selection of stars in Kepler Quarter 1 data. The method finds 687 flaring stars with a total of 1873 flares after vetos have been applied. For these flares we have made preliminary characterizations of their durations and and S/N.

  4. Variables associated with detection probability, detection latency, and behavioral responses of Golden-winged Warblers (Vermivora chrysoptera)

    USGS Publications Warehouse

    Aldinger, Kyle R.; Wood, Petra B.

    2015-01-01

    Detection probability during point counts and its associated variables are important considerations for bird population monitoring and have implications for conservation planning by influencing population estimates. During 2008–2009, we evaluated variables hypothesized to be associated with detection probability, detection latency, and behavioral responses of male Golden-winged Warblers in pastures in the Monongahela National Forest, West Virginia, USA. This is the first study of male Golden-winged Warbler detection probability, detection latency, or behavioral response based on point-count sampling with known territory locations and identities for all males. During 3-min passive point counts, detection probability decreased as distance to a male's territory and time since sunrise increased. During 3-min point counts with playback, detection probability decreased as distance to a male's territory increased, but remained constant as time since sunrise increased. Detection probability was greater when point counts included type 2 compared with type 1 song playback, particularly during the first 2 min of type 2 song playback. Golden-winged Warblers primarily use type 1 songs (often zee bee bee bee with a higher-pitched first note) in intersexual contexts and type 2 songs (strident, rapid stutter ending with a lower-pitched buzzy note) in intrasexual contexts. Distance to a male's territory, ordinal date, and song playback type were associated with the type of behavioral response to song playback. Overall, ~2 min of type 2 song playback may increase the efficacy of point counts for monitoring populations of Golden-winged Warblers by increasing the conspicuousness of males for visual identification and offsetting the consequences of surveying later in the morning. Because playback may interfere with the ability to detect distant males, it is important to follow playback with a period of passive listening. Our results indicate that even in relatively open pasture vegetation, detection probability of male Golden-winged Warblers is imperfect and highly variable.

  5. Optimal Deployment of Unmanned Aerial Vehicles for Border Surveillance

    DTIC Science & Technology

    2014-06-01

    and intercept intruders that are trying to trespass a border. These intruders can include terrorists, drug traffickers, smugglers, illegal immigrants...routes, altitudes, and speeds in order to maximize the probability of detecting intruders trying to trespass a given border. These models will...Border surveillance is an important concern for most nations wanting to detect and intercept intruders that are trying to trespass a border. These

  6. A model describing vestibular detection of body sway motion.

    NASA Technical Reports Server (NTRS)

    Nashner, L. M.

    1971-01-01

    An experimental technique was developed which facilitated the formulation of a quantitative model describing vestibular detection of body sway motion in a postural response mode. All cues, except vestibular ones, which gave a subject an indication that he was beginning to sway, were eliminated using a specially designed two-degree-of-freedom platform; body sway was then induced and resulting compensatory responses at the ankle joints measured. Hybrid simulation compared the experimental results with models of the semicircular canals and utricular otolith receptors. Dynamic characteristics of the resulting canal model compared closely with characteristics of models which describe eye movement and subjective responses to body rotational motions. The average threshold level, in the postural response mode, however, was considerably lower. Analysis indicated that the otoliths probably play no role in the initial detection of body sway motion.

  7. Rapid Target Detection in High Resolution Remote Sensing Images Using Yolo Model

    NASA Astrophysics Data System (ADS)

    Wu, Z.; Chen, X.; Gao, Y.; Li, Y.

    2018-04-01

    Object detection in high resolution remote sensing images is a fundamental and challenging problem in the field of remote sensing imagery analysis for civil and military application due to the complex neighboring environments, which can cause the recognition algorithms to mistake irrelevant ground objects for target objects. Deep Convolution Neural Network(DCNN) is the hotspot in object detection for its powerful ability of feature extraction and has achieved state-of-the-art results in Computer Vision. Common pipeline of object detection based on DCNN consists of region proposal, CNN feature extraction, region classification and post processing. YOLO model frames object detection as a regression problem, using a single CNN predicts bounding boxes and class probabilities in an end-to-end way and make the predict faster. In this paper, a YOLO based model is used for object detection in high resolution sensing images. The experiments on NWPU VHR-10 dataset and our airport/airplane dataset gain from GoogleEarth show that, compare with the common pipeline, the proposed model speeds up the detection process and have good accuracy.

  8. Lake Sturgeon, Lake Whitefish, and Walleye egg deposition patterns with response to fish spawning substrate restoration in the St. Clair–Detroit River system

    USGS Publications Warehouse

    Fischer, Jason L.; Pritt, Jeremy J.; Roseman, Edward; Prichard, Carson G.; Craig, Jaquelyn M.; Kennedy, Gregory W.; Manny, Bruce A.

    2018-01-01

    Egg deposition and use of restored spawning substrates by lithophilic fishes (e.g., Lake Sturgeon Acipenser fulvescens, Lake Whitefish Coregonus clupeaformis, and Walleye Sander vitreus) were assessed throughout the St. Clair–Detroit River system from 2005 to 2016. Bayesian models were used to quantify egg abundance and presence/absence relative to site-specific variables (e.g., depth, velocity, and artificial spawning reef presence) and temperature to evaluate fish use of restored artificial spawning reefs and assess patterns in egg deposition. Lake Whitefish and Walleye egg abundance, probability of detection, and probability of occupancy were assessed with detection-adjusted methods; Lake Sturgeon egg abundance and probability of occurrence were assessed using delta-lognormal methods. The models indicated that the probability of Walleye eggs occupying a site increased with water velocity and that the rate of increase decreased with depth, whereas Lake Whitefish egg occupancy was not correlated with any of the attributes considered. Egg deposition by Lake Whitefish and Walleyes was greater at sites with high water velocities and was lower over artificial spawning reefs. Lake Sturgeon eggs were collected least frequently but were more likely to be collected over artificial spawning reefs and in greater abundances than elsewhere. Detection-adjusted egg abundances were not greater over artificial spawning reefs, indicating that these projects may not directly benefit spawning Walleyes and Lake Whitefish. However, 98% of the Lake Sturgeon eggs observed were collected over artificial spawning reefs, supporting the hypothesis that the reefs provided spawning sites for Lake Sturgeon and could mitigate historic losses of Lake Sturgeon spawning habitat.

  9. Use of portable antennas to estimate abundance of PIT-tagged fish in small streams: Factors affecting detection probability

    USGS Publications Warehouse

    O'Donnell, Matthew J.; Horton, Gregg E.; Letcher, Benjamin H.

    2010-01-01

    Portable passive integrated transponder (PIT) tag antenna systems can be valuable in providing reliable estimates of the abundance of tagged Atlantic salmon Salmo salar in small streams under a wide range of conditions. We developed and employed PIT tag antenna wand techniques in two controlled experiments and an additional case study to examine the factors that influenced our ability to estimate population size. We used Pollock's robust-design capture–mark–recapture model to obtain estimates of the probability of first detection (p), the probability of redetection (c), and abundance (N) in the two controlled experiments. First, we conducted an experiment in which tags were hidden in fixed locations. Although p and c varied among the three observers and among the three passes that each observer conducted, the estimates of N were identical to the true values and did not vary among observers. In the second experiment using free-swimming tagged fish, p and c varied among passes and time of day. Additionally, estimates of N varied between day and night and among age-classes but were within 10% of the true population size. In the case study, we used the Cormack–Jolly–Seber model to examine the variation in p, and we compared counts of tagged fish found with the antenna wand with counts collected via electrofishing. In that study, we found that although p varied for age-classes, sample dates, and time of day, antenna and electrofishing estimates of N were similar, indicating that population size can be reliably estimated via PIT tag antenna wands. However, factors such as the observer, time of day, age of fish, and stream discharge can influence the initial and subsequent detection probabilities.

  10. System and method for automated object detection in an image

    DOEpatents

    Kenyon, Garrett T.; Brumby, Steven P.; George, John S.; Paiton, Dylan M.; Schultz, Peter F.

    2015-10-06

    A contour/shape detection model may use relatively simple and efficient kernels to detect target edges in an object within an image or video. A co-occurrence probability may be calculated for two or more edge features in an image or video using an object definition. Edge features may be differentiated between in response to measured contextual support, and prominent edge features may be extracted based on the measured contextual support. The object may then be identified based on the extracted prominent edge features.

  11. Quantum walks: The first detected passage time problem

    NASA Astrophysics Data System (ADS)

    Friedman, H.; Kessler, D. A.; Barkai, E.

    2017-03-01

    Even after decades of research, the problem of first passage time statistics for quantum dynamics remains a challenging topic of fundamental and practical importance. Using a projective measurement approach, with a sampling time τ , we obtain the statistics of first detection events for quantum dynamics on a lattice, with the detector located at the origin. A quantum renewal equation for a first detection wave function, in terms of which the first detection probability can be calculated, is derived. This formula gives the relation between first detection statistics and the solution of the corresponding Schrödinger equation in the absence of measurement. We illustrate our results with tight-binding quantum walk models. We examine a closed system, i.e., a ring, and reveal the intricate influence of the sampling time τ on the statistics of detection, discussing the quantum Zeno effect, half dark states, revivals, and optimal detection. The initial condition modifies the statistics of a quantum walk on a finite ring in surprising ways. In some cases, the average detection time is independent of the sampling time while in others the average exhibits multiple divergences as the sampling time is modified. For an unbounded one-dimensional quantum walk, the probability of first detection decays like (time)(-3 ) with superimposed oscillations, with exceptional behavior when the sampling period τ times the tunneling rate γ is a multiple of π /2 . The amplitude of the power-law decay is suppressed as τ →0 due to the Zeno effect. Our work, an extended version of our previously published paper, predicts rich physical behaviors compared with classical Brownian motion, for which the first passage probability density decays monotonically like (time)-3 /2, as elucidated by Schrödinger in 1915.

  12. Probability of detection of nests and implications for survey design

    USGS Publications Warehouse

    Smith, P.A.; Bart, J.; Lanctot, Richard B.; McCaffery, B.J.; Brown, S.

    2009-01-01

    Surveys based on double sampling include a correction for the probability of detection by assuming complete enumeration of birds in an intensively surveyed subsample of plots. To evaluate this assumption, we calculated the probability of detecting active shorebird nests by using information from observers who searched the same plots independently. Our results demonstrate that this probability varies substantially by species and stage of the nesting cycle but less by site or density of nests. Among the species we studied, the estimated single-visit probability of nest detection during the incubation period varied from 0.21 for the White-rumped Sandpiper (Calidris fuscicollis), the most difficult species to detect, to 0.64 for the Western Sandpiper (Calidris mauri), the most easily detected species, with a mean across species of 0.46. We used these detection probabilities to predict the fraction of persistent nests found over repeated nest searches. For a species with the mean value for detectability, the detection rate exceeded 0.85 after four visits. This level of nest detection was exceeded in only three visits for the Western Sandpiper, but six to nine visits were required for the White-rumped Sandpiper, depending on the type of survey employed. Our results suggest that the double-sampling method's requirement of nearly complete counts of birds in the intensively surveyed plots is likely to be met for birds with nests that survive over several visits of nest searching. Individuals with nests that fail quickly or individuals that do not breed can be detected with high probability only if territorial behavior is used to identify likely nesting pairs. ?? The Cooper Ornithological Society, 2009.

  13. A new method for detecting small and dim targets in starry background

    NASA Astrophysics Data System (ADS)

    Yao, Rui; Zhang, Yanning; Jiang, Lei

    2011-08-01

    Small visible optical space targets detection is one of the key issues in the research of long-range early warning and space debris surveillance. The SNR(Signal to Noise Ratio) of the target is very low because of the self influence of image device. Random noise and background movement also increase the difficulty of target detection. In order to detect small visible optical space targets effectively and rapidly, we bring up a novel detecting method based on statistic theory. Firstly, we get a reasonable statistical model of visible optical space image. Secondly, we extract SIFT(Scale-Invariant Feature Transform) feature of the image frames, and calculate the transform relationship, then use the transform relationship to compensate whole visual field's movement. Thirdly, the influence of star was wiped off by using interframe difference method. We find segmentation threshold to differentiate candidate targets and noise by using OTSU method. Finally, we calculate statistical quantity to judge whether there is the target for every pixel position in the image. Theory analysis shows the relationship of false alarm probability and detection probability at different SNR. The experiment result shows that this method could detect target efficiently, even the target passing through stars.

  14. The Bees among Us: Modelling Occupancy of Solitary Bees

    PubMed Central

    MacIvor, J. Scott; Packer, Laurence

    2016-01-01

    Occupancy modelling has received increasing attention as a tool for differentiating between true absence and non-detection in biodiversity data. This is thought to be particularly useful when a species of interest is spread out over a large area and sampling is constrained. We used occupancy modelling to estimate the probability of three phylogenetically independent pairs of native—introduced species [Megachile campanulae (Robertson)—Megachile rotundata (Fab.), Megachile pugnata Say—Megachile centuncularis (L.), Osmia pumila Cresson—Osmia caerulescens (L.)] (Apoidea: Megachilidae) being present when repeated sampling did not always find them. Our study occurred along a gradient of urbanization and used nest boxes (bee hotels) set up over three consecutive years. Occupancy modelling discovered different patterns to those obtained by species detection and abundance-based data alone. For example, it predicted that the species that was ranked 4th in terms of detection actually had the greatest occupancy among all six species. The native M. pugnata had decreased occupancy with increasing building footprint and a similar but not significant pattern was found for the native O. pumila. Two introduced bees (M. rotundata and M. centuncularis), and one native (M. campanulae) had modelled occupancy values that increased with increasing urbanization. Occupancy probability differed among urban green space types for three of six bee species, with values for two native species (M. campanulae and O. pumila) being highest in home gardens and that for the exotic O. caerulescens being highest in community gardens. The combination of occupancy modelling with analysis of habitat variables as an augmentation to detection and abundance-based sampling is suggested to be the best way to ensure that urban habitat management results in the desired outcomes. PMID:27911954

  15. Simulation of sea surface wave influence on small target detection with airborne laser depth sounding.

    PubMed

    Tulldahl, H Michael; Steinvall, K Ove

    2004-04-20

    A theoretical model for simulation of airborne depth-sounding lidar is presented with the purpose of analyzing the influence from water surface waves on the ability to detect 1-m3 targets placed on the sea bottom. Although water clarity is the main limitation, sea surface waves can significantly affect the detectability. The detection probability for a target at a 9-m depth can be above 90% at 1-m/s wind and below 80% at 6-m/s wind for the same water clarity. The simulation model contains both numerical and analytical components. Simulated data are compared with measured data and give realistic results for bottom depths between 3 and 10 m.

  16. Modeling of a bubble-memory organization with self-checking translators to achieve high reliability.

    NASA Technical Reports Server (NTRS)

    Bouricius, W. G.; Carter, W. C.; Hsieh, E. P.; Wadia, A. B.; Jessep, D. C., Jr.

    1973-01-01

    Study of the design and modeling of a highly reliable bubble-memory system that has the capabilities of: (1) correcting a single 16-adjacent bit-group error resulting from failures in a single basic storage module (BSM), and (2) detecting with a probability greater than 0.99 any double errors resulting from failures in BSM's. The results of the study justify the design philosophy adopted of employing memory data encoding and a translator to correct single group errors and detect double group errors to enhance the overall system reliability.

  17. Sampling little fish in big rivers: Larval fish detection probabilities in two Lake Erie tributaries and implications for sampling effort and abundance indices

    USGS Publications Warehouse

    Pritt, Jeremy J.; DuFour, Mark R.; Mayer, Christine M.; Roseman, Edward F.; DeBruyne, Robin L.

    2014-01-01

    Larval fish are frequently sampled in coastal tributaries to determine factors affecting recruitment, evaluate spawning success, and estimate production from spawning habitats. Imperfect detection of larvae is common, because larval fish are small and unevenly distributed in space and time, and coastal tributaries are often large and heterogeneous. We estimated detection probabilities of larval fish from several taxa in the Maumee and Detroit rivers, the two largest tributaries of Lake Erie. We then demonstrated how accounting for imperfect detection influenced (1) the probability of observing taxa as present relative to sampling effort and (2) abundance indices for larval fish of two Detroit River species. We found that detection probabilities ranged from 0.09 to 0.91 but were always less than 1.0, indicating that imperfect detection is common among taxa and between systems. In general, taxa with high fecundities, small larval length at hatching, and no nesting behaviors had the highest detection probabilities. Also, detection probabilities were higher in the Maumee River than in the Detroit River. Accounting for imperfect detection produced up to fourfold increases in abundance indices for Lake Whitefish Coregonus clupeaformis and Gizzard Shad Dorosoma cepedianum. The effect of accounting for imperfect detection in abundance indices was greatest during periods of low abundance for both species. Detection information can be used to determine the appropriate level of sampling effort for larval fishes and may improve management and conservation decisions based on larval fish data.

  18. Detection, prevalence, and transmission of avian hematozoa in waterfowl at the Arctic/sub-Arctic interface: co-infections, viral interactions, and sources of variation.

    USGS Publications Warehouse

    Meixell, Brandt W.; Arnold, Todd W.; Lindberg, Mark S.; Smith, Matthew M.; Runstadler, Jonathan A.; Ramey, Andy M.

    2016-01-01

    Methods: We used molecular methods to screen blood samples and cloacal/oropharyngeal swabs collected from 1347 ducks of five species during May-August 2010, in interior Alaska, for the presence of hematozoa, Influenza A Virus (IAV), and IAV antibodies. Using models to account for imperfect detection of parasites, we estimated seasonal variation in prevalence of three parasite genera (Haemoproteus, Plasmodium, Leucocytozoon) and investigated how co-infection with parasites and viruses were related to the probability of infection. Results: We detected parasites from each hematozoan genus in adult and juvenile ducks of all species sampled. Seasonal patterns in detection and prevalence varied by parasite genus and species, age, and sex of duck hosts. The probabilities of infection for Haemoproteus and Leucocytozoon parasites were strongly positively correlated, but hematozoa infection was not correlated with IAV infection or serostatus. The probability of Haemoproteus infection was negatively related to body condition in juvenile ducks; relationships between Leucocytozoon infection and body condition varied among host species. Conclusions: We present prevalence estimates for Haemoproteus, Leucocytozoon, and Plasmodium infections in waterfowl at the interface of the sub-Arctic and Arctic and provide evidence for local transmission of all three parasite genera. Variation in prevalence and molecular detection of hematozoa parasites in wild ducks is influenced by seasonal timing and a number of host traits. A positive correlation in co-infection of Leucocytozoon and Haemoproteus suggests that infection probability by parasites in one or both genera is enhanced by infection with the other, or that encounter rates of hosts and genus-specific vectors are correlated. Using size-adjusted mass as an index of host condition, we did not find evidence for strong deleterious consequences of hematozoa infection in wild ducks.

  19. The costs of evaluating species densities and composition of snakes to assess development impacts in amazonia.

    PubMed

    Fraga, Rafael de; Stow, Adam J; Magnusson, William E; Lima, Albertina P

    2014-01-01

    Studies leading to decision-making for environmental licensing often fail to provide accurate estimates of diversity. Measures of snake diversity are regularly obtained to assess development impacts in the rainforests of the Amazon Basin, but this taxonomic group may be subject to poor detection probabilities. Recently, the Brazilian government tried to standardize sampling designs by the implementation of a system (RAPELD) to quantify biological diversity using spatially-standardized sampling units. Consistency in sampling design allows the detection probabilities to be compared among taxa, and sampling effort and associated cost to be evaluated. The cost effectiveness of detecting snakes has received no attention in Amazonia. Here we tested the effects of reducing sampling effort on estimates of species densities and assemblage composition. We identified snakes in seven plot systems, each standardised with 14 plots. The 250 m long centre line of each plot followed an altitudinal contour. Surveys were repeated four times in each plot and detection probabilities were estimated for the 41 species encountered. Reducing the number of observations, or the size of the sampling modules, caused significant loss of information on species densities and local patterns of variation in assemblage composition. We estimated the cost to find a snake as $ 120 U.S., but general linear models indicated the possibility of identifying differences in assemblage composition for half the overall survey costs. Decisions to reduce sampling effort depend on the importance of lost information to target-issues, and may not be the preferred option if there is the potential for identifying individual snake species requiring specific conservation actions. However, in most studies of human disturbance on species assemblages, it is likely to be more cost-effective to focus on other groups of organisms with higher detection probabilities.

  20. The Costs of Evaluating Species Densities and Composition of Snakes to Assess Development Impacts in Amazonia

    PubMed Central

    de Fraga, Rafael; Stow, Adam J.; Magnusson, William E.; Lima, Albertina P.

    2014-01-01

    Studies leading to decision-making for environmental licensing often fail to provide accurate estimates of diversity. Measures of snake diversity are regularly obtained to assess development impacts in the rainforests of the Amazon Basin, but this taxonomic group may be subject to poor detection probabilities. Recently, the Brazilian government tried to standardize sampling designs by the implementation of a system (RAPELD) to quantify biological diversity using spatially-standardized sampling units. Consistency in sampling design allows the detection probabilities to be compared among taxa, and sampling effort and associated cost to be evaluated. The cost effectiveness of detecting snakes has received no attention in Amazonia. Here we tested the effects of reducing sampling effort on estimates of species densities and assemblage composition. We identified snakes in seven plot systems, each standardised with 14 plots. The 250 m long centre line of each plot followed an altitudinal contour. Surveys were repeated four times in each plot and detection probabilities were estimated for the 41 species encountered. Reducing the number of observations, or the size of the sampling modules, caused significant loss of information on species densities and local patterns of variation in assemblage composition. We estimated the cost to find a snake as $ 120 U.S., but general linear models indicated the possibility of identifying differences in assemblage composition for half the overall survey costs. Decisions to reduce sampling effort depend on the importance of lost information to target-issues, and may not be the preferred option if there is the potential for identifying individual snake species requiring specific conservation actions. However, in most studies of human disturbance on species assemblages, it is likely to be more cost-effective to focus on other groups of organisms with higher detection probabilities. PMID:25147930

  1. A hierarchical model for estimating density in camera-trap studies

    USGS Publications Warehouse

    Royle, J. Andrew; Nichols, James D.; Karanth, K.Ullas; Gopalaswamy, Arjun M.

    2009-01-01

    Estimating animal density using capture–recapture data from arrays of detection devices such as camera traps has been problematic due to the movement of individuals and heterogeneity in capture probability among them induced by differential exposure to trapping.We develop a spatial capture–recapture model for estimating density from camera-trapping data which contains explicit models for the spatial point process governing the distribution of individuals and their exposure to and detection by traps.We adopt a Bayesian approach to analysis of the hierarchical model using the technique of data augmentation.The model is applied to photographic capture–recapture data on tigers Panthera tigris in Nagarahole reserve, India. Using this model, we estimate the density of tigers to be 14·3 animals per 100 km2 during 2004.Synthesis and applications. Our modelling framework largely overcomes several weaknesses in conventional approaches to the estimation of animal density from trap arrays. It effectively deals with key problems such as individual heterogeneity in capture probabilities, movement of traps, presence of potential ‘holes’ in the array and ad hoc estimation of sample area. The formulation, thus, greatly enhances flexibility in the conduct of field surveys as well as in the analysis of data, from studies that may involve physical, photographic or DNA-based ‘captures’ of individual animals.

  2. Sampling considerations for disease surveillance in wildlife populations

    USGS Publications Warehouse

    Nusser, S.M.; Clark, W.R.; Otis, D.L.; Huang, L.

    2008-01-01

    Disease surveillance in wildlife populations involves detecting the presence of a disease, characterizing its prevalence and spread, and subsequent monitoring. A probability sample of animals selected from the population and corresponding estimators of disease prevalence and detection provide estimates with quantifiable statistical properties, but this approach is rarely used. Although wildlife scientists often assume probability sampling and random disease distributions to calculate sample sizes, convenience samples (i.e., samples of readily available animals) are typically used, and disease distributions are rarely random. We demonstrate how landscape-based simulation can be used to explore properties of estimators from convenience samples in relation to probability samples. We used simulation methods to model what is known about the habitat preferences of the wildlife population, the disease distribution, and the potential biases of the convenience-sample approach. Using chronic wasting disease in free-ranging deer (Odocoileus virginianus) as a simple illustration, we show that using probability sample designs with appropriate estimators provides unbiased surveillance parameter estimates but that the selection bias and coverage errors associated with convenience samples can lead to biased and misleading results. We also suggest practical alternatives to convenience samples that mix probability and convenience sampling. For example, a sample of land areas can be selected using a probability design that oversamples areas with larger animal populations, followed by harvesting of individual animals within sampled areas using a convenience sampling method.

  3. A Gibbs sampler for motif detection in phylogenetically close sequences

    NASA Astrophysics Data System (ADS)

    Siddharthan, Rahul; van Nimwegen, Erik; Siggia, Eric

    2004-03-01

    Genes are regulated by transcription factors that bind to DNA upstream of genes and recognize short conserved ``motifs'' in a random intergenic ``background''. Motif-finders such as the Gibbs sampler compare the probability of these short sequences being represented by ``weight matrices'' to the probability of their arising from the background ``null model'', and explore this space (analogous to a free-energy landscape). But closely related species may show conservation not because of functional sites but simply because they have not had sufficient time to diverge, so conventional methods will fail. We introduce a new Gibbs sampler algorithm that accounts for common ancestry when searching for motifs, while requiring minimal ``prior'' assumptions on the number and types of motifs, assessing the significance of detected motifs by ``tracking'' clusters that stay together. We apply this scheme to motif detection in sporulation-cycle genes in the yeast S. cerevisiae, using recent sequences of other closely-related Saccharomyces species.

  4. Multivariate η-μ fading distribution with arbitrary correlation model

    NASA Astrophysics Data System (ADS)

    Ghareeb, Ibrahim; Atiani, Amani

    2018-03-01

    An extensive analysis for the multivariate ? distribution with arbitrary correlation is presented, where novel analytical expressions for the multivariate probability density function, cumulative distribution function and moment generating function (MGF) of arbitrarily correlated and not necessarily identically distributed ? power random variables are derived. Also, this paper provides exact-form expression for the MGF of the instantaneous signal-to-noise ratio at the combiner output in a diversity reception system with maximal-ratio combining and post-detection equal-gain combining operating in slow frequency nonselective arbitrarily correlated not necessarily identically distributed ?-fading channels. The average bit error probability of differentially detected quadrature phase shift keying signals with post-detection diversity reception system over arbitrarily correlated and not necessarily identical fading parameters ?-fading channels is determined by using the MGF-based approach. The effect of fading correlation between diversity branches, fading severity parameters and diversity level is studied.

  5. Detectability of Wellbore CO2 Leakage using the Magnetotelluric Method

    NASA Astrophysics Data System (ADS)

    Yang, X.; Buscheck, T. A.; Mansoor, K.; Carroll, S.

    2016-12-01

    We assessed the effectiveness of the magnetotelluric (MT) method in detecting CO2 and brine leakage through a wellbore, which penetrates a CO2 storage reservoir, into overlying aquifers, 0 to 1720 m in depth, in support of the USDOE National Risk Assessment Partnership (NRAP) monitoring program. Synthetic datasets based on the Kimberlina site in the southern San Joaquin Basin, California were created using CO2 storage reservoir models, wellbore leakage models, and groundwater/geochemical models of the overlying aquifers. The species concentrations simulated with the groundwater/geochemical models were converted into bulk electrical conductivity (EC) distributions as the MT model input. Brine and CO2 leakage into the overlying aquifers increases ion concentrations, and thus results in an EC increase, which may be detected by the MT method. Our objective was to estimate and maximize the probability of leakage detection using the MT method. The MT method is an electromagnetic geophysical technique that images the subsurface EC distribution by measuring natural electric and magnetic fields in the frequency range from 0.01 Hz to 1 kHz with sensors on the ground surface. The ModEM software was used to predict electromagnetic responses from brine and CO2 leakage and to invert synthetic MT data for recovery of subsurface conductivity distribution. We are in the process of building 1000 simulations for ranges of permeability, leakage flux, and hydraulic gradient to study leakage detectability and to develop an optimization method to answer when, where and how an MT monitoring system should be deployed to maximize the probability of leakage detection. This work was sponsored by the USDOE Fossil Energy, National Energy Technology Laboratory, managed by Traci Rodosta and Andrea McNemar. This work was performed under the auspices of the USDOE by LLNL under contract DE-AC52-07NA27344. LLNL IM release number is LLNL-ABS-699276.

  6. Beyond the swab: ecosystem sampling to understand the persistence of an amphibian pathogen.

    PubMed

    Mosher, Brittany A; Huyvaert, Kathryn P; Bailey, Larissa L

    2018-06-02

    Understanding the ecosystem-level persistence of pathogens is essential for predicting and measuring host-pathogen dynamics. However, this process is often masked, in part due to a reliance on host-based pathogen detection methods. The amphibian pathogens Batrachochytrium dendrobatidis (Bd) and B. salamandrivorans (Bsal) are pathogens of global conservation concern. Despite having free-living life stages, little is known about the distribution and persistence of these pathogens outside of their amphibian hosts. We combine historic amphibian monitoring data with contemporary host- and environment-based pathogen detection data to obtain estimates of Bd occurrence independent of amphibian host distributions. We also evaluate differences in filter- and swab-based detection probability and assess inferential differences arising from using different decision criteria used to classify samples as positive or negative. Water filtration-based detection probabilities were lower than those from swabs but were > 10%, and swab-based detection probabilities varied seasonally, declining in the early fall. The decision criterion used to classify samples as positive or negative was important; using a more liberal criterion yielded higher estimates of Bd occurrence than when a conservative criterion was used. Different covariates were important when using the liberal or conservative criterion in modeling Bd detection. We found evidence of long-term Bd persistence for several years after an amphibian host species of conservation concern, the boreal toad (Anaxyrus boreas boreas), was last detected. Our work provides evidence of long-term Bd persistence in the ecosystem, and underscores the importance of environmental samples for understanding and mitigating disease-related threats to amphibian biodiversity.

  7. Detection of mastitis in dairy cattle by use of mixture models for repeated somatic cell scores: a Bayesian approach via Gibbs sampling.

    PubMed

    Odegård, J; Jensen, J; Madsen, P; Gianola, D; Klemetsdal, G; Heringstad, B

    2003-11-01

    The distribution of somatic cell scores could be regarded as a mixture of at least two components depending on a cow's udder health status. A heteroscedastic two-component Bayesian normal mixture model with random effects was developed and implemented via Gibbs sampling. The model was evaluated using datasets consisting of simulated somatic cell score records. Somatic cell score was simulated as a mixture representing two alternative udder health statuses ("healthy" or "diseased"). Animals were assigned randomly to the two components according to the probability of group membership (Pm). Random effects (additive genetic and permanent environment), when included, had identical distributions across mixture components. Posterior probabilities of putative mastitis were estimated for all observations, and model adequacy was evaluated using measures of sensitivity, specificity, and posterior probability of misclassification. Fitting different residual variances in the two mixture components caused some bias in estimation of parameters. When the components were difficult to disentangle, so were their residual variances, causing bias in estimation of Pm and of location parameters of the two underlying distributions. When all variance components were identical across mixture components, the mixture model analyses returned parameter estimates essentially without bias and with a high degree of precision. Including random effects in the model increased the probability of correct classification substantially. No sizable differences in probability of correct classification were found between models in which a single cow effect (ignoring relationships) was fitted and models where this effect was split into genetic and permanent environmental components, utilizing relationship information. When genetic and permanent environmental effects were fitted, the between-replicate variance of estimates of posterior means was smaller because the model accounted for random genetic drift.

  8. Modelling the spatial distribution of Fasciola hepatica in dairy cattle in Europe.

    PubMed

    Ducheyne, Els; Charlier, Johannes; Vercruysse, Jozef; Rinaldi, Laura; Biggeri, Annibale; Demeler, Janina; Brandt, Christina; De Waal, Theo; Selemetas, Nikolaos; Höglund, Johan; Kaba, Jaroslaw; Kowalczyk, Slawomir J; Hendrickx, Guy

    2015-03-26

    A harmonized sampling approach in combination with spatial modelling is required to update current knowledge of fasciolosis in dairy cattle in Europe. Within the scope of the EU project GLOWORM, samples from 3,359 randomly selected farms in 849 municipalities in Belgium, Germany, Ireland, Poland and Sweden were collected and their infection status assessed using an indirect bulk tank milk (BTM) enzyme-linked immunosorbent assay (ELISA). Dairy farms were considered exposed when the optical density ratio (ODR) exceeded the 0.3 cut-off. Two ensemble-modelling techniques, Random Forests (RF) and Boosted Regression Trees (BRT), were used to obtain the spatial distribution of the probability of exposure to Fasciola hepatica using remotely sensed environmental variables (1-km spatial resolution) and interpolated values from meteorological stations as predictors. The median ODRs amounted to 0.31, 0.12, 0.54, 0.25 and 0.44 for Belgium, Germany, Ireland, Poland and southern Sweden, respectively. Using the 0.3 threshold, 571 municipalities were categorized as positive and 429 as negative. RF was seen as capable of predicting the spatial distribution of exposure with an area under the receiver operation characteristic (ROC) curve (AUC) of 0.83 (0.96 for BRT). Both models identified rainfall and temperature as the most important factors for probability of exposure. Areas of high and low exposure were identified by both models, with BRT better at discriminating between low-probability and high-probability exposure; this model may therefore be more useful in practise. Given a harmonized sampling strategy, it should be possible to generate robust spatial models for fasciolosis in dairy cattle in Europe to be used as input for temporal models and for the detection of deviations in baseline probability. Further research is required for model output in areas outside the eco-climatic range investigated.

  9. A Bayesian state-space formulation of dynamic occupancy models

    USGS Publications Warehouse

    Royle, J. Andrew; Kery, M.

    2007-01-01

    Species occurrence and its dynamic components, extinction and colonization probabilities, are focal quantities in biogeography and metapopulation biology, and for species conservation assessments. It has been increasingly appreciated that these parameters must be estimated separately from detection probability to avoid the biases induced by nondetection error. Hence, there is now considerable theoretical and practical interest in dynamic occupancy models that contain explicit representations of metapopulation dynamics such as extinction, colonization, and turnover as well as growth rates. We describe a hierarchical parameterization of these models that is analogous to the state-space formulation of models in time series, where the model is represented by two components, one for the partially observable occupancy process and another for the observations conditional on that process. This parameterization naturally allows estimation of all parameters of the conventional approach to occupancy models, but in addition, yields great flexibility and extensibility, e.g., to modeling heterogeneity or latent structure in model parameters. We also highlight the important distinction between population and finite sample inference; the latter yields much more precise estimates for the particular sample at hand. Finite sample estimates can easily be obtained using the state-space representation of the model but are difficult to obtain under the conventional approach of likelihood-based estimation. We use R and Win BUGS to apply the model to two examples. In a standard analysis for the European Crossbill in a large Swiss monitoring program, we fit a model with year-specific parameters. Estimates of the dynamic parameters varied greatly among years, highlighting the irruptive population dynamics of that species. In the second example, we analyze route occupancy of Cerulean Warblers in the North American Breeding Bird Survey (BBS) using a model allowing for site-specific heterogeneity in model parameters. The results indicate relatively low turnover and a stable distribution of Cerulean Warblers which is in contrast to analyses of counts of individuals from the same survey that indicate important declines. This discrepancy illustrates the inertia in occupancy relative to actual abundance. Furthermore, the model reveals a declining patch survival probability, and increasing turnover, toward the edge of the range of the species, which is consistent with metapopulation perspectives on the genesis of range edges. Given detection/non-detection data, dynamic occupancy models as described here have considerable potential for the study of distributions and range dynamics.

  10. Predicting potentially toxigenic Pseudo-nitzschia blooms in the Chesapeake Bay

    NASA Astrophysics Data System (ADS)

    Anderson, Clarissa R.; Sapiano, Mathew R. P.; Prasad, M. Bala Krishna; Long, Wen; Tango, Peter J.; Brown, Christopher W.; Murtugudde, Raghu

    2010-11-01

    Harmful algal blooms are now recognized as a significant threat to the Chesapeake Bay as they can severely compromise the economic viability of important recreational and commercial fisheries in the largest estuary of the United States. This study describes the development of empirical models for the potentially domoic acid-producing Pseudo-nitzschia species complex present in the Bay, developed from a 22-year time series of cell abundance and concurrent measurements of hydrographic and chemical properties. Using a logistic Generalized Linear Model (GLM) approach, model parameters and performance were compared over a range of Pseudo-nitzschia bloom thresholds relevant to toxin production by different species. Small-threshold blooms (≥10 cells mL -1) are explained by time of year, location, and variability in surface values of phosphate, temperature, nitrate plus nitrite, and freshwater discharge. Medium- (100 cells mL -1) to large- threshold (1000 cells mL -1) blooms are further explained by salinity, silicic acid, dissolved organic carbon, and light attenuation (Secchi) depth. These predictors are similar to other models for Pseudo-nitzschia blooms on the west coast, suggesting commonalities across ecosystems. Hindcasts of bloom probabilities at a 19% bloom prediction point yield a Heidke Skill Score of ~53%, a Probability of Detection ˜ 75%, a False Alarm Ratio of ˜ 52%, and a Probability of False Detection ˜9%. The implication of possible future changes in Baywide nutrient stoichiometry on Pseudo-nitzschia blooms is discussed.

  11. Early environment and recruitment of black brant (Branta bernicla nigricans) into the breeding population

    USGS Publications Warehouse

    Sedinger, James S.; Herzog, Mark P.; Ward, David H.

    2004-01-01

    In geese, growth regulates survival in the first year. We examined whether early growth, which is primarily governed by environmental conditions, also affects the probability that individuals that survive their first year enter the breeding population. We used logistic regression on a sample of Black Brant (Branta bernicla nigricans) that were weighed at a known age in their first summer and observed during winter (indicating that they had survived the principal mortality period in their first year) to study whether early growth influenced the probability that those individuals would be recruited into the breeding population. We also examined the effects of cohort (1986-1996), sex, age when measured, and area where individuals were reared. The model with the lowest Akaike's Information Criterion score contained body mass, age (days) at measurement, cohort, sex, and brood-rearing area. Models that included variable mass had 85% of the cumulative model weight of the models we considered, indicating that gosling mass had a substantial effect on probability of them entering the breeding population. Females were more likely to be detected breeding than males, which is consistent with the differential fidelity of the sexes. Of individuals that survived the first year, larger goslings were more likely to become breeders. More recent cohorts were less likely to have been detected as breeders. Our findings indicate that environment during the growth period affects the ability of individuals to enter the breeding population, even after accounting for the effects of growth on survival.

  12. Precipitation Cluster Distributions: Current Climate Storm Statistics and Projected Changes Under Global Warming

    NASA Astrophysics Data System (ADS)

    Quinn, Kevin Martin

    The total amount of precipitation integrated across a precipitation cluster (contiguous precipitating grid cells exceeding a minimum rain rate) is a useful measure of the aggregate size of the disturbance, expressed as the rate of water mass lost or latent heat released, i.e. the power of the disturbance. Probability distributions of cluster power are examined during boreal summer (May-September) and winter (January-March) using satellite-retrieved rain rates from the Tropical Rainfall Measuring Mission (TRMM) 3B42 and Special Sensor Microwave Imager and Sounder (SSM/I and SSMIS) programs, model output from the High Resolution Atmospheric Model (HIRAM, roughly 0.25-0.5 0 resolution), seven 1-2° resolution members of the Coupled Model Intercomparison Project Phase 5 (CMIP5) experiment, and National Center for Atmospheric Research Large Ensemble (NCAR LENS). Spatial distributions of precipitation-weighted centroids are also investigated in observations (TRMM-3B42) and climate models during winter as a metric for changes in mid-latitude storm tracks. Observed probability distributions for both seasons are scale-free from the smallest clusters up to a cutoff scale at high cluster power, after which the probability density drops rapidly. When low rain rates are excluded by choosing a minimum rain rate threshold in defining clusters, the models accurately reproduce observed cluster power statistics and winter storm tracks. Changes in behavior in the tail of the distribution, above the cutoff, are important for impacts since these quantify the frequency of the most powerful storms. End-of-century cluster power distributions and storm track locations are investigated in these models under a "business as usual" global warming scenario. The probability of high cluster power events increases by end-of-century across all models, by up to an order of magnitude for the highest-power events for which statistics can be computed. For the three models in the suite with continuous time series of high resolution output, there is substantial variability on when these probability increases for the most powerful precipitation clusters become detectable, ranging from detectable within the observational period to statistically significant trends emerging only after 2050. A similar analysis of National Centers for Environmental Prediction (NCEP) Reanalysis 2 and SSM/I-SSMIS rain rate retrievals in the recent observational record does not yield reliable evidence of trends in high-power cluster probabilities at this time. Large impacts to mid-latitude storm tracks are projected over the West Coast and eastern North America, with no less than 8 of the 9 models examined showing large increases by end-of-century in the probability density of the most powerful storms, ranging up to a factor of 6.5 in the highest range bin for which historical statistics are computed. However, within these regional domains, there is considerable variation among models in pinpointing exactly where the largest increases will occur.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Masoumi, Ali; Vilenkin, Alexander; Yamada, Masaki, E-mail: ali@cosmos.phy.tufts.edu, E-mail: vilenkin@cosmos.phy.tufts.edu, E-mail: Masaki.Yamada@tufts.edu

    In the landscape perspective, our Universe begins with a quantum tunneling from an eternally-inflating parent vacuum, followed by a period of slow-roll inflation. We investigate the tunneling process and calculate the probability distribution for the initial conditions and for the number of e-folds of slow-roll inflation, modeling the landscape by a small-field one-dimensional random Gaussian potential. We find that such a landscape is fully consistent with observations, but the probability for future detection of spatial curvature is rather low, P ∼ 10{sup −3}.

  14. Detection of nuclear resonance signals: modification of the receiver operating characteristics using feedback.

    PubMed

    Blauch, A J; Schiano, J L; Ginsberg, M D

    2000-06-01

    The performance of a nuclear resonance detection system can be quantified using binary detection theory. Within this framework, signal averaging increases the probability of a correct detection and decreases the probability of a false alarm by reducing the variance of the noise in the average signal. In conjunction with signal averaging, we propose another method based on feedback control concepts that further improves detection performance. By maximizing the nuclear resonance signal amplitude, feedback raises the probability of correct detection. Furthermore, information generated by the feedback algorithm can be used to reduce the probability of false alarm. We discuss the advantages afforded by feedback that cannot be obtained using signal averaging. As an example, we show how this method is applicable to the detection of explosives using nuclear quadrupole resonance. Copyright 2000 Academic Press.

  15. Stochastic satisficing account of confidence in uncertain value-based decisions

    PubMed Central

    Bahrami, Bahador; Keramati, Mehdi

    2018-01-01

    Every day we make choices under uncertainty; choosing what route to work or which queue in a supermarket to take, for example. It is unclear how outcome variance, e.g. uncertainty about waiting time in a queue, affects decisions and confidence when outcome is stochastic and continuous. How does one evaluate and choose between an option with unreliable but high expected reward, and an option with more certain but lower expected reward? Here we used an experimental design where two choices’ payoffs took continuous values, to examine the effect of outcome variance on decision and confidence. We found that our participants’ probability of choosing the good (high expected reward) option decreased when the good or the bad options’ payoffs were more variable. Their confidence ratings were affected by outcome variability, but only when choosing the good option. Unlike perceptual detection tasks, confidence ratings correlated only weakly with decisions’ time, but correlated with the consistency of trial-by-trial choices. Inspired by the satisficing heuristic, we propose a “stochastic satisficing” (SSAT) model for evaluating options with continuous uncertain outcomes. In this model, options are evaluated by their probability of exceeding an acceptability threshold, and confidence reports scale with the chosen option’s thus-defined satisficing probability. Participants’ decisions were best explained by an expected reward model, while the SSAT model provided the best prediction of decision confidence. We further tested and verified the predictions of this model in a second experiment. Our model and experimental results generalize the models of metacognition from perceptual detection tasks to continuous-value based decisions. Finally, we discuss how the stochastic satisficing account of decision confidence serves psychological and social purposes associated with the evaluation, communication and justification of decision-making. PMID:29621325

  16. On splice site prediction using weight array models: a comparison of smoothing techniques

    NASA Astrophysics Data System (ADS)

    Taher, Leila; Meinicke, Peter; Morgenstern, Burkhard

    2007-11-01

    In most eukaryotic genes, protein-coding exons are separated by non-coding introns which are removed from the primary transcript by a process called "splicing". The positions where introns are cut and exons are spliced together are called "splice sites". Thus, computational prediction of splice sites is crucial for gene finding in eukaryotes. Weight array models are a powerful probabilistic approach to splice site detection. Parameters for these models are usually derived from m-tuple frequencies in trusted training data and subsequently smoothed to avoid zero probabilities. In this study we compare three different ways of parameter estimation for m-tuple frequencies, namely (a) non-smoothed probability estimation, (b) standard pseudo counts and (c) a Gaussian smoothing procedure that we recently developed.

  17. Designing occupancy studies when false-positive detections occur

    USGS Publications Warehouse

    Clement, Matthew

    2016-01-01

    1.Recently, estimators have been developed to estimate occupancy probabilities when false-positive detections occur during presence-absence surveys. Some of these estimators combine different types of survey data to improve estimates of occupancy. With these estimators, there is a tradeoff between the number of sample units surveyed, and the number and type of surveys at each sample unit. Guidance on efficient design of studies when false positives occur is unavailable. 2.For a range of scenarios, I identified survey designs that minimized the mean square error of the estimate of occupancy. I considered an approach that uses one survey method and two observation states and an approach that uses two survey methods. For each approach, I used numerical methods to identify optimal survey designs when model assumptions were met and parameter values were correctly anticipated, when parameter values were not correctly anticipated, and when the assumption of no unmodelled detection heterogeneity was violated. 3.Under the approach with two observation states, false positive detections increased the number of recommended surveys, relative to standard occupancy models. If parameter values could not be anticipated, pessimism about detection probabilities avoided poor designs. Detection heterogeneity could require more or fewer repeat surveys, depending on parameter values. If model assumptions were met, the approach with two survey methods was inefficient. However, with poor anticipation of parameter values, with detection heterogeneity, or with removal sampling schemes, combining two survey methods could improve estimates of occupancy. 4.Ignoring false positives can yield biased parameter estimates, yet false positives greatly complicate the design of occupancy studies. Specific guidance for major types of false-positive occupancy models, and for two assumption violations common in field data, can conserve survey resources. This guidance can be used to design efficient monitoring programs and studies of species occurrence, species distribution, or habitat selection, when false positives occur during surveys.

  18. Declining occurrence and low colonization probability in freshwater mussel assemblages: A dynamic occurrence modeling approach

    USGS Publications Warehouse

    Pandolfo, Tamara J.; Kwak, Thomas J.; Cope, W. Gregory; Heise, Ryan J.; Nichols, Robert B.; Pacifici, Krishna

    2017-01-01

    Mussel monitoring data are abundant, but methods for analyzing long-term trends in these data are often uninformative or have low power to detect changes. We used a dynamic occurrence model, which accounted for imperfect species detection in surveys, to assess changes in species occurrence in a longterm data set (1986–2011) for the Tar River basin of North Carolina, USA. Occurrence of all species decreased steadily over the time period studied. Occurrence in 1986 ranged from 0.19 for Utterbackia imbecillis to 0.60 for Fusconaia masoni. Occurrence in 2010–2011 ranged from 0.10 for Lampsilis radiata to 0.40 for F. masoni. The maximum difference between occurrence in 1986 and 2011 was a decline of 0.30 for Alasmidonta undulata. Mean persistence for all species was high (0.97, 95% CI ¼ 0.95–0.99); however, mean colonization probability was very low (,0.01, 95% CI ¼ ,0.01–0.01). These results indicate that mussels persisted at sites already occupied but that they have not colonized sites where they had not occurred previously. Our findings highlight the importance of modeling approaches that incorporate imperfect detection in estimating species occurrence and revealing temporal trends to inform conservation planning.

  19. Improving detection probabilities for pests in stored grain.

    PubMed

    Elmouttie, David; Kiermeier, Andreas; Hamilton, Grant

    2010-12-01

    The presence of insects in stored grain is a significant problem for grain farmers, bulk grain handlers and distributors worldwide. Inspection of bulk grain commodities is essential to detect pests and thereby to reduce the risk of their presence in exported goods. It has been well documented that insect pests cluster in response to factors such as microclimatic conditions within bulk grain. Statistical sampling methodologies for grain, however, have typically considered pests and pathogens to be homogeneously distributed throughout grain commodities. In this paper, a sampling methodology is demonstrated that accounts for the heterogeneous distribution of insects in bulk grain. It is shown that failure to account for the heterogeneous distribution of pests may lead to overestimates of the capacity for a sampling programme to detect insects in bulk grain. The results indicate the importance of the proportion of grain that is infested in addition to the density of pests within the infested grain. It is also demonstrated that the probability of detecting pests in bulk grain increases as the number of subsamples increases, even when the total volume or mass of grain sampled remains constant. This study underlines the importance of considering an appropriate biological model when developing sampling methodologies for insect pests. Accounting for a heterogeneous distribution of pests leads to a considerable improvement in the detection of pests over traditional sampling models. Copyright © 2010 Society of Chemical Industry.

  20. Time-course of germination, initiation of mycelium proliferation and probability of visible growth and detectable AFB1 production of an isolate of Aspergillus flavus on pistachio extract agar.

    PubMed

    Aldars-García, Laila; Sanchis, Vicente; Ramos, Antonio J; Marín, Sonia

    2017-06-01

    The aim of this work was to assess the temporal relationship among quantified germination, mycelial growth and aflatoxin B 1 (AFB1) production from colonies coming from single spores, in order to find the best way to predict as accurately as possible the presence of AFB1 at the early stages of contamination. Germination, mycelial growth, probability of growth and probability of AFB1 production of an isolate of Aspergillus flavus were determined at 25 °C and two water activities (0.85 and 0.87) on 3% Pistachio Extract Agar (PEA). The percentage of germinated spores versus time was fitted to the modified Gompertz equation for the estimation of the germination parameters (geometrical germination time and germination rate). The radial growth curve for each colony was fitted to a linear model for the estimation of the apparent lag time for growth and the growth rate, and besides the time to visible growth was estimated. Binary data obtained from growth and AFB1 studies were modeled using logistic regression analysis. Both water activities led to a similar fungal growth and AFB1 production. In this study, given the suboptimal set conditions, it has been observed that germination is a stage far from the AFB1 production process. Once the probability of growth started to increase it took 6 days to produce AFB1, and when probability of growth was 100%, only a 40-57% probability of detection of AFB1 production was predicted. Moreover, colony sizes with a radius of 1-2 mm could be a helpful indicator of the possible AFB1 contamination in the commodity. Despite growth models may overestimate the presence of AFB1, their use would be a helpful tool for producers and manufacturers; from our data 5% probability of AFB1 production (initiation of production) would occur when a minimum of 60% probability of growth is observed. Legal restrictions are quite severe for these toxins, thus their control from the early stages of contamination throughout the food chain is of paramount importance. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. On the importance of incorporating sampling weights in ...

    EPA Pesticide Factsheets

    Occupancy models are used extensively to assess wildlife-habitat associations and to predict species distributions across large geographic regions. Occupancy models were developed as a tool to properly account for imperfect detection of a species. Current guidelines on survey design requirements for occupancy models focus on the number of sample units and the pattern of revisits to a sample unit within a season. We focus on the sampling design or how the sample units are selected in geographic space (e.g., stratified, simple random, unequal probability, etc). In a probability design, each sample unit has a sample weight which quantifies the number of sample units it represents in the finite (oftentimes areal) sampling frame. We demonstrate the importance of including sampling weights in occupancy model estimation when the design is not a simple random sample or equal probability design. We assume a finite areal sampling frame as proposed for a national bat monitoring program. We compare several unequal and equal probability designs and varying sampling intensity within a simulation study. We found the traditional single season occupancy model produced biased estimates of occupancy and lower confidence interval coverage rates compared to occupancy models that accounted for the sampling design. We also discuss how our findings inform the analyses proposed for the nascent North American Bat Monitoring Program and other collaborative synthesis efforts that propose h

  2. When are goshawks not there? Is a single visit enough to infer absence at occupied nest areas? Journal of Raptor Research

    Treesearch

    Douglas A. Boyce; Patricia L. Kennedy; Paul Beier; Michael F. Ingraldi; Susie R. MacVean; Melissa S. Siders; John R. Squires; Brian Woodbridge

    2005-01-01

    We tested the efficacy of three methods (historical nest search, broadcast search, and tree transect search) for detecting presence of the Northern Goshawk (Accipiter gentilis) at occupied nest areas during the 1994 breeding season using only a single visit to a previously known nest area. We used detection rates in a probability model to determine how many...

  3. Compensating for geographic variation in detection probability with water depth improves abundance estimates of coastal marine megafauna.

    PubMed

    Hagihara, Rie; Jones, Rhondda E; Sobtzick, Susan; Cleguer, Christophe; Garrigue, Claire; Marsh, Helene

    2018-01-01

    The probability of an aquatic animal being available for detection is typically <1. Accounting for covariates that reduce the probability of detection is important for obtaining robust estimates of the population abundance and determining its status and trends. The dugong (Dugong dugon) is a bottom-feeding marine mammal and a seagrass community specialist. We hypothesized that the probability of a dugong being available for detection is dependent on water depth and that dugongs spend more time underwater in deep-water seagrass habitats than in shallow-water seagrass habitats. We tested this hypothesis by quantifying the depth use of 28 wild dugongs fitted with GPS satellite transmitters and time-depth recorders (TDRs) at three sites with distinct seagrass depth distributions: 1) open waters supporting extensive seagrass meadows to 40 m deep (Torres Strait, 6 dugongs, 2015); 2) a protected bay (average water depth 6.8 m) with extensive shallow seagrass beds (Moreton Bay, 13 dugongs, 2011 and 2012); and 3) a mixture of lagoon, coral and seagrass habitats to 60 m deep (New Caledonia, 9 dugongs, 2013). The fitted instruments were used to measure the times the dugongs spent in the experimentally determined detection zones under various environmental conditions. The estimated probability of detection was applied to aerial survey data previously collected at each location. In general, dugongs were least available for detection in Torres Strait, and the population estimates increased 6-7 fold using depth-specific availability correction factors compared with earlier estimates that assumed homogeneous detection probability across water depth and location. Detection probabilities were higher in Moreton Bay and New Caledonia than Torres Strait because the water transparency in these two locations was much greater than in Torres Strait and the effect of correcting for depth-specific detection probability much less. The methodology has application to visual survey of coastal megafauna including surveys using Unmanned Aerial Vehicles.

  4. Directed Design of Experiments (DOE) for Determining Probability of Detection (POD) Capability of NDE Systems (DOEPOD)

    NASA Technical Reports Server (NTRS)

    Generazio, Ed

    2007-01-01

    This viewgraph presentation reviews some of the issues that people who specialize in Non destructive evaluation (NDE) have with determining the statistics of the probability of detection. There is discussion of the use of the binominal distribution, and the probability of hit. The presentation then reviews the concepts of Directed Design of Experiments for Validating Probability of Detection of Inspection Systems (DOEPOD). Several cases are reviewed, and discussed. The concept of false calls is also reviewed.

  5. Common IED exploitation target set ontology

    NASA Astrophysics Data System (ADS)

    Russomanno, David J.; Qualls, Joseph; Wowczuk, Zenovy; Franken, Paul; Robinson, William

    2010-04-01

    The Common IED Exploitation Target Set (CIEDETS) ontology provides a comprehensive semantic data model for capturing knowledge about sensors, platforms, missions, environments, and other aspects of systems under test. The ontology also includes representative IEDs; modeled as explosives, camouflage, concealment objects, and other background objects, which comprise an overall threat scene. The ontology is represented using the Web Ontology Language and the SPARQL Protocol and RDF Query Language, which ensures portability of the acquired knowledge base across applications. The resulting knowledge base is a component of the CIEDETS application, which is intended to support the end user sensor test and evaluation community. CIEDETS associates a system under test to a subset of cataloged threats based on the probability that the system will detect the threat. The associations between systems under test, threats, and the detection probabilities are established based on a hybrid reasoning strategy, which applies a combination of heuristics and simplified modeling techniques. Besides supporting the CIEDETS application, which is focused on efficient and consistent system testing, the ontology can be leveraged in a myriad of other applications, including serving as a knowledge source for mission planning tools.

  6. Hard choices in assessing survival past dams — a comparison of single- and paired-release strategies

    USGS Publications Warehouse

    Zydlewski, Joseph D.; Stich, Daniel S.; Sigourney, Douglas B.

    2017-01-01

    Mark–recapture models are widely used to estimate survival of salmon smolts migrating past dams. Paired releases have been used to improve estimate accuracy by removing components of mortality not attributable to the dam. This method is accompanied by reduced precision because (i) sample size is reduced relative to a single, large release; and (ii) variance calculations inflate error. We modeled an idealized system with a single dam to assess trade-offs between accuracy and precision and compared methods using root mean squared error (RMSE). Simulations were run under predefined conditions (dam mortality, background mortality, detection probability, and sample size) to determine scenarios when the paired release was preferable to a single release. We demonstrate that a paired-release design provides a theoretical advantage over a single-release design only at large sample sizes and high probabilities of detection. At release numbers typical of many survival studies, paired release can result in overestimation of dam survival. Failures to meet model assumptions of a paired release may result in further overestimation of dam-related survival. Under most conditions, a single-release strategy was preferable.

  7. Testing hypotheses on distribution shifts and changes in phenology of imperfectly detectable species

    USGS Publications Warehouse

    Chambert, Thierry A.; Kendall, William L.; Hines, James E.; Nichols, James D.; Pedrini, Paolo; Waddle, J. Hardin; Tavecchia, Giacomo; Walls, Susan C.; Tenan, Simone

    2015-01-01

    With ongoing climate change, many species are expected to shift their spatial and temporal distributions. To document changes in species distribution and phenology, detection/non-detection data have proven very useful. Occupancy models provide a robust way to analyse such data, but inference is usually focused on species spatial distribution, not phenology.We present a multi-season extension of the staggered-entry occupancy model of Kendall et al. (2013, Ecology, 94, 610), which permits inference about the within-season patterns of species arrival and departure at sampling sites. The new model presented here allows investigation of species phenology and spatial distribution across years, as well as site extinction/colonization dynamics.We illustrate the model with two data sets on European migratory passerines and one data set on North American treefrogs. We show how to derive several additional phenological parameters, such as annual mean arrival and departure dates, from estimated arrival and departure probabilities.Given the extent of detection/non-detection data that are available, we believe that this modelling approach will prove very useful to further understand and predict species responses to climate change.

  8. Where and When: Optimal Scheduling of the Electromagnetic Follow-up of Gravitational-wave Events Based on Counterpart Light-curve Models

    NASA Astrophysics Data System (ADS)

    Salafia, Om Sharan; Colpi, Monica; Branchesi, Marica; Chassande-Mottin, Eric; Ghirlanda, Giancarlo; Ghisellini, Gabriele; Vergani, Susanna D.

    2017-09-01

    The electromagnetic (EM) follow-up of a gravitational-wave (GW) event requires scanning a wide sky region, defined by the so-called “skymap,” to detect and identify a transient counterpart. We propose a novel method that exploits the information encoded in the GW signal to construct a “detectability map,” which represents the time-dependent (“when”) probability of detecting the transient at each position of the skymap (“where”). Focusing on the case of a neutron star binary inspiral, we model the associated short gamma-ray burst afterglow and macronova emission using the probability distributions of binary parameters (sky position, distance, orbit inclination, mass ratio) extracted from the GW signal as inputs. The resulting family of possible light curves is the basis for constructing the detectability map. As a practical example, we apply the method to a simulated GW signal produced by a neutron star merger at 75 Mpc whose localization uncertainty is very large (˜1500 deg2). We construct observing strategies for optical, infrared, and radio facilities based on the detectability maps, taking VST, VISTA, and MeerKAT as prototypes. Assuming limiting fluxes of r˜ 24.5, J˜ 22.4 (AB magnitudes), and 500 μ {Jy} (1.4 {GHz}) for ˜1000 s of exposure each, the afterglow and macronova emissions are successfully detected with a minimum observing time of 7, 15, and 5 hr respectively.

  9. Day-time identification of summer hailstorm cells from MSG data

    NASA Astrophysics Data System (ADS)

    Merino, A.; López, L.; Sánchez, J. L.; García-Ortega, E.; Cattani, E.; Levizzani, V.

    2013-10-01

    Identifying deep convection is of paramount importance, as it may be associated with extreme weather that has significant impact on the environment, property and the population. A new method, the Hail Detection Tool (HDT), is described for identifying hail-bearing storms using multi-spectral Meteosat Second Generation (MSG) data. HDT was conceived as a two-phase method, in which the first step is the Convective Mask (CM) algorithm devised for detection of deep convection, and the second a Hail Detection algorithm (HD) for the identification of hail-bearing clouds among cumulonimbus systems detected by CM. Both CM and HD are based on logistic regression models trained with multi-spectral MSG data-sets comprised of summer convective events in the middle Ebro Valley between 2006-2010, and detected by the RGB visualization technique (CM) or C-band weather radar system of the University of León. By means of the logistic regression approach, the probability of identifying a cumulonimbus event with CM or a hail event with HD are computed by exploiting a proper selection of MSG wavelengths or their combination. A number of cloud physical properties (liquid water path, optical thickness and effective cloud drop radius) were used to physically interpret results of statistical models from a meteorological perspective, using a method based on these "ingredients." Finally, the HDT was applied to a new validation sample consisting of events during summer 2011. The overall Probability of Detection (POD) was 76.9% and False Alarm Ratio 16.7%.

  10. A generalized Benford's law for JPEG coefficients and its applications in image forensics

    NASA Astrophysics Data System (ADS)

    Fu, Dongdong; Shi, Yun Q.; Su, Wei

    2007-02-01

    In this paper, a novel statistical model based on Benford's law for the probability distributions of the first digits of the block-DCT and quantized JPEG coefficients is presented. A parametric logarithmic law, i.e., the generalized Benford's law, is formulated. Furthermore, some potential applications of this model in image forensics are discussed in this paper, which include the detection of JPEG compression for images in bitmap format, the estimation of JPEG compression Qfactor for JPEG compressed bitmap image, and the detection of double compressed JPEG image. The results of our extensive experiments demonstrate the effectiveness of the proposed statistical model.

  11. A risk-based multi-objective model for optimal placement of sensors in water distribution system

    NASA Astrophysics Data System (ADS)

    Naserizade, Sareh S.; Nikoo, Mohammad Reza; Montaseri, Hossein

    2018-02-01

    In this study, a new stochastic model based on Conditional Value at Risk (CVaR) and multi-objective optimization methods is developed for optimal placement of sensors in water distribution system (WDS). This model determines minimization of risk which is caused by simultaneous multi-point contamination injection in WDS using CVaR approach. The CVaR considers uncertainties of contamination injection in the form of probability distribution function and calculates low-probability extreme events. In this approach, extreme losses occur at tail of the losses distribution function. Four-objective optimization model based on NSGA-II algorithm is developed to minimize losses of contamination injection (through CVaR of affected population and detection time) and also minimize the two other main criteria of optimal placement of sensors including probability of undetected events and cost. Finally, to determine the best solution, Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE), as a subgroup of Multi Criteria Decision Making (MCDM) approach, is utilized to rank the alternatives on the trade-off curve among objective functions. Also, sensitivity analysis is done to investigate the importance of each criterion on PROMETHEE results considering three relative weighting scenarios. The effectiveness of the proposed methodology is examined through applying it to Lamerd WDS in the southwestern part of Iran. The PROMETHEE suggests 6 sensors with suitable distribution that approximately cover all regions of WDS. Optimal values related to CVaR of affected population and detection time as well as probability of undetected events for the best optimal solution are equal to 17,055 persons, 31 mins and 0.045%, respectively. The obtained results of the proposed methodology in Lamerd WDS show applicability of CVaR-based multi-objective simulation-optimization model for incorporating the main uncertainties of contamination injection in order to evaluate extreme value of losses in WDS.

  12. Climate change and the detection of trends in annual runoff

    USGS Publications Warehouse

    McCabe, G.J.; Wolock, D.M.

    1997-01-01

    This study examines the statistical likelihood of detecting a trend in annual runoff given an assumed change in mean annual runoff, the underlying year-to-year variability in runoff, and serial correlation of annual runoff. Means, standard deviations, and lag-1 serial correlations of annual runoff were computed for 585 stream gages in the conterminous United States, and these statistics were used to compute the probability of detecting a prescribed trend in annual runoff. Assuming a linear 20% change in mean annual runoff over a 100 yr period and a significance level of 95%, the average probability of detecting a significant trend was 28% among the 585 stream gages. The largest probability of detecting a trend was in the northwestern U.S., the Great Lakes region, the northeastern U.S., the Appalachian Mountains, and parts of the northern Rocky Mountains. The smallest probability of trend detection was in the central and southwestern U.S., and in Florida. Low probabilities of trend detection were associated with low ratios of mean annual runoff to the standard deviation of annual runoff and with high lag-1 serial correlation in the data.

  13. Spatial patch occupancy patterns of the Lower Keys marsh rabbit

    USGS Publications Warehouse

    Eaton, Mitchell J.; Hughes, Phillip T.; Nichols, James D.; Morkill, Anne; Anderson, Chad

    2011-01-01

    Reliable estimates of presence or absence of a species can provide substantial information on management questions related to distribution and habitat use but should incorporate the probability of detection to reduce bias. We surveyed for the endangered Lower Keys marsh rabbit (Sylvilagus palustris hefneri) in habitat patches on 5 Florida Key islands, USA, to estimate occupancy and detection probabilities. We derived detection probabilities using spatial replication of plots and evaluated hypotheses that patch location (coastal or interior) and patch size influence occupancy and detection. Results demonstrate that detection probability, given rabbits were present, was <0.5 and suggest that naïve estimates (i.e., estimates without consideration of imperfect detection) of patch occupancy are negatively biased. We found that patch size and location influenced probability of occupancy but not detection. Our findings will be used by Refuge managers to evaluate population trends of Lower Keys marsh rabbits from historical data and to guide management decisions for species recovery. The sampling and analytical methods we used may be useful for researchers and managers of other endangered lagomorphs and cryptic or fossorial animals occupying diverse habitats.

  14. Detection probabilities of electrofishing, hoop nets, and benthic trawls for fishes in two western North American rivers

    USGS Publications Warehouse

    Smith, Christopher D.; Quist, Michael C.; Hardy, Ryan S.

    2015-01-01

    Research comparing different sampling techniques helps improve the efficiency and efficacy of sampling efforts. We compared the effectiveness of three sampling techniques (small-mesh hoop nets, benthic trawls, boat-mounted electrofishing) for 30 species in the Green (WY, USA) and Kootenai (ID, USA) rivers by estimating conditional detection probabilities (probability of detecting a species given its presence at a site). Electrofishing had the highest detection probabilities (generally greater than 0.60) for most species (88%), but hoop nets also had high detectability for several taxa (e.g., adult burbot Lota lota, juvenile northern pikeminnow Ptychocheilus oregonensis). Benthic trawls had low detection probabilities (<0.05) for most taxa (84%). Gear-specific effects were present for most species indicating large differences in gear effectiveness among techniques. In addition to gear effects, habitat characteristics also influenced detectability of fishes. Most species-specific habitat relationships were idiosyncratic and reflected the ecology of the species. Overall findings of our study indicate that boat-mounted electrofishing and hoop nets are the most effective techniques for sampling fish assemblages in large, coldwater rivers.

  15. Radar Detectability Studies of Slow and Small Zodiacal Dust Cloud Particles. III. The Role of Sodium and the Head Echo Size on the Probability of Detection

    NASA Astrophysics Data System (ADS)

    Janches, D.; Swarnalingam, N.; Carrillo-Sanchez, J. D.; Gomez-Martin, J. C.; Marshall, R.; Nesvorný, D.; Plane, J. M. C.; Feng, W.; Pokorný, P.

    2017-07-01

    We present a path forward on a long-standing issue concerning the flux of small and slow meteoroids, which are believed to be the dominant portion of the incoming meteoric mass flux into the Earth’s atmosphere. Such a flux, which is predicted by dynamical dust models of the Zodiacal Cloud, is not evident in ground-based radar observations. For decades this was attributed to the fact that the radars used for meteor observations lack the sensitivity to detect this population, due to the small amount of ionization produced by slow-velocity meteors. Such a hypothesis has been challenged by the introduction of meteor head echo (HE) observations with High Power and Large Aperture radars, in particular the Arecibo 430 MHz radar. Janches et al. developed a probabilistic approach to estimate the detectability of meteors by these radars and initially showed that, with the current knowledge of ablation and ionization, such particles should dominate the detected rates by one to two orders of magnitude compared to the actual observations. In this paper, we include results in our model from recently published laboratory measurements, which showed that (1) the ablation of Na is less intense covering a wider altitude range; and (2) the ionization probability, {β }{ip}, for Na atoms in the air is up to two orders of magnitude smaller for low speeds than originally believed. By applying these results and using a somewhat smaller size of the HE radar target we offer a solution that reconciles these observations with model predictions.

  16. Radar Detectability Studies of Slow and Small Zodiacal Dust Cloud Particles. III. The Role of Sodium and the Head Echo Size on the Probability of Detection

    NASA Technical Reports Server (NTRS)

    Janches, D.; Swarnalingam, N.; Carrillo-Sanchez, J. D.; Gomez-Martin, J. C.; Marshall, R.; Nesvorny, D.; Plane, J. M. C.; Feng, W.; Pokorny, P.

    2017-01-01

    We present a path forward on a long-standing issue concerning the flux of small and slow meteoroids, which are believed to be the dominant portion of the incoming meteoric mass flux into the Earth's atmosphere. Such a flux, which is predicted by dynamical dust models of the Zodiacal Cloud, is not evident in ground-based radar observations. For decades this was attributed to the fact that the radars used for meteor observations lack the sensitivity to detect this population, due to the small amount of ionization produced by slow-velocity meteors. Such a hypothesis has been challenged by the introduction of meteor head echo (HE) observations with High Power and Large Aperture radars, in particular the Arecibo 430 MHz radar. Janches et al. developed a probabilistic approach to estimate the detectability of meteors by these radars and initially showed that, with the current knowledge of ablation and ionization, such particles should dominate the detected rates by one to two orders of magnitude compared to the actual observations. In this paper, we include results in our model from recently published laboratory measurements, which showed that (1) the ablation of Na is less intense covering a wider altitude range; and (2) the ionization probability, Beta ip, for Na atoms in the air is up to two orders of magnitude smaller for low speeds than originally believed. By applying these results and using a somewhat smaller size of the HE radar target we offer a solution that reconciles these observations with model predictions.

  17. Infrared maritime target detection using a probabilistic single Gaussian model of sea clutter in Fourier domain

    NASA Astrophysics Data System (ADS)

    Zhou, Anran; Xie, Weixin; Pei, Jihong; Chen, Yapei

    2018-02-01

    For ship targets detection in cluttered infrared image sequences, a robust detection method, based on the probabilistic single Gaussian model of sea background in Fourier domain, is put forward. The amplitude spectrum sequences at each frequency point of the pure seawater images in Fourier domain, being more stable than the gray value sequences of each background pixel in the spatial domain, are regarded as a Gaussian model. Next, a probability weighted matrix is built based on the stability of the pure seawater's total energy spectrum in the row direction, to make the Gaussian model more accurate. Then, the foreground frequency points are separated from the background frequency points by the model. Finally, the false-alarm points are removed utilizing ships' shape features. The performance of the proposed method is tested by visual and quantitative comparisons with others.

  18. Effectiveness of scat detection dogs for detecting forest carnivores

    USGS Publications Warehouse

    Long, Robert A.; Donovan, T.M.; MacKay, Paula; Zielinski, William J.; Buzas, Jeffrey S.

    2007-01-01

    We assessed the detection and accuracy rates of detection dogs trained to locate scats from free-ranging black bears (Ursus americanus), fishers (Martes pennanti), and bobcats (Lynx rufus). During the summers of 2003-2004, 5 detection teams located 1,565 scats (747 putative black bear, 665 putative fisher, and 153 putative bobcat) at 168 survey sites throughout Vermont, USA. Of 347 scats genetically analyzed for species identification, 179 (51.6%) yielded a positive identification, 131 (37.8%) failed to yield DNA information, and 37 (10.7%) yielded DNA but provided no species confirmation. For 70 survey sites where confirmation of a putative target species' scat was not possible, we assessed the probability that ???1 of the scats collected at the site was deposited by the target species (probability of correct identification; P ID). Based on species confirmations or PID values, we detected bears at 57.1% (96) of sites, fishers at 61.3% (103) of sites, and bobcats at 12.5%o (21) of sites. We estimated that the mean probability of detecting the target species (when present) during a single visit to a site was 0.86 for black bears, 0.95 for fishers, and 0.40 for bobcats. The probability of detecting black bears was largely unaffected by site- or visit-specific covariates, but the probability of detecting fishers varied by detection team. We found little or no effect of topographic ruggedness, vegetation density, or local weather (e.g., temp, humidity) on detection probability for fishers or black bears (data were insufficient for bobcat analyses). Detection dogs were highly effective at locating scats from forest carnivores and provided an efficient and accurate method for collecting detection-nondetection data on multiple species.

  19. A Probability Co-Kriging Model to Account for Reporting Bias and Recognize Areas at High Risk for Zebra Mussels and Eurasian Watermilfoil Invasions in Minnesota

    PubMed Central

    Kanankege, Kaushi S. T.; Alkhamis, Moh A.; Phelps, Nicholas B. D.; Perez, Andres M.

    2018-01-01

    Zebra mussels (ZMs) (Dreissena polymorpha) and Eurasian watermilfoil (EWM) (Myriophyllum spicatum) are aggressive aquatic invasive species posing a conservation burden on Minnesota. Recognizing areas at high risk for invasion is a prerequisite for the implementation of risk-based prevention and mitigation management strategies. The early detection of invasion has been challenging, due in part to the imperfect observation process of invasions including the absence of a surveillance program, reliance on public reporting, and limited resource availability, which results in reporting bias. To predict the areas at high risk for invasions, while accounting for underreporting, we combined network analysis and probability co-kriging to estimate the risk of ZM and EWM invasions. We used network analysis to generate a waterbody-specific variable representing boater traffic, a known high risk activity for human-mediated transportation of invasive species. In addition, co-kriging was used to estimate the probability of species introduction, using waterbody-specific variables. A co-kriging model containing distance to the nearest ZM infested location, boater traffic, and road access was used to recognize the areas at high risk for ZM invasions (AUC = 0.78). The EWM co-kriging model included distance to the nearest EWM infested location, boater traffic, and connectivity to infested waterbodies (AUC = 0.76). Results suggested that, by 2015, nearly 20% of the waterbodies in Minnesota were at high risk of ZM (12.45%) or EWM (12.43%) invasions, whereas only 125/18,411 (0.67%) and 304/18,411 (1.65%) are currently infested, respectively. Prediction methods presented here can support decisions related to solving the problems of imperfect detection, which subsequently improve the early detection of biological invasions. PMID:29354638

  20. Count data, detection probabilities, and the demography, dynamics, distribution, and decline of amphibians.

    PubMed

    Schmidt, Benedikt R

    2003-08-01

    The evidence for amphibian population declines is based on count data that were not adjusted for detection probabilities. Such data are not reliable even when collected using standard methods. The formula C = Np (where C is a count, N the true parameter value, and p is a detection probability) relates count data to demography, population size, or distributions. With unadjusted count data, one assumes a linear relationship between C and N and that p is constant. These assumptions are unlikely to be met in studies of amphibian populations. Amphibian population data should be based on methods that account for detection probabilities.

  1. Tube structural integrity evaluation of Palo Verde Unit 1 steam generators for axial upper-bundle cracking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woodman, B.W.; Begley, J.A.; Brown, S.D.

    1995-12-01

    The analysis of the issue of upper bundle axial ODSCC as it apples to steam generator tube structural integrity in Unit 1 at the Palo Verde Nuclear generating Station is presented in this study. Based on past inspection results for Units 2 and 3 at Palo Verde, the detection of secondary side stress corrosion cracks in the upper bundle region of Unit 1 may occur at some future date. The following discussion provides a description and analysis of the probability of axial ODSCC in Unit 1 leading to the exceedance of Regulatory Guide 1.121 structural limits. The probabilities of structuralmore » limit exceedance are estimated as function of run time using a conservative approach. The chosen approach models the historical development of cracks, crack growth, detection of cracks and subsequent removal from service and the initiation and growth of new cracks during a given cycle of operation. Past performance of all Palo Verde Units as well as the historical performance of other steam generators was considered in the development of cracking statistics for application to Unit 1. Data in the literature and Unit 2 pulled tube examination results were used to construct probability of detection curves for the detection of axial IGSCC/IGA using an MRPC (multi-frequency rotating panake coil) eddy current probe. Crack growth rates were estimated from Unit 2 eddy current inspection data combined with pulled tube examination results and data in the literature. A Monte-Carlo probabilistic model is developed to provide an overall assessment of the risk of Regulatory Guide exceedance during plant operation.« less

  2. Clinical model to estimate the pretest probability of malignancy in patients with pulmonary focal Ground-glass Opacity.

    PubMed

    Jiang, Long; Situ, Dongrong; Lin, Yongbin; Su, Xiaodong; Zheng, Yan; Zhang, Yigong; Long, Hao

    2013-11-01

    Effective strategies for managing patients with pulmonary focal Ground-glass Opacity (fGGO) depend on the pretest probability of malignancy. Estimating a clinical probability of malignancy in patients with fGGOs can facilitate the selection and interpretation of subsequent diagnostic tests. METHODS : Data from patients with pulmonary fGGO lesions, who were diagnosed at Sun Yat-sen University Cancer Center, was retrospectively collected. Multiple logistic regression analysis was used to identify independent clinical predictors for malignancy and to develop a clinical predictive model to estimate the pretest probability of malignancy in patients with fGGOs.  One hundred and sixty-five pulmonary fGGO nodules were detected in 128 patients. Independent predictors for malignant fGGOs included a history of other cancers (odds ratio [OR], 0.264; 95% confidence interval [CI], 0.072 to 0.970), pleural indentation (OR, 8.766; 95% CI, 3.033-25.390), vessel-convergence sign (OR, 23.626; 95% CI, 6.200 to 90.027) and air bronchogram (OR, 7.41; 95% CI, 2.037 to 26.961). Model accuracy was satisfactory (area under the curve of the receiver operating characteristic, 0.934; 95% CI, 0.894 to 0.975), and there was excellent agreement between the predicted probability and the observed frequency of malignant fGGOs. We have developed a predictive model, which could be used to generate pretest probabilities of malignant fGGOs, and the equation could be incorporated into a formal decision analysis. © 2013 Tianjin Lung Cancer Institute and Wiley Publishing Asia Pty Ltd.

  3. A Violation of the Conditional Independence Assumption in the Two-High-Threshold Model of Recognition Memory

    ERIC Educational Resources Information Center

    Chen, Tina; Starns, Jeffrey J.; Rotello, Caren M.

    2015-01-01

    The 2-high-threshold (2HT) model of recognition memory assumes that test items result in distinct internal states: they are either detected or not, and the probability of responding at a particular confidence level that an item is "old" or "new" depends on the state-response mapping parameters. The mapping parameters are…

  4. Multi-Detection Events, Probability Density Functions, and Reduced Location Area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eslinger, Paul W.; Schrom, Brian T.

    2016-03-01

    Abstract Several efforts have been made in the Comprehensive Nuclear-Test-Ban Treaty (CTBT) community to assess the benefits of combining detections of radionuclides to improve the location estimates available from atmospheric transport modeling (ATM) backtrack calculations. We present a Bayesian estimation approach rather than a simple dilution field of regard approach to allow xenon detections and non-detections to be combined mathematically. This system represents one possible probabilistic approach to radionuclide event formation. Application of this method to a recent interesting radionuclide event shows a substantial reduction in the location uncertainty of that event.

  5. Detection probability of an in-stream passive integrated transponder (PIT) tag detection system for juvenile salmonids in the Klamath River, northern California, 2011

    USGS Publications Warehouse

    Beeman, John W.; Hayes, Brian; Wright, Katrina

    2012-01-01

    A series of in-stream passive integrated transponder (PIT) detection antennas installed across the Klamath River in August 2010 were tested using tagged fish in the summer of 2011. Six pass-by antennas were constructed and anchored to the bottom of the Klamath River at a site between the Shasta and Scott Rivers. Two of the six antennas malfunctioned during the spring of 2011 and two pass-through antennas were installed near the opposite shoreline prior to system testing. The detection probability of the PIT tag detection system was evaluated using yearling coho salmon implanted with a PIT tag and a radio transmitter and then released into the Klamath River slightly downstream of Iron Gate Dam. Cormack-Jolly-Seber capture-recapture methods were used to estimate the detection probability of the PIT tag detection system based on detections of PIT tags there and detections of radio transmitters at radio-telemetry detection systems downstream. One of the 43 PIT- and radio-tagged fish released was detected by the PIT tag detection system and 23 were detected by the radio-telemetry detection systems. The estimated detection probability of the PIT tag detection system was 0.043 (standard error 0.042). Eight PIT-tagged fish from other studies also were detected. Detections at the PIT tag detection system were at the two pass-through antennas and the pass-by antenna adjacent to them. Above average river discharge likely was a factor in the low detection probability of the PIT tag detection system. High discharges dislodged two power cables leaving 12 meters of the river width unsampled for PIT detections and resulted in water depths greater than the read distance of the antennas, which allowed fish to pass over much of the system with little chance of being detected. Improvements in detection probability may be expected under river discharge conditions where water depth over the antennas is within maximum read distance of the antennas. Improvements also may be expected if additional arrays of antennas are used.

  6. Associations among habitat characteristics and meningeal worm prevalence in eastern South Dakota, USA

    USGS Publications Warehouse

    Jacques, Christopher N.; Jenks, Jonathan A.; Klaver, Robert W.; Dubay, Shelli A.

    2017-01-01

    Few studies have evaluated how wetland and forest characteristics influence the prevalence of meningeal worm (Parelaphostrongylus tenuis) infection of deer throughout the grassland biome of central North America. We used previously collected, county-level prevalence data to evaluate associations between habitat characteristics and probability of meningeal worm infection in white-tailed deer (Odocoileus virginianus) across eastern South Dakota, US. The highest-ranked binomial regression model for detecting probability of meningeal worm infection was spring temperature + summer precipitation + percent wetland; weight of evidence (wi=0.71) favored this model over alternative models, though predictive capability was low (Receiver operating characteristic=0.62). Probability of meningeal worm infection increased by 1.3- and 1.6-fold for each 1-cm and 1-C increase in summer precipitation and spring temperature, respectively. Similarly, probability of infection increased 1.2-fold for each 1% increase in wetland habitat. Our findings highlight the importance of wetland habitat in predicting meningeal worm infection across eastern South Dakota. Future research is warranted to evaluate the relationships between climatic conditions (e.g., drought, wet cycles) and deer habitat selection in maintaining P. tenuis along the western boundary of the parasite.

  7. Demographic estimation methods for plants with dormancy

    USGS Publications Warehouse

    Kery, M.; Gregg, K.B.

    2004-01-01

    Demographic studies in plants appear simple because unlike animals, plants do not run away. Plant individuals can be marked with, e.g., plastic tags, but often the coordinates of an individual may be sufficient to identify it. Vascular plants in temperate latitudes have a pronounced seasonal life–cycle, so most plant demographers survey their study plots once a year often during or shortly after flowering. Life–states are pervasive in plants, hence the results of a demographic study for an individual can be summarized in a familiar encounter history, such as 0VFVVF000. A zero means that an individual was not seen in a year and a letter denotes its state for years when it was seen aboveground. V and F here stand for vegetative and flowering states, respectively. Probabilities of survival and state transitions can then be obtained by mere counting.Problems arise when there is an unobservable dormant state, i.e., when plants may stay belowground for one or more growing seasons. Encounter histories such as 0VF00F000 may then occur where the meaning of zeroes becomes ambiguous. A zero can either mean a dead or a dormant plant. Various ad hoc methods in wide use among plant ecologists have made strong assumptions about when a zero should be equated to a dormant individual. These methods have never been compared among each other. In our talk and in Kéry et al. (submitted), we show that these ad hoc estimators provide spurious estimates of survival and should not be used.In contrast, if detection probabilities for aboveground plants are known or can be estimated, capturerecapture (CR) models can be used to estimate probabilities of survival and state–transitions and the fraction of the population that is dormant. We have used this approach in two studies of terrestrial orchids, Cleistes bifaria (Kéry et al., submitted) and Cypripedium reginae(Kéry & Gregg, submitted) in West Virginia, U.S.A. For Cleistes, our data comprised one population with a total of 620 marked ramets over 10 years, and for Cypripedium, two populations with 98 and 258 marked ramets over 11 years. We chose the ramet (= single stem or shoot) as the demographic unit of our study since there was no way distinguishing among genets (genet = genetical individual, i.e., the “individual” that animal ecologists are mostly concerned with). This will introduce some non–independence into the data, which can nevertheless be dealt with easily by correcting variances for overdispersion. Using ramets instead of genets has the further advantage that individuals can be assigned to a state such as flowering or vegetative in an unambiguous manner. This is not possible when genets are the demographic units. In all three populations, auxiliary data was available to show that detection probability of aboveground plants was m 0.995We fitted multistate models in program MARK by specifying three states (D, V, F), even though the dormant state D does not occur in the encounter histories. Detection probability is fixed at 1 for the vegetative (V) and the flowering state (F) and at zero for the dormant state (D). Rates of survival and of state transitions as well as slopes of covariate relationships can be estimated and LRT or the AIC machinery be used to select among models. To estimate the fraction of the population in the unobservabledormant state, the encounter histories are collapsed to 0 (plant not observed aboveground) and 1 (plant observed aboveground). The Cormack–Jolly–Seber model without constraints on detection probability is used to estimate detection probability, the complement of which is the estimated fraction of the population in the dormant state.Parameter identifiability is an important issue in multi state models. We used the Catchpole–Morgan–Freeman approach to determine which parameters are estimable in principle in our multi state models. Most of 15 tested models were indeed estimable with the notable exception of the most general model, which has fully interactive state- and time-dependent survival and state transition rates. This model would become identifiable if at least some plants would be excavated in years when they do not show up aboveground.Our analyses for three analyzed populations of Cleistes and Cypripedium yielded annual ramet survival rates ranging from 0.86–0.96. Estimates of the average fraction dormant ranged from 0.02–0.30, but with up to half a population in the dormant state in some years. Ultrastructural modeling enables interesting hypotheses to be tested about the relationships of demographic rates with climatic covariates for instance. Such covariate modeling makes the CR approach particularly interesting for evolutionary–ecological questions about, e.g., the adaptive significance of the dormant state.

  8. Probabilistic model for quick detection of dissimilar binary images

    NASA Astrophysics Data System (ADS)

    Mustafa, Adnan A. Y.

    2015-09-01

    We present a quick method to detect dissimilar binary images. The method is based on a "probabilistic matching model" for image matching. The matching model is used to predict the probability of occurrence of distinct-dissimilar image pairs (completely different images) when matching one image to another. Based on this model, distinct-dissimilar images can be detected by matching only a few points between two images with high confidence, namely 11 points for a 99.9% successful detection rate. For image pairs that are dissimilar but not distinct-dissimilar, more points need to be mapped. The number of points required to attain a certain successful detection rate or confidence depends on the amount of similarity between the compared images. As this similarity increases, more points are required. For example, images that differ by 1% can be detected by mapping fewer than 70 points on average. More importantly, the model is image size invariant; so, images of any sizes will produce high confidence levels with a limited number of matched points. As a result, this method does not suffer from the image size handicap that impedes current methods. We report on extensive tests conducted on real images of different sizes.

  9. Is eradication of the pinewood nematode (Bursaphelenchus xylophilus) likely? An evaluation of current contingency plans.

    PubMed

    Okland, Bjørn; Skarpaas, Olav; Schroeder, Martin; Magnusson, Christer; Lindelöw, Ake; Thunes, Karl

    2010-09-01

    The pinewood nematode (PWN) is one of the worst tree-killing exotic pests in East-Asian countries. The first European record of establishment in Portugal in 1999 triggered extensive surveys and contingency plans for eradication in European countries, including immediate removal of large areas of conifer host trees. Using Norway as an example, we applied a simulation model to evaluate the chance of successful eradication of a hypothetical introduction by the current contingency plan in a northern area where wilting symptoms are not expected to occur. Despite a highly variable spread of nematode infestations in space and time, the probability of successful eradication in 20 years was consistently low (mean 0.035, SE 0.02). The low success did not change significantly by varying the biological parameters in sensitivity analyses (SA), probably due to the late detection of infestations by the survey (mean 14.3 years). SA revealed a strong influence of management parameters. However, a high probability of eradication required unrealistic measures: achieving an eradication probability of 0.99 in 20 years required 10,000 survey samples per year and a host tree removal radius of 8,000 m around each detection point. © 2010 Society for Risk Analysis.

  10. Community Detection Algorithm Combining Stochastic Block Model and Attribute Data Clustering

    NASA Astrophysics Data System (ADS)

    Kataoka, Shun; Kobayashi, Takuto; Yasuda, Muneki; Tanaka, Kazuyuki

    2016-11-01

    We propose a new algorithm to detect the community structure in a network that utilizes both the network structure and vertex attribute data. Suppose we have the network structure together with the vertex attribute data, that is, the information assigned to each vertex associated with the community to which it belongs. The problem addressed this paper is the detection of the community structure from the information of both the network structure and the vertex attribute data. Our approach is based on the Bayesian approach that models the posterior probability distribution of the community labels. The detection of the community structure in our method is achieved by using belief propagation and an EM algorithm. We numerically verified the performance of our method using computer-generated networks and real-world networks.

  11. Comprehensive Assessment of Models and Events based on Library tools (CAMEL)

    NASA Astrophysics Data System (ADS)

    Rastaetter, L.; Boblitt, J. M.; DeZeeuw, D.; Mays, M. L.; Kuznetsova, M. M.; Wiegand, C.

    2017-12-01

    At the Community Coordinated Modeling Center (CCMC), the assessment of modeling skill using a library of model-data comparison metrics is taken to the next level by fully integrating the ability to request a series of runs with the same model parameters for a list of events. The CAMEL framework initiates and runs a series of selected, pre-defined simulation settings for participating models (e.g., WSA-ENLIL, SWMF-SC+IH for the heliosphere, SWMF-GM, OpenGGCM, LFM, GUMICS for the magnetosphere) and performs post-processing using existing tools for a host of different output parameters. The framework compares the resulting time series data with respective observational data and computes a suite of metrics such as Prediction Efficiency, Root Mean Square Error, Probability of Detection, Probability of False Detection, Heidke Skill Score for each model-data pair. The system then plots scores by event and aggregated over all events for all participating models and run settings. We are building on past experiences with model-data comparisons of magnetosphere and ionosphere model outputs in GEM2008, GEM-CEDAR CETI2010 and Operational Space Weather Model challenges (2010-2013). We can apply the framework also to solar-heliosphere as well as radiation belt models. The CAMEL framework takes advantage of model simulations described with Space Physics Archive Search and Extract (SPASE) metadata and a database backend design developed for a next-generation Run-on-Request system at the CCMC.

  12. Hierarchical heuristic search using a Gaussian mixture model for UAV coverage planning.

    PubMed

    Lin, Lanny; Goodrich, Michael A

    2014-12-01

    During unmanned aerial vehicle (UAV) search missions, efficient use of UAV flight time requires flight paths that maximize the probability of finding the desired subject. The probability of detecting the desired subject based on UAV sensor information can vary in different search areas due to environment elements like varying vegetation density or lighting conditions, making it likely that the UAV can only partially detect the subject. This adds another dimension of complexity to the already difficult (NP-Hard) problem of finding an optimal search path. We present a new class of algorithms that account for partial detection in the form of a task difficulty map and produce paths that approximate the payoff of optimal solutions. The algorithms use the mode goodness ratio heuristic that uses a Gaussian mixture model to prioritize search subregions. The algorithms search for effective paths through the parameter space at different levels of resolution. We compare the performance of the new algorithms against two published algorithms (Bourgault's algorithm and LHC-GW-CONV algorithm) in simulated searches with three real search and rescue scenarios, and show that the new algorithms outperform existing algorithms significantly and can yield efficient paths that yield payoffs near the optimal.

  13. Age-0 Lost River sucker and shortnose sucker nearshore habitat use in Upper Klamath Lake, Oregon: A patch occupancy approach

    USGS Publications Warehouse

    Burdick, S.M.; Hendrixson, H.A.; VanderKooi, S.P.

    2008-01-01

    We examined habitat use by age-0 Lost River suckers Deltistes luxatus and shortnose suckers Chasmistes brevirostris over six substrate classes and in vegetated and nonvegetated areas of Upper Klamath Lake, Oregon. We used a patch occupancy approach to model the effect of physical habitat and water quality conditions on habitat use. Our models accounted for potential inconsistencies in detection probability among sites and sampling occasions as a result of differences in fishing gear types and techniques, habitat characteristics, and age-0 fish size and abundance. Detection probability was greatest during mid- to late summer, when water temperatures were highest and age-0 suckers were the largest. The proportion of sites used by age-0 suckers was inversely related to depth (range = 0.4-3.0 m), particularly during late summer. Age-0 suckers were more likely to use habitats containing small substrate (64 mm) and habitats with vegetation than those without vegetation. Relatively narrow ranges in dissolved oxygen, temperature, and pH prevented us from detecting effects of these water quality features on age-0 sucker nearshore habitat use.

  14. Real Time Data Management for Estimating Probabilities of Incidents and Near Misses

    NASA Astrophysics Data System (ADS)

    Stanitsas, P. D.; Stephanedes, Y. J.

    2011-08-01

    Advances in real-time data collection, data storage and computational systems have led to development of algorithms for transport administrators and engineers that improve traffic safety and reduce cost of road operations. Despite these advances, problems in effectively integrating real-time data acquisition, processing, modelling and road-use strategies at complex intersections and motorways remain. These are related to increasing system performance in identification, analysis, detection and prediction of traffic state in real time. This research develops dynamic models to estimate the probability of road incidents, such as crashes and conflicts, and incident-prone conditions based on real-time data. The models support integration of anticipatory information and fee-based road use strategies in traveller information and management. Development includes macroscopic/microscopic probabilistic models, neural networks, and vector autoregressions tested via machine vision at EU and US sites.

  15. The perception of probability.

    PubMed

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  16. Study of biological communities subject to imperfect detection: Bias and precision of community N-mixture abundance models in small-sample situations

    USGS Publications Warehouse

    Yamaura, Yuichi; Kery, Marc; Royle, Andy

    2016-01-01

    Community N-mixture abundance models for replicated counts provide a powerful and novel framework for drawing inferences related to species abundance within communities subject to imperfect detection. To assess the performance of these models, and to compare them to related community occupancy models in situations with marginal information, we used simulation to examine the effects of mean abundance (λ¯: 0.1, 0.5, 1, 5), detection probability (p¯: 0.1, 0.2, 0.5), and number of sampling sites (n site : 10, 20, 40) and visits (n visit : 2, 3, 4) on the bias and precision of species-level parameters (mean abundance and covariate effect) and a community-level parameter (species richness). Bias and imprecision of estimates decreased when any of the four variables (λ¯, p¯, n site , n visit ) increased. Detection probability p¯ was most important for the estimates of mean abundance, while λ¯ was most influential for covariate effect and species richness estimates. For all parameters, increasing n site was more beneficial than increasing n visit . Minimal conditions for obtaining adequate performance of community abundance models were n site  ≥ 20, p¯ ≥ 0.2, and λ¯ ≥ 0.5. At lower abundance, the performance of community abundance and community occupancy models as species richness estimators were comparable. We then used additive partitioning analysis to reveal that raw species counts can overestimate β diversity both of species richness and the Shannon index, while community abundance models yielded better estimates. Community N-mixture abundance models thus have great potential for use with community ecology or conservation applications provided that replicated counts are available.

  17. Import risk assessment incorporating a dose-response model: introduction of highly pathogenic porcine reproductive and respiratory syndrome into Australia via illegally imported raw pork.

    PubMed

    Brookes, V J; Hernández-Jover, M; Holyoake, P; Ward, M P

    2014-03-01

    Highly pathogenic porcine reproductive and respiratory syndrome (PRRS) has spread through parts of south-east Asia, posing a risk to Australia. The objective of this study was to assess the probability of infection of a feral or domestic pig in Australia with highly pathogenic PRRS following ingestion of illegally imported raw pork. A conservative scenario was considered in which 500 g of raw pork was imported from the Philippines into Australia without being detected by border security, then discarded from a household and potentially accessed by a pig. Monte Carlo simulation of a two-dimensional, stochastic model was used to estimate the probability of entry and exposure, and the probability of infection was assessed by incorporating a virus-decay and mechanistic dose-response model. Results indicated that the probability of infection of a feral pig after ingestion of raw meat was higher than the probability of infection of a domestic pig. Sensitivity analysis was used to assess the influence of input parameters on model output probability estimates, and extension of the virus-decay and dose-response model was used to explore the impact of different temperatures and time from slaughter to ingestion of the meat, different weights of meat, and the level of viraemia at slaughter on the infectivity of meat. Parameters with the highest influence on the model output were the level of viraemia of a pig prior to slaughter and the probability of access by a feral pig to food-waste discarded on property surrounding a household. Extension of the decay and dose-response model showed that small pieces of meat (10 g) from a highly pathogenic PRRS viraemic pig could contain enough virus to have a high probability of infection of a pig, and that routes to Australia by sea or air from all highly pathogenic PRRS virus endemic countries were of interest dependent on the temperature of the raw meat during transport. This study highlighted the importance of mitigation strategies such as disposal of food-waste from international traffic as quarantine waste, and the need for further research into the probability of access to food-waste on properties by feral pigs. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Zipf 's law and the effect of ranking on probability distributions

    NASA Astrophysics Data System (ADS)

    Günther, R.; Levitin, L.; Schapiro, B.; Wagner, P.

    1996-02-01

    Ranking procedures are widely used in the description of many different types of complex systems. Zipf's law is one of the most remarkable frequency-rank relationships and has been observed independently in physics, linguistics, biology, demography, etc. We show that ranking plays a crucial role in making it possible to detect empirical relationships in systems that exist in one realization only, even when the statistical ensemble to which the systems belong has a very broad probability distribution. Analytical results and numerical simulations are presented which clarify the relations between the probability distributions and the behavior of expected values for unranked and ranked random variables. This analysis is performed, in particular, for the evolutionary model presented in our previous papers which leads to Zipf's law and reveals the underlying mechanism of this phenomenon in terms of a system with interdependent and interacting components as opposed to the “ideal gas” models suggested by previous researchers. The ranking procedure applied to this model leads to a new, unexpected phenomenon: a characteristic “staircase” behavior of the mean values of the ranked variables (ranked occupation numbers). This result is due to the broadness of the probability distributions for the occupation numbers and does not follow from the “ideal gas” model. Thus, it provides an opportunity, by comparison with empirical data, to obtain evidence as to which model relates to reality.

  19. Statistics provide guidance for indigenous organic carbon detection on Mars missions.

    PubMed

    Sephton, Mark A; Carter, Jonathan N

    2014-08-01

    Data from the Viking and Mars Science Laboratory missions indicate the presence of organic compounds that are not definitively martian in origin. Both contamination and confounding mineralogies have been suggested as alternatives to indigenous organic carbon. Intuitive thought suggests that we are repeatedly obtaining data that confirms the same level of uncertainty. Bayesian statistics may suggest otherwise. If an organic detection method has a true positive to false positive ratio greater than one, then repeated organic matter detection progressively increases the probability of indigeneity. Bayesian statistics also reveal that methods with higher ratios of true positives to false positives give higher overall probabilities and that detection of organic matter in a sample with a higher prior probability of indigenous organic carbon produces greater confidence. Bayesian statistics, therefore, provide guidance for the planning and operation of organic carbon detection activities on Mars. Suggestions for future organic carbon detection missions and instruments are as follows: (i) On Earth, instruments should be tested with analog samples of known organic content to determine their true positive to false positive ratios. (ii) On the mission, for an instrument with a true positive to false positive ratio above one, it should be recognized that each positive detection of organic carbon will result in a progressive increase in the probability of indigenous organic carbon being present; repeated measurements, therefore, can overcome some of the deficiencies of a less-than-definitive test. (iii) For a fixed number of analyses, the highest true positive to false positive ratio method or instrument will provide the greatest probability that indigenous organic carbon is present. (iv) On Mars, analyses should concentrate on samples with highest prior probability of indigenous organic carbon; intuitive desires to contrast samples of high prior probability and low prior probability of indigenous organic carbon should be resisted.

  20. Atmospheric control on ground and space based early warning system for hazard linked to ash injection into the atmosphere

    NASA Astrophysics Data System (ADS)

    Caudron, Corentin; Taisne, Benoit; Whelley, Patrick; Garces, Milton; Le Pichon, Alexis

    2014-05-01

    Violent volcanic eruptions are common in the Southeast Asia which is bordered by active subduction zones with hundreds of active volcanoes. The physical conditions at the eruptive vent are difficult to estimate, especially when there are only a few sensors distributed around the volcano. New methods are therefore required to tackle this problem. Among them, satellite imagery and infrasound may rapidly provide information on strong eruptions triggered at volcanoes which are not closely monitored by on-site instruments. The deployment of an infrasonic array located at Singapore will increase the detection capability of the existing IMS network. In addition, the location of Singapore with respect to those volcanoes makes it the perfect site to identify erupting blasts based on the wavefront characteristics of the recorded signal. There are ~750 active or potentially active volcanoes within 4000 kilometers of Singapore. They have been combined into 23 volcanic zones that have clear azimuth with respect to Singapore. Each of those zones has been assessed for probabilities of eruptive styles, from moderate (Volcanic Explosivity Index of 3) to cataclysmic (VEI 8) based on remote morphologic analysis. Ash dispersal models have been run using wind velocity profiles from 2010 to 2012 and hypothetical eruption scenarios for a range of eruption explosivities. Results can be used to estimate the likelihood of volcanic ash at any location in SE Asia. Seasonal changes in atmospheric conditions will strongly affect the potential to detect small volcanic eruptions with infrasound and clouds can hide eruption plumes from satellites. We use the average cloud cover for each zone to estimate the probability of eruption detection from space, and atmospheric models to estimate the probability of eruption detection with infrasound. Using remote sensing in conjunction with infrasound improves detection capabilities as each method is capable of detecting eruptions when the other is 'blind' or 'defened' by adverse atmospheric conditions. According to its location, each volcanic zone will be associated with a threshold value (minimum VEI detectable) depending on the seasonality of the wind velocity profile in the region and the cloud cover.

  1. Occupancy Modeling Species-Environment Relationships with Non-ignorable Survey Designs.

    PubMed

    Irvine, Kathryn M; Rodhouse, Thomas J; Wright, Wilson J; Olsen, Anthony R

    2018-05-26

    Statistical models supporting inferences about species occurrence patterns in relation to environmental gradients are fundamental to ecology and conservation biology. A common implicit assumption is that the sampling design is ignorable and does not need to be formally accounted for in analyses. The analyst assumes data are representative of the desired population and statistical modeling proceeds. However, if datasets from probability and non-probability surveys are combined or unequal selection probabilities are used, the design may be non ignorable. We outline the use of pseudo-maximum likelihood estimation for site-occupancy models to account for such non-ignorable survey designs. This estimation method accounts for the survey design by properly weighting the pseudo-likelihood equation. In our empirical example, legacy and newer randomly selected locations were surveyed for bats to bridge a historic statewide effort with an ongoing nationwide program. We provide a worked example using bat acoustic detection/non-detection data and show how analysts can diagnose whether their design is ignorable. Using simulations we assessed whether our approach is viable for modeling datasets composed of sites contributed outside of a probability design Pseudo-maximum likelihood estimates differed from the usual maximum likelihood occu31 pancy estimates for some bat species. Using simulations we show the maximum likelihood estimator of species-environment relationships with non-ignorable sampling designs was biased, whereas the pseudo-likelihood estimator was design-unbiased. However, in our simulation study the designs composed of a large proportion of legacy or non-probability sites resulted in estimation issues for standard errors. These issues were likely a result of highly variable weights confounded by small sample sizes (5% or 10% sampling intensity and 4 revisits). Aggregating datasets from multiple sources logically supports larger sample sizes and potentially increases spatial extents for statistical inferences. Our results suggest that ignoring the mechanism for how locations were selected for data collection (e.g., the sampling design) could result in erroneous model-based conclusions. Therefore, in order to ensure robust and defensible recommendations for evidence-based conservation decision-making, the survey design information in addition to the data themselves must be available for analysts. Details for constructing the weights used in estimation and code for implementation are provided. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  2. Estimating disperser abundance using open population models that incorporate data from continuous detection PIT arrays

    USGS Publications Warehouse

    Dzul, Maria C.; Yackulic, Charles B.; Korman, Josh

    2017-01-01

    Autonomous passive integrated transponder (PIT) tag antenna systems continuously detect individually marked organisms at one or more fixed points over long time periods. Estimating abundance using data from autonomous antennae can be challenging, because these systems do not detect unmarked individuals. Here we pair PIT antennae data from a tributary with mark-recapture sampling data in a mainstem river to estimate the number of fish moving from the mainstem to the tributary. We then use our model to estimate abundance of non-native rainbow trout Oncorhynchus mykiss that move from the Colorado River to the Little Colorado River (LCR), the latter of which is important spawning and rearing habitat for federally-endangered humpback chub Gila cypha. We estimate 226 rainbow trout (95% CI: 127-370) entered the LCR from October 2013-April 2014. We discuss the challenges of incorporating detections from autonomous PIT antenna systems into mark-recapture population models, particularly in regards to using information about spatial location to estimate movement and detection probabilities.

  3. Reaction times to weak test lights. [psychophysics biological model

    NASA Technical Reports Server (NTRS)

    Wandell, B. A.; Ahumada, P.; Welsh, D.

    1984-01-01

    Maloney and Wandell (1984) describe a model of the response of a single visual channel to weak test lights. The initial channel response is a linearly filtered version of the stimulus. The filter output is randomly sampled over time. Each time a sample occurs there is some probability increasing with the magnitude of the sampled response - that a discrete detection event is generated. Maloney and Wandell derive the statistics of the detection events. In this paper a test is conducted of the hypothesis that the reaction time responses to the presence of a weak test light are initiated at the first detection event. This makes it possible to extend the application of the model to lights that are slightly above threshold, but still within the linear operating range of the visual system. A parameter-free prediction of the model proposed by Maloney and Wandell for lights detected by this statistic is tested. The data are in agreement with the prediction.

  4. Quick probabilistic binary image matching: changing the rules of the game

    NASA Astrophysics Data System (ADS)

    Mustafa, Adnan A. Y.

    2016-09-01

    A Probabilistic Matching Model for Binary Images (PMMBI) is presented that predicts the probability of matching binary images with any level of similarity. The model relates the number of mappings, the amount of similarity between the images and the detection confidence. We show the advantage of using a probabilistic approach to matching in similarity space as opposed to a linear search in size space. With PMMBI a complete model is available to predict the quick detection of dissimilar binary images. Furthermore, the similarity between the images can be measured to a good degree if the images are highly similar. PMMBI shows that only a few pixels need to be compared to detect dissimilarity between images, as low as two pixels in some cases. PMMBI is image size invariant; images of any size can be matched at the same quick speed. Near-duplicate images can also be detected without much difficulty. We present tests on real images that show the prediction accuracy of the model.

  5. Mixture models for estimating the size of a closed population when capture rates vary among individuals

    USGS Publications Warehouse

    Dorazio, R.M.; Royle, J. Andrew

    2003-01-01

    We develop a parameterization of the beta-binomial mixture that provides sensible inferences about the size of a closed population when probabilities of capture or detection vary among individuals. Three classes of mixture models (beta-binomial, logistic-normal, and latent-class) are fitted to recaptures of snowshoe hares for estimating abundance and to counts of bird species for estimating species richness. In both sets of data, rates of detection appear to vary more among individuals (animals or species) than among sampling occasions or locations. The estimates of population size and species richness are sensitive to model-specific assumptions about the latent distribution of individual rates of detection. We demonstrate using simulation experiments that conventional diagnostics for assessing model adequacy, such as deviance, cannot be relied on for selecting classes of mixture models that produce valid inferences about population size. Prior knowledge about sources of individual heterogeneity in detection rates, if available, should be used to help select among classes of mixture models that are to be used for inference.

  6. Labeled RFS-Based Track-Before-Detect for Multiple Maneuvering Targets in the Infrared Focal Plane Array.

    PubMed

    Li, Miao; Li, Jun; Zhou, Yiyu

    2015-12-08

    The problem of jointly detecting and tracking multiple targets from the raw observations of an infrared focal plane array is a challenging task, especially for the case with uncertain target dynamics. In this paper a multi-model labeled multi-Bernoulli (MM-LMB) track-before-detect method is proposed within the labeled random finite sets (RFS) framework. The proposed track-before-detect method consists of two parts-MM-LMB filter and MM-LMB smoother. For the MM-LMB filter, original LMB filter is applied to track-before-detect based on target and measurement models, and is integrated with the interacting multiple models (IMM) approach to accommodate the uncertainty of target dynamics. For the MM-LMB smoother, taking advantage of the track labels and posterior model transition probability, the single-model single-target smoother is extended to a multi-model multi-target smoother. A Sequential Monte Carlo approach is also presented to implement the proposed method. Simulation results show the proposed method can effectively achieve tracking continuity for multiple maneuvering targets. In addition, compared with the forward filtering alone, our method is more robust due to its combination of forward filtering and backward smoothing.

  7. Labeled RFS-Based Track-Before-Detect for Multiple Maneuvering Targets in the Infrared Focal Plane Array

    PubMed Central

    Li, Miao; Li, Jun; Zhou, Yiyu

    2015-01-01

    The problem of jointly detecting and tracking multiple targets from the raw observations of an infrared focal plane array is a challenging task, especially for the case with uncertain target dynamics. In this paper a multi-model labeled multi-Bernoulli (MM-LMB) track-before-detect method is proposed within the labeled random finite sets (RFS) framework. The proposed track-before-detect method consists of two parts—MM-LMB filter and MM-LMB smoother. For the MM-LMB filter, original LMB filter is applied to track-before-detect based on target and measurement models, and is integrated with the interacting multiple models (IMM) approach to accommodate the uncertainty of target dynamics. For the MM-LMB smoother, taking advantage of the track labels and posterior model transition probability, the single-model single-target smoother is extended to a multi-model multi-target smoother. A Sequential Monte Carlo approach is also presented to implement the proposed method. Simulation results show the proposed method can effectively achieve tracking continuity for multiple maneuvering targets. In addition, compared with the forward filtering alone, our method is more robust due to its combination of forward filtering and backward smoothing. PMID:26670234

  8. Monthly and seasonally verification of precipitation in Poland

    NASA Astrophysics Data System (ADS)

    Starosta, K.; Linkowska, J.

    2009-04-01

    The national meteorological service of Poland - the Institute of Meteorology and Water Management (IMWM) joined COSMO - The Consortium for Small Scale Modelling on July 2004. In Poland, the COSMO _PL model version 3.5 had run till June 2007. Since July 2007, the model version 4.0 has been running. The model runs in an operational mode at 14-km grid spacing, twice a day (00 UTC, 12 UTC). For scientific research also model with 7-km grid spacing is ran. Monthly and seasonally verification for the 24-hours (06 UTC - 06 UTC) accumulated precipitation is presented in this paper. The precipitation field of COSMO_LM had been verified against rain gauges network (308 points). The verification had been made for every month and all seasons from December 2007 to December 2008. The verification was made for three forecast days for selected thresholds: 0.5, 1, 2.5, 5, 10, 20, 25, 30 mm. Following indices from contingency table were calculated: FBI (bias), POD (probability of detection), PON (probability of detection of non event), FAR (False alarm rate), TSS (True sill statistic), HSS (Heidke skill score), ETS (Equitable skill score). Also percentile ranks and ROC-relative operating characteristic are presented. The ROC is a graph of the hit rate (Y-axis) against false alarm rate (X-axis) for different decision thresholds

  9. Monthly and seasonally verification of precipitation in Poland

    NASA Astrophysics Data System (ADS)

    Starosta, K.; Linkowska, J.

    2009-04-01

    The national meteorological service of Poland - the Institute of Meteorology and Water Management (IMWM) joined COSMO - The Consortium for Small Scale Modelling on July 2004. In Poland, the COSMO _PL model version 3.5 had run till June 2007. Since July 2007, the model version 4.0 has been running. The model runs in an operational mode at 14-km grid spacing, twice a day (00 UTC, 12 UTC). For scientific research also model with 7-km grid spacing is ran. Monthly and seasonally verification for the 24-hours (06 UTC - 06 UTC) accumulated precipitation is presented in this paper. The precipitation field of COSMO_LM had been verified against rain gauges network (308 points). The verification had been made for every month and all seasons from December 2007 to December 2008. The verification was made for three forecast days for selected thresholds: 0.5, 1, 2.5, 5, 10, 20, 25, 30 mm. Following indices from contingency table were calculated: FBI (bias), POD (probability of detection), PON (probability of detection of non event), FAR (False alarm rate), TSS (True sill statistic), HSS (Heidke skill score), ETS (Equitable skill score). Also percentile ranks and ROC-relative operating characteristic are presented. The ROC is a graph of the hit rate (Y-axis) against false alarm rate (X-axis) for different decision thresholds.

  10. A Deep Stochastic Model for Detecting Community in Complex Networks

    NASA Astrophysics Data System (ADS)

    Fu, Jingcheng; Wu, Jianliang

    2017-01-01

    Discovering community structures is an important step to understanding the structure and dynamics of real-world networks in social science, biology and technology. In this paper, we develop a deep stochastic model based on non-negative matrix factorization to identify communities, in which there are two sets of parameters. One is the community membership matrix, of which the elements in a row correspond to the probabilities of the given node belongs to each of the given number of communities in our model, another is the community-community connection matrix, of which the element in the i-th row and j-th column represents the probability of there being an edge between a randomly chosen node from the i-th community and a randomly chosen node from the j-th community. The parameters can be evaluated by an efficient updating rule, and its convergence can be guaranteed. The community-community connection matrix in our model is more precise than the community-community connection matrix in traditional non-negative matrix factorization methods. Furthermore, the method called symmetric nonnegative matrix factorization, is a special case of our model. Finally, based on the experiments on both synthetic and real-world networks data, it can be demonstrated that our algorithm is highly effective in detecting communities.

  11. I Environmental DNA sampling is more sensitive than a traditional survey technique for detecting an aquatic invader.

    PubMed

    Smart, Adam S; Tingley, Reid; Weeks, Andrew R; van Rooyen, Anthony R; McCarthy, Michael A

    2015-10-01

    Effective management of alien species requires detecting populations in the early stages of invasion. Environmental DNA (eDNA) sampling can detect aquatic species at relatively low densities, but few studies have directly compared detection probabilities of eDNA sampling with those of traditional sampling methods. We compare the ability of a traditional sampling technique (bottle trapping) and eDNA to detect a recently established invader, the smooth newt Lissotriton vulgaris vulgaris, at seven field sites in Melbourne, Australia. Over a four-month period, per-trap detection probabilities ranged from 0.01 to 0.26 among sites where L. v. vulgaris was detected, whereas per-sample eDNA estimates were much higher (0.29-1.0). Detection probabilities of both methods varied temporally (across days and months), but temporal variation appeared to be uncorrelated between methods. Only estimates of spatial variation were strongly correlated across the two sampling techniques. Environmental variables (water depth, rainfall, ambient temperature) were not clearly correlated with detection probabilities estimated via trapping, whereas eDNA detection probabilities were negatively correlated with water depth, possibly reflecting higher eDNA concentrations at lower water levels. Our findings demonstrate that eDNA sampling can be an order of magnitude more sensitive than traditional methods, and illustrate that traditional- and eDNA-based surveys can provide independent information on species distributions when occupancy surveys are conducted over short timescales.

  12. Coactivation of Gustatory and Olfactory Signals in Flavor Perception

    PubMed Central

    Veldhuizen, Maria G.; Shepard, Timothy G.; Wang, Miao-Fen

    2010-01-01

    It is easier to detect mixtures of gustatory and olfactory flavorants than to detect either component alone. But does the detection of mixtures exceed the level predicted by probability summation, assuming independent detection of each component? To answer this question, we measured simple response times (RTs) to detect brief pulses of one of 3 flavorants (sucrose [gustatory], citral [olfactory], sucrose–citral mixture) or water, presented into the mouth by a computer-operated, automated flow system. Subjects were instructed to press a button as soon as they detected any of the 3 nonwater stimuli. Responses to the mixtures were faster (RTs smaller) than predicted by a model of probability summation of independently detected signals, suggesting positive coactivation (integration) of gustation and retronasal olfaction in flavor perception. Evidence for integration appeared mainly in the fastest 60% of the responses, indicating that integration arises relatively early in flavor processing. Results were similar when the 3 possible flavorants, and water, were interleaved within the same session (experimental condition), and when each flavorant was interleaved with water only (control conditions). This outcome suggests that subjects did not attend selectively to one flavor component or the other in the experimental condition and further supports the conclusion that (late) decisional or attentional strategies do not exert a large influence on the gustatory–olfactory flavor integration. PMID:20032112

  13. Fitting N-mixture models to count data with unmodeled heterogeneity: Bias, diagnostics, and alternative approaches

    USGS Publications Warehouse

    Duarte, Adam; Adams, Michael J.; Peterson, James T.

    2018-01-01

    Monitoring animal populations is central to wildlife and fisheries management, and the use of N-mixture models toward these efforts has markedly increased in recent years. Nevertheless, relatively little work has evaluated estimator performance when basic assumptions are violated. Moreover, diagnostics to identify when bias in parameter estimates from N-mixture models is likely is largely unexplored. We simulated count data sets using 837 combinations of detection probability, number of sample units, number of survey occasions, and type and extent of heterogeneity in abundance or detectability. We fit Poisson N-mixture models to these data, quantified the bias associated with each combination, and evaluated if the parametric bootstrap goodness-of-fit (GOF) test can be used to indicate bias in parameter estimates. We also explored if assumption violations can be diagnosed prior to fitting N-mixture models. In doing so, we propose a new model diagnostic, which we term the quasi-coefficient of variation (QCV). N-mixture models performed well when assumptions were met and detection probabilities were moderate (i.e., ≥0.3), and the performance of the estimator improved with increasing survey occasions and sample units. However, the magnitude of bias in estimated mean abundance with even slight amounts of unmodeled heterogeneity was substantial. The parametric bootstrap GOF test did not perform well as a diagnostic for bias in parameter estimates when detectability and sample sizes were low. The results indicate the QCV is useful to diagnose potential bias and that potential bias associated with unidirectional trends in abundance or detectability can be diagnosed using Poisson regression. This study represents the most thorough assessment to date of assumption violations and diagnostics when fitting N-mixture models using the most commonly implemented error distribution. Unbiased estimates of population state variables are needed to properly inform management decision making. Therefore, we also discuss alternative approaches to yield unbiased estimates of population state variables using similar data types, and we stress that there is no substitute for an effective sample design that is grounded upon well-defined management objectives.

  14. Test for age-specificity in survival of the common tern

    USGS Publications Warehouse

    Nisbet, I.C.T.; Cam, E.

    2002-01-01

    Much effort in life-history theory has been addressed to the dependence of life-history traits on age, especially the phenomenon of senescence and its evolution. Although senescent declines in survival are well documented in humans and in domestic and laboratory animals, evidence for their occurrence and importance in wild animal species remains limited and equivocal. Several recent papers have suggested that methodological issues may contribute to this problem, and have encouraged investigators to improve sampling designs and to analyse their data using recently developed approaches to modelling of capture-mark-recapture data. Here we report on a three-year, two-site, mark-recapture study of known-aged common terns (Sterna hirundo) in the north-eastern USA. The study was nested within a long-term ecological study in which large numbers of chicks had been banded in each year for > 25 years. We used a range of models to test the hypothesis of an influence of age on survival probability. We also tested for a possible influence of sex on survival. The cross-sectional design of the study (one year's parameter estimates) avoided the possible confounding of effects of age and time. The study was conducted at a time when one of the study sites was being colonized and numbers were increasing rapidly. We detected two-way movements between the sites and estimated movement probabilities in the year for which they could be modelled. We also obtained limited data on emigration from our study area to more distant sites. We found no evidence that survival depended on either sex or age, except that survival was lower among the youngest birds (ages 2-3 years). Despite the large number of birds included in the study (1599 known-aged birds, 2367 total), confidence limits on estimates of survival probability were wide, especially for the oldest age-classes, so that a slight decline in survival late in life could not have been detected. In addition, the cross-sectional design of this study meant that a decline in survival probability within individuals (actuarial senescence) could have been masked by heterogeneity in survival probability among individuals (mortality selection). This emphasizes the need for the development of modelling tools permitting separation of these two phenomena, valid under field conditions in which the recapture probabilities are less than one.

  15. University Students’ Conceptual Knowledge of Randomness and Probability in the Contexts of Evolution and Mathematics

    PubMed Central

    Fiedler, Daniela; Tröbst, Steffen; Harms, Ute

    2017-01-01

    Students of all ages face severe conceptual difficulties regarding key aspects of evolution—the central, unifying, and overarching theme in biology. Aspects strongly related to abstract “threshold” concepts like randomness and probability appear to pose particular difficulties. A further problem is the lack of an appropriate instrument for assessing students’ conceptual knowledge of randomness and probability in the context of evolution. To address this problem, we have developed two instruments, Randomness and Probability Test in the Context of Evolution (RaProEvo) and Randomness and Probability Test in the Context of Mathematics (RaProMath), that include both multiple-choice and free-response items. The instruments were administered to 140 university students in Germany, then the Rasch partial-credit model was applied to assess them. The results indicate that the instruments generate reliable and valid inferences about students’ conceptual knowledge of randomness and probability in the two contexts (which are separable competencies). Furthermore, RaProEvo detected significant differences in knowledge of randomness and probability, as well as evolutionary theory, between biology majors and preservice biology teachers. PMID:28572180

  16. True detection limits in an experimental linearly heteroscedastic system. Part 1

    NASA Astrophysics Data System (ADS)

    Voigtman, Edward; Abraham, Kevin T.

    2011-11-01

    Using a lab-constructed laser-excited filter fluorimeter deliberately designed to exhibit linearly heteroscedastic, additive Gaussian noise, it has been shown that accurate estimates may be made of the true theoretical Currie decision levels ( YC and XC) and true Currie detection limits ( YD and XD) for the detection of rhodamine 6 G tetrafluoroborate in ethanol. The obtained experimental values, for 5% probability of false positives and 5% probability of false negatives, were YC = 56.1 mV, YD = 125. mV, XC = 0.132 μg /mL and XD = 0.294 μg /mL. For 5% probability of false positives and 1% probability of false negatives, the obtained detection limits were YD = 158. mV and XD = 0.372 μg /mL. These decision levels and corresponding detection limits were shown to pass the ultimate test: they resulted in observed probabilities of false positives and false negatives that were statistically equivalent to the a priori specified values.

  17. A Bayesian sequential processor approach to spectroscopic portal system decisions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sale, K; Candy, J; Breitfeller, E

    The development of faster more reliable techniques to detect radioactive contraband in a portal type scenario is an extremely important problem especially in this era of constant terrorist threats. Towards this goal the development of a model-based, Bayesian sequential data processor for the detection problem is discussed. In the sequential processor each datum (detector energy deposit and pulse arrival time) is used to update the posterior probability distribution over the space of model parameters. The nature of the sequential processor approach is that a detection is produced as soon as it is statistically justified by the data rather than waitingmore » for a fixed counting interval before any analysis is performed. In this paper the Bayesian model-based approach, physics and signal processing models and decision functions are discussed along with the first results of our research.« less

  18. A simple two-stage model predicts response time distributions.

    PubMed

    Carpenter, R H S; Reddi, B A J; Anderson, A J

    2009-08-15

    The neural mechanisms underlying reaction times have previously been modelled in two distinct ways. When stimuli are hard to detect, response time tends to follow a random-walk model that integrates noisy sensory signals. But studies investigating the influence of higher-level factors such as prior probability and response urgency typically use highly detectable targets, and response times then usually correspond to a linear rise-to-threshold mechanism. Here we show that a model incorporating both types of element in series - a detector integrating noisy afferent signals, followed by a linear rise-to-threshold performing decision - successfully predicts not only mean response times but, much more stringently, the observed distribution of these times and the rate of decision errors over a wide range of stimulus detectability. By reconciling what previously may have seemed to be conflicting theories, we are now closer to having a complete description of reaction time and the decision processes that underlie it.

  19. Modeling haul-out behavior of walruses in Bering Sea ice

    USGS Publications Warehouse

    Udevitz, M.S.; Jay, C.V.; Fischbach, Anthony S.; Garlich-Miller, J. L.

    2009-01-01

    Understanding haul-out behavior of ice-associated pinnipeds is essential for designing and interpreting popula-tion surveys and for assessing effects of potential changes in their ice environments. We used satellite-linked transmitters to obtain sequential information about location and haul-out state for Pacific walruses, Odobenus rosmarus divergens (Il-liger, 1815), in the Bering Sea during April of 2004, 2005, and 2006. We used these data in a generalized mixed model of haul-out bout durations and a hierarchical Bayesian model of haul-out probabilities to assess factors related to walrus haul-out behavior, and provide the first predictive model of walrus haul-out behavior in sea ice habitat. Average haul-out bout duration was 9 h, but durations of haul-out bouts tended to increase with durations of preceding in-water bouts. On aver-age, tagged walruses spent only about 17% of their time hauled out on sea ice. Probability of being hauled out decreased with wind speed, increased with temperature, and followed a diurnal cycle with the highest values in the evening. Our haul-out probability model can be used to estimate the proportion of the population that is unavailable for detection in spring surveys of Pacific walruses on sea ice.

  20. Accounting for imperfect detection of groups and individuals when estimating abundance.

    PubMed

    Clement, Matthew J; Converse, Sarah J; Royle, J Andrew

    2017-09-01

    If animals are independently detected during surveys, many methods exist for estimating animal abundance despite detection probabilities <1. Common estimators include double-observer models, distance sampling models and combined double-observer and distance sampling models (known as mark-recapture-distance-sampling models; MRDS). When animals reside in groups, however, the assumption of independent detection is violated. In this case, the standard approach is to account for imperfect detection of groups, while assuming that individuals within groups are detected perfectly. However, this assumption is often unsupported. We introduce an abundance estimator for grouped animals when detection of groups is imperfect and group size may be under-counted, but not over-counted. The estimator combines an MRDS model with an N-mixture model to account for imperfect detection of individuals. The new MRDS-Nmix model requires the same data as an MRDS model (independent detection histories, an estimate of distance to transect, and an estimate of group size), plus a second estimate of group size provided by the second observer. We extend the model to situations in which detection of individuals within groups declines with distance. We simulated 12 data sets and used Bayesian methods to compare the performance of the new MRDS-Nmix model to an MRDS model. Abundance estimates generated by the MRDS-Nmix model exhibited minimal bias and nominal coverage levels. In contrast, MRDS abundance estimates were biased low and exhibited poor coverage. Many species of conservation interest reside in groups and could benefit from an estimator that better accounts for imperfect detection. Furthermore, the ability to relax the assumption of perfect detection of individuals within detected groups may allow surveyors to re-allocate resources toward detection of new groups instead of extensive surveys of known groups. We believe the proposed estimator is feasible because the only additional field data required are a second estimate of group size.

  1. Accounting for imperfect detection of groups and individuals when estimating abundance

    USGS Publications Warehouse

    Clement, Matthew J.; Converse, Sarah J.; Royle, J. Andrew

    2017-01-01

    If animals are independently detected during surveys, many methods exist for estimating animal abundance despite detection probabilities <1. Common estimators include double-observer models, distance sampling models and combined double-observer and distance sampling models (known as mark-recapture-distance-sampling models; MRDS). When animals reside in groups, however, the assumption of independent detection is violated. In this case, the standard approach is to account for imperfect detection of groups, while assuming that individuals within groups are detected perfectly. However, this assumption is often unsupported. We introduce an abundance estimator for grouped animals when detection of groups is imperfect and group size may be under-counted, but not over-counted. The estimator combines an MRDS model with an N-mixture model to account for imperfect detection of individuals. The new MRDS-Nmix model requires the same data as an MRDS model (independent detection histories, an estimate of distance to transect, and an estimate of group size), plus a second estimate of group size provided by the second observer. We extend the model to situations in which detection of individuals within groups declines with distance. We simulated 12 data sets and used Bayesian methods to compare the performance of the new MRDS-Nmix model to an MRDS model. Abundance estimates generated by the MRDS-Nmix model exhibited minimal bias and nominal coverage levels. In contrast, MRDS abundance estimates were biased low and exhibited poor coverage. Many species of conservation interest reside in groups and could benefit from an estimator that better accounts for imperfect detection. Furthermore, the ability to relax the assumption of perfect detection of individuals within detected groups may allow surveyors to re-allocate resources toward detection of new groups instead of extensive surveys of known groups. We believe the proposed estimator is feasible because the only additional field data required are a second estimate of group size.

  2. Developing and testing a landscape habitat suitability model for fisher (Martes pennanti) in forests of interior northern California

    Treesearch

    W.J. Zielinski; J. R. Dunk; J. S. Yaeger; D. W. LaPlante

    2010-01-01

    The fisher is warranted for protection under the Endangered Species Act in the western United States and, as such, it is especially important that conservation and management actions are based on sound scientific information. We developed a landscape-scale suitability model for interior northern California to predict the probability of detecting fishers and to identify...

  3. Assessing Aircraft Supply Air to Recommend Compounds for Timely Warning of Contamination

    NASA Astrophysics Data System (ADS)

    Fox, Richard B.

    Taking aircraft out of service for even one day to correct fume-in-cabin events can cost the industry roughly $630 million per year in lost revenue. The quantitative correlation study investigated quantitative relationships between measured concentrations of contaminants in bleed air and probability of odor detectability. Data were collected from 94 aircraft engine and auxiliary power unit (APU) bleed air tests from an archival data set between 1997 and 2011, and no relationships were found. Pearson correlation was followed by regression analysis for individual contaminants. Significant relationships of concentrations of compounds in bleed air to probability of odor detectability were found (p<0.05), as well as between compound concentration and probability of sensory irritancy detectability. Study results may be useful to establish early warning levels. Predictive trend monitoring, a method to identify potential pending failure modes within a mechanical system, may influence scheduled down-time for maintenance as a planned event, rather than repair after a mechanical failure and thereby reduce operational costs associated with odor-in-cabin events. Twenty compounds (independent variables) were found statistically significant as related to probability of odor detectability (dependent variable 1). Seventeen compounds (independent variables) were found statistically significant as related to probability of sensory irritancy detectability (dependent variable 2). Additional research was recommended to further investigate relationships between concentrations of contaminants and probability of odor detectability or probability of sensory irritancy detectability for all turbine oil brands. Further research on implementation of predictive trend monitoring may be warranted to demonstrate how the monitoring process might be applied to in-flight application.

  4. Hidden Semi-Markov Models and Their Application

    NASA Astrophysics Data System (ADS)

    Beyreuther, M.; Wassermann, J.

    2008-12-01

    In the framework of detection and classification of seismic signals there are several different approaches. Our choice for a more robust detection and classification algorithm is to adopt Hidden Markov Models (HMM), a technique showing major success in speech recognition. HMM provide a powerful tool to describe highly variable time series based on a double stochastic model and therefore allow for a broader class description than e.g. template based pattern matching techniques. Being a fully probabilistic model, HMM directly provide a confidence measure of an estimated classification. Furthermore and in contrast to classic artificial neuronal networks or support vector machines, HMM are incorporating the time dependence explicitly in the models thus providing a adequate representation of the seismic signal. As the majority of detection algorithms, HMM are not based on the time and amplitude dependent seismogram itself but on features estimated from the seismogram which characterize the different classes. Features, or in other words characteristic functions, are e.g. the sonogram bands, instantaneous frequency, instantaneous bandwidth or centroid time. In this study we apply continuous Hidden Semi-Markov Models (HSMM), an extension of continuous HMM. The duration probability of a HMM is an exponentially decaying function of the time, which is not a realistic representation of the duration of an earthquake. In contrast HSMM use Gaussians as duration probabilities, which results in an more adequate model. The HSMM detection and classification system is running online as an EARTHWORM module at the Bavarian Earthquake Service. Here the signals that are to be classified simply differ in epicentral distance. This makes it possible to easily decide whether a classification is correct or wrong and thus allows to better evaluate the advantages and disadvantages of the proposed algorithm. The evaluation is based on several month long continuous data and the results are additionally compared to the previously published discrete HMM, continuous HMM and a classic STA/LTA. The intermediate evaluation results are very promising.

  5. Application of Probability of Crack Detection to Aircraft Systems Reliability.

    DOT National Transportation Integrated Search

    1993-08-31

    This report describes three tasks related to probability of crack detection (POD) and aircraft systems reliablity. All three consider previous work in which crack growth simulations and crack detection data in the Service Difficulty Report (SDR) data...

  6. Odor detection of mixtures of homologous carboxylic acids and coffee aroma compounds by humans.

    PubMed

    Miyazawa, Toshio; Gallagher, Michele; Preti, George; Wise, Paul M

    2009-11-11

    Mixture summation among homologous carboxylic acids, that is, the relationship between detection probabilities for mixtures and detection probabilities for their unmixed components, varies with similarity in carbon-chain length. The current study examined detection of acetic, butyric, hexanoic, and octanoic acids mixed with three other model odorants that differ greatly from the acids in both structure and odor character, namely, 2-hydroxy-3-methylcyclopent-2-en-1-one, furan-2-ylmethanethiol, and (3-methyl-3-sulfanylbutyl) acetate. Psychometric functions were measured for both single compounds and binary mixtures (2 of 5, forced-choice method). An air dilution olfactometer delivered stimuli, with vapor-phase calibration using gas chromatography-mass spectrometry. Across the three odorants that differed from the acids, acetic and butyric acid showed approximately additive (or perhaps even supra-additive) summation at low perithreshold concentrations, but subadditive interactions at high perithreshold concentrations. In contrast, the medium-chain acids showed subadditive interactions across a wide range of concentrations. Thus, carbon-chain length appears to influence not only summation with other carboxylic acids but also summation with at least some unrelated compounds.

  7. Daytime identification of summer hailstorm cells from MSG data

    NASA Astrophysics Data System (ADS)

    Merino, A.; López, L.; Sánchez, J. L.; García-Ortega, E.; Cattani, E.; Levizzani, V.

    2014-04-01

    Identifying deep convection is of paramount importance, as it may be associated with extreme weather phenomena that have significant impact on the environment, property and populations. A new method, the hail detection tool (HDT), is described for identifying hail-bearing storms using multispectral Meteosat Second Generation (MSG) data. HDT was conceived as a two-phase method, in which the first step is the convective mask (CM) algorithm devised for detection of deep convection, and the second a hail mask algorithm (HM) for the identification of hail-bearing clouds among cumulonimbus systems detected by CM. Both CM and HM are based on logistic regression models trained with multispectral MSG data sets comprised of summer convective events in the middle Ebro Valley (Spain) between 2006 and 2010, and detected by the RGB (red-green-blue) visualization technique (CM) or C-band weather radar system of the University of León. By means of the logistic regression approach, the probability of identifying a cumulonimbus event with CM or a hail event with HM are computed by exploiting a proper selection of MSG wavelengths or their combination. A number of cloud physical properties (liquid water path, optical thickness and effective cloud drop radius) were used to physically interpret results of statistical models from a meteorological perspective, using a method based on these "ingredients". Finally, the HDT was applied to a new validation sample consisting of events during summer 2011. The overall probability of detection was 76.9 % and the false alarm ratio 16.7 %.

  8. Estimating black bear density using DNA data from hair snares

    USGS Publications Warehouse

    Gardner, B.; Royle, J. Andrew; Wegan, M.T.; Rainbolt, R.E.; Curtis, P.D.

    2010-01-01

    DNA-based mark-recapture has become a methodological cornerstone of research focused on bear species. The objective of such studies is often to estimate population size; however, doing so is frequently complicated by movement of individual bears. Movement affects the probability of detection and the assumption of closure of the population required in most models. To mitigate the bias caused by movement of individuals, population size and density estimates are often adjusted using ad hoc methods, including buffering the minimum polygon of the trapping array. We used a hierarchical, spatial capturerecapture model that contains explicit components for the spatial-point process that governs the distribution of individuals and their exposure to (via movement), and detection by, traps. We modeled detection probability as a function of each individual's distance to the trap and an indicator variable for previous capture to account for possible behavioral responses. We applied our model to a 2006 hair-snare study of a black bear (Ursus americanus) population in northern New York, USA. Based on the microsatellite marker analysis of collected hair samples, 47 individuals were identified. We estimated mean density at 0.20 bears/km2. A positive estimate of the indicator variable suggests that bears are attracted to baited sites; therefore, including a trap-dependence covariate is important when using bait to attract individuals. Bayesian analysis of the model was implemented in WinBUGS, and we provide the model specification. The model can be applied to any spatially organized trapping array (hair snares, camera traps, mist nests, etc.) to estimate density and can also account for heterogeneity and covariate information at the trap or individual level. ?? The Wildlife Society.

  9. Comparison of hoop-net trapping and visual surveys to monitor abundance of the Rio Grande cooter (Pseudemys gorzugi).

    PubMed

    Mali, Ivana; Duarte, Adam; Forstner, Michael R J

    2018-01-01

    Abundance estimates play an important part in the regulatory and conservation decision-making process. It is important to correct monitoring data for imperfect detection when using these data to track spatial and temporal variation in abundance, especially in the case of rare and elusive species. This paper presents the first attempt to estimate abundance of the Rio Grande cooter ( Pseudemys gorzugi ) while explicitly considering the detection process. Specifically, in 2016 we monitored this rare species at two sites along the Black River, New Mexico via traditional baited hoop-net traps and less invasive visual surveys to evaluate the efficacy of these two sampling designs. We fitted the Huggins closed-capture estimator to estimate capture probabilities using the trap data and distance sampling models to estimate detection probabilities using the visual survey data. We found that only the visual survey with the highest number of observed turtles resulted in similar abundance estimates to those estimated using the trap data. However, the estimates of abundance from the remaining visual survey data were highly variable and often underestimated abundance relative to the estimates from the trap data. We suspect this pattern is related to changes in the basking behavior of the species and, thus, the availability of turtles to be detected even though all visual surveys were conducted when environmental conditions were similar. Regardless, we found that riverine habitat conditions limited our ability to properly conduct visual surveys at one site. Collectively, this suggests visual surveys may not be an effective sample design for this species in this river system. When analyzing the trap data, we found capture probabilities to be highly variable across sites and between age classes and that recapture probabilities were much lower than initial capture probabilities, highlighting the importance of accounting for detectability when monitoring this species. Although baited hoop-net traps seem to be an effective sampling design, it is important to note that this method required a relatively high trap effort to reliably estimate abundance. This information will be useful when developing a larger-scale, long-term monitoring program for this species of concern.

  10. The 1996 Leonid shower as studied with a potassium lidar: Observations and inferred meteoroid sizes

    NASA Astrophysics Data System (ADS)

    Höffner, Josef; von Zahn, Ulf; McNeil, William J.; Murad, Edmond

    1999-02-01

    We report on the observation and analysis of meteor trails that are detected by ground-based lidar tuned to the D1 fine structure line of K. The lidar is located at Kühlungsborn, Germany. The echo profiles are analyzed with a temporal resolution of about 1 s and altitude resolution of 200 m. Identification of meteor trails in the large archive of raw data is performed with help of an automated computer search code. During the peak of the Lenoid meteor shower on the morning of November 17, 1996, we observed seven meteor trails between 0245 and 0445 UT. Their mean altitude was 89.0 km. The duration of observation of individual trails ranges from 3 s to ~30 min. We model the probability of observing a meteor trail by ground-based lidar as a function of both altitude distribution and duration of the trails. These distributions depend on the mass distribution, entry velocity, and entry angle of the meteoroids, on the altitude-dependent chemical and dynamical lifetimes of the released K atom, and on the absolute detection sensitivity of our lidar experiment. From the modeling, we derive the statistical likelihood of detection of trails from meteoroids of a particular size. These bracket quite well the observed trails. The model also gives estimates of the probable size of the meteoroids based on characteristics of individual trails.

  11. Interference Information Based Power Control for Cognitive Radio with Multi-Hop Cooperative Sensing

    NASA Astrophysics Data System (ADS)

    Yu, Youngjin; Murata, Hidekazu; Yamamoto, Koji; Yoshida, Susumu

    Reliable detection of other radio systems is crucial for systems that share the same frequency band. In wireless communication channels, there is uncertainty in the received signal level due to multipath fading and shadowing. Cooperative sensing techniques in which radio stations share their sensing information can improve the detection probability of other systems. In this paper, a new cooperative sensing scheme that reduces the false detection probability while maintaining the outage probability of other systems is investigated. In the proposed system, sensing information is collected using multi-hop transmission from all sensing stations that detect other systems, and transmission decisions are based on the received sensing information. The proposed system also controls the transmit power based on the received CINRs from the sensing stations. Simulation results reveal that the proposed system can reduce the outage probability of other systems, or improve its link success probability.

  12. Improved detection probability of low level light and infrared image fusion system

    NASA Astrophysics Data System (ADS)

    Luo, Yuxiang; Fu, Rongguo; Zhang, Junju; Wang, Wencong; Chang, Benkang

    2018-02-01

    Low level light(LLL) image contains rich information on environment details, but is easily affected by the weather. In the case of smoke, rain, cloud or fog, much target information will lose. Infrared image, which is from the radiation produced by the object itself, can be "active" to obtain the target information in the scene. However, the image contrast and resolution is bad, the ability of the acquisition of target details is very poor, and the imaging mode does not conform to the human visual habit. The fusion of LLL and infrared image can make up for the deficiency of each sensor and give play to the advantages of single sensor. At first, we show the hardware design of fusion circuit. Then, through the recognition probability calculation of the target(one person) and the background image(trees), we find that the trees detection probability of LLL image is higher than that of the infrared image, and the person detection probability of the infrared image is obviously higher than that of LLL image. The detection probability of fusion image for one person and trees is higher than that of single detector. Therefore, image fusion can significantly enlarge recognition probability and improve detection efficiency.

  13. Species traits and catchment-scale habitat factors influence the occurrence of freshwater mussel populations and assemblages

    USGS Publications Warehouse

    Pandolfo, Tamara J.; Kwak, Thomas J.; Cope, W. Gregory; Heise, Ryan J.; Nichols, Robert B.; Pacifici, Krishna

    2016-01-01

    Conservation of freshwater unionid mussels presents unique challenges due to their distinctive life cycle, cryptic occurrence and imperilled status. Relevant ecological information is urgently needed to guide their management and conservation.We adopted a modelling approach, which is a novel application to freshwater mussels to enhance inference on rare species, by borrowing data among species in a hierarchical framework to conduct the most comprehensive occurrence analysis for freshwater mussels to date. We incorporated imperfect detection to more accurately examine effects of biotic and abiotic factors at multiple scales on the occurrence of 14 mussel species and the entire assemblage of the Tar River Basin of North Carolina, U.S.A.The single assemblage estimate of detection probability for all species was 0.42 (95% CI, 0.36–0.47) with no species- or site-specific detection effects identified. We empirically observed 15 mussel species in the basin but estimated total species richness at 21 (95% CI, 16–24) when accounting for imperfect detection.Mean occurrence probability among species ranged from 0.04 (95% CI, 0.01–0.16) for Alasmidonta undulata, an undescribed Lampsilis sp., and Strophitus undulatus to 0.67 (95% CI, 0.42–0.86) for Elliptio icterina. Median occurrence probability among sites was <0.30 for all species with the exception of E. icterina. Site occurrence probability generally related to mussel conservation status, with reduced occurrence for endangered and threatened species.Catchment-scale abiotic variables (stream power, agricultural land use) and species traits (brood time, host specificity, tribe) influenced the occurrence of mussel assemblages more than reach- or microhabitat-scale features.Our findings reflect the complexity of mussel ecology and indicate that habitat restoration alone may not be adequate for mussel conservation. Catchment-scale management can benefit an entire assemblage, but species-specific strategies may be necessary for successful conservation. The hierarchical multispecies modelling approach revealed findings that could not be elucidated by other means, and the approach may be applied more broadly to other river basins and regions. Accurate measures of assemblage dynamics, such as occurrence and species richness, are required to create management plans for effective conservation.

  14. Spatial Probability Dynamically Modulates Visual Target Detection in Chickens

    PubMed Central

    Sridharan, Devarajan; Ramamurthy, Deepa L.; Knudsen, Eric I.

    2013-01-01

    The natural world contains a rich and ever-changing landscape of sensory information. To survive, an organism must be able to flexibly and rapidly locate the most relevant sources of information at any time. Humans and non-human primates exploit regularities in the spatial distribution of relevant stimuli (targets) to improve detection at locations of high target probability. Is the ability to flexibly modify behavior based on visual experience unique to primates? Chickens (Gallus domesticus) were trained on a multiple alternative Go/NoGo task to detect a small, briefly-flashed dot (target) in each of the quadrants of the visual field. When targets were presented with equal probability (25%) in each quadrant, chickens exhibited a distinct advantage for detecting targets at lower, relative to upper, hemifield locations. Increasing the probability of presentation in the upper hemifield locations (to 80%) dramatically improved detection performance at these locations to be on par with lower hemifield performance. Finally, detection performance in the upper hemifield changed on a rapid timescale, improving with successive target detections, and declining with successive detections at the diagonally opposite location in the lower hemifield. These data indicate the action of a process that in chickens, as in primates, flexibly and dynamically modulates detection performance based on the spatial probabilities of sensory stimuli as well as on recent performance history. PMID:23734188

  15. Sampling stored product insect pests: a comparison of four statistical sampling models for probability of pest detection

    USDA-ARS?s Scientific Manuscript database

    Statistically robust sampling strategies form an integral component of grain storage and handling activities throughout the world. Developing sampling strategies to target biological pests such as insects in stored grain is inherently difficult due to species biology and behavioral characteristics. ...

  16. A Managed Care Model for the Military Departments

    DTIC Science & Technology

    1990-05-15

    Seattle reports that a widely used technique for electronic monitoring of a fetal heart during delivery is "no more effective than a stethoscope in 6...detecting fetal distress."𔃿 The study director went on to say that obstetricians will probably continue to use the technique to avoid possible

  17. Trap configuration and spacing influences parameter estimates in spatial capture-recapture models

    USGS Publications Warehouse

    Sun, Catherine C.; Fuller, Angela K.; Royle, J. Andrew

    2014-01-01

    An increasing number of studies employ spatial capture-recapture models to estimate population size, but there has been limited research on how different spatial sampling designs and trap configurations influence parameter estimators. Spatial capture-recapture models provide an advantage over non-spatial models by explicitly accounting for heterogeneous detection probabilities among individuals that arise due to the spatial organization of individuals relative to sampling devices. We simulated black bear (Ursus americanus) populations and spatial capture-recapture data to evaluate the influence of trap configuration and trap spacing on estimates of population size and a spatial scale parameter, sigma, that relates to home range size. We varied detection probability and home range size, and considered three trap configurations common to large-mammal mark-recapture studies: regular spacing, clustered, and a temporal sequence of different cluster configurations (i.e., trap relocation). We explored trap spacing and number of traps per cluster by varying the number of traps. The clustered arrangement performed well when detection rates were low, and provides for easier field implementation than the sequential trap arrangement. However, performance differences between trap configurations diminished as home range size increased. Our simulations suggest it is important to consider trap spacing relative to home range sizes, with traps ideally spaced no more than twice the spatial scale parameter. While spatial capture-recapture models can accommodate different sampling designs and still estimate parameters with accuracy and precision, our simulations demonstrate that aspects of sampling design, namely trap configuration and spacing, must consider study area size, ranges of individual movement, and home range sizes in the study population.

  18. Detection and plant monitoring programs: lessons from an intensive survey of Asclepias meadii with five observers.

    PubMed

    Alexander, Helen M; Reed, Aaron W; Kettle, W Dean; Slade, Norman A; Bodbyl Roels, Sarah A; Collins, Cathy D; Salisbury, Vaughn

    2012-01-01

    Monitoring programs, where numbers of individuals are followed through time, are central to conservation. Although incomplete detection is expected with wildlife surveys, this topic is rarely considered with plants. However, if plants are missed in surveys, raw count data can lead to biased estimates of population abundance and vital rates. To illustrate, we had five independent observers survey patches of the rare plant Asclepias meadii at two prairie sites. We analyzed data with two mark-recapture approaches. Using the program CAPTURE, the estimated number of patches equaled the detected number for a burned site, but exceeded detected numbers by 28% for an unburned site. Analyses of detected patches using Huggins models revealed important effects of observer, patch state (flowering/nonflowering), and patch size (number of stems) on probabilities of detection. Although some results were expected (i.e. greater detection of flowering than nonflowering patches), the importance of our approach is the ability to quantify the magnitude of detection problems. We also evaluated the degree to which increased observer numbers improved detection: smaller groups (3-4 observers) generally found 90 - 99% of the patches found by all five people, but pairs of observers or single observers had high error and detection depended on which individuals were involved. We conclude that an intensive study at the start of a long-term monitoring study provides essential information about probabilities of detection and what factors cause plants to be missed. This information can guide development of monitoring programs.

  19. Application of artificial neural network to search for gravitational-wave signals associated with short gamma-ray bursts

    NASA Astrophysics Data System (ADS)

    Kim, Kyungmin; Harry, Ian W.; Hodge, Kari A.; Kim, Young-Min; Lee, Chang-Hwan; Lee, Hyun Kyu; Oh, John J.; Oh, Sang Hoon; Son, Edwin J.

    2015-12-01

    We apply a machine learning algorithm, the artificial neural network, to the search for gravitational-wave signals associated with short gamma-ray bursts (GRBs). The multi-dimensional samples consisting of data corresponding to the statistical and physical quantities from the coherent search pipeline are fed into the artificial neural network to distinguish simulated gravitational-wave signals from background noise artifacts. Our result shows that the data classification efficiency at a fixed false alarm probability (FAP) is improved by the artificial neural network in comparison to the conventional detection statistic. Specifically, the distance at 50% detection probability at a fixed false positive rate is increased about 8%-14% for the considered waveform models. We also evaluate a few seconds of the gravitational-wave data segment using the trained networks and obtain the FAP. We suggest that the artificial neural network can be a complementary method to the conventional detection statistic for identifying gravitational-wave signals related to the short GRBs.

  20. Assessing bat detectability and occupancy with multiple automated echolocation detectors

    Treesearch

    Marcos P. Gorresen; Adam C. Miles; Christopher M. Todd; Frank J. Bonaccorso; Theodore J. Weller

    2008-01-01

    Occupancy analysis and its ability to account for differential detection probabilities is important for studies in which detecting echolocation calls is used as a measure of bat occurrence and activity. We examined the feasibility of remotely acquiring bat encounter histories to estimate detection probability and occupancy. We used echolocation detectors coupled o...

  1. Image Discrimination Models Predict Object Detection in Natural Backgrounds

    NASA Technical Reports Server (NTRS)

    Ahumada, Albert J., Jr.; Rohaly, A. M.; Watson, Andrew B.; Null, Cynthia H. (Technical Monitor)

    1994-01-01

    Object detection involves looking for one of a large set of object sub-images in a large set of background images. Image discrimination models only predict the probability that an observer will detect a difference between two images. In a recent study based on only six different images, we found that discrimination models can predict the relative detectability of objects in those images, suggesting that these simpler models may be useful in some object detection applications. Here we replicate this result using a new, larger set of images. Fifteen images of a vehicle in an other-wise natural setting were altered to remove the vehicle and mixed with the original image in a proportion chosen to make the target neither perfectly recognizable nor unrecognizable. The target was also rotated about a vertical axis through its center and mixed with the background. Sixteen observers rated these 30 target images and the 15 background-only images for the presence of a vehicle. The likelihoods of the observer responses were computed from a Thurstone scaling model with the assumption that the detectabilities are proportional to the predictions of an image discrimination model. Three image discrimination models were used: a cortex transform model, a single channel model with a contrast sensitivity function filter, and the Root-Mean-Square (RMS) difference of the digital target and background-only images. As in the previous study, the cortex transform model performed best; the RMS difference predictor was second best; and last, but still a reasonable predictor, was the single channel model. Image discrimination models can predict the relative detectabilities of objects in natural backgrounds.

  2. Statistically qualified neuro-analytic failure detection method and system

    DOEpatents

    Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.

    2002-03-02

    An apparatus and method for monitoring a process involve development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two stages: deterministic model adaption and stochastic model modification of the deterministic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics, augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation error minimization technique. Stochastic model modification involves qualifying any remaining uncertainty in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system. Illustrative of the method and apparatus, the method is applied to a peristaltic pump system.

  3. Point count length and detection of forest neotropical migrant birds

    USGS Publications Warehouse

    Dawson, D.K.; Smith, D.R.; Robbins, C.S.; Ralph, C. John; Sauer, John R.; Droege, Sam

    1995-01-01

    Comparisons of bird abundances among years or among habitats assume that the rates at which birds are detected and counted are constant within species. We use point count data collected in forests of the Mid-Atlantic states to estimate detection probabilities for Neotropical migrant bird species as a function of count length. For some species, significant differences existed among years or observers in both the probability of detecting the species and in the rate at which individuals are counted. We demonstrate the consequence that variability in species' detection probabilities can have on estimates of population change, and discuss ways for reducing this source of bias in point count studies.

  4. Risk assessment of turbine rotor failure using probabilistic ultrasonic non-destructive evaluations

    NASA Astrophysics Data System (ADS)

    Guan, Xuefei; Zhang, Jingdan; Zhou, S. Kevin; Rasselkorde, El Mahjoub; Abbasi, Waheed A.

    2014-02-01

    The study presents a method and application of risk assessment methodology for turbine rotor fatigue failure using probabilistic ultrasonic nondestructive evaluations. A rigorous probabilistic modeling for ultrasonic flaw sizing is developed by incorporating the model-assisted probability of detection, and the probability density function (PDF) of the actual flaw size is derived. Two general scenarios, namely the ultrasonic inspection with an identified flaw indication and the ultrasonic inspection without flaw indication, are considered in the derivation. To perform estimations for fatigue reliability and remaining useful life, uncertainties from ultrasonic flaw sizing and fatigue model parameters are systematically included and quantified. The model parameter PDF is estimated using Bayesian parameter estimation and actual fatigue testing data. The overall method is demonstrated using a realistic application of steam turbine rotor, and the risk analysis under given safety criteria is provided to support maintenance planning.

  5. Integrated Data Analysis for Fusion: A Bayesian Tutorial for Fusion Diagnosticians

    NASA Astrophysics Data System (ADS)

    Dinklage, Andreas; Dreier, Heiko; Fischer, Rainer; Gori, Silvio; Preuss, Roland; Toussaint, Udo von

    2008-03-01

    Integrated Data Analysis (IDA) offers a unified way of combining information relevant to fusion experiments. Thereby, IDA meets with typical issues arising in fusion data analysis. In IDA, all information is consistently formulated as probability density functions quantifying uncertainties in the analysis within the Bayesian probability theory. For a single diagnostic, IDA allows the identification of faulty measurements and improvements in the setup. For a set of diagnostics, IDA gives joint error distributions allowing the comparison and integration of different diagnostics results. Validation of physics models can be performed by model comparison techniques. Typical data analysis applications benefit from IDA capabilities of nonlinear error propagation, the inclusion of systematic effects and the comparison of different physics models. Applications range from outlier detection, background discrimination, model assessment and design of diagnostics. In order to cope with next step fusion device requirements, appropriate techniques are explored for fast analysis applications.

  6. A goodness-of-fit test for capture-recapture model M(t) under closure

    USGS Publications Warehouse

    Stanley, T.R.; Burnham, K.P.

    1999-01-01

    A new, fully efficient goodness-of-fit test for the time-specific closed-population capture-recapture model M(t) is presented. This test is based on the residual distribution of the capture history data given the maximum likelihood parameter estimates under model M(t), is partitioned into informative components, and is based on chi-square statistics. Comparison of this test with Leslie's test (Leslie, 1958, Journal of Animal Ecology 27, 84- 86) for model M(t), using Monte Carlo simulations, shows the new test generally outperforms Leslie's test. The new test is frequently computable when Leslie's test is not, has Type I error rates that are closer to nominal error rates than Leslie's test, and is sensitive to behavioral variation and heterogeneity in capture probabilities. Leslie's test is not sensitive to behavioral variation in capture probabilities but, when computable, has greater power to detect heterogeneity than the new test.

  7. Method for detecting and avoiding flight hazards

    NASA Astrophysics Data System (ADS)

    von Viebahn, Harro; Schiefele, Jens

    1997-06-01

    Today's aircraft equipment comprise several independent warning and hazard avoidance systems like GPWS, TCAS or weather radar. It is the pilot's task to monitor all these systems and take the appropriate action in case of an emerging hazardous situation. The developed method for detecting and avoiding flight hazards combines all potential external threats for an aircraft into a single system. It is based on an aircraft surrounding airspace model consisting of discrete volume elements. For each element of the volume the threat probability is derived or computed from sensor output, databases, or information provided via datalink. The position of the own aircraft is predicted by utilizing a probability distribution. This approach ensures that all potential positions of the aircraft within the near future are considered while weighting the most likely flight path. A conflict detection algorithm initiates an alarm in case the threat probability exceeds a threshold. An escape manoeuvre is generated taking into account all potential hazards in the vicinity, not only the one which caused the alarm. The pilot gets a visual information about the type, the locating, and severeness o the threat. The algorithm was implemented and tested in a flight simulator environment. The current version comprises traffic, terrain and obstacle hazards avoidance functions. Its general formulation allows an easy integration of e.g. weather information or airspace restrictions.

  8. Lidar detection algorithm for time and range anomalies.

    PubMed

    Ben-David, Avishai; Davidson, Charles E; Vanderbeek, Richard G

    2007-10-10

    A new detection algorithm for lidar applications has been developed. The detection is based on hyperspectral anomaly detection that is implemented for time anomaly where the question "is a target (aerosol cloud) present at range R within time t(1) to t(2)" is addressed, and for range anomaly where the question "is a target present at time t within ranges R(1) and R(2)" is addressed. A detection score significantly different in magnitude from the detection scores for background measurements suggests that an anomaly (interpreted as the presence of a target signal in space/time) exists. The algorithm employs an option for a preprocessing stage where undesired oscillations and artifacts are filtered out with a low-rank orthogonal projection technique. The filtering technique adaptively removes the one over range-squared dependence of the background contribution of the lidar signal and also aids visualization of features in the data when the signal-to-noise ratio is low. A Gaussian-mixture probability model for two hypotheses (anomaly present or absent) is computed with an expectation-maximization algorithm to produce a detection threshold and probabilities of detection and false alarm. Results of the algorithm for CO(2) lidar measurements of bioaerosol clouds Bacillus atrophaeus (formerly known as Bacillus subtilis niger, BG) and Pantoea agglomerans, Pa (formerly known as Erwinia herbicola, Eh) are shown and discussed.

  9. Multiparametric Detection of Antibodies against Different EBV Antigens to Predict Risk for Nasopharyngeal Carcinoma in a High-Risk Population of China.

    PubMed

    Chen, Hao; Chen, Shulin; Lu, Jie; Wang, Xueping; Li, Jianpei; Li, Linfang; Fu, Jihuan; Scheper, Thomas; Meyer, Wolfgang; Peng, Yu-Hui; Liu, Wanli

    2017-09-01

    In this study, we aimed to use the combined detection of multiple antibodies against Epstein-Barr virus (EBV) antigens to develop a model for screening and diagnosis of nasopharyngeal carcinoma (NPC). Samples of 300 nasopharyngeal carcinoma patients and 494 controls, including 294 healthy subjects (HC), 99 non-nasopharyngeal carcinoma cancer patients (NNPC), and 101 patients with benign nasopharyngeal lesions (BNL), were incubated with the EUROLINE Anti-EBV Profile 2, and band intensities were used to establish a risk prediction model. The nasopharyngeal carcinoma risk probability analysis based on the panel of VCAgp125 IgA, EBNA-1 IgA, EA-D IgA, EBNA-1 IgG, EAD IgG, and VCAp19 IgG displayed the best performance. When using 26.1% as the cutoff point in ROC analysis, the AUC value and sensitivity/specificity were 0.951 and 90.7%/86.2%, respectively, in nasopharyngeal carcinoma and all controls. In nasopharyngeal carcinoma and controls without the non-nasopharyngeal carcinoma and BNL groups, the AUC value and sensitivity/specificity were 0.957 and 90.7%/88.1%, respectively. The diagnostic specificity and sensitivity of the EUROLINE Anti-EBV Profile 2 assay for both nasopharyngeal carcinoma and early-stage nasopharyngeal carcinoma were higher than that of mono-antibody detection by immune-enzymatic assay and real-time PCR (EBV DNA). In the VCA-IgA-negative group, 82.6% of nasopharyngeal carcinoma patients showed high probability for nasopharyngeal carcinoma, and the negative predictive value was 97.1%. In the VCA-IgA-positive group, 73.3% of healthy subjects showed low probability. The positive predictive value reached 98.2% in this group. The nasopharyngeal carcinoma risk probability value determined by the EUROLINE Anti-EBV Profile 2 might be a suitable tool for nasopharyngeal carcinoma screening. Cancer Prev Res; 10(9); 542-50. ©2017 AACR . ©2017 American Association for Cancer Research.

  10. Rare targets are less susceptible to attention capture once detection has begun.

    PubMed

    Hon, Nicholas; Ng, Gavin; Chan, Gerald

    2016-04-01

    Rare or low probability targets are detected more slowly and/ or less accurately than higher probability counterparts. Various proposals have implicated perceptual and response-based processes in this deficit. Recent evidence, however, suggests that it is attentional in nature, with low probability targets requiring more attentional resources than high probability ones to detect. This difference in attentional requirements, in turn, suggests the possibility that low and high probability targets may have different susceptibilities to attention capture, which is also known to be resource-dependent. Supporting this hypothesis, we found that, once attentional resources have begun to be engaged by detection processes, low, but not high, probability targets have a reduced susceptibility to capture. Our findings speak to several issues. First, they indicate that the likelihood of attention capture occurring when a given task-relevant stimulus is being processed is dependent, to some extent, on how said stimulus is represented within mental task sets. Second, they provide added support for the idea that the behavioural deficit associated with low probability targets is attention-based. Finally, the current data point to reduced top-down biasing of target templates as a likely mechanism underlying the attentional locus of the deficit in question.

  11. Generative adversarial network based telecom fraud detection at the receiving bank.

    PubMed

    Zheng, Yu-Jun; Zhou, Xiao-Han; Sheng, Wei-Guo; Xue, Yu; Chen, Sheng-Yong

    2018-06-01

    Recently telecom fraud has become a serious problem especially in developing countries such as China. At present, it can be very difficult to coordinate different agencies to prevent fraud completely. In this paper we study how to detect large transfers that are sent from victims deceived by fraudsters at the receiving bank. We propose a new generative adversarial network (GAN) based model to calculate for each large transfer a probability that it is fraudulent, such that the bank can take appropriate measures to prevent potential fraudsters to take the money if the probability exceeds a threshold. The inference model uses a deep denoising autoencoder to effectively learn the complex probabilistic relationship among the input features, and employs adversarial training that establishes a minimax game between a discriminator and a generator to accurately discriminate between positive samples and negative samples in the data distribution. We show that the model outperforms a set of well-known classification methods in experiments, and its applications in two commercial banks have reduced losses of about 10 million RMB in twelve weeks and significantly improved their business reputation. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Patch-occupancy models indicate human activity as major determinant of forest elephant Loxodonta cyclotis seasonal distribution in an industrial corridor in Gabon

    USGS Publications Warehouse

    Buij, R.; McShea, W.J.; Campbell, P.; Lee, M.E.; Dallmeier, F.; Guimondou, S.; Mackaga, L.; Guisseougou, N.; Mboumba, S.; Hines, J.E.; Nichols, J.D.; Alonso, A.

    2007-01-01

    The importance of human activity and ecological features in influencing African forest elephant ranging behaviour was investigated in the Rabi-Ndogo corridor of the Gamba Complex of Protected Areas in southwest Gabon. Locations in a wide geographical area with a range of environmental variables were selected for patch-occupancy surveys using elephant dung to assess seasonal presence and absence of elephants. Patch-occupancy procedures allowed for covariate modelling evaluating hypotheses for both occupancy in relation to human activity and ecological features, and detection probability in relation to vegetation density. The best fitting models for old and fresh dung data sets indicate that (1) detection probability for elephant dung is negatively related to the relative density of the vegetation, and (2) human activity, such as presence and infrastructure, are more closely associated with elephant distribution patterns than are ecological features, such as the presence of wetlands and preferred fresh fruit. Our findings emphasize the sensitivity of elephants to human disturbance, in this case infrastructure development associated with gas and oil production. Patch-occupancy methodology offers a viable alternative to current transect protocols for monitoring programs with multiple covariates.

  13. A Comparative Study of Automated Infrasound Detectors - PMCC and AFD with Analyst Review.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Junghyun; Hayward, Chris; Zeiler, Cleat

    Automated detections calculated by the progressive multi-channel correlation (PMCC) method (Cansi, 1995) and the adaptive F detector (AFD) (Arrowsmith et al., 2009) are compared to the signals identified by five independent analysts. Each detector was applied to a four-hour time sequence recorded by the Korean infrasound array CHNAR. This array was used because it is composed of both small (<100 m) and large (~1000 m) aperture element spacing. The four hour time sequence contained a number of easily identified signals under noise conditions that have average RMS amplitudes varied from 1.2 to 4.5 mPa (1 to 5 Hz), estimated withmore » running five-minute window. The effectiveness of the detectors was estimated for the small aperture, large aperture, small aperture combined with the large aperture, and full array. The full and combined arrays performed the best for AFD under all noise conditions while the large aperture array had the poorest performance for both detectors. PMCC produced similar results as AFD under the lower noise conditions, but did not produce as dramatic an increase in detections using the full and combined arrays. Both automated detectors and the analysts produced a decrease in detections under the higher noise conditions. Comparing the detection probabilities with Estimated Receiver Operating Characteristic (EROC) curves we found that the smaller value of consistency for PMCC and the larger p-value for AFD had the highest detection probability. These parameters produced greater changes in detection probability than estimates of the false alarm rate. The detection probability was impacted the most by noise level, with low noise (average RMS amplitude of 1.7 mPa) having an average detection probability of ~40% and high noise (average RMS amplitude of 2.9 mPa) average detection probability of ~23%.« less

  14. Estimating factors influencing the detection probability of semiaquatic freshwater snails using quadrat survey methods

    USGS Publications Warehouse

    Roesler, Elizabeth L.; Grabowski, Timothy B.

    2018-01-01

    Developing effective monitoring methods for elusive, rare, or patchily distributed species requires extra considerations, such as imperfect detection. Although detection is frequently modeled, the opportunity to assess it empirically is rare, particularly for imperiled species. We used Pecos assiminea (Assiminea pecos), an endangered semiaquatic snail, as a case study to test detection and accuracy issues surrounding quadrat searches. Quadrats (9 × 20 cm; n = 12) were placed in suitable Pecos assiminea habitat and randomly assigned a treatment, defined as the number of empty snail shells (0, 3, 6, or 9). Ten observers rotated through each quadrat, conducting 5-min visual searches for shells. The probability of detecting a shell when present was 67.4 ± 3.0%, but it decreased with the increasing litter depth and fewer number of shells present. The mean (± SE) observer accuracy was 25.5 ± 4.3%. Accuracy was positively correlated to the number of shells in the quadrat and negatively correlated to the number of times a quadrat was searched. The results indicate quadrat surveys likely underrepresent true abundance, but accurately determine the presence or absence. Understanding detection and accuracy of elusive, rare, or imperiled species improves density estimates and aids in monitoring and conservation efforts.

  15. Cortical Local Field Potential Power Is Associated with Behavioral Detection of Near-threshold Stimuli in the Rat Whisker System: Dissociation between Orbitofrontal and Somatosensory Cortices.

    PubMed

    Rickard, Rachel E; Young, Andrew M J; Gerdjikov, Todor V

    2018-01-01

    There is growing evidence that ongoing brain oscillations may represent a key regulator of attentional processes and as such may contribute to behavioral performance in psychophysical tasks. OFC appears to be involved in the top-down modulation of sensory processing; however, the specific contribution of ongoing OFC oscillations to perception has not been characterized. Here we used the rat whiskers as a model system to further characterize the relationship between cortical state and tactile detection. Head-fixed rats were trained to report the presence of a vibrotactile stimulus (frequency = 60 Hz, duration = 2 sec, deflection amplitude = 0.01-0.5 mm) applied to a single vibrissa. We calculated power spectra of local field potentials preceding the onset of near-threshold stimuli from microelectrodes chronically implanted in OFC and somatosensory cortex. We found a dissociation between slow oscillation power in the two regions in relation to detection probability: Higher OFC but not somatosensory delta power was associated with increased detection probability. Furthermore, coherence between OFC and barrel cortex was reduced preceding successful detection. Consistent with the role of OFC in attention, our results identify a cortical network whose activity is differentially modulated before successful tactile detection.

  16. Maximizing detection probability of Wetland-dependent birds during point-count surveys in northwestern Florida

    USGS Publications Warehouse

    Nadeau, C.P.; Conway, C.J.; Smith, B.S.; Lewis, T.E.

    2008-01-01

    We conducted 262 call-broadcast point-count surveys (1-6 replicate surveys on each of 62 points) using standardized North American Marsh Bird Monitoring Protocols between 31 May and 7 July 2006 on St. Vincent National Wildlife Refuge, an island off the northwest coast of Florida. We conducted double-blind multiple-observer surveys, paired morning and evening surveys, and paired morning and night surveys to examine the influence of call-broadcast and time of day on detection probability. Observer detection probability for all species pooled was 75% and was similar between passive (69%) and call-broadcast (65%) periods. Detection probability was higher on morning than evening (t = 3.0, P = 0.030) or night (t = 3.4, P = 0.042) surveys when we pooled all species. Detection probability was higher (but not significant for all species) on morning compared to evening or night surveys for all five focal species detected on surveys: Least Bittern (Ixobrychus exilis), Clapper Rail (Rallus longirostris), Purple Gallinule (Porphyrula martinica), Common Moorhen (Gallinula chloropus), and American Coot (Fulica americana). We detected more Least Bitterns (t = 2.4, P = 0.064) and Common Moorhens (t = 2.8, P = 0.026) on morning than evening surveys, and more Clapper Rails (t = 5.1, P = 0.014) on morning than night surveys.

  17. Responses of pond-breeding amphibians to wildfire: Short-term patterns in occupancy and colonization

    USGS Publications Warehouse

    Hossack, B.R.; Corn, P.S.

    2007-01-01

    Wildland fires are expected to become more frequent and severe in many ecosystems, potentially posing a threat to many sensitive species. We evaluated the effects of a large, stand-replacement wildfire on three species of pond-breeding amphibians by estimating changes in occupancy of breeding sites during the three years before and after the fire burned 42 of 83 previously surveyed wetlands. Annual occupancy and colonization for each species was estimated using recently developed models that incorporate detection probabilities to provide unbiased parameter estimates. We did not find negative effects of the fire on the occupancy or colonization rates of the long-toed salamander (Ambystoma macrodactylum). Instead, its occupancy was higher across the study area after the fire, possibly in response to a large snowpack that may have facilitated colonization of unoccupied wetlands. Naïve data (uncorrected for detection probability) for the Columbia spotted frog (Rana luteiventris) initially led to the conclusion of increased occupancy and colonization in wetlands that burned. After accounting for temporal and spatial variation in detection probabilities, however, it was evident that these parameters were relatively stable in both areas before and after the fire. We found a similar discrepancy between naïve and estimated occupancy of A. macrodactylum that resulted from different detection probabilities in burned and control wetlands. The boreal toad (Bufo boreas) was not found breeding in the area prior to the fire but colonized several wetlands the year after they burned. Occupancy by B. boreas then declined during years 2 and 3 following the fire. Our study suggests that the amphibian populations we studied are resistant to wildfire and that B. boreas may experience short-term benefits from wildfire. Our data also illustrate how naïve presence–non-detection data can provide misleading results.

  18. Bayesian anomaly detection in monitoring data applying relevance vector machine

    NASA Astrophysics Data System (ADS)

    Saito, Tomoo

    2011-04-01

    A method for automatically classifying the monitoring data into two categories, normal and anomaly, is developed in order to remove anomalous data included in the enormous amount of monitoring data, applying the relevance vector machine (RVM) to a probabilistic discriminative model with basis functions and their weight parameters whose posterior PDF (probabilistic density function) conditional on the learning data set is given by Bayes' theorem. The proposed framework is applied to actual monitoring data sets containing some anomalous data collected at two buildings in Tokyo, Japan, which shows that the trained models discriminate anomalous data from normal data very clearly, giving high probabilities of being normal to normal data and low probabilities of being normal to anomalous data.

  19. Mixture Modeling for Background and Sources Separation in x-ray Astronomical Images

    NASA Astrophysics Data System (ADS)

    Guglielmetti, Fabrizia; Fischer, Rainer; Dose, Volker

    2004-11-01

    A probabilistic technique for the joint estimation of background and sources in high-energy astrophysics is described. Bayesian probability theory is applied to gain insight into the coexistence of background and sources through a probabilistic two-component mixture model, which provides consistent uncertainties of background and sources. The present analysis is applied to ROSAT PSPC data (0.1-2.4 keV) in Survey Mode. A background map is modelled using a Thin-Plate spline. Source probability maps are obtained for each pixel (45 arcsec) independently and for larger correlation lengths, revealing faint and extended sources. We will demonstrate that the described probabilistic method allows for detection improvement of faint extended celestial sources compared to the Standard Analysis Software System (SASS) used for the production of the ROSAT All-Sky Survey (RASS) catalogues.

  20. Hidden Markov models for fault detection in dynamic systems

    NASA Technical Reports Server (NTRS)

    Smyth, Padhraic J. (Inventor)

    1995-01-01

    The invention is a system failure monitoring method and apparatus which learns the symptom-fault mapping directly from training data. The invention first estimates the state of the system at discrete intervals in time. A feature vector x of dimension k is estimated from sets of successive windows of sensor data. A pattern recognition component then models the instantaneous estimate of the posterior class probability given the features, p(w(sub i) (vertical bar)/x), 1 less than or equal to i isless than or equal to m. Finally, a hidden Markov model is used to take advantage of temporal context and estimate class probabilities conditioned on recent past history. In this hierarchical pattern of information flow, the time series data is transformed and mapped into a categorical representation (the fault classes) and integrated over time to enable robust decision-making.

  1. Hidden Markov models for fault detection in dynamic systems

    NASA Technical Reports Server (NTRS)

    Smyth, Padhraic J. (Inventor)

    1993-01-01

    The invention is a system failure monitoring method and apparatus which learns the symptom-fault mapping directly from training data. The invention first estimates the state of the system at discrete intervals in time. A feature vector x of dimension k is estimated from sets of successive windows of sensor data. A pattern recognition component then models the instantaneous estimate of the posterior class probability given the features, p(w(sub i) perpendicular to x), 1 less than or equal to i is less than or equal to m. Finally, a hidden Markov model is used to take advantage of temporal context and estimate class probabilities conditioned on recent past history. In this hierarchical pattern of information flow, the time series data is transformed and mapped into a categorical representation (the fault classes) and integrated over time to enable robust decision-making.

  2. Systems Approach to Defeating Maritime Improvised Explosive Devices in U.S. Ports

    DTIC Science & Technology

    2008-12-01

    DETECTION Pfi PROBABILITY OF FALSE IDENTIFICATION PHPK PROBABILITY OF HIT/PROBABILITY OF KILL PMA POST MISSION ANALYSIS PNNL PACIFIC...16 Naval Warfare Publication 27-2(Rev. B), Section 1.8.4.1(unclassified) 42 detection analysis is conducted...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA Approved for public release; distribution is unlimited Prepared

  3. Detection of sea otters in boat-based surveys of Prince William Sound, Alaska

    USGS Publications Warehouse

    Udevitz, Mark S.; Bodkin, James L.; Costa, Daniel P.

    1995-01-01

    Boat-based surveys have been commonly used to monitor sea otter populations, but there has been little quantitative work to evaluate detection biases that may affect these surveys. We used ground-based observers to investigate sea otter detection probabilities in a boat-based survey of Prince William Sound, Alaska. We estimated that 30% of the otters present on surveyed transects were not detected by boat crews. Approximately half (53%) of the undetected otters were missed because the otters left the transects, apparently in response to the approaching boat. Unbiased estimates of detection probabilities will be required for obtaining unbiased population estimates from boat-based surveys of sea otters. Therefore, boat-based surveys should include methods to estimate sea otter detection probabilities under the conditions specific to each survey. Unbiased estimation of detection probabilities with ground-based observers requires either that the ground crews detect all of the otters in observed subunits, or that there are no errors in determining which crews saw each detected otter. Ground-based observer methods may be appropriate in areas where nearly all of the sea otter habitat is potentially visible from ground-based vantage points.

  4. Classification of change detection and change blindness from near-infrared spectroscopy signals

    NASA Astrophysics Data System (ADS)

    Tanaka, Hirokazu; Katura, Takusige

    2011-08-01

    Using a machine-learning classification algorithm applied to near-infrared spectroscopy (NIRS) signals, we classify a success (change detection) or a failure (change blindness) in detecting visual changes for a change-detection task. Five subjects perform a change-detection task, and their brain activities are continuously monitored. A support-vector-machine algorithm is applied to classify the change-detection and change-blindness trials, and correct classification probability of 70-90% is obtained for four subjects. Two types of temporal shapes in classification probabilities are found: one exhibiting a maximum value after the task is completed (postdictive type), and another exhibiting a maximum value during the task (predictive type). As for the postdictive type, the classification probability begins to increase immediately after the task completion and reaches its maximum in about the time scale of neuronal hemodynamic response, reflecting a subjective report of change detection. As for the predictive type, the classification probability shows an increase at the task initiation and is maximal while subjects are performing the task, predicting the task performance in detecting a change. We conclude that decoding change detection and change blindness from NIRS signal is possible and argue some future applications toward brain-machine interfaces.

  5. Support Vector Machine Based Monitoring of Cardio-Cerebrovascular Reserve during Simulated Hemorrhage.

    PubMed

    van der Ster, Björn J P; Bennis, Frank C; Delhaas, Tammo; Westerhof, Berend E; Stok, Wim J; van Lieshout, Johannes J

    2017-01-01

    Introduction: In the initial phase of hypovolemic shock, mean blood pressure (BP) is maintained by sympathetically mediated vasoconstriction rendering BP monitoring insensitive to detect blood loss early. Late detection can result in reduced tissue oxygenation and eventually cellular death. We hypothesized that a machine learning algorithm that interprets currently used and new hemodynamic parameters could facilitate in the detection of impending hypovolemic shock. Method: In 42 (27 female) young [mean (sd): 24 (4) years], healthy subjects central blood volume (CBV) was progressively reduced by application of -50 mmHg lower body negative pressure until the onset of pre-syncope. A support vector machine was trained to classify samples into normovolemia (class 0), initial phase of CBV reduction (class 1) or advanced CBV reduction (class 2). Nine models making use of different features were computed to compare sensitivity and specificity of different non-invasive hemodynamic derived signals. Model features included : volumetric hemodynamic parameters (stroke volume and cardiac output), BP curve dynamics, near-infrared spectroscopy determined cortical brain oxygenation, end-tidal carbon dioxide pressure, thoracic bio-impedance, and middle cerebral artery transcranial Doppler (TCD) blood flow velocity. Model performance was tested by quantifying the predictions with three methods : sensitivity and specificity, absolute error, and quantification of the log odds ratio of class 2 vs. class 0 probability estimates. Results: The combination with maximal sensitivity and specificity for classes 1 and 2 was found for the model comprising volumetric features (class 1: 0.73-0.98 and class 2: 0.56-0.96). Overall lowest model error was found for the models comprising TCD curve hemodynamics. Using probability estimates the best combination of sensitivity for class 1 (0.67) and specificity (0.87) was found for the model that contained the TCD cerebral blood flow velocity derived pulse height. The highest combination for class 2 was found for the model with the volumetric features (0.72 and 0.91). Conclusion: The most sensitive models for the detection of advanced CBV reduction comprised data that describe features from volumetric parameters and from cerebral blood flow velocity hemodynamics. In a validated model of hemorrhage in humans these parameters provide the best indication of the progression of central hypovolemia.

  6. Investigating prior probabilities in a multiple hypothesis test for use in space domain awareness

    NASA Astrophysics Data System (ADS)

    Hardy, Tyler J.; Cain, Stephen C.

    2016-05-01

    The goal of this research effort is to improve Space Domain Awareness (SDA) capabilities of current telescope systems through improved detection algorithms. Ground-based optical SDA telescopes are often spatially under-sampled, or aliased. This fact negatively impacts the detection performance of traditionally proposed binary and correlation-based detection algorithms. A Multiple Hypothesis Test (MHT) algorithm has been previously developed to mitigate the effects of spatial aliasing. This is done by testing potential Resident Space Objects (RSOs) against several sub-pixel shifted Point Spread Functions (PSFs). A MHT has been shown to increase detection performance for the same false alarm rate. In this paper, the assumption of a priori probability used in a MHT algorithm is investigated. First, an analysis of the pixel decision space is completed to determine alternate hypothesis prior probabilities. These probabilities are then implemented into a MHT algorithm, and the algorithm is then tested against previous MHT algorithms using simulated RSO data. Results are reported with Receiver Operating Characteristic (ROC) curves and probability of detection, Pd, analysis.

  7. Detection of the nipple in automated 3D breast ultrasound using coronal slab-average-projection and cumulative probability map

    NASA Astrophysics Data System (ADS)

    Kim, Hannah; Hong, Helen

    2014-03-01

    We propose an automatic method for nipple detection on 3D automated breast ultrasound (3D ABUS) images using coronal slab-average-projection and cumulative probability map. First, to identify coronal images that appeared remarkable distinction between nipple-areola region and skin, skewness of each coronal image is measured and the negatively skewed images are selected. Then, coronal slab-average-projection image is reformatted from selected images. Second, to localize nipple-areola region, elliptical ROI covering nipple-areola region is detected using Hough ellipse transform in coronal slab-average-projection image. Finally, to separate the nipple from areola region, 3D Otsu's thresholding is applied to the elliptical ROI and cumulative probability map in the elliptical ROI is generated by assigning high probability to low intensity region. False detected small components are eliminated using morphological opening and the center point of detected nipple region is calculated. Experimental results show that our method provides 94.4% nipple detection rate.

  8. Causality in time-neutral cosmologies

    NASA Astrophysics Data System (ADS)

    Kent, Adrian

    1999-02-01

    Gell-Mann and Hartle (GMH) have recently considered time-neutral cosmological models in which the initial and final conditions are independently specified, and several authors have investigated experimental tests of such models. We point out here that GMH time-neutral models can allow superluminal signaling, in the sense that it can be possible for observers in those cosmologies, by detecting and exploiting regularities in the final state, to construct devices which send and receive signals between space-like separated points. In suitable cosmologies, any single superluminal message can be transmitted with probability arbitrarily close to one by the use of redundant signals. However, the outcome probabilities of quantum measurements generally depend on precisely which past and future measurements take place. As the transmission of any signal relies on quantum measurements, its transmission probability is similarly context dependent. As a result, the standard superluminal signaling paradoxes do not apply. Despite their unusual features, the models are internally consistent. These results illustrate an interesting conceptual point. The standard view of Minkowski causality is not an absolutely indispensable part of the mathematical formalism of relativistic quantum theory. It is contingent on the empirical observation that naturally occurring ensembles can be naturally pre-selected but not post-selected.

  9. Testing metapopulation concepts: effects of patch characteristics and neighborhood occupancy on the dynamics of an endangered lagomorph

    USGS Publications Warehouse

    Eaton, Mitchell J.; Hughes, Phillip T.; Hines, James E.; Nichols, James D.

    2014-01-01

    Metapopulation ecology is a field that is richer in theory than in empirical results. Many existing empirical studies use an incidence function approach based on spatial patterns and key assumptions about extinction and colonization rates. Here we recast these assumptions as hypotheses to be tested using 18 years of historic detection survey data combined with four years of data from a new monitoring program for the Lower Keys marsh rabbit. We developed a new model to estimate probabilities of local extinction and colonization in the presence of nondetection, while accounting for estimated occupancy levels of neighboring patches. We used model selection to identify important drivers of population turnover and estimate the effective neighborhood size for this system. Several key relationships related to patch size and isolation that are often assumed in metapopulation models were supported: patch size was negatively related to the probability of extinction and positively related to colonization, and estimated occupancy of neighboring patches was positively related to colonization and negatively related to extinction probabilities. This latter relationship suggested the existence of rescue effects. In our study system, we inferred that coastal patches experienced higher probabilities of extinction and colonization than interior patches. Interior patches exhibited higher occupancy probabilities and may serve as refugia, permitting colonization of coastal patches following disturbances such as hurricanes and storm surges. Our modeling approach should be useful for incorporating neighbor occupancy into future metapopulation analyses and in dealing with other historic occupancy surveys that may not include the recommended levels of sampling replication.

  10. Neyman Pearson detection of K-distributed random variables

    NASA Astrophysics Data System (ADS)

    Tucker, J. Derek; Azimi-Sadjadi, Mahmood R.

    2010-04-01

    In this paper a new detection method for sonar imagery is developed in K-distributed background clutter. The equation for the log-likelihood is derived and compared to the corresponding counterparts derived for the Gaussian and Rayleigh assumptions. Test results of the proposed method on a data set of synthetic underwater sonar images is also presented. This database contains images with targets of different shapes inserted into backgrounds generated using a correlated K-distributed model. Results illustrating the effectiveness of the K-distributed detector are presented in terms of probability of detection, false alarm, and correct classification rates for various bottom clutter scenarios.

  11. MRI-alone radiation therapy planning for prostate cancer: Automatic fiducial marker detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghose, Soumya, E-mail: soumya.ghose@case.edu; Mitra, Jhimli; Rivest-Hénault, David

    Purpose: The feasibility of radiation therapy treatment planning using substitute computed tomography (sCT) generated from magnetic resonance images (MRIs) has been demonstrated by a number of research groups. One challenge with an MRI-alone workflow is the accurate identification of intraprostatic gold fiducial markers, which are frequently used for prostate localization prior to each dose delivery fraction. This paper investigates a template-matching approach for the detection of these seeds in MRI. Methods: Two different gradient echo T1 and T2* weighted MRI sequences were acquired from fifteen prostate cancer patients and evaluated for seed detection. For training, seed templates from manual contoursmore » were selected in a spectral clustering manifold learning framework. This aids in clustering “similar” gold fiducial markers together. The marker with the minimum distance to a cluster centroid was selected as the representative template of that cluster during training. During testing, Gaussian mixture modeling followed by a Markovian model was used in automatic detection of the probable candidates. The probable candidates were rigidly registered to the templates identified from spectral clustering, and a similarity metric is computed for ranking and detection. Results: A fiducial detection accuracy of 95% was obtained compared to manual observations. Expert radiation therapist observers were able to correctly identify all three implanted seeds on 11 of the 15 scans (the proposed method correctly identified all seeds on 10 of the 15). Conclusions: An novel automatic framework for gold fiducial marker detection in MRI is proposed and evaluated with detection accuracies comparable to manual detection. When radiation therapists are unable to determine the seed location in MRI, they refer back to the planning CT (only available in the existing clinical framework); similarly, an automatic quality control is built into the automatic software to ensure that all gold seeds are either correctly detected or a warning is raised for further manual intervention.« less

  12. MRI-alone radiation therapy planning for prostate cancer: Automatic fiducial marker detection.

    PubMed

    Ghose, Soumya; Mitra, Jhimli; Rivest-Hénault, David; Fazlollahi, Amir; Stanwell, Peter; Pichler, Peter; Sun, Jidi; Fripp, Jurgen; Greer, Peter B; Dowling, Jason A

    2016-05-01

    The feasibility of radiation therapy treatment planning using substitute computed tomography (sCT) generated from magnetic resonance images (MRIs) has been demonstrated by a number of research groups. One challenge with an MRI-alone workflow is the accurate identification of intraprostatic gold fiducial markers, which are frequently used for prostate localization prior to each dose delivery fraction. This paper investigates a template-matching approach for the detection of these seeds in MRI. Two different gradient echo T1 and T2* weighted MRI sequences were acquired from fifteen prostate cancer patients and evaluated for seed detection. For training, seed templates from manual contours were selected in a spectral clustering manifold learning framework. This aids in clustering "similar" gold fiducial markers together. The marker with the minimum distance to a cluster centroid was selected as the representative template of that cluster during training. During testing, Gaussian mixture modeling followed by a Markovian model was used in automatic detection of the probable candidates. The probable candidates were rigidly registered to the templates identified from spectral clustering, and a similarity metric is computed for ranking and detection. A fiducial detection accuracy of 95% was obtained compared to manual observations. Expert radiation therapist observers were able to correctly identify all three implanted seeds on 11 of the 15 scans (the proposed method correctly identified all seeds on 10 of the 15). An novel automatic framework for gold fiducial marker detection in MRI is proposed and evaluated with detection accuracies comparable to manual detection. When radiation therapists are unable to determine the seed location in MRI, they refer back to the planning CT (only available in the existing clinical framework); similarly, an automatic quality control is built into the automatic software to ensure that all gold seeds are either correctly detected or a warning is raised for further manual intervention.

  13. Predicting future changes in Muskegon River Watershed game fish distributions under future land cover alteration and climate change scenarios

    USGS Publications Warehouse

    Steen, Paul J.; Wiley, Michael J.; Schaeffer, Jeffrey S.

    2010-01-01

    Future alterations in land cover and climate are likely to cause substantial changes in the ranges of fish species. Predictive distribution models are an important tool for assessing the probability that these changes will cause increases or decreases in or the extirpation of species. Classification tree models that predict the probability of game fish presence were applied to the streams of the Muskegon River watershed, Michigan. The models were used to study three potential future scenarios: (1) land cover change only, (2) land cover change and a 3°C increase in air temperature by 2100, and (3) land cover change and a 5°C increase in air temperature by 2100. The analysis indicated that the expected change in air temperature and subsequent change in water temperatures would result in the decline of coldwater fish in the Muskegon watershed by the end of the 21st century while cool- and warmwater species would significantly increase their ranges. The greatest decline detected was a 90% reduction in the probability that brook trout Salvelinus fontinalis would occur in Bigelow Creek. The greatest increase was a 276% increase in the probability that northern pike Esox lucius would occur in the Middle Branch River. Changes in land cover are expected to cause large changes in a few fish species, such as walleye Sander vitreus and Chinook salmon Oncorhynchus tshawytscha, but not to drive major changes in species composition. Managers can alter stream environmental conditions to maximize the probability that species will reside in particular stream reaches through application of the classification tree models. Such models represent a good way to predict future changes, as they give quantitative estimates of the n-dimensional niches for particular species.

  14. Prospective testing of Coulomb short-term earthquake forecasts

    NASA Astrophysics Data System (ADS)

    Jackson, D. D.; Kagan, Y. Y.; Schorlemmer, D.; Zechar, J. D.; Wang, Q.; Wong, K.

    2009-12-01

    Earthquake induced Coulomb stresses, whether static or dynamic, suddenly change the probability of future earthquakes. Models to estimate stress and the resulting seismicity changes could help to illuminate earthquake physics and guide appropriate precautionary response. But do these models have improved forecasting power compared to empirical statistical models? The best answer lies in prospective testing in which a fully specified model, with no subsequent parameter adjustments, is evaluated against future earthquakes. The Center of Study of Earthquake Predictability (CSEP) facilitates such prospective testing of earthquake forecasts, including several short term forecasts. Formulating Coulomb stress models for formal testing involves several practical problems, mostly shared with other short-term models. First, earthquake probabilities must be calculated after each “perpetrator” earthquake but before the triggered earthquakes, or “victims”. The time interval between a perpetrator and its victims may be very short, as characterized by the Omori law for aftershocks. CSEP evaluates short term models daily, and allows daily updates of the models. However, lots can happen in a day. An alternative is to test and update models on the occurrence of each earthquake over a certain magnitude. To make such updates rapidly enough and to qualify as prospective, earthquake focal mechanisms, slip distributions, stress patterns, and earthquake probabilities would have to be made by computer without human intervention. This scheme would be more appropriate for evaluating scientific ideas, but it may be less useful for practical applications than daily updates. Second, triggered earthquakes are imperfectly recorded following larger events because their seismic waves are buried in the coda of the earlier event. To solve this problem, testing methods need to allow for “censoring” of early aftershock data, and a quantitative model for detection threshold as a function of distance, time, and magnitude is needed. Third, earthquake catalogs contain errors in location and magnitude that may be corrected in later editions. One solution is to test models in “pseudo-prospective” mode (after catalog revision but without model adjustment). Again, appropriate for science but not for response. Hopefully, demonstrations of modeling success will stimulate improvements in earthquake detection.

  15. Breeding birds in managed forests on public conservation lands in the Mississippi Alluvial Valley

    USGS Publications Warehouse

    Twedt, Daniel J.; Wilson, R. Randy

    2017-01-01

    Managers of public conservation lands in the Mississippi Alluvial Valley have implemented forest management strategies to improve bottomland hardwood habitat for target wildlife species. Through implementation of various silvicultural practices, forest managers have sought to attain forest structural conditions (e.g., canopy cover, basal area, etc.) within values postulated to benefit wildlife. We evaluated data from point count surveys of breeding birds on 180 silviculturally treated stands (1049 counts) that ranged from 1 to 20 years post-treatment and 134 control stands (676 counts) that had not been harvested for >20 years. Birds detected during 10-min counts were recorded within four distance classes and three time intervals. Avian diversity was greater on treated stands than on unharvested stands. Of 42 commonly detected species, six species including Prothonotary Warbler (Prothonotaria citrea) and Acadian Flycatcher (Empidonax virescens) were indicative of control stands. Similarly, six species including Indigo Bunting (Passerina cyanea) and Yellow-breasted Chat (Icteria virens) were indicative of treated stands. Using a removal model to assess probability of detection, we evaluated occupancy of bottomland forests at two spatial scales (stands and points within occupied stands). Wildlife-forestry treatment improved predictive models of species occupancy for 18 species. We found years post treatment (range = 1–20), total basal area, and overstory canopy were important species-specific predictors of occupancy, whereas variability in basal area was not. In addition, we used a removal model to estimate species-specific probability of availability for detection, and a distance model to estimate effective detection radius. We used these two estimated parameters to derive species densities and 95% confidence intervals for treated and unharvested stands. Avian densities differed between treated and control stands for 16 species, but only Common Yellowthroat (Geothlypis trichas) and Yellow-breasted Chat had greater densities on treated stands.

  16. Phase transitions in community detection: A solvable toy model

    NASA Astrophysics Data System (ADS)

    Ver Steeg, Greg; Moore, Cristopher; Galstyan, Aram; Allahverdyan, Armen

    2014-05-01

    Recently, it was shown that there is a phase transition in the community detection problem. This transition was first computed using the cavity method, and has been proved rigorously in the case of q = 2 groups. However, analytic calculations using the cavity method are challenging since they require us to understand probability distributions of messages. We study analogous transitions in the so-called “zero-temperature inference” model, where this distribution is supported only on the most likely messages. Furthermore, whenever several messages are equally likely, we break the tie by choosing among them with equal probability, corresponding to an infinitesimal random external field. While the resulting analysis overestimates the thresholds, it reproduces some of the qualitative features of the system. It predicts a first-order detectability transition whenever q > 2 (as opposed to q > 4 according to the finite-temperature cavity method). It also has a regime analogous to the “hard but detectable” phase, where the community structure can be recovered, but only when the initial messages are sufficiently accurate. Finally, we study a semisupervised setting where we are given the correct labels for a fraction ρ of the nodes. For q > 2, we find a regime where the accuracy jumps discontinuously at a critical value of ρ.

  17. Environmentally Adaptive UXO Detection and Classification Systems

    DTIC Science & Technology

    2016-04-01

    probability of false alarm ( Pfa ), as well as Receiver Op- erating Characteristic (ROC) curve and confusion matrix characteristics. The results of these...techniques at a false alarm probability of Pfa = 1× 10−3. X̃ = g(X). In this case, the problem remains invariant to the group of transformations G = { g : g(X...and observed target responses as well as the probability of detection versus SNR for both detection techniques at Pfa = 1× 10−3. with N = 128 and M = 50

  18. Linear fitting of multi-threshold counting data with a pixel-array detector for spectral X-ray imaging

    PubMed Central

    Muir, Ryan D.; Pogranichney, Nicholas R.; Muir, J. Lewis; Sullivan, Shane Z.; Battaile, Kevin P.; Mulichak, Anne M.; Toth, Scott J.; Keefe, Lisa J.; Simpson, Garth J.

    2014-01-01

    Experiments and modeling are described to perform spectral fitting of multi-threshold counting measurements on a pixel-array detector. An analytical model was developed for describing the probability density function of detected voltage in X-ray photon-counting arrays, utilizing fractional photon counting to account for edge/corner effects from voltage plumes that spread across multiple pixels. Each pixel was mathematically calibrated by fitting the detected voltage distributions to the model at both 13.5 keV and 15.0 keV X-ray energies. The model and established pixel responses were then exploited to statistically recover images of X-ray intensity as a function of X-ray energy in a simulated multi-wavelength and multi-counting threshold experiment. PMID:25178010

  19. Linear fitting of multi-threshold counting data with a pixel-array detector for spectral X-ray imaging.

    PubMed

    Muir, Ryan D; Pogranichney, Nicholas R; Muir, J Lewis; Sullivan, Shane Z; Battaile, Kevin P; Mulichak, Anne M; Toth, Scott J; Keefe, Lisa J; Simpson, Garth J

    2014-09-01

    Experiments and modeling are described to perform spectral fitting of multi-threshold counting measurements on a pixel-array detector. An analytical model was developed for describing the probability density function of detected voltage in X-ray photon-counting arrays, utilizing fractional photon counting to account for edge/corner effects from voltage plumes that spread across multiple pixels. Each pixel was mathematically calibrated by fitting the detected voltage distributions to the model at both 13.5 keV and 15.0 keV X-ray energies. The model and established pixel responses were then exploited to statistically recover images of X-ray intensity as a function of X-ray energy in a simulated multi-wavelength and multi-counting threshold experiment.

  20. Predicting the geographic distribution of a species from presence-only data subject to detection errors

    USGS Publications Warehouse

    Dorazio, Robert M.

    2012-01-01

    Several models have been developed to predict the geographic distribution of a species by combining measurements of covariates of occurrence at locations where the species is known to be present with measurements of the same covariates at other locations where species occurrence status (presence or absence) is unknown. In the absence of species detection errors, spatial point-process models and binary-regression models for case-augmented surveys provide consistent estimators of a species’ geographic distribution without prior knowledge of species prevalence. In addition, these regression models can be modified to produce estimators of species abundance that are asymptotically equivalent to those of the spatial point-process models. However, if species presence locations are subject to detection errors, neither class of models provides a consistent estimator of covariate effects unless the covariates of species abundance are distinct and independently distributed from the covariates of species detection probability. These analytical results are illustrated using simulation studies of data sets that contain a wide range of presence-only sample sizes. Analyses of presence-only data of three avian species observed in a survey of landbirds in western Montana and northern Idaho are compared with site-occupancy analyses of detections and nondetections of these species.

Top