Sample records for estimating temporal probability

  1. Survival estimates for Florida manatees from the photo-identification of individuals

    USGS Publications Warehouse

    Langtimm, C.A.; Beck, C.A.; Edwards, H.H.; Fick-Child, K. J.; Ackerman, B.B.; Barton, S.L.; Hartley, W.C.

    2004-01-01

    We estimated adult survival probabilities for the endangered Florida manatee (Trichechus manatus latirostris) in four regional populations using photo-identification data and open-population capture-recapture statistical models. The mean annual adult survival probability over the most recent 10-yr period of available estimates was as follows: Northwest - 0.956 (SE 0.007), Upper St. Johns River - 0.960 (0.011), Atlantic Coast - 0.937 (0.008), and Southwest - 0.908 (0.019). Estimates of temporal variance independent of sampling error, calculated from the survival estimates, indicated constant survival in the Upper St. Johns River, true temporal variability in the Northwest and Atlantic Coast, and large sampling variability obscuring estimates for the Southwest. Calf and subadult survival probabilities were estimated for the Upper St. Johns River from the only available data for known-aged individuals: 0.810 (95% CI 0.727-0.873) for 1st year calves, 0.915 (0.827-0.960) for 2nd year calves, and 0.969 (0.946-0.982) for manatee 3 yr or older. These estimates of survival probabilities and temporal variance, in conjunction with estimates of reproduction probabilities from photoidentification data can be used to model manatee population dynamics, estimate population growth rates, and provide an integrated measure of regional status.

  2. Probability based models for estimation of wildfire risk

    Treesearch

    Haiganoush Preisler; D. R. Brillinger; R. E. Burgan; John Benoit

    2004-01-01

    We present a probability-based model for estimating fire risk. Risk is defined using three probabilities: the probability of fire occurrence; the conditional probability of a large fire given ignition; and the unconditional probability of a large fire. The model is based on grouped data at the 1 km²-day cell level. We fit a spatially and temporally explicit non-...

  3. Evaluating Perceived Probability of Threat-Relevant Outcomes and Temporal Orientation in Flying Phobia.

    PubMed

    Mavromoustakos, Elena; Clark, Gavin I; Rock, Adam J

    2016-01-01

    Probability bias regarding threat-relevant outcomes has been demonstrated across anxiety disorders but has not been investigated in flying phobia. Individual temporal orientation (time perspective) may be hypothesised to influence estimates of negative outcomes occurring. The present study investigated whether probability bias could be demonstrated in flying phobia and whether probability estimates of negative flying events was predicted by time perspective. Sixty flying phobic and fifty-five non-flying-phobic adults were recruited to complete an online questionnaire. Participants completed the Flight Anxiety Scale, Probability Scale (measuring perceived probability of flying-negative events, general-negative and general positive events) and the Past-Negative, Future and Present-Hedonistic subscales of the Zimbardo Time Perspective Inventory (variables argued to predict mental travel forward and backward in time). The flying phobic group estimated the probability of flying negative and general negative events occurring as significantly higher than non-flying phobics. Past-Negative scores (positively) and Present-Hedonistic scores (negatively) predicted probability estimates of flying negative events. The Future Orientation subscale did not significantly predict probability estimates. This study is the first to demonstrate probability bias for threat-relevant outcomes in flying phobia. Results suggest that time perspective may influence perceived probability of threat-relevant outcomes but the nature of this relationship remains to be determined.

  4. Evaluating Perceived Probability of Threat-Relevant Outcomes and Temporal Orientation in Flying Phobia

    PubMed Central

    Mavromoustakos, Elena; Clark, Gavin I.; Rock, Adam J.

    2016-01-01

    Probability bias regarding threat-relevant outcomes has been demonstrated across anxiety disorders but has not been investigated in flying phobia. Individual temporal orientation (time perspective) may be hypothesised to influence estimates of negative outcomes occurring. The present study investigated whether probability bias could be demonstrated in flying phobia and whether probability estimates of negative flying events was predicted by time perspective. Sixty flying phobic and fifty-five non-flying-phobic adults were recruited to complete an online questionnaire. Participants completed the Flight Anxiety Scale, Probability Scale (measuring perceived probability of flying-negative events, general-negative and general positive events) and the Past-Negative, Future and Present-Hedonistic subscales of the Zimbardo Time Perspective Inventory (variables argued to predict mental travel forward and backward in time). The flying phobic group estimated the probability of flying negative and general negative events occurring as significantly higher than non-flying phobics. Past-Negative scores (positively) and Present-Hedonistic scores (negatively) predicted probability estimates of flying negative events. The Future Orientation subscale did not significantly predict probability estimates. This study is the first to demonstrate probability bias for threat-relevant outcomes in flying phobia. Results suggest that time perspective may influence perceived probability of threat-relevant outcomes but the nature of this relationship remains to be determined. PMID:27557054

  5. Temporal scaling in information propagation.

    PubMed

    Huang, Junming; Li, Chao; Wang, Wen-Qiang; Shen, Hua-Wei; Li, Guojie; Cheng, Xue-Qi

    2014-06-18

    For the study of information propagation, one fundamental problem is uncovering universal laws governing the dynamics of information propagation. This problem, from the microscopic perspective, is formulated as estimating the propagation probability that a piece of information propagates from one individual to another. Such a propagation probability generally depends on two major classes of factors: the intrinsic attractiveness of information and the interactions between individuals. Despite the fact that the temporal effect of attractiveness is widely studied, temporal laws underlying individual interactions remain unclear, causing inaccurate prediction of information propagation on evolving social networks. In this report, we empirically study the dynamics of information propagation, using the dataset from a population-scale social media website. We discover a temporal scaling in information propagation: the probability a message propagates between two individuals decays with the length of time latency since their latest interaction, obeying a power-law rule. Leveraging the scaling law, we further propose a temporal model to estimate future propagation probabilities between individuals, reducing the error rate of information propagation prediction from 6.7% to 2.6% and improving viral marketing with 9.7% incremental customers.

  6. Temporal scaling in information propagation

    NASA Astrophysics Data System (ADS)

    Huang, Junming; Li, Chao; Wang, Wen-Qiang; Shen, Hua-Wei; Li, Guojie; Cheng, Xue-Qi

    2014-06-01

    For the study of information propagation, one fundamental problem is uncovering universal laws governing the dynamics of information propagation. This problem, from the microscopic perspective, is formulated as estimating the propagation probability that a piece of information propagates from one individual to another. Such a propagation probability generally depends on two major classes of factors: the intrinsic attractiveness of information and the interactions between individuals. Despite the fact that the temporal effect of attractiveness is widely studied, temporal laws underlying individual interactions remain unclear, causing inaccurate prediction of information propagation on evolving social networks. In this report, we empirically study the dynamics of information propagation, using the dataset from a population-scale social media website. We discover a temporal scaling in information propagation: the probability a message propagates between two individuals decays with the length of time latency since their latest interaction, obeying a power-law rule. Leveraging the scaling law, we further propose a temporal model to estimate future propagation probabilities between individuals, reducing the error rate of information propagation prediction from 6.7% to 2.6% and improving viral marketing with 9.7% incremental customers.

  7. Modeling summer month hydrological drought probabilities in the United States using antecedent flow conditions

    USGS Publications Warehouse

    Austin, Samuel H.; Nelms, David L.

    2017-01-01

    Climate change raises concern that risks of hydrological drought may be increasing. We estimate hydrological drought probabilities for rivers and streams in the United States (U.S.) using maximum likelihood logistic regression (MLLR). Streamflow data from winter months are used to estimate the chance of hydrological drought during summer months. Daily streamflow data collected from 9,144 stream gages from January 1, 1884 through January 9, 2014 provide hydrological drought streamflow probabilities for July, August, and September as functions of streamflows during October, November, December, January, and February, estimating outcomes 5-11 months ahead of their occurrence. Few drought prediction methods exploit temporal links among streamflows. We find MLLR modeling of drought streamflow probabilities exploits the explanatory power of temporally linked water flows. MLLR models with strong correct classification rates were produced for streams throughout the U.S. One ad hoc test of correct prediction rates of September 2013 hydrological droughts exceeded 90% correct classification. Some of the best-performing models coincide with areas of high concern including the West, the Midwest, Texas, the Southeast, and the Mid-Atlantic. Using hydrological drought MLLR probability estimates in a water management context can inform understanding of drought streamflow conditions, provide warning of future drought conditions, and aid water management decision making.

  8. Semiparametric temporal process regression of survival-out-of-hospital.

    PubMed

    Zhan, Tianyu; Schaubel, Douglas E

    2018-05-23

    The recurrent/terminal event data structure has undergone considerable methodological development in the last 10-15 years. An example of the data structure that has arisen with increasing frequency involves the recurrent event being hospitalization and the terminal event being death. We consider the response Survival-Out-of-Hospital, defined as a temporal process (indicator function) taking the value 1 when the subject is currently alive and not hospitalized, and 0 otherwise. Survival-Out-of-Hospital is a useful alternative strategy for the analysis of hospitalization/survival in the chronic disease setting, with the response variate representing a refinement to survival time through the incorporation of an objective quality-of-life component. The semiparametric model we consider assumes multiplicative covariate effects and leaves unspecified the baseline probability of being alive-and-out-of-hospital. Using zero-mean estimating equations, the proposed regression parameter estimator can be computed without estimating the unspecified baseline probability process, although baseline probabilities can subsequently be estimated for any time point within the support of the censoring distribution. We demonstrate that the regression parameter estimator is asymptotically normal, and that the baseline probability function estimator converges to a Gaussian process. Simulation studies are performed to show that our estimating procedures have satisfactory finite sample performances. The proposed methods are applied to the Dialysis Outcomes and Practice Patterns Study (DOPPS), an international end-stage renal disease study.

  9. Spatio-Temporal Pattern Recognition Using Hidden Markov Models

    DTIC Science & Technology

    1994-06-01

    Jersey, 1982. 5. H. B . Barlow and W. R. Levick . The mechanism of directionally selective units in rabbit’s retina. Journal of Physiology (London), 178:477...108 A.2.2 Re-estimate of .. .. ................... .110 A.2.3 Re-estimate of B ...... ................... 110 A.3 Logarithmic Form of the Baum-Welch...19 a0 Transition Probability from State i to State j ................ 19 B Observation Probability Matrix

  10. Sampling design trade-offs in occupancy studies with imperfect detection: examples and software

    USGS Publications Warehouse

    Bailey, L.L.; Hines, J.E.; Nichols, J.D.

    2007-01-01

    Researchers have used occupancy, or probability of occupancy, as a response or state variable in a variety of studies (e.g., habitat modeling), and occupancy is increasingly favored by numerous state, federal, and international agencies engaged in monitoring programs. Recent advances in estimation methods have emphasized that reliable inferences can be made from these types of studies if detection and occupancy probabilities are simultaneously estimated. The need for temporal replication at sampled sites to estimate detection probability creates a trade-off between spatial replication (number of sample sites distributed within the area of interest/inference) and temporal replication (number of repeated surveys at each site). Here, we discuss a suite of questions commonly encountered during the design phase of occupancy studies, and we describe software (program GENPRES) developed to allow investigators to easily explore design trade-offs focused on particularities of their study system and sampling limitations. We illustrate the utility of program GENPRES using an amphibian example from Greater Yellowstone National Park, USA.

  11. Probability of atrial fibrillation after ablation: Using a parametric nonlinear temporal decomposition mixed effects model.

    PubMed

    Rajeswaran, Jeevanantham; Blackstone, Eugene H; Ehrlinger, John; Li, Liang; Ishwaran, Hemant; Parides, Michael K

    2018-01-01

    Atrial fibrillation is an arrhythmic disorder where the electrical signals of the heart become irregular. The probability of atrial fibrillation (binary response) is often time varying in a structured fashion, as is the influence of associated risk factors. A generalized nonlinear mixed effects model is presented to estimate the time-related probability of atrial fibrillation using a temporal decomposition approach to reveal the pattern of the probability of atrial fibrillation and their determinants. This methodology generalizes to patient-specific analysis of longitudinal binary data with possibly time-varying effects of covariates and with different patient-specific random effects influencing different temporal phases. The motivation and application of this model is illustrated using longitudinally measured atrial fibrillation data obtained through weekly trans-telephonic monitoring from an NIH sponsored clinical trial being conducted by the Cardiothoracic Surgery Clinical Trials Network.

  12. [WebSurvCa: web-based estimation of death and survival probabilities in a cohort].

    PubMed

    Clèries, Ramon; Ameijide, Alberto; Buxó, Maria; Vilardell, Mireia; Martínez, José Miguel; Alarcón, Francisco; Cordero, David; Díez-Villanueva, Ana; Yasui, Yutaka; Marcos-Gragera, Rafael; Vilardell, Maria Loreto; Carulla, Marià; Galceran, Jaume; Izquierdo, Ángel; Moreno, Víctor; Borràs, Josep M

    2018-01-19

    Relative survival has been used as a measure of the temporal evolution of the excess risk of death of a cohort of patients diagnosed with cancer, taking into account the mortality of a reference population. Once the excess risk of death has been estimated, three probabilities can be computed at time T: 1) the crude probability of death associated with the cause of initial diagnosis (disease under study), 2) the crude probability of death associated with other causes, and 3) the probability of absolute survival in the cohort at time T. This paper presents the WebSurvCa application (https://shiny.snpstats.net/WebSurvCa/), whereby hospital-based and population-based cancer registries and registries of other diseases can estimate such probabilities in their cohorts by selecting the mortality of the relevant region (reference population). Copyright © 2017 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  13. I Environmental DNA sampling is more sensitive than a traditional survey technique for detecting an aquatic invader.

    PubMed

    Smart, Adam S; Tingley, Reid; Weeks, Andrew R; van Rooyen, Anthony R; McCarthy, Michael A

    2015-10-01

    Effective management of alien species requires detecting populations in the early stages of invasion. Environmental DNA (eDNA) sampling can detect aquatic species at relatively low densities, but few studies have directly compared detection probabilities of eDNA sampling with those of traditional sampling methods. We compare the ability of a traditional sampling technique (bottle trapping) and eDNA to detect a recently established invader, the smooth newt Lissotriton vulgaris vulgaris, at seven field sites in Melbourne, Australia. Over a four-month period, per-trap detection probabilities ranged from 0.01 to 0.26 among sites where L. v. vulgaris was detected, whereas per-sample eDNA estimates were much higher (0.29-1.0). Detection probabilities of both methods varied temporally (across days and months), but temporal variation appeared to be uncorrelated between methods. Only estimates of spatial variation were strongly correlated across the two sampling techniques. Environmental variables (water depth, rainfall, ambient temperature) were not clearly correlated with detection probabilities estimated via trapping, whereas eDNA detection probabilities were negatively correlated with water depth, possibly reflecting higher eDNA concentrations at lower water levels. Our findings demonstrate that eDNA sampling can be an order of magnitude more sensitive than traditional methods, and illustrate that traditional- and eDNA-based surveys can provide independent information on species distributions when occupancy surveys are conducted over short timescales.

  14. Small-Scale Spatio-Temporal Distribution of Bactrocera minax (Enderlein) (Diptera: Tephritidae) Using Probability Kriging.

    PubMed

    Wang, S Q; Zhang, H Y; Li, Z L

    2016-10-01

    Understanding spatio-temporal distribution of pest in orchards can provide important information that could be used to design monitoring schemes and establish better means for pest control. In this study, the spatial and temporal distribution of Bactrocera minax (Enderlein) (Diptera: Tephritidae) was assessed, and activity trends were evaluated by using probability kriging. Adults of B. minax were captured in two successive occurrences in a small-scale citrus orchard by using food bait traps, which were placed both inside and outside the orchard. The weekly spatial distribution of B. minax within the orchard and adjacent woods was examined using semivariogram parameters. The edge concentration was discovered during the most weeks in adult occurrence, and the population of the adults aggregated with high probability within a less-than-100-m-wide band on both of the sides of the orchard and the woods. The sequential probability kriged maps showed that the adults were estimated in the marginal zone with higher probability, especially in the early and peak stages. The feeding, ovipositing, and mating behaviors of B. minax are possible explanations for these spatio-temporal patterns. Therefore, spatial arrangement and distance to the forest edge of traps or spraying spot should be considered to enhance pest control on B. minax in small-scale orchards.

  15. Modeling utilization distributions in space and time

    USGS Publications Warehouse

    Keating, K.A.; Cherry, S.

    2009-01-01

    W. Van Winkle defined the utilization distribution (UD) as a probability density that gives an animal's relative frequency of occurrence in a two-dimensional (x, y) plane. We extend Van Winkle's work by redefining the UD as the relative frequency distribution of an animal's occurrence in all four dimensions of space and time. We then describe a product kernel model estimation method, devising a novel kernel from the wrapped Cauchy distribution to handle circularly distributed temporal covariates, such as day of year. Using Monte Carlo simulations of animal movements in space and time, we assess estimator performance. Although not unbiased, the product kernel method yields models highly correlated (Pearson's r - 0.975) with true probabilities of occurrence and successfully captures temporal variations in density of occurrence. In an empirical example, we estimate the expected UD in three dimensions (x, y, and t) for animals belonging to each of two distinct bighorn sheep {Ovis canadensis) social groups in Glacier National Park, Montana, USA. Results show the method can yield ecologically informative models that successfully depict temporal variations in density of occurrence for a seasonally migratory species. Some implications of this new approach to UD modeling are discussed. ?? 2009 by the Ecological Society of America.

  16. High monetary reward rates and caloric rewards decrease temporal persistence

    PubMed Central

    Bode, Stefan; Murawski, Carsten

    2017-01-01

    Temporal persistence refers to an individual's capacity to wait for future rewards, while forgoing possible alternatives. This requires a trade-off between the potential value of delayed rewards and opportunity costs, and is relevant to many real-world decisions, such as dieting. Theoretical models have previously suggested that high monetary reward rates, or positive energy balance, may result in decreased temporal persistence. In our study, 50 fasted participants engaged in a temporal persistence task, incentivised with monetary rewards. In alternating blocks of this task, rewards were delivered at delays drawn randomly from distributions with either a lower or higher maximum reward rate. During some blocks participants received either a caloric drink or water. We used survival analysis to estimate participants' probability of quitting conditional on the delay distribution and the consumed liquid. Participants had a higher probability of quitting in blocks with the higher reward rate. Furthermore, participants who consumed the caloric drink had a higher probability of quitting than those who consumed water. Our results support the predictions from the theoretical models, and importantly, suggest that both higher monetary reward rates and physiologically relevant rewards can decrease temporal persistence, which is a crucial determinant for survival in many species. PMID:28228517

  17. High monetary reward rates and caloric rewards decrease temporal persistence.

    PubMed

    Fung, Bowen J; Bode, Stefan; Murawski, Carsten

    2017-02-22

    Temporal persistence refers to an individual's capacity to wait for future rewards, while forgoing possible alternatives. This requires a trade-off between the potential value of delayed rewards and opportunity costs, and is relevant to many real-world decisions, such as dieting. Theoretical models have previously suggested that high monetary reward rates, or positive energy balance, may result in decreased temporal persistence. In our study, 50 fasted participants engaged in a temporal persistence task, incentivised with monetary rewards. In alternating blocks of this task, rewards were delivered at delays drawn randomly from distributions with either a lower or higher maximum reward rate. During some blocks participants received either a caloric drink or water. We used survival analysis to estimate participants' probability of quitting conditional on the delay distribution and the consumed liquid. Participants had a higher probability of quitting in blocks with the higher reward rate. Furthermore, participants who consumed the caloric drink had a higher probability of quitting than those who consumed water. Our results support the predictions from the theoretical models, and importantly, suggest that both higher monetary reward rates and physiologically relevant rewards can decrease temporal persistence, which is a crucial determinant for survival in many species. © 2017 The Authors.

  18. Statistics of single unit responses in the human medial temporal lobe: A sparse and overdispersed code

    NASA Astrophysics Data System (ADS)

    Magyar, Andrew

    The recent discovery of cells that respond to purely conceptual features of the environment (particular people, landmarks, objects, etc) in the human medial temporal lobe (MTL), has raised many questions about the nature of the neural code in humans. The goal of this dissertation is to develop a novel statistical method based upon maximum likelihood regression which will then be applied to these experiments in order to produce a quantitative description of the coding properties of the human MTL. In general, the method is applicable to any experiments in which a sequence of stimuli are presented to an organism while the binary responses of a large number of cells are recorded in parallel. The central concept underlying the approach is the total probability that a neuron responds to a random stimulus, called the neuronal sparsity. The model then estimates the distribution of response probabilities across the population of cells. Applying the method to single-unit recordings from the human medial temporal lobe, estimates of the sparsity distributions are acquired in four regions: the hippocampus, the entorhinal cortex, the amygdala, and the parahippocampal cortex. The resulting distributions are found to be sparse (large fraction of cells with a low response probability) and highly non-uniform, with a large proportion of ultra-sparse neurons that possess a very low response probability, and a smaller population of cells which respond much more frequently. Rammifications of the results are discussed in relation to the sparse coding hypothesis, and comparisons are made between the statistics of the human medial temporal lobe cells and place cells observed in the rodent hippocampus.

  19. Time-dependent landslide probability mapping

    USGS Publications Warehouse

    Campbell, Russell H.; Bernknopf, Richard L.; ,

    1993-01-01

    Case studies where time of failure is known for rainfall-triggered debris flows can be used to estimate the parameters of a hazard model in which the probability of failure is a function of time. As an example, a time-dependent function for the conditional probability of a soil slip is estimated from independent variables representing hillside morphology, approximations of material properties, and the duration and rate of rainfall. If probabilities are calculated in a GIS (geomorphic information system ) environment, the spatial distribution of the result for any given hour can be displayed on a map. Although the probability levels in this example are uncalibrated, the method offers a potential for evaluating different physical models and different earth-science variables by comparing the map distribution of predicted probabilities with inventory maps for different areas and different storms. If linked with spatial and temporal socio-economic variables, this method could be used for short-term risk assessment.

  20. Sources of variation in extinction rates, turnover, and diversity of marine invertebrate families during the Paleozoic

    USGS Publications Warehouse

    Nichols, J.D.; Morris, R.W.; Brownie, C.; Pollock, K.H.

    1986-01-01

    The authors present a new method that can be used to estimate taxonomic turnover in conjunction with stratigraphic range data for families in five phyla of Paleozoic marine invertebrates. Encounter probabilities varied among taxa and showed evidence of a decrease over time for the geologic series examined. The number of families varied substantially among the five phyla and showed some evidence of an increase over the series examined. There was no evidence of variation in extinction probabilities among the phyla. Although there was evidence of temporal variation in extinction probabilities within phyla, there was no evidence of a linear decrease in extinction probabilities over time, as has been reported by others. The authors did find evidence of high extinction probabilities for the two intervals that had been identified by others as periods of mass extinction. They found no evidence of variation in turnover among the five phyla. There was evidence of temporal variation in turnover, with greater turnover occurring in the older series.

  1. Using areas of known occupancy to identify sources of variation in detection probability of raptors: taking time lowers replication effort for surveys.

    PubMed

    Murn, Campbell; Holloway, Graham J

    2016-10-01

    Species occurring at low density can be difficult to detect and if not properly accounted for, imperfect detection will lead to inaccurate estimates of occupancy. Understanding sources of variation in detection probability and how they can be managed is a key part of monitoring. We used sightings data of a low-density and elusive raptor (white-headed vulture Trigonoceps occipitalis ) in areas of known occupancy (breeding territories) in a likelihood-based modelling approach to calculate detection probability and the factors affecting it. Because occupancy was known a priori to be 100%, we fixed the model occupancy parameter to 1.0 and focused on identifying sources of variation in detection probability. Using detection histories from 359 territory visits, we assessed nine covariates in 29 candidate models. The model with the highest support indicated that observer speed during a survey, combined with temporal covariates such as time of year and length of time within a territory, had the highest influence on the detection probability. Averaged detection probability was 0.207 (s.e. 0.033) and based on this the mean number of visits required to determine within 95% confidence that white-headed vultures are absent from a breeding area is 13 (95% CI: 9-20). Topographical and habitat covariates contributed little to the best models and had little effect on detection probability. We highlight that low detection probabilities of some species means that emphasizing habitat covariates could lead to spurious results in occupancy models that do not also incorporate temporal components. While variation in detection probability is complex and influenced by effects at both temporal and spatial scales, temporal covariates can and should be controlled as part of robust survey methods. Our results emphasize the importance of accounting for detection probability in occupancy studies, particularly during presence/absence studies for species such as raptors that are widespread and occur at low densities.

  2. Spatiotemporal variation in reproductive parameters of yellow-bellied marmots.

    PubMed

    Ozgul, Arpat; Oli, Madan K; Olson, Lucretia E; Blumstein, Daniel T; Armitage, Kenneth B

    2007-11-01

    Spatiotemporal variation in reproductive rates is a common phenomenon in many wildlife populations, but the population dynamic consequences of spatial and temporal variability in different components of reproduction remain poorly understood. We used 43 years (1962-2004) of data from 17 locations and a capture-mark-recapture (CMR) modeling framework to investigate the spatiotemporal variation in reproductive parameters of yellow-bellied marmots (Marmota flaviventris), and its influence on the realized population growth rate. Specifically, we estimated and modeled breeding probabilities of two-year-old females (earliest age of first reproduction), >2-year-old females that have not reproduced before (subadults), and >2-year-old females that have reproduced before (adults), as well as the litter sizes of two-year old and >2-year-old females. Most reproductive parameters exhibited spatial and/or temporal variation. However, reproductive parameters differed with respect to their relative influence on the realized population growth rate (lambda). Litter size had a stronger influence than did breeding probabilities on both spatial and temporal variations in lambda. Our analysis indicated that lambda was proportionately more sensitive to survival than recruitment. However, the annual fluctuation in litter size, abetted by the breeding probabilities, accounted for most of the temporal variation in lambda.

  3. Detecting temporal trends in species assemblages with bootstrapping procedures and hierarchical models

    USGS Publications Warehouse

    Gotelli, Nicholas J.; Dorazio, Robert M.; Ellison, Aaron M.; Grossman, Gary D.

    2010-01-01

    Quantifying patterns of temporal trends in species assemblages is an important analytical challenge in community ecology. We describe methods of analysis that can be applied to a matrix of counts of individuals that is organized by species (rows) and time-ordered sampling periods (columns). We first developed a bootstrapping procedure to test the null hypothesis of random sampling from a stationary species abundance distribution with temporally varying sampling probabilities. This procedure can be modified to account for undetected species. We next developed a hierarchical model to estimate species-specific trends in abundance while accounting for species-specific probabilities of detection. We analysed two long-term datasets on stream fishes and grassland insects to demonstrate these methods. For both assemblages, the bootstrap test indicated that temporal trends in abundance were more heterogeneous than expected under the null model. We used the hierarchical model to estimate trends in abundance and identified sets of species in each assemblage that were steadily increasing, decreasing or remaining constant in abundance over more than a decade of standardized annual surveys. Our methods of analysis are broadly applicable to other ecological datasets, and they represent an advance over most existing procedures, which do not incorporate effects of incomplete sampling and imperfect detection.

  4. Bias adjustment of infrared-based rainfall estimation using Passive Microwave satellite rainfall data

    NASA Astrophysics Data System (ADS)

    Karbalaee, Negar; Hsu, Kuolin; Sorooshian, Soroosh; Braithwaite, Dan

    2017-04-01

    This study explores using Passive Microwave (PMW) rainfall estimation for spatial and temporal adjustment of Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks-Cloud Classification System (PERSIANN-CCS). The PERSIANN-CCS algorithm collects information from infrared images to estimate rainfall. PERSIANN-CCS is one of the algorithms used in the Integrated Multisatellite Retrievals for GPM (Global Precipitation Mission) estimation for the time period PMW rainfall estimations are limited or not available. Continued improvement of PERSIANN-CCS will support Integrated Multisatellite Retrievals for GPM for current as well as retrospective estimations of global precipitation. This study takes advantage of the high spatial and temporal resolution of GEO-based PERSIANN-CCS estimation and the more effective, but lower sample frequency, PMW estimation. The Probability Matching Method (PMM) was used to adjust the rainfall distribution of GEO-based PERSIANN-CCS toward that of PMW rainfall estimation. The results show that a significant improvement of global PERSIANN-CCS rainfall estimation is obtained.

  5. Intra-annual patterns in adult band-tailed pigeon survival estimates

    USGS Publications Warehouse

    Casazza, Michael L.; Coates, Peter S.; Overton, Cory T.; Howe, Kristy H.

    2015-01-01

    Implications: We present the first inter-seasonal analysis of survival probability of the Pacific coast race of band-tailed pigeons and illustrate important temporal patterns that may influence future species management including harvest strategies and disease monitoring.

  6. Discriminability limits in spatio-temporal stereo block matching.

    PubMed

    Jain, Ankit K; Nguyen, Truong Q

    2014-05-01

    Disparity estimation is a fundamental task in stereo imaging and is a well-studied problem. Recently, methods have been adapted to the video domain where motion is used as a matching criterion to help disambiguate spatially similar candidates. In this paper, we analyze the validity of the underlying assumptions of spatio-temporal disparity estimation, and determine the extent to which motion aids the matching process. By analyzing the error signal for spatio-temporal block matching under the sum of squared differences criterion and treating motion as a stochastic process, we determine the probability of a false match as a function of image features, motion distribution, image noise, and number of frames in the spatio-temporal patch. This performance quantification provides insight into when spatio-temporal matching is most beneficial in terms of the scene and motion, and can be used as a guide to select parameters for stereo matching algorithms. We validate our results through simulation and experiments on stereo video.

  7. Effect of radar rainfall time resolution on the predictive capability of a distributed hydrologic model

    NASA Astrophysics Data System (ADS)

    Atencia, A.; Llasat, M. C.; Garrote, L.; Mediero, L.

    2010-10-01

    The performance of distributed hydrological models depends on the resolution, both spatial and temporal, of the rainfall surface data introduced. The estimation of quantitative precipitation from meteorological radar or satellite can improve hydrological model results, thanks to an indirect estimation at higher spatial and temporal resolution. In this work, composed radar data from a network of three C-band radars, with 6-minutal temporal and 2 × 2 km2 spatial resolution, provided by the Catalan Meteorological Service, is used to feed the RIBS distributed hydrological model. A Window Probability Matching Method (gage-adjustment method) is applied to four cases of heavy rainfall to improve the observed rainfall sub-estimation in both convective and stratiform Z/R relations used over Catalonia. Once the rainfall field has been adequately obtained, an advection correction, based on cross-correlation between two consecutive images, was introduced to get several time resolutions from 1 min to 30 min. Each different resolution is treated as an independent event, resulting in a probable range of input rainfall data. This ensemble of rainfall data is used, together with other sources of uncertainty, such as the initial basin state or the accuracy of discharge measurements, to calibrate the RIBS model using probabilistic methodology. A sensitivity analysis of time resolutions was implemented by comparing the various results with real values from stream-flow measurement stations.

  8. Spatio-temporal variation in click production rates of beaked whales: Implications for passive acoustic density estimation.

    PubMed

    Warren, Victoria E; Marques, Tiago A; Harris, Danielle; Thomas, Len; Tyack, Peter L; Aguilar de Soto, Natacha; Hickmott, Leigh S; Johnson, Mark P

    2017-03-01

    Passive acoustic monitoring has become an increasingly prevalent tool for estimating density of marine mammals, such as beaked whales, which vocalize often but are difficult to survey visually. Counts of acoustic cues (e.g., vocalizations), when corrected for detection probability, can be translated into animal density estimates by applying an individual cue production rate multiplier. It is essential to understand variation in these rates to avoid biased estimates. The most direct way to measure cue production rate is with animal-mounted acoustic recorders. This study utilized data from sound recording tags deployed on Blainville's (Mesoplodon densirostris, 19 deployments) and Cuvier's (Ziphius cavirostris, 16 deployments) beaked whales, in two locations per species, to explore spatial and temporal variation in click production rates. No spatial or temporal variation was detected within the average click production rate of Blainville's beaked whales when calculated over dive cycles (including silent periods between dives); however, spatial variation was detected when averaged only over vocal periods. Cuvier's beaked whales exhibited significant spatial and temporal variation in click production rates within vocal periods and when silent periods were included. This evidence of variation emphasizes the need to utilize appropriate cue production rates when estimating density from passive acoustic data.

  9. Hidden Markov models for fault detection in dynamic systems

    NASA Technical Reports Server (NTRS)

    Smyth, Padhraic J. (Inventor)

    1995-01-01

    The invention is a system failure monitoring method and apparatus which learns the symptom-fault mapping directly from training data. The invention first estimates the state of the system at discrete intervals in time. A feature vector x of dimension k is estimated from sets of successive windows of sensor data. A pattern recognition component then models the instantaneous estimate of the posterior class probability given the features, p(w(sub i) (vertical bar)/x), 1 less than or equal to i isless than or equal to m. Finally, a hidden Markov model is used to take advantage of temporal context and estimate class probabilities conditioned on recent past history. In this hierarchical pattern of information flow, the time series data is transformed and mapped into a categorical representation (the fault classes) and integrated over time to enable robust decision-making.

  10. Hidden Markov models for fault detection in dynamic systems

    NASA Technical Reports Server (NTRS)

    Smyth, Padhraic J. (Inventor)

    1993-01-01

    The invention is a system failure monitoring method and apparatus which learns the symptom-fault mapping directly from training data. The invention first estimates the state of the system at discrete intervals in time. A feature vector x of dimension k is estimated from sets of successive windows of sensor data. A pattern recognition component then models the instantaneous estimate of the posterior class probability given the features, p(w(sub i) perpendicular to x), 1 less than or equal to i is less than or equal to m. Finally, a hidden Markov model is used to take advantage of temporal context and estimate class probabilities conditioned on recent past history. In this hierarchical pattern of information flow, the time series data is transformed and mapped into a categorical representation (the fault classes) and integrated over time to enable robust decision-making.

  11. Evaluation of some random effects methodology applicable to bird ringing data

    USGS Publications Warehouse

    Burnham, K.P.; White, Gary C.

    2002-01-01

    Existing models for ring recovery and recapture data analysis treat temporal variations in annual survival probability (S) as fixed effects. Often there is no explainable structure to the temporal variation in S1,..., Sk; random effects can then be a useful model: Si = E(S) + ??i. Here, the temporal variation in survival probability is treated as random with average value E(??2) = ??2. This random effects model can now be fit in program MARK. Resultant inferences include point and interval estimation for process variation, ??2, estimation of E(S) and var (E??(S)) where the latter includes a component for ??2 as well as the traditional component for v??ar(S??\\S??). Furthermore, the random effects model leads to shrinkage estimates, Si, as improved (in mean square error) estimators of Si compared to the MLE, S??i, from the unrestricted time-effects model. Appropriate confidence intervals based on the Si are also provided. In addition, AIC has been generalized to random effects models. This paper presents results of a Monte Carlo evaluation of inference performance under the simple random effects model. Examined by simulation, under the simple one group Cormack-Jolly-Seber (CJS) model, are issues such as bias of ??s2, confidence interval coverage on ??2, coverage and mean square error comparisons for inference about Si based on shrinkage versus maximum likelihood estimators, and performance of AIC model selection over three models: Si ??? S (no effects), Si = E(S) + ??i (random effects), and S1,..., Sk (fixed effects). For the cases simulated, the random effects methods performed well and were uniformly better than fixed effects MLE for the Si.

  12. Statistics of some atmospheric turbulence records relevant to aircraft response calculations

    NASA Technical Reports Server (NTRS)

    Mark, W. D.; Fischer, R. W.

    1981-01-01

    Methods for characterizing atmospheric turbulence are described. The methods illustrated include maximum likelihood estimation of the integral scale and intensity of records obeying the von Karman transverse power spectral form, constrained least-squares estimation of the parameters of a parametric representation of autocorrelation functions, estimation of the power spectra density of the instantaneous variance of a record with temporally fluctuating variance, and estimation of the probability density functions of various turbulence components. Descriptions of the computer programs used in the computations are given, and a full listing of these programs is included.

  13. Selective Attention in Pigeon Temporal Discrimination.

    PubMed

    Subramaniam, Shrinidhi; Kyonka, Elizabeth

    2017-07-27

    Cues can vary in how informative they are about when specific outcomes, such as food availability, will occur. This study was an experimental investigation of the functional relation between cue informativeness and temporal discrimination in a peak-interval (PI) procedure. Each session consisted of fixed-interval (FI) 2-s and 4-s schedules of food and occasional, 12-s PI trials during which pecks had no programmed consequences. Across conditions, the phi (ϕ) correlation between key light color and FI schedule value was manipulated. Red and green key lights signaled the onset of either or both FI schedules. Different colors were either predictive (ϕ = 1), moderately predictive (ϕ = 0.2-0.8), or not predictive (ϕ = 0) of a specific FI schedule. This study tested the hypothesis that temporal discrimination is a function of the momentary conditional probability of food; that is, pigeons peck the most at either 2 s or 4 s when ϕ = 1 and peck at both intervals when ϕ < 1. Response distributions were bimodal Gaussian curves; distributions from red- and green-key PI trials converged when ϕ ≤ 0.6. Peak times estimated by summed Gaussian functions, averaged across conditions and pigeons, were 1.85 s and 3.87 s, however, pigeons did not always maximize the momentary probability of food. When key light color was highly correlated with FI schedules (ϕ ≥ 0.6), estimates of peak times indicated that temporal discrimination accuracy was reduced at the unlikely interval, but not the likely interval. The mechanism of this reduced temporal discrimination accuracy could be interpreted as an attentional process.

  14. Anticipating abrupt shifts in temporal evolution of probability of eruption

    NASA Astrophysics Data System (ADS)

    Rohmer, J.; Loschetter, A.

    2016-04-01

    Estimating the probability of eruption by jointly accounting for different sources of monitoring parameters over time is a key component for volcano risk management. In the present study, we are interested in the transition from a state of low-to-moderate probability value to a state of high probability value. By using the data of MESIMEX exercise at the Vesuvius volcano, we investigated the potential for time-varying indicators related to the correlation structure or to the variability of the probability time series for detecting in advance this critical transition. We found that changes in the power spectra and in the standard deviation estimated over a rolling time window both present an abrupt increase, which marks the approaching shift. Our numerical experiments revealed that the transition from an eruption probability of 10-15% to > 70% could be identified up to 1-3 h in advance. This additional lead time could be useful to place different key services (e.g., emergency services for vulnerable groups, commandeering additional transportation means, etc.) on a higher level of alert before the actual call for evacuation.

  15. Spatial and Temporal Analysis of Eruption Locations, Compositions, and Styles in Northern Harrat Rahat, Kingdom of Saudi Arabia

    NASA Astrophysics Data System (ADS)

    Dietterich, H. R.; Stelten, M. E.; Downs, D. T.; Champion, D. E.

    2017-12-01

    Harrat Rahat is a predominantly mafic, 20,000 km2 volcanic field in western Saudi Arabia with an elongate volcanic axis extending 310 km north-south. Prior mapping suggests that the youngest eruptions were concentrated in northernmost Harrat Rahat, where our new geologic mapping and geochronology reveal >300 eruptive vents with ages ranging from 1.2 Ma to a historic eruption in 1256 CE. Eruption compositions and styles vary spatially and temporally within the volcanic field, where extensive alkali basaltic lavas dominate, but more evolved compositions erupted episodically as clusters of trachytic domes and small-volume pyroclastic flows. Analysis of vent locations, compositions, and eruption styles shows the evolution of the volcanic field and allows assessment of the spatio-temporal probabilities of vent opening and eruption styles. We link individual vents and fissures to eruptions and their deposits using field relations, petrography, geochemistry, paleomagnetism, and 40Ar/39Ar and 36Cl geochronology. Eruption volumes and deposit extents are derived from geologic mapping and topographic analysis. Spatial density analysis with kernel density estimation captures vent densities of up to 0.2 %/km2 along the north-south running volcanic axis, decaying quickly away to the east but reaching a second, lower high along a secondary axis to the west. Temporal trends show slight younging of mafic eruption ages to the north in the past 300 ka, as well as clustered eruptions of trachytes over the past 150 ka. Vent locations, timing, and composition are integrated through spatial probability weighted by eruption age for each compositional range to produce spatio-temporal models of vent opening probability. These show that the next mafic eruption is most probable within the north end of the main (eastern) volcanic axis, whereas more evolved compositions are most likely to erupt within the trachytic centers further to the south. These vent opening probabilities, combined with corresponding eruption properties, can be used as the basis for lava flow and tephra fall hazard maps.

  16. Quantification of EEG reactivity in comatose patients

    PubMed Central

    Hermans, Mathilde C.; Westover, M. Brandon; van Putten, Michel J.A.M.; Hirsch, Lawrence J.; Gaspard, Nicolas

    2016-01-01

    Objective EEG reactivity is an important predictor of outcome in comatose patients. However, visual analysis of reactivity is prone to subjectivity and may benefit from quantitative approaches. Methods In EEG segments recorded during reactivity testing in 59 comatose patients, 13 quantitative EEG parameters were used to compare the spectral characteristics of 1-minute segments before and after the onset of stimulation (spectral temporal symmetry). Reactivity was quantified with probability values estimated using combinations of these parameters. The accuracy of probability values as a reactivity classifier was evaluated against the consensus assessment of three expert clinical electroencephalographers using visual analysis. Results The binary classifier assessing spectral temporal symmetry in four frequency bands (delta, theta, alpha and beta) showed best accuracy (Median AUC: 0.95) and was accompanied by substantial agreement with the individual opinion of experts (Gwet’s AC1: 65–70%), at least as good as inter-expert agreement (AC1: 55%). Probability values also reflected the degree of reactivity, as measured by the inter-experts’ agreement regarding reactivity for each individual case. Conclusion Automated quantitative EEG approaches based on probabilistic description of spectral temporal symmetry reliably quantify EEG reactivity. Significance Quantitative EEG may be useful for evaluating reactivity in comatose patients, offering increased objectivity. PMID:26183757

  17. Application of a multistate model to estimate culvert effects on movement of small fishes

    USGS Publications Warehouse

    Norman, J.R.; Hagler, M.M.; Freeman, Mary C.; Freeman, B.J.

    2009-01-01

    While it is widely acknowledged that culverted road-stream crossings may impede fish passage, effects of culverts on movement of nongame and small-bodied fishes have not been extensively studied and studies generally have not accounted for spatial variation in capture probabilities. We estimated probabilities for upstream and downstream movement of small (30-120 mm standard length) benthic and water column fishes across stream reaches with and without culverts at four road-stream crossings over a 4-6-week period. Movement and reach-specific capture probabilities were estimated using multistate capture-recapture models. Although none of the culverts were complete barriers to passage, only a bottomless-box culvert appeared to permit unrestricted upstream and downstream movements by benthic fishes based on model estimates of movement probabilities. At two box culverts that were perched above the water surface at base flow, observed movements were limited to water column fishes and to intervals when runoff from storm events raised water levels above the perched level. Only a single fish was observed to move through a partially embedded pipe culvert. Estimates for probabilities of movement over distances equal to at least the length of one culvert were low (e.g., generally ???0.03, estimated for 1-2-week intervals) and had wide 95% confidence intervals as a consequence of few observed movements to nonadjacent reaches. Estimates of capture probabilities varied among reaches by a factor of 2 to over 10, illustrating the importance of accounting for spatially variable capture rates when estimating movement probabilities with capture-recapture data. Longer-term studies are needed to evaluate temporal variability in stream fish passage at culverts (e.g., in relation to streamflow variability) and to thereby better quantify the degree of population fragmentation caused by road-stream crossings with culverts. ?? American Fisheries Society 2009.

  18. Effects of environmental covariates and density on the catchability of fish populations and interpretation of catch per unit effort trends

    USGS Publications Warehouse

    Korman, Josh; Yard, Mike

    2017-01-01

    Article for outlet: Fisheries Research. Abstract: Quantifying temporal and spatial trends in abundance or relative abundance is required to evaluate effects of harvest and changes in habitat for exploited and endangered fish populations. In many cases, the proportion of the population or stock that is captured (catchability or capture probability) is unknown but is often assumed to be constant over space and time. We used data from a large-scale mark-recapture study to evaluate the extent of spatial and temporal variation, and the effects of fish density, fish size, and environmental covariates, on the capture probability of rainbow trout (Oncorhynchus mykiss) in the Colorado River, AZ. Estimates of capture probability for boat electrofishing varied 5-fold across five reaches, 2.8-fold across the range of fish densities that were encountered, 2.1-fold over 19 trips, and 1.6-fold over five fish size classes. Shoreline angle and turbidity were the best covariates explaining variation in capture probability across reaches and trips. Patterns in capture probability were driven by changes in gear efficiency and spatial aggregation, but the latter was more important. Failure to account for effects of fish density on capture probability when translating a historical catch per unit effort time series into a time series of abundance, led to 2.5-fold underestimation of the maximum extent of variation in abundance over the period of record, and resulted in unreliable estimates of relative change in critical years. Catch per unit effort surveys have utility for monitoring long-term trends in relative abundance, but are too imprecise and potentially biased to evaluate population response to habitat changes or to modest changes in fishing effort.

  19. Spatial and temporal patterns of chronic wasting disease: Fine-scale mapping of a wildlife epidemic in Wisconsin

    USGS Publications Warehouse

    Osnas, E.E.; Heisey, D.M.; Rolley, R.E.; Samuel, M.D.

    2009-01-01

    Emerging infectious diseases threaten wildlife populations and human health. Understanding the spatial distributions of these new diseases is important for disease management and policy makers; however, the data are complicated by heterogeneities across host classes, sampling variance, sampling biases, and the space-time epidemic process. Ignoring these issues can lead to false conclusions or obscure important patterns in the data, such as spatial variation in disease prevalence. Here, we applied hierarchical Bayesian disease mapping methods to account for risk factors and to estimate spatial and temporal patterns of infection by chronic wasting disease (CWD) in white-tailed deer (Odocoileus virginianus) of Wisconsin, USA. We found significant heterogeneities for infection due to age, sex, and spatial location. Infection probability increased with age for all young deer, increased with age faster for young males, and then declined for some older animals, as expected from disease-associated mortality and age-related changes in infection risk. We found that disease prevalence was clustered in a central location, as expected under a simple spatial epidemic process where disease prevalence should increase with time and expand spatially. However, we could not detect any consistent temporal or spatiotemporal trends in CWD prevalence. Estimates of the temporal trend indicated that prevalence may have decreased or increased with nearly equal posterior probability, and the model without temporal or spatiotemporal effects was nearly equivalent to models with these effects based on deviance information criteria. For maximum interpretability of the role of location as a disease risk factor, we used the technique of direct standardization for prevalence mapping, which we develop and describe. These mapping results allow disease management actions to be employed with reference to the estimated spatial distribution of the disease and to those host classes most at risk. Future wildlife epidemiology studies should employ hierarchical Bayesian methods to smooth estimated quantities across space and time, account for heterogeneities, and then report disease rates based on an appropriate standardization. ?? 2009 by the Ecological Society of America.

  20. Trackline and point detection probabilities for acoustic surveys of Cuvier's and Blainville's beaked whales.

    PubMed

    Barlow, Jay; Tyack, Peter L; Johnson, Mark P; Baird, Robin W; Schorr, Gregory S; Andrews, Russel D; Aguilar de Soto, Natacha

    2013-09-01

    Acoustic survey methods can be used to estimate density and abundance using sounds produced by cetaceans and detected using hydrophones if the probability of detection can be estimated. For passive acoustic surveys, probability of detection at zero horizontal distance from a sensor, commonly called g(0), depends on the temporal patterns of vocalizations. Methods to estimate g(0) are developed based on the assumption that a beaked whale will be detected if it is producing regular echolocation clicks directly under or above a hydrophone. Data from acoustic recording tags placed on two species of beaked whales (Cuvier's beaked whale-Ziphius cavirostris and Blainville's beaked whale-Mesoplodon densirostris) are used to directly estimate the percentage of time they produce echolocation clicks. A model of vocal behavior for these species as a function of their diving behavior is applied to other types of dive data (from time-depth recorders and time-depth-transmitting satellite tags) to indirectly determine g(0) in other locations for low ambient noise conditions. Estimates of g(0) for a single instant in time are 0.28 [standard deviation (s.d.) = 0.05] for Cuvier's beaked whale and 0.19 (s.d. = 0.01) for Blainville's beaked whale.

  1. COMDYN: Software to study the dynamics of animal communities using a capture-recapture approach

    USGS Publications Warehouse

    Hines, J.E.; Boulinier, T.; Nichols, J.D.; Sauer, J.R.; Pollock, K.H.

    1999-01-01

    COMDYN is a set of programs developed for estimation of parameters associated with community dynamics using count data from two locations or time periods. It is Internet-based, allowing remote users either to input their own data, or to use data from the North American Breeding Bird Survey for analysis. COMDYN allows probability of detection to vary among species and among locations and time periods. The basic estimator for species richness underlying all estimators is the jackknife estimator proposed by Burnham and Overton. Estimators are presented for quantities associated with temporal change in species richness, including rate of change in species richness over time, local extinction probability, local species turnover and number of local colonizing species. Estimators are also presented for quantities associated with spatial variation in species richness, including relative richness at two locations and proportion of species present in one location that are also present at a second location. Application of the estimators to species richness estimation has been previously described and justified. The potential applications of these programs are discussed.

  2. Spectrally-Temporally Adapted Spectrally Modulated Spectrally Encoded (SMSE) Waveform Design for Coexistent CR-Based SDR Applications

    DTIC Science & Technology

    2010-03-01

    uses all available resources in some optimized manner. By further exploiting the design flexibility and computational efficiency of Orthogonal Frequency...in the following sections. 3.2.1 Estimation of PU Signal Statistics. The Estimate PU Signal Statis- tics function of Fig 3.4 is used to compute the...consecutive PU transmissions, and 4) the probability of transitioning from one transmission state to another. These statistics are then used to compute the

  3. Accounting for randomness in measurement and sampling in studying cancer cell population dynamics.

    PubMed

    Ghavami, Siavash; Wolkenhauer, Olaf; Lahouti, Farshad; Ullah, Mukhtar; Linnebacher, Michael

    2014-10-01

    Knowing the expected temporal evolution of the proportion of different cell types in sample tissues gives an indication about the progression of the disease and its possible response to drugs. Such systems have been modelled using Markov processes. We here consider an experimentally realistic scenario in which transition probabilities are estimated from noisy cell population size measurements. Using aggregated data of FACS measurements, we develop MMSE and ML estimators and formulate two problems to find the minimum number of required samples and measurements to guarantee the accuracy of predicted population sizes. Our numerical results show that the convergence mechanism of transition probabilities and steady states differ widely from the real values if one uses the standard deterministic approach for noisy measurements. This provides support for our argument that for the analysis of FACS data one should consider the observed state as a random variable. The second problem we address is about the consequences of estimating the probability of a cell being in a particular state from measurements of small population of cells. We show how the uncertainty arising from small sample sizes can be captured by a distribution for the state probability.

  4. Real-time Mainshock Forecast by Statistical Discrimination of Foreshock Clusters

    NASA Astrophysics Data System (ADS)

    Nomura, S.; Ogata, Y.

    2016-12-01

    Foreshock discremination is one of the most effective ways for short-time forecast of large main shocks. Though many large earthquakes accompany their foreshocks, discreminating them from enormous small earthquakes is difficult and only probabilistic evaluation from their spatio-temporal features and magnitude evolution may be available. Logistic regression is the statistical learning method best suited to such binary pattern recognition problems where estimates of a-posteriori probability of class membership are required. Statistical learning methods can keep learning discreminating features from updating catalog and give probabilistic recognition of forecast in real time. We estimated a non-linear function of foreshock proportion by smooth spline bases and evaluate the possibility of foreshocks by the logit function. In this study, we classified foreshocks from earthquake catalog by the Japan Meteorological Agency by single-link clustering methods and learned spatial and temporal features of foreshocks by the probability density ratio estimation. We use the epicentral locations, time spans and difference in magnitudes for learning and forecasting. Magnitudes of main shocks are also predicted our method by incorporating b-values into our method. We discuss the spatial pattern of foreshocks from the classifier composed by our model. We also implement a back test to validate predictive performance of the model by this catalog.

  5. Estimation of Flattened Musk Turtle (Sternotherus depressus) survival, recapture, and recovery rate during and after a disease outbreak

    USGS Publications Warehouse

    Fonnesbeck, C.J.; Dodd, C.K.

    2003-01-01

    We estimated survivorship, recapture probabilities and recovery rates in a threatened population of Flattened Musk Turtles (Sternotherus depressus) through a disease outbreak in Alabama in 1985. We evaluated a set of models for the demographic effects of disease by analyzing recaptures and recoveries simultaneously. Multiple-model inference suggested survival was temporally dynamic, whereas recapture probability was sex- and age-specifc. Biweekly survivorship declined from 98-99% before to 82-88% during the outbreak. Live recapture was twice as likely for male turtles relative to juveniles or females, whereas dead recoveries varied only slightly by sex and age. Our results suggest modest reduction in survival over a relatively short time period may severely affect population status.

  6. Quantification of EEG reactivity in comatose patients.

    PubMed

    Hermans, Mathilde C; Westover, M Brandon; van Putten, Michel J A M; Hirsch, Lawrence J; Gaspard, Nicolas

    2016-01-01

    EEG reactivity is an important predictor of outcome in comatose patients. However, visual analysis of reactivity is prone to subjectivity and may benefit from quantitative approaches. In EEG segments recorded during reactivity testing in 59 comatose patients, 13 quantitative EEG parameters were used to compare the spectral characteristics of 1-minute segments before and after the onset of stimulation (spectral temporal symmetry). Reactivity was quantified with probability values estimated using combinations of these parameters. The accuracy of probability values as a reactivity classifier was evaluated against the consensus assessment of three expert clinical electroencephalographers using visual analysis. The binary classifier assessing spectral temporal symmetry in four frequency bands (delta, theta, alpha and beta) showed best accuracy (Median AUC: 0.95) and was accompanied by substantial agreement with the individual opinion of experts (Gwet's AC1: 65-70%), at least as good as inter-expert agreement (AC1: 55%). Probability values also reflected the degree of reactivity, as measured by the inter-experts' agreement regarding reactivity for each individual case. Automated quantitative EEG approaches based on probabilistic description of spectral temporal symmetry reliably quantify EEG reactivity. Quantitative EEG may be useful for evaluating reactivity in comatose patients, offering increased objectivity. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  7. The multi temporal/multi-model approach to predictive uncertainty assessment in real-time flood forecasting

    NASA Astrophysics Data System (ADS)

    Barbetta, Silvia; Coccia, Gabriele; Moramarco, Tommaso; Brocca, Luca; Todini, Ezio

    2017-08-01

    This work extends the multi-temporal approach of the Model Conditional Processor (MCP-MT) to the multi-model case and to the four Truncated Normal Distributions (TNDs) approach, demonstrating the improvement on the single-temporal one. The study is framed in the context of probabilistic Bayesian decision-making that is appropriate to take rational decisions on uncertain future outcomes. As opposed to the direct use of deterministic forecasts, the probabilistic forecast identifies a predictive probability density function that represents a fundamental knowledge on future occurrences. The added value of MCP-MT is the identification of the probability that a critical situation will happen within the forecast lead-time and when, more likely, it will occur. MCP-MT is thoroughly tested for both single-model and multi-model configurations at a gauged site on the Tiber River, central Italy. The stages forecasted by two operative deterministic models, STAFOM-RCM and MISDc, are considered for the study. The dataset used for the analysis consists of hourly data from 34 flood events selected on a time series of six years. MCP-MT improves over the original models' forecasts: the peak overestimation and the rising limb delayed forecast, characterizing MISDc and STAFOM-RCM respectively, are significantly mitigated, with a reduced mean error on peak stage from 45 to 5 cm and an increased coefficient of persistence from 0.53 up to 0.75. The results show that MCP-MT outperforms the single-temporal approach and is potentially useful for supporting decision-making because the exceedance probability of hydrometric thresholds within a forecast horizon and the most probable flooding time can be estimated.

  8. Long-term effective population size dynamics of an intensively monitored vertebrate population

    PubMed Central

    Mueller, A-K; Chakarov, N; Krüger, O; Hoffman, J I

    2016-01-01

    Long-term genetic data from intensively monitored natural populations are important for understanding how effective population sizes (Ne) can vary over time. We therefore genotyped 1622 common buzzard (Buteo buteo) chicks sampled over 12 consecutive years (2002–2013 inclusive) at 15 microsatellite loci. This data set allowed us to both compare single-sample with temporal approaches and explore temporal patterns in the effective number of parents that produced each cohort in relation to the observed population dynamics. We found reasonable consistency between linkage disequilibrium-based single-sample and temporal estimators, particularly during the latter half of the study, but no clear relationship between annual Ne estimates () and census sizes. We also documented a 14-fold increase in between 2008 and 2011, a period during which the census size doubled, probably reflecting a combination of higher adult survival and immigration from further afield. Our study thus reveals appreciable temporal heterogeneity in the effective population size of a natural vertebrate population, confirms the need for long-term studies and cautions against drawing conclusions from a single sample. PMID:27553455

  9. Plethodontid salamander population ecology in managed forest headwaters in the Oregon coast range

    Treesearch

    Matthew R. Kluber; Deanna H. Olson

    2013-01-01

    We examined temporal and spatial patterns of terrestrial amphibian species abundances and individual movements in western Oregon managed headwater forest stands using artifi cial cover object (ACO) arrays. Using mark-recapture methods, we estimated the eff ects of species and seasonality on apparent survival rates and recapture probabilities. We captured, marked, and...

  10. Absolute probability estimates of lethal vessel strikes to North Atlantic right whales in Roseway Basin, Scotian Shelf.

    PubMed

    van der Hoop, Julie M; Vanderlaan, Angelia S M; Taggart, Christopher T

    2012-10-01

    Vessel strikes are the primary source of known mortality for the endangered North Atlantic right whale (Eubalaena glacialis). Multi-institutional efforts to reduce mortality associated with vessel strikes include vessel-routing amendments such as the International Maritime Organization voluntary "area to be avoided" (ATBA) in the Roseway Basin right whale feeding habitat on the southwestern Scotian Shelf. Though relative probabilities of lethal vessel strikes have been estimated and published, absolute probabilities remain unknown. We used a modeling approach to determine the regional effect of the ATBA, by estimating reductions in the expected number of lethal vessel strikes. This analysis differs from others in that it explicitly includes a spatiotemporal analysis of real-time transits of vessels through a population of simulated, swimming right whales. Combining automatic identification system (AIS) vessel navigation data and an observationally based whale movement model allowed us to determine the spatial and temporal intersection of vessels and whales, from which various probability estimates of lethal vessel strikes are derived. We estimate one lethal vessel strike every 0.775-2.07 years prior to ATBA implementation, consistent with and more constrained than previous estimates of every 2-16 years. Following implementation, a lethal vessel strike is expected every 41 years. When whale abundance is held constant across years, we estimate that voluntary vessel compliance with the ATBA results in an 82% reduction in the per capita rate of lethal strikes; very similar to a previously published estimate of 82% reduction in the relative risk of a lethal vessel strike. The models we developed can inform decision-making and policy design, based on their ability to provide absolute, population-corrected, time-varying estimates of lethal vessel strikes, and they are easily transported to other regions and situations.

  11. Investigations of potential bias in the estimation of lambda using Pradel's (1996) model for capture-recapture data

    USGS Publications Warehouse

    Hines, James E.; Nichols, James D.

    2002-01-01

    Pradel's (1996) temporal symmetry model permitting direct estimation and modelling of population growth rate, u i , provides a potentially useful tool for the study of population dynamics using marked animals. Because of its recent publication date, the approach has not seen much use, and there have been virtually no investigations directed at robustness of the resulting estimators. Here we consider several potential sources of bias, all motivated by specific uses of this estimation approach. We consider sampling situations in which the study area expands with time and present an analytic expression for the bias in u i We next consider trap response in capture probabilities and heterogeneous capture probabilities and compute large-sample and simulation-based approximations of resulting bias in u i . These approximations indicate that trap response is an especially important assumption violation that can produce substantial bias. Finally, we consider losses on capture and emphasize the importance of selecting the estimator for u i that is appropriate to the question being addressed. For studies based on only sighting and resighting data, Pradel's (1996) u i ' is the appropriate estimator.

  12. Evaluating the capacity of GF-4 satellite data for estimating fractional vegetation cover

    NASA Astrophysics Data System (ADS)

    Zhang, C.; Qin, Q.; Ren, H.; Zhang, T.; Sun, Y.

    2016-12-01

    Fractional vegetation cover (FVC) is a crucial parameter for many agricultural, environmental, meteorological and ecological applications, which is of great importance for studies on ecosystem structure and function. The Chinese GaoFen-4 (GF-4) geostationary satellite designed for the purpose of environmental and ecological observation was launched in December 29, 2015, and official use has been started by Chinese Government on June 13, 2016. Multi-spectral images with spatial resolution of 50 m and high temporal resolution, could be acquired by the sensor on GF-4 satellite on the 36000 km-altitude orbit. To take full advantage of the outstanding performance of GF-4 satellite, this study evaluated the capacity of GF-4 satellite data for monitoring FVC. To the best of our knowledge, this is the first research about estimating FVC from GF-4 satellite images. First, we developed a procedure for preprocessing GF-4 satellite data, including radiometric calibration and atmospheric correction, to acquire surface reflectance. Then single image and multi-temporal images were used for extracting the endmembers of vegetation and soil, respectively. After that, dimidiate pixel model and square model based on vegetation indices were used for estimating FVC. Finally, the estimation results were comparatively analyzed with FVC estimated by other existing sensors. The experimental results showed that satisfying accuracy of FVC estimation could be achieved from GF-4 satellite images using dimidiate pixel model and square model based on vegetation indices. What's more, the multi-temporal images increased the probability to find pure vegetation and soil endmembers, thus the characteristic of high temporal resolution of GF-4 satellite images improved the accuracy of FVC estimation. This study demonstrated the capacity of GF-4 satellite data for monitoring FVC. The conclusions reached by this study are significant for improving the accuracy and spatial-temporal resolution of existing FVC products, which provides a basis for the studies on ecosystem structure and function using remote sensing data acquired by GF-4 satellite.

  13. Inferences about population dynamics from count data using multi-state models: A comparison to capture-recapture approaches

    USGS Publications Warehouse

    Grant, Evan H. Campbell; Zipkin, Elise; Scott, Sillett T.; Chandler, Richard; Royle, J. Andrew

    2014-01-01

    Wildlife populations consist of individuals that contribute disproportionately to growth and viability. Understanding a population's spatial and temporal dynamics requires estimates of abundance and demographic rates that account for this heterogeneity. Estimating these quantities can be difficult, requiring years of intensive data collection. Often, this is accomplished through the capture and recapture of individual animals, which is generally only feasible at a limited number of locations. In contrast, N-mixture models allow for the estimation of abundance, and spatial variation in abundance, from count data alone. We extend recently developed multistate, open population N-mixture models, which can additionally estimate demographic rates based on an organism's life history characteristics. In our extension, we develop an approach to account for the case where not all individuals can be assigned to a state during sampling. Using only state-specific count data, we show how our model can be used to estimate local population abundance, as well as density-dependent recruitment rates and state-specific survival. We apply our model to a population of black-throated blue warblers (Setophaga caerulescens) that have been surveyed for 25 years on their breeding grounds at the Hubbard Brook Experimental Forest in New Hampshire, USA. The intensive data collection efforts allow us to compare our estimates to estimates derived from capture–recapture data. Our model performed well in estimating population abundance and density-dependent rates of annual recruitment/immigration. Estimates of local carrying capacity and per capita recruitment of yearlings were consistent with those published in other studies. However, our model moderately underestimated annual survival probability of yearling and adult females and severely underestimates survival probabilities for both of these male stages. The most accurate and precise estimates will necessarily require some amount of intensive data collection efforts (such as capture–recapture). Integrated population models that combine data from both intensive and extensive sources are likely to be the most efficient approach for estimating demographic rates at large spatial and temporal scales.

  14. Spatio-temporal hierarchical modeling of rates and variability of Holocene sea-level changes in the western North Atlantic and the Caribbean

    NASA Astrophysics Data System (ADS)

    Ashe, E.; Kopp, R. E.; Khan, N.; Horton, B.; Engelhart, S. E.

    2016-12-01

    Sea level varies over of both space and time. Prior to the instrumental period, the sea-level record depends upon geological reconstructions that contain vertical and temporal uncertainty. Spatio-temporal statistical models enable the interpretation of RSL and rates of change as well as the reconstruction of the entire sea-level field from such noisy data. Hierarchical models explicitly distinguish between a process level, which characterizes the spatio-temporal field, and a data level, by which sparse proxy data and its noise is recorded. A hyperparameter level depicts prior expectations about the structure of variability in the spatio-temporal field. Spatio-temporal hierarchical models are amenable to several analysis approaches, with tradeoffs regarding computational efficiency and comprehensiveness of uncertainty characterization. A fully-Bayesian hierarchical model (BHM), which places prior probability distributions upon the hyperparameters, is more computationally intensive than an empirical hierarchical model (EHM), which uses point estimates of hyperparameters, derived from the data [1]. Here, we assess the sensitivity of posterior estimates of relative sea level (RSL) and rates to different statistical approaches by varying prior assumptions about the spatial and temporal structure of sea-level variability and applying multiple analytical approaches to Holocene sea-level proxies along the Atlantic coast of North American and the Caribbean [2]. References: 1. N Cressie, Wikle CK (2011) Statistics for spatio-temporal data (John Wiley & Sons). 2. Kahn N et al. (2016). Quaternary Science Reviews (in revision).

  15. Estimating state-transition probabilities for unobservable states using capture-recapture/resighting data

    USGS Publications Warehouse

    Kendall, W.L.; Nichols, J.D.

    2002-01-01

    Temporary emigration was identified some time ago as causing potential problems in capture-recapture studies, and in the last five years approaches have been developed for dealing with special cases of this general problem. Temporary emigration can be viewed more generally as involving transitions to and from an unobservable state, and frequently the state itself is one of biological interest (e.g., 'nonbreeder'). Development of models that permit estimation of relevant parameters in the presence of an unobservable state requires either extra information (e.g., as supplied by Pollock's robust design) or the following classes of model constraints: reducing the order of Markovian transition probabilities, imposing a degree of determinism on transition probabilities, removing state specificity of survival probabilities, and imposing temporal constancy of parameters. The objective of the work described in this paper is to investigate estimability of model parameters under a variety of models that include an unobservable state. Beginning with a very general model and no extra information, we used numerical methods to systematically investigate the use of ancillary information and constraints to yield models that are useful for estimation. The result is a catalog of models for which estimation is possible. An example analysis of sea turtle capture-recapture data under two different models showed similar point estimates but increased precision for the model that incorporated ancillary data (the robust design) when compared to the model with deterministic transitions only. This comparison and the results of our numerical investigation of model structures lead to design suggestions for capture-recapture studies in the presence of an unobservable state.

  16. A robust design mark-resight abundance estimator allowing heterogeneity in resighting probabilities

    USGS Publications Warehouse

    McClintock, B.T.; White, Gary C.; Burnham, K.P.

    2006-01-01

    This article introduces the beta-binomial estimator (BBE), a closed-population abundance mark-resight model combining the favorable qualities of maximum likelihood theory and the allowance of individual heterogeneity in sighting probability (p). The model may be parameterized for a robust sampling design consisting of multiple primary sampling occasions where closure need not be met between primary occasions. We applied the model to brown bear data from three study areas in Alaska and compared its performance to the joint hypergeometric estimator (JHE) and Bowden's estimator (BOWE). BBE estimates suggest heterogeneity levels were non-negligible and discourage the use of JHE for these data. Compared to JHE and BOWE, confidence intervals were considerably shorter for the AICc model-averaged BBE. To evaluate the properties of BBE relative to JHE and BOWE when sample sizes are small, simulations were performed with data from three primary occasions generated under both individual heterogeneity and temporal variation in p. All models remained consistent regardless of levels of variation in p. In terms of precision, the AICc model-averaged BBE showed advantages over JHE and BOWE when heterogeneity was present and mean sighting probabilities were similar between primary occasions. Based on the conditions examined, BBE is a reliable alternative to JHE or BOWE and provides a framework for further advances in mark-resight abundance estimation. ?? 2006 American Statistical Association and the International Biometric Society.

  17. Partitioning Detectability Components in Populations Subject to Within-Season Temporary Emigration Using Binomial Mixture Models

    PubMed Central

    O’Donnell, Katherine M.; Thompson, Frank R.; Semlitsch, Raymond D.

    2015-01-01

    Detectability of individual animals is highly variable and nearly always < 1; imperfect detection must be accounted for to reliably estimate population sizes and trends. Hierarchical models can simultaneously estimate abundance and effective detection probability, but there are several different mechanisms that cause variation in detectability. Neglecting temporary emigration can lead to biased population estimates because availability and conditional detection probability are confounded. In this study, we extend previous hierarchical binomial mixture models to account for multiple sources of variation in detectability. The state process of the hierarchical model describes ecological mechanisms that generate spatial and temporal patterns in abundance, while the observation model accounts for the imperfect nature of counting individuals due to temporary emigration and false absences. We illustrate our model’s potential advantages, including the allowance of temporary emigration between sampling periods, with a case study of southern red-backed salamanders Plethodon serratus. We fit our model and a standard binomial mixture model to counts of terrestrial salamanders surveyed at 40 sites during 3–5 surveys each spring and fall 2010–2012. Our models generated similar parameter estimates to standard binomial mixture models. Aspect was the best predictor of salamander abundance in our case study; abundance increased as aspect became more northeasterly. Increased time-since-rainfall strongly decreased salamander surface activity (i.e. availability for sampling), while higher amounts of woody cover objects and rocks increased conditional detection probability (i.e. probability of capture, given an animal is exposed to sampling). By explicitly accounting for both components of detectability, we increased congruence between our statistical modeling and our ecological understanding of the system. We stress the importance of choosing survey locations and protocols that maximize species availability and conditional detection probability to increase population parameter estimate reliability. PMID:25775182

  18. Evaluating species richness: biased ecological inference results from spatial heterogeneity in species detection probabilities

    USGS Publications Warehouse

    McNew, Lance B.; Handel, Colleen M.

    2015-01-01

    Accurate estimates of species richness are necessary to test predictions of ecological theory and evaluate biodiversity for conservation purposes. However, species richness is difficult to measure in the field because some species will almost always be overlooked due to their cryptic nature or the observer's failure to perceive their cues. Common measures of species richness that assume consistent observability across species are inviting because they may require only single counts of species at survey sites. Single-visit estimation methods ignore spatial and temporal variation in species detection probabilities related to survey or site conditions that may confound estimates of species richness. We used simulated and empirical data to evaluate the bias and precision of raw species counts, the limiting forms of jackknife and Chao estimators, and multi-species occupancy models when estimating species richness to evaluate whether the choice of estimator can affect inferences about the relationships between environmental conditions and community size under variable detection processes. Four simulated scenarios with realistic and variable detection processes were considered. Results of simulations indicated that (1) raw species counts were always biased low, (2) single-visit jackknife and Chao estimators were significantly biased regardless of detection process, (3) multispecies occupancy models were more precise and generally less biased than the jackknife and Chao estimators, and (4) spatial heterogeneity resulting from the effects of a site covariate on species detection probabilities had significant impacts on the inferred relationships between species richness and a spatially explicit environmental condition. For a real dataset of bird observations in northwestern Alaska, the four estimation methods produced different estimates of local species richness, which severely affected inferences about the effects of shrubs on local avian richness. Overall, our results indicate that neglecting the effects of site covariates on species detection probabilities may lead to significant bias in estimation of species richness, as well as the inferred relationships between community size and environmental covariates.

  19. Temporal resolution requirements of satellite constellations for 30 m global burned area mapping

    NASA Astrophysics Data System (ADS)

    Melchiorre, A.; Boschetti, L.

    2017-12-01

    Global burned area maps have been generated systematically with daily, coarse resolution satellite data (Giglio et al. 2013). The production of moderate resolution (10 - 30 m) global burned area products would meet the needs of several user communities: improved carbon emission estimations due to heterogeneous landscapes and for local scale air quality and fire management applications (Mouillot et al. 2014; van der Werf et al. 2010). While the increased spatial resolution reduces the influence of mixed burnt/unburnt pixels and it would increase the spectral separation of burned areas, moderate resolution satellites have reduced temporal resolution (10 - 16 days). Fire causes a land-cover change spectrally visible for a period ranging from a few weeks in savannas to over a year in forested ecosystems (Roy et al. 2010); because clouds, smoke, and other optically thick aerosols limit the number of available observations (Roy et al. 2008; Smith and Wooster 2005), burned areas might disappear before they are observed by moderate resolution sensors. Data fusion from a constellation of different sensors has been proposed to overcome these limits (Boschetti et al. 2015; Roy 2015). In this study, we estimated the probability of moderate resolution satellites and virtual constellations (including Landsat-8/9, Sentinel-2A/B) to provide sufficient observations for burned area mapping globally, and by ecosystem. First, we estimated the duration of the persistence of the signal associated with burned areas by combining the MODIS Global Burned Area and the Nadir BRDF-Adjusted Reflectance Product by characterizing the post-fire trends in reflectance to determine the length of the period in which the burn class is spectrally distinct from the unburned and, therefore, detectable. The MODIS-Terra daily cloud data were then used to estimate the probability of cloud cover. The cloud probability was used at each location to estimate the minimum revisit time needed to obtain at least one cloud-free observation within the duration of the persistence of burned areas. As complementary results, the expected omission error due to insufficient observations was estimated for each of the satellite combination considered making use of the calendar and geometry of acquisition for each of the sensor included in the virtual constellation.

  20. Spatial and temporal Brook Trout density dynamics: Implications for conservation, management, and monitoring

    USGS Publications Warehouse

    Wagner, Tyler; Jefferson T. Deweber,; Jason Detar,; Kristine, David; John A. Sweka,

    2014-01-01

    Many potential stressors to aquatic environments operate over large spatial scales, prompting the need to assess and monitor both site-specific and regional dynamics of fish populations. We used hierarchical Bayesian models to evaluate the spatial and temporal variability in density and capture probability of age-1 and older Brook Trout Salvelinus fontinalis from three-pass removal data collected at 291 sites over a 37-year time period (1975–2011) in Pennsylvania streams. There was high between-year variability in density, with annual posterior means ranging from 2.1 to 10.2 fish/100 m2; however, there was no significant long-term linear trend. Brook Trout density was positively correlated with elevation and negatively correlated with percent developed land use in the network catchment. Probability of capture did not vary substantially across sites or years but was negatively correlated with mean stream width. Because of the low spatiotemporal variation in capture probability and a strong correlation between first-pass CPUE (catch/min) and three-pass removal density estimates, the use of an abundance index based on first-pass CPUE could represent a cost-effective alternative to conducting multiple-pass removal sampling for some Brook Trout monitoring and assessment objectives. Single-pass indices may be particularly relevant for monitoring objectives that do not require precise site-specific estimates, such as regional monitoring programs that are designed to detect long-term linear trends in density.

  1. Inferences about landbird abundance from count data: recent advances and future directions

    USGS Publications Warehouse

    Nichols, J.D.; Thomas, L.; Conn, P.B.; Thomson, David L.; Cooch, Evan G.; Conroy, Michael J.

    2009-01-01

    We summarize results of a November 2006 workshop dealing with recent research on the estimation of landbird abundance from count data. Our conceptual framework includes a decomposition of the probability of detecting a bird potentially exposed to sampling efforts into four separate probabilities. Primary inference methods are described and include distance sampling, multiple observers, time of detection, and repeated counts. The detection parameters estimated by these different approaches differ, leading to different interpretations of resulting estimates of density and abundance. Simultaneous use of combinations of these different inference approaches can not only lead to increased precision but also provides the ability to decompose components of the detection process. Recent efforts to test the efficacy of these different approaches using natural systems and a new bird radio test system provide sobering conclusions about the ability of observers to detect and localize birds in auditory surveys. Recent research is reported on efforts to deal with such potential sources of error as bird misclassification, measurement error, and density gradients. Methods for inference about spatial and temporal variation in avian abundance are outlined. Discussion topics include opinions about the need to estimate detection probability when drawing inference about avian abundance, methodological recommendations based on the current state of knowledge and suggestions for future research.

  2. Comparison of hoop-net trapping and visual surveys to monitor abundance of the Rio Grande cooter (Pseudemys gorzugi).

    PubMed

    Mali, Ivana; Duarte, Adam; Forstner, Michael R J

    2018-01-01

    Abundance estimates play an important part in the regulatory and conservation decision-making process. It is important to correct monitoring data for imperfect detection when using these data to track spatial and temporal variation in abundance, especially in the case of rare and elusive species. This paper presents the first attempt to estimate abundance of the Rio Grande cooter ( Pseudemys gorzugi ) while explicitly considering the detection process. Specifically, in 2016 we monitored this rare species at two sites along the Black River, New Mexico via traditional baited hoop-net traps and less invasive visual surveys to evaluate the efficacy of these two sampling designs. We fitted the Huggins closed-capture estimator to estimate capture probabilities using the trap data and distance sampling models to estimate detection probabilities using the visual survey data. We found that only the visual survey with the highest number of observed turtles resulted in similar abundance estimates to those estimated using the trap data. However, the estimates of abundance from the remaining visual survey data were highly variable and often underestimated abundance relative to the estimates from the trap data. We suspect this pattern is related to changes in the basking behavior of the species and, thus, the availability of turtles to be detected even though all visual surveys were conducted when environmental conditions were similar. Regardless, we found that riverine habitat conditions limited our ability to properly conduct visual surveys at one site. Collectively, this suggests visual surveys may not be an effective sample design for this species in this river system. When analyzing the trap data, we found capture probabilities to be highly variable across sites and between age classes and that recapture probabilities were much lower than initial capture probabilities, highlighting the importance of accounting for detectability when monitoring this species. Although baited hoop-net traps seem to be an effective sampling design, it is important to note that this method required a relatively high trap effort to reliably estimate abundance. This information will be useful when developing a larger-scale, long-term monitoring program for this species of concern.

  3. Investigations of potential bias in the estimation of lambda using Pradel's (1996) model for capture-recapture data

    USGS Publications Warehouse

    Hines, J.E.; Nichols, J.D.

    2002-01-01

    Pradel's (1996) temporal symmetry model permitting direct estimation and modelling of population growth rate, lambda sub i provides a potentially useful tool for the study of population dynamics using marked animals. Because of its recent publication date, the approach has not seen much use, and there have been virtually no investigations directed at robustness of the resulting estimators. Here we consider several potential sources of bias, all motivated by specific uses of this estimation approach. We consider sampling situations in which the study area expands with time and present an analytic expression for the bias in lambda hat sub i. We next consider trap response in capture probabilities and heterogeneous capture probabilities and compute large-sample and simulation-based approximations of resulting bias in lambda hat sub i. These approximations indicate that trap response is an especially important assumption violation that can produce substantial bias. Finally, we consider losses on capture and emphasize the importance of selecting the estimator for lambda sub i that is appropriate to the question being addressed. For studies based on only sighting and resighting data, Pradel's (1996) lambda hat prime sub i is the appropriate estimator.

  4. Temporal patterns of apparent leg band retention in North American geese

    USGS Publications Warehouse

    Zimmerman, Guthrie S.; Kendall, William L.; Moser, Timothy J.; White, Gary C.; Doherty, Paul F.

    2009-01-01

    An important assumption of mark?recapture studies is that individuals retain their marks, which has not been assessed for goose reward bands. We estimated aluminum leg band retention probabilities and modeled how band retention varied with band type (standard vs. reward band), band age (1-40 months), and goose characteristics (species and size class) for Canada (Branta canadensis), cackling (Branta hutchinsii), snow (Chen caerulescens), and Ross?s (Chen rossii) geese that field coordinators double-leg banded during a North American goose reward band study (N = 40,999 individuals from 15 populations). We conditioned all models in this analysis on geese that were encountered with >1 leg band still attached (n = 5,747 dead recoveries and live recaptures). Retention probabilities for standard aluminum leg bands were high (estimate of 0.9995, SE = 0.001) and constant over 1-40 months. In contrast, apparent retention probabilities for reward bands demonstrated an interactive relationship between 5 size and species classes (small cackling, medium Canada, large Canada, snow, and Ross?s geese). In addition, apparent retention probabilities for each of the 5 classes varied quadratically with time, being lower immediately after banding and at older age classes. The differential retention probabilities among band type (reward vs. standard) that we observed suggests that 1) models estimating reporting probability should incorporate differential band loss if it is nontrivial, 2) goose managers should consider the costs and benefits of double-banding geese on an operational basis, and 3) the United States Geological Survey Bird Banding Lab should modify protocols for receiving recovery data.

  5. Estimating occupancy and abundance using aerial images with imperfect detection

    USGS Publications Warehouse

    Williams, Perry J.; Hooten, Mevin B.; Womble, Jamie N.; Bower, Michael R.

    2017-01-01

    Species distribution and abundance are critical population characteristics for efficient management, conservation, and ecological insight. Point process models are a powerful tool for modelling distribution and abundance, and can incorporate many data types, including count data, presence-absence data, and presence-only data. Aerial photographic images are a natural tool for collecting data to fit point process models, but aerial images do not always capture all animals that are present at a site. Methods for estimating detection probability for aerial surveys usually include collecting auxiliary data to estimate the proportion of time animals are available to be detected.We developed an approach for fitting point process models using an N-mixture model framework to estimate detection probability for aerial occupancy and abundance surveys. Our method uses multiple aerial images taken of animals at the same spatial location to provide temporal replication of sample sites. The intersection of the images provide multiple counts of individuals at different times. We examined this approach using both simulated and real data of sea otters (Enhydra lutris kenyoni) in Glacier Bay National Park, southeastern Alaska.Using our proposed methods, we estimated detection probability of sea otters to be 0.76, the same as visual aerial surveys that have been used in the past. Further, simulations demonstrated that our approach is a promising tool for estimating occupancy, abundance, and detection probability from aerial photographic surveys.Our methods can be readily extended to data collected using unmanned aerial vehicles, as technology and regulations permit. The generality of our methods for other aerial surveys depends on how well surveys can be designed to meet the assumptions of N-mixture models.

  6. A Statistical Physics Characterization of the Complex Systems Dynamics: Quantifying Complexity from Spatio-Temporal Interactions

    PubMed Central

    Koorehdavoudi, Hana; Bogdan, Paul

    2016-01-01

    Biological systems are frequently categorized as complex systems due to their capabilities of generating spatio-temporal structures from apparent random decisions. In spite of research on analyzing biological systems, we lack a quantifiable framework for measuring their complexity. To fill this gap, in this paper, we develop a new paradigm to study a collective group of N agents moving and interacting in a three-dimensional space. Our paradigm helps to identify the spatio-temporal states of the motion of the group and their associated transition probabilities. This framework enables the estimation of the free energy landscape corresponding to the identified states. Based on the energy landscape, we quantify missing information, emergence, self-organization and complexity for a collective motion. We show that the collective motion of the group of agents evolves to reach the most probable state with relatively lowest energy level and lowest missing information compared to other possible states. Our analysis demonstrates that the natural group of animals exhibit a higher degree of emergence, self-organization and complexity over time. Consequently, this algorithm can be integrated into new frameworks to engineer collective motions to achieve certain degrees of emergence, self-organization and complexity. PMID:27297496

  7. A Statistical Physics Characterization of the Complex Systems Dynamics: Quantifying Complexity from Spatio-Temporal Interactions

    NASA Astrophysics Data System (ADS)

    Koorehdavoudi, Hana; Bogdan, Paul

    2016-06-01

    Biological systems are frequently categorized as complex systems due to their capabilities of generating spatio-temporal structures from apparent random decisions. In spite of research on analyzing biological systems, we lack a quantifiable framework for measuring their complexity. To fill this gap, in this paper, we develop a new paradigm to study a collective group of N agents moving and interacting in a three-dimensional space. Our paradigm helps to identify the spatio-temporal states of the motion of the group and their associated transition probabilities. This framework enables the estimation of the free energy landscape corresponding to the identified states. Based on the energy landscape, we quantify missing information, emergence, self-organization and complexity for a collective motion. We show that the collective motion of the group of agents evolves to reach the most probable state with relatively lowest energy level and lowest missing information compared to other possible states. Our analysis demonstrates that the natural group of animals exhibit a higher degree of emergence, self-organization and complexity over time. Consequently, this algorithm can be integrated into new frameworks to engineer collective motions to achieve certain degrees of emergence, self-organization and complexity.

  8. Using counts to simultaneously estimate abundance and detection probabilities in a salamander community

    USGS Publications Warehouse

    Dodd, C.K.; Dorazio, R.M.

    2004-01-01

    A critical variable in both ecological and conservation field studies is determining how many individuals of a species are present within a defined sampling area. Labor intensive techniques such as capture-mark-recapture and removal sampling may provide estimates of abundance, but there are many logistical constraints to their widespread application. Many studies on terrestrial and aquatic salamanders use counts as an index of abundance, assuming that detection remains constant while sampling. If this constancy is violated, determination of detection probabilities is critical to the accurate estimation of abundance. Recently, a model was developed that provides a statistical approach that allows abundance and detection to be estimated simultaneously from spatially and temporally replicated counts. We adapted this model to estimate these parameters for salamanders sampled over a six vear period in area-constrained plots in Great Smoky Mountains National Park. Estimates of salamander abundance varied among years, but annual changes in abundance did not vary uniformly among species. Except for one species, abundance estimates were not correlated with site covariates (elevation/soil and water pH, conductivity, air and water temperature). The uncertainty in the estimates was so large as to make correlations ineffectual in predicting which covariates might influence abundance. Detection probabilities also varied among species and sometimes among years for the six species examined. We found such a high degree of variation in our counts and in estimates of detection among species, sites, and years as to cast doubt upon the appropriateness of using count data to monitor population trends using a small number of area-constrained survey plots. Still, the model provided reasonable estimates of abundance that could make it useful in estimating population size from count surveys.

  9. Location Prediction Based on Transition Probability Matrices Constructing from Sequential Rules for Spatial-Temporal K-Anonymity Dataset

    PubMed Central

    Liu, Zhao; Zhu, Yunhong; Wu, Chenxue

    2016-01-01

    Spatial-temporal k-anonymity has become a mainstream approach among techniques for protection of users’ privacy in location-based services (LBS) applications, and has been applied to several variants such as LBS snapshot queries and continuous queries. Analyzing large-scale spatial-temporal anonymity sets may benefit several LBS applications. In this paper, we propose two location prediction methods based on transition probability matrices constructing from sequential rules for spatial-temporal k-anonymity dataset. First, we define single-step sequential rules mined from sequential spatial-temporal k-anonymity datasets generated from continuous LBS queries for multiple users. We then construct transition probability matrices from mined single-step sequential rules, and normalize the transition probabilities in the transition matrices. Next, we regard a mobility model for an LBS requester as a stationary stochastic process and compute the n-step transition probability matrices by raising the normalized transition probability matrices to the power n. Furthermore, we propose two location prediction methods: rough prediction and accurate prediction. The former achieves the probabilities of arriving at target locations along simple paths those include only current locations, target locations and transition steps. By iteratively combining the probabilities for simple paths with n steps and the probabilities for detailed paths with n-1 steps, the latter method calculates transition probabilities for detailed paths with n steps from current locations to target locations. Finally, we conduct extensive experiments, and correctness and flexibility of our proposed algorithm have been verified. PMID:27508502

  10. Time Dependence of Collision Probabilities During Satellite Conjunctions

    NASA Technical Reports Server (NTRS)

    Hall, Doyle T.; Hejduk, Matthew D.; Johnson, Lauren C.

    2017-01-01

    The NASA Conjunction Assessment Risk Analysis (CARA) team has recently implemented updated software to calculate the probability of collision (P (sub c)) for Earth-orbiting satellites. The algorithm can employ complex dynamical models for orbital motion, and account for the effects of non-linear trajectories as well as both position and velocity uncertainties. This “3D P (sub c)” method entails computing a 3-dimensional numerical integral for each estimated probability. Our analysis indicates that the 3D method provides several new insights over the traditional “2D P (sub c)” method, even when approximating the orbital motion using the relatively simple Keplerian two-body dynamical model. First, the formulation provides the means to estimate variations in the time derivative of the collision probability, or the probability rate, R (sub c). For close-proximity satellites, such as those orbiting in formations or clusters, R (sub c) variations can show multiple peaks that repeat or blend with one another, providing insight into the ongoing temporal distribution of risk. For single, isolated conjunctions, R (sub c) analysis provides the means to identify and bound the times of peak collision risk. Additionally, analysis of multiple actual archived conjunctions demonstrates that the commonly used “2D P (sub c)” approximation can occasionally provide inaccurate estimates. These include cases in which the 2D method yields negligibly small probabilities (e.g., P (sub c)) is greater than 10 (sup -10)), but the 3D estimates are sufficiently large to prompt increased monitoring or collision mitigation (e.g., P (sub c) is greater than or equal to 10 (sup -5)). Finally, the archive analysis indicates that a relatively efficient calculation can be used to identify which conjunctions will have negligibly small probabilities. This small-P (sub c) screening test can significantly speed the overall risk analysis computation for large numbers of conjunctions.

  11. N-mix for fish: estimating riverine salmonid habitat selection via N-mixture models

    USGS Publications Warehouse

    Som, Nicholas A.; Perry, Russell W.; Jones, Edward C.; De Juilio, Kyle; Petros, Paul; Pinnix, William D.; Rupert, Derek L.

    2018-01-01

    Models that formulate mathematical linkages between fish use and habitat characteristics are applied for many purposes. For riverine fish, these linkages are often cast as resource selection functions with variables including depth and velocity of water and distance to nearest cover. Ecologists are now recognizing the role that detection plays in observing organisms, and failure to account for imperfect detection can lead to spurious inference. Herein, we present a flexible N-mixture model to associate habitat characteristics with the abundance of riverine salmonids that simultaneously estimates detection probability. Our formulation has the added benefits of accounting for demographics variation and can generate probabilistic statements regarding intensity of habitat use. In addition to the conceptual benefits, model application to data from the Trinity River, California, yields interesting results. Detection was estimated to vary among surveyors, but there was little spatial or temporal variation. Additionally, a weaker effect of water depth on resource selection is estimated than that reported by previous studies not accounting for detection probability. N-mixture models show great promise for applications to riverine resource selection.

  12. Traffic Video Image Segmentation Model Based on Bayesian and Spatio-Temporal Markov Random Field

    NASA Astrophysics Data System (ADS)

    Zhou, Jun; Bao, Xu; Li, Dawei; Yin, Yongwen

    2017-10-01

    Traffic video image is a kind of dynamic image and its background and foreground is changed at any time, which results in the occlusion. In this case, using the general method is more difficult to get accurate image segmentation. A segmentation algorithm based on Bayesian and Spatio-Temporal Markov Random Field is put forward, which respectively build the energy function model of observation field and label field to motion sequence image with Markov property, then according to Bayesian' rule, use the interaction of label field and observation field, that is the relationship of label field’s prior probability and observation field’s likelihood probability, get the maximum posterior probability of label field’s estimation parameter, use the ICM model to extract the motion object, consequently the process of segmentation is finished. Finally, the segmentation methods of ST - MRF and the Bayesian combined with ST - MRF were analyzed. Experimental results: the segmentation time in Bayesian combined with ST-MRF algorithm is shorter than in ST-MRF, and the computing workload is small, especially in the heavy traffic dynamic scenes the method also can achieve better segmentation effect.

  13. Double-observer approach to estimating egg mass abundance of vernal pool breeding amphibians

    USGS Publications Warehouse

    Grant, E.H.C.; Jung, R.E.; Nichols, J.D.; Hines, J.E.

    2005-01-01

    Interest in seasonally flooded pools, and the status of associated amphibian populations, has initiated programs in the northeastern United States to document and monitor these habitats. Counting egg masses is an effective way to determine the population size of pool-breeding amphibians, such as wood frogs (Rana sylvatica) and spotted salamanders (Ambystoma maculatum). However, bias is associated with counts if egg masses are missed. Counts unadjusted for the proportion missed (i.e., without adjustment for detection probability) could lead to false assessments of population trends. We used a dependent double-observer method in 2002-2003 to estimate numbers of wood frog and spotted salamander egg masses at seasonal forest pools in 13 National Wildlife Refuges, 1 National Park, 1 National Seashore, and 1 State Park in the northeastern United States. We calculated detection probabilities for egg masses and examined whether detection probabilities varied by species, observers, pools, and in relation to pool characteristics (pool area, pool maximum depth, within-pool vegetation). For the 2 years, model selection indicated that no consistent set of variables explained the variation in data sets from individual Refuges and Parks. Because our results indicated that egg mass detection probabilities vary spatially and temporally, we conclude that it is essential to use estimation procedures, such as double-observer methods with egg mass surveys, to determine population sizes and trends of these species.

  14. Temporal models for the episodic volcanism of Campi Flegrei caldera (Italy) with uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Bevilacqua, Andrea; Flandoli, Franco; Neri, Augusto; Isaia, Roberto; Vitale, Stefano

    2016-11-01

    After the large-scale event of Neapolitan Yellow Tuff ( 15 ka B.P.), intense and mostly explosive volcanism has occurred within and along the boundaries of the Campi Flegrei caldera (Italy). Eruptions occurred closely spaced in time, over periods from a few centuries to a few millennia, and were alternated with periods of quiescence lasting up to several millennia. Often events also occurred closely in space, thus generating a cluster of events. This study had two main objectives: (1) to describe the uncertainty in the geologic record by using a quantitative model and (2) to develop, based on the uncertainty assessment, a long-term subdomain specific temporal probability model that describes the temporal and spatial eruptive behavior of the caldera. In particular, the study adopts a space-time doubly stochastic nonhomogeneous Poisson-type model with a local self-excitation feature able to generate clustering of events which are consistent with the reconstructed record of Campi Flegrei. Results allow the evaluation of similarities and differences between the three epochs of activity as well as to derive eruptive base rate of the caldera and its capacity to generate clusters of events. The temporal probability model is also used to investigate the effect of the most recent eruption of Monte Nuovo (A.D. 1538) in a possible reactivation of the caldera and to estimate the time to the next eruption under different volcanological and modeling assumptions.

  15. Thrombus segmentation by texture dynamics from microscopic image sequences

    NASA Astrophysics Data System (ADS)

    Brieu, Nicolas; Serbanovic-Canic, Jovana; Cvejic, Ana; Stemple, Derek; Ouwehand, Willem; Navab, Nassir; Groher, Martin

    2010-03-01

    The genetic factors of thrombosis are commonly explored by microscopically imaging the coagulation of blood cells induced by injuring a vessel of mice or of zebrafish mutants. The latter species is particularly interesting since skin transparency permits to non-invasively acquire microscopic images of the scene with a CCD camera and to estimate the parameters characterizing the thrombus development. These parameters are currently determined by manual outlining, which is both error prone and extremely time consuming. Even though a technique for automatic thrombus extraction would be highly valuable for gene analysts, little work can be found, which is mainly due to very low image contrast and spurious structures. In this work, we propose to semi-automatically segment the thrombus over time from microscopic image sequences of wild-type zebrafish larvae. To compensate the lack of valuable spatial information, our main idea consists of exploiting the temporal information by modeling the variations of the pixel intensities over successive temporal windows with a linear Markov-based dynamic texture formalization. We then derive an image from the estimated model parameters, which represents the probability of a pixel to belong to the thrombus. We employ this probability image to accurately estimate the thrombus position via an active contour segmentation incorporating also prior and spatial information of the underlying intensity images. The performance of our approach is tested on three microscopic image sequences. We show that the thrombus is accurately tracked over time in each sequence if the respective parameters controlling prior influence and contour stiffness are correctly chosen.

  16. Estimation of typhoon rainfall in GaoPing River: A Multivariate Maximum Entropy Method

    NASA Astrophysics Data System (ADS)

    Pei-Jui, Wu; Hwa-Lung, Yu

    2016-04-01

    The heavy rainfall from typhoons is the main factor of the natural disaster in Taiwan, which causes the significant loss of human lives and properties. Statistically average 3.5 typhoons invade Taiwan every year, and the serious typhoon, Morakot in 2009, impacted Taiwan in recorded history. Because the duration, path and intensity of typhoon, also affect the temporal and spatial rainfall type in specific region , finding the characteristics of the typhoon rainfall type is advantageous when we try to estimate the quantity of rainfall. This study developed a rainfall prediction model and can be divided three parts. First, using the EEOF(extended empirical orthogonal function) to classify the typhoon events, and decompose the standard rainfall type of all stations of each typhoon event into the EOF and PC(principal component). So we can classify the typhoon events which vary similarly in temporally and spatially as the similar typhoon types. Next, according to the classification above, we construct the PDF(probability density function) in different space and time by means of using the multivariate maximum entropy from the first to forth moment statistically. Therefore, we can get the probability of each stations of each time. Final we use the BME(Bayesian Maximum Entropy method) to construct the typhoon rainfall prediction model , and to estimate the rainfall for the case of GaoPing river which located in south of Taiwan.This study could be useful for typhoon rainfall predictions in future and suitable to government for the typhoon disaster prevention .

  17. Validation of prediction models: examining temporal and geographic stability of baseline risk and estimated covariate effects

    PubMed Central

    Austin, Peter C.; van Klaveren, David; Vergouwe, Yvonne; Nieboer, Daan; Lee, Douglas S.; Steyerberg, Ewout W.

    2018-01-01

    Background Stability in baseline risk and estimated predictor effects both geographically and temporally is a desirable property of clinical prediction models. However, this issue has received little attention in the methodological literature. Our objective was to examine methods for assessing temporal and geographic heterogeneity in baseline risk and predictor effects in prediction models. Methods We studied 14,857 patients hospitalized with heart failure at 90 hospitals in Ontario, Canada, in two time periods. We focussed on geographic and temporal variation in baseline risk (intercept) and predictor effects (regression coefficients) of the EFFECT-HF mortality model for predicting 1-year mortality in patients hospitalized for heart failure. We used random effects logistic regression models for the 14,857 patients. Results The baseline risk of mortality displayed moderate geographic variation, with the hospital-specific probability of 1-year mortality for a reference patient lying between 0.168 and 0.290 for 95% of hospitals. Furthermore, the odds of death were 11% lower in the second period than in the first period. However, we found minimal geographic or temporal variation in predictor effects. Among 11 tests of differences in time for predictor variables, only one had a modestly significant P value (0.03). Conclusions This study illustrates how temporal and geographic heterogeneity of prediction models can be assessed in settings with a large sample of patients from a large number of centers at different time periods. PMID:29350215

  18. Development of spatial-temporal ventilation heterogeneity and probability analysis tools for hyperpolarized 3He magnetic resonance imaging

    NASA Astrophysics Data System (ADS)

    Choy, S.; Ahmed, H.; Wheatley, A.; McCormack, D. G.; Parraga, G.

    2010-03-01

    We developed image analysis tools to evaluate spatial and temporal 3He magnetic resonance imaging (MRI) ventilation in asthma and cystic fibrosis. We also developed temporal ventilation probability maps to provide a way to describe and quantify ventilation heterogeneity over time, as a way to test respiratory exacerbations or treatment predictions and to provide a discrete probability measurement of 3He ventilation defect persistence.

  19. Age-specific survival of male golden-cheeked warblers on the Fort Hood Military Reservation, Texas

    USGS Publications Warehouse

    Duarte, Adam; Hines, James E.; Nichols, James D.; Hatfield, Jeffrey S.; Weckerly, Floyd W.

    2014-01-01

    Population models are essential components of large-scale conservation and management plans for the federally endangered Golden-cheeked Warbler (Setophaga chrysoparia; hereafter GCWA). However, existing models are based on vital rate estimates calculated using relatively small data sets that are now more than a decade old. We estimated more current, precise adult and juvenile apparent survival (Φ) probabilities and their associated variances for male GCWAs. In addition to providing estimates for use in population modeling, we tested hypotheses about spatial and temporal variation in Φ. We assessed whether a linear trend in Φ or a change in the overall mean Φ corresponded to an observed increase in GCWA abundance during 1992-2000 and if Φ varied among study plots. To accomplish these objectives, we analyzed long-term GCWA capture-resight data from 1992 through 2011, collected across seven study plots on the Fort Hood Military Reservation using a Cormack-Jolly-Seber model structure within program MARK. We also estimated Φ process and sampling variances using a variance-components approach. Our results did not provide evidence of site-specific variation in adult Φ on the installation. Because of a lack of data, we could not assess whether juvenile Φ varied spatially. We did not detect a strong temporal association between GCWA abundance and Φ. Mean estimates of Φ for adult and juvenile male GCWAs for all years analyzed were 0.47 with a process variance of 0.0120 and a sampling variance of 0.0113 and 0.28 with a process variance of 0.0076 and a sampling variance of 0.0149, respectively. Although juvenile Φ did not differ greatly from previous estimates, our adult Φ estimate suggests previous GCWA population models were overly optimistic with respect to adult survival. These updated Φ probabilities and their associated variances will be incorporated into new population models to assist with GCWA conservation decision making.

  20. Lateralization of temporal lobe epilepsy by multimodal multinomial hippocampal response-driven models.

    PubMed

    Nazem-Zadeh, Mohammad-Reza; Elisevich, Kost V; Schwalb, Jason M; Bagher-Ebadian, Hassan; Mahmoudi, Fariborz; Soltanian-Zadeh, Hamid

    2014-12-15

    Multiple modalities are used in determining laterality in mesial temporal lobe epilepsy (mTLE). It is unclear how much different imaging modalities should be weighted in decision-making. The purpose of this study is to develop response-driven multimodal multinomial models for lateralization of epileptogenicity in mTLE patients based upon imaging features in order to maximize the accuracy of noninvasive studies. The volumes, means and standard deviations of FLAIR intensity and means of normalized ictal-interictal SPECT intensity of the left and right hippocampi were extracted from preoperative images of a retrospective cohort of 45 mTLE patients with Engel class I surgical outcomes, as well as images of a cohort of 20 control, nonepileptic subjects. Using multinomial logistic function regression, the parameters of various univariate and multivariate models were estimated. Based on the Bayesian model averaging (BMA) theorem, response models were developed as compositions of independent univariate models. A BMA model composed of posterior probabilities of univariate response models of hippocampal volumes, means and standard deviations of FLAIR intensity, and means of SPECT intensity with the estimated weighting coefficients of 0.28, 0.32, 0.09, and 0.31, respectively, as well as a multivariate response model incorporating all mentioned attributes, demonstrated complete reliability by achieving a probability of detection of one with no false alarms to establish proper laterality in all mTLE patients. The proposed multinomial multivariate response-driven model provides a reliable lateralization of mesial temporal epileptogenicity including those patients who require phase II assessment. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Inferences about population dynamics from count data using multistate models: a comparison to capture–recapture approaches

    PubMed Central

    Zipkin, Elise F; Sillett, T Scott; Grant, Evan H Campbell; Chandler, Richard B; Royle, J Andrew

    2014-01-01

    Wildlife populations consist of individuals that contribute disproportionately to growth and viability. Understanding a population's spatial and temporal dynamics requires estimates of abundance and demographic rates that account for this heterogeneity. Estimating these quantities can be difficult, requiring years of intensive data collection. Often, this is accomplished through the capture and recapture of individual animals, which is generally only feasible at a limited number of locations. In contrast, N-mixture models allow for the estimation of abundance, and spatial variation in abundance, from count data alone. We extend recently developed multistate, open population N-mixture models, which can additionally estimate demographic rates based on an organism's life history characteristics. In our extension, we develop an approach to account for the case where not all individuals can be assigned to a state during sampling. Using only state-specific count data, we show how our model can be used to estimate local population abundance, as well as density-dependent recruitment rates and state-specific survival. We apply our model to a population of black-throated blue warblers (Setophaga caerulescens) that have been surveyed for 25 years on their breeding grounds at the Hubbard Brook Experimental Forest in New Hampshire, USA. The intensive data collection efforts allow us to compare our estimates to estimates derived from capture–recapture data. Our model performed well in estimating population abundance and density-dependent rates of annual recruitment/immigration. Estimates of local carrying capacity and per capita recruitment of yearlings were consistent with those published in other studies. However, our model moderately underestimated annual survival probability of yearling and adult females and severely underestimates survival probabilities for both of these male stages. The most accurate and precise estimates will necessarily require some amount of intensive data collection efforts (such as capture–recapture). Integrated population models that combine data from both intensive and extensive sources are likely to be the most efficient approach for estimating demographic rates at large spatial and temporal scales. PMID:24634726

  2. Spawning areas of Engraulis anchoita in the Southeastern Brazilian Bight during late-spring and early summer

    NASA Astrophysics Data System (ADS)

    del Favero, Jana M.; Katsuragawa, Mario; Zani-Teixeira, Maria de Lourdes; Turner, Jefferson T.

    2017-04-01

    Analysis of fish egg density and distribution is indispensable for the understanding of the adult stock variability, and is a powerful tool for fisheries management. Thus, the objective of the present study was to characterize the spatial-temporal spawning patterns of Engraulis anchoita in the Southeastern Brazilian Bight, in terms of geographic location and abiotic factors. We analyzed data of eggs sampled during ten years, from 1974 to 1993, to create maps of the mean and the standard deviation (sd) of the estimated probability of egg presence, through indicative kriging. Preferred, tolerated and avoided temperature, salinity, local depth and distance for spawning of E. anchoita were defined by the estimation of bootstrapped confidence intervals of the quotient values (Q). Despite not having identified any recurrent spawning sites, a few occasional and unfavorable spawning sites were identified, showing that the spawning habit of E. anchoita not only varied spatially, but also temporally. The largest occasional spawning site and with the highest probability of egg presence (0.6-0.7) was located around 27°S, close to Florianópolis (Santa Catarina state). On the other hand, a well-marked unfavorable spawning site was located off São Sebastião Island (São Paulo state), with the probability of egg presence between 0-0.1. Abiotic and biotic factors that could be related to the changes in the spawning areas of E. anchoita were discussed, with shelf width, mesoscale hydrodynamic features and biological interactions apparently playing important roles in defining spawning sites.

  3. A first hazard analysis of the Harrat Ash Shamah volcanic field, Syria-Jordan Borderline

    NASA Astrophysics Data System (ADS)

    Cagnan, Zehra; Akkar, Sinan; Moghimi, Saed

    2017-04-01

    The northernmost part of the Saudi Cenozoic Volcanic Fields, the 100,000 km2 Harrat Ash Shamah has hosted some of the most recent volcanic eruptions along the Syria-Jordan borderline. With rapid growth of the cities in this region, exposure to any potential renewed volcanism increased considerably. We present here a first-order probabilistic hazard analysis related to new vent formation and subsequent lava flow from Harrat Ash Shamah. The 733 visible eruption vent sites were utilized to develop a probability density function for new eruption sites using Gaussian kernel smoothing. This revealed a NNW striking zone of high spatial hazard surrounding the cities Amman and Irbid in Jordan. The temporal eruption recurrence rate is estimated to be approximately one vent per 3500 years, but the temporal record of the field is so poorly constrained that the lower and upper bounds for the recurrence interval are 17,700 yrs and 70 yrs, respectively. A Poisson temporal model is employed within the scope of this study. In order to treat the uncertainties associated with the spatio-temporal models as well as size of the area affected by the lava flow, the logic tree approach is adopted. For the Syria-Jordan borderline, the spatial variation of volcanic hazard is computed as well as uncertainty associated with these estimates.

  4. Overwinter survival of neotropical migratory birds in early successional and mature tropical forests

    USGS Publications Warehouse

    Conway, C.J.; Powell, G.V.N.; Nichols, J.D.

    1995-01-01

    Many Neotropical migratory species inhabit both mature and early successional forest on their wintering grounds, yet comparisons of survival rates between habitats are lacking. Consequently, the factors affecting habitat suitability for Neotropical migrants and the potential effects of tropical deforestation on migrants are not well understood. We estimated over-winter survival and capture probabilities of Wood Thrush (Hylocichla mustelina), Ovenbird (Seiurus aurocapillus), Hooded Warbler (Wilsonia citrina), and Kentucky Warbler (Oporomis formosus) inhabiting two common tropical habitat types, mature and early-successional forest. Our results suggest that large differences (for example, ratio of survival rates (gamma) < 0.85) in overwinter survival between these habitats do not exist for any of these species. Age ratios did not differ between habitats, but males were more common in forest habitats and females more common in successional habitats for Hooded Warblers and Kentucky Warblers. Future research on overwinter survival should address the need for age- and sex-specific survival estimates before we can draw strong conclusions regarding winter habitat suitability. Our estimates of over-winter survival extrapolated to annual survival rates that were generally lower than previous estimates of annual survival of migratory birds. Capture probability differed between habitats for Kentucky Warblers, but our results provide strong evidence against large differences in capture probability between habitats for Wood Thrush, Hooded Warblers, and Ovenbirds. We found no temporal or among site differences in survival or capture probability for any of the four species. Additional research is needed to examine the effects of winter habitat use on survival during migration and between-winter survival.

  5. Sampling the stream landscape: Improving the applicability of an ecoregion-level capture probability model for stream fishes

    USGS Publications Warehouse

    Mollenhauer, Robert; Mouser, Joshua B.; Brewer, Shannon K.

    2018-01-01

    Temporal and spatial variability in streams result in heterogeneous gear capture probability (i.e., the proportion of available individuals identified) that confounds interpretation of data used to monitor fish abundance. We modeled tow-barge electrofishing capture probability at multiple spatial scales for nine Ozark Highland stream fishes. In addition to fish size, we identified seven reach-scale environmental characteristics associated with variable capture probability: stream discharge, water depth, conductivity, water clarity, emergent vegetation, wetted width–depth ratio, and proportion of riffle habitat. The magnitude of the relationship between capture probability and both discharge and depth varied among stream fishes. We also identified lithological characteristics among stream segments as a coarse-scale source of variable capture probability. The resulting capture probability model can be used to adjust catch data and derive reach-scale absolute abundance estimates across a wide range of sampling conditions with similar effort as used in more traditional fisheries surveys (i.e., catch per unit effort). Adjusting catch data based on variable capture probability improves the comparability of data sets, thus promoting both well-informed conservation and management decisions and advances in stream-fish ecology.

  6. Statistical analysis of Geopotential Height (GH) timeseries based on Tsallis non-extensive statistical mechanics

    NASA Astrophysics Data System (ADS)

    Karakatsanis, L. P.; Iliopoulos, A. C.; Pavlos, E. G.; Pavlos, G. P.

    2018-02-01

    In this paper, we perform statistical analysis of time series deriving from Earth's climate. The time series are concerned with Geopotential Height (GH) and correspond to temporal and spatial components of the global distribution of month average values, during the period (1948-2012). The analysis is based on Tsallis non-extensive statistical mechanics and in particular on the estimation of Tsallis' q-triplet, namely {qstat, qsens, qrel}, the reconstructed phase space and the estimation of correlation dimension and the Hurst exponent of rescaled range analysis (R/S). The deviation of Tsallis q-triplet from unity indicates non-Gaussian (Tsallis q-Gaussian) non-extensive character with heavy tails probability density functions (PDFs), multifractal behavior and long range dependences for all timeseries considered. Also noticeable differences of the q-triplet estimation found in the timeseries at distinct local or temporal regions. Moreover, in the reconstructive phase space revealed a lower-dimensional fractal set in the GH dynamical phase space (strong self-organization) and the estimation of Hurst exponent indicated multifractality, non-Gaussianity and persistence. The analysis is giving significant information identifying and characterizing the dynamical characteristics of the earth's climate.

  7. Time-dependence of graph theory metrics in functional connectivity analysis

    PubMed Central

    Chiang, Sharon; Cassese, Alberto; Guindani, Michele; Vannucci, Marina; Yeh, Hsiang J.; Haneef, Zulfi; Stern, John M.

    2016-01-01

    Brain graphs provide a useful way to computationally model the network structure of the connectome, and this has led to increasing interest in the use of graph theory to quantitate and investigate the topological characteristics of the healthy brain and brain disorders on the network level. The majority of graph theory investigations of functional connectivity have relied on the assumption of temporal stationarity. However, recent evidence increasingly suggests that functional connectivity fluctuates over the length of the scan. In this study, we investigate the stationarity of brain network topology using a Bayesian hidden Markov model (HMM) approach that estimates the dynamic structure of graph theoretical measures of whole-brain functional connectivity. In addition to extracting the stationary distribution and transition probabilities of commonly employed graph theory measures, we propose two estimators of temporal stationarity: the S-index and N-index. These indexes can be used to quantify different aspects of the temporal stationarity of graph theory measures. We apply the method and proposed estimators to resting-state functional MRI data from healthy controls and patients with temporal lobe epilepsy. Our analysis shows that several graph theory measures, including small-world index, global integration measures, and betweenness centrality, may exhibit greater stationarity over time and therefore be more robust. Additionally, we demonstrate that accounting for subject-level differences in the level of temporal stationarity of network topology may increase discriminatory power in discriminating between disease states. Our results confirm and extend findings from other studies regarding the dynamic nature of functional connectivity, and suggest that using statistical models which explicitly account for the dynamic nature of functional connectivity in graph theory analyses may improve the sensitivity of investigations and consistency across investigations. PMID:26518632

  8. Time-dependence of graph theory metrics in functional connectivity analysis.

    PubMed

    Chiang, Sharon; Cassese, Alberto; Guindani, Michele; Vannucci, Marina; Yeh, Hsiang J; Haneef, Zulfi; Stern, John M

    2016-01-15

    Brain graphs provide a useful way to computationally model the network structure of the connectome, and this has led to increasing interest in the use of graph theory to quantitate and investigate the topological characteristics of the healthy brain and brain disorders on the network level. The majority of graph theory investigations of functional connectivity have relied on the assumption of temporal stationarity. However, recent evidence increasingly suggests that functional connectivity fluctuates over the length of the scan. In this study, we investigate the stationarity of brain network topology using a Bayesian hidden Markov model (HMM) approach that estimates the dynamic structure of graph theoretical measures of whole-brain functional connectivity. In addition to extracting the stationary distribution and transition probabilities of commonly employed graph theory measures, we propose two estimators of temporal stationarity: the S-index and N-index. These indexes can be used to quantify different aspects of the temporal stationarity of graph theory measures. We apply the method and proposed estimators to resting-state functional MRI data from healthy controls and patients with temporal lobe epilepsy. Our analysis shows that several graph theory measures, including small-world index, global integration measures, and betweenness centrality, may exhibit greater stationarity over time and therefore be more robust. Additionally, we demonstrate that accounting for subject-level differences in the level of temporal stationarity of network topology may increase discriminatory power in discriminating between disease states. Our results confirm and extend findings from other studies regarding the dynamic nature of functional connectivity, and suggest that using statistical models which explicitly account for the dynamic nature of functional connectivity in graph theory analyses may improve the sensitivity of investigations and consistency across investigations. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. Entropy-Bayesian Inversion of Time-Lapse Tomographic GPR data for Monitoring Dielectric Permittivity and Soil Moisture Variations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hou, Z; Terry, N; Hubbard, S S

    2013-02-12

    In this study, we evaluate the possibility of monitoring soil moisture variation using tomographic ground penetrating radar travel time data through Bayesian inversion, which is integrated with entropy memory function and pilot point concepts, as well as efficient sampling approaches. It is critical to accurately estimate soil moisture content and variations in vadose zone studies. Many studies have illustrated the promise and value of GPR tomographic data for estimating soil moisture and associated changes, however, challenges still exist in the inversion of GPR tomographic data in a manner that quantifies input and predictive uncertainty, incorporates multiple data types, handles non-uniquenessmore » and nonlinearity, and honors time-lapse tomograms collected in a series. To address these challenges, we develop a minimum relative entropy (MRE)-Bayesian based inverse modeling framework that non-subjectively defines prior probabilities, incorporates information from multiple sources, and quantifies uncertainty. The framework enables us to estimate dielectric permittivity at pilot point locations distributed within the tomogram, as well as the spatial correlation range. In the inversion framework, MRE is first used to derive prior probability distribution functions (pdfs) of dielectric permittivity based on prior information obtained from a straight-ray GPR inversion. The probability distributions are then sampled using a Quasi-Monte Carlo (QMC) approach, and the sample sets provide inputs to a sequential Gaussian simulation (SGSim) algorithm that constructs a highly resolved permittivity/velocity field for evaluation with a curved-ray GPR forward model. The likelihood functions are computed as a function of misfits, and posterior pdfs are constructed using a Gaussian kernel. Inversion of subsequent time-lapse datasets combines the Bayesian estimates from the previous inversion (as a memory function) with new data. The memory function and pilot point design takes advantage of the spatial-temporal correlation of the state variables. We first apply the inversion framework to a static synthetic example and then to a time-lapse GPR tomographic dataset collected during a dynamic experiment conducted at the Hanford Site in Richland, WA. We demonstrate that the MRE-Bayesian inversion enables us to merge various data types, quantify uncertainty, evaluate nonlinear models, and produce more detailed and better resolved estimates than straight-ray based inversion; therefore, it has the potential to improve estimates of inter-wellbore dielectric permittivity and soil moisture content and to monitor their temporal dynamics more accurately.« less

  10. Entropy-Bayesian Inversion of Time-Lapse Tomographic GPR data for Monitoring Dielectric Permittivity and Soil Moisture Variations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hou, Zhangshuan; Terry, Neil C.; Hubbard, Susan S.

    2013-02-22

    In this study, we evaluate the possibility of monitoring soil moisture variation using tomographic ground penetrating radar travel time data through Bayesian inversion, which is integrated with entropy memory function and pilot point concepts, as well as efficient sampling approaches. It is critical to accurately estimate soil moisture content and variations in vadose zone studies. Many studies have illustrated the promise and value of GPR tomographic data for estimating soil moisture and associated changes, however, challenges still exist in the inversion of GPR tomographic data in a manner that quantifies input and predictive uncertainty, incorporates multiple data types, handles non-uniquenessmore » and nonlinearity, and honors time-lapse tomograms collected in a series. To address these challenges, we develop a minimum relative entropy (MRE)-Bayesian based inverse modeling framework that non-subjectively defines prior probabilities, incorporates information from multiple sources, and quantifies uncertainty. The framework enables us to estimate dielectric permittivity at pilot point locations distributed within the tomogram, as well as the spatial correlation range. In the inversion framework, MRE is first used to derive prior probability density functions (pdfs) of dielectric permittivity based on prior information obtained from a straight-ray GPR inversion. The probability distributions are then sampled using a Quasi-Monte Carlo (QMC) approach, and the sample sets provide inputs to a sequential Gaussian simulation (SGSIM) algorithm that constructs a highly resolved permittivity/velocity field for evaluation with a curved-ray GPR forward model. The likelihood functions are computed as a function of misfits, and posterior pdfs are constructed using a Gaussian kernel. Inversion of subsequent time-lapse datasets combines the Bayesian estimates from the previous inversion (as a memory function) with new data. The memory function and pilot point design takes advantage of the spatial-temporal correlation of the state variables. We first apply the inversion framework to a static synthetic example and then to a time-lapse GPR tomographic dataset collected during a dynamic experiment conducted at the Hanford Site in Richland, WA. We demonstrate that the MRE-Bayesian inversion enables us to merge various data types, quantify uncertainty, evaluate nonlinear models, and produce more detailed and better resolved estimates than straight-ray based inversion; therefore, it has the potential to improve estimates of inter-wellbore dielectric permittivity and soil moisture content and to monitor their temporal dynamics more accurately.« less

  11. Comparing different methods for determining forest evapotranspiration and its components at multiple temporal scales.

    PubMed

    Tie, Qiang; Hu, Hongchang; Tian, Fuqiang; Holbrook, N Michele

    2018-08-15

    Accurately estimating forest evapotranspiration and its components is of great importance for hydrology, ecology, and meteorology. In this study, a comparison of methods for determining forest evapotranspiration and its components at annual, monthly, daily, and diurnal scales was conducted based on in situ measurements in the subhumid mountainous forest of North China. The goal of the study was to evaluate the accuracies and reliabilities of the different methods. The results indicate the following: (1) The sap flow upscaling procedure, taking into account diversities in forest types and tree species, produced component-based forest evapotranspiration estimate that agreed with eddy covariance-based estimate at the temporal scales of year, month, and day, while soil water budget-based forest evapotranspiration estimate was also qualitatively consistent with eddy covariance-based estimate at the daily scale; (2) At the annual scale, catchment water balance-based forest evapotranspiration estimate was significantly higher than eddy covariance-based estimate, which might probably result from non-negligible subsurface runoff caused by the widely distributed regolith and fractured bedrock under the ground; (3) At the sub-daily scale, the diurnal course of sap flow based-canopy transpiration estimate lagged significantly behind eddy covariance-based forest evapotranspiration estimate, which might physiologically be due to stem water storage and stem hydraulic conductivity. The results in this region may have much referential significance for forest evapotranspiration estimation and method evaluation in regions with similar environmental conditions. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Effect of drift on the temporal asymptotic form of the particle survival probability in media with absorbing traps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arkhincheev, V. E., E-mail: varkhin@mail.ru

    A new asymptotic form of the particle survival probability in media with absorbing traps has been established. It is shown that this drift mechanism determines a new temporal behavior of the probability of particle survival in media with absorbing traps over long time intervals.

  13. Active Longitude and Solar Flare Occurrences

    NASA Astrophysics Data System (ADS)

    Gyenge, N.; Ludmány, A.; Baranyi, T.

    2016-02-01

    The aim of the present work is to specify the spatio-temporal characteristics of flare activity observed by the Reuven Ramaty High Energy Solar Spectroscopic Imager (RHESSI) and the Geostationary Operational Environmental Satellite (GOES) in connection with the behavior of the longitudinal domain of enhanced sunspot activity known as active longitude (AL). By using our method developed for this purpose, we identified the AL in every Carrington Rotation provided by the Debrecen Photoheliographic Data. The spatial probability of flare occurrence has been estimated depending on the longitudinal distance from AL in the northern and southern hemispheres separately. We have found that more than 60% of the RHESSI and GOES flares is located within +/- 36^\\circ from the AL. Hence, the most flare-productive active regions tend to be located in or close to the active longitudinal belt. This observed feature may allow for the prediction of the geo-effective position of the domain of enhanced flaring probability. Furthermore, we studied the temporal properties of flare occurrence near the AL and several significant fluctuations were found. More precisely, the results of the method are the following fluctuations: 0.8, 1.3, and 1.8 years. These temporal and spatial properties of the solar flare occurrence within the active longitudinal belts could provide us with an enhanced solar flare forecasting opportunity.

  14. A direct approach to estimating the number of potential fatalities from an eruption: Application to the Central Volcanic Complex of Tenerife Island

    NASA Astrophysics Data System (ADS)

    Marrero, J. M.; García, A.; Llinares, A.; Rodriguez-Losada, J. A.; Ortiz, R.

    2012-03-01

    One of the critical issues in managing volcanic crises is making the decision to evacuate a densely-populated region. In order to take a decision of such importance it is essential to estimate the cost in lives for each of the expected eruptive scenarios. One of the tools that assist in estimating the number of potential fatalities for such decision-making is the calculation of the FN-curves. In this case the FN-curve is a graphical representation that relates the frequency of the different hazards to be expected for a particular volcano or volcanic area, and the number of potential fatalities expected for each event if the zone of impact is not evacuated. In this study we propose a method for assessing the impact that a possible eruption from the Tenerife Central Volcanic Complex (CVC) would have on the population at risk. Factors taken into account include the spatial probability of the eruptive scenarios (susceptibility) and the temporal probability of the magnitudes of the eruptive scenarios. For each point or cell of the susceptibility map with greater probability, a series of probability-scaled hazard maps is constructed for the whole range of magnitudes expected. The number of potential fatalities is obtained from the intersection of the hazard maps with the spatial map of population distribution. The results show that the Emergency Plan for Tenerife must provide for the evacuation of more than 100,000 persons.

  15. Incorporating availability for detection in estimates of bird abundance

    USGS Publications Warehouse

    Diefenbach, D.R.; Marshall, M.R.; Mattice, J.A.; Brauning, D.W.

    2007-01-01

    Several bird-survey methods have been proposed that provide an estimated detection probability so that bird-count statistics can be used to estimate bird abundance. However, some of these estimators adjust counts of birds observed by the probability that a bird is detected and assume that all birds are available to be detected at the time of the survey. We marked male Henslow's Sparrows (Ammodramus henslowii) and Grasshopper Sparrows (A. savannarum) and monitored their behavior during May-July 2002 and 2003 to estimate the proportion of time they were available for detection. We found that the availability of Henslow's Sparrows declined in late June to <10% for 5- or 10-min point counts when a male had to sing and be visible to the observer; but during 20 May-19 June, males were available for detection 39.1% (SD = 27.3) of the time for 5-min point counts and 43.9% (SD = 28.9) of the time for 10-min point counts (n = 54). We detected no temporal changes in availability for Grasshopper Sparrows, but estimated availability to be much lower for 5-min point counts (10.3%, SD = 12.2) than for 10-min point counts (19.2%, SD = 22.3) when males had to be visible and sing during the sampling period (n = 80). For distance sampling, we estimated the availability of Henslow's Sparrows to be 44.2% (SD = 29.0) and the availability of Grasshopper Sparrows to be 20.6% (SD = 23.5). We show how our estimates of availability can be incorporated in the abundance and variance estimators for distance sampling and modify the abundance and variance estimators for the double-observer method. Methods that directly estimate availability from bird counts but also incorporate detection probabilities need further development and will be important for obtaining unbiased estimates of abundance for these species.

  16. Computational Aspects of N-Mixture Models

    PubMed Central

    Dennis, Emily B; Morgan, Byron JT; Ridout, Martin S

    2015-01-01

    The N-mixture model is widely used to estimate the abundance of a population in the presence of unknown detection probability from only a set of counts subject to spatial and temporal replication (Royle, 2004, Biometrics 60, 105–115). We explain and exploit the equivalence of N-mixture and multivariate Poisson and negative-binomial models, which provides powerful new approaches for fitting these models. We show that particularly when detection probability and the number of sampling occasions are small, infinite estimates of abundance can arise. We propose a sample covariance as a diagnostic for this event, and demonstrate its good performance in the Poisson case. Infinite estimates may be missed in practice, due to numerical optimization procedures terminating at arbitrarily large values. It is shown that the use of a bound, K, for an infinite summation in the N-mixture likelihood can result in underestimation of abundance, so that default values of K in computer packages should be avoided. Instead we propose a simple automatic way to choose K. The methods are illustrated by analysis of data on Hermann's tortoise Testudo hermanni. PMID:25314629

  17. Understanding the Influence of Turbulence in Imaging Fourier-Transform Spectrometry of Smokestack Plumes

    DTIC Science & Technology

    2011-03-01

    capability of FTS to estimate plume effluent concentrations by comparing intrusive measurements of aircraft engine exhaust with those from an FTS. A... turbojet engine. Temporal averaging was used to reduce SCAs in the spectra, and spatial maps of temperature and concentration were generated. The time...density function ( PDF ) is the de- fined as the derivative of the CDF, and describes the probability of obtaining a given value of X. For a normally

  18. Satellite-Based Assessment of Rainfall-Triggered Landslide Hazard for Situational Awareness

    NASA Astrophysics Data System (ADS)

    Kirschbaum, Dalia; Stanley, Thomas

    2018-03-01

    Determining the time, location, and severity of natural disaster impacts is fundamental to formulating mitigation strategies, appropriate and timely responses, and robust recovery plans. A Landslide Hazard Assessment for Situational Awareness (LHASA) model was developed to indicate potential landslide activity in near real-time. LHASA combines satellite-based precipitation estimates with a landslide susceptibility map derived from information on slope, geology, road networks, fault zones, and forest loss. Precipitation data from the Global Precipitation Measurement (GPM) mission are used to identify rainfall conditions from the past 7 days. When rainfall is considered to be extreme and susceptibility values are moderate to very high, a "nowcast" is issued to indicate the times and places where landslides are more probable. When LHASA nowcasts were evaluated with a Global Landslide Catalog, the probability of detection (POD) ranged from 8% to 60%, depending on the evaluation period, precipitation product used, and the size of the spatial and temporal window considered around each landslide point. Applications of the LHASA system are also discussed, including how LHASA is used to estimate long-term trends in potential landslide activity at a nearly global scale and how it can be used as a tool to support disaster risk assessment. LHASA is intended to provide situational awareness of landslide hazards in near real-time, providing a flexible, open-source framework that can be adapted to other spatial and temporal scales based on data availability.

  19. Multistate modeling of habitat dynamics: Factors affecting Florida scrub transition probabilities

    USGS Publications Warehouse

    Breininger, D.R.; Nichols, J.D.; Duncan, B.W.; Stolen, Eric D.; Carter, G.M.; Hunt, D.K.; Drese, J.H.

    2010-01-01

    Many ecosystems are influenced by disturbances that create specific successional states and habitat structures that species need to persist. Estimating transition probabilities between habitat states and modeling the factors that influence such transitions have many applications for investigating and managing disturbance-prone ecosystems. We identify the correspondence between multistate capture-recapture models and Markov models of habitat dynamics. We exploit this correspondence by fitting and comparing competing models of different ecological covariates affecting habitat transition probabilities in Florida scrub and flatwoods, a habitat important to many unique plants and animals. We subdivided a large scrub and flatwoods ecosystem along central Florida's Atlantic coast into 10-ha grid cells, which approximated average territory size of the threatened Florida Scrub-Jay (Aphelocoma coerulescens), a management indicator species. We used 1.0-m resolution aerial imagery for 1994, 1999, and 2004 to classify grid cells into four habitat quality states that were directly related to Florida Scrub-Jay source-sink dynamics and management decision making. Results showed that static site features related to fire propagation (vegetation type, edges) and temporally varying disturbances (fires, mechanical cutting) best explained transition probabilities. Results indicated that much of the scrub and flatwoods ecosystem was resistant to moving from a degraded state to a desired state without mechanical cutting, an expensive restoration tool. We used habitat models parameterized with the estimated transition probabilities to investigate the consequences of alternative management scenarios on future habitat dynamics. We recommend this multistate modeling approach as being broadly applicable for studying ecosystem, land cover, or habitat dynamics. The approach provides maximum-likelihood estimates of transition parameters, including precision measures, and can be used to assess evidence among competing ecological models that describe system dynamics. ?? 2010 by the Ecological Society of America.

  20. Using prediction markets to estimate the reproducibility of scientific research.

    PubMed

    Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A; Johannesson, Magnus

    2015-12-15

    Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants' individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a "statistically significant" finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications.

  1. Using prediction markets to estimate the reproducibility of scientific research

    PubMed Central

    Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A.; Johannesson, Magnus

    2015-01-01

    Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants’ individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a “statistically significant” finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications. PMID:26553988

  2. Probability mass first flush evaluation for combined sewer discharges.

    PubMed

    Park, Inhyeok; Kim, Hongmyeong; Chae, Soo-Kwon; Ha, Sungryong

    2010-01-01

    The Korea government has put in a lot of effort to construct sanitation facilities for controlling non-point source pollution. The first flush phenomenon is a prime example of such pollution. However, to date, several serious problems have arisen in the operation and treatment effectiveness of these facilities due to unsuitable design flow volumes and pollution loads. It is difficult to assess the optimal flow volume and pollution mass when considering both monetary and temporal limitations. The objective of this article was to characterize the discharge of storm runoff pollution from urban catchments in Korea and to estimate the probability of mass first flush (MFFn) using the storm water management model and probability density functions. As a result of the review of gauged storms for the representative using probability density function with rainfall volumes during the last two years, all the gauged storms were found to be valid representative precipitation. Both the observed MFFn and probability MFFn in BE-1 denoted similarly large magnitudes of first flush with roughly 40% of the total pollution mass contained in the first 20% of the runoff. In the case of BE-2, however, there were significant difference between the observed MFFn and probability MFFn.

  3. Regionalization of precipitation characteristics in Montana using L-moments

    USGS Publications Warehouse

    Parrett, C.

    1998-01-01

    Dimensionless precipitation-frequency curves for estimating precipitation depths having small exceedance probabilities were developed for 2-, 6-, and 24-hour storm durations for three homogeneous regions in Montana. L-moment statistics were used to help define the homogeneous regions. The generalized extreme value distribution was used to construct the frequency curves for each duration within each region. The effective record length for each duration in each region was estimated using a graphical method and was found to range from 500 years for 6-hour duration data in Region 2 to 5,100 years for 24-hour duration data in Region 3. The temporal characteristics of storms were analyzed, and methods for estimating synthetic storm hyetographs were developed. Dimensionless depth-duration data were grouped by independent duration (2,6, and 24 hours) and by region, and the beta distribution was fit to dimensionless depth data for various incremental time intervals. Ordinary least-squares regression was used to develop relations between dimensionless depths for a key, short duration - termed the kernel duration - and dimensionless depths for other durations. The regression relations were used, together with the probabilistic dimensionless depth data for the kernel duration, to calculate dimensionless depth-duration curves for exceedance probabilities from .1 to .9. Dimensionless storm hyetographs for each independent duration in each region were constructed for median value conditions based on an exceedance probability of .5.

  4. A hydroclimatological approach to predicting regional landslide probability using Landlab

    NASA Astrophysics Data System (ADS)

    Strauch, Ronda; Istanbulluoglu, Erkan; Nudurupati, Sai Siddhartha; Bandaragoda, Christina; Gasparini, Nicole M.; Tucker, Gregory E.

    2018-02-01

    We develop a hydroclimatological approach to the modeling of regional shallow landslide initiation that integrates spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at mid-elevations (1400 to 2400 m), and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure.

  5. Analysis of Spatiotemporal Characteristics of Pandemic SARS Spread in Mainland China.

    PubMed

    Cao, Chunxiang; Chen, Wei; Zheng, Sheng; Zhao, Jian; Wang, Jinfeng; Cao, Wuchun

    2016-01-01

    Severe acute respiratory syndrome (SARS) is one of the most severe emerging infectious diseases of the 21st century so far. SARS caused a pandemic that spread throughout mainland China for 7 months, infecting 5318 persons in 194 administrative regions. Using detailed mainland China epidemiological data, we study spatiotemporal aspects of this person-to-person contagious disease and simulate its spatiotemporal transmission dynamics via the Bayesian Maximum Entropy (BME) method. The BME reveals that SARS outbreaks show autocorrelation within certain spatial and temporal distances. We use BME to fit a theoretical covariance model that has a sine hole spatial component and exponential temporal component and obtain the weights of geographical and temporal autocorrelation factors. Using the covariance model, SARS dynamics were estimated and simulated under the most probable conditions. Our study suggests that SARS transmission varies in its epidemiological characteristics and SARS outbreak distributions exhibit palpable clusters on both spatial and temporal scales. In addition, the BME modelling demonstrates that SARS transmission features are affected by spatial heterogeneity, so we analyze potential causes. This may benefit epidemiological control of pandemic infectious diseases.

  6. Analysis of Spatiotemporal Characteristics of Pandemic SARS Spread in Mainland China

    PubMed Central

    Cao, Chunxiang; Zheng, Sheng; Zhao, Jian; Wang, Jinfeng; Cao, Wuchun

    2016-01-01

    Severe acute respiratory syndrome (SARS) is one of the most severe emerging infectious diseases of the 21st century so far. SARS caused a pandemic that spread throughout mainland China for 7 months, infecting 5318 persons in 194 administrative regions. Using detailed mainland China epidemiological data, we study spatiotemporal aspects of this person-to-person contagious disease and simulate its spatiotemporal transmission dynamics via the Bayesian Maximum Entropy (BME) method. The BME reveals that SARS outbreaks show autocorrelation within certain spatial and temporal distances. We use BME to fit a theoretical covariance model that has a sine hole spatial component and exponential temporal component and obtain the weights of geographical and temporal autocorrelation factors. Using the covariance model, SARS dynamics were estimated and simulated under the most probable conditions. Our study suggests that SARS transmission varies in its epidemiological characteristics and SARS outbreak distributions exhibit palpable clusters on both spatial and temporal scales. In addition, the BME modelling demonstrates that SARS transmission features are affected by spatial heterogeneity, so we analyze potential causes. This may benefit epidemiological control of pandemic infectious diseases. PMID:27597972

  7. Assessing tiger population dynamics using photographic capture-recapture sampling

    USGS Publications Warehouse

    Karanth, K.U.; Nichols, J.D.; Kumar, N.S.; Hines, J.E.

    2006-01-01

    Although wide-ranging, elusive, large carnivore species, such as the tiger, are of scientific and conservation interest, rigorous inferences about their population dynamics are scarce because of methodological problems of sampling populations at the required spatial and temporal scales. We report the application of a rigorous, noninvasive method for assessing tiger population dynamics to test model-based predictions about population viability. We obtained photographic capture histories for 74 individual tigers during a nine-year study involving 5725 trap-nights of effort. These data were modeled under a likelihood-based, ?robust design? capture?recapture analytic framework. We explicitly modeled and estimated ecological parameters such as time-specific abundance, density, survival, recruitment, temporary emigration, and transience, using models that incorporated effects of factors such as individual heterogeneity, trap-response, and time on probabilities of photo-capturing tigers. The model estimated a random temporary emigration parameter of =K' =Y' 0.10 ? 0.069 (values are estimated mean ? SE). When scaled to an annual basis, tiger survival rates were estimated at S = 0.77 ? 0.051, and the estimated probability that a newly caught animal was a transient was = 0.18 ? 0.11. During the period when the sampled area was of constant size, the estimated population size Nt varied from 17 ? 1.7 to 31 ? 2.1 tigers, with a geometric mean rate of annual population change estimated as = 1.03 ? 0.020, representing a 3% annual increase. The estimated recruitment of new animals, Bt, varied from 0 ? 3.0 to 14 ? 2.9 tigers. Population density estimates, D, ranged from 7.33 ? 0.8 tigers/100 km2 to 21.73 ? 1.7 tigers/100 km2 during the study. Thus, despite substantial annual losses and temporal variation in recruitment, the tiger density remained at relatively high levels in Nagarahole. Our results are consistent with the hypothesis that protected wild tiger populations can remain healthy despite heavy mortalities because of their inherently high reproductive potential. The ability to model the entire photographic capture history data set and incorporate reduced-parameter models led to estimates of mean annual population change that were sufficiently precise to be useful. This efficient, noninvasive sampling approach can be used to rigorously investigate the population dynamics of tigers and other elusive, rare, wide-ranging animal species in which individuals can be identified from photographs or other means.

  8. Assessing tiger population dynamics using photographic capture-recapture sampling.

    PubMed

    Karanth, K Ullas; Nichols, James D; Kumar, N Samba; Hines, James E

    2006-11-01

    Although wide-ranging, elusive, large carnivore species, such as the tiger, are of scientific and conservation interest, rigorous inferences about their population dynamics are scarce because of methodological problems of sampling populations at the required spatial and temporal scales. We report the application of a rigorous, noninvasive method for assessing tiger population dynamics to test model-based predictions about population viability. We obtained photographic capture histories for 74 individual tigers during a nine-year study involving 5725 trap-nights of effort. These data were modeled under a likelihood-based, "robust design" capture-recapture analytic framework. We explicitly modeled and estimated ecological parameters such as time-specific abundance, density, survival, recruitment, temporary emigration, and transience, using models that incorporated effects of factors such as individual heterogeneity, trap-response, and time on probabilities of photo-capturing tigers. The model estimated a random temporary emigration parameter of gamma" = gamma' = 0.10 +/- 0.069 (values are estimated mean +/- SE). When scaled to an annual basis, tiger survival rates were estimated at S = 0.77 +/- 0.051, and the estimated probability that a newly caught animal was a transient was tau = 0.18 +/- 0.11. During the period when the sampled area was of constant size, the estimated population size N(t) varied from 17 +/- 1.7 to 31 +/- 2.1 tigers, with a geometric mean rate of annual population change estimated as lambda = 1.03 +/- 0.020, representing a 3% annual increase. The estimated recruitment of new animals, B(t), varied from 0 +/- 3.0 to 14 +/- 2.9 tigers. Population density estimates, D, ranged from 7.33 +/- 0.8 tigers/100 km2 to 21.73 +/- 1.7 tigers/100 km2 during the study. Thus, despite substantial annual losses and temporal variation in recruitment, the tiger density remained at relatively high levels in Nagarahole. Our results are consistent with the hypothesis that protected wild tiger populations can remain healthy despite heavy mortalities because of their inherently high reproductive potential. The ability to model the entire photographic capture history data set and incorporate reduced-parameter models led to estimates of mean annual population change that were sufficiently precise to be useful. This efficient, noninvasive sampling approach can be used to rigorously investigate the population dynamics of tigers and other elusive, rare, wide-ranging animal species in which individuals can be identified from photographs or other means.

  9. Examining the relationship between motor assessments and handwriting consistency in children with and without probable developmental coordination disorder.

    PubMed

    Bo, Jin; Colbert, Alison; Lee, Chi-Mei; Schaffert, Jeffrey; Oswald, Kaitlin; Neill, Rebecca

    2014-09-01

    Children with Developmental Coordination Disorder (DCD) often experience difficulties in handwriting. The current study examined the relationships between three motor assessments and the spatial and temporal consistency of handwriting. Twelve children with probable DCD and 29 children from 7 to 12 years who were typically developing wrote the lowercase letters "e" and "l" in cursive and printed forms repetitively on a digitizing tablet. Three behavioral assessments, including the Beery-Buktenica Developmental Test of Visual-Motor Integration (VMI), the Minnesota Handwriting Assessment (MHA) and the Movement Assessment Battery for Children (MABC), were administered. Children with probable DCD had low scores on the VMI, MABC and MHA and showed high temporal, not spatial, variability in the letter-writing task. Their MABC scores related to temporal consistency in all handwriting conditions, and the Legibility scores in their MHA correlated with temporal consistency in cursive "e" and printed "l". It appears that children with probable DCD have prominent difficulties on the temporal aspect of handwriting. While the MHA is a good product-oriented assessment for measuring handwriting deficits, the MABC shows promise as a good assessment for capturing the temporal process of handwriting in children with DCD. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Geographic analysis of species richness and community attributes of forest birds from survey data in the mid-Atlantic integrated assessment region

    USGS Publications Warehouse

    Cam, E.; Sauer, J.R.; Nichols, J.D.; Hines, J.E.; Flather, C.H.

    2000-01-01

    Species richness of local communities is a state variable commonly used in community ecology and conservation biology. Investigation of spatial and temporal variations in richness and identification of factors associated with these variations form a basis for specifying management plans, evaluating these plans, and for testing hypotheses of theoretical interest. However, estimation of species richness is not trivial: species can be missed by investigators during sampling sessions. Sampling artifacts can lead to erroneous conclusions on spatial and temporal variation in species richness. Here we use data from the North American Breeding Bird Survey to estimate parameters describing the state of bird communities in the Mid-Atlantic Assessment (MAIA) region: species richness, extinction probability, turnover and relative species richness. We use a recently developed approach to estimation of species richness and related parameters that does not require the assumption that all the species are detected during sampling efforts. The information presented here is intended to visualize the state of bird communities in the MAIA region. We provide information on 1975 and 1990. We also quantified the changes between these years. We summarized and mapped the community attributes at a scale of management interest (watershed units).

  11. Modelling postfledging survival and age- specific breeding probabilities in species with delayed maturity: A case study of Roseate Terns at Falkner Island, Connecticut

    USGS Publications Warehouse

    Spendelow, J.A.; Nichols, J.D.; Hines, J.E.; Lebreton, J.D.; Pradel, R.

    2002-01-01

    We modelled postfledging survival and age-specific breeding probabilities in endangered Roseate Terns (Sterna dougallii) at Falkner Island, Connecticut, USA using capture-recapture data from 1988-1998 of birds ringed as chicks and as adults. While no individuals bred as 2-year-olds during this period, about three-quarters of the young that survived and returned as 3-year-olds nested, and virtually all surviving birds had begun breeding by the time they reached 5 years of age. We found no evidence of temporal variation age of first breeding of birds from different cohorts. There was significant temporal variation in the annual survival of adults and the survival over the typical 3-year maturation period of prebreeding birds, with extremely low values for both groups from the 1991 breeding season. The estimated overwinter survival rate (0.62) for adults from 1991-1992 was about three-quarters the usual rate of about 0.83, but the low survival of fledglings from 1991 resulted in less than 25% of the otherwise expected number of young from that cohort returning as breeding birds; this suggests that fledglings suffered a greater proportional decrease in survival than did adults. The survival estimates of young from 1989 and 1990 show that these cohorts were not negatively influenced by the events that decimated the young from 1991, and the young from 1992 and 1993 had above-average survival estimates. The apparent decrease since 1996 in development of fidelity of new recruits to this site is suspected to be due mainly to nocturnal disturbance and predation of chicks causing low productivity.

  12. Modelling postfledging survival and age-specific breeding probabilities in species with delayed maturity: a case study of Roseate Terns at Falkner Island, Connecticut

    USGS Publications Warehouse

    Spendelow, J.A.; Nichols, J.D.; Hines, J.E.; Lebreton, J.D.; Pradel, R.

    2002-01-01

    We modeled postfledging survival and age-specific breeding probabilities in endangered Roseate Terns (Sterna dougallii) at Falkner Island, Connecticut, USA using capture-recapture data from 1988-1998 of birds ringed as chicks and as adults. While no individuals bred as 2-yr-olds during this period, about three-quarters of the young that survived and returned as 3-yr-olds nested, and virtually all surviving birds had begun breeding by the time they reached 5 years of age. We found no evidence of temporal variation in age of first breeding of birds from different cohorts. There was significant temporal variation in the annual survival of adults and the survival over the typical 3-yr maturation period of prebreeding birds, with extremely low values for both groups from the 1991 breeding season. The estimated overwinter survival rate (0.62) for adults from 1991-1992 was about three-quarters the usual rate of about 0.83, but the low survival of fledglings from 1991 resulted in less than 25% of the otherwise expected number of young from that cohort returning as breeding birds; this suggests that fledglings suffered a greater proportional decrease in survival than did adults. The survival estimates of young from 1989 and 1990 show that these cohorts were not negatively influenced by the events that decimated the young from 1991, and the young from 1992 and 1993 had above-average survival estimates. The apparent decrease since 1996 in development of fidelity of new recruits to this site is suspected due mainly to nocturnal disturbance and predation of chicks causing low productivity.

  13. Impact of temporal probability in 4D dose calculation for lung tumors.

    PubMed

    Rouabhi, Ouided; Ma, Mingyu; Bayouth, John; Xia, Junyi

    2015-11-08

    The purpose of this study was to evaluate the dosimetric uncertainty in 4D dose calculation using three temporal probability distributions: uniform distribution, sinusoidal distribution, and patient-specific distribution derived from the patient respiratory trace. Temporal probability, defined as the fraction of time a patient spends in each respiratory amplitude, was evaluated in nine lung cancer patients. Four-dimensional computed tomography (4D CT), along with deformable image registration, was used to compute 4D dose incorporating the patient's respiratory motion. First, the dose of each of 10 phase CTs was computed using the same planning parameters as those used in 3D treatment planning based on the breath-hold CT. Next, deformable image registration was used to deform the dose of each phase CT to the breath-hold CT using the deformation map between the phase CT and the breath-hold CT. Finally, the 4D dose was computed by summing the deformed phase doses using their corresponding temporal probabilities. In this study, 4D dose calculated from the patient-specific temporal probability distribution was used as the ground truth. The dosimetric evaluation matrix included: 1) 3D gamma analysis, 2) mean tumor dose (MTD), 3) mean lung dose (MLD), and 4) lung V20. For seven out of nine patients, both uniform and sinusoidal temporal probability dose distributions were found to have an average gamma passing rate > 95% for both the lung and PTV regions. Compared with 4D dose calculated using the patient respiratory trace, doses using uniform and sinusoidal distribution showed a percentage difference on average of -0.1% ± 0.6% and -0.2% ± 0.4% in MTD, -0.2% ± 1.9% and -0.2% ± 1.3% in MLD, 0.09% ± 2.8% and -0.07% ± 1.8% in lung V20, -0.1% ± 2.0% and 0.08% ± 1.34% in lung V10, 0.47% ± 1.8% and 0.19% ± 1.3% in lung V5, respectively. We concluded that four-dimensional dose computed using either a uniform or sinusoidal temporal probability distribution can approximate four-dimensional dose computed using the patient-specific respiratory trace.

  14. Demographics and 2008 Run Timing of Adult Lost River (Deltistes luxatus) and Shortnose (Chasmistes brevirostris) Suckers in Upper Klamath Lake

    USGS Publications Warehouse

    Janney, Eric C.; Hayes, Brian S.; Hewitt, David A.; Barry, Patrick M.; Scott, Alta; Koller, Justin; Johnson, Mark; Blackwood, Greta

    2009-01-01

    We used capture-recapture data to assess population dynamics of endangered Lost River suckers (Deltistes luxatus) and shortnose suckers (Chasmistes brevirostris) in Upper Klamath Lake, Oregon. The Cormack-Jolly-Seber method was used to estimate apparent survival probabilities, and a temporal symmetry model was used to estimate annual seniority probabilities. Information theoretic modeling was used to assess variation in parameter estimates due to time, gender, and species. In addition, length data were used to detect multiple year-class failures and events of high recruitment into adult spawning populations. Survival of adult Lost River and shortnose suckers varied substantially across years. Relatively high annual mortality was observed for the lakeshore-spawning Lost River sucker subpopulation in 2002 and for the river spawning subpopulation in 2001. Shortnose suckers experienced high mortality in 2001 and 2004. This indicates that high mortality events are not only species specific, but also are specific to subpopulations for Lost River suckers. Seniority probability estimates and length composition data indicate that recruitment of new individuals into adult sucker populations has been sparse. The overall fitness of Upper Klamath Lake sucker populations are of concern given the low observed survival in some years and the paucity of recent recruitment. During most years, estimates of survival probabilities were lower than seniority probabilities, indicating net losses in adult sucker population abundances. The evidence for decline was more marked for shortnose suckers than for Lost River suckers. Our data indicated that sucker survival for both species, but especially shortnose suckers, was sometimes low in years without any observed fish kills. This indicates that high mortality can occur over a protracted period, resulting in poor annual survival, but will not necessarily be observed in association with a fish kill. A better understanding of the factors influencing adult survival and recruitment into spawning populations is needed. Monitoring these vital parameters will provide a quantitative means to evaluate population status and assess the effectiveness of conservation and recovery efforts.

  15. Satellite-based high-resolution mapping of rainfall over southern Africa

    NASA Astrophysics Data System (ADS)

    Meyer, Hanna; Drönner, Johannes; Nauss, Thomas

    2017-06-01

    A spatially explicit mapping of rainfall is necessary for southern Africa for eco-climatological studies or nowcasting but accurate estimates are still a challenging task. This study presents a method to estimate hourly rainfall based on data from the Meteosat Second Generation (MSG) Spinning Enhanced Visible and Infrared Imager (SEVIRI). Rainfall measurements from about 350 weather stations from 2010-2014 served as ground truth for calibration and validation. SEVIRI and weather station data were used to train neural networks that allowed the estimation of rainfall area and rainfall quantities over all times of the day. The results revealed that 60 % of recorded rainfall events were correctly classified by the model (probability of detection, POD). However, the false alarm ratio (FAR) was high (0.80), leading to a Heidke skill score (HSS) of 0.18. Estimated hourly rainfall quantities were estimated with an average hourly correlation of ρ = 0. 33 and a root mean square error (RMSE) of 0.72. The correlation increased with temporal aggregation to 0.52 (daily), 0.67 (weekly) and 0.71 (monthly). The main weakness was the overestimation of rainfall events. The model results were compared to the Integrated Multi-satellitE Retrievals for GPM (IMERG) of the Global Precipitation Measurement (GPM) mission. Despite being a comparably simple approach, the presented MSG-based rainfall retrieval outperformed GPM IMERG in terms of rainfall area detection: GPM IMERG had a considerably lower POD. The HSS was not significantly different compared to the MSG-based retrieval due to a lower FAR of GPM IMERG. There were no further significant differences between the MSG-based retrieval and GPM IMERG in terms of correlation with the observed rainfall quantities. The MSG-based retrieval, however, provides rainfall in a higher spatial resolution. Though estimating rainfall from satellite data remains challenging, especially at high temporal resolutions, this study showed promising results towards improved spatio-temporal estimates of rainfall over southern Africa.

  16. PET of serotonin 1A receptors and cerebral glucose metabolism for temporal lobectomy.

    PubMed

    Theodore, William H; Martinez, Ashley R; Khan, Omar I; Liew, Clarissa J; Auh, Sungyoung; Dustin, Irene M; Heiss, John; Sato, Susumu

    2012-09-01

    The objective of this study was to compare 5-hydroxytryptamine receptor 1A (5-HT(1A)) PET with cerebral metabolic rate of glucose (CMRglc) PET for temporal lobectomy planning. We estimated 5-HT(1A) receptor binding preoperatively with (18)F-trans-4-fluoro-N-2-[4-(2-methoxyphenyl) piperazin-1-yl]ethyl-N-(2-pyridyl) cyclohexane carboxamide ((18)F-FCWAY) PET and CMRglc measurement with (18)F-FDG in regions drawn on coregistered MRI after partial-volume correction in 41 patients who had anterior temporal lobectomy with at least a 1-y follow-up. Surgery was tailored to individual preresection evaluations and intraoperative electrocorticography. Mean regional asymmetry values and the number of regions with asymmetry exceeding 2 SDs in 16 healthy volunteers were compared between seizure-free and non-seizure-free patients. (18)F-FCWAY but not (18)F-FDG and MRI data were masked for surgical decisions and outcome assessment. Twenty-six of 41 (63%) patients seizure-free since surgery had significantly different mesial temporal asymmetries, compared with 15 non-seizure-free patients for both (18)F-FCWAY (F(1,39) = 5.87; P = 0.02) and (18)F-FDG PET (F(1,38) = 5.79; P = 0.021). The probability of being seizure-free was explained by both (18)F-FDG and (18)F-FCWAY PET, but not MRI, with a significant additional (18)F-FCWAY effect (chi(2)(2) = 9.8796; P = 0.0072) after the probability of being seizure-free was explained by (18)F-FDG. Although MRI alone was not predictive, any combination of 2 lateralizing imaging studies was highly predictive of seizure freedom. Our study provides class III evidence that both 5-HT(1A) receptor PET and CMRglc PET can contribute to temporal lobectomy planning. Additional studies should explore the potential for temporal lobectomy based on interictal electroencephalography and minimally invasive imaging studies.

  17. Long-term volcanic hazard assessment on El Hierro (Canary Islands)

    NASA Astrophysics Data System (ADS)

    Becerril, L.; Bartolini, S.; Sobradelo, R.; Martí, J.; Morales, J. M.; Galindo, I.

    2014-07-01

    Long-term hazard assessment, one of the bastions of risk-mitigation programs, is required for land-use planning and for developing emergency plans. To ensure quality and representative results, long-term volcanic hazard assessment requires several sequential steps to be completed, which include the compilation of geological and volcanological information, the characterisation of past eruptions, spatial and temporal probabilistic studies, and the simulation of different eruptive scenarios. Despite being a densely populated active volcanic region that receives millions of visitors per year, no systematic hazard assessment has ever been conducted on the Canary Islands. In this paper we focus our attention on El Hierro, the youngest of the Canary Islands and the most recently affected by an eruption. We analyse the past eruptive activity to determine the spatial and temporal probability, and likely style of a future eruption on the island, i.e. the where, when and how. By studying the past eruptive behaviour of the island and assuming that future eruptive patterns will be similar, we aim to identify the most likely volcanic scenarios and corresponding hazards, which include lava flows, pyroclastic fallout and pyroclastic density currents (PDCs). Finally, we estimate their probability of occurrence. The end result, through the combination of the most probable scenarios (lava flows, pyroclastic density currents and ashfall), is the first qualitative integrated volcanic hazard map of the island.

  18. Regional and seasonal estimates of fractional storm coverage based on station precipitation observations

    NASA Technical Reports Server (NTRS)

    Gong, Gavin; Entekhabi, Dara; Salvucci, Guido D.

    1994-01-01

    Simulated climates using numerical atmospheric general circulation models (GCMs) have been shown to be highly sensitive to the fraction of GCM grid area assumed to be wetted during rain events. The model hydrologic cycle and land-surface water and energy balance are influenced by the parameter bar-kappa, which is the dimensionless fractional wetted area for GCM grids. Hourly precipitation records for over 1700 precipitation stations within the contiguous United States are used to obtain observation-based estimates of fractional wetting that exhibit regional and seasonal variations. The spatial parameter bar-kappa is estimated from the temporal raingauge data using conditional probability relations. Monthly bar-kappa values are estimated for rectangular grid areas over the contiguous United States as defined by the Goddard Institute for Space Studies 4 deg x 5 deg GCM. A bias in the estimates is evident due to the unavoidably sparse raingauge network density, which causes some storms to go undetected by the network. This bias is corrected by deriving the probability of a storm escaping detection by the network. A Monte Carlo simulation study is also conducted that consists of synthetically generated storm arrivals over an artificial grid area. It is used to confirm the bar-kappa estimation procedure and to test the nature of the bias and its correction. These monthly fractional wetting estimates, based on the analysis of station precipitation data, provide an observational basis for assigning the influential parameter bar-kappa in GCM land-surface hydrology parameterizations.

  19. Grizzly Bear Noninvasive Genetic Tagging Surveys: Estimating the Magnitude of Missed Detections.

    PubMed

    Fisher, Jason T; Heim, Nicole; Code, Sandra; Paczkowski, John

    2016-01-01

    Sound wildlife conservation decisions require sound information, and scientists increasingly rely on remotely collected data over large spatial scales, such as noninvasive genetic tagging (NGT). Grizzly bears (Ursus arctos), for example, are difficult to study at population scales except with noninvasive data, and NGT via hair trapping informs management over much of grizzly bears' range. Considerable statistical effort has gone into estimating sources of heterogeneity, but detection error-arising when a visiting bear fails to leave a hair sample-has not been independently estimated. We used camera traps to survey grizzly bear occurrence at fixed hair traps and multi-method hierarchical occupancy models to estimate the probability that a visiting bear actually leaves a hair sample with viable DNA. We surveyed grizzly bears via hair trapping and camera trapping for 8 monthly surveys at 50 (2012) and 76 (2013) sites in the Rocky Mountains of Alberta, Canada. We used multi-method occupancy models to estimate site occupancy, probability of detection, and conditional occupancy at a hair trap. We tested the prediction that detection error in NGT studies could be induced by temporal variability within season, leading to underestimation of occupancy. NGT via hair trapping consistently underestimated grizzly bear occupancy at a site when compared to camera trapping. At best occupancy was underestimated by 50%; at worst, by 95%. Probability of false absence was reduced through successive surveys, but this mainly accounts for error imparted by movement among repeated surveys, not necessarily missed detections by extant bears. The implications of missed detections and biased occupancy estimates for density estimation-which form the crux of management plans-require consideration. We suggest hair-trap NGT studies should estimate and correct detection error using independent survey methods such as cameras, to ensure the reliability of the data upon which species management and conservation actions are based.

  20. Online Reinforcement Learning Using a Probability Density Estimation.

    PubMed

    Agostini, Alejandro; Celaya, Enric

    2017-01-01

    Function approximation in online, incremental, reinforcement learning needs to deal with two fundamental problems: biased sampling and nonstationarity. In this kind of task, biased sampling occurs because samples are obtained from specific trajectories dictated by the dynamics of the environment and are usually concentrated in particular convergence regions, which in the long term tend to dominate the approximation in the less sampled regions. The nonstationarity comes from the recursive nature of the estimations typical of temporal difference methods. This nonstationarity has a local profile, varying not only along the learning process but also along different regions of the state space. We propose to deal with these problems using an estimation of the probability density of samples represented with a gaussian mixture model. To deal with the nonstationarity problem, we use the common approach of introducing a forgetting factor in the updating formula. However, instead of using the same forgetting factor for the whole domain, we make it dependent on the local density of samples, which we use to estimate the nonstationarity of the function at any given input point. To address the biased sampling problem, the forgetting factor applied to each mixture component is modulated according to the new information provided in the updating, rather than forgetting depending only on time, thus avoiding undesired distortions of the approximation in less sampled regions.

  1. Effects of tag loss on direct estimates of population growth rate

    USGS Publications Warehouse

    Rotella, J.J.; Hines, J.E.

    2005-01-01

    The temporal symmetry approach of R. Pradel can be used with capture-recapture data to produce retrospective estimates of a population's growth rate, lambda(i), and the relative contributions to lambda(i) from different components of the population. Direct estimation of lambda(i) provides an alternative to using population projection matrices to estimate asymptotic lambda and is seeing increased use. However, the robustness of direct estimates of lambda(1) to violations of several key assumptions has not yet been investigated. Here, we consider tag loss as a possible source of bias for scenarios in which the rate of tag loss is (1) the same for all marked animals in the population and (2) a function of tag age. We computed analytic approximations of the expected values for each of the parameter estimators involved in direct estimation and used those values to calculate bias and precision for each parameter estimator. Estimates of lambda(i) were robust to homogeneous rates of tag loss. When tag loss rates varied by tag age, bias occurred for some of the sampling situations evaluated, especially those with low capture probability, a high rate of tag loss, or both. For situations with low rates of tag loss and high capture probability, bias was low and often negligible. Estimates of contributions of demographic components to lambda(i) were not robust to tag loss. Tag loss reduced the precision of all estimates because tag loss results in fewer marked animals remaining available for estimation. Clearly tag loss should be prevented if possible, and should be considered in analyses of lambda(i), but tag loss does not necessarily preclude unbiased estimation of lambda(i).

  2. Estimating rates of local extinction and colonization in colonial species and an extension to the metapopulation and community levels

    USGS Publications Warehouse

    Barbraud, C.; Nichols, J.D.; Hines, J.E.; Hafner, H.

    2003-01-01

    Coloniality has mainly been studied from an evolutionary perspective, but relatively few studies have developed methods for modelling colony dynamics. Changes in number of colonies over time provide a useful tool for predicting and evaluating the responses of colonial species to management and to environmental disturbance. Probabilistic Markov process models have been recently used to estimate colony site dynamics using presence-absence data when all colonies are detected in sampling efforts. Here, we define and develop two general approaches for the modelling and analysis of colony dynamics for sampling situations in which all colonies are, and are not, detected. For both approaches, we develop a general probabilistic model for the data and then constrain model parameters based on various hypotheses about colony dynamics. We use Akaike's Information Criterion (AIC) to assess the adequacy of the constrained models. The models are parameterised with conditional probabilities of local colony site extinction and colonization. Presence-absence data arising from Pollock's robust capture-recapture design provide the basis for obtaining unbiased estimates of extinction, colonization, and detection probabilities when not all colonies are detected. This second approach should be particularly useful in situations where detection probabilities are heterogeneous among colony sites. The general methodology is illustrated using presence-absence data on two species of herons (Purple Heron, Ardea purpurea and Grey Heron, Ardea cinerea). Estimates of the extinction and colonization rates showed interspecific differences and strong temporal and spatial variations. We were also able to test specific predictions about colony dynamics based on ideas about habitat change and metapopulation dynamics. We recommend estimators based on probabilistic modelling for future work on colony dynamics. We also believe that this methodological framework has wide application to problems in animal ecology concerning metapopulation and community dynamics.

  3. Time series sightability modeling of animal populations.

    PubMed

    ArchMiller, Althea A; Dorazio, Robert M; St Clair, Katherine; Fieberg, John R

    2018-01-01

    Logistic regression models-or "sightability models"-fit to detection/non-detection data from marked individuals are often used to adjust for visibility bias in later detection-only surveys, with population abundance estimated using a modified Horvitz-Thompson (mHT) estimator. More recently, a model-based alternative for analyzing combined detection/non-detection and detection-only data was developed. This approach seemed promising, since it resulted in similar estimates as the mHT when applied to data from moose (Alces alces) surveys in Minnesota. More importantly, it provided a framework for developing flexible models for analyzing multiyear detection-only survey data in combination with detection/non-detection data. During initial attempts to extend the model-based approach to multiple years of detection-only data, we found that estimates of detection probabilities and population abundance were sensitive to the amount of detection-only data included in the combined (detection/non-detection and detection-only) analysis. Subsequently, we developed a robust hierarchical modeling approach where sightability model parameters are informed only by the detection/non-detection data, and we used this approach to fit a fixed-effects model (FE model) with year-specific parameters and a temporally-smoothed model (TS model) that shares information across years via random effects and a temporal spline. The abundance estimates from the TS model were more precise, with decreased interannual variability relative to the FE model and mHT abundance estimates, illustrating the potential benefits from model-based approaches that allow information to be shared across years.

  4. Time series sightability modeling of animal populations

    USGS Publications Warehouse

    ArchMiller, Althea A.; Dorazio, Robert; St. Clair, Katherine; Fieberg, John R.

    2018-01-01

    Logistic regression models—or “sightability models”—fit to detection/non-detection data from marked individuals are often used to adjust for visibility bias in later detection-only surveys, with population abundance estimated using a modified Horvitz-Thompson (mHT) estimator. More recently, a model-based alternative for analyzing combined detection/non-detection and detection-only data was developed. This approach seemed promising, since it resulted in similar estimates as the mHT when applied to data from moose (Alces alces) surveys in Minnesota. More importantly, it provided a framework for developing flexible models for analyzing multiyear detection-only survey data in combination with detection/non-detection data. During initial attempts to extend the model-based approach to multiple years of detection-only data, we found that estimates of detection probabilities and population abundance were sensitive to the amount of detection-only data included in the combined (detection/non-detection and detection-only) analysis. Subsequently, we developed a robust hierarchical modeling approach where sightability model parameters are informed only by the detection/non-detection data, and we used this approach to fit a fixed-effects model (FE model) with year-specific parameters and a temporally-smoothed model (TS model) that shares information across years via random effects and a temporal spline. The abundance estimates from the TS model were more precise, with decreased interannual variability relative to the FE model and mHT abundance estimates, illustrating the potential benefits from model-based approaches that allow information to be shared across years.

  5. Spatio-temporal optimization of sampling for bluetongue vectors (Culicoides) near grazing livestock

    PubMed Central

    2013-01-01

    Background Estimating the abundance of Culicoides using light traps is influenced by a large variation in abundance in time and place. This study investigates the optimal trapping strategy to estimate the abundance or presence/absence of Culicoides on a field with grazing animals. We used 45 light traps to sample specimens from the Culicoides obsoletus species complex on a 14 hectare field during 16 nights in 2009. Findings The large number of traps and catch nights enabled us to simulate a series of samples consisting of different numbers of traps (1-15) on each night. We also varied the number of catch nights when simulating the sampling, and sampled with increasing minimum distances between traps. We used resampling to generate a distribution of different mean and median abundance in each sample. Finally, we used the hypergeometric distribution to estimate the probability of falsely detecting absence of vectors on the field. The variation in the estimated abundance decreased steeply when using up to six traps, and was less pronounced when using more traps, although no clear cutoff was found. Conclusions Despite spatial clustering in vector abundance, we found no effect of increasing the distance between traps. We found that 18 traps were generally required to reach 90% probability of a true positive catch when sampling just one night. But when sampling over two nights the same probability level was obtained with just three traps per night. The results are useful for the design of vector monitoring programmes on fields with grazing animals. PMID:23705770

  6. Temporally and spatially partitioned behaviours of spinner dolphins: implications for resilience to human disturbance

    PubMed Central

    Johnston, David W.; Christiansen, Fredrik

    2017-01-01

    Selective forces shape the evolution of wildlife behavioural strategies and influence the spatial and temporal partitioning of behavioural activities to maximize individual fitness. Globally, wildlife is increasingly exposed to human activities which may affect their behavioural activities. The ability of wildlife to compensate for the effects of human activities may have implications for their resilience to disturbance. Resilience theory suggests that behavioural systems which are constrained in their repertoires are less resilient to disturbance than flexible systems. Using behavioural time-series data, we show that spinner dolphins (Stenella longirostris) spatially and temporally partition their behavioural activities on a daily basis. Specifically, spinner dolphins were never observed foraging during daytime, where resting was the predominant activity. Travelling and socializing probabilities were higher in early mornings and late afternoons when dolphins were returning from or preparing for nocturnal feeding trips, respectively. The constrained nature of spinner dolphin behaviours suggests they are less resilient to human disturbance than other cetaceans. These dolphins experience the highest exposure rates to human activities ever reported for any cetaceans. Over the last 30 years human activities have increased significantly in Hawaii, but the spinner dolphins still inhabit these bays. Recent abundance estimates (2011 and 2012) however, are lower than all previous estimates (1979–1981, 1989–1992 and 2003), indicating a possible long-term impact. Quantification of the spatial and temporal partitioning of wildlife behavioural schedules provides critical insight for conservation measures that aim to mitigate the effects of human disturbance. PMID:28280561

  7. Temporally and spatially partitioned behaviours of spinner dolphins: implications for resilience to human disturbance.

    PubMed

    Tyne, Julian A; Johnston, David W; Christiansen, Fredrik; Bejder, Lars

    2017-01-01

    Selective forces shape the evolution of wildlife behavioural strategies and influence the spatial and temporal partitioning of behavioural activities to maximize individual fitness. Globally, wildlife is increasingly exposed to human activities which may affect their behavioural activities. The ability of wildlife to compensate for the effects of human activities may have implications for their resilience to disturbance. Resilience theory suggests that behavioural systems which are constrained in their repertoires are less resilient to disturbance than flexible systems. Using behavioural time-series data, we show that spinner dolphins ( Stenella longirostris ) spatially and temporally partition their behavioural activities on a daily basis. Specifically, spinner dolphins were never observed foraging during daytime, where resting was the predominant activity. Travelling and socializing probabilities were higher in early mornings and late afternoons when dolphins were returning from or preparing for nocturnal feeding trips, respectively. The constrained nature of spinner dolphin behaviours suggests they are less resilient to human disturbance than other cetaceans. These dolphins experience the highest exposure rates to human activities ever reported for any cetaceans. Over the last 30 years human activities have increased significantly in Hawaii, but the spinner dolphins still inhabit these bays. Recent abundance estimates (2011 and 2012) however, are lower than all previous estimates (1979-1981, 1989-1992 and 2003), indicating a possible long-term impact. Quantification of the spatial and temporal partitioning of wildlife behavioural schedules provides critical insight for conservation measures that aim to mitigate the effects of human disturbance.

  8. Holocene paleoseismicity, temporal clustering, and probabilities of future large (M > 7) earthquakes on the Wasatch fault zone, Utah

    USGS Publications Warehouse

    McCalpin, J.P.; Nishenko, S.P.

    1996-01-01

    The chronology of M>7 paleoearthquakes on the central five segments of the Wasatch fault zone (WFZ) is one of the best dated in the world and contains 16 earthquakes in the past 5600 years with an average repeat time of 350 years. Repeat times for individual segments vary by a factor of 2, and range from about 1200 to 2600 years. Four of the central five segments ruptured between ??? 620??30 and 1230??60 calendar years B.P. The remaining segment (Brigham City segment) has not ruptured in the past 2120??100 years. Comparison of the WFZ space-time diagram of paleoearthquakes with synthetic paleoseismic histories indicates that the observed temporal clusters and gaps have about an equal probability (depending on model assumptions) of reflecting random coincidence as opposed to intersegment contagion. Regional seismicity suggests that for exposure times of 50 and 100 years, the probability for an earthquake of M>7 anywhere within the Wasatch Front region, based on a Poisson model, is 0.16 and 0.30, respectively. A fault-specific WFZ model predicts 50 and 100 year probabilities for a M>7 earthquake on the WFZ itself, based on a Poisson model, as 0.13 and 0.25, respectively. In contrast, segment-specific earthquake probabilities that assume quasi-periodic recurrence behavior on the Weber, Provo, and Nephi segments are less (0.01-0.07 in 100 years) than the regional or fault-specific estimates (0.25-0.30 in 100 years), due to the short elapsed times compared to average recurrence intervals on those segments. The Brigham City and Salt Lake City segments, however, have time-dependent probabilities that approach or exceed the regional and fault specific probabilities. For the Salt Lake City segment, these elevated probabilities are due to the elapsed time being approximately equal to the average late Holocene recurrence time. For the Brigham City segment, the elapsed time is significantly longer than the segment-specific late Holocene recurrence time.

  9. The long-term outcomes of epilepsy surgery

    PubMed Central

    Keller, Simon; Nicolson, Andrew; Biswas, Shubhabrata; Smith, David; Osman Farah, Jibril; Eldridge, Paul; Wieshmann, Udo

    2018-01-01

    Objective Despite modern anti-epileptic drug treatment, approximately 30% of epilepsies remain medically refractory and for these patients, epilepsy surgery may be a treatment option. There have been numerous studies demonstrating good outcome of epilepsy surgery in the short to median term however, there are a limited number of studies looking at the long-term outcomes. The aim of this study was to ascertain the long-term outcome of resective epilepsy surgery in a large neurosurgery hospital in the U.K. Methods This a retrospective analysis of prospectively collected data. We used the 2001 International League Against Epilepsy (ILAE) classification system to classify seizure freedom and Kaplan-Meier survival analysis to estimate the probability of seizure freedom. Results We included 284 patients who underwent epilepsy surgery (178 anterior temporal lobe resections, 37 selective amygdalohippocampectomies, 33 temporal lesionectomies, 36 extratemporal lesionectomies), and had a prospective median follow-up of 5 years (range 1–27). Kaplan-Meier estimates showed that 47% (95% CI 40–58) remained seizure free (apart from simple partial seizures) at 5 years and 38% (95% CI 31–45) at 10 years after surgery. 74% (95% CI 69–80) had a greater than 50% seizure reduction at 5 years and 70% (95% CI 64–77) at 10 years. Patients who had an amygdalohippocampectomy were more likely to have seizure recurrence than patients who had an anterior temporal lobe resection (p = 0.006) and temporal lesionectomy (p = 0.029). There was no significant difference between extra temporal and temporal lesionectomies. Hippocampal sclerosis was associated with a good outcome but declined in relative frequency over the years. Conclusion The vast majority of patients who were not seizure free experienced at least a substantial and long-lasting reduction in seizure frequency. A positive long-term outcome after epilepsy surgery is possible for many patients and especially those with hippocampal sclerosis or those who had anterior temporal lobe resections. PMID:29768433

  10. The long-term outcomes of epilepsy surgery.

    PubMed

    Mohan, Midhun; Keller, Simon; Nicolson, Andrew; Biswas, Shubhabrata; Smith, David; Osman Farah, Jibril; Eldridge, Paul; Wieshmann, Udo

    2018-01-01

    Despite modern anti-epileptic drug treatment, approximately 30% of epilepsies remain medically refractory and for these patients, epilepsy surgery may be a treatment option. There have been numerous studies demonstrating good outcome of epilepsy surgery in the short to median term however, there are a limited number of studies looking at the long-term outcomes. The aim of this study was to ascertain the long-term outcome of resective epilepsy surgery in a large neurosurgery hospital in the U.K. This a retrospective analysis of prospectively collected data. We used the 2001 International League Against Epilepsy (ILAE) classification system to classify seizure freedom and Kaplan-Meier survival analysis to estimate the probability of seizure freedom. We included 284 patients who underwent epilepsy surgery (178 anterior temporal lobe resections, 37 selective amygdalohippocampectomies, 33 temporal lesionectomies, 36 extratemporal lesionectomies), and had a prospective median follow-up of 5 years (range 1-27). Kaplan-Meier estimates showed that 47% (95% CI 40-58) remained seizure free (apart from simple partial seizures) at 5 years and 38% (95% CI 31-45) at 10 years after surgery. 74% (95% CI 69-80) had a greater than 50% seizure reduction at 5 years and 70% (95% CI 64-77) at 10 years. Patients who had an amygdalohippocampectomy were more likely to have seizure recurrence than patients who had an anterior temporal lobe resection (p = 0.006) and temporal lesionectomy (p = 0.029). There was no significant difference between extra temporal and temporal lesionectomies. Hippocampal sclerosis was associated with a good outcome but declined in relative frequency over the years. The vast majority of patients who were not seizure free experienced at least a substantial and long-lasting reduction in seizure frequency. A positive long-term outcome after epilepsy surgery is possible for many patients and especially those with hippocampal sclerosis or those who had anterior temporal lobe resections.

  11. Estimating occupancy dynamics in an anuran assemblage from Louisiana, USA

    USGS Publications Warehouse

    Walls, Susan C.; Waddle, J. Hardin; Dorazio, Robert M.

    2011-01-01

    Effective monitoring programs are designed to track changes in the distribution, occurrence, and abundance of species. We developed an extension of Royle and Kéry's (2007) single species model to estimate simultaneously temporal changes in probabilities of detection, occupancy, colonization, extinction, and species turnover using data on calling anuran amphibians, collected from 2002 to 2006 in the Lower Mississippi Alluvial Valley of Louisiana, USA. During our 5-year study, estimates of occurrence probabilities declined for all 12 species detected. These declines occurred primarily in conjunction with variation in estimates of local extinction probabilities (cajun chorus frog [Pseudacris fouquettei], spring peeper [P. crucifer], northern cricket frog [Acris crepitans], Cope's gray treefrog [Hyla chrysoscelis], green treefrog [H. cinerea], squirrel treefrog [H. squirella], southern leopard frog [Lithobates sphenocephalus], bronze frog [L. clamitans], American bullfrog [L. catesbeianus], and Fowler's toad [Anaxyrus fowleri]). For 2 species (eastern narrowmouthed toad [Gastrophryne carolinensis] and Gulf Coast toad [Incilius nebulifer]), declines in occupancy appeared to be a consequence of both increased local extinction and decreased colonization events. The eastern narrow-mouthed toad experienced a 2.5-fold increase in estimates of occupancy in 2004, possibly because of the high amount of rainfall received during that year, along with a decrease in extinction and increase in colonization of new sites between 2003 and 2004. Our model can be incorporated into monitoring programs to estimate simultaneously the occupancy dynamics for multiple species that show similar responses to ecological conditions. It will likely be an important asset for those monitoring programs that employ the same methods to sample assemblages of ecologically similar species, including those that are rare. By combining information from multiple species to decrease the variance on estimates of individual species, our results are advantageous compared to single-species models. This feature enables managers and researchers to use an entire community, rather than just one species, as an ecological indicator in monitoring programs.

  12. Grizzly Bear Noninvasive Genetic Tagging Surveys: Estimating the Magnitude of Missed Detections

    PubMed Central

    Fisher, Jason T.; Heim, Nicole; Code, Sandra; Paczkowski, John

    2016-01-01

    Sound wildlife conservation decisions require sound information, and scientists increasingly rely on remotely collected data over large spatial scales, such as noninvasive genetic tagging (NGT). Grizzly bears (Ursus arctos), for example, are difficult to study at population scales except with noninvasive data, and NGT via hair trapping informs management over much of grizzly bears’ range. Considerable statistical effort has gone into estimating sources of heterogeneity, but detection error–arising when a visiting bear fails to leave a hair sample–has not been independently estimated. We used camera traps to survey grizzly bear occurrence at fixed hair traps and multi-method hierarchical occupancy models to estimate the probability that a visiting bear actually leaves a hair sample with viable DNA. We surveyed grizzly bears via hair trapping and camera trapping for 8 monthly surveys at 50 (2012) and 76 (2013) sites in the Rocky Mountains of Alberta, Canada. We used multi-method occupancy models to estimate site occupancy, probability of detection, and conditional occupancy at a hair trap. We tested the prediction that detection error in NGT studies could be induced by temporal variability within season, leading to underestimation of occupancy. NGT via hair trapping consistently underestimated grizzly bear occupancy at a site when compared to camera trapping. At best occupancy was underestimated by 50%; at worst, by 95%. Probability of false absence was reduced through successive surveys, but this mainly accounts for error imparted by movement among repeated surveys, not necessarily missed detections by extant bears. The implications of missed detections and biased occupancy estimates for density estimation–which form the crux of management plans–require consideration. We suggest hair-trap NGT studies should estimate and correct detection error using independent survey methods such as cameras, to ensure the reliability of the data upon which species management and conservation actions are based. PMID:27603134

  13. A novel approach to estimate the eruptive potential and probability in open conduit volcanoes

    PubMed Central

    De Gregorio, Sofia; Camarda, Marco

    2016-01-01

    In open conduit volcanoes, volatile-rich magma continuously enters into the feeding system nevertheless the eruptive activity occurs intermittently. From a practical perspective, the continuous steady input of magma in the feeding system is not able to produce eruptive events alone, but rather surplus of magma inputs are required to trigger the eruptive activity. The greater the amount of surplus of magma within the feeding system, the higher is the eruptive probability.Despite this observation, eruptive potential evaluations are commonly based on the regular magma supply, and in eruptive probability evaluations, generally any magma input has the same weight. Conversely, herein we present a novel approach based on the quantification of surplus of magma progressively intruded in the feeding system. To quantify the surplus of magma, we suggest to process temporal series of measurable parameters linked to the magma supply. We successfully performed a practical application on Mt Etna using the soil CO2 flux recorded over ten years. PMID:27456812

  14. A novel approach to estimate the eruptive potential and probability in open conduit volcanoes.

    PubMed

    De Gregorio, Sofia; Camarda, Marco

    2016-07-26

    In open conduit volcanoes, volatile-rich magma continuously enters into the feeding system nevertheless the eruptive activity occurs intermittently. From a practical perspective, the continuous steady input of magma in the feeding system is not able to produce eruptive events alone, but rather surplus of magma inputs are required to trigger the eruptive activity. The greater the amount of surplus of magma within the feeding system, the higher is the eruptive probability.Despite this observation, eruptive potential evaluations are commonly based on the regular magma supply, and in eruptive probability evaluations, generally any magma input has the same weight. Conversely, herein we present a novel approach based on the quantification of surplus of magma progressively intruded in the feeding system. To quantify the surplus of magma, we suggest to process temporal series of measurable parameters linked to the magma supply. We successfully performed a practical application on Mt Etna using the soil CO2 flux recorded over ten years.

  15. Spatial Interpolation of Rain-field Dynamic Time-Space Evolution in Hong Kong

    NASA Astrophysics Data System (ADS)

    Liu, P.; Tung, Y. K.

    2017-12-01

    Accurate and reliable measurement and prediction of spatial and temporal distribution of rain-field over a wide range of scales are important topics in hydrologic investigations. In this study, geostatistical treatment of precipitation field is adopted. To estimate the rainfall intensity over a study domain with the sample values and the spatial structure from the radar data, the cumulative distribution functions (CDFs) at all unsampled locations were estimated. Indicator Kriging (IK) was used to estimate the exceedance probabilities for different pre-selected cutoff levels and a procedure was implemented for interpolating CDF values between the thresholds that were derived from the IK. Different interpolation schemes of the CDF were proposed and their influences on the performance were also investigated. The performance measures and visual comparison between the observed rain-field and the IK-based estimation suggested that the proposed method can provide fine results of estimation of indicator variables and is capable of producing realistic image.

  16. Environmental and social determinants of population vulnerability to Zika virus emergence at the local scale.

    PubMed

    Rees, Erin E; Petukhova, Tatiana; Mascarenhas, Mariola; Pelcat, Yann; Ogden, Nicholas H

    2018-05-08

    Zika virus (ZIKV) spread rapidly in the Americas in 2015. Targeting effective public health interventions for inhabitants of, and travellers to and from, affected countries depends on understanding the risk of ZIKV emergence (and re-emergence) at the local scale. We explore the extent to which environmental, social and neighbourhood disease intensity variables influenced emergence dynamics. Our objective was to characterise population vulnerability given the potential for sustained autochthonous ZIKV transmission and the timing of emergence. Logistic regression models estimated the probability of reporting at least one case of ZIKV in a given municipality over the course of the study period as an indicator for sustained transmission; while accelerated failure time (AFT) survival models estimated the time to a first reported case of ZIKV in week t for a given municipality as an indicator for timing of emergence. Sustained autochthonous ZIKV transmission was best described at the temporal scale of the study period (almost one year), such that high levels of study period precipitation and low mean study period temperature reduced the probability. Timing of ZIKV emergence was best described at the weekly scale for precipitation in that high precipitation in the current week delayed reporting. Both modelling approaches detected an effect of high poverty on reducing/slowing case detection, especially when inter-municipal road connectivity was low. We also found that proximity to municipalities reporting ZIKV had an effect to reduce timing of emergence when located, on average, less than 100 km away. The different modelling approaches help distinguish between large temporal scale factors driving vector habitat suitability and short temporal scale factors affecting the speed of spread. We find evidence for inter-municipal movements of infected people as a local-scale driver of spatial spread. The negative association with poverty suggests reduced case reporting in poorer areas. Overall, relatively simplistic models may be able to predict the vulnerability of populations to autochthonous ZIKV transmission at the local scale.

  17. The two-parametric scaling and new temporal asymptotic of survival probability of diffusing particle in the medium with traps.

    PubMed

    Arkhincheev, V E

    2017-03-01

    The new asymptotic behavior of the survival probability of particles in a medium with absorbing traps in an electric field has been established in two ways-by using the scaling approach and by the direct solution of the diffusion equation in the field. It has shown that at long times, this drift mechanism leads to a new temporal behavior of the survival probability of particles in a medium with absorbing traps.

  18. The two-parametric scaling and new temporal asymptotic of survival probability of diffusing particle in the medium with traps

    NASA Astrophysics Data System (ADS)

    Arkhincheev, V. E.

    2017-03-01

    The new asymptotic behavior of the survival probability of particles in a medium with absorbing traps in an electric field has been established in two ways—by using the scaling approach and by the direct solution of the diffusion equation in the field. It has shown that at long times, this drift mechanism leads to a new temporal behavior of the survival probability of particles in a medium with absorbing traps.

  19. Quantum temporal probabilities in tunneling systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anastopoulos, Charis, E-mail: anastop@physics.upatras.gr; Savvidou, Ntina, E-mail: ksavvidou@physics.upatras.gr

    We study the temporal aspects of quantum tunneling as manifested in time-of-arrival experiments in which the detected particle tunnels through a potential barrier. In particular, we present a general method for constructing temporal probabilities in tunneling systems that (i) defines ‘classical’ time observables for quantum systems and (ii) applies to relativistic particles interacting through quantum fields. We show that the relevant probabilities are defined in terms of specific correlation functions of the quantum field associated with tunneling particles. We construct a probability distribution with respect to the time of particle detection that contains all information about the temporal aspects ofmore » the tunneling process. In specific cases, this probability distribution leads to the definition of a delay time that, for parity-symmetric potentials, reduces to the phase time of Bohm and Wigner. We apply our results to piecewise constant potentials, by deriving the appropriate junction conditions on the points of discontinuity. For the double square potential, in particular, we demonstrate the existence of (at least) two physically relevant time parameters, the delay time and a decay rate that describes the escape of particles trapped in the inter-barrier region. Finally, we propose a resolution to the paradox of apparent superluminal velocities for tunneling particles. We demonstrate that the idea of faster-than-light speeds in tunneling follows from an inadmissible use of classical reasoning in the description of quantum systems. -- Highlights: •Present a general methodology for deriving temporal probabilities in tunneling systems. •Treatment applies to relativistic particles interacting through quantum fields. •Derive a new expression for tunneling time. •Identify new time parameters relevant to tunneling. •Propose a resolution of the superluminality paradox in tunneling.« less

  20. Responses of pond-breeding amphibians to wildfire: Short-term patterns in occupancy and colonization

    USGS Publications Warehouse

    Hossack, B.R.; Corn, P.S.

    2007-01-01

    Wildland fires are expected to become more frequent and severe in many ecosystems, potentially posing a threat to many sensitive species. We evaluated the effects of a large, stand-replacement wildfire on three species of pond-breeding amphibians by estimating changes in occupancy of breeding sites during the three years before and after the fire burned 42 of 83 previously surveyed wetlands. Annual occupancy and colonization for each species was estimated using recently developed models that incorporate detection probabilities to provide unbiased parameter estimates. We did not find negative effects of the fire on the occupancy or colonization rates of the long-toed salamander (Ambystoma macrodactylum). Instead, its occupancy was higher across the study area after the fire, possibly in response to a large snowpack that may have facilitated colonization of unoccupied wetlands. Naïve data (uncorrected for detection probability) for the Columbia spotted frog (Rana luteiventris) initially led to the conclusion of increased occupancy and colonization in wetlands that burned. After accounting for temporal and spatial variation in detection probabilities, however, it was evident that these parameters were relatively stable in both areas before and after the fire. We found a similar discrepancy between naïve and estimated occupancy of A. macrodactylum that resulted from different detection probabilities in burned and control wetlands. The boreal toad (Bufo boreas) was not found breeding in the area prior to the fire but colonized several wetlands the year after they burned. Occupancy by B. boreas then declined during years 2 and 3 following the fire. Our study suggests that the amphibian populations we studied are resistant to wildfire and that B. boreas may experience short-term benefits from wildfire. Our data also illustrate how naïve presence–non-detection data can provide misleading results.

  1. A capture-recapture model of amphidromous fish dispersal

    USGS Publications Warehouse

    Smith, W.; Kwak, Thomas J.

    2014-01-01

    Adult movement scale was quantified for two tropical Caribbean diadromous fishes, bigmouth sleeper Gobiomorus dormitor and mountain mullet Agonostomus monticola, using passive integrated transponders (PITs) and radio-telemetry. Large numbers of fishes were tagged in Rio Mameyes, Puerto Rico, U.S.A., with PITs and monitored at three fixed locations over a 2-5 year period to estimate transition probabilities between upper and lower elevations and survival probabilities with a multistate Cormack-Jolly-Seber model. A sub-set of fishes were tagged with radio-transmitters and tracked at weekly intervals to estimate fine-scale dispersal. Changes in spatial and temporal distributions of tagged fishes indicated that neither G. dormitor nor A. monticola moved into the lowest, estuarine reaches of Rio Mameyes during two consecutive reproductive periods, thus demonstrating that both species follow an amphidromous, rather than catadromous, migratory strategy. Further, both species were relatively sedentary, with restricted linear ranges. While substantial dispersal of these species occurs at the larval stage during recruitment to fresh water, the results indicate minimal dispersal in spawning adults. Successful conservation of diadromous fauna on tropical islands requires management at both broad basin and localized spatial scales.

  2. Spatial and temporal variability in rates of landsliding in seismically active mountain ranges

    NASA Astrophysics Data System (ADS)

    Parker, R.; Petley, D.; Rosser, N.; Densmore, A.; Gunasekera, R.; Brain, M.

    2012-04-01

    Where earthquake and precipitation driven disasters occur in steep, mountainous regions, landslides often account for a large proportion of the associated damage and losses. This research addresses spatial and temporal variability in rates of landslide occurrence in seismically active mountain ranges as a step towards developing better regional scale prediction of losses in such events. In the first part of this paper we attempt to explain reductively the variability in spatial rates of landslide occurrence, using data from five major earthquakes. This is achieved by fitting a regression-based conditional probability model to spatial probabilities of landslide occurrence, using as predictor variables proxies for spatial patterns of seismic ground motion and modelled hillslope stability. A combined model for all earthquakes performs well in hindcasting spatial probabilities of landslide occurrence as a function of readily-attainable spatial variables. We present validation of the model and demonstrate the extent to which it may be applied globally to derive landslide probabilities for future earthquakes. In part two we examine the temporal behaviour of rates of landslide occurrence. This is achieved through numerical modelling to simulate the behaviour of a hypothetical landscape. The model landscape is composed of hillslopes that continually weaken, fail and reset in response to temporally-discrete forcing events that represent earthquakes. Hillslopes with different geometries require different amounts of weakening to fail, such that they fail and reset at different temporal rates. Our results suggest that probabilities of landslide occurrence are not temporally constant, but rather vary with time, irrespective of changes in forcing event magnitudes or environmental conditions. Various parameters influencing the magnitude and temporal patterns of this variability are identified, highlighting areas where future research is needed. This model has important implications for landslide hazard and risk analysis in mountain areas as existing techniques usually assume that susceptibility to failure does not change with time.

  3. ERMiT: Estimating Post-Fire Erosion in Probabilistic Terms

    NASA Astrophysics Data System (ADS)

    Pierson, F. B.; Robichaud, P. R.; Elliot, W. J.; Hall, D. E.; Moffet, C. A.

    2006-12-01

    Mitigating the impact of post-wildfire runoff and erosion on life, property, and natural resources have cost the United States government tens of millions of dollars over the past decade. The decision of where, when, and how to apply the most effective mitigation treatments requires land managers to assess the risk of damaging runoff and erosion events occurring after a fire. The Erosion Risk Management Tool (ERMiT) is a web-based application that estimates erosion in probabilistic terms on burned and recovering forest, range, and chaparral lands. Unlike most erosion prediction models, ERMiT does not provide `average annual erosion rates;' rather, it provides a distribution of erosion rates with the likelihood of their occurrence. ERMiT combines rain event variability with spatial and temporal variabilities of hillslope burn severity, soil properties, and ground cover to estimate Water Erosion Prediction Project (WEPP) model input parameter values. Based on 20 to 40 individual WEPP runs, ERMiT produces a distribution of rain event erosion rates with a probability of occurrence for each of five post-fire years. Over the 5 years of modeled recovery, the occurrence probability of the less erodible soil parameters is increased and the occurrence probability of the more erodible soil parameters is decreased. In addition, the occurrence probabilities and the four spatial arrangements of burn severity (arrangements of overland flow elements (OFE's)), are shifted toward lower burn severity with each year of recovery. These yearly adjustments are based on field measurements made through post-fire recovery periods. ERMiT also provides rain event erosion rate distributions for hillslopes that have been treated with seeding, straw mulch, straw wattles and contour-felled log erosion barriers. Such output can help managers make erosion mitigation treatment decisions based on the probability of high sediment yields occurring, the value of resources at risk for damage, cost, and other management considerations.

  4. Merging Satellite Precipitation Products for Improved Streamflow Simulations

    NASA Astrophysics Data System (ADS)

    Maggioni, V.; Massari, C.; Barbetta, S.; Camici, S.; Brocca, L.

    2017-12-01

    Accurate quantitative precipitation estimation is of great importance for water resources management, agricultural planning and forecasting and monitoring of natural hazards such as flash floods and landslides. In situ observations are limited around the Earth, especially in remote areas (e.g., complex terrain, dense vegetation), but currently available satellite precipitation products are able to provide global precipitation estimates with an accuracy that depends upon many factors (e.g., type of storms, temporal sampling, season, etc.). The recent SM2RAIN approach proposes to estimate rainfall by using satellite soil moisture observations. As opposed to traditional satellite precipitation methods, which sense cloud properties to retrieve instantaneous estimates, this new bottom-up approach makes use of two consecutive soil moisture measurements for obtaining an estimate of the fallen precipitation within the interval between two satellite overpasses. As a result, the nature of the measurement is different and complementary to the one of classical precipitation products and could provide a different valid perspective to substitute or improve current rainfall estimates. Therefore, we propose to merge SM2RAIN and the widely used TMPA 3B42RT product across Italy for a 6-year period (2010-2015) at daily/0.25deg temporal/spatial scale. Two conceptually different merging techniques are compared to each other and evaluated in terms of different statistical metrics, including hit bias, threat score, false alarm rates, and missed rainfall volumes. The first is based on the maximization of the temporal correlation with a reference dataset, while the second is based on a Bayesian approach, which provides a probabilistic satellite precipitation estimate derived from the joint probability distribution of observations and satellite estimates. The merged precipitation products show a better performance with respect to the parental satellite-based products in terms of categorical statistics, as well as bias reduction and correlation coefficient, with the Bayesian approach being superior to other methods. A study case in the Tiber river basin is also presented to discuss the performance of forcing a hydrological model with the merged satellite precipitation product to simulate streamflow time series.

  5. Evidence for a global seismic-moment release sequence

    USGS Publications Warehouse

    Bufe, C.G.; Perkins, D.M.

    2005-01-01

    Temporal clustering of the larger earthquakes (foreshock-mainshock-aftershock) followed by relative quiescence (stress shadow) are characteristic of seismic cycles along plate boundaries. A global seismic-moment release history, based on a little more than 100 years of instrumental earthquake data in an extended version of the catalog of Pacheco and Sykes (1992), illustrates similar behavior for Earth as a whole. Although the largest earthquakes have occurred in the circum-Pacific region, an analysis of moment release in the hemisphere antipodal to the Pacific plate shows a very similar pattern. Monte Carlo simulations confirm that the global temporal clustering of great shallow earthquakes during 1952-1964 at M ??? 9.0 is highly significant (4% random probability) as is the clustering of the events of M ??? 8.6 (0.2% random probability) during 1950-1965. We have extended the Pacheco and Sykes (1992) catalog from 1989 through 2001 using Harvard moment centroid data. Immediately after the 1950-1965 cluster, significant quiescence at and above M 8.4 begins and continues until 2001 (0.5% random probability). In alternative catalogs derived by correcting for possible random errors in magnitude estimates in the extended Pacheco-Sykes catalog, the clustering of M ??? 9 persists at a significant level. These observations indicate that, for great earthquakes, Earth behaves as a coherent seismotectonic system. A very-large-scale mechanism for global earthquake triggering and/or stress transfer is implied. There are several candidates, but so far only viscoelastic relaxation has been modeled on a global scale.

  6. Probability Distribution Extraction from TEC Estimates based on Kernel Density Estimation

    NASA Astrophysics Data System (ADS)

    Demir, Uygar; Toker, Cenk; Çenet, Duygu

    2016-07-01

    Statistical analysis of the ionosphere, specifically the Total Electron Content (TEC), may reveal important information about its temporal and spatial characteristics. One of the core metrics that express the statistical properties of a stochastic process is its Probability Density Function (pdf). Furthermore, statistical parameters such as mean, variance and kurtosis, which can be derived from the pdf, may provide information about the spatial uniformity or clustering of the electron content. For example, the variance differentiates between a quiet ionosphere and a disturbed one, whereas kurtosis differentiates between a geomagnetic storm and an earthquake. Therefore, valuable information about the state of the ionosphere (and the natural phenomena that cause the disturbance) can be obtained by looking at the statistical parameters. In the literature, there are publications which try to fit the histogram of TEC estimates to some well-known pdf.s such as Gaussian, Exponential, etc. However, constraining a histogram to fit to a function with a fixed shape will increase estimation error, and all the information extracted from such pdf will continue to contain this error. In such techniques, it is highly likely to observe some artificial characteristics in the estimated pdf which is not present in the original data. In the present study, we use the Kernel Density Estimation (KDE) technique to estimate the pdf of the TEC. KDE is a non-parametric approach which does not impose a specific form on the TEC. As a result, better pdf estimates that almost perfectly fit to the observed TEC values can be obtained as compared to the techniques mentioned above. KDE is particularly good at representing the tail probabilities, and outliers. We also calculate the mean, variance and kurtosis of the measured TEC values. The technique is applied to the ionosphere over Turkey where the TEC values are estimated from the GNSS measurement from the TNPGN-Active (Turkish National Permanent GNSS Network) network. This study is supported by by TUBITAK 115E915 and Joint TUBITAK 114E092 and AS CR14/001 projects.

  7. A screening-level modeling approach to estimate nitrogen ...

    EPA Pesticide Factsheets

    This paper presents a screening-level modeling approach that can be used to rapidly estimate nutrient loading and assess numerical nutrient standard exceedance risk of surface waters leading to potential classification as impaired for designated use. It can also be used to explore best management practice (BMP) implementation to reduce loading. The modeling framework uses a hybrid statistical and process based approach to estimate source of pollutants, their transport and decay in the terrestrial and aquatic parts of watersheds. The framework is developed in the ArcGIS environment and is based on the total maximum daily load (TMDL) balance model. Nitrogen (N) is currently addressed in the framework, referred to as WQM-TMDL-N. Loading for each catchment includes non-point sources (NPS) and point sources (PS). NPS loading is estimated using export coefficient or event mean concentration methods depending on the temporal scales, i.e., annual or daily. Loading from atmospheric deposition is also included. The probability of a nutrient load to exceed a target load is evaluated using probabilistic risk assessment, by including the uncertainty associated with export coefficients of various land uses. The computed risk data can be visualized as spatial maps which show the load exceedance probability for all stream segments. In an application of this modeling approach to the Tippecanoe River watershed in Indiana, USA, total nitrogen (TN) loading and risk of standard exce

  8. Monte Carlo role in radiobiological modelling of radiotherapy outcomes

    NASA Astrophysics Data System (ADS)

    El Naqa, Issam; Pater, Piotr; Seuntjens, Jan

    2012-06-01

    Radiobiological models are essential components of modern radiotherapy. They are increasingly applied to optimize and evaluate the quality of different treatment planning modalities. They are frequently used in designing new radiotherapy clinical trials by estimating the expected therapeutic ratio of new protocols. In radiobiology, the therapeutic ratio is estimated from the expected gain in tumour control probability (TCP) to the risk of normal tissue complication probability (NTCP). However, estimates of TCP/NTCP are currently based on the deterministic and simplistic linear-quadratic formalism with limited prediction power when applied prospectively. Given the complex and stochastic nature of the physical, chemical and biological interactions associated with spatial and temporal radiation induced effects in living tissues, it is conjectured that methods based on Monte Carlo (MC) analysis may provide better estimates of TCP/NTCP for radiotherapy treatment planning and trial design. Indeed, over the past few decades, methods based on MC have demonstrated superior performance for accurate simulation of radiation transport, tumour growth and particle track structures; however, successful application of modelling radiobiological response and outcomes in radiotherapy is still hampered with several challenges. In this review, we provide an overview of some of the main techniques used in radiobiological modelling for radiotherapy, with focus on the MC role as a promising computational vehicle. We highlight the current challenges, issues and future potentials of the MC approach towards a comprehensive systems-based framework in radiobiological modelling for radiotherapy.

  9. On land-use modeling: A treatise of satellite imagery data and misclassification error

    NASA Astrophysics Data System (ADS)

    Sandler, Austin M.

    Recent availability of satellite-based land-use data sets, including data sets with contiguous spatial coverage over large areas, relatively long temporal coverage, and fine-scale land cover classifications, is providing new opportunities for land-use research. However, care must be used when working with these datasets due to misclassification error, which causes inconsistent parameter estimates in the discrete choice models typically used to model land-use. I therefore adapt the empirical correction methods developed for other contexts (e.g., epidemiology) so that they can be applied to land-use modeling. I then use a Monte Carlo simulation, and an empirical application using actual satellite imagery data from the Northern Great Plains, to compare the results of a traditional model ignoring misclassification to those from models accounting for misclassification. Results from both the simulation and application indicate that ignoring misclassification will lead to biased results. Even seemingly insignificant levels of misclassification error (e.g., 1%) result in biased parameter estimates, which alter marginal effects enough to affect policy inference. At the levels of misclassification typical in current satellite imagery datasets (e.g., as high as 35%), ignoring misclassification can lead to systematically erroneous land-use probabilities and substantially biased marginal effects. The correction methods I propose, however, generate consistent parameter estimates and therefore consistent estimates of marginal effects and predicted land-use probabilities.

  10. Reinforcement Probability Modulates Temporal Memory Selection and Integration Processes

    PubMed Central

    Matell, Matthew S.; Kurti, Allison N.

    2013-01-01

    We have previously shown that rats trained in a mixed-interval peak procedure (tone = 4s, light = 12s) respond in a scalar manner at a time in between the trained peak times when presented with the stimulus compound (Swanton & Matell, 2011). In our previous work, the two component cues were reinforced with different probabilities (short = 20%, long = 80%) to equate response rates, and we found that the compound peak time was biased toward the cue with the higher reinforcement probability. Here, we examined the influence that different reinforcement probabilities have on the temporal location and shape of the compound response function. We found that the time of peak responding shifted as a function of the relative reinforcement probability of the component cues, becoming earlier as the relative likelihood of reinforcement associated with the short cue increased. However, as the relative probabilities of the component cues grew dissimilar, the compound peak became non-scalar, suggesting that the temporal control of behavior shifted from a process of integration to one of selection. As our previous work has utilized durations and reinforcement probabilities more discrepant than those used here, these data suggest that the processes underlying the integration/selection decision for time are based on cue value. PMID:23896560

  11. HIV-1 disease progression during highly active antiretroviral therapy: an application using population-level data in British Columbia: 1996-2011.

    PubMed

    Nosyk, Bohdan; Min, Jeong; Lima, Viviane D; Yip, Benita; Hogg, Robert S; Montaner, Julio S G

    2013-08-15

    Accurately estimating rates of disease progression is of central importance in developing mathematical models used to project outcomes and guide resource allocation decisions. Our objective was to specify a multivariate regression model to estimate changes in disease progression among individuals on highly active antiretroviral treatment in British Columbia, Canada, 1996-2011. We used population-level data on disease progression and antiretroviral treatment utilization from the BC HIV Drug Treatment Program. Disease progression was captured using longitudinal CD4 and plasma viral load testing data, linked with data on antiretroviral treatment. The study outcome was categorized into (CD4 count ≥ 500, 500-350, 350-200, <200 cells/mm, and mortality). A 5-state continuous-time Markov model was used to estimate covariate-specific probabilities of CD4 progression, focusing on temporal changes during the study period. A total of 210,083 CD4 measurements among 7421 individuals with HIV/AIDS were included in the study. Results of the multivariate model suggested that current highly active antiretroviral treatment at baseline, lower baseline CD4 (<200 cells/mm), and extended durations of elevated plasma viral load were each associated with accelerated progression. Immunological improvement was accelerated significantly from 2004 onward, with 23% and 46% increases in the probability of CD4 improvement from the fourth CD4 stratum (CD4 < 200) in 2004-2008 and 2008-2011, respectively. Our results demonstrate the impact of innovations in antiretroviral treatment and treatment delivery at the population level. These results can be used to estimate a transition probability matrix flexible to changes in the observed mix of clients in different clinical stages and treatment regimens over time.

  12. Semi-quantitative assessment of the physical vulnerability of buildings for the landslide risk analysis. A case study in the Loures municipality, Lisbon district, Portugal

    NASA Astrophysics Data System (ADS)

    Guillard-Gonçalves, Clémence; Zêzere, José Luis; Pereira, Susana; Garcia, Ricardo

    2016-04-01

    The physical vulnerability of the buildings of Loures (a Portuguese municipality) to landslides was assessed, and the landslide risk was computed as the product of the landslide hazard by the vulnerability and the market economic value of the buildings. First, the hazard was assessed by combining the spatio-temporal probability and the frequency-magnitude relationship of the landslides, which was established by plotting the probability of a landslide area. The susceptibility of deep-seated and shallow landslides was assessed by a bi-variate statistical method and was mapped. The annual and multiannual spatio-temporal probabilities were estimated, providing a landslide hazard model. Then, an assessment of buildings vulnerability to landslides, based on an inquiry of a pool of landslide European experts, was developed and applied to the study area. The inquiry was based on nine magnitude scenarios and four structural building types. A sub-pool of the landslide experts who know the study area was extracted from the pool, and the variability of the answers coming from the pool and the sub-pool was assessed with standard deviation. Moreover, the average vulnerability of the basic geographic entities was compared by changing the map unit and applying the vulnerability to all the buildings of a test site (included in the study area), the inventory of which was listed on the field. Next, the market economic value of the buildings was calculated using an adaptation of the Portuguese Tax Services approach. Finally, the annual and multiannual landslide risk was computed for the nine landslide magnitude scenarios and different spatio-temporal probabilities by multiplying the potential loss (Vulnerability × Economic Value) by the hazard probability. As a rule, the vulnerability values given by the sub-pool of experts who know the study area are higher than those given by the European experts, namely for the high magnitude landslides. The obtained vulnerabilities vary from 0.2 to 1 as a function of the structural building types and the landslide magnitude, and are maximal for 10 and 20 meters landslide depths. However, the highest annual risk was found for the 3 m deep landslides, with a maximum value of 25.68 € per 5 m pixel, which is explained by the combination of a relatively high frequency in the Loures municipality with a substantial potential damage.

  13. Time-course of germination, initiation of mycelium proliferation and probability of visible growth and detectable AFB1 production of an isolate of Aspergillus flavus on pistachio extract agar.

    PubMed

    Aldars-García, Laila; Sanchis, Vicente; Ramos, Antonio J; Marín, Sonia

    2017-06-01

    The aim of this work was to assess the temporal relationship among quantified germination, mycelial growth and aflatoxin B 1 (AFB1) production from colonies coming from single spores, in order to find the best way to predict as accurately as possible the presence of AFB1 at the early stages of contamination. Germination, mycelial growth, probability of growth and probability of AFB1 production of an isolate of Aspergillus flavus were determined at 25 °C and two water activities (0.85 and 0.87) on 3% Pistachio Extract Agar (PEA). The percentage of germinated spores versus time was fitted to the modified Gompertz equation for the estimation of the germination parameters (geometrical germination time and germination rate). The radial growth curve for each colony was fitted to a linear model for the estimation of the apparent lag time for growth and the growth rate, and besides the time to visible growth was estimated. Binary data obtained from growth and AFB1 studies were modeled using logistic regression analysis. Both water activities led to a similar fungal growth and AFB1 production. In this study, given the suboptimal set conditions, it has been observed that germination is a stage far from the AFB1 production process. Once the probability of growth started to increase it took 6 days to produce AFB1, and when probability of growth was 100%, only a 40-57% probability of detection of AFB1 production was predicted. Moreover, colony sizes with a radius of 1-2 mm could be a helpful indicator of the possible AFB1 contamination in the commodity. Despite growth models may overestimate the presence of AFB1, their use would be a helpful tool for producers and manufacturers; from our data 5% probability of AFB1 production (initiation of production) would occur when a minimum of 60% probability of growth is observed. Legal restrictions are quite severe for these toxins, thus their control from the early stages of contamination throughout the food chain is of paramount importance. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Predicted sequence of cortical tau and amyloid-β deposition in Alzheimer disease spectrum.

    PubMed

    Cho, Hanna; Lee, Hye Sun; Choi, Jae Yong; Lee, Jae Hoon; Ryu, Young Hoon; Lee, Myung Sik; Lyoo, Chul Hyoung

    2018-04-17

    We investigated sequential order between tau and amyloid-β (Aβ) deposition in Alzheimer disease spectrum using a conditional probability method. Two hundred twenty participants underwent 18 F-flortaucipir and 18 F-florbetaben positron emission tomography scans and neuropsychological tests. The presence of tau and Aβ in each region and impairment in each cognitive domain were determined by Z-score cutoffs. By comparing pairs of conditional probabilities, the sequential order of tau and Aβ deposition were determined. Probability for the presence of tau in the entorhinal cortex was higher than that of Aβ in all cortical regions, and in the medial temporal cortices, probability for the presence of tau was higher than that of Aβ. Conversely, in the remaining neocortex above the inferior temporal cortex, probability for the presence of Aβ was always higher than that of tau. Tau pathology in the entorhinal cortex may appear earlier than neocortical Aβ and may spread in the absence of Aβ within the neighboring medial temporal regions. However, Aβ may be required for massive tau deposition in the distant cortical areas. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. Under-sampling trajectory design for compressed sensing based DCE-MRI.

    PubMed

    Liu, Duan-duan; Liang, Dong; Zhang, Na; Liu, Xin; Zhang, Yuan-ting

    2013-01-01

    Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) needs high temporal and spatial resolution to accurately estimate quantitative parameters and characterize tumor vasculature. Compressed Sensing (CS) has the potential to accomplish this mutual importance. However, the randomness in CS under-sampling trajectory designed using the traditional variable density (VD) scheme may translate to uncertainty in kinetic parameter estimation when high reduction factors are used. Therefore, accurate parameter estimation using VD scheme usually needs multiple adjustments on parameters of Probability Density Function (PDF), and multiple reconstructions even with fixed PDF, which is inapplicable for DCE-MRI. In this paper, an under-sampling trajectory design which is robust to the change on PDF parameters and randomness with fixed PDF is studied. The strategy is to adaptively segment k-space into low-and high frequency domain, and only apply VD scheme in high-frequency domain. Simulation results demonstrate high accuracy and robustness comparing to VD design.

  16. Markov Chain Monte Carlo estimation of species distributions: a case study of the swift fox in western Kansas

    USGS Publications Warehouse

    Sargeant, Glen A.; Sovada, Marsha A.; Slivinski, Christiane C.; Johnson, Douglas H.

    2005-01-01

    Accurate maps of species distributions are essential tools for wildlife research and conservation. Unfortunately, biologists often are forced to rely on maps derived from observed occurrences recorded opportunistically during observation periods of variable length. Spurious inferences are likely to result because such maps are profoundly affected by the duration and intensity of observation and by methods used to delineate distributions, especially when detection is uncertain. We conducted a systematic survey of swift fox (Vulpes velox) distribution in western Kansas, USA, and used Markov chain Monte Carlo (MCMC) image restoration to rectify these problems. During 1997–1999, we searched 355 townships (ca. 93 km) 1–3 times each for an average cost of $7,315 per year and achieved a detection rate (probability of detecting swift foxes, if present, during a single search) of = 0.69 (95% Bayesian confidence interval [BCI] = [0.60, 0.77]). Our analysis produced an estimate of the underlying distribution, rather than a map of observed occurrences, that reflected the uncertainty associated with estimates of model parameters. To evaluate our results, we analyzed simulated data with similar properties. Results of our simulations suggest negligible bias and good precision when probabilities of detection on ≥1 survey occasions (cumulative probabilities of detection) exceed 0.65. Although the use of MCMC image restoration has been limited by theoretical and computational complexities, alternatives do not possess the same advantages. Image models accommodate uncertain detection, do not require spatially independent data or a census of map units, and can be used to estimate species distributions directly from observations without relying on habitat covariates or parameters that must be estimated subjectively. These features facilitate economical surveys of large regions, the detection of temporal trends in distribution, and assessments of landscape-level relations between species and habitats. Requirements for the use of MCMC image restoration include study areas that can be partitioned into regular grids of mapping units, spatially contagious species distributions, reliable methods for identifying target species, and cumulative probabilities of detection ≥0.65.

  17. Markov chain Monte Carlo estimation of species distributions: A case study of the swift fox in western Kansas

    USGS Publications Warehouse

    Sargeant, G.A.; Sovada, M.A.; Slivinski, C.C.; Johnson, D.H.

    2005-01-01

    Accurate maps of species distributions are essential tools for wildlife research and conservation. Unfortunately, biologists often are forced to rely on maps derived from observed occurrences recorded opportunistically during observation periods of variable length. Spurious inferences are likely to result because such maps are profoundly affected by the duration and intensity of observation and by methods used to delineate distributions, especially when detection is uncertain. We conducted a systematic survey of swift fox (Vulpes velox) distribution in western Kansas, USA, and used Markov chain Monte Carlo (MCMC) image restoration to rectify these problems. During 1997-1999, we searched 355 townships (ca. 93 km2) 1-3 times each for an average cost of $7,315 per year and achieved a detection rate (probability of detecting swift foxes, if present, during a single search) of ?? = 0.69 (95% Bayesian confidence interval [BCI] = [0.60, 0.77]). Our analysis produced an estimate of the underlying distribution, rather than a map of observed occurrences, that reflected the uncertainty associated with estimates of model parameters. To evaluate our results, we analyzed simulated data with similar properties. Results of our simulations suggest negligible bias and good precision when probabilities of detection on ???1 survey occasions (cumulative probabilities of detection) exceed 0.65. Although the use of MCMC image restoration has been limited by theoretical and computational complexities, alternatives do not possess the same advantages. Image models accommodate uncertain detection, do not require spatially independent data or a census of map units, and can be used to estimate species distributions directly from observations without relying on habitat covariates or parameters that must be estimated subjectively. These features facilitate economical surveys of large regions, the detection of temporal trends in distribution, and assessments of landscape-level relations between species and habitats. Requirements for the use of MCMC image restoration include study areas that can be partitioned into regular grids of mapping units, spatially contagious species distributions, reliable methods for identifying target species, and cumulative probabilities of detection ???0.65.

  18. Higher temporal variability of forest breeding bird communities in fragmented landscapes

    USGS Publications Warehouse

    Boulinier, T.; Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Flather, C.H.; Pollock, K.H.

    1998-01-01

    Understanding the relationship between animal community dynamics and landscape structure has become a priority for biodiversity conservation. In particular, predicting the effects of habitat destruction that confine species to networks of small patches is an important prerequisite to conservation plan development. Theoretical models that predict the occurrence of species in fragmented landscapes, and relationships between stability and diversity do exist. However, reliable empirical investigations of the dynamics of biodiversity have been prevented by differences in species detection probabilities among landscapes. Using long-term data sampled at a large spatial scale in conjunction with a capture-recapture approach, we developed estimates of parameters of community changes over a 22-year period for forest breeding birds in selected areas of the eastern United States. We show that forest fragmentation was associated not only with a reduced number of forest bird species, but also with increased temporal variability in the number of species. This higher temporal variability was associated with higher local extinction and turnover rates. These results have major conservation implications. Moreover, the approach used provides a practical tool for the study of the dynamics of biodiversity.

  19. How temporal cues can aid colour constancy

    PubMed Central

    Foster, David H.; Amano, Kinjiro; Nascimento, Sérgio M. C.

    2007-01-01

    Colour constancy assessed by asymmetric simultaneous colour matching usually reveals limited levels of performance in the unadapted eye. Yet observers can readily discriminate illuminant changes on a scene from changes in the spectral reflectances of the surfaces making up the scene. This ability is probably based on judgements of relational colour constancy, in turn based on the physical stability of spatial ratios of cone excitations under illuminant changes. Evidence is presented suggesting that the ability to detect violations in relational colour constancy depends on temporal transient cues. Because colour constancy and relational colour constancy are closely connected, it should be possible to improve estimates of colour constancy by introducing similar transient cues into the matching task. To test this hypothesis, an experiment was performed in which observers made surface-colour matches between patterns presented in the same position in an alternating sequence with period 2 s or, as a control, presented simultaneously, side-by-side. The degree of constancy was significantly higher for sequential presentation, reaching 87% for matches averaged over 20 observers. Temporal cues may offer a useful source of information for making colour-constancy judgements. PMID:17515948

  20. Analyzing the evolution of young people's brain cancer mortality in Spanish provinces.

    PubMed

    Ugarte, M D; Adin, A; Goicoa, T; López-Abente, G

    2015-06-01

    To analyze the spatio-temporal evolution of brain cancer relative mortality risks in young population (under 20 years of age) in Spanish provinces during the period 1986-2010. A new and flexible conditional autoregressive spatio-temporal model with two levels of spatial aggregation was used. Brain cancer relative mortality risks in young population in Spanish provinces decreased during the last years, although a clear increase was observed during the 1990s. The global geographical pattern emphasized a high relative mortality risk in Navarre and a low relative mortality risk in Madrid. Although there is a specific Autonomous Region-time interaction effect on the relative mortality risks this effect is weak in the final estimates when compared to the global spatial and temporal effects. Differences in mortality between regions and over time may be caused by the increase in survival rates, the differences in treatment or the availability of diagnostic tools. The increase in relative risks observed in the 1990s was probably due to improved diagnostics with computerized axial tomography and magnetic resonance imaging techniques. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Temporal-varying failures of nodes in networks

    NASA Astrophysics Data System (ADS)

    Knight, Georgie; Cristadoro, Giampaolo; Altmann, Eduardo G.

    2015-08-01

    We consider networks in which random walkers are removed because of the failure of specific nodes. We interpret the rate of loss as a measure of the importance of nodes, a notion we denote as failure centrality. We show that the degree of the node is not sufficient to determine this measure and that, in a first approximation, the shortest loops through the node have to be taken into account. We propose approximations of the failure centrality which are valid for temporal-varying failures, and we dwell on the possibility of externally changing the relative importance of nodes in a given network by exploiting the interference between the loops of a node and the cycles of the temporal pattern of failures. In the limit of long failure cycles we show analytically that the escape in a node is larger than the one estimated from a stochastic failure with the same failure probability. We test our general formalism in two real-world networks (air-transportation and e-mail users) and show how communities lead to deviations from predictions for failures in hubs.

  2. Spatio-temporal variations in storm surges along the North Atlantic coasts

    NASA Astrophysics Data System (ADS)

    Marcos, Marta; Woodworth, Philip

    2017-04-01

    Extreme sea levels along the coasts of the North Atlantic Ocean and the Gulf of Mexico have been investigated using hourly tide gauge records compiled in the recently released GESLA-2 data set (www.gesla.org). These regions are among the most densely monitored coasts worldwide, with more than 300 high frequency quality-controlled tide gauge time series available. Here we estimate the storm surge component of extreme sea levels using both tidal residuals and skew surges, for which we explore the spatial and temporal coherency of their intensities, duration and frequency. We quantify the relationship of extremes with dominant large scale climate patterns and discuss the impact of mean sea level changes. Finally, we test the assumption of stationarity of the probability of extreme occurrence and to which extent it holds when mean sea level changes are considered in combination with storm surges.

  3. Ecological risk assessment of TBT in Ise Bay.

    PubMed

    Yamamoto, Joji; Yonezawa, Yoshitaka; Nakata, Kisaburo; Horiguchi, Fumio

    2009-02-01

    An ecological risk assessment of tributyltin (TBT) in Ise Bay was conducted using the margin of exposure (MOE) method. The assessment endpoint was defined to protect the survival, growth and reproduction of marine organisms. Sources of TBT in this study were assumed to be commercial vessels in harbors and navigation routes. Concentrations of TBT in Ise Bay were estimated using a three-dimensional hydrodynamic model, an ecosystem model and a chemical fate model. Estimated MOEs for marine organisms for 1990 and 2008 were approximately 0.1-2.0 and over 100 respectively, indicating a declining temporal trend in the probability of adverse effects. The chemical fate model predicts a much longer persistence of TBT in sediments than in the water column. Therefore, it is necessary to monitor the harmful effects of TBT on benthic organisms.

  4. Evaluation and comparison of statistical methods for early temporal detection of outbreaks: A simulation-based study

    PubMed Central

    Le Strat, Yann

    2017-01-01

    The objective of this paper is to evaluate a panel of statistical algorithms for temporal outbreak detection. Based on a large dataset of simulated weekly surveillance time series, we performed a systematic assessment of 21 statistical algorithms, 19 implemented in the R package surveillance and two other methods. We estimated false positive rate (FPR), probability of detection (POD), probability of detection during the first week, sensitivity, specificity, negative and positive predictive values and F1-measure for each detection method. Then, to identify the factors associated with these performance measures, we ran multivariate Poisson regression models adjusted for the characteristics of the simulated time series (trend, seasonality, dispersion, outbreak sizes, etc.). The FPR ranged from 0.7% to 59.9% and the POD from 43.3% to 88.7%. Some methods had a very high specificity, up to 99.4%, but a low sensitivity. Methods with a high sensitivity (up to 79.5%) had a low specificity. All methods had a high negative predictive value, over 94%, while positive predictive values ranged from 6.5% to 68.4%. Multivariate Poisson regression models showed that performance measures were strongly influenced by the characteristics of time series. Past or current outbreak size and duration strongly influenced detection performances. PMID:28715489

  5. A Multi-Temporal Remote Sensing Approach to Freshwater Turtle Conservation

    NASA Astrophysics Data System (ADS)

    Mui, Amy B.

    Freshwater turtles are a globally declining taxa, and estimates of population status are not available for many species. Primary causes of decline stem from widespread habitat loss and degradation, and obtaining spatially-explicit information on remaining habitat across a relevant spatial scale has proven challenging. The discipline of remote sensing science has been employed widely in studies of biodiversity conservation, but it has not been utilized as frequently for cryptic, and less vagile species such as turtles, despite their vulnerable status. The work presented in this thesis investigates how multi-temporal remote sensing imagery can contribute key information for building spatially-explicit and temporally dynamic models of habitat and connectivity for the threatened, Blanding's turtle (Emydoidea blandingii) in southern Ontario, Canada. I began with outlining a methodological approach for delineating freshwater wetlands from high spatial resolution remote sensing imagery, using a geographic object-based image analysis (GEOBIA) approach. This method was applied to three different landscapes in southern Ontario, and across two biologically relevant seasons during the active (non-hibernating) period of Blanding's turtles. Next, relevant environmental variables associated with turtle presence were extracted from remote sensing imagery, and a boosted regression tree model was developed to predict the probability of occurrence of this species. Finally, I analysed the movement potential for Blanding's turtles in a disturbed landscape using a combination of approaches. Results indicate that (1) a parsimonious GEOBIA approach to land cover mapping, incorporating texture, spectral indices, and topographic information can map heterogeneous land cover with high accuracy, (2) remote-sensing derived environmental variables can be used to build habitat models with strong predictive power, and (3) connectivity potential is best estimated using a variety of approaches, though accurate estimates across human-altered landscapes is challenging. Overall, this body of work supports the use of remote sensing imagery in species distribution models to strengthen the precision, and power of predictive models, and also draws attention to the need to consider a multi-temporal examination of species habitat requirements.

  6. Bayesian switching factor analysis for estimating time-varying functional connectivity in fMRI.

    PubMed

    Taghia, Jalil; Ryali, Srikanth; Chen, Tianwen; Supekar, Kaustubh; Cai, Weidong; Menon, Vinod

    2017-07-15

    There is growing interest in understanding the dynamical properties of functional interactions between distributed brain regions. However, robust estimation of temporal dynamics from functional magnetic resonance imaging (fMRI) data remains challenging due to limitations in extant multivariate methods for modeling time-varying functional interactions between multiple brain areas. Here, we develop a Bayesian generative model for fMRI time-series within the framework of hidden Markov models (HMMs). The model is a dynamic variant of the static factor analysis model (Ghahramani and Beal, 2000). We refer to this model as Bayesian switching factor analysis (BSFA) as it integrates factor analysis into a generative HMM in a unified Bayesian framework. In BSFA, brain dynamic functional networks are represented by latent states which are learnt from the data. Crucially, BSFA is a generative model which estimates the temporal evolution of brain states and transition probabilities between states as a function of time. An attractive feature of BSFA is the automatic determination of the number of latent states via Bayesian model selection arising from penalization of excessively complex models. Key features of BSFA are validated using extensive simulations on carefully designed synthetic data. We further validate BSFA using fingerprint analysis of multisession resting-state fMRI data from the Human Connectome Project (HCP). Our results show that modeling temporal dependencies in the generative model of BSFA results in improved fingerprinting of individual participants. Finally, we apply BSFA to elucidate the dynamic functional organization of the salience, central-executive, and default mode networks-three core neurocognitive systems with central role in cognitive and affective information processing (Menon, 2011). Across two HCP sessions, we demonstrate a high level of dynamic interactions between these networks and determine that the salience network has the highest temporal flexibility among the three networks. Our proposed methods provide a novel and powerful generative model for investigating dynamic brain connectivity. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Anurans in a Subarctic Tundra Landscape Near Cape Churchill, Manitoba

    USGS Publications Warehouse

    Reiter, M.E.; Boal, C.W.; Andersen, D.E.

    2008-01-01

    Distribution, abundance, and habitat relationships of anurans inhabiting subarctic regions are poorly understood, and anuran monitoring protocols developed for temperate regions may not be applicable across large roadless areas of northern landscapes. In addition, arctic and subarctic regions of North America are predicted to experience changes in climate and, in some areas, are experiencing habitat alteration due to high rates of herbivory by breeding and migrating waterfowl. To better understand subarctic anuran abundance, distribution, and habitat associations, we conducted anuran calling surveys in the Cape Churchill region of Wapusk National Park, Manitoba, Canada, in 2004 and 2005. We conducted surveys along ~l-km transects distributed across three landscape types (coastal tundra, interior sedge meadow-tundra, and boreal forest-tundra interface) to estimate densities and probabilities of detection of Boreal Chorus Frogs (Pseudacris maculata) and Wood Frogs (Lithobates sylvaticus). We detected a Wood Frog or Boreal Chorus Frog on 22 (87%) of 26 transects surveyed, but probability of detection varied between years and species and among landscape types. Estimated densities of both species increased from the coastal zone inland toward the boreal forest edge. Our results suggest anurans occur across all three landscape types in our study area, but that species-specific spatial patterns exist in their abundances. Considerations for both spatial and temporal variation in abundance and detection probability need to be incorporated into surveys and monitoring programs for subarctic anurans.

  8. Markov Chain-Based Acute Effect Estimation of Air Pollution on Elder Asthma Hospitalization

    PubMed Central

    Luo, Li; Zhang, Fengyi; Sun, Lin; Li, Chunyang; Huang, Debin; Han, Gao; Wang, Bin

    2017-01-01

    Background Asthma caused substantial economic and health care burden and is susceptible to air pollution. Particularly, when it comes to elder asthma patient (older than 65), the phenomenon is more significant. The aim of this study is to investigate the Markov-based acute effects of air pollution on elder asthma hospitalizations, in forms of transition probabilities. Methods A retrospective, population-based study design was used to assess temporal patterns in hospitalizations for asthma in a region of Sichuan province, China. Approximately 12 million residents were covered during this period. Relative risk analysis and Markov chain model were employed on daily hospitalization state estimation. Results Among PM2.5, PM10, NO2, and SO2, only SO2 was significant. When air pollution is severe, the transition probability from a low-admission state (previous day) to high-admission state (next day) is 35.46%, while it is 20.08% when air pollution is mild. In particular, for female-cold subgroup, the counterparts are 30.06% and 0.01%, respectively. Conclusions SO2 was a significant risk factor for elder asthma hospitalization. When air pollution worsened, the transition probabilities from each state to high admission states increase dramatically. This phenomenon appeared more evidently, especially in female-cold subgroup (which is in cold season for female admissions). Based on our work, admission amount forecast, asthma intervention, and corresponding healthcare allocation can be done. PMID:29147496

  9. Endogenous modulation of low frequency oscillations by temporal expectations

    PubMed Central

    Cravo, Andre M.; Rohenkohl, Gustavo; Wyart, Valentin

    2011-01-01

    Recent studies have associated increasing temporal expectations with synchronization of higher frequency oscillations and suppression of lower frequencies. In this experiment, we explore a proposal that low-frequency oscillations provide a mechanism for regulating temporal expectations. We used a speeded Go/No-go task and manipulated temporal expectations by changing the probability of target presentation after certain intervals. Across two conditions, the temporal conditional probability of target events differed substantially at the first of three possible intervals. We found that reactions times differed significantly at this first interval across conditions, decreasing with higher temporal expectations. Interestingly, the power of theta activity (4–8 Hz), distributed over central midline sites, also differed significantly across conditions at this first interval. Furthermore, we found a transient coupling between theta phase and beta power after the first interval in the condition with high temporal expectation for targets at this time point. Our results suggest that the adjustments in theta power and the phase-power coupling between theta and beta contribute to a central mechanism for controlling neural excitability according to temporal expectations. PMID:21900508

  10. Multiscale spatial and temporal estimation of the b-value

    NASA Astrophysics Data System (ADS)

    García-Hernández, R.; D'Auria, L.; Barrancos, J.; Padilla, G.

    2017-12-01

    The estimation of the spatial and temporal variations of the Gutenberg-Richter b-value is of great importance in different seismological applications. One of the problems affecting its estimation is the heterogeneous distribution of the seismicity which makes its estimate strongly dependent upon the selected spatial and/or temporal scale. This is especially important in volcanoes where dense clusters of earthquakes often overlap the background seismicity. Proposed solutions for estimating temporal variations of the b-value include considering equally spaced time intervals or variable intervals having an equal number of earthquakes. Similar approaches have been proposed to image the spatial variations of this parameter as well.We propose a novel multiscale approach, based on the method of Ogata and Katsura (1993), allowing a consistent estimation of the b-value regardless of the considered spatial and/or temporal scales. Our method, named MUST-B (MUltiscale Spatial and Temporal characterization of the B-value), basically consists in computing estimates of the b-value at multiple temporal and spatial scales, extracting for a give spatio-temporal point a statistical estimator of the value, as well as and indication of the characteristic spatio-temporal scale. This approach includes also a consistent estimation of the completeness magnitude (Mc) and of the uncertainties over both b and Mc.We applied this method to example datasets for volcanic (Tenerife, El Hierro) and tectonic areas (Central Italy) as well as an example application at global scale.

  11. A Spatio-Temporally Explicit Random Encounter Model for Large-Scale Population Surveys

    PubMed Central

    Jousimo, Jussi; Ovaskainen, Otso

    2016-01-01

    Random encounter models can be used to estimate population abundance from indirect data collected by non-invasive sampling methods, such as track counts or camera-trap data. The classical Formozov–Malyshev–Pereleshin (FMP) estimator converts track counts into an estimate of mean population density, assuming that data on the daily movement distances of the animals are available. We utilize generalized linear models with spatio-temporal error structures to extend the FMP estimator into a flexible Bayesian modelling approach that estimates not only total population size, but also spatio-temporal variation in population density. We also introduce a weighting scheme to estimate density on habitats that are not covered by survey transects, assuming that movement data on a subset of individuals is available. We test the performance of spatio-temporal and temporal approaches by a simulation study mimicking the Finnish winter track count survey. The results illustrate how the spatio-temporal modelling approach is able to borrow information from observations made on neighboring locations and times when estimating population density, and that spatio-temporal and temporal smoothing models can provide improved estimates of total population size compared to the FMP method. PMID:27611683

  12. Estimating movement and survival rates of a small saltwater fish using autonomous antenna receiver arrays and passive integrated transponder tags

    USGS Publications Warehouse

    Rudershausen, Paul J.; Buckel, Jeffery A.; Dubreuil, Todd; O'Donnell, Matthew J.; Hightower, Joseph E.; Poland, Steven J.; Letcher, Benjamin H.

    2014-01-01

    We evaluated the performance of small (12.5 mm long) passive integrated transponder (PIT) tags and custom detection antennas for obtaining fine-scale movement and demographic data of mummichog Fundulus heteroclitus in a salt marsh creek. Apparent survival and detection probability were estimated using a Cormack Jolly Seber (CJS) model fitted to detection data collected by an array of 3 vertical antennas from November 2010 to March 2011 and by a single horizontal antenna from April to August 2011. Movement of mummichogs was monitored during the period when the array of vertical antennas was used. Antenna performance was examined in situ using tags placed in wooden dowels (drones) and in live mummichogs. Of the 44 tagged fish, 42 were resighted over the 9 mo monitoring period. The in situ detection probabilities of the drone and live mummichogs were high (~80-100%) when the ambient water depth was less than ~0.8 m. Upstream and downstream movement of mummichogs was related to hourly water depth and direction of tidal current in a way that maximized time periods over which mummichogs utilized the intertidal vegetated marsh. Apparent survival was lower during periods of colder water temperatures in December 2010 and early January 2011 (median estimate of daily apparent survival = 0.979) than during other periods of the study (median estimate of daily apparent survival = 0.992). During late fall and winter, temperature had a positive effect on the CJS detection probability of a tagged mummichog, likely due to greater fish activity over warmer periods. During the spring and summer, this pattern reversed possibly due to mummichogs having reduced activity during the hottest periods. This study demonstrates the utility of PIT tags and continuously operating autonomous detection systems for tracking fish at fine temporal scales, and improving estimates of demographic parameters in salt marsh creeks that are difficult or impractical to sample with active fishing gear.

  13. Factors influencing nest survival and productivity of Red-throated Loons (Gavia stellata) in Alaska

    USGS Publications Warehouse

    Rizzolo, Daniel; Schmutz, Joel A.; McCloskey, Sarah E.; Fondell, Thomas F.

    2014-01-01

    Red-throated Loon (Gavia stellata) numbers in Alaska have fluctuated dramatically over the past 3 decades; however, the demographic processes contributing to these population dynamics are poorly understood. To examine spatial and temporal variation in productivity, we estimated breeding parameters at 5 sites in Alaska: at Cape Espenberg and the Copper River Delta we estimated nest survival, and at 3 sites within the Yukon-Kuskokwim Delta we estimated nest survival and productivity. Nest survival varied broadly among sites and years; annual estimates (lower, upper 95% confidence interval) ranged from 0.09 (0.03, 0.29) at Cape Espenberg in 2001 to 0.93 (0.76, 0.99) at the Copper River Delta in 2002. Annual variation among sites was not concordant, suggesting that site-scale factors had a strong influence on nest survival. Models of nest survival indicated that visits to monitor nests had a negative effect on nest daily survival probability, which if not accounted for biased nest survival strongly downward. The sensitivity of breeding Red-throated Loons to nest monitoring suggests other sources of disturbance that cause incubating birds to flush from their nests may also reduce nest survival. Nest daily survival probability at the Yukon-Kuskokwim Delta was negatively associated with an annual index of fox occurrence. Survival through the incubation and chick-rearing periods on the Yukon-Kuskokwim Delta ranged from 0.09 (0.001, 0.493) to 0.50 (0.04, 0.77). Daily survival probability during the chick-rearing period was lower for chicks that had a sibling in 2 of 3 years, consistent with the hypothesis that food availability was limited. Estimates of annual productivity on the Yukon-Kuskokwim Delta ranged from 0.17 to 1.0 chicks per pair. Productivity was not sufficient to maintain population stability in 2 of 3 years, indicating that nest depredation by foxes and poor foraging conditions during chick rearing can have important effects on productivity.

  14. Are ranger patrols effective in reducing poaching-related threats within protected areas?

    USGS Publications Warehouse

    Moore, Jennnifer F.; Mulindahabi, Felix; Masozera, Michel K.; Nichols, James; Hines, James; Turikunkiko, Ezechiel; Oli, Madan K.

    2018-01-01

    Poaching is one of the greatest threats to wildlife conservation world-wide. However, the spatial and temporal patterns of poaching activities within protected areas, and the effectiveness of ranger patrols and ranger posts in mitigating these threats, are relatively unknown.We used 10 years (2006–2015) of ranger-based monitoring data and dynamic multi-season occupancy models to quantify poaching-related threats, to examine factors influencing the spatio-temporal dynamics of these threats and to test the efficiency of management actions to combat poaching in Nyungwe National Park (NNP), Rwanda.The probability of occurrence of poaching-related threats was highest at lower elevations (1,801–2,200 m), especially in areas that were close to roads and tourist trails; conversely, occurrence probability was lowest at high elevation sites (2,601–3,000 m), and near the park boundary and ranger posts. The number of ranger patrols substantially increased the probability that poaching-related threats disappear at a site if threats were originally present (i.e. probability of extinction of threats). Without ranger visits, the annual probability of extinction of poaching-related threats was an estimated 7%; this probability would increase to 20% and 57% with 20 and 50 ranger visits per year, respectively.Our results suggest that poaching-related threats can be effectively reduced in NNP by adding ranger posts in areas where they do not currently exist, and by increasing the number of patrols to sites where the probability of poaching activities is high.Synthesis and applications. Our application of dynamic occupancy models to predict the probability of presence of poaching-related threats is novel, and explicitly considers imperfect detection of illegal activities. Based on the modelled relationships, we identify areas that are most vulnerable to poaching, and offer insights regarding how ranger patrols can be optimally deployed to reduce poaching-related threats and other illegal activites, while taking into account potential sampling biases. We show that poaching can be effectively reduced by increasing ranger patrols to areas under high risk of poaching activities, and by adding ranger patrols near these sites. These findings are broadly applicable to national parks and protected areas experiencing a high degree of poaching and other illegal activities.

  15. The non-uniformity of fossil preservation.

    PubMed

    Holland, Steven M

    2016-07-19

    The fossil record provides the primary source of data for calibrating the origin of clades. Although minimum ages of clades are given by the oldest preserved fossil, these underestimate the true age, which must be bracketed by probabilistic methods based on multiple fossil occurrences. Although most of these methods assume uniform preservation rates, this assumption is unsupported over geological timescales. On geologically long timescales (more than 10 Myr), the origin and cessation of sedimentary basins, and long-term variations in tectonic subsidence, eustatic sea level and sedimentation rate control the availability of depositional facies that preserve the environments in which species lived. The loss of doomed sediments, those with a low probability of preservation, imparts a secular trend to fossil preservation. As a result, the fossil record is spatially and temporally non-uniform. Models of fossil preservation should reflect this non-uniformity by using empirical estimates of fossil preservation that are spatially and temporally partitioned, or by using indirect proxies of fossil preservation. Geologically, realistic models of preservation will provide substantially more reliable estimates of the origination of clades.This article is part of the themed issue 'Dating species divergences using rocks and clocks'. © 2016 The Author(s).

  16. Quantifying temporal isolation: a modelling approach assessing the effect of flowering time differences on crop-to-weed pollen flow in sunflower

    PubMed Central

    Roumet, Marie; Cayre, Adeline; Latreille, Muriel; Muller, Marie-Hélène

    2015-01-01

    Flowering time divergence can be a crucial component of reproductive isolation between sympatric populations, but few studies have quantified its actual contribution to the reduction of gene flow. In this study, we aimed at estimating pollen-mediated gene flow between cultivated sunflower and a weedy conspecific sunflower population growing in the same field and at quantifying, how it is affected by the weeds' flowering time. For that purpose, we extended an existing mating model by including a temporal distance (i.e. flowering time difference between potential parents) effect on mating probabilities. Using phenological and genotypic data gathered on the crop and on a sample of the weedy population and its offspring, we estimated an average hybridization rate of approximately 10%. This rate varied strongly from 30% on average for weeds flowering at the crop flowering peak to 0% when the crop finished flowering and was affected by the local density of weeds. Our result also suggested the occurrence of other factors limiting crop-to-weed gene flow. This level of gene flow and its dependence on flowering time might influence the evolutionary fate of weedy sunflower populations sympatric to their crop relative. PMID:25667603

  17. The non-uniformity of fossil preservation

    PubMed Central

    2016-01-01

    The fossil record provides the primary source of data for calibrating the origin of clades. Although minimum ages of clades are given by the oldest preserved fossil, these underestimate the true age, which must be bracketed by probabilistic methods based on multiple fossil occurrences. Although most of these methods assume uniform preservation rates, this assumption is unsupported over geological timescales. On geologically long timescales (more than 10 Myr), the origin and cessation of sedimentary basins, and long-term variations in tectonic subsidence, eustatic sea level and sedimentation rate control the availability of depositional facies that preserve the environments in which species lived. The loss of doomed sediments, those with a low probability of preservation, imparts a secular trend to fossil preservation. As a result, the fossil record is spatially and temporally non-uniform. Models of fossil preservation should reflect this non-uniformity by using empirical estimates of fossil preservation that are spatially and temporally partitioned, or by using indirect proxies of fossil preservation. Geologically, realistic models of preservation will provide substantially more reliable estimates of the origination of clades. This article is part of the themed issue ‘Dating species divergences using rocks and clocks’. PMID:27325828

  18. Factors influencing detection of the federally endangered Diamond Darter Crystallaria cincotta: Implications for long-term monitoring strategies

    USGS Publications Warehouse

    Rizzo, Austin A.; Brown, Donald J.; Welsh, Stuart A.; Thompson, Patricia A.

    2017-01-01

    Population monitoring is an essential component of endangered species recovery programs. The federally endangered Diamond Darter Crystallaria cincotta is in need of an effective monitoring design to improve our understanding of its distribution and track population trends. Because of their small size, cryptic coloration, and nocturnal behavior, along with limitations associated with current sampling methods, individuals are difficult to detect at known occupied sites. Therefore, research is needed to determine if survey efforts can be improved by increasing probability of individual detection. The primary objective of this study was to determine if there are seasonal and diel patterns in Diamond Darter detectability during population surveys. In addition to temporal factors, we also assessed five habitat variables that might influence individual detection. We used N-mixture models to estimate site abundances and relationships between covariates and individual detectability and ranked models using Akaike's information criteria. During 2015 three known occupied sites were sampled 15 times each between May and Oct. The best supported model included water temperature as a quadratic function influencing individual detectability, with temperatures around 22 C resulting in the highest detection probability. Detection probability when surveying at the optimal temperature was approximately 6% and 7.5% greater than when surveying at 16 C and 29 C, respectively. Time of Night and day of year were not strong predictors of Diamond Darter detectability. The results of this study will allow researchers and agencies to maximize detection probability when surveying populations, resulting in greater monitoring efficiency and likely more precise abundance estimates.

  19. High and variable mortality of leatherback turtles reveal possible anthropogenic impacts.

    PubMed

    Santidrián Tomillo, P; Robinson, N J; Sanz-Aguilar, A; Spotila, J R; Paladino, F V; Tavecchia, G

    2017-08-01

    The number of nesting leatherback turtles (Dermochelys coriacea) in the eastern Pacific Ocean has declined dramatically since the late 1980s. This decline has been attributed to egg poaching and interactions with fisheries. However, it is not clear how much of the decline should also be ascribed to variability in the physical characteristics of the ocean. We used data on individually marked turtles that nest at Playa Grande, Costa Rica, to address whether climatic variability affects survival and inter-breeding interval. Because some turtles might nest undetected, we used capture-recapture models to model survival probability accounting for a detection failure. In addition, as the probability of reproduction is constrained by past nesting events, we formulated a new parameterization to estimate inter-breeding intervals and contrast hypotheses on the role of climatic covariates on reproductive frequency. Average annual survival for the period 1993-2011 was low (0.78) and varied over time ranging from 0.49 to 0.99 with a negative temporal trend mainly due to the high mortality values registered after 2004. Survival probability was not associated with the Multivariate ENSO Index of the South Pacific Ocean (MEI) but this index explained 24% of the temporal variability in the reproductive frequency. The probability of a turtle to permanently leave after the first encounter was 26%. This high proportion of transients might be associated with a high mortality cost of the first reproduction or with a long-distance nesting dispersal after the first nesting season. Although current data do not allow separating these two hypotheses, low encounter rate at other locations and high investment in reproduction, supports the first hypothesis. The low and variable annual survival probability has largely contributed to the decline of this leatherback population. The lack of correlation between survival probability and the most important climatic driver of oceanic processes in the Pacific discards a climate-related decline and point to anthropogenic sources of mortality as the main causes responsible for the observed population decline. © 2017 by the Ecological Society of America.

  20. Heterogeneous occupancy and density estimates of the pathogenic fungus Batrachochytrium dendrobatidis in waters of North America

    USGS Publications Warehouse

    Chestnut, Tara E.; Anderson, Chauncey; Popa, Radu; Blaustein, Andrew R.; Voytek, Mary; Olson, Deanna H.; Kirshtein, Julie

    2014-01-01

    Biodiversity losses are occurring worldwide due to a combination of stressors. For example, by one estimate, 40% of amphibian species are vulnerable to extinction, and disease is one threat to amphibian populations. The emerging infectious disease chytridiomycosis, caused by the aquatic fungus Batrachochytrium dendrobatidis (Bd), is a contributor to amphibian declines worldwide. Bd research has focused on the dynamics of the pathogen in its amphibian hosts, with little emphasis on investigating the dynamics of free-living Bd. Therefore, we investigated patterns of Bd occupancy and density in amphibian habitats using occupancy models, powerful tools for estimating site occupancy and detection probability. Occupancy models have been used to investigate diseases where the focus was on pathogen occurrence in the host. We applied occupancy models to investigate free-living Bd in North American surface waters to determine Bd seasonality, relationships between Bd site occupancy and habitat attributes, and probability of detection from water samples as a function of the number of samples, sample volume, and water quality. We also report on the temporal patterns of Bd density from a 4-year case study of a Bd-positive wetland. We provide evidence that Bd occurs in the environment year-round. Bd exhibited temporal and spatial heterogeneity in density, but did not exhibit seasonality in occupancy. Bd was detected in all months, typically at less than 100 zoospores L−1. The highest density observed was ∼3 million zoospores L−1. We detected Bd in 47% of sites sampled, but estimated that Bd occupied 61% of sites, highlighting the importance of accounting for imperfect detection. When Bd was present, there was a 95% chance of detecting it with four samples of 600 ml of water or five samples of 60 mL. Our findings provide important baseline information to advance the study of Bd disease ecology, and advance our understanding of amphibian exposure to free-living Bd in aquatic habitats over time.

  1. Diagnostic causal reasoning with verbal information.

    PubMed

    Meder, Björn; Mayrhofer, Ralf

    2017-08-01

    In diagnostic causal reasoning, the goal is to infer the probability of causes from one or multiple observed effects. Typically, studies investigating such tasks provide subjects with precise quantitative information regarding the strength of the relations between causes and effects or sample data from which the relevant quantities can be learned. By contrast, we sought to examine people's inferences when causal information is communicated through qualitative, rather vague verbal expressions (e.g., "X occasionally causes A"). We conducted three experiments using a sequential diagnostic inference task, where multiple pieces of evidence were obtained one after the other. Quantitative predictions of different probabilistic models were derived using the numerical equivalents of the verbal terms, taken from an unrelated study with different subjects. We present a novel Bayesian model that allows for incorporating the temporal weighting of information in sequential diagnostic reasoning, which can be used to model both primacy and recency effects. On the basis of 19,848 judgments from 292 subjects, we found a remarkably close correspondence between the diagnostic inferences made by subjects who received only verbal information and those of a matched control group to whom information was presented numerically. Whether information was conveyed through verbal terms or numerical estimates, diagnostic judgments closely resembled the posterior probabilities entailed by the causes' prior probabilities and the effects' likelihoods. We observed interindividual differences regarding the temporal weighting of evidence in sequential diagnostic reasoning. Our work provides pathways for investigating judgment and decision making with verbal information within a computational modeling framework. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Bayesian analysis of multimodal data and brain imaging

    NASA Astrophysics Data System (ADS)

    Assadi, Amir H.; Eghbalnia, Hamid; Backonja, Miroslav; Wakai, Ronald T.; Rutecki, Paul; Haughton, Victor

    2000-06-01

    It is often the case that information about a process can be obtained using a variety of methods. Each method is employed because of specific advantages over the competing alternatives. An example in medical neuro-imaging is the choice between fMRI and MEG modes where fMRI can provide high spatial resolution in comparison to the superior temporal resolution of MEG. The combination of data from varying modes provides the opportunity to infer results that may not be possible by means of any one mode alone. We discuss a Bayesian and learning theoretic framework for enhanced feature extraction that is particularly suited to multi-modal investigations of massive data sets from multiple experiments. In the following Bayesian approach, acquired knowledge (information) regarding various aspects of the process are all directly incorporated into the formulation. This information can come from a variety of sources. In our case, it represents statistical information obtained from other modes of data collection. The information is used to train a learning machine to estimate a probability distribution, which is used in turn by a second machine as a prior, in order to produce a more refined estimation of the distribution of events. The computational demand of the algorithm is handled by proposing a distributed parallel implementation on a cluster of workstations that can be scaled to address real-time needs if required. We provide a simulation of these methods on a set of synthetically generated MEG and EEG data. We show how spatial and temporal resolutions improve by using prior distributions. The method on fMRI signals permits one to construct the probability distribution of the non-linear hemodynamics of the human brain (real data). These computational results are in agreement with biologically based measurements of other labs, as reported to us by researchers from UK. We also provide preliminary analysis involving multi-electrode cortical recording that accompanies behavioral data in pain experiments on freely moving mice subjected to moderate heat delivered by an electric bulb. Summary of new or breakthrough ideas: (1) A new method to estimate probability distribution for measurement of nonlinear hemodynamics of brain from a multi- modal neuronal data. This is the first time that such an idea is tried, to our knowledge. (2) Breakthrough in improvement of time resolution of fMRI signals using (1) above.

  3. Behavior of sensitivities in the one-dimensional advection-dispersion equation: Implications for parameter estimation and sampling design

    USGS Publications Warehouse

    Knopman, Debra S.; Voss, Clifford I.

    1987-01-01

    The spatial and temporal variability of sensitivities has a significant impact on parameter estimation and sampling design for studies of solute transport in porous media. Physical insight into the behavior of sensitivities is offered through an analysis of analytically derived sensitivities for the one-dimensional form of the advection-dispersion equation. When parameters are estimated in regression models of one-dimensional transport, the spatial and temporal variability in sensitivities influences variance and covariance of parameter estimates. Several principles account for the observed influence of sensitivities on parameter uncertainty. (1) Information about a physical parameter may be most accurately gained at points in space and time with a high sensitivity to the parameter. (2) As the distance of observation points from the upstream boundary increases, maximum sensitivity to velocity during passage of the solute front increases and the consequent estimate of velocity tends to have lower variance. (3) The frequency of sampling must be “in phase” with the S shape of the dispersion sensitivity curve to yield the most information on dispersion. (4) The sensitivity to the dispersion coefficient is usually at least an order of magnitude less than the sensitivity to velocity. (5) The assumed probability distribution of random error in observations of solute concentration determines the form of the sensitivities. (6) If variance in random error in observations is large, trends in sensitivities of observation points may be obscured by noise and thus have limited value in predicting variance in parameter estimates among designs. (7) Designs that minimize the variance of one parameter may not necessarily minimize the variance of other parameters. (8) The time and space interval over which an observation point is sensitive to a given parameter depends on the actual values of the parameters in the underlying physical system.

  4. Rainfall estimation for real time flood monitoring using geostationary meteorological satellite data

    NASA Astrophysics Data System (ADS)

    Veerakachen, Watcharee; Raksapatcharawong, Mongkol

    2015-09-01

    Rainfall estimation by geostationary meteorological satellite data provides good spatial and temporal resolutions. This is advantageous for real time flood monitoring and warning systems. However, a rainfall estimation algorithm developed in one region needs to be adjusted for another climatic region. This work proposes computationally-efficient rainfall estimation algorithms based on an Infrared Threshold Rainfall (ITR) method calibrated with regional ground truth. Hourly rain gauge data collected from 70 stations around the Chao-Phraya river basin were used for calibration and validation of the algorithms. The algorithm inputs were derived from FY-2E satellite observations consisting of infrared and water vapor imagery. The results were compared with the Global Satellite Mapping of Precipitation (GSMaP) near real time product (GSMaP_NRT) using the probability of detection (POD), root mean square error (RMSE) and linear correlation coefficient (CC) as performance indices. Comparison with the GSMaP_NRT product for real time monitoring purpose shows that hourly rain estimates from the proposed algorithm with the error adjustment technique (ITR_EA) offers higher POD and approximately the same RMSE and CC with less data latency.

  5. Abundance models improve spatial and temporal prioritization of conservation resources.

    PubMed

    Johnston, Alison; Fink, Daniel; Reynolds, Mark D; Hochachka, Wesley M; Sullivan, Brian L; Bruns, Nicholas E; Hallstein, Eric; Merrifield, Matt S; Matsumoto, Sandi; Kelling, Steve

    2015-10-01

    Conservation prioritization requires knowledge about organism distribution and density. This information is often inferred from models that estimate the probability of species occurrence rather than from models that estimate species abundance, because abundance data are harder to obtain and model. However, occurrence and abundance may not display similar patterns and therefore development of robust, scalable, abundance models is critical to ensuring that scarce conservation resources are applied where they can have the greatest benefits. Motivated by a dynamic land conservation program, we develop and assess a general method for modeling relative abundance using citizen science monitoring data. Weekly estimates of relative abundance and occurrence were compared for prioritizing times and locations of conservation actions for migratory waterbird species in California, USA. We found that abundance estimates consistently provided better rankings of observed counts than occurrence estimates. Additionally, the relationship between abundance and occurrence was nonlinear and varied by species and season. Across species, locations prioritized by occurrence models had only 10-58% overlap with locations prioritized by abundance models, highlighting that occurrence models will not typically identify the locations of highest abundance that are vital for conservation of populations.

  6. A capture-recapture model of amphidromous fish dispersal.

    PubMed

    Smith, W E; Kwak, T J

    2014-04-01

    Adult movement scale was quantified for two tropical Caribbean diadromous fishes, bigmouth sleeper Gobiomorus dormitor and mountain mullet Agonostomus monticola, using passive integrated transponders (PITs) and radio-telemetry. Large numbers of fishes were tagged in Río Mameyes, Puerto Rico, U.S.A., with PITs and monitored at three fixed locations over a 2·5 year period to estimate transition probabilities between upper and lower elevations and survival probabilities with a multistate Cormack-Jolly-Seber model. A sub-set of fishes were tagged with radio-transmitters and tracked at weekly intervals to estimate fine-scale dispersal. Changes in spatial and temporal distributions of tagged fishes indicated that neither G. dormitor nor A. monticola moved into the lowest, estuarine reaches of Río Mameyes during two consecutive reproductive periods, thus demonstrating that both species follow an amphidromous, rather than catadromous, migratory strategy. Further, both species were relatively sedentary, with restricted linear ranges. While substantial dispersal of these species occurs at the larval stage during recruitment to fresh water, the results indicate minimal dispersal in spawning adults. Successful conservation of diadromous fauna on tropical islands requires management at both broad basin and localized spatial scales. © 2014 The Fisheries Society of the British Isles.

  7. Trap configuration and spacing influences parameter estimates in spatial capture-recapture models

    USGS Publications Warehouse

    Sun, Catherine C.; Fuller, Angela K.; Royle, J. Andrew

    2014-01-01

    An increasing number of studies employ spatial capture-recapture models to estimate population size, but there has been limited research on how different spatial sampling designs and trap configurations influence parameter estimators. Spatial capture-recapture models provide an advantage over non-spatial models by explicitly accounting for heterogeneous detection probabilities among individuals that arise due to the spatial organization of individuals relative to sampling devices. We simulated black bear (Ursus americanus) populations and spatial capture-recapture data to evaluate the influence of trap configuration and trap spacing on estimates of population size and a spatial scale parameter, sigma, that relates to home range size. We varied detection probability and home range size, and considered three trap configurations common to large-mammal mark-recapture studies: regular spacing, clustered, and a temporal sequence of different cluster configurations (i.e., trap relocation). We explored trap spacing and number of traps per cluster by varying the number of traps. The clustered arrangement performed well when detection rates were low, and provides for easier field implementation than the sequential trap arrangement. However, performance differences between trap configurations diminished as home range size increased. Our simulations suggest it is important to consider trap spacing relative to home range sizes, with traps ideally spaced no more than twice the spatial scale parameter. While spatial capture-recapture models can accommodate different sampling designs and still estimate parameters with accuracy and precision, our simulations demonstrate that aspects of sampling design, namely trap configuration and spacing, must consider study area size, ranges of individual movement, and home range sizes in the study population.

  8. Seismic hazard and risk assessment in the intraplate environment: The New Madrid seismic zone of the central United States

    USGS Publications Warehouse

    Wang, Z.

    2007-01-01

    Although the causes of large intraplate earthquakes are still not fully understood, they pose certain hazard and risk to societies. Estimating hazard and risk in these regions is difficult because of lack of earthquake records. The New Madrid seismic zone is one such region where large and rare intraplate earthquakes (M = 7.0 or greater) pose significant hazard and risk. Many different definitions of hazard and risk have been used, and the resulting estimates differ dramatically. In this paper, seismic hazard is defined as the natural phenomenon generated by earthquakes, such as ground motion, and is quantified by two parameters: a level of hazard and its occurrence frequency or mean recurrence interval; seismic risk is defined as the probability of occurrence of a specific level of seismic hazard over a certain time and is quantified by three parameters: probability, a level of hazard, and exposure time. Probabilistic seismic hazard analysis (PSHA), a commonly used method for estimating seismic hazard and risk, derives a relationship between a ground motion parameter and its return period (hazard curve). The return period is not an independent temporal parameter but a mathematical extrapolation of the recurrence interval of earthquakes and the uncertainty of ground motion. Therefore, it is difficult to understand and use PSHA. A new method is proposed and applied here for estimating seismic hazard in the New Madrid seismic zone. This method provides hazard estimates that are consistent with the state of our knowledge and can be easily applied to other intraplate regions. ?? 2007 The Geological Society of America.

  9. Towards the dynamic prediction of wildfire danger. Modeling temporal scenarios of fire-occurrence in Northeast Spain

    NASA Astrophysics Data System (ADS)

    Martín, Yago; Rodrigues, Marcos

    2017-04-01

    Up to date models of human-caused ignition probability have commonly been developed from a static or structural point of view, regardless of the time cycles that drive human behavior or environmental conditions. However, human drivers mostly have a temporal dimension, and fuel conditions are subjected to temporal changes as well, which is why a historical/temporal perspective is often required. Previous studies in the region suggest that human driving factors of wildfires have undergone significant shifts in inter-annual occurrence probability models, thus varying over time. On the other hand, an increasing role of environmental conditions has also been reported. This research comprehensively analyzes the intra-annual dimension of fire occurrence and fire-triggering factors using NW Spain as a test area, moving one-step forward towards achieving more accurate predictions, to ultimately develop dynamic predictive models. To this end, several intra-annual presence-only models have been calibrated, exploring seasonal variations of environmental conditions and short-term cycles of human activity (working- vs non-working days). Models were developed from accurately geolocated fire data in the 2008-2012 period, and GIS and remote sensing (MOD1A2 and MOD16) information . Specifically, 8 occurrence data subsets (scenarios) were constructed by splitting fire records into 4 seasons (winter, spring, summer and autumn) then separating each season into 2 new categories (working and non-working days). This allows analyzing the temporal variation of socioeconomic (urban- and agricultural-interfaces, transport and road networks, and human settlements) and environmental (fuel conditions) factors associated with occurrence. Models were calibrated applying the Maximum Entropy algorithm (MaxEnt). The MaxEnt algorithm was selected as it is the most widespread approach to deal with presence-only data, as may be the case of fire occurrence. The dependent variable for each scenario was created on a conceptual framework which assumed that there were no true cases of fire absence. Model accuracy was assessed using a cross-validation k-fold procedure, whereas variable importance was addressed using a jacknife approach combined with AUC estimation. Results reported model performances around 0.8 AUC in all temporal scenarios. In addition, large variability was observed in the contribution of explanatory factors, with accessibility variables and fuel conditions as key factors along models. Overall, we believe our approach is reliable enough to derive dynamic predictions of human-caused fire occurrence probability. To our knowledge, this is the first attempt to combine presence-only models based on XY located fire data, with remote sensing information and intra-annual scenarios also including cycles of human activity.

  10. Factors influencing reporting and harvest probabilities in North American geese

    USGS Publications Warehouse

    Zimmerman, G.S.; Moser, T.J.; Kendall, W.L.; Doherty, P.F.; White, Gary C.; Caswell, D.F.

    2009-01-01

    We assessed variation in reporting probabilities of standard bands among species, populations, harvest locations, and size classes of North American geese to enable estimation of unbiased harvest probabilities. We included reward (US10,20,30,50, or100) and control (0) banded geese from 16 recognized goose populations of 4 species: Canada (Branta canadensis), cackling (B. hutchinsii), Ross's (Chen rossii), and snow geese (C. caerulescens). We incorporated spatially explicit direct recoveries and live recaptures into a multinomial model to estimate reporting, harvest, and band-retention probabilities. We compared various models for estimating harvest probabilities at country (United States vs. Canada), flyway (5 administrative regions), and harvest area (i.e., flyways divided into northern and southern sections) scales. Mean reporting probability of standard bands was 0.73 (95 CI 0.690.77). Point estimates of reporting probabilities for goose populations or spatial units varied from 0.52 to 0.93, but confidence intervals for individual estimates overlapped and model selection indicated that models with species, population, or spatial effects were less parsimonious than those without these effects. Our estimates were similar to recently reported estimates for mallards (Anas platyrhynchos). We provide current harvest probability estimates for these populations using our direct measures of reporting probability, improving the accuracy of previous estimates obtained from recovery probabilities alone. Goose managers and researchers throughout North America can use our reporting probabilities to correct recovery probabilities estimated from standard banding operations for deriving spatially explicit harvest probabilities.

  11. Unmanned aerial vehicles for surveying marine fauna: assessing detection probability.

    PubMed

    Hodgson, Amanda; Peel, David; Kelly, Natalie

    2017-06-01

    Aerial surveys are conducted for various fauna to assess abundance, distribution, and habitat use over large spatial scales. They are traditionally conducted using light aircraft with observers recording sightings in real time. Unmanned Aerial Vehicles (UAVs) offer an alternative with many potential advantages, including eliminating human risk. To be effective, this emerging platform needs to provide detection rates of animals comparable to traditional methods. UAVs can also acquire new types of information, and this new data requires a reevaluation of traditional analyses used in aerial surveys; including estimating the probability of detecting animals. We conducted 17 replicate UAV surveys of humpback whales (Megaptera novaeangliae) while simultaneously obtaining a 'census' of the population from land-based observations, to assess UAV detection probability. The ScanEagle UAV, carrying a digital SLR camera, continuously captured images (with 75% overlap) along transects covering the visual range of land-based observers. We also used ScanEagle to conduct focal follows of whale pods (n = 12, mean duration = 40 min), to assess a new method of estimating availability. A comparison of the whale detections from the UAV to the land-based census provided an estimated UAV detection probability of 0.33 (CV = 0.25; incorporating both availability and perception biases), which was not affected by environmental covariates (Beaufort sea state, glare, and cloud cover). According to our focal follows, the mean availability was 0.63 (CV = 0.37), with pods including mother/calf pairs having a higher availability (0.86, CV = 0.20) than those without (0.59, CV = 0.38). The follows also revealed (and provided a potential correction for) a downward bias in group size estimates from the UAV surveys, which resulted from asynchronous diving within whale pods, and a relatively short observation window of 9 s. We have shown that UAVs are an effective alternative to traditional methods, providing a detection probability that is within the range of previous studies for our target species. We also describe a method of assessing availability bias that represents spatial and temporal characteristics of a survey, from the same perspective as the survey platform, is benign, and provides additional data on animal behavior. © 2017 by the Ecological Society of America.

  12. A method to assess the inter-annual weather-dependent variability in air pollution concentration and deposition based on weather typing

    NASA Astrophysics Data System (ADS)

    Pleijel, Håkan; Grundström, Maria; Karlsson, Gunilla Pihl; Karlsson, Per Erik; Chen, Deliang

    2016-02-01

    Annual anomalies in air pollutant concentrations, and deposition (bulk and throughfall) of sulphate, nitrate and ammonium, in the Gothenburg region, south-west Sweden, were correlated with optimized linear combinations of the yearly frequency of Lamb Weather Types (LWTs) to determine the extent to which the year-to-year variation in pollution exposure can be partly explained by weather related variability. Air concentrations of urban NO2, CO, PM10, as well as O3 at both an urban and a rural monitoring site, and the deposition of sulphate, nitrate and ammonium for the period 1997-2010 were included in the analysis. Linear detrending of the time series was performed to estimate trend-independent anomalies. These estimated anomalies were subtracted from observed annual values. Then the statistical significance of temporal trends with and without LWT adjustment was tested. For the pollutants studied, the annual anomaly was well correlated with the annual LWT combination (R2 in the range 0.52-0.90). Some negative (annual average [NO2], ammonia bulk deposition) or positive (average urban [O3]) temporal trends became statistically significant (p < 0.05) when the LWT adjustment was applied. In all the cases but one (NH4 throughfall, for which no temporal trend existed) the significance of temporal trends became stronger with LWT adjustment. For nitrate and ammonium, the LWT based adjustment explained a larger fraction of the inter-annual variation for bulk deposition than for throughfall. This is probably linked to the longer time scale of canopy related dry deposition processes influencing throughfall being explained to a lesser extent by LWTs than the meteorological factors controlling bulk deposition. The proposed novel methodology can be used by authorities responsible for air pollution management, and by researchers studying temporal trends in pollution, to evaluate e.g. the relative importance of changes in emissions and weather variability in annual air pollution exposure.

  13. Multiscale Characterization of the Probability Density Functions of Velocity and Temperature Increment Fields

    NASA Astrophysics Data System (ADS)

    DeMarco, Adam Ward

    The turbulent motions with the atmospheric boundary layer exist over a wide range of spatial and temporal scales and are very difficult to characterize. Thus, to explore the behavior of such complex flow enviroments, it is customary to examine their properties from a statistical perspective. Utilizing the probability density functions of velocity and temperature increments, deltau and deltaT, respectively, this work investigates their multiscale behavior to uncover the unique traits that have yet to be thoroughly studied. Utilizing diverse datasets, including idealized, wind tunnel experiments, atmospheric turbulence field measurements, multi-year ABL tower observations, and mesoscale models simulations, this study reveals remarkable similiarities (and some differences) between the small and larger scale components of the probability density functions increments fields. This comprehensive analysis also utilizes a set of statistical distributions to showcase their ability to capture features of the velocity and temperature increments' probability density functions (pdfs) across multiscale atmospheric motions. An approach is proposed for estimating their pdfs utilizing the maximum likelihood estimation (MLE) technique, which has never been conducted utilizing atmospheric data. Using this technique, we reveal the ability to estimate higher-order moments accurately with a limited sample size, which has been a persistent concern for atmospheric turbulence research. With the use robust Goodness of Fit (GoF) metrics, we quantitatively reveal the accuracy of the distributions to the diverse dataset. Through this analysis, it is shown that the normal inverse Gaussian (NIG) distribution is a prime candidate to be used as an estimate of the increment pdfs fields. Therefore, using the NIG model and its parameters, we display the variations in the increments over a range of scales revealing some unique scale-dependent qualities under various stability and ow conditions. This novel approach can provide a method of characterizing increment fields with the sole use of only four pdf parameters. Also, we investigate the capability of the current state-of-the-art mesoscale atmospheric models to predict the features and highlight the potential for use for future model development. With the knowledge gained in this study, a number of applications can benefit by using our methodology, including the wind energy and optical wave propagation fields.

  14. Temporal and Statistical Information in Causal Structure Learning

    ERIC Educational Resources Information Center

    McCormack, Teresa; Frosch, Caren; Patrick, Fiona; Lagnado, David

    2015-01-01

    Three experiments examined children's and adults' abilities to use statistical and temporal information to distinguish between common cause and causal chain structures. In Experiment 1, participants were provided with conditional probability information and/or temporal information and asked to infer the causal structure of a 3-variable mechanical…

  15. Fossil preservation and the stratigraphic ranges of taxa

    NASA Technical Reports Server (NTRS)

    Foote, M.; Raup, D. M.

    1996-01-01

    The incompleteness of the fossil record hinders the inference of evolutionary rates and patterns. Here, we derive relationships among true taxonomic durations, preservation probability, and observed taxonomic ranges. We use these relationships to estimate original distributions of taxonomic durations, preservation probability, and completeness (proportion of taxa preserved), given only the observed ranges. No data on occurrences within the ranges of taxa are required. When preservation is random and the original distribution of durations is exponential, the inference of durations, preservability, and completeness is exact. However, reasonable approximations are possible given non-exponential duration distributions and temporal and taxonomic variation in preservability. Thus, the approaches we describe have great potential in studies of taphonomy, evolutionary rates and patterns, and genealogy. Analyses of Upper Cambrian-Lower Ordovician trilobite species, Paleozoic crinoid genera, Jurassic bivalve species, and Cenozoic mammal species yield the following results: (1) The preservation probability inferred from stratigraphic ranges alone agrees with that inferred from the analysis of stratigraphic gaps when data on the latter are available. (2) Whereas median durations based on simple tabulations of observed ranges are biased by stratigraphic resolution, our estimates of median duration, extinction rate, and completeness are not biased.(3) The shorter geologic ranges of mammalian species relative to those of bivalves cannot be attributed to a difference in preservation potential. However, we cannot rule out the contribution of taxonomic practice to this difference. (4) In the groups studied, completeness (proportion of species [trilobites, bivalves, mammals] or genera [crinoids] preserved) ranges from 60% to 90%. The higher estimates of completeness at smaller geographic scales support previous suggestions that the incompleteness of the fossil record reflects loss of fossiliferous rock more than failure of species to enter the fossil record in the first place.

  16. Estimation of post-test probabilities by residents: Bayesian reasoning versus heuristics?

    PubMed

    Hall, Stacey; Phang, Sen Han; Schaefer, Jeffrey P; Ghali, William; Wright, Bruce; McLaughlin, Kevin

    2014-08-01

    Although the process of diagnosing invariably begins with a heuristic, we encourage our learners to support their diagnoses by analytical cognitive processes, such as Bayesian reasoning, in an attempt to mitigate the effects of heuristics on diagnosing. There are, however, limited data on the use ± impact of Bayesian reasoning on the accuracy of disease probability estimates. In this study our objective was to explore whether Internal Medicine residents use a Bayesian process to estimate disease probabilities by comparing their disease probability estimates to literature-derived Bayesian post-test probabilities. We gave 35 Internal Medicine residents four clinical vignettes in the form of a referral letter and asked them to estimate the post-test probability of the target condition in each case. We then compared these to literature-derived probabilities. For each vignette the estimated probability was significantly different from the literature-derived probability. For the two cases with low literature-derived probability our participants significantly overestimated the probability of these target conditions being the correct diagnosis, whereas for the two cases with high literature-derived probability the estimated probability was significantly lower than the calculated value. Our results suggest that residents generate inaccurate post-test probability estimates. Possible explanations for this include ineffective application of Bayesian reasoning, attribute substitution whereby a complex cognitive task is replaced by an easier one (e.g., a heuristic), or systematic rater bias, such as central tendency bias. Further studies are needed to identify the reasons for inaccuracy of disease probability estimates and to explore ways of improving accuracy.

  17. Framework for probabilistic flood risk assessment in an Alpine region

    NASA Astrophysics Data System (ADS)

    Schneeberger, Klaus; Huttenlau, Matthias; Steinberger, Thomas; Achleitner, Stefan; Stötter, Johann

    2014-05-01

    Flooding is among the natural hazards that regularly cause significant losses to property and human lives. The assessment of flood risk delivers crucial information for all participants involved in flood risk management and especially for local authorities and insurance companies in order to estimate the possible flood losses. Therefore a framework for assessing flood risk has been developed and is introduced with the presented contribution. Flood risk is thereby defined as combination of the probability of flood events and of potential flood damages. The probability of occurrence is described through the spatial and temporal characterisation of flood. The potential flood damages are determined in the course of vulnerability assessment, whereas, the exposure and the vulnerability of the elements at risks are considered. Direct costs caused by flooding with the focus on residential building are analysed. The innovative part of this contribution lies on the development of a framework which takes the probability of flood events and their spatio-temporal characteristic into account. Usually the probability of flooding will be determined by means of recurrence intervals for an entire catchment without any spatial variation. This may lead to a misinterpretation of the flood risk. Within the presented framework the probabilistic flood risk assessment is based on analysis of a large number of spatial correlated flood events. Since the number of historic flood events is relatively small additional events have to be generated synthetically. This temporal extrapolation is realised by means of the method proposed by Heffernan and Tawn (2004). It is used to generate a large number of possible spatial correlated flood events within a larger catchment. The approach is based on the modelling of multivariate extremes considering the spatial dependence structure of flood events. The input for this approach are time series derived from river gauging stations. In a next step the historic and synthetic flood events have to be spatially interpolated from point scale (i.e. river gauges) to the river network. Therefore, topological kriging (Top-kriging) proposed by Skøien et al. (2006) is applied. Top-kriging considers the nested structure of river networks and is therefore suitable to regionalise flood characteristics. Thus, the characteristics of a large number of possible flood events can be transferred to arbitrary locations (e.g. community level) at the river network within a study region. This framework has been used to generate a set of spatial correlated river flood events in the Austrian Federal Province of Vorarlberg. In addition, loss-probability-curves for each community has been calculated based on official inundation maps of public authorities, elements at risks and their vulnerability. One location along the river network within each community refers as interface between the set of flood events and the individual loss-probability relationships for the individual communities. Consequently, every flood event from the historic and synthetic generated dataset can be monetary evaluated. Thus, a time series comprising a large number of flood events and their corresponding monetary losses serves as basis for a probabilistic flood risk assessment. This includes expected annual losses and estimates of extreme event losses, which occur over the course of a certain time period. The gained results are essential decision-support for primary insurers, reinsurance companies and public authorities in order to setup a scale adequate risk management.

  18. The relationship between species detection probability and local extinction probability

    USGS Publications Warehouse

    Alpizar-Jara, R.; Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Pollock, K.H.; Rosenberry, C.S.

    2004-01-01

    In community-level ecological studies, generally not all species present in sampled areas are detected. Many authors have proposed the use of estimation methods that allow detection probabilities that are < 1 and that are heterogeneous among species. These methods can also be used to estimate community-dynamic parameters such as species local extinction probability and turnover rates (Nichols et al. Ecol Appl 8:1213-1225; Conserv Biol 12:1390-1398). Here, we present an ad hoc approach to estimating community-level vital rates in the presence of joint heterogeneity of detection probabilities and vital rates. The method consists of partitioning the number of species into two groups using the detection frequencies and then estimating vital rates (e.g., local extinction probabilities) for each group. Estimators from each group are combined in a weighted estimator of vital rates that accounts for the effect of heterogeneity. Using data from the North American Breeding Bird Survey, we computed such estimates and tested the hypothesis that detection probabilities and local extinction probabilities were negatively related. Our analyses support the hypothesis that species detection probability covaries negatively with local probability of extinction and turnover rates. A simulation study was conducted to assess the performance of vital parameter estimators as well as other estimators relevant to questions about heterogeneity, such as coefficient of variation of detection probabilities and proportion of species in each group. Both the weighted estimator suggested in this paper and the original unweighted estimator for local extinction probability performed fairly well and provided no basis for preferring one to the other.

  19. Dynamics of Surfactant Clustering at Interfaces and Its Influence on the Interfacial Tension: Atomistic Simulation of a Sodium Hexadecane-Benzene Sulfonate-Tetradecane-Water System.

    PubMed

    Paredes, Ricardo; Fariñas-Sánchez, Ana Isabel; Medina-Rodrı Guez, Bryan; Samaniego, Samantha; Aray, Yosslen; Álvarez, Luis Javier

    2018-03-06

    The process of equilibration of the tetradecane-water interface in the presence of sodium hexadecane-benzene sulfonate is studied using intensive atomistic molecular dynamics simulations. Starting as an initial point with all of the surfactants at the interface, it is obtained that the equilibration time of the interface (several microseconds) is orders of magnitude higher than previously reported simulated times. There is strong evidence that this slow equilibration process is due to the aggregation of surfactants molecules on the interface. To determine this fact, temporal evolution of interfacial tension and interfacial formation energy are studied and their temporal variations are correlated with cluster formation. To study cluster evolution, the mean cluster size and the probability that a molecule of surfactant chosen at random is free are obtained as a function of time. Cluster size distribution is estimated, and it is observed that some of the molecules remain free, whereas the rest agglomerate. Additionally, the temporal evolution of the interfacial thickness and the structure of the surfactant molecules on the interface are studied. It is observed how this structure depends on whether the molecules agglomerate or not.

  20. Temporal bird community dynamics are strongly affected by landscape fragmentation in a Central American tropical forest region

    USGS Publications Warehouse

    Blandón, A.C.; Perelman, S.B.; Ramírez, M.; López, A.; Javier, O.; Robbins, Chandler S.

    2016-01-01

    Habitat loss and fragmentation are considered the main causes of species extinctions, particularly in tropical ecosystems. The objective of this work was to evaluate the temporal dynamics of tropical bird communities in landscapes with different levels of fragmentation in eastern Guatemala. We evaluated five bird community dynamic parameters for forest specialists and generalists: (1) species extinction, (2) species turnover, (3) number of colonizing species, (4) relative species richness, and (5) a homogeneity index. For each of 24 landscapes, community dynamic parameters were estimated from bird point count data, for the 1998–1999 and 2008–2009 periods, accounting for species’ detection probability. Forest specialists had higher extinction rates and a smaller number of colonizing species in landscapes with higher fragmentation, thus having lower species richness in both time periods. Alternatively, forest generalists elicited a completely different pattern, showing a curvilinear association to forest fragmentation for most parameters. Thus, greater community dynamism for forest generalists was shown in landscapes with intermediate levels of fragmentation. Our study supports general theory regarding the expected negative effects of habitat loss and fragmentation on the temporal dynamics of biotic communities, particularly for forest specialists, providing strong evidence from understudied tropical bird communities.

  1. Atmospheric Tracer Inverse Modeling Using Markov Chain Monte Carlo (MCMC)

    NASA Astrophysics Data System (ADS)

    Kasibhatla, P.

    2004-12-01

    In recent years, there has been an increasing emphasis on the use of Bayesian statistical estimation techniques to characterize the temporal and spatial variability of atmospheric trace gas sources and sinks. The applications have been varied in terms of the particular species of interest, as well as in terms of the spatial and temporal resolution of the estimated fluxes. However, one common characteristic has been the use of relatively simple statistical models for describing the measurement and chemical transport model error statistics and prior source statistics. For example, multivariate normal probability distribution functions (pdfs) are commonly used to model these quantities and inverse source estimates are derived for fixed values of pdf paramaters. While the advantage of this approach is that closed form analytical solutions for the a posteriori pdfs of interest are available, it is worth exploring Bayesian analysis approaches which allow for a more general treatment of error and prior source statistics. Here, we present an application of the Markov Chain Monte Carlo (MCMC) methodology to an atmospheric tracer inversion problem to demonstrate how more gereral statistical models for errors can be incorporated into the analysis in a relatively straightforward manner. The MCMC approach to Bayesian analysis, which has found wide application in a variety of fields, is a statistical simulation approach that involves computing moments of interest of the a posteriori pdf by efficiently sampling this pdf. The specific inverse problem that we focus on is the annual mean CO2 source/sink estimation problem considered by the TransCom3 project. TransCom3 was a collaborative effort involving various modeling groups and followed a common modeling and analysis protocoal. As such, this problem provides a convenient case study to demonstrate the applicability of the MCMC methodology to atmospheric tracer source/sink estimation problems.

  2. Multi-voxel patterns of visual category representation during episodic encoding are predictive of subsequent memory

    PubMed Central

    Kuhl, Brice A.; Rissman, Jesse; Wagner, Anthony D.

    2012-01-01

    Successful encoding of episodic memories is thought to depend on contributions from prefrontal and temporal lobe structures. Neural processes that contribute to successful encoding have been extensively explored through univariate analyses of neuroimaging data that compare mean activity levels elicited during the encoding of events that are subsequently remembered vs. those subsequently forgotten. Here, we applied pattern classification to fMRI data to assess the degree to which distributed patterns of activity within prefrontal and temporal lobe structures elicited during the encoding of word-image pairs were diagnostic of the visual category (Face or Scene) of the encoded image. We then assessed whether representation of category information was predictive of subsequent memory. Classification analyses indicated that temporal lobe structures contained information robustly diagnostic of visual category. Information in prefrontal cortex was less diagnostic of visual category, but was nonetheless associated with highly reliable classifier-based evidence for category representation. Critically, trials associated with greater classifier-based estimates of category representation in temporal and prefrontal regions were associated with a higher probability of subsequent remembering. Finally, consideration of trial-by-trial variance in classifier-based measures of category representation revealed positive correlations between prefrontal and temporal lobe representations, with the strength of these correlations varying as a function of the category of image being encoded. Together, these results indicate that multi-voxel representations of encoded information can provide unique insights into how visual experiences are transformed into episodic memories. PMID:21925190

  3. Prenatal choline supplementation increases sensitivity to time by reducing non-scalar sources of variance in adult temporal processing

    PubMed Central

    Cheng, Ruey-Kuang; Meck, Warren H.

    2009-01-01

    Choline supplementation of the maternal diet has a long-term facilitative effect on timing and temporal memory of the offspring. To further delineate the impact of early nutritional status on interval timing, we examined effects of prenatal-choline supplementation on the temporal sensitivity of adult (6 mo) male rats. Rats that were given sufficient choline in their chow (CON: 1.1 g/kg) or supplemental choline added to their drinking water (SUP: 3.5 g/kg) during embryonic days (ED) 12–17 were trained with a peak-interval procedure that was shifted among 75%, 50%, and 25% probabilities of reinforcement with transitions from 18s –> 36s –>72s temporal criteria. Prenatal-choline supplementation systematically sharpened interval-timing functions by reducing the associative/non-temporal response enhancing effects of reinforcement probability on the Start response threshold, thereby reducing non-scalar sources of variance in the left-hand portion of the Gaussian-shaped response functions. No effect was observed for the Stop response threshold as a function of any of these manipulations. In addition, independence of peak time and peak rate was demonstrated as a function of reinforcement probability for both prenatal-choline supplemented and control rats. Overall, these results suggest that prenatal-choline supplementation facilitates timing by reducing impulsive responding early in the interval, thereby improving the superimposition of peak functions for different temporal criteria. PMID:17996223

  4. Modelling the 2013 North Aegean (Greece) seismic sequence: geometrical and frictional constraints, and aftershock probabilities

    NASA Astrophysics Data System (ADS)

    Karakostas, Vassilis; Papadimitriou, Eleftheria; Gospodinov, Dragomir

    2014-04-01

    The 2013 January 8 Mw 5.8 North Aegean earthquake sequence took place on one of the ENE-WSW trending parallel dextral strike slip fault branches in this area, in the continuation of 1968 large (M = 7.5) rupture. The source mechanism of the main event indicates predominantly strike slip faulting in agreement with what is expected from regional seismotectonics. It was the largest event to have occurred in the area since the establishment of the Hellenic Unified Seismological Network (HUSN), with an adequate number of stations in close distances and full azimuthal coverage, thus providing the chance of an exhaustive analysis of its aftershock sequence. The main shock was followed by a handful of aftershocks with M ≥ 4.0 and tens with M ≥ 3.0. Relocation was performed by using the recordings from HUSN and a proper crustal model for the area, along with time corrections in each station relative to the model used. Investigation of the spatial and temporal behaviour of seismicity revealed possible triggering of adjacent fault segments. Theoretical static stress changes from the main shock give a preliminary explanation for the aftershock distribution aside from the main rupture. The off-fault seismicity is perfectly explained if μ > 0.5 and B = 0.0, evidencing high fault friction. In an attempt to forecast occurrence probabilities of the strong events (Mw ≥ 5.0), estimations were performed following the Restricted Epidemic Type Aftershock Sequence (RETAS) model. The identified best-fitting MOF model was used to execute 1-d forecasts for such aftershocks and follow the probability evolution in time during the sequence. Forecasting was also implemented on the base of a temporal model of aftershock occurrence, different from the modified Omori formula (the ETAS model), which resulted in probability gain (though small) in strong aftershock forecasting for the beginning of the sequence.

  5. A coupled weather generator - rainfall-runoff approach on hourly time steps for flood risk analysis

    NASA Astrophysics Data System (ADS)

    Winter, Benjamin; Schneeberger, Klaus; Dung Nguyen, Viet; Vorogushyn, Sergiy; Huttenlau, Matthias; Merz, Bruno; Stötter, Johann

    2017-04-01

    The evaluation of potential monetary damage of flooding is an essential part of flood risk management. One possibility to estimate the monetary risk is to analyze long time series of observed flood events and their corresponding damages. In reality, however, only few flood events are documented. This limitation can be overcome by the generation of a set of synthetic, physically and spatial plausible flood events and subsequently the estimation of the resulting monetary damages. In the present work, a set of synthetic flood events is generated by a continuous rainfall-runoff simulation in combination with a coupled weather generator and temporal disaggregation procedure for the study area of Vorarlberg (Austria). Most flood risk studies focus on daily time steps, however, the mesoscale alpine study area is characterized by short concentration times, leading to large differences between daily mean and daily maximum discharge. Accordingly, an hourly time step is needed for the simulations. The hourly metrological input for the rainfall-runoff model is generated in a two-step approach. A synthetic daily dataset is generated by a multivariate and multisite weather generator and subsequently disaggregated to hourly time steps with a k-Nearest-Neighbor model. Following the event generation procedure, the negative consequences of flooding are analyzed. The corresponding flood damage for each synthetic event is estimated by combining the synthetic discharge at representative points of the river network with a loss probability relation for each community in the study area. The loss probability relation is based on exposure and susceptibility analyses on a single object basis (residential buildings) for certain return periods. For these impact analyses official inundation maps of the study area are used. Finally, by analyzing the total event time series of damages, the expected annual damage or losses associated with a certain probability of occurrence can be estimated for the entire study area.

  6. Predicting spatial spread of rabies in skunk populations using surveillance data reported by the public

    PubMed Central

    Streicker, Daniel G.; Fischer, Justin W.; VerCauteren, Kurt C.; Gilbert, Amy T.

    2017-01-01

    Background Prevention and control of wildlife disease invasions relies on the ability to predict spatio-temporal dynamics and understand the role of factors driving spread rates, such as seasonality and transmission distance. Passive disease surveillance (i.e., case reports by public) is a common method of monitoring emergence of wildlife diseases, but can be challenging to interpret due to spatial biases and limitations in data quantity and quality. Methodology/Principal findings We obtained passive rabies surveillance data from dead striped skunks (Mephitis mephitis) in an epizootic in northern Colorado, USA. We developed a dynamic patch-occupancy model which predicts spatio-temporal spreading while accounting for heterogeneous sampling. We estimated the distance travelled per transmission event, direction of invasion, rate of spatial spread, and effects of infection density and season. We also estimated mean transmission distance and rates of spatial spread using a phylogeographic approach on a subsample of viral sequences from the same epizootic. Both the occupancy and phylogeographic approaches predicted similar rates of spatio-temporal spread. Estimated mean transmission distances were 2.3 km (95% Highest Posterior Density (HPD95): 0.02, 11.9; phylogeographic) and 3.9 km (95% credible intervals (CI95): 1.4, 11.3; occupancy). Estimated rates of spatial spread in km/year were: 29.8 (HPD95: 20.8, 39.8; phylogeographic, branch velocity, homogenous model), 22.6 (HPD95: 15.3, 29.7; phylogeographic, diffusion rate, homogenous model) and 21.1 (CI95: 16.7, 25.5; occupancy). Initial colonization probability was twice as high in spring relative to fall. Conclusions/Significance Skunk-to-skunk transmission was primarily local (< 4 km) suggesting that if interventions were needed, they could be applied at the wave front. Slower viral invasions of skunk rabies in western USA compared to a similar epizootic in raccoons in the eastern USA implies host species or landscape factors underlie the dynamics of rabies invasions. Our framework provides a straightforward method for estimating rates of spatial spread of wildlife diseases. PMID:28759576

  7. PRODIGEN: visualizing the probability landscape of stochastic gene regulatory networks in state and time space.

    PubMed

    Ma, Chihua; Luciani, Timothy; Terebus, Anna; Liang, Jie; Marai, G Elisabeta

    2017-02-15

    Visualizing the complex probability landscape of stochastic gene regulatory networks can further biologists' understanding of phenotypic behavior associated with specific genes. We present PRODIGEN (PRObability DIstribution of GEne Networks), a web-based visual analysis tool for the systematic exploration of probability distributions over simulation time and state space in such networks. PRODIGEN was designed in collaboration with bioinformaticians who research stochastic gene networks. The analysis tool combines in a novel way existing, expanded, and new visual encodings to capture the time-varying characteristics of probability distributions: spaghetti plots over one dimensional projection, heatmaps of distributions over 2D projections, enhanced with overlaid time curves to display temporal changes, and novel individual glyphs of state information corresponding to particular peaks. We demonstrate the effectiveness of the tool through two case studies on the computed probabilistic landscape of a gene regulatory network and of a toggle-switch network. Domain expert feedback indicates that our visual approach can help biologists: 1) visualize probabilities of stable states, 2) explore the temporal probability distributions, and 3) discover small peaks in the probability landscape that have potential relation to specific diseases.

  8. A Physically-Based and Distributed Tool for Modeling the Hydrological and Mechanical Processes of Shallow Landslides

    NASA Astrophysics Data System (ADS)

    Arnone, E.; Noto, L. V.; Dialynas, Y. G.; Caracciolo, D.; Bras, R. L.

    2015-12-01

    This work presents the capabilities of a model, i.e. the tRIBS-VEGGIE-Landslide, in two different versions, i.e. developed within a probabilistic framework and coupled with a root cohesion module. The probabilistic model treats geotechnical and soil retention curve parameters as random variables across the basin and estimates theoretical probability distributions of slope stability and the associated "factor of safety" commonly used to describe the occurrence of shallow landslides. The derived distributions are used to obtain the spatio-temporal dynamics of probability of failure, conditioned on soil moisture dynamics at each watershed location. The framework has been tested in the Luquillo Experimental Forest (Puerto Rico) where shallow landslides are common. In particular, the methodology was used to evaluate how the spatial and temporal patterns of precipitation, whose variability is significant over the basin, affect the distribution of probability of failure. Another version of the model accounts for the additional cohesion exerted by vegetation roots. The approach is to use the Fiber Bundle Model (FBM) framework that allows for the evaluation of the root strength as a function of the stress-strain relationships of bundles of fibers. The model requires the knowledge of the root architecture to evaluate the additional reinforcement from each root diameter class. The root architecture is represented with a branching topology model based on Leonardo's rule. The methodology has been tested on a simple case study to explore the role of both hydrological and mechanical root effects. Results demonstrate that the effects of root water uptake can at times be more significant than the mechanical reinforcement; and that the additional resistance provided by roots depends heavily on the vegetation root structure and length.

  9. Probability density and exceedance rate functions of locally Gaussian turbulence

    NASA Technical Reports Server (NTRS)

    Mark, W. D.

    1989-01-01

    A locally Gaussian model of turbulence velocities is postulated which consists of the superposition of a slowly varying strictly Gaussian component representing slow temporal changes in the mean wind speed and a more rapidly varying locally Gaussian turbulence component possessing a temporally fluctuating local variance. Series expansions of the probability density and exceedance rate functions of the turbulence velocity model, based on Taylor's series, are derived. Comparisons of the resulting two-term approximations with measured probability density and exceedance rate functions of atmospheric turbulence velocity records show encouraging agreement, thereby confirming the consistency of the measured records with the locally Gaussian model. Explicit formulas are derived for computing all required expansion coefficients from measured turbulence records.

  10. Estimating earthquake magnitudes from reported intensities in the central and eastern United States

    USGS Publications Warehouse

    Boyd, Oliver; Cramer, Chris H.

    2014-01-01

    A new macroseismic intensity prediction equation is derived for the central and eastern United States and is used to estimate the magnitudes of the 1811–1812 New Madrid, Missouri, and 1886 Charleston, South Carolina, earthquakes. This work improves upon previous derivations of intensity prediction equations by including additional intensity data, correcting magnitudes in the intensity datasets to moment magnitude, and accounting for the spatial and temporal population distributions. The new relation leads to moment magnitude estimates for the New Madrid earthquakes that are toward the lower range of previous studies. Depending on the intensity dataset to which the new macroseismic intensity prediction equation is applied, mean estimates for the 16 December 1811, 23 January 1812, and 7 February 1812 mainshocks, and 16 December 1811 dawn aftershock range from 6.9 to 7.1, 6.8 to 7.1, 7.3 to 7.6, and 6.3 to 6.5, respectively. One‐sigma uncertainties on any given estimate could be as high as 0.3–0.4 magnitude units. We also estimate a magnitude of 6.9±0.3 for the 1886 Charleston, South Carolina, earthquake. We find a greater range of magnitude estimates when also accounting for multiple macroseismic intensity prediction equations. The inability to accurately and precisely ascertain magnitude from intensities increases the uncertainty of the central United States earthquake hazard by nearly a factor of two. Relative to the 2008 national seismic hazard maps, our range of possible 1811–1812 New Madrid earthquake magnitudes increases the coefficient of variation of seismic hazard estimates for Memphis, Tennessee, by 35%–42% for ground motions expected to be exceeded with a 2% probability in 50 years and by 27%–35% for ground motions expected to be exceeded with a 10% probability in 50 years.

  11. Plant-hummingbird interactions and temporal nectar availability in a restinga from Brazil.

    PubMed

    Fonseca, Lorena C N; Vizentin-Bugoni, Jeferson; Rech, André R; Alves, Maria Alice S

    2015-01-01

    Hummingbirds are the most important and specialized group of pollinating birds in the Neotropics and their interactions with plants are key components to many communities. In the present study we identified the assemblage of plants visited by hummingbirds and investigated the temporal availability of floral resources in an area of restinga, sandy plain coastal vegetation associated with the Atlantic forest, in Southeastern Brazil. We recorded flower and nectar features, flowering phenology and interactions between plants and hummingbirds and estimated the amount of calories produced per hectare from June 2005 to August 2006. Ten plant species were visited by two hummingbirds, Amazilia fimbriata and Eupetomena macroura. Resource availability was highly variable among plant species and over time. Nectar volume and concentration per flower were similar to other Neotropical hummingbird-visited plant assemblages. The estimated nectar resource availability between months varied from 0.85 to 5.97 Kcal per hectare/day, demanding an area between one and 6.8 ha to support a single hummingbird. Our study reports an unusual tropical setting where almost all interactions between hummingbirds and plants were performed by a single hummingbird species, A. fimbriata. Hence, the variable nectar availability is probably influencing hummingbird movements, its foraging area, and consequently plant pollination.

  12. A new method for ultrasound detection of interfacial position in gas-liquid two-phase flow.

    PubMed

    Coutinho, Fábio Rizental; Ofuchi, César Yutaka; de Arruda, Lúcia Valéria Ramos; Neves, Flávio; Morales, Rigoberto E M

    2014-05-22

    Ultrasonic measurement techniques for velocity estimation are currently widely used in fluid flow studies and applications. An accurate determination of interfacial position in gas-liquid two-phase flows is still an open problem. The quality of this information directly reflects on the accuracy of void fraction measurement, and it provides a means of discriminating velocity information of both phases. The algorithm known as Velocity Matched Spectrum (VM Spectrum) is a velocity estimator that stands out from other methods by returning a spectrum of velocities for each interrogated volume sample. Interface detection of free-rising bubbles in quiescent liquid presents some difficulties for interface detection due to abrupt changes in interface inclination. In this work a method based on velocity spectrum curve shape is used to generate a spatial-temporal mapping, which, after spatial filtering, yields an accurate contour of the air-water interface. It is shown that the proposed technique yields a RMS error between 1.71 and 3.39 and a probability of detection failure and false detection between 0.89% and 11.9% in determining the spatial-temporal gas-liquid interface position in the flow of free rising bubbles in stagnant liquid. This result is valid for both free path and with transducer emitting through a metallic plate or a Plexiglas pipe.

  13. A New Method for Ultrasound Detection of Interfacial Position in Gas-Liquid Two-Phase Flow

    PubMed Central

    Coutinho, Fábio Rizental; Ofuchi, César Yutaka; de Arruda, Lúcia Valéria Ramos; Jr., Flávio Neves; Morales, Rigoberto E. M.

    2014-01-01

    Ultrasonic measurement techniques for velocity estimation are currently widely used in fluid flow studies and applications. An accurate determination of interfacial position in gas-liquid two-phase flows is still an open problem. The quality of this information directly reflects on the accuracy of void fraction measurement, and it provides a means of discriminating velocity information of both phases. The algorithm known as Velocity Matched Spectrum (VM Spectrum) is a velocity estimator that stands out from other methods by returning a spectrum of velocities for each interrogated volume sample. Interface detection of free-rising bubbles in quiescent liquid presents some difficulties for interface detection due to abrupt changes in interface inclination. In this work a method based on velocity spectrum curve shape is used to generate a spatial-temporal mapping, which, after spatial filtering, yields an accurate contour of the air-water interface. It is shown that the proposed technique yields a RMS error between 1.71 and 3.39 and a probability of detection failure and false detection between 0.89% and 11.9% in determining the spatial-temporal gas-liquid interface position in the flow of free rising bubbles in stagnant liquid. This result is valid for both free path and with transducer emitting through a metallic plate or a Plexiglas pipe. PMID:24858961

  14. Resolution of deep eudicot phylogeny and their temporal diversification using nuclear genes from transcriptomic and genomic datasets.

    PubMed

    Zeng, Liping; Zhang, Ning; Zhang, Qiang; Endress, Peter K; Huang, Jie; Ma, Hong

    2017-05-01

    Explosive diversification is widespread in eukaryotes, making it difficult to resolve phylogenetic relationships. Eudicots contain c. 75% of extant flowering plants, are important for human livelihood and terrestrial ecosystems, and have probably experienced explosive diversifications. The eudicot phylogenetic relationships, especially among those of the Pentapetalae, remain unresolved. Here, we present a highly supported eudicot phylogeny and diversification rate shifts using 31 newly generated transcriptomes and 88 other datasets covering 70% of eudicot orders. A highly supported eudicot phylogeny divided Pentapetalae into two groups: one with rosids, Saxifragales, Vitales and Santalales; the other containing asterids, Caryophyllales and Dilleniaceae, with uncertainty for Berberidopsidales. Molecular clock analysis estimated that crown eudicots originated c. 146 Ma, considerably earlier than earliest tricolpate pollen fossils and most other molecular clock estimates, and Pentapetalae sequentially diverged into eight major lineages within c. 15 Myr. Two identified increases of diversification rate are located in the stems leading to Pentapetalae and asterids, and lagged behind the gamma hexaploidization. The nuclear genes from newly generated transcriptomes revealed a well-resolved eudicot phylogeny, sequential separation of major core eudicot lineages and temporal mode of diversifications, providing new insights into the evolutionary trend of morphologies and contributions to the diversification of eudicots. © 2017 The Authors. New Phytologist © 2017 New Phytologist Trust.

  15. Probability Theory Plus Noise: Descriptive Estimation and Inferential Judgment.

    PubMed

    Costello, Fintan; Watts, Paul

    2018-01-01

    We describe a computational model of two central aspects of people's probabilistic reasoning: descriptive probability estimation and inferential probability judgment. This model assumes that people's reasoning follows standard frequentist probability theory, but it is subject to random noise. This random noise has a regressive effect in descriptive probability estimation, moving probability estimates away from normative probabilities and toward the center of the probability scale. This random noise has an anti-regressive effect in inferential judgement, however. These regressive and anti-regressive effects explain various reliable and systematic biases seen in people's descriptive probability estimation and inferential probability judgment. This model predicts that these contrary effects will tend to cancel out in tasks that involve both descriptive estimation and inferential judgement, leading to unbiased responses in those tasks. We test this model by applying it to one such task, described by Gallistel et al. ). Participants' median responses in this task were unbiased, agreeing with normative probability theory over the full range of responses. Our model captures the pattern of unbiased responses in this task, while simultaneously explaining systematic biases away from normatively correct probabilities seen in other tasks. Copyright © 2018 Cognitive Science Society, Inc.

  16. Sensitivity of a Bayesian atmospheric-transport inversion model to spatio-temporal sensor resolution applied to the 2006 North Korean nuclear test

    NASA Astrophysics Data System (ADS)

    Lundquist, K. A.; Jensen, D. D.; Lucas, D. D.

    2017-12-01

    Atmospheric source reconstruction allows for the probabilistic estimate of source characteristics of an atmospheric release using observations of the release. Performance of the inversion depends partially on the temporal frequency and spatial scale of the observations. The objective of this study is to quantify the sensitivity of the source reconstruction method to sparse spatial and temporal observations. To this end, simulations of atmospheric transport of noble gasses are created for the 2006 nuclear test at the Punggye-ri nuclear test site. Synthetic observations are collected from the simulation, and are taken as "ground truth". Data denial techniques are used to progressively coarsen the temporal and spatial resolution of the synthetic observations, while the source reconstruction model seeks to recover the true input parameters from the synthetic observations. Reconstructed parameters considered here are source location, source timing and source quantity. Reconstruction is achieved by running an ensemble of thousands of dispersion model runs that sample from a uniform distribution of the input parameters. Machine learning is used to train a computationally-efficient surrogate model from the ensemble simulations. Monte Carlo sampling and Bayesian inversion are then used in conjunction with the surrogate model to quantify the posterior probability density functions of source input parameters. This research seeks to inform decision makers of the tradeoffs between more expensive, high frequency observations and less expensive, low frequency observations.

  17. The effects of anterior arcuate and dorsomedial frontal cortex lesions on visually guided eye movements: 2. Paired and multiple targets.

    PubMed

    Schiller, P H; Chou, I

    2000-01-01

    This study examined the effects of anterior arcuate and dorsomedial frontal cortex lesions on the execution of saccadic eye movements made to paired and multiple targets in rhesus monkeys. Identical paired targets were presented with various temporal asynchronies to determine the temporal offset required to yield equal probability choices to either target. In the intact animal equal probability choices were typically obtained when the targets appeared simultaneously. After unilateral anterior arcuate lesions a major shift arose in the temporal offset required to obtain equal probability choices for paired targets that necessitated presenting the target in the hemifield contralateral to the lesion more than 100 ms prior to the target in the ipsilateral hemifield. This deficit was still pronounced 1 year after the lesion. Dorsomedial frontal cortex lesions produced much smaller but significant shifts in target selection that recovered more rapidly. Paired lesions produced deficits similar to those observed with anterior arcuate lesions alone. Major deficits were also observed on a multiple target temporal discrimination task after anterior arcuate but not after dorsomedial frontal cortex lesions. These results suggest that the frontal eye fields that reside in anterior bank of the arcuate sulcus play an important role in temporal processing and in target selection. Dorsomedial frontal cortex, that contains the medial eye fields, plays a much less important role in the execution of these tasks.

  18. Wildfire Risk Mapping over the State of Mississippi: Land Surface Modeling Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cooke, William H.; Mostovoy, Georgy; Anantharaj, Valentine G

    2012-01-01

    Three fire risk indexes based on soil moisture estimates were applied to simulate wildfire probability over the southern part of Mississippi using the logistic regression approach. The fire indexes were retrieved from: (1) accumulated difference between daily precipitation and potential evapotranspiration (P-E); (2) top 10 cm soil moisture content simulated by the Mosaic land surface model; and (3) the Keetch-Byram drought index (KBDI). The P-E, KBDI, and soil moisture based indexes were estimated from gridded atmospheric and Mosaic-simulated soil moisture data available from the North American Land Data Assimilation System (NLDAS-2). Normalized deviations of these indexes from the 31-year meanmore » (1980-2010) were fitted into the logistic regression model describing probability of wildfires occurrence as a function of the fire index. It was assumed that such normalization provides more robust and adequate description of temporal dynamics of soil moisture anomalies than the original (not normalized) set of indexes. The logistic model parameters were evaluated for 0.25 x0.25 latitude/longitude cells and for probability representing at least one fire event occurred during 5 consecutive days. A 23-year (1986-2008) forest fires record was used. Two periods were selected and examined (January mid June and mid September December). The application of the logistic model provides an overall good agreement between empirical/observed and model-fitted fire probabilities over the study area during both seasons. The fire risk indexes based on the top 10 cm soil moisture and KBDI have the largest impact on the wildfire odds (increasing it by almost 2 times in response to each unit change of the corresponding fire risk index during January mid June period and by nearly 1.5 times during mid September-December) observed over 0.25 x0.25 cells located along the state of Mississippi Coast line. This result suggests a rather strong control of fire risk indexes on fire occurrence probability over this region.« less

  19. Groupwise registration of cardiac perfusion MRI sequences using normalized mutual information in high dimension

    NASA Astrophysics Data System (ADS)

    Hamrouni, Sameh; Rougon, Nicolas; Pr"teux, Françoise

    2011-03-01

    In perfusion MRI (p-MRI) exams, short-axis (SA) image sequences are captured at multiple slice levels along the long-axis of the heart during the transit of a vascular contrast agent (Gd-DTPA) through the cardiac chambers and muscle. Compensating cardio-thoracic motions is a requirement for enabling computer-aided quantitative assessment of myocardial ischaemia from contrast-enhanced p-MRI sequences. The classical paradigm consists of registering each sequence frame on a reference image using some intensity-based matching criterion. In this paper, we introduce a novel unsupervised method for the spatio-temporal groupwise registration of cardiac p-MRI exams based on normalized mutual information (NMI) between high-dimensional feature distributions. Here, local contrast enhancement curves are used as a dense set of spatio-temporal features, and statistically matched through variational optimization to a target feature distribution derived from a registered reference template. The hard issue of probability density estimation in high-dimensional state spaces is bypassed by using consistent geometric entropy estimators, allowing NMI to be computed directly from feature samples. Specifically, a computationally efficient kth-nearest neighbor (kNN) estimation framework is retained, leading to closed-form expressions for the gradient flow of NMI over finite- and infinite-dimensional motion spaces. This approach is applied to the groupwise alignment of cardiac p-MRI exams using a free-form Deformation (FFD) model for cardio-thoracic motions. Experiments on simulated and natural datasets suggest its accuracy and robustness for registering p-MRI exams comprising more than 30 frames.

  20. Active temporal multiplexing of indistinguishable heralded single photons

    PubMed Central

    Xiong, C.; Zhang, X.; Liu, Z.; Collins, M. J.; Mahendra, A.; Helt, L. G.; Steel, M. J.; Choi, D. -Y.; Chae, C. J.; Leong, P. H. W.; Eggleton, B. J.

    2016-01-01

    It is a fundamental challenge in quantum optics to deterministically generate indistinguishable single photons through non-deterministic nonlinear optical processes, due to the intrinsic coupling of single- and multi-photon-generation probabilities in these processes. Actively multiplexing photons generated in many temporal modes can decouple these probabilities, but key issues are to minimize resource requirements to allow scalability, and to ensure indistinguishability of the generated photons. Here we demonstrate the multiplexing of photons from four temporal modes solely using fibre-integrated optics and off-the-shelf electronic components. We show a 100% enhancement to the single-photon output probability without introducing additional multi-photon noise. Photon indistinguishability is confirmed by a fourfold Hong–Ou–Mandel quantum interference with a 91±16% visibility after subtracting multi-photon noise due to high pump power. Our demonstration paves the way for scalable multiplexing of many non-deterministic photon sources to a single near-deterministic source, which will be of benefit to future quantum photonic technologies. PMID:26996317

  1. Modeling storms improves estimates of long-term shoreline change

    NASA Astrophysics Data System (ADS)

    Frazer, L. Neil; Anderson, Tiffany R.; Fletcher, Charles H.

    2009-10-01

    Large storms make it difficult to extract the long-term trend of erosion or accretion from shoreline position data. Here we make storms part of the shoreline change model by means of a storm function. The data determine storm amplitudes and the rate at which the shoreline recovers from storms. Historical shoreline data are temporally sparse, and inclusion of all storms in one model over-fits the data, but a probability-weighted average model shows effects from all storms, illustrating how model averaging incorporates information from good models that might otherwise have been discarded as un-parsimonious. Data from Cotton Patch Hill, DE, yield a long-term shoreline loss rate of 0.49 ± 0.01 m/yr, about 16% less than published estimates. A minimum loss rate of 0.34 ± 0.01 m/yr is given by a model containing the 1929, 1962 and 1992 storms.

  2. Environmental risk assessment of water quality in harbor areas: a new methodology applied to European ports.

    PubMed

    Gómez, Aina G; Ondiviela, Bárbara; Puente, Araceli; Juanes, José A

    2015-05-15

    This work presents a standard and unified procedure for assessment of environmental risks at the contaminant source level in port aquatic systems. Using this method, port managers and local authorities will be able to hierarchically classify environmental hazards and proceed with the most suitable management actions. This procedure combines rigorously selected parameters and indicators to estimate the environmental risk of each contaminant source based on its probability, consequences and vulnerability. The spatio-temporal variability of multiple stressors (agents) and receptors (endpoints) is taken into account to provide accurate estimations for application of precisely defined measures. The developed methodology is tested on a wide range of different scenarios via application in six European ports. The validation process confirms its usefulness, versatility and adaptability as a management tool for port water quality in Europe and worldwide. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. NIRS-SPM: statistical parametric mapping for near infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Tak, Sungho; Jang, Kwang Eun; Jung, Jinwook; Jang, Jaeduck; Jeong, Yong; Ye, Jong Chul

    2008-02-01

    Even though there exists a powerful statistical parametric mapping (SPM) tool for fMRI, similar public domain tools are not available for near infrared spectroscopy (NIRS). In this paper, we describe a new public domain statistical toolbox called NIRS-SPM for quantitative analysis of NIRS signals. Specifically, NIRS-SPM statistically analyzes the NIRS data using GLM and makes inference as the excursion probability which comes from the random field that are interpolated from the sparse measurement. In order to obtain correct inference, NIRS-SPM offers the pre-coloring and pre-whitening method for temporal correlation estimation. For simultaneous recording NIRS signal with fMRI, the spatial mapping between fMRI image and real coordinate in 3-D digitizer is estimated using Horn's algorithm. These powerful tools allows us the super-resolution localization of the brain activation which is not possible using the conventional NIRS analysis tools.

  4. Modelling above-ground carbon dynamics using multi-temporal airborne lidar: insights from a Mediterranean woodland

    NASA Astrophysics Data System (ADS)

    Simonson, W.; Ruiz-Benito, P.; Valladares, F.; Coomes, D.

    2015-09-01

    Woodlands represent highly significant carbon sinks globally, though could lose this function under future climatic change. Effective large-scale monitoring of these woodlands has a critical role to play in mitigating for, and adapting to, climate change. Mediterranean woodlands have low carbon densities, but represent important global carbon stocks due to their extensiveness and are particularly vulnerable because the region is predicted to become much hotter and drier over the coming century. Airborne lidar is already recognized as an excellent approach for high-fidelity carbon mapping, but few studies have used multi-temporal lidar surveys to measure carbon fluxes in forests and none have worked with Mediterranean woodlands. We use a multi-temporal (five year interval) airborne lidar dataset for a region of central Spain to estimate above-ground biomass (AGB) and carbon dynamics in typical mixed broadleaved/coniferous Mediterranean woodlands. Field calibration of the lidar data enabled the generation of grid-based maps of AGB for 2006 and 2011, and the resulting AGB change were estimated. There was a close agreement between the lidar-based AGB growth estimate (1.22 Mg ha-1 year-1) and those derived from two independent sources: the Spanish National Forest Inventory, and a~tree-ring based analysis (1.19 and 1.13 Mg ha-1 year-1, respectively). We parameterised a simple simulator of forest dynamics using the lidar carbon flux measurements, and used it to explore four scenarios of fire occurrence. Under undisturbed conditions (no fire occurrence) an accelerating accumulation of biomass and carbon is evident over the next 100 years with an average carbon sequestration rate of 1.95 Mg C ha-1 year-1. This rate reduces by almost a third when fire probability is increased to 0.01, as has been predicted under climate change. Our work shows the power of multi-temporal lidar surveying to map woodland carbon fluxes and provide parameters for carbon dynamics models. Space deployment of lidar instruments in the near future could open the way for rolling out wide-scale forest carbon stock monitoring to inform management and governance responses to future environmental change.

  5. Modelling above-ground carbon dynamics using multi-temporal airborne lidar: insights from a Mediterranean woodland

    NASA Astrophysics Data System (ADS)

    Simonson, W.; Ruiz-Benito, P.; Valladares, F.; Coomes, D.

    2016-02-01

    Woodlands represent highly significant carbon sinks globally, though could lose this function under future climatic change. Effective large-scale monitoring of these woodlands has a critical role to play in mitigating for, and adapting to, climate change. Mediterranean woodlands have low carbon densities, but represent important global carbon stocks due to their extensiveness and are particularly vulnerable because the region is predicted to become much hotter and drier over the coming century. Airborne lidar is already recognized as an excellent approach for high-fidelity carbon mapping, but few studies have used multi-temporal lidar surveys to measure carbon fluxes in forests and none have worked with Mediterranean woodlands. We use a multi-temporal (5-year interval) airborne lidar data set for a region of central Spain to estimate above-ground biomass (AGB) and carbon dynamics in typical mixed broadleaved and/or coniferous Mediterranean woodlands. Field calibration of the lidar data enabled the generation of grid-based maps of AGB for 2006 and 2011, and the resulting AGB change was estimated. There was a close agreement between the lidar-based AGB growth estimate (1.22 Mg ha-1 yr-1) and those derived from two independent sources: the Spanish National Forest Inventory, and a tree-ring based analysis (1.19 and 1.13 Mg ha-1 yr-1, respectively). We parameterised a simple simulator of forest dynamics using the lidar carbon flux measurements, and used it to explore four scenarios of fire occurrence. Under undisturbed conditions (no fire) an accelerating accumulation of biomass and carbon is evident over the next 100 years with an average carbon sequestration rate of 1.95 Mg C ha-1 yr-1. This rate reduces by almost a third when fire probability is increased to 0.01 (fire return rate of 100 years), as has been predicted under climate change. Our work shows the power of multi-temporal lidar surveying to map woodland carbon fluxes and provide parameters for carbon dynamics models. Space deployment of lidar instruments in the near future could open the way for rolling out wide-scale forest carbon stock monitoring to inform management and governance responses to future environmental change.

  6. Impact of Partial Time Delay on Temporal Dynamics of Watts-Strogatz Small-World Neuronal Networks

    NASA Astrophysics Data System (ADS)

    Yan, Hao; Sun, Xiaojuan

    2017-06-01

    In this paper, we mainly discuss effects of partial time delay on temporal dynamics of Watts-Strogatz (WS) small-world neuronal networks by controlling two parameters. One is the time delay τ and the other is the probability of partial time delay pdelay. Temporal dynamics of WS small-world neuronal networks are discussed with the aid of temporal coherence and mean firing rate. With the obtained simulation results, it is revealed that for small time delay τ, the probability pdelay could weaken temporal coherence and increase mean firing rate of neuronal networks, which indicates that it could improve neuronal firings of the neuronal networks while destroying firing regularity. For large time delay τ, temporal coherence and mean firing rate do not have great changes with respect to pdelay. Time delay τ always has great influence on both temporal coherence and mean firing rate no matter what is the value of pdelay. Moreover, with the analysis of spike trains and histograms of interspike intervals of neurons inside neuronal networks, it is found that the effects of partial time delays on temporal coherence and mean firing rate could be the result of locking between the period of neuronal firing activities and the value of time delay τ. In brief, partial time delay could have great influence on temporal dynamics of the neuronal networks.

  7. Estimating the concordance probability in a survival analysis with a discrete number of risk groups.

    PubMed

    Heller, Glenn; Mo, Qianxing

    2016-04-01

    A clinical risk classification system is an important component of a treatment decision algorithm. A measure used to assess the strength of a risk classification system is discrimination, and when the outcome is survival time, the most commonly applied global measure of discrimination is the concordance probability. The concordance probability represents the pairwise probability of lower patient risk given longer survival time. The c-index and the concordance probability estimate have been used to estimate the concordance probability when patient-specific risk scores are continuous. In the current paper, the concordance probability estimate and an inverse probability censoring weighted c-index are modified to account for discrete risk scores. Simulations are generated to assess the finite sample properties of the concordance probability estimate and the weighted c-index. An application of these measures of discriminatory power to a metastatic prostate cancer risk classification system is examined.

  8. Neural dynamics of reward probability coding: a Magnetoencephalographic study in humans

    PubMed Central

    Thomas, Julie; Vanni-Mercier, Giovanna; Dreher, Jean-Claude

    2013-01-01

    Prediction of future rewards and discrepancy between actual and expected outcomes (prediction error) are crucial signals for adaptive behavior. In humans, a number of fMRI studies demonstrated that reward probability modulates these two signals in a large brain network. Yet, the spatio-temporal dynamics underlying the neural coding of reward probability remains unknown. Here, using magnetoencephalography, we investigated the neural dynamics of prediction and reward prediction error computations while subjects learned to associate cues of slot machines with monetary rewards with different probabilities. We showed that event-related magnetic fields (ERFs) arising from the visual cortex coded the expected reward value 155 ms after the cue, demonstrating that reward value signals emerge early in the visual stream. Moreover, a prediction error was reflected in ERF peaking 300 ms after the rewarded outcome and showing decreasing amplitude with higher reward probability. This prediction error signal was generated in a network including the anterior and posterior cingulate cortex. These findings pinpoint the spatio-temporal characteristics underlying reward probability coding. Together, our results provide insights into the neural dynamics underlying the ability to learn probabilistic stimuli-reward contingencies. PMID:24302894

  9. An efficient deterministic-probabilistic approach to modeling regional groundwater flow: 2. Application to Owens Valley, California

    USGS Publications Warehouse

    Guymon, Gary L.; Yen, Chung-Cheng

    1990-01-01

    The applicability of a deterministic-probabilistic model for predicting water tables in southern Owens Valley, California, is evaluated. The model is based on a two-layer deterministic model that is cascaded with a two-point probability model. To reduce the potentially large number of uncertain variables in the deterministic model, lumping of uncertain variables was evaluated by sensitivity analysis to reduce the total number of uncertain variables to three variables: hydraulic conductivity, storage coefficient or specific yield, and source-sink function. Results demonstrate that lumping of uncertain parameters reduces computational effort while providing sufficient precision for the case studied. Simulated spatial coefficients of variation for water table temporal position in most of the basin is small, which suggests that deterministic models can predict water tables in these areas with good precision. However, in several important areas where pumping occurs or the geology is complex, the simulated spatial coefficients of variation are over estimated by the two-point probability method.

  10. An efficient deterministic-probabilistic approach to modeling regional groundwater flow: 2. Application to Owens Valley, California

    NASA Astrophysics Data System (ADS)

    Guymon, Gary L.; Yen, Chung-Cheng

    1990-07-01

    The applicability of a deterministic-probabilistic model for predicting water tables in southern Owens Valley, California, is evaluated. The model is based on a two-layer deterministic model that is cascaded with a two-point probability model. To reduce the potentially large number of uncertain variables in the deterministic model, lumping of uncertain variables was evaluated by sensitivity analysis to reduce the total number of uncertain variables to three variables: hydraulic conductivity, storage coefficient or specific yield, and source-sink function. Results demonstrate that lumping of uncertain parameters reduces computational effort while providing sufficient precision for the case studied. Simulated spatial coefficients of variation for water table temporal position in most of the basin is small, which suggests that deterministic models can predict water tables in these areas with good precision. However, in several important areas where pumping occurs or the geology is complex, the simulated spatial coefficients of variation are over estimated by the two-point probability method.

  11. Electroencephalogram-based indices applied to dogs' depth of anaesthesia monitoring.

    PubMed

    Brás, S; Georgakis, A; Ribeiro, L; Ferreira, D A; Silva, A; Antunes, L; Nunes, C S

    2014-12-01

    Hypnotic drug administration causes alterations in the electroencephalogram (EEG) in a dose-dependent manner. These changes cannot be identified easily in the raw EEG, therefore EEG based indices were adopted for assessing depth of anaesthesia (DoA). This study examines several indices for estimating dogs' DoA. Data (EEG, clinical end-points) were collected from 8 dogs anaesthetized with propofol. EEG was initially collected without propofol. Then, 100 ml h⁻¹ (1000 mg h⁻¹) of propofol 1% infusion rate was administered until a deep anaesthetic stage was reached. The infusion rate was temporarily increased to 200 ml h⁻¹ (2000 mg h⁻¹) to achieve 80% of burst suppression. The index performance was accessed by correlation coefficient with the propofol concentrations, and prediction probability with the anaesthetic clinical end-points. The temporal entropy and the averaged instantaneous frequency were the best indices because they exhibit: (a) strong correlations with propofol concentrations, (b) high probabilities of predicting anaesthesia clinical end-points. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Seismic potential for large and great interplate earthquakes along the Chilean and Southern Peruvian Margins of South America: A quantitative reappraisal

    NASA Astrophysics Data System (ADS)

    Nishenko, Stuart P.

    1985-04-01

    The seismic potential of the Chilean and southern Peruvian margins of South America is reevaluated to delineate those areas or segments of the margin that may be expected to experience large or great interplate earthquakes within the next 20 years (1984-2004). Long-term estimates of seismic potential (or the conditional probability of recurrence within a specified period of time) are based on (1) statistical analysis of historic repeat time data using Weibull distributions and (2) deterministic estimates of recurrence times based on the time-predictable model of earthquake recurrence. Both methods emphasize the periodic nature of large and great earthquake recurrence, and are compared with estimates of probability based on the assumption of Poisson-type behavior. The estimates of seismic potential presented in this study are long-term forecasts only, as the temporal resolution (or standard deviation) of both methods is taken to range from ±15% to ±25% of the average or estimated repeat time. At present, the Valparaiso region of central Chile (32°-35°S) has a high potential or probability of recurrence in the next 20 years. Coseismic uplift data associated with previous shocks in 1822 and 1906 suggest that this area may have already started to rerupture in 1971-1973. Average repeat times also suggest this area is due for a great shock within the next 20 years. Flanking segments of the Chilean margin, Coquimbo-Illapel (30°-32°S) and Talca-Concepcion (35°-38°S), presently have poorly constrained but possibly quite high potentials for a series of large or great shocks within the next 20 years. In contrast, the rupture zone of the great 1960 earthquake (37°-46°S) has the lowest potential along the margin and is not expected to rerupture in a great earthquake within the next 100 years. In the north, the seismic potentials of the Mollendo-Arica (17°-18°S) and Arica-Antofagasta (18°-24°S) segments (which last ruptured during great earthquakes in 1868 and 1877) are also high, but poorly constrained.

  13. Estimating parameters for probabilistic linkage of privacy-preserved datasets.

    PubMed

    Brown, Adrian P; Randall, Sean M; Ferrante, Anna M; Semmens, James B; Boyd, James H

    2017-07-10

    Probabilistic record linkage is a process used to bring together person-based records from within the same dataset (de-duplication) or from disparate datasets using pairwise comparisons and matching probabilities. The linkage strategy and associated match probabilities are often estimated through investigations into data quality and manual inspection. However, as privacy-preserved datasets comprise encrypted data, such methods are not possible. In this paper, we present a method for estimating the probabilities and threshold values for probabilistic privacy-preserved record linkage using Bloom filters. Our method was tested through a simulation study using synthetic data, followed by an application using real-world administrative data. Synthetic datasets were generated with error rates from zero to 20% error. Our method was used to estimate parameters (probabilities and thresholds) for de-duplication linkages. Linkage quality was determined by F-measure. Each dataset was privacy-preserved using separate Bloom filters for each field. Match probabilities were estimated using the expectation-maximisation (EM) algorithm on the privacy-preserved data. Threshold cut-off values were determined by an extension to the EM algorithm allowing linkage quality to be estimated for each possible threshold. De-duplication linkages of each privacy-preserved dataset were performed using both estimated and calculated probabilities. Linkage quality using the F-measure at the estimated threshold values was also compared to the highest F-measure. Three large administrative datasets were used to demonstrate the applicability of the probability and threshold estimation technique on real-world data. Linkage of the synthetic datasets using the estimated probabilities produced an F-measure that was comparable to the F-measure using calculated probabilities, even with up to 20% error. Linkage of the administrative datasets using estimated probabilities produced an F-measure that was higher than the F-measure using calculated probabilities. Further, the threshold estimation yielded results for F-measure that were only slightly below the highest possible for those probabilities. The method appears highly accurate across a spectrum of datasets with varying degrees of error. As there are few alternatives for parameter estimation, the approach is a major step towards providing a complete operational approach for probabilistic linkage of privacy-preserved datasets.

  14. A cautionary note on substituting spatial subunits for repeated temporal sampling in studies of site occupancy

    USGS Publications Warehouse

    Kendall, William L.; White, Gary C.

    2009-01-01

    1. Assessing the probability that a given site is occupied by a species of interest is important to resource managers, as well as metapopulation or landscape ecologists. Managers require accurate estimates of the state of the system, in order to make informed decisions. Models that yield estimates of occupancy, while accounting for imperfect detection, have proven useful by removing a potentially important source of bias. To account for detection probability, multiple independent searches per site for the species are required, under the assumption that the species is available for detection during each search of an occupied site. 2. We demonstrate that when multiple samples per site are defined by searching different locations within a site, absence of the species from a subset of these spatial subunits induces estimation bias when locations are exhaustively assessed or sampled without replacement. 3. We further demonstrate that this bias can be removed by choosing sampling locations with replacement, or if the species is highly mobile over a short period of time. 4. Resampling an existing data set does not mitigate bias due to exhaustive assessment of locations or sampling without replacement. 5. Synthesis and applications. Selecting sampling locations for presence/absence surveys with replacement is practical in most cases. Such an adjustment to field methods will prevent one source of bias, and therefore produce more robust statistical inferences about species occupancy. This will in turn permit managers to make resource decisions based on better knowledge of the state of the system.

  15. Improving PERSIANN-CCS rain estimation using probabilistic approach and multi-sensors information

    NASA Astrophysics Data System (ADS)

    Karbalaee, N.; Hsu, K. L.; Sorooshian, S.; Kirstetter, P.; Hong, Y.

    2016-12-01

    This presentation discusses the recent implemented approaches to improve the rainfall estimation from Precipitation Estimation from Remotely Sensed Information using Artificial Neural Network-Cloud Classification System (PERSIANN-CCS). PERSIANN-CCS is an infrared (IR) based algorithm being integrated in the IMERG (Integrated Multi-Satellite Retrievals for the Global Precipitation Mission GPM) to create a precipitation product in 0.1x0.1degree resolution over the chosen domain 50N to 50S every 30 minutes. Although PERSIANN-CCS has a high spatial and temporal resolution, it overestimates or underestimates due to some limitations.PERSIANN-CCS can estimate rainfall based on the extracted information from IR channels at three different temperature threshold levels (220, 235, and 253k). This algorithm relies only on infrared data to estimate rainfall indirectly from this channel which cause missing the rainfall from warm clouds and false estimation for no precipitating cold clouds. In this research the effectiveness of using other channels of GOES satellites such as visible and water vapors has been investigated. By using multi-sensors the precipitation can be estimated based on the extracted information from multiple channels. Also, instead of using the exponential function for estimating rainfall from cloud top temperature, the probabilistic method has been used. Using probability distributions of precipitation rates instead of deterministic values has improved the rainfall estimation for different type of clouds.

  16. The contribution of threat probability estimates to reexperiencing symptoms: a prospective analog study.

    PubMed

    Regambal, Marci J; Alden, Lynn E

    2012-09-01

    Individuals with posttraumatic stress disorder (PTSD) are hypothesized to have a "sense of current threat." Perceived threat from the environment (i.e., external threat), can lead to overestimating the probability of the traumatic event reoccurring (Ehlers & Clark, 2000). However, it is unclear if external threat judgments are a pre-existing vulnerability for PTSD or a consequence of trauma exposure. We used trauma analog methodology to prospectively measure probability estimates of a traumatic event, and investigate how these estimates were related to cognitive processes implicated in PTSD development. 151 participants estimated the probability of being in car-accident related situations, watched a movie of a car accident victim, and then completed a measure of data-driven processing during the movie. One week later, participants re-estimated the probabilities, and completed measures of reexperiencing symptoms and symptom appraisals/reactions. Path analysis revealed that higher pre-existing probability estimates predicted greater data-driven processing which was associated with negative appraisals and responses to intrusions. Furthermore, lower pre-existing probability estimates and negative responses to intrusions were both associated with a greater change in probability estimates. Reexperiencing symptoms were predicted by negative responses to intrusions and, to a lesser degree, by greater changes in probability estimates. The undergraduate student sample may not be representative of the general public. The reexperiencing symptoms are less severe than what would be found in a trauma sample. Threat estimates present both a vulnerability and a consequence of exposure to a distressing event. Furthermore, changes in these estimates are associated with cognitive processes implicated in PTSD. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. Pretest expectations strongly influence interpretation of abnormal laboratory results and further management

    PubMed Central

    2010-01-01

    Background Abnormal results of diagnostic laboratory tests can be difficult to interpret when disease probability is very low. Although most physicians generally do not use Bayesian calculations to interpret abnormal results, their estimates of pretest disease probability and reasons for ordering diagnostic tests may - in a more implicit manner - influence test interpretation and further management. A better understanding of this influence may help to improve test interpretation and management. Therefore, the objective of this study was to examine the influence of physicians' pretest disease probability estimates, and their reasons for ordering diagnostic tests, on test result interpretation, posttest probability estimates and further management. Methods Prospective study among 87 primary care physicians in the Netherlands who each ordered laboratory tests for 25 patients. They recorded their reasons for ordering the tests (to exclude or confirm disease or to reassure patients) and their pretest disease probability estimates. Upon receiving the results they recorded how they interpreted the tests, their posttest probability estimates and further management. Logistic regression was used to analyse whether the pretest probability and the reasons for ordering tests influenced the interpretation, the posttest probability estimates and the decisions on further management. Results The physicians ordered tests for diagnostic purposes for 1253 patients; 742 patients had an abnormal result (64%). Physicians' pretest probability estimates and their reasons for ordering diagnostic tests influenced test interpretation, posttest probability estimates and further management. Abnormal results of tests ordered for reasons of reassurance were significantly more likely to be interpreted as normal (65.8%) compared to tests ordered to confirm a diagnosis or exclude a disease (27.7% and 50.9%, respectively). The odds for abnormal results to be interpreted as normal were much lower when the physician estimated a high pretest disease probability, compared to a low pretest probability estimate (OR = 0.18, 95% CI = 0.07-0.52, p < 0.001). Conclusions Interpretation and management of abnormal test results were strongly influenced by physicians' estimation of pretest disease probability and by the reason for ordering the test. By relating abnormal laboratory results to their pretest expectations, physicians may seek a balance between over- and under-reacting to laboratory test results. PMID:20158908

  18. Pretest expectations strongly influence interpretation of abnormal laboratory results and further management.

    PubMed

    Houben, Paul H H; van der Weijden, Trudy; Winkens, Bjorn; Winkens, Ron A G; Grol, Richard P T M

    2010-02-16

    Abnormal results of diagnostic laboratory tests can be difficult to interpret when disease probability is very low. Although most physicians generally do not use Bayesian calculations to interpret abnormal results, their estimates of pretest disease probability and reasons for ordering diagnostic tests may--in a more implicit manner--influence test interpretation and further management. A better understanding of this influence may help to improve test interpretation and management. Therefore, the objective of this study was to examine the influence of physicians' pretest disease probability estimates, and their reasons for ordering diagnostic tests, on test result interpretation, posttest probability estimates and further management. Prospective study among 87 primary care physicians in the Netherlands who each ordered laboratory tests for 25 patients. They recorded their reasons for ordering the tests (to exclude or confirm disease or to reassure patients) and their pretest disease probability estimates. Upon receiving the results they recorded how they interpreted the tests, their posttest probability estimates and further management. Logistic regression was used to analyse whether the pretest probability and the reasons for ordering tests influenced the interpretation, the posttest probability estimates and the decisions on further management. The physicians ordered tests for diagnostic purposes for 1253 patients; 742 patients had an abnormal result (64%). Physicians' pretest probability estimates and their reasons for ordering diagnostic tests influenced test interpretation, posttest probability estimates and further management. Abnormal results of tests ordered for reasons of reassurance were significantly more likely to be interpreted as normal (65.8%) compared to tests ordered to confirm a diagnosis or exclude a disease (27.7% and 50.9%, respectively). The odds for abnormal results to be interpreted as normal were much lower when the physician estimated a high pretest disease probability, compared to a low pretest probability estimate (OR = 0.18, 95% CI = 0.07-0.52, p < 0.001). Interpretation and management of abnormal test results were strongly influenced by physicians' estimation of pretest disease probability and by the reason for ordering the test. By relating abnormal laboratory results to their pretest expectations, physicians may seek a balance between over- and under-reacting to laboratory test results.

  19. Temporal rainfall estimation using input data reduction and model inversion

    NASA Astrophysics Data System (ADS)

    Wright, A. J.; Vrugt, J. A.; Walker, J. P.; Pauwels, V. R. N.

    2016-12-01

    Floods are devastating natural hazards. To provide accurate, precise and timely flood forecasts there is a need to understand the uncertainties associated with temporal rainfall and model parameters. The estimation of temporal rainfall and model parameter distributions from streamflow observations in complex dynamic catchments adds skill to current areal rainfall estimation methods, allows for the uncertainty of rainfall input to be considered when estimating model parameters and provides the ability to estimate rainfall from poorly gauged catchments. Current methods to estimate temporal rainfall distributions from streamflow are unable to adequately explain and invert complex non-linear hydrologic systems. This study uses the Discrete Wavelet Transform (DWT) to reduce rainfall dimensionality for the catchment of Warwick, Queensland, Australia. The reduction of rainfall to DWT coefficients allows the input rainfall time series to be simultaneously estimated along with model parameters. The estimation process is conducted using multi-chain Markov chain Monte Carlo simulation with the DREAMZS algorithm. The use of a likelihood function that considers both rainfall and streamflow error allows for model parameter and temporal rainfall distributions to be estimated. Estimation of the wavelet approximation coefficients of lower order decomposition structures was able to estimate the most realistic temporal rainfall distributions. These rainfall estimates were all able to simulate streamflow that was superior to the results of a traditional calibration approach. It is shown that the choice of wavelet has a considerable impact on the robustness of the inversion. The results demonstrate that streamflow data contains sufficient information to estimate temporal rainfall and model parameter distributions. The extent and variance of rainfall time series that are able to simulate streamflow that is superior to that simulated by a traditional calibration approach is a demonstration of equifinality. The use of a likelihood function that considers both rainfall and streamflow error combined with the use of the DWT as a model data reduction technique allows the joint inference of hydrologic model parameters along with rainfall.

  20. The Relationship between Spatial and Temporal Magnitude Estimation of Scientific Concepts at Extreme Scales

    NASA Astrophysics Data System (ADS)

    Price, Aaron; Lee, H.

    2010-01-01

    Many astronomical objects, processes, and events exist and occur at extreme scales of spatial and temporal magnitudes. Our research draws upon the psychological literature, replete with evidence of linguistic and metaphorical links between the spatial and temporal domains, to compare how students estimate spatial and temporal magnitudes associated with objects and processes typically taught in science class.. We administered spatial and temporal scale estimation tests, with many astronomical items, to 417 students enrolled in 12 undergraduate science courses. Results show that while the temporal test was more difficult, students’ overall performance patterns between the two tests were mostly similar. However, asymmetrical correlations between the two tests indicate that students think of the extreme ranges of spatial and temporal scales in different ways, which is likely influenced by their classroom experience. When making incorrect estimations, students tended to underestimate the difference between the everyday scale and the extreme scales on both tests. This suggests the use of a common logarithmic mental number line for both spatial and temporal magnitude estimation. However, there are differences between the two tests in the errors student make in the everyday range. Among the implications discussed is the use of spatio-temporal reference frames, instead of smooth bootstrapping, to help students maneuver between scales of magnitude and the use of logarithmic transformations between reference frames. Implications for astronomy range from learning about spectra to large scale galaxy structure.

  1. Probabilistic estimation of residential air exchange rates for ...

    EPA Pesticide Factsheets

    Residential air exchange rates (AERs) are a key determinant in the infiltration of ambient air pollution indoors. Population-based human exposure models using probabilistic approaches to estimate personal exposure to air pollutants have relied on input distributions from AER measurements. An algorithm for probabilistically estimating AER was developed based on the Lawrence Berkley National Laboratory Infiltration model utilizing housing characteristics and meteorological data with adjustment for window opening behavior. The algorithm was evaluated by comparing modeled and measured AERs in four US cities (Los Angeles, CA; Detroit, MI; Elizabeth, NJ; and Houston, TX) inputting study-specific data. The impact on the modeled AER of using publically available housing data representative of the region for each city was also assessed. Finally, modeled AER based on region-specific inputs was compared with those estimated using literature-based distributions. While modeled AERs were similar in magnitude to the measured AER they were consistently lower for all cities except Houston. AERs estimated using region-specific inputs were lower than those using study-specific inputs due to differences in window opening probabilities. The algorithm produced more spatially and temporally variable AERs compared with literature-based distributions reflecting within- and between-city differences, helping reduce error in estimates of air pollutant exposure. Published in the Journal of

  2. Spatiotemporal modelling of groundwater extraction in semi-arid central Queensland, Australia

    NASA Astrophysics Data System (ADS)

    Keir, Greg; Bulovic, Nevenka; McIntyre, Neil

    2016-04-01

    The semi-arid Surat Basin in central Queensland, Australia, forms part of the Great Artesian Basin, a groundwater resource of national significance. While this area relies heavily on groundwater supply bores to sustain agricultural industries and rural life in general, measurement of groundwater extraction rates is very limited. Consequently, regional groundwater extraction rates are not well known, which may have implications for regional numerical groundwater modelling. However, flows from a small number of bores are metered, and less precise anecdotal estimates of extraction are increasingly available. There is also an increasing number of other spatiotemporal datasets which may help predict extraction rates (e.g. rainfall, temperature, soils, stocking rates etc.). These can be used to construct spatial multivariate regression models to estimate extraction. The data exhibit complicated statistical features, such as zero-valued observations, non-Gaussianity, and non-stationarity, which limit the use of many classical estimation techniques, such as kriging. As well, water extraction histories may exhibit temporal autocorrelation. To account for these features, we employ a separable space-time model to predict bore extraction rates using the R-INLA package for computationally efficient Bayesian inference. A joint approach is used to model both the probability (using a binomial likelihood) and magnitude (using a gamma likelihood) of extraction. The correlation between extraction rates in space and time is modelled using a Gaussian Markov Random Field (GMRF) with a Matérn spatial covariance function which can evolve over time according to an autoregressive model. To reduce computational burden, we allow the GMRF to be evaluated at a relatively coarse temporal resolution, while still allowing predictions to be made at arbitrarily small time scales. We describe the process of model selection and inference using an information criterion approach, and present some preliminary results from the study area. We conclude by discussing issues related with upscaling of the modelling approach to the entire basin, including merging of extraction rate observations with different precision, temporal resolution, and even potentially different likelihoods.

  3. Transmission parameters estimated for Salmonella typhimurium in swine using susceptible-infectious-resistant models and a Bayesian approach

    PubMed Central

    2014-01-01

    Background Transmission models can aid understanding of disease dynamics and are useful in testing the efficiency of control measures. The aim of this study was to formulate an appropriate stochastic Susceptible-Infectious-Resistant/Carrier (SIR) model for Salmonella Typhimurium in pigs and thus estimate the transmission parameters between states. Results The transmission parameters were estimated using data from a longitudinal study of three Danish farrow-to-finish pig herds known to be infected. A Bayesian model framework was proposed, which comprised Binomial components for the transition from susceptible to infectious and from infectious to carrier; and a Poisson component for carrier to infectious. Cohort random effects were incorporated into these models to allow for unobserved cohort-specific variables as well as unobserved sources of transmission, thus enabling a more realistic estimation of the transmission parameters. In the case of the transition from susceptible to infectious, the cohort random effects were also time varying. The number of infectious pigs not detected by the parallel testing was treated as unknown, and the probability of non-detection was estimated using information about the sensitivity and specificity of the bacteriological and serological tests. The estimate of the transmission rate from susceptible to infectious was 0.33 [0.06, 1.52], from infectious to carrier was 0.18 [0.14, 0.23] and from carrier to infectious was 0.01 [0.0001, 0.04]. The estimate for the basic reproduction ration (R 0 ) was 1.91 [0.78, 5.24]. The probability of non-detection was estimated to be 0.18 [0.12, 0.25]. Conclusions The proposed framework for stochastic SIR models was successfully implemented to estimate transmission rate parameters for Salmonella Typhimurium in swine field data. R 0 was 1.91, implying that there was dissemination of the infection within pigs of the same cohort. There was significant temporal-cohort variability, especially at the susceptible to infectious stage. The model adequately fitted the data, allowing for both observed and unobserved sources of uncertainty (cohort effects, diagnostic test sensitivity), so leading to more reliable estimates of transmission parameters. PMID:24774444

  4. Making Energy-Water Nexus Scenarios more Fit-for-Purpose through Better Characterization of Extremes

    NASA Astrophysics Data System (ADS)

    Yetman, G.; Levy, M. A.; Chen, R. S.; Schnarr, E.

    2017-12-01

    Often quantitative scenarios of future trends exhibit less variability than the historic data upon which the models that generate them are based. The problem of dampened variability, which typically also entails dampened extremes, manifests both temporally and spatially. As a result, risk assessments that rely on such scenarios are in danger of producing misleading results. This danger is pronounced in nexus issues, because of the multiple dimensions of change that are relevant. We illustrate the above problem by developing alternative joint distributions of the probability of drought and of human population totals, across U.S. counties over the period 2010-2030. For the dampened-extremes case we use drought frequencies derived from climate models used in the U.S. National Climate Assessment and the Environmental Protection Agency's population and land use projections contained in its Integrated Climate and Land Use Scenarios (ICLUS). For the elevated extremes case we use an alternative spatial drought frequency estimate based on tree-ring data, covering a 555-year period (Ho et al 2017); and we introduce greater temporal and spatial extremes in the ICLUS socioeconomic projections so that they conform to observed extremes in the historical U.S. spatial census data 1790-present (National Historical Geographic Information System). We use spatial and temporal coincidence of high population and extreme drought as a proxy for energy-water nexus risk. We compare the representation of risk in the dampened-extreme and elevated-extreme scenario analysis. We identify areas of the country where using more realistic portrayals of extremes makes the biggest difference in estimate risk and suggest implications for future risk assessments. References: Michelle Ho, Upmanu Lall, Xun Sun, Edward R. Cook. 2017. Multiscale temporal variability and regional patterns in 555 years of conterminous U.S. streamflow. Water Resources Research. . doi: 10.1002/2016WR019632

  5. Temporal and spatial distribution of waterborne mercury in a gold miner's river.

    PubMed

    Picado, Francisco; Bengtsson, Göran

    2012-10-26

    We examined the spatial and temporal (hourly) variation of aqueous concentrations of mercury in a gold miner's river to determine factors that control transport, retention, and export of mercury. The mercury flux was estimated to account for episodic inputs of mercury through mining tailings, variations in flow rate, and the partitioning of mercury between dissolved and particulate phases. Water samples were collected upstream and downstream of two gold mining sites in the Artiguas river, Nicaragua. The samples were analyzed for dissolved and suspended mercury, total solids, dissolved organic carbon, and total iron in water. Water velocity was also measured at the sampling sites. We found that mercury was mainly transported in a suspended phase, with a temporal pattern of diurnal peaks corresponding to the amalgamation schedules at the mining plants. The concentrations decreased with distance from the mining sites, suggesting dilution by tributaries or sedimentation of particle-bound mercury. The lowest total mercury concentrations in the water were less than 0.1 μg l(-1) and the highest concentration was 5.0 μg l(-1). The mercury concentrations are below the present WHO guidelines of 6 μg l(-1) but are considered to lead to a higher risk to aquatic bacteria and fish in the stream than to humans. The aqueous concentrations exceed the hazard endpoints for both groups by a probability of about 1%. Particulate mercury accounted for the largest variation of mercury fluxes, whereas dissolved mercury made up most of the long-range transport along the stream. The estimated total mass of mercury retained due to sedimentation of suspended solids was 2.7 kg per year, and the total mass exported downstream from the mining area was 1.6 kg per year. This study demonstrates the importance of the temporal and spatial resolution of observations in describing the occurrence and fate of mercury in a river affected by anthropogenic activities.

  6. Pseudo Bayes Estimates for Test Score Distributions and Chained Equipercentile Equating. Research Report. ETS RR-09-47

    ERIC Educational Resources Information Center

    Moses, Tim; Oh, Hyeonjoo J.

    2009-01-01

    Pseudo Bayes probability estimates are weighted averages of raw and modeled probabilities; these estimates have been studied primarily in nonpsychometric contexts. The purpose of this study was to evaluate pseudo Bayes probability estimates as applied to the estimation of psychometric test score distributions and chained equipercentile equating…

  7. Population structure and covariate analysis based on pairwise microsatellite allele matching frequencies.

    PubMed

    Givens, Geof H; Ozaksoy, Isin

    2007-01-01

    We describe a general model for pairwise microsatellite allele matching probabilities. The model can be used for analysis of population substructure, and is particularly focused on relating genetic correlation to measurable covariates. The approach is intended for cases when the existence of subpopulations is uncertain and a priori assignment of samples to hypothesized subpopulations is difficult. Such a situation arises, for example, with western Arctic bowhead whales, where genetic samples are available only from a possibly mixed migratory assemblage. We estimate genetic structure associated with spatial, temporal, or other variables that may confound the detection of population structure. In the bowhead case, the model permits detection of genetic patterns associated with a temporally pulsed multi-population assemblage in the annual migration. Hypothesis tests for population substructure and for covariate effects can be carried out using permutation methods. Simulated and real examples illustrate the effectiveness and reliability of the approach and enable comparisons with other familiar approaches. Analysis of the bowhead data finds no evidence for two temporally pulsed subpopulations using the best available data, although a significant pattern found by other researchers using preliminary data is also confirmed here. Code in the R language is available from www.stat.colostate.edu/~geof/gammmp.html.

  8. Age-related differences in time-based prospective memory: The role of time estimation in the clock monitoring strategy.

    PubMed

    Vanneste, Sandrine; Baudouin, Alexia; Bouazzaoui, Badiâa; Taconnat, Laurence

    2016-07-01

    Time-based prospective memory (TBPM) is required when it is necessary to remember to perform an action at a specific future point in time. This type of memory has been found to be particularly sensitive to ageing, probably because it requires a self-initiated response at a specific time. In this study, we sought to examine the involvement of temporal processes in the time monitoring strategy, which has been demonstrated to be a decisive factor in TBPM efficiency. We compared the performance of young and older adults in a TBPM task in which they had to press a response button every minute while categorising words. The design allowed participants to monitor time by checking a clock whenever they decided. Participants also completed a classic time-production task and several executive tasks assessing inhibition, updating and shifting processes. Our results confirm an age-related lack of accuracy in prospective memory performance, which seems to be related to a deficient strategic use of time monitoring. This could in turn be partially explained by age-related temporal deficits, as evidenced in the duration production task. These findings suggest that studies designed to investigate the age effect in TBPM tasks should consider the contribution of temporal mechanisms.

  9. Incorporating detection probability into northern Great Plains pronghorn population estimates

    USGS Publications Warehouse

    Jacques, Christopher N.; Jenks, Jonathan A.; Grovenburg, Troy W.; Klaver, Robert W.; DePerno, Christopher S.

    2014-01-01

    Pronghorn (Antilocapra americana) abundances commonly are estimated using fixed-wing surveys, but these estimates are likely to be negatively biased because of violations of key assumptions underpinning line-transect methodology. Reducing bias and improving precision of abundance estimates through use of detection probability and mark-resight models may allow for more responsive pronghorn management actions. Given their potential application in population estimation, we evaluated detection probability and mark-resight models for use in estimating pronghorn population abundance. We used logistic regression to quantify probabilities that detecting pronghorn might be influenced by group size, animal activity, percent vegetation, cover type, and topography. We estimated pronghorn population size by study area and year using mixed logit-normal mark-resight (MLNM) models. Pronghorn detection probability increased with group size, animal activity, and percent vegetation; overall detection probability was 0.639 (95% CI = 0.612–0.667) with 396 of 620 pronghorn groups detected. Despite model selection uncertainty, the best detection probability models were 44% (range = 8–79%) and 180% (range = 139–217%) greater than traditional pronghorn population estimates. Similarly, the best MLNM models were 28% (range = 3–58%) and 147% (range = 124–180%) greater than traditional population estimates. Detection probability of pronghorn was not constant but depended on both intrinsic and extrinsic factors. When pronghorn detection probability is a function of animal group size, animal activity, landscape complexity, and percent vegetation, traditional aerial survey techniques will result in biased pronghorn abundance estimates. Standardizing survey conditions, increasing resighting occasions, or accounting for variation in individual heterogeneity in mark-resight models will increase the accuracy and precision of pronghorn population estimates.

  10. Regional And Seasonal Aspects Of Within-The-Hour Tec Statistics

    NASA Astrophysics Data System (ADS)

    Koroglu, Ozan; Arikan, Feza; Koroglu, Meltem

    2015-04-01

    Ionosphere is one of the atmosphere layers which has a plasma structure. Several mechanisms originating from both space and earth itself governs this plasma layer such as solar radiation and geomagnetic effects. Ionosphere plays important role for HF and satellite communication, and space based positioning systems. Therefore, the determination of statistical behavior of ionosphere has utmost importance. The variability of the ionosphere has complex spatio-temporal characteristics, which depends on solar, geomagnetic, gravitational and seismic activities. Total Electron Content (TEC) is one of the major observables for investigating and determining this variability. In this study, spatio-temporal within-the-hour statistical behavior of TEC is determined for Turkey, which is located in mid-latitude, using the TEC estimates from Turkish National Permanent GPS Network (TNPGN)-Active between the years 2009 and 2012. TEC estimates are obtained as IONOLAB-TEC which is developed by IONOLAB group (www.ionolab.org) from Hacettepe University. IONOLAB-TEC for each station in TNPGN-Active is organized in a database and grouped with respect to years, ionospheric seasons, hours and regions 2 degree by 3 degree, in latitude and longitude, respectively. The data sets are used to calculate within-the-hour parametric Probability Density Functions (PDF). For every year, every region and every hour, a representative PDF is determined. It is observed that TEC values have a strong hourly, seasonal and positional dependence on east-west direction, and the growing trend shifts according to sunrise and sunset times. It is observed that the data are distributed predominantly as Lognormal and Weibull. The averages and standard deviations of the chosen distributions follow the trends in 24 hour diurnal and 11 year solar cycle periods. The regional and seasonal behavior of PDFs are investigated using a representative GPS station within each region. Within-the-hour PDF estimates are grouped into ionospheric seasons as Winter, Summer, March equinox and September equinox. In winter and summer seasons, Lognormal distribution is observed. During equinox seasons, Weibull distribution is observed more frequently. Furthermore, all hourly TEC values in same region are combined in order to improve the reliability and accuracy of the probability density function estimates. It is observed that as being in mid-latitude region, the ionosphere over Turkey has robust characteristics that are distributed as Lognormal and Weibull. Statistical observations on PDF estimates of TEC of the ionosphere over Turkey will contribute to developing a regional and seasonal random field model, which will further contribute to HF channel characterization. This study is supported by a joint grant of TUBITAK 112E568 and RFBR 13-02-91370-CT_a.

  11. Evaluation of Uncertainty in Runoff Analysis Incorporating Theory of Stochastic Process

    NASA Astrophysics Data System (ADS)

    Yoshimi, Kazuhiro; Wang, Chao-Wen; Yamada, Tadashi

    2015-04-01

    The aim of this paper is to provide a theoretical framework of uncertainty estimate on rainfall-runoff analysis based on theory of stochastic process. SDE (stochastic differential equation) based on this theory has been widely used in the field of mathematical finance due to predict stock price movement. Meanwhile, some researchers in the field of civil engineering have investigated by using this knowledge about SDE (stochastic differential equation) (e.g. Kurino et.al, 1999; Higashino and Kanda, 2001). However, there have been no studies about evaluation of uncertainty in runoff phenomenon based on comparisons between SDE (stochastic differential equation) and Fokker-Planck equation. The Fokker-Planck equation is a partial differential equation that describes the temporal variation of PDF (probability density function), and there is evidence to suggest that SDEs and Fokker-Planck equations are equivalent mathematically. In this paper, therefore, the uncertainty of discharge on the uncertainty of rainfall is explained theoretically and mathematically by introduction of theory of stochastic process. The lumped rainfall-runoff model is represented by SDE (stochastic differential equation) due to describe it as difference formula, because the temporal variation of rainfall is expressed by its average plus deviation, which is approximated by Gaussian distribution. This is attributed to the observed rainfall by rain-gauge station and radar rain-gauge system. As a result, this paper has shown that it is possible to evaluate the uncertainty of discharge by using the relationship between SDE (stochastic differential equation) and Fokker-Planck equation. Moreover, the results of this study show that the uncertainty of discharge increases as rainfall intensity rises and non-linearity about resistance grows strong. These results are clarified by PDFs (probability density function) that satisfy Fokker-Planck equation about discharge. It means the reasonable discharge can be estimated based on the theory of stochastic processes, and it can be applied to the probabilistic risk of flood management.

  12. Occupancy in continuous habitat

    USGS Publications Warehouse

    Efford, Murray G.; Dawson, Deanna K.

    2012-01-01

    The probability that a site has at least one individual of a species ('occupancy') has come to be widely used as a state variable for animal population monitoring. The available statistical theory for estimation when detection is imperfect applies particularly to habitat patches or islands, although it is also used for arbitrary plots in continuous habitat. The probability that such a plot is occupied depends on plot size and home-range characteristics (size, shape and dispersion) as well as population density. Plot size is critical to the definition of occupancy as a state variable, but clear advice on plot size is missing from the literature on the design of occupancy studies. We describe models for the effects of varying plot size and home-range size on expected occupancy. Temporal, spatial, and species variation in average home-range size is to be expected, but information on home ranges is difficult to retrieve from species presence/absence data collected in occupancy studies. The effect of variable home-range size is negligible when plots are very large (>100 x area of home range), but large plots pose practical problems. At the other extreme, sampling of 'point' plots with cameras or other passive detectors allows the true 'proportion of area occupied' to be estimated. However, this measure equally reflects home-range size and density, and is of doubtful value for population monitoring or cross-species comparisons. Plot size is ill-defined and variable in occupancy studies that detect animals at unknown distances, the commonest example being unlimited-radius point counts of song birds. We also find that plot size is ill-defined in recent treatments of "multi-scale" occupancy; the respective scales are better interpreted as temporal (instantaneous and asymptotic) rather than spatial. Occupancy is an inadequate metric for population monitoring when it is confounded with home-range size or detection distance.

  13. Development and neurophysiology of mentalizing.

    PubMed Central

    Frith, Uta; Frith, Christopher D

    2003-01-01

    The mentalizing (theory of mind) system of the brain is probably in operation from ca. 18 months of age, allowing implicit attribution of intentions and other mental states. Between the ages of 4 and 6 years explicit mentalizing becomes possible, and from this age children are able to explain the misleading reasons that have given rise to a false belief. Neuroimaging studies of mentalizing have so far only been carried out in adults. They reveal a system with three components consistently activated during both implicit and explicit mentalizing tasks: medial prefrontal cortex (MPFC), temporal poles and posterior superior temporal sulcus (STS). The functions of these components can be elucidated, to some extent, from their role in other tasks used in neuroimaging studies. Thus, the MPFC region is probably the basis of the decoupling mechanism that distinguishes mental state representations from physical state representations; the STS region is probably the basis of the detection of agency, and the temporal poles might be involved in access to social knowledge in the form of scripts. The activation of these components in concert appears to be critical to mentalizing. PMID:12689373

  14. Comparing Approaches to Deal With Non-Gaussianity of Rainfall Data in Kriging-Based Radar-Gauge Rainfall Merging

    NASA Astrophysics Data System (ADS)

    Cecinati, F.; Wani, O.; Rico-Ramirez, M. A.

    2017-11-01

    Merging radar and rain gauge rainfall data is a technique used to improve the quality of spatial rainfall estimates and in particular the use of Kriging with External Drift (KED) is a very effective radar-rain gauge rainfall merging technique. However, kriging interpolations assume Gaussianity of the process. Rainfall has a strongly skewed, positive, probability distribution, characterized by a discontinuity due to intermittency. In KED rainfall residuals are used, implicitly calculated as the difference between rain gauge data and a linear function of the radar estimates. Rainfall residuals are non-Gaussian as well. The aim of this work is to evaluate the impact of applying KED to non-Gaussian rainfall residuals, and to assess the best techniques to improve Gaussianity. We compare Box-Cox transformations with λ parameters equal to 0.5, 0.25, and 0.1, Box-Cox with time-variant optimization of λ, normal score transformation, and a singularity analysis technique. The results suggest that Box-Cox with λ = 0.1 and the singularity analysis is not suitable for KED. Normal score transformation and Box-Cox with optimized λ, or λ = 0.25 produce satisfactory results in terms of Gaussianity of the residuals, probability distribution of the merged rainfall products, and rainfall estimate quality, when validated through cross-validation. However, it is observed that Box-Cox transformations are strongly dependent on the temporal and spatial variability of rainfall and on the units used for the rainfall intensity. Overall, applying transformations results in a quantitative improvement of the rainfall estimates only if the correct transformations for the specific data set are used.

  15. Crop identification and acreage measurement utilizing ERTS imagery. [Missouri, Kansa, Idaho, and South Dakota

    NASA Technical Reports Server (NTRS)

    Wigton, W. H.; Vonsteen, D. H.

    1974-01-01

    The Statistical Reporting Service of the U.S. Department of Agriculture is evaluating ERTS-1 imagery as a potential tool for estimating crop acreage. A main data source for the estimates is obtained by enumerating small land parcels that have been randomly selected from the total U.S. land area. These small parcels are being used as ground observations in this investigation. The test sites are located in Missouri, Kansas, Idaho, and South Dakota. The major crops of interest are wheat, cotton, corn, soybeans, sugar beets, potatoes, oats, alfalfa, and grain sorghum. Some of the crops are unique to a given site while others are common in two or three states. This provides an opportunity to observe crops grown under different conditions. Results for the Missouri test site are presented. Results of temporal overlays, unequal prior probabilities, and sample classifiers are discussed. The amount of improvement that each technique contributes is shown in terms of overall performance. The results show that useful information for making crop acreage estimates can be obtained from ERTS-1 data.

  16. Motion illusions in optical art presented for long durations are temporally distorted.

    PubMed

    Nather, Francisco Carlos; Mecca, Fernando Figueiredo; Bueno, José Lino Oliveira

    2013-01-01

    Static figurative images implying human body movements observed for shorter and longer durations affect the perception of time. This study examined whether images of static geometric shapes would affect the perception of time. Undergraduate participants observed two Optical Art paintings by Bridget Riley for 9 or 36 s (group G9 and G36, respectively). Paintings implying different intensities of movement (2.0 and 6.0 point stimuli) were randomly presented. The prospective paradigm in the reproduction method was used to record time estimations. Data analysis did not show time distortions in the G9 group. In the G36 group the paintings were differently perceived: that for the 2.0 point one are estimated to be shorter than that for the 6.0 point one. Also for G36, the 2.0 point painting was underestimated in comparison with the actual time of exposure. Motion illusions in static images affected time estimation according to the attention given to the complexity of movement by the observer, probably leading to changes in the storage velocity of internal clock pulses.

  17. Application of remotely sensed land-use information to improve estimates of streamflow characteristics, volume 8. [Maryland, Virginia, and Delaware

    NASA Technical Reports Server (NTRS)

    Pluhowski, E. J. (Principal Investigator)

    1977-01-01

    The author has identified the following significant results. Land use data derived from high altitude photography and satellite imagery were studied for 49 basins in Delaware, and eastern Maryland and Virginia. Applying multiple regression techniques to a network of gaging stations monitoring runoff from 39 of the basins, demonstrated that land use data from high altitude photography provided an effective means of significantly improving estimates of stream flow. Forty stream flow characteristic equations for incorporating remotely sensed land use information, were compared with a control set of equations using map derived land cover. Significant improvement was detected in six equations where level 1 data was added and in five equations where level 2 information was utilized. Only four equations were improved significantly using land use data derived from LANDSAT imagery. Significant losses in accuracy due to the use of remotely sensed land use information were detected only in estimates of flood peaks. Losses in accuracy for flood peaks were probably due to land cover changes associated with temporal differences among the primary land use data sources.

  18. Risk assessment of TBT in the Japanese short-neck clam ( Ruditapes philippinarum) of Tokyo Bay using a chemical fate model

    NASA Astrophysics Data System (ADS)

    Horiguchi, Fumio; Nakata, Kisaburo; Ito, Naganori; Okawa, Ken

    2006-12-01

    A risk assessment of Tributyltin (TBT) in Tokyo Bay was conducted using the Margin of Exposure (MOE) method at the species level using the Japanese short-neck clam, Ruditapes philippinarum. The assessment endpoint was defined to protect R. philippinarum in Tokyo Bay from TBT (growth effects). A No Observed Effect Concentration (NOEC) for this species with respect to growth reduction induced by TBT was estimated from experimental results published in the scientific literature. Sources of TBT in this study were assumed to be commercial vessels in harbors and navigation routes. Concentrations of TBT in Tokyo Bay were estimated using a three-dimensional hydrodynamic model, an ecosystem model and a chemical fate model. MOEs for this species were estimated for the years 1990, 2000, and 2007. Estimated MOEs for R. philippinarum for 1990, 2000, and 2007 were approximately 1-3, 10, and 100, respectively, indicating a declining temporal trend in the probability of adverse growth effects. A simplified software package called RAMTB was developed by incorporating the chemical fate model and the databases of seasonal flow fields and distributions of organic substances (phytoplankton and detritus) in Tokyo Bay, simulated by the hydrodynamic and ecological model, respectively.

  19. Individual prediction of change in delayed recall of prose passages after left-sided anterior temporal lobectomy.

    PubMed

    Jokeit, H; Ebner, A; Holthausen, H; Markowitsch, H J; Moch, A; Pannek, H; Schulz, R; Tuxhorn, I

    1997-08-01

    Prognostic variables for individual memory outcome after left anterior temporal lobectomy (ATL) were studied in 27 patients with refractory temporal lobe epilepsy. The difference between pre- and postoperative performance in the delayed recall of two prose passages (Story A and B) from the Wechsler Memory Scale served as measure of postoperative memory change. Fifteen independent clinical, neuropsychological, and electrophysiological variables were submitted to a multiple linear regression analysis. Preoperative immediate and delayed recall of story content and right hemisphere Wada memory performance for pictorial and verbal items explained very well postoperative memory changes in recall of Story B. Delayed recall of Story B, but not of Story A, had high concurrent validity to other measures of memory. Patients who became seizure-free did not differ in memory change from patients who continued to have seizures after ATL. The variables age at epilepsy onset and probable age at temporal lobe damage provided complementary information for individual prediction but with less effectiveness than Wada test data. Our model confirmed that good preoperative memory functioning and impaired right hemispheric Wada memory performance for pictorial items predict a high risk of memory loss after left ATL. The analyses demonstrate that the combination of independent measures delivers more information than Wada test performance or any other variable alone. The suggested function can be used routinely to estimate the individual severity of verbal episodic memory impairment that might occur after left-sided ATL and offers a rational basis for the counseling of patients.

  20. Large Scale Crop Classification in Ukraine using Multi-temporal Landsat-8 Images with Missing Data

    NASA Astrophysics Data System (ADS)

    Kussul, N.; Skakun, S.; Shelestov, A.; Lavreniuk, M. S.

    2014-12-01

    At present, there are no globally available Earth observation (EO) derived products on crop maps. This issue is being addressed within the Sentinel-2 for Agriculture initiative where a number of test sites (including from JECAM) participate to provide coherent protocols and best practices for various global agriculture systems, and subsequently crop maps from Sentinel-2. One of the problems in dealing with optical images for large territories (more than 10,000 sq. km) is the presence of clouds and shadows that result in having missing values in data sets. In this abstract, a new approach to classification of multi-temporal optical satellite imagery with missing data due to clouds and shadows is proposed. First, self-organizing Kohonen maps (SOMs) are used to restore missing pixel values in a time series of satellite imagery. SOMs are trained for each spectral band separately using non-missing values. Missing values are restored through a special procedure that substitutes input sample's missing components with neuron's weight coefficients. After missing data restoration, a supervised classification is performed for multi-temporal satellite images. For this, an ensemble of neural networks, in particular multilayer perceptrons (MLPs), is proposed. Ensembling of neural networks is done by the technique of average committee, i.e. to calculate the average class probability over classifiers and select the class with the highest average posterior probability for the given input sample. The proposed approach is applied for large scale crop classification using multi temporal Landsat-8 images for the JECAM test site in Ukraine [1-2]. It is shown that ensemble of MLPs provides better performance than a single neural network in terms of overall classification accuracy and kappa coefficient. The obtained classification map is also validated through estimated crop and forest areas and comparison to official statistics. 1. A.Yu. Shelestov et al., "Geospatial information system for agricultural monitoring," Cybernetics Syst. Anal., vol. 49, no. 1, pp. 124-132, 2013. 2. J. Gallego et al., "Efficiency Assessment of Different Approaches to Crop Classification Based on Satellite and Ground Observations," J. Autom. Inform. Scie., vol. 44, no. 5, pp. 67-80, 2012.

  1. Probability estimation with machine learning methods for dichotomous and multicategory outcome: theory.

    PubMed

    Kruppa, Jochen; Liu, Yufeng; Biau, Gérard; Kohler, Michael; König, Inke R; Malley, James D; Ziegler, Andreas

    2014-07-01

    Probability estimation for binary and multicategory outcome using logistic and multinomial logistic regression has a long-standing tradition in biostatistics. However, biases may occur if the model is misspecified. In contrast, outcome probabilities for individuals can be estimated consistently with machine learning approaches, including k-nearest neighbors (k-NN), bagged nearest neighbors (b-NN), random forests (RF), and support vector machines (SVM). Because machine learning methods are rarely used by applied biostatisticians, the primary goal of this paper is to explain the concept of probability estimation with these methods and to summarize recent theoretical findings. Probability estimation in k-NN, b-NN, and RF can be embedded into the class of nonparametric regression learning machines; therefore, we start with the construction of nonparametric regression estimates and review results on consistency and rates of convergence. In SVMs, outcome probabilities for individuals are estimated consistently by repeatedly solving classification problems. For SVMs we review classification problem and then dichotomous probability estimation. Next we extend the algorithms for estimating probabilities using k-NN, b-NN, and RF to multicategory outcomes and discuss approaches for the multicategory probability estimation problem using SVM. In simulation studies for dichotomous and multicategory dependent variables we demonstrate the general validity of the machine learning methods and compare it with logistic regression. However, each method fails in at least one simulation scenario. We conclude with a discussion of the failures and give recommendations for selecting and tuning the methods. Applications to real data and example code are provided in a companion article (doi:10.1002/bimj.201300077). © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Temporal Variability of Oral Microbiota over 10 Months and the Implications for Future Epidemiologic Studies.

    PubMed

    Vogtmann, Emily; Hua, Xing; Zhou, Liang; Wan, Yunhu; Suman, Shalabh; Zhu, Bin; Dagnall, Casey L; Hutchinson, Amy; Jones, Kristine; Hicks, Belynda D; Sinha, Rashmi; Shi, Jianxin; Abnet, Christian C

    2018-05-01

    Background: Few studies have prospectively evaluated the association between oral microbiota and health outcomes. Precise estimates of the intrasubject microbial metric stability will allow better study planning. Therefore, we conducted a study to evaluate the temporal variability of oral microbiota. Methods: Forty individuals provided six oral samples using the OMNIgene ORAL kit and Scope mouthwash oral rinses approximately every two months over 10 months. DNA was extracted using the QIAsymphony and the V4 region of the 16S rRNA gene was amplified and sequenced using the MiSeq. To estimate temporal variation, we calculated intraclass correlation coefficients (ICCs) for a variety of metrics and examined stability after clustering samples into distinct community types using Dirichlet multinomial models (DMMs). Results: The ICCs for the alpha diversity measures were high, including for number of observed bacterial species [0.74; 95% confidence interval (CI): 0.65-0.82 and 0.79; 95% CI: 0.75-0.94] from OMNIgene ORAL and Scope mouthwash, respectively. The ICCs for the relative abundance of the top four phyla and beta diversity matrices were lower. Three clusters provided the best model fit for the DMM from the OMNIgene ORAL samples, and the probability of remaining in a specific cluster was high (59.5%-80.7%). Conclusions: The oral microbiota appears to be stable over time for multiple metrics, but some measures, particularly relative abundance, were less stable. Impact: We used this information to calculate stability-adjusted power calculations that will inform future field study protocols and experimental analytic designs. Cancer Epidemiol Biomarkers Prev; 27(5); 594-600. ©2018 AACR . ©2018 American Association for Cancer Research.

  3. Analysis of hourly crash likelihood using unbalanced panel data mixed logit model and real-time driving environmental big data.

    PubMed

    Chen, Feng; Chen, Suren; Ma, Xiaoxiang

    2018-06-01

    Driving environment, including road surface conditions and traffic states, often changes over time and influences crash probability considerably. It becomes stretched for traditional crash frequency models developed in large temporal scales to capture the time-varying characteristics of these factors, which may cause substantial loss of critical driving environmental information on crash prediction. Crash prediction models with refined temporal data (hourly records) are developed to characterize the time-varying nature of these contributing factors. Unbalanced panel data mixed logit models are developed to analyze hourly crash likelihood of highway segments. The refined temporal driving environmental data, including road surface and traffic condition, obtained from the Road Weather Information System (RWIS), are incorporated into the models. Model estimation results indicate that the traffic speed, traffic volume, curvature and chemically wet road surface indicator are better modeled as random parameters. The estimation results of the mixed logit models based on unbalanced panel data show that there are a number of factors related to crash likelihood on I-25. Specifically, weekend indicator, November indicator, low speed limit and long remaining service life of rutting indicator are found to increase crash likelihood, while 5-am indicator and number of merging ramps per lane per mile are found to decrease crash likelihood. The study underscores and confirms the unique and significant impacts on crash imposed by the real-time weather, road surface, and traffic conditions. With the unbalanced panel data structure, the rich information from real-time driving environmental big data can be well incorporated. Copyright © 2018 National Safety Council and Elsevier Ltd. All rights reserved.

  4. Long‐term creep rates on the Hayward Fault: evidence for controls on the size and frequency of large earthquakes

    USGS Publications Warehouse

    Lienkaemper, James J.; McFarland, Forrest S.; Simpson, Robert W.; Bilham, Roger; Ponce, David A.; Boatwright, John; Caskey, S. John

    2012-01-01

    The Hayward fault (HF) in California exhibits large (Mw 6.5–7.1) earthquakes with short recurrence times (161±65 yr), probably kept short by a 26%–78% aseismic release rate (including postseismic). Its interseismic release rate varies locally over time, as we infer from many decades of surface creep data. Earliest estimates of creep rate, primarily from infrequent surveys of offset cultural features, revealed distinct spatial variation in rates along the fault, but no detectable temporal variation. Since the 1989 Mw 6.9 Loma Prieta earthquake (LPE), monitoring on 32 alinement arrays and 5 creepmeters has greatly improved the spatial and temporal resolution of creep rate. We now identify significant temporal variations, mostly associated with local and regional earthquakes. The largest rate change was a 6‐yr cessation of creep along a 5‐km length near the south end of the HF, attributed to a regional stress drop from the LPE, ending in 1996 with a 2‐cm creep event. North of there near Union City starting in 1991, rates apparently increased by 25% above pre‐LPE levels on a 16‐km‐long reach of the fault. Near Oakland in 2007 an Mw 4.2 earthquake initiated a 1–2 cm creep event extending 10–15 km along the fault. Using new better‐constrained long‐term creep rates, we updated earlier estimates of depth to locking along the HF. The locking depths outline a single, ∼50‐km‐long locked or retarded patch with the potential for an Mw∼6.8 event equaling the 1868 HF earthquake. We propose that this inferred patch regulates the size and frequency of large earthquakes on HF.

  5. Naive Probability: Model-Based Estimates of Unique Events.

    PubMed

    Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Philip N

    2015-08-01

    We describe a dual-process theory of how individuals estimate the probabilities of unique events, such as Hillary Clinton becoming U.S. President. It postulates that uncertainty is a guide to improbability. In its computer implementation, an intuitive system 1 simulates evidence in mental models and forms analog non-numerical representations of the magnitude of degrees of belief. This system has minimal computational power and combines evidence using a small repertoire of primitive operations. It resolves the uncertainty of divergent evidence for single events, for conjunctions of events, and for inclusive disjunctions of events, by taking a primitive average of non-numerical probabilities. It computes conditional probabilities in a tractable way, treating the given event as evidence that may be relevant to the probability of the dependent event. A deliberative system 2 maps the resulting representations into numerical probabilities. With access to working memory, it carries out arithmetical operations in combining numerical estimates. Experiments corroborated the theory's predictions. Participants concurred in estimates of real possibilities. They violated the complete joint probability distribution in the predicted ways, when they made estimates about conjunctions: P(A), P(B), P(A and B), disjunctions: P(A), P(B), P(A or B or both), and conditional probabilities P(A), P(B), P(B|A). They were faster to estimate the probabilities of compound propositions when they had already estimated the probabilities of each of their components. We discuss the implications of these results for theories of probabilistic reasoning. © 2014 Cognitive Science Society, Inc.

  6. Demographics of an ornate box turtle population experiencing minimal human-induced disturbances

    USGS Publications Warehouse

    Converse, S.J.; Iverson, J.B.; Savidge, J.A.

    2005-01-01

    Human-induced disturbances may threaten the viability of many turtle populations, including populations of North American box turtles. Evaluation of the potential impacts of these disturbances can be aided by long-term studies of populations subject to minimal human activity. In such a population of ornate box turtles (Terrapene ornata ornata) in western Nebraska, we examined survival rates and population growth rates from 1981-2000 based on mark-recapture data. The average annual apparent survival rate of adult males was 0.883 (SE = 0.021) and of adult females was 0.932 (SE = 0.014). Minimum winter temperature was the best of five climate variables as a predictor of adult survival. Survival rates were highest in years with low minimum winter temperatures, suggesting that global warming may result in declining survival. We estimated an average adult population growth rate (????) of 1.006 (SE = 0.065), with an estimated temporal process variance (????2) of 0.029 (95% CI = 0.005-0.176). Stochastic simulations suggest that this mean and temporal process variance would result in a 58% probability of a population decrease over a 20-year period. This research provides evidence that, unless unknown density-dependent mechanisms are operating in the adult age class, significant human disturbances, such as commercial harvest or turtle mortality on roads, represent a potential risk to box turtle populations. ?? 2005 by the Ecological Society of America.

  7. Cluster Adjusted Regression for Displaced Subject Data (CARDS): Marginal Inference under Potentially Informative Temporal Cluster Size Profiles

    PubMed Central

    Bible, Joe; Beck, James D.; Datta, Somnath

    2016-01-01

    Summary Ignorance of the mechanisms responsible for the availability of information presents an unusual problem for analysts. It is often the case that the availability of information is dependent on the outcome. In the analysis of cluster data we say that a condition for informative cluster size (ICS) exists when the inference drawn from analysis of hypothetical balanced data varies from that of inference drawn on observed data. Much work has been done in order to address the analysis of clustered data with informative cluster size; examples include Inverse Probability Weighting (IPW), Cluster Weighted Generalized Estimating Equations (CWGEE), and Doubly Weighted Generalized Estimating Equations (DWGEE). When cluster size changes with time, i.e., the data set possess temporally varying cluster sizes (TVCS), these methods may produce biased inference for the underlying marginal distribution of interest. We propose a new marginalization that may be appropriate for addressing clustered longitudinal data with TVCS. The principal motivation for our present work is to analyze the periodontal data collected by Beck et al. (1997, Journal of Periodontal Research 6, 497–505). Longitudinal periodontal data often exhibits both ICS and TVCS as the number of teeth possessed by participants at the onset of study is not constant and teeth as well as individuals may be displaced throughout the study. PMID:26682911

  8. Spatial and temporal variability in the R-5 infiltration data set: Déjà vu and rainfall-runoff simulations

    NASA Astrophysics Data System (ADS)

    Loague, Keith; Kyriakidis, Phaedon C.

    1997-12-01

    This paper is a continuation of the event-based rainfall-runoff model evaluation study reported by Loague and Freeze [1985[. Here we reevaluate the performance of a quasi-physically based rainfall-runoff model for three large events from the well-known R-5 catchment. Five different statistical criteria are used to quantitatively judge model performance. Temporal variability in the large R-5 infiltration data set [Loague and Gander, 1990] is filtered by working in terms of permeability. The transformed data set is reanalyzed via geostatistical methods to model the spatial distribution of permeability across the R-5 catchment. We present new estimates of the spatial distribution of infiltration that are in turn used in our rainfall-runoff simulations with the Horton rainfall-runoff model. The new rainfall-runoff simulations, complicated by reinfiltration impacts at the smaller scales of characterization, indicate that the near-surface hydrologic response of the R-5 catchment is most probably dominated by a combination of the Horton and Dunne overland flow mechanisms.

  9. Multi-Scale Modeling to Improve Single-Molecule, Single-Cell Experiments

    NASA Astrophysics Data System (ADS)

    Munsky, Brian; Shepherd, Douglas

    2014-03-01

    Single-cell, single-molecule experiments are producing an unprecedented amount of data to capture the dynamics of biological systems. When integrated with computational models, observations of spatial, temporal and stochastic fluctuations can yield powerful quantitative insight. We concentrate on experiments that localize and count individual molecules of mRNA. These high precision experiments have large imaging and computational processing costs, and we explore how improved computational analyses can dramatically reduce overall data requirements. In particular, we show how analyses of spatial, temporal and stochastic fluctuations can significantly enhance parameter estimation results for small, noisy data sets. We also show how full probability distribution analyses can constrain parameters with far less data than bulk analyses or statistical moment closures. Finally, we discuss how a systematic modeling progression from simple to more complex analyses can reduce total computational costs by orders of magnitude. We illustrate our approach using single-molecule, spatial mRNA measurements of Interleukin 1-alpha mRNA induction in human THP1 cells following stimulation. Our approach could improve the effectiveness of single-molecule gene regulation analyses for many other process.

  10. Probability shapes perceptual precision: A study in orientation estimation.

    PubMed

    Jabar, Syaheed B; Anderson, Britt

    2015-12-01

    Probability is known to affect perceptual estimations, but an understanding of mechanisms is lacking. Moving beyond binary classification tasks, we had naive participants report the orientation of briefly viewed gratings where we systematically manipulated contingent probability. Participants rapidly developed faster and more precise estimations for high-probability tilts. The shapes of their error distributions, as indexed by a kurtosis measure, also showed a distortion from Gaussian. This kurtosis metric was robust, capturing probability effects that were graded, contextual, and varying as a function of stimulus orientation. Our data can be understood as a probability-induced reduction in the variability or "shape" of estimation errors, as would be expected if probability affects the perceptual representations. As probability manipulations are an implicit component of many endogenous cuing paradigms, changes at the perceptual level could account for changes in performance that might have traditionally been ascribed to "attention." (c) 2015 APA, all rights reserved).

  11. Diminished caudate and superior temporal gyrus responses to effort-based decision making in patients with first-episode major depressive disorder.

    PubMed

    Yang, Xin-hua; Huang, Jia; Lan, Yong; Zhu, Cui-ying; Liu, Xiao-qun; Wang, Ye-fei; Cheung, Eric F C; Xie, Guang-rong; Chan, Raymond C K

    2016-01-04

    Anhedonia, the loss of interest or pleasure in reward processing, is a hallmark feature of major depressive disorder (MDD), but its underlying neurobiological mechanism is largely unknown. The present study aimed to examine the underlying neural mechanism of reward-related decision-making in patients with MDD. We examined behavioral and neural responses to rewards in patients with first-episode MDD (N=25) and healthy controls (N=25) using the Effort-Expenditure for Rewards Task (EEfRT). The task involved choices about possible rewards of varying magnitude and probability. We tested the hypothesis that individuals with MDD would exhibit a reduced neural response in reward-related brain structures involved in cost-benefit decision-making. Compared with healthy controls, patients with MDD showed significantly weaker responses in the left caudate nucleus when contrasting the 'high reward'-'low reward' condition, and blunted responses in the left superior temporal gyrus and the right caudate nucleus when contrasting high and low probabilities. In addition, hard tasks chosen during high probability trials were negatively correlated with superior temporal gyrus activity in MDD patients, while the same choices were negatively correlated with caudate nucleus activity in healthy controls. These results indicate that reduced caudate nucleus and superior temporal gyrus activation may underpin abnormal cost-benefit decision-making in MDD. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Temporal resolution of the Florida manatee (Trichechus manatus latirostris) auditory system.

    PubMed

    Mann, David A; Colbert, Debborah E; Gaspard, Joseph C; Casper, Brandon M; Cook, Mandy L H; Reep, Roger L; Bauer, Gordon B

    2005-10-01

    Auditory evoked potential (AEP) measurements of two Florida manatees (Trichechus manatus latirostris) were measured in response to amplitude modulated tones. The AEP measurements showed weak responses to test stimuli from 4 kHz to 40 kHz. The manatee modulation rate transfer function (MRTF) is maximally sensitive to 150 and 600 Hz amplitude modulation (AM) rates. The 600 Hz AM rate is midway between the AM sensitivities of terrestrial mammals (chinchillas, gerbils, and humans) (80-150 Hz) and dolphins (1,000-1,200 Hz). Audiograms estimated from the input-output functions of the EPs greatly underestimate behavioral hearing thresholds measured in two other manatees. This underestimation is probably due to the electrodes being located several centimeters from the brain.

  13. Communication: Coordinate-dependent diffusivity from single molecule trajectories

    NASA Astrophysics Data System (ADS)

    Berezhkovskii, Alexander M.; Makarov, Dmitrii E.

    2017-11-01

    Single-molecule observations of biomolecular folding are commonly interpreted using the model of one-dimensional diffusion along a reaction coordinate, with a coordinate-independent diffusion coefficient. Recent analysis, however, suggests that more general models are required to account for single-molecule measurements performed with high temporal resolution. Here, we consider one such generalization: a model where the diffusion coefficient can be an arbitrary function of the reaction coordinate. Assuming Brownian dynamics along this coordinate, we derive an exact expression for the coordinate-dependent diffusivity in terms of the splitting probability within an arbitrarily chosen interval and the mean transition path time between the interval boundaries. This formula can be used to estimate the effective diffusion coefficient along a reaction coordinate directly from single-molecule trajectories.

  14. Spatio-temporal water quality mapping from satellite images using geographically and temporally weighted regression

    NASA Astrophysics Data System (ADS)

    Chu, Hone-Jay; Kong, Shish-Jeng; Chang, Chih-Hua

    2018-03-01

    The turbidity (TB) of a water body varies with time and space. Water quality is traditionally estimated via linear regression based on satellite images. However, estimating and mapping water quality require a spatio-temporal nonstationary model, while TB mapping necessitates the use of geographically and temporally weighted regression (GTWR) and geographically weighted regression (GWR) models, both of which are more precise than linear regression. Given the temporal nonstationary models for mapping water quality, GTWR offers the best option for estimating regional water quality. Compared with GWR, GTWR provides highly reliable information for water quality mapping, boasts a relatively high goodness of fit, improves the explanation of variance from 44% to 87%, and shows a sufficient space-time explanatory power. The seasonal patterns of TB and the main spatial patterns of TB variability can be identified using the estimated TB maps from GTWR and by conducting an empirical orthogonal function (EOF) analysis.

  15. Temporal variability of local abundance, sex ratio and activity in the Sardinian chalk hill blue butterfly

    USGS Publications Warehouse

    Casula, P.; Nichols, J.D.

    2003-01-01

    When capturing and marking of individuals is possible, the application of newly developed capture-recapture models can remove several sources of bias in the estimation of population parameters such as local abundance and sex ratio. For example, observation of distorted sex ratios in counts or captures can reflect either different abundances of the sexes or different sex-specific capture probabilities, and capture-recapture models can help distinguish between these two possibilities. Robust design models and a model selection procedure based on information-theoretic methods were applied to study the local population structure of the endemic Sardinian chalk hill blue butterfly, Polyommatus coridon gennargenti. Seasonal variations of abundance, plus daily and weather-related variations of active populations of males and females were investigated. Evidence was found of protandry and male pioneering of the breeding space. Temporary emigration probability, which describes the proportion of the population not exposed to capture (e.g. absent from the study area) during the sampling process, was estimated, differed between sexes, and was related to temperature, a factor known to influence animal activity. The correlation between temporary emigration and average daily temperature suggested interpreting temporary emigration as inactivity of animals. Robust design models were used successfully to provide a detailed description of the population structure and activity in this butterfly and are recommended for studies of local abundance and animal activity in the field.

  16. Estimating Density and Temperature Dependence of Juvenile Vital Rates Using a Hidden Markov Model

    PubMed Central

    McElderry, Robert M.

    2017-01-01

    Organisms in the wild have cryptic life stages that are sensitive to changing environmental conditions and can be difficult to survey. In this study, I used mark-recapture methods to repeatedly survey Anaea aidea (Nymphalidae) caterpillars in nature, then modeled caterpillar demography as a hidden Markov process to assess if temporal variability in temperature and density influence the survival and growth of A. aidea over time. Individual encounter histories result from the joint likelihood of being alive and observed in a particular stage, and I have included hidden states by separating demography and observations into parallel and independent processes. I constructed a demographic matrix containing the probabilities of all possible fates for each stage, including hidden states, e.g., eggs and pupae. I observed both dead and live caterpillars with high probability. Peak caterpillar abundance attracted multiple predators, and survival of fifth instars declined as per capita predation rate increased through spring. A time lag between predator and prey abundance was likely the cause of improved fifth instar survival estimated at high density. Growth rates showed an increase with temperature, but the preferred model did not include temperature. This work illustrates how state-space models can include unobservable stages and hidden state processes to evaluate how environmental factors influence vital rates of cryptic life stages in the wild. PMID:28505138

  17. Using effort information with change-in-ratio data for population estimation

    USGS Publications Warehouse

    Udevitz, Mark S.; Pollock, Kenneth H.

    1995-01-01

    Most change-in-ratio (CIR) methods for estimating fish and wildlife population sizes have been based only on assumptions about how encounter probabilities vary among population subclasses. When information on sampling effort is available, it is also possible to derive CIR estimators based on assumptions about how encounter probabilities vary over time. This paper presents a generalization of previous CIR models that allows explicit consideration of a range of assumptions about the variation of encounter probabilities among subclasses and over time. Explicit estimators are derived under this model for specific sets of assumptions about the encounter probabilities. Numerical methods are presented for obtaining estimators under the full range of possible assumptions. Likelihood ratio tests for these assumptions are described. Emphasis is on obtaining estimators based on assumptions about variation of encounter probabilities over time.

  18. Statistical methods for incomplete data: Some results on model misspecification.

    PubMed

    McIsaac, Michael; Cook, R J

    2017-02-01

    Inverse probability weighted estimating equations and multiple imputation are two of the most studied frameworks for dealing with incomplete data in clinical and epidemiological research. We examine the limiting behaviour of estimators arising from inverse probability weighted estimating equations, augmented inverse probability weighted estimating equations and multiple imputation when the requisite auxiliary models are misspecified. We compute limiting values for settings involving binary responses and covariates and illustrate the effects of model misspecification using simulations based on data from a breast cancer clinical trial. We demonstrate that, even when both auxiliary models are misspecified, the asymptotic biases of double-robust augmented inverse probability weighted estimators are often smaller than the asymptotic biases of estimators arising from complete-case analyses, inverse probability weighting or multiple imputation. We further demonstrate that use of inverse probability weighting or multiple imputation with slightly misspecified auxiliary models can actually result in greater asymptotic bias than the use of naïve, complete case analyses. These asymptotic results are shown to be consistent with empirical results from simulation studies.

  19. Deficits in Visuo-Motor Temporal Integration Impacts Manual Dexterity in Probable Developmental Coordination Disorder.

    PubMed

    Nobusako, Satoshi; Sakai, Ayami; Tsujimoto, Taeko; Shuto, Takashi; Nishi, Yuki; Asano, Daiki; Furukawa, Emi; Zama, Takuro; Osumi, Michihiro; Shimada, Sotaro; Morioka, Shu; Nakai, Akio

    2018-01-01

    The neurological basis of developmental coordination disorder (DCD) is thought to be deficits in the internal model and mirror-neuron system (MNS) in the parietal lobe and cerebellum. However, it is not clear if the visuo-motor temporal integration in the internal model and automatic-imitation function in the MNS differs between children with DCD and those with typical development (TD). The current study aimed to investigate these differences. Using the manual dexterity test of the Movement Assessment Battery for Children (second edition), the participants were either assigned to the probable DCD (pDCD) group or TD group. The former was comprised of 29 children with clumsy manual dexterity, while the latter consisted of 42 children with normal manual dexterity. Visuo-motor temporal integration ability and automatic-imitation function were measured using the delayed visual feedback detection task and motor interference task, respectively. Further, the current study investigated whether autism-spectrum disorder (ASD) traits, attention-deficit hyperactivity disorder (ADHD) traits, and depressive symptoms differed among the two groups, since these symptoms are frequent comorbidities of DCD. In addition, correlation and multiple regression analyses were performed to extract factors affecting clumsy manual dexterity. In the results, the delay-detection threshold (DDT) and steepness of the delay-detection probability curve, which indicated visuo-motor temporal integration ability, were significantly prolonged and decreased, respectively, in children with pDCD. The interference effect, which indicated automatic-imitation function, was also significantly reduced in this group. These results highlighted that children with clumsy manual dexterity have deficits in visuo-motor temporal integration and automatic-imitation function. There was a significant correlation between manual dexterity, and measures of visuo-motor temporal integration, and ASD traits and ADHD traits and ASD. Multiple regression analysis revealed that the DDT, which indicated visuo-motor temporal integration, was the greatest predictor of poor manual dexterity. The current results supported and provided further evidence for the internal model deficit hypothesis. Further, they suggested a neurorehabilitation technique that improved visuo-motor temporal integration could be therapeutically effective for children with DCD.

  20. Deficits in Visuo-Motor Temporal Integration Impacts Manual Dexterity in Probable Developmental Coordination Disorder

    PubMed Central

    Nobusako, Satoshi; Sakai, Ayami; Tsujimoto, Taeko; Shuto, Takashi; Nishi, Yuki; Asano, Daiki; Furukawa, Emi; Zama, Takuro; Osumi, Michihiro; Shimada, Sotaro; Morioka, Shu; Nakai, Akio

    2018-01-01

    The neurological basis of developmental coordination disorder (DCD) is thought to be deficits in the internal model and mirror-neuron system (MNS) in the parietal lobe and cerebellum. However, it is not clear if the visuo-motor temporal integration in the internal model and automatic-imitation function in the MNS differs between children with DCD and those with typical development (TD). The current study aimed to investigate these differences. Using the manual dexterity test of the Movement Assessment Battery for Children (second edition), the participants were either assigned to the probable DCD (pDCD) group or TD group. The former was comprised of 29 children with clumsy manual dexterity, while the latter consisted of 42 children with normal manual dexterity. Visuo-motor temporal integration ability and automatic-imitation function were measured using the delayed visual feedback detection task and motor interference task, respectively. Further, the current study investigated whether autism-spectrum disorder (ASD) traits, attention-deficit hyperactivity disorder (ADHD) traits, and depressive symptoms differed among the two groups, since these symptoms are frequent comorbidities of DCD. In addition, correlation and multiple regression analyses were performed to extract factors affecting clumsy manual dexterity. In the results, the delay-detection threshold (DDT) and steepness of the delay-detection probability curve, which indicated visuo-motor temporal integration ability, were significantly prolonged and decreased, respectively, in children with pDCD. The interference effect, which indicated automatic-imitation function, was also significantly reduced in this group. These results highlighted that children with clumsy manual dexterity have deficits in visuo-motor temporal integration and automatic-imitation function. There was a significant correlation between manual dexterity, and measures of visuo-motor temporal integration, and ASD traits and ADHD traits and ASD. Multiple regression analysis revealed that the DDT, which indicated visuo-motor temporal integration, was the greatest predictor of poor manual dexterity. The current results supported and provided further evidence for the internal model deficit hypothesis. Further, they suggested a neurorehabilitation technique that improved visuo-motor temporal integration could be therapeutically effective for children with DCD. PMID:29556211

  1. Recent Advances on INSAR Temporal Decorrelation: Theory and Observations Using UAVSAR

    NASA Technical Reports Server (NTRS)

    Lavalle, M.; Hensley, S.; Simard, M.

    2011-01-01

    We review our recent advances in understanding the role of temporal decorrelation in SAR interferometry and polarimetric SAR interferometry. We developed a physical model of temporal decorrelation based on Gaussian-statistic motion that varies along the vertical direction in forest canopies. Temporal decorrelation depends on structural parameters such as forest height, is sensitive to polarization and affects coherence amplitude and phase. A model of temporal-volume decorrelation valid for arbitrary spatial baseline is discussed. We tested the inversion of this model to estimate forest height from model simulations supported by JPL/UAVSAR data and lidar LVIS data. We found a general good agreement between forest height estimated from radar data and forest height estimated from lidar data.

  2. Multinomial mixture model with heterogeneous classification probabilities

    USGS Publications Warehouse

    Holland, M.D.; Gray, B.R.

    2011-01-01

    Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.

  3. Solute concentration at a well in non-Gaussian aquifers under constant and time-varying pumping schedule

    NASA Astrophysics Data System (ADS)

    Libera, Arianna; de Barros, Felipe P. J.; Riva, Monica; Guadagnini, Alberto

    2017-10-01

    Our study is keyed to the analysis of the interplay between engineering factors (i.e., transient pumping rates versus less realistic but commonly analyzed uniform extraction rates) and the heterogeneous structure of the aquifer (as expressed by the probability distribution characterizing transmissivity) on contaminant transport. We explore the joint influence of diverse (a) groundwater pumping schedules (constant and variable in time) and (b) representations of the stochastic heterogeneous transmissivity (T) field on temporal histories of solute concentrations observed at an extraction well. The stochastic nature of T is rendered by modeling its natural logarithm, Y = ln T, through a typical Gaussian representation and the recently introduced Generalized sub-Gaussian (GSG) model. The latter has the unique property to embed scale-dependent non-Gaussian features of the main statistics of Y and its (spatial) increments, which have been documented in a variety of studies. We rely on numerical Monte Carlo simulations and compute the temporal evolution at the well of low order moments of the solute concentration (C), as well as statistics of the peak concentration (Cp), identified as the environmental performance metric of interest in this study. We show that the pumping schedule strongly affects the pattern of the temporal evolution of the first two statistical moments of C, regardless the nature (Gaussian or non-Gaussian) of the underlying Y field, whereas the latter quantitatively influences their magnitude. Our results show that uncertainty associated with C and Cp estimates is larger when operating under a transient extraction scheme than under the action of a uniform withdrawal schedule. The probability density function (PDF) of Cp displays a long positive tail in the presence of time-varying pumping schedule. All these aspects are magnified in the presence of non-Gaussian Y fields. Additionally, the PDF of Cp displays a bimodal shape for all types of pumping schemes analyzed, independent of the type of heterogeneity considered.

  4. Probability machines: consistent probability estimation using nonparametric learning machines.

    PubMed

    Malley, J D; Kruppa, J; Dasgupta, A; Malley, K G; Ziegler, A

    2012-01-01

    Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications.

  5. Spatial and temporal variation of life-history traits documented using capture-mark-recapture methods in the vector snail Bulinus truncatus.

    PubMed

    Chlyeh, G; Henry, P Y; Jarne, P

    2003-09-01

    The population biology of the schistosome-vector snail Bulinus truncatus was studied in an irrigation area near Marrakech, Morocco, using demographic approaches, in order to estimate life-history parameters. The survey was conducted using 2 capture-mark-recapture analyses in 2 separate sites from the irrigation area, the first one in 1999 and the second one in 2000. Individuals larger than 5 mm were considered. The capture probability varied through time and space in both analyses. Apparent survival (from 0.7 to 1 per period of 2-3 days) varied with time and space (a series of sinks was considered), as well as a square function of size. These results suggest variation in population intrinsic rate of increase. They also suggest that results from more classical analyses of population demography, aiming, for example at estimating population size, should be interpreted with caution. Together with other results obtained in the same irrigation area, they also lead to some suggestions for population control.

  6. Effects of numerical dissipation and unphysical excursions on scalar-mixing estimates in large-eddy simulations

    NASA Astrophysics Data System (ADS)

    Sharan, Nek; Matheou, Georgios; Dimotakis, Paul

    2017-11-01

    Artificial numerical dissipation decreases dispersive oscillations and can play a key role in mitigating unphysical scalar excursions in large eddy simulations (LES). Its influence on scalar mixing can be assessed through the resolved-scale scalar, Z , its probability density function (PDF), variance, spectra, and the budget of the horizontally averaged equation for Z2. LES of incompressible temporally evolving shear flow enabled us to study the influence of numerical dissipation on unphysical scalar excursions and mixing estimates. Flows with different mixing behavior, with both marching and non-marching scalar PDFs, are studied. Scalar fields for each flow are compared for different grid resolutions and numerical scalar-convection term schemes. As expected, increasing numerical dissipation enhances scalar mixing in the development stage of shear flow characterized by organized large-scale pairings with a non-marching PDF, but has little influence in the self-similar stage of flows with marching PDFs. Flow parameters and regimes sensitive to numerical dissipation help identify approaches to mitigate unphysical excursions while minimizing dissipation.

  7. Remote sensing applied to forest resources

    NASA Technical Reports Server (NTRS)

    Hernandezfilho, P. (Principal Investigator)

    1984-01-01

    The development of methodologies to classify reforested areas using remotely sensed data is discussed. A preliminary study was carried out in northeast of the Sao Paulo State in 1978. The reforested areas of Pinus spp and Eucalyptus spp were based on the spectral, spatial and temporal characteristics fo LANDSAT imagery. Afterwards, a more detailed study was carried out in the Mato Grosso do Sul State. The reforested areas were mapped in functions of the age (from: 0 to 1 year, 1 to 2 years, 2 to 3 years, 3 to 4 years, 4 to 5 years and 5 to 6 years) and of the heterogeneity stand (from: 0 to 20%, 20 to 40%, 40 to 60%, 60 to 80% and 80 to 100%). The relative differences between the artificial forest areas, estimated from LANDSAT data and ground information, varied from -8.72 to +9.49%. The estimation of forest volume through a multistage sampling technique, with probability proportional to size, is also discussed.

  8. Challenges estimating the return period of extreme floods for reinsurance applications

    NASA Astrophysics Data System (ADS)

    Raven, Emma; Busby, Kathryn; Liu, Ye

    2013-04-01

    Mapping and modelling extreme natural events is fundamental within the insurance and reinsurance industry for assessing risk. For example, insurers might use a 1 in 100-year flood hazard map to set the annual premium of a property, whilst a reinsurer might assess the national scale loss associated with the 1 in 200-year return period for capital and regulatory requirements. Using examples from a range of international flood projects, we focus on exploring how to define what the n-year flood looks like for predictive uses in re/insurance applications, whilst considering challenges posed by short historical flow records and the spatial and temporal complexities of flood. First, we shall explore the use of extreme value theory (EVT) statistics for extrapolating data beyond the range of observations in a marginal analysis. In particular, we discuss how to estimate the return period of historical flood events and explore the impact that a range of statistical decisions have on these estimates. Decisions include: (1) selecting which distribution type to apply (e.g. generalised Pareto distribution (GPD) vs. generalised extreme value distribution (GEV)); (2) if former, the choice of the threshold above which the GPD is fitted to the data; and (3) the necessity to perform a cluster analysis to group flow peaks to temporally represent individual flood events. Second, we summarise a specialised multivariate extreme value model, which combines the marginal analysis above with dependence modelling to generate industry standard event sets containing thousands of simulated, equi-probable floods across a region/country. These events represent the typical range of anticipated flooding across a region and can be used to estimate the largest or most widespread events that are expected to occur. Finally, we summarise how a reinsurance catastrophe model combines the event set with detailed flood hazard maps to estimate the financial cost of floods; both the full event set and also individual extreme events. Since the predicted loss estimates, typically in the form of a curve plotting return period against modelled loss, are used in the pricing of reinsurance, we demonstrate the importance of the estimated return period and understanding the uncertainties associated with it.

  9. Horvitz-Thompson survey sample methods for estimating large-scale animal abundance

    USGS Publications Warehouse

    Samuel, M.D.; Garton, E.O.

    1994-01-01

    Large-scale surveys to estimate animal abundance can be useful for monitoring population status and trends, for measuring responses to management or environmental alterations, and for testing ecological hypotheses about abundance. However, large-scale surveys may be expensive and logistically complex. To ensure resources are not wasted on unattainable targets, the goals and uses of each survey should be specified carefully and alternative methods for addressing these objectives always should be considered. During survey design, the impoflance of each survey error component (spatial design, propofiion of detected animals, precision in detection) should be considered carefully to produce a complete statistically based survey. Failure to address these three survey components may produce population estimates that are inaccurate (biased low), have unrealistic precision (too precise) and do not satisfactorily meet the survey objectives. Optimum survey design requires trade-offs in these sources of error relative to the costs of sampling plots and detecting animals on plots, considerations that are specific to the spatial logistics and survey methods. The Horvitz-Thompson estimators provide a comprehensive framework for considering all three survey components during the design and analysis of large-scale wildlife surveys. Problems of spatial and temporal (especially survey to survey) heterogeneity in detection probabilities have received little consideration, but failure to account for heterogeneity produces biased population estimates. The goal of producing unbiased population estimates is in conflict with the increased variation from heterogeneous detection in the population estimate. One solution to this conflict is to use an MSE-based approach to achieve a balance between bias reduction and increased variation. Further research is needed to develop methods that address spatial heterogeneity in detection, evaluate the effects of temporal heterogeneity on survey objectives and optimize decisions related to survey bias and variance. Finally, managers and researchers involved in the survey design process must realize that obtaining the best survey results requires an interactive and recursive process of survey design, execution, analysis and redesign. Survey refinements will be possible as further knowledge is gained on the actual abundance and distribution of the population and on the most efficient techniques for detection animals.

  10. Pyroclastic density current hazard maps at Campi Flegrei caldera (Italy): the effects of event scale, vent location and time forecasts.

    NASA Astrophysics Data System (ADS)

    Bevilacqua, Andrea; Neri, Augusto; Esposti Ongaro, Tomaso; Isaia, Roberto; Flandoli, Franco; Bisson, Marina

    2016-04-01

    Today hundreds of thousands people live inside the Campi Flegrei caldera (Italy) and in the adjacent part of the city of Naples making a future eruption of such volcano an event with huge consequences. Very high risks are associated with the occurrence of pyroclastic density currents (PDCs). Mapping of background or long-term PDC hazard in the area is a great challenge due to the unknown eruption time, scale and vent location of the next event as well as the complex dynamics of the flow over the caldera topography. This is additionally complicated by the remarkable epistemic uncertainty on the eruptive record, affecting the time of past events, the location of vents as well as the PDCs areal extent estimates. First probability maps of PDC invasion were produced combining a vent-opening probability map, statistical estimates concerning the eruptive scales and a Cox-type temporal model including self-excitement effects, based on the eruptive record of the last 15 kyr. Maps were produced by using a Monte Carlo approach and adopting a simplified inundation model based on the "box model" integral approximation tested with 2D transient numerical simulations of flow dynamics. In this presentation we illustrate the independent effects of eruption scale, vent location and time of forecast of the next event. Specific focus was given to the remarkable differences between the eastern and western sectors of the caldera and their effects on the hazard maps. The analysis allowed to identify areas with elevated probabilities of flow invasion as a function of the diverse assumptions made. With the quantification of some sources of uncertainty in relation to the system, we were also able to provide mean and percentile maps of PDC hazard levels.

  11. Rapid spread and association of Schmallenberg virus with ruminant abortions and foetal death in Austria in 2012/2013.

    PubMed

    Steinrigl, Adolf; Schiefer, Peter; Schleicher, Corina; Peinhopf, Walter; Wodak, Eveline; Bagó, Zoltán; Schmoll, Friedrich

    2014-10-15

    Schmallenberg virus (SBV) has emerged in summer-autumn 2011 in north-western Europe. Since then, SBV has been continuously spreading over Europe, including Austria, where antibodies to SBV, as well as SBV genome, were first detected in autumn 2012. This study was performed to demonstrate the dynamics of SBV spread within Austria, after its probable first introduction in summer 2012. True seroprevalence estimates for cattle and small ruminates were calculated to demonstrate temporal and regional differences of infection. Furthermore, the probability of SBV genome detection in foetal tissues of aborted or stillborn cattle and small ruminants as well as in allantoic fluid samples from cows with early foetal losses was retrospectively assessed. SBV first reached Austria most likely in July-August 2012, as indicated by retrospective detection of SBV antibodies and SBV genome in archived samples. From August to October 2012, a rapid increase in seroprevalence to over 98% in cattle and a contemporaneous peak in the detection of SBV genome in foetal tissues and allantoic fluid samples was noted, indicating widespread acute infections. Notably, foetal malformations were absent in RT-qPCR positive foetuses at this time of the epidemic. SBV spread within Austrian cattle reached a plateau phase as early as October 2012, without significant regional differences in SBV seroprevalence (98.4-100%). Estimated true seroprevalences among small ruminates were comparatively lower than in cattle and regionally different (58.3-95.6% in October 2012), potentially indicating an eastward spread of the infection, as well as different infection dynamics between cattle and small ruminants. Additionally, the probability of SBV genome detection over time differed significantly between small ruminant and cattle samples subjected to RT-qPCR testing. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Critical behavior in earthquake energy dissipation

    NASA Astrophysics Data System (ADS)

    Wanliss, James; Muñoz, Víctor; Pastén, Denisse; Toledo, Benjamín; Valdivia, Juan Alejandro

    2017-09-01

    We explore bursty multiscale energy dissipation from earthquakes flanked by latitudes 29° S and 35.5° S, and longitudes 69.501° W and 73.944° W (in the Chilean central zone). Our work compares the predictions of a theory of nonequilibrium phase transitions with nonstandard statistical signatures of earthquake complex scaling behaviors. For temporal scales less than 84 hours, time development of earthquake radiated energy activity follows an algebraic arrangement consistent with estimates from the theory of nonequilibrium phase transitions. There are no characteristic scales for probability distributions of sizes and lifetimes of the activity bursts in the scaling region. The power-law exponents describing the probability distributions suggest that the main energy dissipation takes place due to largest bursts of activity, such as major earthquakes, as opposed to smaller activations which contribute less significantly though they have greater relative occurrence. The results obtained provide statistical evidence that earthquake energy dissipation mechanisms are essentially "scale-free", displaying statistical and dynamical self-similarity. Our results provide some evidence that earthquake radiated energy and directed percolation belong to a similar universality class.

  13. Bayesian time series analysis of segments of the Rocky Mountain trumpeter swan population

    USGS Publications Warehouse

    Wright, Christopher K.; Sojda, Richard S.; Goodman, Daniel

    2002-01-01

    A Bayesian time series analysis technique, the dynamic linear model, was used to analyze counts of Trumpeter Swans (Cygnus buccinator) summering in Idaho, Montana, and Wyoming from 1931 to 2000. For the Yellowstone National Park segment of white birds (sub-adults and adults combined) the estimated probability of a positive growth rate is 0.01. The estimated probability of achieving the Subcommittee on Rocky Mountain Trumpeter Swans 2002 population goal of 40 white birds for the Yellowstone segment is less than 0.01. Outside of Yellowstone National Park, Wyoming white birds are estimated to have a 0.79 probability of a positive growth rate with a 0.05 probability of achieving the 2002 objective of 120 white birds. In the Centennial Valley in southwest Montana, results indicate a probability of 0.87 that the white bird population is growing at a positive rate with considerable uncertainty. The estimated probability of achieving the 2002 Centennial Valley objective of 160 white birds is 0.14 but under an alternative model falls to 0.04. The estimated probability that the Targhee National Forest segment of white birds has a positive growth rate is 0.03. In Idaho outside of the Targhee National Forest, white birds are estimated to have a 0.97 probability of a positive growth rate with a 0.18 probability of attaining the 2002 goal of 150 white birds.

  14. Temporally diffeomorphic cardiac motion estimation from three-dimensional echocardiography by minimization of intensity consistency error.

    PubMed

    Zhang, Zhijun; Ashraf, Muhammad; Sahn, David J; Song, Xubo

    2014-05-01

    Quantitative analysis of cardiac motion is important for evaluation of heart function. Three dimensional (3D) echocardiography is among the most frequently used imaging modalities for motion estimation because it is convenient, real-time, low-cost, and nonionizing. However, motion estimation from 3D echocardiographic sequences is still a challenging problem due to low image quality and image corruption by noise and artifacts. The authors have developed a temporally diffeomorphic motion estimation approach in which the velocity field instead of the displacement field was optimized. The optimal velocity field optimizes a novel similarity function, which we call the intensity consistency error, defined as multiple consecutive frames evolving to each time point. The optimization problem is solved by using the steepest descent method. Experiments with simulated datasets, images of anex vivo rabbit phantom, images of in vivo open-chest pig hearts, and healthy human images were used to validate the authors' method. Simulated and real cardiac sequences tests showed that results in the authors' method are more accurate than other competing temporal diffeomorphic methods. Tests with sonomicrometry showed that the tracked crystal positions have good agreement with ground truth and the authors' method has higher accuracy than the temporal diffeomorphic free-form deformation (TDFFD) method. Validation with an open-access human cardiac dataset showed that the authors' method has smaller feature tracking errors than both TDFFD and frame-to-frame methods. The authors proposed a diffeomorphic motion estimation method with temporal smoothness by constraining the velocity field to have maximum local intensity consistency within multiple consecutive frames. The estimated motion using the authors' method has good temporal consistency and is more accurate than other temporally diffeomorphic motion estimation methods.

  15. Crash probability estimation via quantifying driver hazard perception.

    PubMed

    Li, Yang; Zheng, Yang; Wang, Jianqiang; Kodaka, Kenji; Li, Keqiang

    2018-07-01

    Crash probability estimation is an important method to predict the potential reduction of crash probability contributed by forward collision avoidance technologies (FCATs). In this study, we propose a practical approach to estimate crash probability, which combines a field operational test and numerical simulations of a typical rear-end crash model. To consider driver hazard perception characteristics, we define a novel hazard perception measure, called as driver risk response time, by considering both time-to-collision (TTC) and driver braking response to impending collision risk in a near-crash scenario. Also, we establish a driving database under mixed Chinese traffic conditions based on a CMBS (Collision Mitigation Braking Systems)-equipped vehicle. Applying the crash probability estimation in this database, we estimate the potential decrease in crash probability owing to use of CMBS. A comparison of the results with CMBS on and off shows a 13.7% reduction of crash probability in a typical rear-end near-crash scenario with a one-second delay of driver's braking response. These results indicate that CMBS is positive in collision prevention, especially in the case of inattentive drivers or ole drivers. The proposed crash probability estimation offers a practical way for evaluating the safety benefits in the design and testing of FCATs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Estimating adult sex ratios in nature.

    PubMed

    Ancona, Sergio; Dénes, Francisco V; Krüger, Oliver; Székely, Tamás; Beissinger, Steven R

    2017-09-19

    Adult sex ratio (ASR, the proportion of males in the adult population) is a central concept in population and evolutionary biology, and is also emerging as a major factor influencing mate choice, pair bonding and parental cooperation in both human and non-human societies. However, estimating ASR is fraught with difficulties stemming from the effects of spatial and temporal variation in the numbers of males and females, and detection/capture probabilities that differ between the sexes. Here, we critically evaluate methods for estimating ASR in wild animal populations, reviewing how recent statistical advances can be applied to handle some of these challenges. We review methods that directly account for detection differences between the sexes using counts of unmarked individuals (observed, trapped or killed) and counts of marked individuals using mark-recapture models. We review a third class of methods that do not directly sample the number of males and females, but instead estimate the sex ratio indirectly using relationships that emerge from demographic measures, such as survival, age structure, reproduction and assumed dynamics. We recommend that detection-based methods be used for estimating ASR in most situations, and point out that studies are needed that compare different ASR estimation methods and control for sex differences in dispersal.This article is part of the themed issue 'Adult sex ratios and reproductive decisions: a critical re-examination of sex differences in human and animal societies'. © 2017 The Author(s).

  17. The cometary and asteroidal impactor flux at the earth

    NASA Technical Reports Server (NTRS)

    Weissman, Paul R.

    1988-01-01

    The cratering records on the Earth and the lunar maria provide upper limits on the total impactor flux at the Earth's orbit over the past 600 Myr and the past 3.3 Gyr, respectively. These limits can be compared with estimates of the expected cratering rate from observed comets and asteroids in Earth-crossing orbits, corrected for observational selection effects and incompleteness, and including expected temporal variations in the impactor flux. Both estimates can also be used to calculate the probability of large impacts which may result in biological extinction events on the Earth. The estimated cratering rate on the Earth for craters greater than 10 km-diameter, based on counted craters on dated surfaces is 2.2 + or - 1.1 x 10 to the minus 14th power km(-2) yr(-1) (Shoemaker et al., 1979). Using a revised mass distribution for cometary nuclei based on the results of the spacecraft flybys of Comet Halley in 1986, and other refinements in the estimate of the cometary flux in the terrestrial planets zone, it is now estimated that long-period comets account for 11 percent of the cratering on the Earth (scaled to the estimate above), and short-period comets account for 4 pct (Weissman, 1987). However, the greatest contribution is from large but infrequent, random cometary showers, accounting for 22 pct of the terrestrial cratering.

  18. Estimating the probability that the Taser directly causes human ventricular fibrillation.

    PubMed

    Sun, H; Haemmerich, D; Rahko, P S; Webster, J G

    2010-04-01

    This paper describes the first methodology and results for estimating the order of probability for Tasers directly causing human ventricular fibrillation (VF). The probability of an X26 Taser causing human VF was estimated using: (1) current density near the human heart estimated by using 3D finite-element (FE) models; (2) prior data of the maximum dart-to-heart distances that caused VF in pigs; (3) minimum skin-to-heart distances measured in erect humans by echocardiography; and (4) dart landing distribution estimated from police reports. The estimated mean probability of human VF was 0.001 for data from a pig having a chest wall resected to the ribs and 0.000006 for data from a pig with no resection when inserting a blunt probe. The VF probability for a given dart location decreased with the dart-to-heart horizontal distance (radius) on the skin surface.

  19. Identification of Stochastically Perturbed Autonomous Systems from Temporal Sequences of Probability Density Functions

    NASA Astrophysics Data System (ADS)

    Nie, Xiaokai; Luo, Jingjing; Coca, Daniel; Birkin, Mark; Chen, Jing

    2018-03-01

    The paper introduces a method for reconstructing one-dimensional iterated maps that are driven by an external control input and subjected to an additive stochastic perturbation, from sequences of probability density functions that are generated by the stochastic dynamical systems and observed experimentally.

  20. Human papillomavirus in oropharyngeal cancer in Canada: analysis of 5 comprehensive cancer centres using multiple imputation

    PubMed Central

    Habbous, Steven; Chu, Karen P.; Lau, Harold; Schorr, Melissa; Belayneh, Mathieos; Ha, Michael N.; Murray, Scott; O’Sullivan, Brian; Huang, Shao Hui; Snow, Stephanie; Parliament, Matthew; Hao, Desiree; Cheung, Winson Y.; Xu, Wei; Liu, Geoffrey

    2017-01-01

    BACKGROUND: The incidence of oropharyngeal cancer has risen over the past 2 decades. This rise has been attributed to human papillomavirus (HPV), but information on temporal trends in incidence of HPV-associated cancers across Canada is limited. METHODS: We collected social, clinical and demographic characteristics and p16 protein status (p16-positive or p16-negative, using this immunohistochemistry variable as a surrogate marker of HPV status) for 3643 patients with oropharyngeal cancer diagnosed between 2000 and 2012 at comprehensive cancer centres in British Columbia (6 centres), Edmonton, Calgary, Toronto and Halifax. We used receiver operating characteristic curves and multiple imputation to estimate the p16 status for missing values. We chose a best-imputation probability cut point on the basis of accuracy in samples with known p16 status and through an independent relation between p16 status and overall survival. We used logistic and Cox proportional hazard regression. RESULTS: We found no temporal changes in p16-positive status initially, but there was significant selection bias, with p16 testing significantly more likely to be performed in males, lifetime never-smokers, patients with tonsillar or base-of-tongue tumours and those with nodal involvement (p < 0.05 for each variable). We used the following variables associated with p16-positive status for multiple imputation: male sex, tonsillar or base-of-tongue tumours, smaller tumours, nodal involvement, less smoking and lower alcohol consumption (p < 0.05 for each variable). Using sensitivity analyses, we showed that different imputation probability cut points for p16-positive status each identified a rise from 2000 to 2012, with the best-probability cut point identifying an increase from 47.3% in 2000 to 73.7% in 2012 (p < 0.001). INTERPRETATION: Across multiple centres in Canada, there was a steady rise in the proportion of oropharyngeal cancers attributable to HPV from 2000 to 2012. PMID:28808115

  1. Temporal plus epilepsy is a major determinant of temporal lobe surgery failures.

    PubMed

    Barba, Carmen; Rheims, Sylvain; Minotti, Lorella; Guénot, Marc; Hoffmann, Dominique; Chabardès, Stephan; Isnard, Jean; Kahane, Philippe; Ryvlin, Philippe

    2016-02-01

    Reasons for failed temporal lobe epilepsy surgery remain unclear. Temporal plus epilepsy, characterized by a primary temporal lobe epileptogenic zone extending to neighboured regions, might account for a yet unknown proportion of these failures. In this study all patients from two epilepsy surgery programmes who fulfilled the following criteria were included: (i) operated from an anterior temporal lobectomy or disconnection between January 1990 and December 2001; (ii) magnetic resonance imaging normal or showing signs of hippocampal sclerosis; and (iii) postoperative follow-up ≥ 24 months for seizure-free patients. Patients were classified as suffering from unilateral temporal lobe epilepsy, bitemporal epilepsy or temporal plus epilepsy based on available presurgical data. Kaplan-Meier survival analysis was used to calculate the probability of seizure freedom over time. Predictors of seizure recurrence were investigated using Cox proportional hazards model. Of 168 patients included, 108 (63.7%) underwent stereoelectroencephalography, 131 (78%) had hippocampal sclerosis, 149 suffered from unilateral temporal lobe epilepsy (88.7%), one from bitemporal epilepsy (0.6%) and 18 (10.7%) from temporal plus epilepsy. The probability of Engel class I outcome at 10 years of follow-up was 67.3% (95% CI: 63.4-71.2) for the entire cohort, 74.5% (95% CI: 70.6-78.4) for unilateral temporal lobe epilepsy, and 14.8% (95% CI: 5.9-23.7) for temporal plus epilepsy. Multivariate analyses demonstrated four predictors of seizure relapse: temporal plus epilepsy (P < 0.001), postoperative hippocampal remnant (P = 0.001), past history of traumatic or infectious brain insult (P = 0.022), and secondary generalized tonic-clonic seizures (P = 0.023). Risk of temporal lobe surgery failure was 5.06 (95% CI: 2.36-10.382) greater in patients with temporal plus epilepsy than in those with unilateral temporal lobe epilepsy. Temporal plus epilepsy represents a hitherto unrecognized prominent cause of temporal lobe surgery failures. In patients with temporal plus epilepsy, anterior temporal lobectomy appears very unlikely to control seizures and should not be advised. Whether larger resection of temporal plus epileptogenic zones offers greater chance of seizure freedom remains to be investigated. © The Author (2015). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. A robust measure of HIV-1 population turnover within chronically infected individuals.

    PubMed

    Achaz, G; Palmer, S; Kearney, M; Maldarelli, F; Mellors, J W; Coffin, J M; Wakeley, J

    2004-10-01

    A simple nonparameteric test for population structure was applied to temporally spaced samples of HIV-1 sequences from the gag-pol region within two chronically infected individuals. The results show that temporal structure can be detected for samples separated by about 22 months or more. The performance of the method, which was originally proposed to detect geographic structure, was tested for temporally spaced samples using neutral coalescent simulations. Simulations showed that the method is robust to variation in samples sizes and mutation rates, to the presence/absence of recombination, and that the power to detect temporal structure is high. By comparing levels of temporal structure in simulations to the levels observed in real data, we estimate the effective intra-individual population size of HIV-1 to be between 10(3) and 10(4) viruses, which is in agreement with some previous estimates. Using this estimate and a simple measure of sequence diversity, we estimate an effective neutral mutation rate of about 5 x 10(-6) per site per generation in the gag-pol region. The definition and interpretation of estimates of such "effective" population parameters are discussed.

  3. A likelihood framework for joint estimation of salmon abundance and migratory timing using telemetric mark-recapture

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.; Gates, Kenneth S.; Palmer, Douglas E.

    2010-01-01

    Many fisheries for Pacific salmon Oncorhynchus spp. are actively managed to meet escapement goal objectives. In fisheries where the demand for surplus production is high, an extensive assessment program is needed to achieve the opposing objectives of allowing adequate escapement and fully exploiting the available surplus. Knowledge of abundance is a critical element of such assessment programs. Abundance estimation using mark—recapture experiments in combination with telemetry has become common in recent years, particularly within Alaskan river systems. Fish are typically captured and marked in the lower river while migrating in aggregations of individuals from multiple populations. Recapture data are obtained using telemetry receivers that are co-located with abundance assessment projects near spawning areas, which provide large sample sizes and information on population-specific mark rates. When recapture data are obtained from multiple populations, unequal mark rates may reflect a violation of the assumption of homogeneous capture probabilities. A common analytical strategy is to test the hypothesis that mark rates are homogeneous and combine all recapture data if the test is not significant. However, mark rates are often low, and a test of homogeneity may lack sufficient power to detect meaningful differences among populations. In addition, differences among mark rates may provide information that could be exploited during parameter estimation. We present a temporally stratified mark—recapture model that permits capture probabilities and migratory timing through the capture area to vary among strata. Abundance information obtained from a subset of populations after the populations have segregated for spawning is jointly modeled with telemetry distribution data by use of a likelihood function. Maximization of the likelihood produces estimates of the abundance and timing of individual populations migrating through the capture area, thus yielding substantially more information than the total abundance estimate provided by the conventional approach. The utility of the model is illustrated with data for coho salmon O. kisutch from the Kasilof River in south-central Alaska.

  4. Assessment of Rainfall Estimates Using a Standard Z-R Relationship and the Probability Matching Method Applied to Composite Radar Data in Central Florida

    NASA Technical Reports Server (NTRS)

    Crosson, William L.; Duchon, Claude E.; Raghavan, Ravikumar; Goodman, Steven J.

    1996-01-01

    Precipitation estimates from radar systems are a crucial component of many hydrometeorological applications, from flash flood forecasting to regional water budget studies. For analyses on large spatial scales and long timescales, it is frequently necessary to use composite reflectivities from a network of radar systems. Such composite products are useful for regional or national studies, but introduce a set of difficulties not encountered when using single radars. For instance, each contributing radar has its own calibration and scanning characteristics, but radar identification may not be retained in the compositing procedure. As a result, range effects on signal return cannot be taken into account. This paper assesses the accuracy with which composite radar imagery can be used to estimate precipitation in the convective environment of Florida during the summer of 1991. Results using Z = 30OR(sup 1.4) (WSR-88D default Z-R relationship) are compared with those obtained using the probability matching method (PMM). Rainfall derived from the power law Z-R was found to he highly biased (+90%-l10%) compared to rain gauge measurements for various temporal and spatial integrations. Application of a 36.5-dBZ reflectivity threshold (determined via the PMM) was found to improve the performance of the power law Z-R, reducing the biases substantially to 20%-33%. Correlations between precipitation estimates obtained with either Z-R relationship and mean gauge values are much higher for areal averages than for point locations. Precipitation estimates from the PMM are an improvement over those obtained using the power law in that biases and root-mean-square errors are much lower. The minimum timescale for application of the PMM with the composite radar dataset was found to be several days for area-average precipitation. The minimum spatial scale is harder to quantify, although it is concluded that it is less than 350 sq km. Implications relevant to the WSR-88D system are discussed.

  5. Natal location influences movement and survival of a spatially structured population of snail kites

    USGS Publications Warehouse

    Martin, J.; Kitchens, W.M.; Hines, J.E.

    2007-01-01

    Despite the accepted importance of the need to better understand how natal location affects movement decisions and survival of animals, robust estimates of movement and survival in relation to the natal location are lacking. Our study focuses on movement and survival related to the natal location of snail kites in Florida and shows that kites, in addition to exhibiting a high level of site tenacity to breeding regions, also exhibit particular attraction to their natal region. More specifically, we found that estimates of movement from post-dispersal regions were greater toward natal regions than toward non-natal regions (differences were significant for three of four regions). We also found that estimates of natal philopatry were greater than estimates of philopatry to non-natal regions (differences were statistically significant for two of four regions). A previous study indicated an effect of natal region on juvenile survival; in this study, we show an effect of natal region on adult survival. Estimates of adult survival varied among kites that were hatched in different regions. Adults experienced mortality rates characteristic of the region occupied at the time when survival was measured, but because there is a greater probability that kites will return to their natal region than to any other regions, their survival was ultimately influenced by their natal region. In most years, kites hatched in southern regions had greater survival probabilities than did kites hatched in northern regions. However, during a multiregional drought, one of the northern regions served as a refuge from drought, and during this perturbation, survival was greater for birds hatched in the north. Our study shows that natal location may be important in influencing the ecological dynamics of kites but also highlights the importance of considering temporal variation in habitat conditions of spatially structured systems when attempting to evaluate the conservation value of habitats.

  6. Unbiased multi-fidelity estimate of failure probability of a free plane jet

    NASA Astrophysics Data System (ADS)

    Marques, Alexandre; Kramer, Boris; Willcox, Karen; Peherstorfer, Benjamin

    2017-11-01

    Estimating failure probability related to fluid flows is a challenge because it requires a large number of evaluations of expensive models. We address this challenge by leveraging multiple low fidelity models of the flow dynamics to create an optimal unbiased estimator. In particular, we investigate the effects of uncertain inlet conditions in the width of a free plane jet. We classify a condition as failure when the corresponding jet width is below a small threshold, such that failure is a rare event (failure probability is smaller than 0.001). We estimate failure probability by combining the frameworks of multi-fidelity importance sampling and optimal fusion of estimators. Multi-fidelity importance sampling uses a low fidelity model to explore the parameter space and create a biasing distribution. An unbiased estimate is then computed with a relatively small number of evaluations of the high fidelity model. In the presence of multiple low fidelity models, this framework offers multiple competing estimators. Optimal fusion combines all competing estimators into a single estimator with minimal variance. We show that this combined framework can significantly reduce the cost of estimating failure probabilities, and thus can have a large impact in fluid flow applications. This work was funded by DARPA.

  7. Gap Detection and Temporal Modulation Transfer Function as Behavioral Estimates of Auditory Temporal Acuity Using Band-Limited Stimuli in Young and Older Adults

    PubMed Central

    Shen, Yi

    2015-01-01

    Purpose Gap detection and the temporal modulation transfer function (TMTF) are 2 common methods to obtain behavioral estimates of auditory temporal acuity. However, the agreement between the 2 measures is not clear. This study compares results from these 2 methods and their dependencies on listener age and hearing status. Method Gap detection thresholds and the parameters that describe the TMTF (sensitivity and cutoff frequency) were estimated for young and older listeners who were naive to the experimental tasks. Stimuli were 800-Hz-wide noises with upper frequency limits of 2400 Hz, presented at 85 dB SPL. A 2-track procedure (Shen & Richards, 2013) was used for the efficient estimation of the TMTF. Results No significant correlation was found between gap detection threshold and the sensitivity or the cutoff frequency of the TMTF. No significant effect of age and hearing loss on either the gap detection threshold or the TMTF cutoff frequency was found, while the TMTF sensitivity improved with increasing hearing threshold and worsened with increasing age. Conclusion Estimates of temporal acuity using gap detection and TMTF paradigms do not seem to provide a consistent description of the effects of listener age and hearing status on temporal envelope processing. PMID:25087722

  8. On the nonlinearity of spatial scales in extreme weather attribution statements

    NASA Astrophysics Data System (ADS)

    Angélil, Oliver; Stone, Daíthí; Perkins-Kirkpatrick, Sarah; Alexander, Lisa V.; Wehner, Michael; Shiogama, Hideo; Wolski, Piotr; Ciavarella, Andrew; Christidis, Nikolaos

    2018-04-01

    In the context of ongoing climate change, extreme weather events are drawing increasing attention from the public and news media. A question often asked is how the likelihood of extremes might have changed by anthropogenic greenhouse-gas emissions. Answers to the question are strongly influenced by the model used, duration, spatial extent, and geographic location of the event—some of these factors often overlooked. Using output from four global climate models, we provide attribution statements characterised by a change in probability of occurrence due to anthropogenic greenhouse-gas emissions, for rainfall and temperature extremes occurring at seven discretised spatial scales and three temporal scales. An understanding of the sensitivity of attribution statements to a range of spatial and temporal scales of extremes allows for the scaling of attribution statements, rendering them relevant to other extremes having similar but non-identical characteristics. This is a procedure simple enough to approximate timely estimates of the anthropogenic contribution to the event probability. Furthermore, since real extremes do not have well-defined physical borders, scaling can help quantify uncertainty around attribution results due to uncertainty around the event definition. Results suggest that the sensitivity of attribution statements to spatial scale is similar across models and that the sensitivity of attribution statements to the model used is often greater than the sensitivity to a doubling or halving of the spatial scale of the event. The use of a range of spatial scales allows us to identify a nonlinear relationship between the spatial scale of the event studied and the attribution statement.

  9. Population viability of Pediocactus bradyi (Cactaceae) in a changing climate.

    PubMed

    Shryock, Daniel F; Esque, Todd C; Hughes, Lee

    2014-11-01

    A key question concerns the vulnerability of desert species adapted to harsh, variable climates to future climate change. Evaluating this requires coupling long-term demographic models with information on past and projected future climates. We investigated climatic drivers of population growth using a 22-yr demographic model for Pediocactus bradyi, an endangered cactus in northern Arizona. We used a matrix model to calculate stochastic population growth rates (λs) and the relative influences of life-cycle transitions on population growth. Regression models linked population growth with climatic variability, while stochastic simulations were used to (1) understand how predicted increases in drought frequency and extreme precipitation would affect λs, and (2) quantify variability in λs based on temporal replication of data. Overall λs was below unity (0.961). Population growth was equally influenced by fecundity and survival and significantly correlated with increased annual precipitation and higher winter temperatures. Stochastic simulations increasing the probability of drought and extreme precipitation reduced λs, but less than simulations increasing the probability of drought alone. Simulations varying the temporal replication of data suggested 14 yr were required for accurate λs estimates. Pediocactus bradyi may be vulnerable to increases in the frequency and intensity of extreme climatic events, particularly drought. Biotic interactions resulting in low survival during drought years outweighed increased seedling establishment following heavy precipitation. Climatic extremes beyond historical ranges of variability may threaten rare desert species with low population growth rates and therefore high susceptibility to stochastic events. © 2014 Botanical Society of America, Inc.

  10. Counterion effects on the ultrafast dynamics of charge-transfer-to-solvent electrons.

    PubMed

    Rivas, N; Moriena, G; Domenianni, L; Hodak, J H; Marceca, E

    2017-12-06

    We performed femtosecond transient absorption (TA) experiments to monitor the solvation dynamics of charge-transfer-to-solvent (CTTS) electrons originating from UV photoexcitation of ammoniated iodide in close proximity to the counterions. Solutions of KI were prepared in liquid ammonia and TA experiments were carried out at different temperatures and densities, along the liquid-gas coexistence curve of the fluid. The results complement previous femtosecond TA work by P. Vöhringer's group in neat ammonia via multiphoton ionization. The dynamics of CTTS-detached electrons in ammonia was found to be strongly affected by ion pairing. Geminate recombination time constants as well as escape probabilities were determined from the measured temporal profiles and analysed as a function of the medium density. A fast unresolved (τ < 250 fs) increase of absorption related to the creation/thermalization of solvated electron species was followed by two decay components: one with a characteristic time around 10 ps, and a slower one that remains active for hundreds of picoseconds. While the first process is attributed to an early recombination of (I, e - ) pairs, the second decay and its asymptote reflects the effect of the K + counterion on the geminate recombination dynamics, rate and yield. The cation basically acts as an electron anchor that restricts the ejection distance, leading to solvent-separated counterion-electron species. The formation of (K + , NH 3 , e - ) pairs close to the parent iodine atom brings the electron escape probability to very low values. Transient spectra of the electron species have also been estimated as a function of time by probing the temporal profiles at different wavelengths.

  11. A Repeated Trajectory Class Model for Intensive Longitudinal Categorical Outcome

    PubMed Central

    Lin, Haiqun; Han, Ling; Peduzzi, Peter N.; Murphy, Terrence E.; Gill, Thomas M.; Allore, Heather G.

    2014-01-01

    This paper presents a novel repeated latent class model for a longitudinal response that is frequently measured as in our prospective study of older adults with monthly data on activities of daily living (ADL) for more than ten years. The proposed method is especially useful when the longitudinal response is measured much more frequently than other relevant covariates. The repeated trajectory classes represent distinct temporal patterns of the longitudinal response wherein an individual’s membership in the trajectory classes may renew or change over time. Within a trajectory class, the longitudinal response is modeled by a class-specific generalized linear mixed model. Effectively, an individual may remain in a trajectory class or switch to another as the class membership predictors are updated periodically over time. The identification of a common set of trajectory classes allows changes among the temporal patterns to be distinguished from local fluctuations in the response. An informative event such as death is jointly modeled by class-specific probability of the event through shared random effects. We do not impose the conditional independence assumption given the classes. The method is illustrated by analyzing the change over time in ADL trajectory class among 754 older adults with 70500 person-months of follow-up in the Precipitating Events Project. We also investigate the impact of jointly modeling the class-specific probability of the event on the parameter estimates in a simulation study. The primary contribution of our paper is the periodic updating of trajectory classes for a longitudinal categorical response without assuming conditional independence. PMID:24519416

  12. The estimation of tree posterior probabilities using conditional clade probability distributions.

    PubMed

    Larget, Bret

    2013-07-01

    In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample.

  13. A probability model for evaluating the bias and precision of influenza vaccine effectiveness estimates from case-control studies.

    PubMed

    Haber, M; An, Q; Foppa, I M; Shay, D K; Ferdinands, J M; Orenstein, W A

    2015-05-01

    As influenza vaccination is now widely recommended, randomized clinical trials are no longer ethical in many populations. Therefore, observational studies on patients seeking medical care for acute respiratory illnesses (ARIs) are a popular option for estimating influenza vaccine effectiveness (VE). We developed a probability model for evaluating and comparing bias and precision of estimates of VE against symptomatic influenza from two commonly used case-control study designs: the test-negative design and the traditional case-control design. We show that when vaccination does not affect the probability of developing non-influenza ARI then VE estimates from test-negative design studies are unbiased even if vaccinees and non-vaccinees have different probabilities of seeking medical care against ARI, as long as the ratio of these probabilities is the same for illnesses resulting from influenza and non-influenza infections. Our numerical results suggest that in general, estimates from the test-negative design have smaller bias compared to estimates from the traditional case-control design as long as the probability of non-influenza ARI is similar among vaccinated and unvaccinated individuals. We did not find consistent differences between the standard errors of the estimates from the two study designs.

  14. Nonparametric Estimation of the Probability of Ruin.

    DTIC Science & Technology

    1985-02-01

    MATHEMATICS RESEARCH CENTER I E N FREES FEB 85 MRC/TSR...in NONPARAMETRIC ESTIMATION OF THE PROBABILITY OF RUIN Lf Edward W. Frees * Mathematics Research Center University of Wisconsin-Madison 610 Walnut...34 - .. --- - • ’. - -:- - - ..- . . .- -- .-.-. . -. . .- •. . - . . - . . .’ . ’- - .. -’vi . .-" "-- -" ,’- UNIVERSITY OF WISCONSIN-MADISON MATHEMATICS RESEARCH CENTER NONPARAMETRIC ESTIMATION OF THE PROBABILITY

  15. A two-parameter design storm for Mediterranean convective rainfall

    NASA Astrophysics Data System (ADS)

    García-Bartual, Rafael; Andrés-Doménech, Ignacio

    2017-05-01

    The following research explores the feasibility of building effective design storms for extreme hydrological regimes, such as the one which characterizes the rainfall regime of the east and south-east of the Iberian Peninsula, without employing intensity-duration-frequency (IDF) curves as a starting point. Nowadays, after decades of functioning hydrological automatic networks, there is an abundance of high-resolution rainfall data with a reasonable statistic representation, which enable the direct research of temporal patterns and inner structures of rainfall events at a given geographic location, with the aim of establishing a statistical synthesis directly based on those observed patterns. The authors propose a temporal design storm defined in analytical terms, through a two-parameter gamma-type function. The two parameters are directly estimated from 73 independent storms identified from rainfall records of high temporal resolution in Valencia (Spain). All the relevant analytical properties derived from that function are developed in order to use this storm in real applications. In particular, in order to assign a probability to the design storm (return period), an auxiliary variable combining maximum intensity and total cumulated rainfall is introduced. As a result, for a given return period, a set of three storms with different duration, depth and peak intensity are defined. The consistency of the results is verified by means of comparison with the classic method of alternating blocks based on an IDF curve, for the above mentioned study case.

  16. Local cerebral glucose utilization during status epilepticus in newborn primates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fujikawa, D.G.; Dwyer, B.E.; Lake, R.R.

    1989-06-01

    The effect of bicuculline-induced status epilepticus (SE) on local cerebral metabolic rates for glucose (LCMRglc) was studied in 2-wk-old ketamine-anesthetized marmoset monkeys, using the 2-(/sup 14/C)-deoxy-D-glucose autoradiographical technique. To estimate LCMRglc in cerebral cortex and thalamus during SE, the lumped constant (LC) for 2-deoxy-D-glucose (2-DG) and the rate constants for 2-DG and glucose were calculated for these regions. The control LC was 0.43 in frontoparietal cortex, 0.51 in temporal cortex, and 0.50 in thalamus; it increased to 1.07 in frontoparietal cortex, 1.13 in temporal cortex, and 1.25 in thalamus after 30 min of seizures. With control LC values, LCMRglc inmore » frontoparietal cortex, temporal cortex, and dorsomedial thalamus appeared to increase four to sixfold. With seizure LC values, LCMRglc increased 1.5- to 2-fold and only in cortex. During 45-min seizures, LCMRglc in cortex and thalamus probably increases 4- to 6-fold initially and later falls to the 1.5- to 2-fold level as tissue glucose concentrations decrease. Together with our previous results demonstrating depletion of high-energy phosphates and glucose in these regions, the data suggest that energy demands exceed glucose supply. The long-term effects of these metabolic changes on the developing brain remain to be determined.« less

  17. Temporal variability and memory in sediment transport in an experimental step-pool channel

    NASA Astrophysics Data System (ADS)

    Saletti, Matteo; Molnar, Peter; Zimmermann, André; Hassan, Marwan A.; Church, Michael

    2015-11-01

    Temporal dynamics of sediment transport in steep channels using two experiments performed in a steep flume (8%) with natural sediment composed of 12 grain sizes are studied. High-resolution (1 s) time series of sediment transport were measured for individual grain-size classes at the outlet of the flume for different combinations of sediment input rates and flow discharges. Our aim in this paper is to quantify (a) the relation of discharge and sediment transport and (b) the nature and strength of memory in grain-size-dependent transport. None of the simple statistical descriptors of sediment transport (mean, extreme values, and quantiles) display a clear relation with water discharge, in fact a large variability between discharge and sediment transport is observed. Instantaneous transport rates have probability density functions with heavy tails. Bed load bursts have a coarser grain-size distribution than that of the entire experiment. We quantify the strength and nature of memory in sediment transport rates by estimating the Hurst exponent and the autocorrelation coefficient of the time series for different grain sizes. Our results show the presence of the Hurst phenomenon in transport rates, indicating long-term memory which is grain-size dependent. The short-term memory in coarse grain transport increases with temporal aggregation and this reveals the importance of the sampling duration of bed load transport rates in natural streams, especially for large fractions.

  18. Demography of a reintroduced population: moving toward management models for an endangered species, the whooping crane

    USGS Publications Warehouse

    Servanty, Sabrina; Converse, Sarah J.; Bailey, Larissa L.

    2014-01-01

    The reintroduction of threatened and endangered species is now a common method for reestablishing populations. Typically, a fundamental objective of reintroduction is to establish a self-sustaining population. Estimation of demographic parameters in reintroduced populations is critical, as these estimates serve multiple purposes. First, they support evaluation of progress toward the fundamental objective via construction of population viability analyses (PVAs) to predict metrics such as probability of persistence. Second, PVAs can be expanded to support evaluation of management actions, via management modeling. Third, the estimates themselves can support evaluation of the demographic performance of the reintroduced population, e.g., via comparison with wild populations. For each of these purposes, thorough treatment of uncertainties in the estimates is critical. Recently developed statistical methods - namely, hierarchical Bayesian implementations of state-space models - allow for effective integration of different types of uncertainty in estimation. We undertook a demographic estimation effort for a reintroduced population of endangered whooping cranes with the purpose of ultimately developing a Bayesian PVA for determining progress toward establishing a self-sustaining population, and for evaluating potential management actions via a Bayesian PVA-based management model. We evaluated individual and temporal variation in demographic parameters based upon a multi-state mark-recapture model. We found that survival was relatively high across time and varied little by sex. There was some indication that survival varied by release method. Survival was similar to that observed in the wild population. Although overall reproduction in this reintroduced population is poor, birds formed social pairs when relatively young, and once a bird was in a social pair, it had a nearly 50% chance of nesting the following breeding season. Also, once a bird had nested, it had a high probability of nesting again. These results are encouraging considering that survival and reproduction have been major challenges in past reintroductions of this species. The demographic estimates developed will support construction of a management model designed to facilitate exploration of management actions of interest, and will provide critical guidance in future planning for this reintroduction. An approach similar to what we describe could be usefully applied to many reintroduced populations.

  19. A Bayesian Assessment of Seismic Semi-Periodicity Forecasts

    NASA Astrophysics Data System (ADS)

    Nava, F.; Quinteros, C.; Glowacka, E.; Frez, J.

    2016-01-01

    Among the schemes for earthquake forecasting, the search for semi-periodicity during large earthquakes in a given seismogenic region plays an important role. When considering earthquake forecasts based on semi-periodic sequence identification, the Bayesian formalism is a useful tool for: (1) assessing how well a given earthquake satisfies a previously made forecast; (2) re-evaluating the semi-periodic sequence probability; and (3) testing other prior estimations of the sequence probability. A comparison of Bayesian estimates with updated estimates of semi-periodic sequences that incorporate new data not used in the original estimates shows extremely good agreement, indicating that: (1) the probability that a semi-periodic sequence is not due to chance is an appropriate estimate for the prior sequence probability estimate; and (2) the Bayesian formalism does a very good job of estimating corrected semi-periodicity probabilities, using slightly less data than that used for updated estimates. The Bayesian approach is exemplified explicitly by its application to the Parkfield semi-periodic forecast, and results are given for its application to other forecasts in Japan and Venezuela.

  20. Estimating the effect of treatment rate changes when treatment benefits are heterogeneous: antibiotics and otitis media.

    PubMed

    Park, Tae-Ryong; Brooks, John M; Chrischilles, Elizabeth A; Bergus, George

    2008-01-01

    Contrast methods to assess the health effects of a treatment rate change when treatment benefits are heterogeneous across patients. Antibiotic prescribing for children with otitis media (OM) in Iowa Medicaid is the empirical example. Instrumental variable (IV) and linear probability model (LPM) are used to estimate the effect of antibiotic treatments on cure probabilities for children with OM in Iowa Medicaid. Local area physician supply per capita is the instrument in the IV models. Estimates are contrasted in terms of their ability to make inferences for patients whose treatment choices may be affected by a change in population treatment rates. The instrument was positively related to the probability of being prescribed an antibiotic. LPM estimates showed a positive effect of antibiotics on OM patient cure probability while IV estimates showed no relationship between antibiotics and patient cure probability. Linear probability model estimation yields the average effects of the treatment on patients that were treated. IV estimation yields the average effects for patients whose treatment choices were affected by the instrument. As antibiotic treatment effects are heterogeneous across OM patients, our estimates from these approaches are aligned with clinical evidence and theory. The average estimate for treated patients (higher severity) from the LPM model is greater than estimates for patients whose treatment choices are affected by the instrument (lower severity) from the IV models. Based on our IV estimates it appears that lowering antibiotic use in OM patients in Iowa Medicaid did not result in lost cures.

  1. Impact of probability estimation on frequency of urine culture requests in ambulatory settings.

    PubMed

    Gul, Naheed; Quadri, Mujtaba

    2012-07-01

    To determine the perceptions of the medical community about urine culture in diagnosing urinary tract infections. The cross-sectional survey based of consecutive sampling was conducted at Shifa International Hospital, Islamabad, on 200 doctors, including medical students of the Shifa College of Medicine, from April to October 2010. A questionnaire with three common clinical scenarios of low, intermediate and high pre-test probability for urinary tract infection was used to assess the behaviour of the respondents to make a decision for urine culture test. The differences between the reference estimates and the respondents' estimates of pre- and post-test probability were assessed. The association of estimated probabilities with the number of tests ordered was also evaluated. The respondents were also asked about the cost effectiveness and safety of urine culture and sensitivity. Data was analysed using SPSS version 15. In low pre-test probability settings, the disease probability was over-estimated, suggesting the participants' inability to rule out the disease. The post-test probabilities were, however, under-estimated by the doctors as compared to the students. In intermediate and high pre-test probability settings, both over- and underestimation of probabilities were noticed. Doctors were more likely to consider ordering the test as the disease probability increased. Most of the respondents were of the opinion that urine culture was a cost-effective test and there was no associated potential harm. The wide variation in the clinical use of urine culture necessitates the formulation of appropriate guidelines for the diagnostic use of urine culture, and application of Bayesian probabilistic thinking to real clinical situations.

  2. Robust location and spread measures for nonparametric probability density function estimation.

    PubMed

    López-Rubio, Ezequiel

    2009-10-01

    Robustness against outliers is a desirable property of any unsupervised learning scheme. In particular, probability density estimators benefit from incorporating this feature. A possible strategy to achieve this goal is to substitute the sample mean and the sample covariance matrix by more robust location and spread estimators. Here we use the L1-median to develop a nonparametric probability density function (PDF) estimator. We prove its most relevant properties, and we show its performance in density estimation and classification applications.

  3. Estimating the Probability of Traditional Copying, Conditional on Answer-Copying Statistics.

    PubMed

    Allen, Jeff; Ghattas, Andrew

    2016-06-01

    Statistics for detecting copying on multiple-choice tests produce p values measuring the probability of a value at least as large as that observed, under the null hypothesis of no copying. The posterior probability of copying is arguably more relevant than the p value, but cannot be derived from Bayes' theorem unless the population probability of copying and probability distribution of the answer-copying statistic under copying are known. In this article, the authors develop an estimator for the posterior probability of copying that is based on estimable quantities and can be used with any answer-copying statistic. The performance of the estimator is evaluated via simulation, and the authors demonstrate how to apply the formula using actual data. Potential uses, generalizability to other types of cheating, and limitations of the approach are discussed.

  4. Optimal estimation for discrete time jump processes

    NASA Technical Reports Server (NTRS)

    Vaca, M. V.; Tretter, S. A.

    1977-01-01

    Optimum estimates of nonobservable random variables or random processes which influence the rate functions of a discrete time jump process (DTJP) are obtained. The approach is based on the a posteriori probability of a nonobservable event expressed in terms of the a priori probability of that event and of the sample function probability of the DTJP. A general representation for optimum estimates and recursive equations for minimum mean squared error (MMSE) estimates are obtained. MMSE estimates are nonlinear functions of the observations. The problem of estimating the rate of a DTJP when the rate is a random variable with a probability density function of the form cx super K (l-x) super m and show that the MMSE estimates are linear in this case. This class of density functions explains why there are insignificant differences between optimum unconstrained and linear MMSE estimates in a variety of problems.

  5. Optimal estimation for discrete time jump processes

    NASA Technical Reports Server (NTRS)

    Vaca, M. V.; Tretter, S. A.

    1978-01-01

    Optimum estimates of nonobservable random variables or random processes which influence the rate functions of a discrete time jump process (DTJP) are derived. The approach used is based on the a posteriori probability of a nonobservable event expressed in terms of the a priori probability of that event and of the sample function probability of the DTJP. Thus a general representation is obtained for optimum estimates, and recursive equations are derived for minimum mean-squared error (MMSE) estimates. In general, MMSE estimates are nonlinear functions of the observations. The problem is considered of estimating the rate of a DTJP when the rate is a random variable with a beta probability density function and the jump amplitudes are binomially distributed. It is shown that the MMSE estimates are linear. The class of beta density functions is rather rich and explains why there are insignificant differences between optimum unconstrained and linear MMSE estimates in a variety of problems.

  6. Trends and geographic patterns in drug-poisoning death rates in the U.S., 1999-2009.

    PubMed

    Rossen, Lauren M; Khan, Diba; Warner, Margaret

    2013-12-01

    Drug poisoning mortality has increased substantially in the U.S. over the past 3 decades. Previous studies have described state-level variation and urban-rural differences in drug-poisoning deaths, but variation at the county level has largely not been explored in part because crude county-level death rates are often highly unstable. The goal of the study was to use small-area estimation techniques to produce stable county-level estimates of age-adjusted death rates (AADR) associated with drug poisoning for the U.S., 1999-2009, in order to examine geographic and temporal variation. Population-based observational study using data on 304,087 drug-poisoning deaths in the U.S. from the 1999-2009 National Vital Statistics Multiple Cause of Death Files (analyzed in 2012). Because of the zero-inflated and right-skewed distribution of drug-poisoning death rates, a two-stage modeling procedure was used in which the first stage modeled the probability of observing a death for a given county and year, and the second stage modeled the log-transformed drug-poisoning death rate given that a death occurred. Empirical Bayes estimates of county-level drug-poisoning death rates were mapped to explore temporal and geographic variation. Only 3% of counties had drug-poisoning AADRs greater than ten per 100,000 per year in 1999-2000, compared to 54% in 2008-2009. Drug-poisoning AADRs grew by 394% in rural areas compared to 279% for large central metropolitan counties, but the highest drug-poisoning AADRs were observed in central metropolitan areas from 1999 to 2009. There was substantial geographic variation in drug-poisoning mortality across the U.S. Published by American Journal of Preventive Medicine on behalf of American Journal of Preventive Medicine.

  7. Use of radar QPE for the derivation of Intensity-Duration-Frequency curves in a range of climatic regimes

    NASA Astrophysics Data System (ADS)

    Marra, Francesco; Morin, Efrat

    2015-12-01

    Intensity-Duration-Frequency (IDF) curves are widely used in flood risk management because they provide an easy link between the characteristics of a rainfall event and the probability of its occurrence. Weather radars provide distributed rainfall estimates with high spatial and temporal resolutions and overcome the scarce representativeness of point-based rainfall for regions characterized by large gradients in rainfall climatology. This work explores the use of radar quantitative precipitation estimation (QPE) for the identification of IDF curves over a region with steep climatic transitions (Israel) using a unique radar data record (23 yr) and combined physical and empirical adjustment of the radar data. IDF relationships were derived by fitting a generalized extreme value distribution to the annual maximum series for durations of 20 min, 1 h and 4 h. Arid, semi-arid and Mediterranean climates were explored using 14 study cases. IDF curves derived from the study rain gauges were compared to those derived from radar and from nearby rain gauges characterized by similar climatology, taking into account the uncertainty linked with the fitting technique. Radar annual maxima and IDF curves were generally overestimated but in 70% of the cases (60% for a 100 yr return period), they lay within the rain gauge IDF confidence intervals. Overestimation tended to increase with return period, and this effect was enhanced in arid climates. This was mainly associated with radar estimation uncertainty, even if other effects, such as rain gauge temporal resolution, cannot be neglected. Climatological classification remained meaningful for the analysis of rainfall extremes and radar was able to discern climatology from rainfall frequency analysis.

  8. Capturing contextual effects in spectro-temporal receptive fields.

    PubMed

    Westö, Johan; May, Patrick J C

    2016-09-01

    Spectro-temporal receptive fields (STRFs) are thought to provide descriptive images of the computations performed by neurons along the auditory pathway. However, their validity can be questioned because they rely on a set of assumptions that are probably not fulfilled by real neurons exhibiting contextual effects, that is, nonlinear interactions in the time or frequency dimension that cannot be described with a linear filter. We used a novel approach to investigate how a variety of contextual effects, due to facilitating nonlinear interactions and synaptic depression, affect different STRF models, and if these effects can be captured with a context field (CF). Contextual effects were incorporated in simulated networks of spiking neurons, allowing one to define the true STRFs of the neurons. This, in turn, made it possible to evaluate the performance of each STRF model by comparing the estimations with the true STRFs. We found that currently used STRF models are particularly poor at estimating inhibitory regions. Specifically, contextual effects make estimated STRFs dependent on stimulus density in a contrasting fashion: inhibitory regions are underestimated at lower densities while artificial inhibitory regions emerge at higher densities. The CF was found to provide a solution to this dilemma, but only when it is used together with a generalized linear model. Our results therefore highlight the limitations of the traditional STRF approach and provide useful recipes for how different STRF models and stimuli can be used to arrive at reliable quantifications of neural computations in the presence of contextual effects. The results therefore push the purpose of STRF analysis from simply finding an optimal stimulus toward describing context-dependent computations of neurons along the auditory pathway. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Estimating the Probability of a Diffusing Target Encountering a Stationary Sensor.

    DTIC Science & Technology

    1985-07-01

    7 RD-R1577 6- 44 ESTIMATING THE PROBABILITY OF A DIFFUSING TARGET i/i ENCOUNTERING R STATIONARY SENSOR(U) NAVAL POSTGRADUATE U SCHOOL MONTEREY CA...8217,: *.:.; - -*.. ,’.-,:;;’.’.. ’,. ,. .*.’.- 4 6 6- ..- .-,,.. : .-.;.- -. NPS55-85-013 NAVAL POSTGRADUATE SCHOOL Monterey, California ESTIMATING THE PROBABILITY OF A DIFFUSING TARGET...PROBABILITY OF A DIFFUSING Technical TARGET ENCOUNTERING A STATIONARY SENSOR S. PERFORMING ORG. REPORT NUMBER 7. AUTHOR(@) S. CONTRACT OR GRANT NUMBER(a

  10. Interval Estimation of Seismic Hazard Parameters

    NASA Astrophysics Data System (ADS)

    Orlecka-Sikora, Beata; Lasocki, Stanislaw

    2017-03-01

    The paper considers Poisson temporal occurrence of earthquakes and presents a way to integrate uncertainties of the estimates of mean activity rate and magnitude cumulative distribution function in the interval estimation of the most widely used seismic hazard functions, such as the exceedance probability and the mean return period. The proposed algorithm can be used either when the Gutenberg-Richter model of magnitude distribution is accepted or when the nonparametric estimation is in use. When the Gutenberg-Richter model of magnitude distribution is used the interval estimation of its parameters is based on the asymptotic normality of the maximum likelihood estimator. When the nonparametric kernel estimation of magnitude distribution is used, we propose the iterated bias corrected and accelerated method for interval estimation based on the smoothed bootstrap and second-order bootstrap samples. The changes resulted from the integrated approach in the interval estimation of the seismic hazard functions with respect to the approach, which neglects the uncertainty of the mean activity rate estimates have been studied using Monte Carlo simulations and two real dataset examples. The results indicate that the uncertainty of mean activity rate affects significantly the interval estimates of hazard functions only when the product of activity rate and the time period, for which the hazard is estimated, is no more than 5.0. When this product becomes greater than 5.0, the impact of the uncertainty of cumulative distribution function of magnitude dominates the impact of the uncertainty of mean activity rate in the aggregated uncertainty of the hazard functions. Following, the interval estimates with and without inclusion of the uncertainty of mean activity rate converge. The presented algorithm is generic and can be applied also to capture the propagation of uncertainty of estimates, which are parameters of a multiparameter function, onto this function.

  11. PMP Estimations at Sparsely Controlled Andinian Basins and Climate Change Projections

    NASA Astrophysics Data System (ADS)

    Lagos Zúñiga, M. A.; Vargas, X.

    2012-12-01

    Probable Maximum Precipitation (PMP) estimation implies an extensive review of hydrometeorological data and understandig of precipitation formation processes. There exists different methodology processes that apply for their estimations and all of them require a good spatial and temporal representation of storms. The estimation of hydrometeorological PMP on sparsely controlled basins is a difficult task, specially if the studied area has an important orographic effect due to mountains and the mixed precipitation occurrence in the most several storms time period, the main task of this study is to propose and estimate PMP in a sparsely controlled basin, affected by abrupt topography and mixed hidrology basin; also analyzing statystic uncertainties estimations and possible climate changes effects in its estimation. In this study the PMP estimation under statistical and hydrometeorological aproaches (watershed-based and traditional depth area duration analysis) was done in a semi arid zone at Puclaro dam in north Chile. Due to the lack of good spatial meteorological representation at the study zone, we propose a methodology to consider the orographic effects of Los Andes due to orographic effects patterns based in a RCM PRECIS-DGF and annual isoyetal maps. Estimations were validated with precipitation patterns for given winters, considering snow route and rainfall gauges at the preferencial wind direction, finding good results. The estimations are also compared with the highest areal storms in USA, Australia, India and China and with frequency analysis in local rain gauge stations in order to decide about the most adequate approach for the study zone. Climate change projections were evaluated with ECHAM5 GCM model, due to its good quality representation in the seasonality and the magnitude of meteorological variables. Temperature projections, for 2040-2065 period, show that there would be a rise in the catchment contributing area that would lead to an increase of the average liquid precipitation over the basin. Temperature projections would also affect the maximization factors in the calculation of the PMP, increasing it up to 126.6% and 62.5% in scenarios A2 and B1, respectively. These projections are important to be studied due to the implications of PMP in hydrologic design of great hydraulic works as Probable Maximum Flood (PMF). We propose that the methodology presented in this study could be also used in other basins of similar characteristics.

  12. The clinical diagnostic reasoning process determining the use of endoscopy in diagnosing peptic ulcer disease.

    PubMed

    Gul, Naheed; Quadri, Mujtaba

    2011-09-01

    To evaluate the clinical diagnostic reasoning process as a tool to decrease the number of unnecessary endoscopies for diagnosing peptic ulcer disease. tudy Cross-sectional KAP study. Shifa College of Medicine, Islamabad, from April to August 2010. Two hundred doctors were assessed with three common clinical scenarios of low, intermediate and high pre-test probability for peptic ulcer disease using a questionnaire. The differences between the reference estimates and the respondents' estimates of pre-test and post test probability were used for assessing the ability of estimating the pretest probability and the post test probability of the disease. Doctors were also enquired about the cost-effectiveness and safety of endoscopy. Consecutive sampling technique was used and the data was analyzed using SPSS version 16. In the low pre-test probability settings, overestimation of the disease probability suggested the doctors' inability to rule out the disease. The post test probabilities were similarly overestimated. In intermediate pre-test probability settings, both over and under estimation of probabilities were noticed. In high pre-test probability setting, there was no significant difference in the reference and the responders' intuitive estimates of post test probability. Doctors were more likely to consider ordering the test as the disease probability increased. Most respondents were of the opinion that endoscopy is not a cost-effective procedure and may be associated with a potential harm. Improvement is needed in doctors' diagnostic ability by more emphasis on clinical decision-making and application of bayesian probabilistic thinking to real clinical situations.

  13. The Estimation of Tree Posterior Probabilities Using Conditional Clade Probability Distributions

    PubMed Central

    Larget, Bret

    2013-01-01

    In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample. [Bayesian phylogenetics; conditional clade distributions; improved accuracy; posterior probabilities of trees.] PMID:23479066

  14. Estimating temporary emigration and breeding proportions using capture-recapture data with Pollock's robust design

    USGS Publications Warehouse

    Kendall, W.L.; Nichols, J.D.; Hines, J.E.

    1997-01-01

    Statistical inference for capture-recapture studies of open animal populations typically relies on the assumption that all emigration from the studied population is permanent. However, there are many instances in which this assumption is unlikely to be met. We define two general models for the process of temporary emigration, completely random and Markovian. We then consider effects of these two types of temporary emigration on Jolly-Seber (Seber 1982) estimators and on estimators arising from the full-likelihood approach of Kendall et al. (1995) to robust design data. Capture-recapture data arising from Pollock's (1982) robust design provide the basis for obtaining unbiased estimates of demographic parameters in the presence of temporary emigration and for estimating the probability of temporary emigration. We present a likelihood-based approach to dealing with temporary emigration that permits estimation under different models of temporary emigration and yields tests for completely random and Markovian emigration. In addition, we use the relationship between capture probability estimates based on closed and open models under completely random temporary emigration to derive three ad hoc estimators for the probability of temporary emigration, two of which should be especially useful in situations where capture probabilities are heterogeneous among individual animals. Ad hoc and full-likelihood estimators are illustrated for small mammal capture-recapture data sets. We believe that these models and estimators will be useful for testing hypotheses about the process of temporary emigration, for estimating demographic parameters in the presence of temporary emigration, and for estimating probabilities of temporary emigration. These latter estimates are frequently of ecological interest as indicators of animal movement and, in some sampling situations, as direct estimates of breeding probabilities and proportions.

  15. Transactional Problem Content in Cost Discounting: Parallel Effects for Probability and Delay

    ERIC Educational Resources Information Center

    Jones, Stephen; Oaksford, Mike

    2011-01-01

    Four experiments investigated the effects of transactional content on temporal and probabilistic discounting of costs. Kusev, van Schaik, Ayton, Dent, and Chater (2009) have shown that content other than gambles can alter decision-making behavior even when associated value and probabilities are held constant. Transactions were hypothesized to lead…

  16. Fisher classifier and its probability of error estimation

    NASA Technical Reports Server (NTRS)

    Chittineni, C. B.

    1979-01-01

    Computationally efficient expressions are derived for estimating the probability of error using the leave-one-out method. The optimal threshold for the classification of patterns projected onto Fisher's direction is derived. A simple generalization of the Fisher classifier to multiple classes is presented. Computational expressions are developed for estimating the probability of error of the multiclass Fisher classifier.

  17. Future Volcanism at Yucca Mountain - Statistical Insights from the Non-Detection of Basalt Intrusions in the Potential Repository

    NASA Astrophysics Data System (ADS)

    Coleman, N.; Abramson, L.

    2004-05-01

    Yucca Mt. (YM) is a potential repository site for high-level radioactive waste and spent fuel. One issue is the potential for future igneous activity to intersect the repository. If the event probability is <1E-8/yr, it need not be considered in licensing. Plio-Quaternary volcanos and older basalts occur near YM. Connor et al (JGR, 2000) estimate a probability of 1E-8/yr to 1E-7/yr for a basaltic dike to intersect the potential repository. Based on aeromagnetic data, Hill and Stamatakos (CNWRA, 2002) propose that additional volcanos may lie buried in nearby basins. They suggest if these volcanos are part of temporal-clustered volcanic activity, the probability of an intrusion may be as high as 1E-6/yr. We examine whether recurrence probabilities >2E-7/yr are realistic given that no dikes have been found in or above the 1.3E7 yr-old potential repository block. For 2E-7/yr (or 1E-6/yr), the expected number of penetrating dikes is 2.6 (respectively, 13), and the probability of at least one penetration is 0.93 (0.999). These results are not consistent with the exploration evidence. YM is one of the most intensively studied places on Earth. Over 20 yrs of studies have included surface and subsurface mapping, geophysical surveys, construction of 10+ km of tunnels in the mountain, drilling of many boreholes, and construction of many pits (DOE, Site Recommendation, 2002). It seems unlikely that multiple dikes could exist within the proposed repository footprint and escape detection. A dike complex dated 11.7 Ma (Smith et al, UNLV, 1997) or 10 Ma (Carr and Parrish, 1985) does exist NW of YM and west of the main Solitario Canyon Fault. These basalts intruded the Tiva Canyon Tuff (12.7 Ma) in an epoch of caldera-forming pyroclastic eruptions that ended millions of yrs ago. We would conclude that basaltic volcanism related to Miocene silicic volcanism may also have ended. Given the nondetection of dikes in the potential repository, we can use a Poisson model to estimate an upper-bound probability of 2E-7/yr (95% conf. level) for an igneous intrusion over the next 1E4 yrs. If we assume one undiscovered dike exists, the upper-bound probability would rise to 4E-7/yr. Higher probabilities may be possible if conditions that fostered Plio-Quaternary volcanism became enhanced over time. To the contrary, basalts of the past 11 Ma in Crater Flat have erupted in four episodes that together show a declining trend in erupted magma volume (DOE, TBD13, 2003). Smith et al (GSA Today, 2002) suggest there may be a common magma source for volcanism in Crater Flat and the Lunar Crater volcanic field, and that recurrence rates for YM could be underestimated. Their interpretation is highly speculative given the 130-km (80-mi) distance between these zones. A claim that crustal extension at YM is anomalously large, possibly favoring renewed volcanism (Wernicke et al, Science, 1999), was contradicted by later work (Savage et al, JGR, 2000). Spatial-temporal models that predict future intrusion probabilities of >2E-7/yr may be overly conservative and unrealistic. Along with currently planned site characterization activities, realistic models could be developed by considering the non-detection of basaltic dikes in the potential repository footprint. (The views expressed are the authors' and do not reflect any final judgment or determination by the Advisory Committee on Nuclear Waste or the Nuclear Regulatory Commission regarding the matters addressed or the acceptability of a license application for a geologic repository at Yucca Mt.)

  18. A double-observer approach for estimating detection probability and abundance from point counts

    USGS Publications Warehouse

    Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Fallon, F.W.; Fallon, J.E.; Heglund, P.J.

    2000-01-01

    Although point counts are frequently used in ornithological studies, basic assumptions about detection probabilities often are untested. We apply a double-observer approach developed to estimate detection probabilities for aerial surveys (Cook and Jacobson 1979) to avian point counts. At each point count, a designated 'primary' observer indicates to another ('secondary') observer all birds detected. The secondary observer records all detections of the primary observer as well as any birds not detected by the primary observer. Observers alternate primary and secondary roles during the course of the survey. The approach permits estimation of observer-specific detection probabilities and bird abundance. We developed a set of models that incorporate different assumptions about sources of variation (e.g. observer, bird species) in detection probability. Seventeen field trials were conducted, and models were fit to the resulting data using program SURVIV. Single-observer point counts generally miss varying proportions of the birds actually present, and observer and bird species were found to be relevant sources of variation in detection probabilities. Overall detection probabilities (probability of being detected by at least one of the two observers) estimated using the double-observer approach were very high (>0.95), yielding precise estimates of avian abundance. We consider problems with the approach and recommend possible solutions, including restriction of the approach to fixed-radius counts to reduce the effect of variation in the effective radius of detection among various observers and to provide a basis for using spatial sampling to estimate bird abundance on large areas of interest. We believe that most questions meriting the effort required to carry out point counts also merit serious attempts to estimate detection probabilities associated with the counts. The double-observer approach is a method that can be used for this purpose.

  19. Using a Betabinomial distribution to estimate the prevalence of adherence to physical activity guidelines among children and youth.

    PubMed

    Garriguet, Didier

    2016-04-01

    Estimates of the prevalence of adherence to physical activity guidelines in the population are generally the result of averaging individual probability of adherence based on the number of days people meet the guidelines and the number of days they are assessed. Given this number of active and inactive days (days assessed minus days active), the conditional probability of meeting the guidelines that has been used in the past is a Beta (1 + active days, 1 + inactive days) distribution assuming the probability p of a day being active is bounded by 0 and 1 and averages 50%. A change in the assumption about the distribution of p is required to better match the discrete nature of the data and to better assess the probability of adherence when the percentage of active days in the population differs from 50%. Using accelerometry data from the Canadian Health Measures Survey, the probability of adherence to physical activity guidelines is estimated using a conditional probability given the number of active and inactive days distributed as a Betabinomial(n, a + active days , β + inactive days) assuming that p is randomly distributed as Beta(a, β) where the parameters a and β are estimated by maximum likelihood. The resulting Betabinomial distribution is discrete. For children aged 6 or older, the probability of meeting physical activity guidelines 7 out of 7 days is similar to published estimates. For pre-schoolers, the Betabinomial distribution yields higher estimates of adherence to the guidelines than the Beta distribution, in line with the probability of being active on any given day. In estimating the probability of adherence to physical activity guidelines, the Betabinomial distribution has several advantages over the previously used Beta distribution. It is a discrete distribution and maximizes the richness of accelerometer data.

  20. Temporally Adaptive Sampling: A Case Study in Rare Species Survey Design with Marbled Salamanders (Ambystoma opacum)

    PubMed Central

    Charney, Noah D.; Kubel, Jacob E.; Eiseman, Charles S.

    2015-01-01

    Improving detection rates for elusive species with clumped distributions is often accomplished through adaptive sampling designs. This approach can be extended to include species with temporally variable detection probabilities. By concentrating survey effort in years when the focal species are most abundant or visible, overall detection rates can be improved. This requires either long-term monitoring at a few locations where the species are known to occur or models capable of predicting population trends using climatic and demographic data. For marbled salamanders (Ambystoma opacum) in Massachusetts, we demonstrate that annual variation in detection probability of larvae is regionally correlated. In our data, the difference in survey success between years was far more important than the difference among the three survey methods we employed: diurnal surveys, nocturnal surveys, and dipnet surveys. Based on these data, we simulate future surveys to locate unknown populations under a temporally adaptive sampling framework. In the simulations, when pond dynamics are correlated over the focal region, the temporally adaptive design improved mean survey success by as much as 26% over a non-adaptive sampling design. Employing a temporally adaptive strategy costs very little, is simple, and has the potential to substantially improve the efficient use of scarce conservation funds. PMID:25799224

  1. The temporal program of chromosome replication: genomewide replication in clb5{Delta} Saccharomyces cerevisiae.

    PubMed

    McCune, Heather J; Danielson, Laura S; Alvino, Gina M; Collingwood, David; Delrow, Jeffrey J; Fangman, Walton L; Brewer, Bonita J; Raghuraman, M K

    2008-12-01

    Temporal regulation of origin activation is widely thought to explain the pattern of early- and late-replicating domains in the Saccharomyces cerevisiae genome. Recently, single-molecule analysis of replication suggested that stochastic processes acting on origins with different probabilities of activation could generate the observed kinetics of replication without requiring an underlying temporal order. To distinguish between these possibilities, we examined a clb5Delta strain, where origin firing is largely limited to the first half of S phase, to ask whether all origins nonspecifically show decreased firing (as expected for disordered firing) or if only some origins ("late" origins) are affected. Approximately half the origins in the mutant genome show delayed replication while the remainder replicate largely on time. The delayed regions can encompass hundreds of kilobases and generally correspond to regions that replicate late in wild-type cells. Kinetic analysis of replication in wild-type cells reveals broad windows of origin firing for both early and late origins. Our results are consistent with a temporal model in which origins can show some heterogeneity in both time and probability of origin firing, but clustering of temporally like origins nevertheless yields a genome that is organized into blocks showing different replication times.

  2. Integrating Eddy Covariance, Penman-Monteith and METRIC based Evapotranspiration estimates to generate high resolution space-time ET over the Brazos River Basin

    NASA Astrophysics Data System (ADS)

    Mbabazi, D.; Mohanty, B.; Gaur, N.

    2017-12-01

    Evapotranspiration (ET) is an important component of the water and energy balance and accounts for 60 -70% of precipitation losses. However, accurate estimates of ET are difficult to quantify at varying spatial and temporal scales. Eddy covariance methods estimate ET at high temporal resolutions but without capturing the spatial variation in ET within its footprint. On the other hand, remote sensing methods using Landsat imagery provide ET with high spatial resolution but low temporal resolution (16 days). In this study, we used both eddy covariance and remote sensing methods to generate high space-time resolution ET. Daily, monthly and seasonal ET estimates were obtained using the eddy covariance (EC) method, Penman-Monteith (PM) and Mapping Evapotranspiration with Internalized Calibration (METRIC) models to determine cotton and native prairie ET dynamics in the Brazos river basin characterized by varying hydro-climatic and geological gradients. Daily estimates of spatially distributed ET (30 m resolution) were generated using spatial autocorrelation and temporal interpolations between the EC flux variable footprints and METRIC ET for the 2016 and 2017 growing seasons. A comparison of the 2016 and 2017 preliminary daily ET estimates showed similar ET dynamics/trends among the EC, PM and METRIC methods, and 5-20% differences in seasonal ET estimates. This study will improve the spatial estimates of EC ET and temporal resolution of satellite derived ET thus providing better ET data for water use management.

  3. An estimate of post-depositional remanent magnetization lock-in depth in organic rich varved lake sediments

    NASA Astrophysics Data System (ADS)

    Snowball, Ian; Mellström, Anette; Ahlstrand, Emelie; Haltia, Eeva; Nilsson, Andreas; Ning, Wenxin; Muscheler, Raimund; Brauer, Achim

    2013-11-01

    We studied the paleomagnetic properties of relatively organic rich, annually laminated (varved) sediments of Holocene age in Gyltigesjön, which is a lake in southern Sweden. An age-depth model was based on a regional lead pollution isochron and Bayesian modelling of radiocarbon ages of bulk sediments and terrestrial macrofossils, which included a radiocarbon wiggle-matched series of 873 varves that accumulated between 3000 and 2000 Cal a BP (Mellström et al., 2013). Mineral magnetic data and first order reversal curves suggest that the natural remanent magnetization is carried by stable single-domain grains of magnetite, probably of magnetosomal origin. Discrete samples taken from overlapping piston cores were used to produce smoothed paleomagnetic secular variation (inclination and declination) and relative paleointensity data sets. Alternative temporal trends in the paleomagnetic data were obtained by correcting for paleomagnetic lock-in depths between 0 and 70 cm and taking into account changes in sediment accumulation rate. These temporal trends were regressed against reference curves for the same region (FENNOSTACK and FENNORPIS; Snowball et al., 2007). The best statistical matches to the reference curves are obtained when we apply lock-in depths of 21-34 cm to the Gyltigesjön paleomagnetic data, although these are most likely minimum estimates. Our study suggests that a significant paleomagnetic lock-in depth can affect the acquisition of post-depositional remanent magnetization even where bioturbation is absent and no mixed sediment surface layer exists.

  4. Reducing uncertainty with flood frequency analysis: The contribution of paleoflood and historical flood information

    NASA Astrophysics Data System (ADS)

    Lam, Daryl; Thompson, Chris; Croke, Jacky; Sharma, Ashneel; Macklin, Mark

    2017-03-01

    Using a combination of stream gauge, historical, and paleoflood records to extend extreme flood records has proven to be useful in improving flood frequency analysis (FFA). The approach has typically been applied in localities with long historical records and/or suitable river settings for paleoflood reconstruction from slack-water deposits (SWDs). However, many regions around the world have neither extensive historical information nor bedrock gorges suitable for SWDs preservation and paleoflood reconstruction. This study from subtropical Australia demonstrates that confined, semialluvial channels such as macrochannels provide relatively stable boundaries over the 1000-2000 year time period and the preserved SWDs enabled paleoflood reconstruction and their incorporation into FFA. FFA for three sites in subtropical Australia with the integration of historical and paleoflood data using Bayesian Inference methods showed a significant reduction in uncertainty associated with the estimated discharge of a flood quantile. Uncertainty associated with estimated discharge for the 1% Annual Exceedance Probability (AEP) flood is reduced by more than 50%. In addition, sensitivity analysis of possible within-channel boundary changes shows that FFA is not significantly affected by any associated changes in channel capacity. Therefore, a greater range of channel types may be used for reliable paleoflood reconstruction by evaluating the stability of inset alluvial units, thereby increasing the quantity of temporal data available for FFA. The reduction in uncertainty, particularly in the prediction of the ≤1% AEP design flood, will improve flood risk planning and management in regions with limited temporal flood data.

  5. Rare event simulation in radiation transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kollman, Craig

    1993-10-01

    This dissertation studies methods for estimating extremely small probabilities by Monte Carlo simulation. Problems in radiation transport typically involve estimating very rare events or the expected value of a random variable which is with overwhelming probability equal to zero. These problems often have high dimensional state spaces and irregular geometries so that analytic solutions are not possible. Monte Carlo simulation must be used to estimate the radiation dosage being transported to a particular location. If the area is well shielded the probability of any one particular particle getting through is very small. Because of the large number of particles involved,more » even a tiny fraction penetrating the shield may represent an unacceptable level of radiation. It therefore becomes critical to be able to accurately estimate this extremely small probability. Importance sampling is a well known technique for improving the efficiency of rare event calculations. Here, a new set of probabilities is used in the simulation runs. The results are multiple by the likelihood ratio between the true and simulated probabilities so as to keep the estimator unbiased. The variance of the resulting estimator is very sensitive to which new set of transition probabilities are chosen. It is shown that a zero variance estimator does exist, but that its computation requires exact knowledge of the solution. A simple random walk with an associated killing model for the scatter of neutrons is introduced. Large deviation results for optimal importance sampling in random walks are extended to the case where killing is present. An adaptive ``learning`` algorithm for implementing importance sampling is given for more general Markov chain models of neutron scatter. For finite state spaces this algorithm is shown to give with probability one, a sequence of estimates converging exponentially fast to the true solution.« less

  6. Characterization of H 1743-322 during its 2003 outburst with TCAF Solution.

    NASA Astrophysics Data System (ADS)

    Nagarkoti, Shreeram; Debnath, Dipak; Chakrabarti, Sandip Kumar; Mondal, Santanu; Chatterjee, Arka

    2016-07-01

    Transiant black hole candidate (BHC) H 1743-322 became active in X-rays on 2003 March 21 after remaining dormant for around two decades. We study both the spectral and temporal properties of the source during its 2003 outburst under TCAF paradigm. The classification of different spectral states (hard, hard-intermediate, soft-intermediate, soft) and their intermediate transitions are more clear from the variation of TCAF model fitted/derived physical flow parameters and nature of quasi-periodic oscillations (if present). We also studied evolutions of low frequency QPOs during rising and declining phases of the outburst with propagating oscillatory shock (POS) model. We get a good estimation of the probable mass range of the objects from prediction methods using TCAF and POS model, as discussed in Molla et al. (2016).

  7. A new approach for the description of discharge extremes in small catchments

    NASA Astrophysics Data System (ADS)

    Pavia Santolamazza, Daniela; Lebrenz, Henning; Bárdossy, András

    2017-04-01

    Small catchment basins in Northwestern Switzerland, characterized by small concentration times, are frequently targeted by floods. The peak and the volume of these floods are commonly estimated by a frequency analysis of occurrence and described by a random variable, assuming a uniform distributed probability and stationary input drivers (e.g. precipitation, temperature). For these small catchments, we attempt to describe and identify the underlying mechanisms and dynamics at the occurrence of extremes by means of available high temporal resolution (10 min) observations and to explore the possibilities to regionalize hydrological parameters for short intervals. Therefore, we investigate new concepts for the flood description such as entropy as a measure of disorder and dispersion of precipitation. First findings and conclusions of this ongoing research are presented.

  8. Using Multiple and Logistic Regression to Estimate the Median WillCost and Probability of Cost and Schedule Overrun for Program Managers

    DTIC Science & Technology

    2017-03-23

    PUBLIC RELEASE; DISTRIBUTION UNLIMITED Using Multiple and Logistic Regression to Estimate the Median Will- Cost and Probability of Cost and... Cost and Probability of Cost and Schedule Overrun for Program Managers Ryan C. Trudelle Follow this and additional works at: https://scholar.afit.edu...afit.edu. Recommended Citation Trudelle, Ryan C., "Using Multiple and Logistic Regression to Estimate the Median Will- Cost and Probability of Cost and

  9. Electrofishing capture probability of smallmouth bass in streams

    USGS Publications Warehouse

    Dauwalter, D.C.; Fisher, W.L.

    2007-01-01

    Abundance estimation is an integral part of understanding the ecology and advancing the management of fish populations and communities. Mark-recapture and removal methods are commonly used to estimate the abundance of stream fishes. Alternatively, abundance can be estimated by dividing the number of individuals sampled by the probability of capture. We conducted a mark-recapture study and used multiple repeated-measures logistic regression to determine the influence of fish size, sampling procedures, and stream habitat variables on the cumulative capture probability for smallmouth bass Micropterus dolomieu in two eastern Oklahoma streams. The predicted capture probability was used to adjust the number of individuals sampled to obtain abundance estimates. The observed capture probabilities were higher for larger fish and decreased with successive electrofishing passes for larger fish only. Model selection suggested that the number of electrofishing passes, fish length, and mean thalweg depth affected capture probabilities the most; there was little evidence for any effect of electrofishing power density and woody debris density on capture probability. Leave-one-out cross validation showed that the cumulative capture probability model predicts smallmouth abundance accurately. ?? Copyright by the American Fisheries Society 2007.

  10. Testing the molecular clock using mechanistic models of fossil preservation and molecular evolution

    PubMed Central

    2017-01-01

    Molecular sequence data provide information about relative times only, and fossil-based age constraints are the ultimate source of information about absolute times in molecular clock dating analyses. Thus, fossil calibrations are critical to molecular clock dating, but competing methods are difficult to evaluate empirically because the true evolutionary time scale is never known. Here, we combine mechanistic models of fossil preservation and sequence evolution in simulations to evaluate different approaches to constructing fossil calibrations and their impact on Bayesian molecular clock dating, and the relative impact of fossil versus molecular sampling. We show that divergence time estimation is impacted by the model of fossil preservation, sampling intensity and tree shape. The addition of sequence data may improve molecular clock estimates, but accuracy and precision is dominated by the quality of the fossil calibrations. Posterior means and medians are poor representatives of true divergence times; posterior intervals provide a much more accurate estimate of divergence times, though they may be wide and often do not have high coverage probability. Our results highlight the importance of increased fossil sampling and improved statistical approaches to generating calibrations, which should incorporate the non-uniform nature of ecological and temporal fossil species distributions. PMID:28637852

  11. Incorporating parametric uncertainty into population viability analysis models

    USGS Publications Warehouse

    McGowan, Conor P.; Runge, Michael C.; Larson, Michael A.

    2011-01-01

    Uncertainty in parameter estimates from sampling variation or expert judgment can introduce substantial uncertainty into ecological predictions based on those estimates. However, in standard population viability analyses, one of the most widely used tools for managing plant, fish and wildlife populations, parametric uncertainty is often ignored in or discarded from model projections. We present a method for explicitly incorporating this source of uncertainty into population models to fully account for risk in management and decision contexts. Our method involves a two-step simulation process where parametric uncertainty is incorporated into the replication loop of the model and temporal variance is incorporated into the loop for time steps in the model. Using the piping plover, a federally threatened shorebird in the USA and Canada, as an example, we compare abundance projections and extinction probabilities from simulations that exclude and include parametric uncertainty. Although final abundance was very low for all sets of simulations, estimated extinction risk was much greater for the simulation that incorporated parametric uncertainty in the replication loop. Decisions about species conservation (e.g., listing, delisting, and jeopardy) might differ greatly depending on the treatment of parametric uncertainty in population models.

  12. Factors associated with automobile accidents and survival.

    PubMed

    Kim, Hong Sok; Kim, Hyung Jin; Son, Bongsoo

    2006-09-01

    This paper develops an econometric model for vehicles' inherent mortality rate and estimates the probability of accidents and survival in the United States. Logistic regression model is used to estimate probability of survival, and censored regression model is used to estimate probability of accidents. The estimation results indicated that the probability of accident and survival are influenced by the physical characteristics of the vehicles involved in the accident, and by the characteristics of the driver and the occupants. Using restrain system and riding in heavy vehicle increased the survival rate. Middle-aged drivers are less susceptible to involve in an accident, and surprisingly, female drivers are more likely to have an accident than male drivers. Riding in powerful vehicles (high horsepower) and driving late night increase the probability of accident. Overall, the driving behavior and characteristics of vehicle does matter and affects the probabilities of having a fatal accident for different types of vehicles.

  13. Optimizing Probability of Detection Point Estimate Demonstration

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  14. Extrapolating regional probability of drying of headwater streams using discrete observations and gauging networks

    NASA Astrophysics Data System (ADS)

    Beaufort, Aurélien; Lamouroux, Nicolas; Pella, Hervé; Datry, Thibault; Sauquet, Eric

    2018-05-01

    Headwater streams represent a substantial proportion of river systems and many of them have intermittent flows due to their upstream position in the network. These intermittent rivers and ephemeral streams have recently seen a marked increase in interest, especially to assess the impact of drying on aquatic ecosystems. The objective of this paper is to quantify how discrete (in space and time) field observations of flow intermittence help to extrapolate over time the daily probability of drying (defined at the regional scale). Two empirical models based on linear or logistic regressions have been developed to predict the daily probability of intermittence at the regional scale across France. Explanatory variables were derived from available daily discharge and groundwater-level data of a dense gauging/piezometer network, and models were calibrated using discrete series of field observations of flow intermittence. The robustness of the models was tested using an independent, dense regional dataset of intermittence observations and observations of the year 2017 excluded from the calibration. The resulting models were used to extrapolate the daily regional probability of drying in France: (i) over the period 2011-2017 to identify the regions most affected by flow intermittence; (ii) over the period 1989-2017, using a reduced input dataset, to analyse temporal variability of flow intermittence at the national level. The two empirical regression models performed equally well between 2011 and 2017. The accuracy of predictions depended on the number of continuous gauging/piezometer stations and intermittence observations available to calibrate the regressions. Regions with the highest performance were located in sedimentary plains, where the monitoring network was dense and where the regional probability of drying was the highest. Conversely, the worst performances were obtained in mountainous regions. Finally, temporal projections (1989-2016) suggested the highest probabilities of intermittence (> 35 %) in 1989-1991, 2003 and 2005. A high density of intermittence observations improved the information provided by gauging stations and piezometers to extrapolate the temporal variability of intermittent rivers and ephemeral streams.

  15. Optimizing probability of detection point estimate demonstration

    NASA Astrophysics Data System (ADS)

    Koshti, Ajay M.

    2017-04-01

    The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using point estimate method. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. Traditionally largest flaw size in the set is considered to be a conservative estimate of the flaw size with minimum 90% probability and 95% confidence. The flaw size is denoted as α90/95PE. The paper investigates relationship between range of flaw sizes in relation to α90, i.e. 90% probability flaw size, to provide a desired PPD. The range of flaw sizes is expressed as a proportion of the standard deviation of the probability density distribution. Difference between median or average of the 29 flaws and α90 is also expressed as a proportion of standard deviation of the probability density distribution. In general, it is concluded that, if probability of detection increases with flaw size, average of 29 flaw sizes would always be larger than or equal to α90 and is an acceptable measure of α90/95PE. If NDE technique has sufficient sensitivity and signal-to-noise ratio, then the 29 flaw-set can be optimized to meet requirements of minimum required PPD, maximum allowable POF, requirements on flaw size tolerance about mean flaw size and flaw size detectability requirements. The paper provides procedure for optimizing flaw sizes in the point estimate demonstration flaw-set.

  16. A Probabilistic Model of Illegal Drug Trafficking Operations in the Eastern Pacific and Caribbean Sea

    DTIC Science & Technology

    2013-09-01

    partner agencies and nations, detects, tracks, and interdicts illegal drug-trafficking in this region. In this thesis, we develop a probability model based...trafficking in this region. In this thesis, we develop a probability model based on intelligence inputs to generate a spatial temporal heat map specifying the...complement and vet such complicated simulation by developing more analytically tractable models. We develop probability models to generate a heat map

  17. Estimating The Probability Of Achieving Shortleaf Pine Regeneration At Variable Specified Levels

    Treesearch

    Thomas B. Lynch; Jean Nkouka; Michael M. Huebschmann; James M. Guldin

    2002-01-01

    A model was developed that can be used to estimate the probability of achieving regeneration at a variety of specified stem density levels. The model was fitted to shortleaf pine (Pinus echinata Mill.) regeneration data, and can be used to estimate the probability of achieving desired levels of regeneration between 300 and 700 stems per acre 9-l 0...

  18. Estimating unbiased phenological trends by adapting site-occupancy models.

    PubMed

    Roth, Tobias; Strebel, Nicolas; Amrhein, Valentin

    2014-08-01

    As a response to climate warming, many animals and plants have been found to shift phenologies, such as appearance in spring or timing of reproduction. However, traditional measures for shifts in phenology that are based on observational data likely are biased due to a large influence of population size, observational effort, starting date of a survey, or other causes that may affect the probability of detecting a species. Understanding phenological responses of species to climate change, however, requires a robust measure that could be compared among studies and study years. Here, we developed a new method for estimating arrival and departure dates based on site-occupancy models. Using simulated data, we show that our method provided virtually unbiased estimates of phenological events even if detection probability or the number of sites occupied by the species is changing over time. To illustrate the flexibility of our method, we analyzed spring arrival of two long-distance migrant songbirds and the length of the flight period of two butterfly species, using data from a long-term biodiversity monitoring program in Switzerland. In contrast to many birds that migrate short distances, the two long-distance migrant songbirds tended to postpone average spring arrival by -0.5 days per year between 1995 and 2012. Furthermore, the flight period of the short-distance-flying butterfly species apparently became even shorter over the study period, while the flight period of the longer-distance-flying butterfly species remained relatively stable. Our method could be applied to temporally and spatially extensive data from a wide range of monitoring programs and citizen science projects, to help unravel how species and communities respond to global warming.

  19. Parameter estimation for a cohesive sediment transport model by assimilating satellite observations in the Hangzhou Bay: Temporal variations and spatial distributions

    NASA Astrophysics Data System (ADS)

    Wang, Daosheng; Zhang, Jicai; He, Xianqiang; Chu, Dongdong; Lv, Xianqing; Wang, Ya Ping; Yang, Yang; Fan, Daidu; Gao, Shu

    2018-01-01

    Model parameters in the suspended cohesive sediment transport models are critical for the accurate simulation of suspended sediment concentrations (SSCs). Difficulties in estimating the model parameters still prevent numerical modeling of the sediment transport from achieving a high level of predictability. Based on a three-dimensional cohesive sediment transport model and its adjoint model, the satellite remote sensing data of SSCs during both spring tide and neap tide, retrieved from Geostationary Ocean Color Imager (GOCI), are assimilated to synchronously estimate four spatially and temporally varying parameters in the Hangzhou Bay in China, including settling velocity, resuspension rate, inflow open boundary conditions and initial conditions. After data assimilation, the model performance is significantly improved. Through several sensitivity experiments, the spatial and temporal variation tendencies of the estimated model parameters are verified to be robust and not affected by model settings. The pattern for the variations of the estimated parameters is analyzed and summarized. The temporal variations and spatial distributions of the estimated settling velocity are negatively correlated with current speed, which can be explained using the combination of flocculation process and Stokes' law. The temporal variations and spatial distributions of the estimated resuspension rate are also negatively correlated with current speed, which are related to the grain size of the seabed sediments under different current velocities. Besides, the estimated inflow open boundary conditions reach the local maximum values near the low water slack conditions and the estimated initial conditions are negatively correlated with water depth, which is consistent with the general understanding. The relationships between the estimated parameters and the hydrodynamic fields can be suggestive for improving the parameterization in cohesive sediment transport models.

  20. United States Geological Survey fire science: fire danger monitoring and forecasting

    USGS Publications Warehouse

    Eidenshink, Jeff C.; Howard, Stephen M.

    2012-01-01

    Each day, the U.S. Geological Survey produces 7-day forecasts for all Federal lands of the distributions of number of ignitions, number of fires above a given size, and conditional probabilities of fires growing larger than a specified size. The large fire probability map is an estimate of the likelihood that ignitions will become large fires. The large fire forecast map is a probability estimate of the number of fires on federal lands exceeding 100 acres in the forthcoming week. The ignition forecast map is a probability estimate of the number of fires on Federal land greater than 1 acre in the forthcoming week. The extreme event forecast is the probability estimate of the number of fires on Federal land that may exceed 5,000 acres in the forthcoming week.

  1. Brookian stratigraphic plays in the National Petroleum Reserve - Alaska (NPRA)

    USGS Publications Warehouse

    Houseknecht, David W.

    2003-01-01

    The Brookian megasequence in the National Petroleum Reserve in Alaska (NPRA) includes bottomset and clinoform seismic facies of the Torok Formation (mostly Albian age) and generally coeval, topset seismic facies of the uppermost Torok Formation and the Nanushuk Group. These strata are part of a composite total petroleum system involving hydrocarbons expelled from three stratigraphic intervals of source rocks, the Lower Cretaceous gamma-ray zone (GRZ), the Lower Jurassic Kingak Shale, and the Triassic Shublik Formation. The potential for undiscovered oil and gas resources in the Brookian megasequence in NPRA was assessed by defining five plays (assessment units), one in the topset seismic facies and four in the bottomset-clinoform seismic facies. The Brookian Topset Play is estimated to contain between 60 (95-percent probability) and 465 (5-percent probability) million barrels of technically recoverable oil, with a mean (expected value) of 239 million barrels. The Brookian Topset Play is estimated to contain between 0 (95-percent probability) and 679 (5-percent probability) billion cubic feet of technically recoverable, nonassociated natural gas, with a mean (expected value) of 192 billion cubic feet. The Brookian Clinoform North Play, which extends across northern NPRA, is estimated to contain between 538 (95-percent probability) and 2,257 (5-percent probability) million barrels of technically recoverable oil, with a mean (expected value) of 1,306 million barrels. The Brookian Clinoform North Play is estimated to contain between 0 (95-percent probability) and 1,969 (5-percent probability) billion cubic feet of technically recoverable, nonassociated natural gas, with a mean (expected value) of 674 billion cubic feet. The Brookian Clinoform Central Play, which extends across central NPRA, is estimated to contain between 299 (95-percent probability) and 1,849 (5-percent probability) million barrels of technically recoverable oil, with a mean (expected value) of 973 million barrels. The Brookian Clinoform Central Play is estimated to contain between 1,806 (95-percent probability) and 10,076 (5-percent probability) billion cubic feet of technically recoverable, nonassociated natural gas, with a mean (expected value) of 5,405 billion cubic feet. The Brookian Clinoform South-Shallow Play is estimated to contain between 0 (95-percent probability) and 1,254 (5-percent probability) million barrels of technically recoverable oil, with a mean (expected value) of 508 million barrels. The Brookian Clinoform South-Shallow Play is estimated to contain between 0 (95-percent probability) and 5,809 (5-percent probability) billion cubic feet of technically recoverable, nonassociated natural gas, with a mean (expected value) of 2,405 billion cubic feet. The Brookian Clinoform South-Deep Play is estimated to contain between 0 (95-percent probability) and 8,796 (5-percent probability) billion cubic feet of technically recoverable, nonassociated natural gas, with a mean (expected value) of 3,788 billion cubic feet. No technically recoverable oil is assessed in the Brookian Clinoform South-Deep Play, as it lies at depths that are entirely in the gas window. Among the Brookian stratigraphic plays in NPRA, the Brookian Clinoform North Play and the Brookian Clinoform Central Play are most likely to be objectives of exploration activity in the near-term future because they are estimated to contain multiple oil accumulations larger than 128 million barrels technically recoverable oil, and because some of those accumulations may occur near existing infrastructure in the eastern parts of the plays. The other Brookian stratigraphic plays are not likely to be the focus of exploration activity because they are estimated to contain maximum accumulation sizes that are smaller, but they may be an objective of satellite exploration if infrastructure is extended into the play areas. The total volumes of natural gas estimated to occur in B

  2. Spatiotemporal reconstruction of list-mode PET data.

    PubMed

    Nichols, Thomas E; Qi, Jinyi; Asma, Evren; Leahy, Richard M

    2002-04-01

    We describe a method for computing a continuous time estimate of tracer density using list-mode positron emission tomography data. The rate function in each voxel is modeled as an inhomogeneous Poisson process whose rate function can be represented using a cubic B-spline basis. The rate functions are estimated by maximizing the likelihood of the arrival times of detected photon pairs over the control vertices of the spline, modified by quadratic spatial and temporal smoothness penalties and a penalty term to enforce nonnegativity. Randoms rate functions are estimated by assuming independence between the spatial and temporal randoms distributions. Similarly, scatter rate functions are estimated by assuming spatiotemporal independence and that the temporal distribution of the scatter is proportional to the temporal distribution of the trues. A quantitative evaluation was performed using simulated data and the method is also demonstrated in a human study using 11C-raclopride.

  3. Aspects of spatial and temporal aggregation in estimating regional carbon dioxide fluxes from temperate forest soils

    NASA Technical Reports Server (NTRS)

    Kicklighter, David W.; Melillo, Jerry M.; Peterjohn, William T.; Rastetter, Edward B.; Mcguire, A. David; Steudler, Paul A.; Aber, John D.

    1994-01-01

    We examine the influence of aggregation errors on developing estimates of regional soil-CO2 flux from temperate forests. We find daily soil-CO2 fluxes to be more sensitive to changes in soil temperatures (Q(sub 10) = 3.08) than air temperatures (Q(sub 10) = 1.99). The direct use of mean monthly air temperatures with a daily flux model underestimates regional fluxes by approximately 4%. Temporal aggregation error varies with spatial resolution. Overall, our calibrated modeling approach reduces spatial aggregation error by 9.3% and temporal aggregation error by 15.5%. After minimizing spatial and temporal aggregation errors, mature temperate forest soils are estimated to contribute 12.9 Pg C/yr to the atmosphere as carbon dioxide. Georeferenced model estimates agree well with annual soil-CO2 fluxes measured during chamber studies in mature temperate forest stands around the globe.

  4. Role of the site of synaptic competition and the balance of learning forces for Hebbian encoding of probabilistic Markov sequences

    PubMed Central

    Bouchard, Kristofer E.; Ganguli, Surya; Brainard, Michael S.

    2015-01-01

    The majority of distinct sensory and motor events occur as temporally ordered sequences with rich probabilistic structure. Sequences can be characterized by the probability of transitioning from the current state to upcoming states (forward probability), as well as the probability of having transitioned to the current state from previous states (backward probability). Despite the prevalence of probabilistic sequencing of both sensory and motor events, the Hebbian mechanisms that mold synapses to reflect the statistics of experienced probabilistic sequences are not well understood. Here, we show through analytic calculations and numerical simulations that Hebbian plasticity (correlation, covariance, and STDP) with pre-synaptic competition can develop synaptic weights equal to the conditional forward transition probabilities present in the input sequence. In contrast, post-synaptic competition can develop synaptic weights proportional to the conditional backward probabilities of the same input sequence. We demonstrate that to stably reflect the conditional probability of a neuron's inputs and outputs, local Hebbian plasticity requires balance between competitive learning forces that promote synaptic differentiation and homogenizing learning forces that promote synaptic stabilization. The balance between these forces dictates a prior over the distribution of learned synaptic weights, strongly influencing both the rate at which structure emerges and the entropy of the final distribution of synaptic weights. Together, these results demonstrate a simple correspondence between the biophysical organization of neurons, the site of synaptic competition, and the temporal flow of information encoded in synaptic weights by Hebbian plasticity while highlighting the utility of balancing learning forces to accurately encode probability distributions, and prior expectations over such probability distributions. PMID:26257637

  5. Internal Medicine residents use heuristics to estimate disease probability.

    PubMed

    Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin

    2015-01-01

    Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. We randomized 55 Internal Medicine residents to different versions of four clinical vignettes and asked them to estimate probabilities of target conditions. We manipulated the clinical data for each vignette to be consistent with either 1) using a representative heuristic, by adding non-discriminating prototypical clinical features of the target condition, or 2) using anchoring with adjustment heuristic, by providing a high or low anchor for the target condition. When presented with additional non-discriminating data the odds of diagnosing the target condition were increased (odds ratio (OR) 2.83, 95% confidence interval [1.30, 6.15], p = 0.009). Similarly, the odds of diagnosing the target condition were increased when a high anchor preceded the vignette (OR 2.04, [1.09, 3.81], p = 0.025). Our findings suggest that despite previous exposure to the use of Bayesian reasoning, residents use heuristics, such as the representative heuristic and anchoring with adjustment, to estimate probabilities. Potential reasons for attribute substitution include the relative cognitive ease of heuristics vs. Bayesian reasoning or perhaps residents in their clinical practice use gist traces rather than precise probability estimates when diagnosing.

  6. Temporal variability of spectro-temporal receptive fields in the anesthetized auditory cortex.

    PubMed

    Meyer, Arne F; Diepenbrock, Jan-Philipp; Ohl, Frank W; Anemüller, Jörn

    2014-01-01

    Temporal variability of neuronal response characteristics during sensory stimulation is a ubiquitous phenomenon that may reflect processes such as stimulus-driven adaptation, top-down modulation or spontaneous fluctuations. It poses a challenge to functional characterization methods such as the receptive field, since these often assume stationarity. We propose a novel method for estimation of sensory neurons' receptive fields that extends the classic static linear receptive field model to the time-varying case. Here, the long-term estimate of the static receptive field serves as the mean of a probabilistic prior distribution from which the short-term temporally localized receptive field may deviate stochastically with time-varying standard deviation. The derived corresponding generalized linear model permits robust characterization of temporal variability in receptive field structure also for highly non-Gaussian stimulus ensembles. We computed and analyzed short-term auditory spectro-temporal receptive field (STRF) estimates with characteristic temporal resolution 5-30 s based on model simulations and responses from in total 60 single-unit recordings in anesthetized Mongolian gerbil auditory midbrain and cortex. Stimulation was performed with short (100 ms) overlapping frequency-modulated tones. Results demonstrate identification of time-varying STRFs, with obtained predictive model likelihoods exceeding those from baseline static STRF estimation. Quantitative characterization of STRF variability reveals a higher degree thereof in auditory cortex compared to midbrain. Cluster analysis indicates that significant deviations from the long-term static STRF are brief, but reliably estimated. We hypothesize that the observed variability more likely reflects spontaneous or state-dependent internal fluctuations that interact with stimulus-induced processing, rather than experimental or stimulus design.

  7. Fine-temporal forecasting of outbreak probability and severity: Ross River virus in Western Australia.

    PubMed

    Koolhof, I S; Bettiol, S; Carver, S

    2017-10-01

    Health warnings of mosquito-borne disease risk require forecasts that are accurate at fine-temporal resolutions (weekly scales); however, most forecasting is coarse (monthly). We use environmental and Ross River virus (RRV) surveillance to predict weekly outbreak probabilities and incidence spanning tropical, semi-arid, and Mediterranean regions of Western Australia (1991-2014). Hurdle and linear models were used to predict outbreak probabilities and incidence respectively, using time-lagged environmental variables. Forecast accuracy was assessed by model fit and cross-validation. Residual RRV notification data were also examined against mitigation expenditure for one site, Mandurah 2007-2014. Models were predictive of RRV activity, except at one site (Capel). Minimum temperature was an important predictor of RRV outbreaks and incidence at all predicted sites. Precipitation was more likely to cause outbreaks and greater incidence among tropical and semi-arid sites. While variable, mitigation expenditure coincided positively with increased RRV incidence (r 2 = 0·21). Our research demonstrates capacity to accurately predict mosquito-borne disease outbreaks and incidence at fine-temporal resolutions. We apply our findings, developing a user-friendly tool enabling managers to easily adopt this research to forecast region-specific RRV outbreaks and incidence. Approaches here may be of value to fine-scale forecasting of RRV in other areas of Australia, and other mosquito-borne diseases.

  8. The estimated lifetime probability of acquiring human papillomavirus in the United States.

    PubMed

    Chesson, Harrell W; Dunne, Eileen F; Hariri, Susan; Markowitz, Lauri E

    2014-11-01

    Estimates of the lifetime probability of acquiring human papillomavirus (HPV) can help to quantify HPV incidence, illustrate how common HPV infection is, and highlight the importance of HPV vaccination. We developed a simple model, based primarily on the distribution of lifetime numbers of sex partners across the population and the per-partnership probability of acquiring HPV, to estimate the lifetime probability of acquiring HPV in the United States in the time frame before HPV vaccine availability. We estimated the average lifetime probability of acquiring HPV among those with at least 1 opposite sex partner to be 84.6% (range, 53.6%-95.0%) for women and 91.3% (range, 69.5%-97.7%) for men. Under base case assumptions, more than 80% of women and men acquire HPV by age 45 years. Our results are consistent with estimates in the existing literature suggesting a high lifetime probability of HPV acquisition and are supported by cohort studies showing high cumulative HPV incidence over a relatively short period, such as 3 to 5 years.

  9. Accurate estimation of camera shot noise in the real-time

    NASA Astrophysics Data System (ADS)

    Cheremkhin, Pavel A.; Evtikhiev, Nikolay N.; Krasnov, Vitaly V.; Rodin, Vladislav G.; Starikov, Rostislav S.

    2017-10-01

    Nowadays digital cameras are essential parts of various technological processes and daily tasks. They are widely used in optics and photonics, astronomy, biology and other various fields of science and technology such as control systems and video-surveillance monitoring. One of the main information limitations of photo- and videocameras are noises of photosensor pixels. Camera's photosensor noise can be divided into random and pattern components. Temporal noise includes random noise component while spatial noise includes pattern noise component. Temporal noise can be divided into signal-dependent shot noise and signal-nondependent dark temporal noise. For measurement of camera noise characteristics, the most widely used methods are standards (for example, EMVA Standard 1288). It allows precise shot and dark temporal noise measurement but difficult in implementation and time-consuming. Earlier we proposed method for measurement of temporal noise of photo- and videocameras. It is based on the automatic segmentation of nonuniform targets (ASNT). Only two frames are sufficient for noise measurement with the modified method. In this paper, we registered frames and estimated shot and dark temporal noises of cameras consistently in the real-time. The modified ASNT method is used. Estimation was performed for the cameras: consumer photocamera Canon EOS 400D (CMOS, 10.1 MP, 12 bit ADC), scientific camera MegaPlus II ES11000 (CCD, 10.7 MP, 12 bit ADC), industrial camera PixeLink PL-B781F (CMOS, 6.6 MP, 10 bit ADC) and video-surveillance camera Watec LCL-902C (CCD, 0.47 MP, external 8 bit ADC). Experimental dependencies of temporal noise on signal value are in good agreement with fitted curves based on a Poisson distribution excluding areas near saturation. Time of registering and processing of frames used for temporal noise estimation was measured. Using standard computer, frames were registered and processed during a fraction of second to several seconds only. Also the accuracy of the obtained temporal noise values was estimated.

  10. Short-term impact of atmospheric pollution on fecundability.

    PubMed

    Slama, Rémy; Bottagisi, Sébastien; Solansky, Ivo; Lepeule, Johanna; Giorgis-Allemand, Lise; Sram, Radim

    2013-11-01

    Epidemiologic studies have reported associations between air pollution levels and semen characteristics, which might in turn affect a couple's ability to achieve a live birth. Our aim was to characterize short-term effects of atmospheric pollutants on fecundability (the month-specific probability of pregnancy among noncontracepting couples). For a cohort of births between 1994 and 1999 in Teplice (Czech Republic), we averaged fine particulate matter (PM2.5), carcinogenic polycyclic aromatic hydrocarbons, ozone, nitrogen dioxide (NO2), and sulfur dioxide levels estimated from a central measurement site over the 60-day period before the end of the first month of unprotected intercourse. We estimated changes in the probability of occurrence of a pregnancy during the first month of unprotected intercourse associated with exposure, using binomial regression and adjusting for maternal behaviors and time trends. Among the 1,916 recruited couples, 486 (25%) conceived during the first month of unprotected intercourse. Each increase of 10 µg/m in PM2.5 levels was associated with an adjusted decrease in fecundability of 22% (95% confidence interval = 6%-35%). NO2 levels were also associated with decreased fecundability. There was no evidence of adverse effects with the other pollutants considered. Biases related to pregnancy planning or temporal trends in air pollution were unlikely to explain the observed associations. In this polluted area, we highlighted short-term decreases in a couple's ability to conceive in association with PM2.5 and NO2 levels assessed in a central monitoring station.

  11. Hierarchical faunal filters: An approach to assessing effects of habitat and nonnative species on native fishes

    USGS Publications Warehouse

    Quist, M.C.; Rahel, F.J.; Hubert, W.A.

    2005-01-01

    Understanding factors related to the occurrence of species across multiple spatial and temporal scales is critical to the conservation and management of native fishes, especially for those species at the edge of their natural distribution. We used the concept of hierarchical faunal filters to provide a framework for investigating the influence of habitat characteristics and normative piscivores on the occurrence of 10 native fishes in streams of the North Platte River watershed in Wyoming. Three faunal filters were developed for each species: (i) large-scale biogeographic, (ii) local abiotic, and (iii) biotic. The large-scale biogeographic filter, composed of elevation and stream-size thresholds, was used to determine the boundaries within which each species might be expected to occur. Then, a local abiotic filter (i.e., habitat associations), developed using binary logistic-regression analysis, estimated the probability of occurrence of each species from features such as maximum depth, substrate composition, submergent aquatic vegetation, woody debris, and channel morphology (e.g., amount of pool habitat). Lastly, a biotic faunal filter was developed using binary logistic regression to estimate the probability of occurrence of each species relative to the abundance of nonnative piscivores in a reach. Conceptualising fish assemblages within a framework of hierarchical faunal filters is simple and logical, helps direct conservation and management activities, and provides important information on the ecology of fishes in the western Great Plains of North America. ?? Blackwell Munksgaard, 2004.

  12. Inverse sequential detection of parameter changes in developing time series

    NASA Technical Reports Server (NTRS)

    Radok, Uwe; Brown, Timothy J.

    1992-01-01

    Progressive values of two probabilities are obtained for parameter estimates derived from an existing set of values and from the same set enlarged by one or more new values, respectively. One probability is that of erroneously preferring the second of these estimates for the existing data ('type 1 error'), while the second probability is that of erroneously accepting their estimates for the enlarged test ('type 2 error'). A more stable combined 'no change' probability which always falls between 0.5 and 0 is derived from the (logarithmic) width of the uncertainty region of an equivalent 'inverted' sequential probability ratio test (SPRT, Wald 1945) in which the error probabilities are calculated rather than prescribed. A parameter change is indicated when the compound probability undergoes a progressive decrease. The test is explicitly formulated and exemplified for Gaussian samples.

  13. Evolution of the polarization of the optical afterglow of the gamma-ray burst GRB030329.

    PubMed

    Greiner, Jochen; Klose, Sylvio; Reinsch, Klaus; Schmid, Hans Martin; Sari, Re'em; Hartmann, Dieter H; Kouveliotou, Chryssa; Rau, Arne; Palazzi, Eliana; Straubmeier, Christian; Stecklum, Bringfried; Zharikov, Sergej; Tovmassian, Gaghik; Bärnbantner, Otto; Ries, Christoph; Jehin, Emmanuel; Henden, Arne; Kaas, Anlaug A; Grav, Tommy; Hjorth, Jens; Pedersen, Holger; Wijers, Ralph A M J; Kaufer, Andreas; Park, Hye-Sook; Williams, Grant; Reimer, Olaf

    2003-11-13

    The association of a supernova with GRB030329 strongly supports the 'collapsar' model of gamma-ray bursts, where a relativistic jet forms after the progenitor star collapses. Such jets cannot be spatially resolved because gamma-ray bursts lie at cosmological distances; their existence is instead inferred from 'breaks' in the light curves of the afterglows, and from the theoretical desire to reduce the estimated total energy of the burst by proposing that most of it comes out in narrow beams. Temporal evolution of the polarization of the afterglows may provide independent evidence for the jet structure of the relativistic outflow. Small-level polarization ( approximately 1-3 per cent) has been reported for a few bursts, but its temporal evolution has yet to be established. Here we report polarimetric observations of the afterglow of GRB030329. We establish the polarization light curve, detect sustained polarization at the per cent level, and find significant variability. The data imply that the afterglow magnetic field has a small coherence length and is mostly random, probably generated by turbulence, in contrast with the picture arising from the high polarization detected in the prompt gamma-rays from GRB021206 (ref. 18).

  14. Hypothesis tests for the detection of constant speed radiation moving sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dumazert, Jonathan; Coulon, Romain; Kondrasovs, Vladimir

    2015-07-01

    Radiation Portal Monitors are deployed in linear network to detect radiological material in motion. As a complement to single and multichannel detection algorithms, inefficient under too low signal to noise ratios, temporal correlation algorithms have been introduced. Test hypothesis methods based on empirically estimated mean and variance of the signals delivered by the different channels have shown significant gain in terms of a tradeoff between detection sensitivity and false alarm probability. This paper discloses the concept of a new hypothesis test for temporal correlation detection methods, taking advantage of the Poisson nature of the registered counting signals, and establishes amore » benchmark between this test and its empirical counterpart. The simulation study validates that in the four relevant configurations of a pedestrian source carrier under respectively high and low count rate radioactive background, and a vehicle source carrier under the same respectively high and low count rate radioactive background, the newly introduced hypothesis test ensures a significantly improved compromise between sensitivity and false alarm, while guaranteeing the stability of its optimization parameter regardless of signal to noise ratio variations between 2 to 0.8. (authors)« less

  15. Quantifying Melt Ponds in the Beaufort MIZ using Linear Support Vector Machines from High Resolution Panchromatic Images

    NASA Astrophysics Data System (ADS)

    Ortiz, M.; Graber, H. C.; Wilkinson, J.; Nyman, L. M.; Lund, B.

    2017-12-01

    Much work has been done on determining changes in summer ice albedo and morphological properties of melt ponds such as depth, shape and distribution using in-situ measurements and satellite-based sensors. Although these studies have dedicated much pioneering work in this area, there still lacks sufficient spatial and temporal scales. We present a prototype algorithm using Linear Support Vector Machines (LSVMs) designed to quantify the evolution of melt pond fraction from a recently government-declassified high-resolution panchromatic optical dataset. The study area of interest lies within the Beaufort marginal ice zone (MIZ), where several in-situ instruments were deployed by the British Antarctic Survey in joint with the MIZ Program, from April-September, 2014. The LSVM uses four dimensional feature data from the intensity image itself, and from various textures calculated from a modified first-order histogram technique using probability density of occurrences. We explore both the temporal evolution of melt ponds and spatial statistics such as pond fraction, pond area, and number pond density, to name a few. We also introduce a linear regression model that can potentially be used to estimate average pond area by ingesting several melt pond statistics and shape parameters.

  16. Temporal uncertainty analysis of human errors based on interrelationships among multiple factors: a case of Minuteman III missile accident.

    PubMed

    Rong, Hao; Tian, Jin; Zhao, Tingdi

    2016-01-01

    In traditional approaches of human reliability assessment (HRA), the definition of the error producing conditions (EPCs) and the supporting guidance are such that some of the conditions (especially organizational or managerial conditions) can hardly be included, and thus the analysis is burdened with incomprehensiveness without reflecting the temporal trend of human reliability. A method based on system dynamics (SD), which highlights interrelationships among technical and organizational aspects that may contribute to human errors, is presented to facilitate quantitatively estimating the human error probability (HEP) and its related variables changing over time in a long period. Taking the Minuteman III missile accident in 2008 as a case, the proposed HRA method is applied to assess HEP during missile operations over 50 years by analyzing the interactions among the variables involved in human-related risks; also the critical factors are determined in terms of impact that the variables have on risks in different time periods. It is indicated that both technical and organizational aspects should be focused on to minimize human errors in a long run. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  17. A simulation to study the feasibility of improving the temporal resolution of LAGEOS geodynamic solutions by using a sequential process noise filter

    NASA Technical Reports Server (NTRS)

    Hartman, Brian Davis

    1995-01-01

    A key drawback to estimating geodetic and geodynamic parameters over time based on satellite laser ranging (SLR) observations is the inability to accurately model all the forces acting on the satellite. Errors associated with the observations and the measurement model can detract from the estimates as well. These 'model errors' corrupt the solutions obtained from the satellite orbit determination process. Dynamical models for satellite motion utilize known geophysical parameters to mathematically detail the forces acting on the satellite. However, these parameters, while estimated as constants, vary over time. These temporal variations must be accounted for in some fashion to maintain meaningful solutions. The primary goal of this study is to analyze the feasibility of using a sequential process noise filter for estimating geodynamic parameters over time from the Laser Geodynamics Satellite (LAGEOS) SLR data. This evaluation is achieved by first simulating a sequence of realistic LAGEOS laser ranging observations. These observations are generated using models with known temporal variations in several geodynamic parameters (along track drag and the J(sub 2), J(sub 3), J(sub 4), and J(sub 5) geopotential coefficients). A standard (non-stochastic) filter and a stochastic process noise filter are then utilized to estimate the model parameters from the simulated observations. The standard non-stochastic filter estimates these parameters as constants over consecutive fixed time intervals. Thus, the resulting solutions contain constant estimates of parameters that vary in time which limits the temporal resolution and accuracy of the solution. The stochastic process noise filter estimates these parameters as correlated process noise variables. As a result, the stochastic process noise filter has the potential to estimate the temporal variations more accurately since the constraint of estimating the parameters as constants is eliminated. A comparison of the temporal resolution of solutions obtained from standard sequential filtering methods and process noise sequential filtering methods shows that the accuracy is significantly improved using process noise. The results show that the positional accuracy of the orbit is improved as well. The temporal resolution of the resulting solutions are detailed, and conclusions drawn about the results. Benefits and drawbacks of using process noise filtering in this type of scenario are also identified.

  18. Nonparametric probability density estimation by optimization theoretic techniques

    NASA Technical Reports Server (NTRS)

    Scott, D. W.

    1976-01-01

    Two nonparametric probability density estimators are considered. The first is the kernel estimator. The problem of choosing the kernel scaling factor based solely on a random sample is addressed. An interactive mode is discussed and an algorithm proposed to choose the scaling factor automatically. The second nonparametric probability estimate uses penalty function techniques with the maximum likelihood criterion. A discrete maximum penalized likelihood estimator is proposed and is shown to be consistent in the mean square error. A numerical implementation technique for the discrete solution is discussed and examples displayed. An extensive simulation study compares the integrated mean square error of the discrete and kernel estimators. The robustness of the discrete estimator is demonstrated graphically.

  19. Maximum likelihood estimation for predicting the probability of obtaining variable shortleaf pine regeneration densities

    Treesearch

    Thomas B. Lynch; Jean Nkouka; Michael M. Huebschmann; James M. Guldin

    2003-01-01

    A logistic equation is the basis for a model that predicts the probability of obtaining regeneration at specified densities. The density of regeneration (trees/ha) for which an estimate of probability is desired can be specified by means of independent variables in the model. When estimating parameters, the dependent variable is set to 1 if the regeneration density (...

  20. Annual survival of Snail Kites in Florida: Radio telemetry versus capture-resighting data

    USGS Publications Warehouse

    Bennetts, R.E.; Dreitz, V.J.; Kitchens, W.M.; Hines, J.E.; Nichols, J.D.

    1999-01-01

    We estimated annual survival of Snail Kites (Rostrhamus sociabilis) in Florida using the Kaplan-Meier estimator with data from 271 radio-tagged birds over a three-year period and capture-recapture (resighting) models with data from 1,319 banded birds over a six-year period. We tested the hypothesis that survival differed among three age classes using both data sources. We tested additional hypotheses about spatial and temporal variation using a combination of data from radio telemetry and single- and multistrata capture-recapture models. Results from these data sets were similar in their indications of the sources of variation in survival, but they differed in some parameter estimates. Both data sources indicated that survival was higher for adults than for juveniles, but they did not support delineation of a subadult age class. Our data also indicated that survival differed among years and regions for juveniles but not for adults. Estimates of juvenile survival using radio telemetry data were higher than estimates using capture-recapture models for two of three years (1992 and 1993). Ancillary evidence based on censored birds indicated that some mortality of radio-tagged juveniles went undetected during those years, resulting in biased estimates. Thus, we have greater confidence in our estimates of juvenile survival using capture-recapture models. Precision of estimates reflected the number of parameters estimated and was surprisingly similar between radio telemetry and single-stratum capture-recapture models, given the substantial differences in sample sizes. Not having to estimate resighting probability likely offsets, to some degree, the smaller sample sizes from our radio telemetry data. Precision of capture-recapture models was lower using multistrata models where region-specific parameters were estimated than using single-stratum models, where spatial variation in parameters was not taken into account.

  1. Subjective Probabilities in Household Surveys

    PubMed Central

    Hurd, Michael D.

    2011-01-01

    Subjective probabilities are now collected on a number of large household surveys with the objective of providing data to better understand inter-temporal decision making. Comparison of subjective probabilities with actual outcomes shows that the probabilities have considerable predictive power in situations where individuals have considerable private information such as survival and retirement. In contrast the subjective probability of a stock market gain varies greatly across individuals even though no one has private information and the outcome is the same for everyone. An explanation is that there is considerable variation in accessing and processing information. Further, the subjective probability of a stock market gain is considerably lower than historical averages, providing an explanation for the relatively low frequency of stock holding. An important research objective will be to understand how individuals form their subjective probabilities. PMID:21643535

  2. Joint estimation of habitat dynamics and species interactions: Disturbance reduces co-occurrence of non-native predators with an endangered toad

    USGS Publications Warehouse

    Miller, David A.W.; Brehme, Cheryl S.; Hines, James E.; Nichols, James D.; Fisher, Robert N.

    2012-01-01

    1. Ecologists have long been interested in the processes that determine patterns of species occurrence and co-occurrence. Potential short-comings of many existing empirical approaches that address these questions include a reliance on patterns of occurrence at a single time point, failure to account properly for imperfect detection and treating the environment as a static variable.2. We fit detection and non-detection data collected from repeat visits using a dynamic site occupancy model that simultaneously accounts for the temporal dynamics of a focal prey species, its predators and its habitat. Our objective was to determine how disturbance and species interactions affect the co-occurrence probabilities of an endangered toad and recently introduced non-native predators in stream breeding habitats. For this, we determined statistical support for alternative processes that could affect co-occurrence frequency in the system.3. We collected occurrence data at stream segments in two watersheds where streams were largely ephemeral and one watershed dominated by perennial streams. Co-occurrence probabilities of toads with non-native predators were related to disturbance frequency, with low co-occurrence in the ephemeral watershed and high co-occurrence in the perennial watershed. This occurred because once predators were established at a site, they were rarely lost from the site except in cases when the site dried out. Once dry sites became suitable again, toads colonized them much more rapidly than predators, creating a period of predator-free space.4. We attribute the dynamics to a storage effect, where toads persisting outside the stream environment during periods of drought rapidly colonized sites when they become suitable again. Our results support that even in highly connected stream networks, temporal disturbance can structure frequencies with which breeding amphibians encounter non-native predators.5. Dynamic multi-state occupancy models are a powerful tool for rigorously examining hypotheses about inter-species and species–habitat interactions. In contrast to previous methods that infer dynamic processes based on static patterns in occupancy, the approach we took allows the dynamic processes that determine species–species and species–habitat interactions to be directly estimated.

  3. Effective connectivity between superior temporal gyrus and Heschl's gyrus during white noise listening: linear versus non-linear models.

    PubMed

    Hamid, Ka; Yusoff, An; Rahman, Mza; Mohamad, M; Hamid, Aia

    2012-04-01

    This fMRI study is about modelling the effective connectivity between Heschl's gyrus (HG) and the superior temporal gyrus (STG) in human primary auditory cortices. MATERIALS #ENTITYSTARTX00026; Ten healthy male participants were required to listen to white noise stimuli during functional magnetic resonance imaging (fMRI) scans. Statistical parametric mapping (SPM) was used to generate individual and group brain activation maps. For input region determination, two intrinsic connectivity models comprising bilateral HG and STG were constructed using dynamic causal modelling (DCM). The models were estimated and inferred using DCM while Bayesian Model Selection (BMS) for group studies was used for model comparison and selection. Based on the winning model, six linear and six non-linear causal models were derived and were again estimated, inferred, and compared to obtain a model that best represents the effective connectivity between HG and the STG, balancing accuracy and complexity. Group results indicated significant asymmetrical activation (p(uncorr) < 0.001) in bilateral HG and STG. Model comparison results showed strong evidence of STG as the input centre. The winning model is preferred by 6 out of 10 participants. The results were supported by BMS results for group studies with the expected posterior probability, r = 0.7830 and exceedance probability, ϕ = 0.9823. One-sample t-tests performed on connection values obtained from the winning model indicated that the valid connections for the winning model are the unidirectional parallel connections from STG to bilateral HG (p < 0.05). Subsequent model comparison between linear and non-linear models using BMS prefers non-linear connection (r = 0.9160, ϕ = 1.000) from which the connectivity between STG and the ipsi- and contralateral HG is gated by the activity in STG itself. We are able to demonstrate that the effective connectivity between HG and STG while listening to white noise for the respective participants can be explained by a non-linear dynamic causal model with the activity in STG influencing the STG-HG connectivity non-linearly.

  4. Estimating Local Chlamydia Incidence and Prevalence Using Surveillance Data

    PubMed Central

    White, Peter J.

    2017-01-01

    Background: Understanding patterns of chlamydia prevalence is important for addressing inequalities and planning cost-effective control programs. Population-based surveys are costly; the best data for England come from the Natsal national surveys, which are only available once per decade, and are nationally representative but not powered to compare prevalence in different localities. Prevalence estimates at finer spatial and temporal scales are required. Methods: We present a method for estimating local prevalence by modeling the infection, testing, and treatment processes. Prior probability distributions for parameters describing natural history and treatment-seeking behavior are informed by the literature or calibrated using national prevalence estimates. By combining them with surveillance data on numbers of chlamydia tests and diagnoses, we obtain estimates of local screening rates, incidence, and prevalence. We illustrate the method by application to data from England. Results: Our estimates of national prevalence by age group agree with the Natsal-3 survey. They could be improved by additional information on the number of diagnosed cases that were asymptomatic. There is substantial local-level variation in prevalence, with more infection in deprived areas. Incidence in each sex is strongly correlated with prevalence in the other. Importantly, we find that positivity (the proportion of tests which were positive) does not provide a reliable proxy for prevalence. Conclusion: This approach provides local chlamydia prevalence estimates from surveillance data, which could inform analyses to identify and understand local prevalence patterns and assess local programs. Estimates could be more accurate if surveillance systems recorded additional information, including on symptoms. See video abstract at, http://links.lww.com/EDE/B211. PMID:28306613

  5. The Everett-Wheeler interpretation and the open future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sudbery, Anthony

    2011-03-28

    I discuss the meaning of probability in the Everett-Wheeler interpretation of quantum mechanics, together with the problem of defining histories. To resolve these, I propose an understanding of probability arising from a form of temporal logic: the probability of a future-tense proposition is identified with its truth value in a many-valued and context-dependent logic. In short, probability is degree of truth. These ideas relate to traditional naive ideas of time and chance. Indeed, I argue that Everettian quantum mechanics is the only form of scientific theory that truly incorporates the perception that the future is open.

  6. Variance components estimation for continuous and discrete data, with emphasis on cross-classified sampling designs

    USGS Publications Warehouse

    Gray, Brian R.; Gitzen, Robert A.; Millspaugh, Joshua J.; Cooper, Andrew B.; Licht, Daniel S.

    2012-01-01

    Variance components may play multiple roles (cf. Cox and Solomon 2003). First, magnitudes and relative magnitudes of the variances of random factors may have important scientific and management value in their own right. For example, variation in levels of invasive vegetation among and within lakes may suggest causal agents that operate at both spatial scales – a finding that may be important for scientific and management reasons. Second, variance components may also be of interest when they affect precision of means and covariate coefficients. For example, variation in the effect of water depth on the probability of aquatic plant presence in a study of multiple lakes may vary by lake. This variation will affect the precision of the average depth-presence association. Third, variance component estimates may be used when designing studies, including monitoring programs. For example, to estimate the numbers of years and of samples per year required to meet long-term monitoring goals, investigators need estimates of within and among-year variances. Other chapters in this volume (Chapters 7, 8, and 10) as well as extensive external literature outline a framework for applying estimates of variance components to the design of monitoring efforts. For example, a series of papers with an ecological monitoring theme examined the relative importance of multiple sources of variation, including variation in means among sites, years, and site-years, for the purposes of temporal trend detection and estimation (Larsen et al. 2004, and references therein).

  7. Age-related differences in striatal, medial temporal, and frontal involvement during value-based decision processing.

    PubMed

    Su, Yu-Shiang; Chen, Jheng-Ting; Tang, Yong-Jheng; Yuan, Shu-Yun; McCarrey, Anna C; Goh, Joshua Oon Soo

    2018-05-21

    Appropriate neural representation of value and application of decision strategies are necessary to make optimal investment choices in real life. Normative human aging alters neural selectivity and control processing in brain regions implicated in value-based decision processing including striatal, medial temporal, and frontal areas. However, the specific neural mechanisms of how these age-related functional brain changes modulate value processing in older adults remain unclear. Here, young and older adults performed a lottery-choice functional magnetic resonance imaging experiment in which probabilities of winning different magnitudes of points constituted expected values of stakes. Increasing probability of winning modulated striatal responses in young adults, but modulated medial temporal and ventromedial prefrontal areas instead in older adults. Older adults additionally engaged higher responses in dorso-medio-lateral prefrontal cortices to more unfavorable stakes. Such extrastriatal involvement mediated age-related increase in risk-taking decisions. Furthermore, lower resting-state functional connectivity between lateral prefrontal and striatal areas also predicted lottery-choice task risk-taking that was mediated by higher functional connectivity between prefrontal and medial temporal areas during the task, with this mediation relationship being stronger in older than younger adults. Overall, we report evidence of a systemic neural mechanistic change in processing of probability in mixed-lottery values with age that increases risk-taking of unfavorable stakes in older adults. Moreover, individual differences in age-related effects on baseline frontostriatal communication may be a central determinant of such subsequent age differences in value-based decision neural processing and resulting behaviors. Copyright © 2018 Elsevier Inc. All rights reserved.

  8. Towards a theoretical determination of the geographical probability distribution of meteoroid impacts on Earth

    NASA Astrophysics Data System (ADS)

    Zuluaga, Jorge I.; Sucerquia, Mario

    2018-06-01

    Tunguska and Chelyabinsk impact events occurred inside a geographical area of only 3.4 per cent of the Earth's surface. Although two events hardly constitute a statistically significant demonstration of a geographical pattern of impacts, their spatial coincidence is at least tantalizing. To understand if this concurrence reflects an underlying geographical and/or temporal pattern, we must aim at predicting the spatio-temporal distribution of meteoroid impacts on Earth. For this purpose we designed, implemented, and tested a novel numerical technique, the `Gravitational Ray Tracing' (GRT) designed to compute the relative impact probability (RIP) on the surface of any planet. GRT is inspired by the so-called ray-casting techniques used to render realistic images of complex 3D scenes. In this paper we describe the method and the results of testing it at the time of large impact events. Our findings suggest a non-trivial pattern of impact probabilities at any given time on the Earth. Locations at 60-90° from the apex are more prone to impacts, especially at midnight. Counterintuitively, sites close to apex direction have the lowest RIP, while in the antapex RIP are slightly larger than average. We present here preliminary maps of RIP at the time of Tunguska and Chelyabinsk events and found no evidence of a spatial or temporal pattern, suggesting that their coincidence was fortuitous. We apply the GRT method to compute theoretical RIP at the location and time of 394 large fireballs. Although the predicted spatio-temporal impact distribution matches marginally the observed events, we successfully predict their impact speed distribution.

  9. Predicting the temporal and spatial probability of orographic cloud cover in the Luquillo Experimental Forest in Puerto Rico using generalized linear (mixed) models.

    Treesearch

    Wei Wu; Charlesb Hall; Lianjun Zhang

    2006-01-01

    We predicted the spatial pattern of hourly probability of cloud cover in the Luquillo Experimental Forest (LEF) in North-Eastern Puerto Rico using four different models. The probability of cloud cover (defined as “the percentage of the area covered by clouds in each pixel on the map” in this paper) at any hour and any place is a function of three topographic variables...

  10. Digital signaling decouples activation probability and population heterogeneity.

    PubMed

    Kellogg, Ryan A; Tian, Chengzhe; Lipniacki, Tomasz; Quake, Stephen R; Tay, Savaş

    2015-10-21

    Digital signaling enhances robustness of cellular decisions in noisy environments, but it is unclear how digital systems transmit temporal information about a stimulus. To understand how temporal input information is encoded and decoded by the NF-κB system, we studied transcription factor dynamics and gene regulation under dose- and duration-modulated inflammatory inputs. Mathematical modeling predicted and microfluidic single-cell experiments confirmed that integral of the stimulus (or area, concentration × duration) controls the fraction of cells that activate NF-κB in the population. However, stimulus temporal profile determined NF-κB dynamics, cell-to-cell variability, and gene expression phenotype. A sustained, weak stimulation lead to heterogeneous activation and delayed timing that is transmitted to gene expression. In contrast, a transient, strong stimulus with the same area caused rapid and uniform dynamics. These results show that digital NF-κB signaling enables multidimensional control of cellular phenotype via input profile, allowing parallel and independent control of single-cell activation probability and population heterogeneity.

  11. Analysis of noise-induced temporal correlations in neuronal spike sequences

    NASA Astrophysics Data System (ADS)

    Reinoso, José A.; Torrent, M. C.; Masoller, Cristina

    2016-11-01

    We investigate temporal correlations in sequences of noise-induced neuronal spikes, using a symbolic method of time-series analysis. We focus on the sequence of time-intervals between consecutive spikes (inter-spike-intervals, ISIs). The analysis method, known as ordinal analysis, transforms the ISI sequence into a sequence of ordinal patterns (OPs), which are defined in terms of the relative ordering of consecutive ISIs. The ISI sequences are obtained from extensive simulations of two neuron models (FitzHugh-Nagumo, FHN, and integrate-and-fire, IF), with correlated noise. We find that, as the noise strength increases, temporal order gradually emerges, revealed by the existence of more frequent ordinal patterns in the ISI sequence. While in the FHN model the most frequent OP depends on the noise strength, in the IF model it is independent of the noise strength. In both models, the correlation time of the noise affects the OP probabilities but does not modify the most probable pattern.

  12. Developing a probability-based model of aquifer vulnerability in an agricultural region

    NASA Astrophysics Data System (ADS)

    Chen, Shih-Kai; Jang, Cheng-Shin; Peng, Yi-Huei

    2013-04-01

    SummaryHydrogeological settings of aquifers strongly influence the regional groundwater movement and pollution processes. Establishing a map of aquifer vulnerability is considerably critical for planning a scheme of groundwater quality protection. This study developed a novel probability-based DRASTIC model of aquifer vulnerability in the Choushui River alluvial fan, Taiwan, using indicator kriging and to determine various risk categories of contamination potentials based on estimated vulnerability indexes. Categories and ratings of six parameters in the probability-based DRASTIC model were probabilistically characterized according to the parameter classification methods of selecting a maximum estimation probability and calculating an expected value. Moreover, the probability-based estimation and assessment gave us an excellent insight into propagating the uncertainty of parameters due to limited observation data. To examine the prediction capacity of pollutants for the developed probability-based DRASTIC model, medium, high, and very high risk categories of contamination potentials were compared with observed nitrate-N exceeding 0.5 mg/L indicating the anthropogenic groundwater pollution. The analyzed results reveal that the developed probability-based DRASTIC model is capable of predicting high nitrate-N groundwater pollution and characterizing the parameter uncertainty via the probability estimation processes.

  13. Estimating the Probability of Rare Events Occurring Using a Local Model Averaging.

    PubMed

    Chen, Jin-Hua; Chen, Chun-Shu; Huang, Meng-Fan; Lin, Hung-Chih

    2016-10-01

    In statistical applications, logistic regression is a popular method for analyzing binary data accompanied by explanatory variables. But when one of the two outcomes is rare, the estimation of model parameters has been shown to be severely biased and hence estimating the probability of rare events occurring based on a logistic regression model would be inaccurate. In this article, we focus on estimating the probability of rare events occurring based on logistic regression models. Instead of selecting a best model, we propose a local model averaging procedure based on a data perturbation technique applied to different information criteria to obtain different probability estimates of rare events occurring. Then an approximately unbiased estimator of Kullback-Leibler loss is used to choose the best one among them. We design complete simulations to show the effectiveness of our approach. For illustration, a necrotizing enterocolitis (NEC) data set is analyzed. © 2016 Society for Risk Analysis.

  14. Frequency domain analysis of errors in cross-correlations of ambient seismic noise

    NASA Astrophysics Data System (ADS)

    Liu, Xin; Ben-Zion, Yehuda; Zigone, Dimitri

    2016-12-01

    We analyse random errors (variances) in cross-correlations of ambient seismic noise in the frequency domain, which differ from previous time domain methods. Extending previous theoretical results on ensemble averaged cross-spectrum, we estimate confidence interval of stacked cross-spectrum of finite amount of data at each frequency using non-overlapping windows with fixed length. The extended theory also connects amplitude and phase variances with the variance of each complex spectrum value. Analysis of synthetic stationary ambient noise is used to estimate the confidence interval of stacked cross-spectrum obtained with different length of noise data corresponding to different number of evenly spaced windows of the same duration. This method allows estimating Signal/Noise Ratio (SNR) of noise cross-correlation in the frequency domain, without specifying filter bandwidth or signal/noise windows that are needed for time domain SNR estimations. Based on synthetic ambient noise data, we also compare the probability distributions, causal part amplitude and SNR of stacked cross-spectrum function using one-bit normalization or pre-whitening with those obtained without these pre-processing steps. Natural continuous noise records contain both ambient noise and small earthquakes that are inseparable from the noise with the existing pre-processing steps. Using probability distributions of random cross-spectrum values based on the theoretical results provides an effective way to exclude such small earthquakes, and additional data segments (outliers) contaminated by signals of different statistics (e.g. rain, cultural noise), from continuous noise waveforms. This technique is applied to constrain values and uncertainties of amplitude and phase velocity of stacked noise cross-spectrum at different frequencies, using data from southern California at both regional scale (˜35 km) and dense linear array (˜20 m) across the plate-boundary faults. A block bootstrap resampling method is used to account for temporal correlation of noise cross-spectrum at low frequencies (0.05-0.2 Hz) near the ocean microseismic peaks.

  15. Correcting length-frequency distributions for imperfect detection

    USGS Publications Warehouse

    Breton, André R.; Hawkins, John A.; Winkelman, Dana L.

    2013-01-01

    Sampling gear selects for specific sizes of fish, which may bias length-frequency distributions that are commonly used to assess population size structure, recruitment patterns, growth, and survival. To properly correct for sampling biases caused by gear and other sources, length-frequency distributions need to be corrected for imperfect detection. We describe a method for adjusting length-frequency distributions when capture and recapture probabilities are a function of fish length, temporal variation, and capture history. The method is applied to a study involving the removal of Smallmouth Bass Micropterus dolomieu by boat electrofishing from a 38.6-km reach on the Yampa River, Colorado. Smallmouth Bass longer than 100 mm were marked and released alive from 2005 to 2010 on one or more electrofishing passes and removed on all other passes from the population. Using the Huggins mark–recapture model, we detected a significant effect of fish total length, previous capture history (behavior), year, pass, year×behavior, and year×pass on capture and recapture probabilities. We demonstrate how to partition the Huggins estimate of abundance into length frequencies to correct for these effects. Uncorrected length frequencies of fish removed from Little Yampa Canyon were negatively biased in every year by as much as 88% relative to mark–recapture estimates for the smallest length-class in our analysis (100–110 mm). Bias declined but remained high even for adult length-classes (≥200 mm). The pattern of bias across length-classes was variable across years. The percentage of unadjusted counts that were below the lower 95% confidence interval from our adjusted length-frequency estimates were 95, 89, 84, 78, 81, and 92% from 2005 to 2010, respectively. Length-frequency distributions are widely used in fisheries science and management. Our simple method for correcting length-frequency estimates for imperfect detection could be widely applied when mark–recapture data are available.

  16. Data Analysis of GPM Constellation Satellites-IMERG and ERA-Interim precipitation products over West of Iran

    NASA Astrophysics Data System (ADS)

    Sharifi, Ehsan; Steinacker, Reinhold; Saghafian, Bahram

    2016-04-01

    Precipitation is a critical component of the Earth's hydrological cycle. The primary requirement in precipitation measurement is to know where and how much precipitation is falling at any given time. Especially in data sparse regions with insufficient radar coverage, satellite information can provide a spatial and temporal context. Nonetheless, evaluation of satellite precipitation is essential prior to operational use. This is why many previous studies are devoted to the validation of satellite estimation. Accurate quantitative precipitation estimation over mountainous basins is of great importance because of their susceptibility to hazards. In situ observations over mountainous areas are mostly limited, but currently available satellite precipitation products can potentially provide the precipitation estimation needed for meteorological and hydrological applications. One of the newest and blended methods that use multi-satellites and multi-sensors has been developed for estimating global precipitation. The considered data set known as Integrated Multi-satellitE Retrievals (IMERG) for GPM (Global Precipitation Measurement) is routinely produced by the GPM constellation satellites. Moreover, recent efforts have been put into the improvement of the precipitation products derived from reanalysis systems, which has led to significant progress. One of the best and a worldwide used model is developed by the European Centre for Medium Range Weather Forecasts (ECMWF). They have produced global reanalysis daily precipitation, known as ERA-Interim. This study has evaluated one year of precipitation data from the GPM-IMERG and ERA-Interim reanalysis daily time series over West of Iran. IMERG and ERA-Interim yield underestimate the observed values while IMERG underestimated slightly and performed better when precipitation is greater than 10mm. Furthermore, with respect to evaluation of probability of detection (POD), threat score (TS), false alarm ratio (FAR) and probability of false detection (POFD) IMERG yields a better value of POD, TS, FAR and POFD in comparison to era-Interim. Overall, ERA-Interim product produced fewer robust results when compared to IMERG.

  17. Utilizing Adjoint-Based Error Estimates for Surrogate Models to Accurately Predict Probabilities of Events

    DOE PAGES

    Butler, Troy; Wildey, Timothy

    2018-01-01

    In thist study, we develop a procedure to utilize error estimates for samples of a surrogate model to compute robust upper and lower bounds on estimates of probabilities of events. We show that these error estimates can also be used in an adaptive algorithm to simultaneously reduce the computational cost and increase the accuracy in estimating probabilities of events using computationally expensive high-fidelity models. Specifically, we introduce the notion of reliability of a sample of a surrogate model, and we prove that utilizing the surrogate model for the reliable samples and the high-fidelity model for the unreliable samples gives preciselymore » the same estimate of the probability of the output event as would be obtained by evaluation of the original model for each sample. The adaptive algorithm uses the additional evaluations of the high-fidelity model for the unreliable samples to locally improve the surrogate model near the limit state, which significantly reduces the number of high-fidelity model evaluations as the limit state is resolved. Numerical results based on a recently developed adjoint-based approach for estimating the error in samples of a surrogate are provided to demonstrate (1) the robustness of the bounds on the probability of an event, and (2) that the adaptive enhancement algorithm provides a more accurate estimate of the probability of the QoI event than standard response surface approximation methods at a lower computational cost.« less

  18. Utilizing Adjoint-Based Error Estimates for Surrogate Models to Accurately Predict Probabilities of Events

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Butler, Troy; Wildey, Timothy

    In thist study, we develop a procedure to utilize error estimates for samples of a surrogate model to compute robust upper and lower bounds on estimates of probabilities of events. We show that these error estimates can also be used in an adaptive algorithm to simultaneously reduce the computational cost and increase the accuracy in estimating probabilities of events using computationally expensive high-fidelity models. Specifically, we introduce the notion of reliability of a sample of a surrogate model, and we prove that utilizing the surrogate model for the reliable samples and the high-fidelity model for the unreliable samples gives preciselymore » the same estimate of the probability of the output event as would be obtained by evaluation of the original model for each sample. The adaptive algorithm uses the additional evaluations of the high-fidelity model for the unreliable samples to locally improve the surrogate model near the limit state, which significantly reduces the number of high-fidelity model evaluations as the limit state is resolved. Numerical results based on a recently developed adjoint-based approach for estimating the error in samples of a surrogate are provided to demonstrate (1) the robustness of the bounds on the probability of an event, and (2) that the adaptive enhancement algorithm provides a more accurate estimate of the probability of the QoI event than standard response surface approximation methods at a lower computational cost.« less

  19. Spatio-temporal Granger causality: a new framework

    PubMed Central

    Luo, Qiang; Lu, Wenlian; Cheng, Wei; Valdes-Sosa, Pedro A.; Wen, Xiaotong; Ding, Mingzhou; Feng, Jianfeng

    2015-01-01

    That physiological oscillations of various frequencies are present in fMRI signals is the rule, not the exception. Herein, we propose a novel theoretical framework, spatio-temporal Granger causality, which allows us to more reliably and precisely estimate the Granger causality from experimental datasets possessing time-varying properties caused by physiological oscillations. Within this framework, Granger causality is redefined as a global index measuring the directed information flow between two time series with time-varying properties. Both theoretical analyses and numerical examples demonstrate that Granger causality is a monotonically increasing function of the temporal resolution used in the estimation. This is consistent with the general principle of coarse graining, which causes information loss by smoothing out very fine-scale details in time and space. Our results confirm that the Granger causality at the finer spatio-temporal scales considerably outperforms the traditional approach in terms of an improved consistency between two resting-state scans of the same subject. To optimally estimate the Granger causality, the proposed theoretical framework is implemented through a combination of several approaches, such as dividing the optimal time window and estimating the parameters at the fine temporal and spatial scales. Taken together, our approach provides a novel and robust framework for estimating the Granger causality from fMRI, EEG, and other related data. PMID:23643924

  20. Estimating total suspended sediment yield with probability sampling

    Treesearch

    Robert B. Thomas

    1985-01-01

    The ""Selection At List Time"" (SALT) scheme controls sampling of concentration for estimating total suspended sediment yield. The probability of taking a sample is proportional to its estimated contribution to total suspended sediment discharge. This procedure gives unbiased estimates of total suspended sediment yield and the variance of the...

  1. Influences of Availability on Parameter Estimates from Site Occupancy Models with Application to Submersed Aquatic Vegetation

    USGS Publications Warehouse

    Gray, Brian R.; Holland, Mark D.; Yi, Feng; Starcevich, Leigh Ann Harrod

    2013-01-01

    Site occupancy models are commonly used by ecologists to estimate the probabilities of species site occupancy and of species detection. This study addresses the influence on site occupancy and detection estimates of variation in species availability among surveys within sites. Such variation in availability may result from temporary emigration, nonavailability of the species for detection, and sampling sites spatially when species presence is not uniform within sites. We demonstrate, using Monte Carlo simulations and aquatic vegetation data, that variation in availability and heterogeneity in the probability of availability may yield biases in the expected values of the site occupancy and detection estimates that have traditionally been associated with low-detection probabilities and heterogeneity in those probabilities. These findings confirm that the effects of availability may be important for ecologists and managers, and that where such effects are expected, modification of sampling designs and/or analytical methods should be considered. Failure to limit the effects of availability may preclude reliable estimation of the probability of site occupancy.

  2. Mice plan decision strategies based on previously learned time intervals, locations, and probabilities

    PubMed Central

    Tosun, Tuğçe; Gür, Ezgi; Balcı, Fuat

    2016-01-01

    Animals can shape their timed behaviors based on experienced probabilistic relations in a nearly optimal fashion. On the other hand, it is not clear if they adopt these timed decisions by making computations based on previously learnt task parameters (time intervals, locations, and probabilities) or if they gradually develop their decisions based on trial and error. To address this question, we tested mice in the timed-switching task, which required them to anticipate when (after a short or long delay) and at which of the two delay locations a reward would be presented. The probability of short trials differed between test groups in two experiments. Critically, we first trained mice on relevant task parameters by signaling the active trial with a discriminative stimulus and delivered the corresponding reward after the associated delay without any response requirement (without inducing switching behavior). During the test phase, both options were presented simultaneously to characterize the emergence and temporal characteristics of the switching behavior. Mice exhibited timed-switching behavior starting from the first few test trials, and their performance remained stable throughout testing in the majority of the conditions. Furthermore, as the probability of the short trial increased, mice waited longer before switching from the short to long location (experiment 1). These behavioral adjustments were in directions predicted by reward maximization. These results suggest that rather than gradually adjusting their time-dependent choice behavior, mice abruptly adopted temporal decision strategies by directly integrating their previous knowledge of task parameters into their timed behavior, supporting the model-based representational account of temporal risk assessment. PMID:26733674

  3. Models based on value and probability in health improve shared decision making.

    PubMed

    Ortendahl, Monica

    2008-10-01

    Diagnostic reasoning and treatment decisions are a key competence of doctors. A model based on values and probability provides a conceptual framework for clinical judgments and decisions, and also facilitates the integration of clinical and biomedical knowledge into a diagnostic decision. Both value and probability are usually estimated values in clinical decision making. Therefore, model assumptions and parameter estimates should be continually assessed against data, and models should be revised accordingly. Introducing parameter estimates for both value and probability, which usually pertain in clinical work, gives the model labelled subjective expected utility. Estimated values and probabilities are involved sequentially for every step in the decision-making process. Introducing decision-analytic modelling gives a more complete picture of variables that influence the decisions carried out by the doctor and the patient. A model revised for perceived values and probabilities by both the doctor and the patient could be used as a tool for engaging in a mutual and shared decision-making process in clinical work.

  4. Can you hear me now? Range-testing a submerged passive acoustic receiver array in a Caribbean coral reef habitat

    USGS Publications Warehouse

    Selby, Thomas H.; Hart, Kristen M.; Fujisaki, Ikuko; Smith, Brian J.; Pollock, Clayton J; Hillis-Star, Zandy M; Lundgren, Ian; Oli, Madan K.

    2016-01-01

    Submerged passive acoustic technology allows researchers to investigate spatial and temporal movement patterns of many marine and freshwater species. The technology uses receivers to detect and record acoustic transmissions emitted from tags attached to an individual. Acoustic signal strength naturally attenuates over distance, but numerous environmental variables also affect the probability a tag is detected. Knowledge of receiver range is crucial for designing acoustic arrays and analyzing telemetry data. Here, we present a method for testing a relatively large-scale receiver array in a dynamic Caribbean coastal environment intended for long-term monitoring of multiple species. The U.S. Geological Survey and several academic institutions in collaboration with resource management at Buck Island Reef National Monument (BIRNM), off the coast of St. Croix, recently deployed a 52 passive acoustic receiver array. We targeted 19 array-representative receivers for range-testing by submersing fixed delay interval range-testing tags at various distance intervals in each cardinal direction from a receiver for a minimum of an hour. Using a generalized linear mixed model (GLMM), we estimated the probability of detection across the array and assessed the effect of water depth, habitat, wind, temperature, and time of day on the probability of detection. The predicted probability of detection across the entire array at 100 m distance from a receiver was 58.2% (95% CI: 44.0–73.0%) and dropped to 26.0% (95% CI: 11.4–39.3%) 200 m from a receiver indicating a somewhat constrained effective detection range. Detection probability varied across habitat classes with the greatest effective detection range occurring in homogenous sand substrate and the smallest in high rugosity reef. Predicted probability of detection across BIRNM highlights potential gaps in coverage using the current array as well as limitations of passive acoustic technology within a complex coral reef environment.

  5. Dynamic Rainfall Patterns and the Simulation of Changing Scenarios: A behavioral watershed response

    NASA Astrophysics Data System (ADS)

    Chu, M.; Guzman, J.; Steiner, J. L.; Hou, C.; Moriasi, D.

    2015-12-01

    Rainfall is one of the fundamental drivers that control hydrologic responses including runoff production and transport phenomena that consequently drive changes in aquatic ecosystems. Quantifying the hydrologic responses to changing scenarios (e.g., climate, land use, and management) using environmental models requires a realistic representation of probable rainfall in its most sensible spatio-temporal dimensions matching that of the phenomenon under investigation. Downscaling projected rainfall from global circulation models (GCMs) is the most common practice in deriving rainfall datasets to be used as main inputs to hydrologic models which in turn are used to assess the impacts of climate changes on ecosystems. Downscaling assumes that local climate is a combination of large-scale climatic/atmospheric conditions and local conditions. However, the representation of the latter is generally beyond the capacity of current GCMs. The main objective of this study was to develop and implement a synthetic rainfall generator to downscale expected rainfall trends to 1 x 1 km rainfall daily patterns that mimic the dynamic propagation of probability distribution functions (pdf) derived from historic rainfall data (rain-gauge or radar estimated). Future projections were determined based on actual and expected changes in the pdf and stochastic processes to account for variability. Watershed responses in terms of streamflow and nutrients loads were evaluated using synthetically generated rainfall patterns and actual data. The framework developed in this study will allow practitioners to generate rainfall datasets that mimic the temporal and spatial patterns exclusive to their study area under full disclosure of the uncertainties involved. This is expected to provide significantly more accurate environmental models than is currently available and would provide practitioners with ways to evaluate the spectrum of systemic responses to changing scenarios.

  6. Internal Medicine residents use heuristics to estimate disease probability

    PubMed Central

    Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin

    2015-01-01

    Background Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. Method We randomized 55 Internal Medicine residents to different versions of four clinical vignettes and asked them to estimate probabilities of target conditions. We manipulated the clinical data for each vignette to be consistent with either 1) using a representative heuristic, by adding non-discriminating prototypical clinical features of the target condition, or 2) using anchoring with adjustment heuristic, by providing a high or low anchor for the target condition. Results When presented with additional non-discriminating data the odds of diagnosing the target condition were increased (odds ratio (OR) 2.83, 95% confidence interval [1.30, 6.15], p = 0.009). Similarly, the odds of diagnosing the target condition were increased when a high anchor preceded the vignette (OR 2.04, [1.09, 3.81], p = 0.025). Conclusions Our findings suggest that despite previous exposure to the use of Bayesian reasoning, residents use heuristics, such as the representative heuristic and anchoring with adjustment, to estimate probabilities. Potential reasons for attribute substitution include the relative cognitive ease of heuristics vs. Bayesian reasoning or perhaps residents in their clinical practice use gist traces rather than precise probability estimates when diagnosing. PMID:27004080

  7. Estimating site occupancy rates for aquatic plants using spatial sub-sampling designs when detection probabilities are less than one

    USGS Publications Warehouse

    Nielson, Ryan M.; Gray, Brian R.; McDonald, Lyman L.; Heglund, Patricia J.

    2011-01-01

    Estimation of site occupancy rates when detection probabilities are <1 is well established in wildlife science. Data from multiple visits to a sample of sites are used to estimate detection probabilities and the proportion of sites occupied by focal species. In this article we describe how site occupancy methods can be applied to estimate occupancy rates of plants and other sessile organisms. We illustrate this approach and the pitfalls of ignoring incomplete detection using spatial data for 2 aquatic vascular plants collected under the Upper Mississippi River's Long Term Resource Monitoring Program (LTRMP). Site occupancy models considered include: a naïve model that ignores incomplete detection, a simple site occupancy model assuming a constant occupancy rate and a constant probability of detection across sites, several models that allow site occupancy rates and probabilities of detection to vary with habitat characteristics, and mixture models that allow for unexplained variation in detection probabilities. We used information theoretic methods to rank competing models and bootstrapping to evaluate the goodness-of-fit of the final models. Results of our analysis confirm that ignoring incomplete detection can result in biased estimates of occupancy rates. Estimates of site occupancy rates for 2 aquatic plant species were 19–36% higher compared to naive estimates that ignored probabilities of detection <1. Simulations indicate that final models have little bias when 50 or more sites are sampled, and little gains in precision could be expected for sample sizes >300. We recommend applying site occupancy methods for monitoring presence of aquatic species.

  8. Characterization of water bodies for mosquito habitat using a multi-sensor approach

    NASA Astrophysics Data System (ADS)

    Midekisa, A.; Wimberly, M. C.; Senay, G. B.

    2012-12-01

    Malaria is a major health problem in Ethiopia. Anopheles arabiensis, which inhabits and breeds in a variety of aquatic habitats, is the major mosquito vector for malaria transmission in the region. In the Amhara region of Ethiopia, mosquito breeding sites are heterogeneously distributed. Therefore, accurate characterization of aquatic habitats and potential breeding sites can be used as a proxy to measure the spatial distribution of malaria risk. Satellite remote sensing provides the ability to map the spatial distribution and monitor the temporal dynamics of surface water. The objective of this study is to map the probability of surface water accumulation to identify potential vector breeding sites for Anopheles arabiensis using remote sensing data from sensors at multiple spatial and temporal resolutions. The normalized difference water index (NDWI), which is based on reflectance in the green and the near infrared (NIR) bands were used to estimate fractional cover of surface water. Temporal changes in surface water were mapped using NDWI indices derived from MODIS surface reflectance product (MOD09A1) for the period 2001-2012. Landsat TM and ETM+ imagery were used to train and calibrate model results from MODIS. Results highlighted interannual variation and seasonal changes in surface water that were observed from the MODIS time series. Static topographic indices that estimate the potential for water accumulation were generated from 30 meter Shuttle Radar Topography Mission (SRTM) elevation data. Integrated fractional surface water cover was developed by combining the static topographic indices and dynamic NDWI indices using Geographic Information System (GIS) overlay methods. Accuracy of the results was evaluated based on ground truth data that was collected on presence and absence of surface water immediately after the rainy season. The study provided a multi-sensor approach for mapping areas with a high potential for surface water accumulation that are potential breeding habitats for anopheline mosquitoes. The resulting products are useful for public health decision making towards effective prevention and control of the malaria burden in the Amhara region of Ethiopia.

  9. An evaluation of the efficiency of minnow traps for estimating the abundance of minnows in desert spring systems

    USGS Publications Warehouse

    Peterson, James T.; Scheerer, Paul D.; Clements, Shaun

    2015-01-01

    Desert springs are sensitive aquatic ecosystems that pose unique challenges to natural resource managers and researchers. Among the most important of these is the need to accurately quantify population parameters for resident fish, particularly when the species are of special conservation concern. We evaluated the efficiency of baited minnow traps for estimating the abundance of two at-risk species, Foskett Speckled Dace Rhinichthys osculus ssp. and Borax Lake Chub Gila boraxobius, in desert spring systems in southeastern Oregon. We evaluated alternative sample designs using simulation and found that capture–recapture designs with four capture occasions would maximize the accuracy of estimates and minimize fish handling. We implemented the design and estimated capture and recapture probabilities using the Huggins closed-capture estimator. Trap capture probabilities averaged 23% and 26% for Foskett Speckled Dace and Borax Lake Chub, respectively, but differed substantially among sample locations, through time, and nonlinearly with fish body size. Recapture probabilities for Foskett Speckled Dace were, on average, 1.6 times greater than (first) capture probabilities, suggesting “trap-happy” behavior. Comparison of population estimates from the Huggins model with the commonly used Lincoln–Petersen estimator indicated that the latter underestimated Foskett Speckled Dace and Borax Lake Chub population size by 48% and by 20%, respectively. These biases were due to variability in capture and recapture probabilities. Simulation of fish monitoring that included the range of capture and recapture probabilities observed indicated that variability in capture and recapture probabilities in time negatively affected the ability to detect annual decreases by up to 20% in fish population size. Failure to account for variability in capture and recapture probabilities can lead to poor quality data and study inferences. Therefore, we recommend that fishery researchers and managers employ sample designs and estimators that can account for this variability.

  10. Estimate of tephra accumulation probabilities for the U.S. Department of Energy's Hanford Site, Washington

    USGS Publications Warehouse

    Hoblitt, Richard P.; Scott, William E.

    2011-01-01

    In response to a request from the U.S. Department of Energy, we estimate the thickness of tephra accumulation that has an annual probability of 1 in 10,000 of being equaled or exceeded at the Hanford Site in south-central Washington State, where a project to build the Tank Waste Treatment and Immobilization Plant is underway. We follow the methodology of a 1987 probabilistic assessment of tephra accumulation in the Pacific Northwest. For a given thickness of tephra, we calculate the product of three probabilities: (1) the annual probability of an eruption producing 0.1 km3 (bulk volume) or more of tephra, (2) the probability that the wind will be blowing toward the Hanford Site, and (3) the probability that tephra accumulations will equal or exceed the given thickness at a given distance. Mount St. Helens, which lies about 200 km upwind from the Hanford Site, has been the most prolific source of tephra fallout among Cascade volcanoes in the recent geologic past and its annual eruption probability based on this record (0.008) dominates assessment of future tephra falls at the site. The probability that the prevailing wind blows toward Hanford from Mount St. Helens is 0.180. We estimate exceedance probabilities of various thicknesses of tephra fallout from an analysis of 14 eruptions of the size expectable from Mount St. Helens and for which we have measurements of tephra fallout at 200 km. The result is that the estimated thickness of tephra accumulation that has an annual probability of 1 in 10,000 of being equaled or exceeded is about 10 centimeters. It is likely that this thickness is a maximum estimate because we used conservative estimates of eruption and wind probabilities and because the 14 deposits we used probably provide an over-estimate. The use of deposits in this analysis that were mostly compacted by the time they were studied and measured implies that the bulk density of the tephra fallout we consider here is in the range of 1,000-1,250 kg/m3. The load of 10 cm of such tephra fallout on a flat surface would therefore be in the range of 100-125 kg/m2; addition of water from rainfall or snowmelt would provide additional load.

  11. A spatio-temporal model for probabilistic seismic hazard zonation of Tehran

    NASA Astrophysics Data System (ADS)

    Hashemi, Mahdi; Alesheikh, Ali Asghar; Zolfaghari, Mohammad Reza

    2013-08-01

    A precondition for all disaster management steps, building damage prediction, and construction code developments is a hazard assessment that shows the exceedance probabilities of different ground motion levels at a site considering different near- and far-field earthquake sources. The seismic sources are usually categorized as time-independent area sources and time-dependent fault sources. While the earlier incorporates the small and medium events, the later takes into account only the large characteristic earthquakes. In this article, a probabilistic approach is proposed to aggregate the effects of time-dependent and time-independent sources on seismic hazard. The methodology is then applied to generate three probabilistic seismic hazard maps of Tehran for 10%, 5%, and 2% exceedance probabilities in 50 years. The results indicate an increase in peak ground acceleration (PGA) values toward the southeastern part of the study area and the PGA variations are mostly controlled by the shear wave velocities across the city. In addition, the implementation of the methodology takes advantage of GIS capabilities especially raster-based analyses and representations. During the estimation of the PGA exceedance rates, the emphasis has been placed on incorporating the effects of different attenuation relationships and seismic source models by using a logic tree.

  12. Small-world networks exhibit pronounced intermittent synchronization

    NASA Astrophysics Data System (ADS)

    Choudhary, Anshul; Mitra, Chiranjit; Kohar, Vivek; Sinha, Sudeshna; Kurths, Jürgen

    2017-11-01

    We report the phenomenon of temporally intermittently synchronized and desynchronized dynamics in Watts-Strogatz networks of chaotic Rössler oscillators. We consider topologies for which the master stability function (MSF) predicts stable synchronized behaviour, as the rewiring probability (p) is tuned from 0 to 1. MSF essentially utilizes the largest non-zero Lyapunov exponent transversal to the synchronization manifold in making stability considerations, thereby ignoring the other Lyapunov exponents. However, for an N-node networked dynamical system, we observe that the difference in its Lyapunov spectra (corresponding to the N - 1 directions transversal to the synchronization manifold) is crucial and serves as an indicator of the presence of intermittently synchronized behaviour. In addition to the linear stability-based (MSF) analysis, we further provide global stability estimate in terms of the fraction of state-space volume shared by the intermittently synchronized state, as p is varied from 0 to 1. This fraction becomes appreciably large in the small-world regime, which is surprising, since this limit has been otherwise considered optimal for synchronized dynamics. Finally, we characterize the nature of the observed intermittency and its dominance in state-space as network rewiring probability (p) is varied.

  13. Modulation of cognitive control levels via manipulation of saccade trial-type probability assessed with event-related BOLD fMRI.

    PubMed

    Pierce, Jordan E; McDowell, Jennifer E

    2016-02-01

    Cognitive control supports flexible behavior adapted to meet current goals and can be modeled through investigation of saccade tasks with varying cognitive demands. Basic prosaccades (rapid glances toward a newly appearing stimulus) are supported by neural circuitry, including occipital and posterior parietal cortex, frontal and supplementary eye fields, and basal ganglia. These trials can be contrasted with complex antisaccades (glances toward the mirror image location of a stimulus), which are characterized by greater functional magnetic resonance imaging (MRI) blood oxygenation level-dependent (BOLD) signal in the aforementioned regions and recruitment of additional regions such as dorsolateral prefrontal cortex. The current study manipulated the cognitive demands of these saccade tasks by presenting three rapid event-related runs of mixed saccades with a varying probability of antisaccade vs. prosaccade trials (25, 50, or 75%). Behavioral results showed an effect of trial-type probability on reaction time, with slower responses in runs with a high antisaccade probability. Imaging results exhibited an effect of probability in bilateral pre- and postcentral gyrus, bilateral superior temporal gyrus, and medial frontal gyrus. Additionally, the interaction between saccade trial type and probability revealed a strong probability effect for prosaccade trials, showing a linear increase in activation parallel to antisaccade probability in bilateral temporal/occipital, posterior parietal, medial frontal, and lateral prefrontal cortex. In contrast, antisaccade trials showed elevated activation across all runs. Overall, this study demonstrated that improbable performance of a typically simple prosaccade task led to augmented BOLD signal to support changing cognitive control demands, resulting in activation levels similar to the more complex antisaccade task. Copyright © 2016 the American Physiological Society.

  14. Wave-driven spatial and temporal variability in sea-floor sediment mobility in the Monterey Bay, Cordell Bank, and Gulf of the Farallones National Marine Sanctuaries

    USGS Publications Warehouse

    Storlazzi, Curt D.; Reid, Jane A.; Golden, Nadine E.

    2007-01-01

    Wind and wave patterns affect many aspects of continental shelves and shorelines geomorphic evolution. Although our understanding of the processes controlling sediment suspension on continental shelves has improved over the past decade, our ability to predict sediment mobility over large spatial and temporal scales remains limited. The deployment of robust operational buoys along the U.S. West Coast in the early 1980s provides large quantities of high-resolution oceanographic and meteorologic data. By 2006, these data sets were long enough to clearly identify long-term trends and compute statistically significant probability estimates of wave and wind behavior during annual and interannual climatic cycles (that is, El Niño and La Niña). Wave-induced sediment mobility on the shelf and upper slope off central California was modeled using synthesized oceanographic and meteorologic data as boundary input for the Delft SWAN model, sea-floor grain-size data provided by the usSEABED database, and regional bathymetry. Differences in waves (heights, periods, and directions) and winds (speeds and directions) between El Niño and La Niña months cause temporal and spatial variations in peak wave-induced bed shear stresses. These variations, in conjunction with spatially heterogeneous unconsolidated sea-floor sedimentary cover, result in predicted sediment mobility widely varying in both time and space. These findings indicate that these factors have significant consequences for both geological and biological processes.

  15. Visualization of spatial patterns and temporal trends for aerial surveillance of illegal oil discharges in western Canadian marine waters.

    PubMed

    Serra-Sogas, Norma; O'Hara, Patrick D; Canessa, Rosaline; Keller, Peter; Pelot, Ronald

    2008-05-01

    This paper examines the use of exploratory spatial analysis for identifying hotspots of shipping-based oil pollution in the Pacific Region of Canada's Exclusive Economic Zone. It makes use of data collected from fiscal years 1997/1998 to 2005/2006 by the National Aerial Surveillance Program, the primary tool for monitoring and enforcing the provisions imposed by MARPOL 73/78. First, we present oil spill data as points in a "dot map" relative to coastlines, harbors and the aerial surveillance distribution. Then, we explore the intensity of oil spill events using the Quadrat Count method, and the Kernel Density Estimation methods with both fixed and adaptive bandwidths. We found that oil spill hotspots where more clearly defined using Kernel Density Estimation with an adaptive bandwidth, probably because of the "clustered" distribution of oil spill occurrences. Finally, we discuss the importance of standardizing oil spill data by controlling for surveillance effort to provide a better understanding of the distribution of illegal oil spills, and how these results can ultimately benefit a monitoring program.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, J.; Kucharek, H.; Möbius, E.

    In this paper we report on a two-year study to estimate the Ne/O abundance ratio in the gas phase of the local interstellar cloud (LIC). Based on the first two years of observations with the Interstellar Boundary Explorer, we determined the fluxes of interstellar neutral (ISN) O and Ne atoms at the Earth's orbit in spring 2009 and 2010. A temporal variation of the Ne/O abundance ratio at the Earth's orbit could be expected due to solar cycle-related effects such as changes of ionization. However, this study shows that there is no significant change in the Ne/O ratio at themore » Earths orbit from 2009 to 2010. We used time-dependent survival probabilities of the ISNs to calculate the Ne/O abundance ratio at the termination shock. Then we estimated the Ne/O abundance ratio in the gas phase of the LIC with the use of filtration factors and the ionization fractions. From our analysis, the Ne/O abundance ratio in the LIC is 0.33 ± 0.07, which is in agreement with the abundance ratio inferred from pickup-ion measurements.« less

  17. Image Motion Detection And Estimation: The Modified Spatio-Temporal Gradient Scheme

    NASA Astrophysics Data System (ADS)

    Hsin, Cheng-Ho; Inigo, Rafael M.

    1990-03-01

    The detection and estimation of motion are generally involved in computing a velocity field of time-varying images. A completely new modified spatio-temporal gradient scheme to determine motion is proposed. This is derived by using gradient methods and properties of biological vision. A set of general constraints is proposed to derive motion constraint equations. The constraints are that the second directional derivatives of image intensity at an edge point in the smoothed image will be constant at times t and t+L . This scheme basically has two stages: spatio-temporal filtering, and velocity estimation. Initially, image sequences are processed by a set of oriented spatio-temporal filters which are designed using a Gaussian derivative model. The velocity is then estimated for these filtered image sequences based on the gradient approach. From a computational stand point, this scheme offers at least three advantages over current methods. The greatest advantage of the modified spatio-temporal gradient scheme over the traditional ones is that an infinite number of motion constraint equations are derived instead of only one. Therefore, it solves the aperture problem without requiring any additional assumptions and is simply a local process. The second advantage is that because of the spatio-temporal filtering, the direct computation of image gradients (discrete derivatives) is avoided. Therefore the error in gradients measurement is reduced significantly. The third advantage is that during the processing of motion detection and estimation algorithm, image features (edges) are produced concurrently with motion information. The reliable range of detected velocity is determined by parameters of the oriented spatio-temporal filters. Knowing the velocity sensitivity of a single motion detection channel, a multiple-channel mechanism for estimating image velocity, seldom addressed by other motion schemes in machine vision, can be constructed by appropriately choosing and combining different sets of parameters. By applying this mechanism, a great range of velocity can be detected. The scheme has been tested for both synthetic and real images. The results of simulations are very satisfactory.

  18. Bootstrap imputation with a disease probability model minimized bias from misclassification due to administrative database codes.

    PubMed

    van Walraven, Carl

    2017-04-01

    Diagnostic codes used in administrative databases cause bias due to misclassification of patient disease status. It is unclear which methods minimize this bias. Serum creatinine measures were used to determine severe renal failure status in 50,074 hospitalized patients. The true prevalence of severe renal failure and its association with covariates were measured. These were compared to results for which renal failure status was determined using surrogate measures including the following: (1) diagnostic codes; (2) categorization of probability estimates of renal failure determined from a previously validated model; or (3) bootstrap methods imputation of disease status using model-derived probability estimates. Bias in estimates of severe renal failure prevalence and its association with covariates were minimal when bootstrap methods were used to impute renal failure status from model-based probability estimates. In contrast, biases were extensive when renal failure status was determined using codes or methods in which model-based condition probability was categorized. Bias due to misclassification from inaccurate diagnostic codes can be minimized using bootstrap methods to impute condition status using multivariable model-derived probability estimates. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Use of uninformative priors to initialize state estimation for dynamical systems

    NASA Astrophysics Data System (ADS)

    Worthy, Johnny L.; Holzinger, Marcus J.

    2017-10-01

    The admissible region must be expressed probabilistically in order to be used in Bayesian estimation schemes. When treated as a probability density function (PDF), a uniform admissible region can be shown to have non-uniform probability density after a transformation. An alternative approach can be used to express the admissible region probabilistically according to the Principle of Transformation Groups. This paper uses a fundamental multivariate probability transformation theorem to show that regardless of which state space an admissible region is expressed in, the probability density must remain the same under the Principle of Transformation Groups. The admissible region can be shown to be analogous to an uninformative prior with a probability density that remains constant under reparameterization. This paper introduces requirements on how these uninformative priors may be transformed and used for state estimation and the difference in results when initializing an estimation scheme via a traditional transformation versus the alternative approach.

  20. High throughput nonparametric probability density estimation.

    PubMed

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  1. High throughput nonparametric probability density estimation

    PubMed Central

    Farmer, Jenny

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference. PMID:29750803

  2. Weighing the value of memory loss in the surgical evaluation of left temporal lobe epilepsy: A decision analysis

    PubMed Central

    Akama-Garren, Elliot H.; Bianchi, Matt T.; Leveroni, Catherine; Cole, Andrew J.; Cash, Sydney S.; Westover, M. Brandon

    2016-01-01

    SUMMARY Objectives Anterior temporal lobectomy is curative for many patients with disabling medically refractory temporal lobe epilepsy, but carries an inherent risk of disabling verbal memory loss. Although accurate prediction of iatrogenic memory loss is becoming increasingly possible, it remains unclear how much weight such predictions should have in surgical decision making. Here we aim to create a framework that facilitates a systematic and integrated assessment of the relative risks and benefits of surgery versus medical management for patients with left temporal lobe epilepsy. Methods We constructed a Markov decision model to evaluate the probabilistic outcomes and associated health utilities associated with choosing to undergo a left anterior temporal lobectomy versus continuing with medical management for patients with medically refractory left temporal lobe epilepsy. Three base-cases were considered, representing a spectrum of surgical candidates encountered in practice, with varying degrees of epilepsy-related disability and potential for decreased quality of life in response to post-surgical verbal memory deficits. Results For patients with moderately severe seizures and moderate risk of verbal memory loss, medical management was the preferred decision, with increased quality-adjusted life expectancy. However, the preferred choice was sensitive to clinically meaningful changes in several parameters, including quality of life impact of verbal memory decline, quality of life with seizures, mortality rate with medical management, probability of remission following surgery, and probability of remission with medical management. Significance Our decision model suggests that for patients with left temporal lobe epilepsy, quantitative assessment of risk and benefit should guide recommendation of therapy. In particular, risk for and potential impact of verbal memory decline should be carefully weighed against the degree of disability conferred by continued seizures on a patient-by-patient basis. PMID:25244498

  3. Weighing the value of memory loss in the surgical evaluation of left temporal lobe epilepsy: a decision analysis.

    PubMed

    Akama-Garren, Elliot H; Bianchi, Matt T; Leveroni, Catherine; Cole, Andrew J; Cash, Sydney S; Westover, M Brandon

    2014-11-01

    Anterior temporal lobectomy is curative for many patients with disabling medically refractory temporal lobe epilepsy, but carries an inherent risk of disabling verbal memory loss. Although accurate prediction of iatrogenic memory loss is becoming increasingly possible, it remains unclear how much weight such predictions should have in surgical decision making. Here we aim to create a framework that facilitates a systematic and integrated assessment of the relative risks and benefits of surgery versus medical management for patients with left temporal lobe epilepsy. We constructed a Markov decision model to evaluate the probabilistic outcomes and associated health utilities associated with choosing to undergo a left anterior temporal lobectomy versus continuing with medical management for patients with medically refractory left temporal lobe epilepsy. Three base-cases were considered, representing a spectrum of surgical candidates encountered in practice, with varying degrees of epilepsy-related disability and potential for decreased quality of life in response to post-surgical verbal memory deficits. For patients with moderately severe seizures and moderate risk of verbal memory loss, medical management was the preferred decision, with increased quality-adjusted life expectancy. However, the preferred choice was sensitive to clinically meaningful changes in several parameters, including quality of life impact of verbal memory decline, quality of life with seizures, mortality rate with medical management, probability of remission following surgery, and probability of remission with medical management. Our decision model suggests that for patients with left temporal lobe epilepsy, quantitative assessment of risk and benefit should guide recommendation of therapy. In particular, risk for and potential impact of verbal memory decline should be carefully weighed against the degree of disability conferred by continued seizures on a patient-by-patient basis. Wiley Periodicals, Inc. © 2014 International League Against Epilepsy.

  4. Cluster membership probability: polarimetric approach

    NASA Astrophysics Data System (ADS)

    Medhi, Biman J.; Tamura, Motohide

    2013-04-01

    Interstellar polarimetric data of the six open clusters Hogg 15, NGC 6611, NGC 5606, NGC 6231, NGC 5749 and NGC 6250 have been used to estimate the membership probability for the stars within them. For proper-motion member stars, the membership probability estimated using the polarimetric data is in good agreement with the proper-motion cluster membership probability. However, for proper-motion non-member stars, the membership probability estimated by the polarimetric method is in total disagreement with the proper-motion cluster membership probability. The inconsistencies in the determined memberships may be because of the fundamental differences between the two methods of determination: one is based on stellar proper motion in space and the other is based on selective extinction of the stellar output by the asymmetric aligned dust grains present in the interstellar medium. The results and analysis suggest that the scatter of the Stokes vectors q (per cent) and u (per cent) for the proper-motion member stars depends on the interstellar and intracluster differential reddening in the open cluster. It is found that this method could be used to estimate the cluster membership probability if we have additional polarimetric and photometric information for a star to identify it as a probable member/non-member of a particular cluster, such as the maximum wavelength value (λmax), the unit weight error of the fit (σ1), the dispersion in the polarimetric position angles (overline{ɛ }), reddening (E(B - V)) or the differential intracluster reddening (ΔE(B - V)). This method could also be used to estimate the membership probability of known member stars having no membership probability as well as to resolve disagreements about membership among different proper-motion surveys.

  5. A multimodal approach to estimating vigilance using EEG and forehead EOG.

    PubMed

    Zheng, Wei-Long; Lu, Bao-Liang

    2017-04-01

    Covert aspects of ongoing user mental states provide key context information for user-aware human computer interactions. In this paper, we focus on the problem of estimating the vigilance of users using EEG and EOG signals. The PERCLOS index as vigilance annotation is obtained from eye tracking glasses. To improve the feasibility and wearability of vigilance estimation devices for real-world applications, we adopt a novel electrode placement for forehead EOG and extract various eye movement features, which contain the principal information of traditional EOG. We explore the effects of EEG from different brain areas and combine EEG and forehead EOG to leverage their complementary characteristics for vigilance estimation. Considering that the vigilance of users is a dynamic changing process because the intrinsic mental states of users involve temporal evolution, we introduce continuous conditional neural field and continuous conditional random field models to capture dynamic temporal dependency. We propose a multimodal approach to estimating vigilance by combining EEG and forehead EOG and incorporating the temporal dependency of vigilance into model training. The experimental results demonstrate that modality fusion can improve the performance compared with a single modality, EOG and EEG contain complementary information for vigilance estimation, and the temporal dependency-based models can enhance the performance of vigilance estimation. From the experimental results, we observe that theta and alpha frequency activities are increased, while gamma frequency activities are decreased in drowsy states in contrast to awake states. The forehead setup allows for the simultaneous collection of EEG and EOG and achieves comparative performance using only four shared electrodes in comparison with the temporal and posterior sites.

  6. Relationship of climate, geography, and geology to the incidence of Rift Valley fever in Kenya during the 2006-2007 outbreak.

    PubMed

    Hightower, Allen; Kinkade, Carl; Nguku, Patrick M; Anyangu, Amwayi; Mutonga, David; Omolo, Jared; Njenga, M Kariuki; Feikin, Daniel R; Schnabel, David; Ombok, Maurice; Breiman, Robert F

    2012-02-01

    We estimated Rift Valley fever (RVF) incidence as a function of geological, geographical, and climatological factors during the 2006-2007 RVF epidemic in Kenya. Location information was obtained for 214 of 340 (63%) confirmed and probable RVF cases that occurred during an outbreak from November 1, 2006 to February 28, 2007. Locations with subtypes of solonetz, calcisols, solonchaks, and planosols soil types were highly associated with RVF occurrence during the outbreak period. Increased rainfall and higher greenness measures before the outbreak were associated with increased risk. RVF was more likely to occur on plains, in densely bushed areas, at lower elevations, and in the Somalia acacia ecological zone. Cases occurred in three spatial temporal clusters that differed by the date of associated rainfall, soil type, and land usage.

  7. Statistical Surrogate Models for Estimating Probability of High-Consequence Climate Change

    NASA Astrophysics Data System (ADS)

    Field, R.; Constantine, P.; Boslough, M.

    2011-12-01

    We have posed the climate change problem in a framework similar to that used in safety engineering, by acknowledging that probabilistic risk assessments focused on low-probability, high-consequence climate events are perhaps more appropriate than studies focused simply on best estimates. To properly explore the tails of the distribution requires extensive sampling, which is not possible with existing coupled atmospheric models due to the high computational cost of each simulation. We have developed specialized statistical surrogate models (SSMs) that can be used to make predictions about the tails of the associated probability distributions. A SSM is different than a deterministic surrogate model in that it represents each climate variable of interest as a space/time random field, that is, a random variable for every fixed location in the atmosphere at all times. The SSM can be calibrated to available spatial and temporal data from existing climate databases, or to a collection of outputs from general circulation models. Because of its reduced size and complexity, the realization of a large number of independent model outputs from a SSM becomes computationally straightforward, so that quantifying the risk associated with low-probability, high-consequence climate events becomes feasible. A Bayesian framework was also developed to provide quantitative measures of confidence, via Bayesian credible intervals, to assess these risks. To illustrate the use of the SSM, we considered two collections of NCAR CCSM 3.0 output data. The first collection corresponds to average December surface temperature for years 1990-1999 based on a collection of 8 different model runs obtained from the Program for Climate Model Diagnosis and Intercomparison (PCMDI). We calibrated the surrogate model to the available model data and make various point predictions. We also analyzed average precipitation rate in June, July, and August over a 54-year period assuming a cyclic Y2K ocean model. We applied the calibrated surrogate model to study the probability that the precipitation rate falls below certain thresholds and utilized the Bayesian approach to quantify our confidence in these predictions. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under Contract DE-AC04-94AL85000.

  8. Probability of success for phase III after exploratory biomarker analysis in phase II.

    PubMed

    Götte, Heiko; Kirchner, Marietta; Sailer, Martin Oliver

    2017-05-01

    The probability of success or average power describes the potential of a future trial by weighting the power with a probability distribution of the treatment effect. The treatment effect estimate from a previous trial can be used to define such a distribution. During the development of targeted therapies, it is common practice to look for predictive biomarkers. The consequence is that the trial population for phase III is often selected on the basis of the most extreme result from phase II biomarker subgroup analyses. In such a case, there is a tendency to overestimate the treatment effect. We investigate whether the overestimation of the treatment effect estimate from phase II is transformed into a positive bias for the probability of success for phase III. We simulate a phase II/III development program for targeted therapies. This simulation allows to investigate selection probabilities and allows to compare the estimated with the true probability of success. We consider the estimated probability of success with and without subgroup selection. Depending on the true treatment effects, there is a negative bias without selection because of the weighting by the phase II distribution. In comparison, selection increases the estimated probability of success. Thus, selection does not lead to a bias in probability of success if underestimation due to the phase II distribution and overestimation due to selection cancel each other out. We recommend to perform similar simulations in practice to get the necessary information about the risk and chances associated with such subgroup selection designs. Copyright © 2017 John Wiley & Sons, Ltd.

  9. TEMPORALLY-RESOLVED AMMONIA EMISSION INVENTORIES: CURRENT ESTIMATES, EVALUATION TOOLS, AND MEASUREMENT NEEDS

    EPA Science Inventory

    In this study, we evaluate the suitability of a three-dimensional chemical transport model (CTM) as a tool for assessing ammonia emission inventories, calculate the improvement in CTM performance owing to recent advances in temporally-varying ammonia emission estimates, and ident...

  10. Seismicity alert probabilities at Parkfield, California, revisited

    USGS Publications Warehouse

    Michael, A.J.; Jones, L.M.

    1998-01-01

    For a decade, the US Geological Survey has used the Parkfield Earthquake Prediction Experiment scenario document to estimate the probability that earthquakes observed on the San Andreas fault near Parkfield will turn out to be foreshocks followed by the expected magnitude six mainshock. During this time, we have learned much about the seismogenic process at Parkfield, about the long-term probability of the Parkfield mainshock, and about the estimation of these types of probabilities. The probabilities for potential foreshocks at Parkfield are reexamined and revised in light of these advances. As part of this process, we have confirmed both the rate of foreshocks before strike-slip earthquakes in the San Andreas physiographic province and the uniform distribution of foreshocks with magnitude proposed by earlier studies. Compared to the earlier assessment, these new estimates of the long-term probability of the Parkfield mainshock are lower, our estimate of the rate of background seismicity is higher, and we find that the assumption that foreshocks at Parkfield occur in a unique way is not statistically significant at the 95% confidence level. While the exact numbers vary depending on the assumptions that are made, the new alert probabilities are lower than previously estimated. Considering the various assumptions and the statistical uncertainties in the input parameters, we also compute a plausible range for the probabilities. The range is large, partly due to the extra knowledge that exists for the Parkfield segment, making us question the usefulness of these numbers.

  11. Optimized two-frequency phase-measuring-profilometry light-sensor temporal-noise sensitivity.

    PubMed

    Li, Jielin; Hassebrook, Laurence G; Guan, Chun

    2003-01-01

    Temporal frame-to-frame noise in multipattern structured light projection can significantly corrupt depth measurement repeatability. We present a rigorous stochastic analysis of phase-measuring-profilometry temporal noise as a function of the pattern parameters and the reconstruction coefficients. The analysis is used to optimize the two-frequency phase measurement technique. In phase-measuring profilometry, a sequence of phase-shifted sine-wave patterns is projected onto a surface. In two-frequency phase measurement, two sets of pattern sequences are used. The first, low-frequency set establishes a nonambiguous depth estimate, and the second, high-frequency set is unwrapped, based on the low-frequency estimate, to obtain an accurate depth estimate. If the second frequency is too low, then depth error is caused directly by temporal noise in the phase measurement. If the second frequency is too high, temporal noise triggers ambiguous unwrapping, resulting in depth measurement error. We present a solution for finding the second frequency, where intensity noise variance is at its minimum.

  12. Integrating field plots, lidar, and landsat time series to provide temporally consistent annual estimates of biomass from 1990 to present

    Treesearch

    Warren B. Cohen; Hans-Erik Andersen; Sean P. Healey; Gretchen G. Moisen; Todd A. Schroeder; Christopher W. Woodall; Grant M. Domke; Zhiqiang Yang; Robert E. Kennedy; Stephen V. Stehman; Curtis Woodcock; Jim Vogelmann; Zhe Zhu; Chengquan Huang

    2015-01-01

    We are developing a system that provides temporally consistent biomass estimates for national greenhouse gas inventory reporting to the United Nations Framework Convention on Climate Change. Our model-assisted estimation framework relies on remote sensing to scale from plot measurements to lidar strip samples, to Landsat time series-based maps. As a demonstration, new...

  13. A removal model for estimating detection probabilities from point-count surveys

    USGS Publications Warehouse

    Farnsworth, G.L.; Pollock, K.H.; Nichols, J.D.; Simons, T.R.; Hines, J.E.; Sauer, J.R.

    2000-01-01

    We adapted a removal model to estimate detection probability during point count surveys. The model assumes one factor influencing detection during point counts is the singing frequency of birds. This may be true for surveys recording forest songbirds when most detections are by sound. The model requires counts to be divided into several time intervals. We used time intervals of 2, 5, and 10 min to develop a maximum-likelihood estimator for the detectability of birds during such surveys. We applied this technique to data from bird surveys conducted in Great Smoky Mountains National Park. We used model selection criteria to identify whether detection probabilities varied among species, throughout the morning, throughout the season, and among different observers. The overall detection probability for all birds was 75%. We found differences in detection probability among species. Species that sing frequently such as Winter Wren and Acadian Flycatcher had high detection probabilities (about 90%) and species that call infrequently such as Pileated Woodpecker had low detection probability (36%). We also found detection probabilities varied with the time of day for some species (e.g. thrushes) and between observers for other species. This method of estimating detectability during point count surveys offers a promising new approach to using count data to address questions of the bird abundance, density, and population trends.

  14. Trends in Timing of Dialysis Initiation within Versus Outside the Department of Veterans Affairs.

    PubMed

    Yu, Margaret K; O'Hare, Ann M; Batten, Adam; Sulc, Christine A; Neely, Emily L; Liu, Chuan-Fen; Hebert, Paul L

    2015-08-07

    The secular trend toward dialysis initiation at progressively higher levels of eGFR is not well understood. This study compared temporal trends in eGFR at dialysis initiation within versus outside the Department of Veterans Affairs (VA)-the largest non-fee-for-service health system in the United States. The study used linked data from the US Renal Data System, VA, and Medicare to compare temporal trends in eGFR at dialysis initiation between 2000 and 2009 (n=971,543). Veterans who initiated dialysis within the VA were compared with three groups who initiated dialysis outside the VA: (1) veterans whose dialysis was paid for by the VA, (2) veterans whose dialysis was not paid for by the VA, and (3) nonveterans. Logistic regression was used to estimate average predicted probabilities of dialysis initiation at an eGFR≥10 ml/min per 1.73 m(2). The adjusted probability of starting dialysis at an eGFR≥10 ml/min per 1.73 m(2) increased over time for all groups but was lower for veterans who started dialysis within the VA (0.31; 95% confidence interval [95% CI], 0.30 to 0.32) than for those starting outside the VA, including veterans whose dialysis was (0.36; 95% CI, 0.35 to 0.38) and was not (0.40; 95% CI, 0.40 to 0.40) paid for by the VA and nonveterans (0.39; 95% CI, 0.39 to 0.39). Differences in eGFR at initiation within versus outside the VA were most pronounced among older patients (P for interaction <0.001) and those with a higher risk of 1-year mortality (P for interaction <0.001). Temporal trends in eGFR at dialysis initiation within the VA mirrored those in the wider United States dialysis population, but eGFR at initiation was consistently lowest among those who initiated within the VA. Differences in eGFR at initiation within versus outside the VA were especially pronounced in older patients and those with higher 1-year mortality risk. Copyright © 2015 by the American Society of Nephrology.

  15. Trends in Timing of Dialysis Initiation within Versus Outside the Department of Veterans Affairs

    PubMed Central

    O’Hare, Ann M.; Batten, Adam; Sulc, Christine A.; Neely, Emily L.; Liu, Chuan-Fen; Hebert, Paul L.

    2015-01-01

    Background and objectives The secular trend toward dialysis initiation at progressively higher levels of eGFR is not well understood. This study compared temporal trends in eGFR at dialysis initiation within versus outside the Department of Veterans Affairs (VA)—the largest non–fee-for-service health system in the United States. Design, setting, participants, & measurements The study used linked data from the US Renal Data System, VA, and Medicare to compare temporal trends in eGFR at dialysis initiation between 2000 and 2009 (n=971,543). Veterans who initiated dialysis within the VA were compared with three groups who initiated dialysis outside the VA: (1) veterans whose dialysis was paid for by the VA, (2) veterans whose dialysis was not paid for by the VA, and (3) nonveterans. Logistic regression was used to estimate average predicted probabilities of dialysis initiation at an eGFR≥10 ml/min per 1.73 m2. Results The adjusted probability of starting dialysis at an eGFR≥10 ml/min per 1.73 m2 increased over time for all groups but was lower for veterans who started dialysis within the VA (0.31; 95% confidence interval [95% CI], 0.30 to 0.32) than for those starting outside the VA, including veterans whose dialysis was (0.36; 95% CI, 0.35 to 0.38) and was not (0.40; 95% CI, 0.40 to 0.40) paid for by the VA and nonveterans (0.39; 95% CI, 0.39 to 0.39). Differences in eGFR at initiation within versus outside the VA were most pronounced among older patients (P for interaction <0.001) and those with a higher risk of 1-year mortality (P for interaction <0.001). Conclusions Temporal trends in eGFR at dialysis initiation within the VA mirrored those in the wider United States dialysis population, but eGFR at initiation was consistently lowest among those who initiated within the VA. Differences in eGFR at initiation within versus outside the VA were especially pronounced in older patients and those with higher 1-year mortality risk. PMID:26206891

  16. On the quantification and efficient propagation of imprecise probabilities resulting from small datasets

    NASA Astrophysics Data System (ADS)

    Zhang, Jiaxin; Shields, Michael D.

    2018-01-01

    This paper addresses the problem of uncertainty quantification and propagation when data for characterizing probability distributions are scarce. We propose a methodology wherein the full uncertainty associated with probability model form and parameter estimation are retained and efficiently propagated. This is achieved by applying the information-theoretic multimodel inference method to identify plausible candidate probability densities and associated probabilities that each method is the best model in the Kullback-Leibler sense. The joint parameter densities for each plausible model are then estimated using Bayes' rule. We then propagate this full set of probability models by estimating an optimal importance sampling density that is representative of all plausible models, propagating this density, and reweighting the samples according to each of the candidate probability models. This is in contrast with conventional methods that try to identify a single probability model that encapsulates the full uncertainty caused by lack of data and consequently underestimate uncertainty. The result is a complete probabilistic description of both aleatory and epistemic uncertainty achieved with several orders of magnitude reduction in computational cost. It is shown how the model can be updated to adaptively accommodate added data and added candidate probability models. The method is applied for uncertainty analysis of plate buckling strength where it is demonstrated how dataset size affects the confidence (or lack thereof) we can place in statistical estimates of response when data are lacking.

  17. Spectral-temporal EEG dynamics of speech discrimination processing in infants during sleep.

    PubMed

    Gilley, Phillip M; Uhler, Kristin; Watson, Kaylee; Yoshinaga-Itano, Christine

    2017-03-22

    Oddball paradigms are frequently used to study auditory discrimination by comparing event-related potential (ERP) responses from a standard, high probability sound and to a deviant, low probability sound. Previous research has established that such paradigms, such as the mismatch response or mismatch negativity, are useful for examining auditory processes in young children and infants across various sleep and attention states. The extent to which oddball ERP responses may reflect subtle discrimination effects, such as speech discrimination, is largely unknown, especially in infants that have not yet acquired speech and language. Mismatch responses for three contrasts (non-speech, vowel, and consonant) were computed as a spectral-temporal probability function in 24 infants, and analyzed at the group level by a modified multidimensional scaling. Immediately following an onset gamma response (30-50 Hz), the emergence of a beta oscillation (12-30 Hz) was temporally coupled with a lower frequency theta oscillation (2-8 Hz). The spectral-temporal probability of this coupling effect relative to a subsequent theta modulation corresponds with discrimination difficulty for non-speech, vowel, and consonant contrast features. The theta modulation effect suggests that unexpected sounds are encoded as a probabilistic measure of surprise. These results support the notion that auditory discrimination is driven by the development of brain networks for predictive processing, and can be measured in infants during sleep. The results presented here have implications for the interpretation of discrimination as a probabilistic process, and may provide a basis for the development of single-subject and single-trial classification in a clinically useful context. An infant's brain is processing information about the environment and performing computations, even during sleep. These computations reflect subtle differences in acoustic feature processing that are necessary for language-learning. Results from this study suggest that brain responses to deviant sounds in an oddball paradigm follow a cascade of oscillatory modulations. This cascade begins with a gamma response that later emerges as a beta synchronization, which is temporally coupled with a theta modulation, and followed by a second, subsequent theta modulation. The difference in frequency and timing of the theta modulations appears to reflect a measure of surprise. These insights into the neurophysiological mechanisms of auditory discrimination provide a basis for exploring the clinically utility of the MMR TF and other auditory oddball responses.

  18. Accelerated long-term forgetting in temporal lobe epilepsy: evidence of improvement after left temporal pole lobectomy.

    PubMed

    Gallassi, Roberto; Sambati, Luisa; Poda, Roberto; Stanzani Maserati, Michelangelo; Oppi, Federico; Giulioni, Marco; Tinuper, Paolo

    2011-12-01

    Accelerated long term forgetting (ALF) is a characteristic cognitive aspect in patients affected by temporal lobe epilepsy that is probably due to an impairment of memory consolidation and retrieval caused by epileptic activity in hippocampal and parahippocampal regions. We describe a case of a patient with TLE who showed improvement in ALF and in remote memory impairment after an anterior left temporal pole lobectomy including the uncus and amygdala. Our findings confirm that impairment of hippocampal functioning leads to pathological ALF, whereas restoration of hippocampal functioning brings ALF to a level comparable to that of controls. Copyright © 2011 Elsevier Inc. All rights reserved.

  19. Pair Production Induced by Ultrashort and Ultraintense Laser Pulses in Plasmas

    NASA Astrophysics Data System (ADS)

    Luo, Yue-E.; Wang, Xue-Wen; Wang, Yuan-Sheng; Ji, Shen-Tong; Yu, Hong

    2018-06-01

    The probability of Schwinger pair production is calculated, which is induced by an ultraintense and ultrashort laser pulse propagating in a plasma. The dependence of the probability on the amplitude of the laser pulse and the frequency of plasmas is analyzed. Particularly, the effect of the pulse duration on the probability is discussed, by introducing a pulse-shape function to describe the temporal shape of the laser pulse. The results show that a laser with shorter pulse is more efficient in pair production. The probability of pair production increases when the order of the duration is comparable to the period of a laser.

  20. Pretest probability estimation in the evaluation of patients with possible deep vein thrombosis.

    PubMed

    Vinson, David R; Patel, Jason P; Irving, Cedric S

    2011-07-01

    An estimation of pretest probability is integral to the proper interpretation of a negative compression ultrasound in the diagnostic assessment of lower-extremity deep vein thrombosis. We sought to determine the rate, method, and predictors of pretest probability estimation in such patients. This cross-sectional study of outpatients was conducted in a suburban community hospital in 2006. Estimation of pretest probability was done by enzyme-linked immunosorbent assay d-dimer, Wells criteria, and unstructured clinical impression. Using logistic regression analysis, we measured predictors of documented risk assessment. A cohort analysis was undertaken to compare 3-month thromboembolic outcomes between risk groups. Among 524 cases, 289 (55.2%) underwent pretest probability estimation using the following methods: enzyme-linked immunosorbent assay d-dimer (228; 43.5%), clinical impression (106; 20.2%), and Wells criteria (24; 4.6%), with 69 (13.2%) patients undergoing a combination of at least two methods. Patient factors were not predictive of pretest probability estimation, but the specialty of the clinician was predictive; emergency physicians (P < .0001) and specialty clinicians (P = .001) were less likely than primary care clinicians to perform risk assessment. Thromboembolic events within 3 months were experienced by 0 of 52 patients in the explicitly low-risk group, 4 (1.8%) of 219 in the explicitly moderate- to high-risk group, and 1 (0.4%) of 226 in the group that did not undergo explicit risk assessment. Negative ultrasounds in the workup of deep vein thrombosis are commonly interpreted in isolation apart from pretest probability estimations. Risk assessments varied by physician specialties. Opportunities exist for improvement in the diagnostic evaluation of these patients. Copyright © 2011 Elsevier Inc. All rights reserved.

  1. Probabilistic tsunami hazard assessment at Seaside, Oregon, for near-and far-field seismic sources

    USGS Publications Warehouse

    Gonzalez, F.I.; Geist, E.L.; Jaffe, B.; Kanoglu, U.; Mofjeld, H.; Synolakis, C.E.; Titov, V.V.; Areas, D.; Bellomo, D.; Carlton, D.; Horning, T.; Johnson, J.; Newman, J.; Parsons, T.; Peters, R.; Peterson, C.; Priest, G.; Venturato, A.; Weber, J.; Wong, F.; Yalciner, A.

    2009-01-01

    The first probabilistic tsunami flooding maps have been developed. The methodology, called probabilistic tsunami hazard assessment (PTHA), integrates tsunami inundation modeling with methods of probabilistic seismic hazard assessment (PSHA). Application of the methodology to Seaside, Oregon, has yielded estimates of the spatial distribution of 100- and 500-year maximum tsunami amplitudes, i.e., amplitudes with 1% and 0.2% annual probability of exceedance. The 100-year tsunami is generated most frequently by far-field sources in the Alaska-Aleutian Subduction Zone and is characterized by maximum amplitudes that do not exceed 4 m, with an inland extent of less than 500 m. In contrast, the 500-year tsunami is dominated by local sources in the Cascadia Subduction Zone and is characterized by maximum amplitudes in excess of 10 m and an inland extent of more than 1 km. The primary sources of uncertainty in these results include those associated with interevent time estimates, modeling of background sea level, and accounting for temporal changes in bathymetry and topography. Nonetheless, PTHA represents an important contribution to tsunami hazard assessment techniques; viewed in the broader context of risk analysis, PTHA provides a method for quantifying estimates of the likelihood and severity of the tsunami hazard, which can then be combined with vulnerability and exposure to yield estimates of tsunami risk. Copyright 2009 by the American Geophysical Union.

  2. Quantifying the transmission potential of pandemic influenza

    NASA Astrophysics Data System (ADS)

    Chowell, Gerardo; Nishiura, Hiroshi

    2008-03-01

    This article reviews quantitative methods to estimate the basic reproduction number of pandemic influenza, a key threshold quantity to help determine the intensity of interventions required to control the disease. Although it is difficult to assess the transmission potential of a probable future pandemic, historical epidemiologic data is readily available from previous pandemics, and as a reference quantity for future pandemic planning, mathematical and statistical analyses of historical data are crucial. In particular, because many historical records tend to document only the temporal distribution of cases or deaths (i.e. epidemic curve), our review focuses on methods to maximize the utility of time-evolution data and to clarify the detailed mechanisms of the spread of influenza. First, we highlight structured epidemic models and their parameter estimation method which can quantify the detailed disease dynamics including those we cannot observe directly. Duration-structured epidemic systems are subsequently presented, offering firm understanding of the definition of the basic and effective reproduction numbers. When the initial growth phase of an epidemic is investigated, the distribution of the generation time is key statistical information to appropriately estimate the transmission potential using the intrinsic growth rate. Applications of stochastic processes are also highlighted to estimate the transmission potential using similar data. Critically important characteristics of influenza data are subsequently summarized, followed by our conclusions to suggest potential future methodological improvements.

  3. Value-based decision-making battery: A Bayesian adaptive approach to assess impulsive and risky behavior.

    PubMed

    Pooseh, Shakoor; Bernhardt, Nadine; Guevara, Alvaro; Huys, Quentin J M; Smolka, Michael N

    2018-02-01

    Using simple mathematical models of choice behavior, we present a Bayesian adaptive algorithm to assess measures of impulsive and risky decision making. Practically, these measures are characterized by discounting rates and are used to classify individuals or population groups, to distinguish unhealthy behavior, and to predict developmental courses. However, a constant demand for improved tools to assess these constructs remains unanswered. The algorithm is based on trial-by-trial observations. At each step, a choice is made between immediate (certain) and delayed (risky) options. Then the current parameter estimates are updated by the likelihood of observing the choice, and the next offers are provided from the indifference point, so that they will acquire the most informative data based on the current parameter estimates. The procedure continues for a certain number of trials in order to reach a stable estimation. The algorithm is discussed in detail for the delay discounting case, and results from decision making under risk for gains, losses, and mixed prospects are also provided. Simulated experiments using prescribed parameter values were performed to justify the algorithm in terms of the reproducibility of its parameters for individual assessments, and to test the reliability of the estimation procedure in a group-level analysis. The algorithm was implemented as an experimental battery to measure temporal and probability discounting rates together with loss aversion, and was tested on a healthy participant sample.

  4. Using open robust design models to estimate temporary emigration from capture-recapture data.

    PubMed

    Kendall, W L; Bjorkland, R

    2001-12-01

    Capture-recapture studies are crucial in many circumstances for estimating demographic parameters for wildlife and fish populations. Pollock's robust design, involving multiple sampling occasions per period of interest, provides several advantages over classical approaches. This includes the ability to estimate the probability of being present and available for detection, which in some situations is equivalent to breeding probability. We present a model for estimating availability for detection that relaxes two assumptions required in previous approaches. The first is that the sampled population is closed to additions and deletions across samples within a period of interest. The second is that each member of the population has the same probability of being available for detection in a given period. We apply our model to estimate survival and breeding probability in a study of hawksbill sea turtles (Eretmochelys imbricata), where previous approaches are not appropriate.

  5. Using open robust design models to estimate temporary emigration from capture-recapture data

    USGS Publications Warehouse

    Kendall, W.L.; Bjorkland, R.

    2001-01-01

    Capture-recapture studies are crucial in many circumstances for estimating demographic parameters for wildlife and fish populations. Pollock's robust design, involving multiple sampling occasions per period of interest, provides several advantages over classical approaches. This includes the ability to estimate the probability of being present and available for detection, which in some situations is equivalent to breeding probability. We present a model for estimating availability for detection that relaxes two assumptions required in previous approaches. The first is that the sampled population is closed to additions and deletions across samples within a period of interest. The second is that each member of the population has the same probability of being available for detection in a given period. We apply our model to estimate survival and breeding probability in a study of hawksbill sea turtles (Eretmochelys imbricata), where previous approaches are not appropriate.

  6. Wetlands Research Program. Corps of Engineers Wetlands Delineation Manual. Appendix C. Sections 1 and 2. Region 4 - North Plains.

    DTIC Science & Technology

    1987-01-01

    wetlands, are as follows: Category Symbol Definition OBLIGATE WETLAND OBL Plants that occur almost always PLANTS (estimated probability >.992) in...estimated probability 1% to 33%) in nonwetlands. FACULTATIVE PLANTS FAC Plants with a similar likelihood (estimated probability 337 to 67%) of... Symbols Symbols appearing in the list under the indicator status column are as follows: +: A "+" sign following an indicator status denotes that the

  7. Risk estimation using probability machines

    PubMed Central

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  8. Risk estimation using probability machines.

    PubMed

    Dasgupta, Abhijit; Szymczak, Silke; Moore, Jason H; Bailey-Wilson, Joan E; Malley, James D

    2014-03-01

    Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a "risk machine", will share properties from the statistical machine that it is derived from.

  9. Visuospatial information processing load and the ratio between parietal cue and target P3 amplitudes in the Attentional Network Test.

    PubMed

    Abramov, Dimitri M; Pontes, Monique; Pontes, Adailton T; Mourao-Junior, Carlos A; Vieira, Juliana; Quero Cunha, Carla; Tamborino, Tiago; Galhanone, Paulo R; deAzevedo, Leonardo C; Lazarev, Vladimir V

    2017-04-24

    In ERP studies of cognitive processes during attentional tasks, the cue signals containing information about the target can increase the amplitude of the parietal cue P3 in relation to the 'neutral' temporal cue, and reduce the subsequent target P3 when this information is valid, i.e. corresponds to the target's attributes. The present study compared the cue-to-target P3 ratios in neutral and visuospatial cueing, in order to estimate the contribution of valid visuospatial information from the cue to target stages of the task performance, in terms of cognitive load. The P3 characteristics were also correlated with the results of individuals' performance of the visuospatial tasks, in order to estimate the relationship of the observed ERP with spatial reasoning. In 20 typically developing boys, aged 10-13 years (11.3±0.86), the intelligence quotient (I.Q.) was estimated by the Block Design and Vocabulary subtests from the WISC-III. The subjects performed the Attentional Network Test (ANT) accompanied by EEG recording. The cued two-choice task had three equiprobable cue conditions: No cue, with no information about the target; Neutral (temporal) cue, with an asterisk in the center of the visual field, predicting the target onset; and Spatial cues, with an asterisk in the upper or lower hemifield, predicting the onset and corresponding location of the target. The ERPs were estimated for the mid-frontal (Fz) and mid-parietal (Pz) scalp derivations. In the Pz, the Neutral cue P3 had a lower amplitude than the Spatial cue P3; whereas for the target ERPs, the P3 of the Neutral cue condition was larger than that of the Spatial cue condition. However, the sums of the magnitudes of the cue and target P3 were equal in the spatial and neutral cueing, probably indicating that in both cases the equivalent information processing load is included in either the cue or the target reaction, respectively. Meantime, in the Fz, the analog ERP components for both the cue and target stimuli did not depend on the cue condition. The results show that, in the parietal site, the spatial cue P3 reflects the processing of visuospatial information regarding the target position. This contributes to the subsequent "decision-making", thus reducing the information processing load on the target response, which is probably reflected in the lower P3. This finding is consistent with the positive correlation of parietal cue P3 with the individual's ability to perform spatial tasks as scored by the Block Design subtest. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. On the use of secondary capture-recapture samples to estimate temporary emigration and breeding proportions

    USGS Publications Warehouse

    Kendall, W.L.; Nichols, J.D.; North, P.M.; Nichols, J.D.

    1995-01-01

    The use of the Cormack- Jolly-Seber model under a standard sampling scheme of one sample per time period, when the Jolly-Seber assumption that all emigration is permanent does not hold, leads to the confounding of temporary emigration probabilities with capture probabilities. This biases the estimates of capture probability when temporary emigration is a completely random process, and both capture and survival probabilities when there is a temporary trap response in temporary emigration, or it is Markovian. The use of secondary capture samples over a shorter interval within each period, during which the population is assumed to be closed (Pollock's robust design), provides a second source of information on capture probabilities. This solves the confounding problem, and thus temporary emigration probabilities can be estimated. This process can be accomplished in an ad hoc fashion for completely random temporary emigration and to some extent in the temporary trap response case, but modelling the complete sampling process provides more flexibility and permits direct estimation of variances. For the case of Markovian temporary emigration, a full likelihood is required.

  11. Optimum space shuttle launch times relative to natural environment

    NASA Technical Reports Server (NTRS)

    King, R. L.

    1977-01-01

    The probabilities of favorable and unfavorable weather conditions for launch and landing of the STS under different criteria were computed for every three hours on a yearly basis using 14 years of weather data. These temporal probability distributions were considered for three sets of weather criteria encompassing benign, moderate and severe weather conditions for both Kennedy Space Center and for Edwards Air Force Base. In addition, the conditional probabilities were computed for unfavorable weather conditions occurring after a delay which may or may not be due to weather conditions. Also for KSC, the probabilities of favorable landing conditions at various times after favorable launch conditions have prevailed. The probabilities were computed to indicate the significance of each weather element to the overall result.

  12. Population Immunity against Serotype-2 Poliomyelitis Leading up to the Global Withdrawal of the Oral Poliovirus Vaccine: Spatio-temporal Modelling of Surveillance Data.

    PubMed

    Pons-Salort, Margarita; Molodecky, Natalie A; O'Reilly, Kathleen M; Wadood, Mufti Zubair; Safdar, Rana M; Etsano, Andrew; Vaz, Rui Gama; Jafari, Hamid; Grassly, Nicholas C; Blake, Isobel M

    2016-10-01

    Global withdrawal of serotype-2 oral poliovirus vaccine (OPV2) took place in April 2016. This marked a milestone in global polio eradication and was a public health intervention of unprecedented scale, affecting 155 countries. Achieving high levels of serotype-2 population immunity before OPV2 withdrawal was critical to avoid subsequent outbreaks of serotype-2 vaccine-derived polioviruses (VDPV2s). In August 2015, we estimated vaccine-induced population immunity against serotype-2 poliomyelitis for 1 January 2004-30 June 2015 and produced forecasts for April 2016 by district in Nigeria and Pakistan. Population immunity was estimated from the vaccination histories of children <36 mo old identified with non-polio acute flaccid paralysis (AFP) reported through polio surveillance, information on immunisation activities with different oral poliovirus vaccine (OPV) formulations, and serotype-specific estimates of the efficacy of these OPVs against poliomyelitis. District immunity estimates were spatio-temporally smoothed using a Bayesian hierarchical framework. Coverage estimates for immunisation activities were also obtained, allowing for heterogeneity within and among districts. Forward projections of immunity, based on these estimates and planned immunisation activities, were produced through to April 2016 using a cohort model. Estimated population immunity was negatively correlated with the probability of VDPV2 poliomyelitis being reported in a district. In Nigeria and Pakistan, declines in immunity during 2008-2009 and 2012-2013, respectively, were associated with outbreaks of VDPV2. Immunity has since improved in both countries as a result of increased use of trivalent OPV, and projections generally indicated sustained or improved immunity in April 2016, such that the majority of districts (99% [95% uncertainty interval 97%-100%] in Nigeria and 84% [95% uncertainty interval 77%-91%] in Pakistan) had >70% population immunity among children <36 mo old. Districts with lower immunity were clustered in northeastern Nigeria and northwestern Pakistan. The accuracy of immunity estimates was limited by the small numbers of non-polio AFP cases in some districts, which was reflected by large uncertainty intervals. Forecasted improvements in immunity for April 2016 were robust to the uncertainty in estimates of baseline immunity (January-June 2015), vaccine coverage, and vaccine efficacy. Immunity against serotype-2 poliomyelitis was forecasted to improve in April 2016 compared to the first half of 2015 in Nigeria and Pakistan. These analyses informed the endorsement of OPV2 withdrawal in April 2016 by the WHO Strategic Advisory Group of Experts on Immunization.

  13. Population Immunity against Serotype-2 Poliomyelitis Leading up to the Global Withdrawal of the Oral Poliovirus Vaccine: Spatio-temporal Modelling of Surveillance Data

    PubMed Central

    O’Reilly, Kathleen M.; Etsano, Andrew; Vaz, Rui Gama; Jafari, Hamid; Grassly, Nicholas C.; Blake, Isobel M.

    2016-01-01

    Background Global withdrawal of serotype-2 oral poliovirus vaccine (OPV2) took place in April 2016. This marked a milestone in global polio eradication and was a public health intervention of unprecedented scale, affecting 155 countries. Achieving high levels of serotype-2 population immunity before OPV2 withdrawal was critical to avoid subsequent outbreaks of serotype-2 vaccine-derived polioviruses (VDPV2s). Methods and Findings In August 2015, we estimated vaccine-induced population immunity against serotype-2 poliomyelitis for 1 January 2004–30 June 2015 and produced forecasts for April 2016 by district in Nigeria and Pakistan. Population immunity was estimated from the vaccination histories of children <36 mo old identified with non-polio acute flaccid paralysis (AFP) reported through polio surveillance, information on immunisation activities with different oral poliovirus vaccine (OPV) formulations, and serotype-specific estimates of the efficacy of these OPVs against poliomyelitis. District immunity estimates were spatio-temporally smoothed using a Bayesian hierarchical framework. Coverage estimates for immunisation activities were also obtained, allowing for heterogeneity within and among districts. Forward projections of immunity, based on these estimates and planned immunisation activities, were produced through to April 2016 using a cohort model. Estimated population immunity was negatively correlated with the probability of VDPV2 poliomyelitis being reported in a district. In Nigeria and Pakistan, declines in immunity during 2008–2009 and 2012–2013, respectively, were associated with outbreaks of VDPV2. Immunity has since improved in both countries as a result of increased use of trivalent OPV, and projections generally indicated sustained or improved immunity in April 2016, such that the majority of districts (99% [95% uncertainty interval 97%–100%] in Nigeria and 84% [95% uncertainty interval 77%–91%] in Pakistan) had >70% population immunity among children <36 mo old. Districts with lower immunity were clustered in northeastern Nigeria and northwestern Pakistan. The accuracy of immunity estimates was limited by the small numbers of non-polio AFP cases in some districts, which was reflected by large uncertainty intervals. Forecasted improvements in immunity for April 2016 were robust to the uncertainty in estimates of baseline immunity (January–June 2015), vaccine coverage, and vaccine efficacy. Conclusions Immunity against serotype-2 poliomyelitis was forecasted to improve in April 2016 compared to the first half of 2015 in Nigeria and Pakistan. These analyses informed the endorsement of OPV2 withdrawal in April 2016 by the WHO Strategic Advisory Group of Experts on Immunization. PMID:27701425

  14. Clinical judgment to estimate pretest probability in the diagnosis of Cushing's syndrome under a Bayesian perspective.

    PubMed

    Cipoli, Daniel E; Martinez, Edson Z; Castro, Margaret de; Moreira, Ayrton C

    2012-12-01

    To estimate the pretest probability of Cushing's syndrome (CS) diagnosis by a Bayesian approach using intuitive clinical judgment. Physicians were requested, in seven endocrinology meetings, to answer three questions: "Based on your personal expertise, after obtaining clinical history and physical examination, without using laboratorial tests, what is your probability of diagnosing Cushing's Syndrome?"; "For how long have you been practicing Endocrinology?"; and "Where do you work?". A Bayesian beta regression, using the WinBugs software was employed. We obtained 294 questionnaires. The mean pretest probability of CS diagnosis was 51.6% (95%CI: 48.7-54.3). The probability was directly related to experience in endocrinology, but not with the place of work. Pretest probability of CS diagnosis was estimated using a Bayesian methodology. Although pretest likelihood can be context-dependent, experience based on years of practice may help the practitioner to diagnosis CS.

  15. A probabilistic approach for shallow rainfall-triggered landslide modeling at basin scale. A case study in the Luquillo Forest, Puerto Rico

    NASA Astrophysics Data System (ADS)

    Dialynas, Y. G.; Arnone, E.; Noto, L. V.; Bras, R. L.

    2013-12-01

    Slope stability depends on geotechnical and hydrological factors that exhibit wide natural spatial variability, yet sufficient measurements of the related parameters are rarely available over entire study areas. The uncertainty associated with the inability to fully characterize hydrologic behavior has an impact on any attempt to model landslide hazards. This work suggests a way to systematically account for this uncertainty in coupled distributed hydrological-stability models for shallow landslide hazard assessment. A probabilistic approach for the prediction of rainfall-triggered landslide occurrence at basin scale was implemented in an existing distributed eco-hydrological and landslide model, tRIBS-VEGGIE -landslide (Triangulated Irregular Network (TIN)-based Real-time Integrated Basin Simulator - VEGetation Generator for Interactive Evolution). More precisely, we upgraded tRIBS-VEGGIE- landslide to assess the likelihood of shallow landslides by accounting for uncertainty related to geotechnical and hydrological factors that directly affect slope stability. Natural variability of geotechnical soil characteristics was considered by randomizing soil cohesion and friction angle. Hydrological uncertainty related to the estimation of matric suction was taken into account by considering soil retention parameters as correlated random variables. The probability of failure is estimated through an assumed theoretical Factor of Safety (FS) distribution, conditioned on soil moisture content. At each cell, the temporally variant FS statistics are approximated by the First Order Second Moment (FOSM) method, as a function of parameters statistical properties. The model was applied on the Rio Mameyes Basin, located in the Luquillo Experimental Forest in Puerto Rico, where previous landslide analyses have been carried out. At each time step, model outputs include the probability of landslide occurrence across the basin, and the most probable depth of failure at each soil column. The use of the proposed probabilistic approach for shallow landslide prediction is able to reveal and quantify landslide risk at slopes assessed as stable by simpler deterministic methods.

  16. Measuring the effect of fuel treatments on forest carbon using landscape risk analysis

    NASA Astrophysics Data System (ADS)

    Ager, A. A.; Finney, M. A.; McMahan, A.; Cathcart, J.

    2010-12-01

    Wildfire simulation modelling was used to examine whether fuel reduction treatments can potentially reduce future wildfire emissions and provide carbon benefits. In contrast to previous reports, the current study modelled landscape scale effects of fuel treatments on fire spread and intensity, and used a probabilistic framework to quantify wildfire effects on carbon pools to account for stochastic wildfire occurrence. The study area was a 68 474 ha watershed located on the Fremont-Winema National Forest in southeastern Oregon, USA. Fuel reduction treatments were simulated on 10% of the watershed (19% of federal forestland). We simulated 30 000 wildfires with random ignition locations under both treated and untreated landscapes to estimate the change in burn probability by flame length class resulting from the treatments. Carbon loss functions were then calculated with the Forest Vegetation Simulator for each stand in the study area to quantify change in carbon as a function of flame length. We then calculated the expected change in carbon from a random ignition and wildfire as the sum of the product of the carbon loss and the burn probabilities by flame length class. The expected carbon difference between the non-treatment and treatment scenarios was then calculated to quantify the effect of fuel treatments. Overall, the results show that the carbon loss from implementing fuel reduction treatments exceeded the expected carbon benefit associated with lowered burn probabilities and reduced fire severity on the treated landscape. Thus, fuel management activities resulted in an expected net loss of carbon immediately after treatment. However, the findings represent a point in time estimate (wildfire immediately after treatments), and a temporal analysis with a probabilistic framework used here is needed to model carbon dynamics over the life cycle of the fuel treatments. Of particular importance is the long-term balance between emissions from the decay of dead trees killed by fire and carbon sequestration by forest regeneration following wildfire.

  17. Quality metrics for sensor images

    NASA Technical Reports Server (NTRS)

    Ahumada, AL

    1993-01-01

    Methods are needed for evaluating the quality of augmented visual displays (AVID). Computational quality metrics will help summarize, interpolate, and extrapolate the results of human performance tests with displays. The FLM Vision group at NASA Ames has been developing computational models of visual processing and using them to develop computational metrics for similar problems. For example, display modeling systems use metrics for comparing proposed displays, halftoning optimizing methods use metrics to evaluate the difference between the halftone and the original, and image compression methods minimize the predicted visibility of compression artifacts. The visual discrimination models take as input two arbitrary images A and B and compute an estimate of the probability that a human observer will report that A is different from B. If A is an image that one desires to display and B is the actual displayed image, such an estimate can be regarded as an image quality metric reflecting how well B approximates A. There are additional complexities associated with the problem of evaluating the quality of radar and IR enhanced displays for AVID tasks. One important problem is the question of whether intruding obstacles are detectable in such displays. Although the discrimination model can handle detection situations by making B the original image A plus the intrusion, this detection model makes the inappropriate assumption that the observer knows where the intrusion will be. Effects of signal uncertainty need to be added to our models. A pilot needs to make decisions rapidly. The models need to predict not just the probability of a correct decision, but the probability of a correct decision by the time the decision needs to be made. That is, the models need to predict latency as well as accuracy. Luce and Green have generated models for auditory detection latencies. Similar models are needed for visual detection. Most image quality models are designed for static imagery. Watson has been developing a general spatial-temporal vision model to optimize video compression techniques. These models need to be adapted and calibrated for AVID applications.

  18. Temporal interpolation alters motion in fMRI scans: Magnitudes and consequences for artifact detection.

    PubMed

    Power, Jonathan D; Plitt, Mark; Kundu, Prantik; Bandettini, Peter A; Martin, Alex

    2017-01-01

    Head motion can be estimated at any point of fMRI image processing. Processing steps involving temporal interpolation (e.g., slice time correction or outlier replacement) often precede motion estimation in the literature. From first principles it can be anticipated that temporal interpolation will alter head motion in a scan. Here we demonstrate this effect and its consequences in five large fMRI datasets. Estimated head motion was reduced by 10-50% or more following temporal interpolation, and reductions were often visible to the naked eye. Such reductions make the data seem to be of improved quality. Such reductions also degrade the sensitivity of analyses aimed at detecting motion-related artifact and can cause a dataset with artifact to falsely appear artifact-free. These reduced motion estimates will be particularly problematic for studies needing estimates of motion in time, such as studies of dynamics. Based on these findings, it is sensible to obtain motion estimates prior to any image processing (regardless of subsequent processing steps and the actual timing of motion correction procedures, which need not be changed). We also find that outlier replacement procedures change signals almost entirely during times of motion and therefore have notable similarities to motion-targeting censoring strategies (which withhold or replace signals entirely during times of motion).

  19. Temporal interpolation alters motion in fMRI scans: Magnitudes and consequences for artifact detection

    PubMed Central

    Plitt, Mark; Kundu, Prantik; Bandettini, Peter A.; Martin, Alex

    2017-01-01

    Head motion can be estimated at any point of fMRI image processing. Processing steps involving temporal interpolation (e.g., slice time correction or outlier replacement) often precede motion estimation in the literature. From first principles it can be anticipated that temporal interpolation will alter head motion in a scan. Here we demonstrate this effect and its consequences in five large fMRI datasets. Estimated head motion was reduced by 10–50% or more following temporal interpolation, and reductions were often visible to the naked eye. Such reductions make the data seem to be of improved quality. Such reductions also degrade the sensitivity of analyses aimed at detecting motion-related artifact and can cause a dataset with artifact to falsely appear artifact-free. These reduced motion estimates will be particularly problematic for studies needing estimates of motion in time, such as studies of dynamics. Based on these findings, it is sensible to obtain motion estimates prior to any image processing (regardless of subsequent processing steps and the actual timing of motion correction procedures, which need not be changed). We also find that outlier replacement procedures change signals almost entirely during times of motion and therefore have notable similarities to motion-targeting censoring strategies (which withhold or replace signals entirely during times of motion). PMID:28880888

  20. Investigation of estimators of probability density functions

    NASA Technical Reports Server (NTRS)

    Speed, F. M.

    1972-01-01

    Four research projects are summarized which include: (1) the generation of random numbers on the IBM 360/44, (2) statistical tests used to check out random number generators, (3) Specht density estimators, and (4) use of estimators of probability density functions in analyzing large amounts of data.

  1. Hong-Ou-Mandel effect in terms of the temporal biphoton wave function with two arrival-time variables

    NASA Astrophysics Data System (ADS)

    Fedorov, M. V.; Sysoeva, A. A.; Vintskevich, S. V.; Grigoriev, D. A.

    2018-03-01

    The well-known Hong-Ou-Mandel effect is revisited. Two physical reasons are discussed for the effect to be less pronounced or even to disappear: differing polarizations of photons coming to the beamsplitter and delay time of photons in one of two channels. For the latter we use the concepts of biphoton frequency and temporal wave functions depending, correspondingly, on two frequency continuous variables of photons and on two time variables t 1 and t 2 interpreted as the arrival times of photons to the beamsplitter. Explicit expressions are found for the probability densities and total probabilities for photon pairs to be split between two channels after the beamsplitter and to be unsplit, when two photons appear together in one of two channels.

  2. On the use of satellite-based estimates of rainfall temporal distribution to simulate the potential for malaria transmission in rural Africa

    NASA Astrophysics Data System (ADS)

    Yamana, Teresa K.; Eltahir, Elfatih A. B.

    2011-02-01

    This paper describes the use of satellite-based estimates of rainfall to force the Hydrology, Entomology and Malaria Transmission Simulator (HYDREMATS), a hydrology-based mechanistic model of malaria transmission. We first examined the temporal resolution of rainfall input required by HYDREMATS. Simulations conducted over Banizoumbou village in Niger showed that for reasonably accurate simulation of mosquito populations, the model requires rainfall data with at least 1 h resolution. We then investigated whether HYDREMATS could be effectively forced by satellite-based estimates of rainfall instead of ground-based observations. The Climate Prediction Center morphing technique (CMORPH) precipitation estimates distributed by the National Oceanic and Atmospheric Administration are available at a 30 min temporal resolution and 8 km spatial resolution. We compared mosquito populations simulated by HYDREMATS when the model is forced by adjusted CMORPH estimates and by ground observations. The results demonstrate that adjusted rainfall estimates from satellites can be used with a mechanistic model to accurately simulate the dynamics of mosquito populations.

  3. Testing the molecular clock using mechanistic models of fossil preservation and molecular evolution.

    PubMed

    Warnock, Rachel C M; Yang, Ziheng; Donoghue, Philip C J

    2017-06-28

    Molecular sequence data provide information about relative times only, and fossil-based age constraints are the ultimate source of information about absolute times in molecular clock dating analyses. Thus, fossil calibrations are critical to molecular clock dating, but competing methods are difficult to evaluate empirically because the true evolutionary time scale is never known. Here, we combine mechanistic models of fossil preservation and sequence evolution in simulations to evaluate different approaches to constructing fossil calibrations and their impact on Bayesian molecular clock dating, and the relative impact of fossil versus molecular sampling. We show that divergence time estimation is impacted by the model of fossil preservation, sampling intensity and tree shape. The addition of sequence data may improve molecular clock estimates, but accuracy and precision is dominated by the quality of the fossil calibrations. Posterior means and medians are poor representatives of true divergence times; posterior intervals provide a much more accurate estimate of divergence times, though they may be wide and often do not have high coverage probability. Our results highlight the importance of increased fossil sampling and improved statistical approaches to generating calibrations, which should incorporate the non-uniform nature of ecological and temporal fossil species distributions. © 2017 The Authors.

  4. Constrained motion estimation-based error resilient coding for HEVC

    NASA Astrophysics Data System (ADS)

    Guo, Weihan; Zhang, Yongfei; Li, Bo

    2018-04-01

    Unreliable communication channels might lead to packet losses and bit errors in the videos transmitted through it, which will cause severe video quality degradation. This is even worse for HEVC since more advanced and powerful motion estimation methods are introduced to further remove the inter-frame dependency and thus improve the coding efficiency. Once a Motion Vector (MV) is lost or corrupted, it will cause distortion in the decoded frame. More importantly, due to motion compensation, the error will propagate along the motion prediction path, accumulate over time, and significantly degrade the overall video presentation quality. To address this problem, we study the problem of encoder-sider error resilient coding for HEVC and propose a constrained motion estimation scheme to mitigate the problem of error propagation to subsequent frames. The approach is achieved by cutting off MV dependencies and limiting the block regions which are predicted by temporal motion vector. The experimental results show that the proposed method can effectively suppress the error propagation caused by bit errors of motion vector and can improve the robustness of the stream in the bit error channels. When the bit error probability is 10-5, an increase of the decoded video quality (PSNR) by up to1.310dB and on average 0.762 dB can be achieved, compared to the reference HEVC.

  5. Stereotactic probability and variability of speech arrest and anomia sites during stimulation mapping of the language dominant hemisphere.

    PubMed

    Chang, Edward F; Breshears, Jonathan D; Raygor, Kunal P; Lau, Darryl; Molinaro, Annette M; Berger, Mitchel S

    2017-01-01

    OBJECTIVE Functional mapping using direct cortical stimulation is the gold standard for the prevention of postoperative morbidity during resective surgery in dominant-hemisphere perisylvian regions. Its role is necessitated by the significant interindividual variability that has been observed for essential language sites. The aim in this study was to determine the statistical probability distribution of eliciting aphasic errors for any given stereotactically based cortical position in a patient cohort and to quantify the variability at each cortical site. METHODS Patients undergoing awake craniotomy for dominant-hemisphere primary brain tumor resection between 1999 and 2014 at the authors' institution were included in this study, which included counting and picture-naming tasks during dense speech mapping via cortical stimulation. Positive and negative stimulation sites were collected using an intraoperative frameless stereotactic neuronavigation system and were converted to Montreal Neurological Institute coordinates. Data were iteratively resampled to create mean and standard deviation probability maps for speech arrest and anomia. Patients were divided into groups with a "classic" or an "atypical" location of speech function, based on the resultant probability maps. Patient and clinical factors were then assessed for their association with an atypical location of speech sites by univariate and multivariate analysis. RESULTS Across 102 patients undergoing speech mapping, the overall probabilities of speech arrest and anomia were 0.51 and 0.33, respectively. Speech arrest was most likely to occur with stimulation of the posterior inferior frontal gyrus (maximum probability from individual bin = 0.025), and variance was highest in the dorsal premotor cortex and the posterior superior temporal gyrus. In contrast, stimulation within the posterior perisylvian cortex resulted in the maximum mean probability of anomia (maximum probability = 0.012), with large variance in the regions surrounding the posterior superior temporal gyrus, including the posterior middle temporal, angular, and supramarginal gyri. Patients with atypical speech localization were far more likely to have tumors in canonical Broca's or Wernicke's areas (OR 7.21, 95% CI 1.67-31.09, p < 0.01) or to have multilobar tumors (OR 12.58, 95% CI 2.22-71.42, p < 0.01), than were patients with classic speech localization. CONCLUSIONS This study provides statistical probability distribution maps for aphasic errors during cortical stimulation mapping in a patient cohort. Thus, the authors provide an expected probability of inducing speech arrest and anomia from specific 10-mm 2 cortical bins in an individual patient. In addition, they highlight key regions of interindividual mapping variability that should be considered preoperatively. They believe these results will aid surgeons in their preoperative planning of eloquent cortex resection.

  6. Estimating trace-suspect match probabilities for singleton Y-STR haplotypes using coalescent theory.

    PubMed

    Andersen, Mikkel Meyer; Caliebe, Amke; Jochens, Arne; Willuweit, Sascha; Krawczak, Michael

    2013-02-01

    Estimation of match probabilities for singleton haplotypes of lineage markers, i.e. for haplotypes observed only once in a reference database augmented by a suspect profile, is an important problem in forensic genetics. We compared the performance of four estimators of singleton match probabilities for Y-STRs, namely the count estimate, both with and without Brenner's so-called 'kappa correction', the surveying estimate, and a previously proposed, but rarely used, coalescent-based approach implemented in the BATWING software. Extensive simulation with BATWING of the underlying population history, haplotype evolution and subsequent database sampling revealed that the coalescent-based approach is characterized by lower bias and lower mean squared error than the uncorrected count estimator and the surveying estimator. Moreover, in contrast to the two count estimators, both the surveying and the coalescent-based approach exhibited a good correlation between the estimated and true match probabilities. However, although its overall performance is thus better than that of any other recognized method, the coalescent-based estimator is still computation-intense on the verge of general impracticability. Its application in forensic practice therefore will have to be limited to small reference databases, or to isolated cases of particular interest, until more powerful algorithms for coalescent simulation have become available. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  7. Spatio-temporal models of mental processes from fMRI.

    PubMed

    Janoos, Firdaus; Machiraju, Raghu; Singh, Shantanu; Morocz, Istvan Ákos

    2011-07-15

    Understanding the highly complex, spatially distributed and temporally organized phenomena entailed by mental processes using functional MRI is an important research problem in cognitive and clinical neuroscience. Conventional analysis methods focus on the spatial dimension of the data discarding the information about brain function contained in the temporal dimension. This paper presents a fully spatio-temporal multivariate analysis method using a state-space model (SSM) for brain function that yields not only spatial maps of activity but also its temporal structure along with spatially varying estimates of the hemodynamic response. Efficient algorithms for estimating the parameters along with quantitative validations are given. A novel low-dimensional feature-space for representing the data, based on a formal definition of functional similarity, is derived. Quantitative validation of the model and the estimation algorithms is provided with a simulation study. Using a real fMRI study for mental arithmetic, the ability of this neurophysiologically inspired model to represent the spatio-temporal information corresponding to mental processes is demonstrated. Moreover, by comparing the models across multiple subjects, natural patterns in mental processes organized according to different mental abilities are revealed. Copyright © 2011 Elsevier Inc. All rights reserved.

  8. Estimating soil moisture exceedance probability from antecedent rainfall

    NASA Astrophysics Data System (ADS)

    Cronkite-Ratcliff, C.; Kalansky, J.; Stock, J. D.; Collins, B. D.

    2016-12-01

    The first storms of the rainy season in coastal California, USA, add moisture to soils but rarely trigger landslides. Previous workers proposed that antecedent rainfall, the cumulative seasonal rain from October 1 onwards, had to exceed specific amounts in order to trigger landsliding. Recent monitoring of soil moisture upslope of historic landslides in the San Francisco Bay Area shows that storms can cause positive pressure heads once soil moisture values exceed a threshold of volumetric water content (VWC). We propose that antecedent rainfall could be used to estimate the probability that VWC exceeds this threshold. A major challenge to estimating the probability of exceedance is that rain gauge records are frequently incomplete. We developed a stochastic model to impute (infill) missing hourly precipitation data. This model uses nearest neighbor-based conditional resampling of the gauge record using data from nearby rain gauges. Using co-located VWC measurements, imputed data can be used to estimate the probability that VWC exceeds a specific threshold for a given antecedent rainfall. The stochastic imputation model can also provide an estimate of uncertainty in the exceedance probability curve. Here we demonstrate the method using soil moisture and precipitation data from several sites located throughout Northern California. Results show a significant variability between sites in the sensitivity of VWC exceedance probability to antecedent rainfall.

  9. A multimodal approach to estimating vigilance using EEG and forehead EOG

    NASA Astrophysics Data System (ADS)

    Zheng, Wei-Long; Lu, Bao-Liang

    2017-04-01

    Objective. Covert aspects of ongoing user mental states provide key context information for user-aware human computer interactions. In this paper, we focus on the problem of estimating the vigilance of users using EEG and EOG signals. Approach. The PERCLOS index as vigilance annotation is obtained from eye tracking glasses. To improve the feasibility and wearability of vigilance estimation devices for real-world applications, we adopt a novel electrode placement for forehead EOG and extract various eye movement features, which contain the principal information of traditional EOG. We explore the effects of EEG from different brain areas and combine EEG and forehead EOG to leverage their complementary characteristics for vigilance estimation. Considering that the vigilance of users is a dynamic changing process because the intrinsic mental states of users involve temporal evolution, we introduce continuous conditional neural field and continuous conditional random field models to capture dynamic temporal dependency. Main results. We propose a multimodal approach to estimating vigilance by combining EEG and forehead EOG and incorporating the temporal dependency of vigilance into model training. The experimental results demonstrate that modality fusion can improve the performance compared with a single modality, EOG and EEG contain complementary information for vigilance estimation, and the temporal dependency-based models can enhance the performance of vigilance estimation. From the experimental results, we observe that theta and alpha frequency activities are increased, while gamma frequency activities are decreased in drowsy states in contrast to awake states. Significance. The forehead setup allows for the simultaneous collection of EEG and EOG and achieves comparative performance using only four shared electrodes in comparison with the temporal and posterior sites.

  10. LCDs are better: psychophysical and photometric estimates of the temporal characteristics of CRT and LCD monitors.

    PubMed

    Lagroix, Hayley E P; Yanko, Matthew R; Spalek, Thomas M

    2012-07-01

    Many cognitive and perceptual phenomena, such as iconic memory and temporal integration, require brief displays. A critical requirement is that the image not remain visible after its offset. It is commonly believed that liquid crystal displays (LCD) are unsuitable because of their poor temporal response characteristics relative to cathode-ray-tube (CRT) screens. Remarkably, no psychophysical estimates of visible persistence are available to verify this belief. A series of experiments in which white stimuli on a black background produced discernible persistence on CRT but not on LCD screens, during both dark- and light-adapted viewing, falsified this belief. Similar estimates using black stimuli on a white background produced no visible persistence on either screen. That said, photometric measurements are available that seem to confirm the poor temporal characteristics of LCD screens, but they were obtained before recent advances in LCD technology. Using current LCD screens, we obtained photometric estimates of rise time far shorter (1-6 ms) than earlier estimates (20-150 ms), and approaching those of CRTs (<1 ms). We conclude that LCDs are preferable to CRTs when visible persistence is a concern, except when black-on-white displays are used.

  11. Bayesian inverse modeling and source location of an unintended 131I release in Europe in the fall of 2011

    NASA Astrophysics Data System (ADS)

    Tichý, Ondřej; Šmídl, Václav; Hofman, Radek; Šindelářová, Kateřina; Hýža, Miroslav; Stohl, Andreas

    2017-10-01

    In the fall of 2011, iodine-131 (131I) was detected at several radionuclide monitoring stations in central Europe. After investigation, the International Atomic Energy Agency (IAEA) was informed by Hungarian authorities that 131I was released from the Institute of Isotopes Ltd. in Budapest, Hungary. It was reported that a total activity of 342 GBq of 131I was emitted between 8 September and 16 November 2011. In this study, we use the ambient concentration measurements of 131I to determine the location of the release as well as its magnitude and temporal variation. As the location of the release and an estimate of the source strength became eventually known, this accident represents a realistic test case for inversion models. For our source reconstruction, we use no prior knowledge. Instead, we estimate the source location and emission variation using only the available 131I measurements. Subsequently, we use the partial information about the source term available from the Hungarian authorities for validation of our results. For the source determination, we first perform backward runs of atmospheric transport models and obtain source-receptor sensitivity (SRS) matrices for each grid cell of our study domain. We use two dispersion models, FLEXPART and Hysplit, driven with meteorological analysis data from the global forecast system (GFS) and from European Centre for Medium-range Weather Forecasts (ECMWF) weather forecast models. Second, we use a recently developed inverse method, least-squares with adaptive prior covariance (LS-APC), to determine the 131I emissions and their temporal variation from the measurements and computed SRS matrices. For each grid cell of our simulation domain, we evaluate the probability that the release was generated in that cell using Bayesian model selection. The model selection procedure also provides information about the most suitable dispersion model for the source term reconstruction. Third, we select the most probable location of the release with its associated source term and perform a forward model simulation to study the consequences of the iodine release. Results of these procedures are compared with the known release location and reported information about its time variation. We find that our algorithm could successfully locate the actual release site. The estimated release period is also in agreement with the values reported by IAEA and the reported total released activity of 342 GBq is within the 99 % confidence interval of the posterior distribution of our most likely model.

  12. Postglacial trends of hillslope development in two glacially formed mountain valleys in western Norway

    NASA Astrophysics Data System (ADS)

    Laute, K.; Beylich, A. A.

    2012-04-01

    Although rockfall talus slopes occur in all regions where rock weathering products accumulate beneath rock faces and cliffs, they are particularly common in glacially formed mountain landscapes. The retreat of glacier ice from glaciated valleys which have probably experienced oversteepening of rock slopes by glacial erosion causes paraglacial destabilization of the valley sidewalls related to stress-relief, unloading, frost weathering and / or degradation of mountain permafrost. Large areas of the Norwegian fjord landscapes are occupied by hillslopes which are owned by the influences of the glacial inheritance of the last glacial maximum (LGM). This study focuses on Postglacial trends of hillslope development in two glacially formed mountain valleys in western Norway (Erdalen and Bødalen). The research is part of a doctoral thesis, which is integrated in the Norwegian Research Council (NFR) funded SedyMONT-Norway project within the ESF TOPO-EUROPE SedyMONT (Timescales of sediment dynamics, climate and topographic change in mountain landscapes) Programme. The main aspects addressed in this study are: (i) the spatio-temporal variability of denudative slope processes over the Holocene and (ii) the Postglacial modification of the glacial relief. The applied process-based approach includes detailed geomorphological fieldmapping combined with terrestrial laser scans (LIDAR) of slope deposits in order to identify possible deposition processes and their spatial variability, relative dating techniques (tree rings and lichens) to analyze subrecent temporal variations, detailed surface mapping with additional geophysical subsurface investigations to estimated regolith thicknesses as well as CIR- and orthophoto delineation combined with GIS and DEM computing for calculating estimates of average valley-wide rockwall retreat rates. Results show Holocene rockwall retreat rates for the two valleys which are in a comparable range with other estimates of rockwall retreat rates in other cold mountain environments worldwide. Further on the results indicate probably higher accumulation rates of slope deposits mainly throughout an enhanced rockfall activity shortly after the glacier retreat (at about 10.000 yr BP) as compared to subrecent and contemporary rates. The overall tendency of landscape development is a Postglacial modification of the defined U-shaped valley morphometry (valley widening) throughout rockwall retreat and connected accumulation of debris material beneath these rockwalls. Active fluvial material removal at the base of slopes is almost negligible due to a very limited hillslope-channel coupling in both valleys. So far, the glacially sculptured relief has not adapted to the denudative surface processes occurring under recent environmental conditions.

  13. The Probabilities of Unique Events

    PubMed Central

    Khemlani, Sangeet S.; Lotstein, Max; Johnson-Laird, Phil

    2012-01-01

    Many theorists argue that the probabilities of unique events, even real possibilities such as President Obama's re-election, are meaningless. As a consequence, psychologists have seldom investigated them. We propose a new theory (implemented in a computer program) in which such estimates depend on an intuitive non-numerical system capable only of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of conjunctions should often tend to split the difference between the probabilities of the two conjuncts. We report two experiments showing that individuals commit such violations of the probability calculus, and corroborating other predictions of the theory, e.g., individuals err in the same way even when they make non-numerical verbal estimates, such as that an event is highly improbable. PMID:23056224

  14. Statistical surrogate models for prediction of high-consequence climate change.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Constantine, Paul; Field, Richard V., Jr.; Boslough, Mark Bruce Elrick

    2011-09-01

    In safety engineering, performance metrics are defined using probabilistic risk assessments focused on the low-probability, high-consequence tail of the distribution of possible events, as opposed to best estimates based on central tendencies. We frame the climate change problem and its associated risks in a similar manner. To properly explore the tails of the distribution requires extensive sampling, which is not possible with existing coupled atmospheric models due to the high computational cost of each simulation. We therefore propose the use of specialized statistical surrogate models (SSMs) for the purpose of exploring the probability law of various climate variables of interest.more » A SSM is different than a deterministic surrogate model in that it represents each climate variable of interest as a space/time random field. The SSM can be calibrated to available spatial and temporal data from existing climate databases, e.g., the Program for Climate Model Diagnosis and Intercomparison (PCMDI), or to a collection of outputs from a General Circulation Model (GCM), e.g., the Community Earth System Model (CESM) and its predecessors. Because of its reduced size and complexity, the realization of a large number of independent model outputs from a SSM becomes computationally straightforward, so that quantifying the risk associated with low-probability, high-consequence climate events becomes feasible. A Bayesian framework is developed to provide quantitative measures of confidence, via Bayesian credible intervals, in the use of the proposed approach to assess these risks.« less

  15. Real-time prediction of rain-triggered lahars: incorporating seasonality and catchment recovery

    NASA Astrophysics Data System (ADS)

    Jones, Robbie; Manville, Vern; Peakall, Jeff; Froude, Melanie J.; Odbert, Henry M.

    2017-12-01

    Rain-triggered lahars are a significant secondary hydrological and geomorphic hazard at volcanoes where unconsolidated pyroclastic material produced by explosive eruptions is exposed to intense rainfall, often occurring for years to decades after the initial eruptive activity. Previous studies have shown that secondary lahar initiation is a function of rainfall parameters, source material characteristics and time since eruptive activity. In this study, probabilistic rain-triggered lahar forecasting models are developed using the lahar occurrence and rainfall record of the Belham River valley at the Soufrière Hills volcano (SHV), Montserrat, collected between April 2010 and April 2012. In addition to the use of peak rainfall intensity (PRI) as a base forecasting parameter, considerations for the effects of rainfall seasonality and catchment evolution upon the initiation of rain-triggered lahars and the predictability of lahar generation are also incorporated into these models. Lahar probability increases with peak 1 h rainfall intensity throughout the 2-year dataset and is higher under given rainfall conditions in year 1 than year 2. The probability of lahars is also enhanced during the wet season, when large-scale synoptic weather systems (including tropical cyclones) are more common and antecedent rainfall and thus levels of deposit saturation are typically increased. The incorporation of antecedent conditions and catchment evolution into logistic-regression-based rain-triggered lahar probability estimation models is shown to enhance model performance and displays the potential for successful real-time prediction of lahars, even in areas featuring strongly seasonal climates and temporal catchment recovery.

  16. Neural Mechanisms Underlying Risk and Ambiguity Attitudes.

    PubMed

    Blankenstein, Neeltje E; Peper, Jiska S; Crone, Eveline A; van Duijvenvoorde, Anna C K

    2017-11-01

    Individual differences in attitudes to risk (a taste for risk, known probabilities) and ambiguity (a tolerance for uncertainty, unknown probabilities) differentially influence risky decision-making. However, it is not well understood whether risk and ambiguity are coded differently within individuals. Here, we tested whether individual differences in risk and ambiguity attitudes were reflected in distinct neural correlates during choice and outcome processing of risky and ambiguous gambles. To these ends, we developed a neuroimaging task in which participants ( n = 50) chose between a sure gain and a gamble, which was either risky or ambiguous, and presented decision outcomes (gains, no gains). From a separate task in which the amount, probability, and ambiguity level were varied, we estimated individuals' risk and ambiguity attitudes. Although there was pronounced neural overlap between risky and ambiguous gambling in a network typically related to decision-making under uncertainty, relatively more risk-seeking attitudes were associated with increased activation in valuation regions of the brain (medial and lateral OFC), whereas relatively more ambiguity-seeking attitudes were related to temporal cortex activation. In addition, although striatum activation was observed during reward processing irrespective of a prior risky or ambiguous gamble, reward processing after an ambiguous gamble resulted in enhanced dorsomedial PFC activation, possibly functioning as a general signal of uncertainty coding. These findings suggest that different neural mechanisms reflect individual differences in risk and ambiguity attitudes and that risk and ambiguity may impact overt risk-taking behavior in different ways.

  17. Temporal Lobe Reactions After Radiotherapy With Carbon Ions: Incidence and Estimation of the Relative Biological Effectiveness by the Local Effect Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schlampp, Ingmar; Karger, Christian P.; Jaekel, Oliver

    2011-07-01

    Purpose: To identify predictors for the development of temporal lobe reactions (TLR) after carbon ion radiation therapy (RT) for radiation-resistant tumors in the central nervous system and to evaluate the predictions of the local effect model (LEM) used for calculation of the biologically effective dose. Methods and Materials: This retrospective study reports the TLR rates in patients with skull base chordomas and chondrosarcomas irradiated with carbon ions at GSI, Darmstadt, Germany, in the years 2002 and 2003. Calculation of the relative biological effectiveness and dose optimization of treatment plans were performed on the basis of the LEM. Clinical examinations andmore » magnetic resonance imaging (MRI) were performed at 3, 6, and 12 months after RT and annually thereafter. Local contrast medium enhancement in temporal lobes, as detected on MRI, was regarded as radiation-induced TLR. Dose-volume histograms of 118 temporal lobes in 59 patients were analyzed, and 16 therapy-associated and 2 patient-associated factors were statistically evaluated for their predictive value for the occurrence of TLR. Results: Median follow-up was 2.5 years (range, 0.3--6.6 years). Age and maximum dose applied to at least 1 cm{sup 3} of the temporal lobe (D{sub max,V-1cm}3, maximum dose in the remaining temporal lobe volume, excluding the volume 1 cm{sup 3} with the highest dose) were found to be the most important predictors for TLR. Dose response curves of D{sub max,V-1cm}3 were calculated. The biologically equivalent tolerance doses for the 5% and 50% probabilities to develop TLR were 68.8 {+-} 3.3 Gy equivalents (GyE) and 87.3 {+-} 2.8 GyE, respectively. Conclusions: D{sub max,V-1cm}3 is predictive for radiation-induced TLR. The tolerance doses obtained seem to be consistent with published data for highly conformal photon and proton irradiations. We could not detect any clinically relevant deviations between clinical findings and expectations based on predictions of the LEM.« less

  18. Red-shouldered hawk occupancy surveys in central Minnesota, USA

    USGS Publications Warehouse

    Henneman, C.; McLeod, M.A.; Andersen, D.E.

    2007-01-01

    Forest-dwelling raptors are often difficult to detect because many species occur at low density or are secretive. Broadcasting conspecific vocalizations can increase the probability of detecting forest-dwelling raptors and has been shown to be an effective method for locating raptors and assessing their relative abundance. Recent advances in statistical techniques based on presence-absence data use probabilistic arguments to derive probability of detection when it is <1 and to provide a model and likelihood-based method for estimating proportion of sites occupied. We used these maximum-likelihood models with data from red-shouldered hawk (Buteo lineatus) call-broadcast surveys conducted in central Minnesota, USA, in 1994-1995 and 2004-2005. Our objectives were to obtain estimates of occupancy and detection probability 1) over multiple sampling seasons (yr), 2) incorporating within-season time-specific detection probabilities, 3) with call type and breeding stage included as covariates in models of probability of detection, and 4) with different sampling strategies. We visited individual survey locations 2-9 times per year, and estimates of both probability of detection (range = 0.28-0.54) and site occupancy (range = 0.81-0.97) varied among years. Detection probability was affected by inclusion of a within-season time-specific covariate, call type, and breeding stage. In 2004 and 2005 we used survey results to assess the effect that number of sample locations, double sampling, and discontinued sampling had on parameter estimates. We found that estimates of probability of detection and proportion of sites occupied were similar across different sampling strategies, and we suggest ways to reduce sampling effort in a monitoring program.

  19. The effects of vent location, event scale and time forecasts on pyroclastic density current hazard maps at Campi Flegrei caldera (Italy)

    NASA Astrophysics Data System (ADS)

    Bevilacqua, Andrea; Neri, Augusto; Bisson, Marina; Esposti Ongaro, Tomaso; Flandoli, Franco; Isaia, Roberto; Rosi, Mauro; Vitale, Stefano

    2017-09-01

    This study presents a new method for producing long-term hazard maps for pyroclastic density currents (PDC) originating at Campi Flegrei caldera. Such method is based on a doubly stochastic approach and is able to combine the uncertainty assessments on the spatial location of the volcanic vent, the size of the flow and the expected time of such an event. The results are obtained by using a Monte Carlo approach and adopting a simplified invasion model based on the box model integral approximation. Temporal assessments are modelled through a Cox-type process including self-excitement effects, based on the eruptive record of the last 15 kyr. Mean and percentile maps of PDC invasion probability are produced, exploring their sensitivity to some sources of uncertainty and to the effects of the dependence between PDC scales and the caldera sector where they originated. Conditional maps representative of PDC originating inside limited zones of the caldera, or of PDC with a limited range of scales are also produced. Finally, the effect of assuming different time windows for the hazard estimates is explored, also including the potential occurrence of a sequence of multiple events. Assuming that the last eruption of Monte Nuovo (A.D. 1538) marked the beginning of a new epoch of activity similar to the previous ones, results of the statistical analysis indicate a mean probability of PDC invasion above 5% in the next 50 years on almost the entire caldera (with a probability peak of 25% in the central part of the caldera). In contrast, probability values reduce by a factor of about 3 if the entire eruptive record is considered over the last 15 kyr, i.e. including both eruptive epochs and quiescent periods.

  20. Water level dynamics in wetlands and nesting success of Black Terns in Maine

    USGS Publications Warehouse

    Gilbert, A.T.; Servello, F.A.

    2005-01-01

    The Black Tern (Chlidonias niger) nests in freshwater wetlands that are prone to water level fluctuations, and nest losses to flooding are common. We examined temporal patterns in water levels at six sites with Black Tern colonies in Maine and determined probabilities of flood events and associated nest loss at Douglas Pond, the location of the largest breeding colony. Daily precipitation data from weather stations and water flow data from a flow gauge below Douglas Pond were obtained for 1960-1999. Information on nest losses from three floods at Douglas Pond in 1997-1999 were used to characterize small (6% nest loss), medium (56% nest loss) and large (94% nest loss) flood events, and we calculated probabilities of these three levels of flooding occurring at Douglas Pond using historic water levels data. Water levels generally decreased gradually during the nesting season at colony sites, except at Douglas Pond where water levels fluctuated substantially in response to rain events. Annual probabilities of small, medium, and large flood events were 68%, 35%, and 13% for nests initiated during 23 May-12 July, with similar probabilities for early (23 May-12 June) and late (13 June-12 July) periods. An index of potential nest loss indicated that medium floods at Douglas Pond had the greatest potential effect on nest success because they occurred relatively frequently and inundated large proportions of nests. Nest losses at other colonies were estimated to be approximately 30% of those at Douglas Pond. Nest losses to flooding appear to be common for the Black Tern in Maine and related to spring precipitation patterns, but ultimate effects on breeding productivity are uncertain.

  1. Rare Event Simulation in Radiation Transport

    NASA Astrophysics Data System (ADS)

    Kollman, Craig

    This dissertation studies methods for estimating extremely small probabilities by Monte Carlo simulation. Problems in radiation transport typically involve estimating very rare events or the expected value of a random variable which is with overwhelming probability equal to zero. These problems often have high dimensional state spaces and irregular geometries so that analytic solutions are not possible. Monte Carlo simulation must be used to estimate the radiation dosage being transported to a particular location. If the area is well shielded the probability of any one particular particle getting through is very small. Because of the large number of particles involved, even a tiny fraction penetrating the shield may represent an unacceptable level of radiation. It therefore becomes critical to be able to accurately estimate this extremely small probability. Importance sampling is a well known technique for improving the efficiency of rare event calculations. Here, a new set of probabilities is used in the simulation runs. The results are multiplied by the likelihood ratio between the true and simulated probabilities so as to keep our estimator unbiased. The variance of the resulting estimator is very sensitive to which new set of transition probabilities are chosen. It is shown that a zero variance estimator does exist, but that its computation requires exact knowledge of the solution. A simple random walk with an associated killing model for the scatter of neutrons is introduced. Large deviation results for optimal importance sampling in random walks are extended to the case where killing is present. An adaptive "learning" algorithm for implementing importance sampling is given for more general Markov chain models of neutron scatter. For finite state spaces this algorithm is shown to give, with probability one, a sequence of estimates converging exponentially fast to the true solution. In the final chapter, an attempt to generalize this algorithm to a continuous state space is made. This involves partitioning the space into a finite number of cells. There is a tradeoff between additional computation per iteration and variance reduction per iteration that arises in determining the optimal grid size. All versions of this algorithm can be thought of as a compromise between deterministic and Monte Carlo methods, capturing advantages of both techniques.

  2. Probabilistic Assessment of Cancer Risk from Solar Particle Events

    NASA Astrophysics Data System (ADS)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.

    For long duration missions outside of the protection of the Earth's magnetic field, space radi-ation presents significant health risks including cancer mortality. Space radiation consists of solar particle events (SPEs), comprised largely of medium energy protons (less than several hundred MeV); and galactic cosmic ray (GCR), which include high energy protons and heavy ions. While the frequency distribution of SPEs depends strongly upon the phase within the solar activity cycle, the individual SPE occurrences themselves are random in nature. We es-timated the probability of SPE occurrence using a non-homogeneous Poisson model to fit the historical database of proton measurements. Distributions of particle fluences of SPEs for a specified mission period were simulated ranging from its 5th to 95th percentile to assess the cancer risk distribution. Spectral variability of SPEs was also examined, because the detailed energy spectra of protons are important especially at high energy levels for assessing the cancer risk associated with energetic particles for large events. We estimated the overall cumulative probability of GCR environment for a specified mission period using a solar modulation model for the temporal characterization of the GCR environment represented by the deceleration po-tential (φ). Probabilistic assessment of cancer fatal risk was calculated for various periods of lunar and Mars missions. This probabilistic approach to risk assessment from space radiation is in support of mission design and operational planning for future manned space exploration missions. In future work, this probabilistic approach to the space radiation will be combined with a probabilistic approach to the radiobiological factors that contribute to the uncertainties in projecting cancer risks.

  3. Probabilistic Assessment of Cancer Risk from Solar Particle Events

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.

    2010-01-01

    For long duration missions outside of the protection of the Earth s magnetic field, space radiation presents significant health risks including cancer mortality. Space radiation consists of solar particle events (SPEs), comprised largely of medium energy protons (less than several hundred MeV); and galactic cosmic ray (GCR), which include high energy protons and heavy ions. While the frequency distribution of SPEs depends strongly upon the phase within the solar activity cycle, the individual SPE occurrences themselves are random in nature. We estimated the probability of SPE occurrence using a non-homogeneous Poisson model to fit the historical database of proton measurements. Distributions of particle fluences of SPEs for a specified mission period were simulated ranging from its 5 th to 95th percentile to assess the cancer risk distribution. Spectral variability of SPEs was also examined, because the detailed energy spectra of protons are important especially at high energy levels for assessing the cancer risk associated with energetic particles for large events. We estimated the overall cumulative probability of GCR environment for a specified mission period using a solar modulation model for the temporal characterization of the GCR environment represented by the deceleration potential (^). Probabilistic assessment of cancer fatal risk was calculated for various periods of lunar and Mars missions. This probabilistic approach to risk assessment from space radiation is in support of mission design and operational planning for future manned space exploration missions. In future work, this probabilistic approach to the space radiation will be combined with a probabilistic approach to the radiobiological factors that contribute to the uncertainties in projecting cancer risks.

  4. Population density approach for discrete mRNA distributions in generalized switching models for stochastic gene expression.

    PubMed

    Stinchcombe, Adam R; Peskin, Charles S; Tranchina, Daniel

    2012-06-01

    We present a generalization of a population density approach for modeling and analysis of stochastic gene expression. In the model, the gene of interest fluctuates stochastically between an inactive state, in which transcription cannot occur, and an active state, in which discrete transcription events occur; and the individual mRNA molecules are degraded stochastically in an independent manner. This sort of model in simplest form with exponential dwell times has been used to explain experimental estimates of the discrete distribution of random mRNA copy number. In our generalization, the random dwell times in the inactive and active states, T_{0} and T_{1}, respectively, are independent random variables drawn from any specified distributions. Consequently, the probability per unit time of switching out of a state depends on the time since entering that state. Our method exploits a connection between the fully discrete random process and a related continuous process. We present numerical methods for computing steady-state mRNA distributions and an analytical derivation of the mRNA autocovariance function. We find that empirical estimates of the steady-state mRNA probability mass function from Monte Carlo simulations of laboratory data do not allow one to distinguish between underlying models with exponential and nonexponential dwell times in some relevant parameter regimes. However, in these parameter regimes and where the autocovariance function has negative lobes, the autocovariance function disambiguates the two types of models. Our results strongly suggest that temporal data beyond the autocovariance function is required in general to characterize gene switching.

  5. Methods for estimating drought streamflow probabilities for Virginia streams

    USGS Publications Warehouse

    Austin, Samuel H.

    2014-01-01

    Maximum likelihood logistic regression model equations used to estimate drought flow probabilities for Virginia streams are presented for 259 hydrologic basins in Virginia. Winter streamflows were used to estimate the likelihood of streamflows during the subsequent drought-prone summer months. The maximum likelihood logistic regression models identify probable streamflows from 5 to 8 months in advance. More than 5 million streamflow daily values collected over the period of record (January 1, 1900 through May 16, 2012) were compiled and analyzed over a minimum 10-year (maximum 112-year) period of record. The analysis yielded the 46,704 equations with statistically significant fit statistics and parameter ranges published in two tables in this report. These model equations produce summer month (July, August, and September) drought flow threshold probabilities as a function of streamflows during the previous winter months (November, December, January, and February). Example calculations are provided, demonstrating how to use the equations to estimate probable streamflows as much as 8 months in advance.

  6. Calibrated fMRI in the Medial Temporal Lobe During a Memory Encoding Task

    PubMed Central

    Restom, Khaled; Perthen, Joanna E.; Liu, Thomas T.

    2008-01-01

    Prior measures of the blood oxygenation level dependent (BOLD) and cerebral blood flow (CBF) responses to a memory encoding task within the medial temporal lobe have suggested that the coupling between functional changes in CBF and changes in the cerebral metabolic rate of oxgyen (CMRO2) may be tighter in the medial temporal lobe as compared to the primary sensory areas. In this study, we used a calibrated functional magnetic resonance imaging (fMRI) approach to directly estimate memory-encoding-related changes in CMRO2 and to assess the coupling between CBF and CMRO2 in the medial temporal lobe. The CBF-CMRO2 coupling ratio was estimated using a linear fit to the flow and metabolism changes observed across subjects. In addition, we examined the effect of region-of-interest (ROI) selection on the estimates. In response to the memory encoding task, CMRO2 increased by 23.1% ± 8.8 to 25.3% ± 5.7 (depending upon ROI), with an estimated CBF-CMRO2 coupling ratio of 1.66 ± 0.07 to 1.75± 0.16. There was not a significant effect of ROI selection on either the CMRO2 or coupling ratio estimates. The observed coupling ratios were significantly lower than the values (2 to 4.5) that have been reported in previous calibrated fMRI studies of the visual and motor cortices. In addition, the estimated coupling ratio was found to be less sensitive to the calibration procedure for functional responses in the medial temporal lobe as compared to the primary sensory areas. PMID:18329291

  7. Temporal Coherence: A Model for Non-Stationarity in Natural and Simulated Wind Records

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rinker, Jennifer M.; Gavin, Henri P.; Clifton, Andrew

    We present a novel methodology for characterizing and simulating non-stationary stochastic wind records. In this new method, non-stationarity is characterized and modelled via temporal coherence, which is quantified in the discrete frequency domain by probability distributions of the differences in phase between adjacent Fourier components. Temporal coherence can also be used to quantify non-stationary characteristics in wind data. Three case studies are presented that analyze the non-stationarity of turbulent wind data obtained at the National Wind Technology Center near Boulder, Colorado, USA. The first study compares the temporal and spectral characteristics of a stationary wind record and a non-stationary windmore » record in order to highlight their differences in temporal coherence. The second study examines the distribution of one of the proposed temporal coherence parameters and uses it to quantify the prevalence of nonstationarity in the dataset. The third study examines how temporal coherence varies with a range of atmospheric parameters to determine what conditions produce more non-stationarity.« less

  8. Spatio-temporal interpolation of precipitation during monsoon periods in Pakistan

    NASA Astrophysics Data System (ADS)

    Hussain, Ijaz; Spöck, Gunter; Pilz, Jürgen; Yu, Hwa-Lung

    2010-08-01

    Spatio-temporal estimation of precipitation over a region is essential to the modeling of hydrologic processes for water resources management. The changes of magnitude and space-time heterogeneity of rainfall observations make space-time estimation of precipitation a challenging task. In this paper we propose a Box-Cox transformed hierarchical Bayesian multivariate spatio-temporal interpolation method for the skewed response variable. The proposed method is applied to estimate space-time monthly precipitation in the monsoon periods during 1974-2000, and 27-year monthly average precipitation data are obtained from 51 stations in Pakistan. The results of transformed hierarchical Bayesian multivariate spatio-temporal interpolation are compared to those of non-transformed hierarchical Bayesian interpolation by using cross-validation. The software developed by [11] is used for Bayesian non-stationary multivariate space-time interpolation. It is observed that the transformed hierarchical Bayesian method provides more accuracy than the non-transformed hierarchical Bayesian method.

  9. A traditionally administered short course failed to improve medical students' diagnostic performance. A quantitative evaluation of diagnostic thinking.

    PubMed

    Noguchi, Yoshinori; Matsui, Kunihiko; Imura, Hiroshi; Kiyota, Masatomo; Fukui, Tsuguya

    2004-05-01

    Quite often medical students or novice residents have difficulty in ruling out diseases even though they are quite unlikely and, due to this difficulty, such students and novice residents unnecessarily repeat laboratory or imaging tests. To explore whether or not a carefully designed short training course teaching Bayesian probabilistic thinking improves the diagnostic ability of medical students. Ninety students at 2 medical schools were presented with clinical scenarios of coronary artery disease corresponding to high, low, and intermediate pretest probabilities. The students' estimates of test characteristics of exercise stress test, and pretest and posttest probability for each scenario were evaluated before and after the short course. The pretest probability estimates by the students, as well as their proficiency in applying Bayes's theorem, were improved in the high pretest probability scenario after the short course. However, estimates of pretest probability in the low pretest probability scenario, and their proficiency in applying Bayes's theorem in the intermediate and low pretest probability scenarios, showed essentially no improvement. A carefully designed, but traditionally administered, short course could not improve the students' abilities in estimating pretest probability in a low pretest probability setting, and subsequently students remained incompetent in ruling out disease. We need to develop educational methods that cultivate a well-balanced clinical sense to enable students to choose a suitable diagnostic strategy as needed in a clinical setting without being one-sided to the "rule-in conscious paradigm."

  10. Validation of Satellite Precipitation (trmm 3B43) in Ecuadorian Coastal Plains, Andean Highlands and Amazonian Rainforest

    NASA Astrophysics Data System (ADS)

    Ballari, D.; Castro, E.; Campozano, L.

    2016-06-01

    Precipitation monitoring is of utmost importance for water resource management. However, in regions of complex terrain such as Ecuador, the high spatio-temporal precipitation variability and the scarcity of rain gauges, make difficult to obtain accurate estimations of precipitation. Remotely sensed estimated precipitation, such as the Multi-satellite Precipitation Analysis TRMM, can cope with this problem after a validation process, which must be representative in space and time. In this work we validate monthly estimates from TRMM 3B43 satellite precipitation (0.25° x 0.25° resolution), by using ground data from 14 rain gauges in Ecuador. The stations are located in the 3 most differentiated regions of the country: the Pacific coastal plains, the Andean highlands, and the Amazon rainforest. Time series, between 1998 - 2010, of imagery and rain gauges were compared using statistical error metrics such as bias, root mean square error, and Pearson correlation; and with detection indexes such as probability of detection, equitable threat score, false alarm rate and frequency bias index. The results showed that precipitation seasonality is well represented and TRMM 3B43 acceptably estimates the monthly precipitation in the three regions of the country. According to both, statistical error metrics and detection indexes, the coastal and Amazon regions are better estimated quantitatively than the Andean highlands. Additionally, it was found that there are better estimations for light precipitation rates. The present validation of TRMM 3B43 provides important results to support further studies on calibration and bias correction of precipitation in ungagged watershed basins.

  11. Aro: a machine learning approach to identifying single molecules and estimating classification error in fluorescence microscopy images.

    PubMed

    Wu, Allison Chia-Yi; Rifkin, Scott A

    2015-03-27

    Recent techniques for tagging and visualizing single molecules in fixed or living organisms and cell lines have been revolutionizing our understanding of the spatial and temporal dynamics of fundamental biological processes. However, fluorescence microscopy images are often noisy, and it can be difficult to distinguish a fluorescently labeled single molecule from background speckle. We present a computational pipeline to distinguish the true signal of fluorescently labeled molecules from background fluorescence and noise. We test our technique using the challenging case of wide-field, epifluorescence microscope image stacks from single molecule fluorescence in situ experiments on nematode embryos where there can be substantial out-of-focus light and structured noise. The software recognizes and classifies individual mRNA spots by measuring several features of local intensity maxima and classifying them with a supervised random forest classifier. A key innovation of this software is that, by estimating the probability that each local maximum is a true spot in a statistically principled way, it makes it possible to estimate the error introduced by image classification. This can be used to assess the quality of the data and to estimate a confidence interval for the molecule count estimate, all of which are important for quantitative interpretations of the results of single-molecule experiments. The software classifies spots in these images well, with >95% AUROC on realistic artificial data and outperforms other commonly used techniques on challenging real data. Its interval estimates provide a unique measure of the quality of an image and confidence in the classification.

  12. Calibrating random forests for probability estimation.

    PubMed

    Dankowski, Theresa; Ziegler, Andreas

    2016-09-30

    Probabilities can be consistently estimated using random forests. It is, however, unclear how random forests should be updated to make predictions for other centers or at different time points. In this work, we present two approaches for updating random forests for probability estimation. The first method has been proposed by Elkan and may be used for updating any machine learning approach yielding consistent probabilities, so-called probability machines. The second approach is a new strategy specifically developed for random forests. Using the terminal nodes, which represent conditional probabilities, the random forest is first translated to logistic regression models. These are, in turn, used for re-calibration. The two updating strategies were compared in a simulation study and are illustrated with data from the German Stroke Study Collaboration. In most simulation scenarios, both methods led to similar improvements. In the simulation scenario in which the stricter assumptions of Elkan's method were not met, the logistic regression-based re-calibration approach for random forests outperformed Elkan's method. It also performed better on the stroke data than Elkan's method. The strength of Elkan's method is its general applicability to any probability machine. However, if the strict assumptions underlying this approach are not met, the logistic regression-based approach is preferable for updating random forests for probability estimation. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  13. Multiple-Parameter Estimation Method Based on Spatio-Temporal 2-D Processing for Bistatic MIMO Radar

    PubMed Central

    Yang, Shouguo; Li, Yong; Zhang, Kunhui; Tang, Weiping

    2015-01-01

    A novel spatio-temporal 2-dimensional (2-D) processing method that can jointly estimate the transmitting-receiving azimuth and Doppler frequency for bistatic multiple-input multiple-output (MIMO) radar in the presence of spatial colored noise and an unknown number of targets is proposed. In the temporal domain, the cross-correlation of the matched filters’ outputs for different time-delay sampling is used to eliminate the spatial colored noise. In the spatial domain, the proposed method uses a diagonal loading method and subspace theory to estimate the direction of departure (DOD) and direction of arrival (DOA), and the Doppler frequency can then be accurately estimated through the estimation of the DOD and DOA. By skipping target number estimation and the eigenvalue decomposition (EVD) of the data covariance matrix estimation and only requiring a one-dimensional search, the proposed method achieves low computational complexity. Furthermore, the proposed method is suitable for bistatic MIMO radar with an arbitrary transmitted and received geometrical configuration. The correction and efficiency of the proposed method are verified by computer simulation results. PMID:26694385

  14. Multiple-Parameter Estimation Method Based on Spatio-Temporal 2-D Processing for Bistatic MIMO Radar.

    PubMed

    Yang, Shouguo; Li, Yong; Zhang, Kunhui; Tang, Weiping

    2015-12-14

    A novel spatio-temporal 2-dimensional (2-D) processing method that can jointly estimate the transmitting-receiving azimuth and Doppler frequency for bistatic multiple-input multiple-output (MIMO) radar in the presence of spatial colored noise and an unknown number of targets is proposed. In the temporal domain, the cross-correlation of the matched filters' outputs for different time-delay sampling is used to eliminate the spatial colored noise. In the spatial domain, the proposed method uses a diagonal loading method and subspace theory to estimate the direction of departure (DOD) and direction of arrival (DOA), and the Doppler frequency can then be accurately estimated through the estimation of the DOD and DOA. By skipping target number estimation and the eigenvalue decomposition (EVD) of the data covariance matrix estimation and only requiring a one-dimensional search, the proposed method achieves low computational complexity. Furthermore, the proposed method is suitable for bistatic MIMO radar with an arbitrary transmitted and received geometrical configuration. The correction and efficiency of the proposed method are verified by computer simulation results.

  15. Soil Erosion as a stochastic process

    NASA Astrophysics Data System (ADS)

    Casper, Markus C.

    2015-04-01

    The main tools to provide estimations concerning risk and amount of erosion are different types of soil erosion models: on the one hand, there are empirically based model concepts on the other hand there are more physically based or process based models. However, both types of models have substantial weak points. All empirical model concepts are only capable of providing rough estimates over larger temporal and spatial scales, they do not account for many driving factors that are in the scope of scenario related analysis. In addition, the physically based models contain important empirical parts and hence, the demand for universality and transferability is not given. As a common feature, we find, that all models rely on parameters and input variables, which are to certain, extend spatially and temporally averaged. A central question is whether the apparent heterogeneity of soil properties or the random nature of driving forces needs to be better considered in our modelling concepts. Traditionally, researchers have attempted to remove spatial and temporal variability through homogenization. However, homogenization has been achieved through physical manipulation of the system, or by statistical averaging procedures. The price for obtaining this homogenized (average) model concepts of soils and soil related processes has often been a failure to recognize the profound importance of heterogeneity in many of the properties and processes that we study. Especially soil infiltrability and the resistance (also called "critical shear stress" or "critical stream power") are the most important empirical factors of physically based erosion models. The erosion resistance is theoretically a substrate specific parameter, but in reality, the threshold where soil erosion begins is determined experimentally. The soil infiltrability is often calculated with empirical relationships (e.g. based on grain size distribution). Consequently, to better fit reality, this value needs to be corrected experimentally. To overcome this disadvantage of our actual models, soil erosion models are needed that are able to use stochastic directly variables and parameter distributions. There are only some minor approaches in this direction. The most advanced is the model "STOSEM" proposed by Sidorchuk in 2005. In this model, only a small part of the soil erosion processes is described, the aggregate detachment and the aggregate transport by flowing water. The concept is highly simplified, for example, many parameters are temporally invariant. Nevertheless, the main problem is that our existing measurements and experiments are not geared to provide stochastic parameters (e.g. as probability density functions); in the best case they deliver a statistical validation of the mean values. Again, we get effective parameters, spatially and temporally averaged. There is an urgent need for laboratory and field experiments on overland flow structure, raindrop effects and erosion rate, which deliver information on spatial and temporal structure of soil and surface properties and processes.

  16. Dynamic Encoding of Speech Sequence Probability in Human Temporal Cortex

    PubMed Central

    Leonard, Matthew K.; Bouchard, Kristofer E.; Tang, Claire

    2015-01-01

    Sensory processing involves identification of stimulus features, but also integration with the surrounding sensory and cognitive context. Previous work in animals and humans has shown fine-scale sensitivity to context in the form of learned knowledge about the statistics of the sensory environment, including relative probabilities of discrete units in a stream of sequential auditory input. These statistics are a defining characteristic of one of the most important sequential signals humans encounter: speech. For speech, extensive exposure to a language tunes listeners to the statistics of sound sequences. To address how speech sequence statistics are neurally encoded, we used high-resolution direct cortical recordings from human lateral superior temporal cortex as subjects listened to words and nonwords with varying transition probabilities between sound segments. In addition to their sensitivity to acoustic features (including contextual features, such as coarticulation), we found that neural responses dynamically encoded the language-level probability of both preceding and upcoming speech sounds. Transition probability first negatively modulated neural responses, followed by positive modulation of neural responses, consistent with coordinated predictive and retrospective recognition processes, respectively. Furthermore, transition probability encoding was different for real English words compared with nonwords, providing evidence for online interactions with high-order linguistic knowledge. These results demonstrate that sensory processing of deeply learned stimuli involves integrating physical stimulus features with their contextual sequential structure. Despite not being consciously aware of phoneme sequence statistics, listeners use this information to process spoken input and to link low-level acoustic representations with linguistic information about word identity and meaning. PMID:25948269

  17. Estimating the empirical probability of submarine landslide occurrence

    USGS Publications Warehouse

    Geist, Eric L.; Parsons, Thomas E.; Mosher, David C.; Shipp, Craig; Moscardelli, Lorena; Chaytor, Jason D.; Baxter, Christopher D. P.; Lee, Homa J.; Urgeles, Roger

    2010-01-01

    The empirical probability for the occurrence of submarine landslides at a given location can be estimated from age dates of past landslides. In this study, tools developed to estimate earthquake probability from paleoseismic horizons are adapted to estimate submarine landslide probability. In both types of estimates, one has to account for the uncertainty associated with age-dating individual events as well as the open time intervals before and after the observed sequence of landslides. For observed sequences of submarine landslides, we typically only have the age date of the youngest event and possibly of a seismic horizon that lies below the oldest event in a landslide sequence. We use an empirical Bayes analysis based on the Poisson-Gamma conjugate prior model specifically applied to the landslide probability problem. This model assumes that landslide events as imaged in geophysical data are independent and occur in time according to a Poisson distribution characterized by a rate parameter λ. With this method, we are able to estimate the most likely value of λ and, importantly, the range of uncertainty in this estimate. Examples considered include landslide sequences observed in the Santa Barbara Channel, California, and in Port Valdez, Alaska. We confirm that given the uncertainties of age dating that landslide complexes can be treated as single events by performing statistical test of age dates representing the main failure episode of the Holocene Storegga landslide complex.

  18. Estimating the probability of rare events: addressing zero failure data.

    PubMed

    Quigley, John; Revie, Matthew

    2011-07-01

    Traditional statistical procedures for estimating the probability of an event result in an estimate of zero when no events are realized. Alternative inferential procedures have been proposed for the situation where zero events have been realized but often these are ad hoc, relying on selecting methods dependent on the data that have been realized. Such data-dependent inference decisions violate fundamental statistical principles, resulting in estimation procedures whose benefits are difficult to assess. In this article, we propose estimating the probability of an event occurring through minimax inference on the probability that future samples of equal size realize no more events than that in the data on which the inference is based. Although motivated by inference on rare events, the method is not restricted to zero event data and closely approximates the maximum likelihood estimate (MLE) for nonzero data. The use of the minimax procedure provides a risk adverse inferential procedure where there are no events realized. A comparison is made with the MLE and regions of the underlying probability are identified where this approach is superior. Moreover, a comparison is made with three standard approaches to supporting inference where no event data are realized, which we argue are unduly pessimistic. We show that for situations of zero events the estimator can be simply approximated with 1/2.5n, where n is the number of trials. © 2011 Society for Risk Analysis.

  19. Multifractals embedded in short time series: An unbiased estimation of probability moment

    NASA Astrophysics Data System (ADS)

    Qiu, Lu; Yang, Tianguang; Yin, Yanhua; Gu, Changgui; Yang, Huijie

    2016-12-01

    An exact estimation of probability moments is the base for several essential concepts, such as the multifractals, the Tsallis entropy, and the transfer entropy. By means of approximation theory we propose a new method called factorial-moment-based estimation of probability moments. Theoretical prediction and computational results show that it can provide us an unbiased estimation of the probability moments of continuous order. Calculations on probability redistribution model verify that it can extract exactly multifractal behaviors from several hundred recordings. Its powerfulness in monitoring evolution of scaling behaviors is exemplified by two empirical cases, i.e., the gait time series for fast, normal, and slow trials of a healthy volunteer, and the closing price series for Shanghai stock market. By using short time series with several hundred lengths, a comparison with the well-established tools displays significant advantages of its performance over the other methods. The factorial-moment-based estimation can evaluate correctly the scaling behaviors in a scale range about three generations wider than the multifractal detrended fluctuation analysis and the basic estimation. The estimation of partition function given by the wavelet transform modulus maxima has unacceptable fluctuations. Besides the scaling invariance focused in the present paper, the proposed factorial moment of continuous order can find its various uses, such as finding nonextensive behaviors of a complex system and reconstructing the causality relationship network between elements of a complex system.

  20. Methods to assess performance of models estimating risk of death in intensive care patients: a review.

    PubMed

    Cook, D A

    2006-04-01

    Models that estimate the probability of death of intensive care unit patients can be used to stratify patients according to the severity of their condition and to control for casemix and severity of illness. These models have been used for risk adjustment in quality monitoring, administration, management and research and as an aid to clinical decision making. Models such as the Mortality Prediction Model family, SAPS II, APACHE II, APACHE III and the organ system failure models provide estimates of the probability of in-hospital death of ICU patients. This review examines methods to assess the performance of these models. The key attributes of a model are discrimination (the accuracy of the ranking in order of probability of death) and calibration (the extent to which the model's prediction of probability of death reflects the true risk of death). These attributes should be assessed in existing models that predict the probability of patient mortality, and in any subsequent model that is developed for the purposes of estimating these probabilities. The literature contains a range of approaches for assessment which are reviewed and a survey of the methodologies used in studies of intensive care mortality models is presented. The systematic approach used by Standards for Reporting Diagnostic Accuracy provides a framework to incorporate these theoretical considerations of model assessment and recommendations are made for evaluation and presentation of the performance of models that estimate the probability of death of intensive care patients.

Top