Sample records for conditional probability hypothesis

  1. Students' Understanding of Conditional Probability on Entering University

    ERIC Educational Resources Information Center

    Reaburn, Robyn

    2013-01-01

    An understanding of conditional probability is essential for students of inferential statistics as it is used in Null Hypothesis Tests. Conditional probability is also used in Bayes' theorem, in the interpretation of medical screening tests and in quality control procedures. This study examines the understanding of conditional probability of…

  2. A test of the substitution-habitat hypothesis in amphibians.

    PubMed

    Martínez-Abraín, Alejandro; Galán, Pedro

    2018-06-01

    Most examples that support the substitution-habitat hypothesis (human-made habitats act as substitutes of original habitat) deal with birds and mammals. We tested this hypothesis in 14 amphibians by using percentage occupancy as a proxy of habitat quality (i.e., higher occupancy percentages indicate higher quality). We classified water body types as original habitat (no or little human influence) depending on anatomical, behavioral, or physiological adaptations of each amphibian species. Ten species had relatively high probabilities (0.16-0.28) of occurrence in original habitat, moderate probability of occurrence in substitution habitats (0.11-0.14), and low probability of occurrence in refuge habitats (0.05-0.08). Thus, the substitution-habitat hypothesis only partially applies to amphibians because the low occupancy of refuges could be due to the negligible human persecution of this group (indicating good conservation status). However, low occupancy of refuges could also be due to low tolerance of refuge conditions, which could have led to selective extinction or colonization problems due to poor dispersal capabilities. That original habitats had the highest probabilities of occupancy suggests amphibians have a good conservation status in the region. They also appeared highly adaptable to anthropogenic substitution habitats. © 2017 Society for Conservation Biology.

  3. Tier-Adjacency Is Not a Necessary Condition for Learning Phonotactic Dependencies

    ERIC Educational Resources Information Center

    Koo, Hahn; Callahan, Lydia

    2012-01-01

    One hypothesis raised by Newport and Aslin to explain how speakers learn dependencies between nonadjacent phonemes is that speakers track bigram probabilities between two segments that are adjacent to each other within a tier of their own. The hypothesis predicts that a dependency between segments separated from each other at the tier level cannot…

  4. Hypothesis testing and earthquake prediction.

    PubMed

    Jackson, D D

    1996-04-30

    Requirements for testing include advance specification of the conditional rate density (probability per unit time, area, and magnitude) or, alternatively, probabilities for specified intervals of time, space, and magnitude. Here I consider testing fully specified hypotheses, with no parameter adjustments or arbitrary decisions allowed during the test period. Because it may take decades to validate prediction methods, it is worthwhile to formulate testable hypotheses carefully in advance. Earthquake prediction generally implies that the probability will be temporarily higher than normal. Such a statement requires knowledge of "normal behavior"--that is, it requires a null hypothesis. Hypotheses can be tested in three ways: (i) by comparing the number of actual earth-quakes to the number predicted, (ii) by comparing the likelihood score of actual earthquakes to the predicted distribution, and (iii) by comparing the likelihood ratio to that of a null hypothesis. The first two tests are purely self-consistency tests, while the third is a direct comparison of two hypotheses. Predictions made without a statement of probability are very difficult to test, and any test must be based on the ratio of earthquakes in and out of the forecast regions.

  5. Hypothesis testing and earthquake prediction.

    PubMed Central

    Jackson, D D

    1996-01-01

    Requirements for testing include advance specification of the conditional rate density (probability per unit time, area, and magnitude) or, alternatively, probabilities for specified intervals of time, space, and magnitude. Here I consider testing fully specified hypotheses, with no parameter adjustments or arbitrary decisions allowed during the test period. Because it may take decades to validate prediction methods, it is worthwhile to formulate testable hypotheses carefully in advance. Earthquake prediction generally implies that the probability will be temporarily higher than normal. Such a statement requires knowledge of "normal behavior"--that is, it requires a null hypothesis. Hypotheses can be tested in three ways: (i) by comparing the number of actual earth-quakes to the number predicted, (ii) by comparing the likelihood score of actual earthquakes to the predicted distribution, and (iii) by comparing the likelihood ratio to that of a null hypothesis. The first two tests are purely self-consistency tests, while the third is a direct comparison of two hypotheses. Predictions made without a statement of probability are very difficult to test, and any test must be based on the ratio of earthquakes in and out of the forecast regions. PMID:11607663

  6. A comparator-hypothesis account of biased contingency detection.

    PubMed

    Vadillo, Miguel A; Barberia, Itxaso

    2018-02-12

    Our ability to detect statistical dependencies between different events in the environment is strongly biased by the number of coincidences between them. Even when there is no true covariation between a cue and an outcome, if the marginal probability of either of them is high, people tend to perceive some degree of statistical contingency between both events. The present paper explores the ability of the Comparator Hypothesis to explain the general pattern of results observed in this literature. Our simulations show that this model can account for the biasing effects of the marginal probabilities of cues and outcomes. Furthermore, the overall fit of the Comparator Hypothesis to a sample of experimental conditions from previous studies is comparable to that of the popular Rescorla-Wagner model. These results should encourage researchers to further explore and put to the test the predictions of the Comparator Hypothesis in the domain of biased contingency detection. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. Probability in reasoning: a developmental test on conditionals.

    PubMed

    Barrouillet, Pierre; Gauffroy, Caroline

    2015-04-01

    Probabilistic theories have been claimed to constitute a new paradigm for the psychology of reasoning. A key assumption of these theories is captured by what they call the Equation, the hypothesis that the meaning of the conditional is probabilistic in nature and that the probability of If p then q is the conditional probability, in such a way that P(if p then q)=P(q|p). Using the probabilistic truth-table task in which participants are required to evaluate the probability of If p then q sentences, the present study explored the pervasiveness of the Equation through ages (from early adolescence to adulthood), types of conditionals (basic, causal, and inducements) and contents. The results reveal that the Equation is a late developmental achievement only endorsed by a narrow majority of educated adults for certain types of conditionals depending on the content they involve. Age-related changes in evaluating the probability of all the conditionals studied closely mirror the development of truth-value judgements observed in previous studies with traditional truth-table tasks. We argue that our modified mental model theory can account for this development, and hence for the findings related with the probability task, which do not consequently support the probabilistic approach of human reasoning over alternative theories. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Learning Problem-Solving Rules as Search Through a Hypothesis Space.

    PubMed

    Lee, Hee Seung; Betts, Shawn; Anderson, John R

    2016-07-01

    Learning to solve a class of problems can be characterized as a search through a space of hypotheses about the rules for solving these problems. A series of four experiments studied how different learning conditions affected the search among hypotheses about the solution rule for a simple computational problem. Experiment 1 showed that a problem property such as computational difficulty of the rules biased the search process and so affected learning. Experiment 2 examined the impact of examples as instructional tools and found that their effectiveness was determined by whether they uniquely pointed to the correct rule. Experiment 3 compared verbal directions with examples and found that both could guide search. The final experiment tried to improve learning by using more explicit verbal directions or by adding scaffolding to the example. While both manipulations improved learning, learning still took the form of a search through a hypothesis space of possible rules. We describe a model that embodies two assumptions: (1) the instruction can bias the rules participants hypothesize rather than directly be encoded into a rule; (2) participants do not have memory for past wrong hypotheses and are likely to retry them. These assumptions are realized in a Markov model that fits all the data by estimating two sets of probabilities. First, the learning condition induced one set of Start probabilities of trying various rules. Second, should this first hypothesis prove wrong, the learning condition induced a second set of Choice probabilities of considering various rules. These findings broaden our understanding of effective instruction and provide implications for instructional design. Copyright © 2015 Cognitive Science Society, Inc.

  9. A mechanism producing power law etc. distributions

    NASA Astrophysics Data System (ADS)

    Li, Heling; Shen, Hongjun; Yang, Bin

    2017-07-01

    Power law distribution is playing an increasingly important role in the complex system study. Based on the insolvability of complex systems, the idea of incomplete statistics is utilized and expanded, three different exponential factors are introduced in equations about the normalization condition, statistical average and Shannon entropy, with probability distribution function deduced about exponential function, power function and the product form between power function and exponential function derived from Shannon entropy and maximal entropy principle. So it is shown that maximum entropy principle can totally replace equal probability hypothesis. Owing to the fact that power and probability distribution in the product form between power function and exponential function, which cannot be derived via equal probability hypothesis, can be derived by the aid of maximal entropy principle, it also can be concluded that maximal entropy principle is a basic principle which embodies concepts more extensively and reveals basic principles on motion laws of objects more fundamentally. At the same time, this principle also reveals the intrinsic link between Nature and different objects in human society and principles complied by all.

  10. Bayes factor and posterior probability: Complementary statistical evidence to p-value.

    PubMed

    Lin, Ruitao; Yin, Guosheng

    2015-09-01

    As a convention, a p-value is often computed in hypothesis testing and compared with the nominal level of 0.05 to determine whether to reject the null hypothesis. Although the smaller the p-value, the more significant the statistical test, it is difficult to perceive the p-value in a probability scale and quantify it as the strength of the data against the null hypothesis. In contrast, the Bayesian posterior probability of the null hypothesis has an explicit interpretation of how strong the data support the null. We make a comparison of the p-value and the posterior probability by considering a recent clinical trial. The results show that even when we reject the null hypothesis, there is still a substantial probability (around 20%) that the null is true. Not only should we examine whether the data would have rarely occurred under the null hypothesis, but we also need to know whether the data would be rare under the alternative. As a result, the p-value only provides one side of the information, for which the Bayes factor and posterior probability may offer complementary evidence. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Non-Bayesian Inference: Causal Structure Trumps Correlation

    ERIC Educational Resources Information Center

    Bes, Benedicte; Sloman, Steven; Lucas, Christopher G.; Raufaste, Eric

    2012-01-01

    The study tests the hypothesis that conditional probability judgments can be influenced by causal links between the target event and the evidence even when the statistical relations among variables are held constant. Three experiments varied the causal structure relating three variables and found that (a) the target event was perceived as more…

  12. Estimating the Probability of Traditional Copying, Conditional on Answer-Copying Statistics.

    PubMed

    Allen, Jeff; Ghattas, Andrew

    2016-06-01

    Statistics for detecting copying on multiple-choice tests produce p values measuring the probability of a value at least as large as that observed, under the null hypothesis of no copying. The posterior probability of copying is arguably more relevant than the p value, but cannot be derived from Bayes' theorem unless the population probability of copying and probability distribution of the answer-copying statistic under copying are known. In this article, the authors develop an estimator for the posterior probability of copying that is based on estimable quantities and can be used with any answer-copying statistic. The performance of the estimator is evaluated via simulation, and the authors demonstrate how to apply the formula using actual data. Potential uses, generalizability to other types of cheating, and limitations of the approach are discussed.

  13. A field test for differences in condition among trapped and shot mallards

    USGS Publications Warehouse

    Reinecke, K.J.; Shaiffer, C.W.

    1988-01-01

    We tested predictions from the condition bias hypothesis (Weatherland and Greenwood 1981) regarding the effects of sampling methods of body weights of mallards (Anas platyrhynchos) at White River National Wildlife Refuge (WRNWR), Arkansas, during 24 November-8 December 1985. Body weights of 84 mallards caught with unbaited rocket nets in a natural wetland were used as experimental controls and compared to the body weights of 70 mallards captured with baited rocket nets, 86 mallards captured with baited swim-in traps, and 130 mallards killed by hunters. We found no differences (P > 0.27) in body weight among sampling methods, but body condition (wt/wing length) of the birds killed by hunters was less (P 0.75 for differences > 50 g. The condition bias hypothesis probably applies to ducks killed by hunters but not to trapping operations when substantial (> 20 at 1 time) numbers of birds are captured.

  14. Interactive effects of senescence and natural disturbance on the annual survival probabilities of snail kites

    USGS Publications Warehouse

    Reichert, Brian E.; Martin, J.; Kendall, William L.; Cattau, Christopher E.; Kitchens, Wiley M.

    2010-01-01

    Individuals in wild populations face risks associated with both intrinsic (i.e. aging) and external (i.e. environmental) sources of mortality. Condition-dependent mortality occurs when there is an interaction between such factors; however, few studies have clearly demonstrated condition-dependent mortality and some have even argued that condition-dependent mortality does not occur in wild avian populations. Using large sample sizes (2084 individuals, 3746 re-sights) of individual-based longitudinal data collected over a 33 year period (1976-2008) on multiple cohorts, we used a capture-mark-recapture framework to model age-dependent survival in the snail kite Rostrhamus sociabilis plumbeus population in Florida. Adding to the growing amount of evidence for actuarial senescence in wild populations, we found evidence of senescent declines in survival probabilities in adult kites. We also tested the hypothesis that older kites experienced condition-dependent mortality during a range-wide drought event (2000-2002). The results provide convincing evidence that the annual survival probability of senescent kites was disproportionately affected by the drought relative to the survival probability of prime-aged adults. To our knowledge, this is the first evidence of condition-dependent mortality to be demonstrated in a wild avian population, a finding which challenges recent conclusions drawn in the literature. Our study suggests that senescence and condition-dependent mortality can affect the demography of wild avian populations. Accounting for these sources of variation may be particularly important to appropriately compute estimates of population growth rate, and probabilities of quasi-extinctions.

  15. Précis of statistical significance: rationale, validity, and utility.

    PubMed

    Chow, S L

    1998-04-01

    The null-hypothesis significance-test procedure (NHSTP) is defended in the context of the theory-corroboration experiment, as well as the following contrasts: (a) substantive hypotheses versus statistical hypotheses, (b) theory corroboration versus statistical hypothesis testing, (c) theoretical inference versus statistical decision, (d) experiments versus nonexperimental studies, and (e) theory corroboration versus treatment assessment. The null hypothesis can be true because it is the hypothesis that errors are randomly distributed in data. Moreover, the null hypothesis is never used as a categorical proposition. Statistical significance means only that chance influences can be excluded as an explanation of data; it does not identify the nonchance factor responsible. The experimental conclusion is drawn with the inductive principle underlying the experimental design. A chain of deductive arguments gives rise to the theoretical conclusion via the experimental conclusion. The anomalous relationship between statistical significance and the effect size often used to criticize NHSTP is more apparent than real. The absolute size of the effect is not an index of evidential support for the substantive hypothesis. Nor is the effect size, by itself, informative as to the practical importance of the research result. Being a conditional probability, statistical power cannot be the a priori probability of statistical significance. The validity of statistical power is debatable because statistical significance is determined with a single sampling distribution of the test statistic based on H0, whereas it takes two distributions to represent statistical power or effect size. Sample size should not be determined in the mechanical manner envisaged in power analysis. It is inappropriate to criticize NHSTP for nonstatistical reasons. At the same time, neither effect size, nor confidence interval estimate, nor posterior probability can be used to exclude chance as an explanation of data. Neither can any of them fulfill the nonstatistical functions expected of them by critics.

  16. Unification of field theory and maximum entropy methods for learning probability densities

    NASA Astrophysics Data System (ADS)

    Kinney, Justin B.

    2015-09-01

    The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy density estimate can be recovered in the infinite smoothness limit of an appropriate Bayesian field theory. I also show that Bayesian field theory estimation can be performed without imposing any boundary conditions on candidate densities, and that the infinite smoothness limit of these theories recovers the most common types of maximum entropy estimates. Bayesian field theory thus provides a natural test of the maximum entropy null hypothesis and, furthermore, returns an alternative (lower entropy) density estimate when the maximum entropy hypothesis is falsified. The computations necessary for this approach can be performed rapidly for one-dimensional data, and software for doing this is provided.

  17. Unification of field theory and maximum entropy methods for learning probability densities.

    PubMed

    Kinney, Justin B

    2015-09-01

    The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy density estimate can be recovered in the infinite smoothness limit of an appropriate Bayesian field theory. I also show that Bayesian field theory estimation can be performed without imposing any boundary conditions on candidate densities, and that the infinite smoothness limit of these theories recovers the most common types of maximum entropy estimates. Bayesian field theory thus provides a natural test of the maximum entropy null hypothesis and, furthermore, returns an alternative (lower entropy) density estimate when the maximum entropy hypothesis is falsified. The computations necessary for this approach can be performed rapidly for one-dimensional data, and software for doing this is provided.

  18. To P or Not to P: Backing Bayesian Statistics.

    PubMed

    Buchinsky, Farrel J; Chadha, Neil K

    2017-12-01

    In biomedical research, it is imperative to differentiate chance variation from truth before we generalize what we see in a sample of subjects to the wider population. For decades, we have relied on null hypothesis significance testing, where we calculate P values for our data to decide whether to reject a null hypothesis. This methodology is subject to substantial misinterpretation and errant conclusions. Instead of working backward by calculating the probability of our data if the null hypothesis were true, Bayesian statistics allow us instead to work forward, calculating the probability of our hypothesis given the available data. This methodology gives us a mathematical means of incorporating our "prior probabilities" from previous study data (if any) to produce new "posterior probabilities." Bayesian statistics tell us how confidently we should believe what we believe. It is time to embrace and encourage their use in our otolaryngology research.

  19. Maternal condition and previous reproduction interact to affect offspring sex in a wild mammal.

    PubMed

    Douhard, Mathieu; Festa-Bianchet, Marco; Pelletier, Fanie

    2016-08-01

    Trivers and Willard proposed that offspring sex ratio should vary with maternal condition when condition, meant as maternal capacity to care, has different fitness consequences for sons and daughters. In polygynous and dimorphic species, mothers in good condition should preferentially produce sons, whereas mothers in poor condition should produce more daughters. Despite its logical appeal, support for this hypothesis has been inconsistent. Sex-ratio variation may be influenced by additional factors, such as environmental conditions and previous reproduction, which are often ignored in empirical studies. We analysed 39 years of data on bighorn sheep (Ovis canadensis) that fit all the assumptions of the Trivers-Willard hypothesis. Production of sons increased with maternal condition only for mothers that weaned a son the previous year. This relationship likely reflects a mother's ability to bear the higher reproductive costs of sons. The interaction between maternal condition and previous weaning success on the probability of producing a son was independent of the positive effect of paternal reproductive success. Maternal and paternal effects accounted for similar proportions of the variance in offspring sex. Maternal reproductive history should be considered in addition to current condition in studies of sex allocation. © 2016 The Author(s).

  20. Pre-conception energy balance and secondary sex ratio--partial support for the Trivers-Willard hypothesis in dairy cows.

    PubMed

    Roche, J R; Lee, J M; Berry, D P

    2006-06-01

    According to the Trivers-Willard hypothesis, maternal condition at or around conception affects the secondary sex ratio in mammals. However, there are little or no data available on indicators of maternal condition in dairy cows on the sex of the resultant offspring. A total of 76,607 body condition score (BCS; scale of 1 to 5) records and 76,611 body weight (BW) records from 3,209 lactations across 1,172 cows were extracted from a research database collated from one research herd between 1986 and 2004, inclusive. Exclusion of multiple births and cows with no information before calving (e.g., nulliparous animals) resulted in 2,029 records with BCS and BW observations from the previous calving, and 2,002 and 1,872 lactations with BCS and BW observations at conception and midgestation, respectively. Change in BCS and BW between calving and conception and between conception and midgestation was calculated per lactation. Generalized estimating equations were used to model the logit of the probability of a male calf, in which cow was included as a repeated effect with a first-order autoregressive correlation structure assumed among records within cow. Of the BCS variables investigated, there was a linear relationship between the logit of the probability of a male calf and BCS change between calving and conception, the rate of BCS change over this period (BCS divided by days in milk), and BCS at the calving event immediately before conception. The birth of a bull calf was 1.85 times more likely in cows that lost no BCS from calving to conception compared with cows that lost one BCS unit from calving to conception. This increase in odds was equivalent to a 14% unit increase in the probability of a male calf (from 54 to 68%). The amount of BW lost between calving and conception and the rate of loss affected the sex of the resultant offspring. Less BW loss or greater BW gain between calving and conception was associated with greater likelihood of a male calf. Results suggested a positive effect of pre-conception BCS and BW change on secondary sex ratio, agreeing with the Trivers-Willard hypothesis that females in good physiological condition are more likely to produce male offspring.

  1. Addendum to the article: Misuse of null hypothesis significance testing: Would estimation of positive and negative predictive values improve certainty of chemical risk assessment?

    PubMed

    Bundschuh, Mirco; Newman, Michael C; Zubrod, Jochen P; Seitz, Frank; Rosenfeldt, Ricki R; Schulz, Ralf

    2015-03-01

    We argued recently that the positive predictive value (PPV) and the negative predictive value (NPV) are valuable metrics to include during null hypothesis significance testing: They inform the researcher about the probability of statistically significant and non-significant test outcomes actually being true. Although commonly misunderstood, a reported p value estimates only the probability of obtaining the results or more extreme results if the null hypothesis of no effect was true. Calculations of the more informative PPV and NPV require a priori estimate of the probability (R). The present document discusses challenges of estimating R.

  2. P value and the theory of hypothesis testing: an explanation for new researchers.

    PubMed

    Biau, David Jean; Jolles, Brigitte M; Porcher, Raphaël

    2010-03-01

    In the 1920s, Ronald Fisher developed the theory behind the p value and Jerzy Neyman and Egon Pearson developed the theory of hypothesis testing. These distinct theories have provided researchers important quantitative tools to confirm or refute their hypotheses. The p value is the probability to obtain an effect equal to or more extreme than the one observed presuming the null hypothesis of no effect is true; it gives researchers a measure of the strength of evidence against the null hypothesis. As commonly used, investigators will select a threshold p value below which they will reject the null hypothesis. The theory of hypothesis testing allows researchers to reject a null hypothesis in favor of an alternative hypothesis of some effect. As commonly used, investigators choose Type I error (rejecting the null hypothesis when it is true) and Type II error (accepting the null hypothesis when it is false) levels and determine some critical region. If the test statistic falls into that critical region, the null hypothesis is rejected in favor of the alternative hypothesis. Despite similarities between the two, the p value and the theory of hypothesis testing are different theories that often are misunderstood and confused, leading researchers to improper conclusions. Perhaps the most common misconception is to consider the p value as the probability that the null hypothesis is true rather than the probability of obtaining the difference observed, or one that is more extreme, considering the null is true. Another concern is the risk that an important proportion of statistically significant results are falsely significant. Researchers should have a minimum understanding of these two theories so that they are better able to plan, conduct, interpret, and report scientific experiments.

  3. An Empirical Comparison of Selected Two-Sample Hypothesis Testing Procedures Which Are Locally Most Powerful Under Certain Conditions.

    ERIC Educational Resources Information Center

    Hoover, H. D.; Plake, Barbara

    The relative power of the Mann-Whitney statistic, the t-statistic, the median test, a test based on exceedances (A,B), and two special cases of (A,B) the Tukey quick test and the revised Tukey quick test, was investigated via a Monte Carlo experiment. These procedures were compared across four population probability models: uniform, beta, normal,…

  4. Universal laws of human society's income distribution

    NASA Astrophysics Data System (ADS)

    Tao, Yong

    2015-10-01

    General equilibrium equations in economics play the same role with many-body Newtonian equations in physics. Accordingly, each solution of the general equilibrium equations can be regarded as a possible microstate of the economic system. Since Arrow's Impossibility Theorem and Rawls' principle of social fairness will provide a powerful support for the hypothesis of equal probability, then the principle of maximum entropy is available in a just and equilibrium economy so that an income distribution will occur spontaneously (with the largest probability). Remarkably, some scholars have observed such an income distribution in some democratic countries, e.g. USA. This result implies that the hypothesis of equal probability may be only suitable for some "fair" systems (economic or physical systems). From this meaning, the non-equilibrium systems may be "unfair" so that the hypothesis of equal probability is unavailable.

  5. Octopamine influences honey bee foraging preference.

    PubMed

    Giray, Tugrul; Galindo-Cardona, Alberto; Oskay, Devrim

    2007-07-01

    Colony condition and differences in individual preferences influence forage type collected by bees. Physiological bases for the changing preferences of individual foragers are just beginning to be examined. Recently, for honey bees octopamine is shown to influence age at onset of foraging and probability of dance for rewards. However, octopamine has not been causally linked with foraging preference in the field. We tested the hypothesis that changes in octopamine may alter forage type (preference hypothesis). We treated identified foragers orally with octopamine or its immediate precursor, tyramine, or sucrose syrup (control). Octopamine-treated foragers switched type of material collected; control bees did not. Tyramine group results were not different from the control group. In addition, sugar concentrations of nectar collected by foragers after octopamine treatment were lower than before treatment, indicating change in preference. In contrast, before and after nectar concentrations for bees in the control group were similar. These results, taken together, support the preference hypothesis.

  6. Seasonal fecundity is not related to geographic position ...

    EPA Pesticide Factsheets

    AimSixty-five years ago, Theodosius Dobzhansky suggested that individuals of a species face greater challenges from abiotic stressors at high latitudes and from biotic stressors at their low-latitude range edges. This idea has been expanded to the hypothesis that species’ ranges are limited by abiotic and biotic stressors at high and low latitudes, respectively. Support has been found in many systems, but this hypothesis has almost never been tested with demographic data. We present an analysis of fecundity across the breeding range of a species as a test of this hypothesis.Location575 km of tidal marshes in the northeastern United States.MethodsWe monitored saltmarsh sparrow (Ammodramus caudacutus) nests at twenty-three sites from Maine to New Jersey, USA. With data from 840 nests, we calculated daily nest failure probabilities due to competing abiotic (flooding) and biotic (depredation) stressors.ResultsWe observed that abiotic stress (nest flooding probability) was greater than biotic stress (nest depredation probability) at the high-latitude range edge of saltmarsh sparrows, consistent with Dobzhansky’s hypothesis. Similarly, biotic stress decreased with increasing latitude throughout the range, whereas abiotic stress was not predicted by latitude alone. Instead, nest flooding probability was best predicted by date, maximum high tide, and extremity of rare flooding events.Main conclusionsOur results provide support for Dobzhansky’s hypothesis across th

  7. Automatic registration of terrestrial point clouds based on panoramic reflectance images and efficient BaySAC

    NASA Astrophysics Data System (ADS)

    Kang, Zhizhong

    2013-10-01

    This paper presents a new approach to automatic registration of terrestrial laser scanning (TLS) point clouds utilizing a novel robust estimation method by an efficient BaySAC (BAYes SAmpling Consensus). The proposed method directly generates reflectance images from 3D point clouds, and then using SIFT algorithm extracts keypoints to identify corresponding image points. The 3D corresponding points, from which transformation parameters between point clouds are computed, are acquired by mapping the 2D ones onto the point cloud. To remove false accepted correspondences, we implement a conditional sampling method to select the n data points with the highest inlier probabilities as a hypothesis set and update the inlier probabilities of each data point using simplified Bayes' rule for the purpose of improving the computation efficiency. The prior probability is estimated by the verification of the distance invariance between correspondences. The proposed approach is tested on four data sets acquired by three different scanners. The results show that, comparing with the performance of RANSAC, BaySAC leads to less iterations and cheaper computation cost when the hypothesis set is contaminated with more outliers. The registration results also indicate that, the proposed algorithm can achieve high registration accuracy on all experimental datasets.

  8. Concerns regarding a call for pluralism of information theory and hypothesis testing

    USGS Publications Warehouse

    Lukacs, P.M.; Thompson, W.L.; Kendall, W.L.; Gould, W.R.; Doherty, P.F.; Burnham, K.P.; Anderson, D.R.

    2007-01-01

    1. Stephens et al . (2005) argue for `pluralism? in statistical analysis, combining null hypothesis testing and information-theoretic (I-T) methods. We show that I-T methods are more informative even in single variable problems and we provide an ecological example. 2. I-T methods allow inferences to be made from multiple models simultaneously. We believe multimodel inference is the future of data analysis, which cannot be achieved with null hypothesis-testing approaches. 3. We argue for a stronger emphasis on critical thinking in science in general and less reliance on exploratory data analysis and data dredging. Deriving alternative hypotheses is central to science; deriving a single interesting science hypothesis and then comparing it to a default null hypothesis (e.g. `no difference?) is not an efficient strategy for gaining knowledge. We think this single-hypothesis strategy has been relied upon too often in the past. 4. We clarify misconceptions presented by Stephens et al . (2005). 5. We think inference should be made about models, directly linked to scientific hypotheses, and their parameters conditioned on data, Prob(Hj| data). I-T methods provide a basis for this inference. Null hypothesis testing merely provides a probability statement about the data conditioned on a null model, Prob(data |H0). 6. Synthesis and applications. I-T methods provide a more informative approach to inference. I-T methods provide a direct measure of evidence for or against hypotheses and a means to consider simultaneously multiple hypotheses as a basis for rigorous inference. Progress in our science can be accelerated if modern methods can be used intelligently; this includes various I-T and Bayesian methods.

  9. Comparative study of probability distribution distances to define a metric for the stability of multi-source biomedical research data.

    PubMed

    Sáez, Carlos; Robles, Montserrat; García-Gómez, Juan Miguel

    2013-01-01

    Research biobanks are often composed by data from multiple sources. In some cases, these different subsets of data may present dissimilarities among their probability density functions (PDF) due to spatial shifts. This, may lead to wrong hypothesis when treating the data as a whole. Also, the overall quality of the data is diminished. With the purpose of developing a generic and comparable metric to assess the stability of multi-source datasets, we have studied the applicability and behaviour of several PDF distances over shifts on different conditions (such as uni- and multivariate, different types of variable, and multi-modality) which may appear in real biomedical data. From the studied distances, we found information-theoretic based and Earth Mover's Distance to be the most practical distances for most conditions. We discuss the properties and usefulness of each distance according to the possible requirements of a general stability metric.

  10. Judging the Probability of Hypotheses Versus the Impact of Evidence: Which Form of Inductive Inference Is More Accurate and Time-Consistent?

    PubMed

    Tentori, Katya; Chater, Nick; Crupi, Vincenzo

    2016-04-01

    Inductive reasoning requires exploiting links between evidence and hypotheses. This can be done focusing either on the posterior probability of the hypothesis when updated on the new evidence or on the impact of the new evidence on the credibility of the hypothesis. But are these two cognitive representations equally reliable? This study investigates this question by comparing probability and impact judgments on the same experimental materials. The results indicate that impact judgments are more consistent in time and more accurate than probability judgments. Impact judgments also predict the direction of errors in probability judgments. These findings suggest that human inductive reasoning relies more on estimating evidential impact than on posterior probability. Copyright © 2015 Cognitive Science Society, Inc.

  11. Investigating prior probabilities in a multiple hypothesis test for use in space domain awareness

    NASA Astrophysics Data System (ADS)

    Hardy, Tyler J.; Cain, Stephen C.

    2016-05-01

    The goal of this research effort is to improve Space Domain Awareness (SDA) capabilities of current telescope systems through improved detection algorithms. Ground-based optical SDA telescopes are often spatially under-sampled, or aliased. This fact negatively impacts the detection performance of traditionally proposed binary and correlation-based detection algorithms. A Multiple Hypothesis Test (MHT) algorithm has been previously developed to mitigate the effects of spatial aliasing. This is done by testing potential Resident Space Objects (RSOs) against several sub-pixel shifted Point Spread Functions (PSFs). A MHT has been shown to increase detection performance for the same false alarm rate. In this paper, the assumption of a priori probability used in a MHT algorithm is investigated. First, an analysis of the pixel decision space is completed to determine alternate hypothesis prior probabilities. These probabilities are then implemented into a MHT algorithm, and the algorithm is then tested against previous MHT algorithms using simulated RSO data. Results are reported with Receiver Operating Characteristic (ROC) curves and probability of detection, Pd, analysis.

  12. A single-gene explanation for the probability of having idiopathic talipes equinovarus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rebbeck, T.R.; Buetow, K.H.; Dietz, F.R.

    1993-11-01

    It has been hypothesized that the pathogenesis of idiopathic talipes equinovarus (ITEV, or clubfoot) is explained by genetic regulation of development and growth. The objective of the present study was to determine whether a single Mendelian gene explains the probability of having ITEV in a sample of 143 Caucasian pedigrees from Iowa. These pedigrees were ascertained through probands with ITEV. Complex segregation analyses were undertaken using a regressive logistic model. The results of these analyses strongly rejected the hypotheses that the probability of having ITEV in these pedigrees was explained by a non-Mendelian pattern of transmission with residual sibling correlation,more » a nontransmitted (environmental) factor with residual sibling correlation, or residual sibling correlation alone. These results were consistent with the hypothesis that the probability of having ITEV was explained by the Mendelian segregation of a single gene with two alleles plus the effects of some unmeasured factor(s) shared among siblings. The segregation of alleles at this single Mendelian gene indicated that the disease allele A was incompletely dominant to the nondisease allele B. The disease allele A, associated with ITEV affection, was estimated to occur in the population of inference with a frequency of .007. After adjusting for sex-specific population incidences of ITEV, the conditional probability (penetrance) of ITEV affection given the AA, AB, and BB genotypes was computed to be 1.0, 0.039, and .0006, respectively. Individual pedigrees in this sample that most strongly supported the single Mendelian gene hypothesis were identified. These pedigrees are candidates for genetic linkage analyses or DNA association studies. 35 refs., 2 figs., 7 tabs.« less

  13. Self-reported hand washing behaviors and foodborne illness: a propensity score matching approach.

    PubMed

    Ali, Mir M; Verrill, Linda; Zhang, Yuanting

    2014-03-01

    Hand washing is a simple and effective but easily overlooked way to reduce cross-contamination and the transmission of foodborne pathogens. In this study, we used the propensity score matching methodology to account for potential selection bias to explore our hypothesis that always washing hands before food preparation tasks is associated with a reduction in the probability of reported foodborne illness. Propensity score matching can simulate random assignment to a condition so that pretreatment observable differences between a treatment group and a control group are homogenous on all the covariates except the treatment variable. Using the U.S. Food and Drug Administration's 2010 Food Safety Survey, we estimated the effect of self-reported hand washing behavior on the probability of self-reported foodborne illness. Our results indicate that reported washing of hands with soap always before food preparation leads to a reduction in the probability of reported foodborne illness.

  14. Autosomal STRs provide genetic evidence for the hypothesis that Tai people originate from southern China.

    PubMed

    Sun, Hao; Zhou, Chi; Huang, Xiaoqin; Lin, Keqin; Shi, Lei; Yu, Liang; Liu, Shuyuan; Chu, Jiayou; Yang, Zhaoqing

    2013-01-01

    Tai people are widely distributed in Thailand, Laos and southwestern China and are a large population of Southeast Asia. Although most anthropologists and historians agree that modern Tai people are from southwestern China and northern Thailand, the place from which they historically migrated remains controversial. Three popular hypotheses have been proposed: northern origin hypothesis, southern origin hypothesis or an indigenous origin. We compared the genetic relationships between the Tai in China and their "siblings" to test different hypotheses by analyzing 10 autosomal microsatellites. The genetic data of 916 samples from 19 populations were analyzed in this survey. The autosomal STR data from 15 of the 19 populations came from our previous study (Lin et al., 2010). 194 samples from four additional populations were genotyped in this study: Han (Yunnan), Dai (Dehong), Dai (Yuxi) and Mongolian. The results of genetic distance comparisons, genetic structure analyses and admixture analyses all indicate that populations from northern origin hypothesis have large genetic distances and are clearly differentiated from the Tai. The simulation-based ABC analysis also indicates this. The posterior probability of the northern origin hypothesis is just 0.04 [95%CI: (0.01-0.06)]. Conversely, genetic relationships were very close between the Tai and populations from southern origin or an indigenous origin hypothesis. Simulation-based ABC analyses were also used to distinguish the southern origin hypothesis from the indigenous origin hypothesis. The results indicate that the posterior probability of the southern origin hypothesis [0.640, 95%CI: (0.524-0.757)] is greater than that of the indigenous origin hypothesis [0.324, 95%CI: (0.211-0.438)]. Therefore, we propose that the genetic evidence does not support the hypothesis of northern origin. Our genetic data indicate that the southern origin hypothesis has higher probability than the other two hypotheses statistically, suggesting that the Tai people most likely originated from southern China.

  15. What Would Jaws Do? The Tyranny of Film and the Relationship between Gaze and Higher-Level Narrative Film Comprehension

    PubMed Central

    Loschky, Lester C.; Larson, Adam M.; Magliano, Joseph P.; Smith, Tim J.

    2015-01-01

    What is the relationship between film viewers’ eye movements and their film comprehension? Typical Hollywood movies induce strong attentional synchrony—most viewers look at the same things at the same time. Thus, we asked whether film viewers’ eye movements would differ based on their understanding—the mental model hypothesis—or whether any such differences would be overwhelmed by viewers’ attentional synchrony—the tyranny of film hypothesis. To investigate this question, we manipulated the presence/absence of prior film context and measured resulting differences in film comprehension and eye movements. Viewers watched a 12-second James Bond movie clip, ending just as a critical predictive inference should be drawn that Bond’s nemesis, “Jaws,” would fall from the sky onto a circus tent. The No-context condition saw only the 12-second clip, but the Context condition also saw the preceding 2.5 minutes of the movie before seeing the critical 12-second portion. Importantly, the Context condition viewers were more likely to draw the critical inference and were more likely to perceive coherence across the entire 6 shot sequence (as shown by event segmentation), indicating greater comprehension. Viewers’ eye movements showed strong attentional synchrony in both conditions as compared to a chance level baseline, but smaller differences between conditions. Specifically, the Context condition viewers showed slightly, but significantly, greater attentional synchrony and lower cognitive load (as shown by fixation probability) during the critical first circus tent shot. Thus, overall, the results were more consistent with the tyranny of film hypothesis than the mental model hypothesis. These results suggest the need for a theory that encompasses processes from the perception to the comprehension of film. PMID:26606606

  16. Upper bounds on the error probabilities and asymptotic error exponents in quantum multiple state discrimination

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Audenaert, Koenraad M. R., E-mail: koenraad.audenaert@rhul.ac.uk; Department of Physics and Astronomy, University of Ghent, S9, Krijgslaan 281, B-9000 Ghent; Mosonyi, Milán, E-mail: milan.mosonyi@gmail.com

    2014-10-01

    We consider the multiple hypothesis testing problem for symmetric quantum state discrimination between r given states σ₁, …, σ{sub r}. By splitting up the overall test into multiple binary tests in various ways we obtain a number of upper bounds on the optimal error probability in terms of the binary error probabilities. These upper bounds allow us to deduce various bounds on the asymptotic error rate, for which it has been hypothesized that it is given by the multi-hypothesis quantum Chernoff bound (or Chernoff divergence) C(σ₁, …, σ{sub r}), as recently introduced by Nussbaum and Szkoła in analogy with Salikhov'smore » classical multi-hypothesis Chernoff bound. This quantity is defined as the minimum of the pairwise binary Chernoff divergences min{sub j« less

  17. Entropy generation in biophysical systems

    NASA Astrophysics Data System (ADS)

    Lucia, U.; Maino, G.

    2013-03-01

    Recently, in theoretical biology and in biophysical engineering the entropy production has been verified to approach asymptotically its maximum rate, by using the probability of individual elementary modes distributed in accordance with the Boltzmann distribution. The basis of this approach is the hypothesis that the entropy production rate is maximum at the stationary state. In the present work, this hypothesis is explained and motivated, starting from the entropy generation analysis. This latter quantity is obtained from the entropy balance for open systems considering the lifetime of the natural real process. The Lagrangian formalism is introduced in order to develop an analytical approach to the thermodynamic analysis of the open irreversible systems. The stationary conditions of the open systems are thus obtained in relation to the entropy generation and the least action principle. Consequently, the considered hypothesis is analytically proved and it represents an original basic approach in theoretical and mathematical biology and also in biophysical engineering. It is worth remarking that the present results show that entropy generation not only increases but increases as fast as possible.

  18. Dopaminergic Drug Effects on Probability Weighting during Risky Decision Making.

    PubMed

    Ojala, Karita E; Janssen, Lieneke K; Hashemi, Mahur M; Timmer, Monique H M; Geurts, Dirk E M; Ter Huurne, Niels P; Cools, Roshan; Sescousse, Guillaume

    2018-01-01

    Dopamine has been associated with risky decision-making, as well as with pathological gambling, a behavioral addiction characterized by excessive risk-taking behavior. However, the specific mechanisms through which dopamine might act to foster risk-taking and pathological gambling remain elusive. Here we test the hypothesis that this might be achieved, in part, via modulation of subjective probability weighting during decision making. Human healthy controls ( n = 21) and pathological gamblers ( n = 16) played a decision-making task involving choices between sure monetary options and risky gambles both in the gain and loss domains. Each participant played the task twice, either under placebo or the dopamine D 2 /D 3 receptor antagonist sulpiride, in a double-blind counterbalanced design. A prospect theory modelling approach was used to estimate subjective probability weighting and sensitivity to monetary outcomes. Consistent with prospect theory, we found that participants presented a distortion in the subjective weighting of probabilities, i.e., they overweighted low probabilities and underweighted moderate to high probabilities, both in the gain and loss domains. Compared with placebo, sulpiride attenuated this distortion in the gain domain. Across drugs, the groups did not differ in their probability weighting, although gamblers consistently underweighted losing probabilities in the placebo condition. Overall, our results reveal that dopamine D 2 /D 3 receptor antagonism modulates the subjective weighting of probabilities in the gain domain, in the direction of more objective, economically rational decision making.

  19. Wald Sequential Probability Ratio Test for Space Object Conjunction Assessment

    NASA Technical Reports Server (NTRS)

    Carpenter, James R.; Markley, F Landis

    2014-01-01

    This paper shows how satellite owner/operators may use sequential estimates of collision probability, along with a prior assessment of the base risk of collision, in a compound hypothesis ratio test to inform decisions concerning collision risk mitigation maneuvers. The compound hypothesis test reduces to a simple probability ratio test, which appears to be a novel result. The test satisfies tolerances related to targeted false alarm and missed detection rates. This result is independent of the method one uses to compute the probability density that one integrates to compute collision probability. A well-established test case from the literature shows that this test yields acceptable results within the constraints of a typical operational conjunction assessment decision timeline. Another example illustrates the use of the test in a practical conjunction assessment scenario based on operations of the International Space Station.

  20. The introduction of an additional probability coefficient in evaluating the possibility of the existence of extraterrestrial intelligent beings

    NASA Astrophysics Data System (ADS)

    Barth, H.

    An hypothesis is presented concerning the crucial influence of tides on the evolutionary transition from aquatic to land animal forms. The hypothesis suggests that the evolution of higher forms of life on a planet also depends on the existence of a planet-moon system in which the mass ratio of both constituents must be approximately equal to that of the earth-moon system, which is 81:1. The hypothesis is taken into account in the form of the probability factor fb in Drake's formula for estimating the presumed extraterrestrial civilizations in Milky Way which may conceivably make contact.

  1. FERTILITY INTENTIONS AND EARLY LIFE HEALTH STRESS AMONG WOMEN IN EIGHT INDIAN CITIES: TESTING THE REPRODUCTIVE ACCELERATION HYPOTHESIS.

    PubMed

    Kulathinal, Sangita; Säävälä, Minna

    2015-09-01

    In life history theory, early life adversity is associated with an accelerated reproductive tempo. In harsh and unpredictable conditions in developing societies fertility is generally higher and the reproductive tempo faster than in more secure environments. This paper examines whether differences in female anthropometry, particularly adult height, are associated with fertility intentions of women in urban environments in India. The study population consists of women aged 15-29 (N=4485) in slums and non-slums of eight Indian cities in the National Family Health Survey (NFHS) of 2005-2006. Adult height is taken as a proxy for early childhood health and nutritional condition. Fertility intentions are examined by using two variables: the desire to have a child or another child, and to have it relatively soon, as indicative of accelerated reproductive scheduling. Evidence supporting the acceleration hypothesis is found in two urban frames out of 26 examined in a two-staged multinomial logistic model. In three cases, the relationship between fertility intentions and height is the opposite than expected by the acceleration hypothesis: taller women have a higher predictive probability of desiring a(nother) child and/or narrower birth spacing. Potential explanations for the partly contradictory relationship between the childhood health indicator and fertility intentions are discussed.

  2. Direct evidence for a dual process model of deductive inference.

    PubMed

    Markovits, Henry; Brunet, Marie-Laurence; Thompson, Valerie; Brisson, Janie

    2013-07-01

    In 2 experiments, we tested a strong version of a dual process theory of conditional inference (cf. Verschueren et al., 2005a, 2005b) that assumes that most reasoners have 2 strategies available, the choice of which is determined by situational variables, cognitive capacity, and metacognitive control. The statistical strategy evaluates inferences probabilistically, accepting those with high conditional probability. The counterexample strategy rejects inferences when a counterexample shows the inference to be invalid. To discriminate strategy use, we presented reasoners with conditional statements (if p, then q) and explicit statistical information about the relative frequency of the probability of p/q (50% vs. 90%). A statistical strategy would accept the more probable inferences more frequently, whereas the counterexample one would reject both. In Experiment 1, reasoners under time pressure used the statistical strategy more, but switched to the counterexample strategy when time constraints were removed; the former took less time than the latter. These data are consistent with the hypothesis that the statistical strategy is the default heuristic. Under a free-time condition, reasoners preferred the counterexample strategy and kept it when put under time pressure. Thus, it is not simply a lack of capacity that produces a statistical strategy; instead, it seems that time pressure disrupts the ability to make good metacognitive choices. In line with this conclusion, in a 2nd experiment, we measured reasoners' confidence in their performance; those under time pressure were less confident in the statistical than the counterexample strategy and more likely to switch strategies under free-time conditions. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  3. Tracking Object Existence From an Autonomous Patrol Vehicle

    NASA Technical Reports Server (NTRS)

    Wolf, Michael; Scharenbroich, Lucas

    2011-01-01

    An autonomous vehicle patrols a large region, during which an algorithm receives measurements of detected potential objects within its sensor range. The goal of the algorithm is to track all objects in the region over time. This problem differs from traditional multi-target tracking scenarios because the region of interest is much larger than the sensor range and relies on the movement of the sensor through this region for coverage. The goal is to know whether anything has changed between visits to the same location. In particular, two kinds of alert conditions must be detected: (1) a previously detected object has disappeared and (2) a new object has appeared in a location already checked. For the time an object is within sensor range, the object can be assumed to remain stationary, changing position only between visits. The problem is difficult because the upstream object detection processing is likely to make many errors, resulting in heavy clutter (false positives) and missed detections (false negatives), and because only noisy, bearings-only measurements are available. This work has three main goals: (1) Associate incoming measurements with known objects or mark them as new objects or false positives, as appropriate. For this, a multiple hypothesis tracker was adapted to this scenario. (2) Localize the objects using multiple bearings-only measurements to provide estimates of global position (e.g., latitude and longitude). A nonlinear Kalman filter extension provides these 2D position estimates using the 1D measurements. (3) Calculate the probability that a suspected object truly exists (in the estimated position), and determine whether alert conditions have been triggered (for new objects or disappeared objects). The concept of a probability of existence was created, and a new Bayesian method for updating this probability at each time step was developed. A probabilistic multiple hypothesis approach is chosen because of its superiority in handling the uncertainty arising from errors in sensors and upstream processes. However, traditional target tracking methods typically assume a stationary detection volume of interest, whereas in this case, one must make adjustments for being able to see only a small portion of the region of interest and understand when an alert situation has occurred. To track object existence inside and outside the vehicle's sensor range, a probability of existence was defined for each hypothesized object, and this value was updated at every time step in a Bayesian manner based on expected characteristics of the sensor and object and whether that object has been detected in the most recent time step. Then, this value feeds into a sequential probability ratio test (SPRT) to determine the status of the object (suspected, confirmed, or deleted). Alerts are sent upon selected status transitions. Additionally, in order to track objects that move in and out of sensor range and update the probability of existence appropriately a variable probability detection has been defined and the hypothesis probability equations have been re-derived to accommodate this change. Unsupervised object tracking is a pervasive issue in automated perception systems. This work could apply to any mobile platform (ground vehicle, sea vessel, air vehicle, or orbiter) that intermittently revisits regions of interest and needs to determine whether anything interesting has changed.

  4. Non-linear relationship of cell hit and transformation probabilities in a low dose of inhaled radon progenies.

    PubMed

    Balásházy, Imre; Farkas, Arpád; Madas, Balázs Gergely; Hofmann, Werner

    2009-06-01

    Cellular hit probabilities of alpha particles emitted by inhaled radon progenies in sensitive bronchial epithelial cell nuclei were simulated at low exposure levels to obtain useful data for the rejection or support of the linear-non-threshold (LNT) hypothesis. In this study, local distributions of deposited inhaled radon progenies in airway bifurcation models were computed at exposure conditions characteristic of homes and uranium mines. Then, maximum local deposition enhancement factors at bronchial airway bifurcations, expressed as the ratio of local to average deposition densities, were determined to characterise the inhomogeneity of deposition and to elucidate their effect on resulting hit probabilities. The results obtained suggest that in the vicinity of the carinal regions of the central airways the probability of multiple hits can be quite high, even at low average doses. Assuming a uniform distribution of activity there are practically no multiple hits and the hit probability as a function of dose exhibits a linear shape in the low dose range. The results are quite the opposite in the case of hot spots revealed by realistic deposition calculations, where practically all cells receive multiple hits and the hit probability as a function of dose is non-linear in the average dose range of 10-100 mGy.

  5. A shift from significance test to hypothesis test through power analysis in medical research.

    PubMed

    Singh, G

    2006-01-01

    Medical research literature until recently, exhibited substantial dominance of the Fisher's significance test approach of statistical inference concentrating more on probability of type I error over Neyman-Pearson's hypothesis test considering both probability of type I and II error. Fisher's approach dichotomises results into significant or not significant results with a P value. The Neyman-Pearson's approach talks of acceptance or rejection of null hypothesis. Based on the same theory these two approaches deal with same objective and conclude in their own way. The advancement in computing techniques and availability of statistical software have resulted in increasing application of power calculations in medical research and thereby reporting the result of significance tests in the light of power of the test also. Significance test approach, when it incorporates power analysis contains the essence of hypothesis test approach. It may be safely argued that rising application of power analysis in medical research may have initiated a shift from Fisher's significance test to Neyman-Pearson's hypothesis test procedure.

  6. Dopaminergic Drug Effects on Probability Weighting during Risky Decision Making

    PubMed Central

    Timmer, Monique H. M.; ter Huurne, Niels P.

    2018-01-01

    Abstract Dopamine has been associated with risky decision-making, as well as with pathological gambling, a behavioral addiction characterized by excessive risk-taking behavior. However, the specific mechanisms through which dopamine might act to foster risk-taking and pathological gambling remain elusive. Here we test the hypothesis that this might be achieved, in part, via modulation of subjective probability weighting during decision making. Human healthy controls (n = 21) and pathological gamblers (n = 16) played a decision-making task involving choices between sure monetary options and risky gambles both in the gain and loss domains. Each participant played the task twice, either under placebo or the dopamine D2/D3 receptor antagonist sulpiride, in a double-blind counterbalanced design. A prospect theory modelling approach was used to estimate subjective probability weighting and sensitivity to monetary outcomes. Consistent with prospect theory, we found that participants presented a distortion in the subjective weighting of probabilities, i.e., they overweighted low probabilities and underweighted moderate to high probabilities, both in the gain and loss domains. Compared with placebo, sulpiride attenuated this distortion in the gain domain. Across drugs, the groups did not differ in their probability weighting, although gamblers consistently underweighted losing probabilities in the placebo condition. Overall, our results reveal that dopamine D2/D3 receptor antagonism modulates the subjective weighting of probabilities in the gain domain, in the direction of more objective, economically rational decision making. PMID:29632870

  7. Thermal and mechanical quantitative sensory testing in Chinese patients with burning mouth syndrome--a probable neuropathic pain condition?

    PubMed

    Mo, Xueyin; Zhang, Jinglu; Fan, Yuan; Svensson, Peter; Wang, Kelun

    2015-01-01

    To explore the hypothesis that burning mouth syndrome (BMS) probably is a neuropathic pain condition, thermal and mechanical sensory and pain thresholds were tested and compared with age- and gender-matched control participants using a standardized battery of psychophysical techniques. Twenty-five BMS patients (men: 8, women: 17, age: 49.5 ± 11.4 years) and 19 age- and gender-matched healthy control participants were included. The cold detection threshold (CDT), warm detection threshold (WDT), cold pain threshold (CPT), heat pain threshold (HPT), mechanical detection threshold (MDT) and mechanical pain threshold (MPT), in accordance with the German Network of Neuropathic Pain guidelines, were measured at the following four sites: the dorsum of the left hand (hand), the skin at the mental foramen (chin), on the tip of the tongue (tongue), and the mucosa of the lower lip (lip). Statistical analysis was performed using ANOVA with repeated measures to compare the means within and between groups. Furthermore, Z-score profiles were generated, and exploratory correlation analyses between QST and clinical variables were performed. Two-tailed tests with a significance level of 5 % were used throughout. CDTs (P < 0.02) were significantly lower (less sensitivity) and HPTs (P < 0.001) were significantly higher (less sensitivity) at the tongue and lip in BMS patients compared to control participants. WDT (P = 0.007) was also significantly higher at the tongue in BMS patients compared to control subjects . There were no significant differences in MDT and MPT between the BMS patients and healthy subjects at any of the four test sites. Z-scores showed that significant loss of function can be identified for CDT (Z-scores = -0.9±1.1) and HPT (Z-scores = 1.5±0.4). There were no significant correlations between QST and clinical variables (pain intensity, duration, depressions scores). BMS patients had a significant loss of thermal function but not mechanical function, supporting the hypothesis that BMS may be a probable neuropathic pain condition. Further studies including e.g. electrophysiological or imaging techniques are needed to clarify the underlying mechanisms of BMS.

  8. Molecular phylogeny of broken-back shrimps (genus Lysmata and allies): a test of the 'Tomlinson-Ghiselin' hypothesis explaining the evolution of hermaphroditism.

    PubMed

    Baeza, J Antonio

    2013-10-01

    The 'Tomlinson-Ghiselin' hypothesis (TGh) predicts that outcrossing simultaneous hermaphroditism (SH) is advantageous when population density is low because the probability of finding sexual partners is negligible. In shrimps from the family Lysmatidae, Bauer's historical contingency hypothesis (HCh) suggests that SH evolved in an ancestral tropical species that adopted a symbiotic lifestyle with, e.g., sea anemones and became a specialized fish-cleaner. Restricted mobility of shrimps due to their association with a host, and hence, reduced probability of encountering mating partners, would have favored SH. The HCh is a special case of the TGh. Herein, I examined within a phylogenetic framework whether the TGh/HCh explains the origin of SH in shrimps. A phylogeny of caridean broken-back shrimps in the families Lysmatidae, Barbouriidae, Merguiidae was first developed using nuclear and mitochondrial makers. Complete evidence phylogenetic analyses using maximum likelihood (ML) and Bayesian inference (BI) demonstrated that Lysmatidae+Barbouriidae are monophyletic. In turn, Merguiidae is sister to the Lysmatidae+Barbouriidae. ML and BI ancestral character-state reconstruction in the resulting phylogenetic trees indicated that the ancestral Lysmatidae was either gregarious or lived in small groups and was not symbiotic. Four different evolutionary transitions from a free-living to a symbiotic lifestyle occurred in shrimps. Therefore, the evolution of SH in shrimps cannot be explained by the TGh/HCh; reduced probability of encountering mating partners in an ancestral species due to its association with a sessile host did not favor SH in the Lysmatidae. It is proposed that two conditions acting together in the past; low male mating opportunities and brooding constraints, might have favored SH in the ancestral Lysmatidae+Barbouridae. Additional studies on the life history and phylogenetics of broken-back shrimps are needed to understand the evolution of SH in the ecologically diverse Caridea. Copyright © 2013 Elsevier Inc. All rights reserved.

  9. Urban sprawl and delayed ambulance arrival in the U.S.

    PubMed

    Trowbridge, Matthew J; Gurka, Matthew J; O'Connor, Robert E

    2009-11-01

    Minimizing emergency medical service (EMS) response time is a central objective of prehospital care, yet the potential influence of built environment features such as urban sprawl on EMS system performance is often not considered. This study measures the association between urban sprawl and EMS response time to test the hypothesis that features of sprawling development increase the probability of delayed ambulance arrival. In 2008, EMS response times for 43,424 motor-vehicle crashes were obtained from the Fatal Analysis Reporting System, a national census of crashes involving > or =1 fatality. Sprawl at each crash location was measured using a continuous county-level index previously developed by Ewing et al. The association between sprawl and the probability of a delayed ambulance arrival (> or =8 minutes) was then measured using generalized linear mixed modeling to account for correlation among crashes from the same county. Urban sprawl is significantly associated with increased EMS response time and a higher probability of delayed ambulance arrival (p=0.03). This probability increases quadratically as the severity of sprawl increases while controlling for nighttime crash occurrence, road conditions, and presence of construction. For example, in sprawling counties (e.g., Fayette County GA), the probability of a delayed ambulance arrival for daytime crashes in dry conditions without construction was 69% (95% CI=66%, 72%) compared with 31% (95% CI=28%, 35%) in counties with prominent smart-growth characteristics (e.g., Delaware County PA). Urban sprawl is significantly associated with increased EMS response time and a higher probability of delayed ambulance arrival following motor-vehicle crashes in the U.S. The results of this study suggest that promotion of community design and development that follows smart-growth principles and regulates urban sprawl may improve EMS performance and reliability.

  10. An Exercise for Illustrating the Logic of Hypothesis Testing

    ERIC Educational Resources Information Center

    Lawton, Leigh

    2009-01-01

    Hypothesis testing is one of the more difficult concepts for students to master in a basic, undergraduate statistics course. Students often are puzzled as to why statisticians simply don't calculate the probability that a hypothesis is true. This article presents an exercise that forces students to lay out on their own a procedure for testing a…

  11. Sex and Adolescent Ethanol Exposure Influence Pavlovian Conditioned Approach

    PubMed Central

    Madayag, Aric C.; Stringfield, Sierra J.; Reissner, Kathryn J.; Boettiger, Charlotte A.; Robinson, Donita L.

    2017-01-01

    BACKGROUND Alcohol use among adolescents is widespread and a growing concern due to long-term behavioral deficits, including altered Pavlovian behavior, that potentially contribute to addiction vulnerability. We tested the hypothesis that adolescent intermittent ethanol (AIE) exposure alters Pavlovian behavior in males and females as measured by a shift from goal-tracking to sign-tracking. Additionally, we investigated GLT-1, an astrocytic glutamate transporter, as a potential contributor to a sign-tracking phenotype. METHODS Male and female Sprague-Dawley rats were exposed to AIE (5g/Kg, intragastric) or water intermittently 2 days on, 2 days off from postnatal day (P) 25 to 54. Around P70, animals began 20 daily sessions of Pavlovian conditioned approach, where they learned that a cue predicted non-contingent reward delivery. Lever pressing indicated interaction with the cue, or sign-tracking, and receptacle entries indicated approach to the reward delivery location, or goal-tracking. To test for effects of AIE on nucleus accumbens excitatory signaling, we isolated membrane subfractions and measured protein levels of the glutamate transporter GLT-1 after animals completed behavior as a measure of glutamate homeostasis. RESULTS Females exhibited elevated sign-tracking compared to males with significantly more lever presses, faster latency to first lever press, and greater probability to lever press in a trial. AIE significantly increased lever pressing while blunting goal tracking, as indicated by fewer cue-evoked receptacle entries, slower latency to receptacle entry, and lower probability to enter the receptacle in a trial. No significant Sex-by-Exposure interactions were observed in sign- or goal-tracking metrics. Moreover, we found no significant effects of Sex or Exposure on membrane GLT-1 expression in the nucleus accumbens. CONCLUSIONS Females exhibited enhanced sign-tracking compared to males, while AIE decreased goal-tracking compared to control exposure. Our findings support the hypothesis that adolescent binge ethanol can shift conditioned behavior from goal- to cue-directed in Pavlovian conditioned approach, especially in females. PMID:28196273

  12. [Significance of motivation balance for a choice of dog's behavior under conditions of environmental uncertainty].

    PubMed

    Chilingarian, L I; Grigor'ian, G A

    2007-01-01

    Two experimental models with a choice between two reinforcements were used for assessment of individual typological features of dogs. In the first model dogs were given the choice of homogeneous food reinforcements: between less valuable constantly delivered reinforcement and more valuable reinforcement but delivered with low probabilities. In the second model the dogs had the choice of heterogeneous reinforcements: between performing alimentary and defensive reactions. Under conditions of rise of uncertainty owing to a decrease in probability of getting the valuable food, two dogs continued to prefer the valuable reinforcement, while the third animal gradually shifted its behavior from the choice of a highly valuable but infrequent reward to a less valuable but easily achieved reinforcement. Under condition of choice between the valuable food reinforcement and avoidance of electrocutaneous stimulation, the first two dogs preferred food, whereas the third animal which had been previously oriented to the choice of the low-valuable constant reinforcement, steadily preferred the avoidance behavior. The data obtained are consistent with the hypothesis according to which the individual typological characteristics of animals's (human's) behavior substantially depend on two parameters: extent of environmental uncertainty and subjective features of reinforcement assessment.

  13. The Heuristic Value of p in Inductive Statistical Inference

    PubMed Central

    Krueger, Joachim I.; Heck, Patrick R.

    2017-01-01

    Many statistical methods yield the probability of the observed data – or data more extreme – under the assumption that a particular hypothesis is true. This probability is commonly known as ‘the’ p-value. (Null Hypothesis) Significance Testing ([NH]ST) is the most prominent of these methods. The p-value has been subjected to much speculation, analysis, and criticism. We explore how well the p-value predicts what researchers presumably seek: the probability of the hypothesis being true given the evidence, and the probability of reproducing significant results. We also explore the effect of sample size on inferential accuracy, bias, and error. In a series of simulation experiments, we find that the p-value performs quite well as a heuristic cue in inductive inference, although there are identifiable limits to its usefulness. We conclude that despite its general usefulness, the p-value cannot bear the full burden of inductive inference; it is but one of several heuristic cues available to the data analyst. Depending on the inferential challenge at hand, investigators may supplement their reports with effect size estimates, Bayes factors, or other suitable statistics, to communicate what they think the data say. PMID:28649206

  14. The Heuristic Value of p in Inductive Statistical Inference.

    PubMed

    Krueger, Joachim I; Heck, Patrick R

    2017-01-01

    Many statistical methods yield the probability of the observed data - or data more extreme - under the assumption that a particular hypothesis is true. This probability is commonly known as 'the' p -value. (Null Hypothesis) Significance Testing ([NH]ST) is the most prominent of these methods. The p -value has been subjected to much speculation, analysis, and criticism. We explore how well the p -value predicts what researchers presumably seek: the probability of the hypothesis being true given the evidence, and the probability of reproducing significant results. We also explore the effect of sample size on inferential accuracy, bias, and error. In a series of simulation experiments, we find that the p -value performs quite well as a heuristic cue in inductive inference, although there are identifiable limits to its usefulness. We conclude that despite its general usefulness, the p -value cannot bear the full burden of inductive inference; it is but one of several heuristic cues available to the data analyst. Depending on the inferential challenge at hand, investigators may supplement their reports with effect size estimates, Bayes factors, or other suitable statistics, to communicate what they think the data say.

  15. Could Hypomanic Traits Explain Selective Migration? Verifying the Hypothesis by the Surveys on Sardinian Migrants

    PubMed Central

    MG, Carta; MF, Moro; V, Kovess; MV, Brasesco; KM, Bhat; MC, Angermeyer; HS, Akiskal

    2012-01-01

    Introduction: A recent survey put forward the hypothesis that the emigration that occurred from Sardinia from the 1960’s to the 1980’s, selected people with a hypomanic temperament. The paper aims to verify if the people who migrated from Sardinia in that period have shown a high risk of mood disorders in the surveys carried out in their host countries, and if the results are consistent with this hypothesis. Methods: This is systematic review. Results: In the 1970’s when examining the attitudes towards migration in Sardinian couples waiting to emigrate, Rudas found that the decision to emigrate was principally taken by males. Female showed lower self-esteem than male emigrants. A study on Sardinian immigrants in Argentina carried out in 2001-02, at the peak of the economic crisis, found a high risk of depressive disorders in women only. These results were opposite to the findings recorded ten years earlier in a survey on Sardinian immigrants in Paris, where the risk of Depressive Episode was higher in young men only. Discussion: Data point to a bipolar disorder risk for young (probably hypomanic) male migrants in competitive, challenging conditions; and a different kind of depressive episodes for women in trying economic conditions. The results of the survey on Sardinian migrants are partially in agreement with the hypothesis of a selective migration of people with a hypomanic temperament. Early motivations and self-esteem seem related to the ways mood disorders are expressed, and to the vulnerability to specific triggering situations in the host country. PMID:23248679

  16. Could hypomanic traits explain selective migration? Verifying the hypothesis by the surveys on sardinian migrants.

    PubMed

    Giovanni, Carta Mauro; Francesca, Moro Maria; Viviane, Kovess; Brasesco, Maria Veronica; Bhat, Krishna M; Matthias, Angermeyer C; Akiskal, Hagop S

    2012-01-01

    A recent survey put forward the hypothesis that the emigration that occurred from Sardinia from the 1960's to the 1980's, selected people with a hypomanic temperament. The paper aims to verify if the people who migrated from Sardinia in that period have shown a high risk of mood disorders in the surveys carried out in their host countries, and if the results are consistent with this hypothesis. This is systematic review. In the 1970's when examining the attitudes towards migration in Sardinian couples waiting to emigrate, Rudas found that the decision to emigrate was principally taken by males. Female showed lower self-esteem than male emigrants. A study on Sardinian immigrants in Argentina carried out in 2001-02, at the peak of the economic crisis, found a high risk of depressive disorders in women only. These results were opposite to the findings recorded ten years earlier in a survey on Sardinian immigrants in Paris, where the risk of Depressive Episode was higher in young men only. Data point to a bipolar disorder risk for young (probably hypomanic) male migrants in competitive, challenging conditions; and a different kind of depressive episodes for women in trying economic conditions. The results of the survey on Sardinian migrants are partially in agreement with the hypothesis of a selective migration of people with a hypomanic temperament. Early motivations and self-esteem seem related to the ways mood disorders are expressed, and to the vulnerability to specific triggering situations in the host country.

  17. Female infidelity is constrained by El Niño conditions in a long-lived bird.

    PubMed

    Kiere, Lynna Marie; Drummond, Hugh

    2016-07-01

    Explaining the remarkable variation in socially monogamous females' extrapair (EP) behaviour revealed by decades of molecular paternity testing remains an important challenge. One hypothesis proposes that restrictive environmental conditions (e.g. extreme weather, food scarcity) limit females' resources and increase EP behaviour costs, forcing females to reduce EP reproductive behaviours. For the first time, we tested this hypothesis by directly quantifying within-pair and EP behaviours rather than inferring behaviour from paternity. We evaluated whether warmer sea surface temperatures depress total pre-laying reproductive behaviours, and particularly EP behaviours, in socially paired female blue-footed boobies (Sula nebouxii). Warm waters in the Eastern Pacific are associated with El Niño Southern Oscillation and lead to decreased food availability and reproductive success in this and other marine predators. With warmer waters, females decreased their neighbourhood attendance, total copulation frequency and laying probability, suggesting that they contend with restricted resources by prioritizing self-maintenance and committing less to reproduction, sometimes abandoning the attempt altogether. Females were also less likely to participate in EP courtship and copulations, but when they did, rates of these behaviours were unaffected by water temperature. Females' neighbourhood attendance, total copulation frequency and EP courtship probability responded to temperature differences at the between-season scale, and neighbourhood attendance and EP copulation probability were affected by within-season fluctuations. Path analysis indicated that decreased EP participation was not attributable to reduced female time available for EP activities. Together, our results suggest that immediate time and energy constraints were not the main factors limiting females' infidelity. Our study shows that El Niño conditions depress female boobies' EP participation and total reproductive activity. In addition to increasing general self-maintenance and reproductive costs, warm waters may increase costs specific to EP behaviours including divorce, reduced male parental care, or pathogen exposure. Our results suggest that female boobies strategically refrained from EP behaviours to avoid these or other longer-term costs, rather than being compelled by immediate constraints. This study demonstrates that current environmental conditions affect females' mating decisions, contributing to variation in EP behaviours, even in a long-lived, iteroparous species that can buffer against temporary adversity. © 2016 The Authors. Journal of Animal Ecology © 2016 British Ecological Society.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Velsko, Stephan P.

    This report provides an introduction to the topic of conditional dependence in the context of microbial forensic assays. Conditional dependence between two items of evidence E1 and E2 occurs when they are both used to support a hypothesis, but E1 affects the probability of E2 and vice versa. Ignoring this dependence can lead to very large errors in estimating the diagnosticity of the combined evidence. To introduce readers to this concept, a number of definitions of conditional dependence that have been used by authors in the past have been collected together and compared. Formal mathematical relationships that constrain conditional dependencemore » are summarized. There are several specific scenarios in which unrecognized conditional dependence can arise in microbial forensic contexts. This report provides some notional examples that illustrate dramatic effects of conditional dependence on the weight of microbial forensic evidence, and discusses the relevance of these observations for the validation of microbial forensic assays. A two-­parameter model that describes the transition between various limiting forms of conditional dependence relations is provided in an appendix.« less

  19. Ecological constraints influence the emergence of cooperative breeding when population dynamics determine the fitness of helpers.

    PubMed

    McLeod, David V; Wild, Geoff

    2013-11-01

    Cooperative breeding is a system in which certain individuals facilitate the production of offspring by others. The ecological constraints hypothesis states that ecological conditions deter individuals from breeding independently, and so individuals breed cooperatively to make the best of a bad situation. Current theoretical support for the ecological constraints hypothesis is lacking. We formulate a mathematical model that emphasizes the underlying ecology of cooperative breeders. Our goal is to derive theoretical support for the ecological constraints hypothesis using an ecological model of population dynamics. We consider a population composed of two kinds of individuals, nonbreeders (auxiliaries) and breeders. We suppose that help provided by an auxiliary increases breeder fecundity, but reduces the probability with which the auxiliary becomes a breeder. Our main result is a condition that guarantees success of auxiliary help. We predict that increasing the cost of dispersal promotes helping, in agreement with verbal theory. We also predict that increasing breeder mortality can either hinder helping (at high population densities), or promote it (at low population densities). We conclude that ecological constraints can exert influence over the evolution of auxiliary help when population dynamics are considered; moreover, that influence need not coincide with direct fitness benefits as previously found. © 2013 The Author(s). Evolution © 2013 The Society for the Study of Evolution.

  20. Unless reasoning.

    PubMed

    García-Madruga, Juan A; Carriedo, Nuria; Moreno-Ríos, Sergio; Gutiérrez, Francisco; Schaeken, Walter

    2008-11-01

    We report the results of two experiments investigating conditional inferences from conditional unless assertions, such as Juan is not in León unless Nuria is in Madrid. Experiments 1 and 2 check Fillenbaum's hypothesis about the semantic similarity of unless with if not and only if assertions; both also examine inferential endorsements (Experiment 1) and endorsements and latencies (Experiment 2) of the four logically equivalent conditional formulations: if A then B, if not-B then not-A, A only if B and notA unless B. The results of these experiments show the similarity of unless and only if confirming that the representation of both conditionals from the outset probably include two possibilities directionally oriented from B to A; results also confirm the especial difficulty of unless assertions. The implications of the results are discussed in the context of recent psychological and linguistic theories of the meaning of unless.

  1. Expert Financial Advice Neurobiologically “Offloads” Financial Decision-Making under Risk

    PubMed Central

    Engelmann, Jan B.; Capra, C. Monica; Noussair, Charles; Berns, Gregory S.

    2009-01-01

    Background Financial advice from experts is commonly sought during times of uncertainty. While the field of neuroeconomics has made considerable progress in understanding the neurobiological basis of risky decision-making, the neural mechanisms through which external information, such as advice, is integrated during decision-making are poorly understood. In the current experiment, we investigated the neurobiological basis of the influence of expert advice on financial decisions under risk. Methodology/Principal Findings While undergoing fMRI scanning, participants made a series of financial choices between a certain payment and a lottery. Choices were made in two conditions: 1) advice from a financial expert about which choice to make was displayed (MES condition); and 2) no advice was displayed (NOM condition). Behavioral results showed a significant effect of expert advice. Specifically, probability weighting functions changed in the direction of the expert's advice. This was paralleled by neural activation patterns. Brain activations showing significant correlations with valuation (parametric modulation by value of lottery/sure win) were obtained in the absence of the expert's advice (NOM) in intraparietal sulcus, posterior cingulate cortex, cuneus, precuneus, inferior frontal gyrus and middle temporal gyrus. Notably, no significant correlations with value were obtained in the presence of advice (MES). These findings were corroborated by region of interest analyses. Neural equivalents of probability weighting functions showed significant flattening in the MES compared to the NOM condition in regions associated with probability weighting, including anterior cingulate cortex, dorsolateral PFC, thalamus, medial occipital gyrus and anterior insula. Finally, during the MES condition, significant activations in temporoparietal junction and medial PFC were obtained. Conclusions/Significance These results support the hypothesis that one effect of expert advice is to “offload” the calculation of value of decision options from the individual's brain. PMID:19308261

  2. Using the Coefficient of Confidence to Make the Philosophical Switch from a Posteriori to a Priori Inferential Statistics

    ERIC Educational Resources Information Center

    Trafimow, David

    2017-01-01

    There has been much controversy over the null hypothesis significance testing procedure, with much of the criticism centered on the problem of inverse inference. Specifically, p gives the probability of the finding (or one more extreme) given the null hypothesis, whereas the null hypothesis significance testing procedure involves drawing a…

  3. Correlation of probability scores of placenta accreta on magnetic resonance imaging with hemorrhagic morbidity.

    PubMed

    Lim, Grace; Horowitz, Jeanne M; Berggruen, Senta; Ernst, Linda M; Linn, Rebecca L; Hewlett, Bradley; Kim, Jennifer; Chalifoux, Laurie A; McCarthy, Robert J

    2016-11-01

    To evaluate the hypothesis that assigning grades to magnetic resonance imaging (MRI) findings of suspected placenta accreta will correlate with hemorrhagic outcomes. We chose a single-center, retrospective, observational design. Nulliparous or multiparous women who had antenatal placental MRI performed at a tertiary level academic hospital were included. Cases with antenatal placental MRI were included and compared with cases without MRI performed. Two radiologists assigned a probability score for accreta to each study. Estimated blood loss and transfusion requirements were compared among groups by the Kruskal-Wallis H test. Thirty-five cases had placental MRI performed. MRI performance was associated with higher blood loss compared with the non-MRI group (2600 [1400-4500]mL vs 900[600-1500]mL, P<.001). There was no difference in estimated blood loss (P=.31) or transfusion (P=.57) among the MRI probability groups. In cases of suspected placenta accreta, probability scores for antenatal placental MRI may not be associated with increasing degrees of hemorrhage. Continued research is warranted to determine the effectiveness of assigning probability scores for antenatal accreta imaging studies, combined with clinical indices of suspicion, in assisting with antenatal multidisciplinary team planning for operative management of this morbid condition. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. A Comprehensive Breath Plume Model for Disease Transmission via Expiratory Aerosols

    NASA Astrophysics Data System (ADS)

    Halloran, S. K.; Wexler, A. S.; Ristenpart, W. D.

    2012-11-01

    The peak in influenza incidence during wintertime represents a longstanding unresolved scientific question. One hypothesis is that the efficacy of airborne transmission via aerosols is increased at low humidity and temperature, conditions that prevail in wintertime. Recent experiments with guinea pigs suggest that transmission is indeed maximized at low humidity and temperature, a finding which has been widely interpreted in terms of airborne influenza virus survivability. This interpretation, however, neglects the effect of the airflow on the transmission probability. Here we provide a comprehensive model for assessing the probability of disease transmission via expiratory aerosols between test animals in laboratory conditions. The spread of aerosols emitted from an infected animal is modeled using dispersion theory for a homogeneous turbulent airflow. The concentration and size distribution of the evaporating droplets in the resulting ``Gaussian breath plume'' are calculated as functions of downstream position. We demonstrate that the breath plume model is broadly consistent with the guinea pig experiments, without invoking airborne virus survivability. Moreover, the results highlight the need for careful characterization of the airflow in airborne transmission experiments.

  5. Ockham's razor and Bayesian analysis. [statistical theory for systems evaluation

    NASA Technical Reports Server (NTRS)

    Jefferys, William H.; Berger, James O.

    1992-01-01

    'Ockham's razor', the ad hoc principle enjoining the greatest possible simplicity in theoretical explanations, is presently shown to be justifiable as a consequence of Bayesian inference; Bayesian analysis can, moreover, clarify the nature of the 'simplest' hypothesis consistent with the given data. By choosing the prior probabilities of hypotheses, it becomes possible to quantify the scientific judgment that simpler hypotheses are more likely to be correct. Bayesian analysis also shows that a hypothesis with fewer adjustable parameters intrinsically possesses an enhanced posterior probability, due to the clarity of its predictions.

  6. Do observed or perceived characteristics of the neighborhood environment mediate associations between neighborhood poverty and cumulative biological risk?

    PubMed Central

    Schulz, Amy J.; Mentz, Graciela; Lachance, Laurie; Zenk, Shannon N.; Johnson, Jonetta; Stokes, Carmen; Mandell, Rebecca

    2013-01-01

    Objective To examine contributions of observed and perceived neighborhood characteristics in explaining associations between neighborhood poverty and cumulative biological risk (CBR) in an urban community. Methods Multilevel regression analyses were conducted using cross-sectional data from a probability sample survey (n=919), and observational and census data. Dependent variable: CBR. Independent variables: Neighborhood disorder, deterioration and characteristics; perceived neighborhood social environment, physical environment, and neighborhood environment. Covariates: Neighborhood and individual demographics, health-related behaviors. Results Observed and perceived indicators of neighborhood conditions were significantly associated with CBR, after accounting for both neighborhood and individual level socioeconomic indicators. Observed and perceived neighborhood environmental conditions mediated associations between neighborhood poverty and CBR. Conclusions Findings were consistent with the hypothesis that neighborhood conditions associated with economic divestment mediate associations between neighborhood poverty and CBR. PMID:24100238

  7. A critique of statistical hypothesis testing in clinical research

    PubMed Central

    Raha, Somik

    2011-01-01

    Many have documented the difficulty of using the current paradigm of Randomized Controlled Trials (RCTs) to test and validate the effectiveness of alternative medical systems such as Ayurveda. This paper critiques the applicability of RCTs for all clinical knowledge-seeking endeavors, of which Ayurveda research is a part. This is done by examining statistical hypothesis testing, the underlying foundation of RCTs, from a practical and philosophical perspective. In the philosophical critique, the two main worldviews of probability are that of the Bayesian and the frequentist. The frequentist worldview is a special case of the Bayesian worldview requiring the unrealistic assumptions of knowing nothing about the universe and believing that all observations are unrelated to each other. Many have claimed that the first belief is necessary for science, and this claim is debunked by comparing variations in learning with different prior beliefs. Moving beyond the Bayesian and frequentist worldviews, the notion of hypothesis testing itself is challenged on the grounds that a hypothesis is an unclear distinction, and assigning a probability on an unclear distinction is an exercise that does not lead to clarity of action. This critique is of the theory itself and not any particular application of statistical hypothesis testing. A decision-making frame is proposed as a way of both addressing this critique and transcending ideological debates on probability. An example of a Bayesian decision-making approach is shown as an alternative to statistical hypothesis testing, utilizing data from a past clinical trial that studied the effect of Aspirin on heart attacks in a sample population of doctors. As a big reason for the prevalence of RCTs in academia is legislation requiring it, the ethics of legislating the use of statistical methods for clinical research is also examined. PMID:22022152

  8. Is chronic insomnia a precursor to major depression? Epidemiological and biological findings.

    PubMed

    Baglioni, Chiara; Riemann, Dieter

    2012-10-01

    Insomnia has been found to be a clinical predictor of subsequent depression. Nevertheless the biological processes underlying this causal relationship are yet not fully understood. Both conditions share a common imbalance of the arousal system. Patients with insomnia present fragmented REM sleep, which probably interferes with basal processes of emotion regulation. The interaction between the arousal and the affective system with the persistence of the disorder could slowly alter also the cognitive system and lead to depression. Although preliminary results seem to support this hypothesis, data are still too few to make valid conclusions.

  9. Probability of stress-corrosion fracture under random loading

    NASA Technical Reports Server (NTRS)

    Yang, J. N.

    1974-01-01

    Mathematical formulation is based on cumulative-damage hypothesis and experimentally-determined stress-corrosion characteristics. Under both stationary random loadings, mean value and variance of cumulative damage are obtained. Probability of stress-corrosion fracture is then evaluated, using principle of maximum entropy.

  10. Application of Second-Moment Source Analysis to Three Problems in Earthquake Forecasting

    NASA Astrophysics Data System (ADS)

    Donovan, J.; Jordan, T. H.

    2011-12-01

    Though earthquake forecasting models have often represented seismic sources as space-time points (usually hypocenters), a more complete hazard analysis requires the consideration of finite-source effects, such as rupture extent, orientation, directivity, and stress drop. The most compact source representation that includes these effects is the finite moment tensor (FMT), which approximates the degree-two polynomial moments of the stress glut by its projection onto the seismic (degree-zero) moment tensor. This projection yields a scalar space-time source function whose degree-one moments define the centroid moment tensor (CMT) and whose degree-two moments define the FMT. We apply this finite-source parameterization to three forecasting problems. The first is the question of hypocenter bias: can we reject the null hypothesis that the conditional probability of hypocenter location is uniformly distributed over the rupture area? This hypothesis is currently used to specify rupture sets in the "extended" earthquake forecasts that drive simulation-based hazard models, such as CyberShake. Following McGuire et al. (2002), we test the hypothesis using the distribution of FMT directivity ratios calculated from a global data set of source slip inversions. The second is the question of source identification: given an observed FMT (and its errors), can we identify it with an FMT in the complete rupture set that represents an extended fault-based rupture forecast? Solving this problem will facilitate operational earthquake forecasting, which requires the rapid updating of earthquake triggering and clustering models. Our proposed method uses the second-order uncertainties as a norm on the FMT parameter space to identify the closest member of the hypothetical rupture set and to test whether this closest member is an adequate representation of the observed event. Finally, we address the aftershock excitation problem: given a mainshock, what is the spatial distribution of aftershock probabilities? The FMT representation allows us to generalize the models typically used for this purpose (e.g., marked point process models, such as ETAS), which will again be necessary in operational earthquake forecasting. To quantify aftershock probabilities, we compare mainshock FMTs with the first and second spatial moments of weighted aftershock hypocenters. We will describe applications of these results to the Uniform California Earthquake Rupture Forecast, version 3, which is now under development by the Working Group on California Earthquake Probabilities.

  11. Two models for microsimulation of family life cycle and family structure.

    PubMed

    Bertino, S; Pinnelli, A; Vichi, M

    1988-01-01

    2 models are proposed for the microsimulation of the family and analysis of family structure and life cycle. These models were devised primarily for teaching purposes. The families are composed of 3 generations (parents, grandparents, children). Cohabitation is not considered. The 1st model is governed by a transition mechanism based on the rules of a Markov multidimensional, nonhonogeneous chain. The 2nd model is based on stochastic point processes. Input data comprise annual mortality probability according to 1) sex, 2) age, 3) civil status, 4) annual probability of 1st marriage, 5) age combinations between the spouses, and 6) the probability of having 1, 2, or 3 children at 6 months intervals from the previous event (marriage or birth of nth child). The applications of the 1st model are presented using 2 mortality and fertility hypotheses (high and low) and a nuptiality hypothesis (West European nature). The various features of family composition are analyzed according to the duration of a couple's marriage and the age of the individual, as well as the characteristic features of the individual and family life cycle given these 2 demographic conditions.

  12. Multiple model cardinalized probability hypothesis density filter

    NASA Astrophysics Data System (ADS)

    Georgescu, Ramona; Willett, Peter

    2011-09-01

    The Probability Hypothesis Density (PHD) filter propagates the first-moment approximation to the multi-target Bayesian posterior distribution while the Cardinalized PHD (CPHD) filter propagates both the posterior likelihood of (an unlabeled) target state and the posterior probability mass function of the number of targets. Extensions of the PHD filter to the multiple model (MM) framework have been published and were implemented either with a Sequential Monte Carlo or a Gaussian Mixture approach. In this work, we introduce the multiple model version of the more elaborate CPHD filter. We present the derivation of the prediction and update steps of the MMCPHD particularized for the case of two target motion models and proceed to show that in the case of a single model, the new MMCPHD equations reduce to the original CPHD equations.

  13. Power Enhancement in High Dimensional Cross-Sectional Tests

    PubMed Central

    Fan, Jianqing; Liao, Yuan; Yao, Jiawei

    2016-01-01

    We propose a novel technique to boost the power of testing a high-dimensional vector H : θ = 0 against sparse alternatives where the null hypothesis is violated only by a couple of components. Existing tests based on quadratic forms such as the Wald statistic often suffer from low powers due to the accumulation of errors in estimating high-dimensional parameters. More powerful tests for sparse alternatives such as thresholding and extreme-value tests, on the other hand, require either stringent conditions or bootstrap to derive the null distribution and often suffer from size distortions due to the slow convergence. Based on a screening technique, we introduce a “power enhancement component”, which is zero under the null hypothesis with high probability, but diverges quickly under sparse alternatives. The proposed test statistic combines the power enhancement component with an asymptotically pivotal statistic, and strengthens the power under sparse alternatives. The null distribution does not require stringent regularity conditions, and is completely determined by that of the pivotal statistic. As specific applications, the proposed methods are applied to testing the factor pricing models and validating the cross-sectional independence in panel data models. PMID:26778846

  14. A Bayesian Method for Evaluating and Discovering Disease Loci Associations

    PubMed Central

    Jiang, Xia; Barmada, M. Michael; Cooper, Gregory F.; Becich, Michael J.

    2011-01-01

    Background A genome-wide association study (GWAS) typically involves examining representative SNPs in individuals from some population. A GWAS data set can concern a million SNPs and may soon concern billions. Researchers investigate the association of each SNP individually with a disease, and it is becoming increasingly commonplace to also analyze multi-SNP associations. Techniques for handling so many hypotheses include the Bonferroni correction and recently developed Bayesian methods. These methods can encounter problems. Most importantly, they are not applicable to a complex multi-locus hypothesis which has several competing hypotheses rather than only a null hypothesis. A method that computes the posterior probability of complex hypotheses is a pressing need. Methodology/Findings We introduce the Bayesian network posterior probability (BNPP) method which addresses the difficulties. The method represents the relationship between a disease and SNPs using a directed acyclic graph (DAG) model, and computes the likelihood of such models using a Bayesian network scoring criterion. The posterior probability of a hypothesis is computed based on the likelihoods of all competing hypotheses. The BNPP can not only be used to evaluate a hypothesis that has previously been discovered or suspected, but also to discover new disease loci associations. The results of experiments using simulated and real data sets are presented. Our results concerning simulated data sets indicate that the BNPP exhibits both better evaluation and discovery performance than does a p-value based method. For the real data sets, previous findings in the literature are confirmed and additional findings are found. Conclusions/Significance We conclude that the BNPP resolves a pressing problem by providing a way to compute the posterior probability of complex multi-locus hypotheses. A researcher can use the BNPP to determine the expected utility of investigating a hypothesis further. Furthermore, we conclude that the BNPP is a promising method for discovering disease loci associations. PMID:21853025

  15. The Role of Labour Market Expectations and Admission Probabilities in Students' Application Decisions on Higher Education: The Case of Hungary

    ERIC Educational Resources Information Center

    Varga, Julia

    2006-01-01

    This paper analyses students' application strategies to higher education, the effects of labour market expectations and admission probabilities. The starting hypothesis of this study is that students consider the expected utility of their choices, a function of expected net lifetime earnings and the probability of admission. Based on a survey…

  16. Masculinity-femininity predicts sexual orientation in men but not in women.

    PubMed

    Udry, J Richard; Chantala, Kim

    2006-11-01

    Using the nationally representative sample of about 15,000 Add Health respondents in Wave III, the hypothesis is tested that masculinity-femininity in adolescence is correlated with sexual orientation 5 years later and 6 years later: that is, that for adolescent males in 1995 and again in 1996, more feminine males have a higher probability of self-identifying as homosexuals in 2001-02. It is predicted that for adolescent females in 1995 and 1996, more masculine females have a higher probability of self-identifying as homosexuals in 2001-02. Masculinity-femininity is measured by the classical method used by Terman & Miles. For both time periods, the hypothesis was strongly confirmed for males: the more feminine males had several times the probability of being attracted to same-sex partners, several times the probability of having same-sex partners, and several times the probability of self-identifying as homosexuals, compared with more masculine males. For females, no relationship was found at either time period between masculinity and sex of preference. The biological mechanism underlying homosexuality may be different for males and females.

  17. The relationship between species detection probability and local extinction probability

    USGS Publications Warehouse

    Alpizar-Jara, R.; Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Pollock, K.H.; Rosenberry, C.S.

    2004-01-01

    In community-level ecological studies, generally not all species present in sampled areas are detected. Many authors have proposed the use of estimation methods that allow detection probabilities that are < 1 and that are heterogeneous among species. These methods can also be used to estimate community-dynamic parameters such as species local extinction probability and turnover rates (Nichols et al. Ecol Appl 8:1213-1225; Conserv Biol 12:1390-1398). Here, we present an ad hoc approach to estimating community-level vital rates in the presence of joint heterogeneity of detection probabilities and vital rates. The method consists of partitioning the number of species into two groups using the detection frequencies and then estimating vital rates (e.g., local extinction probabilities) for each group. Estimators from each group are combined in a weighted estimator of vital rates that accounts for the effect of heterogeneity. Using data from the North American Breeding Bird Survey, we computed such estimates and tested the hypothesis that detection probabilities and local extinction probabilities were negatively related. Our analyses support the hypothesis that species detection probability covaries negatively with local probability of extinction and turnover rates. A simulation study was conducted to assess the performance of vital parameter estimators as well as other estimators relevant to questions about heterogeneity, such as coefficient of variation of detection probabilities and proportion of species in each group. Both the weighted estimator suggested in this paper and the original unweighted estimator for local extinction probability performed fairly well and provided no basis for preferring one to the other.

  18. Similar brain networks for detecting visuo-motor and visuo-proprioceptive synchrony.

    PubMed

    Balslev, Daniela; Nielsen, Finn A; Lund, Torben E; Law, Ian; Paulson, Olaf B

    2006-05-15

    The ability to recognize feedback from own movement as opposed to the movement of someone else is important for motor control and social interaction. The neural processes involved in feedback recognition are incompletely understood. Two competing hypotheses have been proposed: the stimulus is compared with either (a) the proprioceptive feedback or with (b) the motor command and if they match, then the external stimulus is identified as feedback. Hypothesis (a) predicts that the neural mechanisms or brain areas involved in distinguishing self from other during passive and active movement are similar, whereas hypothesis (b) predicts that they are different. In this fMRI study, healthy subjects saw visual cursor movement that was either synchronous or asynchronous with their active or passive finger movements. The aim was to identify the brain areas where the neural activity depended on whether the visual stimulus was feedback from own movement and to contrast the functional activation maps for active and passive movement. We found activity increases in the right temporoparietal cortex in the condition with asynchronous relative to synchronous visual feedback from both active and passive movements. However, no statistically significant difference was found between these sets of activated areas when the active and passive movement conditions were compared. With a posterior probability of 0.95, no brain voxel had a contrast effect above 0.11% of the whole-brain mean signal. These results do not support the hypothesis that recognition of visual feedback during active and passive movement relies on different brain areas.

  19. Sex and Adolescent Ethanol Exposure Influence Pavlovian Conditioned Approach.

    PubMed

    Madayag, Aric C; Stringfield, Sierra J; Reissner, Kathryn J; Boettiger, Charlotte A; Robinson, Donita L

    2017-04-01

    Alcohol use among adolescents is widespread and a growing concern due to long-term behavioral deficits, including altered Pavlovian behavior, that potentially contribute to addiction vulnerability. We tested the hypothesis that adolescent intermittent ethanol (AIE) exposure alters Pavlovian behavior in males and females as measured by a shift from goal-tracking to sign-tracking. Additionally, we investigated GLT-1, an astrocytic glutamate transporter, as a potential contributor to a sign-tracking phenotype. Male and female Sprague-Dawley rats were exposed to AIE (5 g/kg, intragastric) or water intermittently 2 days on and 2 days off from postnatal day (P) 25 to 54. Around P70, animals began 20 daily sessions of Pavlovian conditioned approach (PCA), where they learned that a cue predicted noncontingent reward delivery. Lever pressing indicated interaction with the cue, or sign-tracking, and receptacle entries indicated approach to the reward delivery location, or goal-tracking. To test for effects of AIE on nucleus accumbens (NAcc) excitatory signaling, we isolated membrane subfractions and measured protein levels of the glutamate transporter GLT-1 after animals completed behavior as a measure of glutamate homeostasis. Females exhibited elevated sign-tracking compared to males with significantly more lever presses, faster latency to first lever press, and greater probability to lever press in a trial. AIE significantly increased lever pressing while blunting goal-tracking, as indicated by fewer cue-evoked receptacle entries, slower latency to receptacle entry, and lower probability to enter the receptacle in a trial. No significant sex-by-exposure interactions were observed in sign- or goal-tracking metrics. Moreover, we found no significant effects of sex or exposure on membrane GLT-1 expression in the NAcc. Females exhibited enhanced sign-tracking compared to males, while AIE decreased goal-tracking compared to control exposure. Our findings support the hypothesis that adolescent binge ethanol can shift conditioned behavior from goal- to cue-directed in PCA, especially in females. Copyright © 2017 by the Research Society on Alcoholism.

  20. Spatial and temporal patterns of coexistence between competing Aedes mosquitoes in urban Florida

    PubMed Central

    Juliano, S. A.

    2009-01-01

    Understanding mechanisms fostering coexistence between invasive and resident species is important in predicting ecological, economic, or health impacts of invasive species. The mosquito Aedes aegypti coexists at some urban sites in southeastern United States with invasive Aedes albopictus, which is often superior in interspecific competition. We tested predictions for three hypotheses of species coexistence: seasonal condition-specific competition, aggregation among individual water-filled containers, and colonization–competition tradeoff across spatially partitioned habitat patches (cemeteries) that have high densities of containers. We measured spatial and temporal patterns of abundance for both species among water-filled resident cemetery vases and experimentally positioned standard cemetery vases and ovitraps in metropolitan Tampa, Florida. Consistent with the seasonal condition-specific competition hypothesis, abundances of both species in resident and standard cemetery vases were higher early in the wet season (June) versus late in the wet season (September), but the proportional increase of A. albopictus was greater than that of A. aegypti, presumably due to higher dry-season egg mortality and strong wet-season competitive superiority of larval A. albopictus. Spatial partitioning was not evident among cemeteries, a result inconsistent with the colonization-competition tradeoff hypothesis, but both species were highly independently aggregated among standard cemetery vases and ovitraps, which is consistent with the aggregation hypothesis. Densities of A. aegypti but not A. albopictus differed among land use categories, with A. aegypti more abundant in ovitraps in residential areas compared to industrial and commercial areas. Spatial partitioning among land use types probably results from effects of land use on conditions in both terrestrial and aquatic-container environments. These results suggest that both temporal and spatial variation may contribute to local coexistence between these Aedes in urban areas. PMID:19263086

  1. Spatial and temporal patterns of coexistence between competing Aedes mosquitoes in urban Florida.

    PubMed

    Leisnham, Paul T; Juliano, S A

    2009-05-01

    Understanding mechanisms fostering coexistence between invasive and resident species is important in predicting ecological, economic, or health impacts of invasive species. The mosquito Aedes aegypti coexists at some urban sites in southeastern United States with invasive Aedes albopictus, which is often superior in interspecific competition. We tested predictions for three hypotheses of species coexistence: seasonal condition-specific competition, aggregation among individual water-filled containers, and colonization-competition tradeoff across spatially partitioned habitat patches (cemeteries) that have high densities of containers. We measured spatial and temporal patterns of abundance for both species among water-filled resident cemetery vases and experimentally positioned standard cemetery vases and ovitraps in metropolitan Tampa, Florida. Consistent with the seasonal condition-specific competition hypothesis, abundances of both species in resident and standard cemetery vases were higher early in the wet season (June) versus late in the wet season (September), but the proportional increase of A. albopictus was greater than that of A. aegypti, presumably due to higher dry-season egg mortality and strong wet-season competitive superiority of larval A. albopictus. Spatial partitioning was not evident among cemeteries, a result inconsistent with the colonization-competition tradeoff hypothesis, but both species were highly independently aggregated among standard cemetery vases and ovitraps, which is consistent with the aggregation hypothesis. Densities of A. aegypti but not A. albopictus differed among land use categories, with A. aegypti more abundant in ovitraps in residential areas compared to industrial and commercial areas. Spatial partitioning among land use types probably results from effects of land use on conditions in both terrestrial and aquatic-container environments. These results suggest that both temporal and spatial variation may contribute to local coexistence between these Aedes in urban areas.

  2. Explorations in Statistics: Power

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2010-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This fifth installment of "Explorations in Statistics" revisits power, a concept fundamental to the test of a null hypothesis. Power is the probability that we reject the null hypothesis when it is false. Four…

  3. Killeen's (2005) "p[subscript rep]" Coefficient: Logical and Mathematical Problems

    ERIC Educational Resources Information Center

    Maraun, Michael; Gabriel, Stephanie

    2010-01-01

    In his article, "An Alternative to Null-Hypothesis Significance Tests," Killeen (2005) urged the discipline to abandon the practice of "p[subscript obs]"-based null hypothesis testing and to quantify the signal-to-noise characteristics of experimental outcomes with replication probabilities. He described the coefficient that he…

  4. A Comprehensive Breath Plume Model for Disease Transmission via Expiratory Aerosols

    PubMed Central

    Halloran, Siobhan K.; Wexler, Anthony S.; Ristenpart, William D.

    2012-01-01

    The peak in influenza incidence during wintertime in temperate regions represents a longstanding, unresolved scientific question. One hypothesis is that the efficacy of airborne transmission via aerosols is increased at lower humidities and temperatures, conditions that prevail in wintertime. Recent work with a guinea pig model by Lowen et al. indicated that humidity and temperature do modulate airborne influenza virus transmission, and several investigators have interpreted the observed humidity dependence in terms of airborne virus survivability. This interpretation, however, neglects two key observations: the effect of ambient temperature on the viral growth kinetics within the animals, and the strong influence of the background airflow on transmission. Here we provide a comprehensive theoretical framework for assessing the probability of disease transmission via expiratory aerosols between test animals in laboratory conditions. The spread of aerosols emitted from an infected animal is modeled using dispersion theory for a homogeneous turbulent airflow. The concentration and size distribution of the evaporating droplets in the resulting “Gaussian breath plume” are calculated as functions of position, humidity, and temperature. The overall transmission probability is modeled with a combination of the time-dependent viral concentration in the infected animal and the probability of droplet inhalation by the exposed animal downstream. We demonstrate that the breath plume model is broadly consistent with the results of Lowen et al., without invoking airborne virus survivability. The results also suggest that, at least for guinea pigs, variation in viral kinetics within the infected animals is the dominant factor explaining the increased transmission probability observed at lower temperatures. PMID:22615902

  5. Test of a hypothesis of realism in quantum theory using a Bayesian approach

    NASA Astrophysics Data System (ADS)

    Nikitin, N.; Toms, K.

    2017-05-01

    In this paper we propose a time-independent equality and time-dependent inequality, suitable for an experimental test of the hypothesis of realism. The derivation of these relations is based on the concept of conditional probability and on Bayes' theorem in the framework of Kolmogorov's axiomatics of probability theory. The equality obtained is intrinsically different from the well-known Greenberger-Horne-Zeilinger (GHZ) equality and its variants, because violation of the proposed equality might be tested in experiments with only two microsystems in a maximally entangled Bell state |Ψ-> , while a test of the GHZ equality requires at least three quantum systems in a special state |ΨGHZ> . The obtained inequality differs from Bell's, Wigner's, and Leggett-Garg inequalities, because it deals with spin s =1 /2 projections onto only two nonparallel directions at two different moments of time, while a test of the Bell and Wigner inequalities requires at least three nonparallel directions, and a test of the Leggett-Garg inequalities requires at least three distinct moments of time. Hence, the proposed inequality seems to open an additional experimental possibility to avoid the "contextuality loophole." Violation of the proposed equality and inequality is illustrated with the behavior of a pair of anticorrelated spins in an external magnetic field and also with the oscillations of flavor-entangled pairs of neutral pseudoscalar mesons.

  6. Tracing the footsteps of Sherlock Holmes: cognitive representations of hypothesis testing.

    PubMed

    Van Wallendael, L R; Hastie, R

    1990-05-01

    A well-documented phenomenon in opinion-revision literature is subjects' failure to revise probability estimates for an exhaustive set of mutually exclusive hypotheses in a complementary manner. However, prior research has not addressed the question of whether such behavior simply represents a misunderstanding of mathematical rules, or whether it is a consequence of a cognitive representation of hypotheses that is at odds with the Bayesian notion of a set relationship. Two alternatives to the Bayesian representation, a belief system (Shafer, 1976) and a system of independent hypotheses, were proposed, and three experiments were conducted to examine cognitive representations of hypothesis sets in the testing of multiple competing hypotheses. Subjects were given brief murder mysteries to solve and allowed to request various types of information about the suspects; after having received each new piece of information, subjects rated each suspect's probability of being the murderer. Presence and timing of suspect eliminations were varied in the first two experiments; the final experiment involved the varying of percentages of clues that referred to more than one suspect (for example, all of the female suspects). The noncomplementarity of opinion revisions remained a strong phenomenon in all conditions. Information-search data refuted the idea that subjects represented hypotheses as a Bayesian set; further study of the independent hypotheses theory and Shaferian belief functions as descriptive models is encouraged.

  7. Fisher information framework for time series modeling

    NASA Astrophysics Data System (ADS)

    Venkatesan, R. C.; Plastino, A.

    2017-08-01

    A robust prediction model invoking the Takens embedding theorem, whose working hypothesis is obtained via an inference procedure based on the minimum Fisher information principle, is presented. The coefficients of the ansatz, central to the working hypothesis satisfy a time independent Schrödinger-like equation in a vector setting. The inference of (i) the probability density function of the coefficients of the working hypothesis and (ii) the establishing of constraint driven pseudo-inverse condition for the modeling phase of the prediction scheme, is made, for the case of normal distributions, with the aid of the quantum mechanical virial theorem. The well-known reciprocity relations and the associated Legendre transform structure for the Fisher information measure (FIM, hereafter)-based model in a vector setting (with least square constraints) are self-consistently derived. These relations are demonstrated to yield an intriguing form of the FIM for the modeling phase, which defines the working hypothesis, solely in terms of the observed data. Cases for prediction employing time series' obtained from the: (i) the Mackey-Glass delay-differential equation, (ii) one ECG signal from the MIT-Beth Israel Deaconess Hospital (MIT-BIH) cardiac arrhythmia database, and (iii) one ECG signal from the Creighton University ventricular tachyarrhythmia database. The ECG samples were obtained from the Physionet online repository. These examples demonstrate the efficiency of the prediction model. Numerical examples for exemplary cases are provided.

  8. Surveillance system and method having an adaptive sequential probability fault detection test

    NASA Technical Reports Server (NTRS)

    Herzog, James P. (Inventor); Bickford, Randall L. (Inventor)

    2005-01-01

    System and method providing surveillance of an asset such as a process and/or apparatus by providing training and surveillance procedures that numerically fit a probability density function to an observed residual error signal distribution that is correlative to normal asset operation and then utilizes the fitted probability density function in a dynamic statistical hypothesis test for providing improved asset surveillance.

  9. Surveillance system and method having an adaptive sequential probability fault detection test

    NASA Technical Reports Server (NTRS)

    Bickford, Randall L. (Inventor); Herzog, James P. (Inventor)

    2006-01-01

    System and method providing surveillance of an asset such as a process and/or apparatus by providing training and surveillance procedures that numerically fit a probability density function to an observed residual error signal distribution that is correlative to normal asset operation and then utilizes the fitted probability density function in a dynamic statistical hypothesis test for providing improved asset surveillance.

  10. Surveillance System and Method having an Adaptive Sequential Probability Fault Detection Test

    NASA Technical Reports Server (NTRS)

    Bickford, Randall L. (Inventor); Herzog, James P. (Inventor)

    2008-01-01

    System and method providing surveillance of an asset such as a process and/or apparatus by providing training and surveillance procedures that numerically fit a probability density function to an observed residual error signal distribution that is correlative to normal asset operation and then utilizes the fitted probability density function in a dynamic statistical hypothesis test for providing improved asset surveillance.

  11. Distributed Immune Systems for Wireless Network Information Assurance

    DTIC Science & Technology

    2010-04-26

    ratio test (SPRT), where the goal is to optimize a hypothesis testing problem given a trade-off between the probability of errors and the...using cumulative sum (CUSUM) and Girshik-Rubin-Shiryaev (GRSh) statistics. In sequential versions of the problem the sequential probability ratio ...the more complicated problems, in particular those where no clear mean can be established. We developed algorithms based on the sequential probability

  12. A Gaussian Model-Based Probabilistic Approach for Pulse Transit Time Estimation.

    PubMed

    Jang, Dae-Geun; Park, Seung-Hun; Hahn, Minsoo

    2016-01-01

    In this paper, we propose a new probabilistic approach to pulse transit time (PTT) estimation using a Gaussian distribution model. It is motivated basically by the hypothesis that PTTs normalized by RR intervals follow the Gaussian distribution. To verify the hypothesis, we demonstrate the effects of arterial compliance on the normalized PTTs using the Moens-Korteweg equation. Furthermore, we observe a Gaussian distribution of the normalized PTTs on real data. In order to estimate the PTT using the hypothesis, we first assumed that R-waves in the electrocardiogram (ECG) can be correctly identified. The R-waves limit searching ranges to detect pulse peaks in the photoplethysmogram (PPG) and to synchronize the results with cardiac beats--i.e., the peaks of the PPG are extracted within the corresponding RR interval of the ECG as pulse peak candidates. Their probabilities of being the actual pulse peak are then calculated using a Gaussian probability function. The parameters of the Gaussian function are automatically updated when a new pulse peak is identified. This update makes the probability function adaptive to variations of cardiac cycles. Finally, the pulse peak is identified as the candidate with the highest probability. The proposed approach is tested on a database where ECG and PPG waveforms are collected simultaneously during the submaximal bicycle ergometer exercise test. The results are promising, suggesting that the method provides a simple but more accurate PTT estimation in real applications.

  13. Information Use Differences in Hot and Cold Risk Processing: When Does Information About Probability Count in the Columbia Card Task?

    PubMed

    Markiewicz, Łukasz; Kubińska, Elżbieta

    2015-01-01

    This paper aims to provide insight into information processing differences between hot and cold risk taking decision tasks within a single domain. Decision theory defines risky situations using at least three parameters: outcome one (often a gain) with its probability and outcome two (often a loss) with a complementary probability. Although a rational agent should consider all of the parameters, s/he could potentially narrow their focus to only some of them, particularly when explicit Type 2 processes do not have the resources to override implicit Type 1 processes. Here we investigate differences in risky situation parameters' influence on hot and cold decisions. Although previous studies show lower information use in hot than in cold processes, they do not provide decision weight changes and therefore do not explain whether this difference results from worse concentration on each parameter of a risky situation (probability, gain amount, and loss amount) or from ignoring some parameters. Two studies were conducted, with participants performing the Columbia Card Task (CCT) in either its Cold or Hot version. In the first study, participants also performed the Cognitive Reflection Test (CRT) to monitor their ability to override Type 1 processing cues (implicit processes) with Type 2 explicit processes. Because hypothesis testing required comparison of the relative importance of risky situation decision weights (gain, loss, probability), we developed a novel way of measuring information use in the CCT by employing a conjoint analysis methodology. Across the two studies, results indicated that in the CCT Cold condition decision makers concentrate on each information type (gain, loss, probability), but in the CCT Hot condition they concentrate mostly on a single parameter: probability of gain/loss. We also show that an individual's CRT score correlates with information use propensity in cold but not hot tasks. Thus, the affective dimension of hot tasks inhibits correct information processing, probably because it is difficult to engage Type 2 processes in such circumstances. Individuals' Type 2 processing abilities (measured by the CRT) assist greater use of information in cold tasks but do not help in hot tasks.

  14. Information Use Differences in Hot and Cold Risk Processing: When Does Information About Probability Count in the Columbia Card Task?

    PubMed Central

    Markiewicz, Łukasz; Kubińska, Elżbieta

    2015-01-01

    Objective: This paper aims to provide insight into information processing differences between hot and cold risk taking decision tasks within a single domain. Decision theory defines risky situations using at least three parameters: outcome one (often a gain) with its probability and outcome two (often a loss) with a complementary probability. Although a rational agent should consider all of the parameters, s/he could potentially narrow their focus to only some of them, particularly when explicit Type 2 processes do not have the resources to override implicit Type 1 processes. Here we investigate differences in risky situation parameters' influence on hot and cold decisions. Although previous studies show lower information use in hot than in cold processes, they do not provide decision weight changes and therefore do not explain whether this difference results from worse concentration on each parameter of a risky situation (probability, gain amount, and loss amount) or from ignoring some parameters. Methods: Two studies were conducted, with participants performing the Columbia Card Task (CCT) in either its Cold or Hot version. In the first study, participants also performed the Cognitive Reflection Test (CRT) to monitor their ability to override Type 1 processing cues (implicit processes) with Type 2 explicit processes. Because hypothesis testing required comparison of the relative importance of risky situation decision weights (gain, loss, probability), we developed a novel way of measuring information use in the CCT by employing a conjoint analysis methodology. Results: Across the two studies, results indicated that in the CCT Cold condition decision makers concentrate on each information type (gain, loss, probability), but in the CCT Hot condition they concentrate mostly on a single parameter: probability of gain/loss. We also show that an individual's CRT score correlates with information use propensity in cold but not hot tasks. Thus, the affective dimension of hot tasks inhibits correct information processing, probably because it is difficult to engage Type 2 processes in such circumstances. Individuals' Type 2 processing abilities (measured by the CRT) assist greater use of information in cold tasks but do not help in hot tasks. PMID:26635652

  15. Computing under-ice discharge: A proof-of-concept using hydroacoustics and the Probability Concept

    NASA Astrophysics Data System (ADS)

    Fulton, John W.; Henneberg, Mark F.; Mills, Taylor J.; Kohn, Michael S.; Epstein, Brian; Hittle, Elizabeth A.; Damschen, William C.; Laveau, Christopher D.; Lambrecht, Jason M.; Farmer, William H.

    2018-07-01

    Under-ice discharge is estimated using open-water reference hydrographs; however, the ratings for ice-affected sites are generally qualified as poor. The U.S. Geological Survey (USGS), in collaboration with the Colorado Water Conservation Board, conducted a proof-of-concept to develop an alternative method for computing under-ice discharge using hydroacoustics and the Probability Concept. The study site was located south of Minturn, Colorado (CO), USA, and was selected because of (1) its proximity to the existing USGS streamgage 09064600 Eagle River near Minturn, CO, and (2) its ease-of-access to verify discharge using a variety of conventional methods. From late September 2014 to early March 2015, hydraulic conditions varied from open water to under ice. These temporal changes led to variations in water depth and velocity. Hydroacoustics (tethered and uplooking acoustic Doppler current profilers and acoustic Doppler velocimeters) were deployed to measure the vertical-velocity profile at a singularly important vertical of the channel-cross section. Because the velocity profile was non-standard and cannot be characterized using a Power Law or Log Law, velocity data were analyzed using the Probability Concept, which is a probabilistic formulation of the velocity distribution. The Probability Concept-derived discharge was compared to conventional methods including stage-discharge and index-velocity ratings and concurrent field measurements; each is complicated by the dynamics of ice formation, pressure influences on stage measurements, and variations in cross-sectional area due to ice formation. No particular discharge method was assigned as truth. Rather one statistical metric (Kolmogorov-Smirnov; KS), agreement plots, and concurrent measurements provided a measure of comparability between various methods. Regardless of the method employed, comparisons between each method revealed encouraging results depending on the flow conditions and the absence or presence of ice cover. For example, during lower discharges dominated by under-ice and transition (intermittent open-water and under-ice) conditions, the KS metric suggests there is not sufficient information to reject the null hypothesis and implies that the Probability Concept and index-velocity rating represent similar distributions. During high-flow, open-water conditions, the comparisons are less definitive; therefore, it is important that the appropriate analytical method and instrumentation be selected. Six conventional discharge measurements were collected concurrently with Probability Concept-derived discharges with percent differences (%) of -9.0%, -21%, -8.6%, 17.8%, 3.6%, and -2.3%. This proof-of-concept demonstrates that riverine discharges can be computed using the Probability Concept for a range of hydraulic extremes (variations in discharge, open-water and under-ice conditions) immediately after the siting phase is complete, which typically requires one day. Computing real-time discharges is particularly important at sites, where (1) new streamgages are planned, (2) river hydraulics are complex, and (3) shifts in the stage-discharge rating are needed to correct the streamflow record. Use of the Probability Concept does not preclude the need to maintain a stage-area relation. Both the Probability Concept and index-velocity rating offer water-resource managers and decision makers alternatives for computing real-time discharge for open-water and under-ice conditions.

  16. Computing under-ice discharge: A proof-of-concept using hydroacoustics and the Probability Concept

    USGS Publications Warehouse

    Fulton, John W.; Henneberg, Mark F.; Mills, Taylor J.; Kohn, Michael S.; Epstein, Brian; Hittle, Elizabeth A.; Damschen, William C.; Laveau, Christopher D.; Lambrecht, Jason M.; Farmer, William H.

    2018-01-01

    Under-ice discharge is estimated using open-water reference hydrographs; however, the ratings for ice-affected sites are generally qualified as poor. The U.S. Geological Survey (USGS), in collaboration with the Colorado Water Conservation Board, conducted a proof-of-concept to develop an alternative method for computing under-ice discharge using hydroacoustics and the Probability Concept.The study site was located south of Minturn, Colorado (CO), USA, and was selected because of (1) its proximity to the existing USGS streamgage 09064600 Eagle River near Minturn, CO, and (2) its ease-of-access to verify discharge using a variety of conventional methods. From late September 2014 to early March 2015, hydraulic conditions varied from open water to under ice. These temporal changes led to variations in water depth and velocity. Hydroacoustics (tethered and uplooking acoustic Doppler current profilers and acoustic Doppler velocimeters) were deployed to measure the vertical-velocity profile at a singularly important vertical of the channel-cross section. Because the velocity profile was non-standard and cannot be characterized using a Power Law or Log Law, velocity data were analyzed using the Probability Concept, which is a probabilistic formulation of the velocity distribution. The Probability Concept-derived discharge was compared to conventional methods including stage-discharge and index-velocity ratings and concurrent field measurements; each is complicated by the dynamics of ice formation, pressure influences on stage measurements, and variations in cross-sectional area due to ice formation.No particular discharge method was assigned as truth. Rather one statistical metric (Kolmogorov-Smirnov; KS), agreement plots, and concurrent measurements provided a measure of comparability between various methods. Regardless of the method employed, comparisons between each method revealed encouraging results depending on the flow conditions and the absence or presence of ice cover.For example, during lower discharges dominated by under-ice and transition (intermittent open-water and under-ice) conditions, the KS metric suggests there is not sufficient information to reject the null hypothesis and implies that the Probability Concept and index-velocity rating represent similar distributions. During high-flow, open-water conditions, the comparisons are less definitive; therefore, it is important that the appropriate analytical method and instrumentation be selected. Six conventional discharge measurements were collected concurrently with Probability Concept-derived discharges with percent differences (%) of −9.0%, −21%, −8.6%, 17.8%, 3.6%, and −2.3%.This proof-of-concept demonstrates that riverine discharges can be computed using the Probability Concept for a range of hydraulic extremes (variations in discharge, open-water and under-ice conditions) immediately after the siting phase is complete, which typically requires one day. Computing real-time discharges is particularly important at sites, where (1) new streamgages are planned, (2) river hydraulics are complex, and (3) shifts in the stage-discharge rating are needed to correct the streamflow record. Use of the Probability Concept does not preclude the need to maintain a stage-area relation. Both the Probability Concept and index-velocity rating offer water-resource managers and decision makers alternatives for computing real-time discharge for open-water and under-ice conditions.

  17. A further test of sequential-sampling models that account for payoff effects on response bias in perceptual decision tasks.

    PubMed

    Diederich, Adele

    2008-02-01

    Recently, Diederich and Busemeyer (2006) evaluated three hypotheses formulated as particular versions of a sequential-sampling model to account for the effects of payoffs in a perceptual decision task with time constraints. The bound-change hypothesis states that payoffs affect the distance of the starting position of the decision process to each decision bound. The drift-rate-change hypothesis states that payoffs affect the drift rate of the decision process. The two-stage-processing hypothesis assumes two processes, one for processing payoffs and another for processing stimulus information, and that on a given trial, attention switches from one process to the other. The latter hypothesis gave the best account of their data. The present study investigated two questions: (1) Does the experimental setting influence decisions, and consequently affect the fits of the hypotheses? A task was conducted in two experimental settings--either the time limit or the payoff matrix was held constant within a given block of trials, using three different payoff matrices and four different time limits--in order to answer this question. (2) Could it be that participants neglect payoffs on some trials and stimulus information on others? To investigate this idea, a further hypothesis was considered, the mixture-of-processes hypothesis. Like the two-stage-processing hypothesis, it postulates two processes, one for payoffs and another for stimulus information. However, it differs from the previous hypothesis in assuming that on a given trial exactly one of the processes operates, never both. The present design had no effect on choice probability but may have affected choice response times (RTs). Overall, the two-stage-processing hypothesis gave the best account, with respect both to choice probabilities and to observed mean RTs and mean RT patterns within a choice pair.

  18. Medial prefrontal cortex reacts to unfairness if this damages the self: a tDCS study

    PubMed Central

    Miniussi, Carlo; Rumiati, Raffaella I.

    2015-01-01

    Neural correlates of unfairness perception depend on who is the target of the unfair treatment. These previous findings suggest that the activation of medial prefrontal cortex (MPFC) is related to unfairness perception only when the subject of the measurement is also the person affected by the unfair treatment. We aim at demonstrating the specificity of MPFC involvement using transcranial direct current stimulation (tDCS), a technique that induces cortical excitability changes in the targeted region. We use a modified version of the Ultimatum Game, in which responders play both for themselves (myself—MS condition) and on behalf of an unknown third-party (TP condition), where they respond to unfairness without being the target of it. We find that the application of cathodal tDCS over MPFC decreases the probability of rejecting unfair offers in MS, but not in TP; conversely, the same stimulation increases the probability of rejecting fair offers in TP, but not in MS. We confirm the hypothesis that MPFC is specifically related to processing unfairness when the self is involved, and discuss possible explanations for the opposite effect of the stimulation in TP. PMID:25552567

  19. Why Current Statistics of Complementary Alternative Medicine Clinical Trials is Invalid.

    PubMed

    Pandolfi, Maurizio; Carreras, Giulia

    2018-06-07

    It is not sufficiently known that frequentist statistics cannot provide direct information on the probability that the research hypothesis tested is correct. The error resulting from this misunderstanding is compounded when the hypotheses under scrutiny have precarious scientific bases, which, generally, those of complementary alternative medicine (CAM) are. In such cases, it is mandatory to use inferential statistics, considering the prior probability that the hypothesis tested is true, such as the Bayesian statistics. The authors show that, under such circumstances, no real statistical significance can be achieved in CAM clinical trials. In this respect, CAM trials involving human material are also hardly defensible from an ethical viewpoint.

  20. Investigating Soil Moisture Feedbacks on Precipitation With Tests of Granger Causality

    NASA Astrophysics Data System (ADS)

    Salvucci, G. D.; Saleem, J. A.; Kaufmann, R.

    2002-05-01

    Granger causality (GC) is used in the econometrics literature to identify the presence of one- and two-way coupling between terms in noisy multivariate dynamical systems. Here we test for the presence of GC to identify a soil moisture (S) feedback on precipitation (P) using data from Illinois. In this framework S is said to Granger cause P if F(Pt;At-dt)does not equal F(P;(A-S)t-dt) where F denotes the conditional distribution of P at time t, At-dt represents the set of all knowledge available at time t-dt, and (A-S)t-dt represents all knowledge available at t-dt except S. Critical for land-atmosphere interaction research is that At-dt includes all past information on P as well as S. Therefore that part of the relation between past soil moisture and current precipitation which results from precipitation autocorrelation and soil water balance will be accounted for and not attributed to causality. Tests for GC usually specify all relevant variables in a coupled vector autoregressive (VAR) model and then calculate the significance level of decreased predictability as various coupling coefficients are omitted. But because the data (daily precipitation and soil moisture) are distinctly non-Gaussian, we avoid using a VAR and instead express the daily precipitation events as a Markov model. We then test whether the probability of storm occurrence, conditioned on past information on precipitation, changes with information on soil moisture. Past information on precipitation is expressed both as the occurrence of previous day precipitation (to account for storm-scale persistence) and as a simple soil moisture-like precipitation-wetness index derived solely from precipitation (to account for seasonal-scale persistence). In this way only those fluctuations in moisture not attributable to past fluctuations in precipitation (e.g., those due to temperature) can influence the outcome of the test. The null hypothesis (no moisture influence) is evaluated by comparing observed changes in storm probability to Monte-Carlo simulated differences generated with unconditional occurrence probabilities. The null hypothesis is not rejected (p>0.5) suggesting that contrary to recently published results, insufficient evidence exists to support an influence of soil moisture on precipitation in Illinois.

  1. Hypotheses to explain the origin of species in Amazonia.

    PubMed

    Haffer, J

    2008-11-01

    The main hypotheses proposed to explain barrier formation separating populations and causing the differentiation of species in Amazonia during the course of geological history are based on different factors, as follow: (1) Changes in the distribution of land and sea or in the landscape due to tectonic movements or sea level fluctuations (Paleogeography hypothesis), (2) the barrier effect of Amazonian rivers (River hypothesis), (3) a combination of the barrier effect of broad rivers and vegetational changes in northern and southern Amazonia (River-refuge hypothesis), (4) the isolation of humid rainforest blocks near areas of surface relief in the periphery of Amazonia separated by dry forests, savannas and other intermediate vegetation types during dry climatic periods of the Tertiary and Quaternary (Refuge hypothesis), (5) changes in canopy-density due to climatic reversals (Canopy-density hypothesis) (6) the isolation and speciation of animal populations in small montane habitat pockets around Amazonia due to climatic fluctuations without major vegetational changes (Museum hypothesis), (7) competitive species interactions and local species isolations in peripheral regions of Amazonia due to invasion and counterinvasion during cold/warm periods of the Pleistocene (Disturbance-vicariance hypothesis) and (8) parapatric speciation across steep environmental gradients without separation of the respective populations (Gradient hypothesis). Several of these hypotheses probably are relevant to a different degree for the speciation processes in different faunal groups or during different geological periods. The basic paleogeography model refers mainly to faunal differentiation during the Tertiary and in combination with the Refuge hypothesis. Milankovitch cycles leading to global main hypotheses proposed to explain barrier formation separating populations and causing the differentiation of species in Amazonia during the course of geological history are based on different factors, as follow: (1) Changes in the distribution of land and sea or in the landscape due to tectonic movements or sea level fluctuations (Paleogeography hypothesis), (2) the barrier effect of Amazonian rivers (River hypothesis), (3) a combination of the barrier effect of broad rivers and vegetational changes in northern and southern Amazonia (River-refuge hypothesis), (4) the isolation of humid rainforest blocks near areas of surface relief in the periphery of Amazonia separated by dry forests, savannas and other intermediate vegetation types during dry climatic periods of the Tertiary and Quaternary (Refuge hypothesis), (5) changes in canopy-density due to climatic reversals (Canopy-density hypothesis) (6) the isolation and speciation of animal populations in small montane habitat pockets around Amazonia due to climatic fluctuations without major vegetational changes (Museum hypothesis), (7) competitive species interactions and local species isolations in peripheral regions of Amazonia due to invasion and counterinvasion during cold/warm periods of the Pleistocene (Disturbance-vicariance hypothesis) and (8) parapatric speciation across steep environmental gradients without separation of the respective populations (Gradient hypothesis). Several of these hypotheses probably are relevant to a different degree for the speciation processes in different faunal groups or during different geological periods. The basic paleogeography model refers mainly to faunal differentiation during the Tertiary and in combination with the Refuge hypothesis. Milankovitch cycles leading to global climatic-vegetational changes affected the biomes of the world not only during the Pleistocene but also during the Tertiary and earlier geological periods. New geoscientific evidence for the effect of dry climatic periods in Amazonia supports the predictions of the Refuge hypothesis. The disturbance-vicariance hypothesis refers to the presumed effect of cold/warm climatic phases of the Pleistocene only and is of limited general relevance because most extant species originated earlier and probably through paleogeographic changes and the formation of ecological refuges during the Tertiary.

  2. Bayesian enhancement two-stage design for single-arm phase II clinical trials with binary and time-to-event endpoints.

    PubMed

    Shi, Haolun; Yin, Guosheng

    2018-02-21

    Simon's two-stage design is one of the most commonly used methods in phase II clinical trials with binary endpoints. The design tests the null hypothesis that the response rate is less than an uninteresting level, versus the alternative hypothesis that the response rate is greater than a desirable target level. From a Bayesian perspective, we compute the posterior probabilities of the null and alternative hypotheses given that a promising result is declared in Simon's design. Our study reveals that because the frequentist hypothesis testing framework places its focus on the null hypothesis, a potentially efficacious treatment identified by rejecting the null under Simon's design could have only less than 10% posterior probability of attaining the desirable target level. Due to the indifference region between the null and alternative, rejecting the null does not necessarily mean that the drug achieves the desirable response level. To clarify such ambiguity, we propose a Bayesian enhancement two-stage (BET) design, which guarantees a high posterior probability of the response rate reaching the target level, while allowing for early termination and sample size saving in case that the drug's response rate is smaller than the clinically uninteresting level. Moreover, the BET design can be naturally adapted to accommodate survival endpoints. We conduct extensive simulation studies to examine the empirical performance of our design and present two trial examples as applications. © 2018, The International Biometric Society.

  3. Sequential Probability Ratio Test for Collision Avoidance Maneuver Decisions Based on a Bank of Norm-Inequality-Constrained Epoch-State Filters

    NASA Technical Reports Server (NTRS)

    Carpenter, J. R.; Markley, F. L.; Alfriend, K. T.; Wright, C.; Arcido, J.

    2011-01-01

    Sequential probability ratio tests explicitly allow decision makers to incorporate false alarm and missed detection risks, and are potentially less sensitive to modeling errors than a procedure that relies solely on a probability of collision threshold. Recent work on constrained Kalman filtering has suggested an approach to formulating such a test for collision avoidance maneuver decisions: a filter bank with two norm-inequality-constrained epoch-state extended Kalman filters. One filter models 1he null hypothesis 1ha1 the miss distance is inside the combined hard body radius at the predicted time of closest approach, and one filter models the alternative hypothesis. The epoch-state filter developed for this method explicitly accounts for any process noise present in the system. The method appears to work well using a realistic example based on an upcoming highly-elliptical orbit formation flying mission.

  4. Assessing flight safety differences between the United States regional and major airlines

    NASA Astrophysics Data System (ADS)

    Sharp, Broderick H.

    During 2008, the U.S. domestic airline departures exceeded 28,000 flights per day. Thirty-nine or less than 0.2 of 1% of these flights resulted in operational incidents or accidents. However, even a low percentage of airline accidents and incidents continue to cause human suffering and property loss. The charge of this study was the comparison of U.S. major and regional airline safety histories. The study spans safety events from January 1982 through December 2008. In this quantitative analysis, domestic major and regional airlines were statistically tested for their flight safety differences. Four major airlines and thirty-seven regional airlines qualified for the safety study which compared the airline groups' fatal accidents, incidents, non-fatal accidents, pilot errors, and the remaining six safety event probable cause types. The six other probable cause types are mechanical failure, weather, air traffic control, maintenance, other, and unknown causes. The National Transportation Safety Board investigated each airline safety event, and assigned a probable cause to each event. A sample of 500 events was randomly selected from the 1,391 airlines' accident and incident population. The airline groups' safety event probabilities were estimated using the least squares linear regression. A probability significance level of 5% was chosen to conclude the appropriate research question hypothesis. The airline fatal accidents and incidents probability levels were 1.2% and 0.05% respectively. These two research questions did not reach the 5% significance level threshold. Therefore, the airline groups' fatal accidents and non-destructive incidents probabilities favored the airline groups' safety differences hypothesis. The linear progression estimates for the remaining three research questions were 71.5% for non-fatal accidents, 21.8% for the pilot errors, and 7.4% significance level for the six probable causes. These research questions' linear regressions are greater than the 5% level. Consequently, these three research questions favored airline groups' safety similarities hypothesis. The study indicates the U.S. domestic major airlines were safer than the regional airlines. Ideas for potential airline safety progress can examine pilot fatigue, the airline groups' hiring policies, the government's airline oversight personnel, or the comparison of individual airline's operational policies.

  5. Test of association: which one is the most appropriate for my study?

    PubMed

    Gonzalez-Chica, David Alejandro; Bastos, João Luiz; Duquia, Rodrigo Pereira; Bonamigo, Renan Rangel; Martínez-Mesa, Jeovany

    2015-01-01

    Hypothesis tests are statistical tools widely used for assessing whether or not there is an association between two or more variables. These tests provide a probability of the type 1 error (p-value), which is used to accept or reject the null study hypothesis. To provide a practical guide to help researchers carefully select the most appropriate procedure to answer the research question. We discuss the logic of hypothesis testing and present the prerequisites of each procedure based on practical examples.

  6. Planned Hypothesis Tests Are Not Necessarily Exempt from Multiplicity Adjustment

    ERIC Educational Resources Information Center

    Frane, Andrew V.

    2015-01-01

    Scientific research often involves testing more than one hypothesis at a time, which can inflate the probability that a Type I error (false discovery) will occur. To prevent this Type I error inflation, adjustments can be made to the testing procedure that compensate for the number of tests. Yet many researchers believe that such adjustments are…

  7. Ecological niche models reveal the importance of climate variability for the biogeography of protosteloid amoebae

    PubMed Central

    Aguilar, María; Lado, Carlos

    2012-01-01

    Habitat availability and environmental preferences of species are among the most important factors in determining the success of dispersal processes and therefore in shaping the distribution of protists. We explored the differences in fundamental niches and potential distributions of an ecological guild of slime moulds—protosteloid amoebae—in the Iberian Peninsula. A large set of samples collected in a north-east to south-west transect of approximately 1000 km along the peninsula was used to test the hypothesis that, together with the existence of suitable microhabitats, climate conditions may determine the probability of survival of species. Although protosteloid amoebae share similar morphologies and life history strategies, canonical correspondence analyses showed that they have varied ecological optima, and that climate conditions have an important effect in niche differentiation. Maxent environmental niche models provided consistent predictions of the probability of presence of the species based on climate data, and they were used to generate maps of potential distribution in an ‘everything is everywhere' scenario. The most important climatic factors were, in both analyses, variables that measure changes in conditions throughout the year, confirming that the alternation of fruiting bodies, cysts and amoeboid stages in the life cycles of protosteloid amoebae constitutes an advantage for surviving in a changing environment. Microhabitat affinity seems to be influenced by climatic conditions, which suggests that the micro-environment may vary at a local scale and change together with the external climate at a larger scale. PMID:22402402

  8. Ecological niche models reveal the importance of climate variability for the biogeography of protosteloid amoebae.

    PubMed

    Aguilar, María; Lado, Carlos

    2012-08-01

    Habitat availability and environmental preferences of species are among the most important factors in determining the success of dispersal processes and therefore in shaping the distribution of protists. We explored the differences in fundamental niches and potential distributions of an ecological guild of slime moulds-protosteloid amoebae-in the Iberian Peninsula. A large set of samples collected in a north-east to south-west transect of approximately 1000 km along the peninsula was used to test the hypothesis that, together with the existence of suitable microhabitats, climate conditions may determine the probability of survival of species. Although protosteloid amoebae share similar morphologies and life history strategies, canonical correspondence analyses showed that they have varied ecological optima, and that climate conditions have an important effect in niche differentiation. Maxent environmental niche models provided consistent predictions of the probability of presence of the species based on climate data, and they were used to generate maps of potential distribution in an 'everything is everywhere' scenario. The most important climatic factors were, in both analyses, variables that measure changes in conditions throughout the year, confirming that the alternation of fruiting bodies, cysts and amoeboid stages in the life cycles of protosteloid amoebae constitutes an advantage for surviving in a changing environment. Microhabitat affinity seems to be influenced by climatic conditions, which suggests that the micro-environment may vary at a local scale and change together with the external climate at a larger scale.

  9. The Stranding Anomaly as Population Indicator: The Case of Harbour Porpoise Phocoena phocoena in North-Western Europe

    PubMed Central

    Peltier, Helene; Baagøe, Hans J.; Camphuysen, Kees C. J.; Czeck, Richard; Dabin, Willy; Daniel, Pierre; Deaville, Rob; Haelters, Jan; Jauniaux, Thierry; Jensen, Lasse F.; Jepson, Paul D.; Keijl, Guido O.; Siebert, Ursula; Van Canneyt, Olivier; Ridoux, Vincent

    2013-01-01

    Ecological indicators for monitoring strategies are expected to combine three major characteristics: ecological significance, statistical credibility, and cost-effectiveness. Strategies based on stranding networks rank highly in cost-effectiveness, but their ecological significance and statistical credibility are disputed. Our present goal is to improve the value of stranding data as population indicator as part of monitoring strategies by constructing the spatial and temporal null hypothesis for strandings. The null hypothesis is defined as: small cetacean distribution and mortality are uniform in space and constant in time. We used a drift model to map stranding probabilities and predict stranding patterns of cetacean carcasses under H0 across the North Sea, the Channel and the Bay of Biscay, for the period 1990–2009. As the most common cetacean occurring in this area, we chose the harbour porpoise Phocoena phocoena for our modelling. The difference between these strandings expected under H0 and observed strandings is defined as the stranding anomaly. It constituted the stranding data series corrected for drift conditions. Seasonal decomposition of stranding anomaly suggested that drift conditions did not explain observed seasonal variations of porpoise strandings. Long-term stranding anomalies increased first in the southern North Sea, the Channel and Bay of Biscay coasts, and finally the eastern North Sea. The hypothesis of changes in porpoise distribution was consistent with local visual surveys, mostly SCANS surveys (1994 and 2005). This new indicator could be applied to cetacean populations across the world and more widely to marine megafauna. PMID:23614031

  10. Kidnapping of chicks in emperor penguins: a hormonal by-product?

    PubMed

    Angelier, Frédéric; Barbraud, Christophe; Lormée, Hervé; Prud'homme, François; Chastel, Olivier

    2006-04-01

    The function and causes of kidnapping juveniles are little understood because individuals sustain some breeding costs to rear an unrelated offspring. Here we focus on the proximal causes of this behaviour in emperor penguins (Aptenodytes forsteri), whose failed breeders often kidnap chicks. We experimentally tested the hypothesis that kidnapping behaviour was the result of high residual levels of prolactin (PRL), a hormone involved in parental behaviour. Penguins with artificially decreased PRL levels by bromocriptine administration kidnapped chicks less often than control penguins. Within the bromocriptine treated group, kidnapping behaviour was not totally suppressed and the probability of kidnapping a chick was positively correlated to PRL levels measured before treatment. During breeding, emperor penguins have to forage in remote ice-free areas. In these birds, PRL secretion is poorly influenced by chick stimuli and has probably evolved to maintain a willingness to return to the colony after a long absence at sea. Therefore, penguins that have lost their chick during a foraging trip still maintain high residual PRL levels and this, combined with colonial breeding, probably facilitates kidnapping. We suggest that kidnapping in non-cooperative systems may result from a hormonal byproduct of a reproductive adaptation to extreme conditions.

  11. Making inference from wildlife collision data: inferring predator absence from prey strikes

    PubMed Central

    Hosack, Geoffrey R.; Barry, Simon C.

    2017-01-01

    Wildlife collision data are ubiquitous, though challenging for making ecological inference due to typically irreducible uncertainty relating to the sampling process. We illustrate a new approach that is useful for generating inference from predator data arising from wildlife collisions. By simply conditioning on a second prey species sampled via the same collision process, and by using a biologically realistic numerical response functions, we can produce a coherent numerical response relationship between predator and prey. This relationship can then be used to make inference on the population size of the predator species, including the probability of extinction. The statistical conditioning enables us to account for unmeasured variation in factors influencing the runway strike incidence for individual airports and to enable valid comparisons. A practical application of the approach for testing hypotheses about the distribution and abundance of a predator species is illustrated using the hypothesized red fox incursion into Tasmania, Australia. We estimate that conditional on the numerical response between fox and lagomorph runway strikes on mainland Australia, the predictive probability of observing no runway strikes of foxes in Tasmania after observing 15 lagomorph strikes is 0.001. We conclude there is enough evidence to safely reject the null hypothesis that there is a widespread red fox population in Tasmania at a population density consistent with prey availability. The method is novel and has potential wider application. PMID:28243534

  12. How Often Is p[subscript rep] Close to the True Replication Probability?

    ERIC Educational Resources Information Center

    Trafimow, David; MacDonald, Justin A.; Rice, Stephen; Clason, Dennis L.

    2010-01-01

    Largely due to dissatisfaction with the standard null hypothesis significance testing procedure, researchers have begun to consider alternatives. For example, Killeen (2005a) has argued that researchers should calculate p[subscript rep] that is purported to indicate the probability that, if the experiment in question were replicated, the obtained…

  13. Toward "Constructing" the Concept of Statistical Power: An Optical Analogy.

    ERIC Educational Resources Information Center

    Rogers, Bruce G.

    This paper presents a visual analogy that may be used by instructors to teach the concept of statistical power in statistical courses. Statistical power is mathematically defined as the probability of rejecting a null hypothesis when that null is false, or, equivalently, the probability of detecting a relationship when it exists. The analogy…

  14. Do men believe that physically attractive women are more healthy and capable of having children?

    PubMed

    Mathes, Eugene W; Arms, Clarissa; Bryant, Alicia; Fields, Jeni; Witowski, Aggie

    2005-06-01

    The purpose of this research was to test the hypothesis that men view physical attractiveness as an index of a woman's health and her capacity to have children. 21 men and 26 women from an introductory psychology course were shown photographs from 1972 of men and women college students, judged in 2002 to be attractive or unattractive. Subjects were asked to rate the photographed individuals' current health, the probability that they were married, the probability that they had children, and whether they had reproductive problems. The hypothesis was generally supported; the men rated the photographs of attractive women as healthier, more likely to be married, and more likely to have children.

  15. Perturbation analysis for patch occupancy dynamics

    USGS Publications Warehouse

    Martin, Julien; Nichols, James D.; McIntyre, Carol L.; Ferraz, Goncalo; Hines, James E.

    2009-01-01

    Perturbation analysis is a powerful tool to study population and community dynamics. This article describes expressions for sensitivity metrics reflecting changes in equilibrium occupancy resulting from small changes in the vital rates of patch occupancy dynamics (i.e., probabilities of local patch colonization and extinction). We illustrate our approach with a case study of occupancy dynamics of Golden Eagle (Aquila chrysaetos) nesting territories. Examination of the hypothesis of system equilibrium suggests that the system satisfies equilibrium conditions. Estimates of vital rates obtained using patch occupancy models are used to estimate equilibrium patch occupancy of eagles. We then compute estimates of sensitivity metrics and discuss their implications for eagle population ecology and management. Finally, we discuss the intuition underlying our sensitivity metrics and then provide examples of ecological questions that can be addressed using perturbation analyses. For instance, the sensitivity metrics lead to predictions about the relative importance of local colonization and local extinction probabilities in influencing equilibrium occupancy for rare and common species.

  16. Gaussian Hypothesis Testing and Quantum Illumination.

    PubMed

    Wilde, Mark M; Tomamichel, Marco; Lloyd, Seth; Berta, Mario

    2017-09-22

    Quantum hypothesis testing is one of the most basic tasks in quantum information theory and has fundamental links with quantum communication and estimation theory. In this paper, we establish a formula that characterizes the decay rate of the minimal type-II error probability in a quantum hypothesis test of two Gaussian states given a fixed constraint on the type-I error probability. This formula is a direct function of the mean vectors and covariance matrices of the quantum Gaussian states in question. We give an application to quantum illumination, which is the task of determining whether there is a low-reflectivity object embedded in a target region with a bright thermal-noise bath. For the asymmetric-error setting, we find that a quantum illumination transmitter can achieve an error probability exponent stronger than a coherent-state transmitter of the same mean photon number, and furthermore, that it requires far fewer trials to do so. This occurs when the background thermal noise is either low or bright, which means that a quantum advantage is even easier to witness than in the symmetric-error setting because it occurs for a larger range of parameters. Going forward from here, we expect our formula to have applications in settings well beyond those considered in this paper, especially to quantum communication tasks involving quantum Gaussian channels.

  17. More attention when speaking: does it help or does it hurt?

    PubMed Central

    Nozari, Nazbanou; Thompson-Schill, Sharon L.

    2013-01-01

    Paying selective attention to a word in a multi-word utterance results in a decreased probability of error on that word (benefit), but an increased probability of error on the other words (cost). We ask whether excitation of the prefrontal cortex helps or hurts this cost. One hypothesis (the resource hypothesis) predicts a decrease in the cost due to the deployment of more attentional resources, while another (the focus hypothesis) predicts even greater costs due to further fine-tuning of selective attention. Our results are more consistent with the focus hypothesis: prefrontal stimulation caused a reliable increase in the benefit and a marginal increase in the cost of selective attention. To ensure that the effects are due to changes to the prefrontal cortex, we provide two checks: We show that the pattern of results is quite different if, instead, the primary motor cortex is stimulated. We also show that the stimulation-related benefits in the verbal task correlate with the stimulation-related benefits in an N-back task, which is known to tap into a prefrontal function. Our results shed light on how selective attention affects language production, and more generally, on how selective attention affects production of a sequence over time. PMID:24012690

  18. The Global Phylogeography of Lyssaviruses - Challenging the 'Out of Africa' Hypothesis

    PubMed Central

    Fooks, Anthony R.; Marston, Denise A.; Garcia-R, Juan C.

    2016-01-01

    Rabies virus kills tens of thousands of people globally each year, especially in resource-limited countries. Yet, there are genetically- and antigenically-related lyssaviruses, all capable of causing the disease rabies, circulating globally among bats without causing conspicuous disease outbreaks. The species richness and greater genetic diversity of African lyssaviruses, along with the lack of antibody cross-reactivity among them, has led to the hypothesis that Africa is the origin of lyssaviruses. This hypothesis was tested using a probabilistic phylogeographical approach. The nucleoprotein gene sequences from 153 representatives of 16 lyssavirus species, collected between 1956 and 2015, were used to develop a phylogenetic tree which incorporated relevant geographic and temporal data relating to the viruses. In addition, complete genome sequences from all 16 (putative) species were analysed. The most probable ancestral distribution for the internal nodes was inferred using three different approaches and was confirmed by analysis of complete genomes. These results support a Palearctic origin for lyssaviruses (posterior probability = 0.85), challenging the ‘out of Africa’ hypothesis, and suggest three independent transmission events to the Afrotropical region, representing the three phylogroups that form the three major lyssavirus clades. PMID:28036390

  19. Microcephalic osteodysplastic primordial dwarfism type I/III in sibs.

    PubMed

    Meinecke, P; Passarge, E

    1991-11-01

    The clinical and radiological findings in a pair of sibs with microcephalic osteodysplastic primordial dwarfism (MOPD) are described, a boy who survived for 5 1/2 years and his more severely affected younger sister, who died at the age of 6 months. Neuropathological studies in this girl showed marked micrencephaly with severely hypoplastic, poorly gyrated frontal lobes and absent corpus callosum. Our observation supports the hypothesis that types I and III MOPD probably constitute a spectrum of one and the same entity and published data together with this report are consistent with autosomal recessive inheritance. The pathogenesis of this condition is as yet unknown, but its characteristics indicate a basic defect affecting cell proliferation and tissue differentiation.

  20. Steady State Condition in the Measurement of VO2and VCO2by Indirect Calorimetry.

    PubMed

    Cadena, M; Sacristan, E; Infante, O; Escalante, B; Rodriguez, F

    2005-01-01

    Resting Metabolic Rate (RMR) is computed using VO2and VCO2short time 15-minute window measurement with Indirect Calorimetry (IC) instruments designed with mixing chamber. Steady state condition using a 10% variation coefficient criteria is the main objective to achieve metabolic long time prediction reliability. This study address how susceptible is the steady state VO2, VCO2measurement condition to the clino-orthostatic physiological maneuver. 30 young healthy subjects were analyzed. Only 18 passed the 10% variation coefficient inclusive criteria. They were exposed to 10 minutes clino-stage and 10 minutes orthostage. The hypothesis tests show not statistical significance (p< 0.1) in the average and variance analysis. It is concluded that the steady state is not influenced by the patient position IC test, probably because IC mixing chamber instruments are insensitive to detect a mayor physiological dynamics changes that can modify the steady state definition.

  1. Predicting redox-sensitive contaminant concentrations in groundwater using random forest classification

    NASA Astrophysics Data System (ADS)

    Tesoriero, Anthony J.; Gronberg, Jo Ann; Juckem, Paul F.; Miller, Matthew P.; Austin, Brian P.

    2017-08-01

    Machine learning techniques were applied to a large (n > 10,000) compliance monitoring database to predict the occurrence of several redox-active constituents in groundwater across a large watershed. Specifically, random forest classification was used to determine the probabilities of detecting elevated concentrations of nitrate, iron, and arsenic in the Fox, Wolf, Peshtigo, and surrounding watersheds in northeastern Wisconsin. Random forest classification is well suited to describe the nonlinear relationships observed among several explanatory variables and the predicted probabilities of elevated concentrations of nitrate, iron, and arsenic. Maps of the probability of elevated nitrate, iron, and arsenic can be used to assess groundwater vulnerability and the vulnerability of streams to contaminants derived from groundwater. Processes responsible for elevated concentrations are elucidated using partial dependence plots. For example, an increase in the probability of elevated iron and arsenic occurred when well depths coincided with the glacial/bedrock interface, suggesting a bedrock source for these constituents. Furthermore, groundwater in contact with Ordovician bedrock has a higher likelihood of elevated iron concentrations, which supports the hypothesis that groundwater liberates iron from a sulfide-bearing secondary cement horizon of Ordovician age. Application of machine learning techniques to existing compliance monitoring data offers an opportunity to broadly assess aquifer and stream vulnerability at regional and national scales and to better understand geochemical processes responsible for observed conditions.

  2. Predicting redox-sensitive contaminant concentrations in groundwater using random forest classification

    USGS Publications Warehouse

    Tesoriero, Anthony J.; Gronberg, Jo Ann M.; Juckem, Paul F.; Miller, Matthew P.; Austin, Brian P.

    2017-01-01

    Machine learning techniques were applied to a large (n > 10,000) compliance monitoring database to predict the occurrence of several redox-active constituents in groundwater across a large watershed. Specifically, random forest classification was used to determine the probabilities of detecting elevated concentrations of nitrate, iron, and arsenic in the Fox, Wolf, Peshtigo, and surrounding watersheds in northeastern Wisconsin. Random forest classification is well suited to describe the nonlinear relationships observed among several explanatory variables and the predicted probabilities of elevated concentrations of nitrate, iron, and arsenic. Maps of the probability of elevated nitrate, iron, and arsenic can be used to assess groundwater vulnerability and the vulnerability of streams to contaminants derived from groundwater. Processes responsible for elevated concentrations are elucidated using partial dependence plots. For example, an increase in the probability of elevated iron and arsenic occurred when well depths coincided with the glacial/bedrock interface, suggesting a bedrock source for these constituents. Furthermore, groundwater in contact with Ordovician bedrock has a higher likelihood of elevated iron concentrations, which supports the hypothesis that groundwater liberates iron from a sulfide-bearing secondary cement horizon of Ordovician age. Application of machine learning techniques to existing compliance monitoring data offers an opportunity to broadly assess aquifer and stream vulnerability at regional and national scales and to better understand geochemical processes responsible for observed conditions.

  3. Development of a methodology for probable maximum precipitation estimation over the American River watershed using the WRF model

    NASA Astrophysics Data System (ADS)

    Tan, Elcin

    A new physically-based methodology for probable maximum precipitation (PMP) estimation is developed over the American River Watershed (ARW) using the Weather Research and Forecast (WRF-ARW) model. A persistent moisture flux convergence pattern, called Pineapple Express, is analyzed for 42 historical extreme precipitation events, and it is found that Pineapple Express causes extreme precipitation over the basin of interest. An average correlation between moisture flux convergence and maximum precipitation is estimated as 0.71 for 42 events. The performance of the WRF model is verified for precipitation by means of calibration and independent validation of the model. The calibration procedure is performed only for the first ranked flood event 1997 case, whereas the WRF model is validated for 42 historical cases. Three nested model domains are set up with horizontal resolutions of 27 km, 9 km, and 3 km over the basin of interest. As a result of Chi-square goodness-of-fit tests, the hypothesis that "the WRF model can be used in the determination of PMP over the ARW for both areal average and point estimates" is accepted at the 5% level of significance. The sensitivities of model physics options on precipitation are determined using 28 microphysics, atmospheric boundary layer, and cumulus parameterization schemes combinations. It is concluded that the best triplet option is Thompson microphysics, Grell 3D ensemble cumulus, and YSU boundary layer (TGY), based on 42 historical cases, and this TGY triplet is used for all analyses of this research. Four techniques are proposed to evaluate physically possible maximum precipitation using the WRF: 1. Perturbations of atmospheric conditions; 2. Shift in atmospheric conditions; 3. Replacement of atmospheric conditions among historical events; and 4. Thermodynamically possible worst-case scenario creation. Moreover, climate change effect on precipitation is discussed by emphasizing temperature increase in order to determine the physically possible upper limits of precipitation due to climate change. The simulation results indicate that the meridional shift in atmospheric conditions is the optimum method to determine maximum precipitation in consideration of cost and efficiency. Finally, exceedance probability analyses of the model results of 42 historical extreme precipitation events demonstrate that the 72-hr basin averaged probable maximum precipitation is 21.72 inches for the exceedance probability of 0.5 percent. On the other hand, the current operational PMP estimation for the American River Watershed is 28.57 inches as published in the hydrometeorological report no. 59 and a previous PMP value was 31.48 inches as published in the hydrometeorological report no. 36. According to the exceedance probability analyses of this proposed method, the exceedance probabilities of these two estimations correspond to 0.036 percent and 0.011 percent, respectively.

  4. By-product mutualism and the ambiguous effects of harsher environments - A game-theoretic model.

    PubMed

    De Jaegher, Kris; Hoyer, Britta

    2016-03-21

    We construct two-player two-strategy game-theoretic models of by-product mutualism, where our focus lies on the way in which the probability of cooperation among players is affected by the degree of adversity facing the players. In our first model, cooperation consists of the production of a public good, and adversity is linked to the degree of complementarity of the players׳ efforts in producing the public good. In our second model, cooperation consists of the defense of a public, and/or a private good with by-product benefits, and adversity is measured by the number of random attacks (e.g., by a predator) facing the players. In both of these models, our analysis confirms the existence of the so-called boomerang effect, which states that in a harsh environment, the individual player has few incentives to unilaterally defect in a situation of joint cooperation. Focusing on such an effect in isolation leads to the "common-enemy" hypothesis that a larger degree of adversity increases the probability of cooperation. Yet, we also find that a sucker effect may simultaneously exist, which says that in a harsh environment, the individual player has few incentives to unilaterally cooperate in a situation of joint defection. Looked at in isolation, the sucker effect leads to the competing hypothesis that a larger degree of adversity decreases the probability of cooperation. Our analysis predicts circumstances in which the "common enemy" hypothesis prevails, and circumstances in which the competing hypothesis prevails. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. The role of control groups in mutagenicity studies: matching biological and statistical relevance.

    PubMed

    Hauschke, Dieter; Hothorn, Torsten; Schäfer, Juliane

    2003-06-01

    The statistical test of the conventional hypothesis of "no treatment effect" is commonly used in the evaluation of mutagenicity experiments. Failing to reject the hypothesis often leads to the conclusion in favour of safety. The major drawback of this indirect approach is that what is controlled by a prespecified level alpha is the probability of erroneously concluding hazard (producer risk). However, the primary concern of safety assessment is the control of the consumer risk, i.e. limiting the probability of erroneously concluding that a product is safe. In order to restrict this risk, safety has to be formulated as the alternative, and hazard, i.e. the opposite, has to be formulated as the hypothesis. The direct safety approach is examined for the case when the corresponding threshold value is expressed either as a fraction of the population mean for the negative control, or as a fraction of the difference between the positive and negative controls.

  6. To Do or Not to Do: Dopamine, Affordability and the Economics of Opportunity.

    PubMed

    Beeler, Jeff A; Mourra, Devry

    2018-01-01

    Five years ago, we introduced the thrift hypothesis of dopamine (DA), suggesting that the primary role of DA in adaptive behavior is regulating behavioral energy expenditure to match the prevailing economic conditions of the environment. Here we elaborate that hypothesis with several new ideas. First, we introduce the concept of affordability, suggesting that costs must necessarily be evaluated with respect to the availability of resources to the organism, which computes a value not only for the potential reward opportunity, but also the value of resources expended. Placing both costs and benefits within the context of the larger economy in which the animal is functioning requires consideration of the different timescales against which to compute resource availability, or average reward rate. Appropriate windows of computation for tracking resources requires corresponding neural substrates that operate on these different timescales. In discussing temporal patterns of DA signaling, we focus on a neglected form of DA plasticity and adaptation, changes in the physical substrate of the DA system itself, such as up- and down-regulation of receptors or release probability. We argue that changes in the DA substrate itself fundamentally alter its computational function, which we propose mediates adaptations to longer temporal horizons and economic conditions. In developing our hypothesis, we focus on DA D2 receptors (D2R), arguing that D2R implements a form of "cost control" in response to the environmental economy, serving as the "brain's comptroller". We propose that the balance between the direct and indirect pathway, regulated by relative expression of D1 and D2 DA receptors, implements affordability. Finally, as we review data, we discuss limitations in current approaches that impede fully investigating the proposed hypothesis and highlight alternative, more semi-naturalistic strategies more conducive to neuroeconomic investigations on the role of DA in adaptive behavior.

  7. To Do or Not to Do: Dopamine, Affordability and the Economics of Opportunity

    PubMed Central

    Beeler, Jeff A.; Mourra, Devry

    2018-01-01

    Five years ago, we introduced the thrift hypothesis of dopamine (DA), suggesting that the primary role of DA in adaptive behavior is regulating behavioral energy expenditure to match the prevailing economic conditions of the environment. Here we elaborate that hypothesis with several new ideas. First, we introduce the concept of affordability, suggesting that costs must necessarily be evaluated with respect to the availability of resources to the organism, which computes a value not only for the potential reward opportunity, but also the value of resources expended. Placing both costs and benefits within the context of the larger economy in which the animal is functioning requires consideration of the different timescales against which to compute resource availability, or average reward rate. Appropriate windows of computation for tracking resources requires corresponding neural substrates that operate on these different timescales. In discussing temporal patterns of DA signaling, we focus on a neglected form of DA plasticity and adaptation, changes in the physical substrate of the DA system itself, such as up- and down-regulation of receptors or release probability. We argue that changes in the DA substrate itself fundamentally alter its computational function, which we propose mediates adaptations to longer temporal horizons and economic conditions. In developing our hypothesis, we focus on DA D2 receptors (D2R), arguing that D2R implements a form of “cost control” in response to the environmental economy, serving as the “brain’s comptroller”. We propose that the balance between the direct and indirect pathway, regulated by relative expression of D1 and D2 DA receptors, implements affordability. Finally, as we review data, we discuss limitations in current approaches that impede fully investigating the proposed hypothesis and highlight alternative, more semi-naturalistic strategies more conducive to neuroeconomic investigations on the role of DA in adaptive behavior. PMID:29487508

  8. Altitudinal gradients of generalist and specialist herbivory on three montane Asteraceae

    NASA Astrophysics Data System (ADS)

    Scheidel, U.; Röhl, S.; Bruelheide, H.

    Different functional types of herbivory on three montane Asteraceae were investigated in natural populations in central Germany to test the hypothesis that herbivory is decreasing with altitude. Generalist herbivory was assessed as leaf area loss, mainly caused by slugs, and, in Petasites albus, as rhizome mining by oligophagous insect larvae. Capitules were found to be parasitized by oligophagous insects in Centaurea pseudophrygia and by the specialist fly Tephritis arnicae in Arnica montana. Only the damage to leaves of P. albus showed the hypothesized decrease with increasing altitude. No altitudinal gradient could be found in the leaf and capitule damage to C. pseudophrygia. In A. montana, capitule damage increased with increasing elevation. The data suggest that abundance and activity of generalist herbivores are more affected by climatic conditions along altitudinal gradients than specialist herbivores. In all probability, specialist herbivores depend less on abiotic conditions than on their host's population characteristics, such as host population size.

  9. Load-carriage distance run and push-ups tests: no body mass bias and occupationally relevant.

    PubMed

    Vanderburgh, Paul M; Mickley, Nicholas S; Anloague, Philip A

    2011-09-01

    Recent research has demonstrated body mass (M) bias in military physical fitness tests favoring lighter, not just leaner, service members. Mathematical modeling predicts that a distance run carrying a backpack of 30 lbs would eliminate M-bias. The purpose of this study was to empirically test this prediction for the U.S. Army push-ups and 2-mile run tests. Two tests were performed for both events for each of 56 university Reserve Officer Training Corps male cadets: with (loaded) and without backpack (unloaded). Results indicated significant M-bias in the unloaded and no M-bias in the loaded condition for both events. Allometrically scaled scores for both events were worse in the loaded vs. unloaded conditions, supporting a hypothesis not previously tested. The loaded push-ups and 2-mile run appear to remove M-bias and are probably more occupationally relevant as military personnel are often expected to carry external loads.

  10. The frequentist implications of optional stopping on Bayesian hypothesis tests.

    PubMed

    Sanborn, Adam N; Hills, Thomas T

    2014-04-01

    Null hypothesis significance testing (NHST) is the most commonly used statistical methodology in psychology. The probability of achieving a value as extreme or more extreme than the statistic obtained from the data is evaluated, and if it is low enough, the null hypothesis is rejected. However, because common experimental practice often clashes with the assumptions underlying NHST, these calculated probabilities are often incorrect. Most commonly, experimenters use tests that assume that sample sizes are fixed in advance of data collection but then use the data to determine when to stop; in the limit, experimenters can use data monitoring to guarantee that the null hypothesis will be rejected. Bayesian hypothesis testing (BHT) provides a solution to these ills because the stopping rule used is irrelevant to the calculation of a Bayes factor. In addition, there are strong mathematical guarantees on the frequentist properties of BHT that are comforting for researchers concerned that stopping rules could influence the Bayes factors produced. Here, we show that these guaranteed bounds have limited scope and often do not apply in psychological research. Specifically, we quantitatively demonstrate the impact of optional stopping on the resulting Bayes factors in two common situations: (1) when the truth is a combination of the hypotheses, such as in a heterogeneous population, and (2) when a hypothesis is composite-taking multiple parameter values-such as the alternative hypothesis in a t-test. We found that, for these situations, while the Bayesian interpretation remains correct regardless of the stopping rule used, the choice of stopping rule can, in some situations, greatly increase the chance of experimenters finding evidence in the direction they desire. We suggest ways to control these frequentist implications of stopping rules on BHT.

  11. The "where is it?" reflex: autoshaping the orienting response.

    PubMed

    Buzsáki, G

    1982-05-01

    The goal of this review is to compare two divergent lines of research on signal-centered behavior: the orienting reflex (OR) and autoshaping. A review of conditioning experiments in animals and humans suggests that the novelty hypothesis of the OR is no longer tenable. Only stimuli that represent biological "relevance" elicit ORs. A stimulus may be relevant a priori (i.e., unconditioned) or as a result of conditioning. Exposure to a conditioned stimulus (CS) that predicts a positive reinforcer causes the animal to orient to it throughout conditioning. Within the CS-US interval, the initial CS-directed orienting response is followed by US-directed tendencies. Experimental evidence is shown that the development and maintenance of the conditioned OR occur in a similar fashion both in response-independent (classical) and response-dependent (instrumental) paradigms. It is proposed that the conditioned OR and the signal-directed autoshaped response are identical. Signals predicting aversive events repel the subject from the source of the CS. It is suggested that the function of the CS is not only to signal the probability of US occurrence, but also to serve as a spatial cue to guide the animal in the environment.

  12. The "where is it?" reflex: autoshaping the orienting response.

    PubMed Central

    Buzsáki, G

    1982-01-01

    The goal of this review is to compare two divergent lines of research on signal-centered behavior: the orienting reflex (OR) and autoshaping. A review of conditioning experiments in animals and humans suggests that the novelty hypothesis of the OR is no longer tenable. Only stimuli that represent biological "relevance" elicit ORs. A stimulus may be relevant a priori (i.e., unconditioned) or as a result of conditioning. Exposure to a conditioned stimulus (CS) that predicts a positive reinforcer causes the animal to orient to it throughout conditioning. Within the CS-US interval, the initial CS-directed orienting response is followed by US-directed tendencies. Experimental evidence is shown that the development and maintenance of the conditioned OR occur in a similar fashion both in response-independent (classical) and response-dependent (instrumental) paradigms. It is proposed that the conditioned OR and the signal-directed autoshaped response are identical. Signals predicting aversive events repel the subject from the source of the CS. It is suggested that the function of the CS is not only to signal the probability of US occurrence, but also to serve as a spatial cue to guide the animal in the environment. PMID:7097153

  13. Neutral aggregation in finite-length genotype space

    NASA Astrophysics Data System (ADS)

    Houchmandzadeh, Bahram

    2017-01-01

    The advent of modern genome sequencing techniques allows for a more stringent test of the neutrality hypothesis of Darwinian evolution, where all individuals have the same fitness. Using the individual-based model of Wright and Fisher, we compute the amplitude of neutral aggregation in the genome space, i.e., the probability of finding two individuals at genetic (Hamming) distance k as a function of the genome size L , population size N , and mutation probability per base ν . In well-mixed populations, we show that for N ν <1 /L , neutral aggregation is the dominant force and most individuals are found at short genetic distances from each other. For N ν >1 , on the contrary, individuals are randomly dispersed in genome space. The results are extended to a geographically dispersed population, where the controlling parameter is shown to be a combination of mutation and migration probability. The theory we develop can be used to test the neutrality hypothesis in various ecological and evolutionary systems.

  14. Introduction and Application of non-stationary Standardized Precipitation Index Considering Probability Distribution Function and Return Period

    NASA Astrophysics Data System (ADS)

    Park, J.; Lim, Y. J.; Sung, J. H.; Kang, H. S.

    2017-12-01

    The widely used meteorological drought index, the Standardized Precipitation Index (SPI) basically assumes stationarity, but recent change in the climate have led to a need to review this hypothesis. In this study, a new non-stationary SPI that considers not only the modified probability distribution parameter but also the return period under the non-stationary process has been proposed. The results are evaluated for two severe drought cases during the last 10 years in South Korea. As a result, SPIs considered the non-stationary hypothesis underestimated the drought severity than the stationary SPI despite these past two droughts were recognized as significantly severe droughts. It may be caused by that the variances of summer and autumn precipitation become larger over time then it can make the shape of probability distribution function wider than before. This understanding implies that drought expressions by statistical index such as SPI can be distorted by stationary assumption and cautious approach is needed when deciding drought level considering climate changes.

  15. Introduction and application of non-stationary standardized precipitation index considering probability distribution function and return period

    NASA Astrophysics Data System (ADS)

    Park, Junehyeong; Sung, Jang Hyun; Lim, Yoon-Jin; Kang, Hyun-Suk

    2018-05-01

    The widely used meteorological drought index, the Standardized Precipitation Index (SPI), basically assumes stationarity, but recent changes in the climate have led to a need to review this hypothesis. In this study, a new non-stationary SPI that considers not only the modified probability distribution parameter but also the return period under the non-stationary process was proposed. The results were evaluated for two severe drought cases during the last 10 years in South Korea. As a result, SPIs considered that the non-stationary hypothesis underestimated the drought severity than the stationary SPI despite that these past two droughts were recognized as significantly severe droughts. It may be caused by that the variances of summer and autumn precipitation become larger over time then it can make the probability distribution wider than before. This implies that drought expressions by statistical index such as SPI can be distorted by stationary assumption and cautious approach is needed when deciding drought level considering climate changes.

  16. Formulating appropriate statistical hypotheses for treatment comparison in clinical trial design and analysis.

    PubMed

    Huang, Peng; Ou, Ai-hua; Piantadosi, Steven; Tan, Ming

    2014-11-01

    We discuss the problem of properly defining treatment superiority through the specification of hypotheses in clinical trials. The need to precisely define the notion of superiority in a one-sided hypothesis test problem has been well recognized by many authors. Ideally designed null and alternative hypotheses should correspond to a partition of all possible scenarios of underlying true probability models P={P(ω):ω∈Ω} such that the alternative hypothesis Ha={P(ω):ω∈Ωa} can be inferred upon the rejection of null hypothesis Ho={P(ω):ω∈Ω(o)} However, in many cases, tests are carried out and recommendations are made without a precise definition of superiority or a specification of alternative hypothesis. Moreover, in some applications, the union of probability models specified by the chosen null and alternative hypothesis does not constitute a completed model collection P (i.e., H(o)∪H(a) is smaller than P). This not only imposes a strong non-validated assumption of the underlying true models, but also leads to different superiority claims depending on which test is used instead of scientific plausibility. Different ways to partition P fro testing treatment superiority often have different implications on sample size, power, and significance in both efficacy and comparative effectiveness trial design. Such differences are often overlooked. We provide a theoretical framework for evaluating the statistical properties of different specification of superiority in typical hypothesis testing. This can help investigators to select proper hypotheses for treatment comparison inclinical trial design. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. Dual-Processes in Learning and Judgment: Evidence from the Multiple Cue Probability Learning Paradigm

    ERIC Educational Resources Information Center

    Rolison, Jonathan J.; Evans, Jonathan St. B. T.; Dennis, Ian; Walsh, Clare R.

    2012-01-01

    Multiple cue probability learning (MCPL) involves learning to predict a criterion based on a set of novel cues when feedback is provided in response to each judgment made. But to what extent does MCPL require controlled attention and explicit hypothesis testing? The results of two experiments show that this depends on cue polarity. Learning about…

  18. The Risk of Reduced Physical Activity in Children with Probable Developmental Coordination Disorder: A Prospective Longitudinal Study

    ERIC Educational Resources Information Center

    Green, Dido; Lingam, Raghu; Mattocks, Calum; Riddoch, Chris; Ness, Andy; Emond, Alan

    2011-01-01

    The aim of the current study was to test the hypothesis that children with probable Developmental Coordination Disorder have an increased risk of reduced moderate to vigorous physical activity (MVPA), using data from a large population based study. Prospectively collected data from 4331 children (boys = 2065, girls = 2266) who had completed motor…

  19. Testing for variation in taxonomic extinction probabilities: a suggested methodology and some results

    USGS Publications Warehouse

    Conroy, M.J.; Nichols, J.D.

    1984-01-01

    Several important questions in evolutionary biology and paleobiology involve sources of variation in extinction rates. In all cases of which we are aware, extinction rates have been estimated from data in which the probability that an observation (e.g., a fossil taxon) will occur is related both to extinction rates and to what we term encounter probabilities. Any statistical method for analyzing fossil data should at a minimum permit separate inferences on these two components. We develop a method for estimating taxonomic extinction rates from stratigraphic range data and for testing hypotheses about variability in these rates. We use this method to estimate extinction rates and to test the hypothesis of constant extinction rates for several sets of stratigraphic range data. The results of our tests support the hypothesis that extinction rates varied over the geologic time periods examined. We also present a test that can be used to identify periods of high or low extinction probabilities and provide an example using Phanerozoic invertebrate data. Extinction rates should be analyzed using stochastic models, in which it is recognized that stratigraphic samples are random varlates and that sampling is imperfect

  20. Maternal investment in the swordtail fish Xiphophorus multilineatus: support for the differential allocation hypothesis.

    PubMed

    Rios-Cardenas, Oscar; Brewer, Jason; Morris, Molly R

    2013-01-01

    The differential allocation hypothesis predicts that reproductive investment will be influenced by mate attractiveness, given a cost to reproduction and a tradeoff between current and future reproduction. We tested the differential allocation hypothesis in the swordtail fish Xiphophorus multilineatus, where males have genetically influenced (patroclinous inheritance) alternative mating tactics (ARTs) maintained by a tradeoff between being more attractive to females (mature later as larger courting males) and a higher probability of reaching sexual maturity (mature earlier as smaller sneaker males). Males in X. multilineatus do not provide parental care or other resources to the offspring. Allelic variation and copy number of the Mc4R gene on the Y-chromosome influences the size differences between males, however there is no variation in this gene on the X-chromosome. Therefore, to determine if mothers invested more in offspring of the larger courter males, we examined age to sexual maturity for daughters. We confirmed a tradeoff between number of offspring and female offspring's age to sexual maturity, corroborating that there is a cost to reproduction. In addition, the ART of their fathers significantly influenced the age at which daughters reached sexual maturity, suggesting increased maternal investment to daughters of courter males. The differential allocation we detected was influenced by how long the wild-caught mother had been in the laboratory, as there was a brood order by father genotype (ART) interaction. These results suggest that females can adjust their reproductive investment strategy, and that differential allocation is context specific. We hypothesize that one of two aspects of laboratory conditions produced this shift: increased female condition due to higher quality diet, and/or assessment of future mating opportunities due to isolation from males.

  1. Maternal Investment in the Swordtail Fish Xiphophorus multilineatus: Support for the Differential Allocation Hypothesis

    PubMed Central

    Rios-Cardenas, Oscar; Brewer, Jason; Morris, Molly R.

    2013-01-01

    The differential allocation hypothesis predicts that reproductive investment will be influenced by mate attractiveness, given a cost to reproduction and a tradeoff between current and future reproduction. We tested the differential allocation hypothesis in the swordtail fish Xiphophorus multilineatus, where males have genetically influenced (patroclinous inheritance) alternative mating tactics (ARTs) maintained by a tradeoff between being more attractive to females (mature later as larger courting males) and a higher probability of reaching sexual maturity (mature earlier as smaller sneaker males). Males in X. multilineatus do not provide parental care or other resources to the offspring. Allelic variation and copy number of the Mc4R gene on the Y-chromosome influences the size differences between males, however there is no variation in this gene on the X-chromosome. Therefore, to determine if mothers invested more in offspring of the larger courter males, we examined age to sexual maturity for daughters. We confirmed a tradeoff between number of offspring and female offspring’s age to sexual maturity, corroborating that there is a cost to reproduction. In addition, the ART of their fathers significantly influenced the age at which daughters reached sexual maturity, suggesting increased maternal investment to daughters of courter males. The differential allocation we detected was influenced by how long the wild-caught mother had been in the laboratory, as there was a brood order by father genotype (ART) interaction. These results suggest that females can adjust their reproductive investment strategy, and that differential allocation is context specific. We hypothesize that one of two aspects of laboratory conditions produced this shift: increased female condition due to higher quality diet, and/or assessment of future mating opportunities due to isolation from males. PMID:24349348

  2. Exploiting target amplitude information to improve multi-target tracking

    NASA Astrophysics Data System (ADS)

    Ehrman, Lisa M.; Blair, W. Dale

    2006-05-01

    Closely-spaced (but resolved) targets pose a challenge for measurement-to-track data association algorithms. Since the Mahalanobis distances between measurements collected on closely-spaced targets and tracks are similar, several elements of the corresponding kinematic measurement-to-track cost matrix are also similar. Lacking any other information on which to base assignments, it is not surprising that data association algorithms make mistakes. One ad hoc approach for mitigating this problem is to multiply the kinematic measurement-to-track likelihoods by amplitude likelihoods. However, this can actually be detrimental to the measurement-to-track association process. With that in mind, this paper pursues a rigorous treatment of the hypothesis probabilities for kinematic measurements and features. Three simple scenarios are used to demonstrate the impact of basing data association decisions on these hypothesis probabilities for Rayleigh, fixed-amplitude, and Rician targets. The first scenario assumes that the tracker carries two tracks but only one measurement is collected. This provides insight into more complex scenarios in which there are fewer measurements than tracks. The second scenario includes two measurements and one track. This extends naturally to the case with more measurements than tracks. Two measurements and two tracks are present in the third scenario, which provides insight into the performance of this method when the number of measurements equals the number of tracks. In all cases, basing data association decisions on the hypothesis probabilities leads to good results.

  3. Investigating soil moisture feedbacks on precipitation with tests of Granger causality

    NASA Astrophysics Data System (ADS)

    Salvucci, Guido D.; Saleem, Jennifer A.; Kaufmann, Robert

    Granger causality (GC) is used in the econometrics literature to identify the presence of one- and two-way coupling between terms in noisy multivariate dynamical systems. Here we test for the presence of GC to identify a soil moisture ( S) feedback on precipitation ( P) using data from Illinois. In this framework S is said to Granger cause P if F(P t|Ω t- Δt )≠F(P t|Ω t- Δt -S t- Δt ) where F denotes the conditional distribution of P, Ω t- Δt represents the set of all knowledge available at time t-Δ t, and Ω t- Δt -S t- Δt represents all knowledge except S. Critical for land-atmosphere interaction research is that Ω t- Δt includes all past information on P as well as S. Therefore that part of the relation between past soil moisture and current precipitation which results from precipitation autocorrelation and soil water balance will be accounted for and not attributed to causality. Tests for GC usually specify all relevant variables in a coupled vector autoregressive (VAR) model and then calculate the significance level of decreased predictability as various coupling coefficients are omitted. But because the data (daily precipitation and soil moisture) are distinctly non-Gaussian, we avoid using a VAR and instead express the daily precipitation events as a Markov model. We then test whether the probability of storm occurrence, conditioned on past information on precipitation, changes with information on soil moisture. Past information on precipitation is expressed both as the occurrence of previous day precipitation (to account for storm-scale persistence) and as a simple soil moisture-like precipitation-wetness index derived solely from precipitation (to account for seasonal-scale persistence). In this way only those fluctuations in moisture not attributable to past fluctuations in precipitation (e.g., those due to temperature) can influence the outcome of the test. The null hypothesis (no moisture influence) is evaluated by comparing observed changes in storm probability to Monte-Carlo simulated differences generated with unconditional occurrence probabilities. The null hypothesis is not rejected ( p>0.5) suggesting that contrary to recently published results, insufficient evidence exists to support an influence of soil moisture on precipitation in Illinois.

  4. Compensating for geographic variation in detection probability with water depth improves abundance estimates of coastal marine megafauna.

    PubMed

    Hagihara, Rie; Jones, Rhondda E; Sobtzick, Susan; Cleguer, Christophe; Garrigue, Claire; Marsh, Helene

    2018-01-01

    The probability of an aquatic animal being available for detection is typically <1. Accounting for covariates that reduce the probability of detection is important for obtaining robust estimates of the population abundance and determining its status and trends. The dugong (Dugong dugon) is a bottom-feeding marine mammal and a seagrass community specialist. We hypothesized that the probability of a dugong being available for detection is dependent on water depth and that dugongs spend more time underwater in deep-water seagrass habitats than in shallow-water seagrass habitats. We tested this hypothesis by quantifying the depth use of 28 wild dugongs fitted with GPS satellite transmitters and time-depth recorders (TDRs) at three sites with distinct seagrass depth distributions: 1) open waters supporting extensive seagrass meadows to 40 m deep (Torres Strait, 6 dugongs, 2015); 2) a protected bay (average water depth 6.8 m) with extensive shallow seagrass beds (Moreton Bay, 13 dugongs, 2011 and 2012); and 3) a mixture of lagoon, coral and seagrass habitats to 60 m deep (New Caledonia, 9 dugongs, 2013). The fitted instruments were used to measure the times the dugongs spent in the experimentally determined detection zones under various environmental conditions. The estimated probability of detection was applied to aerial survey data previously collected at each location. In general, dugongs were least available for detection in Torres Strait, and the population estimates increased 6-7 fold using depth-specific availability correction factors compared with earlier estimates that assumed homogeneous detection probability across water depth and location. Detection probabilities were higher in Moreton Bay and New Caledonia than Torres Strait because the water transparency in these two locations was much greater than in Torres Strait and the effect of correcting for depth-specific detection probability much less. The methodology has application to visual survey of coastal megafauna including surveys using Unmanned Aerial Vehicles.

  5. Developing a Hypothetical Learning Trajectory for the Sampling Distribution of the Sample Means

    NASA Astrophysics Data System (ADS)

    Syafriandi

    2018-04-01

    Special types of probability distribution are sampling distributions that are important in hypothesis testing. The concept of a sampling distribution may well be the key concept in understanding how inferential procedures work. In this paper, we will design a hypothetical learning trajectory (HLT) for the sampling distribution of the sample mean, and we will discuss how the sampling distribution is used in hypothesis testing.

  6. More attention when speaking: does it help or does it hurt?

    PubMed

    Nozari, Nazbanou; Thompson-Schill, Sharon L

    2013-11-01

    Paying selective attention to a word in a multi-word utterance results in a decreased probability of error on that word (benefit), but an increased probability of error on the other words (cost). We ask whether excitation of the prefrontal cortex helps or hurts this cost. One hypothesis (the resource hypothesis) predicts a decrease in the cost due to the deployment of more attentional resources, while another (the focus hypothesis) predicts even greater costs due to further fine-tuning of selective attention. Our results are more consistent with the focus hypothesis: prefrontal stimulation caused a reliable increase in the benefit and a marginal increase in the cost of selective attention. To ensure that the effects are due to changes to the prefrontal cortex, we provide two checks: We show that the pattern of results is quite different if, instead, the primary motor cortex is stimulated. We also show that the stimulation-related benefits in the verbal task correlate with the stimulation-related benefits in an N-back task, which is known to tap into a prefrontal function. Our results shed light on how selective attention affects language production, and more generally, on how selective attention affects production of a sequence over time. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Paleoindian demography and the extraterrestrial impact hypothesis

    NASA Astrophysics Data System (ADS)

    Buchanan, Briggs; Collard, Mark; Edinborough, Kevan

    2008-08-01

    Recently it has been suggested that one or more large extraterrestrial (ET) objects struck northern North America 12,900 ± 100 calendar years before present (calBP) [Firestone RB, et al. (2007) Proc Natl Acad Sci USA 104: 16016-16021]. This impact is claimed to have triggered the Younger Dryas major cooling event and resulted in the extinction of the North American megafauna. The impact is also claimed to have caused major cultural changes and population decline among the Paleoindians. Here, we report a study in which ≈1,500 radiocarbon dates from archaeological sites in Canada and the United States were used to test the hypothesis that the ET resulted in population decline among the Paleoindians. Following recent studies [e.g., Gamble C, Davies W, Pettitt P, Hazelwood L, Richards M (2005) Camb Archaeol J 15:193-223), the summed probability distribution of the calibrated dates was used to identify probable changes in human population size between 15,000 and 9,000 calBP. Subsequently, potential biases were evaluated by modeling and spatial analysis of the dated occupations. The results of the analyses were not consistent with the predictions of extraterrestrial impact hypothesis. No evidence of a population decline among the Paleoindians at 12,900 ± 100 calBP was found. Thus, minimally, the study suggests the extraterrestrial impact hypothesis should be amended.

  8. Paleoindian demography and the extraterrestrial impact hypothesis.

    PubMed

    Buchanan, Briggs; Collard, Mark; Edinborough, Kevan

    2008-08-19

    Recently it has been suggested that one or more large extraterrestrial (ET) objects struck northern North America 12,900 +/- 100 calendar years before present (calBP) [Firestone RB, et al. (2007) Proc Natl Acad Sci USA 104: 16016-16021]. This impact is claimed to have triggered the Younger Dryas major cooling event and resulted in the extinction of the North American megafauna. The impact is also claimed to have caused major cultural changes and population decline among the Paleoindians. Here, we report a study in which approximately 1,500 radiocarbon dates from archaeological sites in Canada and the United States were used to test the hypothesis that the ET resulted in population decline among the Paleoindians. Following recent studies [e.g., Gamble C, Davies W, Pettitt P, Hazelwood L, Richards M (2005) Camb Archaeol J 15:193-223), the summed probability distribution of the calibrated dates was used to identify probable changes in human population size between 15,000 and 9,000 calBP. Subsequently, potential biases were evaluated by modeling and spatial analysis of the dated occupations. The results of the analyses were not consistent with the predictions of extraterrestrial impact hypothesis. No evidence of a population decline among the Paleoindians at 12,900 +/- 100 calBP was found. Thus, minimally, the study suggests the extraterrestrial impact hypothesis should be amended.

  9. Statistical hypothesis tests of some micrometeorological observations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    SethuRaman, S.; Tichler, J.

    Chi-square goodness-of-fit is used to test the hypothesis that the medium scale of turbulence in the atmospheric surface layer is normally distributed. Coefficients of skewness and excess are computed from the data. If the data are not normal, these coefficients are used in Edgeworth's asymptotic expansion of Gram-Charlier series to determine an altrnate probability density function. The observed data are then compared with the modified probability densities and the new chi-square values computed.Seventy percent of the data analyzed was either normal or approximatley normal. The coefficient of skewness g/sub 1/ has a good correlation with the chi-square values. Events withmore » vertical-barg/sub 1/vertical-bar<0.21 were normal to begin with and those with 0.21« less

  10. Functional imaging of brain responses to different outcomes of hypothesis testing: revealed in a category induction task.

    PubMed

    Li, Fuhong; Cao, Bihua; Luo, Yuejia; Lei, Yi; Li, Hong

    2013-02-01

    Functional magnetic resonance imaging (fMRI) was used to examine differences in brain activation that occur when a person receives the different outcomes of hypothesis testing (HT). Participants were provided with a series of images of batteries and were asked to learn a rule governing what kinds of batteries were charged. Within each trial, the first two charged batteries were sequentially displayed, and participants would generate a preliminary hypothesis based on the perceptual comparison. Next, a third battery that served to strengthen, reject, or was irrelevant to the preliminary hypothesis was displayed. The fMRI results revealed that (1) no significant differences in brain activation were found between the 2 hypothesis-maintain conditions (i.e., strengthen and irrelevant conditions); and (2) compared with the hypothesis-maintain conditions, the hypothesis-reject condition activated the left medial frontal cortex, bilateral putamen, left parietal cortex, and right cerebellum. These findings are discussed in terms of the neural correlates of the subcomponents of HT and working memory manipulation. Copyright © 2012 Elsevier Inc. All rights reserved.

  11. Bladder pain syndrome/interstitial cystitis as a functional somatic syndrome.

    PubMed

    Warren, John W

    2014-12-01

    To determine whether bladder pain syndrome/interstitial cystitis (BPS/IC) has the characteristics of a functional somatic syndrome (FSS). There is no accepted definition of an FSS. Consequently, this paper reviewed the literature for common FSS characteristics and for reports that BPS/IC has these characteristics. Eleven articles met inclusion and exclusion criteria and yielded 18 FSS characteristics. BPS/IC patients manifest all but two: the exceptions were normal light microscopic anatomy (after hydrodistention under anesthesia, some BPS/IC bladders have Hunner's lesions and most have petechial hemorrhages) and normal laboratory tests (many BPS/IC patients have hematuria). Petechial hemorrhages and hematuria are probably related and may appear during naturally-occurring bladder distention. Without such distention, then, the 90% of BPS/IC patients without a Hunner's lesion have all the characteristics of an FSS. Comparisons in the opposite direction were consistent: several additional features of BPS/IC were found in FSSs. This systematic but untested method is consistent with but does not test the hypothesis that BPS/IC in some patients might best be understood as an FSS. Like most conditions, BPS/IC is probably heterogeneous; hence only a proportion of BPS/IC cases are likely to be manifestations of an FSS. This hypothesis has several implications. Explorations of processes that connect the FSSs might contribute to understanding the pathogenesis of BPS/IC. Patients with FSSs are at risk for BPS/IC and may benefit from future preventive strategies. Therapies that are useful in FSSs also may be useful in some cases of BPS/IC. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Some properties of a 5-parameter bivariate probability distribution

    NASA Technical Reports Server (NTRS)

    Tubbs, J. D.; Brewer, D. W.; Smith, O. E.

    1983-01-01

    A five-parameter bivariate gamma distribution having two shape parameters, two location parameters and a correlation parameter was developed. This more general bivariate gamma distribution reduces to the known four-parameter distribution. The five-parameter distribution gives a better fit to the gust data. The statistical properties of this general bivariate gamma distribution and a hypothesis test were investigated. Although these developments have come too late in the Shuttle program to be used directly as design criteria for ascent wind gust loads, the new wind gust model has helped to explain the wind profile conditions which cause large dynamic loads. Other potential applications of the newly developed five-parameter bivariate gamma distribution are in the areas of reliability theory, signal noise, and vibration mechanics.

  13. Paroxysmal Kinesigenic Dyskinesia.

    PubMed

    Mallik, Ritwika; Nandi, Sitansu Sekhar

    2016-04-01

    We present a case of paroxysmal kinesigenic dyskinesia (PKD) in a 21 year old girl, with no family history of similar episodes. The episodes were short (lasting less than a minute), frequent, occurring 5 to 10 times a day, self-limiting dystonia of her right upper limb precipitated by sudden movement. She also had a past history of partial seizures with secondary generalization in her childhood. She responded to phenytoin, with cessation of events after 1 month of treatment. This case impresses upon the hypothesis stating the association between seizure activity and PKD probably due to a common foci of origin. Awareness of this condition is required as it is easily treatable but frequently misdiagnosed. © Journal of the Association of Physicians of India 2011.

  14. The Coding Question.

    PubMed

    Gallistel, C R

    2017-07-01

    Recent electrophysiological results imply that the duration of the stimulus onset asynchrony in eyeblink conditioning is encoded by a mechanism intrinsic to the cerebellar Purkinje cell. This raises the general question - how is quantitative information (durations, distances, rates, probabilities, amounts, etc.) transmitted by spike trains and encoded into engrams? The usual assumption is that information is transmitted by firing rates. However, rate codes are energetically inefficient and computationally awkward. A combinatorial code is more plausible. If the engram consists of altered synaptic conductances (the usual assumption), then we must ask how numbers may be written to synapses. It is much easier to formulate a coding hypothesis if the engram is realized by a cell-intrinsic molecular mechanism. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Microcephalic osteodysplastic primordial dwarfism type I/III in sibs.

    PubMed Central

    Meinecke, P; Passarge, E

    1991-01-01

    The clinical and radiological findings in a pair of sibs with microcephalic osteodysplastic primordial dwarfism (MOPD) are described, a boy who survived for 5 1/2 years and his more severely affected younger sister, who died at the age of 6 months. Neuropathological studies in this girl showed marked micrencephaly with severely hypoplastic, poorly gyrated frontal lobes and absent corpus callosum. Our observation supports the hypothesis that types I and III MOPD probably constitute a spectrum of one and the same entity and published data together with this report are consistent with autosomal recessive inheritance. The pathogenesis of this condition is as yet unknown, but its characteristics indicate a basic defect affecting cell proliferation and tissue differentiation. Images PMID:1770539

  16. The Use of Probability Theory as a Basis for Planning and Controlling Overhead Costs in Education and Industry. Final Report.

    ERIC Educational Resources Information Center

    Vinson, R. B.

    In this report, the author suggests changes in the treatment of overhead costs by hypothesizing that "the effectiveness of standard costing in planning and controlling overhead costs can be increased through the use of probability theory and associated statistical techniques." To test the hypothesis, the author (1) presents an overview of the…

  17. The thresholds for statistical and clinical significance – a five-step procedure for evaluation of intervention effects in randomised clinical trials

    PubMed Central

    2014-01-01

    Background Thresholds for statistical significance are insufficiently demonstrated by 95% confidence intervals or P-values when assessing results from randomised clinical trials. First, a P-value only shows the probability of getting a result assuming that the null hypothesis is true and does not reflect the probability of getting a result assuming an alternative hypothesis to the null hypothesis is true. Second, a confidence interval or a P-value showing significance may be caused by multiplicity. Third, statistical significance does not necessarily result in clinical significance. Therefore, assessment of intervention effects in randomised clinical trials deserves more rigour in order to become more valid. Methods Several methodologies for assessing the statistical and clinical significance of intervention effects in randomised clinical trials were considered. Balancing simplicity and comprehensiveness, a simple five-step procedure was developed. Results For a more valid assessment of results from a randomised clinical trial we propose the following five-steps: (1) report the confidence intervals and the exact P-values; (2) report Bayes factor for the primary outcome, being the ratio of the probability that a given trial result is compatible with a ‘null’ effect (corresponding to the P-value) divided by the probability that the trial result is compatible with the intervention effect hypothesised in the sample size calculation; (3) adjust the confidence intervals and the statistical significance threshold if the trial is stopped early or if interim analyses have been conducted; (4) adjust the confidence intervals and the P-values for multiplicity due to number of outcome comparisons; and (5) assess clinical significance of the trial results. Conclusions If the proposed five-step procedure is followed, this may increase the validity of assessments of intervention effects in randomised clinical trials. PMID:24588900

  18. Maestlin's teaching of Copernicus. The evidence of his university textbook and disputations.

    NASA Astrophysics Data System (ADS)

    Methuen, C.

    1996-06-01

    Michael Maestlin (1550 - 1631), professor of mathematics at the University of Tübingen from 1584 until his death, is probably best known as the teacher of Johannes Kepler. As such he has merited more attention from historians than most other sixteenth-century German professors of mathematics. While Maestlin's own achievements (for instance, his correct description of earthshine, his observation and identification of the nova of 1572, and his attempt to determine the orbit of the comet of 1577 - 1578) have been noted, Kepler's testimony that he learned the Copernican system from Maestlin has meant that attention has been focused on Maestlin's attitude toward the Copernican hypothesis. Central to the ensuing discussion has been the question, and particularly the light shed on it by such of Maestlin's teaching materials as survive, that forms the subject of this paper. Copernicus's planetary hypothesis was probably not taught to every student in the University of Tübingen, but Maestlin seems always to have made the new hypothesis available who had the interest and ability to pursue it.

  19. Deterministic versus evidence-based attitude towards clinical diagnosis.

    PubMed

    Soltani, Akbar; Moayyeri, Alireza

    2007-08-01

    Generally, two basic classes have been proposed for scientific explanation of events. Deductive reasoning emphasizes on reaching conclusions about a hypothesis based on verification of universal laws pertinent to that hypothesis, while inductive or probabilistic reasoning explains an event by calculation of some probabilities for that event to be related to a given hypothesis. Although both types of reasoning are used in clinical practice, evidence-based medicine stresses on the advantages of the second approach for most instances in medical decision making. While 'probabilistic or evidence-based' reasoning seems to involve more mathematical formulas at the first look, this attitude is more dynamic and less imprisoned by the rigidity of mathematics comparing with 'deterministic or mathematical attitude'. In the field of medical diagnosis, appreciation of uncertainty in clinical encounters and utilization of likelihood ratio as measure of accuracy seem to be the most important characteristics of evidence-based doctors. Other characteristics include use of series of tests for refining probability, changing diagnostic thresholds considering external evidences and nature of the disease, and attention to confidence intervals to estimate uncertainty of research-derived parameters.

  20. Examinations of the reward comparison hypothesis: The modulation of gender and footshock.

    PubMed

    Huang, Andrew Chih Wei; Wang, Cheng Chung; Wang, Shiun

    2015-11-01

    The reward comparison hypothesis suggests that drugs of abuse-induced conditioned saccharin suppression intake is due to the reward value of drugs of abuse that outweighs that of a saccharin solution dissociating from the aversive LiCl-induced conditioned taste aversion (CTA). Huang and Hsiao (2008) provided some conflict data to challenge the reward comparison hypothesis. Whether the rewarding drugs of abuse-induced conditioned suppression and the aversive LiCl-induced CTA resulted from aversion or reward should be addressed. The present study investigated how gender and footshock affect aversive LiCl- and rewarding morphine- and methamphetamine (MAMPH)-induced conditioned suppression to re-examine the reward comparison hypothesis. The results indicated that gender and footshock did not directly influence the aversive LiCl-induced CTA or rewarding morphine- and MAMPH-induced conditioned suppression. The gender effect interacted with the drug effect in the aversive LiCl- and rewarding MAMPH-induced conditioned suppression but did not interact with the drug effect in the rewarding morphine-induced conditioned suppression. Footshock interacted with the drug effect in rewarding morphine- and MAMPH-induced conditioned suppression, but footshock did not interact with the drug effect in the aversive LiCl-induced CTA. Therefore, the gender and footshock effects might play a modulatory (but not a mediating) role with the drug effect. The present data indicated that footshock modulates drugs of abuse-induced conditioned suppression, which is consistent with the reward comparison hypothesis, but our findings with regard to the modulatory role of the gender effect and the drug effect do not support this hypothesis. The reward comparison hypothesis should be discussed and possibly reconsidered. Copyright © 2015. Published by Elsevier Inc.

  1. A note on the IQ of monozygotic twins raised apart and the order of their birth.

    PubMed

    Pencavel, J H

    1976-10-01

    This note examines James Shields' sample of monozygotic twins raised apart to entertain the hypothesis that there is a significant association between the measured IQ of these twins and the order of their birth. A non-parametric test supports this hypothesis and then a linear probability function is estimated that discriminates the effects on IQ of birth order from the effects of birth weight.

  2. Optimum space shuttle launch times relative to natural environment

    NASA Technical Reports Server (NTRS)

    King, R. L.

    1977-01-01

    Three sets of meteorological criteria were analyzed to determine the probabilities of favorable launch and landing conditions. Probabilities were computed for every 3 hours on a yearly basis using 14 years of weather data. These temporal probability distributions, applicable to the three sets of weather criteria encompassing benign, moderate and severe weather conditions, were computed for both Kennedy Space Center (KSC) and Edwards Air Force Base. In addition, conditional probabilities were computed for unfavorable weather conditions occurring after a delay which may or may not be due to weather conditions. Also, for KSC, the probabilities of favorable landing conditions at various times after favorable launch conditions have prevailed have been computed so that mission probabilities may be more accurately computed for those time periods when persistence strongly correlates weather conditions. Moreover, the probabilities and conditional probabilities of the occurrence of both favorable and unfavorable events for each individual criterion were computed to indicate the significance of each weather element to the overall result.

  3. Passage and survival probabilities of juvenile Chinook salmon at Cougar Dam, Oregon, 2012

    USGS Publications Warehouse

    Beeman, John W.; Evans, Scott D.; Haner, Philip V.; Hansel, Hal C.; Hansen, Amy C.; Smith, Collin D.; Sprando, Jamie M.

    2014-01-01

    This report describes studies of juvenile-salmon dam passage and apparent survival at Cougar Dam, Oregon, during two operating conditions in 2012. Cougar Dam is a 158-meter tall rock-fill dam used primarily for flood control, and passes water through a temperature control tower to either a powerhouse penstock or to a regulating outlet (RO). The temperature control tower has moveable weir gates to enable water of different elevations and temperatures to be drawn through the dam to control water temperatures downstream. A series of studies of downstream dam passage of juvenile salmonids were begun after the National Oceanic and Atmospheric Administration determined that Cougar Dam was impacting the viability of anadromous fish stocks. The primary objectives of the studies described in this report were to estimate the route-specific fish passage probabilities at the dam and to estimate the survival probabilities of fish passing through the RO. The first set of dam operating conditions, studied in November, consisted of (1) a mean reservoir elevation of 1,589 feet, (2) water entering the temperature control tower through the weir gates, (3) most water routed through the turbines during the day and through the RO during the night, and (4) mean RO gate openings of 1.2 feet during the day and 3.2 feet during the night. The second set of dam operating conditions, studied in December, consisted of (1) a mean reservoir elevation of 1,507 ft, (2) water entering the temperature control tower through the RO bypass, (3) all water passing through the RO, and (4) mean RO gate openings of 7.3 feet during the day and 7.5 feet during the night. The studies were based on juvenile Chinook salmon (Oncorhynchus tshawytscha) surgically implanted with radio transmitters and passive integrated transponder (PIT) tags. Inferences about general dam passage percentage and timing of volitional migrants were based on surface-acclimated fish released in the reservoir. Dam passage and apparent survival probabilities were estimated using the Route-Specific-Survival Model with data from surface-acclimated fish released near the water surface directly upstream of the temperature control tower (treatment group) and slightly downstream of the dam (control group). In this study, apparent survival is the joint probability of surviving and migrating through the study area during the life of the transmitters. Two rearing groups were used to enable sufficient sample sizes for the studies. The groups differed in feed type, and for the December study only, the rearing location. Fish from each group were divided nearly equally among all combinations of release sites, release times, and surgeons. The sizes, travel times, and survivals of the two rearing groups were similar. There were statistical differences in fish lengths and travel times of the two groups, but they were small and likely were not biologically meaningful. There also was evidence of a difference in single-release estimates of survival between the rearing groups during the December study, but the differences had little effect on the relative survival estimates so the analyses of passage and survival were based on data from the rearing groups pooled. Conditions during the December study were more conducive to passing volitionally migrating fish than conditions during the November study. The passage percentage of the fish released in the reservoir was similar between studies (about 70 percent), but the passage occurred in a median of 1.0 day during the December study and a median of 9.3 days during the November study. More than 93 percent of the dam passage of volitionally migrating fish occurred at night during each study. This finding corroborates results of previous studies at Cougar Dam and suggests that the operating conditions at night are most important to volitionally migrating fish, given the current configuration of the dam. Most fish released near the temperature control tower passed through the RO. A total of 92.2 percent of the treatment group passed through the RO during the November study and the RO was the only route open during the December study. The assumptions of the survival model were either met or adjusted for during each study. There was little evidence that tagger skill or premature failure of radio transmitters had an effect on survival estimates. There were statistically significant differences in travel times between treatment and control groups through several of the river reaches they had in common, but the differences were typically only a few hours, and the two groups likely experienced the same in-river conditions. There was direct evidence of bias due to detection of euthanized fish with live transmitters released as part of the study design. The bias was ameliorated by adjusting the survival estimates for the probability of detecting dead fish with live transmitters, which reduced the estimated survival probabilities by about 0.02. The data and models indicated that the treatment effect was not fully expressed until the study reach terminating with Marshall Island Park on the Willamette River, a distance of 105.8 kilometers downstream of Cougar Dam. This was the first reach in which the 95-percent confidence interval of the estimated reach-specific relative survival overlapped 1.0, indicating similar survival of treatment and control groups. The median travel time of the treatment group from release to Marshall Island Park was 1.64 days during the November study and 1.36 days during the December study. The survival probability of fish that passed into the RO was greater during the December study than during the November study. The relative survival probability of fish passing through the RO was 0.4594 (standard error [SE] 0.0543) during the November study and 0.7389 (SE 0.1160) during the December study. These estimates represent relative survival probabilities from release near Cougar Dam to the Marshall Island site. The estimated survival probability of RO passage was lower than previous studies based on balloon and PIT tags, but higher than a similar study based on radio transmitters. We suggest that, apart from dam operations, the differences in survival primarily are due to the release location. We hypothesize that the balloon- and PIT-tagged fish released through a hose at a point near the RO gate opening experienced more benign conditions than the radio-tagged fish passing the RO volitionally. This hypothesis could be tested with further study. An alternative hypothesis is that some live fish remained within the study area beyond the life of their radio transmitter. The results from these and previous studies indicate that entrainment and survival of juvenile salmonids passing Cougar Dam varies with dam operating conditions. The condition most conducive to dam passage has been the discharge and low pool elevation condition tested during December 2012. That condition included large RO gate openings and was the condition with the highest dam passage survival.

  4. [Lifestyle and probabilty of dementia in the elderly].

    PubMed

    León-Ortiz, Pablo; Ruiz-Flores, Manuel Leonardo; Ramírez-Bermúdez, Jesús; Sosa-Ortiz, Ana Luisa

    2013-01-01

    there is evidence of a relationship between physical and cognitive activity and the development of dementia, although this hypothesis has not been tested in Mexican population. analyze the association between an increased participation in physical and cognitive activities and the probability of having dementia, using a Mexican open population sample. we made a cross sectional survey in open Mexican population of residents in urban and rural areas of 65 of age and older; we performed cognitive assessments to identify subjects with dementia, as well as questionnaires to assess the level of participation in physical and cognitive activities. We performed a binary logistic regression analysis to establish the association between participation and the probability of having dementia. we included 2003 subjects, 180 with diagnosis of dementia. Subjects with dementia were older, had less education and higher prevalence of some chronic diseases. The low participation in cognitive activities was associated with a higher probability of developing dementia. Patients with dementia had significantly lower scores on physical activity scales. this study supports the hypothesis of a relationship between low cognitive and physical activity and the presentation of dementia.

  5. A scan statistic for binary outcome based on hypergeometric probability model, with an application to detecting spatial clusters of Japanese encephalitis.

    PubMed

    Zhao, Xing; Zhou, Xiao-Hua; Feng, Zijian; Guo, Pengfei; He, Hongyan; Zhang, Tao; Duan, Lei; Li, Xiaosong

    2013-01-01

    As a useful tool for geographical cluster detection of events, the spatial scan statistic is widely applied in many fields and plays an increasingly important role. The classic version of the spatial scan statistic for the binary outcome is developed by Kulldorff, based on the Bernoulli or the Poisson probability model. In this paper, we apply the Hypergeometric probability model to construct the likelihood function under the null hypothesis. Compared with existing methods, the likelihood function under the null hypothesis is an alternative and indirect method to identify the potential cluster, and the test statistic is the extreme value of the likelihood function. Similar with Kulldorff's methods, we adopt Monte Carlo test for the test of significance. Both methods are applied for detecting spatial clusters of Japanese encephalitis in Sichuan province, China, in 2009, and the detected clusters are identical. Through a simulation to independent benchmark data, it is indicated that the test statistic based on the Hypergeometric model outweighs Kulldorff's statistics for clusters of high population density or large size; otherwise Kulldorff's statistics are superior.

  6. Territory occupancy and breeding success of Peregrine Falcons Falco peregrinus at various stages of population recovery

    USGS Publications Warehouse

    McGrady, Michael J.; Hines, James; Rollie, Chris; Smith, George D.; Morton, Elise R.; Moore, Jennifer F.; Mearns, Richard M.; Newton, Ian; Murillo-Garcia, Oscar E.; Oli, Madan K.

    2017-01-01

    Organochlorine pesticides disrupted reproduction and killed many raptorial birds, and contributed to population declines during the 1940s to 1970s. We sought to discern whether and to what extent territory occupancy and breeding success changed from the pesticide era to recent years in a resident population of Peregrine Falcons Falco peregrinus in southern Scotland using long-term (1964–2015) field data and multi-state, multi-season occupancy models. Peregrine territories that were occupied with successful reproduction in one year were much more likely to be occupied and experience reproductive success in the following year, compared with those that were unoccupied or occupied by unsuccessful breeders in the previous year. Probability of territory occupancy differed between territories in the eastern and western parts of the study area, and varied over time. The probability of occupancy of territories that were unoccupied and those that were occupied with successful reproduction during the previous breeding season generally increased over time, whereas the probability of occupancy of territories that were occupied after failed reproduction decreased. The probability of reproductive success (conditional on occupancy) in territories that were occupied during the previous breeding season increased over time. Specifically, for territories that had been successful in the previous year, the probability of occupancy as well as reproductive success increased steadily over time; these probabilities were substantially higher in recent years than earlier, when the population was still exposed to direct or residual effects of organochlorine pesticides. These results are consistent with the hypothesis that progressive reduction, followed by a complete ban, in the use of organochlorine pesticides improved reproductive success of Peregrines in southern Scotland. Differences in the temporal pattern of probability of reproductive success between south-eastern and south-western Scotland suggest that the effect of organochlorine pesticides on Peregrine reproductive success and/or the recovery from pesticide effects varied geographically and was possibly affected by other factors such as persecution.

  7. The Hypothesis-Driven Physical Examination.

    PubMed

    Garibaldi, Brian T; Olson, Andrew P J

    2018-05-01

    The physical examination remains a vital part of the clinical encounter. However, physical examination skills have declined in recent years, in part because of decreased time at the bedside. Many clinicians question the relevance of physical examinations in the age of technology. A hypothesis-driven approach to teaching and practicing the physical examination emphasizes the performance of maneuvers that can alter the likelihood of disease. Likelihood ratios are diagnostic weights that allow clinicians to estimate the post-probability of disease. This hypothesis-driven approach to the physical examination increases its value and efficiency, while preserving its cultural role in the patient-physician relationship. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. A predictive approach to selecting the size of a clinical trial, based on subjective clinical opinion.

    PubMed

    Spiegelhalter, D J; Freedman, L S

    1986-01-01

    The 'textbook' approach to determining sample size in a clinical trial has some fundamental weaknesses which we discuss. We describe a new predictive method which takes account of prior clinical opinion about the treatment difference. The method adopts the point of clinical equivalence (determined by interviewing the clinical participants) as the null hypothesis. Decision rules at the end of the study are based on whether the interval estimate of the treatment difference (classical or Bayesian) includes the null hypothesis. The prior distribution is used to predict the probabilities of making the decisions to use one or other treatment or to reserve final judgement. It is recommended that sample size be chosen to control the predicted probability of the last of these decisions. An example is given from a multi-centre trial of superficial bladder cancer.

  9. Extended target recognition in cognitive radar networks.

    PubMed

    Wei, Yimin; Meng, Huadong; Liu, Yimin; Wang, Xiqin

    2010-01-01

    We address the problem of adaptive waveform design for extended target recognition in cognitive radar networks. A closed-loop active target recognition radar system is extended to the case of a centralized cognitive radar network, in which a generalized likelihood ratio (GLR) based sequential hypothesis testing (SHT) framework is employed. Using Doppler velocities measured by multiple radars, the target aspect angle for each radar is calculated. The joint probability of each target hypothesis is then updated using observations from different radar line of sights (LOS). Based on these probabilities, a minimum correlation algorithm is proposed to adaptively design the transmit waveform for each radar in an amplitude fluctuation situation. Simulation results demonstrate performance improvements due to the cognitive radar network and adaptive waveform design. Our minimum correlation algorithm outperforms the eigen-waveform solution and other non-cognitive waveform design approaches.

  10. Does prediction error drive one-shot declarative learning?

    PubMed

    Greve, Andrea; Cooper, Elisa; Kaula, Alexander; Anderson, Michael C; Henson, Richard

    2017-06-01

    The role of prediction error (PE) in driving learning is well-established in fields such as classical and instrumental conditioning, reward learning and procedural memory; however, its role in human one-shot declarative encoding is less clear. According to one recent hypothesis, PE reflects the divergence between two probability distributions: one reflecting the prior probability (from previous experiences) and the other reflecting the sensory evidence (from the current experience). Assuming unimodal probability distributions, PE can be manipulated in three ways: (1) the distance between the mode of the prior and evidence, (2) the precision of the prior, and (3) the precision of the evidence. We tested these three manipulations across five experiments, in terms of peoples' ability to encode a single presentation of a scene-item pairing as a function of previous exposures to that scene and/or item. Memory was probed by presenting the scene together with three choices for the previously paired item, in which the two foil items were from other pairings within the same condition as the target item. In Experiment 1, we manipulated the evidence to be either consistent or inconsistent with prior expectations, predicting PE to be larger, and hence memory better, when the new pairing was inconsistent. In Experiments 2a-c, we manipulated the precision of the priors, predicting better memory for a new pairing when the (inconsistent) priors were more precise. In Experiment 3, we manipulated both visual noise and prior exposure for unfamiliar faces, before pairing them with scenes, predicting better memory when the sensory evidence was more precise. In all experiments, the PE hypotheses were supported. We discuss alternative explanations of individual experiments, and conclude the Predictive Interactive Multiple Memory Signals (PIMMS) framework provides the most parsimonious account of the full pattern of results.

  11. Paleoindian demography and the extraterrestrial impact hypothesis

    PubMed Central

    Buchanan, Briggs; Collard, Mark; Edinborough, Kevan

    2008-01-01

    Recently it has been suggested that one or more large extraterrestrial (ET) objects struck northern North America 12,900 ± 100 calendar years before present (calBP) [Firestone RB, et al. (2007) Proc Natl Acad Sci USA 104: 16016–16021]. This impact is claimed to have triggered the Younger Dryas major cooling event and resulted in the extinction of the North American megafauna. The impact is also claimed to have caused major cultural changes and population decline among the Paleoindians. Here, we report a study in which ≈1,500 radiocarbon dates from archaeological sites in Canada and the United States were used to test the hypothesis that the ET resulted in population decline among the Paleoindians. Following recent studies [e.g., Gamble C, Davies W, Pettitt P, Hazelwood L, Richards M (2005) Camb Archaeol J 15:193–223), the summed probability distribution of the calibrated dates was used to identify probable changes in human population size between 15,000 and 9,000 calBP. Subsequently, potential biases were evaluated by modeling and spatial analysis of the dated occupations. The results of the analyses were not consistent with the predictions of extraterrestrial impact hypothesis. No evidence of a population decline among the Paleoindians at 12,900 ± 100 calBP was found. Thus, minimally, the study suggests the extraterrestrial impact hypothesis should be amended. PMID:18697936

  12. Glassfrog embryos hatch early after parental desertion.

    PubMed

    Delia, Jesse R J; Ramírez-Bautista, Aurelio; Summers, Kyle

    2014-06-22

    Both parental care and hatching plasticity can improve embryo survival. Research has found that parents can alter hatching time owing to a direct effect of care on embryogenesis or via forms of care that cue the hatching process. Because parental care alters conditions critical for offspring development, hatching plasticity could allow embryos to exploit variation in parental behaviour. However, this interaction of parental care and hatching plasticity remains largely unexplored. We tested the hypothesis that embryos hatch early to cope with paternal abandonment in the glassfrog Hyalinobatrachium fleischmanni (Centrolenidae). We conducted male-removal experiments in a wild population, and examined embryos' response to conditions with and without fathers. Embryos hatched early when abandoned, but extended development in the egg stage when fathers continued care. Paternal care had no effect on developmental rate. Rather, hatching plasticity was due to embryos actively hatching at different developmental stages, probably in response to deteriorating conditions without fathers. Our experimental results are supported by a significant correlation between the natural timing of abandonment and hatching in an unmanipulated population. This study demonstrates that embryos can respond to conditions resulting from parental abandonment, and provides insights into how variation in care can affect selection on egg-stage adaptations.

  13. Glassfrog embryos hatch early after parental desertion

    PubMed Central

    Delia, Jesse R. J.; Ramírez-Bautista, Aurelio; Summers, Kyle

    2014-01-01

    Both parental care and hatching plasticity can improve embryo survival. Research has found that parents can alter hatching time owing to a direct effect of care on embryogenesis or via forms of care that cue the hatching process. Because parental care alters conditions critical for offspring development, hatching plasticity could allow embryos to exploit variation in parental behaviour. However, this interaction of parental care and hatching plasticity remains largely unexplored. We tested the hypothesis that embryos hatch early to cope with paternal abandonment in the glassfrog Hyalinobatrachium fleischmanni (Centrolenidae). We conducted male-removal experiments in a wild population, and examined embryos' response to conditions with and without fathers. Embryos hatched early when abandoned, but extended development in the egg stage when fathers continued care. Paternal care had no effect on developmental rate. Rather, hatching plasticity was due to embryos actively hatching at different developmental stages, probably in response to deteriorating conditions without fathers. Our experimental results are supported by a significant correlation between the natural timing of abandonment and hatching in an unmanipulated population. This study demonstrates that embryos can respond to conditions resulting from parental abandonment, and provides insights into how variation in care can affect selection on egg-stage adaptations. PMID:24789892

  14. Novel exposure units for at-home personalized testing of electromagnetic sensibility.

    PubMed

    Huss, Anke; Murbach, Manuel; van Moorselaar, Imke; Kuster, Niels; van Strien, Rob; Kromhout, Hans; Vermeulen, Roel; Slottje, Pauline

    2016-01-01

    Previous experimental studies on electromagnetic hypersensitivity have been criticized regarding inflexibility of choice of exposure and of study locations. We developed and tested novel portable exposure units that can generate different output levels of various extremely low frequency magnetic fields (ELF-MF; 50 Hz field plus harmonics) and radiofrequency electromagnetic fields (RF-EMF). Testing was done with a group of healthy volunteers (n = 25 for 5 ELF-MF and n = 25 for 5 RF-EMF signals) to assess if units were indeed able to produce double-blind exposure conditions. Results substantiated that double-blind conditions were met; on average participants scored 50.6% of conditions correct on the ELF-MF, and 50.0% on the RF-EMF unit, which corresponds to guessing probability. No cues as to exposure conditions were reported. We aim to use these units in a future experiment with subjects who wish to test their personal hypothesis of being able to sense or experience when being exposed to EMF. The new units allow for a high degree of flexibility regarding choice of applied electromagnetic signal, output power level and location (at home or another environment of subjects' choosing). © 2015 Wiley Periodicals, Inc.

  15. The late Neandertal supraorbital fossils from Vindija Cave, Croatia: a biased sample?

    PubMed

    Ahern, James C M; Lee, Sang-Hee; Hawks, John D

    2002-09-01

    The late Neandertal sample from Vindija (Croatia) has been described as transitional between the earlier Central European Neandertals from Krapina (Croatia) and modern humans. However, the morphological differences indicating this transition may rather be the result of different sex and/or age compositions between the samples. This study tests the hypothesis that the metric differences between the Krapina and Vindija supraorbital samples are due to sampling bias. We focus upon the supraorbital region because past studies have posited this region as particularly indicative of the Vindija sample's transitional nature. Furthermore, the supraorbital region varies significantly with both age and sex. We analyzed four chords and two derived indices of supraorbital torus form as defined by Smith & Ranyard (1980, Am. J. phys. Anthrop.93, pp. 589-610). For each variable, we analyzed relative sample bias of the Krapina and Vindija samples using three sampling methods. In order to test the hypothesis that the Vindija sample contains an over-representation of females and/or young while the Krapina sample is normal or also female/young biased, we determined the probability of drawing a sample of the same size as and with a mean equal to or less than Vindija's from a Krapina-based population. In order to test the hypothesis that the Vindija sample is female/young biased while the Krapina sample is male/old biased, we determined the probability of drawing a sample of the same size as and with a mean equal or less than Vindija's from a generated population whose mean is halfway between Krapina's and Vindija's. Finally, in order to test the hypothesis that the Vindija sample is normal while the Krapina sample contains an over-representation of males and/or old, we determined the probability of drawing a sample of the same size as and with a mean equal to or greater than Krapina's from a Vindija-based population. Unless we assume that the Vindija sample is female/young and the Krapina sample is male/old biased, our results falsify the hypothesis that the metric differences between the Krapina and Vindija samples are due to sample bias.

  16. Geochemical climate proxies applied to the Neoproterozoic glacial succession on the Yangtze Platform, South China

    NASA Astrophysics Data System (ADS)

    Dobrzinski, Nicole; Bahlburg, Heinrich; Strauss, Harald; Zhang, Qirui

    A Neoproterozoic succession of glaciomarine deposits of probably Sturtian age is preserved on the Yangtze Platform in South China. At that time, the South China block was located in intermediate to low paleolatitudes at ca. 40°. The snowball Earth hypothesis offers one possible explanation for the occurrence of low latitude tillites. The hypothesis is largely based on geological and geochemical observation made in deposits underlying or overlying such tillites on several continents. In contrast our study focuses on evidence offered by the tillites themselves. We use major, trace and rare earth geochemistry to evaluate the environmental conditions prevailing during the glaciation. Of particular interest are the intensity of chemical weathering and the relative degree of oxygenation of Neoproterozoic (Nanhuan-Sinian) marine bottom waters. CIA values were obtained from preglacial sand- and siltstones, the matrix of the glacial deposits, fine-grained clastic sediments of a unit intercalated in the glacial succession, and postglacial siltstones and black shales. The data indicate relatively low degrees of chemical weathering for the glacial deposits. In contrast, pre- and postglacial deposits display comparatively elevated levels. This is also true for the intercalated unit, which we interpret as the product of a warmer and more humid interglacial period. Data for S/TOC, U/Th, Cd, Mo, and the Ceanom of the glaciomarine samples indicate the presence of oxic bottom waters during the glaciation. The snowball Earth hypothesis predicts the shutdown of chemical weathering on the continents and complete anoxia of the global ocean largely covered by sea ice for several million years. The geochemical record of the Neoproterozoic tillites on the Yangtze Platform is difficult to reconcile with the snowball Earth hypothesis.

  17. A Comparison of Urge Intensity and the Probability of Tic Completion During Tic Freely and Tic Suppression Conditions.

    PubMed

    Specht, Matt W; Nicotra, Cassandra M; Kelly, Laura M; Woods, Douglas W; Ricketts, Emily J; Perry-Parrish, Carisa; Reynolds, Elizabeth; Hankinson, Jessica; Grados, Marco A; Ostrander, Rick S; Walkup, John T

    2014-03-01

    Tic-suppression-based treatments (TSBTs) represent a safe and effective treatment option for Chronic Tic Disorders (CTDs). Prior research has demonstrated that treatment naive youths with CTDs have the capacity to safely and effectively suppress tics for prolonged periods. It remains unclear how tic suppression is achieved. The current study principally examines how effective suppression is achieved and preliminary correlates of the ability to suppress tics. Twelve youths, ages 10 to 17 years, with moderate-to-marked CTDs participated in an alternating sequence of tic freely and reinforced tic suppression conditions during which urge intensity and tic frequency were frequently assessed. Probability of tics occurring was half as likely following high-intensity urges during tic suppression (31%) in contrast to low-intensity urges during tic freely conditions (60%). Age was not associated with ability to suppress. Intelligence indices were associated with or trended toward greater ability to suppress tics. Attention difficulties were not associated with ability to suppress but were associated with tic severity. In contrast to our "selective suppression" hypothesis, we found participants equally capable of suppressing their tics regardless of urge intensity during reinforced tic suppression. Tic suppression was achieved with an "across-the-board" effort to resist urges. Preliminary data suggest that ability to suppress may be associated with general cognitive variables rather than age, tic severity, urge severity, and attention. Treatment naive youths appear to possess a capacity for robust tic suppression. TSBTs may bolster these capacities and/or enable their broader implementation, resulting in symptom improvement. © The Author(s) 2014.

  18. Selective Attention in Pigeon Temporal Discrimination.

    PubMed

    Subramaniam, Shrinidhi; Kyonka, Elizabeth

    2017-07-27

    Cues can vary in how informative they are about when specific outcomes, such as food availability, will occur. This study was an experimental investigation of the functional relation between cue informativeness and temporal discrimination in a peak-interval (PI) procedure. Each session consisted of fixed-interval (FI) 2-s and 4-s schedules of food and occasional, 12-s PI trials during which pecks had no programmed consequences. Across conditions, the phi (ϕ) correlation between key light color and FI schedule value was manipulated. Red and green key lights signaled the onset of either or both FI schedules. Different colors were either predictive (ϕ = 1), moderately predictive (ϕ = 0.2-0.8), or not predictive (ϕ = 0) of a specific FI schedule. This study tested the hypothesis that temporal discrimination is a function of the momentary conditional probability of food; that is, pigeons peck the most at either 2 s or 4 s when ϕ = 1 and peck at both intervals when ϕ < 1. Response distributions were bimodal Gaussian curves; distributions from red- and green-key PI trials converged when ϕ ≤ 0.6. Peak times estimated by summed Gaussian functions, averaged across conditions and pigeons, were 1.85 s and 3.87 s, however, pigeons did not always maximize the momentary probability of food. When key light color was highly correlated with FI schedules (ϕ ≥ 0.6), estimates of peak times indicated that temporal discrimination accuracy was reduced at the unlikely interval, but not the likely interval. The mechanism of this reduced temporal discrimination accuracy could be interpreted as an attentional process.

  19. Goodman and Kruskal's TAU-B Statistics: A Fortran-77 Subroutine.

    ERIC Educational Resources Information Center

    Berry, Kenneth J.; Mielke, Paul W., Jr.

    1986-01-01

    An algorithm and associated FORTRAN-77 computer subroutine are described for computing Goodman and Kruskal's tau-b statistic along with the associated nonasymptotic probability value under the null hypothesis tau=O. (Author)

  20. Long-term social bonds promote cooperation in the iterated Prisoner's Dilemma.

    PubMed

    St-Pierre, Angèle; Larose, Karine; Dubois, Frédérique

    2009-12-07

    Reciprocal altruism, one of the most probable explanations for cooperation among non-kin, has been modelled as a Prisoner's Dilemma. According to this game, cooperation could evolve when individuals, who expect to play again, use conditional strategies like tit-for-tat or Pavlov. There is evidence that humans use such strategies to achieve mutual cooperation, but most controlled experiments with non-human animals have failed to find cooperation. One reason for this could be that subjects fail to cooperate because they behave as if they were to play only once. To assess this hypothesis, we conducted an experiment with monogamous zebra finches (Taeniopygia guttata) that were tested in a two-choice apparatus, with either their social partner or an experimental opponent of the opposite sex. We found that zebra finches maintained high levels of cooperation in an iterated Prisoner's Dilemma game only when interacting with their social partner. Although other mechanisms may have contributed to the observed difference between the two treatments, our results support the hypothesis that animals do not systematically give in to the short-term temptation of cheating when long-term benefits exist. Thus, our findings contradict the commonly accepted idea that reciprocal altruism will be rare in non-human animals.

  1. CPROB: A COMPUTATIONAL TOOL FOR CONDUCTING CONDITIONAL PROBABILITY ANALYSIS

    EPA Science Inventory

    Conditional probability analysis measures the probability of observing one event given that another event has occurred. In an environmental context, conditional probability analysis helps assess the association between an environmental contaminant (i.e. the stressor) and the ec...

  2. Giant Panda Maternal Care: A Test of the Experience Constraint Hypothesis.

    PubMed

    Snyder, Rebecca J; Perdue, Bonnie M; Zhang, Zhihe; Maple, Terry L; Charlton, Benjamin D

    2016-06-07

    The body condition constraint and the experience condition constraint hypotheses have both been proposed to account for differences in reproductive success between multiparous (experienced) and primiparous (first-time) mothers. However, because primiparous mothers are typically characterized by both inferior body condition and lack of experience when compared to multiparous mothers, interpreting experience related differences in maternal care as support for either the body condition constraint hypothesis or the experience constraint hypothesis is extremely difficult. Here, we examined maternal behaviour in captive giant pandas, allowing us to simultaneously control for body condition and provide a rigorous test of the experience constraint hypothesis in this endangered animal. We found that multiparous mothers spent more time engaged in key maternal behaviours (nursing, grooming, and holding cubs) and had significantly less vocal cubs than primiparous mothers. This study provides the first evidence supporting the experience constraint hypothesis in the order Carnivora, and may have utility for captive breeding programs in which it is important to monitor the welfare of this species' highly altricial cubs, whose survival is almost entirely dependent on receiving adequate maternal care during the first few weeks of life.

  3. Study on conditional probability of surface rupture: effect of fault dip and width of seismogenic layer

    NASA Astrophysics Data System (ADS)

    Inoue, N.

    2017-12-01

    The conditional probability of surface ruptures is affected by various factors, such as shallow material properties, process of earthquakes, ground motions and so on. Toda (2013) pointed out difference of the conditional probability of strike and reverse fault by considering the fault dip and width of seismogenic layer. This study evaluated conditional probability of surface rupture based on following procedures. Fault geometry was determined from the randomly generated magnitude based on The Headquarters for Earthquake Research Promotion (2017) method. If the defined fault plane was not saturated in the assumed width of the seismogenic layer, the fault plane depth was randomly provided within the seismogenic layer. The logistic analysis was performed to two data sets: surface displacement calculated by dislocation methods (Wang et al., 2003) from the defined source fault, the depth of top of the defined source fault. The estimated conditional probability from surface displacement indicated higher probability of reverse faults than that of strike faults, and this result coincides to previous similar studies (i.e. Kagawa et al., 2004; Kataoka and Kusakabe, 2005). On the contrary, the probability estimated from the depth of the source fault indicated higher probability of thrust faults than that of strike and reverse faults, and this trend is similar to the conditional probability of PFDHA results (Youngs et al., 2003; Moss and Ross, 2011). The probability of combined simulated results of thrust and reverse also shows low probability. The worldwide compiled reverse fault data include low fault dip angle earthquake. On the other hand, in the case of Japanese reverse fault, there is possibility that the conditional probability of reverse faults with less low dip angle earthquake shows low probability and indicates similar probability of strike fault (i.e. Takao et al., 2013). In the future, numerical simulation by considering failure condition of surface by the source fault would be performed in order to examine the amount of the displacement and conditional probability quantitatively.

  4. Diminished caudate and superior temporal gyrus responses to effort-based decision making in patients with first-episode major depressive disorder.

    PubMed

    Yang, Xin-hua; Huang, Jia; Lan, Yong; Zhu, Cui-ying; Liu, Xiao-qun; Wang, Ye-fei; Cheung, Eric F C; Xie, Guang-rong; Chan, Raymond C K

    2016-01-04

    Anhedonia, the loss of interest or pleasure in reward processing, is a hallmark feature of major depressive disorder (MDD), but its underlying neurobiological mechanism is largely unknown. The present study aimed to examine the underlying neural mechanism of reward-related decision-making in patients with MDD. We examined behavioral and neural responses to rewards in patients with first-episode MDD (N=25) and healthy controls (N=25) using the Effort-Expenditure for Rewards Task (EEfRT). The task involved choices about possible rewards of varying magnitude and probability. We tested the hypothesis that individuals with MDD would exhibit a reduced neural response in reward-related brain structures involved in cost-benefit decision-making. Compared with healthy controls, patients with MDD showed significantly weaker responses in the left caudate nucleus when contrasting the 'high reward'-'low reward' condition, and blunted responses in the left superior temporal gyrus and the right caudate nucleus when contrasting high and low probabilities. In addition, hard tasks chosen during high probability trials were negatively correlated with superior temporal gyrus activity in MDD patients, while the same choices were negatively correlated with caudate nucleus activity in healthy controls. These results indicate that reduced caudate nucleus and superior temporal gyrus activation may underpin abnormal cost-benefit decision-making in MDD. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Neotropical forest expansion during the last glacial period challenges refuge hypothesis.

    PubMed

    Leite, Yuri L R; Costa, Leonora P; Loss, Ana Carolina; Rocha, Rita G; Batalha-Filho, Henrique; Bastos, Alex C; Quaresma, Valéria S; Fagundes, Valéria; Paresque, Roberta; Passamani, Marcelo; Pardini, Renata

    2016-01-26

    The forest refuge hypothesis (FRH) has long been a paradigm for explaining the extreme biological diversity of tropical forests. According to this hypothesis, forest retraction and fragmentation during glacial periods would have promoted reproductive isolation and consequently speciation in forest patches (ecological refuges) surrounded by open habitats. The recent use of paleoclimatic models of species and habitat distributions revitalized the FRH, not by considering refuges as the main drivers of allopatric speciation, but instead by suggesting that high contemporary diversity is associated with historically stable forest areas. However, the role of the emerged continental shelf on the Atlantic Forest biodiversity hotspot of eastern South America during glacial periods has been ignored in the literature. Here, we combined results of species distribution models with coalescent simulations based on DNA sequences to explore the congruence between scenarios of forest dynamics through time and the genetic structure of mammal species cooccurring in the central region of the Atlantic Forest. Contrary to the FRH predictions, we found more fragmentation of suitable habitats during the last interglacial (LIG) and the present than in the last glacial maximum (LGM), probably due to topography. We also detected expansion of suitable climatic conditions onto the emerged continental shelf during the LGM, which would have allowed forests and forest-adapted species to expand. The interplay of sea level and land distribution must have been crucial in the biogeographic history of the Atlantic Forest, and forest refuges played only a minor role, if any, in this biodiversity hotspot during glacial periods.

  6. The Italian national trends in smoking initiation and cessation according to gender and education.

    PubMed

    Sardu, C; Mereu, A; Minerba, L; Contu, P

    2009-09-01

    OBJECTIVES. This study aims to assess the trend in initiation and cessation of smoking across successive birth cohorts, according to gender and education, in order to provide useful suggestion for tobacco control policy. STUDY DESIGN. The study is based on data from the "Health conditions and resort to sanitary services" survey carried out in Italy from October 2004 to September 2005 by the National Institute of Statistics. Through a multisampling procedure a sample representative of the entire national territory was selected. In order to calculate trends in smoking initiation and cessation, data were stratified for birth cohorts, gender and education level, and analyzed through the life table method. The cumulative probability of smoking initiation, across subsequent generations, shows a downward trend followed by a plateau. This result highlights that there is not a shred of evidence to support the hypothesis of an anticipation in smoking initiation. The cumulative probability of quitting, across subsequent generations, follows an upward trend, highlighting the growing tendency of smokers to become an "early quitter", who give up within 30 years of age. Results suggest that the Italian antismoking approach, for the most part targeted at preventing the initiation of smoking emphasising the negative consequences, has an effect on the early smoking cessation. Health policies should reinforce the existing trend of "early quitting" through specific actions. In addition our results show that men with low education exhibit the higher probability of smoking initiation and the lower probability of early quitting, and therefore should be targeted with special attention.

  7. A Tutorial in Bayesian Potential Outcomes Mediation Analysis.

    PubMed

    Miočević, Milica; Gonzalez, Oscar; Valente, Matthew J; MacKinnon, David P

    2018-01-01

    Statistical mediation analysis is used to investigate intermediate variables in the relation between independent and dependent variables. Causal interpretation of mediation analyses is challenging because randomization of subjects to levels of the independent variable does not rule out the possibility of unmeasured confounders of the mediator to outcome relation. Furthermore, commonly used frequentist methods for mediation analysis compute the probability of the data given the null hypothesis, which is not the probability of a hypothesis given the data as in Bayesian analysis. Under certain assumptions, applying the potential outcomes framework to mediation analysis allows for the computation of causal effects, and statistical mediation in the Bayesian framework gives indirect effects probabilistic interpretations. This tutorial combines causal inference and Bayesian methods for mediation analysis so the indirect and direct effects have both causal and probabilistic interpretations. Steps in Bayesian causal mediation analysis are shown in the application to an empirical example.

  8. Upper bounds on the error probabilities and asymptotic error exponents in quantum multiple state discrimination

    NASA Astrophysics Data System (ADS)

    Audenaert, Koenraad M. R.; Mosonyi, Milán

    2014-10-01

    We consider the multiple hypothesis testing problem for symmetric quantum state discrimination between r given states σ1, …, σr. By splitting up the overall test into multiple binary tests in various ways we obtain a number of upper bounds on the optimal error probability in terms of the binary error probabilities. These upper bounds allow us to deduce various bounds on the asymptotic error rate, for which it has been hypothesized that it is given by the multi-hypothesis quantum Chernoff bound (or Chernoff divergence) C(σ1, …, σr), as recently introduced by Nussbaum and Szkoła in analogy with Salikhov's classical multi-hypothesis Chernoff bound. This quantity is defined as the minimum of the pairwise binary Chernoff divergences min _{j

  9. On the alleged collisional origin of the Kirkwood Gaps. [in asteroid belt

    NASA Technical Reports Server (NTRS)

    Heppenheimer, T. A.

    1975-01-01

    This paper examines two proposed mechanisms whereby asteroidal collisions and close approaches may have given rise to the Kirkwood Gaps. The first hypothesis is that asteroids in near-resonant orbits have markedly increased collision probabilities and so are preferentially destroyed, or suffer decay in population density, within the resonance zones. A simple order-of-magnitude analysis shows that this hypothesis is untenable since it leads to conclusions which are either unrealistic or not in accord with present understanding of asteroidal physics. The second hypothesis is the Brouwer-Jefferys theory that collisions would smooth an asteroidal distribution function, as a function of Jacobi constant, thus forming resonance gaps. This hypothesis is examined by direct numerical integration of 50 asteroid orbits near the 2:1 resonance, with collisions simulated by random variables. No tendency to form a gap was observed.

  10. Seasonal variation in non-structural carbohydrates, sucrolytic activity and secondary metabolites in deciduous and perennial Diospyros species sampled in Western Mexico

    PubMed Central

    Ramírez-Briones, Ernesto; Rodríguez-Macías, Ramón; Salcedo-Pérez, Eduardo; Martínez-Gallardo, Norma; Tiessen, Axel; Molina-Torres, Jorge; Délano-Frier, John P.; Zañudo-Hernández, Julia

    2017-01-01

    This study was performed to test the working hypothesis that the primary determinants influencing seasonal driven modifications in carbon mobilization and other key biochemical parameters in leaves of poorly known Diospyros digyna (Ddg; semi-domesticated; perennial) and D. rekoi (Dre; undomesticated; deciduous) trees are determined by environmental growing conditions, agronomic management and physiological plasticity. Thus, biochemical changes in leaves of both trees were recorded seasonally during two successive fruiting years. Trees were randomly sampled in Western Mexico habitats with differing soil quality, climatic conditions, luminosity, and cultivation practices. Leaves of Ddg had consistently higher total chlorophyll contents (CT) that, unexpectedly, peaked in the winter of 2015. In Dre, the highest leaf CT values recorded in the summer of 2015 inversely correlated with low average luminosity and high Chl a/ Chlb ratios. The seasonal CT variations in Dre were congruent with varying luminosity, whereas those in Ddg were probably affected by other factors, such as fluctuating leaf protein contents and the funneling of light energy to foliar non-structural carbohydrates (NSCs) accumulation, which were consistently higher than those detected in Dre leaves. Seasonal foliar NSC fluctuations in both species were in agreement with the carbon (C) demands of flowering, fruiting and/ or leaf regrowth. Seasonal changes in foliar hexose to sucrose (Hex/ Suc) ratios coincided with cell wall invertase activity in both species. In Dre, high Hex/ Suc ratios in spring leaves possibly allowed an accumulation of phenolic acids, not observed in Ddg. The above results supported the hypothesis proposed by showing that leaf responses to changing environmental conditions differ in perennial and deciduous Diospyros trees, including a dynamic adjustment of NSCs to supply the C demands imposed by reproduction, leaf regrowth and, possibly, stress. PMID:29073239

  11. Timing and Causality in the Generation of Learned Eyelid Responses

    PubMed Central

    Sánchez-Campusano, Raudel; Gruart, Agnès; Delgado-García, José M.

    2011-01-01

    The cerebellum-red nucleus-facial motoneuron (Mn) pathway has been reported as being involved in the proper timing of classically conditioned eyelid responses. This special type of associative learning serves as a model of event timing for studying the role of the cerebellum in dynamic motor control. Here, we have re-analyzed the firing activities of cerebellar posterior interpositus (IP) neurons and orbicularis oculi (OO) Mns in alert behaving cats during classical eyeblink conditioning, using a delay paradigm. The aim was to revisit the hypothesis that the IP neurons (IPns) can be considered a neuronal phase-modulating device supporting OO Mns firing with an emergent timing mechanism and an explicit correlation code during learned eyelid movements. Optimized experimental and computational tools allowed us to determine the different causal relationships (temporal order and correlation code) during and between trials. These intra- and inter-trial timing strategies expanding from sub-second range (millisecond timing) to longer-lasting ranges (interval timing) expanded the functional domain of cerebellar timing beyond motor control. Interestingly, the results supported the above-mentioned hypothesis. The causal inferences were influenced by the precise motor and pre-motor spike timing in the cause-effect interval, and, in addition, the timing of the learned responses depended on cerebellar–Mn network causality. Furthermore, the timing of CRs depended upon the probability of simulated causal conditions in the cause-effect interval and not the mere duration of the inter-stimulus interval. In this work, the close relation between timing and causality was verified. It could thus be concluded that the firing activities of IPns may be related more to the proper performance of ongoing CRs (i.e., the proper timing as a consequence of the pertinent causality) than to their generation and/or initiation. PMID:21941469

  12. Implications of Cognitive Load for Hypothesis Generation and Probability Judgment

    PubMed Central

    Sprenger, Amber M.; Dougherty, Michael R.; Atkins, Sharona M.; Franco-Watkins, Ana M.; Thomas, Rick P.; Lange, Nicholas; Abbs, Brandon

    2011-01-01

    We tested the predictions of HyGene (Thomas et al., 2008) that both divided attention at encoding and judgment should affect the degree to which participants’ probability judgments violate the principle of additivity. In two experiments, we showed that divided attention during judgment leads to an increase in subadditivity, suggesting that the comparison process for probability judgments is capacity limited. Contrary to the predictions of HyGene, a third experiment revealed that divided attention during encoding leads to an increase in later probability judgment made under full attention. The effect of divided attention during encoding on judgment was completely mediated by the number of hypotheses participants generated, indicating that limitations in both encoding and recall can cascade into biases in judgments. PMID:21734897

  13. PASTIS: Bayesian extrasolar planet validation - I. General framework, models, and performance

    NASA Astrophysics Data System (ADS)

    Díaz, R. F.; Almenara, J. M.; Santerne, A.; Moutou, C.; Lethuillier, A.; Deleuil, M.

    2014-06-01

    A large fraction of the smallest transiting planet candidates discovered by the Kepler and CoRoT space missions cannot be confirmed by a dynamical measurement of the mass using currently available observing facilities. To establish their planetary nature, the concept of planet validation has been advanced. This technique compares the probability of the planetary hypothesis against that of all reasonably conceivable alternative false positive (FP) hypotheses. The candidate is considered as validated if the posterior probability of the planetary hypothesis is sufficiently larger than the sum of the probabilities of all FP scenarios. In this paper, we present PASTIS, the Planet Analysis and Small Transit Investigation Software, a tool designed to perform a rigorous model comparison of the hypotheses involved in the problem of planet validation, and to fully exploit the information available in the candidate light curves. PASTIS self-consistently models the transit light curves and follow-up observations. Its object-oriented structure offers a large flexibility for defining the scenarios to be compared. The performance is explored using artificial transit light curves of planets and FPs with a realistic error distribution obtained from a Kepler light curve. We find that data support the correct hypothesis strongly only when the signal is high enough (transit signal-to-noise ratio above 50 for the planet case) and remain inconclusive otherwise. PLAnetary Transits and Oscillations of stars (PLATO) shall provide transits with high enough signal-to-noise ratio, but to establish the true nature of the vast majority of Kepler and CoRoT transit candidates additional data or strong reliance on hypotheses priors is needed.

  14. Time, Chance, and Reduction

    NASA Astrophysics Data System (ADS)

    Ernst, Gerhard; Hüttemann, Andreas

    2010-01-01

    List of contributors; 1. Introduction Gerhard Ernst and Andreas Hütteman; Part I. The Arrows of Time: 2. Does a low-entropy constraint prevent us from influencing the past? Mathias Frisch; 3. The part hypothesis meets gravity Craig Callender; 4. Quantum gravity and the arrow of time Claus Kiefer; Part II. Probability and Chance: 5. The natural-range conception of probability Jacob Rosenthal; 6. Probability in Boltzmannian statistical mechanics Roman Frigg; 7. Humean mechanics versus a metaphysics of powers Michael Esfeld; Part III. Reduction: 8. The crystallisation of Clausius's phenomenological thermodynamics C. Ulises Moulines; 9. Reduction and renormalization Robert W. Batterman; 10. Irreversibility in stochastic dynamics Jos Uffink; Index.

  15. Giant Panda Maternal Care: A Test of the Experience Constraint Hypothesis

    PubMed Central

    Snyder, Rebecca J.; Perdue, Bonnie M.; Zhang, Zhihe; Maple, Terry L.; Charlton, Benjamin D.

    2016-01-01

    The body condition constraint and the experience condition constraint hypotheses have both been proposed to account for differences in reproductive success between multiparous (experienced) and primiparous (first-time) mothers. However, because primiparous mothers are typically characterized by both inferior body condition and lack of experience when compared to multiparous mothers, interpreting experience related differences in maternal care as support for either the body condition constraint hypothesis or the experience constraint hypothesis is extremely difficult. Here, we examined maternal behaviour in captive giant pandas, allowing us to simultaneously control for body condition and provide a rigorous test of the experience constraint hypothesis in this endangered animal. We found that multiparous mothers spent more time engaged in key maternal behaviours (nursing, grooming, and holding cubs) and had significantly less vocal cubs than primiparous mothers. This study provides the first evidence supporting the experience constraint hypothesis in the order Carnivora, and may have utility for captive breeding programs in which it is important to monitor the welfare of this species’ highly altricial cubs, whose survival is almost entirely dependent on receiving adequate maternal care during the first few weeks of life. PMID:27272352

  16. Optimum space shuttle launch times relative to natural environment

    NASA Technical Reports Server (NTRS)

    King, R. L.

    1977-01-01

    The probabilities of favorable and unfavorable weather conditions for launch and landing of the STS under different criteria were computed for every three hours on a yearly basis using 14 years of weather data. These temporal probability distributions were considered for three sets of weather criteria encompassing benign, moderate and severe weather conditions for both Kennedy Space Center and for Edwards Air Force Base. In addition, the conditional probabilities were computed for unfavorable weather conditions occurring after a delay which may or may not be due to weather conditions. Also for KSC, the probabilities of favorable landing conditions at various times after favorable launch conditions have prevailed. The probabilities were computed to indicate the significance of each weather element to the overall result.

  17. Incidence of Lower Respiratory Tract Infections and Atopic Conditions in Boys and Young Male Adults: Royal College of General Practitioners Research and Surveillance Centre Annual Report 2015-2016

    PubMed Central

    Correa, Ana; Pebody, Richard; Yonova, Ivelina; Smith, Gillian; Byford, Rachel; Pathirannehelage, Sameera Rankiri; McGee, Christopher; Elliot, Alex J; Hriskova, Mariya; Ferreira, Filipa IM; Rafi, Imran; Jones, Simon

    2018-01-01

    Background The Royal College of General Practitioners Research and Surveillance Centre comprises more than 150 general practices, with a combined population of more than 1.5 million, contributing to UK and European public health surveillance and research. Objective The aim of this paper was to report gender differences in the presentation of infectious and respiratory conditions in children and young adults. Methods Disease incidence data were used to test the hypothesis that boys up to puberty present more with lower respiratory tract infection (LRTI) and asthma. Incidence rates were reported for infectious conditions in children and young adults by gender. We controlled for ethnicity, deprivation, and consultation rates. We report odds ratios (OR) with 95% CI, P values, and probability of presenting. Results Boys presented more with LRTI, largely due to acute bronchitis. The OR of males consulting was greater across the youngest 3 age bands (OR 1.59, 95% CI 1.35-1.87; OR 1.13, 95% CI 1.05-1.21; OR 1.20, 95% CI 1.09-1.32). Allergic rhinitis and asthma had a higher OR of presenting in boys aged 5 to 14 years (OR 1.52, 95% CI 1.37-1.68; OR 1.31, 95% CI 1.17-1.48). Upper respiratory tract infection (URTI) and urinary tract infection (UTI) had lower odds of presenting in boys, especially those older than 15 years. The probability of presenting showed different patterns for LRTI, URTI, and atopic conditions. Conclusions Boys younger than 15 years have greater odds of presenting with LRTI and atopic conditions, whereas girls may present more with URTI and UTI. These differences may provide insights into disease mechanisms and for health service planning. PMID:29712621

  18. Distinguishing between the partial-mapping preparation hypothesis and the failure-to-engage hypothesis of residual switch costs.

    PubMed

    Lindsen, Job P; de Jong, Ritske

    2010-10-01

    Lien, Ruthruff, Remington, & Johnston (2005) reported residual switch cost differences between stimulus-response (S-R) pairs and proposed the partial-mapping preparation (PMP) hypothesis, which states that advance preparation will typically be limited to a subset of S-R pairs because of structural capacity limitations, to account for these differences. Alternatively, the failure-to-engage (FTE) hypothesis does not allow for differences in probability of advance preparation between S-R pairs within a set; it accounts for residual switch cost differences by assuming that benefits of advance preparation may differ between S-R pairs. Three Experiments were designed to test between these hypotheses. No capacity limitations of the type assumed by the PMP hypothesis were found for many participants in Experiment 1. In Experiments 2 and 3, no evidence was found for the dependency of residual switch cost differences between S-R pairs on response-stimulus interval that is predicted by the PMP hypothesis. Mixture-model analysis of reaction times distributions in Experiment 3 provided strong support for the FTE hypothesis over the PMP hypothesis. Simulation studies with a computational implementation of the FTE hypothesis showed that it is able to account in great detail for the results of the present study. Together, these results provide strong evidence against the PMP hypothesis and support the FTE hypothesis that advance preparation probabilistically fails or succeeds at the level of the task set. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  19. Quantum simulation of the integer factorization problem: Bell states in a Penning trap

    NASA Astrophysics Data System (ADS)

    Rosales, Jose Luis; Martin, Vicente

    2018-03-01

    The arithmetic problem of factoring an integer N can be translated into the physics of a quantum device, a result that supports Pólya's and Hilbert's conjecture to demonstrate Riemann's hypothesis. The energies of this system, being univocally related to the factors of N , are the eigenvalues of a bounded Hamiltonian. Here we solve the quantum conditions and show that the histogram of the discrete energies, provided by the spectrum of the system, should be interpreted in number theory as the relative probability for a prime to be a factor candidate of N . This is equivalent to a quantum sieve that is shown to require only o (ln√{N}) 3 energy measurements to solve the problem, recovering Shor's complexity result. Hence the outcome can be seen as a probability map that a pair of primes solve the given factorization problem. Furthermore, we show that a possible embodiment of this quantum simulator corresponds to two entangled particles in a Penning trap. The possibility to build the simulator experimentally is studied in detail. The results show that factoring numbers, many orders of magnitude larger than those computed with experimentally available quantum computers, is achievable using typical parameters in Penning traps.

  20. The ranking probability approach and its usage in design and analysis of large-scale studies.

    PubMed

    Kuo, Chia-Ling; Zaykin, Dmitri

    2013-01-01

    In experiments with many statistical tests there is need to balance type I and type II error rates while taking multiplicity into account. In the traditional approach, the nominal [Formula: see text]-level such as 0.05 is adjusted by the number of tests, [Formula: see text], i.e., as 0.05/[Formula: see text]. Assuming that some proportion of tests represent "true signals", that is, originate from a scenario where the null hypothesis is false, power depends on the number of true signals and the respective distribution of effect sizes. One way to define power is for it to be the probability of making at least one correct rejection at the assumed [Formula: see text]-level. We advocate an alternative way of establishing how "well-powered" a study is. In our approach, useful for studies with multiple tests, the ranking probability [Formula: see text] is controlled, defined as the probability of making at least [Formula: see text] correct rejections while rejecting hypotheses with [Formula: see text] smallest P-values. The two approaches are statistically related. Probability that the smallest P-value is a true signal (i.e., [Formula: see text]) is equal to the power at the level [Formula: see text], to an very good excellent approximation. Ranking probabilities are also related to the false discovery rate and to the Bayesian posterior probability of the null hypothesis. We study properties of our approach when the effect size distribution is replaced for convenience by a single "typical" value taken to be the mean of the underlying distribution. We conclude that its performance is often satisfactory under this simplification; however, substantial imprecision is to be expected when [Formula: see text] is very large and [Formula: see text] is small. Precision is largely restored when three values with the respective abundances are used instead of a single typical effect size value.

  1. The Estimation of Tree Posterior Probabilities Using Conditional Clade Probability Distributions

    PubMed Central

    Larget, Bret

    2013-01-01

    In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample. [Bayesian phylogenetics; conditional clade distributions; improved accuracy; posterior probabilities of trees.] PMID:23479066

  2. An Alternative Version of Conditional Probabilities and Bayes' Rule: An Application of Probability Logic

    ERIC Educational Resources Information Center

    Satake, Eiki; Amato, Philip P.

    2008-01-01

    This paper presents an alternative version of formulas of conditional probabilities and Bayes' rule that demonstrate how the truth table of elementary mathematical logic applies to the derivations of the conditional probabilities of various complex, compound statements. This new approach is used to calculate the prior and posterior probabilities…

  3. Testing neuropsychological hypotheses for cognitive deficits in psychopathic criminals: a study of global-local processing.

    PubMed

    Kosson, David S; Miller, Sarah K; Byrnes, Katherine A; Leveroni, Catherine L

    2007-03-01

    Competing hypotheses about neuropsychological mechanisms underlying psychopathy are seldom examined in the same study. We tested the left hemisphere activation hypothesis and the response modulation hypothesis of psychopathy in 172 inmates completing a global-local processing task under local bias, global bias, and neutral conditions. Consistent with the left hemisphere activation hypothesis, planned comparisons showed that psychopathic inmates classified local targets more slowly than nonpsychopathic inmates in a local bias condition and exhibited a trend toward similar deficits for global targets in this condition. However, contrary to the response modulation hypothesis, psychopaths were no slower to respond to local targets in a global bias condition. Because psychopathic inmates were not generally slower to respond to local targets, results are also not consistent with a general left hemisphere dysfunction account. Correlational analyses also indicated deficits specific to conditions presenting most targets at the local level initially. Implications for neuropsychological conceptualizations of psychopathy are considered.

  4. Evidence for an All-Or-None Perceptual Response: Single-Trial Analyses of Magnetoencephalography Signals Indicate an Abrupt Transition Between Visual Perception and Its Absence

    PubMed Central

    Sekar, Krithiga; Findley, William M.; Llinás, Rodolfo R.

    2014-01-01

    Whether consciousness is an all-or-none or graded phenomenon is an area of inquiry that has received considerable interest in neuroscience and is as of yet, still debated. In this magnetoencephalography (MEG) study we used a single stimulus paradigm with sub-threshold, threshold and supra-threshold duration inputs to assess whether stimulus perception is continuous with or abruptly differentiated from unconscious stimulus processing in the brain. By grouping epochs according to stimulus identification accuracy and exposure duration, we were able to investigate whether a high-amplitude perception-related cortical event was (1) only evoked for conditions where perception was most probable (2) had invariant amplitude once evoked and (3) was largely absent for conditions where perception was least probable (criteria satisfying an all-on-none hypothesis). We found that averaged evoked responses showed a gradual increase in amplitude with increasing perceptual strength. However, single trial analyses demonstrated that stimulus perception was correlated with an all-or-none response, the temporal precision of which increased systematically as perception transitioned from ambiguous to robust states. Due to poor signal-to-noise resolution of single trial data, whether perception-related responses, whenever present, were invariant in amplitude could not be unambiguously demonstrated. However, our findings strongly suggest that visual perception of simple stimuli is associated with an all-or-none cortical evoked response the temporal precision of which varies as a function of perceptual strength. PMID:22020091

  5. Emergency medical services and congestion : urban sprawl and pre-hospital emergency care time.

    DOT National Transportation Integrated Search

    2009-01-01

    This research measured the association between urban sprawl and emergency medical service (EMS) response time. The purpose was to test the hypothesis that features of the built environment increase the probability of delayed ambulance arrival. Using ...

  6. New methods of testing nonlinear hypothesis using iterative NLLS estimator

    NASA Astrophysics Data System (ADS)

    Mahaboob, B.; Venkateswarlu, B.; Mokeshrayalu, G.; Balasiddamuni, P.

    2017-11-01

    This research paper discusses the method of testing nonlinear hypothesis using iterative Nonlinear Least Squares (NLLS) estimator. Takeshi Amemiya [1] explained this method. However in the present research paper, a modified Wald test statistic due to Engle, Robert [6] is proposed to test the nonlinear hypothesis using iterative NLLS estimator. An alternative method for testing nonlinear hypothesis using iterative NLLS estimator based on nonlinear hypothesis using iterative NLLS estimator based on nonlinear studentized residuals has been proposed. In this research article an innovative method of testing nonlinear hypothesis using iterative restricted NLLS estimator is derived. Pesaran and Deaton [10] explained the methods of testing nonlinear hypothesis. This paper uses asymptotic properties of nonlinear least squares estimator proposed by Jenrich [8]. The main purpose of this paper is to provide very innovative methods of testing nonlinear hypothesis using iterative NLLS estimator, iterative NLLS estimator based on nonlinear studentized residuals and iterative restricted NLLS estimator. Eakambaram et al. [12] discussed least absolute deviation estimations versus nonlinear regression model with heteroscedastic errors and also they studied the problem of heteroscedasticity with reference to nonlinear regression models with suitable illustration. William Grene [13] examined the interaction effect in nonlinear models disused by Ai and Norton [14] and suggested ways to examine the effects that do not involve statistical testing. Peter [15] provided guidelines for identifying composite hypothesis and addressing the probability of false rejection for multiple hypotheses.

  7. Distribution pattern and number of ticks on lizards.

    PubMed

    Dudek, Krzysztof; Skórka, Piotr; Sajkowska, Zofia Anna; Ekner-Grzyb, Anna; Dudek, Monika; Tryjanowski, Piotr

    2016-02-01

    The success of ectoparasites depends primarily on the site of attachment and body condition of their hosts. Ticks usually tend to aggregate on vertebrate hosts in specific areas, but the distribution pattern may depend on host body size and condition, sex, life stage or skin morphology. Here, we studied the distribution of ticks on lizards and tested the following hypothesis: occurrence or high abundance of ticks is confined with body parts with smaller scales and larger interscalar length because such sites should provide ticks with superior attachment conditions. This study was performed in field conditions in central Poland in 2008-2011. In total, 500 lizards (Lacerta agilis) were caught and 839 ticks (Ixodes ricinus, larvae and nymphs) were collected from them. Using generalised linear mixed models, we found that the ticks were most abundant on forelimbs and their axillae, with 90% of ticks attached there. This part of the lizard body and the region behind the hindlimb were covered by the smallest scales with relatively wide gaps between them. This does not fully support our hypothesis that ticks prefer locations with easy access to skin between scales, because it does not explain why so few ticks were in the hindlimb area. We found that the abundance of ticks was positively correlated with lizard body size index (snout-vent length). Tick abundance was also higher in male and mature lizards than in female and young individuals. Autotomy had no effect on tick abundance. We found no correlation between tick size and lizard morphology, sex, autotomy and body size index. The probability of occurrence of dead ticks was positively linked with the total number of ticks on the lizard but there was no relationship between dead tick presence and lizard size, sex or age. Thus lizard body size and sex are the major factors affecting the abundance of ticks, and these parasites are distributed nearly exclusively on the host's forelimbs and their axillae. Copyright © 2015 Elsevier GmbH. All rights reserved.

  8. Friedrich Nietzsche: the wandering and learned neuropath under Dionisius.

    PubMed

    Gomes, Marleide da Mota

    2015-11-01

    Friedrich Nietzsche (1844-1900) was a remarkable philologist-philosopher while remaining in a condition of ill-health. Issues about his wandering/disruptive behavior that might be a consequence and/or protection against his cognitive decline and multifaceted disease are presented. The life complex that raises speculations about its etiology is constituted by: insight, creativity and wandering behavior besides several symptoms and signs of disease(s), mainly neurological one. The most important issue to be considered at the moment is not the disease diagnosis (Lissauer's general paresis or CADASIL, e.g.), but the probable Nietzsche's great cognitive reserve linked to the multifactorial etiology (genetic and environmental), and shared characteristics both to creativity and psychopathology. This makes any disease seems especial regarding Nietzsche, and whichever the diagnostic hypothesis has to consider the Nietzsche's unique background to express any disease(s).

  9. Mineralogical Plasticity Acts as a Compensatory Mechanism to the Impacts of Ocean Acidification.

    PubMed

    Leung, Jonathan Y S; Russell, Bayden D; Connell, Sean D

    2017-03-07

    Calcifying organisms are considered particularly susceptible to the future impacts of ocean acidification (OA), but recent evidence suggests that they may be able to maintain calcification and overall fitness. The underlying mechanism remains unclear but may be attributed to mineralogical plasticity, which modifies the energetic cost of calcification. To test the hypothesis that mineralogical plasticity enables the maintenance of shell growth and functionality under OA conditions, we assessed the biological performance of a gastropod (respiration rate, feeding rate, somatic growth, and shell growth of Austrocochlea constricta) and analyzed its shell mechanical and geochemical properties (shell hardness, elastic modulus, amorphous calcium carbonate, calcite to aragonite ratio, and magnesium to calcium ratio). Despite minor metabolic depression and no increase in feeding rate, shell growth was faster under OA conditions, probably due to increased precipitation of calcite and trade-offs against inner shell density. In addition, the resulting shell was functionally suitable for increasingly "corrosive" oceans, i.e., harder and less soluble shells. We conclude that mineralogical plasticity may act as a compensatory mechanism to maintain overall performance of calcifying organisms under OA conditions and could be a cornerstone of calcifying organisms to acclimate to and maintain their ecological functions in acidifying oceans.

  10. An Alternative Teaching Method of Conditional Probabilities and Bayes' Rule: An Application of the Truth Table

    ERIC Educational Resources Information Center

    Satake, Eiki; Vashlishan Murray, Amy

    2015-01-01

    This paper presents a comparison of three approaches to the teaching of probability to demonstrate how the truth table of elementary mathematical logic can be used to teach the calculations of conditional probabilities. Students are typically introduced to the topic of conditional probabilities--especially the ones that involve Bayes' rule--with…

  11. Egg production of turbot, Scophthalmus maximus, in the Baltic Sea

    NASA Astrophysics Data System (ADS)

    Nissling, Anders; Florin, Ann-Britt; Thorsen, Anders; Bergström, Ulf

    2013-11-01

    In the brackish water Baltic Sea turbot spawn at ~ 6-9 psu along the coast and on offshore banks in ICES SD 24-29, with salinity influencing the reproductive success. The potential fecundity (the stock of vitellogenic oocytes in the pre-spawning ovary), egg size (diameter and dry weight of artificially fertilized 1-day-old eggs) and gonad dry weight were assessed for fish sampled in SD 25 and SD 28. Multiple regression analysis identified somatic weight, or total length in combination with Fulton's condition factor, as main predictors of fecundity and gonad dry weight with stage of maturity (oocyte packing density or leading cohort) as an additional predictor. For egg size, somatic weight was identified as main predictor while otolith weight (proxy for age) was an additional predictor. Univariate analysis using GLM revealed significantly higher fecundity and gonad dry weight for turbot from SD 28 (3378-3474 oocytes/g somatic weight) compared to those from SD 25 (2343 oocytes/g somatic weight), with no difference in egg size (1.05 ± 0.03 mm diameter and 46.8 ± 6.5 μg dry weight; mean ± sd). The difference in egg production matched egg survival probabilities in relation to salinity conditions suggesting selection for higher fecundity as a consequence of poorer reproductive success at lower salinities. This supports the hypothesis of higher size-specific fecundity towards the limit of the distribution of a species as an adaptation to harsher environmental conditions and lower offspring survival probabilities. Within SD 28 comparisons were made between two major fishing areas targeting spawning aggregations and a marine protected area without fishing. The outcome was inconclusive and is discussed with respect to potential fishery induced effects, effects of the salinity gradient, effects of specific year-classes, and effects of maturation status of sampled fish.

  12. An experimental test of whether habitat corridors affect pollen transfer.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Townsend, Patricia A.; Levey, Douglas J.

    Abstract. Negative effects of habitat fragmentation are thought to be diminished when habitat patches are joined by a corridor. A key assumption is that corridors facilitate exchange rates of organisms between otherwise isolated patches. If the organisms are pollinators, corridors may be important for maintaining genetically viable populations of the plants that they pollinate. We tested the hypothesis that corridors increase the movement of insect pollinators into patches of habitat and thereby increase pollen transfer for two species of plants, one pollinated by butterflies (Lantana camara) and the other by bees and wasps (Rudbeckia hirta). We worked in an experimentalmore » landscape consisting of 40 greater than or equal to 1-ha patches of early-successional habitat in a matrix of forest. Within each of eight experimental units, two patches were connected by a corridor (150 X 25 m), and three were not. Patch shape varied to control for the area added by the presence of a corridor. Differences in patch shape also allowed us to test alternative hypotheses of how corridors might function. The Traditional Corridor Hypothesis posits that corridors increase immigration and emigration by functioning as movement conduits between patches. The Drift Fence Hypothesis posits that corridors function by ‘‘capturing’’ organisms dispersing through the matrix, redirecting them into associated habitat patches. Using fluorescent powder to track pollen, we found that pollen transfer by butterflies between patches connected by a corridor was significantly higher than between unconnected patches (all values mean plus or minus 1 SE: 59% plus or minus 9.2% vs. 25% plus or minus 5.2% of flowers receiving pollen). Likewise, pollen transfer by bees and wasps was significantly higher between connected patches than between unconnected patches (30% plus or minus 4.2% vs. 14.5% plus or minus 2.2%). These results support the Traditional Corridor Hypothesis. There was little support, however, for the Drift Fence Hypothesis. To generalize our results to a larger scale, we measured the probability of pollen transfer by butterflies as a function of distance along a 2000 X 75 m corridor. Pollen transfer probability exponentially declined with respect to distance and successfully predicted pollen transfer probability on the scale of our previous experiment. These results suggest that corridors facilitate pollen transfer in fragmented landscapes.« less

  13. Probability density function of non-reactive solute concentration in heterogeneous porous formations.

    PubMed

    Bellin, Alberto; Tonina, Daniele

    2007-10-30

    Available models of solute transport in heterogeneous formations lack in providing complete characterization of the predicted concentration. This is a serious drawback especially in risk analysis where confidence intervals and probability of exceeding threshold values are required. Our contribution to fill this gap of knowledge is a probability distribution model for the local concentration of conservative tracers migrating in heterogeneous aquifers. Our model accounts for dilution, mechanical mixing within the sampling volume and spreading due to formation heterogeneity. It is developed by modeling local concentration dynamics with an Ito Stochastic Differential Equation (SDE) that under the hypothesis of statistical stationarity leads to the Beta probability distribution function (pdf) for the solute concentration. This model shows large flexibility in capturing the smoothing effect of the sampling volume and the associated reduction of the probability of exceeding large concentrations. Furthermore, it is fully characterized by the first two moments of the solute concentration, and these are the same pieces of information required for standard geostatistical techniques employing Normal or Log-Normal distributions. Additionally, we show that in the absence of pore-scale dispersion and for point concentrations the pdf model converges to the binary distribution of [Dagan, G., 1982. Stochastic modeling of groundwater flow by unconditional and conditional probabilities, 2, The solute transport. Water Resour. Res. 18 (4), 835-848.], while it approaches the Normal distribution for sampling volumes much larger than the characteristic scale of the aquifer heterogeneity. Furthermore, we demonstrate that the same model with the spatial moments replacing the statistical moments can be applied to estimate the proportion of the plume volume where solute concentrations are above or below critical thresholds. Application of this model to point and vertically averaged bromide concentrations from the first Cape Cod tracer test and to a set of numerical simulations confirms the above findings and for the first time it shows the superiority of the Beta model to both Normal and Log-Normal models in interpreting field data. Furthermore, we show that assuming a-priori that local concentrations are normally or log-normally distributed may result in a severe underestimate of the probability of exceeding large concentrations.

  14. Quantitative tests of a reconstitution model for RNA folding thermodynamics and kinetics.

    PubMed

    Bisaria, Namita; Greenfeld, Max; Limouse, Charles; Mabuchi, Hideo; Herschlag, Daniel

    2017-09-12

    Decades of study of the architecture and function of structured RNAs have led to the perspective that RNA tertiary structure is modular, made of locally stable domains that retain their structure across RNAs. We formalize a hypothesis inspired by this modularity-that RNA folding thermodynamics and kinetics can be quantitatively predicted from separable energetic contributions of the individual components of a complex RNA. This reconstitution hypothesis considers RNA tertiary folding in terms of ΔG align , the probability of aligning tertiary contact partners, and ΔG tert , the favorable energetic contribution from the formation of tertiary contacts in an aligned state. This hypothesis predicts that changes in the alignment of tertiary contacts from different connecting helices and junctions (ΔG HJH ) or from changes in the electrostatic environment (ΔG +/- ) will not affect the energetic perturbation from a mutation in a tertiary contact (ΔΔG tert ). Consistent with these predictions, single-molecule FRET measurements of folding of model RNAs revealed constant ΔΔG tert values for mutations in a tertiary contact embedded in different structural contexts and under different electrostatic conditions. The kinetic effects of these mutations provide further support for modular behavior of RNA elements and suggest that tertiary mutations may be used to identify rate-limiting steps and dissect folding and assembly pathways for complex RNAs. Overall, our model and results are foundational for a predictive understanding of RNA folding that will allow manipulation of RNA folding thermodynamics and kinetics. Conversely, the approaches herein can identify cases where an independent, additive model cannot be applied and so require additional investigation.

  15. Neotropical forest expansion during the last glacial period challenges refuge hypothesis

    PubMed Central

    Costa, Leonora P.; Loss, Ana Carolina; Rocha, Rita G.; Batalha-Filho, Henrique; Bastos, Alex C.; Quaresma, Valéria S.; Fagundes, Valéria; Paresque, Roberta; Passamani, Marcelo; Pardini, Renata

    2016-01-01

    The forest refuge hypothesis (FRH) has long been a paradigm for explaining the extreme biological diversity of tropical forests. According to this hypothesis, forest retraction and fragmentation during glacial periods would have promoted reproductive isolation and consequently speciation in forest patches (ecological refuges) surrounded by open habitats. The recent use of paleoclimatic models of species and habitat distributions revitalized the FRH, not by considering refuges as the main drivers of allopatric speciation, but instead by suggesting that high contemporary diversity is associated with historically stable forest areas. However, the role of the emerged continental shelf on the Atlantic Forest biodiversity hotspot of eastern South America during glacial periods has been ignored in the literature. Here, we combined results of species distribution models with coalescent simulations based on DNA sequences to explore the congruence between scenarios of forest dynamics through time and the genetic structure of mammal species cooccurring in the central region of the Atlantic Forest. Contrary to the FRH predictions, we found more fragmentation of suitable habitats during the last interglacial (LIG) and the present than in the last glacial maximum (LGM), probably due to topography. We also detected expansion of suitable climatic conditions onto the emerged continental shelf during the LGM, which would have allowed forests and forest-adapted species to expand. The interplay of sea level and land distribution must have been crucial in the biogeographic history of the Atlantic Forest, and forest refuges played only a minor role, if any, in this biodiversity hotspot during glacial periods. PMID:26755597

  16. The Probability Approach to English If-Conditional Sentences

    ERIC Educational Resources Information Center

    Wu, Mei

    2012-01-01

    Users of the Probability Approach choose the right one from four basic types of conditional sentences--factual, predictive, hypothetical and counterfactual conditionals, by judging how likely (i.e. the probability) the event in the result-clause will take place when the condition in the if-clause is met. Thirty-three students from the experimental…

  17. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  18. PROBABILITY SURVEYS, CONDITIONAL PROBABILITIES, AND ECOLOGICAL RISK ASSESSMENT

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Asscssment Program EMAP) can be analyzed with a conditional probability analysis (CPA) to conduct quantitative probabi...

  19. [Tonsillopharyngitis outbreak caused by foodborne group A beta-hemolytic Streptococcus].

    PubMed

    Nieto Vera, Juan; Figueroa Murillo, Estrella; Cruz Calderón, María Victoria; Pérez Alonso, Aránzazu

    2011-08-01

    Although infrequent, some authors have reported outbreaks of foodborne tonsillopharyngitis. On May 11, 2010 a series of cases of tonsillopharyngitis among those attending a fellowship meeting on 8 March was notified to the Epidemiological Surveillance Network in Andalusia (SVEA). The aim of this study is to epidemiologically characterise the outbreak. Descriptive analysis of reported cases and case - control exposure to the implicated food. The variables taken into account were age, sex, symptoms and start date. Sources of information used were the records of the SVEA and individual digital report (DIRAYA). Frequencies and attack rates were calculated, and a Bayesian analysis for the comparison of difference in proportions of disease was carried out for a 95% probability or credibility range (IP). Among the 130 attendees at a communion 41 cases of tonsillopharyngitis (attack rate 31.5%) were detected, and in smears Group A Beta-Hemolytic Streptococcus was isolated. The most affected age group was the 25-44 year-olds, 16 (39,0%); 68.6% (24) female. The egg salad showed a probability greater than 80% P(Δ>0.10 and Δ>0.15) for a 95% IP of risk of disease after intake and a probability of having a lower risk of no disease. It was a Group A Beta-Hemolytic Streptococcal outbreak, the epidemiological evidence indicates exposure to common single source, hence the hypothesis of dietary origin, the implicated food was egg salad. Contributing factors could be cross-contamination after preparation favoured by the bad practice and the conditions of the place.

  20. Variations in the survival probabilities of the PVC-protected red mangrove propagules: testing the encased replanting technique.

    PubMed

    López-Ortiz, M I; Pérez, C M; Suárez, E; Ríos-Dávila, R

    1999-12-01

    The EcoEléctrica Mangrove Planting Project, a five-year voluntary effort, has the purpose of testing a recently developed mangrove planting technique at the EcoEléctrica site in Peñuelas, Puerto Rico. The goal of the project is to provide empirical validation to promote or improve the technique to be used in recovering mangrove ecosystems in Puerto Rico and United States. The research presented herein analyzed the information collected on the first two years of the project. The proportions of remaining casings and seeds per study zone were compared using the chi-square distribution. Zone 1 had the least pipes lost while Zone 4 had the most (p < 0.05). Forty-three percent of the seeds in Zone 1 remained in the casing, while 26% remained in Zone 2 (p = 0.03). Median growth rates of seeds per study zone showed that Zone 1 had the highest median growth rates. Survival analysis described the survival experience of the seeds, and differences in survival probabilities were compared with the log-rank test. Zone 1 seeds had a better survival experience compared to Zones 2, 3 and 4 (p < 0.0001). Survival probabilities for being free of spots were over 60% during the whole study period. No significant differences were observed in the survival experience with the use-or-no use of casing extensions (p = 0.40), and the use-or-no use of nursed seeds (p = 0.26). Differences in survival probabilities might be attributed to variations in wave energy, depth or substrate conditions. This hypothesis will be evaluated in the second phase of the study.

  1. Behavioral and physiological responses to male handicap in chick-rearing black-legged kittiwakes

    USGS Publications Warehouse

    Leclaire, S.; Bourret, V.; Wagner, R.H.; Hatch, Shyla A.; Helfenstein, F.; Chastel, O.; Danchin, E.

    2011-01-01

    Parental investment entails a trade-off between the benefits of effort in current offspring and the costs to future reproduction. Long-lived species are predicted to be reluctant to increase parental effort to avoid affecting their survival. We tested this hypothesis in black-legged kittiwakes Rissa tridactyla by clipping flight feathers of experimental males at the beginning of the chick-rearing period. We analyzed the consequences of this handicap on feeding and attendance behavior, body condition, integument coloration, and circulating levels of corticosterone and prolactin in handicapped males and their mates in comparison to unmanipulated controls. Chicks in both groups were compared in terms of aggressive behavior, growth, and mortality. Handicapped males lost more mass, had less bright integuments, and attended the nest less often than controls. Nevertheless, they fed their chicks at the same rate and had similar corticosterone and prolactin levels. Compared with control females, females mated with handicapped males showed a lower provisioning rate and higher nest attendance in the first days after manipulation. Their lower feeding rate probably triggered the increased sibling aggression and mortality observed in experimental broods. Our findings suggest that experimental females adaptively adjusted their effort to their mate's perceived quality or that their provisioning was constrained by their higher nest attendance. Overall, our results suggest that kittiwake males can decrease their condition for the sake of their chicks, which seems to contradict the hypothesis that kittiwakes should be reluctant to increase parental effort to avoid affecting their survival. ?? 2011 The Author. Published by Oxford University Press on behalf of the International Society for Behavioral Ecology. All rights reserved.

  2. Nest-site selection in the acorn woodpecker

    USGS Publications Warehouse

    Hooge, P.N.; Stanback, M.T.; Koenig, Walter D.

    1999-01-01

    Acorn Woodpeckers (Melanerpes formicivorus) at Hastings Reservation in central California prefer to nest in dead limbs in large, dead valley oaks (Quercus lobata) and California sycamores (Platanus racemosa) that are also frequently used as acorn storage trees. Based on 232 nest cavities used over an 18-year period, we tested whether preferred or modal nest-site characters were associated with increased reproductive success (the "nest-site quality" hypothesis). We also examined whether more successful nests were likely to experience more favorable microclimatic conditions or to be less accessible to terrestrial predators. We found only equivocal support for the nest-site quality hypothesis: only 1 of 5 preferred characters and 2 of 10 characters exhibiting a clear modality were correlated with higher reproductive success. All three characteristics of nests known or likely to be associated with a more favorable microclimate, and two of five characteristics likely to render nests less accessible to predators, were correlated with higher reproductive success. These results suggest that nest cavities in this population are built in part to take advantage of favorable microclimatic conditions and, to a lesser extent, to reduce access to predators. However, despite benefits of particular nest characteristics, birds frequently nested in apparently suboptimal cavities. We also found a significant relationship between mean group size and the history of occupancy of particular territories and the probability of nest cavities being built in microclimatically favorable live limbs, suggesting that larger groups residing on more stable territories were better able to construct nests with optimal characteristics. This indicates that there may be demographic, as well as ecological, constraints on nest-site selection in this primary cavity nester.

  3. Probability of stress-corrosion fracture under random loading.

    NASA Technical Reports Server (NTRS)

    Yang, J.-N.

    1972-01-01

    A method is developed for predicting the probability of stress-corrosion fracture of structures under random loadings. The formulation is based on the cumulative damage hypothesis and the experimentally determined stress-corrosion characteristics. Under both stationary and nonstationary random loadings, the mean value and the variance of the cumulative damage are obtained. The probability of stress-corrosion fracture is then evaluated using the principle of maximum entropy. It is shown that, under stationary random loadings, the standard deviation of the cumulative damage increases in proportion to the square root of time, while the coefficient of variation (dispersion) decreases in inversed proportion to the square root of time. Numerical examples are worked out to illustrate the general results.

  4. The role of probability arguments in the history of science.

    PubMed

    Weinert, Friedel

    2010-03-01

    The paper examines Wesley Salmon's claim that the primary role of plausibility arguments in the history of science is to impose constraints on the prior probability of hypotheses (in the language of Bayesian confirmation theory). A detailed look at Copernicanism and Darwinism and, more briefly, Rutherford's discovery of the atomic nucleus reveals a further and arguably more important role of plausibility arguments. It resides in the consideration of likelihoods, which state how likely a given hypothesis makes a given piece of evidence. In each case the likelihoods raise the probability of one of the competing hypotheses and diminish the credibility of its rival, and this may happen either on the basis of 'old' or 'new' evidence.

  5. Landscape characteristics influence pond occupancy by frogs after accounting for detectability

    USGS Publications Warehouse

    Mazerolle, M.J.; Desrochers, A.; Rochefort, L.

    2005-01-01

    Many investigators have hypothesized that landscape attributes such as the amount and proximity of habitat are important for amphibian spatial patterns. This has produced a number of studies focusing on the effects of landscape characteristics on amphibian patterns of occurrence in patches or ponds, most of which conclude that the landscape is important. We identified two concerns associated with these studies: one deals with their applicability to other landscape types, as most have been conducted in agricultural landscapes; the other highlights the need to account for the probability of detection. We tested the hypothesis that landscape characteristics influence spatial patterns of amphibian occurrence at ponds after accounting for the probability of detection in little-studied peatland landscapes undergoing peat mining. We also illustrated the costs of not accounting for the probability of detection by comparing our results to conventional logistic regression analyses. Results indicate that frog occurrence increased with the percent cover of ponds within 100, 250, and 1000 m, as well as the amount of forest cover within 1000 m. However, forest cover at 250 m had a negative influence on frog presence at ponds. Not accounting for the probability of detection resulted in underestimating the influence of most variables on frog occurrence, whereas a few were overestimated. Regardless, we show that conventional logistic regression can lead to different conclusions than analyses accounting for detectability. Our study is consistent with the hypothesis that landscape characteristics are important in determining the spatial patterns of frog occurrence at ponds. We strongly recommend estimating the probability of detection in field surveys, as this will increase the quality and conservation potential of models derived from such data. ?? 2005 by the Ecological Society of America.

  6. Probability Surveys, Conditional Probability, and Ecological Risk Assessment

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency’s (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  7. Affective and cognitive factors influencing sensitivity to probabilistic information.

    PubMed

    Tyszka, Tadeusz; Sawicki, Przemyslaw

    2011-11-01

    In study 1 different groups of female students were randomly assigned to one of four probabilistic information formats. Five different levels of probability of a genetic disease in an unborn child were presented to participants (within-subject factor). After the presentation of the probability level, participants were requested to indicate the acceptable level of pain they would tolerate to avoid the disease (in their unborn child), their subjective evaluation of the disease risk, and their subjective evaluation of being worried by this risk. The results of study 1 confirmed the hypothesis that an experience-based probability format decreases the subjective sense of worry about the disease, thus, presumably, weakening the tendency to overrate the probability of rare events. Study 2 showed that for the emotionally laden stimuli, the experience-based probability format resulted in higher sensitivity to probability variations than other formats of probabilistic information. These advantages of the experience-based probability format are interpreted in terms of two systems of information processing: the rational deliberative versus the affective experiential and the principle of stimulus-response compatibility. © 2011 Society for Risk Analysis.

  8. The Hygiene Hypothesis in the Age of the Microbiome.

    PubMed

    Ege, Markus J

    2017-11-01

    The original version of the hygiene hypothesis suggested that infections transmitted early in life by "unhygienic contact" prevented allergies. Examples were endemic fecal-oral infections by viral, bacterial, or protozoic pathogens, such as hepatitis A virus, Helicobacter pylori, or Toxoplasma gondii. Later, this concept also included microorganisms beyond pathogens, such as commensals and symbionts, and the hygiene hypothesis was extended to inflammatory diseases in general. An impressive illustration of the hygiene hypothesis was found in the consistent farm effect on asthma and allergies, which has partly been attributed to immunomodulatory properties of endotoxin as emitted by livestock. Assessment of environmental microorganisms by molecular techniques suggested an additional protective effect of microbial diversity on asthma beyond atopy. Whether microbial diversity stands for a higher probability to encounter protective clusters of microorganisms or whether it is a proxy of a balanced environmental exposure remains elusive. Diversity of the mucosal microbiome of the upper airways probably reflects an undisturbed balance of beneficial microorganisms and pathogens, such as Moraxella catarrhalis, which has been associated with subsequent development of asthma and pneumonia. In addition, specific fermenters of plant fibers, such as the genera Ruminococcus and Bacteroides, have been implied in asthma protection through production of short-chain fatty acids, volatile substances with the capability to reduce T-helper cell type 2-mediated allergic airway inflammation. Evolutionary thinking may offer a key to understanding noncommunicable inflammatory diseases as delayed adaptation to a world of fast and profound environmental changes. Better adaptation may be fostered by growing insight into the interplay between man and microbiome and an adequate choice of the environmental exposure.

  9. Research Design and Statistics for Applied Linguistics.

    ERIC Educational Resources Information Center

    Hatch, Evelyn; Farhady, Hossein

    An introduction to the conventions of research design and statistical analysis is presented for graduate students of applied linguistics. The chapters cover such concepts as the definition of research, variables, research designs, research report formats, sorting and displaying data, probability and hypothesis testing, comparing means,…

  10. A Student’s t Mixture Probability Hypothesis Density Filter for Multi-Target Tracking with Outliers

    PubMed Central

    Liu, Zhuowei; Chen, Shuxin; Wu, Hao; He, Renke; Hao, Lin

    2018-01-01

    In multi-target tracking, the outliers-corrupted process and measurement noises can reduce the performance of the probability hypothesis density (PHD) filter severely. To solve the problem, this paper proposed a novel PHD filter, called Student’s t mixture PHD (STM-PHD) filter. The proposed filter models the heavy-tailed process noise and measurement noise as a Student’s t distribution as well as approximates the multi-target intensity as a mixture of Student’s t components to be propagated in time. Then, a closed PHD recursion is obtained based on Student’s t approximation. Our approach can make full use of the heavy-tailed characteristic of a Student’s t distribution to handle the situations with heavy-tailed process and the measurement noises. The simulation results verify that the proposed filter can overcome the negative effect generated by outliers and maintain a good tracking accuracy in the simultaneous presence of process and measurement outliers. PMID:29617348

  11. Enhanced tau neutrino appearance through invisible decay

    NASA Astrophysics Data System (ADS)

    Pagliaroli, Giulia; Di Marco, Natalia; Mannarelli, Massimo

    2016-06-01

    The decay of neutrino mass eigenstates leads to a change of the conversion and survival probability of neutrino flavor eigenstates. Exploiting the recent results released by the long-baseline OPERA experiment we perform the statistical investigation of the neutrino invisible decay hypothesis in the νμ→ντ appearance channel. We find that the neutrino decay provides an enhancement of the expected tau appearance signal with respect to the standard oscillation scenario for the long-baseline OPERA experiment. The increase of the νμ→ντ conversion probability by the decay of one of the mass eigenstates is due to a reduction of the "destructive interference" among the different massive neutrino components. Despite data showing a very mild preference for invisible decays with respect to the oscillations only hypothesis, we provide an upper limit for the neutrino decay lifetime in this channel of τ3/m3≳1.3 ×10-13 s /eV at the 90% confidence level.

  12. Type I error probability spending for post-market drug and vaccine safety surveillance with binomial data.

    PubMed

    Silva, Ivair R

    2018-01-15

    Type I error probability spending functions are commonly used for designing sequential analysis of binomial data in clinical trials, but it is also quickly emerging for near-continuous sequential analysis of post-market drug and vaccine safety surveillance. It is well known that, for clinical trials, when the null hypothesis is not rejected, it is still important to minimize the sample size. Unlike in post-market drug and vaccine safety surveillance, that is not important. In post-market safety surveillance, specially when the surveillance involves identification of potential signals, the meaningful statistical performance measure to be minimized is the expected sample size when the null hypothesis is rejected. The present paper shows that, instead of the convex Type I error spending shape conventionally used in clinical trials, a concave shape is more indicated for post-market drug and vaccine safety surveillance. This is shown for both, continuous and group sequential analysis. Copyright © 2017 John Wiley & Sons, Ltd.

  13. Nature vs. nurture: can enrichment rescue the behavioural phenotype of BDNF heterozygous mice?

    PubMed

    Chourbaji, Sabine; Brandwein, Christiane; Vogt, Miriam A; Dormann, Christof; Hellweg, Rainer; Gass, Peter

    2008-10-10

    In earlier experiments we have demonstrated that group-housing in a rather impoverished "standard" environment can be a crucial stress factor in male C57Bl/6 mice. The present study aimed at investigating the effect of combining a probable genetic vulnerability--postulated by the "Neurotrophin Hypothesis of Depression"--with the potentially modulating influence of a stressful environment such as "impoverished" standard housing conditions. For that purpose mice with a partial deletion of brain-derived neurotrophic factor (BDNF) were group-housed under standard and enriched housing conditions and analysed in a well-established test battery for emotional behaviours. Standard group-housing affected emotional behaviour in male and female BDNF heterozygous mice, causing an increase in anxiety, changes in exploration as well as nociception. Providing the animals' cages with supplementary enrichment, however, led to a rescue of emotional alterations, which emphasises the significance of external factors and their relevance for a valid investigation of genetic aspects in these mutants as well as others, which may be examined in terms of stress-responsiveness or emotionality.

  14. Decision-making when data and inferences are not conclusive: risk-benefit and acceptable regret approach.

    PubMed

    Hozo, Iztok; Schell, Michael J; Djulbegovic, Benjamin

    2008-07-01

    The absolute truth in research is unobtainable, as no evidence or research hypothesis is ever 100% conclusive. Therefore, all data and inferences can in principle be considered as "inconclusive." Scientific inference and decision-making need to take into account errors, which are unavoidable in the research enterprise. The errors can occur at the level of conclusions that aim to discern the truthfulness of research hypothesis based on the accuracy of research evidence and hypothesis, and decisions, the goal of which is to enable optimal decision-making under present and specific circumstances. To optimize the chance of both correct conclusions and correct decisions, the synthesis of all major statistical approaches to clinical research is needed. The integration of these approaches (frequentist, Bayesian, and decision-analytic) can be accomplished through formal risk:benefit (R:B) analysis. This chapter illustrates the rational choice of a research hypothesis using R:B analysis based on decision-theoretic expected utility theory framework and the concept of "acceptable regret" to calculate the threshold probability of the "truth" above which the benefit of accepting a research hypothesis outweighs its risks.

  15. Latitudinal variation in reproductive strategies by the migratory Louisiana Waterthrush

    USGS Publications Warehouse

    Mattsson, B.J.; Latta, S.C.; Cooper, R.J.; Mulvihill, R.S.

    2011-01-01

    We evaluated hypotheses that seek to explain breeding strategies of the Louisiana Waterthrush (Parkesia motacilla) that vary across a latitudinal gradient. On the basis of data from 418 nests of color-banded individuals in southwestern Pennsylvania and 700 km south in the Georgia Piedmont, we found that clutch size in replacement nests and probability of renesting were significantly greater in Pennsylvania (clutch size 4.4; renesting probability 0.66) than in Georgia (clutch size 3.8; renesting probability 0.54). Contrasts of the remaining measures of breeding were not statistically significant, and, in particular, mean daily nest survival in the two study areas was nearly identical (0.974 in Pennsylvania; 0.975 in Georgia). An individual-based model of fecundity (i.e., number of fledged young per adult female), predicted that approximately half of the females in both Pennsylvania and Georgia fledge at least one young, and mean values for fecundity in Pennsylvania and Georgia were 2.28 and 1.91, respectively. On the basis of greater support for the food-limitation hypothesis than for the season-length hypothesis, the trade-off between breeding in a region with more food but making a longer migration may be greater for waterthrushes breeding farther north than for those breeding farther south. ?? The Cooper Ornithological Society 2011.

  16. Mediating role of activity level in the depressive realism effect.

    PubMed

    Blanco, Fernando; Matute, Helena; A Vadillo, Miguel

    2012-01-01

    Several classic studies have concluded that the accuracy of identifying uncontrollable situations depends heavily on depressive mood. Nondepressed participants tend to exhibit an optimistic illusion of control, whereas depressed participants tend to better detect a lack of control. Recently, we suggested that the different activity levels (measured as the probability of responding during a contingency learning task) exhibited by depressed and nondepressed individuals is partly responsible for this effect. The two studies presented in this paper provide further support for this mediational hypothesis, in which mood is the distal cause of the illusion of control operating through activity level, the proximal cause. In Study 1, the probability of responding, P(R), was found to be a mediator variable between the depressive symptoms and the judgments of control. In Study 2, we intervened directly on the mediator variable: The P(R) for both depressed and nondepressed participants was manipulated through instructions. Our results confirm that P(R) manipulation produced differences in the participants' perceptions of uncontrollability. Importantly, the intervention on the mediator variable cancelled the effect of the distal cause; the participants' judgments of control were no longer mood dependent when the P(R) was manipulated. This result supports the hypothesis that the so-called depressive realism effect is actually mediated by the probability of responding.

  17. Estimating nest detection probabilities for white-winged dove nest transects in Tamaulipas, Mexico

    USGS Publications Warehouse

    Nichols, J.D.; Tomlinson, R.E.; Waggerman, G.

    1986-01-01

    Nest transects in nesting colonies provide one source of information on White-winged Dove (Zenaida asiatica asiatica) population status and reproduction. Nests are counted along transects using standardized field methods each year in Texas and northeastern Mexico by personnel associated with Mexico's Office of Flora and Fauna, the Texas Parks and Wildlife Department, and the U.S. Fish and Wildlife Service. Nest counts on transects are combined with information on the size of nesting colonies to estimate total numbers of nests in sampled colonies. Historically, these estimates have been based on the actual nest counts on transects and thus have required the assumption that all nests lying within transect boundaries are detected (seen) with a probability of one. Our objectives were to test the hypothesis that nest detection probability is one and, if rejected, to estimate this probability.

  18. Cancer immunotherapy by immunosuppression.

    PubMed

    Prehn, Richmond T; Prehn, Liisa M

    2010-12-15

    We have previously suggested that the stimulatory effect of a weak immune reaction on tumor growth may be necessary for the growth of incipient tumors. In the present paper, we enlarge upon and extend that idea by collecting evidence in the literature bearing upon the new hypothesis that a growing cancer, whether in man or mouse, is throughout its lifespan, probably growing and progressing because of continued immune stimulation by a weak immune reaction. We also suggest that prolonged immunosuppression might interfere with progression and thus be an aid to therapy. While most of the considerable evidence that supports the hypothesis comes from observations of experimental mouse tumors, there is suggestive evidence that human tumors may behave in much the same way, and as far as we can ascertain, there is no present evidence that necessarily refutes the hypothesis.

  19. Apparent inferiority of first-time breeders in the kittiwake: The role of heterogeneity among age classes

    USGS Publications Warehouse

    Cam, E.; Monnat, J.-Y.

    2000-01-01

    1. Many studies have provided evidence that first-time breeders have a lower survival, a lower probability of success, or of breeding, in the following year. Hypotheses based on reproductive costs have often been proposed to explain this. However, because of the intrinsic relationship between age and experience, the apparent inferiority of first-time breeders at the population level may result from selection, and experience may not influence performance within each individual. In this paper we address the question of phenotypic correlations between fitness components. This addresses differences in individual quality, a prerequisite for a selection process to occur. We also test the hypothesis of an influence of experience on these components while taking age and reproductive success into account: two factors likely to play a key role in a selection process. 2. Using data from a long-term study on the kittiwake, we found that first-time breeders have a lower probability of success, a lower survival and a lower probability of breeding in the next year than experienced breeders. However, neither experienced nor inexperienced breeders have a lower survival or a lower probability of breeding in the following year than birds that skipped a breeding opportunity. This suggests heterogeneity in quality among individuals. 3. Failed birds have a lower survival and a lower probability of breeding in the following year regardless of experience. This can be interpreted in the light of the selection hypothesis. The inferiority of inexperienced breeders may be linked to a higher proportion of lower-quality individuals in younger age classes. When age and breeding success are controlled for, there is no evidence of an influence of experience on survival or future breeding probability. 4. Using data from individuals whose reproductive life lasted the same number of years, we investigated the influence of experience on reproductive performance within individuals. There is no strong evidence that a process operating within individuals explains the improvement in performance observed at the population level.

  20. An observational estimate of the probability of encounters between mass-losing evolved stars and molecular clouds

    NASA Astrophysics Data System (ADS)

    Kastner, Joel H.; Myers, P. C.

    1994-02-01

    One hypothesis for the elevated abundance of Al-26 present during the formation of the solar system is that an asymptotic giant branch (AGB) star expired within the molecular cloud (MC) containing the protosolar nebula. To test this hypothesis for star-forming clouds at the present epoch, we compared nearly complete lists of rapidly mass-losing AGB stars and MCs in the solar neighborhood and identified those stars which are most likely to encounter a nearby cloud. Roughly 10 stars satisfy our selection criteria. We estimated probabilities of encounter for these stars from the position of each star relative to cloud CO emission and the likely star-cloud distance along the line of sight. Typical encounter probabilities are approximately 1%. The number of potential encounters and the probability for each star-cloud pair to result in an encounter suggests that within 1 kpc of the Sun, there is a approximately 1% chance that a given cloud will be visited by a mass-losing AGB star over the next million years. The estimate is dominated by the possibility of encounters involving the stars IRC +60041 and S Cep. Over a MC lifetime, the probability for AGB encounter may be as high as approximately 70%. We discuss the implications of these results for theories of AL-26 enrichment of processed and unprocessed meteoritic inclusions. If the Al-26 in either type of inclusion arose from AGB-MC interaction, the low probability estimated here seems to require that AGB-MC encounters trigger multiple star formation and/or that the production rate of AGB stars was higher during the epoch of solar system formation than at present. Various lines of evidence suggest only the more massive (5-8 solar mass) AGB stars can produce significant AL-26 enrichment of star-forming clouds.

  1. The estimation of tree posterior probabilities using conditional clade probability distributions.

    PubMed

    Larget, Bret

    2013-07-01

    In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample.

  2. Exact probability distribution function for the volatility of cumulative production

    NASA Astrophysics Data System (ADS)

    Zadourian, Rubina; Klümper, Andreas

    2018-04-01

    In this paper we study the volatility and its probability distribution function for the cumulative production based on the experience curve hypothesis. This work presents a generalization of the study of volatility in Lafond et al. (2017), which addressed the effects of normally distributed noise in the production process. Due to its wide applicability in industrial and technological activities we present here the mathematical foundation for an arbitrary distribution function of the process, which we expect will pave the future research on forecasting of the production process.

  3. Neural implementation of operations used in quantum cognition.

    PubMed

    Busemeyer, Jerome R; Fakhari, Pegah; Kvam, Peter

    2017-11-01

    Quantum probability theory has been successfully applied outside of physics to account for numerous findings from psychology regarding human judgement and decision making behavior. However, the researchers who have made these applications do not rely on the hypothesis that the brain is some type of quantum computer. This raises the question of how could the brain implement quantum algorithms other than quantum physical operations. This article outlines one way that a neural based system could perform the computations required by applications of quantum probability to human behavior. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Near-Infrared Spectroscopy – Electroencephalography-Based Brain-State-Dependent Electrotherapy: A Computational Approach Based on Excitation–Inhibition Balance Hypothesis

    PubMed Central

    Dagar, Snigdha; Chowdhury, Shubhajit Roy; Bapi, Raju Surampudi; Dutta, Anirban; Roy, Dipanjan

    2016-01-01

    Stroke is the leading cause of severe chronic disability and the second cause of death worldwide with 15 million new cases and 50 million stroke survivors. The poststroke chronic disability may be ameliorated with early neuro rehabilitation where non-invasive brain stimulation (NIBS) techniques can be used as an adjuvant treatment to hasten the effects. However, the heterogeneity in the lesioned brain will require individualized NIBS intervention where innovative neuroimaging technologies of portable electroencephalography (EEG) and functional-near-infrared spectroscopy (fNIRS) can be leveraged for Brain State Dependent Electrotherapy (BSDE). In this hypothesis and theory article, we propose a computational approach based on excitation–inhibition (E–I) balance hypothesis to objectively quantify the poststroke individual brain state using online fNIRS–EEG joint imaging. One of the key events that occurs following Stroke is the imbalance in local E–I (that is the ratio of Glutamate/GABA), which may be targeted with NIBS using a computational pipeline that includes individual “forward models” to predict current flow patterns through the lesioned brain or brain target region. The current flow will polarize the neurons, which can be captured with E–I-based brain models. Furthermore, E–I balance hypothesis can be used to find the consequences of cellular polarization on neuronal information processing, which can then be implicated in changes in function. We first review the evidence that shows how this local imbalance between E–I leading to functional dysfunction can be restored in targeted sites with NIBS (motor cortex and somatosensory cortex) resulting in large-scale plastic reorganization over the cortex, and probably facilitating recovery of functions. Second, we show evidence how BSDE based on E–I balance hypothesis may target a specific brain site or network as an adjuvant treatment. Hence, computational neural mass model-based integration of neurostimulation with online neuroimaging systems may provide less ambiguous, robust optimization of NIBS, and its application in neurological conditions and disorders across individual patients. PMID:27551273

  5. Term Projects on Interstellar Comets

    ERIC Educational Resources Information Center

    Mack, John E.

    1975-01-01

    Presents two calculations of the probability of detection of an interstellar comet, under the hypothesis that such comets would escape from comet clouds similar to that believed to surround the sun. Proposes three problems, each of which would be a reasonable term project for a motivated undergraduate. (Author/MLH)

  6. Using Astrology to Teach Research Methods to Introductory Psychology Students.

    ERIC Educational Resources Information Center

    Ward, Roger A.; Grasha, Anthony F.

    1986-01-01

    Provides a classroom demonstration designed to test an astrological hypothesis and help teach introductory psychology students about research design and data interpretation. Illustrates differences between science and nonscience, the role of theory in developing and testing hypotheses, making comparisons among groups, probability and statistical…

  7. Hypercalculia in savant syndrome: central executive failure?

    PubMed

    González-Garrido, Andrés Antonio; Ruiz-Sandoval, José Luis; Gómez-Velázquez, Fabiola R; de Alba, José Luis Oropeza; Villaseñor-Cabrera, Teresa

    2002-01-01

    The existence of outstanding cognitive talent in mentally retarded subjects persists as a challenge to present knowledge. We report the case of a 16-year-old male patient with exceptional mental calculation abilities and moderate mental retardation. The patient was clinically evaluated. Data from standard magnetic resonance imaging (MRI) and two 99mTc-ethyl cysteine dimer (ECD)-single photon emission computer tomography (SPECT) (in resting condition and performing a mental calculation task) studies were analyzed. Main neurologic findings were brachycephalia, right-side neurologic soft signs, obsessive personality profile, low color-word interference effect in Stroop test, and diffuse increased cerebral blood flow during calculation task in 99mTc-ECD SPECT. MRI showed anatomical temporal plane inverse asymmetry. Evidence appears to support the hypothesis that savant skill is related to excessive and erroneous use of cognitive processing resources instigated by probable failure in central executive control mechanisms.

  8. Multifocal oral melanoacanthoma associated with Addison's disease and hyperthyroidism: a case report.

    PubMed

    Dantas, Thinali Sousa; Nascimento, Isabelly Vidal do; Verde, Maria Elisa Quezado Lima; Alves, Ana Paula Negreiros Nunes; Sousa, Fabrício Bitu; Mota, Mário Rogério Lima

    2017-01-01

    Oral melanoacanthoma is a mucocutaneous, pigmented, rare, benign, and probably reactive lesion. This paper reports for the first time in the literature a case of multifocal oral melanoacanthoma in a patient diagnosed with Addison's disease and concomitant Graves' disease with hyperthyroidism. The patient presented with oral pigmented lesions, which were hypothesized to be mucosal pigmentation associated with Addison's disease. Due to their unusual clinical pattern, these oral lesions were biopsied and diagnosed as oral melanoacanthoma on histopathology and immunohistochemistry for HMB-45. At the moment of this report, the patient was being treated for her systemic conditions, but the lesions had not regressed. Reactive hyperpigmentation of the skin and mucous membranes may be found in Addison's disease and hyperthyroidism. This case reinforces the hypothesis of a reactive nature for oral melanoacanthoma and highlights the need for investigation of endocrine disorders in patients with multifocal oral melanoacanthoma.

  9. Fluvial valleys in the heavily cratered terrains of Mars: Evidence for paleoclimatic change?

    NASA Technical Reports Server (NTRS)

    Gulick, V. C.; Baker, V. R.

    1993-01-01

    Whether the formation of the Martian valley networks provides unequivocal evidence for drastically different climatic conditions remains debatable. Recent theoretical climate modeling precludes the existence of a temperate climate early in Mars' geological history. An alternative hypothesis suggests that Mars had a globally higher heat flow early in its geological history, bringing water tables to within 350 m of the surface. While a globally higher heat flow would initiate ground water circulation at depth, the valley networks probably required water tables to be even closer to the surface. Additionally, it was previously reported that the clustered distribution of the valley networks within terrain types, particularly in the heavily cratered highlands, suggests regional hydrological processes were important. The case for localized hydrothermal systems is summarized and estimates of both erosion volumes and of the implied water volumes for several Martian valley systems are presented.

  10. Did Father Cicero suffer from rheumatism?

    PubMed

    Rocha, Francisco Airton Castro

    Father Cicero Romao Batista is probably the most famous Ceará character of all time. An important protagonist of the Cariri region, situated in the south of Ceara State, in the late nineteenth century and the first third of the twentieth century, Father Cicero had great political and religious activity, as well as other less well-known achievements, for instance, his ecological teachings that led him to be awarded the title of "Patron of Forests", besides an enormous effort and personal sacrifice for the improvement of the conditions of human life. Inspired by reading his biography, we find that the "Padim Ciço" could have inflammatory spondyloarthropathy. In this article, we present the plausibility of this diagnostic hypothesis, seeking to emphasize that an attentive ear and clinical observation, albeit indirectly and without the privilege of a personal contact with the patient, are unparalleled tools for bringing forth a diagnosis. Copyright © 2016. Published by Elsevier Editora Ltda.

  11. Parents who influence their children to become scientists: effects of gender and parental education.

    PubMed

    Sonnert, Gerhard

    2009-12-01

    In this paper we report on testing the 'role-model' and 'opportunity-structure' hypotheses about the parents whom scientists mentioned as career influencers. According to the role-model hypothesis, the gender match between scientist and influencer is paramount (for example, women scientists would disproportionately often mention their mothers as career influencers). According to the opportunity-structure hypothesis, the parent's educational level predicts his/her probability of being mentioned as a career influencer (that is, parents with higher educational levels would be more likely to be named). The examination of a sample of American scientists who had received prestigious postdoctoral fellowships resulted in rejecting the role-model hypothesis and corroborating the opportunity-structure hypothesis. There were a few additional findings. First, women scientists were more likely than men scientists to mention parental influencers. Second, fathers were more likely than mothers to be mentioned as influencers. Third, an interaction was found between the scientist's gender and parental education when predicting a parent's nomination as influencer.

  12. Applications of Bayesian Statistics to Problems in Gamma-Ray Bursts

    NASA Technical Reports Server (NTRS)

    Meegan, Charles A.

    1997-01-01

    This presentation will describe two applications of Bayesian statistics to Gamma Ray Bursts (GRBS). The first attempts to quantify the evidence for a cosmological versus galactic origin of GRBs using only the observations of the dipole and quadrupole moments of the angular distribution of bursts. The cosmological hypothesis predicts isotropy, while the galactic hypothesis is assumed to produce a uniform probability distribution over positive values for these moments. The observed isotropic distribution indicates that the Bayes factor for the cosmological hypothesis over the galactic hypothesis is about 300. Another application of Bayesian statistics is in the estimation of chance associations of optical counterparts with galaxies. The Bayesian approach is preferred to frequentist techniques here because the Bayesian approach easily accounts for galaxy mass distributions and because one can incorporate three disjoint hypotheses: (1) bursts come from galactic centers, (2) bursts come from galaxies in proportion to luminosity, and (3) bursts do not come from external galaxies. This technique was used in the analysis of the optical counterpart to GRB970228.

  13. Testing the assumptions of the pyrodiversity begets biodiversity hypothesis for termites in semi-arid Australia.

    PubMed

    Davis, Hayley; Ritchie, Euan G; Avitabile, Sarah; Doherty, Tim; Nimmo, Dale G

    2018-04-01

    Fire shapes the composition and functioning of ecosystems globally. In many regions, fire is actively managed to create diverse patch mosaics of fire-ages under the assumption that a diversity of post-fire-age classes will provide a greater variety of habitats, thereby enabling species with differing habitat requirements to coexist, and enhancing species diversity (the pyrodiversity begets biodiversity hypothesis). However, studies provide mixed support for this hypothesis. Here, using termite communities in a semi-arid region of southeast Australia, we test four key assumptions of the pyrodiversity begets biodiversity hypothesis (i) that fire shapes vegetation structure over sufficient time frames to influence species' occurrence, (ii) that animal species are linked to resources that are themselves shaped by fire and that peak at different times since fire, (iii) that species' probability of occurrence or abundance peaks at varying times since fire and (iv) that providing a diversity of fire-ages increases species diversity at the landscape scale. Termite species and habitat elements were sampled in 100 sites across a range of fire-ages, nested within 20 landscapes chosen to represent a gradient of low to high pyrodiversity. We used regression modelling to explore relationships between termites, habitat and fire. Fire affected two habitat elements (coarse woody debris and the cover of woody vegetation) that were associated with the probability of occurrence of three termite species and overall species richness, thus supporting the first two assumptions of the pyrodiversity hypothesis. However, this did not result in those species or species richness being affected by fire history per se. Consequently, landscapes with a low diversity of fire histories had similar numbers of termite species as landscapes with high pyrodiversity. Our work suggests that encouraging a diversity of fire-ages for enhancing termite species richness in this study region is not necessary.

  14. Testing the assumptions of the pyrodiversity begets biodiversity hypothesis for termites in semi-arid Australia

    PubMed Central

    Davis, Hayley; Ritchie, Euan G.; Avitabile, Sarah; Doherty, Tim

    2018-01-01

    Fire shapes the composition and functioning of ecosystems globally. In many regions, fire is actively managed to create diverse patch mosaics of fire-ages under the assumption that a diversity of post-fire-age classes will provide a greater variety of habitats, thereby enabling species with differing habitat requirements to coexist, and enhancing species diversity (the pyrodiversity begets biodiversity hypothesis). However, studies provide mixed support for this hypothesis. Here, using termite communities in a semi-arid region of southeast Australia, we test four key assumptions of the pyrodiversity begets biodiversity hypothesis (i) that fire shapes vegetation structure over sufficient time frames to influence species' occurrence, (ii) that animal species are linked to resources that are themselves shaped by fire and that peak at different times since fire, (iii) that species’ probability of occurrence or abundance peaks at varying times since fire and (iv) that providing a diversity of fire-ages increases species diversity at the landscape scale. Termite species and habitat elements were sampled in 100 sites across a range of fire-ages, nested within 20 landscapes chosen to represent a gradient of low to high pyrodiversity. We used regression modelling to explore relationships between termites, habitat and fire. Fire affected two habitat elements (coarse woody debris and the cover of woody vegetation) that were associated with the probability of occurrence of three termite species and overall species richness, thus supporting the first two assumptions of the pyrodiversity hypothesis. However, this did not result in those species or species richness being affected by fire history per se. Consequently, landscapes with a low diversity of fire histories had similar numbers of termite species as landscapes with high pyrodiversity. Our work suggests that encouraging a diversity of fire-ages for enhancing termite species richness in this study region is not necessary. PMID:29765661

  15. Bayesian learning

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1989-01-01

    In 1983 and 1984, the Infrared Astronomical Satellite (IRAS) detected 5,425 stellar objects and measured their infrared spectra. In 1987 a program called AUTOCLASS used Bayesian inference methods to discover the classes present in these data and determine the most probable class of each object, revealing unknown phenomena in astronomy. AUTOCLASS has rekindled the old debate on the suitability of Bayesian methods, which are computationally intensive, interpret probabilities as plausibility measures rather than frequencies, and appear to depend on a subjective assessment of the probability of a hypothesis before the data were collected. Modern statistical methods have, however, recently been shown to also depend on subjective elements. These debates bring into question the whole tradition of scientific objectivity and offer scientists a new way to take responsibility for their findings and conclusions.

  16. Prediction of Conditional Probability of Survival After Surgery for Gastric Cancer: A Study Based on Eastern and Western Large Data Sets.

    PubMed

    Zhong, Qing; Chen, Qi-Yue; Li, Ping; Xie, Jian-Wei; Wang, Jia-Bin; Lin, Jian-Xian; Lu, Jun; Cao, Long-Long; Lin, Mi; Tu, Ru-Hong; Zheng, Chao-Hui; Huang, Chang-Ming

    2018-04-20

    The dynamic prognosis of patients who have undergone curative surgery for gastric cancer has yet to be reported. Our objective was to devise an accurate tool for predicting the conditional probability of survival for these patients. We analyzed 11,551 gastric cancer patients from the Surveillance, Epidemiology, and End Results database. Two-thirds of the patients were selected randomly for the development set and one-third for the validation set. Two nomograms were constructed to predict the conditional probability of overall survival and the conditional probability of disease-specific survival, using conditional survival methods. We then applied these nomograms to the 4,001 patients in the database from Fujian Medical University Union Hospital, Fuzhou, China, one of the most active Chinese institutes. The 5-year conditional probability of overall survival of the patients was 41.6% immediately after resection and increased to 52.8%, 68.2%, and 80.4% at 1, 2, and 3 years after gastrectomy. The 5-year conditional probability of disease-specific survival "increased" from 48.9% at the time of gastrectomy to 59.8%, 74.7%, and 85.5% for patients surviving 1, 2, and 3 years, respectively. Sex; race; age; depth of tumor invasion; lymph node metastasis; and tumor size, site, and grade were associated with overall survival and disease-specific survival (P <.05). Within the Surveillance, Epidemiology, and End Results validation set, the accuracy of the conditional probability of overall survival nomogram was 0.77, 0.81, 0.82, and 0.82 at 1, 3, 5, and 10 years after gastrectomy, respectively. Within the other validation set from the Fujian Medical University Union Hospital (n = 4,001), the accuracy of the conditional probability of overall survival nomogram was 0.76, 0.79, 0.77, and 0.77 at 1, 3, 5, and 10 years, respectively. The accuracy of the conditional probability of disease-specific survival model was also favorable. The calibration curve demonstrated good agreement between the predicted and observed survival rates. Based on the large Eastern and Western data sets, we developed and validated the first conditional nomogram for prediction of conditional probability of survival for patients with gastric cancer to allow consideration of the duration of survivorship. Copyright © 2018 Elsevier Inc. All rights reserved.

  17. Neyman-Pearson classification algorithms and NP receiver operating characteristics

    PubMed Central

    Tong, Xin; Feng, Yang; Li, Jingyi Jessica

    2018-01-01

    In many binary classification applications, such as disease diagnosis and spam detection, practitioners commonly face the need to limit type I error (that is, the conditional probability of misclassifying a class 0 observation as class 1) so that it remains below a desired threshold. To address this need, the Neyman-Pearson (NP) classification paradigm is a natural choice; it minimizes type II error (that is, the conditional probability of misclassifying a class 1 observation as class 0) while enforcing an upper bound, α, on the type I error. Despite its century-long history in hypothesis testing, the NP paradigm has not been well recognized and implemented in classification schemes. Common practices that directly limit the empirical type I error to no more than α do not satisfy the type I error control objective because the resulting classifiers are likely to have type I errors much larger than α, and the NP paradigm has not been properly implemented in practice. We develop the first umbrella algorithm that implements the NP paradigm for all scoring-type classification methods, such as logistic regression, support vector machines, and random forests. Powered by this algorithm, we propose a novel graphical tool for NP classification methods: NP receiver operating characteristic (NP-ROC) bands motivated by the popular ROC curves. NP-ROC bands will help choose α in a data-adaptive way and compare different NP classifiers. We demonstrate the use and properties of the NP umbrella algorithm and NP-ROC bands, available in the R package nproc, through simulation and real data studies. PMID:29423442

  18. Landscape conditions predisposing grizzly bears to conflicts on private agricultural lands in the western USA

    USGS Publications Warehouse

    Wilson, S.M.; Madel, M.J.; Mattson, D.J.; Graham, J.M.; Merrill, T.

    2006-01-01

    We used multiple logistic regression to model how different landscape conditions contributed to the probability of human-grizzly bear conflicts on private agricultural ranch lands. We used locations of livestock pastures, traditional livestock carcass disposal areas (boneyards), beehives, and wetland-riparian associated vegetation to model the locations of 178 reported human-grizzly bear conflicts along the Rocky Mountain East Front, Montana, USA during 1986-2001. We surveyed 61 livestock producers in the upper Teton watershed of north-central Montana, to collect spatial and temporal data on livestock pastures, boneyards, and beehives for the same period, accounting for changes in livestock and boneyard management and beehive location and protection, for each season. We used 2032 random points to represent the null hypothesis of random location relative to potential explanatory landscape features, and used Akaike's Information Criteria (AIC/AICC) and Hosmer-Lemeshow goodness-of-fit statistics for model selection. We used a resulting "best" model to map contours of predicted probabilities of conflict, and used this map for verification with an independent dataset of conflicts to provide additional insights regarding the nature of conflicts. The presence of riparian vegetation and distances to spring, summer, and fall sheep or cattle pastures, calving and sheep lambing areas, unmanaged boneyards, and fenced and unfenced beehives were all associated with the likelihood of human-grizzly bear conflicts. Our model suggests that collections of attractants concentrated in high quality bear habitat largely explain broad patterns of human-grizzly bear conflicts on private agricultural land in our study area. ?? 2005 Elsevier Ltd. All rights reserved.

  19. Healthy-unhealthy weight and time preference. Is there an association? An analysis through a consumer survey.

    PubMed

    Cavaliere, Alessia; De Marchi, Elisa; Banterle, Alessandro

    2014-12-01

    Individual time preference has been recognized as key driver in explaining consumers' probability to have a healthy weight or to incur excess weight problems. The term time preference refers to the rate at which a person is disposed to trade a current satisfaction for a future benefit. This characteristic may affect the extent at which individuals invest in health and may influence diet choices. The purpose of this paper is to analyse which could be the role of time preference (measured in terms of diet-related behaviours) in explaining consumers' healthy or unhealthy body weight. The analysis also considers other drivers predicted to influence BMI, specifically information searching, health-related activities and socio-demographic conditions. The survey was based on face-to-face interviews on a sample of 240 consumers living in Milan. In order to test the hypothesis, we performed a set of seven ORM regressions, all having consumers' BMI as the dependent variable. Each ORM contains a different block of explanatory variables, while time preference is always included among the regressors. The results suggest that the healthy weight condition is associated with a high orientation to the future, with a high interest in nutrition claims, a low attention to health-related claims, and a high level of education. On the opposite, the probability to be overweight or obese increases when consumers are less future-concerned and is associated with a low searching for nutrition claims and to a high interest in health claims. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Neyman-Pearson classification algorithms and NP receiver operating characteristics.

    PubMed

    Tong, Xin; Feng, Yang; Li, Jingyi Jessica

    2018-02-01

    In many binary classification applications, such as disease diagnosis and spam detection, practitioners commonly face the need to limit type I error (that is, the conditional probability of misclassifying a class 0 observation as class 1) so that it remains below a desired threshold. To address this need, the Neyman-Pearson (NP) classification paradigm is a natural choice; it minimizes type II error (that is, the conditional probability of misclassifying a class 1 observation as class 0) while enforcing an upper bound, α, on the type I error. Despite its century-long history in hypothesis testing, the NP paradigm has not been well recognized and implemented in classification schemes. Common practices that directly limit the empirical type I error to no more than α do not satisfy the type I error control objective because the resulting classifiers are likely to have type I errors much larger than α, and the NP paradigm has not been properly implemented in practice. We develop the first umbrella algorithm that implements the NP paradigm for all scoring-type classification methods, such as logistic regression, support vector machines, and random forests. Powered by this algorithm, we propose a novel graphical tool for NP classification methods: NP receiver operating characteristic (NP-ROC) bands motivated by the popular ROC curves. NP-ROC bands will help choose α in a data-adaptive way and compare different NP classifiers. We demonstrate the use and properties of the NP umbrella algorithm and NP-ROC bands, available in the R package nproc, through simulation and real data studies.

  1. The impact of an emergency fee increase on the composition of patients visiting emergency departments.

    PubMed

    Jung, Hyemin; Do, Young Kyung; Kim, Yoon; Ro, Junsoo

    2014-11-01

    This study aimed to test our hypothesis that a raise in the emergency fee implemented on March 1, 2013 has increased the proportion of patients with emergent symptoms by discouraging non-urgent emergency department visits. We conducted an analysis of 728 736 patients registered in the National Emergency Department Information System who visited level 1 and level 2 emergency medical institutes in the two-month time period from February 1, 2013, one month before the raise in the emergency fee, to March 31, 2013, one month after the raise. A difference-in-difference method was used to estimate the net effects of a raise in the emergency fee on the probability that an emergency visit is for urgent conditions. The percentage of emergency department visits in urgent or equivalent patients increased by 2.4% points, from 74.2% before to 76.6% after the policy implementation. In a group of patients transferred using public transport or ambulance, who were assumed to be least conscious of cost, the change in the proportion of urgent patients was not statistically significant. On the other hand, the probability that a group of patients directly presenting to the emergency department by private transport, assumed to be most conscious of cost, showed a 2.4% point increase in urgent conditions (p<0.001). This trend appeared to be consistent across the level 1 and level 2 emergency medical institutes. A raise in the emergency fee implemented on March 1, 2013 increased the proportion of urgent patients in the total emergency visits by reducing emergency department visits by non-urgent patients.

  2. Educability and Group Differences.

    ERIC Educational Resources Information Center

    Jensen, Arthur R.

    This pivotal analysis of the genetic factor in intelligence and educability argues that those qualities which seem most closely related to educability cannot be accounted for by a traditional environmentalist hypothesis. It is more probable that they have a substantial genetic basis. Educability, as defined in this book, is the ability to learn…

  3. Using the Nobel Laureates in Economics to Teach Quantitative Methods

    ERIC Educational Resources Information Center

    Becker, William E.; Greene, William H.

    2005-01-01

    The authors show how the work of Nobel Laureates in economics can enhance student understanding and bring them up to date on topics such as probability, uncertainty and decision theory, hypothesis testing, regression to the mean, instrumental variable techniques, discrete choice modeling, and time-series analysis. (Contains 2 notes.)

  4. Mind Your p's and Alphas.

    ERIC Educational Resources Information Center

    Stallings, William M.

    In the educational research literature alpha, the a priori level of significance, and p, the a posteriori probability of obtaining a test statistic of at least a certain value when the null hypothesis is true, are often confused. Explanations for this confusion are offered. Paradoxically, alpha retains a prominent place in textbook discussions of…

  5. A Paradigm for the Telephonic Assessment of Suicidal Ideation

    ERIC Educational Resources Information Center

    Halderman, Brent L.; Eyman, James R.; Kerner, Lisa; Schlacks, Bill

    2009-01-01

    A three-stage paradigm for telephonically assessing suicidal risk and triaging suicidal callers as practiced in an Employee Assistance Program Call Center was investigated. The first hypothesis was that the use of the procedure would increase the probability that callers would accept the clinician's recommendations, evidenced by fewer police…

  6. Constructing the Exact Significance Level for a Person-Fit Statistic.

    ERIC Educational Resources Information Center

    Liou, Michelle; Chang, Chih-Hsin

    1992-01-01

    An extension is proposed for the network algorithm introduced by C.R. Mehta and N.R. Patel to construct exact tail probabilities for testing the general hypothesis that item responses are distributed according to the Rasch model. A simulation study indicates the efficiency of the algorithm. (SLD)

  7. Bayesian Posterior Odds Ratios: Statistical Tools for Collaborative Evaluations

    ERIC Educational Resources Information Center

    Hicks, Tyler; Rodríguez-Campos, Liliana; Choi, Jeong Hoon

    2018-01-01

    To begin statistical analysis, Bayesians quantify their confidence in modeling hypotheses with priors. A prior describes the probability of a certain modeling hypothesis apart from the data. Bayesians should be able to defend their choice of prior to a skeptical audience. Collaboration between evaluators and stakeholders could make their choices…

  8. Skipping of Chinese characters does not rely on word-based processing.

    PubMed

    Lin, Nan; Angele, Bernhard; Hua, Huimin; Shen, Wei; Zhou, Junyi; Li, Xingshan

    2018-02-01

    Previous eye-movement studies have indicated that people tend to skip extremely high-frequency words in sentence reading, such as "the" in English and "/de" in Chinese. Two alternative hypotheses have been proposed to explain how this frequent skipping happens in Chinese reading: one assumes that skipping happens when the preview has been fully identified at the word level (word-based skipping); the other assumes that skipping happens whenever the preview character is easy to identify regardless of whether lexical processing has been completed or not (character-based skipping). Using the gaze-contingent display change paradigm, we examined the two hypotheses by substituting the preview of the third character of a four-character Chinese word with the high-frequency Chinese character "/de", which should disrupt the ongoing word-level processing. The character-based skipping hypothesis predicts that this manipulation will enhance the skipping probability of the target character (i.e., the third character of the target word), because the character "/de" has much higher character frequency than the original character. The word-based skipping hypothesis instead predicts a reduction of the skipping probability of the target character because the presence of the character "/de" is lexically infelicitous at word level. The results supported the character-based skipping hypothesis, indicating that in Chinese reading the decision of skipping a character can be made before integrating it into a word.

  9. Statistics 101 for Radiologists.

    PubMed

    Anvari, Arash; Halpern, Elkan F; Samir, Anthony E

    2015-10-01

    Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.

  10. Evidence of Methane Outgassing During MIS3 in the Bering Sea

    NASA Astrophysics Data System (ADS)

    Cook, M. S.; Keigwin, L. D.

    2005-12-01

    There are multiple negative excursions in planktonic and benthic foraminifer δ13C in a core from 1467m in the southeast Bering Sea. These excursions occur episodically during the last glacial period, and may coincide with Dansgaard-Oeschger (D-O) events. Measured foraminifer δ13C during the excursions is as low as -14‰ and are probably the result of overgrowths of diagenetic calcium carbonate. We estimate overgrowth δ13C is -23‰, and hypothesize that the occurrence of overgrowths is associated with anaerobic oxidation of biogenic methane. The likely pressure and temperature conditions at this site and during the last glacial period were well within the zone of methane-hydrate stability, so the source of methane is probably not from destabilization of methane hydrate at this depth. The methane may have originated from increased in-situ methanogenesis resulting from greater burial of organic carbon, or from destabilization of methane hydrate at shallower sites near the methane-hydrate stability threshold. Both these scenarios could be active, consistent with the ``Clathrate Gun Hypothesis'' (Kennett et al., 2003), in which there is widespread destabilization of marine methane hydrates during D-O events, where methane gas both is oxidized within the water column and escapes to the atmosphere.

  11. Option volatility and the acceleration Lagrangian

    NASA Astrophysics Data System (ADS)

    Baaquie, Belal E.; Cao, Yang

    2014-01-01

    This paper develops a volatility formula for option on an asset from an acceleration Lagrangian model and the formula is calibrated with market data. The Black-Scholes model is a simpler case that has a velocity dependent Lagrangian. The acceleration Lagrangian is defined, and the classical solution of the system in Euclidean time is solved by choosing proper boundary conditions. The conditional probability distribution of final position given the initial position is obtained from the transition amplitude. The volatility is the standard deviation of the conditional probability distribution. Using the conditional probability and the path integral method, the martingale condition is applied, and one of the parameters in the Lagrangian is fixed. The call option price is obtained using the conditional probability and the path integral method.

  12. The Relationship Between Specific Pavlovian Instrumental Transfer and Instrumental Reward Probability

    PubMed Central

    Cartoni, Emilio; Moretta, Tania; Puglisi-Allegra, Stefano; Cabib, Simona; Baldassarre, Gianluca

    2015-01-01

    Goal-directed behavior is influenced by environmental cues: in particular, cues associated with a reward can bias action choice toward actions directed to that same reward. This effect is studied experimentally as specific Pavlovian-instrumental transfer (specific PIT). We have investigated the hypothesis that cues associated to an outcome elicit specific PIT by rising the estimates of reward probability of actions associated to that same outcome. In other words, cues reduce the uncertainty on the efficacy of instrumental actions. We used a human PIT experimental paradigm to test the effects of two different instrumental contingencies: one group of participants had a 33% chance of being rewarded for each button press, while another had a 100% chance. The group trained with 33% reward probability showed a stronger PIT effect than the 100% group, in line with the hypothesis that Pavlovian cues linked to an outcome work by reducing the uncertainty of receiving it. The 100% group also showed a significant specific PIT effect, highlighting additional factors that could contribute to specific PIT beyond the instrumental training contingency. We hypothesize that the uncertainty about reward delivery due to testing in extinction might be one of these factors. These results add knowledge on how goal-directed behavior is influenced by the presence of environmental cues associated with a reward: such influence depends on the probability that we have to reach a reward, namely when there is less chance of getting a reward we are more influenced by cues associated with it, and vice versa. PMID:26635645

  13. The Effects of Temperature and Diet during Development, Adulthood, and Mating on Reproduction in the Red Flour Beetle.

    PubMed

    Scharf, Inon; Braf, Hila; Ifrach, Naama; Rosenstein, Shai; Subach, Aziz

    2015-01-01

    The effects of different temperatures and diets experienced during distinct life stages are not necessarily similar. The silver-spoon hypothesis predicts that developing under favorable conditions will always lead to better performing adults under all adult conditions. The environment-matching hypothesis suggests that a match between developmental and adult conditions will lead to the best performing adults. Similar to the latter hypothesis, the beneficial-acclimation hypothesis suggests that either developing or acclimating as adults to the test temperature will improve later performance under such temperature. We disentangled here between the effect of growth, adult, and mating conditions (temperature and diet) on reproduction in the red flour beetle (Tribolium castaneum), in reference to the reproduction success rate, the number of viable offspring produced, and the mean offspring mass 13 days after mating. The most influential stage affecting reproduction differed between the diet and temperature experiments: adult temperature vs. parental growth diet. Generally, a yeast-rich diet or warmer temperature improved reproduction, supporting the silver-spoon hypothesis. However, interactions between life stages made the results more complex, also fitting the environment-matching hypothesis. Warm growth temperature positively affected reproduction success, but only when adults were kept under the same warm temperature. When the parental growth and adult diets matched, the mean offspring mass was greater than in a mismatch between the two. Additionally, a match between warm adult temperature and warm offspring growth temperature led to the largest offspring mass. These findings support the environment-matching hypothesis. Our results provide evidence for all these hypotheses and demonstrate that parental effects and plasticity may be induced by temperature and diet.

  14. Type I error probabilities based on design-stage strategies with applications to noninferiority trials.

    PubMed

    Rothmann, Mark

    2005-01-01

    When testing the equality of means from two different populations, a t-test or large sample normal test tend to be performed. For these tests, when the sample size or design for the second sample is dependent on the results of the first sample, the type I error probability is altered for each specific possibility in the null hypothesis. We will examine the impact on the type I error probabilities for two confidence interval procedures and procedures using test statistics when the design for the second sample or experiment is dependent on the results from the first sample or experiment (or series of experiments). Ways for controlling a desired maximum type I error probability or a desired type I error rate will be discussed. Results are applied to the setting of noninferiority comparisons in active controlled trials where the use of a placebo is unethical.

  15. Importance of structural stability to success of mourning dove nests

    USGS Publications Warehouse

    Coon, R.A.; Nichols, J.D.; Percival, H.F.

    1981-01-01

    Studies of nest-site selection and nesting habitats often involve a "characterization" of nests and of habitats in which nests are found. Our objective in the present work is to identify nest-site characteristics that are associated with variation in components of Mourning Dove (Zenaida macroura) fitness (e.g. the probability of a nest succeeding), as opposed to simply "characterizing" dove nest sites. If certain nest- site characteristics affect the probability that a nest will succeed, then we suspect that these characteristics will be associated with either concealment (the probability of detection by certain predators) or structural stability (the probability of eggs or entire nests falling to the ground as a result of wind, rain storms, parental activity, etc.). Although other workers agree that structural stability is an important determinant of Mourning Dove nesting success (e.g. McClure 1944: 384; Woolfenden and Rohwer 1969: 59), we are aware of no actual tests of this hypothesis.

  16. Bivariate normal, conditional and rectangular probabilities: A computer program with applications

    NASA Technical Reports Server (NTRS)

    Swaroop, R.; Brownlow, J. D.; Ashwworth, G. R.; Winter, W. R.

    1980-01-01

    Some results for the bivariate normal distribution analysis are presented. Computer programs for conditional normal probabilities, marginal probabilities, as well as joint probabilities for rectangular regions are given: routines for computing fractile points and distribution functions are also presented. Some examples from a closed circuit television experiment are included.

  17. The Effects of Phonotactic Probability and Neighborhood Density on Adults' Word Learning in Noisy Conditions

    PubMed Central

    Storkel, Holly L.; Lee, Jaehoon; Cox, Casey

    2016-01-01

    Purpose Noisy conditions make auditory processing difficult. This study explores whether noisy conditions influence the effects of phonotactic probability (the likelihood of occurrence of a sound sequence) and neighborhood density (phonological similarity among words) on adults' word learning. Method Fifty-eight adults learned nonwords varying in phonotactic probability and neighborhood density in either an unfavorable (0-dB signal-to-noise ratio [SNR]) or a favorable (+8-dB SNR) listening condition. Word learning was assessed using a picture naming task by scoring the proportion of phonemes named correctly. Results The unfavorable 0-dB SNR condition showed a significant interaction between phonotactic probability and neighborhood density in the absence of main effects. In particular, adults learned more words when phonotactic probability and neighborhood density were both low or both high. The +8-dB SNR condition did not show this interaction. These results are inconsistent with those from a prior adult word learning study conducted under quiet listening conditions that showed main effects of word characteristics. Conclusions As the listening condition worsens, adult word learning benefits from a convergence of phonotactic probability and neighborhood density. Clinical implications are discussed for potential populations who experience difficulty with auditory perception or processing, making them more vulnerable to noise. PMID:27788276

  18. The Effects of Phonotactic Probability and Neighborhood Density on Adults' Word Learning in Noisy Conditions.

    PubMed

    Han, Min Kyung; Storkel, Holly L; Lee, Jaehoon; Cox, Casey

    2016-11-01

    Noisy conditions make auditory processing difficult. This study explores whether noisy conditions influence the effects of phonotactic probability (the likelihood of occurrence of a sound sequence) and neighborhood density (phonological similarity among words) on adults' word learning. Fifty-eight adults learned nonwords varying in phonotactic probability and neighborhood density in either an unfavorable (0-dB signal-to-noise ratio [SNR]) or a favorable (+8-dB SNR) listening condition. Word learning was assessed using a picture naming task by scoring the proportion of phonemes named correctly. The unfavorable 0-dB SNR condition showed a significant interaction between phonotactic probability and neighborhood density in the absence of main effects. In particular, adults learned more words when phonotactic probability and neighborhood density were both low or both high. The +8-dB SNR condition did not show this interaction. These results are inconsistent with those from a prior adult word learning study conducted under quiet listening conditions that showed main effects of word characteristics. As the listening condition worsens, adult word learning benefits from a convergence of phonotactic probability and neighborhood density. Clinical implications are discussed for potential populations who experience difficulty with auditory perception or processing, making them more vulnerable to noise.

  19. Retrodiction for Bayesian multiple-hypothesis/multiple-target tracking in densely cluttered environment

    NASA Astrophysics Data System (ADS)

    Koch, Wolfgang

    1996-05-01

    Sensor data processing in a dense target/dense clutter environment is inevitably confronted with data association conflicts which correspond with the multiple hypothesis character of many modern approaches (MHT: multiple hypothesis tracking). In this paper we analyze the efficiency of retrodictive techniques that generalize standard fixed interval smoothing to MHT applications. 'Delayed estimation' based on retrodiction provides uniquely interpretable and accurate trajectories from ambiguous MHT output if a certain time delay is tolerated. In a Bayesian framework the theoretical background of retrodiction and its intimate relation to Bayesian MHT is sketched. By a simulated example with two closely-spaced targets, relatively low detection probabilities, and rather high false return densities, we demonstrate the benefits of retrodiction and quantitatively discuss the achievable track accuracies and the time delays involved for typical radar parameters.

  20. Cancer immunotherapy by immunosuppression

    PubMed Central

    2010-01-01

    We have previously suggested that the stimulatory effect of a weak immune reaction on tumor growth may be necessary for the growth of incipient tumors. In the present paper, we enlarge upon and extend that idea by collecting evidence in the literature bearing upon this new hypothesis that a growing cancer, whether in man or mouse, is throughout its lifespan, probably growing and progressing because of continued immune stimulation by a weak immune reaction. We also suggest that prolonged immunosuppression might interfere with progression and thus be an aid to therapy. While most of the considerable evidence that supports the hypothesis comes from observations of experimental mouse tumors, there is suggestive evidence that human tumors may behave in much the same way, and as far as we can ascertain, there is no present evidence that necessarily refutes the hypothesis. PMID:21159199

  1. Dynamic test input generation for multiple-fault isolation

    NASA Technical Reports Server (NTRS)

    Schaefer, Phil

    1990-01-01

    Recent work is Causal Reasoning has provided practical techniques for multiple fault diagnosis. These techniques provide a hypothesis/measurement diagnosis cycle. Using probabilistic methods, they choose the best measurements to make, then update fault hypotheses in response. For many applications such as computers and spacecraft, few measurement points may be accessible, or values may change quickly as the system under diagnosis operates. In these cases, a hypothesis/measurement cycle is insufficient. A technique is presented for a hypothesis/test-input/measurement diagnosis cycle. In contrast to generating tests a priori for determining device functionality, it dynamically generates tests in response to current knowledge about fault probabilities. It is shown how the mathematics previously used for measurement specification can be applied to the test input generation process. An example from an efficient implementation called Multi-Purpose Causal (MPC) is presented.

  2. UNIFORMLY MOST POWERFUL BAYESIAN TESTS

    PubMed Central

    Johnson, Valen E.

    2014-01-01

    Uniformly most powerful tests are statistical hypothesis tests that provide the greatest power against a fixed null hypothesis among all tests of a given size. In this article, the notion of uniformly most powerful tests is extended to the Bayesian setting by defining uniformly most powerful Bayesian tests to be tests that maximize the probability that the Bayes factor, in favor of the alternative hypothesis, exceeds a specified threshold. Like their classical counterpart, uniformly most powerful Bayesian tests are most easily defined in one-parameter exponential family models, although extensions outside of this class are possible. The connection between uniformly most powerful tests and uniformly most powerful Bayesian tests can be used to provide an approximate calibration between p-values and Bayes factors. Finally, issues regarding the strong dependence of resulting Bayes factors and p-values on sample size are discussed. PMID:24659829

  3. Hypothesis Testing as an Act of Rationality

    NASA Astrophysics Data System (ADS)

    Nearing, Grey

    2017-04-01

    Statistical hypothesis testing is ad hoc in two ways. First, setting probabilistic rejection criteria is, as Neyman (1957) put it, an act of will rather than an act of rationality. Second, physical theories like conservation laws do not inherently admit probabilistic predictions, and so we must use what are called epistemic bridge principles to connect model predictions with the actual methods of hypothesis testing. In practice, these bridge principles are likelihood functions, error functions, or performance metrics. I propose that the reason we are faced with these problems is because we have historically failed to account for a fundamental component of basic logic - namely the portion of logic that explains how epistemic states evolve in the presence of empirical data. This component of Cox' (1946) calculitic logic is called information theory (Knuth, 2005), and adding information theory our hypothetico-deductive account of science yields straightforward solutions to both of the above problems. This also yields a straightforward method for dealing with Popper's (1963) problem of verisimilitude by facilitating a quantitative approach to measuring process isomorphism. In practice, this involves data assimilation. Finally, information theory allows us to reliably bound measures of epistemic uncertainty, thereby avoiding the problem of Bayesian incoherency under misspecified priors (Grünwald, 2006). I therefore propose solutions to four of the fundamental problems inherent in both hypothetico-deductive and/or Bayesian hypothesis testing. - Neyman (1957) Inductive Behavior as a Basic Concept of Philosophy of Science. - Cox (1946) Probability, Frequency and Reasonable Expectation. - Knuth (2005) Lattice Duality: The Origin of Probability and Entropy. - Grünwald (2006). Bayesian Inconsistency under Misspecification. - Popper (1963) Conjectures and Refutations: The Growth of Scientific Knowledge.

  4. Testing Pairwise Association between Spatially Autocorrelated Variables: A New Approach Using Surrogate Lattice Data

    PubMed Central

    Deblauwe, Vincent; Kennel, Pol; Couteron, Pierre

    2012-01-01

    Background Independence between observations is a standard prerequisite of traditional statistical tests of association. This condition is, however, violated when autocorrelation is present within the data. In the case of variables that are regularly sampled in space (i.e. lattice data or images), such as those provided by remote-sensing or geographical databases, this problem is particularly acute. Because analytic derivation of the null probability distribution of the test statistic (e.g. Pearson's r) is not always possible when autocorrelation is present, we propose instead the use of a Monte Carlo simulation with surrogate data. Methodology/Principal Findings The null hypothesis that two observed mapped variables are the result of independent pattern generating processes is tested here by generating sets of random image data while preserving the autocorrelation function of the original images. Surrogates are generated by matching the dual-tree complex wavelet spectra (and hence the autocorrelation functions) of white noise images with the spectra of the original images. The generated images can then be used to build the probability distribution function of any statistic of association under the null hypothesis. We demonstrate the validity of a statistical test of association based on these surrogates with both actual and synthetic data and compare it with a corrected parametric test and three existing methods that generate surrogates (randomization, random rotations and shifts, and iterative amplitude adjusted Fourier transform). Type I error control was excellent, even with strong and long-range autocorrelation, which is not the case for alternative methods. Conclusions/Significance The wavelet-based surrogates are particularly appropriate in cases where autocorrelation appears at all scales or is direction-dependent (anisotropy). We explore the potential of the method for association tests involving a lattice of binary data and discuss its potential for validation of species distribution models. An implementation of the method in Java for the generation of wavelet-based surrogates is available online as supporting material. PMID:23144961

  5. Isolation and molecular characterization of ERF1, an ethylene response factor gene from durum wheat (Triticum turgidum L. subsp. durum), potentially involved in salt-stress responses.

    PubMed

    Makhloufi, Emna; Yousfi, Fatma-Ezzahra; Marande, William; Mila, Isabelle; Hanana, Mohsen; Bergès, Hélène; Mzid, Rim; Bouzayen, Mondher

    2014-12-01

    As food crop, wheat is of prime importance for human society. Nevertheless, our understanding of the genetic and molecular mechanisms controlling wheat productivity conditions has been, so far, hampered by the lack of sufficient genomic resources. The present work describes the isolation and characterization of TdERF1, an ERF gene from durum wheat (Triticum turgidum L. subsp. durum). The structural features of TdERF1 supported the hypothesis that it is a novel member of the ERF family in durum wheat and, considering its close similarity to TaERF1 of Triticum aestivum, it probably plays a similar role in mediating responses to environmental stresses. TdERF1 displayed an expression pattern that discriminated between two durum wheat genotypes contrasted with regard to salt-stress tolerance. The high number of cis-regulatory elements related to stress responses present in the TdERF1 promoter and the ability of TdERF1 to regulate the transcription of ethylene and drought-responsive promoters clearly indicated its potential role in mediating plant responses to a wide variety of environmental constrains. TdERF1 was also regulated by abscisic acid, ethylene, auxin, and salicylic acid, suggesting that it may be at the crossroads of multiple hormone signalling pathways. Four TdERF1 allelic variants have been identified in durum wheat genome, all shown to be transcriptionally active. Interestingly, the expression of one allelic form is specific to the tolerant genotype, further supporting the hypothesis that this gene is probably associated with the susceptibility/tolerance mechanism to salt stress. In this regard, the TdERF1 gene may provide a discriminating marker between tolerant and sensitive wheat varieties. © The Author 2014. Published by Oxford University Press on behalf of the Society for Experimental Biology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  6. Rare targets are less susceptible to attention capture once detection has begun.

    PubMed

    Hon, Nicholas; Ng, Gavin; Chan, Gerald

    2016-04-01

    Rare or low probability targets are detected more slowly and/ or less accurately than higher probability counterparts. Various proposals have implicated perceptual and response-based processes in this deficit. Recent evidence, however, suggests that it is attentional in nature, with low probability targets requiring more attentional resources than high probability ones to detect. This difference in attentional requirements, in turn, suggests the possibility that low and high probability targets may have different susceptibilities to attention capture, which is also known to be resource-dependent. Supporting this hypothesis, we found that, once attentional resources have begun to be engaged by detection processes, low, but not high, probability targets have a reduced susceptibility to capture. Our findings speak to several issues. First, they indicate that the likelihood of attention capture occurring when a given task-relevant stimulus is being processed is dependent, to some extent, on how said stimulus is represented within mental task sets. Second, they provide added support for the idea that the behavioural deficit associated with low probability targets is attention-based. Finally, the current data point to reduced top-down biasing of target templates as a likely mechanism underlying the attentional locus of the deficit in question.

  7. Differential effects of vitamins E and C and carotenoids on growth, resistance to oxidative stress, fledging success and plumage colouration in wild great tits.

    PubMed

    Marri, Viviana; Richner, Heinz

    2014-05-01

    Oxidative stress is the imbalance between the production of reactive species and antioxidants, which causes damage to lipids, proteins and DNA. Antioxidants, like vitamins and carotenoids, can limit oxidative damage and can therefore regulate the trade-off between growth, which is a period of high reactive species production, and self-maintenance. However, the role of carotenoids as antioxidants in vivo has been debated, and it has been suggested that carotenoid-based signals indicate the availability of non-pigmentary antioxidants (e.g. vitamins) that protect carotenoids from oxidation, known as the 'protection hypothesis'. To evaluate the importance of vitamins versus carotenoids as antioxidants during growth and to test the protection hypothesis, we supplemented nestling great tits, Parus major, 3, 5 and 7 days after hatching with a single dose of carotenoids and/or vitamins in a 2×2 full-factorial design. We subsequently measured body condition, antioxidant capacity, oxidative damage, fledging success and plumage reflectance. Vitamins enhanced antioxidant capacity, but did not affect oxidative damage. Vitamin-treated nestlings had higher growth rates and higher probability of fledging. In contrast, carotenoids did not affect any of these traits. Furthermore, carotenoid-based colouration increased over the breeding season in nestlings that received vitamins only. This study shows that vitamins are limiting for growth rate and fledging success, and suggests that vitamins could regulate the trade-off between growth and self-maintenance in favour of the former. Moreover, our results are consistent with the idea that carotenoids are minor antioxidants in birds, but they do not support the protection hypothesis.

  8. Complexity, information loss, and model building: from neuro- to cognitive dynamics

    NASA Astrophysics Data System (ADS)

    Arecchi, F. Tito

    2007-06-01

    A scientific problem described within a given code is mapped by a corresponding computational problem, We call complexity (algorithmic) the bit length of the shortest instruction which solves the problem. Deterministic chaos in general affects a dynamical systems making the corresponding problem experimentally and computationally heavy, since one must reset the initial conditions at a rate higher than that of information loss (Kolmogorov entropy). One can control chaos by adding to the system new degrees of freedom (information swapping: information lost by chaos is replaced by that arising from the new degrees of freedom). This implies a change of code, or a new augmented model. Within a single code, changing hypotheses is equivalent to fixing different sets of control parameters, each with a different a-priori probability, to be then confirmed and transformed to an a-posteriori probability via Bayes theorem. Sequential application of Bayes rule is nothing else than the Darwinian strategy in evolutionary biology. The sequence is a steepest ascent algorithm, which stops once maximum probability has been reached. At this point the hypothesis exploration stops. By changing code (and hence the set of relevant variables) one can start again to formulate new classes of hypotheses . We call semantic complexity the number of accessible scientific codes, or models, that describe a situation. It is however a fuzzy concept, in so far as this number changes due to interaction of the operator with the system under investigation. These considerations are illustrated with reference to a cognitive task, starting from synchronization of neuron arrays in a perceptual area and tracing the putative path toward a model building.

  9. Solar Wind Strahl Observations and Their Implication to the Core-Halo Formation due to Scattering

    NASA Technical Reports Server (NTRS)

    Vinas, Adolfo F.

    2011-01-01

    A study of the kinetic properties of the strahl electron velocity distribution functions (VDF?s) in the solar wind is presented. This study focuses on the mechanisms that control and regulate the electron VDF?s and the stability of the strahl electrons in the solar wind; mechanisms that are not yet well understood. Various parameters are investigated such as the strahl-electron density, temperature anisotropy, and electron heat-flux. These parameters are used to investigate the stability of the strahl population. The analysis check for whether the strahl electrons are constrained by some instability (e.g., the whistler or KAW instabilities), or are maintained by other types of processes. The electron heat-flux and temperature anisotropy are determined by modeling of the 3D-VDF?s from which the moments properties of the various populations are obtained. The results of this study have profound implication on the current hypothesis about the probable formation of the solar wind halo electrons produced from the scattering of the strahl population. This hypothesis is strengthened by direct observations of the strahl electrons being scattered into the core-halo in an isolated event. The observation implies that the scattering of the strahl is not a continuous process but occurs in bursts in regions where conditions for wave growth providing the scattering are optimum. Sometimes, observations indicate that the strahl component is anisotropic (Tper/Tpal approx. 2). This provides a possible free energy source for the excitation of whistler waves as a possible scattering mechanism, however this condition is not always observed. The study is based on high time resolution data from the Cluster/PEACE electron spectrometer.

  10. Beginning Bayes

    ERIC Educational Resources Information Center

    Erickson, Tim

    2017-01-01

    Understanding a Bayesian perspective demands comfort with conditional probability and with probabilities that appear to change as we acquire additional information. This paper suggests a simple context in conditional probability that helps develop the understanding students would need for a successful introduction to Bayesian reasoning.

  11. The evolution of intelligence in mammalian carnivores.

    PubMed

    Holekamp, Kay E; Benson-Amram, Sarah

    2017-06-06

    Although intelligence should theoretically evolve to help animals solve specific types of problems posed by the environment, it is unclear which environmental challenges favour enhanced cognition, or how general intelligence evolves along with domain-specific cognitive abilities. The social intelligence hypothesis posits that big brains and great intelligence have evolved to cope with the labile behaviour of group mates. We have exploited the remarkable convergence in social complexity between cercopithecine primates and spotted hyaenas to test predictions of the social intelligence hypothesis in regard to both cognition and brain size. Behavioural data indicate that there has been considerable convergence between primates and hyaenas with respect to their social cognitive abilities. Moreover, compared with other hyaena species, spotted hyaenas have larger brains and expanded frontal cortex, as predicted by the social intelligence hypothesis. However, broader comparative study suggests that domain-general intelligence in carnivores probably did not evolve in response to selection pressures imposed specifically in the social domain. The cognitive buffer hypothesis, which suggests that general intelligence evolves to help animals cope with novel or changing environments, appears to offer a more robust explanation for general intelligence in carnivores than any hypothesis invoking selection pressures imposed strictly by sociality or foraging demands.

  12. The evolution of intelligence in mammalian carnivores

    PubMed Central

    Benson-Amram, Sarah

    2017-01-01

    Although intelligence should theoretically evolve to help animals solve specific types of problems posed by the environment, it is unclear which environmental challenges favour enhanced cognition, or how general intelligence evolves along with domain-specific cognitive abilities. The social intelligence hypothesis posits that big brains and great intelligence have evolved to cope with the labile behaviour of group mates. We have exploited the remarkable convergence in social complexity between cercopithecine primates and spotted hyaenas to test predictions of the social intelligence hypothesis in regard to both cognition and brain size. Behavioural data indicate that there has been considerable convergence between primates and hyaenas with respect to their social cognitive abilities. Moreover, compared with other hyaena species, spotted hyaenas have larger brains and expanded frontal cortex, as predicted by the social intelligence hypothesis. However, broader comparative study suggests that domain-general intelligence in carnivores probably did not evolve in response to selection pressures imposed specifically in the social domain. The cognitive buffer hypothesis, which suggests that general intelligence evolves to help animals cope with novel or changing environments, appears to offer a more robust explanation for general intelligence in carnivores than any hypothesis invoking selection pressures imposed strictly by sociality or foraging demands. PMID:28479979

  13. Fatigue Failure of External Hexagon Connections on Cemented Implant-Supported Crowns.

    PubMed

    Malta Barbosa, João; Navarro da Rocha, Daniel; Hirata, Ronaldo; Freitas, Gileade; Bonfante, Estevam A; Coelho, Paulo G

    2018-01-17

    To evaluate the probability of survival and failure modes of different external hexagon connection systems restored with anterior cement-retained single-unit crowns. The postulated null hypothesis was that there would be no differences under accelerated life testing. Fifty-four external hexagon dental implants (∼4 mm diameter) were used for single cement-retained crown replacement and divided into 3 groups: (3i) Full OSSEOTITE, Biomet 3i (n = 18); (OL) OEX P4, Osseolife Implants (n = 18); and (IL) Unihex, Intra-Lock International (n = 18). Abutments were torqued to the implants, and maxillary central incisor crowns were cemented and subjected to step-stress-accelerated life testing in water. Use-level probability Weibull curves and probability of survival for a mission of 100,000 cycles at 200 N (95% 2-sided confidence intervals) were calculated. Stereo and scanning electron microscopes were used for failure inspection. The beta values for 3i, OL, and IL (1.60, 1.69, and 1.23, respectively) indicated that fatigue accelerated the failure of the 3 groups. Reliability for the 3i and OL (41% and 68%, respectively) was not different between each other, but both were significantly lower than IL group (98%). Abutment screw fracture was the failure mode consistently observed in all groups. Because the reliability was significantly different between the 3 groups, our postulated null hypothesis was rejected.

  14. Know the risk, take the win: how executive functions and probability processing influence advantageous decision making under risk conditions.

    PubMed

    Brand, Matthias; Schiebener, Johannes; Pertl, Marie-Theres; Delazer, Margarete

    2014-01-01

    Recent models on decision making under risk conditions have suggested that numerical abilities are important ingredients of advantageous decision-making performance, but empirical evidence is still limited. The results of our first study show that logical reasoning and basic mental calculation capacities predict ratio processing and that ratio processing predicts decision making under risk. In the second study, logical reasoning together with executive functions predicted probability processing (numeracy and probability knowledge), and probability processing predicted decision making under risk. These findings suggest that increasing an individual's understanding of ratios and probabilities should lead to more advantageous decisions under risk conditions.

  15. Physiopathological Hypothesis of Cellulite

    PubMed Central

    de Godoy, José Maria Pereira; de Godoy, Maria de Fátima Guerreiro

    2009-01-01

    A series of questions are asked concerning this condition including as regards to its name, the consensus about the histopathological findings, physiological hypothesis and treatment of the disease. We established a hypothesis for cellulite and confirmed that the clinical response is compatible with this hypothesis. Hence this novel approach brings a modern physiological concept with physiopathologic basis and clinical proof of the hypothesis. We emphasize that the choice of patient, correct diagnosis of cellulite and the technique employed are fundamental to success. PMID:19756187

  16. Incidence of Lower Respiratory Tract Infections and Atopic Conditions in Boys and Young Male Adults: Royal College of General Practitioners Research and Surveillance Centre Annual Report 2015-2016.

    PubMed

    de Lusignan, Simon; Correa, Ana; Pebody, Richard; Yonova, Ivelina; Smith, Gillian; Byford, Rachel; Pathirannehelage, Sameera Rankiri; McGee, Christopher; Elliot, Alex J; Hriskova, Mariya; Ferreira, Filipa Im; Rafi, Imran; Jones, Simon

    2018-04-30

    The Royal College of General Practitioners Research and Surveillance Centre comprises more than 150 general practices, with a combined population of more than 1.5 million, contributing to UK and European public health surveillance and research. The aim of this paper was to report gender differences in the presentation of infectious and respiratory conditions in children and young adults. Disease incidence data were used to test the hypothesis that boys up to puberty present more with lower respiratory tract infection (LRTI) and asthma. Incidence rates were reported for infectious conditions in children and young adults by gender. We controlled for ethnicity, deprivation, and consultation rates. We report odds ratios (OR) with 95% CI, P values, and probability of presenting. Boys presented more with LRTI, largely due to acute bronchitis. The OR of males consulting was greater across the youngest 3 age bands (OR 1.59, 95% CI 1.35-1.87; OR 1.13, 95% CI 1.05-1.21; OR 1.20, 95% CI 1.09-1.32). Allergic rhinitis and asthma had a higher OR of presenting in boys aged 5 to 14 years (OR 1.52, 95% CI 1.37-1.68; OR 1.31, 95% CI 1.17-1.48). Upper respiratory tract infection (URTI) and urinary tract infection (UTI) had lower odds of presenting in boys, especially those older than 15 years. The probability of presenting showed different patterns for LRTI, URTI, and atopic conditions. Boys younger than 15 years have greater odds of presenting with LRTI and atopic conditions, whereas girls may present more with URTI and UTI. These differences may provide insights into disease mechanisms and for health service planning. ©Simon de Lusignan, Ana Correa, Richard Pebody, Ivelina Yonova, Gillian Smith, Rachel Byford, Sameera Rankiri Pathirannehelage, Christopher McGee, Alex J. Elliot, Mariya Hriskova, Filipa IM Ferreira, Imran Rafi, Simon Jones. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 30.04.2018.

  17. Contact networks and the study of contagion.

    PubMed

    Hartigan, P M

    1980-09-01

    The contact network among individuals in a patient group and in a control group is examined. The probability of knowing another person is modelled with parameters assigned to various factors, such as age, sex or disease, which may influence this probability. Standard likelihood techniques are used to estimate the parameters and to test the significance of the hypotheses, in particular the hypothesis of contagion, generated in the modelling process. The method is illustrated in a study of the Yale student body, in which infectious mononucleosis patients of the opposite sex are shown to know each other significantly more frequently than expected.

  18. Detoxification of mercury, cadmium, and lead in Klebsiella aerogenes NCTC 418 growing in continuous culture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aiking, H.; Govers, H.; van 'T Riet, J.

    1985-11-01

    Klebsiella aerogenes NCTC 418 growing in the presence of cadmium under glucose-, sulfate-, or phosphate-limited conditions in continuous culture exhibited sulfide formation and P/sub i/ accumulation as the only demonstrable detoxification mechanisms. In the presence of mercury under similar conditions only HgS formation could be confirmed, by an increased sensitivity to mercury under sulfate-limited conditions, among others. The fact that the cells were most sensitive to cadmium under conditions of phosphate limitation and most sensitive to mercury under conditions of sulfate limitation led to the hypothesis that these inorganic detoxification mechanisms generally depended on a kind of facilitated precipitation. Themore » process was coined thus because heavy metals were probably accumulated and precipitated near the cell perimeter due to the relatively high local concentrations of sulfide and phosphate there. Depending on the growth-limiting nutrient, mercury proved to be 25-fold (phosphate limitation), 75-fold (glycerol limitation), or 150-fold (sulfate limitation) more toxic than cadmium to this organism. In the presence of lead, PbS formation was suggested. since no other detoxification mechanisms were detected, for example, rendering heavy metal ions innocuous as metallo-organic compounds, it was concluded that formation of heavy metal precipitates is crucially important to this organism. In addition, it was observed that several components of a defined mineral medium were able to reduce mercuric ions to elemental mercury. This abiotic mercury volatilization was studied in detail, and its general and environmental implications are discussed.« less

  19. Better Child Support Enforcement: Can It Reduce Teenage Premarital Childbearing?

    ERIC Educational Resources Information Center

    Plotnick, Robert D.; Garfinkel, Irwin; McLanahan, Sara S.; Ku, Inhoe

    2004-01-01

    Stricter child support enforcement may reduce unwed childbearing by raising the costs of fatherhood. The authors investigate this hypothesis using a sample of young women from the National Longitudinal Survey of Youth, to which they add information on state child support enforcement. Models of the probability of a teenage premarital birth and of…

  20. Responding to Crimes of Violence against Women: Gender Differences versus Organizational Imperatives.

    ERIC Educational Resources Information Center

    Buzawa, Eve; And Others

    1995-01-01

    Reports results of a study testing the hypothesis that an inverse relationship exists between level of intimacy between perpetrator and victim in incidents of violence and likelihood of arrest. Notwithstanding relevant elements of probable cause, such as the presence of weapons, witnesses, injury, and the offender, results supported the…

  1. Changing the Subject: The Place of Revisions in Grammatical Development

    ERIC Educational Resources Information Center

    Rispoli, Matthew

    2018-01-01

    Purpose: This article focuses on toddlers' revisions of the sentence subject and tests the hypothesis that subject diversity (i.e., the number of different subjects produced) increases the probability of subject revision. Method: One-hour language samples were collected from 61 children (32 girls) at 27 months. Spontaneously produced, active…

  2. Birth Order and Sibling Sex Ratio in Homosexual Male Adolescents and Probably Prehomosexual Feminine Boys.

    ERIC Educational Resources Information Center

    Blanchard, Ray; And Others

    1995-01-01

    Examined the hypothesis that male homosexuals have a greater than average proportion of male siblings and a later than average birth order, by comparing a group of prehomosexual boys (individuals exhibiting cross-gender behaviors) and homosexual adolescents with a control group. Both predicted results were confirmed. (MDM)

  3. Disadvantages of the Horsfall-Barratt Scale for estimating severity of citrus canker

    USDA-ARS?s Scientific Manuscript database

    Direct visual estimation of disease severity to the nearest percent was compared to using the Horsfall-Barratt (H-B) scale. Data from a simulation model designed to sample two diseased populations were used to investigate the probability of the two methods to reject a null hypothesis (H0) using a t-...

  4. Is Variability in Mate Choice Similar for Intelligence and Personality Traits? Testing a Hypothesis about the Evolutionary Genetics of Personality

    ERIC Educational Resources Information Center

    Stone, Emily A.; Shackelford, Todd K.; Buss, David M.

    2012-01-01

    This study tests the hypothesis presented by Penke, Denissen, and Miller (2007a) that condition-dependent traits, including intelligence, attractiveness, and health, are universally and uniformly preferred as characteristics in a mate relative to traits that are less indicative of condition, including personality traits. We analyzed…

  5. Sex-Typical Play: Masculinization/Defeminization in Girls with an Autism Spectrum Condition

    ERIC Educational Resources Information Center

    Knickmeyer, Rebecca C.; Wheelwright, Sally; Baron-Cohen, Simon B.

    2008-01-01

    We tested the hypothesis that prenatal masculinization of the brain by androgens increases risk of developing an autism spectrum condition (ASC). Sex-typical play was measured in n = 66 children diagnosed with an ASC and n = 55 typically developing age-matched controls. Consistent with the hypothesis, girls with autism did not show the…

  6. Heuristic analogy in Ars Conjectandi: From Archimedes' De Circuli Dimensione to Bernoulli's theorem.

    PubMed

    Campos, Daniel G

    2018-02-01

    This article investigates the way in which Jacob Bernoulli proved the main mathematical theorem that undergirds his art of conjecturing-the theorem that founded, historically, the field of mathematical probability. It aims to contribute a perspective into the question of problem-solving methods in mathematics while also contributing to the comprehension of the historical development of mathematical probability. It argues that Bernoulli proved his theorem by a process of mathematical experimentation in which the central heuristic strategy was analogy. In this context, the analogy functioned as an experimental hypothesis. The article expounds, first, Bernoulli's reasoning for proving his theorem, describing it as a process of experimentation in which hypothesis-making is crucial. Next, it investigates the analogy between his reasoning and Archimedes' approximation of the value of π, by clarifying both Archimedes' own experimental approach to the said approximation and its heuristic influence on Bernoulli's problem-solving strategy. The discussion includes some general considerations about analogy as a heuristic technique to make experimental hypotheses in mathematics. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. The Pavlovian analysis of instrumental conditioning.

    PubMed

    Gormezano, I; Tait, R W

    1976-01-01

    An account was given of the development within the Russian literature of a uniprocess formulation of classical and instrumental conditioning, known as the bidirectional conditioning hypothesis. The hypothesis purports to offer a single set of Pavlovian principles to account for both paradigms, based upon a neural model which assumes that bidirectional (forward and backward) connections are formed in both calssical and instrumental conditioning situations. In instrumental conditioning, the bidirectional connections are hypothesized to be simply more complex than those in classical conditioning, and any differences in empirical functions are presumed to lie not in difference in mechanism, but in the strength of the forward and backward connections. Although bidirectional connections are assumed to develop in instrumental conditioning, the experimental investigation of the bidirectional conditioning hypothesis has been essentially restricted to the classical conditioning operations of pairing two CSs (sensory preconditioning training), a US followed by a CS (backward conditioning training) and two USs. However, the paradigm involving the pairing of two USs, because of theoretical and analytical considerations, is the one most commonly employed by Russian investigators. The results of an initial experiment involving the pairing of two USs, and reference to the results of a more extensive investigation, leads us to tentatively question the validity of the bidirectional conditioning account of instrumental conditioning.

  8. Phonotactics, Neighborhood Activation, and Lexical Access for Spoken Words

    PubMed Central

    Vitevitch, Michael S.; Luce, Paul A.; Pisoni, David B.; Auer, Edward T.

    2012-01-01

    Probabilistic phonotactics refers to the relative frequencies of segments and sequences of segments in spoken words. Neighborhood density refers to the number of words that are phonologically similar to a given word. Despite a positive correlation between phonotactic probability and neighborhood density, nonsense words with high probability segments and sequences are responded to more quickly than nonsense words with low probability segments and sequences, whereas real words occurring in dense similarity neighborhoods are responded to more slowly than real words occurring in sparse similarity neighborhoods. This contradiction may be resolved by hypothesizing that effects of probabilistic phonotactics have a sublexical focus and that effects of similarity neighborhood density have a lexical focus. The implications of this hypothesis for models of spoken word recognition are discussed. PMID:10433774

  9. The slavery hypothesis for hypertension among African Americans: the historical evidence.

    PubMed Central

    Curtin, P D

    1992-01-01

    The slavery hypothesis for hypertension has stated that the high blood pressures sometimes measured in African Americans are caused by one or more of these conditions: first, salt deficiency in the parts of Africa that supplied slaves for the Americas; second, the trauma of the slave trade itself; third, conditions of slavery in the United States. A review of the historical evidence shows that there was no salt deficiency in those parts of Africa, nor do present-day West Africans have a high incidence of hypertension. Historical evidence does not support the hypothesis that deaths aboard slave ships were caused mainly by conditions that might be conductive to hypertension, such as salt-depleting diseases. Finally, the hypothesis has depended heavily on evidence from the West Indies, which is not relevant for the United States. There is no evidence that diet or the resulting patterns of disease and demography among slaves in the American South were significantly different from those of other poor southerners. Images p1682-a p1684-a PMID:1456349

  10. Evidence supporting the match/mismatch hypothesis of psychiatric disorders.

    PubMed

    Santarelli, Sara; Lesuis, Sylvie L; Wang, Xiao-Dong; Wagner, Klaus V; Hartmann, Jakob; Labermaier, Christiana; Scharf, Sebastian H; Müller, Marianne B; Holsboer, Florian; Schmidt, Mathias V

    2014-06-01

    Chronic stress is one of the predominant environmental risk factors for a number of psychiatric disorders, particularly for major depression. Different hypotheses have been formulated to address the interaction between early and adult chronic stress in psychiatric disease vulnerability. The match/mismatch hypothesis of psychiatric disease states that the early life environment shapes coping strategies in a manner that enables individuals to optimally face similar environments later in life. We tested this hypothesis in female Balb/c mice that underwent either stress or enrichment early in life and were in adulthood further subdivided in single or group housed, in order to provide aversive or positive adult environments, respectively. We studied the effects of the environmental manipulation on anxiety-like, depressive-like and sociability behaviors and gene expression profiles. We show that continuous exposure to adverse environments (matched condition) is not necessarily resulting in an opposite phenotype compared to a continuous supportive environment (matched condition). Rather, animals with mismatched environmental conditions behaved differently from animals with matched environments on anxious, social and depressive like phenotypes. These results further support the match/mismatch hypothesis and illustrate how mild or moderate aversive conditions during development can shape an individual to be optimally adapted to similar conditions later in life. Copyright © 2014 Elsevier B.V. and ECNP. All rights reserved.

  11. An Inverse Problem for a Class of Conditional Probability Measure-Dependent Evolution Equations

    PubMed Central

    Mirzaev, Inom; Byrne, Erin C.; Bortz, David M.

    2016-01-01

    We investigate the inverse problem of identifying a conditional probability measure in measure-dependent evolution equations arising in size-structured population modeling. We formulate the inverse problem as a least squares problem for the probability measure estimation. Using the Prohorov metric framework, we prove existence and consistency of the least squares estimates and outline a discretization scheme for approximating a conditional probability measure. For this scheme, we prove general method stability. The work is motivated by Partial Differential Equation (PDE) models of flocculation for which the shape of the post-fragmentation conditional probability measure greatly impacts the solution dynamics. To illustrate our methodology, we apply the theory to a particular PDE model that arises in the study of population dynamics for flocculating bacterial aggregates in suspension, and provide numerical evidence for the utility of the approach. PMID:28316360

  12. High probability neurotransmitter release sites represent an energy efficient design

    PubMed Central

    Lu, Zhongmin; Chouhan, Amit K.; Borycz, Jolanta A.; Lu, Zhiyuan; Rossano, Adam J; Brain, Keith L.; Zhou, You; Meinertzhagen, Ian A.; Macleod, Gregory T.

    2016-01-01

    Nerve terminals contain multiple sites specialized for the release of neurotransmitters. Release usually occurs with low probability, a design thought to confer many advantages. High probability release sites are not uncommon but their advantages are not well understood. Here we test the hypothesis that high probability release sites represent an energy efficient design. We examined release site probabilities and energy efficiency at the terminals of two glutamatergic motor neurons synapsing on the same muscle fiber in Drosophila larvae. Through electrophysiological and ultrastructural measurements we calculated release site probabilities to differ considerably between terminals (0.33 vs. 0.11). We estimated the energy required to release and recycle glutamate from the same measurements. The energy required to remove calcium and sodium ions subsequent to nerve excitation was estimated through microfluorimetric and morphological measurements. We calculated energy efficiency as the number of glutamate molecules released per ATP molecule hydrolyzed, and high probability release site terminals were found to be more efficient (0.13 vs. 0.06). Our analytical model indicates that energy efficiency is optimal (~0.15) at high release site probabilities (~0.76). As limitations in energy supply constrain neural function, high probability release sites might ameliorate such constraints by demanding less energy. Energy efficiency can be viewed as one aspect of nerve terminal function, in balance with others, because high efficiency terminals depress significantly during episodic bursts of activity. PMID:27593375

  13. The relation between proactive environmental strategies and competitive advantage

    NASA Astrophysics Data System (ADS)

    Butnariu, A.; Avasilcăi, S.

    2015-11-01

    There are two distinct orientations of the environmental management that companies may adopt: the model of compliance and the strategic model. The strategic model treats environmental expenses as investments that will lead to competitive advantage for the company. Nevertheless, there are few scientific works that prove the relation between corporate environmental investments and competitive advantage. Thereby, in order to bring clarifications about the profound implications of environmental investments, in the first stage of our research we have proposed the hypothesis that the environmental investments would probably lead to competitive advantage by creating capabilities that are mediators of this relation. In the second stage we have tested this hypothesis, using the research method of survey. A questionnaire was sent to managers in textile Romanian industry, and 109 answers were received. The data was analysed using the linear multiple regression method and the results confirm our hypothesis.

  14. SU-F-T-683: Cancer Stem Cell Hypothesis and Radiation Treatments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fourkal, E

    Purpose: The tumor control probability in radiation therapy allows comparing different radiation treatments to each other by means of calculating the probability that a prescribed dose of radiation eradicates or controls the tumor. In the conventional approach, all cancer cells can divide unlimited number of times and the tumor control often means eradicating every malignant cell by the radiation. In recent years however, there is a mounting consensus that in a given tumor volume there is a sub-population of cells, known as cancer stem cells (CSCs) that are responsible for tumor initiation and growth. Other or progenitor cancer cells canmore » only divide limited number of times. This entails that only cancer stem cells may nned to be eliminated in order to control the tumor. Thus one may define TCP as the probability of eliminating CSCs for the given dose of radiation. Methods: Using stochastic methods, specifically the birth-and-death Markov processes, an infinite system of equations is set for probabilities of having m cancer stem cells at time t after the start of radiation. The TCP is calculated as the probability of no cancer stem cells surviving the radiation. Two scenarios are studied. In the first situation, the TCP is calculated for a unidirectional case when CSC gives birth to another CSC or a progenitor cell. In the second scenario, a bidirectional model is studied where the progenitor cell gives rise to CSC. Results: The proposed calculations show that the calculated TCP for CSC depends on whether one adopts unidirectional or bidirectional conversion models. The bidirectional model shows significantly lower TCP values for the given dose delivered to the tumor. Conclusion: Incorporating CSC hypothesis into the TCP modeling may notably influence the dose prescription as well as the concept of the expected TCP after the radiation treatments.« less

  15. The origins of levels-of-processing effects in a conceptual test: evidence for automatic influences of memory from the process-dissociation procedure.

    PubMed

    Bergerbest, Dafna; Goshen-Gottstein, Yonatan

    2002-12-01

    In three experiments, we explored automatic influences of memory in a conceptual memory task, as affected by a levels-of-processing (LoP) manipulation. We also explored the origins of the LoP effect by examining whether the effect emerged only when participants in the shallow condition truncated the perceptual processing (the lexical-processing hypothesis) or even when the entire word was encoded in this condition (the conceptual-processing hypothesis). Using the process-dissociation procedure and an implicit association-generation task, we found that the deep encoding condition yielded higher estimates of automatic influences than the shallow condition. In support of the conceptual processing hypothesis, the LoP effect was found even when the shallow task did not lead to truncated processing of the lexical units. We suggest that encoding for meaning is a prerequisite for automatic processing on conceptual tests of memory.

  16. An experimental re-examination of the inferential confusion hypothesis of obsessive-compulsive doubt.

    PubMed

    Gangemi, Amelia; Mancini, Francesco; Dar, Reuven

    2015-09-01

    The inferential confusion hypothesis postulates that obsessive doubt is perpetuated by a subjective form of reasoning characterized primarily by a distrust of reality and an overreliance on imagined possibilities. However, experimental evidence for this hypothesis may be compromised by a potential confound between type of information (reality vs. possibility) and its valence (danger vs. safety). In the present study we aimed to untangle this potential confound. Forty OCD and 40 non-clinical participants underwent two versions of the Inferential Processes Task (Aardema, F., et al. (2009). The quantification of doubt in obsessive-compulsive disorder. International Journal of Cognitive Therapy, 2, 188-205). In the original version, the reality-based information is congruent with the safety hypothesis, whereas the possibility-based information is congruent with the danger hypothesis. In the modified version incorporated in the present study, the reality-based information is congruent with the danger hypothesis, whereas the possibility-based information is congruent with the safety hypothesis. Our findings did not support the inferential confusion hypothesis: both OCD and control participants changed their estimations of the probability of unwanted events based on the type of information they received (whether it conveyed danger or safety) regardless of whether it was framed as reality or possibility. The design of the present study does not lend itself to examining alternative explanations for the persistence of doubt in OCD. The hypothesized inferential confusion in OCD requires further validation. It is particularly important to demonstrate that findings do not reflect a prudential reasoning strategy. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Re-evaluation of the reward comparison hypothesis for alcohol abuse.

    PubMed

    He, Alan Bo-Han; Chang, Yu-Chieh; Meng, Anna Wan Yun; Huang, Andrew Chih Wei

    2017-08-14

    This study examined whether various doses of ethanol induced reward or aversion and then evaluated Grigson's reward comparison hypothesis (1997). Rats were given a 0.1% saccharin solution (conditioned stimulus 1 [CS1]) 15min prior to administration of a 0, 0.05, 0.125, 0.20, 0.35, or 0.50g/kg dose of ethanol (unconditioned stimulus [US]). The rats were then exposed to a paired compartment (CS2) for 30min. The low dose of 0.05g/kg ethanol did not induce conditioned suppression (i.e., conditioned taste aversion [CTA]) or conditioned place preference (CPP). The dose of 0.125g/kg ethanol induced CPP but not CTA. High doses of ethanol, including 0.35g/kg and 0.50g/kg, produced CTA but not CPP. The middle dose of 0.20g/kg ethanol simultaneously induced CTA and CPP. As a result, the reward comparison hypothesis cannot explain the present finding that the middle dose of ethanol induced CTA and CPP. Meanwhile, the high doses of ethanol induced motivationally aversive CTA but not rewarding CPP. The reward comparison hypothesis should be updated further. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Magnetoreception and its trigeminal mediation in the homing pigeon

    NASA Astrophysics Data System (ADS)

    Mora, Cordula V.; Davison, Michael; Martin Wild, J.; Walker, Michael M.

    2004-11-01

    Two conflicting hypotheses compete to explain how a homing pigeon can return to its loft over great distances. One proposes the use of atmospheric odours and the other the Earth's magnetic field in the `map' step of the `map and compass' hypothesis of pigeon homing. Although magnetic effects on pigeon orientation provide indirect evidence for a magnetic `map', numerous conditioning experiments have failed to demonstrate reproducible responses to magnetic fields by pigeons. This has led to suggestions that homing pigeons and other birds have no useful sensitivity to the Earth's magnetic field. Here we demonstrate that homing pigeons (Columba livia) can discriminate between the presence and absence of a magnetic anomaly in a conditioned choice experiment. This discrimination is impaired by attachment of a magnet to the cere, local anaesthesia of the upper beak area, and bilateral section of the ophthalmic branch of the trigeminal nerve, but not of the olfactory nerve. These results suggest that magnetoreception (probably magnetite-based) occurs in the upper beak area of the pigeon. Traditional methods of rendering pigeons anosmic might therefore cause simultaneous impairment of magnetoreception so that future orientation experiments will require independent evaluation of the pigeon's magnetic and olfactory systems.

  19. Altering BDNF expression by genetics and/or environment: impact for emotional and depression-like behaviour in laboratory mice.

    PubMed

    Chourbaji, Sabine; Brandwein, Christiane; Gass, Peter

    2011-01-01

    According to the "neurotrophin hypothesis", brain-derived neurotrophic factor (BDNF) is an important candidate gene in depression. Moreover, environmental stress is known to represent a risk factor in the pathophysiology and treatment of this disease. To elucidate, whether changes of BDNF availability signify cause or consequence of depressive-like alterations, it is essential to look for endophenotypes under distinct genetic conditions (e.g. altered BDNF expression). Furthermore it is crucial to examine environment-driven BDNF regulation and its effect on depressive-linked features. Consequently, gene × environment studies investigating prospective genetic mouse models of depression in different environmental contexts become increasingly important. The present review summarizes recent findings in BDNF-mutant mice, which have been controversially discussed as models of depression and anxiety. It furthermore illustrates the potential of environment to serve as naturalistic stressor with the potential to modulate the phenotype in wildtype and mutant mice. Moreover, environment may exert protective effects by regulating BDNF levels as attributed to "environmental enrichment". The effect of this beneficial condition will also be discussed with regard to probable "curative/therapeutic" approaches. Copyright © 2010 Elsevier Ltd. All rights reserved.

  20. Magnetoreception and its trigeminal mediation in the homing pigeon.

    PubMed

    Mora, Cordula V; Davison, Michael; Wild, J Martin; Walker, Michael M

    2004-11-25

    Two conflicting hypotheses compete to explain how a homing pigeon can return to its loft over great distances. One proposes the use of atmospheric odours and the other the Earth's magnetic field in the 'map' step of the 'map and compass' hypothesis of pigeon homing. Although magnetic effects on pigeon orientation provide indirect evidence for a magnetic 'map', numerous conditioning experiments have failed to demonstrate reproducible responses to magnetic fields by pigeons. This has led to suggestions that homing pigeons and other birds have no useful sensitivity to the Earth's magnetic field. Here we demonstrate that homing pigeons (Columba livia) can discriminate between the presence and absence of a magnetic anomaly in a conditioned choice experiment. This discrimination is impaired by attachment of a magnet to the cere, local anaesthesia of the upper beak area, and bilateral section of the ophthalmic branch of the trigeminal nerve, but not of the olfactory nerve. These results suggest that magnetoreception (probably magnetite-based) occurs in the upper beak area of the pigeon. Traditional methods of rendering pigeons anosmic might therefore cause simultaneous impairment of magnetoreception so that future orientation experiments will require independent evaluation of the pigeon's magnetic and olfactory systems.

  1. An Embodied Multi-Sensor Fusion Approach to Visual Motion Estimation Using Unsupervised Deep Networks.

    PubMed

    Shamwell, E Jared; Nothwang, William D; Perlis, Donald

    2018-05-04

    Aimed at improving size, weight, and power (SWaP)-constrained robotic vision-aided state estimation, we describe our unsupervised, deep convolutional-deconvolutional sensor fusion network, Multi-Hypothesis DeepEfference (MHDE). MHDE learns to intelligently combine noisy heterogeneous sensor data to predict several probable hypotheses for the dense, pixel-level correspondence between a source image and an unseen target image. We show how our multi-hypothesis formulation provides increased robustness against dynamic, heteroscedastic sensor and motion noise by computing hypothesis image mappings and predictions at 76⁻357 Hz depending on the number of hypotheses being generated. MHDE fuses noisy, heterogeneous sensory inputs using two parallel, inter-connected architectural pathways and n (1⁻20 in this work) multi-hypothesis generating sub-pathways to produce n global correspondence estimates between a source and a target image. We evaluated MHDE on the KITTI Odometry dataset and benchmarked it against the vision-only DeepMatching and Deformable Spatial Pyramids algorithms and were able to demonstrate a significant runtime decrease and a performance increase compared to the next-best performing method.

  2. Binary Hypothesis Testing With Byzantine Sensors: Fundamental Tradeoff Between Security and Efficiency

    NASA Astrophysics Data System (ADS)

    Ren, Xiaoqiang; Yan, Jiaqi; Mo, Yilin

    2018-03-01

    This paper studies binary hypothesis testing based on measurements from a set of sensors, a subset of which can be compromised by an attacker. The measurements from a compromised sensor can be manipulated arbitrarily by the adversary. The asymptotic exponential rate, with which the probability of error goes to zero, is adopted to indicate the detection performance of a detector. In practice, we expect the attack on sensors to be sporadic, and therefore the system may operate with all the sensors being benign for extended period of time. This motivates us to consider the trade-off between the detection performance of a detector, i.e., the probability of error, when the attacker is absent (defined as efficiency) and the worst-case detection performance when the attacker is present (defined as security). We first provide the fundamental limits of this trade-off, and then propose a detection strategy that achieves these limits. We then consider a special case, where there is no trade-off between security and efficiency. In other words, our detection strategy can achieve the maximal efficiency and the maximal security simultaneously. Two extensions of the secure hypothesis testing problem are also studied and fundamental limits and achievability results are provided: 1) a subset of sensors, namely "secure" sensors, are assumed to be equipped with better security countermeasures and hence are guaranteed to be benign, 2) detection performance with unknown number of compromised sensors. Numerical examples are given to illustrate the main results.

  3. Statistical Hypothesis Testing in Intraspecific Phylogeography: NCPA versus ABC

    PubMed Central

    Templeton, Alan R.

    2009-01-01

    Nested clade phylogeographic analysis (NCPA) and approximate Bayesian computation (ABC) have been used to test phylogeographic hypotheses. Multilocus NCPA tests null hypotheses, whereas ABC discriminates among a finite set of alternatives. The interpretive criteria of NCPA are explicit and allow complex models to be built from simple components. The interpretive criteria of ABC are ad hoc and require the specification of a complete phylogeographic model. The conclusions from ABC are often influenced by implicit assumptions arising from the many parameters needed to specify a complex model. These complex models confound many assumptions so that biological interpretations are difficult. Sampling error is accounted for in NCPA, but ABC ignores important sources of sampling error that creates pseudo-statistical power. NCPA generates the full sampling distribution of its statistics, but ABC only yields local probabilities, which in turn make it impossible to distinguish between a good fitting model, a non-informative model, and an over-determined model. Both NCPA and ABC use approximations, but convergences of the approximations used in NCPA are well defined whereas those in ABC are not. NCPA can analyze a large number of locations, but ABC cannot. Finally, the dimensionality of tested hypothesis is known in NCPA, but not for ABC. As a consequence, the “probabilities” generated by ABC are not true probabilities and are statistically non-interpretable. Accordingly, ABC should not be used for hypothesis testing, but simulation approaches are valuable when used in conjunction with NCPA or other methods that do not rely on highly parameterized models. PMID:19192182

  4. Updating: Learning versus Supposing

    ERIC Educational Resources Information Center

    Zhao, Jiaying; Crupi, Vincenzo; Tentori, Katya; Fitelson, Branden; Osherson, Daniel

    2012-01-01

    Bayesian orthodoxy posits a tight relationship between conditional probability and updating. Namely, the probability of an event "A" after learning "B" should equal the conditional probability of "A" given "B" prior to learning "B". We examine whether ordinary judgment conforms to the orthodox view. In three experiments we found substantial…

  5. Music-evoked incidental happiness modulates probability weighting during risky lottery choices

    PubMed Central

    Schulreich, Stefan; Heussen, Yana G.; Gerhardt, Holger; Mohr, Peter N. C.; Binkofski, Ferdinand C.; Koelsch, Stefan; Heekeren, Hauke R.

    2014-01-01

    We often make decisions with uncertain consequences. The outcomes of the choices we make are usually not perfectly predictable but probabilistic, and the probabilities can be known or unknown. Probability judgments, i.e., the assessment of unknown probabilities, can be influenced by evoked emotional states. This suggests that also the weighting of known probabilities in decision making under risk might be influenced by incidental emotions, i.e., emotions unrelated to the judgments and decisions at issue. Probability weighting describes the transformation of probabilities into subjective decision weights for outcomes and is one of the central components of cumulative prospect theory (CPT) that determine risk attitudes. We hypothesized that music-evoked emotions would modulate risk attitudes in the gain domain and in particular probability weighting. Our experiment featured a within-subject design consisting of four conditions in separate sessions. In each condition, the 41 participants listened to a different kind of music—happy, sad, or no music, or sequences of random tones—and performed a repeated pairwise lottery choice task. We found that participants chose the riskier lotteries significantly more often in the “happy” than in the “sad” and “random tones” conditions. Via structural regressions based on CPT, we found that the observed changes in participants' choices can be attributed to changes in the elevation parameter of the probability weighting function: in the “happy” condition, participants showed significantly higher decision weights associated with the larger payoffs than in the “sad” and “random tones” conditions. Moreover, elevation correlated positively with self-reported music-evoked happiness. Thus, our experimental results provide evidence in favor of a causal effect of incidental happiness on risk attitudes that can be explained by changes in probability weighting. PMID:24432007

  6. Music-evoked incidental happiness modulates probability weighting during risky lottery choices.

    PubMed

    Schulreich, Stefan; Heussen, Yana G; Gerhardt, Holger; Mohr, Peter N C; Binkofski, Ferdinand C; Koelsch, Stefan; Heekeren, Hauke R

    2014-01-07

    We often make decisions with uncertain consequences. The outcomes of the choices we make are usually not perfectly predictable but probabilistic, and the probabilities can be known or unknown. Probability judgments, i.e., the assessment of unknown probabilities, can be influenced by evoked emotional states. This suggests that also the weighting of known probabilities in decision making under risk might be influenced by incidental emotions, i.e., emotions unrelated to the judgments and decisions at issue. Probability weighting describes the transformation of probabilities into subjective decision weights for outcomes and is one of the central components of cumulative prospect theory (CPT) that determine risk attitudes. We hypothesized that music-evoked emotions would modulate risk attitudes in the gain domain and in particular probability weighting. Our experiment featured a within-subject design consisting of four conditions in separate sessions. In each condition, the 41 participants listened to a different kind of music-happy, sad, or no music, or sequences of random tones-and performed a repeated pairwise lottery choice task. We found that participants chose the riskier lotteries significantly more often in the "happy" than in the "sad" and "random tones" conditions. Via structural regressions based on CPT, we found that the observed changes in participants' choices can be attributed to changes in the elevation parameter of the probability weighting function: in the "happy" condition, participants showed significantly higher decision weights associated with the larger payoffs than in the "sad" and "random tones" conditions. Moreover, elevation correlated positively with self-reported music-evoked happiness. Thus, our experimental results provide evidence in favor of a causal effect of incidental happiness on risk attitudes that can be explained by changes in probability weighting.

  7. CProb: a computational tool for conducting conditional probability analysis.

    PubMed

    Hollister, Jeffrey W; Walker, Henry A; Paul, John F

    2008-01-01

    Conditional probability is the probability of observing one event given that another event has occurred. In an environmental context, conditional probability helps to assess the association between an environmental contaminant (i.e., the stressor) and the ecological condition of a resource (i.e., the response). These analyses, when combined with controlled experiments and other methodologies, show great promise in evaluating ecological conditions from observational data and in defining water quality and other environmental criteria. Current applications of conditional probability analysis (CPA) are largely done via scripts or cumbersome spreadsheet routines, which may prove daunting to end-users and do not provide access to the underlying scripts. Combining spreadsheets with scripts eases computation through a familiar interface (i.e., Microsoft Excel) and creates a transparent process through full accessibility to the scripts. With this in mind, we developed a software application, CProb, as an Add-in for Microsoft Excel with R, R(D)com Server, and Visual Basic for Applications. CProb calculates and plots scatterplots, empirical cumulative distribution functions, and conditional probability. In this short communication, we describe CPA, our motivation for developing a CPA tool, and our implementation of CPA as a Microsoft Excel Add-in. Further, we illustrate the use of our software with two examples: a water quality example and a landscape example. CProb is freely available for download at http://www.epa.gov/emap/nca/html/regions/cprob.

  8. Supporting shared hypothesis testing in the biomedical domain.

    PubMed

    Agibetov, Asan; Jiménez-Ruiz, Ernesto; Ondrésik, Marta; Solimando, Alessandro; Banerjee, Imon; Guerrini, Giovanna; Catalano, Chiara E; Oliveira, Joaquim M; Patanè, Giuseppe; Reis, Rui L; Spagnuolo, Michela

    2018-02-08

    Pathogenesis of inflammatory diseases can be tracked by studying the causality relationships among the factors contributing to its development. We could, for instance, hypothesize on the connections of the pathogenesis outcomes to the observed conditions. And to prove such causal hypotheses we would need to have the full understanding of the causal relationships, and we would have to provide all the necessary evidences to support our claims. In practice, however, we might not possess all the background knowledge on the causality relationships, and we might be unable to collect all the evidence to prove our hypotheses. In this work we propose a methodology for the translation of biological knowledge on causality relationships of biological processes and their effects on conditions to a computational framework for hypothesis testing. The methodology consists of two main points: hypothesis graph construction from the formalization of the background knowledge on causality relationships, and confidence measurement in a causality hypothesis as a normalized weighted path computation in the hypothesis graph. In this framework, we can simulate collection of evidences and assess confidence in a causality hypothesis by measuring it proportionally to the amount of available knowledge and collected evidences. We evaluate our methodology on a hypothesis graph that represents both contributing factors which may cause cartilage degradation and the factors which might be caused by the cartilage degradation during osteoarthritis. Hypothesis graph construction has proven to be robust to the addition of potentially contradictory information on the simultaneously positive and negative effects. The obtained confidence measures for the specific causality hypotheses have been validated by our domain experts, and, correspond closely to their subjective assessments of confidences in investigated hypotheses. Overall, our methodology for a shared hypothesis testing framework exhibits important properties that researchers will find useful in literature review for their experimental studies, planning and prioritizing evidence collection acquisition procedures, and testing their hypotheses with different depths of knowledge on causal dependencies of biological processes and their effects on the observed conditions.

  9. Bayesian hypothesis testing for human threat conditioning research: an introduction and the condir R package

    PubMed Central

    Krypotos, Angelos-Miltiadis; Klugkist, Irene; Engelhard, Iris M.

    2017-01-01

    ABSTRACT Threat conditioning procedures have allowed the experimental investigation of the pathogenesis of Post-Traumatic Stress Disorder. The findings of these procedures have also provided stable foundations for the development of relevant intervention programs (e.g. exposure therapy). Statistical inference of threat conditioning procedures is commonly based on p-values and Null Hypothesis Significance Testing (NHST). Nowadays, however, there is a growing concern about this statistical approach, as many scientists point to the various limitations of p-values and NHST. As an alternative, the use of Bayes factors and Bayesian hypothesis testing has been suggested. In this article, we apply this statistical approach to threat conditioning data. In order to enable the easy computation of Bayes factors for threat conditioning data we present a new R package named condir, which can be used either via the R console or via a Shiny application. This article provides both a non-technical introduction to Bayesian analysis for researchers using the threat conditioning paradigm, and the necessary tools for computing Bayes factors easily. PMID:29038683

  10. Global climatic changes during the Devonian-Mississippian: Stable isotope biogeochemistry of brachiopods

    NASA Astrophysics Data System (ADS)

    Brand, Uwe

    1989-12-01

    A progressive trend towards heavier δ 13C values of Devonian-Mississippian brachiopods from North America, Europe, Afghanistan and Algeria probably reflects expansion of the terrestrestrial and/or marine biomass and/or burial of carbon in soils/sediments. Oceanic Productivity crises, based on perturbations in the overall δ 13C trend, are recognized for the Mid Givetian, Early Famennian, Late Kinderhookian, Late Osagean and Early and Late Meramecian. The Givetian productivity crisis was probably accompanied by massive overturn of biologically toxic deep-ocean water. Temperature data, adjusted for the possible secular variation of seawater, support the hypothesis of global greenhouse conditions for the Devonian (mean of 30°C, mean of 26°C if extrinsic data are deleted) and icehouse conditions for the Mississippian (mean of 17°C). During the Mid Givetian, Frasnian and Early Famennian calculated water temperatures for tropical epeiric seas were generally above the thermal threshold limit (˜ 38°C) of most marine invertebrates or epeiric seawater was characterized by unusually low salinities (˜ pp ppt) or a combination of the two. These elevated water temperatures and/or low salinities, in conjunction with the postulated productivity crises and overturning of toxic deep waters are considered prime causes for the biotic crisis of the Late Devonian. In addition, a presumed expanding oxygen-minimum zone and general anoxia in the oceans prevented shallow-water organisms from escaping these inhospitable conditions. Re-population of the tropical seas occurred, after either water temperatures had dropped below the thermal threshold limit and/or salinities were back to normal, and oceanic productivity had increased due to more vigorous oceanic circulation, sometime during the Mid-Late Famennian. Migration of eurythermal, shallow- and deeper-water organisms into the vacant niches of the shallow seas was possible because of, generally, slightly lower sea levels, but, more importantly of more restricted oxygen-minimum zone and generally reduced oceanic anoxia.

  11. The effect of audiovisual and binaural listening on the acceptable noise level (ANL): establishing an ANL conceptual model.

    PubMed

    Wu, Yu-Hsiang; Stangl, Elizabeth; Pang, Carol; Zhang, Xuyang

    2014-02-01

    Little is known regarding the acoustic features of a stimulus used by listeners to determine the acceptable noise level (ANL). Features suggested by previous research include speech intelligibility (noise is unacceptable when it degrades speech intelligibility to a certain degree; the intelligibility hypothesis) and loudness (noise is unacceptable when the speech-to-noise loudness ratio is poorer than a certain level; the loudness hypothesis). The purpose of the study was to investigate if speech intelligibility or loudness is the criterion feature that determines ANL. To achieve this, test conditions were chosen so that the intelligibility and loudness hypotheses would predict different results. In Experiment 1, the effect of audiovisual (AV) and binaural listening on ANL was investigated; in Experiment 2, the effect of interaural correlation (ρ) on ANL was examined. A single-blinded, repeated-measures design was used. Thirty-two and twenty-five younger adults with normal hearing participated in Experiments 1 and 2, respectively. In Experiment 1, both ANL and speech recognition performance were measured using the AV version of the Connected Speech Test (CST) in three conditions: AV-binaural, auditory only (AO)-binaural, and AO-monaural. Lipreading skill was assessed using the Utley lipreading test. In Experiment 2, ANL and speech recognition performance were measured using the Hearing in Noise Test (HINT) in three binaural conditions, wherein the interaural correlation of noise was varied: ρ = 1 (N(o)S(o) [a listening condition wherein both speech and noise signals are identical across two ears]), -1 (NπS(o) [a listening condition wherein speech signals are identical across two ears whereas the noise signals of two ears are 180 degrees out of phase]), and 0 (N(u)S(o) [a listening condition wherein speech signals are identical across two ears whereas noise signals are uncorrelated across ears]). The results were compared to the predictions made based on the intelligibility and loudness hypotheses. The results of the AV and AO conditions appeared to support the intelligibility hypothesis due to the significant correlation between visual benefit in ANL (AV re: AO ANL) and (1) visual benefit in CST performance (AV re: AO CST) and (2) lipreading skill. The results of the N(o)S(o), NπS(o), and N(u)S(o) conditions negated the intelligibility hypothesis because binaural processing benefit (NπS(o) re: N(o)S(o), and N(u)S(o) re: N(o)S(o)) in ANL was not correlated to that in HINT performance. Instead, the results somewhat supported the loudness hypothesis because the pattern of ANL results across the three conditions (N(o)S(o) ≈ NπS(o) ≈ N(u)S(o) ANL) was more consistent with what was predicted by the loudness hypothesis (N(o)S(o) ≈ NπS(o) < N(u)S(o) ANL) than by the intelligibility hypothesis (NπS(o) < N(u)S(o) < N(o)S(o) ANL). The results of the binaural and monaural conditions supported neither hypothesis because (1) binaural benefit (binaural re: monaural) in ANL was not correlated to that in speech recognition performance, and (2) the pattern of ANL results across conditions (binaural < monaural ANL) was not consistent with the prediction made based on previous binaural loudness summation research (binaural ≥ monaural ANL). The study suggests that listeners may use multiple acoustic features to make ANL judgments. The binaural/monaural results showing that neither hypothesis was supported further indicate that factors other than speech intelligibility and loudness, such as psychological factors, may affect ANL. The weightings of different acoustic features in ANL judgments may vary widely across individuals and listening conditions. American Academy of Audiology.

  12. Internal Medicine residents use heuristics to estimate disease probability.

    PubMed

    Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin

    2015-01-01

    Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. We randomized 55 Internal Medicine residents to different versions of four clinical vignettes and asked them to estimate probabilities of target conditions. We manipulated the clinical data for each vignette to be consistent with either 1) using a representative heuristic, by adding non-discriminating prototypical clinical features of the target condition, or 2) using anchoring with adjustment heuristic, by providing a high or low anchor for the target condition. When presented with additional non-discriminating data the odds of diagnosing the target condition were increased (odds ratio (OR) 2.83, 95% confidence interval [1.30, 6.15], p = 0.009). Similarly, the odds of diagnosing the target condition were increased when a high anchor preceded the vignette (OR 2.04, [1.09, 3.81], p = 0.025). Our findings suggest that despite previous exposure to the use of Bayesian reasoning, residents use heuristics, such as the representative heuristic and anchoring with adjustment, to estimate probabilities. Potential reasons for attribute substitution include the relative cognitive ease of heuristics vs. Bayesian reasoning or perhaps residents in their clinical practice use gist traces rather than precise probability estimates when diagnosing.

  13. Risk dishabituation: in repeated gambling, risk is reduced following low-probability "surprising" events (wins or losses).

    PubMed

    Demaree, Heath A; Burns, Kevin J; Dedonno, Michael A; Agarwala, Edward K; Everhart, D Erik

    2012-06-01

    In path-dependent risk taking, like playing a slot machine, the wager on one trial may be affected by the outcome of the preceding trial. Previous studies have shown that a person's risk-taking preferences may change as a result of the preceding trial (win or loss). For example, the "house money effect" suggests that risk taking may increase after a win, whereas the "break even effect" posits that risk taking increases after a loss. Independent of those findings, a person's emotional state has been found to influence risk taking. For example, the "mood maintenance hypothesis" supports the notion that positive affect decreases risk taking, and related research finds that increased negative affect increases risk taking. Because winning and losing may influence one's emotional state, we sought to investigate how both previous outcomes, as well as a person's emotional responses to those outcomes, independently influence subsequent risk taking. To do this, data were collected using three simplified slot machines where the chance of winning each trial was set to 13%, 50%, and 87%, respectively. Evidence for the break even and house money effects were found on the 13% and 87% games, respectively. Likewise, emotional valence was found to predict risk taking on these two tasks, with emotional valence fully explaining the break even effect observed on the 13% game. In addition to these results, the present research revealed that risk taking is reduced following low-probability ("surprising") events (i.e., a win in the 13% condition or loss in the 87% condition). Dubbed "risk dishabituation," this phenomenon is discussed, along with its likely corresponding emotional experience--surprise.

  14. Educational Subculture and Dropping out in Higher Education: A Longitudinal Case Study

    ERIC Educational Resources Information Center

    Venuleo, C.; Mossi, P.; Salvatore, S.

    2016-01-01

    The paper tests longitudinally the hypothesis that educational subcultures in terms of which students interpret their role and their educational setting affect the probability of dropping out of higher education. A logistic regression model was performed to predict drop out at the beginning of the second academic year for the 823 freshmen of a…

  15. Convergence and divergence in leisure style among Whites and African Americans: toward an interracial contact hypothesis

    Treesearch

    Floyd F. Myron; Kimberly J. Shinew

    1999-01-01

    Drawing upon structural theory and social group perspectives, this study examined two propositions developed to explain the relationship between interracial contact and leisure preferences among African Americans and Whites. The first proposition stated that as interracial contact increases, the greater the probability of observing similarity in the leisure...

  16. Delinquency and Crime Prevention: Overview of Research Comparing Treatment Foster Care and Group Care

    ERIC Educational Resources Information Center

    Osei, Gershon K.; Gorey, Kevin M.; Jozefowicz, Debra M. Hernandez

    2016-01-01

    Background: Evidence of treatment foster care (TFC) and group care's (GC) potential to prevent delinquency and crime has been developing. Objectives: We clarified the state of comparative knowledge with a historical overview. Then we explored the hypothesis that smaller, probably better resourced group homes with smaller staff/resident ratios have…

  17. Drought forecasting in Luanhe River basin involving climatic indices

    NASA Astrophysics Data System (ADS)

    Ren, Weinan; Wang, Yixuan; Li, Jianzhu; Feng, Ping; Smith, Ronald J.

    2017-11-01

    Drought is regarded as one of the most severe natural disasters globally. This is especially the case in Tianjin City, Northern China, where drought can affect economic development and people's livelihoods. Drought forecasting, the basis of drought management, is an important mitigation strategy. In this paper, we evolve a probabilistic forecasting model, which forecasts transition probabilities from a current Standardized Precipitation Index (SPI) value to a future SPI class, based on conditional distribution of multivariate normal distribution to involve two large-scale climatic indices at the same time, and apply the forecasting model to 26 rain gauges in the Luanhe River basin in North China. The establishment of the model and the derivation of the SPI are based on the hypothesis of aggregated monthly precipitation that is normally distributed. Pearson correlation and Shapiro-Wilk normality tests are used to select appropriate SPI time scale and large-scale climatic indices. Findings indicated that longer-term aggregated monthly precipitation, in general, was more likely to be considered normally distributed and forecasting models should be applied to each gauge, respectively, rather than to the whole basin. Taking Liying Gauge as an example, we illustrate the impact of the SPI time scale and lead time on transition probabilities. Then, the controlled climatic indices of every gauge are selected by Pearson correlation test and the multivariate normality of SPI, corresponding climatic indices for current month and SPI 1, 2, and 3 months later are demonstrated using Shapiro-Wilk normality test. Subsequently, we illustrate the impact of large-scale oceanic-atmospheric circulation patterns on transition probabilities. Finally, we use a score method to evaluate and compare the performance of the three forecasting models and compare them with two traditional models which forecast transition probabilities from a current to a future SPI class. The results show that the three proposed models outperform the two traditional models and involving large-scale climatic indices can improve the forecasting accuracy.

  18. A Waveform Detector that Targets Template-Decorrelated Signals and Achieves its Predicted Performance: Demonstration with IMS Data

    NASA Astrophysics Data System (ADS)

    Carmichael, J.

    2016-12-01

    Waveform correlation detectors used in seismic monitoring scan multichannel data to test two competing hypotheses: that data contain (1) a noisy, amplitude-scaled version of a template waveform, or, (2) only noise. In reality, seismic wavefields include signals triggered by non-target sources (background seismicity) and target signals that are only partially correlated with the waveform template. We reform the waveform correlation detector hypothesis test to accommodate deterministic uncertainty in template/target waveform similarity and thereby derive a new detector from convex set projections (the "cone detector") for use in explosion monitoring. Our analyses give probability density functions that quantify the detectors' degraded performance with decreasing waveform similarity. We then apply our results to three announced North Korean nuclear tests and use International Monitoring System (IMS) arrays to determine the probability that low magnitude, off-site explosions can be reliably detected with a given waveform template. We demonstrate that cone detectors provide (1) an improved predictive capability over correlation detectors to identify such spatially separated explosive sources, (2) competitive detection rates, and (3) reduced false alarms on background seismicity. Figure Caption: Observed and predicted receiver operating characteristic curves for correlation statistic r(x) (left) and cone statistic s(x) (right) versus semi-empirical explosion magnitude. a: Shaded region shows range of ROC curves for r(x) that give the predicted detection performance in noise conditions recorded over 24 hrs on 8 October 2006. Superimposed stair plot shows the empirical detection performance (recorded detections/total events) averaged over 24 hr of data. Error bars indicate the demeaned range in observed detection probability over the day; means are removed to avoid risk of misinterpreting range to indicate probabilities can exceed one. b: Shaded region shows range of ROC curves for s(x) that give the predicted detection performance for the cone detector. Superimposed stair plot show observed detection performance averaged over 24 hr of data analogous to that shown in a.

  19. An Extension to the Constructivist Coding Hypothesis as a Learning Model for Selective Feedback when the Base Rate Is High

    ERIC Educational Resources Information Center

    Ghaffarzadegan, Navid; Stewart, Thomas R.

    2011-01-01

    Elwin, Juslin, Olsson, and Enkvist (2007) and Henriksson, Elwin, and Juslin (2010) offered the constructivist coding hypothesis to describe how people code the outcomes of their decisions when availability of feedback is conditional on the decision. They provided empirical evidence only for the 0.5 base rate condition. This commentary argues that…

  20. Bayesian Probability Theory

    NASA Astrophysics Data System (ADS)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  1. A New Bayesian Approach for Estimating the Presence of a Suspected Compound in Routine Screening Analysis.

    PubMed

    Woldegebriel, Michael; Vivó-Truyols, Gabriel

    2016-10-04

    A novel method for compound identification in liquid chromatography-high resolution mass spectrometry (LC-HRMS) is proposed. The method, based on Bayesian statistics, accommodates all possible uncertainties involved, from instrumentation up to data analysis into a single model yielding the probability of the compound of interest being present/absent in the sample. This approach differs from the classical methods in two ways. First, it is probabilistic (instead of deterministic); hence, it computes the probability that the compound is (or is not) present in a sample. Second, it answers the hypothesis "the compound is present", opposed to answering the question "the compound feature is present". This second difference implies a shift in the way data analysis is tackled, since the probability of interfering compounds (i.e., isomers and isobaric compounds) is also taken into account.

  2. Conditional, Time-Dependent Probabilities for Segmented Type-A Faults in the WGCEP UCERF 2

    USGS Publications Warehouse

    Field, Edward H.; Gupta, Vipin

    2008-01-01

    This appendix presents elastic-rebound-theory (ERT) motivated time-dependent probabilities, conditioned on the date of last earthquake, for the segmented type-A fault models of the 2007 Working Group on California Earthquake Probabilities (WGCEP). These probabilities are included as one option in the WGCEP?s Uniform California Earthquake Rupture Forecast 2 (UCERF 2), with the other options being time-independent Poisson probabilities and an ?Empirical? model based on observed seismicity rate changes. A more general discussion of the pros and cons of all methods for computing time-dependent probabilities, as well as the justification of those chosen for UCERF 2, are given in the main body of this report (and the 'Empirical' model is also discussed in Appendix M). What this appendix addresses is the computation of conditional, time-dependent probabilities when both single- and multi-segment ruptures are included in the model. Computing conditional probabilities is relatively straightforward when a fault is assumed to obey strict segmentation in the sense that no multi-segment ruptures occur (e.g., WGCEP (1988, 1990) or see Field (2007) for a review of all previous WGCEPs; from here we assume basic familiarity with conditional probability calculations). However, and as we?ll see below, the calculation is not straightforward when multi-segment ruptures are included, in essence because we are attempting to apply a point-process model to a non point process. The next section gives a review and evaluation of the single- and multi-segment rupture probability-calculation methods used in the most recent statewide forecast for California (WGCEP UCERF 1; Petersen et al., 2007). We then present results for the methodology adopted here for UCERF 2. We finish with a discussion of issues and possible alternative approaches that could be explored and perhaps applied in the future. A fault-by-fault comparison of UCERF 2 probabilities with those of previous studies is given in the main part of this report.

  3. Patients' attitudes vs. physicians' determination: implications for cesarean sections.

    PubMed

    Lo, Joan C

    2003-07-01

    Most research studies identifying non-clinical factors that influence the choice of Cesarean Section as a method of obstetric delivery assume that the physician makes the decision. This paper arguably shows the role played by the mother. Owing to the fact that Chinese people generally believe that choosing the right days for certain life events, such as marriage, can change a person's fate into a better one, the hypothesis is tested that the probability of Cesarean Sections being performed is significantly higher on auspicious days and significantly lower on inauspicious days. By employing a logistic model and utilizing 1998 birth certificate data for Taiwan, we are able to show that the hypothesis is accepted.

  4. Dopamine and reward: the anhedonia hypothesis 30 years on.

    PubMed

    Wise, Roy A

    2008-10-01

    The anhedonia hypothesis--that brain dopamine plays a critical role in the subjective pleasure associated with positive rewards--was intended to draw the attention of psychiatrists to the growing evidence that dopamine plays a critical role in the objective reinforcement and incentive motivation associated with food and water, brain stimulation reward, and psychomotor stimulant and opiate reward. The hypothesis called to attention the apparent paradox that neuroleptics, drugs used to treat a condition involving anhedonia (schizophrenia), attenuated in laboratory animals the positive reinforcement that we normally associate with pleasure. The hypothesis held only brief interest for psychiatrists, who pointed out that the animal studies reflected acute actions of neuroleptics whereas the treatment of schizophrenia appears to result from neuroadaptations to chronic neuroleptic administration, and that it is the positive symptoms of schizophrenia that neuroleptics alleviate, rather than the negative symptoms that include anhedonia. Perhaps for these reasons, the hypothesis has had minimal impact in the psychiatric literature. Despite its limited heuristic value for the understanding of schizophrenia, however, the anhedonia hypothesis has had major impact on biological theories of reinforcement, motivation, and addiction. Brain dopamine plays a very important role in reinforcement of response habits, conditioned preferences, and synaptic plasticity in cellular models of learning and memory. The notion that dopamine plays a dominant role in reinforcement is fundamental to the psychomotor stimulant theory of addiction, to most neuroadaptation theories of addiction, and to current theories of conditioned reinforcement and reward prediction. Properly understood, it is also fundamental to recent theories of incentive motivation.

  5. The Formalism of Generalized Contexts and Decay Processes

    NASA Astrophysics Data System (ADS)

    Losada, Marcelo; Laura, Roberto

    2013-04-01

    The formalism of generalized contexts for quantum histories is used to investigate the possibility to consider the survival probability as the probability of no decay property at a given time conditional to no decay property at an earlier time. A negative result is found for an isolated system. The inclusion of two quantum measurement instruments at two different times makes possible to interpret the survival probability as a conditional probability of the whole system.

  6. Jaguar interactions with pumas and prey at the northern edge of jaguars’ range

    PubMed Central

    2017-01-01

    We present the first study that evaluates jaguar-puma interactions in the arid lands of northern Mexico, where jaguars have their northernmost breeding population and both predators are persecuted for livestock depredation. We tested whether jaguars are the dominant species in this unique ecosystem, where: (1) pumas outnumber jaguars, (2) pumas are better adapted to arid environments, and (3) jaguars and pumas are of similar size. We analyzed four years of data with two approaches; a two species conditional occupancy model and an activity patterns analysis. We used camera location and prey presence as covariates for jaguar and puma detection and presence probabilities. We also explored overlap in activities of predators and prey. Where both species were detected, peccary presence was positively correlated with both jaguar and puma presence, whereas in areas where jaguars were detected but pumas were not, deer presence explained the probability of jaguar presence. We found that both predators were more likely to co-occur together than to be found independently, and so we rejected the hypothesis that jaguars were the dominant species in our study area. Predators were mainly nocturnal and their activity patterns overlapped by 60%. Jaguar, as compared with puma, overlapped more with deer and calves; puma overlapped with calves more than with other prey, suggesting a preference. We believe exploring predator relationships at different scales may help elucidate mechanisms that regulate their coexistence. PMID:28133569

  7. Causes and consequences of marine mammal population declines in southwest Alaska: a food-web perspective.

    PubMed

    Estes, J A; Doak, D F; Springer, A M; Williams, T M

    2009-06-27

    Populations of sea otters, seals and sea lions have collapsed across much of southwest Alaska over the past several decades. The sea otter decline set off a trophic cascade in which the coastal marine ecosystem underwent a phase shift from kelp forests to deforested sea urchin barrens. This interaction in turn affected the distribution, abundance and productivity of numerous other species. Ecological consequences of the pinniped declines are largely unknown. Increased predation by transient (marine mammal-eating) killer whales probably caused the sea otter declines and may have caused the pinniped declines as well. Springer et al. proposed that killer whales, which purportedly fed extensively on great whales, expanded their diets to include a higher percentage of sea otters and pinnipeds following a sharp reduction in great whale numbers from post World War II industrial whaling. Critics of this hypothesis claim that great whales are not now and probably never were an important nutritional resource for killer whales. We used demographic/energetic analyses to evaluate whether or not a predator-prey system involving killer whales and the smaller marine mammals would be sustainable without some nutritional contribution from the great whales. Our results indicate that while such a system is possible, it could only exist under a narrow range of extreme conditions and is therefore highly unlikely.

  8. Permutation entropy of finite-length white-noise time series.

    PubMed

    Little, Douglas J; Kane, Deb M

    2016-08-01

    Permutation entropy (PE) is commonly used to discriminate complex structure from white noise in a time series. While the PE of white noise is well understood in the long time-series limit, analysis in the general case is currently lacking. Here the expectation value and variance of white-noise PE are derived as functions of the number of ordinal pattern trials, N, and the embedding dimension, D. It is demonstrated that the probability distribution of the white-noise PE converges to a χ^{2} distribution with D!-1 degrees of freedom as N becomes large. It is further demonstrated that the PE variance for an arbitrary time series can be estimated as the variance of a related metric, the Kullback-Leibler entropy (KLE), allowing the qualitative N≫D! condition to be recast as a quantitative estimate of the N required to achieve a desired PE calculation precision. Application of this theory to statistical inference is demonstrated in the case of an experimentally obtained noise series, where the probability of obtaining the observed PE value was calculated assuming a white-noise time series. Standard statistical inference can be used to draw conclusions whether the white-noise null hypothesis can be accepted or rejected. This methodology can be applied to other null hypotheses, such as discriminating whether two time series are generated from different complex system states.

  9. Mistletoe Infection in an Oak Forest Is Influenced by Competition and Host Size

    PubMed Central

    Matula, Radim; Svátek, Martin; Pálková, Marcela; Volařík, Daniel; Vrška, Tomáš

    2015-01-01

    Host size and distance from an infected plant have been previously found to affect mistletoe occurrence in woody vegetation but the effect of host plant competition on mistletoe infection has not been empirically tested. For an individual tree, increasing competition from neighbouring trees decreases its resource availability, and resource availability is also known to affect the establishment of mistletoes on host trees. Therefore, competition is likely to affect mistletoe infection but evidence for such a mechanism is lacking. Based on this, we hypothesised that the probability of occurrence as well as the abundance of mistletoes on a tree would increase not only with increasing host size and decreasing distance from an infected tree but also with decreasing competition by neighbouring trees. Our hypothesis was tested using generalized linear models (GLMs) with data on Loranthus europaeus Jacq., one of the two most common mistletoes in Europe, on 1015 potential host stems collected in a large fully mapped plot in the Czech Republic. Because many trees were multi-stemmed, we ran the analyses for both individual stems and whole trees. We found that the probability of mistletoe occurrence on individual stems was affected mostly by stem size, whereas competition had the most important effects on the probability of mistletoe occurrence on whole trees as well as on mistletoe abundance. Therefore, we confirmed our hypothesis that competition among trees has a negative effect on mistletoe occurrence. PMID:25992920

  10. Derived distribution of floods based on the concept of partial area coverage with a climatic appeal

    NASA Astrophysics Data System (ADS)

    Iacobellis, Vito; Fiorentino, Mauro

    2000-02-01

    A new rationale for deriving the probability distribution of floods and help in understanding the physical processes underlying the distribution itself is presented. On the basis of this a model that presents a number of new assumptions is developed. The basic ideas are as follows: (1) The peak direct streamflow Q can always be expressed as the product of two random variates, namely, the average runoff per unit area ua and the peak contributing area a; (2) the distribution of ua conditional on a can be related to that of the rainfall depth occurring in a duration equal to a characteristic response time тa of the contributing part of the basin; and (3) тa is assumed to vary with a according to a power law. Consequently, the probability density function of Q can be found as the integral, over the total basin area A of that of a times the density function of ua given a. It is suggested that ua can be expressed as a fraction of the excess rainfall and that the annual flood distribution can be related to that of Q by the hypothesis that the flood occurrence process is Poissonian. In the proposed model it is assumed, as an exploratory attempt, that a and ua are gamma and Weibull distributed, respectively. The model was applied to the annual flood series of eight gauged basins in Basilicata (southern Italy) with catchment areas ranging from 40 to 1600 km2. The results showed strong physical consistence as the parameters tended to assume values in good agreement with well-consolidated geomorphologic knowledge and suggested a new key to understanding the climatic control of the probability distribution of floods.

  11. No compelling positive association between ovarian hormones and wearing red clothing when using multinomial analyses.

    PubMed

    Blake, Khandis R; Dixson, Barnaby J W; O'Dean, Siobhan M; Denson, Thomas F

    2017-04-01

    Several studies report that wearing red clothing enhances women's attractiveness and signals sexual proceptivity to men. The associated hypothesis that women will choose to wear red clothing when fertility is highest, however, has received mixed support from empirical studies. One possible cause of these mixed findings may be methodological. The current study aimed to replicate recent findings suggesting a positive association between hormonal profiles associated with high fertility (high estradiol to progesterone ratios) and the likelihood of wearing red. We compared the effect of the estradiol to progesterone ratio on the probability of wearing: red versus non-red (binary logistic regression); red versus neutral, black, blue, green, orange, multi-color, and gray (multinomial logistic regression); and each of these same colors in separate binary models (e.g., green versus non-green). Red versus non-red analyses showed a positive trend between a high estradiol to progesterone ratio and wearing red, but the effect only arose for younger women and was not robust across samples. We found no compelling evidence for ovarian hormones increasing the probability of wearing red in the other analyses. However, we did find that the probability of wearing neutral was positively associated with the estradiol to progesterone ratio, though the effect did not reach conventional levels of statistical significance. Findings suggest that although ovarian hormones may affect younger women's preference for red clothing under some conditions, the effect is not robust when differentiating amongst other colors of clothing. In addition, the effect of ovarian hormones on clothing color preference may not be specific to the color red. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Transition metal catalysis in the generation of petroleum and natural gas. Progress report, [1992--1993

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mango, F.

    1993-08-01

    A new hypothesis is introduced for the generation of petroleum and natural gas. The transition metals, activated under the reducing conditions of diagenesis, are proposed as catalysts in the generation of light hydrocarbons. The objective of this proposal is to test that hypothesis. Transition metals (Ni, V, Ti, Co, Fe), in kerogen, porphyrins, and as pure compounds, will be tested under catagenic conditions for catalytic activity in the conversion of normal paraffins and hydrogen into light hydrocarbons. If the hypothesis is correct, kerogenous transition metals should become catalytically active under the reducing conditions of diagenesis and catalyze the conversion ofmore » paraffins into the light hydrocarbons seen in petroleum. Moreover, the C{sub 1}-C{sub 4} hydrocarbons generated catalytically should be similar in molecular and isotopic compositions to natural gas.« less

  13. Initial conditions for cosmological perturbations

    NASA Astrophysics Data System (ADS)

    Ashtekar, Abhay; Gupt, Brajesh

    2017-02-01

    Penrose proposed that the big bang singularity should be constrained by requiring that the Weyl curvature vanishes there. The idea behind this past hypothesis is attractive because it constrains the initial conditions for the universe in geometric terms and is not confined to a specific early universe paradigm. However, the precise statement of Penrose’s hypothesis is tied to classical space-times and furthermore restricts only the gravitational degrees of freedom. These are encapsulated only in the tensor modes of the commonly used cosmological perturbation theory. Drawing inspiration from the underlying idea, we propose a quantum generalization of Penrose’s hypothesis using the Planck regime in place of the big bang, and simultaneously incorporating tensor as well as scalar modes. Initial conditions selected by this generalization constrain the universe to be as homogeneous and isotropic in the Planck regime as permitted by the Heisenberg uncertainty relations.

  14. Using dynamic geometry software for teaching conditional probability with area-proportional Venn diagrams

    NASA Astrophysics Data System (ADS)

    Radakovic, Nenad; McDougall, Douglas

    2012-10-01

    This classroom note illustrates how dynamic visualization can be used to teach conditional probability and Bayes' theorem. There are two features of the visualization that make it an ideal pedagogical tool in probability instruction. The first feature is the use of area-proportional Venn diagrams that, along with showing qualitative relationships, describe the quantitative relationship between two sets. The second feature is the slider and animation component of dynamic geometry software enabling students to observe how the change in the base rate of an event influences conditional probability. A hypothetical instructional sequence using a well-known breast cancer example is described.

  15. The World According to de Finetti: On de Finetti's Theory of Probability and Its Application to Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Berkovitz, Joseph

    Bruno de Finetti is one of the founding fathers of the subjectivist school of probability, where probabilities are interpreted as rational degrees of belief. His work on the relation between the theorems of probability and rationality is among the corner stones of modern subjective probability theory. De Finetti maintained that rationality requires that degrees of belief be coherent, and he argued that the whole of probability theory could be derived from these coherence conditions. De Finetti's interpretation of probability has been highly influential in science. This paper focuses on the application of this interpretation to quantum mechanics. We argue that de Finetti held that the coherence conditions of degrees of belief in events depend on their verifiability. Accordingly, the standard coherence conditions of degrees of belief that are familiar from the literature on subjective probability only apply to degrees of belief in events which could (in principle) be jointly verified; and the coherence conditions of degrees of belief in events that cannot be jointly verified are weaker. While the most obvious explanation of de Finetti's verificationism is the influence of positivism, we argue that it could be motivated by the radical subjectivist and instrumental nature of probability in his interpretation; for as it turns out, in this interpretation it is difficult to make sense of the idea of coherent degrees of belief in, and accordingly probabilities of unverifiable events. We then consider the application of this interpretation to quantum mechanics, concentrating on the Einstein-Podolsky-Rosen experiment and Bell's theorem.

  16. Muscarinic receptors in amygdala control trace fear conditioning.

    PubMed

    Baysinger, Amber N; Kent, Brianne A; Brown, Thomas H

    2012-01-01

    Intelligent behavior requires transient memory, which entails the ability to retain information over short time periods. A newly-emerging hypothesis posits that endogenous persistent firing (EPF) is the neurophysiological foundation for aspects or types of transient memory. EPF is enabled by the activation of muscarinic acetylcholine receptors (mAChRs) and is triggered by suprathreshold stimulation. EPF occurs in several brain regions, including the lateral amygdala (LA). The present study examined the role of amygdalar mAChRs in trace fear conditioning, a paradigm that requires transient memory. If mAChR-dependent EPF selectively supports transient memory, then blocking amygdalar mAChRs should impair trace conditioning, while sparing delay and context conditioning, which presumably do not rely upon transient memory. To test the EPF hypothesis, LA was bilaterally infused, prior to trace or delay conditioning, with either a mAChR antagonist (scopolamine) or saline. Computerized video analysis quantified the amount of freezing elicited by the cue and by the training context. Scopolamine infusion profoundly reduced freezing in the trace conditioning group but had no significant effect on delay or context conditioning. This pattern of results was uniquely anticipated by the EPF hypothesis. The present findings are discussed in terms of a systems-level theory of how EPF in LA and several other brain regions might help support trace fear conditioning.

  17. Prediction and visualization of redox conditions in the groundwater of Central Valley, California

    NASA Astrophysics Data System (ADS)

    Rosecrans, Celia Z.; Nolan, Bernard T.; Gronberg, JoAnn M.

    2017-03-01

    Regional-scale, three-dimensional continuous probability models, were constructed for aspects of redox conditions in the groundwater system of the Central Valley, California. These models yield grids depicting the probability that groundwater in a particular location will have dissolved oxygen (DO) concentrations less than selected threshold values representing anoxic groundwater conditions, or will have dissolved manganese (Mn) concentrations greater than selected threshold values representing secondary drinking water-quality contaminant levels (SMCL) and health-based screening levels (HBSL). The probability models were constrained by the alluvial boundary of the Central Valley to a depth of approximately 300 m. Probability distribution grids can be extracted from the 3-D models at any desired depth, and are of interest to water-resource managers, water-quality researchers, and groundwater modelers concerned with the occurrence of natural and anthropogenic contaminants related to anoxic conditions. Models were constructed using a Boosted Regression Trees (BRT) machine learning technique that produces many trees as part of an additive model and has the ability to handle many variables, automatically incorporate interactions, and is resistant to collinearity. Machine learning methods for statistical prediction are becoming increasing popular in that they do not require assumptions associated with traditional hypothesis testing. Models were constructed using measured dissolved oxygen and manganese concentrations sampled from 2767 wells within the alluvial boundary of the Central Valley, and over 60 explanatory variables representing regional-scale soil properties, soil chemistry, land use, aquifer textures, and aquifer hydrologic properties. Models were trained on a USGS dataset of 932 wells, and evaluated on an independent hold-out dataset of 1835 wells from the California Division of Drinking Water. We used cross-validation to assess the predictive performance of models of varying complexity, as a basis for selecting final models. Trained models were applied to cross-validation testing data and a separate hold-out dataset to evaluate model predictive performance by emphasizing three model metrics of fit: Kappa; accuracy; and the area under the receiver operator characteristic curve (ROC). The final trained models were used for mapping predictions at discrete depths to a depth of 304.8 m. Trained DO and Mn models had accuracies of 86-100%, Kappa values of 0.69-0.99, and ROC values of 0.92-1.0. Model accuracies for cross-validation testing datasets were 82-95% and ROC values were 0.87-0.91, indicating good predictive performance. Kappas for the cross-validation testing dataset were 0.30-0.69, indicating fair to substantial agreement between testing observations and model predictions. Hold-out data were available for the manganese model only and indicated accuracies of 89-97%, ROC values of 0.73-0.75, and Kappa values of 0.06-0.30. The predictive performance of both the DO and Mn models was reasonable, considering all three of these fit metrics and the low percentages of low-DO and high-Mn events in the data.

  18. Internal Medicine residents use heuristics to estimate disease probability

    PubMed Central

    Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin

    2015-01-01

    Background Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. Method We randomized 55 Internal Medicine residents to different versions of four clinical vignettes and asked them to estimate probabilities of target conditions. We manipulated the clinical data for each vignette to be consistent with either 1) using a representative heuristic, by adding non-discriminating prototypical clinical features of the target condition, or 2) using anchoring with adjustment heuristic, by providing a high or low anchor for the target condition. Results When presented with additional non-discriminating data the odds of diagnosing the target condition were increased (odds ratio (OR) 2.83, 95% confidence interval [1.30, 6.15], p = 0.009). Similarly, the odds of diagnosing the target condition were increased when a high anchor preceded the vignette (OR 2.04, [1.09, 3.81], p = 0.025). Conclusions Our findings suggest that despite previous exposure to the use of Bayesian reasoning, residents use heuristics, such as the representative heuristic and anchoring with adjustment, to estimate probabilities. Potential reasons for attribute substitution include the relative cognitive ease of heuristics vs. Bayesian reasoning or perhaps residents in their clinical practice use gist traces rather than precise probability estimates when diagnosing. PMID:27004080

  19. Clairvoyant fusion: a new methodology for designing robust detection algorithms

    NASA Astrophysics Data System (ADS)

    Schaum, Alan

    2016-10-01

    Many realistic detection problems cannot be solved with simple statistical tests for known alternative probability models. Uncontrollable environmental conditions, imperfect sensors, and other uncertainties transform simple detection problems with likelihood ratio solutions into composite hypothesis (CH) testing problems. Recently many multi- and hyperspectral sensing CH problems have been addressed with a new approach. Clairvoyant fusion (CF) integrates the optimal detectors ("clairvoyants") associated with every unspecified value of the parameters appearing in a detection model. For problems with discrete parameter values, logical rules emerge for combining the decisions of the associated clairvoyants. For many problems with continuous parameters, analytic methods of CF have been found that produce closed-form solutions-or approximations for intractable problems. Here the principals of CF are reviewed and mathematical insights are described that have proven useful in the derivation of solutions. It is also shown how a second-stage fusion procedure can be used to create theoretically superior detection algorithms for ALL discrete parameter problems.

  20. Heat conduction in periodic laminates with probabilistic distribution of material properties

    NASA Astrophysics Data System (ADS)

    Ostrowski, Piotr; Jędrysiak, Jarosław

    2017-04-01

    This contribution deals with a problem of heat conduction in a two-phase laminate made of periodically distributed micro-laminas along one direction. In general, the Fourier's Law describing the heat conduction in a considered composite has highly oscillating and discontinuous coefficients. Therefore, the tolerance averaging technique (cf. Woźniak et al. in Thermomechanics of microheterogeneous solids and structures. Monografie - Politechnika Łódzka, Wydawnictwo Politechniki Łódzkiej, Łódź, 2008) is applied. Based on this technique, the averaged differential equations for a tolerance-asymptotic model are derived and solved analytically for given initial-boundary conditions. The second part of this contribution is an investigation of the effect of material properties ratio ω of two components on the total temperature field θ, by the assumption that conductivities of micro-laminas are not necessary uniquely described. Numerical experiments (Monte Carlo simulation) are executed under assumption that ω is a random variable with a fixed probability distribution. At the end, based on the obtained results, a crucial hypothesis is formulated.

  1. [Hypopituitarism following traumatic brain injury: diagnostic and therapeutic issues].

    PubMed

    Lecoq, A-L; Chanson, P

    2015-10-01

    Traumatic Brain Injury (TBI) is a well-known public health problem worldwide and is a leading cause of death and disability, particularly in young adults. Besides neurological and psychiatric issues, pituitary dysfunction can also occur after TBI, in the acute or chronic phase. The exact prevalence of post-traumatic hypopituitarism is difficult to assess due to the wide heterogeneity of published studies and bias in interpretation of hormonal test results in this specific population. Predictive factors for hypopituitarism have been proposed and are helpful for the screening. The pathophysiology of pituitary dysfunction after TBI is not well understood but the vascular hypothesis is privileged. Activation of pituitary stem/progenitor cells is probably involved in the recovery of pituitary functions. Those cells also play a role in the induction of pituitary tumors, highlighting their crucial place in pituitary conditions. This review updates the current data related to anterior pituitary dysfunction after TBI and discusses the bias and difficulties encountered in its diagnosis. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  2. Interaction of mercury and selenium in the larval stage zebrafish vertebrate model.

    PubMed

    MacDonald, Tracy C; Korbas, Malgorzata; James, Ashley K; Sylvain, Nicole J; Hackett, Mark J; Nehzati, Susan; Krone, Patrick H; George, Graham N; Pickering, Ingrid J

    2015-08-01

    The compounds of mercury can be more toxic than those of any other non-radioactive heavy element. Despite this, environmental mercury pollution and human exposure to mercury are widespread, and are increasing. While the unusual ability of selenium to cancel the toxicity of mercury compounds has been known for nearly five decades, only recently have some aspects of the molecular mechanisms begun to be understood. We report herein a study of the interaction of mercury and selenium in the larval stage zebrafish, a model vertebrate system, using X-ray fluorescence imaging. Exposure of larval zebrafish to inorganic mercury shows nano-scale structures containing co-localized mercury and selenium. No such co-localization is seen with methylmercury exposure under similar conditions. Micro X-ray absorption spectra support the hypothesis that the co-localized deposits are most likely comprised of highly insoluble mixed chalcogenide HgSxSe(1-x) where x is 0.4-0.9, probably with the cubic zincblende structure.

  3. Thinking about touch facilitates tactile but not auditory processing.

    PubMed

    Anema, Helen A; de Haan, Alyanne M; Gebuis, Titia; Dijkerman, H Chris

    2012-05-01

    Mental imagery is considered to be important for normal conscious experience. It is most frequently investigated in the visual, auditory and motor domain (imagination of movement), while the studies on tactile imagery (imagination of touch) are scarce. The current study investigated the effect of tactile and auditory imagery on the left/right discriminations of tactile and auditory stimuli. In line with our hypothesis, we observed that after tactile imagery, tactile stimuli were responded to faster as compared to auditory stimuli and vice versa. On average, tactile stimuli were responded to faster as compared to auditory stimuli, and stimuli in the imagery condition were on average responded to slower as compared to baseline performance (left/right discrimination without imagery assignment). The former is probably due to the spatial and somatotopic proximity of the fingers receiving the taps and the thumbs performing the response (button press), the latter to a dual task cost. Together, these results provide the first evidence of a behavioural effect of a tactile imagery assignment on the perception of real tactile stimuli.

  4. Rule-based reasoning is fast and belief-based reasoning can be slow: Challenging current explanations of belief-bias and base-rate neglect.

    PubMed

    Newman, Ian R; Gibb, Maia; Thompson, Valerie A

    2017-07-01

    It is commonly assumed that belief-based reasoning is fast and automatic, whereas rule-based reasoning is slower and more effortful. Dual-Process theories of reasoning rely on this speed-asymmetry explanation to account for a number of reasoning phenomena, such as base-rate neglect and belief-bias. The goal of the current study was to test this hypothesis about the relative speed of belief-based and rule-based processes. Participants solved base-rate problems (Experiment 1) and conditional inferences (Experiment 2) under a challenging deadline; they then gave a second response in free time. We found that fast responses were informed by rules of probability and logical validity, and that slow responses incorporated belief-based information. Implications for Dual-Process theories and future research options for dissociating Type I and Type II processes are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  5. The Tortoise and the Hare. Small-Game Use, the Broad-Spectrum Revolution, and Paleolithic Demography.

    PubMed

    Stiner; Munro; Surovell

    2000-02-01

    This study illustrates the potential of small-game data for identifying and dating Paleolithic demographic pulses such as those associated with modern human origins and the later evolution of food-producing economies. Archaeofaunal series from Israel and Italy serve as our examples. Three important implications of this study are that (1) early Middle Paleolithic populations were exceptionally small and highly dispersed, (2) the first major population growth pulse in the eastern Mediterranean probably occurred before the end of the Middle Paleolithic, and (3) subsequent demographic pulses in the Upper and Epi-Paleolithic greatly reshaped the conditions of selection that operated on human subsistence ecology, technology, and society. The findings of this study are consistent with the main premise of Flannery's broad-spectrum-revolution hypothesis. However, ranking small prey in terms of work of capture (in the absence of special harvesting tools) proved far more effective in this investigation of human diet breadth than have the taxonomic-diversity analyses published previously.

  6. One explanatory basis for the discrepancy of reported prevalences of sleep paralysis among healthy respondents.

    PubMed

    Fukuda, K

    1993-12-01

    In a previous study, the author and coworkers found 39.8% of healthy young adults had experienced sleep paralysis. Some other studies reported prevalence as about the same or higher (i.e., 40.7% to 62.0%) than that previous estimate, while yet other studies, including Goode's work cited by ASDC and ASDA classifications, suggested much lower prevalences (i.e., 4.7% to 26.2%). The author tested the hypothesis that this discrepancy among the reported prevalences is partly due to the expression used in each questionnaire. University students who answered the questionnaire using the term 'transient paralysis' reported the lower prevalence (26.4%), while the second group of respondents who answered the questionnaire using the term kanashibari, the Japanese folklore expression for sleep paralysis, gave the higher prevalence (39.3%). The third group who answered the questionnaire with the term 'condition,' probably a rather neutral expression, marked the middle (31.0%) of these.

  7. Oxidative peptide /and amide/ formation from Schiff base complexes

    NASA Technical Reports Server (NTRS)

    Strehler, B. L.; Li, M. P.; Martin, K.; Fliss, H.; Schmid, P.

    1982-01-01

    One hypothesis of the origin of pre-modern forms of life is that the original replicating molecules were specific polypeptides which acted as templates for the assembly of poly-Schiff bases complementary to the template, and that these polymers were then oxidized to peptide linkages, probably by photo-produced oxidants. A double cycle of such anti-parallel complementary replication would yield the original peptide polymer. If this model were valid, the Schiff base between an N-acyl alpha mino aldehyde and an amino acid should yield a dipeptide in aqueous solution in the presence of an appropriate oxidant. In the present study it is shown that the substituted dipeptide, N-acetyl-tyrosyl-tyrosine, is produced in high yield in aqueous solution at pH 9 through the action of H2O2 on the Schiff-base complex between N-acetyl-tyrosinal and tyrosine and that a great variety of N-acyl amino acids are formed from amino acids and aliphatic aldehydes under similar conditions.

  8. Conservative Analytical Collision Probabilities for Orbital Formation Flying

    NASA Technical Reports Server (NTRS)

    Carpenter, J. Russell

    2004-01-01

    The literature offers a number of approximations for analytically and/or efficiently computing the probability of collision between two space objects. However, only one of these techniques is a completely analytical approximation that is suitable for use in the preliminary design phase, when it is more important to quickly analyze a large segment of the trade space than it is to precisely compute collision probabilities. Unfortunately, among the types of formations that one might consider, some combine a range of conditions for which this analytical method is less suitable. This work proposes a simple, conservative approximation that produces reasonable upper bounds on the collision probability in such conditions. Although its estimates are much too conservative under other conditions, such conditions are typically well suited for use of the existing method.

  9. Conservative Analytical Collision Probability for Design of Orbital Formations

    NASA Technical Reports Server (NTRS)

    Carpenter, J. Russell

    2004-01-01

    The literature offers a number of approximations for analytically and/or efficiently computing the probability of collision between two space objects. However, only one of these techniques is a completely analytical approximation that is suitable for use in the preliminary design phase, when it is more important to quickly analyze a large segment of the trade space than it is to precisely compute collision probabilities. Unfortunately, among the types of formations that one might consider, some combine a range of conditions for which this analytical method is less suitable. This work proposes a simple, conservative approximation that produces reasonable upper bounds on the collision probability in such conditions. Although its estimates are much too conservative under other conditions, such conditions are typically well suited for use of the existing method.

  10. The comparison of socio-economic conditions and personal hygiene habits of neuro-Behçet's disease and multiple sclerosis patients.

    PubMed

    Pehlivan, Münevver; Kürtüncü, Murat; Tüzün, Erdem; Shugaiv, Erkingül; Mutlu, Melike; Eraksoy, Mefküre; Akman-Demir, Gülşen

    2011-07-01

    The "hygiene hypothesis" suggests that a reduction in the exposure to infectious agents due to improved health conditions has contributed to the increased incidence of autoimmune disorders in developed countries. In keeping with the hygiene hypothesis, many autoimmune disorders such as multiple sclerosis (MS) are more frequently observed in developed countries. To identify the relevance of hygiene hypothesis in neuro-Behçet's disease (NBD), another chronic inflammatory disease of the central nervous system, we developed and administered a multiple choice questionnaire to evaluate the hygiene conditions and practices of age and gender-matched NBD patients (n = 50) and control MS (n =5 0) and headache (n = 50) patients. Overall, MS patients had the highest socio-economic and hygiene features, whereas NBD patients displayed a lower socio-economic status group and showed poorer hygiene conditions than MS and headache controls. These poor hygiene conditions might be increasing the susceptibility of exposure to infectious agents that might, at least in part, trigger the inflammatory responses involved in NBD pathogenesis. Copyright © 2011 Elsevier GmbH. All rights reserved.

  11. Test for age-specificity in survival of the common tern

    USGS Publications Warehouse

    Nisbet, I.C.T.; Cam, E.

    2002-01-01

    Much effort in life-history theory has been addressed to the dependence of life-history traits on age, especially the phenomenon of senescence and its evolution. Although senescent declines in survival are well documented in humans and in domestic and laboratory animals, evidence for their occurrence and importance in wild animal species remains limited and equivocal. Several recent papers have suggested that methodological issues may contribute to this problem, and have encouraged investigators to improve sampling designs and to analyse their data using recently developed approaches to modelling of capture-mark-recapture data. Here we report on a three-year, two-site, mark-recapture study of known-aged common terns (Sterna hirundo) in the north-eastern USA. The study was nested within a long-term ecological study in which large numbers of chicks had been banded in each year for > 25 years. We used a range of models to test the hypothesis of an influence of age on survival probability. We also tested for a possible influence of sex on survival. The cross-sectional design of the study (one year's parameter estimates) avoided the possible confounding of effects of age and time. The study was conducted at a time when one of the study sites was being colonized and numbers were increasing rapidly. We detected two-way movements between the sites and estimated movement probabilities in the year for which they could be modelled. We also obtained limited data on emigration from our study area to more distant sites. We found no evidence that survival depended on either sex or age, except that survival was lower among the youngest birds (ages 2-3 years). Despite the large number of birds included in the study (1599 known-aged birds, 2367 total), confidence limits on estimates of survival probability were wide, especially for the oldest age-classes, so that a slight decline in survival late in life could not have been detected. In addition, the cross-sectional design of this study meant that a decline in survival probability within individuals (actuarial senescence) could have been masked by heterogeneity in survival probability among individuals (mortality selection). This emphasizes the need for the development of modelling tools permitting separation of these two phenomena, valid under field conditions in which the recapture probabilities are less than one.

  12. Analysis of kinematically redundant reaching movements using the equilibrium-point hypothesis.

    PubMed

    Cesari, P; Shiratori, T; Olivato, P; Duarte, M

    2001-03-01

    Six subjects performed a planar reaching arm movement to a target while unpredictable perturbations were applied to the endpoint; the perturbations consisted of pulling springs having different stiffness. Two conditions were applied; in the first, subjects had to reach for the target despite the perturbation, in the second condition, the subjects were asked to not correct the motion as a perturbation was applied. We analyzed the kinematics profiles of the three arm segments and, by means of inverse dynamics, calculated the joint torques. The framework of the equilibrium-point (EP) hypothesis, the lambda model, allowed the reconstruction of the control variables, the "equilibrium trajectories", in the "do not correct" condition for the wrist and the elbow joints as well as for the end point final position, while for the other condition, the reconstruction was less reliable. The findings support and extend to a multiple-joint planar movement, the paradigm of the EP hypothesis along with the "do not correct" instruction.

  13. Implementation of the Iterative Proportion Fitting Algorithm for Geostatistical Facies Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li Yupeng, E-mail: yupeng@ualberta.ca; Deutsch, Clayton V.

    2012-06-15

    In geostatistics, most stochastic algorithm for simulation of categorical variables such as facies or rock types require a conditional probability distribution. The multivariate probability distribution of all the grouped locations including the unsampled location permits calculation of the conditional probability directly based on its definition. In this article, the iterative proportion fitting (IPF) algorithm is implemented to infer this multivariate probability. Using the IPF algorithm, the multivariate probability is obtained by iterative modification to an initial estimated multivariate probability using lower order bivariate probabilities as constraints. The imposed bivariate marginal probabilities are inferred from profiles along drill holes or wells.more » In the IPF process, a sparse matrix is used to calculate the marginal probabilities from the multivariate probability, which makes the iterative fitting more tractable and practical. This algorithm can be extended to higher order marginal probability constraints as used in multiple point statistics. The theoretical framework is developed and illustrated with estimation and simulation example.« less

  14. Using Dynamic Geometry Software for Teaching Conditional Probability with Area-Proportional Venn Diagrams

    ERIC Educational Resources Information Center

    Radakovic, Nenad; McDougall, Douglas

    2012-01-01

    This classroom note illustrates how dynamic visualization can be used to teach conditional probability and Bayes' theorem. There are two features of the visualization that make it an ideal pedagogical tool in probability instruction. The first feature is the use of area-proportional Venn diagrams that, along with showing qualitative relationships,…

  15. Differences in cytocompatibility, dynamics of the oxide layers' formation, and nickel release between superelastic and thermo-activated nickel-titanium archwires.

    PubMed

    Čolić, Miodrag; Tomić, Sergej; Rudolf, Rebeka; Marković, Evgenija; Šćepan, Ivana

    2016-08-01

    Superelastic (SE) and thermo-activated (TA) nickel-titanium (NiTi) archwires are used in everyday orthodontic practice, based on their acceptable biocompatibility and well-defined shape memory properties. However, the differences in their surface microstructure and cytotoxicity have not been clearly defined, and the standard cytotoxicity tests are too robust to detect small differences in the cytotoxicity of these alloys, all of which can lead to unexpected adverse reactions in some patients. Therefore, we tested the hypothesis that the differences in manufacture and microstructure of commercially available SE and TA archwires may influence their biocompatibility. The archwires were studied as-received and after conditioning for 24 h or 35 days in a cell culture medium under static conditions. All of the tested archwires, including their conditioned medium (CM), were non-cytotoxic for L929 cells, but Rematitan SE (both as received and conditioned) induced the apoptosis of rat thymocytes in a direct contact. In contrast, TruFlex SE and Equire TA increased the proliferation of thymocytes. The cytotoxic effect of Rematitan SE correlated with the higher release of Ni ions in CM, higher concentration of surface Ni and an increased oxygen layer thickness after the conditioning. In conclusion, the apoptosis assay on rat thymocytes, in contrast to the less sensitive standard assay on L929 cells, revealed that Rematitan SE was less cytocompatible compared to other archwires and the effect was most probably associated with a higher exposition of the cells to Ni on the surface of the archwire, due to the formation of unstable oxide layer.

  16. Surveyor V: Discussion of chemical analysis

    USGS Publications Warehouse

    Gault, D.E.; Adams, J.B.; Collins, R.J.; Green, J.; Kuiper, G.P.; Mazursky, H.; O'Keefe, J. A.; Phinney, R.A.; Shoemaker, E.M.

    1967-01-01

    Material of basaltic composition at the Surveyor V landing site implies that differentiation has occurred in the moon, probably due to internal sources of heat. The results are consistent with the hypothesis that extensive volcanic flows have been responsible for flooding and filling the mare basins. The processes and products of lunar magmatic activity are apparently similar to those of the earth.

  17. Surveyor v: discussion of chemical analysis.

    PubMed

    Gault, D E; Adams, J B; Collins, R J; Green, J; Kuiper, G P; Mazursky, H; O'keefe, J A; Phinney, R A; Shoemaker, E M

    1967-11-03

    Material of basaltic composition at the Surveyor V landing site implies that differentiation has occurred in the moon, probably due to internal sources of heat. The results are consistent with the hypothesis that extensive volcanic flows have been responsible for flooding and filling the mare basins. The processes and products of lunar magmatic activity are apparently similar to those of the earth.

  18. Proceedings of the Third Annual Symposium on Mathematical Pattern Recognition and Image Analysis

    NASA Technical Reports Server (NTRS)

    Guseman, L. F., Jr.

    1985-01-01

    Topics addressed include: multivariate spline method; normal mixture analysis applied to remote sensing; image data analysis; classifications in spatially correlated environments; probability density functions; graphical nonparametric methods; subpixel registration analysis; hypothesis integration in image understanding systems; rectification of satellite scanner imagery; spatial variation in remotely sensed images; smooth multidimensional interpolation; and optimal frequency domain textural edge detection filters.

  19. Second-order asymptotics for quantum hypothesis testing in settings beyond i.i.d. - quantum lattice systems and more

    NASA Astrophysics Data System (ADS)

    Datta, Nilanjana; Pautrat, Yan; Rouzé, Cambyse

    2016-06-01

    Quantum Stein's lemma is a cornerstone of quantum statistics and concerns the problem of correctly identifying a quantum state, given the knowledge that it is one of two specific states (ρ or σ). It was originally derived in the asymptotic i.i.d. setting, in which arbitrarily many (say, n) identical copies of the state (ρ⊗n or σ⊗n) are considered to be available. In this setting, the lemma states that, for any given upper bound on the probability αn of erroneously inferring the state to be σ, the probability βn of erroneously inferring the state to be ρ decays exponentially in n, with the rate of decay converging to the relative entropy of the two states. The second order asymptotics for quantum hypothesis testing, which establishes the speed of convergence of this rate of decay to its limiting value, was derived in the i.i.d. setting independently by Tomamichel and Hayashi, and Li. We extend this result to settings beyond i.i.d. Examples of these include Gibbs states of quantum spin systems (with finite-range, translation-invariant interactions) at high temperatures, and quasi-free states of fermionic lattice gases.

  20. Induced vibrations facilitate traversal of cluttered obstacles

    NASA Astrophysics Data System (ADS)

    Thoms, George; Yu, Siyuan; Kang, Yucheng; Li, Chen

    When negotiating cluttered terrains such as grass-like beams, cockroaches and legged robots with rounded body shapes most often rolled their bodies to traverse narrow gaps between beams. Recent locomotion energy landscape modeling suggests that this locomotor pathway overcomes the lowest potential energy barriers. Here, we tested the hypothesis that body vibrations induced by intermittent leg-ground contact facilitate obstacle traversal by allowing exploration of locomotion energy landscape to find this lowest barrier pathway. To mimic a cockroach / legged robot pushing against two adjacent blades of grass, we developed an automated robotic system to move an ellipsoidal body into two adjacent beams, and varied body vibrations by controlling an oscillation actuator. A novel gyroscope mechanism allowed the body to freely rotate in response to interaction with the beams, and an IMU and cameras recorded the motion of the body and beams. We discovered that body vibrations facilitated body rolling, significantly increasing traversal probability and reducing traversal time (P <0.0001, ANOVA). Traversal probability increased with and traversal time decreased with beam separation. These results confirmed our hypothesis and support the plausibility of locomotion energy landscapes for understanding the formation of locomotor pathways in complex 3-D terrains.

  1. Multireader multicase reader studies with binary agreement data: simulation, analysis, validation, and sizing.

    PubMed

    Chen, Weijie; Wunderlich, Adam; Petrick, Nicholas; Gallas, Brandon D

    2014-10-01

    We treat multireader multicase (MRMC) reader studies for which a reader's diagnostic assessment is converted to binary agreement (1: agree with the truth state, 0: disagree with the truth state). We present a mathematical model for simulating binary MRMC data with a desired correlation structure across readers, cases, and two modalities, assuming the expected probability of agreement is equal for the two modalities ([Formula: see text]). This model can be used to validate the coverage probabilities of 95% confidence intervals (of [Formula: see text], [Formula: see text], or [Formula: see text] when [Formula: see text]), validate the type I error of a superiority hypothesis test, and size a noninferiority hypothesis test (which assumes [Formula: see text]). To illustrate the utility of our simulation model, we adapt the Obuchowski-Rockette-Hillis (ORH) method for the analysis of MRMC binary agreement data. Moreover, we use our simulation model to validate the ORH method for binary data and to illustrate sizing in a noninferiority setting. Our software package is publicly available on the Google code project hosting site for use in simulation, analysis, validation, and sizing of MRMC reader studies with binary agreement data.

  2. Multireader multicase reader studies with binary agreement data: simulation, analysis, validation, and sizing

    PubMed Central

    Chen, Weijie; Wunderlich, Adam; Petrick, Nicholas; Gallas, Brandon D.

    2014-01-01

    Abstract. We treat multireader multicase (MRMC) reader studies for which a reader’s diagnostic assessment is converted to binary agreement (1: agree with the truth state, 0: disagree with the truth state). We present a mathematical model for simulating binary MRMC data with a desired correlation structure across readers, cases, and two modalities, assuming the expected probability of agreement is equal for the two modalities (P1=P2). This model can be used to validate the coverage probabilities of 95% confidence intervals (of P1, P2, or P1−P2 when P1−P2=0), validate the type I error of a superiority hypothesis test, and size a noninferiority hypothesis test (which assumes P1=P2). To illustrate the utility of our simulation model, we adapt the Obuchowski–Rockette–Hillis (ORH) method for the analysis of MRMC binary agreement data. Moreover, we use our simulation model to validate the ORH method for binary data and to illustrate sizing in a noninferiority setting. Our software package is publicly available on the Google code project hosting site for use in simulation, analysis, validation, and sizing of MRMC reader studies with binary agreement data. PMID:26158051

  3. Bayesian adaptive phase II screening design for combination trials.

    PubMed

    Cai, Chunyan; Yuan, Ying; Johnson, Valen E

    2013-01-01

    Trials of combination therapies for the treatment of cancer are playing an increasingly important role in the battle against this disease. To more efficiently handle the large number of combination therapies that must be tested, we propose a novel Bayesian phase II adaptive screening design to simultaneously select among possible treatment combinations involving multiple agents. Our design is based on formulating the selection procedure as a Bayesian hypothesis testing problem in which the superiority of each treatment combination is equated to a single hypothesis. During the trial conduct, we use the current values of the posterior probabilities of all hypotheses to adaptively allocate patients to treatment combinations. Simulation studies show that the proposed design substantially outperforms the conventional multiarm balanced factorial trial design. The proposed design yields a significantly higher probability for selecting the best treatment while allocating substantially more patients to efficacious treatments. The proposed design is most appropriate for the trials combining multiple agents and screening out the efficacious combination to be further investigated. The proposed Bayesian adaptive phase II screening design substantially outperformed the conventional complete factorial design. Our design allocates more patients to better treatments while providing higher power to identify the best treatment at the end of the trial.

  4. Perception of Sentence Stress in Speech Correlates With the Temporal Unpredictability of Prosodic Features.

    PubMed

    Kakouros, Sofoklis; Räsänen, Okko

    2016-09-01

    Numerous studies have examined the acoustic correlates of sentential stress and its underlying linguistic functionality. However, the mechanism that connects stress cues to the listener's attentional processing has remained unclear. Also, the learnability versus innateness of stress perception has not been widely discussed. In this work, we introduce a novel perspective to the study of sentential stress and put forward the hypothesis that perceived sentence stress in speech is related to the unpredictability of prosodic features, thereby capturing the attention of the listener. As predictability is based on the statistical structure of the speech input, the hypothesis also suggests that stress perception is a result of general statistical learning mechanisms. To study this idea, computational simulations are performed where temporal prosodic trajectories are modeled with an n-gram model. Probabilities of the feature trajectories are subsequently evaluated on a set of novel utterances and compared to human perception of stress. The results show that the low-probability regions of F0 and energy trajectories are strongly correlated with stress perception, giving support to the idea that attention and unpredictability of sensory stimulus are mutually connected. Copyright © 2015 Cognitive Science Society, Inc.

  5. Sounds can boost the awareness of visual events through attention without cross-modal integration.

    PubMed

    Pápai, Márta Szabina; Soto-Faraco, Salvador

    2017-01-31

    Cross-modal interactions can lead to enhancement of visual perception, even for visual events below awareness. However, the underlying mechanism is still unclear. Can purely bottom-up cross-modal integration break through the threshold of awareness? We used a binocular rivalry paradigm to measure perceptual switches after brief flashes or sounds which, sometimes, co-occurred. When flashes at the suppressed eye coincided with sounds, perceptual switches occurred the earliest. Yet, contrary to the hypothesis of cross-modal integration, this facilitation never surpassed the assumption of probability summation of independent sensory signals. A follow-up experiment replicated the same pattern of results using silent gaps embedded in continuous noise, instead of sounds. This manipulation should weaken putative sound-flash integration, although keep them salient as bottom-up attention cues. Additional results showed that spatial congruency between flashes and sounds did not determine the effectiveness of cross-modal facilitation, which was again not better than probability summation. Thus, the present findings fail to fully support the hypothesis of bottom-up cross-modal integration, above and beyond the independent contribution of two transient signals, as an account for cross-modal enhancement of visual events below level of awareness.

  6. Protecting Privacy Using k-Anonymity

    PubMed Central

    El Emam, Khaled; Dankar, Fida Kamal

    2008-01-01

    Objective There is increasing pressure to share health information and even make it publicly available. However, such disclosures of personal health information raise serious privacy concerns. To alleviate such concerns, it is possible to anonymize the data before disclosure. One popular anonymization approach is k-anonymity. There have been no evaluations of the actual re-identification probability of k-anonymized data sets. Design Through a simulation, we evaluated the re-identification risk of k-anonymization and three different improvements on three large data sets. Measurement Re-identification probability is measured under two different re-identification scenarios. Information loss is measured by the commonly used discernability metric. Results For one of the re-identification scenarios, k-Anonymity consistently over-anonymizes data sets, with this over-anonymization being most pronounced with small sampling fractions. Over-anonymization results in excessive distortions to the data (i.e., high information loss), making the data less useful for subsequent analysis. We found that a hypothesis testing approach provided the best control over re-identification risk and reduces the extent of information loss compared to baseline k-anonymity. Conclusion Guidelines are provided on when to use the hypothesis testing approach instead of baseline k-anonymity. PMID:18579830

  7. Statistical learning of action: the role of conditional probability.

    PubMed

    Meyer, Meredith; Baldwin, Dare

    2011-12-01

    Identification of distinct units within a continuous flow of human action is fundamental to action processing. Such segmentation may rest in part on statistical learning. In a series of four experiments, we examined what types of statistics people can use to segment a continuous stream involving many brief, goal-directed action elements. The results of Experiment 1 showed no evidence for sensitivity to conditional probability, whereas Experiment 2 displayed learning based on joint probability. In Experiment 3, we demonstrated that additional exposure to the input failed to engender sensitivity to conditional probability. However, the results of Experiment 4 showed that a subset of adults-namely, those more successful at identifying actions that had been seen more frequently than comparison sequences-were also successful at learning conditional-probability statistics. These experiments help to clarify the mechanisms subserving processing of intentional action, and they highlight important differences from, as well as similarities to, prior studies of statistical learning in other domains, including language.

  8. Body Condition Indices Predict Reproductive Success but Not Survival in a Sedentary, Tropical Bird

    PubMed Central

    Milenkaya, Olga; Catlin, Daniel H.; Legge, Sarah; Walters, Jeffrey R.

    2015-01-01

    Body condition may predict individual fitness because those in better condition have more resources to allocate towards improving their fitness. However, the hypothesis that condition indices are meaningful proxies for fitness has been questioned. Here, we ask if intraspecific variation in condition indices predicts annual reproductive success and survival. We monitored a population of Neochmia phaeton (crimson finch), a sedentary, tropical passerine, for reproductive success and survival over four breeding seasons, and sampled them for commonly used condition indices: mass adjusted for body size, muscle and fat scores, packed cell volume, hemoglobin concentration, total plasma protein, and heterophil to lymphocyte ratio. Our study population is well suited for this research because individuals forage in common areas and do not hold territories such that variation in condition between individuals is not confounded by differences in habitat quality. Furthermore, we controlled for factors that are known to impact condition indices in our study population (e.g., breeding stage) such that we assessed individual condition relative to others in the same context. Condition indices that reflect energy reserves predicted both the probability of an individual fledging young and the number of young produced that survived to independence, but only during some years. Those that were relatively heavy for their body size produced about three times more independent young compared to light individuals. That energy reserves are a meaningful predictor of reproductive success in a sedentary passerine supports the idea that energy reserves are at least sometimes predictors of fitness. However, hematological indices failed to predict reproductive success and none of the indices predicted survival. Therefore, some but not all condition indices may be informative, but because we found that most indices did not predict any component of fitness, we question the ubiquitous interpretation of condition indices as surrogates for individual quality and fitness. PMID:26305457

  9. Bayesian Model Selection in Geophysics: The evidence

    NASA Astrophysics Data System (ADS)

    Vrugt, J. A.

    2016-12-01

    Bayesian inference has found widespread application and use in science and engineering to reconcile Earth system models with data, including prediction in space (interpolation), prediction in time (forecasting), assimilation of observations and deterministic/stochastic model output, and inference of the model parameters. Per Bayes theorem, the posterior probability, , P(H|D), of a hypothesis, H, given the data D, is equivalent to the product of its prior probability, P(H), and likelihood, L(H|D), divided by a normalization constant, P(D). In geophysics, the hypothesis, H, often constitutes a description (parameterization) of the subsurface for some entity of interest (e.g. porosity, moisture content). The normalization constant, P(D), is not required for inference of the subsurface structure, yet of great value for model selection. Unfortunately, it is not particularly easy to estimate P(D) in practice. Here, I will introduce the various building blocks of a general purpose method which provides robust and unbiased estimates of the evidence, P(D). This method uses multi-dimensional numerical integration of the posterior (parameter) distribution. I will then illustrate this new estimator by application to three competing subsurface models (hypothesis) using GPR travel time data from the South Oyster Bacterial Transport Site, in Virginia, USA. The three subsurface models differ in their treatment of the porosity distribution and use (a) horizontal layering with fixed layer thicknesses, (b) vertical layering with fixed layer thicknesses and (c) a multi-Gaussian field. The results of the new estimator are compared against the brute force Monte Carlo method, and the Laplace-Metropolis method.

  10. The Cross-Cultural Dementia Screening (CCD): A new neuropsychological screening instrument for dementia in elderly immigrants.

    PubMed

    Goudsmit, Miriam; Uysal-Bozkir, Özgül; Parlevliet, Juliette L; van Campen, Jos P C M; de Rooij, Sophia E; Schmand, Ben

    2017-03-01

    Currently, approximately 3.9% of the European population are non-EU citizens, and a large part of these people are from "non-Western" societies, such as Turkey and Morocco. For various reasons, the incidence of dementia in this group is expected to increase. However, cognitive testing is challenging due to language barriers and low education and/or illiteracy. The newly developed Cross-Cultural Dementia Screening (CCD) can be administered without an interpreter. It contains three subtests that assess memory, mental speed, and executive function. We hypothesized the CCD to be a culture-fair test that could discriminate between demented patients and cognitively healthy controls. To test this hypothesis, 54 patients who had probable dementia were recruited via memory clinics. Controls (N = 1625) were recruited via their general practitioners. All patients and controls were aged 55 years and older and of six different self-defined ethnicities (Dutch, Turkish, Moroccan-Arabic, Moroccan-Berber, Surinamese-Creole, and Surinamese-Hindustani). Exclusion criteria included current or previous conditions that affect cognitive functioning. There were performance differences between the ethnic groups, but these disappeared after correcting for age and education differences between the groups, which supports our central hypothesis that the CCD is a culture-fair test. Receiver-operating characteristic (ROC) and logistic regression analyses showed that the CCD has high predictive validity for dementia (sensitivity: 85%; specificity: 89%). The CCD is a sensitive and culture-fair neuropsychological instrument for dementia screening in low-educated immigrant populations.

  11. The neurobiology of tobacco dependence: a preclinical perspective on the role of the dopamine projections to the nucleus accumbens [corrected].

    PubMed

    Balfour, David J K

    2004-12-01

    It is now widely accepted that nicotine is the primary addictive component of tobacco smoke and that a majority of habitual smokers find it difficult to quit smoking because of their dependence upon this component of the smoke. However, although nicotine replacement therapy elicits a clinically valuable and significant improvement in the number of quit attempts that are ultimately successful, its efficacy remains disappointingly low. This review considers some of the reasons for this problem. It focuses on the hypothesis that stimulation of the dopamine (DA) projections to the medial shell and the core of the nucleus accumbens play complementary roles in the development of nicotine dependence. The hypothesis proposes that increased extra-synaptic DA in the medial shell of the accumbens confers hedonic properties on behaviors, such as smoking, which deliver nicotine, and thereby increase the probability that the response is learned. It also summarizes the evidence that the primary role of the increased DA overflow, observed in the accumbal core of nicotine-pretreated individuals, challenged with nicotine, is the attribution of incentive salience to cues associated with delivery of the drug and the transition to Pavlovian responding to these conditioned stimuli. The review argues that sensitization of the DA projections to the accumbal core, and the behaviors that depend upon this process, play a pivotal role in the maintenance of the tobacco smoking habit and that it is this component of the dependence that is inadequately addressed by nicotine replacement therapy.

  12. Montmorillonite protection of an UV-irradiated hairpin ribozyme: evolution of the RNA world in a mineral environment

    PubMed Central

    Biondi, Elisa; Branciamore, Sergio; Maurel, Marie-Christine; Gallori, Enzo

    2007-01-01

    Background The hypothesis of an RNA-based origin of life, known as the "RNA world", is strongly affected by the hostile environmental conditions probably present in the early Earth. In particular, strong UV and X-ray radiations could have been a major obstacle to the formation and evolution of the first biomolecules. In 1951, J. D. Bernal first proposed that clay minerals could have served as the sites of accumulation and protection from degradation of the first biopolymers, providing the right physical setting for the evolution of more complex systems. Numerous subsequent experimental studies have reinforced this hypothesis. Results The ability of the possibly widespread prebiotic, clay mineral montmorillonite to protect the catalytic RNA molecule ADHR1 (Adenine Dependent Hairpin Ribozyme 1) from UV-induced damages was experimentally checked. In particular, the self-cleavage reaction of the ribozyme was evaluated after UV-irradiation of the molecule in the absence or presence of clay particles. Results obtained showed a three-fold retention of the self-cleavage activity of the montmorillonite-protected molecule, with respect to the same reaction performed by the ribozyme irradiated in the absence of the clay. Conclusion These results provide a suggestion with which RNA, or RNA-like molecules, could have overcame the problem of protection from UV irradiation in the RNA world era, and suggest that a clay-rich environment could have favoured not only the formation of first genetic molecules, but also their evolution towards increasingly complex molecular organization. PMID:17767730

  13. An evaluation of inferential procedures for adaptive clinical trial designs with pre-specified rules for modifying the sample size.

    PubMed

    Levin, Gregory P; Emerson, Sarah C; Emerson, Scott S

    2014-09-01

    Many papers have introduced adaptive clinical trial methods that allow modifications to the sample size based on interim estimates of treatment effect. There has been extensive commentary on type I error control and efficiency considerations, but little research on estimation after an adaptive hypothesis test. We evaluate the reliability and precision of different inferential procedures in the presence of an adaptive design with pre-specified rules for modifying the sampling plan. We extend group sequential orderings of the outcome space based on the stage at stopping, likelihood ratio statistic, and sample mean to the adaptive setting in order to compute median-unbiased point estimates, exact confidence intervals, and P-values uniformly distributed under the null hypothesis. The likelihood ratio ordering is found to average shorter confidence intervals and produce higher probabilities of P-values below important thresholds than alternative approaches. The bias adjusted mean demonstrates the lowest mean squared error among candidate point estimates. A conditional error-based approach in the literature has the benefit of being the only method that accommodates unplanned adaptations. We compare the performance of this and other methods in order to quantify the cost of failing to plan ahead in settings where adaptations could realistically be pre-specified at the design stage. We find the cost to be meaningful for all designs and treatment effects considered, and to be substantial for designs frequently proposed in the literature. © 2014, The International Biometric Society.

  14. Snowball Earth: Response of the biosphere?

    NASA Astrophysics Data System (ADS)

    Runnegar, B.

    2001-05-01

    Snowball Earth is a script for global catastrophe that rivals giant impact theories in the likely severity of its environmental effects. This is particularly true for the "hard" version of the hypothesis, which requires the atmosphere to be effectively isolated from the ocean so that its carbon dioxide concentration can build up to the level ( ~100 PAL) ultimately required to melt the ice. However, coupled GCM-EMB models (Hyde et al. Nature 405, 425-430; Crowley & Hyde, GRL 28, 283-286) allow equatorial open water solutions under plausible Neoproterozoic conditions. These "softer" scenarios are more appealing if one considers the possible effects of snowball Earth episodes on the global biosphere. The meager Neoproterozoic fossil record makes it difficult to observe the biospheric response directly, but we know from evolutionary trees constructed from aligned protein and DNA sequences from living organisms, calibrated by the fossil record, that many lines of descent passed through the Cryogenian glacial periods. They include various kinds of prokaryotic and eukaryotic algae, a range of protists, and probably, a number of different kinds of animals and fungi. In addition, most of the microbial groups shown on comprehensive 16S rRNA trees have molecular clock ages that predate the snowball episodes. As the global environmental perturbations associated with the "hard" snowball hypothesis (freezing temperatures; huge and rapid changes in temperature; sudden carbon dioxide overload) are thought to have been biologically limiting during the Phanerozoic, the inferred response of the biosphere to Neoprotereozic glaciations may, indeed, provide a way of testing alternative snowball Earth scenarios.

  15. [Effects of prefrontal ablations on the reaction of the active choice of feeder under different probability and value of the reinforcement on dog].

    PubMed

    Preobrazhenskaia, L A; Ioffe, M E; Mats, V N

    2004-01-01

    The role of the prefrontal cortex was investigated on the reaction of the active choice of the two feeders under changes value and probability reinforcement. The experiments were performed on 2 dogs with prefrontal ablation (g. proreus). Before the lesions the dogs were taught to receive food in two different feeders to conditioned stimuli with equally probable alimentary reinforcement. After ablation in the inter-trial intervals the dogs were running from the one feeder to another. In the answer to conditioned stimuli for many times the dogs choose the same feeder. The disturbance of the behavior after some times completely restored. In the experiments with competition of probability events and values of reinforcement the dogs chose the feeder with low-probability but better quality of reinforcement. In the experiments with equal value but different probability the intact dogs chose the feeder with higher probability. In our experiments the dogs with prefrontal lesions chose the each feeder equiprobably. Thus in condition of free behavior one of different functions of the prefrontal cortex is the reactions choose with more probability of reinforcement.

  16. Human Inferences about Sequences: A Minimal Transition Probability Model

    PubMed Central

    2016-01-01

    The brain constantly infers the causes of the inputs it receives and uses these inferences to generate statistical expectations about future observations. Experimental evidence for these expectations and their violations include explicit reports, sequential effects on reaction times, and mismatch or surprise signals recorded in electrophysiology and functional MRI. Here, we explore the hypothesis that the brain acts as a near-optimal inference device that constantly attempts to infer the time-varying matrix of transition probabilities between the stimuli it receives, even when those stimuli are in fact fully unpredictable. This parsimonious Bayesian model, with a single free parameter, accounts for a broad range of findings on surprise signals, sequential effects and the perception of randomness. Notably, it explains the pervasive asymmetry between repetitions and alternations encountered in those studies. Our analysis suggests that a neural machinery for inferring transition probabilities lies at the core of human sequence knowledge. PMID:28030543

  17. The idiosyncratic nature of confidence

    PubMed Central

    Navajas, Joaquin; Hindocha, Chandni; Foda, Hebah; Keramati, Mehdi; Latham, Peter E; Bahrami, Bahador

    2017-01-01

    Confidence is the ‘feeling of knowing’ that accompanies decision making. Bayesian theory proposes that confidence is a function solely of the perceived probability of being correct. Empirical research has suggested, however, that different individuals may perform different computations to estimate confidence from uncertain evidence. To test this hypothesis, we collected confidence reports in a task where subjects made categorical decisions about the mean of a sequence. We found that for most individuals, confidence did indeed reflect the perceived probability of being correct. However, in approximately half of them, confidence also reflected a different probabilistic quantity: the perceived uncertainty in the estimated variable. We found that the contribution of both quantities was stable over weeks. We also observed that the influence of the perceived probability of being correct was stable across two tasks, one perceptual and one cognitive. Overall, our findings provide a computational interpretation of individual differences in human confidence. PMID:29152591

  18. Sequential Probability Ratio Test for Spacecraft Collision Avoidance Maneuver Decisions

    NASA Technical Reports Server (NTRS)

    Carpenter, J. Russell; Markley, F. Landis

    2013-01-01

    A document discusses sequential probability ratio tests that explicitly allow decision-makers to incorporate false alarm and missed detection risks, and are potentially less sensitive to modeling errors than a procedure that relies solely on a probability of collision threshold. Recent work on constrained Kalman filtering has suggested an approach to formulating such a test for collision avoidance maneuver decisions: a filter bank with two norm-inequality-constrained epoch-state extended Kalman filters. One filter models the null hypotheses that the miss distance is inside the combined hard body radius at the predicted time of closest approach, and one filter models the alternative hypothesis. The epoch-state filter developed for this method explicitly accounts for any process noise present in the system. The method appears to work well using a realistic example based on an upcoming, highly elliptical orbit formation flying mission.

  19. The Context Matters: Outcome Probability and Expectation Mismatch Modulate the Feedback Negativity When Self-Evaluation of Response Correctness Is Possible

    PubMed Central

    Leue, Anja; Cano Rodilla, Carmen; Beauducel, André

    2015-01-01

    Individuals typically evaluate whether their performance and the obtained feedback match. Previous research has shown that feedback negativity (FN) depends on outcome probability and feedback valence. It is, however, less clear to what extent previous effects of outcome probability on FN depend on self-evaluations of response correctness. Therefore, we investigated the effects of outcome probability on FN amplitude in a simple go/no-go task that allowed for the self-evaluation of response correctness. We also investigated effects of performance incompatibility and feedback valence. In a sample of N = 22 participants, outcome probability was manipulated by means of precues, feedback valence by means of monetary feedback, and performance incompatibility by means of feedback that induced a match versus mismatch with individuals' performance. We found that the 100% outcome probability condition induced a more negative FN following no-loss than the 50% outcome probability condition. The FN following loss was more negative in the 50% compared to the 100% outcome probability condition. Performance-incompatible loss resulted in a more negative FN than performance-compatible loss. Our results indicate that the self-evaluation of the correctness of responses should be taken into account when the effects of outcome probability and expectation mismatch on FN are investigated. PMID:26783525

  20. The Context Matters: Outcome Probability and Expectation Mismatch Modulate the Feedback Negativity When Self-Evaluation of Response Correctness Is Possible.

    PubMed

    Leue, Anja; Cano Rodilla, Carmen; Beauducel, André

    2015-01-01

    Individuals typically evaluate whether their performance and the obtained feedback match. Previous research has shown that feedback negativity (FN) depends on outcome probability and feedback valence. It is, however, less clear to what extent previous effects of outcome probability on FN depend on self-evaluations of response correctness. Therefore, we investigated the effects of outcome probability on FN amplitude in a simple go/no-go task that allowed for the self-evaluation of response correctness. We also investigated effects of performance incompatibility and feedback valence. In a sample of N = 22 participants, outcome probability was manipulated by means of precues, feedback valence by means of monetary feedback, and performance incompatibility by means of feedback that induced a match versus mismatch with individuals' performance. We found that the 100% outcome probability condition induced a more negative FN following no-loss than the 50% outcome probability condition. The FN following loss was more negative in the 50% compared to the 100% outcome probability condition. Performance-incompatible loss resulted in a more negative FN than performance-compatible loss. Our results indicate that the self-evaluation of the correctness of responses should be taken into account when the effects of outcome probability and expectation mismatch on FN are investigated.

  1. Body condition of Morelet’s Crocodiles (Crocodylus moreletii) from northern Belize

    USGS Publications Warehouse

    Mazzotti, Frank J.; Cherkiss, Michael S.; Brandt, Laura A.; Fujisaki, Ikuko; Hart, Kristen; Jeffery, Brian; McMurry, Scott T.; Platt, Steven G.; Rainwater, Thomas R.; Vinci, Joy

    2012-01-01

    Body condition factors have been used as an indicator of health and well-being of crocodilians. We evaluated body condition of Morelet's Crocodiles (Crocodylus moreletii) in northern Belize in relation to biotic (size, sex, and habitat) and abiotic (location, water level, and air temperature) factors. We also tested the hypothesis that high water levels and warm temperatures combine or interact to result in a decrease in body condition. Size class, temperature, and water level explained 20% of the variability in condition of Morelet's Crocodiles in this study. We found that adult crocodiles had higher condition scores than juveniles/subadults but that sex, habitat, and site had no effect. We confirmed our hypothesis that warm temperatures and high water levels interact to decrease body condition. We related body condition of Morelet's Crocodiles to natural fluctuations in air temperatures and water levels in northern Belize, providing baseline conditions for population and ecosystem monitoring.

  2. The Trivers–Willard hypothesis: sex ratio or investment?

    PubMed Central

    Veller, Carl; Haig, David; Nowak, Martin A.

    2016-01-01

    The Trivers–Willard hypothesis has commonly been considered to predict two things. First, that a mother in good condition should bias the sex ratio of her offspring towards males (if males exhibit greater variation in reproductive value). Second, that a mother in good condition should invest more per son than per daughter. These two predictions differ empirically, mechanistically and, as we demonstrate here, theoretically too. We construct a simple model of sex allocation that allows simultaneous analysis of both versions of the Trivers–Willard hypothesis. We show that the sex ratio version holds under very general conditions, being valid for a large class of male and female fitness functions. The investment version, on the other hand, is shown to hold only for a small subset of male and female fitness functions. Our results help to make sense of the observation that the sex ratio version is empirically more successful than the investment version. PMID:27170721

  3. Illuminating the dual-hormone hypothesis: About chronic dominance and the interaction of cortisol and testosterone.

    PubMed

    Pfattheicher, Stefan

    2017-01-01

    The dual-hormone hypothesis suggests that testosterone is positively associated with status-seeking tendencies such as aggression and dominance, particularly in individuals with low levels of cortisol. Although recent research supports the dual-hormone hypothesis, its boundary conditions under which the dual-hormone interaction is likely to emerge are not clearly understood. In the present study (N = 153), the dual-hormone hypothesis was empirically tested in the context of an economic game that included a decision whether to dominate another individual. We also examined whether the dual-hormone interaction is more likely to be found in individuals who are chronically prone to dominance tendencies. Results revealed a significant testosterone × cortisol interaction in line with the dual-hormone hypothesis. Additionally, the testosterone × cortisol interaction was only significant in individuals with a high level of chronic dominance. Overall, the present work suggests that chronic personality tendencies should be taken into account in order to explore (the boundary conditions) of hormone-behavior associations. Aggr. Behav. 43:85-92, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  4. A review of empirical evidence on different uncanny valley hypotheses: support for perceptual mismatch as one road to the valley of eeriness

    PubMed Central

    Kätsyri, Jari; Förger, Klaus; Mäkäräinen, Meeri; Takala, Tapio

    2015-01-01

    The uncanny valley hypothesis, proposed already in the 1970s, suggests that almost but not fully humanlike artificial characters will trigger a profound sense of unease. This hypothesis has become widely acknowledged both in the popular media and scientific research. Surprisingly, empirical evidence for the hypothesis has remained inconsistent. In the present article, we reinterpret the original uncanny valley hypothesis and review empirical evidence for different theoretically motivated uncanny valley hypotheses. The uncanny valley could be understood as the naïve claim that any kind of human-likeness manipulation will lead to experienced negative affinity at close-to-realistic levels. More recent hypotheses have suggested that the uncanny valley would be caused by artificial–human categorization difficulty or by a perceptual mismatch between artificial and human features. Original formulation also suggested that movement would modulate the uncanny valley. The reviewed empirical literature failed to provide consistent support for the naïve uncanny valley hypothesis or the modulatory effects of movement. Results on the categorization difficulty hypothesis were still too scarce to allow drawing firm conclusions. In contrast, good support was found for the perceptual mismatch hypothesis. Taken together, the present review findings suggest that the uncanny valley exists only under specific conditions. More research is still needed to pinpoint the exact conditions under which the uncanny valley phenomenon manifests itself. PMID:25914661

  5. A time-varying effect model for examining group differences in trajectories of zero-inflated count outcomes with applications in substance abuse research.

    PubMed

    Yang, Songshan; Cranford, James A; Jester, Jennifer M; Li, Runze; Zucker, Robert A; Buu, Anne

    2017-02-28

    This study proposes a time-varying effect model for examining group differences in trajectories of zero-inflated count outcomes. The motivating example demonstrates that this zero-inflated Poisson model allows investigators to study group differences in different aspects of substance use (e.g., the probability of abstinence and the quantity of alcohol use) simultaneously. The simulation study shows that the accuracy of estimation of trajectory functions improves as the sample size increases; the accuracy under equal group sizes is only higher when the sample size is small (100). In terms of the performance of the hypothesis testing, the type I error rates are close to their corresponding significance levels under all settings. Furthermore, the power increases as the alternative hypothesis deviates more from the null hypothesis, and the rate of this increasing trend is higher when the sample size is larger. Moreover, the hypothesis test for the group difference in the zero component tends to be less powerful than the test for the group difference in the Poisson component. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  6. Hypothesis tests for the detection of constant speed radiation moving sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dumazert, Jonathan; Coulon, Romain; Kondrasovs, Vladimir

    2015-07-01

    Radiation Portal Monitors are deployed in linear network to detect radiological material in motion. As a complement to single and multichannel detection algorithms, inefficient under too low signal to noise ratios, temporal correlation algorithms have been introduced. Test hypothesis methods based on empirically estimated mean and variance of the signals delivered by the different channels have shown significant gain in terms of a tradeoff between detection sensitivity and false alarm probability. This paper discloses the concept of a new hypothesis test for temporal correlation detection methods, taking advantage of the Poisson nature of the registered counting signals, and establishes amore » benchmark between this test and its empirical counterpart. The simulation study validates that in the four relevant configurations of a pedestrian source carrier under respectively high and low count rate radioactive background, and a vehicle source carrier under the same respectively high and low count rate radioactive background, the newly introduced hypothesis test ensures a significantly improved compromise between sensitivity and false alarm, while guaranteeing the stability of its optimization parameter regardless of signal to noise ratio variations between 2 to 0.8. (authors)« less

  7. Statistical modeling, detection, and segmentation of stains in digitized fabric images

    NASA Astrophysics Data System (ADS)

    Gururajan, Arunkumar; Sari-Sarraf, Hamed; Hequet, Eric F.

    2007-02-01

    This paper will describe a novel and automated system based on a computer vision approach, for objective evaluation of stain release on cotton fabrics. Digitized color images of the stained fabrics are obtained, and the pixel values in the color and intensity planes of these images are probabilistically modeled as a Gaussian Mixture Model (GMM). Stain detection is posed as a decision theoretic problem, where the null hypothesis corresponds to absence of a stain. The null hypothesis and the alternate hypothesis mathematically translate into a first order GMM and a second order GMM respectively. The parameters of the GMM are estimated using a modified Expectation-Maximization (EM) algorithm. Minimum Description Length (MDL) is then used as the test statistic to decide the verity of the null hypothesis. The stain is then segmented by a decision rule based on the probability map generated by the EM algorithm. The proposed approach was tested on a dataset of 48 fabric images soiled with stains of ketchup, corn oil, mustard, ragu sauce, revlon makeup and grape juice. The decision theoretic part of the algorithm produced a correct detection rate (true positive) of 93% and a false alarm rate of 5% on these set of images.

  8. Complex interplay of body condition, life history, and prevailing environment shapes immune defenses of garter snakes in the wild.

    PubMed

    Palacios, Maria G; Cunnick, Joan E; Bronikowski, Anne M

    2013-01-01

    The immunocompetence "pace-of-life" hypothesis proposes that fast-living organisms should invest more in innate immune defenses and less in adaptive defenses compared to slow-living ones. We found some support for this hypothesis in two life-history ecotypes of the snake Thamnophis elegans; fast-living individuals show higher levels of innate immunity compared to slow-living ones. Here, we optimized a lymphocyte proliferation assay to assess the complementary prediction that slow-living snakes should in turn show stronger adaptive defenses. We also assessed the "environmental" hypothesis that predicts that slow-living snakes should show lower levels of immune defenses (both innate and adaptive) given the harsher environment they live in. Proliferation of B- and T-lymphocytes of free-living individuals was on average higher in fast-living than slow-living snakes, opposing the pace-of-life hypothesis and supporting the environmental hypothesis. Bactericidal capacity of plasma, an index of innate immunity, did not differ between fast-living and slow-living snakes in this study, contrasting the previously documented pattern and highlighting the importance of annual environmental conditions as determinants of immune profiles of free-living animals. Our results do not negate a link between life history and immunity, as indicated by ecotype-specific relationships between lymphocyte proliferation and body condition, but suggest more subtle nuances than those currently proposed.

  9. Arousal and hallucinatory activity under two isolation conditions

    NASA Technical Reports Server (NTRS)

    Levin, J.

    1974-01-01

    Experimental exploration of the hypothesis that soundproof-room and water-immersion isolation environments differ with respect to the variety of physiological responses and reported hallucinations they elicit. The results obtained support the hypothesis in regard to physiological responses only.

  10. The case against climate regulation via oceanic phytoplankton sulphur emissions.

    PubMed

    Quinn, P K; Bates, T S

    2011-11-30

    More than twenty years ago, a biological regulation of climate was proposed whereby emissions of dimethyl sulphide from oceanic phytoplankton resulted in the formation of aerosol particles that acted as cloud condensation nuclei in the marine boundary layer. In this hypothesis--referred to as CLAW--the increase in cloud condensation nuclei led to an increase in cloud albedo with the resulting changes in temperature and radiation initiating a climate feedback altering dimethyl sulphide emissions from phytoplankton. Over the past two decades, observations in the marine boundary layer, laboratory studies and modelling efforts have been conducted seeking evidence for the CLAW hypothesis. The results indicate that a dimethyl sulphide biological control over cloud condensation nuclei probably does not exist and that sources of these nuclei to the marine boundary layer and the response of clouds to changes in aerosol are much more complex than was recognized twenty years ago. These results indicate that it is time to retire the CLAW hypothesis.

  11. The line-locking hypothesis, absorption by intervening galaxies, and the z = 1.95 peak in redshifts

    NASA Technical Reports Server (NTRS)

    Burbidge, G.

    1978-01-01

    The controversy over whether the absorption spectrum in QSOs is intrinsic or extrinsic is approached with attention to the peak of redshifts at z = 1.95. Also considered are the line-locking and the intervening galaxy hypotheses. The line locking hypothesis is based on observations that certain ratios found in absorption line QSOs are preferred, and leads inevitably to the conclusion that the absorption line systems are intrinsic. The intervening galaxy hypothesis is based on absorption redshifts resulting from given absorption cross-sections of galactic clusters and the intergalactic medium, and would lead to the theoretical conclusion that most QSOs show strong absorption, a conclusion which is not supported by empirical data. The 1.95 peak, on the other hand, is most probably an intrinsic property of QSOs. The peak is enhanced by redshift, and it is noted that both an emission and an absorption redshift peak are seen at 1.95.

  12. Perceiving expressions of emotion: What evidence could bear on questions about perceptual experience of mental states?

    PubMed

    Butterfill, Stephen A

    2015-11-01

    What evidence could bear on questions about whether humans ever perceptually experience any of another's mental states, and how might those questions be made precise enough to test experimentally? This paper focusses on emotions and their expression. It is proposed that research on perceptual experiences of physical properties provides one model for thinking about what evidence concerning expressions of emotion might reveal about perceptual experiences of others' mental states. This proposal motivates consideration of the hypothesis that categorical perception of expressions of emotion occurs, can be facilitated by information about agents' emotions, and gives rise to phenomenal expectations. It is argued that the truth of this hypothesis would support a modest version of the claim that humans sometimes perceptually experience some of another's mental states. Much available evidence is consistent with, but insufficient to establish, the truth of the hypothesis. We are probably not yet in a position to know whether humans ever perceptually experience others' mental states. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Long- and short-term influence of environment on recruitment in a species with highly delayed maturity.

    PubMed

    Nevoux, Marie; Weimerskirch, Henri; Barbraud, Christophe

    2010-02-01

    Short-term effects of environmental perturbations on various life history traits are reasonably well documented in birds and mammals. But, in the present context of global climate change, there is a need to consider potential long-term effects of natal conditions to better understand and predict the consequences of these changes on population dynamics. The environmental conditions affecting offspring during their early development may determine their lifetime reproductive performance, and therefore the number of recruits produced by a cohort. In this study, we attempted to link recruitment to natal and recent (previous year) conditions in the long-lived black-browed albatross (Thalassarche melanophrys) at Kerguelen Islands. The environmental variability was described using both climatic variables over breeding (sea surface temperature anomaly) and non-breeding grounds (Southern Oscillation index), and variables related to the colony (breeding success and colony size). Immature survival was linked to the breeding success of the colony in the year of birth, which was expected to reflect the average seasonal parental investment. At the cohort level, this initial mortality event may act as a selective filter shaping the number, and presumably the quality (breeding frequency, breeding success probability), of the individuals that recruit into the breeding population. The decision to start breeding was strongly structured by the age of the individuals and adjusted according to recent conditions. An effect of natal conditions was not detected on this parameter, supporting the selection hypothesis. Recruitment, as a whole, was thus influenced by a combination of long- and short-term environmental impacts. Our results highlight the complexity of the influence of environmental factors on such long-lived species, due to the time-lag (associated with a delayed maturity) between the impact of natal conditions on individuals and their repercussion on the breeding population.

  14. Insights into Nitrate-Reducing Fe(II) Oxidation Mechanisms through Analysis of Cell-Mineral Associations, Cell Encrustation, and Mineralogy in the Chemolithoautotrophic Enrichment Culture KS

    PubMed Central

    Nordhoff, M.; Tominski, C.; Halama, M.; Byrne, J. M.; Obst, M.; Behrens, S.

    2017-01-01

    ABSTRACT Most described nitrate-reducing Fe(II)-oxidizing bacteria (NRFeOB) are mixotrophic and depend on organic cosubstrates for growth. Encrustation of cells in Fe(III) minerals has been observed for mixotrophic NRFeOB but not for autotrophic phototrophic and microaerophilic Fe(II) oxidizers. So far, little is known about cell-mineral associations in the few existing autotrophic NRFeOB. Here, we investigate whether the designated autotrophic Fe(II)-oxidizing strain (closely related to Gallionella and Sideroxydans) or the heterotrophic nitrate reducers that are present in the autotrophic nitrate-reducing Fe(II)-oxidizing enrichment culture KS form mineral crusts during Fe(II) oxidation under autotrophic and mixotrophic conditions. In the mixed culture, we found no significant encrustation of any of the cells both during autotrophic oxidation of 8 to 10 mM Fe(II) coupled to nitrate reduction and during cultivation under mixotrophic conditions with 8 to 10 mM Fe(II), 5 mM acetate, and 4 mM nitrate, where higher numbers of heterotrophic nitrate reducers were present. Two pure cultures of heterotrophic nitrate reducers (Nocardioides and Rhodanobacter) isolated from culture KS were analyzed under mixotrophic growth conditions. We found green rust formation, no cell encrustation, and only a few mineral particles on some cell surfaces with 5 mM Fe(II) and some encrustation with 10 mM Fe(II). Our findings suggest that enzymatic, autotrophic Fe(II) oxidation coupled to nitrate reduction forms poorly crystalline Fe(III) oxyhydroxides and proceeds without cellular encrustation while indirect Fe(II) oxidation via heterotrophic nitrate-reduction-derived nitrite can lead to green rust as an intermediate mineral and significant cell encrustation. The extent of encrustation caused by indirect Fe(II) oxidation by reactive nitrogen species depends on Fe(II) concentrations and is probably negligible under environmental conditions in most habitats. IMPORTANCE Most described nitrate-reducing Fe(II)-oxidizing bacteria (NRFeOB) are mixotrophic (their growth depends on organic cosubstrates) and can become encrusted in Fe(III) minerals. Encrustation is expected to be harmful and poses a threat to cells if it also occurs under environmentally relevant conditions. Nitrite produced during heterotrophic denitrification reacts with Fe(II) abiotically and is probably the reason for encrustation in mixotrophic NRFeOB. Little is known about cell-mineral associations in autotrophic NRFeOB such as the enrichment culture KS. Here, we show that no encrustation occurs in culture KS under autotrophic and mixotrophic conditions while heterotrophic nitrate-reducing isolates from culture KS become encrusted. These findings support the hypothesis that encrustation in mixotrophic cultures is caused by the abiotic reaction of Fe(II) with nitrite and provide evidence that Fe(II) oxidation in culture KS is enzymatic. Furthermore, we show that the extent of encrustation caused by indirect Fe(II) oxidation by reactive nitrogen species depends on Fe(II) concentrations and is probably negligible in most environmental habitats. PMID:28455336

  15. Determining probability distribution of coherent integration time near 133 Hz and 1346 km in the Pacific Ocean.

    PubMed

    Spiesberger, John L

    2013-02-01

    The hypothesis tested is that internal gravity waves limit the coherent integration time of sound at 1346 km in the Pacific ocean at 133 Hz and a pulse resolution of 0.06 s. Six months of continuous transmissions at about 18 min intervals are examined. The source and receiver are mounted on the bottom of the ocean with timing governed by atomic clocks. Measured variability is only due to fluctuations in the ocean. A model for the propagation of sound through fluctuating internal waves is run without any tuning with data. Excellent resemblance is found between the model and data's probability distributions of integration time up to five hours.

  16. Learning probability distributions from smooth observables and the maximum entropy principle: some remarks

    NASA Astrophysics Data System (ADS)

    Obuchi, Tomoyuki; Monasson, Rémi

    2015-09-01

    The maximum entropy principle (MEP) is a very useful working hypothesis in a wide variety of inference problems, ranging from biological to engineering tasks. To better understand the reasons of the success of MEP, we propose a statistical-mechanical formulation to treat the space of probability distributions constrained by the measures of (experimental) observables. In this paper we first review the results of a detailed analysis of the simplest case of randomly chosen observables. In addition, we investigate by numerical and analytical means the case of smooth observables, which is of practical relevance. Our preliminary results are presented and discussed with respect to the efficiency of the MEP.

  17. Carryover effects associated with winter location affect fitness, social status, and population dynamics in a long-distance migrant

    USGS Publications Warehouse

    Sedinger, James S.; Schamber, Jason L.; Ward, David H.; Nicolai, Christopher A.; Conant, Bruce

    2011-01-01

    We used observations of individually marked female black brant geese (Branta bernicla nigricans; brant) at three wintering lagoons on the Pacific coast of Baja California—Laguna San Ignacio (LSI), Laguna Ojo de Liebre (LOL), and Bahía San Quintín (BSQ)—and the Tutakoke River breeding colony in Alaska to assess hypotheses about carryover effects on breeding and distribution of individuals among wintering areas. We estimated transition probabilities from wintering locations to breeding and nonbreeding by using multistratum robust-design capture-mark-recapture models. We also examined the effect of breeding on migration to wintering areas to assess the hypothesis that individuals in family groups occupied higher-quality wintering locations. We used 4,538 unique female brant in our analysis of the relationship between winter location and breeding probability. All competitive models of breeding probability contained additive effects of wintering location and the 1997–1998 El Niño–Southern Oscillation (ENSO) event on probability of breeding. Probability of breeding in non-ENSO years was 0.98 ± 0.02, 0.68 ± 0.04, and 0.91 ± 0.11 for females wintering at BSQ, LOL, and LSI, respectively. After the 1997–1998 ENSO event, breeding probability was between 2% (BSQ) and 38% (LOL) lower than in other years. Individuals that bred had the highest probability of migrating the next fall to the wintering area producing the highest probability of breeding.

  18. Estimating the population size and colony boundary of subterranean termites by using the density functions of directionally averaged capture probability.

    PubMed

    Su, Nan-Yao; Lee, Sang-Hee

    2008-04-01

    Marked termites were released in a linear-connected foraging arena, and the spatial heterogeneity of their capture probabilities was averaged for both directions at distance r from release point to obtain a symmetrical distribution, from which the density function of directionally averaged capture probability P(x) was derived. We hypothesized that as marked termites move into the population and given sufficient time, the directionally averaged capture probability may reach an equilibrium P(e) over the distance r and thus satisfy the equal mixing assumption of the mark-recapture protocol. The equilibrium capture probability P(e) was used to estimate the population size N. The hypothesis was tested in a 50-m extended foraging arena to simulate the distance factor of field colonies of subterranean termites. Over the 42-d test period, the density functions of directionally averaged capture probability P(x) exhibited four phases: exponential decline phase, linear decline phase, equilibrium phase, and postequilibrium phase. The equilibrium capture probability P(e), derived as the intercept of the linear regression during the equilibrium phase, correctly projected N estimates that were not significantly different from the known number of workers in the arena. Because the area beneath the probability density function is a constant (50% in this study), preequilibrium regression parameters and P(e) were used to estimate the population boundary distance 1, which is the distance between the release point and the boundary beyond which the population is absent.

  19. Drawbacks of the ancient RNA-based life-like system under primitive earth conditions.

    PubMed

    Kawamura, Kunio

    2012-07-01

    Following the discovery of ribozymes, the "RNA world" hypothesis has become the most accepted hypothesis concerning the origin of life and genetic information. However, this hypothesis has several drawbacks. Verification of the hypothesis from different viewpoints led us to proposals from the viewpoint of the hydrothermal origin of life, solubility of RNA and related biopolymers, and the possibility of creating an evolutionary system comparable to the in vitro selection technique for functional RNA molecules based on molecular biology. Copyright © 2012 Elsevier Masson SAS. All rights reserved.

  20. The researcher and the consultant: from testing to probability statements.

    PubMed

    Hamra, Ghassan B; Stang, Andreas; Poole, Charles

    2015-09-01

    In the first instalment of this series, Stang and Poole provided an overview of Fisher significance testing (ST), Neyman-Pearson null hypothesis testing (NHT), and their unfortunate and unintended offspring, null hypothesis significance testing. In addition to elucidating the distinction between the first two and the evolution of the third, the authors alluded to alternative models of statistical inference; namely, Bayesian statistics. Bayesian inference has experienced a revival in recent decades, with many researchers advocating for its use as both a complement and an alternative to NHT and ST. This article will continue in the direction of the first instalment, providing practicing researchers with an introduction to Bayesian inference. Our work will draw on the examples and discussion of the previous dialogue.

  1. Intelligence and homosexuality.

    PubMed

    Kanazawa, Satoshi

    2012-09-01

    The origin of preferences and values is an unresolved theoretical problem in behavioural sciences. The Savanna-IQ Interaction Hypothesis, derived from the Savanna Principle and a theory of the evolution of general intelligence, suggests that more intelligent individuals are more likely to acquire and espouse evolutionarily novel preferences and values than less intelligent individuals, but general intelligence has no effect on the acquisition and espousal of evolutionarily familiar preferences and values. Ethnographies of traditional societies suggest that exclusively homosexual behaviour was probably rare in the ancestral environment, so the Hypothesis would predict that more intelligent individuals are more likely to identify themselves as homosexual and engage in homosexual behaviour. Analyses of three large, nationally representative samples (two of which are prospectively longitudinal) from two different nations confirm the prediction.

  2. Emotional Sentence Annotation Helps Predict Fiction Genre.

    PubMed

    Samothrakis, Spyridon; Fasli, Maria

    2015-01-01

    Fiction, a prime form of entertainment, has evolved into multiple genres which one can broadly attribute to different forms of stories. In this paper, we examine the hypothesis that works of fiction can be characterised by the emotions they portray. To investigate this hypothesis, we use the work of fictions in the Project Gutenberg and we attribute basic emotional content to each individual sentence using Ekman's model. A time-smoothed version of the emotional content for each basic emotion is used to train extremely randomized trees. We show through 10-fold Cross-Validation that the emotional content of each work of fiction can help identify each genre with significantly higher probability than random. We also show that the most important differentiator between genre novels is fear.

  3. Probabilistic objective functions for sensor management

    NASA Astrophysics Data System (ADS)

    Mahler, Ronald P. S.; Zajic, Tim R.

    2004-08-01

    This paper continues the investigation of a foundational and yet potentially practical basis for control-theoretic sensor management, using a comprehensive, intuitive, system-level Bayesian paradigm based on finite-set statistics (FISST). In this paper we report our most recent progress, focusing on multistep look-ahead -- i.e., allocation of sensor resources throughout an entire future time-window. We determine future sensor states in the time-window using a "probabilistically natural" sensor management objective function, the posterior expected number of targets (PENT). This objective function is constructed using a new "maxi-PIMS" optimization strategy that hedges against unknowable future observation-collections. PENT is used in conjuction with approximate multitarget filters: the probability hypothesis density (PHD) filter or the multi-hypothesis correlator (MHC) filter.

  4. How did you guess? Or, what do multiple-choice questions measure?

    PubMed

    Cox, K R

    1976-06-05

    Multiple-choice questions classified as requiring problem-solving skills have been interpreted as measuring problem-solving skills within students, with the implicit hypothesis that questions needing an increasingly complex intellectual process should present increasing difficulty to the student. This hypothesis was tested in a 150-question paper taken by 721 students in seven Australian medical schools. No correlation was observed between difficulty and assigned process. Consequently, the question-answering process was explored with a group of final-year students. Anecdotal recall by students gave heavy weight to knowledge rather than problem solving in answering these questions. Assignment of the 150 questions to the classification by three teachers and six students showed their congruence to be a little above random probability.

  5. Encounter risk analysis of rainfall and reference crop evapotranspiration in the irrigation district

    NASA Astrophysics Data System (ADS)

    Zhang, Jinping; Lin, Xiaomin; Zhao, Yong; Hong, Yang

    2017-09-01

    Rainfall and reference crop evapotranspiration are random but mutually affected variables in the irrigation district, and their encounter situation can determine water shortage risks under the contexts of natural water supply and demand. However, in reality, the rainfall and reference crop evapotranspiration may have different marginal distributions and their relations are nonlinear. In this study, based on the annual rainfall and reference crop evapotranspiration data series from 1970 to 2013 in the Luhun irrigation district of China, the joint probability distribution of rainfall and reference crop evapotranspiration are developed with the Frank copula function. Using the joint probability distribution, the synchronous-asynchronous encounter risk, conditional joint probability, and conditional return period of different combinations of rainfall and reference crop evapotranspiration are analyzed. The results show that the copula-based joint probability distributions of rainfall and reference crop evapotranspiration are reasonable. The asynchronous encounter probability of rainfall and reference crop evapotranspiration is greater than their synchronous encounter probability, and the water shortage risk associated with meteorological drought (i.e. rainfall variability) is more prone to appear. Compared with other states, there are higher conditional joint probability and lower conditional return period in either low rainfall or high reference crop evapotranspiration. For a specifically high reference crop evapotranspiration with a certain frequency, the encounter risk of low rainfall and high reference crop evapotranspiration is increased with the decrease in frequency. For a specifically low rainfall with a certain frequency, the encounter risk of low rainfall and high reference crop evapotranspiration is decreased with the decrease in frequency. When either the high reference crop evapotranspiration exceeds a certain frequency or low rainfall does not exceed a certain frequency, the higher conditional joint probability and lower conditional return period of various combinations likely cause a water shortage, but the water shortage is not severe.

  6. Report on the search for atmospheric holes using airs image data

    NASA Technical Reports Server (NTRS)

    Reinleitner, Lee A.

    1991-01-01

    Frank et al (1986) presented a very controversial hypothesis which states that the Earth is being bombarded by water-vapor clouds resulting from the disruption and vaporization of small comets. This hypothesis was based on single-pixel intensity decreases in the images of the earth's dayglow emissions at vacuum-ultraviolet (VUV) wavelengths using the DE-1 imager. These dark spots, or atmospheric holes, are hypothesized to be the result of VUV absorption by a water-vapor cloud between the imager and the dayglow-emitting region. Examined here is the VUV data set from the Auroral Ionospheric Remote Sensor (AIRS) instrument that was flown on the Polar BEAR satellite. AIRS was uniquely situated to test this hypothesis. Due to the altitude of the sensor, the holes should show multi-pixel intensity decreases in a scan line. A statistical estimate indicated that sufficient 130.4-nm data from AIRS existed to detect eight to nine such holes, but none was detected. The probability of this occurring is less than 1.0 x 10(exp -4). A statistical estimate indicated that sufficient 135.6-nm data from AIRS existed to detect approx. 2 holes, and two ambiguous cases are shown. In spite of the two ambiguous cases, the 135.6-nm data did not show clear support for the small-comet hypothesis. The 130.4-nm data clearly do not support the small-comet hypothesis.

  7. Testing the niche variation hypothesis with a measure of body condition

    EPA Science Inventory

    Individual variation and fitness are cornerstones of evolution by natural selection. The niche variation hypothesis (NVH) posits that when interspecific competition is relaxed, intraspecific competition should drive niche expansion by selection favoring use of novel resources. Po...

  8. Statistical learning of an auditory sequence and reorganization of acquired knowledge: A time course of word segmentation and ordering.

    PubMed

    Daikoku, Tatsuya; Yatomi, Yutaka; Yumoto, Masato

    2017-01-27

    Previous neural studies have supported the hypothesis that statistical learning mechanisms are used broadly across different domains such as language and music. However, these studies have only investigated a single aspect of statistical learning at a time, such as recognizing word boundaries or learning word order patterns. In this study, we neutrally investigated how the two levels of statistical learning for recognizing word boundaries and word ordering could be reflected in neuromagnetic responses and how acquired statistical knowledge is reorganised when the syntactic rules are revised. Neuromagnetic responses to the Japanese-vowel sequence (a, e, i, o, and u), presented every .45s, were recorded from 14 right-handed Japanese participants. The vowel order was constrained by a Markov stochastic model such that five nonsense words (aue, eao, iea, oiu, and uoi) were chained with an either-or rule: the probability of the forthcoming word was statistically defined (80% for one word; 20% for the other word) by the most recent two words. All of the word transition probabilities (80% and 20%) were switched in the middle of the sequence. In the first and second quarters of the sequence, the neuromagnetic responses to the words that appeared with higher transitional probability were significantly reduced compared with those that appeared with a lower transitional probability. After switching the word transition probabilities, the response reduction was replicated in the last quarter of the sequence. The responses to the final vowels in the words were significantly reduced compared with those to the initial vowels in the last quarter of the sequence. The results suggest that both within-word and between-word statistical learning are reflected in neural responses. The present study supports the hypothesis that listeners learn larger structures such as phrases first, and they subsequently extract smaller structures, such as words, from the learned phrases. The present study provides the first neurophysiological evidence that the correction of statistical knowledge requires more time than the acquisition of new statistical knowledge. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Research Needs for Human Factors.

    DTIC Science & Technology

    1983-01-19

    the parties aggregate their perspectives through some structured interaction (Sachman, 1975; Steiner, 1972). This approach, well worked by students of...be thought of as an action, so may each action be thought of as a decision. Most students of decision making would probably agree with the hypothesis...structuring has become part of the training of some medical students . The user of computerized information retrieval systems (e.g., Prestel, Teletext) might

  10. Early Evidence for Zika Virus Circulation among Aedes aegypti Mosquitoes, Rio de Janeiro, Brazil

    PubMed Central

    Ayllón, Tania; Campos, Renata de Mendonça; Brasil, Patrícia; Morone, Fernanda Cristina; Câmara, Daniel Cardoso Portela; Meira, Guilherme Louzada Silva; Tannich, Egbert; Yamamoto, Kristie Aimi; Carvalho, Marilia Sá; Pedro, Renata Saraiva; Cadar, Daniel; Ferreira, Davis Fernandes; Honório, Nildimar Alves

    2017-01-01

    During 2014–2016, we conducted mosquito-based Zika virus surveillance in Rio de Janeiro, Brazil. Results suggest that Zika virus was probably introduced into the area during May–November 2013 via multiple in-country sources. Furthermore, our results strengthen the hypothesis that Zika virus in the Americas originated in Brazil during October 2012–May 2013. PMID:28628464

  11. Tracking the Sensory Environment: An ERP Study of Probability and Context Updating in ASD

    PubMed Central

    Westerfield, Marissa A.; Zinni, Marla; Vo, Khang; Townsend, Jeanne

    2014-01-01

    We recorded visual event-related brain potentials (ERPs) from 32 adult male participants (16 high-functioning participants diagnosed with Autism Spectrum Disorder (ASD) and 16 control participants, ranging in age from 18–53 yrs) during a three-stimulus oddball paradigm. Target and non-target stimulus probability was varied across three probability conditions, whereas the probability of a third non-target stimulus was held constant in all conditions. P3 amplitude to target stimuli was more sensitive to probability in ASD than in TD participants, whereas P3 amplitude to non-target stimuli was less responsive to probability in ASD participants. This suggests that neural responses to changes in event probability are attention-dependant in high-functioning ASD. The implications of these findings for higher-level behaviors such as prediction and planning are discussed. PMID:24488156

  12. Capacity of optical communications over a lossy bosonic channel with a receiver employing the most general coherent electro-optic feedback control

    NASA Astrophysics Data System (ADS)

    Chung, Hye Won; Guha, Saikat; Zheng, Lizhong

    2017-07-01

    We study the problem of designing optical receivers to discriminate between multiple coherent states using coherent processing receivers—i.e., one that uses arbitrary coherent feedback control and quantum-noise-limited direct detection—which was shown by Dolinar to achieve the minimum error probability in discriminating any two coherent states. We first derive and reinterpret Dolinar's binary-hypothesis minimum-probability-of-error receiver as the one that optimizes the information efficiency at each time instant, based on recursive Bayesian updates within the receiver. Using this viewpoint, we propose a natural generalization of Dolinar's receiver design to discriminate M coherent states, each of which could now be a codeword, i.e., a sequence of N coherent states, each drawn from a modulation alphabet. We analyze the channel capacity of the pure-loss optical channel with a general coherent-processing receiver in the low-photon number regime and compare it with the capacity achievable with direct detection and the Holevo limit (achieving the latter would require a quantum joint-detection receiver). We show compelling evidence that despite the optimal performance of Dolinar's receiver for the binary coherent-state hypothesis test (either in error probability or mutual information), the asymptotic communication rate achievable by such a coherent-processing receiver is only as good as direct detection. This suggests that in the infinitely long codeword limit, all potential benefits of coherent processing at the receiver can be obtained by designing a good code and direct detection, with no feedback within the receiver.

  13. Mathematical Capture of Human Data for Computer Model Building and Validation

    DTIC Science & Technology

    2014-04-03

    weapon. The Projectile, the VDE , and the IDE weapons had effects of financial loss for the targeted participant, while the MRAD yielded its own...for LE, Centroid and TE for the baseline and The VDE weapon conditions since p-values exceeded α. All other conditions rejected the null...hypothesis except the LE for VDE weapon. The K-S Statistics were correspondingly lower for the measures that failed to reject the null hypothesis. The CDF

  14. Are atmospheric surface layer flows ergodic?

    NASA Astrophysics Data System (ADS)

    Higgins, Chad W.; Katul, Gabriel G.; Froidevaux, Martin; Simeonov, Valentin; Parlange, Marc B.

    2013-06-01

    The transposition of atmospheric turbulence statistics from the time domain, as conventionally sampled in field experiments, is explained by the so-called ergodic hypothesis. In micrometeorology, this hypothesis assumes that the time average of a measured flow variable represents an ensemble of independent realizations from similar meteorological states and boundary conditions. That is, the averaging duration must be sufficiently long to include a large number of independent realizations of the sampled flow variable so as to represent the ensemble. While the validity of the ergodic hypothesis for turbulence has been confirmed in laboratory experiments, and numerical simulations for idealized conditions, evidence for its validity in the atmospheric surface layer (ASL), especially for nonideal conditions, continues to defy experimental efforts. There is some urgency to make progress on this problem given the proliferation of tall tower scalar concentration networks aimed at constraining climate models yet are impacted by nonideal conditions at the land surface. Recent advancements in water vapor concentration lidar measurements that simultaneously sample spatial and temporal series in the ASL are used to investigate the validity of the ergodic hypothesis for the first time. It is shown that ergodicity is valid in a strict sense above uniform surfaces away from abrupt surface transitions. Surprisingly, ergodicity may be used to infer the ensemble concentration statistics of a composite grass-lake system using only water vapor concentration measurements collected above the sharp transition delineating the lake from the grass surface.

  15. Phase II design with sequential testing of hypotheses within each stage.

    PubMed

    Poulopoulou, Stavroula; Karlis, Dimitris; Yiannoutsos, Constantin T; Dafni, Urania

    2014-01-01

    The main goal of a Phase II clinical trial is to decide, whether a particular therapeutic regimen is effective enough to warrant further study. The hypothesis tested by Fleming's Phase II design (Fleming, 1982) is [Formula: see text] versus [Formula: see text], with level [Formula: see text] and with a power [Formula: see text] at [Formula: see text], where [Formula: see text] is chosen to represent the response probability achievable with standard treatment and [Formula: see text] is chosen such that the difference [Formula: see text] represents a targeted improvement with the new treatment. This hypothesis creates a misinterpretation mainly among clinicians that rejection of the null hypothesis is tantamount to accepting the alternative, and vice versa. As mentioned by Storer (1992), this introduces ambiguity in the evaluation of type I and II errors and the choice of the appropriate decision at the end of the study. Instead of testing this hypothesis, an alternative class of designs is proposed in which two hypotheses are tested sequentially. The hypothesis [Formula: see text] versus [Formula: see text] is tested first. If this null hypothesis is rejected, the hypothesis [Formula: see text] versus [Formula: see text] is tested next, in order to examine whether the therapy is effective enough to consider further testing in a Phase III study. For the derivation of the proposed design the exact binomial distribution is used to calculate the decision cut-points. The optimal design parameters are chosen, so as to minimize the average sample number (ASN) under specific upper bounds for error levels. The optimal values for the design were found using a simulated annealing method.

  16. Elucidating mechanisms for insect body size: partial support for the oxygen-dependent induction of moulting hypothesis.

    PubMed

    Kivelä, Sami M; Viinamäki, Sonja; Keret, Netta; Gotthard, Karl; Hohtola, Esa; Välimäki, Panu

    2018-01-25

    Body size is a key life history trait, and knowledge of its mechanistic basis is crucial in life history biology. Such knowledge is accumulating for holometabolous insects, whose growth is characterised and body size affected by moulting. According to the oxygen-dependent induction of moulting (ODIM) hypothesis, moult is induced at a critical mass at which oxygen demand of growing tissues overrides the supply from the tracheal respiratory system, which principally grows only at moults. Support for the ODIM hypothesis is controversial, partly because of a lack of proper data to explicitly test the hypothesis. The ODIM hypothesis predicts that the critical mass is positively correlated with oxygen partial pressure ( P O 2 ) and negatively with temperature. To resolve the controversy that surrounds the ODIM hypothesis, we rigorously test these predictions by exposing penultimate-instar Orthosia gothica (Lepidoptera: Noctuidae) larvae to temperature and moderate P O 2  manipulations in a factorial experiment. The relative mass increment in the focal instar increased along with increasing P O 2 , as predicted, but there was only weak suggestive evidence of the temperature effect. Probably owing to a high measurement error in the trait, the effect of P O 2  on the critical mass was sex specific; high P O 2  had a positive effect only in females, whereas low P O 2  had a negative effect only in males. Critical mass was independent of temperature. Support for the ODIM hypothesis is partial because of only suggestive evidence of a temperature effect on moulting, but the role of oxygen in moult induction seems unambiguous. The ODIM mechanism thus seems worth considering in body size analyses. © 2018. Published by The Company of Biologists Ltd.

  17. Decomposition of conditional probability for high-order symbolic Markov chains.

    PubMed

    Melnik, S S; Usatenko, O V

    2017-07-01

    The main goal of this paper is to develop an estimate for the conditional probability function of random stationary ergodic symbolic sequences with elements belonging to a finite alphabet. We elaborate on a decomposition procedure for the conditional probability function of sequences considered to be high-order Markov chains. We represent the conditional probability function as the sum of multilinear memory function monomials of different orders (from zero up to the chain order). This allows us to introduce a family of Markov chain models and to construct artificial sequences via a method of successive iterations, taking into account at each step increasingly high correlations among random elements. At weak correlations, the memory functions are uniquely expressed in terms of the high-order symbolic correlation functions. The proposed method fills the gap between two approaches, namely the likelihood estimation and the additive Markov chains. The obtained results may have applications for sequential approximation of artificial neural network training.

  18. Decomposition of conditional probability for high-order symbolic Markov chains

    NASA Astrophysics Data System (ADS)

    Melnik, S. S.; Usatenko, O. V.

    2017-07-01

    The main goal of this paper is to develop an estimate for the conditional probability function of random stationary ergodic symbolic sequences with elements belonging to a finite alphabet. We elaborate on a decomposition procedure for the conditional probability function of sequences considered to be high-order Markov chains. We represent the conditional probability function as the sum of multilinear memory function monomials of different orders (from zero up to the chain order). This allows us to introduce a family of Markov chain models and to construct artificial sequences via a method of successive iterations, taking into account at each step increasingly high correlations among random elements. At weak correlations, the memory functions are uniquely expressed in terms of the high-order symbolic correlation functions. The proposed method fills the gap between two approaches, namely the likelihood estimation and the additive Markov chains. The obtained results may have applications for sequential approximation of artificial neural network training.

  19. Nonthermal effects of therapeutic ultrasound: the frequency resonance hypothesis.

    PubMed

    Johns, Lennart D

    2002-07-01

    To present the frequency resonance hypothesis, a possible mechanical mechanism by which treatment with non-thermal levels of ultrasound stimulates therapeutic effects. The review encompasses a 4-decade history but focuses on recent reports describing the effects of nonthermal therapeutic levels of ultrasound at the cellular and molecular levels. A search of MEDLINE from 1965 through 2000 using the terms ultrasound and therapeutic ultrasound. The literature provides a number of examples in which exposure of cells to therapeutic ultrasound under nonthermal conditions modified cellular functions. Nonthermal levels of ultrasound are reported to modulate membrane properties, alter cellular proliferation, and produce increases in proteins associated with inflammation and injury repair. Combined, these data suggest that nonthermal effects of therapeutic ultrasound can modify the inflammatory response. The concept of the absorption of ultrasonic energy by enzymatic proteins leading to changes in the enzymes activity is not novel. However, recent reports demonstrating that ultrasound affects enzyme activity and possibly gene regulation provide sufficient data to present a probable molecular mechanism of ultrasound's nonthermal therapeutic action. The frequency resonance hypothesis describes 2 possible biological mechanisms that may alter protein function as a result of the absorption of ultrasonic energy. First, absorption of mechanical energy by a protein may produce a transient conformational shift (modifying the 3-dimensional structure) and alter the protein's functional activity. Second, the resonance or shearing properties of the wave (or both) may dissociate a multimolecular complex, thereby disrupting the complex's function. This review focuses on recent studies that have reported cellular and molecular effects of therapeutic ultrasound and presents a mechanical mechanism that may lead to a better understanding of how the nonthermal effects of ultrasound may be therapeutic. Moreover, a better understanding of ultrasound's mechanical mechanism could lead to a better understanding of how and when ultrasound should be employed as a therapeutic modality.

  20. Evolutionary relatedness does not predict competition and co-occurrence in natural or experimental communities of green algae

    PubMed Central

    Alexandrou, Markos A.; Cardinale, Bradley J.; Hall, John D.; Delwiche, Charles F.; Fritschie, Keith; Narwani, Anita; Venail, Patrick A.; Bentlage, Bastian; Pankey, M. Sabrina; Oakley, Todd H.

    2015-01-01

    The competition-relatedness hypothesis (CRH) predicts that the strength of competition is the strongest among closely related species and decreases as species become less related. This hypothesis is based on the assumption that common ancestry causes close relatives to share biological traits that lead to greater ecological similarity. Although intuitively appealing, the extent to which phylogeny can predict competition and co-occurrence among species has only recently been rigorously tested, with mixed results. When studies have failed to support the CRH, critics have pointed out at least three limitations: (i) the use of data poor phylogenies that provide inaccurate estimates of species relatedness, (ii) the use of inappropriate statistical models that fail to detect relationships between relatedness and species interactions amidst nonlinearities and heteroskedastic variances, and (iii) overly simplified laboratory conditions that fail to allow eco-evolutionary relationships to emerge. Here, we address these limitations and find they do not explain why evolutionary relatedness fails to predict the strength of species interactions or probabilities of coexistence among freshwater green algae. First, we construct a new data-rich, transcriptome-based phylogeny of common freshwater green algae that are commonly cultured and used for laboratory experiments. Using this new phylogeny, we re-analyse ecological data from three previously published laboratory experiments. After accounting for the possibility of nonlinearities and heterogeneity of variances across levels of relatedness, we find no relationship between phylogenetic distance and ecological traits. In addition, we show that communities of North American green algae are randomly composed with respect to their evolutionary relationships in 99% of 1077 lakes spanning the continental United States. Together, these analyses result in one of the most comprehensive case studies of how evolutionary history influences species interactions and community assembly in both natural and experimental systems. Our results challenge the generality of the CRH and suggest it may be time to re-evaluate the validity and assumptions of this hypothesis. PMID:25473009

  1. Fear appeals and attitude change: effects of a threat's noxiousness, probability of occurrence, and the efficacy of coping responses.

    PubMed

    Rogers, R W; Mewborn, C R

    1976-07-01

    Three factorial experiments examined the persuasive effects of the noxiousness of threatened event, its probability of occurrence, and the efficacy of recommended protective measures. A total of 176 students participated in separate studies on the topics of cigarette smoking, driving safety, and venereal disease. The results disclosed that increments in the efficacy variable increased intentions to adopt the efficacy variable increased intentions to adopt the recommended practices. Interaction effects revealed that when the preventive practices were effective, increments in the noxiousness and probability variables facilitated attitude change; however, when the coping responses were the preventive practices were effective, increments in the noxiousness and probability either had no effect or a deleterious effect, respectively. These interaction effects were discussed in terms of a defensive avoidance hypothesis, the crucial component of which was an inability to ward off the danger. Furthermore, the effect of the emotion of fear upon intentions was found to be mediated by the cognitive appraisal of severity of the threat. Finally, similarities with and extensions of previous studies were reviewed.

  2. A chi-square goodness-of-fit test for non-identically distributed random variables: with application to empirical Bayes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Conover, W.J.; Cox, D.D.; Martz, H.F.

    1997-12-01

    When using parametric empirical Bayes estimation methods for estimating the binomial or Poisson parameter, the validity of the assumed beta or gamma conjugate prior distribution is an important diagnostic consideration. Chi-square goodness-of-fit tests of the beta or gamma prior hypothesis are developed for use when the binomial sample sizes or Poisson exposure times vary. Nine examples illustrate the application of the methods, using real data from such diverse applications as the loss of feedwater flow rates in nuclear power plants, the probability of failure to run on demand and the failure rates of the high pressure coolant injection systems atmore » US commercial boiling water reactors, the probability of failure to run on demand of emergency diesel generators in US commercial nuclear power plants, the rate of failure of aircraft air conditioners, baseball batting averages, the probability of testing positive for toxoplasmosis, and the probability of tumors in rats. The tests are easily applied in practice by means of corresponding Mathematica{reg_sign} computer programs which are provided.« less

  3. New normative standards of conditional reasoning and the dual-source model

    PubMed Central

    Singmann, Henrik; Klauer, Karl Christoph; Over, David

    2014-01-01

    There has been a major shift in research on human reasoning toward Bayesian and probabilistic approaches, which has been called a new paradigm. The new paradigm sees most everyday and scientific reasoning as taking place in a context of uncertainty, and inference is from uncertain beliefs and not from arbitrary assumptions. In this manuscript we present an empirical test of normative standards in the new paradigm using a novel probabilized conditional reasoning task. Our results indicated that for everyday conditional with at least a weak causal connection between antecedent and consequent only the conditional probability of the consequent given antecedent contributes unique variance to predicting the probability of conditional, but not the probability of the conjunction, nor the probability of the material conditional. Regarding normative accounts of reasoning, we found significant evidence that participants' responses were confidence preserving (i.e., p-valid in the sense of Adams, 1998) for MP inferences, but not for MT inferences. Additionally, only for MP inferences and to a lesser degree for DA inferences did the rate of responses inside the coherence intervals defined by mental probability logic (Pfeifer and Kleiter, 2005, 2010) exceed chance levels. In contrast to the normative accounts, the dual-source model (Klauer et al., 2010) is a descriptive model. It posits that participants integrate their background knowledge (i.e., the type of information primary to the normative approaches) and their subjective probability that a conclusion is seen as warranted based on its logical form. Model fits showed that the dual-source model, which employed participants' responses to a deductive task with abstract contents to estimate the form-based component, provided as good an account of the data as a model that solely used data from the probabilized conditional reasoning task. PMID:24860516

  4. New normative standards of conditional reasoning and the dual-source model.

    PubMed

    Singmann, Henrik; Klauer, Karl Christoph; Over, David

    2014-01-01

    There has been a major shift in research on human reasoning toward Bayesian and probabilistic approaches, which has been called a new paradigm. The new paradigm sees most everyday and scientific reasoning as taking place in a context of uncertainty, and inference is from uncertain beliefs and not from arbitrary assumptions. In this manuscript we present an empirical test of normative standards in the new paradigm using a novel probabilized conditional reasoning task. Our results indicated that for everyday conditional with at least a weak causal connection between antecedent and consequent only the conditional probability of the consequent given antecedent contributes unique variance to predicting the probability of conditional, but not the probability of the conjunction, nor the probability of the material conditional. Regarding normative accounts of reasoning, we found significant evidence that participants' responses were confidence preserving (i.e., p-valid in the sense of Adams, 1998) for MP inferences, but not for MT inferences. Additionally, only for MP inferences and to a lesser degree for DA inferences did the rate of responses inside the coherence intervals defined by mental probability logic (Pfeifer and Kleiter, 2005, 2010) exceed chance levels. In contrast to the normative accounts, the dual-source model (Klauer et al., 2010) is a descriptive model. It posits that participants integrate their background knowledge (i.e., the type of information primary to the normative approaches) and their subjective probability that a conclusion is seen as warranted based on its logical form. Model fits showed that the dual-source model, which employed participants' responses to a deductive task with abstract contents to estimate the form-based component, provided as good an account of the data as a model that solely used data from the probabilized conditional reasoning task.

  5. Probable Posttraumatic Stress Disorder in the US Veteran Population According to DSM-5: Results From the National Health and Resilience in Veterans Study.

    PubMed

    Wisco, Blair E; Marx, Brian P; Miller, Mark W; Wolf, Erika J; Mota, Natalie P; Krystal, John H; Southwick, Steven M; Pietrzak, Robert H

    2016-11-01

    With the publication of DSM-5, important changes were made to the diagnostic criteria for posttraumatic stress disorder (PTSD), including the addition of 3 new symptoms. Some have argued that these changes will further increase the already high rates of comorbidity between PTSD and other psychiatric disorders. This study examined the prevalence of DSM-5 PTSD, conditional probability of PTSD given certain trauma exposures, endorsement of specific PTSD symptoms, and psychiatric comorbidities in the US veteran population. Data were analyzed from the National Health and Resilience in Veterans Study (NHRVS), a Web-based survey of a cross-sectional, nationally representative, population-based sample of 1,484 US veterans, which was fielded from September through October 2013. Probable PTSD was assessed using the PTSD Checklist-5. The weighted lifetime and past-month prevalence of probable DSM-5 PTSD was 8.1% (SE = 0.7%) and 4.7% (SE = 0.6%), respectively. Conditional probability of lifetime probable PTSD ranged from 10.1% (sudden death of close family member or friend) to 28.0% (childhood sexual abuse). The DSM-5 PTSD symptoms with the lowest prevalence among veterans with probable PTSD were trauma-related amnesia and reckless and self-destructive behavior. Probable PTSD was associated with increased odds of mood and anxiety disorders (OR = 7.6-62.8, P < .001), substance use disorders (OR = 3.9-4.5, P < .001), and suicidal behaviors (OR = 6.7-15.1, P < .001). In US veterans, the prevalence of DSM-5 probable PTSD, conditional probability of probable PTSD, and odds of psychiatric comorbidity were similar to prior findings with DSM-IV-based measures; we found no evidence that changes in DSM-5 increase psychiatric comorbidity. Results underscore the high rates of exposure to both military and nonmilitary trauma and the high public health burden of DSM-5 PTSD and comorbid conditions in veterans. © Copyright 2016 Physicians Postgraduate Press, Inc.

  6. Multiple benefits of alloparental care in a fluctuating environment.

    PubMed

    Guindre-Parker, Sarah; Rubenstein, Dustin R

    2018-02-01

    Although cooperatively breeding vertebrates occur disproportionately in unpredictable environments, the underlying mechanism shaping this biogeographic pattern remains unclear. Cooperative breeding may buffer against harsh conditions (hard life hypothesis), or additionally allow for sustained breeding under benign conditions (temporal variability hypothesis). To distinguish between the hard life and temporal variability hypotheses, we investigated whether the number of alloparents at a nest increased reproductive success or load-lightening in superb starlings ( Lamprotornis superbus ), and whether these two types of benefits varied in harsh and benign years. We found that mothers experienced both types of benefits consistent with the temporal variability hypothesis, as larger contingents of alloparents increased the number of young fledged while simultaneously allowing mothers to reduce their provisioning rates under both harsh and benign rainfall conditions. By contrast, fathers experienced load-lightening only under benign rainfall conditions, suggesting that cooperative breeding may serve to take advantage of unpredictable benign breeding seasons when they do occur. Cooperative breeding in unpredictable environments may thus promote flexibility in offspring care behaviour, which could mitigate variability in the cost of raising young. Our results highlight the importance of considering how offspring care decisions vary among breeding roles and across fluctuating environmental conditions.

  7. Multiple benefits of alloparental care in a fluctuating environment

    PubMed Central

    2018-01-01

    Although cooperatively breeding vertebrates occur disproportionately in unpredictable environments, the underlying mechanism shaping this biogeographic pattern remains unclear. Cooperative breeding may buffer against harsh conditions (hard life hypothesis), or additionally allow for sustained breeding under benign conditions (temporal variability hypothesis). To distinguish between the hard life and temporal variability hypotheses, we investigated whether the number of alloparents at a nest increased reproductive success or load-lightening in superb starlings (Lamprotornis superbus), and whether these two types of benefits varied in harsh and benign years. We found that mothers experienced both types of benefits consistent with the temporal variability hypothesis, as larger contingents of alloparents increased the number of young fledged while simultaneously allowing mothers to reduce their provisioning rates under both harsh and benign rainfall conditions. By contrast, fathers experienced load-lightening only under benign rainfall conditions, suggesting that cooperative breeding may serve to take advantage of unpredictable benign breeding seasons when they do occur. Cooperative breeding in unpredictable environments may thus promote flexibility in offspring care behaviour, which could mitigate variability in the cost of raising young. Our results highlight the importance of considering how offspring care decisions vary among breeding roles and across fluctuating environmental conditions. PMID:29515910

  8. Apgar score and dental caries risk in the primary dentition of five year olds.

    PubMed

    Sanders, A E; Slade, G D

    2010-09-01

    Conditions in utero and early life underlie risk for several childhood disorders. This study tested the hypothesis that the Apgar score predicted dental caries in the primary dentition. A retrospective cohort study conducted in 2003 examined associations between conditions at birth and early life with dental caries experience at five years. Dental examination data for a random sample of five-year-old South Australian children were obtained from School Dental Service electronic records. A questionnaire mailed to the parents obtained information about neonatal status at delivery (five-minute Apgar score, birthweight, plurality, gestational age) and details about birth order, weaning, and behavioural, familial and sociodemographic characteristics. Of the 1398 sampled children with a completed questionnaire (response rate=64.6%), 1058 were singleton term deliveries among whom prevalence of dental caries was 40.1%. In weighted log-binomial regression analysis, children with an Apgar score of <=8 relative to a score of 9-10 had greater probability of dental caries in the primary dentition after adjusting for sociodemographic and behavioural covariates and water fluoridation concentration (adjusted PR=1.47, 95% CI=1.11, 1.95). Readily accessible markers of early life, such as the Apgar score, may guide clinicians in identifying children at potentially heightened risk for dental caries and aid decision-making in allocating preventive services.

  9. Understanding the ontogeny of foraging behaviour: insights from combining marine predator bio-logging with satellite-derived oceanography in hidden Markov models.

    PubMed

    Grecian, W James; Lane, Jude V; Michelot, Théo; Wade, Helen M; Hamer, Keith C

    2018-06-01

    The development of foraging strategies that enable juveniles to efficiently identify and exploit predictable habitat features is critical for survival and long-term fitness. In the marine environment, meso- and sub-mesoscale features such as oceanographic fronts offer a visible cue to enhanced foraging conditions, but how individuals learn to identify these features is a mystery. In this study, we investigate age-related differences in the fine-scale foraging behaviour of adult (aged ≥ 5 years) and immature (aged 2-4 years) northern gannets Morus bassanus Using high-resolution GPS-loggers, we reveal that adults have a much narrower foraging distribution than immature birds and much higher individual foraging site fidelity. By conditioning the transition probabilities of a hidden Markov model on satellite-derived measures of frontal activity, we then demonstrate that adults show a stronger response to frontal activity than immature birds, and are more likely to commence foraging behaviour as frontal intensity increases. Together, these results indicate that adult gannets are more proficient foragers than immatures, supporting the hypothesis that foraging specializations are learned during individual exploratory behaviour in early life. Such memory-based individual foraging strategies may also explain the extended period of immaturity observed in gannets and many other long-lived species. © 2018 The Authors.

  10. Catalytic Role of Manganese Oxides in Prebiotic Nucleobases Synthesis from Formamide.

    PubMed

    Bhushan, Brij; Nayak, Arunima; Kamaluddin

    2016-06-01

    Origin of life processes might have begun with the formation of important biomonomers, such as amino acids and nucleotides, from simple molecules present in the prebiotic environment and their subsequent condensation to biopolymers. While studying the prebiotic synthesis of naturally occurring purine and pyrimidine derivatives from formamide, the manganese oxides demonstrated not only good binding for formamide but demonstrated novel catalytic activity. A novel one pot manganese oxide catalyzed synthesis of pyrimidine nucleobases like thymine is reported along with the formation of other nucleobases like purine, 9-(hydroxyacetyl) purine, cytosine, 4(3 H)-pyrimidinone and adenine in acceptable amounts. The work reported is significant in the sense that the synthesis of thymine has exhibited difficulties especially under one pot conditions and also such has been reported only under the catalytic activity of TiO2. The lower oxides of manganese were reported to show higher potential as catalysts and their existence were favored by the reducing atmospheric conditions prevalent on early Earth; thereby confirming the hypothesis that mineral having metals in reduced form might have been more active during the course of chemical evolution. Our results further confirm the role of formamide as a probable precursor for the formation of purine and pyrimidine bases during the course of chemical evolution and origin of life.

  11. The impact of reproduction on the stress axis of free-living male northern red backed voles (Myodes rutilus).

    PubMed

    Fletcher, Quinn E; Dantzer, Ben; Boonstra, Rudy

    2015-12-01

    Activation of the hypothalamic-pituitary-adrenal (HPA) axis culminates in the release of glucocorticoids (henceforth CORT), which have wide-reaching physiological effects. Three hypotheses potentially explain seasonal variation in CORT. The enabling hypothesis predicts that reproductive season CORT exceeds post-reproductive season CORT because CORT enables reproductive investment. The inhibitory hypothesis predicts the opposite because CORT can negatively affect reproductive function. The costs of reproduction hypothesis predicts that HPA axis condition declines over and following the reproductive season. We tested these hypotheses in wild male red-backed voles (Myodes rutilus) during the reproductive and post-reproductive seasons. We quantified CORT levels in response to restraint stress tests consisting of three blood samples (initial, stress-induced, and recovery). Mineralocorticoid (MR) and glucocorticoid (GR) receptor mRNA levels in the brain were also quantified over the reproductive season. Total CORT (tCORT) in the initial and stress-induced samples were greater in the post-reproductive than in the reproductive season, which supported the inhibitory hypothesis. Conversely, free CORT (fCORT) did not differ between the reproductive and post-reproductive seasons, which was counter to both the enabling and inhibitory hypotheses. Evidence for HPA axis condition decline in CORT as well as GR and MR mRNA over the reproductive season (i.e. costs of reproduction hypothesis) was mixed. Moreover, all of the parameters that showed signs of declining condition over the reproductive season did not also show signs of declining condition over the post-reproductive season suggesting that the costs resulting from reproductive investment had subsided. In conclusion, our results suggest that different aspects of the HPA axis respond differently to seasonal changes and reproductive investment. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Physiological condition of autumn-banded mallards and its relationship to hunting vulnerability

    USGS Publications Warehouse

    Hepp, G.R.; Blohm, R.J.; Reynolds, R.E.; Hines, J.E.; Nichols, J.D.

    1986-01-01

    An important topic of waterfowl ecology concerns the relationship between the physiological condition of ducks during the nonbreeding season and fitness, i.e., survival and future reproductive success. We investigated this subject using direct band recovery records of mallards (Anas platyrhynchos) banded in autumn (1 Oct-15 Dec) 1981-83 in the Mississippi Alluvial Valley (MAV) [USA]. A condition index, weight (g)/wing length (mm), was calculated for each duck, and we tested whether condition of mallards at time of banding was related to their probability of recovery during the hunting season. In 3 years, 5,610 mallards were banded and there were 234 direct recoveries. Three binary regression model was used to test the relationship between recovery probability and condition. Likelihood-ratio tests were conducted to determine the most suitable model. For mallards banded in autumn there was a negative relationship between physical condition and the probability of recovery. Mallards in poor condition at the time of banding had a greater probability of being recovered during the hunting season. In general, this was true for all ages and sex classes; however, the strongest relationship occurred for adult males.

  13. Killing Me Softly: The Fetal Origins Hypothesis*

    PubMed Central

    Almond, Douglas

    2013-01-01

    In the epidemiological literature, the fetal origins hypothesis associated with David J. Barker posits that chronic, degenerative conditions of adult health, including heart disease and type 2 diabetes, may be triggered by circumstance decades earlier, in utero nutrition in particular. Economists have expanded on this hypothesis, investigating a broader range of fetal shocks and circumstances and have found a wealth of later-life impacts on outcomes including test scores, educational attainment, and income, along with health. In the process, they have provided some of the most credible observational evidence in support of the hypothesis. The magnitude of the impacts is generally large. Thus, the fetal origins hypothesis has not only survived contact with economics, but has flourished. PMID:25152565

  14. Quantitative assessment of building fire risk to life safety.

    PubMed

    Guanquan, Chu; Jinhua, Sun

    2008-06-01

    This article presents a quantitative risk assessment framework for evaluating fire risk to life safety. Fire risk is divided into two parts: probability and corresponding consequence of every fire scenario. The time-dependent event tree technique is used to analyze probable fire scenarios based on the effect of fire protection systems on fire spread and smoke movement. To obtain the variation of occurrence probability with time, Markov chain is combined with a time-dependent event tree for stochastic analysis on the occurrence probability of fire scenarios. To obtain consequences of every fire scenario, some uncertainties are considered in the risk analysis process. When calculating the onset time to untenable conditions, a range of fires are designed based on different fire growth rates, after which uncertainty of onset time to untenable conditions can be characterized by probability distribution. When calculating occupant evacuation time, occupant premovement time is considered as a probability distribution. Consequences of a fire scenario can be evaluated according to probability distribution of evacuation time and onset time of untenable conditions. Then, fire risk to life safety can be evaluated based on occurrence probability and consequences of every fire scenario. To express the risk assessment method in detail, a commercial building is presented as a case study. A discussion compares the assessment result of the case study with fire statistics.

  15. Technology-enhanced Interactive Teaching of Marginal, Joint and Conditional Probabilities: The Special Case of Bivariate Normal Distribution

    PubMed Central

    Dinov, Ivo D.; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas

    2014-01-01

    Summary Data analysis requires subtle probability reasoning to answer questions like What is the chance of event A occurring, given that event B was observed? This generic question arises in discussions of many intriguing scientific questions such as What is the probability that an adolescent weighs between 120 and 140 pounds given that they are of average height? and What is the probability of (monetary) inflation exceeding 4% and housing price index below 110? To address such problems, learning some applied, theoretical or cross-disciplinary probability concepts is necessary. Teaching such courses can be improved by utilizing modern information technology resources. Students’ understanding of multivariate distributions, conditional probabilities, correlation and causation can be significantly strengthened by employing interactive web-based science educational resources. Independent of the type of a probability course (e.g. majors, minors or service probability course, rigorous measure-theoretic, applied or statistics course) student motivation, learning experiences and knowledge retention may be enhanced by blending modern technological tools within the classical conceptual pedagogical models. We have designed, implemented and disseminated a portable open-source web-application for teaching multivariate distributions, marginal, joint and conditional probabilities using the special case of bivariate Normal distribution. A real adolescent height and weight dataset is used to demonstrate the classroom utilization of the new web-application to address problems of parameter estimation, univariate and multivariate inference. PMID:25419016

  16. Technology-enhanced Interactive Teaching of Marginal, Joint and Conditional Probabilities: The Special Case of Bivariate Normal Distribution.

    PubMed

    Dinov, Ivo D; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas

    2013-01-01

    Data analysis requires subtle probability reasoning to answer questions like What is the chance of event A occurring, given that event B was observed? This generic question arises in discussions of many intriguing scientific questions such as What is the probability that an adolescent weighs between 120 and 140 pounds given that they are of average height? and What is the probability of (monetary) inflation exceeding 4% and housing price index below 110? To address such problems, learning some applied, theoretical or cross-disciplinary probability concepts is necessary. Teaching such courses can be improved by utilizing modern information technology resources. Students' understanding of multivariate distributions, conditional probabilities, correlation and causation can be significantly strengthened by employing interactive web-based science educational resources. Independent of the type of a probability course (e.g. majors, minors or service probability course, rigorous measure-theoretic, applied or statistics course) student motivation, learning experiences and knowledge retention may be enhanced by blending modern technological tools within the classical conceptual pedagogical models. We have designed, implemented and disseminated a portable open-source web-application for teaching multivariate distributions, marginal, joint and conditional probabilities using the special case of bivariate Normal distribution. A real adolescent height and weight dataset is used to demonstrate the classroom utilization of the new web-application to address problems of parameter estimation, univariate and multivariate inference.

  17. Metocean design parameter estimation for fixed platform based on copula functions

    NASA Astrophysics Data System (ADS)

    Zhai, Jinjin; Yin, Qilin; Dong, Sheng

    2017-08-01

    Considering the dependent relationship among wave height, wind speed, and current velocity, we construct novel trivariate joint probability distributions via Archimedean copula functions. Total 30-year data of wave height, wind speed, and current velocity in the Bohai Sea are hindcast and sampled for case study. Four kinds of distributions, namely, Gumbel distribution, lognormal distribution, Weibull distribution, and Pearson Type III distribution, are candidate models for marginal distributions of wave height, wind speed, and current velocity. The Pearson Type III distribution is selected as the optimal model. Bivariate and trivariate probability distributions of these environmental conditions are established based on four bivariate and trivariate Archimedean copulas, namely, Clayton, Frank, Gumbel-Hougaard, and Ali-Mikhail-Haq copulas. These joint probability models can maximize marginal information and the dependence among the three variables. The design return values of these three variables can be obtained by three methods: univariate probability, conditional probability, and joint probability. The joint return periods of different load combinations are estimated by the proposed models. Platform responses (including base shear, overturning moment, and deck displacement) are further calculated. For the same return period, the design values of wave height, wind speed, and current velocity obtained by the conditional and joint probability models are much smaller than those by univariate probability. Considering the dependence among variables, the multivariate probability distributions provide close design parameters to actual sea state for ocean platform design.

  18. Information-Theoretic Perspectives on Geophysical Models

    NASA Astrophysics Data System (ADS)

    Nearing, Grey

    2016-04-01

    To test any hypothesis about any dynamic system, it is necessary to build a model that places that hypothesis into the context of everything else that we know about the system: initial and boundary conditions and interactions between various governing processes (Hempel and Oppenheim, 1948, Cartwright, 1983). No hypothesis can be tested in isolation, and no hypothesis can be tested without a model (for a geoscience-related discussion see Clark et al., 2011). Science is (currently) fundamentally reductionist in the sense that we seek some small set of governing principles that can explain all phenomena in the universe, and such laws are ontological in the sense that they describe the object under investigation (Davies, 1990 gives several competing perspectives on this claim). However, since we cannot build perfect models of complex systems, any model that does not also contain an epistemological component (i.e., a statement, like a probability distribution, that refers directly to the quality of of the information from the model) is falsified immediately (in the sense of Popper, 2002) given only a small number of observations. Models necessarily contain both ontological and epistemological components, and what this means is that the purpose of any robust scientific method is to measure the amount and quality of information provided by models. I believe that any viable philosophy of science must be reducible to this statement. The first step toward a unified theory of scientific models (and therefore a complete philosophy of science) is a quantitative language that applies to both ontological and epistemological questions. Information theory is one such language: Cox' (1946) theorem (see Van Horn, 2003) tells us that probability theory is the (only) calculus that is consistent with Classical Logic (Jaynes, 2003; chapter 1), and information theory is simply the integration of convex transforms of probability ratios (integration reduces density functions to scalar metrics) (Csiszár, 1972). Fundamentally, models can only translate existing information - they cannot create information. That is, all of the information about any future (or otherwise unobserved event) is contained in the initial and boundary conditions of whatever model we will use to predict that phenomena (Gong et al., 2013). A model simply tells us how to process the available information in a way that is as close to isomorphic with how the system itself processes information. As such, models can only lose or corrupt information because at best a model can only perfectly extract all information contained in its input data; this is a theorem called the Data Processing Inequality (Cover and Thomas, 1991), and this perspective represents a purely ontological treatment of information in models. In practice, however, models provide information to scientists about how to translate information, and in this epistemic sense, models can provide positive quantities of information. During engineering-type efforts, where our goal is fundamentally to make predictions, we would measure the (possibly positive) net epistemic information from some hypothesized model relative to some uninformative prior, or relative to some competing model(s), to measure how much information we gain by running the model (Nearing and Gupta, 2015). True science-focused efforts, however, where the intent is learning rather than prediction, cannot rely on this type of comparative hypothesis testing. We therefore encourage scientists to take the first perspective outlined above and to attempt to measure the ontological information that is lost by their models, rather than the epistemological information that is gained from their models. This represents a radical departure from how scientists usually approach the problem of model evaluation. It turns out that it is possible to approximate the latter objective in practice. We are aware of no existing efforts to this effect in either the philosophy or practice of science (except by Gong et al., 2013, whose fundamental insight is the basis for this talk), and here I offer two examples of practical methods that scientists might use to approximately measure ontological information. I place this practical discussion in the context of several recent and high-profile experiments that have found that simple out-of-sample statistical models typically (vastly) outperform our most sophisticated terrestrial hydrology models. I offer some perspective on several open questions about how to use these findings to improve our models and understanding of these systems. Cartwright, N. (1983) How the Laws of Physics Lie. New York, NY: Cambridge Univ Press. Clark, M. P., Kavetski, D. and Fenicia, F. (2011) 'Pursuing the method of multiple working hypotheses for hydrological modeling', Water Resources Research, 47(9). Cover, T. M. and Thomas, J. A. (1991) Elements of Information Theory. New York, NY: Wiley-Interscience. Cox, R. T. (1946) 'Probability, frequency and reasonable expectation', American Journal of Physics, 14, pp. 1-13. Csiszár, I. (1972) 'A Class of Measures of Informativity of Observation Channels', Periodica Mathematica Hungarica, 2(1), pp. 191-213. Davies, P. C. W. (1990) 'Why is the physical world so comprehensible', Complexity, entropy and the physics of information, pp. 61-70. Gong, W., Gupta, H. V., Yang, D., Sricharan, K. and Hero, A. O. (2013) 'Estimating Epistemic & Aleatory Uncertainties During Hydrologic Modeling: An Information Theoretic Approach', Water Resources Research, 49(4), pp. 2253-2273. Jaynes, E. T. (2003) Probability Theory: The Logic of Science. New York, NY: Cambridge University Press. Nearing, G. S. and Gupta, H. V. (2015) 'The quantity and quality of information in hydrologic models', Water Resources Research, 51(1), pp. 524-538. Popper, K. R. (2002) The Logic of Scientific Discovery. New York: Routledge. Van Horn, K. S. (2003) 'Constructing a logic of plausible inference: a guide to cox's theorem', International Journal of Approximate Reasoning, 34(1), pp. 3-24.

  19. Probability based models for estimation of wildfire risk

    Treesearch

    Haiganoush Preisler; D. R. Brillinger; R. E. Burgan; John Benoit

    2004-01-01

    We present a probability-based model for estimating fire risk. Risk is defined using three probabilities: the probability of fire occurrence; the conditional probability of a large fire given ignition; and the unconditional probability of a large fire. The model is based on grouped data at the 1 km²-day cell level. We fit a spatially and temporally explicit non-...

  20. Targeting the probability versus cost of feared outcomes in public speaking anxiety.

    PubMed

    Nelson, Elizabeth A; Deacon, Brett J; Lickel, James J; Sy, Jennifer T

    2010-04-01

    Cognitive-behavioral theory suggests that social phobia is maintained, in part, by overestimates of the probability and cost of negative social events. Indeed, empirically supported cognitive-behavioral treatments directly target these cognitive biases through the use of in vivo exposure or behavioral experiments. While cognitive-behavioral theories and treatment protocols emphasize the importance of targeting probability and cost biases in the reduction of social anxiety, few studies have examined specific techniques for reducing probability and cost bias, and thus the relative efficacy of exposure to the probability versus cost of negative social events is unknown. In the present study, 37 undergraduates with high public speaking anxiety were randomly assigned to a single-session intervention designed to reduce either the perceived probability or the perceived cost of negative outcomes associated with public speaking. Compared to participants in the probability treatment condition, those in the cost treatment condition demonstrated significantly greater improvement on measures of public speaking anxiety and cost estimates for negative social events. The superior efficacy of the cost treatment condition was mediated by greater treatment-related changes in social cost estimates. The clinical implications of these findings are discussed. Published by Elsevier Ltd.

  1. Meteorological risks are drivers of environmental innovation in agro-ecosystem management

    NASA Astrophysics Data System (ADS)

    Gobin, Anne; Van de Vijver, Hans; Vanwindekens, Frédéric; de Frutos Cachorro, Julia; Verspecht, Ann; Planchon, Viviane; Buyse, Jeroen

    2017-04-01

    Agricultural crop production is to a great extent determined by weather conditions. The research hypothesis is that meteorological risks act as drivers of environmental innovation in agro-ecosystem management. The methodology comprised five major parts: the hazard, its impact on different agro-ecosystems, vulnerability, risk management and risk communication. Generalized Extreme Value (GEV) theory was used to model annual maxima of meteorological variables based on a location-, scale- and shape-parameter that determine the center of the distribution, the deviation of the location-parameter and the upper tail decay, respectively. Spatial interpolation of GEV-derived return levels resulted in spatial temperature extremes, precipitation deficits and wet periods. The temporal overlap between extreme weather conditions and sensitive periods in the agro-ecosystem was realised using a bio-physically based modelling framework that couples phenology, a soil water balance and crop growth. 20-year return values for drought and waterlogging during different crop stages were related to arable yields. The method helped quantify agricultural production risks and rate both weather and crop-based agricultural insurance. The spatial extent of vulnerability is developed on different layers of geo-information to include meteorology, soil-landscapes, crop cover and management. Vulnerability of agroecosystems was mapped based on rules set by experts' knowledge and implemented by Fuzzy Inference System modelling and Geographical Information System tools. The approach was applied for cropland vulnerability to heavy rain and grassland vulnerability to drought. The level of vulnerability and resilience of an agro-ecosystem was also determined by risk management which differed across sectors and farm types. A calibrated agro-economic model demonstrated a marked influence of climate adapted land allocation and crop management on individual utility. The "chain of risk" approach allowed for investigating the hypothesis that meteorological risks act as drivers for agricultural innovation. Risk types were quantified in terms of probability and distribution, and further distinguished according to production type. Examples of strategies and options were provided at field, farm and policy level using different modelling methods.

  2. Testing the hypothesis that treatment can eliminate HIV: a nationwide, population-based study of the Danish HIV epidemic in men who have sex with men.

    PubMed

    Okano, Justin T; Robbins, Danielle; Palk, Laurence; Gerstoft, Jan; Obel, Niels; Blower, Sally

    2016-07-01

    Worldwide, approximately 35 million individuals are infected with HIV; about 25 million of these live in sub-Saharan Africa. WHO proposes using treatment as prevention (TasP) to eliminate HIV. Treatment suppresses viral load, decreasing the probability an individual transmits HIV. The elimination threshold is one new HIV infection per 1000 individuals. Here, we test the hypothesis that TasP can substantially reduce epidemics and eliminate HIV. We estimate the impact of TasP, between 1996 and 2013, on the Danish HIV epidemic in men who have sex with men (MSM), an epidemic UNAIDS has identified as a priority for elimination. We use a CD4-staged Bayesian back-calculation approach to estimate incidence, and the hidden epidemic (the number of HIV-infected undiagnosed MSM). To develop the back-calculation model, we use data from an ongoing nationwide population-based study: the Danish HIV Cohort Study. Incidence, and the hidden epidemic, decreased substantially after treatment was introduced in 1996. By 2013, incidence was close to the elimination threshold: 1·4 (median, 95% Bayesian credible interval [BCI] 0·4-2·1) new HIV infections per 1000 MSM and there were only 617 (264-858) undiagnosed MSM. Decreasing incidence and increasing treatment coverage were highly correlated; a treatment threshold effect was apparent. Our study is the first to show that TasP can substantially reduce a country's HIV epidemic, and bring it close to elimination. However, we have shown the effectiveness of TasP under optimal conditions: very high treatment coverage, and exceptionally high (98%) viral suppression rate. Unless these extremely challenging conditions can be met in sub-Saharan Africa, the WHO's global elimination strategy is unlikely to succeed. National Institute of Allergy and Infectious Diseases. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Aetiology of teenage childbearing: reasons for familial effects.

    PubMed

    Olausson, P O; Lichtenstein, P; Cnattingius, S

    2000-03-01

    The aims of the present study were to evaluate the contribution of the genetic and environmental factors to the risk of teenage childbearing, and to study whether life style, socio-economic conditions, and personality traits could explain possible familial effects. We linked two population-based registers: the Swedish Twin Register and the Swedish Medical Birth Register. The study covers female twin pairs born between 1953 and 1958, having their first infant before the age of 30 years (n = 1885). In order to separate familial effects from other environmental influences, and genetic effects from shared environmental effects, only complete twin pairs with known zygosity were included, in all 260 monozygotic and 370 dizygotic twin pairs. We used quantitative genetic analyses to evaluate the importance of genetic and environmental effects for liability to teenage childbearing. Logistic regression analyses were used to estimate the effects of life style, socio-economic situation, and personality on the probability of teenage childbearing, and to study whether psychosocial factors could explain possible familial effects. Fifty-nine percent (0-76%) of the variance in being a teenage mother was attributable to heritable factors; 0% (0-49%) was due to shared environmental factors; and 41% (23-67%) was explained by non-shared environmental factors. Thus, the data were consistent with the hypothesis that the familial aggregation of teenage childbearing is completely explained by genetic factors, although the alternative hypothesis that familial aggregation is entirely explained by shared environmental factors cannot be ruled out. Significant effects of smoking habits, housing conditions, and educational level were found in relation to liability to teenage childbearing. However, the familial effects on risk of teenage childbearing were not mediated through similarities in life style and socio-economic factors. When studying risk factors for teenage childbearing, it is recommended to include life style and socio-economic variables as well as information about family history of teenage childbearing. Twin Research (2000) 3, 23-27.

  4. The Gumbel hypothesis test for left censored observations using regional earthquake records as an example

    NASA Astrophysics Data System (ADS)

    Thompson, E. M.; Hewlett, J. B.; Baise, L. G.; Vogel, R. M.

    2011-01-01

    Annual maximum (AM) time series are incomplete (i.e., censored) when no events are included above the assumed censoring threshold (i.e., magnitude of completeness). We introduce a distrtibutional hypothesis test for left-censored Gumbel observations based on the probability plot correlation coefficient (PPCC). Critical values of the PPCC hypothesis test statistic are computed from Monte-Carlo simulations and are a function of sample size, censoring level, and significance level. When applied to a global catalog of earthquake observations, the left-censored Gumbel PPCC tests are unable to reject the Gumbel hypothesis for 45 of 46 seismic regions. We apply four different field significance tests for combining individual tests into a collective hypothesis test. None of the field significance tests are able to reject the global hypothesis that AM earthquake magnitudes arise from a Gumbel distribution. Because the field significance levels are not conclusive, we also compute the likelihood that these field significance tests are unable to reject the Gumbel model when the samples arise from a more complex distributional alternative. A power study documents that the censored Gumbel PPCC test is unable to reject some important and viable Generalized Extreme Value (GEV) alternatives. Thus, we cannot rule out the possibility that the global AM earthquake time series could arise from a GEV distribution with a finite upper bound, also known as a reverse Weibull distribution. Our power study also indicates that the binomial and uniform field significance tests are substantially more powerful than the more commonly used Bonferonni and false discovery rate multiple comparison procedures.

  5. Microsaccade production during saccade cancelation in a stop-signal task

    PubMed Central

    Godlove, David C.; Schall, Jeffrey D.

    2014-01-01

    We obtained behavioral data to evaluate two alternative hypotheses about the neural mechanisms of gaze control. The “fixation” hypothesis states that neurons in rostral superior colliculus (SC) enforce fixation of gaze. The “microsaccade” hypothesis states that neurons in rostral SC encode microsaccades rather than fixation per se. Previously reported neuronal activity in monkey SC during the saccade stop-signal task leads to specific, dissociable behavioral predictions of these two hypotheses. When subjects are required to cancel partially-prepared saccades, imbalanced activity spreads across rostral and caudal SC with a reliable temporal profile. The microsaccade hypothesis predicts that this imbalance will lead to elevated microsaccade production biased toward the target location, while the fixation hypothesis predicts reduced microsaccade production. We tested these predictions by analyzing the microsaccades produced by 4 monkeys while they voluntarily canceled partially prepared eye movements in response to explicit stop signals. Consistent with the fixation hypothesis and contradicting the microsaccade hypothesis, we found that each subject produced significantly fewer microsaccades when normal saccades were successfully canceled. The few microsaccades escaping this inhibition tended to be directed toward the target location. We additionally investigated interactions between initiating microsaccades and inhibiting normal saccades. Reaction times were longer when microsaccades immediately preceded target presentation. However, pre-target microsaccade production did not affect stop-signal reaction time or alter the probability of canceling saccades following stop signals. These findings demonstrate that imbalanced activity within SC does not necessarily produce microsaccades and add to evidence that saccade preparation and cancelation are separate processes. PMID:25448116

  6. The Scientific Method, Diagnostic Bayes, and How to Detect Epistemic Errors

    NASA Astrophysics Data System (ADS)

    Vrugt, J. A.

    2015-12-01

    In the past decades, Bayesian methods have found widespread application and use in environmental systems modeling. Bayes theorem states that the posterior probability, P(H|D) of a hypothesis, H is proportional to the product of the prior probability, P(H) of this hypothesis and the likelihood, L(H|hat{D}) of the same hypothesis given the new/incoming observations, \\hat {D}. In science and engineering, H often constitutes some numerical simulation model, D = F(x,.) which summarizes using algebraic, empirical, and differential equations, state variables and fluxes, all our theoretical and/or practical knowledge of the system of interest, and x are the d unknown parameters which are subject to inference using some data, \\hat {D} of the observed system response. The Bayesian approach is intimately related to the scientific method and uses an iterative cycle of hypothesis formulation (model), experimentation and data collection, and theory/hypothesis refinement to elucidate the rules that govern the natural world. Unfortunately, model refinement has proven to be very difficult in large part because of the poor diagnostic power of residual based likelihood functions tep{gupta2008}. This has inspired te{vrugt2013} to advocate the use of 'likelihood-free' inference using approximate Bayesian computation (ABC). This approach uses one or more summary statistics, S(\\hat {D}) of the original data, \\hat {D} designed ideally to be sensitive only to one particular process in the model. Any mismatch between the observed and simulated summary metrics is then easily linked to a specific model component. A recurrent issue with the application of ABC is self-sufficiency of the summary statistics. In theory, S(.) should contain as much information as the original data itself, yet complex systems rarely admit sufficient statistics. In this article, we propose to combine the ideas of ABC and regular Bayesian inference to guarantee that no information is lost in diagnostic model evaluation. This hybrid approach, coined diagnostic Bayes, uses the summary metrics as prior distribution and original data in the likelihood function, or P(x|\\hat {D}) ∝ P(x|S(\\hat {D})) L(x|\\hat {D}). A case study illustrates the ability of the proposed methodology to diagnose epistemic errors and provide guidance on model refinement.

  7. Stochastic approach for an unbiased estimation of the probability of a successful separation in conventional chromatography and sequential elution liquid chromatography.

    PubMed

    Ennis, Erin J; Foley, Joe P

    2016-07-15

    A stochastic approach was utilized to estimate the probability of a successful isocratic or gradient separation in conventional chromatography for numbers of sample components, peak capacities, and saturation factors ranging from 2 to 30, 20-300, and 0.017-1, respectively. The stochastic probabilities were obtained under conditions of (i) constant peak width ("gradient" conditions) and (ii) peak width increasing linearly with time ("isocratic/constant N" conditions). The isocratic and gradient probabilities obtained stochastically were compared with the probabilities predicted by Martin et al. [Anal. Chem., 58 (1986) 2200-2207] and Davis and Stoll [J. Chromatogr. A, (2014) 128-142]; for a given number of components and peak capacity the same trend is always observed: probability obtained with the isocratic stochastic approach

  8. Rooting phylogenetic trees under the coalescent model using site pattern probabilities.

    PubMed

    Tian, Yuan; Kubatko, Laura

    2017-12-19

    Phylogenetic tree inference is a fundamental tool to estimate ancestor-descendant relationships among different species. In phylogenetic studies, identification of the root - the most recent common ancestor of all sampled organisms - is essential for complete understanding of the evolutionary relationships. Rooted trees benefit most downstream application of phylogenies such as species classification or study of adaptation. Often, trees can be rooted by using outgroups, which are species that are known to be more distantly related to the sampled organisms than any other species in the phylogeny. However, outgroups are not always available in evolutionary research. In this study, we develop a new method for rooting species tree under the coalescent model, by developing a series of hypothesis tests for rooting quartet phylogenies using site pattern probabilities. The power of this method is examined by simulation studies and by application to an empirical North American rattlesnake data set. The method shows high accuracy across the simulation conditions considered, and performs well for the rattlesnake data. Thus, it provides a computationally efficient way to accurately root species-level phylogenies that incorporates the coalescent process. The method is robust to variation in substitution model, but is sensitive to the assumption of a molecular clock. Our study establishes a computationally practical method for rooting species trees that is more efficient than traditional methods. The method will benefit numerous evolutionary studies that require rooting a phylogenetic tree without having to specify outgroups.

  9. The impact of nectar chemical features on phenotypic variation in two related nectar yeasts.

    PubMed

    Pozo, María I; Herrera, Carlos M; Van den Ende, Wim; Verstrepen, Kevin; Lievens, Bart; Jacquemyn, Hans

    2015-06-01

    Floral nectars become easily colonized by microbes, most often species of the ascomycetous yeast genus Metschnikowia. Although it is known that nectar composition can vary tremendously among plant species, most probably corresponding to the nutritional requirements of their main pollinators, far less is known about how variation in nectar chemistry affects intraspecific variation in nectarivorous yeasts. Because variation in nectar traits probably affects growth and abundance of nectar yeasts, nectar yeasts can be expected to display large phenotypic variation in order to cope with varying nectar conditions. To test this hypothesis, we related variation in the phenotypic landscape of a vast collection of nectar-living yeast isolates from two Metschnikowia species (M. reukaufii and M. gruessii) to nectar chemical traits using non-linear redundancy analyses. Nectar yeasts were collected from 19 plant species from different plant families to include as much variation in nectar chemical traits as possible. As expected, nectar yeasts displayed large variation in phenotypic traits, particularly in traits related to growth performance in carbon sources and inhibitors, which was significantly related to the host plant from which they were isolated. Total sugar concentration and relative fructose content significantly explained the observed variation in the phenotypic profile of the investigated yeast species, indicating that sugar concentration and composition are the key traits that affect phenotypic variation in nectarivorous yeasts. © FEMS 2015. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  10. Analytical performance evaluation of SAR ATR with inaccurate or estimated models

    NASA Astrophysics Data System (ADS)

    DeVore, Michael D.

    2004-09-01

    Hypothesis testing algorithms for automatic target recognition (ATR) are often formulated in terms of some assumed distribution family. The parameter values corresponding to a particular target class together with the distribution family constitute a model for the target's signature. In practice such models exhibit inaccuracy because of incorrect assumptions about the distribution family and/or because of errors in the assumed parameter values, which are often determined experimentally. Model inaccuracy can have a significant impact on performance predictions for target recognition systems. Such inaccuracy often causes model-based predictions that ignore the difference between assumed and actual distributions to be overly optimistic. This paper reports on research to quantify the effect of inaccurate models on performance prediction and to estimate the effect using only trained parameters. We demonstrate that for large observation vectors the class-conditional probabilities of error can be expressed as a simple function of the difference between two relative entropies. These relative entropies quantify the discrepancies between the actual and assumed distributions and can be used to express the difference between actual and predicted error rates. Focusing on the problem of ATR from synthetic aperture radar (SAR) imagery, we present estimators of the probabilities of error in both ideal and plug-in tests expressed in terms of the trained model parameters. These estimators are defined in terms of unbiased estimates for the first two moments of the sample statistic. We present an analytical treatment of these results and include demonstrations from simulated radar data.

  11. Orbital apocenter is not a sufficient condition for HST/STIS detection of Europa's water vapor aurora.

    PubMed

    Roth, Lorenz; Retherford, Kurt D; Saur, Joachim; Strobel, Darrell F; Feldman, Paul D; McGrath, Melissa A; Nimmo, Francis

    2014-12-02

    We report far-ultraviolet observations of Jupiter's moon Europa taken by Space Telescope Imaging Spectrograph (STIS) of the Hubble Space Telescope (HST) in January and February 2014 to test the hypothesis that the discovery of a water vapor aurora in December 2012 by local hydrogen (H) and oxygen (O) emissions with the STIS originated from plume activity possibly correlated with Europa's distance from Jupiter through tidal stress variations. The 2014 observations were scheduled with Europa near the apocenter similar to the orbital position of its previous detection. Tensile stresses on south polar fractures are expected to be highest in this orbital phase, potentially maximizing the probability for plume activity. No local H and O emissions were detected in the new STIS images. In the south polar region where the emission surpluses were observed in 2012, the brightnesses are sufficiently low in the 2014 images to be consistent with any H2O abundance from (0-5)×10(15) cm(-2). Large high-latitude plumes should have been detectable by the STIS, independent of the observing conditions and geometry. Because electron excitation of water vapor remains the only viable explanation for the 2012 detection, the new observations indicate that although the same orbital position of Europa for plume activity may be a necessary condition, it is not a sufficient condition. However, the December 2012 detection of coincident HI Lyman-α and OI 1304-Å emission surpluses in an ∼200-km high region well separated above Europa's limb is a firm result and not invalidated by our 2014 STIS observations.

  12. Orbital apocenter is not a sufficient condition for HST/STIS detection of Europa's water vapor aurora

    NASA Astrophysics Data System (ADS)

    Roth, Lorenz; Retherford, Kurt D.; Saur, Joachim; Strobel, Darrell F.; Feldman, Paul D.; McGrath, Melissa A.; Nimmo, Francis

    2014-12-01

    We report far-ultraviolet observations of Jupiter's moon Europa taken by Space Telescope Imaging Spectrograph (STIS) of the Hubble Space Telescope (HST) in January and February 2014 to test the hypothesis that the discovery of a water vapor aurora in December 2012 by local hydrogen (H) and oxygen (O) emissions with the STIS originated from plume activity possibly correlated with Europa's distance from Jupiter through tidal stress variations. The 2014 observations were scheduled with Europa near the apocenter similar to the orbital position of its previous detection. Tensile stresses on south polar fractures are expected to be highest in this orbital phase, potentially maximizing the probability for plume activity. No local H and O emissions were detected in the new STIS images. In the south polar region where the emission surpluses were observed in 2012, the brightnesses are sufficiently low in the 2014 images to be consistent with any H2O abundance from (0-5)×1015 cm-2. Large high-latitude plumes should have been detectable by the STIS, independent of the observing conditions and geometry. Because electron excitation of water vapor remains the only viable explanation for the 2012 detection, the new observations indicate that although the same orbital position of Europa for plume activity may be a necessary condition, it is not a sufficient condition. However, the December 2012 detection of coincident HI Lyman-α and OI 1304-Å emission surpluses in an ∼200-km high region well separated above Europa's limb is a firm result and not invalidated by our 2014 STIS observations.

  13. Earthquake likelihood model testing

    USGS Publications Warehouse

    Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.

    2007-01-01

    INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a wide range of possible testing procedures exist. Jolliffe and Stephenson (2003) present different forecast verifications from atmospheric science, among them likelihood testing of probability forecasts and testing the occurrence of binary events. Testing binary events requires that for each forecasted event, the spatial, temporal and magnitude limits be given. Although major earthquakes can be considered binary events, the models within the RELM project express their forecasts on a spatial grid and in 0.1 magnitude units; thus the results are a distribution of rates over space and magnitude. These forecasts can be tested with likelihood tests.In general, likelihood tests assume a valid null hypothesis against which a given hypothesis is tested. The outcome is either a rejection of the null hypothesis in favor of the test hypothesis or a nonrejection, meaning the test hypothesis cannot outperform the null hypothesis at a given significance level. Within RELM, there is no accepted null hypothesis and thus the likelihood test needs to be expanded to allow comparable testing of equipollent hypotheses.To test models against one another, we require that forecasts are expressed in a standard format: the average rate of earthquake occurrence within pre-specified limits of hypocentral latitude, longitude, depth, magnitude, time period, and focal mechanisms. Focal mechanisms should either be described as the inclination of P-axis, declination of P-axis, and inclination of the T-axis, or as strike, dip, and rake angles. Schorlemmer and Gerstenberger (2007, this issue) designed classes of these parameters such that similar models will be tested against each other. These classes make the forecasts comparable between models. Additionally, we are limited to testing only what is precisely defined and consistently reported in earthquake catalogs. Therefore it is currently not possible to test such information as fault rupture length or area, asperity location, etc. Also, to account for data quality issues, we allow for location and magnitude uncertainties as well as the probability that an event is dependent on another event.As we mentioned above, only models with comparable forecasts can be tested against each other. Our current tests are designed to examine grid-based models. This requires that any fault-based model be adapted to a grid before testing is possible. While this is a limitation of the testing, it is an inherent difficulty in any such comparative testing. Please refer to appendix B for a statistical evaluation of the application of the Poisson hypothesis to fault-based models.The testing suite we present consists of three different tests: L-Test, N-Test, and R-Test. These tests are defined similarily to Kagan and Jackson (1995). The first two tests examine the consistency of the hypotheses with the observations while the last test compares the spatial performances of the models.

  14. Directional effect in double conditionals with a construction task: The semantic hypothesis.

    PubMed

    Espino, Orlando; Morales, Tarek; Bolaños-Medina, Alicia

    2017-09-01

    The goal of this paper is to test the main predictions of the semantic hypothesis about the directional effect in double conditionals (such as, 'A only if B/only if C, B') with a construction task. The semantic hypothesis claims that directional effect can be explained by the inherent directionality of the relation between the relatum and the target object of the premises. According to this hypothesis, a directional effect should occur if only one of the end-terms of the premises takes the role of relatum: a) if the end-term that plays the role of relatum is in the first premise, a forward directional effect is predicted (from A to C); and b) if the end-term that plays the role of relatum is in the second premise, a backward directional effect is predicted (from C to A). On the other hand, it claims that there should be no directional effect when both end-terms take the role of relatum or when neither of the end-terms plays the role of relatum. Three experiments confirmed the main predictions of the semantic hypothesis in a construction task. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Ecological, Evolutionary and Social Constraints on Reproductive Effort: Are Hoary Marmots Really Biennial Breeders?

    PubMed Central

    Patil, Vijay P.; Karels, Timothy J.; Hik, David S.

    2015-01-01

    Biennial breeding is a rare life-history trait observed in animal species living in harsh, unproductive environments. This reproductive pattern is thought to occur in 10 of 14 species in the genus Marmota, making marmots useful model organisms for studying its ecological and evolutionary implications. Biennial breeding in marmots has been described as an obligate pattern which evolved as a mechanism to mitigate the energetic costs of reproduction (Evolved Constraint hypothesis). However, recent anecdotal evidence suggests that it is a facultative pattern controlled by annual variation in climate and food availability (Environmental Constraint hypothesis). Finally, in social animals like marmots, biennial breeding could result from reproductive competition between females within social groups (Social Constraint hypothesis). We evaluated these three hypotheses using mark-recapture data from an 8-year study of hoary marmot (Marmota caligata) population dynamics in the Yukon. Annual variation in breeding probability was modeled using multi-state mark-recapture models, while other reproductive life-history traits were modeled with generalized linear mixed models. Hoary marmots were neither obligate nor facultative biennial breeders, and breeding probability was insensitive to evolved, environmental, or social factors. However, newly mature females were significantly less likely to breed than older individuals. Annual breeding did not result in increased mortality. Female survival and, to a lesser extent, average fecundity were correlated with winter climate, as indexed by the Pacific Decadal Oscillation. Hoary marmots are less conservative breeders than previously believed, and the evidence for biennial breeding throughout Marmota, and in other arctic/alpine/antarctic animals, should be re-examined. Prediction of future population dynamics requires an accurate understanding of life history strategies, and of how life history traits allow animals to cope with changes in weather and other demographic influences. PMID:25768300

  16. Prior cocaine exposure disrupts extinction of fear conditioning

    PubMed Central

    Burke, Kathryn A.; Franz, Theresa M.; Gugsa, Nishan; Schoenbaum, Geoffrey

    2008-01-01

    Psychostimulant exposure has been shown to cause molecular and cellular changes in prefrontal cortex. It has been hypothesized that these drug-induced changes might affect the operation of prefrontal-limbic circuits, disrupting their normal role in controlling behavior and thereby leading to compulsive drug-seeking. To test this hypothesis, we tested cocaine-treated rats in a fear conditioning, inflation, and extinction task, known to depend on medial prefrontal cortex and amygdala. Cocaine-treated rats conditioned and inflated similar to saline controls but displayed slower extinction learning. These results support the hypothesis that control processes in the medial prefrontal cortex are impaired by cocaine exposure. PMID:16847305

  17. Prior cocaine exposure disrupts extinction of fear conditioning.

    PubMed

    Burke, Kathryn A; Franz, Theresa M; Gugsa, Nishan; Schoenbaum, Geoffrey

    2006-01-01

    Psychostimulant exposure has been shown to cause molecular and cellular changes in prefrontal cortex. It has been hypothesized that these drug-induced changes might affect the operation of prefrontal-limbic circuits, disrupting their normal role in controlling behavior and thereby leading to compulsive drug-seeking. To test this hypothesis, we tested cocaine-treated rats in a fear conditioning, inflation, and extinction task, known to depend on medial prefrontal cortex and amygdala. Cocaine-treated rats conditioned and inflated similar to saline controls but displayed slower extinction learning. These results support the hypothesis that control processes in the medial prefrontal cortex are impaired by cocaine exposure.

  18. Height probabilities in the Abelian sandpile model on the generalized finite Bethe lattice

    NASA Astrophysics Data System (ADS)

    Chen, Haiyan; Zhang, Fuji

    2013-08-01

    In this paper, we study the sandpile model on the generalized finite Bethe lattice with a particular boundary condition. Using a combinatorial method, we give the exact expressions for all single-site probabilities and some two-site joint probabilities. As a by-product, we prove that the height probabilities of bulk vertices are all the same for the Bethe lattice with certain given boundary condition, which was found from numerical evidence by Grassberger and Manna ["Some more sandpiles," J. Phys. (France) 51, 1077-1098 (1990)], 10.1051/jphys:0199000510110107700 but without a proof.

  19. Multiple Hypothesis Tracking (MHT) for Space Surveillance: Results and Simulation Studies

    DTIC Science & Technology

    2013-09-01

    processor. 1 . INTRODUCTION The Joint Space Operations Center (JSpOC) currently tracks more than 22,000 satellites and space debris orbiting the Earth... 1 , 2]. With the anticipated installation of more accurate sensors and the increased probability of future collisions between space objects, the...average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed

  20. About the cumulants of periodic signals

    NASA Astrophysics Data System (ADS)

    Barrau, Axel; El Badaoui, Mohammed

    2018-01-01

    This note studies cumulants of time series. These functions originating from the probability theory being commonly used as features of deterministic signals, their classical properties are examined in this modified framework. We show additivity of cumulants, ensured in the case of independent random variables, requires here a different hypothesis. Practical applications are proposed, in particular an analysis of the failure of the JADE algorithm to separate some specific periodic signals.

  1. On 3D flow-structures behind an inclined plate

    NASA Astrophysics Data System (ADS)

    Uruba, Václav; Pavlík, David; Procházka, Pavel; Skála, Vladislav; Kopecký, Václav

    Stereo PIV measurements has been performed behind the inclined plate, angle of attack 5 and 10 deg. Occurrence and dynamics of streamwise structures behind the plate trailing edge have been studied in details using POD method. The streamwise structures are represented by vortices and low- and highvelocity regions, probably streaks. The obtained results support the hypothesis of an airfoil-flow force interaction by Hoffman and Johnson [1,2].

  2. Second-order asymptotics for quantum hypothesis testing in settings beyond i.i.d.—quantum lattice systems and more

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Datta, Nilanjana; Rouzé, Cambyse; Pautrat, Yan

    2016-06-15

    Quantum Stein’s lemma is a cornerstone of quantum statistics and concerns the problem of correctly identifying a quantum state, given the knowledge that it is one of two specific states (ρ or σ). It was originally derived in the asymptotic i.i.d. setting, in which arbitrarily many (say, n) identical copies of the state (ρ{sup ⊗n} or σ{sup ⊗n}) are considered to be available. In this setting, the lemma states that, for any given upper bound on the probability α{sub n} of erroneously inferring the state to be σ, the probability β{sub n} of erroneously inferring the state to be ρmore » decays exponentially in n, with the rate of decay converging to the relative entropy of the two states. The second order asymptotics for quantum hypothesis testing, which establishes the speed of convergence of this rate of decay to its limiting value, was derived in the i.i.d. setting independently by Tomamichel and Hayashi, and Li. We extend this result to settings beyond i.i.d. Examples of these include Gibbs states of quantum spin systems (with finite-range, translation-invariant interactions) at high temperatures, and quasi-free states of fermionic lattice gases.« less

  3. An immune reaction may be necessary for cancer development.

    PubMed

    Prehn, Richmond T

    2006-02-03

    The hypothesis of immunosurveillance suggests that new neoplasms arise very frequently, but most are destroyed almost at their inception by an immune response. Its correctness has been debated for many years. In its support, it has been shown that the incidences of many tumor types, though apparently not all, tend to be increased in immunodeficient animals or humans, but this observation does not end the debate. There is an alternative to the surveillance hypothesis; numerous studies have shown that the effect of an immune reaction on a tumor is biphasic. For each tumor, there is some quantitatively low level of immune reaction that, relative to no reaction, is facilitating, perhaps even necessary for the tumor's growth in vivo. The optimum level of this facilitating reaction may often be less than the level of immunity that the tumor might engender in a normal subject. The failure of a tumor to grow as well in the normal as it does in the immunosuppressed host is probably not caused by a lack of tumor-cell killing in the suppressed host. Instead, the higher level of immune response in a normal animal, even if it does not rise to tumor-inhibitory levels, probably gives less positive support to tumor growth. This seems more than a semantic distinction.

  4. An immune reaction may be necessary for cancer development

    PubMed Central

    Prehn, Richmond T

    2006-01-01

    Background The hypothesis of immunosurveillance suggests that new neoplasms arise very frequently, but most are destroyed almost at their inception by an immune response. Its correctness has been debated for many years. In its support, it has been shown that the incidences of many tumor types, though apparently not all, tend to be increased in immunodeficient animals or humans, but this observation does not end the debate. Alternative model There is an alternative to the surveillance hypothesis; numerous studies have shown that the effect of an immune reaction on a tumor is biphasic. For each tumor, there is some quantitatively low level of immune reaction that, relative to no reaction, is facilitating, perhaps even necessary for the tumor's growth in vivo. The optimum level of this facilitating reaction may often be less than the level of immunity that the tumor might engender in a normal subject. Conclusion The failure of a tumor to grow as well in the normal as it does in the immunosuppressed host is probably not caused by a lack of tumor-cell killing in the suppressed host. Instead, the higher level of immune response in a normal animal, even if it does not rise to tumor-inhibitory levels, probably gives less positive support to tumor growth. This seems more than a semantic distinction. PMID:16457723

  5. Sunspot random walk and 22-year variation

    USGS Publications Warehouse

    Love, Jeffrey J.; Rigler, E. Joshua

    2012-01-01

    We examine two stochastic models for consistency with observed long-term secular trends in sunspot number and a faint, but semi-persistent, 22-yr signal: (1) a null hypothesis, a simple one-parameter random-walk model of sunspot-number cycle-to-cycle change, and, (2) an alternative hypothesis, a two-parameter random-walk model with an imposed 22-yr alternating amplitude. The observed secular trend in sunspots, seen from solar cycle 5 to 23, would not be an unlikely result of the accumulation of multiple random-walk steps. Statistical tests show that a 22-yr signal can be resolved in historical sunspot data; that is, the probability is low that it would be realized from random data. On the other hand, the 22-yr signal has a small amplitude compared to random variation, and so it has a relatively small effect on sunspot predictions. Many published predictions for cycle 24 sunspots fall within the dispersion of previous cycle-to-cycle sunspot differences. The probability is low that the Sun will, with the accumulation of random steps over the next few cycles, walk down to a Dalton-like minimum. Our models support published interpretations of sunspot secular variation and 22-yr variation resulting from cycle-to-cycle accumulation of dynamo-generated magnetic energy.

  6. The socioeconomic health gradient across the life cycle: what role for selective mortality and institutionalization?

    PubMed Central

    Baeten, Steef; Van Ourti, Tom; van Doorslaer, Eddy

    2013-01-01

    Several studies have documented the now fairly stylized fact that health inequalities by income differ across the age distribution: in cross-sections the health gap between rich and poor tends to widen until about age 50 and then declines at higher ages. It has been suggested that selective mortality and institutionalization could be important factors driving the convergence at higher ages. We use eight waves of a health survey linked to four registries (on mortality, hospitalizations, (municipal) residence status and taxable incomes) to test this hypothesis. We construct life cycle profiles of health for birth year/gender/income groups from the health surveys (based on 128,689 observations) and exploit the registries to obtain precise estimates of individual probabilities of mortality and institutionalization using a seven year observation period for 2,521,122 individuals. We generate selection corrected health profiles using an inverse probability weighting procedure and find that attrition is indeed not random: older, poorer and unhealthier individuals are significantly more likely not to survive the next year and to be admitted to an institution. While these selection effects are very significant, they are not very large. We therefore reject the hypothesis that selective dropout is an important determinant of the differential health trajectories by income over the life course in the Netherlands. PMID:24161090

  7. Bayesian adaptive phase II screening design for combination trials

    PubMed Central

    Cai, Chunyan; Yuan, Ying; Johnson, Valen E

    2013-01-01

    Background Trials of combination therapies for the treatment of cancer are playing an increasingly important role in the battle against this disease. To more efficiently handle the large number of combination therapies that must be tested, we propose a novel Bayesian phase II adaptive screening design to simultaneously select among possible treatment combinations involving multiple agents. Methods Our design is based on formulating the selection procedure as a Bayesian hypothesis testing problem in which the superiority of each treatment combination is equated to a single hypothesis. During the trial conduct, we use the current values of the posterior probabilities of all hypotheses to adaptively allocate patients to treatment combinations. Results Simulation studies show that the proposed design substantially outperforms the conventional multiarm balanced factorial trial design. The proposed design yields a significantly higher probability for selecting the best treatment while allocating substantially more patients to efficacious treatments. Limitations The proposed design is most appropriate for the trials combining multiple agents and screening out the efficacious combination to be further investigated. Conclusions The proposed Bayesian adaptive phase II screening design substantially outperformed the conventional complete factorial design. Our design allocates more patients to better treatments while providing higher power to identify the best treatment at the end of the trial. PMID:23359875

  8. The Universal Plausibility Metric (UPM) & Principle (UPP).

    PubMed

    Abel, David L

    2009-12-03

    Mere possibility is not an adequate basis for asserting scientific plausibility. A precisely defined universal bound is needed beyond which the assertion of plausibility, particularly in life-origin models, can be considered operationally falsified. But can something so seemingly relative and subjective as plausibility ever be quantified? Amazingly, the answer is, "Yes." A method of objectively measuring the plausibility of any chance hypothesis (The Universal Plausibility Metric [UPM]) is presented. A numerical inequality is also provided whereby any chance hypothesis can be definitively falsified when its UPM metric of xi is < 1 (The Universal Plausibility Principle [UPP]). Both UPM and UPP pre-exist and are independent of any experimental design and data set. No low-probability hypothetical plausibility assertion should survive peer-review without subjection to the UPP inequality standard of formal falsification (xi < 1).

  9. Calculating ellipse area by the Monte Carlo method and analysing dice poker with Excel at high school

    NASA Astrophysics Data System (ADS)

    Benacka, Jan

    2016-08-01

    This paper reports on lessons in which 18-19 years old high school students modelled random processes with Excel. In the first lesson, 26 students formulated a hypothesis on the area of ellipse by using the analogy between the areas of circle, square and rectangle. They verified the hypothesis by the Monte Carlo method with a spreadsheet model developed in the lesson. In the second lesson, 27 students analysed the dice poker game. First, they calculated the probability of the hands by combinatorial formulae. Then, they verified the result with a spreadsheet model developed in the lesson. The students were given a questionnaire to find out if they found the lesson interesting and contributing to their mathematical and technological knowledge.

  10. Emotional Sentence Annotation Helps Predict Fiction Genre

    PubMed Central

    Samothrakis, Spyridon; Fasli, Maria

    2015-01-01

    Fiction, a prime form of entertainment, has evolved into multiple genres which one can broadly attribute to different forms of stories. In this paper, we examine the hypothesis that works of fiction can be characterised by the emotions they portray. To investigate this hypothesis, we use the work of fictions in the Project Gutenberg and we attribute basic emotional content to each individual sentence using Ekman’s model. A time-smoothed version of the emotional content for each basic emotion is used to train extremely randomized trees. We show through 10-fold Cross-Validation that the emotional content of each work of fiction can help identify each genre with significantly higher probability than random. We also show that the most important differentiator between genre novels is fear. PMID:26524352

  11. Self-organized network of fractal-shaped components coupled through statistical interaction.

    PubMed

    Ugajin, R

    2001-09-01

    A dissipative dynamics is introduced to generate self-organized networks of interacting objects, which we call coupled-fractal networks. The growth model is constructed based on a growth hypothesis in which the growth rate of each object is a product of the probability of receiving source materials from faraway and the probability of receiving adhesives from other grown objects, where each object grows to be a random fractal if isolated, but connects with others if glued. The network is governed by the statistical interaction between fractal-shaped components, which can only be identified in a statistical manner over ensembles. This interaction is investigated using the degree of correlation between fractal-shaped components, enabling us to determine whether it is attractive or repulsive.

  12. Revisiting the Role of Bad News in Maintaining Human Observing Behavior

    ERIC Educational Resources Information Center

    Fantino, Edmund; Silberberg, Alan

    2010-01-01

    Results from studies of observing responses have suggested that stimuli maintain observing owing to their special relationship to primary reinforcement (the conditioned- reinforcement hypothesis), and not because they predict the availability and nonavailability of reinforcement (the information hypothesis). The present article first reviews a…

  13. Effects of forest management legacies on spruce budworm (Choristoneura fumiferana) outbreaks

    Treesearch

    Louis-Etienne Robert; Daniel Kneeshaw; Brian R. Sturtevant

    2012-01-01

    The "silvicultural hypothesis" of spruce budworm (Choristoneura fumiferana Clem.) dynamics postulates that increasing severity of spruce budworm outbreaks over the last century resulted from forest conditions created by past management activities. Yet, definitive tests of the hypothesis remain elusive. We examined spruce budworm outbreak...

  14. Conditional Probability Analysis: A Statistical Tool for Environmental Analysis.

    EPA Science Inventory

    The use and application of environmental conditional probability analysis (CPA) is relatively recent. The first presentation using CPA was made in 2002 at the New England Association of Environmental Biologists Annual Meeting in Newport. Rhode Island. CPA has been used since the...

  15. Prediction and visualization of redox conditions in the groundwater of Central Valley, California

    USGS Publications Warehouse

    Rosecrans, Celia Z.; Nolan, Bernard T.; Gronberg, JoAnn M.

    2017-01-01

    Regional-scale, three-dimensional continuous probability models, were constructed for aspects of redox conditions in the groundwater system of the Central Valley, California. These models yield grids depicting the probability that groundwater in a particular location will have dissolved oxygen (DO) concentrations less than selected threshold values representing anoxic groundwater conditions, or will have dissolved manganese (Mn) concentrations greater than selected threshold values representing secondary drinking water-quality contaminant levels (SMCL) and health-based screening levels (HBSL). The probability models were constrained by the alluvial boundary of the Central Valley to a depth of approximately 300 m. Probability distribution grids can be extracted from the 3-D models at any desired depth, and are of interest to water-resource managers, water-quality researchers, and groundwater modelers concerned with the occurrence of natural and anthropogenic contaminants related to anoxic conditions.Models were constructed using a Boosted Regression Trees (BRT) machine learning technique that produces many trees as part of an additive model and has the ability to handle many variables, automatically incorporate interactions, and is resistant to collinearity. Machine learning methods for statistical prediction are becoming increasing popular in that they do not require assumptions associated with traditional hypothesis testing. Models were constructed using measured dissolved oxygen and manganese concentrations sampled from 2767 wells within the alluvial boundary of the Central Valley, and over 60 explanatory variables representing regional-scale soil properties, soil chemistry, land use, aquifer textures, and aquifer hydrologic properties. Models were trained on a USGS dataset of 932 wells, and evaluated on an independent hold-out dataset of 1835 wells from the California Division of Drinking Water. We used cross-validation to assess the predictive performance of models of varying complexity, as a basis for selecting final models. Trained models were applied to cross-validation testing data and a separate hold-out dataset to evaluate model predictive performance by emphasizing three model metrics of fit: Kappa; accuracy; and the area under the receiver operator characteristic curve (ROC). The final trained models were used for mapping predictions at discrete depths to a depth of 304.8 m. Trained DO and Mn models had accuracies of 86–100%, Kappa values of 0.69–0.99, and ROC values of 0.92–1.0. Model accuracies for cross-validation testing datasets were 82–95% and ROC values were 0.87–0.91, indicating good predictive performance. Kappas for the cross-validation testing dataset were 0.30–0.69, indicating fair to substantial agreement between testing observations and model predictions. Hold-out data were available for the manganese model only and indicated accuracies of 89–97%, ROC values of 0.73–0.75, and Kappa values of 0.06–0.30. The predictive performance of both the DO and Mn models was reasonable, considering all three of these fit metrics and the low percentages of low-DO and high-Mn events in the data.

  16. The Affective Bases of Risk Perception: Negative Feelings and Stress Mediate the Relationship between Mental Imagery and Risk Perception.

    PubMed

    Sobkow, Agata; Traczyk, Jakub; Zaleskiewicz, Tomasz

    2016-01-01

    Recent research has documented that affect plays a crucial role in risk perception. When no information about numerical risk estimates is available (e.g., probability of loss or magnitude of consequences), people may rely on positive and negative affect toward perceived risk. However, determinants of affective reactions to risks are poorly understood. In a series of three experiments, we addressed the question of whether and to what degree mental imagery eliciting negative affect and stress influences risk perception. In each experiment, participants were instructed to visualize consequences of risk taking and to rate riskiness. In Experiment 1, participants who imagined negative risk consequences reported more negative affect and perceived risk as higher compared to the control condition. In Experiment 2, we found that this effect was driven by affect elicited by mental imagery rather than its vividness and intensity. In this study, imagining positive risk consequences led to lower perceived risk than visualizing negative risk consequences. Finally, we tested the hypothesis that negative affect related to higher perceived risk was caused by negative feelings of stress. In Experiment 3, we introduced risk-irrelevant stress to show that participants in the stress condition rated perceived risk as higher in comparison to the control condition. This experiment showed that higher ratings of perceived risk were influenced by psychological stress. Taken together, our results demonstrate that affect-laden mental imagery dramatically changes risk perception through negative affect (i.e., psychological stress).

  17. Multiscale control of flooding and riparian-forest composition in Lower Michigan, USA.

    PubMed

    Baker, Matthew E; Wiley, Michael J

    2009-01-01

    Despite general agreement that river-valley hydrology shapes riparian ecosystems, relevant processes are difficult to distinguish and often inadequately specified in riparian studies. We hypothesize that physical constraints imposed by broad-scale watershed characteristics and river valleys modify local site conditions in a predictable and probabilistic fashion. To test this hypothesis, we employ a series of structural equations that decompose occurrence of riparian ecotypes into regional temperature, catchment storm response, valley hydraulics, and local site wetness via a priori specification of factor structure and ask (1) Is there evidence for multiscale hydrologic control of riparian diversity across Lower Michigan? (2) Do representations of key constraints on flood dynamics distinguish regional patterns of riparian vegetation? (3) How important are these effects? Cross-correlation among geospatial predictors initially obscured much of the variation revealed through analysis of semipartial variance. Causal relationships implied by our model fit with observed variation in riparian conditions (chi-square P = 0.43) and accounted for between 84% and 99% of the occurrence probability of five riparian ecotypes at 94 locations. Results suggest strong variation in the effects of regional climate, and both the relative importance and spatial scale of hydrologic factors influencing riparian vegetation through explicit quantification of relative flood frequency, duration, intensity, and relative overall inundation. Although climate and hydrology are not the only determinants of riparian conditions, interactions of hydrologic sourcing and flood dynamics described by our spatial models drive a significant portion of the variation in riparian ecosystem character throughout Lower Michigan, USA.

  18. The Affective Bases of Risk Perception: Negative Feelings and Stress Mediate the Relationship between Mental Imagery and Risk Perception

    PubMed Central

    Sobkow, Agata; Traczyk, Jakub; Zaleskiewicz, Tomasz

    2016-01-01

    Recent research has documented that affect plays a crucial role in risk perception. When no information about numerical risk estimates is available (e.g., probability of loss or magnitude of consequences), people may rely on positive and negative affect toward perceived risk. However, determinants of affective reactions to risks are poorly understood. In a series of three experiments, we addressed the question of whether and to what degree mental imagery eliciting negative affect and stress influences risk perception. In each experiment, participants were instructed to visualize consequences of risk taking and to rate riskiness. In Experiment 1, participants who imagined negative risk consequences reported more negative affect and perceived risk as higher compared to the control condition. In Experiment 2, we found that this effect was driven by affect elicited by mental imagery rather than its vividness and intensity. In this study, imagining positive risk consequences led to lower perceived risk than visualizing negative risk consequences. Finally, we tested the hypothesis that negative affect related to higher perceived risk was caused by negative feelings of stress. In Experiment 3, we introduced risk-irrelevant stress to show that participants in the stress condition rated perceived risk as higher in comparison to the control condition. This experiment showed that higher ratings of perceived risk were influenced by psychological stress. Taken together, our results demonstrate that affect-laden mental imagery dramatically changes risk perception through negative affect (i.e., psychological stress). PMID:27445901

  19. Evaluation of the risk of endocarditis and other cardiovascular events on the basis of the severity of periodontal disease in dogs.

    PubMed

    Glickman, Lawrence T; Glickman, Nita W; Moore, George E; Goldstein, Gary S; Lewis, Hugh B

    2009-02-15

    To test the hypothesis that increased severity of periodontal disease in dogs is associated with an increased risk of cardiovascular-related events, such as endocarditis and cardiomyopathy, as well as markers of inflammation. Historical cohort observational study. 59,296 dogs with a history of periodontal disease (periodontal cohort), of which 23,043 had stage 1 disease, 20,732 had stage 2 disease, and 15,521 had stage 3 disease; and an age-matched comparison group of 59,296 dogs with no history of periodontal disease (nonperiodontal cohort). Cox proportional hazard regression models were used to estimate the risk of cardiovascular-related diagnoses and examination findings in dogs as a function of the stage of periodontal disease (1, 2, or 3 or no periodontal disease) over time while controlling for the effect of potential confounding factors. Significant associations were detected between the severity of periodontal disease and the subsequent risk of cardiovascular-related conditions, such as endocarditis and cardiomyopathy, but not between the severity of periodontal disease and the risk of a variety of other common noncardiovascular-related conditions. The findings of this observational study, similar to epidemiologic studies in humans, suggested that periodontal disease was associated with cardiovascular-related conditions, such as endocarditis and cardiomyopathy. Chronic inflammation is probably an important mechanism connecting bacterial flora in the oral cavity of dogs with systemic disease. Canine health may be improved if veterinarians and pet owners place a higher priority on routine dental care.

  20. European regional efficiency and geographical externalities: a spatial nonparametric frontier analysis

    NASA Astrophysics Data System (ADS)

    Ramajo, Julián; Cordero, José Manuel; Márquez, Miguel Ángel

    2017-10-01

    This paper analyses region-level technical efficiency in nine European countries over the 1995-2007 period. We propose the application of a nonparametric conditional frontier approach to account for the presence of heterogeneous conditions in the form of geographical externalities. Such environmental factors are beyond the control of regional authorities, but may affect the production function. Therefore, they need to be considered in the frontier estimation. Specifically, a spatial autoregressive term is included as an external conditioning factor in a robust order- m model. Thus we can test the hypothesis of non-separability (the external factor impacts both the input-output space and the distribution of efficiencies), demonstrating the existence of significant global interregional spillovers into the production process. Our findings show that geographical externalities affect both the frontier level and the probability of being more or less efficient. Specifically, the results support the fact that the spatial lag variable has an inverted U-shaped non-linear impact on the performance of regions. This finding can be interpreted as a differential effect of interregional spillovers depending on the size of the neighboring economies: positive externalities for small values, possibly related to agglomeration economies, and negative externalities for high values, indicating the possibility of production congestion. Additionally, evidence of the existence of a strong geographic pattern of European regional efficiency is reported and the levels of technical efficiency are acknowledged to have converged during the period under analysis.

Top