Sample records for probability theory

  1. On Replacing "Quantum Thinking" with Counterfactual Reasoning

    NASA Astrophysics Data System (ADS)

    Narens, Louis

    The probability theory used in quantum mechanics is currently being employed by psychologists to model the impact of context on decision. Its event space consists of closed subspaces of a Hilbert space, and its probability function sometimes violate the law of the finite additivity of probabilities. Results from the quantum mechanics literature indicate that such a "Hilbert space probability theory" cannot be extended in a useful way to standard, finitely additive, probability theory by the addition of new events with specific probabilities. This chapter presents a new kind of probability theory that shares many fundamental algebraic characteristics with Hilbert space probability theory but does extend to standard probability theory by adjoining new events with specific probabilities. The new probability theory arises from considerations about how psychological experiments are related through counterfactual reasoning.

  2. Naive Probability: A Mental Model Theory of Extensional Reasoning.

    ERIC Educational Resources Information Center

    Johnson-Laird, P. N.; Legrenzi, Paolo; Girotto, Vittorio; Legrenzi, Maria Sonino; Caverni, Jean-Paul

    1999-01-01

    Outlines a theory of naive probability in which individuals who are unfamiliar with the probability calculus can infer the probabilities of events in an "extensional" way. The theory accommodates reasoning based on numerical premises, and explains how naive reasoners can infer posterior probabilities without relying on Bayes's theorem.…

  3. Propensity, Probability, and Quantum Theory

    NASA Astrophysics Data System (ADS)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  4. On Probability Domains IV

    NASA Astrophysics Data System (ADS)

    Frič, Roman; Papčo, Martin

    2017-12-01

    Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.

  5. Hepatitis disease detection using Bayesian theory

    NASA Astrophysics Data System (ADS)

    Maseleno, Andino; Hidayati, Rohmah Zahroh

    2017-02-01

    This paper presents hepatitis disease diagnosis using a Bayesian theory for better understanding of the theory. In this research, we used a Bayesian theory for detecting hepatitis disease and displaying the result of diagnosis process. Bayesian algorithm theory is rediscovered and perfected by Laplace, the basic idea is using of the known prior probability and conditional probability density parameter, based on Bayes theorem to calculate the corresponding posterior probability, and then obtained the posterior probability to infer and make decisions. Bayesian methods combine existing knowledge, prior probabilities, with additional knowledge derived from new data, the likelihood function. The initial symptoms of hepatitis which include malaise, fever and headache. The probability of hepatitis given the presence of malaise, fever, and headache. The result revealed that a Bayesian theory has successfully identified the existence of hepatitis disease.

  6. Heuristics can produce surprisingly rational probability estimates: Comment on Costello and Watts (2014).

    PubMed

    Nilsson, Håkan; Juslin, Peter; Winman, Anders

    2016-01-01

    Costello and Watts (2014) present a model assuming that people's knowledge of probabilities adheres to probability theory, but that their probability judgments are perturbed by a random noise in the retrieval from memory. Predictions for the relationships between probability judgments for constituent events and their disjunctions and conjunctions, as well as for sums of such judgments were derived from probability theory. Costello and Watts (2014) report behavioral data showing that subjective probability judgments accord with these predictions. Based on the finding that subjective probability judgments follow probability theory, Costello and Watts (2014) conclude that the results imply that people's probability judgments embody the rules of probability theory and thereby refute theories of heuristic processing. Here, we demonstrate the invalidity of this conclusion by showing that all of the tested predictions follow straightforwardly from an account assuming heuristic probability integration (Nilsson, Winman, Juslin, & Hansson, 2009). We end with a discussion of a number of previous findings that harmonize very poorly with the predictions by the model suggested by Costello and Watts (2014). (c) 2015 APA, all rights reserved).

  7. Econophysics and individual choice

    NASA Astrophysics Data System (ADS)

    Bordley, Robert F.

    2005-08-01

    The subjectivist theory of probability specifies certain axioms of rationality which together lead to both a theory of probability and a theory of preference. The theory of probability is used throughout the sciences while the theory of preferences is used in economics. Results in quantum physics challenge the adequacy of the subjectivist theory of probability. As we show, answering this challenge requires modifying an Archimedean axiom in the subjectivist theory. But changing this axiom modifies the subjectivist theory of preference and therefore has implications for economics. As this paper notes, these implications are consistent with current empirical findings in psychology and economics. As we show, these results also have implications for pricing in securities markets. This suggests further directions for research in econophysics.

  8. Alternative probability theories for cognitive psychology.

    PubMed

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.

  9. Decision analysis with cumulative prospect theory.

    PubMed

    Bayoumi, A M; Redelmeier, D A

    2000-01-01

    Individuals sometimes express preferences that do not follow expected utility theory. Cumulative prospect theory adjusts for some phenomena by using decision weights rather than probabilities when analyzing a decision tree. The authors examined how probability transformations from cumulative prospect theory might alter a decision analysis of a prophylactic therapy in AIDS, eliciting utilities from patients with HIV infection (n = 75) and calculating expected outcomes using an established Markov model. They next focused on transformations of three sets of probabilities: 1) the probabilities used in calculating standard-gamble utility scores; 2) the probabilities of being in discrete Markov states; 3) the probabilities of transitioning between Markov states. The same prophylaxis strategy yielded the highest quality-adjusted survival under all transformations. For the average patient, prophylaxis appeared relatively less advantageous when standard-gamble utilities were transformed. Prophylaxis appeared relatively more advantageous when state probabilities were transformed and relatively less advantageous when transition probabilities were transformed. Transforming standard-gamble and transition probabilities simultaneously decreased the gain from prophylaxis by almost half. Sensitivity analysis indicated that even near-linear probability weighting transformations could substantially alter quality-adjusted survival estimates. The magnitude of benefit estimated in a decision-analytic model can change significantly after using cumulative prospect theory. Incorporating cumulative prospect theory into decision analysis can provide a form of sensitivity analysis and may help describe when people deviate from expected utility theory.

  10. Surprisingly rational: probability theory plus noise explains biases in judgment.

    PubMed

    Costello, Fintan; Watts, Paul

    2014-07-01

    The systematic biases seen in people's probability judgments are typically taken as evidence that people do not use the rules of probability theory when reasoning about probability but instead use heuristics, which sometimes yield reasonable judgments and sometimes yield systematic biases. This view has had a major impact in economics, law, medicine, and other fields; indeed, the idea that people cannot reason with probabilities has become a truism. We present a simple alternative to this view, where people reason about probability according to probability theory but are subject to random variation or noise in the reasoning process. In this account the effect of noise is canceled for some probabilistic expressions. Analyzing data from 2 experiments, we find that, for these expressions, people's probability judgments are strikingly close to those required by probability theory. For other expressions, this account produces systematic deviations in probability estimates. These deviations explain 4 reliable biases in human probabilistic reasoning (conservatism, subadditivity, conjunction, and disjunction fallacies). These results suggest that people's probability judgments embody the rules of probability theory and that biases in those judgments are due to the effects of random noise. (c) 2014 APA, all rights reserved.

  11. Topological and Orthomodular Modeling of Context in Behavioral Science

    NASA Astrophysics Data System (ADS)

    Narens, Louis

    2017-02-01

    Two non-boolean methods are discussed for modeling context in behavioral data and theory. The first is based on intuitionistic logic, which is similar to classical logic except that not every event has a complement. Its probability theory is also similar to classical probability theory except that the definition of probability function needs to be generalized to unions of events instead of applying only to unions of disjoint events. The generalization is needed, because intuitionistic event spaces may not contain enough disjoint events for the classical definition to be effective. The second method develops a version of quantum logic for its underlying probability theory. It differs from Hilbert space logic used in quantum mechanics as a foundation for quantum probability theory in variety of ways. John von Neumann and others have commented about the lack of a relative frequency approach and a rational foundation for this probability theory. This article argues that its version of quantum probability theory does not have such issues. The method based on intuitionistic logic is useful for modeling cognitive interpretations that vary with context, for example, the mood of the decision maker, the context produced by the influence of other items in a choice experiment, etc. The method based on this article's quantum logic is useful for modeling probabilities across contexts, for example, how probabilities of events from different experiments are related.

  12. Probability theory plus noise: Replies to Crupi and Tentori (2016) and to Nilsson, Juslin, and Winman (2016).

    PubMed

    Costello, Fintan; Watts, Paul

    2016-01-01

    A standard assumption in much of current psychology is that people do not reason about probability using the rules of probability theory but instead use various heuristics or "rules of thumb," which can produce systematic reasoning biases. In Costello and Watts (2014), we showed that a number of these biases can be explained by a model where people reason according to probability theory but are subject to random noise. More importantly, that model also predicted agreement with probability theory for certain expressions that cancel the effects of random noise: Experimental results strongly confirmed this prediction, showing that probabilistic reasoning is simultaneously systematically biased and "surprisingly rational." In their commentaries on that paper, both Crupi and Tentori (2016) and Nilsson, Juslin, and Winman (2016) point to various experimental results that, they suggest, our model cannot explain. In this reply, we show that our probability theory plus noise model can in fact explain every one of the results identified by these authors. This gives a degree of additional support to the view that people's probability judgments embody the rational rules of probability theory and that biases in those judgments can be explained as simply effects of random noise. (c) 2015 APA, all rights reserved).

  13. The relevance of the early history of probability theory to current risk assessment practices in mental health care.

    PubMed

    Large, Matthew

    2013-12-01

    Probability theory is at the base of modern concepts of risk assessment in mental health. The aim of the current paper is to review the key developments in the early history of probability theory in order to enrich our understanding of current risk assessment practices.

  14. Causal inference, probability theory, and graphical insights.

    PubMed

    Baker, Stuart G

    2013-11-10

    Causal inference from observational studies is a fundamental topic in biostatistics. The causal graph literature typically views probability theory as insufficient to express causal concepts in observational studies. In contrast, the view here is that probability theory is a desirable and sufficient basis for many topics in causal inference for the following two reasons. First, probability theory is generally more flexible than causal graphs: Besides explaining such causal graph topics as M-bias (adjusting for a collider) and bias amplification and attenuation (when adjusting for instrumental variable), probability theory is also the foundation of the paired availability design for historical controls, which does not fit into a causal graph framework. Second, probability theory is the basis for insightful graphical displays including the BK-Plot for understanding Simpson's paradox with a binary confounder, the BK2-Plot for understanding bias amplification and attenuation in the presence of an unobserved binary confounder, and the PAD-Plot for understanding the principal stratification component of the paired availability design. Published 2013. This article is a US Government work and is in the public domain in the USA.

  15. The internal consistency of the standard gamble: tests after adjusting for prospect theory.

    PubMed

    Oliver, Adam

    2003-07-01

    This article reports a study that tests whether the internal consistency of the standard gamble can be improved upon by incorporating loss weighting and probability transformation parameters in the standard gamble valuation procedure. Five alternatives to the standard EU formulation are considered: (1) probability transformation within an EU framework; and, within a prospect theory framework, (2) loss weighting and full probability transformation, (3) no loss weighting and full probability transformation, (4) loss weighting and no probability transformation, and (5) loss weighting and partial probability transformation. Of the five alternatives, only the prospect theory formulation with loss weighting and no probability transformation offers an improvement in internal consistency over the standard EU valuation procedure.

  16. Excluding joint probabilities from quantum theory

    NASA Astrophysics Data System (ADS)

    Allahverdyan, Armen E.; Danageozian, Arshag

    2018-03-01

    Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.

  17. The Cognitive Substrate of Subjective Probability

    ERIC Educational Resources Information Center

    Nilsson, Hakan; Olsson, Henrik; Juslin, Peter

    2005-01-01

    The prominent cognitive theories of probability judgment were primarily developed to explain cognitive biases rather than to account for the cognitive processes in probability judgment. In this article the authors compare 3 major theories of the processes and representations in probability judgment: the representativeness heuristic, implemented as…

  18. Probability theory, not the very guide of life.

    PubMed

    Juslin, Peter; Nilsson, Håkan; Winman, Anders

    2009-10-01

    Probability theory has long been taken as the self-evident norm against which to evaluate inductive reasoning, and classical demonstrations of violations of this norm include the conjunction error and base-rate neglect. Many of these phenomena require multiplicative probability integration, whereas people seem more inclined to linear additive integration, in part, at least, because of well-known capacity constraints on controlled thought. In this article, the authors show with computer simulations that when based on approximate knowledge of probabilities, as is routinely the case in natural environments, linear additive integration can yield as accurate estimates, and as good average decision returns, as estimates based on probability theory. It is proposed that in natural environments people have little opportunity or incentive to induce the normative rules of probability theory and, given their cognitive constraints, linear additive integration may often offer superior bounded rationality.

  19. Extended Importance Sampling for Reliability Analysis under Evidence Theory

    NASA Astrophysics Data System (ADS)

    Yuan, X. K.; Chen, B.; Zhang, B. Q.

    2018-05-01

    In early engineering practice, the lack of data and information makes uncertainty difficult to deal with. However, evidence theory has been proposed to handle uncertainty with limited information as an alternative way to traditional probability theory. In this contribution, a simulation-based approach, called ‘Extended importance sampling’, is proposed based on evidence theory to handle problems with epistemic uncertainty. The proposed approach stems from the traditional importance sampling for reliability analysis under probability theory, and is developed to handle the problem with epistemic uncertainty. It first introduces a nominal instrumental probability density function (PDF) for every epistemic uncertainty variable, and thus an ‘equivalent’ reliability problem under probability theory is obtained. Then the samples of these variables are generated in a way of importance sampling. Based on these samples, the plausibility and belief (upper and lower bounds of probability) can be estimated. It is more efficient than direct Monte Carlo simulation. Numerical and engineering examples are given to illustrate the efficiency and feasible of the proposed approach.

  20. Psychophysics of the probability weighting function

    NASA Astrophysics Data System (ADS)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (0<α<1 and w(0)=1,w(1e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  1. Deriving Laws from Ordering Relations

    NASA Technical Reports Server (NTRS)

    Knuth, Kevin H.

    2004-01-01

    The effect of Richard T. Cox's contribution to probability theory was to generalize Boolean implication among logical statements to degrees of implication, which are manipulated using rules derived from consistency with Boolean algebra. These rules are known as the sum rule, the product rule and Bayes Theorem, and the measure resulting from this generalization is probability. In this paper, I will describe how Cox s technique can be further generalized to include other algebras and hence other problems in science and mathematics. The result is a methodology that can be used to generalize an algebra to a calculus by relying on consistency with order theory to derive the laws of the calculus. My goals are to clear up the mysteries as to why the same basic structure found in probability theory appears in other contexts, to better understand the foundations of probability theory, and to extend these ideas to other areas by developing new mathematics and new physics. The relevance of this methodology will be demonstrated using examples from probability theory, number theory, geometry, information theory, and quantum mechanics.

  2. Chance, determinism and the classical theory of probability.

    PubMed

    Vasudevan, Anubav

    2018-02-01

    This paper situates the metaphysical antinomy between chance and determinism in the historical context of some of the earliest developments in the mathematical theory of probability. Since Hacking's seminal work on the subject, it has been a widely held view that the classical theorists of probability were guilty of an unwitting equivocation between a subjective, or epistemic, interpretation of probability, on the one hand, and an objective, or statistical, interpretation, on the other. While there is some truth to this account, I argue that the tension at the heart of the classical theory of probability is not best understood in terms of the duality between subjective and objective interpretations of probability. Rather, the apparent paradox of chance and determinism, when viewed through the lens of the classical theory of probability, manifests itself in a much deeper ambivalence on the part of the classical probabilists as to the rational commensurability of causal and probabilistic reasoning. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vourdas, A.

    2014-08-15

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H{sub 1},H{sub 2}), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H{sub 1}),P(H{sub 2}), to the subspacesmore » H{sub 1}, H{sub 2}. As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities.« less

  4. Probability Theory Plus Noise: Descriptive Estimation and Inferential Judgment.

    PubMed

    Costello, Fintan; Watts, Paul

    2018-01-01

    We describe a computational model of two central aspects of people's probabilistic reasoning: descriptive probability estimation and inferential probability judgment. This model assumes that people's reasoning follows standard frequentist probability theory, but it is subject to random noise. This random noise has a regressive effect in descriptive probability estimation, moving probability estimates away from normative probabilities and toward the center of the probability scale. This random noise has an anti-regressive effect in inferential judgement, however. These regressive and anti-regressive effects explain various reliable and systematic biases seen in people's descriptive probability estimation and inferential probability judgment. This model predicts that these contrary effects will tend to cancel out in tasks that involve both descriptive estimation and inferential judgement, leading to unbiased responses in those tasks. We test this model by applying it to one such task, described by Gallistel et al. ). Participants' median responses in this task were unbiased, agreeing with normative probability theory over the full range of responses. Our model captures the pattern of unbiased responses in this task, while simultaneously explaining systematic biases away from normatively correct probabilities seen in other tasks. Copyright © 2018 Cognitive Science Society, Inc.

  5. Probability Theory, Not the Very Guide of Life

    ERIC Educational Resources Information Center

    Juslin, Peter; Nilsson, Hakan; Winman, Anders

    2009-01-01

    Probability theory has long been taken as the self-evident norm against which to evaluate inductive reasoning, and classical demonstrations of violations of this norm include the conjunction error and base-rate neglect. Many of these phenomena require multiplicative probability integration, whereas people seem more inclined to linear additive…

  6. Help-Seeking Decisions of Battered Women: A Test of Learned Helplessness and Two Stress Theories.

    ERIC Educational Resources Information Center

    Wauchope, Barbara A.

    This study tested the learned helplessness theory, stress theory, and a modified stress theory to determine the best model for predicting the probability that a woman would seek help when she experienced severe violence from a male partner. The probability was hypothesized to increase as the stress of the violence experienced increased. Data were…

  7. Beable-guided quantum theories: Generalizing quantum probability laws

    NASA Astrophysics Data System (ADS)

    Kent, Adrian

    2013-02-01

    Beable-guided quantum theories (BGQT) are generalizations of quantum theory, inspired by Bell's concept of beables. They modify the quantum probabilities for some specified set of fundamental events, histories, or other elements of quasiclassical reality by probability laws that depend on the realized configuration of beables. For example, they may define an additional probability weight factor for a beable configuration, independent of the quantum dynamics. Beable-guided quantum theories can be fitted to observational data to provide foils against which to compare explanations based on standard quantum theory. For example, a BGQT could, in principle, characterize the effects attributed to dark energy or dark matter, or any other deviation from the predictions of standard quantum dynamics, without introducing extra fields or a cosmological constant. The complexity of the beable-guided theory would then parametrize how far we are from a standard quantum explanation. Less conservatively, we give reasons for taking suitably simple beable-guided quantum theories as serious phenomenological theories in their own right. Among these are the possibility that cosmological models defined by BGQT might in fact fit the empirical data better than any standard quantum explanation, and the fact that BGQT suggest potentially interesting nonstandard ways of coupling quantum matter to gravity.

  8. The Effects of Framing, Reflection, Probability, and Payoff on Risk Preference in Choice Tasks.

    PubMed

    Kühberger; Schulte-Mecklenbeck; Perner

    1999-06-01

    A meta-analysis of Asian-disease-like studies is presented to identify the factors which determine risk preference. First the confoundings between probability levels, payoffs, and framing conditions are clarified in a task analysis. Then the role of framing, reflection, probability, type, and size of payoff is evaluated in a meta-analysis. It is shown that bidirectional framing effects exist for gains and for losses. Presenting outcomes as gains tends to induce risk aversion, while presenting outcomes as losses tends to induce risk seeking. Risk preference is also shown to depend on the size of the payoffs, on the probability levels, and on the type of good at stake (money/property vs human lives). In general, higher payoffs lead to increasing risk aversion. Higher probabilities lead to increasing risk aversion for gains and to increasing risk seeking for losses. These findings are confirmed by a subsequent empirical test. Shortcomings of existing formal theories, such as prospect theory, cumulative prospect theory, venture theory, and Markowitz's utility theory, are identified. It is shown that it is not probabilities or payoffs, but the framing condition, which explains most variance. These findings are interpreted as showing that no linear combination of formally relevant predictors is sufficient to capture the essence of the framing phenomenon. Copyright 1999 Academic Press.

  9. The Probabilities of Unique Events

    PubMed Central

    Khemlani, Sangeet S.; Lotstein, Max; Johnson-Laird, Phil

    2012-01-01

    Many theorists argue that the probabilities of unique events, even real possibilities such as President Obama's re-election, are meaningless. As a consequence, psychologists have seldom investigated them. We propose a new theory (implemented in a computer program) in which such estimates depend on an intuitive non-numerical system capable only of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of conjunctions should often tend to split the difference between the probabilities of the two conjuncts. We report two experiments showing that individuals commit such violations of the probability calculus, and corroborating other predictions of the theory, e.g., individuals err in the same way even when they make non-numerical verbal estimates, such as that an event is highly improbable. PMID:23056224

  10. The place of probability in Hilbert's axiomatization of physics, ca. 1900-1928

    NASA Astrophysics Data System (ADS)

    Verburgt, Lukas M.

    2016-02-01

    Although it has become a common place to refer to the 'sixth problem' of Hilbert's (1900) Paris lecture as the starting point for modern axiomatized probability theory, his own views on probability have received comparatively little explicit attention. The central aim of this paper is to provide a detailed account of this topic in light of the central observation that the development of Hilbert's project of the axiomatization of physics went hand-in-hand with a redefinition of the status of probability theory and the meaning of probability. Where Hilbert first regarded the theory as a mathematizable physical discipline and later approached it as a 'vague' mathematical application in physics, he eventually understood probability, first, as a feature of human thought and, then, as an implicitly defined concept without a fixed physical interpretation. It thus becomes possible to suggest that Hilbert came to question, from the early 1920s on, the very possibility of achieving the goal of the axiomatization of probability as described in the 'sixth problem' of 1900.

  11. Probability and Quantum Paradigms: the Interplay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kracklauer, A. F.

    Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a fewmore » details, this variant is appealing in its reliance on well tested concepts and technology.« less

  12. Probability and Quantum Paradigms: the Interplay

    NASA Astrophysics Data System (ADS)

    Kracklauer, A. F.

    2007-12-01

    Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a few details, this variant is appealing in its reliance on well tested concepts and technology.

  13. A Short History of Probability Theory and Its Applications

    ERIC Educational Resources Information Center

    Debnath, Lokenath; Basu, Kanadpriya

    2015-01-01

    This paper deals with a brief history of probability theory and its applications to Jacob Bernoulli's famous law of large numbers and theory of errors in observations or measurements. Included are the major contributions of Jacob Bernoulli and Laplace. It is written to pay the tricentennial tribute to Jacob Bernoulli, since the year 2013…

  14. What Can Quantum Optics Say about Computational Complexity Theory?

    NASA Astrophysics Data System (ADS)

    Rahimi-Keshari, Saleh; Lund, Austin P.; Ralph, Timothy C.

    2015-02-01

    Considering the problem of sampling from the output photon-counting probability distribution of a linear-optical network for input Gaussian states, we obtain results that are of interest from both quantum theory and the computational complexity theory point of view. We derive a general formula for calculating the output probabilities, and by considering input thermal states, we show that the output probabilities are proportional to permanents of positive-semidefinite Hermitian matrices. It is believed that approximating permanents of complex matrices in general is a #P-hard problem. However, we show that these permanents can be approximated with an algorithm in the BPPNP complexity class, as there exists an efficient classical algorithm for sampling from the output probability distribution. We further consider input squeezed-vacuum states and discuss the complexity of sampling from the probability distribution at the output.

  15. Statistical Inference in Graphical Models

    DTIC Science & Technology

    2008-06-17

    fuse probability theory and graph theory in such a way as to permit efficient rep- resentation and computation with probability distributions. They...message passing. 59 viii 1. INTRODUCTION In approaching real-world problems, we often need to deal with uncertainty. Probability and statis- tics provide a...dynamic programming methods. However, for many sensors of interest, the signal-to-noise ratio does not allow such a treatment. Another source of

  16. The Probabilities of Unique Events

    DTIC Science & Technology

    2012-08-30

    social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...investigated them. We propose a new theory (implemented in a computer program) in which such estimates depend on an intuitive non-numerical system capable only...of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of

  17. Neural response to reward anticipation under risk is nonlinear in probabilities.

    PubMed

    Hsu, Ming; Krajbich, Ian; Zhao, Chen; Camerer, Colin F

    2009-02-18

    A widely observed phenomenon in decision making under risk is the apparent overweighting of unlikely events and the underweighting of nearly certain events. This violates standard assumptions in expected utility theory, which requires that expected utility be linear (objective) in probabilities. Models such as prospect theory have relaxed this assumption and introduced the notion of a "probability weighting function," which captures the key properties found in experimental data. This study reports functional magnetic resonance imaging (fMRI) data that neural response to expected reward is nonlinear in probabilities. Specifically, we found that activity in the striatum during valuation of monetary gambles are nonlinear in probabilities in the pattern predicted by prospect theory, suggesting that probability distortion is reflected at the level of the reward encoding process. The degree of nonlinearity reflected in individual subjects' decisions is also correlated with striatal activity across subjects. Our results shed light on the neural mechanisms of reward processing, and have implications for future neuroscientific studies of decision making involving extreme tails of the distribution, where probability weighting provides an explanation for commonly observed behavioral anomalies.

  18. Does a better model yield a better argument? An info-gap analysis

    NASA Astrophysics Data System (ADS)

    Ben-Haim, Yakov

    2017-04-01

    Theories, models and computations underlie reasoned argumentation in many areas. The possibility of error in these arguments, though of low probability, may be highly significant when the argument is used in predicting the probability of rare high-consequence events. This implies that the choice of a theory, model or computational method for predicting rare high-consequence events must account for the probability of error in these components. However, error may result from lack of knowledge or surprises of various sorts, and predicting the probability of error is highly uncertain. We show that the putatively best, most innovative and sophisticated argument may not actually have the lowest probability of error. Innovative arguments may entail greater uncertainty than more standard but less sophisticated methods, creating an innovation dilemma in formulating the argument. We employ info-gap decision theory to characterize and support the resolution of this problem and present several examples.

  19. Estimation of the lower and upper bounds on the probability of failure using subset simulation and random set theory

    NASA Astrophysics Data System (ADS)

    Alvarez, Diego A.; Uribe, Felipe; Hurtado, Jorge E.

    2018-02-01

    Random set theory is a general framework which comprises uncertainty in the form of probability boxes, possibility distributions, cumulative distribution functions, Dempster-Shafer structures or intervals; in addition, the dependence between the input variables can be expressed using copulas. In this paper, the lower and upper bounds on the probability of failure are calculated by means of random set theory. In order to accelerate the calculation, a well-known and efficient probability-based reliability method known as subset simulation is employed. This method is especially useful for finding small failure probabilities in both low- and high-dimensional spaces, disjoint failure domains and nonlinear limit state functions. The proposed methodology represents a drastic reduction of the computational labor implied by plain Monte Carlo simulation for problems defined with a mixture of representations for the input variables, while delivering similar results. Numerical examples illustrate the efficiency of the proposed approach.

  20. [Biometric bases: basic concepts of probability calculation].

    PubMed

    Dinya, E

    1998-04-26

    The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.

  1. Decision making generalized by a cumulative probability weighting function

    NASA Astrophysics Data System (ADS)

    dos Santos, Lindomar Soares; Destefano, Natália; Martinez, Alexandre Souto

    2018-01-01

    Typical examples of intertemporal decision making involve situations in which individuals must choose between a smaller reward, but more immediate, and a larger one, delivered later. Analogously, probabilistic decision making involves choices between options whose consequences differ in relation to their probability of receiving. In Economics, the expected utility theory (EUT) and the discounted utility theory (DUT) are traditionally accepted normative models for describing, respectively, probabilistic and intertemporal decision making. A large number of experiments confirmed that the linearity assumed by the EUT does not explain some observed behaviors, as nonlinear preference, risk-seeking and loss aversion. That observation led to the development of new theoretical models, called non-expected utility theories (NEUT), which include a nonlinear transformation of the probability scale. An essential feature of the so-called preference function of these theories is that the probabilities are transformed by decision weights by means of a (cumulative) probability weighting function, w(p) . We obtain in this article a generalized function for the probabilistic discount process. This function has as particular cases mathematical forms already consecrated in the literature, including discount models that consider effects of psychophysical perception. We also propose a new generalized function for the functional form of w. The limiting cases of this function encompass some parametric forms already proposed in the literature. Far beyond a mere generalization, our function allows the interpretation of probabilistic decision making theories based on the assumption that individuals behave similarly in the face of probabilities and delays and is supported by phenomenological models.

  2. Probability in reasoning: a developmental test on conditionals.

    PubMed

    Barrouillet, Pierre; Gauffroy, Caroline

    2015-04-01

    Probabilistic theories have been claimed to constitute a new paradigm for the psychology of reasoning. A key assumption of these theories is captured by what they call the Equation, the hypothesis that the meaning of the conditional is probabilistic in nature and that the probability of If p then q is the conditional probability, in such a way that P(if p then q)=P(q|p). Using the probabilistic truth-table task in which participants are required to evaluate the probability of If p then q sentences, the present study explored the pervasiveness of the Equation through ages (from early adolescence to adulthood), types of conditionals (basic, causal, and inducements) and contents. The results reveal that the Equation is a late developmental achievement only endorsed by a narrow majority of educated adults for certain types of conditionals depending on the content they involve. Age-related changes in evaluating the probability of all the conditionals studied closely mirror the development of truth-value judgements observed in previous studies with traditional truth-table tasks. We argue that our modified mental model theory can account for this development, and hence for the findings related with the probability task, which do not consequently support the probabilistic approach of human reasoning over alternative theories. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. The Priority Heuristic: Making Choices Without Trade-Offs

    PubMed Central

    Brandstätter, Eduard; Gigerenzer, Gerd; Hertwig, Ralph

    2010-01-01

    Bernoulli's framework of expected utility serves as a model for various psychological processes, including motivation, moral sense, attitudes, and decision making. To account for evidence at variance with expected utility, we generalize the framework of fast and frugal heuristics from inferences to preferences. The priority heuristic predicts (i) Allais' paradox, (ii) risk aversion for gains if probabilities are high, (iii) risk seeking for gains if probabilities are low (lottery tickets), (iv) risk aversion for losses if probabilities are low (buying insurance), (v) risk seeking for losses if probabilities are high, (vi) certainty effect, (vii) possibility effect, and (viii) intransitivities. We test how accurately the heuristic predicts people's choices, compared to previously proposed heuristics and three modifications of expected utility theory: security-potential/aspiration theory, transfer-of-attention-exchange model, and cumulative prospect theory. PMID:16637767

  4. The force distribution probability function for simple fluids by density functional theory.

    PubMed

    Rickayzen, G; Heyes, D M

    2013-02-28

    Classical density functional theory (DFT) is used to derive a formula for the probability density distribution function, P(F), and probability distribution function, W(F), for simple fluids, where F is the net force on a particle. The final formula for P(F) ∝ exp(-AF(2)), where A depends on the fluid density, the temperature, and the Fourier transform of the pair potential. The form of the DFT theory used is only applicable to bounded potential fluids. When combined with the hypernetted chain closure of the Ornstein-Zernike equation, the DFT theory for W(F) agrees with molecular dynamics computer simulations for the Gaussian and bounded soft sphere at high density. The Gaussian form for P(F) is still accurate at lower densities (but not too low density) for the two potentials, but with a smaller value for the constant, A, than that predicted by the DFT theory.

  5. Prospect theory on the brain? Toward a cognitive neuroscience of decision under risk.

    PubMed

    Trepel, Christopher; Fox, Craig R; Poldrack, Russell A

    2005-04-01

    Most decisions must be made without advance knowledge of their consequences. Economists and psychologists have devoted much attention to modeling decisions made under conditions of risk in which options can be characterized by a known probability distribution over possible outcomes. The descriptive shortcomings of classical economic models motivated the development of prospect theory (D. Kahneman, A. Tversky, Prospect theory: An analysis of decision under risk. Econometrica, 4 (1979) 263-291; A. Tversky, D. Kahneman, Advances in prospect theory: Cumulative representation of uncertainty. Journal of Risk and Uncertainty, 5 (4) (1992) 297-323) the most successful behavioral model of decision under risk. In the prospect theory, subjective value is modeled by a value function that is concave for gains, convex for losses, and steeper for losses than for gains; the impact of probabilities are characterized by a weighting function that overweights low probabilities and underweights moderate to high probabilities. We outline the possible neural bases of the components of prospect theory, surveying evidence from human imaging, lesion, and neuropharmacology studies as well as animal neurophysiology studies. These results provide preliminary suggestions concerning the neural bases of prospect theory that include a broad set of brain regions and neuromodulatory systems. These data suggest that focused studies of decision making in the context of quantitative models may provide substantial leverage towards a fuller understanding of the cognitive neuroscience of decision making.

  6. Dangerous "spin": the probability myth of evidence-based prescribing - a Merleau-Pontyian approach.

    PubMed

    Morstyn, Ron

    2011-08-01

    The aim of this study was to examine logical positivist statistical probability statements used to support and justify "evidence-based" prescribing rules in psychiatry when viewed from the major philosophical theories of probability, and to propose "phenomenological probability" based on Maurice Merleau-Ponty's philosophy of "phenomenological positivism" as a better clinical and ethical basis for psychiatric prescribing. The logical positivist statistical probability statements which are currently used to support "evidence-based" prescribing rules in psychiatry have little clinical or ethical justification when subjected to critical analysis from any of the major theories of probability and represent dangerous "spin" because they necessarily exclude the individual , intersubjective and ambiguous meaning of mental illness. A concept of "phenomenological probability" founded on Merleau-Ponty's philosophy of "phenomenological positivism" overcomes the clinically destructive "objectivist" and "subjectivist" consequences of logical positivist statistical probability and allows psychopharmacological treatments to be appropriately integrated into psychiatric treatment.

  7. Independent events in elementary probability theory

    NASA Astrophysics Data System (ADS)

    Csenki, Attila

    2011-07-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): If the n events E 1, E 2, … , E n are jointly independent then any two events A and B built in finitely many steps from two disjoint subsets of E 1, E 2, … , E n are also independent. The operations 'union', 'intersection' and 'complementation' are permitted only when forming the events A and B. Here we examine this statement from the point of view of elementary probability theory. The approach described here is accessible also to users of probability theory and is believed to be novel.

  8. Large Deviations: Advanced Probability for Undergrads

    ERIC Educational Resources Information Center

    Rolls, David A.

    2007-01-01

    In the branch of probability called "large deviations," rates of convergence (e.g. of the sample mean) are considered. The theory makes use of the moment generating function. So, particularly for sums of independent and identically distributed random variables, the theory can be made accessible to senior undergraduates after a first course in…

  9. [Determinants of pride and shame: outcome, expected success and attribution].

    PubMed

    Schützwohl, A

    1991-01-01

    In two experiments we investigated the relationship between subjective probability of success and pride and shame. According to Atkinson (1957), pride (the incentive of success) is an inverse linear function of the probability of success, shame (the incentive of failure) being a negative linear function. Attribution theory predicts an inverse U-shaped relationship between subjective probability of success and pride and shame. The results presented here are at variance with both theories: Pride and shame do not vary with subjective probability of success. However, pride and shame are systematically correlated with internal attributions of action outcome.

  10. Naive Probability: Model-Based Estimates of Unique Events.

    PubMed

    Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Philip N

    2015-08-01

    We describe a dual-process theory of how individuals estimate the probabilities of unique events, such as Hillary Clinton becoming U.S. President. It postulates that uncertainty is a guide to improbability. In its computer implementation, an intuitive system 1 simulates evidence in mental models and forms analog non-numerical representations of the magnitude of degrees of belief. This system has minimal computational power and combines evidence using a small repertoire of primitive operations. It resolves the uncertainty of divergent evidence for single events, for conjunctions of events, and for inclusive disjunctions of events, by taking a primitive average of non-numerical probabilities. It computes conditional probabilities in a tractable way, treating the given event as evidence that may be relevant to the probability of the dependent event. A deliberative system 2 maps the resulting representations into numerical probabilities. With access to working memory, it carries out arithmetical operations in combining numerical estimates. Experiments corroborated the theory's predictions. Participants concurred in estimates of real possibilities. They violated the complete joint probability distribution in the predicted ways, when they made estimates about conjunctions: P(A), P(B), P(A and B), disjunctions: P(A), P(B), P(A or B or both), and conditional probabilities P(A), P(B), P(B|A). They were faster to estimate the probabilities of compound propositions when they had already estimated the probabilities of each of their components. We discuss the implications of these results for theories of probabilistic reasoning. © 2014 Cognitive Science Society, Inc.

  11. A quantum probability explanation for violations of ‘rational’ decision theory

    PubMed Central

    Pothos, Emmanuel M.; Busemeyer, Jerome R.

    2009-01-01

    Two experimental tasks in psychology, the two-stage gambling game and the Prisoner's Dilemma game, show that people violate the sure thing principle of decision theory. These paradoxical findings have resisted explanation by classical decision theory for over a decade. A quantum probability model, based on a Hilbert space representation and Schrödinger's equation, provides a simple and elegant explanation for this behaviour. The quantum model is compared with an equivalent Markov model and it is shown that the latter is unable to account for violations of the sure thing principle. Accordingly, it is argued that quantum probability provides a better framework for modelling human decision-making. PMID:19324743

  12. A Tale of Two Probabilities

    ERIC Educational Resources Information Center

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  13. Unification of field theory and maximum entropy methods for learning probability densities

    NASA Astrophysics Data System (ADS)

    Kinney, Justin B.

    2015-09-01

    The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy density estimate can be recovered in the infinite smoothness limit of an appropriate Bayesian field theory. I also show that Bayesian field theory estimation can be performed without imposing any boundary conditions on candidate densities, and that the infinite smoothness limit of these theories recovers the most common types of maximum entropy estimates. Bayesian field theory thus provides a natural test of the maximum entropy null hypothesis and, furthermore, returns an alternative (lower entropy) density estimate when the maximum entropy hypothesis is falsified. The computations necessary for this approach can be performed rapidly for one-dimensional data, and software for doing this is provided.

  14. Unification of field theory and maximum entropy methods for learning probability densities.

    PubMed

    Kinney, Justin B

    2015-09-01

    The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy density estimate can be recovered in the infinite smoothness limit of an appropriate Bayesian field theory. I also show that Bayesian field theory estimation can be performed without imposing any boundary conditions on candidate densities, and that the infinite smoothness limit of these theories recovers the most common types of maximum entropy estimates. Bayesian field theory thus provides a natural test of the maximum entropy null hypothesis and, furthermore, returns an alternative (lower entropy) density estimate when the maximum entropy hypothesis is falsified. The computations necessary for this approach can be performed rapidly for one-dimensional data, and software for doing this is provided.

  15. Probabilities and Possibilities: The Strategic Counseling Implications of the Chaos Theory of Careers

    ERIC Educational Resources Information Center

    Pryor, Robert G. L.; Amundson, Norman E.; Bright, Jim E. H.

    2008-01-01

    The chaos theory of careers emphasizes both stability and change in its account of career development. This article outlines counseling strategies derived from this emphasis in terms of convergent or probability thinking and emergent or possibility thinking. These 2 perspectives are characterized, and practical counseling strategy implications are…

  16. Lattice Theory, Measures and Probability

    NASA Astrophysics Data System (ADS)

    Knuth, Kevin H.

    2007-11-01

    In this tutorial, I will discuss the concepts behind generalizing ordering to measuring and apply these ideas to the derivation of probability theory. The fundamental concept is that anything that can be ordered can be measured. Since we are in the business of making statements about the world around us, we focus on ordering logical statements according to implication. This results in a Boolean lattice, which is related to the fact that the corresponding logical operations form a Boolean algebra. The concept of logical implication can be generalized to degrees of implication by generalizing the zeta function of the lattice. The rules of probability theory arise naturally as a set of constraint equations. Through this construction we are able to neatly connect the concepts of order, structure, algebra, and calculus. The meaning of probability is inherited from the meaning of the ordering relation, implication, rather than being imposed in an ad hoc manner at the start.

  17. Algorithmic universality in F-theory compactifications

    NASA Astrophysics Data System (ADS)

    Halverson, James; Long, Cody; Sung, Benjamin

    2017-12-01

    We study universality of geometric gauge sectors in the string landscape in the context of F-theory compactifications. A finite time construction algorithm is presented for 4/3 ×2.96 ×10755 F-theory geometries that are connected by a network of topological transitions in a connected moduli space. High probability geometric assumptions uncover universal structures in the ensemble without explicitly constructing it. For example, non-Higgsable clusters of seven-branes with intricate gauge sectors occur with a probability above 1 - 1.01 ×10-755 , and the geometric gauge group rank is above 160 with probability 0.999995. In the latter case there are at least 10 E8 factors, the structure of which fixes the gauge groups on certain nearby seven-branes. Visible sectors may arise from E6 or S U (3 ) seven-branes, which occur in certain random samples with probability ≃1 /200 .

  18. The priority heuristic: making choices without trade-offs.

    PubMed

    Brandstätter, Eduard; Gigerenzer, Gerd; Hertwig, Ralph

    2006-04-01

    Bernoulli's framework of expected utility serves as a model for various psychological processes, including motivation, moral sense, attitudes, and decision making. To account for evidence at variance with expected utility, the authors generalize the framework of fast and frugal heuristics from inferences to preferences. The priority heuristic predicts (a) the Allais paradox, (b) risk aversion for gains if probabilities are high, (c) risk seeking for gains if probabilities are low (e.g., lottery tickets), (d) risk aversion for losses if probabilities are low (e.g., buying insurance), (e) risk seeking for losses if probabilities are high, (f) the certainty effect, (g) the possibility effect, and (h) intransitivities. The authors test how accurately the heuristic predicts people's choices, compared with previously proposed heuristics and 3 modifications of expected utility theory: security-potential/aspiration theory, transfer-of-attention-exchange model, and cumulative prospect theory. ((c) 2006 APA, all rights reserved).

  19. Quantum mechanics: The Bayesian theory generalized to the space of Hermitian matrices

    NASA Astrophysics Data System (ADS)

    Benavoli, Alessio; Facchini, Alessandro; Zaffalon, Marco

    2016-10-01

    We consider the problem of gambling on a quantum experiment and enforce rational behavior by a few rules. These rules yield, in the classical case, the Bayesian theory of probability via duality theorems. In our quantum setting, they yield the Bayesian theory generalized to the space of Hermitian matrices. This very theory is quantum mechanics: in fact, we derive all its four postulates from the generalized Bayesian theory. This implies that quantum mechanics is self-consistent. It also leads us to reinterpret the main operations in quantum mechanics as probability rules: Bayes' rule (measurement), marginalization (partial tracing), independence (tensor product). To say it with a slogan, we obtain that quantum mechanics is the Bayesian theory in the complex numbers.

  20. On computational Gestalt detection thresholds.

    PubMed

    Grompone von Gioi, Rafael; Jakubowicz, Jérémie

    2009-01-01

    The aim of this paper is to show some recent developments of computational Gestalt theory, as pioneered by Desolneux, Moisan and Morel. The new results allow to predict much more accurately the detection thresholds. This step is unavoidable if one wants to analyze visual detection thresholds in the light of computational Gestalt theory. The paper first recalls the main elements of computational Gestalt theory. It points out a precision issue in this theory, essentially due to the use of discrete probability distributions. It then proposes to overcome this issue by using continuous probability distributions and illustrates it on the meaningful alignment detector of Desolneux et al.

  1. Explanation of asymmetric dynamics of human water consumption in arid regions: prospect theory versus expected utility theory

    NASA Astrophysics Data System (ADS)

    Tian, F.; Lu, Y.

    2017-12-01

    Based on socioeconomic and hydrological data in three arid inland basins and error analysis, the dynamics of human water consumption (HWC) are analyzed to be asymmetric, i.e., HWC increase rapidly in wet periods while maintain or decrease slightly in dry periods. Besides the qualitative analysis that in wet periods great water availability inspires HWC to grow fast but the now expanded economy is managed to sustain by over-exploitation in dry periods, two quantitative models are established and tested, based on expected utility theory (EUT) and prospect theory (PT) respectively. EUT states that humans make decisions based on the total expected utility, namely the sum of utility function multiplied by probability of each result, while PT states that the utility function is defined over gains and losses separately, and probability should be replaced by probability weighting function.

  2. The World According to de Finetti: On de Finetti's Theory of Probability and Its Application to Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Berkovitz, Joseph

    Bruno de Finetti is one of the founding fathers of the subjectivist school of probability, where probabilities are interpreted as rational degrees of belief. His work on the relation between the theorems of probability and rationality is among the corner stones of modern subjective probability theory. De Finetti maintained that rationality requires that degrees of belief be coherent, and he argued that the whole of probability theory could be derived from these coherence conditions. De Finetti's interpretation of probability has been highly influential in science. This paper focuses on the application of this interpretation to quantum mechanics. We argue that de Finetti held that the coherence conditions of degrees of belief in events depend on their verifiability. Accordingly, the standard coherence conditions of degrees of belief that are familiar from the literature on subjective probability only apply to degrees of belief in events which could (in principle) be jointly verified; and the coherence conditions of degrees of belief in events that cannot be jointly verified are weaker. While the most obvious explanation of de Finetti's verificationism is the influence of positivism, we argue that it could be motivated by the radical subjectivist and instrumental nature of probability in his interpretation; for as it turns out, in this interpretation it is difficult to make sense of the idea of coherent degrees of belief in, and accordingly probabilities of unverifiable events. We then consider the application of this interpretation to quantum mechanics, concentrating on the Einstein-Podolsky-Rosen experiment and Bell's theorem.

  3. Toward an Objectivistic Theory of Probability

    DTIC Science & Technology

    1956-01-01

    OBJECTIVISTIC PROBABILITY THEORY 2 7 three potential acts: the individual may choose an apple, an orange or a banana . Each of these acts corresponds to a point...its veneer having begun to peel at one corner, etc., etc. Its future there-ness lies in that it may have its legs gnawed at by the new puppy in the

  4. Models of multidimensional discrete distribution of probabilities of random variables in information systems

    NASA Astrophysics Data System (ADS)

    Gromov, Yu Yu; Minin, Yu V.; Ivanova, O. G.; Morozova, O. N.

    2018-03-01

    Multidimensional discrete distributions of probabilities of independent random values were received. Their one-dimensional distribution is widely used in probability theory. Producing functions of those multidimensional distributions were also received.

  5. How Inhomogeneous Site Percolation Works on Bethe Lattices: Theory and Application

    NASA Astrophysics Data System (ADS)

    Ren, Jingli; Zhang, Liying; Siegmund, Stefan

    2016-03-01

    Inhomogeneous percolation, for its closer relationship with real-life, can be more useful and reasonable than homogeneous percolation to illustrate the critical phenomena and dynamical behaviour of complex networks. However, due to its intricacy, the theoretical framework of inhomogeneous percolation is far from being complete and many challenging problems are still open. In this paper, we first investigate inhomogeneous site percolation on Bethe Lattices with two occupation probabilities, and then extend the result to percolation with m occupation probabilities. The critical behaviour of this inhomogeneous percolation is shown clearly by formulating the percolation probability with given occupation probability p, the critical occupation probability , and the average cluster size where p is subject to . Moreover, using the above theory, we discuss in detail the diffusion behaviour of an infectious disease (SARS) and present specific disease-control strategies in consideration of groups with different infection probabilities.

  6. Non-Kolmogorovian Approach to the Context-Dependent Systems Breaking the Classical Probability Law

    NASA Astrophysics Data System (ADS)

    Asano, Masanari; Basieva, Irina; Khrennikov, Andrei; Ohya, Masanori; Yamato, Ichiro

    2013-07-01

    There exist several phenomena breaking the classical probability laws. The systems related to such phenomena are context-dependent, so that they are adaptive to other systems. In this paper, we present a new mathematical formalism to compute the joint probability distribution for two event-systems by using concepts of the adaptive dynamics and quantum information theory, e.g., quantum channels and liftings. In physics the basic example of the context-dependent phenomena is the famous double-slit experiment. Recently similar examples have been found in biological and psychological sciences. Our approach is an extension of traditional quantum probability theory, and it is general enough to describe aforementioned contextual phenomena outside of quantum physics.

  7. Factorization of Observables

    NASA Astrophysics Data System (ADS)

    Eliaš, Peter; Frič, Roman

    2017-12-01

    Categorical approach to probability leads to better understanding of basic notions and constructions in generalized (fuzzy, operational, quantum) probability, where observables—dual notions to generalized random variables (statistical maps)—play a major role. First, to avoid inconsistencies, we introduce three categories L, S, and P, the objects and morphisms of which correspond to basic notions of fuzzy probability theory and operational probability theory, and describe their relationships. To illustrate the advantages of categorical approach, we show that two categorical constructions involving observables (related to the representation of generalized random variables via products, or smearing of sharp observables, respectively) can be described as factorizing a morphism into composition of two morphisms having desired properties. We close with a remark concerning products.

  8. Reweighting Data in the Spirit of Tukey: Using Bayesian Posterior Probabilities as Rasch Residuals for Studying Misfit

    ERIC Educational Resources Information Center

    Dardick, William R.; Mislevy, Robert J.

    2016-01-01

    A new variant of the iterative "data = fit + residual" data-analytical approach described by Mosteller and Tukey is proposed and implemented in the context of item response theory psychometric models. Posterior probabilities from a Bayesian mixture model of a Rasch item response theory model and an unscalable latent class are expressed…

  9. The Theory of Probable Cause and Searches of Innocent Persons: The Fourth Amendment and "Stanford Daily."

    ERIC Educational Resources Information Center

    Morrison, David

    1978-01-01

    Develops three views of the theory underlying the probable cause requirement of the Fourth Amendment. The advantage of the approach based on the rights prohibition of violent intrusions is that it provides standards based on the individual conduct rather than on governmental acts. Available from UCLA Law Review, 405 Hilgard Avenue, Los Angeles,…

  10. Ergodic Theory, Interpretations of Probability and the Foundations of Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    van Lith, Janneke

    The traditional use of ergodic theory in the foundations of equilibrium statistical mechanics is that it provides a link between thermodynamic observables and microcanonical probabilities. First of all, the ergodic theorem demonstrates the equality of microcanonical phase averages and infinite time averages (albeit for a special class of systems, and up to a measure zero set of exceptions). Secondly, one argues that actual measurements of thermodynamic quantities yield time averaged quantities, since measurements take a long time. The combination of these two points is held to be an explanation why calculating microcanonical phase averages is a successful algorithm for predicting the values of thermodynamic observables. It is also well known that this account is problematic. This survey intends to show that ergodic theory nevertheless may have important roles to play, and it explores three other uses of ergodic theory. Particular attention is paid, firstly, to the relevance of specific interpretations of probability, and secondly, to the way in which the concern with systems in thermal equilibrium is translated into probabilistic language. With respect to the latter point, it is argued that equilibrium should not be represented as a stationary probability distribution as is standardly done; instead, a weaker definition is presented.

  11. Four-dimensional symmetry from a broad viewpoint. II Invariant distribution of quantized field oscillators and questions on infinities

    NASA Technical Reports Server (NTRS)

    Hsu, J. P.

    1983-01-01

    The foundation of the quantum field theory is changed by introducing a new universal probability principle into field operators: one single inherent and invariant probability distribution P(/k/) is postulated for boson and fermion field oscillators. This can be accomplished only when one treats the four-dimensional symmetry from a broad viewpoint. Special relativity is too restrictive to allow such a universal probability principle. A radical length, R, appears in physics through the probability distribution P(/k/). The force between two point particles vanishes when their relative distance tends to zero. This appears to be a general property for all forces and resembles the property of asymptotic freedom. The usual infinities in vacuum fluctuations and in local interactions, however complicated they may be, are all removed from quantum field theories. In appendix A a simple finite and unitary theory of unified electroweak interactions is discussed without assuming Higgs scalar bosons.

  12. The Misapplication of Probability Theory in Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Racicot, Ronald

    2014-03-01

    This article is a revision of two papers submitted to the APS in the past two and a half years. In these papers, arguments and proofs are summarized for the following: (1) The wrong conclusion by EPR that Quantum Mechanics is incomplete, perhaps requiring the addition of ``hidden variables'' for completion. Theorems that assume such ``hidden variables,'' such as Bell's theorem, are also wrong. (2) Quantum entanglement is not a realizable physical phenomenon and is based entirely on assuming a probability superposition model for quantum spin. Such a model directly violates conservation of angular momentum. (3) Simultaneous multiple-paths followed by a quantum particle traveling through space also cannot possibly exist. Besides violating Noether's theorem, the multiple-paths theory is based solely on probability calculations. Probability calculations by themselves cannot possibly represent simultaneous physically real events. None of the reviews of the submitted papers actually refuted the arguments and evidence that was presented. These analyses should therefore be carefully evaluated since the conclusions reached have such important impact in quantum mechanics and quantum information theory.

  13. Training in Small Business Retailing: Testing Human Capital Theory.

    ERIC Educational Resources Information Center

    Barcala, Marta Fernandez; Perez, Maria Jose Sanzo; Gutierrez, Juan Antonio Trespalacios

    1999-01-01

    Looks at four models of training demand: (1) probability of attending training in the near future; (2) probability of having attended training in the past; (3) probability of being willing to follow multimedia and correspondence courses; and (4) probability of repeating the experience of attending another training course in the near future.…

  14. Introduction to Probability, Part 1 - Basic Concepts. Student Text. Revised Edition.

    ERIC Educational Resources Information Center

    Blakeslee, David W.; And Others

    This book is designed to introduce the reader to some fundamental ideas about probability. The mathematical theory of probability plays an increasingly important role in science, government, industry, business, and economics. An understanding of the basic concepts of probability is essential for the study of statistical methods that are widely…

  15. New paradoxes of risky decision making.

    PubMed

    Birnbaum, Michael H

    2008-04-01

    During the last 25 years, prospect theory and its successor, cumulative prospect theory, replaced expected utility as the dominant descriptive theories of risky decision making. Although these models account for the original Allais paradoxes, 11 new paradoxes show where prospect theories lead to self-contradiction or systematic false predictions. The new findings are consistent with and, in several cases, were predicted in advance by simple "configural weight" models in which probability-consequence branches are weighted by a function that depends on branch probability and ranks of consequences on discrete branches. Although they have some similarities to later models called "rank-dependent utility," configural weight models do not satisfy coalescing, the assumption that branches leading to the same consequence can be combined by adding their probabilities. Nor do they satisfy cancellation, the "independence" assumption that branches common to both alternatives can be removed. The transfer of attention exchange model, with parameters estimated from previous data, correctly predicts results with all 11 new paradoxes. Apparently, people do not frame choices as prospects but, instead, as trees with branches.

  16. The Use of Probability Theory as a Basis for Planning and Controlling Overhead Costs in Education and Industry. Final Report.

    ERIC Educational Resources Information Center

    Vinson, R. B.

    In this report, the author suggests changes in the treatment of overhead costs by hypothesizing that "the effectiveness of standard costing in planning and controlling overhead costs can be increased through the use of probability theory and associated statistical techniques." To test the hypothesis, the author (1) presents an overview of the…

  17. Generalized Quantum Theory of Bianchi IX Cosmologies

    NASA Astrophysics Data System (ADS)

    Craig, David; Hartle, James

    2003-04-01

    We apply sum-over-histories generalized quantum theory to the closed homogeneous minisuperspace Bianchi IX cosmological model. We sketch how the probabilities in decoherent sets of alternative, coarse-grained histories of this model universe are calculated. We consider in particular, the probabilities for classical evolution in a suitable coarse-graining. For a restricted class of initial conditions and coarse grainings we exhibit the approximate decoherence of alternative histories in which the universe behaves classically and those in which it does not, illustrating the prediction that these universes will evolve in an approximately classical manner with a probability near unity.

  18. Quantum probability and quantum decision-making.

    PubMed

    Yukalov, V I; Sornette, D

    2016-01-13

    A rigorous general definition of quantum probability is given, which is valid not only for elementary events but also for composite events, for operationally testable measurements as well as for inconclusive measurements, and also for non-commuting observables in addition to commutative observables. Our proposed definition of quantum probability makes it possible to describe quantum measurements and quantum decision-making on the same common mathematical footing. Conditions are formulated for the case when quantum decision theory reduces to its classical counterpart and for the situation where the use of quantum decision theory is necessary. © 2015 The Author(s).

  19. Predicting non-square 2D dice probabilities

    NASA Astrophysics Data System (ADS)

    Pender, G. A. T.; Uhrin, M.

    2014-07-01

    The prediction of the final state probabilities of a general cuboid randomly thrown onto a surface is a problem that naturally arises in the minds of men and women familiar with regular cubic dice and the basic concepts of probability. Indeed, it was considered by Newton in 1664 (Newton 1967 The Mathematical Papers of Issac Newton vol I (Cambridge: Cambridge University Press) pp 60-1). In this paper we make progress on the 2D problem (which can be realized in 3D by considering a long cuboid, or alternatively a rectangular cross-sectioned dreidel). For the two-dimensional case we suggest that the ratio of the probabilities of landing on each of the two sides is given by \\frac{\\sqrt{{{k}^{2}}+{{l}^{2}}}-k}{\\sqrt{{{k}^{2}}+{{l}^{2}}}-l}\\frac{arctan \\frac{l}{k}}{arctan \\frac{k}{l}} where k and l are the lengths of the two sides. We test this theory both experimentally and computationally, and find good agreement between our theory, experimental and computational results. Our theory is known, from its derivation, to be an approximation for particularly bouncy or ‘grippy’ surfaces where the die rolls through many revolutions before settling. On real surfaces we would expect (and we observe) that the true probability ratio for a 2D die is a somewhat closer to unity than predicted by our theory. This problem may also have wider relevance in the testing of physics engines.

  20. Representation of analysis results involving aleatory and epistemic uncertainty.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Jay Dean; Helton, Jon Craig; Oberkampf, William Louis

    2008-08-01

    Procedures are described for the representation of results in analyses that involve both aleatory uncertainty and epistemic uncertainty, with aleatory uncertainty deriving from an inherent randomness in the behavior of the system under study and epistemic uncertainty deriving from a lack of knowledge about the appropriate values to use for quantities that are assumed to have fixed but poorly known values in the context of a specific study. Aleatory uncertainty is usually represented with probability and leads to cumulative distribution functions (CDFs) or complementary cumulative distribution functions (CCDFs) for analysis results of interest. Several mathematical structures are available for themore » representation of epistemic uncertainty, including interval analysis, possibility theory, evidence theory and probability theory. In the presence of epistemic uncertainty, there is not a single CDF or CCDF for a given analysis result. Rather, there is a family of CDFs and a corresponding family of CCDFs that derive from epistemic uncertainty and have an uncertainty structure that derives from the particular uncertainty structure (i.e., interval analysis, possibility theory, evidence theory, probability theory) used to represent epistemic uncertainty. Graphical formats for the representation of epistemic uncertainty in families of CDFs and CCDFs are investigated and presented for the indicated characterizations of epistemic uncertainty.« less

  1. Theories of the Alcoholic Personality.

    ERIC Educational Resources Information Center

    Cox, W. Miles

    Several theories of the alcoholic personality have been devised to determine the relationship between the clusters of personality characteristics of alcoholics and their abuse of alcohol. The oldest and probably best known theory is the dependency theory, formulated in the tradition of classical psychoanalysis, which associates the alcoholic's…

  2. Ponder This

    ERIC Educational Resources Information Center

    Yevdokimov, Oleksiy

    2009-01-01

    This article presents a problem set which includes a selection of probability problems. Probability theory started essentially as an empirical science and developed on the mathematical side later. The problems featured in this article demonstrate diversity of ideas and different concepts of probability, in particular, they refer to Laplace and…

  3. Analyzing force concept inventory with item response theory

    NASA Astrophysics Data System (ADS)

    Wang, Jing; Bao, Lei

    2010-10-01

    Item response theory is a popular assessment method used in education. It rests on the assumption of a probability framework that relates students' innate ability and their performance on test questions. Item response theory transforms students' raw test scores into a scaled proficiency score, which can be used to compare results obtained with different test questions. The scaled score also addresses the issues of ceiling effects and guessing, which commonly exist in quantitative assessment. We used item response theory to analyze the force concept inventory (FCI). Our results show that item response theory can be useful for analyzing physics concept surveys such as the FCI and produces results about the individual questions and student performance that are beyond the capability of classical statistics. The theory yields detailed measurement parameters regarding the difficulty, discrimination features, and probability of correct guess for each of the FCI questions.

  4. Contiguous triple spinal dysraphism associated with Chiari malformation Type II and hydrocephalus: an embryological conundrum between the unified theory of Pang and the unified theory of McLone.

    PubMed

    Dhandapani, Sivashanmugam; Srinivasan, Anirudh

    2016-01-01

    Triple spinal dysraphism is extremely rare. There are published reports of multiple discrete neural tube defects with intervening normal segments that are explained by the multisite closure theory of primary neurulation, having an association with Chiari malformation Type II consistent with the unified theory of McLone. The authors report on a 1-year-old child with contiguous myelomeningocele and lipomyelomeningocele centered on Type I split cord malformation with Chiari malformation Type II and hydrocephalus. This composite anomaly is probably due to select abnormalities of the neurenteric canal during gastrulation, with a contiguous cascading impact on both dysjunction of the neural tube and closure of the neuropore, resulting in a small posterior fossa, probably bringing the unified theory of McLone closer to the unified theory of Pang.

  5. Lattice Duality: The Origin of Probability and Entropy

    NASA Technical Reports Server (NTRS)

    Knuth, Kevin H.

    2004-01-01

    Bayesian probability theory is an inference calculus, which originates from a generalization of inclusion on the Boolean lattice of logical assertions to a degree of inclusion represented by a real number. Dual to this lattice is the distributive lattice of questions constructed from the ordered set of down-sets of assertions, which forms the foundation of the calculus of inquiry-a generalization of information theory. In this paper we introduce this novel perspective on these spaces in which machine learning is performed and discuss the relationship between these results and several proposed generalizations of information theory in the literature.

  6. Academic decision making and prospect theory.

    PubMed

    Mowrer, Robert R; Davidson, William B

    2011-08-01

    Two studies are reported that investigate the applicability of prospect theory to college students' academic decision making. Exp. 1 failed to provide support for the risk-seeking portion of the fourfold pattern predicted by prospect theory but did find the greater weighting of losses over gains. Using a more sensitive dependent measure, in Exp. 2 the results of the first experiment were replicated in terms of the gain-loss effect and also found some support for the fourfold pattern in the interaction between probabilities and gain versus loss. The greatest risk-seeking was found in the high probability loss condition.

  7. Information theoretic quantification of diagnostic uncertainty.

    PubMed

    Westover, M Brandon; Eiseman, Nathaniel A; Cash, Sydney S; Bianchi, Matt T

    2012-01-01

    Diagnostic test interpretation remains a challenge in clinical practice. Most physicians receive training in the use of Bayes' rule, which specifies how the sensitivity and specificity of a test for a given disease combine with the pre-test probability to quantify the change in disease probability incurred by a new test result. However, multiple studies demonstrate physicians' deficiencies in probabilistic reasoning, especially with unexpected test results. Information theory, a branch of probability theory dealing explicitly with the quantification of uncertainty, has been proposed as an alternative framework for diagnostic test interpretation, but is even less familiar to physicians. We have previously addressed one key challenge in the practical application of Bayes theorem: the handling of uncertainty in the critical first step of estimating the pre-test probability of disease. This essay aims to present the essential concepts of information theory to physicians in an accessible manner, and to extend previous work regarding uncertainty in pre-test probability estimation by placing this type of uncertainty within a principled information theoretic framework. We address several obstacles hindering physicians' application of information theoretic concepts to diagnostic test interpretation. These include issues of terminology (mathematical meanings of certain information theoretic terms differ from clinical or common parlance) as well as the underlying mathematical assumptions. Finally, we illustrate how, in information theoretic terms, one can understand the effect on diagnostic uncertainty of considering ranges instead of simple point estimates of pre-test probability.

  8. Elapsed decision time affects the weighting of prior probability in a perceptual decision task

    PubMed Central

    Hanks, Timothy D.; Mazurek, Mark E.; Kiani, Roozbeh; Hopp, Elizabeth; Shadlen, Michael N.

    2012-01-01

    Decisions are often based on a combination of new evidence with prior knowledge of the probable best choice. Optimal combination requires knowledge about the reliability of evidence, but in many realistic situations, this is unknown. Here we propose and test a novel theory: the brain exploits elapsed time during decision formation to combine sensory evidence with prior probability. Elapsed time is useful because (i) decisions that linger tend to arise from less reliable evidence, and (ii) the expected accuracy at a given decision time depends on the reliability of the evidence gathered up to that point. These regularities allow the brain to combine prior information with sensory evidence by weighting the latter in accordance with reliability. To test this theory, we manipulated the prior probability of the rewarded choice while subjects performed a reaction-time discrimination of motion direction using a range of stimulus reliabilities that varied from trial to trial. The theory explains the effect of prior probability on choice and reaction time over a wide range of stimulus strengths. We found that prior probability was incorporated into the decision process as a dynamic bias signal that increases as a function of decision time. This bias signal depends on the speed-accuracy setting of human subjects, and it is reflected in the firing rates of neurons in the lateral intraparietal cortex (LIP) of rhesus monkeys performing this task. PMID:21525274

  9. Elapsed decision time affects the weighting of prior probability in a perceptual decision task.

    PubMed

    Hanks, Timothy D; Mazurek, Mark E; Kiani, Roozbeh; Hopp, Elisabeth; Shadlen, Michael N

    2011-04-27

    Decisions are often based on a combination of new evidence with prior knowledge of the probable best choice. Optimal combination requires knowledge about the reliability of evidence, but in many realistic situations, this is unknown. Here we propose and test a novel theory: the brain exploits elapsed time during decision formation to combine sensory evidence with prior probability. Elapsed time is useful because (1) decisions that linger tend to arise from less reliable evidence, and (2) the expected accuracy at a given decision time depends on the reliability of the evidence gathered up to that point. These regularities allow the brain to combine prior information with sensory evidence by weighting the latter in accordance with reliability. To test this theory, we manipulated the prior probability of the rewarded choice while subjects performed a reaction-time discrimination of motion direction using a range of stimulus reliabilities that varied from trial to trial. The theory explains the effect of prior probability on choice and reaction time over a wide range of stimulus strengths. We found that prior probability was incorporated into the decision process as a dynamic bias signal that increases as a function of decision time. This bias signal depends on the speed-accuracy setting of human subjects, and it is reflected in the firing rates of neurons in the lateral intraparietal area (LIP) of rhesus monkeys performing this task.

  10. Hypergame theory applied to cyber attack and defense

    NASA Astrophysics Data System (ADS)

    House, James Thomas; Cybenko, George

    2010-04-01

    This work concerns cyber attack and defense in the context of game theory--specifically hypergame theory. Hypergame theory extends classical game theory with the ability to deal with differences in players' expertise, differences in their understanding of game rules, misperceptions, and so forth. Each of these different sub-scenarios, or subgames, is associated with a probability--representing the likelihood that the given subgame is truly "in play" at a given moment. In order to form an optimal attack or defense policy, these probabilities must be learned if they're not known a-priori. We present hidden Markov model and maximum entropy approaches for accurately learning these probabilities through multiple iterations of both normal and modified game play. We also give a widely-applicable approach for the analysis of cases where an opponent is aware that he is being studied, and intentionally plays to spoil the process of learning and thereby obfuscate his attributes. These are considered in the context of a generic, abstract cyber attack example. We demonstrate that machine learning efficacy can be heavily dependent on the goals and styles of participant behavior. To this end detailed simulation results under various combinations of attacker and defender behaviors are presented and analyzed.

  11. Re-assessment of road accident data-analysis policy : applying theory from involuntary, high-consequence, low-probability events like nuclear power plant meltdowns to voluntary, low-consequence, high-probability events like traffic accidents

    DOT National Transportation Integrated Search

    2002-02-01

    This report examines the literature on involuntary, high-consequence, low-probability (IHL) events like nuclear power plant meltdowns to determine what can be applied to the problem of voluntary, low-consequence high-probability (VLH) events like tra...

  12. Quantum probability and cognitive modeling: some cautions and a promising direction in modeling physics learning.

    PubMed

    Franceschetti, Donald R; Gire, Elizabeth

    2013-06-01

    Quantum probability theory offers a viable alternative to classical probability, although there are some ambiguities inherent in transferring the quantum formalism to a less determined realm. A number of physicists are now looking at the applicability of quantum ideas to the assessment of physics learning, an area particularly suited to quantum probability ideas.

  13. Using the Reliability Theory for Assessing the Decision Confidence Probability for Comparative Life Cycle Assessments.

    PubMed

    Wei, Wei; Larrey-Lassalle, Pyrène; Faure, Thierry; Dumoulin, Nicolas; Roux, Philippe; Mathias, Jean-Denis

    2016-03-01

    Comparative decision making process is widely used to identify which option (system, product, service, etc.) has smaller environmental footprints and for providing recommendations that help stakeholders take future decisions. However, the uncertainty problem complicates the comparison and the decision making. Probability-based decision support in LCA is a way to help stakeholders in their decision-making process. It calculates the decision confidence probability which expresses the probability of a option to have a smaller environmental impact than the one of another option. Here we apply the reliability theory to approximate the decision confidence probability. We compare the traditional Monte Carlo method with a reliability method called FORM method. The Monte Carlo method needs high computational time to calculate the decision confidence probability. The FORM method enables us to approximate the decision confidence probability with fewer simulations than the Monte Carlo method by approximating the response surface. Moreover, the FORM method calculates the associated importance factors that correspond to a sensitivity analysis in relation to the probability. The importance factors allow stakeholders to determine which factors influence their decision. Our results clearly show that the reliability method provides additional useful information to stakeholders as well as it reduces the computational time.

  14. Modelling uncertainty with generalized credal sets: application to conjunction and decision

    NASA Astrophysics Data System (ADS)

    Bronevich, Andrey G.; Rozenberg, Igor N.

    2018-01-01

    To model conflict, non-specificity and contradiction in information, upper and lower generalized credal sets are introduced. Any upper generalized credal set is a convex subset of plausibility measures interpreted as lower probabilities whose bodies of evidence consist of singletons and a certain event. Analogously, contradiction is modelled in the theory of evidence by a belief function that is greater than zero at empty set. Based on generalized credal sets, we extend the conjunctive rule for contradictory sources of information, introduce constructions like natural extension in the theory of imprecise probabilities and show that the model of generalized credal sets coincides with the model of imprecise probabilities if the profile of a generalized credal set consists of probability measures. We give ways how the introduced model can be applied to decision problems.

  15. Realistic neurons can compute the operations needed by quantum probability theory and other vector symbolic architectures.

    PubMed

    Stewart, Terrence C; Eliasmith, Chris

    2013-06-01

    Quantum probability (QP) theory can be seen as a type of vector symbolic architecture (VSA): mental states are vectors storing structured information and manipulated using algebraic operations. Furthermore, the operations needed by QP match those in other VSAs. This allows existing biologically realistic neural models to be adapted to provide a mechanistic explanation of the cognitive phenomena described in the target article by Pothos & Busemeyer (P&B).

  16. The Research on Tunnel Surrounding Rock Classification Based on Geological Radar and Probability Theory

    NASA Astrophysics Data System (ADS)

    Xiao Yong, Zhao; Xin, Ji Yong; Shuang Ying, Zuo

    2018-03-01

    In order to effectively classify the surrounding rock types of tunnels, a multi-factor tunnel surrounding rock classification method based on GPR and probability theory is proposed. Geological radar was used to identify the geology of the surrounding rock in front of the face and to evaluate the quality of the rock face. According to the previous survey data, the rock uniaxial compressive strength, integrity index, fissure and groundwater were selected for classification. The related theories combine them into a multi-factor classification method, and divide the surrounding rocks according to the great probability. Using this method to classify the surrounding rock of the Ma’anshan tunnel, the surrounding rock types obtained are basically the same as those of the actual surrounding rock, which proves that this method is a simple, efficient and practical rock classification method, which can be used for tunnel construction.

  17. Introducing Disjoint and Independent Events in Probability.

    ERIC Educational Resources Information Center

    Kelly, I. W.; Zwiers, F. W.

    Two central concepts in probability theory are those of independence and mutually exclusive events. This document is intended to provide suggestions to teachers that can be used to equip students with an intuitive, comprehensive understanding of these basic concepts in probability. The first section of the paper delineates mutually exclusive and…

  18. A Quantum Theoretical Explanation for Probability Judgment Errors

    ERIC Educational Resources Information Center

    Busemeyer, Jerome R.; Pothos, Emmanuel M.; Franco, Riccardo; Trueblood, Jennifer S.

    2011-01-01

    A quantum probability model is introduced and used to explain human probability judgment errors including the conjunction and disjunction fallacies, averaging effects, unpacking effects, and order effects on inference. On the one hand, quantum theory is similar to other categorization and memory models of cognition in that it relies on vector…

  19. ON CONTINUOUS-REVIEW (S-1,S) INVENTORY POLICIES WITH STATE-DEPENDENT LEADTIMES,

    DTIC Science & Technology

    INVENTORY CONTROL, *REPLACEMENT THEORY), MATHEMATICAL MODELS, LEAD TIME , MANAGEMENT ENGINEERING, DISTRIBUTION FUNCTIONS, PROBABILITY, QUEUEING THEORY, COSTS, OPTIMIZATION, STATISTICAL PROCESSES, DIFFERENCE EQUATIONS

  20. Compound risk judgment in tasks with both idiosyncratic and systematic risk: The "Robust Beauty" of additive probability integration.

    PubMed

    Sundh, Joakim; Juslin, Peter

    2018-02-01

    In this study, we explore how people integrate risks of assets in a simulated financial market into a judgment of the conjunctive risk that all assets decrease in value, both when assets are independent and when there is a systematic risk present affecting all assets. Simulations indicate that while mental calculation according to naïve application of probability theory is best when the assets are independent, additive or exemplar-based algorithms perform better when systematic risk is high. Considering that people tend to intuitively approach compound probability tasks using additive heuristics, we expected the participants to find it easiest to master tasks with high systematic risk - the most complex tasks from the standpoint of probability theory - while they should shift to probability theory or exemplar memory with independence between the assets. The results from 3 experiments confirm that participants shift between strategies depending on the task, starting off with the default of additive integration. In contrast to results in similar multiple cue judgment tasks, there is little evidence for use of exemplar memory. The additive heuristics also appear to be surprisingly context-sensitive, with limited generalization across formally very similar tasks. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Unified nano-mechanics based probabilistic theory of quasibrittle and brittle structures: II. Fatigue crack growth, lifetime and scaling

    NASA Astrophysics Data System (ADS)

    Le, Jia-Liang; Bažant, Zdeněk P.

    2011-07-01

    This paper extends the theoretical framework presented in the preceding Part I to the lifetime distribution of quasibrittle structures failing at the fracture of one representative volume element under constant amplitude fatigue. The probability distribution of the critical stress amplitude is derived for a given number of cycles and a given minimum-to-maximum stress ratio. The physical mechanism underlying the Paris law for fatigue crack growth is explained under certain plausible assumptions about the damage accumulation in the cyclic fracture process zone at the tip of subcritical crack. This law is then used to relate the probability distribution of critical stress amplitude to the probability distribution of fatigue lifetime. The theory naturally yields a power-law relation for the stress-life curve (S-N curve), which agrees with Basquin's law. Furthermore, the theory indicates that, for quasibrittle structures, the S-N curve must be size dependent. Finally, physical explanation is provided to the experimentally observed systematic deviations of lifetime histograms of various ceramics and bones from the Weibull distribution, and their close fits by the present theory are demonstrated.

  2. Exaggerated risk: prospect theory and probability weighting in risky choice.

    PubMed

    Kusev, Petko; van Schaik, Paul; Ayton, Peter; Dent, John; Chater, Nick

    2009-11-01

    In 5 experiments, we studied precautionary decisions in which participants decided whether or not to buy insurance with specified cost against an undesirable event with specified probability and cost. We compared the risks taken for precautionary decisions with those taken for equivalent monetary gambles. Fitting these data to Tversky and Kahneman's (1992) prospect theory, we found that the weighting function required to model precautionary decisions differed from that required for monetary gambles. This result indicates a failure of the descriptive invariance axiom of expected utility theory. For precautionary decisions, people overweighted small, medium-sized, and moderately large probabilities-they exaggerated risks. This effect is not anticipated by prospect theory or experience-based decision research (Hertwig, Barron, Weber, & Erev, 2004). We found evidence that exaggerated risk is caused by the accessibility of events in memory: The weighting function varies as a function of the accessibility of events. This suggests that people's experiences of events leak into decisions even when risk information is explicitly provided. Our findings highlight a need to investigate how variation in decision content produces variation in preferences for risk.

  3. Generalized Galilean transformations and the measurement problem in the entropic dynamics approach to quantum theory

    NASA Astrophysics Data System (ADS)

    Johnson, David T.

    Quantum mechanics is an extremely successful and accurate physical theory, yet since its inception, it has been afflicted with numerous conceptual difficulties. The primary subject of this thesis is the theory of entropic quantum dynamics (EQD), which seeks to avoid these conceptual problems by interpreting quantum theory from an informational perspective. We begin by reviewing Cox's work in describing probability theory as a means of rationally and consistently quantifying uncertainties. We then discuss how probabilities can be updated according to either Bayes' theorem or the extended method of maximum entropy (ME). After that discussion, we review the work of Caticha and Giffin that shows that Bayes' theorem is a special case of ME. This important result demonstrates that the ME method is the general method for updating probabilities. We then review some motivating difficulties in quantum mechanics before discussing Caticha's work in deriving quantum theory from the approach of entropic dynamics, which concludes our review. After entropic dynamics is introduced, we develop the concepts of symmetries and transformations from an informational perspective. The primary result is the formulation of a symmetry condition that any transformation must satisfy in order to qualify as a symmetry in EQD. We then proceed to apply this condition to the extended Galilean transformation. This transformation is of interest as it exhibits features of both special and general relativity. The transformation yields a gravitational potential that arises from an equivalence of information. We conclude the thesis with a discussion of the measurement problem in quantum mechanics. We discuss the difficulties that arise in the standard quantum mechanical approach to measurement before developing our theory of entropic measurement. In entropic dynamics, position is the only observable. We show how a theory built on this one observable can account for the multitude of measurements present in quantum theory. Furthermore, we show that the Born rule need not be postulated, but can be derived in EQD. Finally, we show how the wave function can be updated by the ME method as the phase is constructed purely in terms of probabilities.

  4. The coherence problem with th Unified Neutral Theory of biodiversity

    Treesearch

    James S. Clark

    2012-01-01

    The Unified Neutral Theory of Biodiversity (UNTB), proposed as an alternative to niche theory, has been viewed as a theory that species coexist without niche differences, without fitness differences, or with equal probability of success. Support is claimed when models lacking species differences predict highly aggregated metrics, such as species abundance distributions...

  5. Thermodynamics and the structure of quantum theory

    NASA Astrophysics Data System (ADS)

    Krumm, Marius; Barnum, Howard; Barrett, Jonathan; Müller, Markus P.

    2017-04-01

    Despite its enormous empirical success, the formalism of quantum theory still raises fundamental questions: why is nature described in terms of complex Hilbert spaces, and what modifications of it could we reasonably expect to find in some regimes of physics? Here we address these questions by studying how compatibility with thermodynamics constrains the structure of quantum theory. We employ two postulates that any probabilistic theory with reasonable thermodynamic behaviour should arguably satisfy. In the framework of generalised probabilistic theories, we show that these postulates already imply important aspects of quantum theory, like self-duality and analogues of projective measurements, subspaces and eigenvalues. However, they may still admit a class of theories beyond quantum mechanics. Using a thought experiment by von Neumann, we show that these theories admit a consistent thermodynamic notion of entropy, and prove that the second law holds for projective measurements and mixing procedures. Furthermore, we study additional entropy-like quantities based on measurement probabilities and convex decomposition probabilities, and uncover a relation between one of these quantities and Sorkin’s notion of higher-order interference.

  6. Climbing Mount Probable

    ERIC Educational Resources Information Center

    Harper, Marc Allen

    2009-01-01

    This work attempts to explain the relationships between natural selection, information theory, and statistical inference. In particular, a geometric formulation of information theory known as information geometry and its deep connections to evolutionary game theory inform the role of natural selection in evolutionary processes. The goals of this…

  7. Do recent observations of very large electromagnetic dissociation cross sections signify a transition towards non-perturbative QED?

    NASA Technical Reports Server (NTRS)

    Norbury, John W.

    1992-01-01

    The very large electromagnetic dissociation (EMD) cross section recently observed by Hill, Wohn, Schwellenbach, and Smith do not agree with Weizsacker-Williams (WW) theory or any simple modification thereof. Calculations are presented for the reaction probabilities for this experiment and the entire single and double nucleon removal EMD data set. It is found that for those few reactions where theory and experiment disagree, the probabilities are exceptionally large. This indicates that WW theory is not valid for these reactions and that one must consider higher order corrections and perhaps even a non-perturbative approach to quantum electrodynamics (QED).

  8. Mathematical Games

    ERIC Educational Resources Information Center

    Gardner, Martin

    1978-01-01

    Describes the life and work of Charles Peirce, U.S. mathematician and philosopher. His accomplishments include contributions to logic, the foundations of mathematics and scientific method, and decision theory and probability theory. (MA)

  9. Hoeffding Type Inequalities and their Applications in Statistics and Operations Research

    NASA Astrophysics Data System (ADS)

    Daras, Tryfon

    2007-09-01

    Large Deviation theory is the branch of Probability theory that deals with rare events. Sometimes, these events can be described by the sum of random variables that deviates from its mean more than a "normal" amount. A precise calculation of the probabilities of such events turns out to be crucial in a variety of different contents (e.g. in Probability Theory, Statistics, Operations Research, Statistical Physics, Financial Mathematics e.t.c.). Recent applications of the theory deal with random walks in random environments, interacting diffusions, heat conduction, polymer chains [1]. In this paper we prove an inequality of exponential type, namely theorem 2.1, which gives a large deviation upper bound for a specific sequence of r.v.s. Inequalities of this type have many applications in Combinatorics [2]. The inequality generalizes already proven results of this type, in the case of symmetric probability measures. We get as consequences to the inequality: (a) large deviations upper bounds for exchangeable Bernoulli sequences of random variables, generalizing results proven for independent and identically distributed Bernoulli sequences of r.v.s. and (b) a general form of Bernstein's inequality. We compare the inequality with large deviation results already proven by the author and try to see its advantages. Finally, using the inequality, we solve one of the basic problems of Operations Research (bin packing problem) in the case of exchangeable r.v.s.

  10. Medical concepts related to individual risk are better explained with "plausibility" rather than "probability".

    PubMed

    Grossi, Enzo

    2005-09-27

    The concept of risk has pervaded medical literature in the last decades and has become a familiar topic, and the concept of probability, linked to binary logic approach, is commonly applied in epidemiology and clinical medicine. The application of probability theory to groups of individuals is quite straightforward but can pose communication challenges at individual level. Few articles by the way have tried to focus the concept of "risk" at the individual subject level rather than at population level. The author has reviewed the conceptual framework which has led to the use of probability theory in the medical field in a time when the principal causes of death were represented by acute disease often of infective origin. In the present scenario, in which chronic degenerative disease dominate and there are smooth transitions between health and disease the use of fuzzy logic rather than binary logic would be more appropriate. The use of fuzzy logic in which more than two possible truth-value assignments are allowed overcomes the trap of probability theory when dealing with uncertain outcomes, thereby making the meaning of a certain prognostic statement easier to understand by the patient. At individual subject level the recourse to the term plausibility, related to fuzzy logic, would help the physician to communicate to the patient more efficiently in comparison with the term probability, related to binary logic. This would represent an evident advantage for the transfer of medical evidences to individual subjects.

  11. Assessing the Chances of Success: Naive Statistics versus Kind Experience

    ERIC Educational Resources Information Center

    Hogarth, Robin M.; Mukherjee, Kanchan; Soyer, Emre

    2013-01-01

    Additive integration of information is ubiquitous in judgment and has been shown to be effective even when multiplicative rules of probability theory are prescribed. We explore the generality of these findings in the context of estimating probabilities of success in contests. We first define a normative model of these probabilities that takes…

  12. Probability model for analyzing fire management alternatives: theory and structure

    Treesearch

    Frederick W. Bratten

    1982-01-01

    A theoretical probability model has been developed for analyzing program alternatives in fire management. It includes submodels or modules for predicting probabilities of fire behavior, fire occurrence, fire suppression, effects of fire on land resources, and financial effects of fire. Generalized "fire management situations" are used to represent actual fire...

  13. Seeing the Forest when Entry Is Unlikely: Probability and the Mental Representation of Events

    ERIC Educational Resources Information Center

    Wakslak, Cheryl J.; Trope, Yaacov; Liberman, Nira; Alony, Rotem

    2006-01-01

    Conceptualizing probability as psychological distance, the authors draw on construal level theory (Y. Trope & N. Liberman, 2003) to propose that decreasing an event's probability leads individuals to represent the event by its central, abstract, general features (high-level construal) rather than by its peripheral, concrete, specific features…

  14. Generalized ensemble theory with non-extensive statistics

    NASA Astrophysics Data System (ADS)

    Shen, Ke-Ming; Zhang, Ben-Wei; Wang, En-Ke

    2017-12-01

    The non-extensive canonical ensemble theory is reconsidered with the method of Lagrange multipliers by maximizing Tsallis entropy, with the constraint that the normalized term of Tsallis' q -average of physical quantities, the sum ∑ pjq, is independent of the probability pi for Tsallis parameter q. The self-referential problem in the deduced probability and thermal quantities in non-extensive statistics is thus avoided, and thermodynamical relationships are obtained in a consistent and natural way. We also extend the study to the non-extensive grand canonical ensemble theory and obtain the q-deformed Bose-Einstein distribution as well as the q-deformed Fermi-Dirac distribution. The theory is further applied to the generalized Planck law to demonstrate the distinct behaviors of the various generalized q-distribution functions discussed in literature.

  15. Targeting the probability versus cost of feared outcomes in public speaking anxiety.

    PubMed

    Nelson, Elizabeth A; Deacon, Brett J; Lickel, James J; Sy, Jennifer T

    2010-04-01

    Cognitive-behavioral theory suggests that social phobia is maintained, in part, by overestimates of the probability and cost of negative social events. Indeed, empirically supported cognitive-behavioral treatments directly target these cognitive biases through the use of in vivo exposure or behavioral experiments. While cognitive-behavioral theories and treatment protocols emphasize the importance of targeting probability and cost biases in the reduction of social anxiety, few studies have examined specific techniques for reducing probability and cost bias, and thus the relative efficacy of exposure to the probability versus cost of negative social events is unknown. In the present study, 37 undergraduates with high public speaking anxiety were randomly assigned to a single-session intervention designed to reduce either the perceived probability or the perceived cost of negative outcomes associated with public speaking. Compared to participants in the probability treatment condition, those in the cost treatment condition demonstrated significantly greater improvement on measures of public speaking anxiety and cost estimates for negative social events. The superior efficacy of the cost treatment condition was mediated by greater treatment-related changes in social cost estimates. The clinical implications of these findings are discussed. Published by Elsevier Ltd.

  16. Prospect Theory and Coercive Bargaining

    ERIC Educational Resources Information Center

    Butler, Christopher K.

    2007-01-01

    Despite many applications of prospect theory's concepts to explain political and strategic phenomena, formal analyses of strategic problems using prospect theory are rare. Using Fearon's model of bargaining, Tversky and Kahneman's value function, and an existing probability weighting function, I construct a model that demonstrates the differences…

  17. The Madelung Picture as a Foundation of Geometric Quantum Theory

    NASA Astrophysics Data System (ADS)

    Reddiger, Maik

    2017-10-01

    Despite its age, quantum theory still suffers from serious conceptual difficulties. To create clarity, mathematical physicists have been attempting to formulate quantum theory geometrically and to find a rigorous method of quantization, but this has not resolved the problem. In this article we argue that a quantum theory recursing to quantization algorithms is necessarily incomplete. To provide an alternative approach, we show that the Schrödinger equation is a consequence of three partial differential equations governing the time evolution of a given probability density. These equations, discovered by Madelung, naturally ground the Schrödinger theory in Newtonian mechanics and Kolmogorovian probability theory. A variety of far-reaching consequences for the projection postulate, the correspondence principle, the measurement problem, the uncertainty principle, and the modeling of particle creation and annihilation are immediate. We also give a speculative interpretation of the equations following Bohm, Vigier and Tsekov, by claiming that quantum mechanical behavior is possibly caused by gravitational background noise.

  18. Gossip in Random Networks

    NASA Astrophysics Data System (ADS)

    Malarz, K.; Szvetelszky, Z.; Szekf, B.; Kulakowski, K.

    2006-11-01

    We consider the average probability X of being informed on a gossip in a given social network. The network is modeled within the random graph theory of Erd{õ}s and Rényi. In this theory, a network is characterized by two parameters: the size N and the link probability p. Our experimental data suggest three levels of social inclusion of friendship. The critical value pc, for which half of agents are informed, scales with the system size as N-gamma with gamma approx 0.68. Computer simulations show that the probability X varies with p as a sigmoidal curve. Influence of the correlations between neighbors is also evaluated: with increasing clustering coefficient C, X decreases.

  19. Simple artificial neural networks that match probability and exploit and explore when confronting a multiarmed bandit.

    PubMed

    Dawson, Michael R W; Dupuis, Brian; Spetch, Marcia L; Kelly, Debbie M

    2009-08-01

    The matching law (Herrnstein 1961) states that response rates become proportional to reinforcement rates; this is related to the empirical phenomenon called probability matching (Vulkan 2000). Here, we show that a simple artificial neural network generates responses consistent with probability matching. This behavior was then used to create an operant procedure for network learning. We use the multiarmed bandit (Gittins 1989), a classic problem of choice behavior, to illustrate that operant training balances exploiting the bandit arm expected to pay off most frequently with exploring other arms. Perceptrons provide a medium for relating results from neural networks, genetic algorithms, animal learning, contingency theory, reinforcement learning, and theories of choice.

  20. Modeling Stochastic Complexity in Complex Adaptive Systems: Non-Kolmogorov Probability and the Process Algebra Approach.

    PubMed

    Sulis, William H

    2017-10-01

    Walter Freeman III pioneered the application of nonlinear dynamical systems theories and methodologies in his work on mesoscopic brain dynamics.Sadly, mainstream psychology and psychiatry still cling to linear correlation based data analysis techniques, which threaten to subvert the process of experimentation and theory building. In order to progress, it is necessary to develop tools capable of managing the stochastic complexity of complex biopsychosocial systems, which includes multilevel feedback relationships, nonlinear interactions, chaotic dynamics and adaptability. In addition, however, these systems exhibit intrinsic randomness, non-Gaussian probability distributions, non-stationarity, contextuality, and non-Kolmogorov probabilities, as well as the absence of mean and/or variance and conditional probabilities. These properties and their implications for statistical analysis are discussed. An alternative approach, the Process Algebra approach, is described. It is a generative model, capable of generating non-Kolmogorov probabilities. It has proven useful in addressing fundamental problems in quantum mechanics and in the modeling of developing psychosocial systems.

  1. Unified nano-mechanics based probabilistic theory of quasibrittle and brittle structures: I. Strength, static crack growth, lifetime and scaling

    NASA Astrophysics Data System (ADS)

    Le, Jia-Liang; Bažant, Zdeněk P.; Bazant, Martin Z.

    2011-07-01

    Engineering structures must be designed for an extremely low failure probability such as 10 -6, which is beyond the means of direct verification by histogram testing. This is not a problem for brittle or ductile materials because the type of probability distribution of structural strength is fixed and known, making it possible to predict the tail probabilities from the mean and variance. It is a problem, though, for quasibrittle materials for which the type of strength distribution transitions from Gaussian to Weibullian as the structure size increases. These are heterogeneous materials with brittle constituents, characterized by material inhomogeneities that are not negligible compared to the structure size. Examples include concrete, fiber composites, coarse-grained or toughened ceramics, rocks, sea ice, rigid foams and bone, as well as many materials used in nano- and microscale devices. This study presents a unified theory of strength and lifetime for such materials, based on activation energy controlled random jumps of the nano-crack front, and on the nano-macro multiscale transition of tail probabilities. Part I of this study deals with the case of monotonic and sustained (or creep) loading, and Part II with fatigue (or cyclic) loading. On the scale of the representative volume element of material, the probability distribution of strength has a Gaussian core onto which a remote Weibull tail is grafted at failure probability of the order of 10 -3. With increasing structure size, the Weibull tail penetrates into the Gaussian core. The probability distribution of static (creep) lifetime is related to the strength distribution by the power law for the static crack growth rate, for which a physical justification is given. The present theory yields a simple relation between the exponent of this law and the Weibull moduli for strength and lifetime. The benefit is that the lifetime distribution can be predicted from short-time tests of the mean size effect on strength and tests of the power law for the crack growth rate. The theory is shown to match closely numerous test data on strength and static lifetime of ceramics and concrete, and explains why their histograms deviate systematically from the straight line in Weibull scale. Although the present unified theory is built on several previous advances, new contributions are here made to address: (i) a crack in a disordered nano-structure (such as that of hydrated Portland cement), (ii) tail probability of a fiber bundle (or parallel coupling) model with softening elements, (iii) convergence of this model to the Gaussian distribution, (iv) the stress-life curve under constant load, and (v) a detailed random walk analysis of crack front jumps in an atomic lattice. The nonlocal behavior is captured in the present theory through the finiteness of the number of links in the weakest-link model, which explains why the mean size effect coincides with that of the previously formulated nonlocal Weibull theory. Brittle structures correspond to the large-size limit of the present theory. An important practical conclusion is that the safety factors for strength and tolerable minimum lifetime for large quasibrittle structures (e.g., concrete structures and composite airframes or ship hulls, as well as various micro-devices) should be calculated as a function of structure size and geometry.

  2. Understanding clinical and non-clinical decisions under uncertainty: a scenario-based survey.

    PubMed

    Simianu, Vlad V; Grounds, Margaret A; Joslyn, Susan L; LeClerc, Jared E; Ehlers, Anne P; Agrawal, Nidhi; Alfonso-Cristancho, Rafael; Flaxman, Abraham D; Flum, David R

    2016-12-01

    Prospect theory suggests that when faced with an uncertain outcome, people display loss aversion by preferring to risk a greater loss rather than incurring certain, lesser cost. Providing probability information improves decision making towards the economically optimal choice in these situations. Clinicians frequently make decisions when the outcome is uncertain, and loss aversion may influence choices. This study explores the extent to which prospect theory, loss aversion, and probability information in a non-clinical domain explains clinical decision making under uncertainty. Four hundred sixty two participants (n = 117 non-medical undergraduates, n = 113 medical students, n = 117 resident trainees, and n = 115 medical/surgical faculty) completed a three-part online task. First, participants completed an iced-road salting task using temperature forecasts with or without explicit probability information. Second, participants chose between less or more risk-averse ("defensive medicine") decisions in standardized scenarios. Last, participants chose between recommending therapy with certain outcomes or risking additional years gained or lost. In the road salting task, the mean expected value for decisions made by clinicians was better than for non-clinicians(-$1,022 vs -$1,061; <0.001). Probability information improved decision making for all participants, but non-clinicians improved more (mean improvement of $64 versus $33; p = 0.027). Mean defensive decisions decreased across training level (medical students 2.1 ± 0.9, residents 1.6 ± 0.8, faculty1.6 ± 1.1; p-trend < 0.001) and prospect-theory-concordant decisions increased (25.4%, 33.9%, and 40.7%;p-trend = 0.016). There was no relationship identified between road salting choices with defensive medicine and prospect-theory-concordant decisions. All participants made more economically-rational decisions when provided explicit probability information in a non-clinical domain. However, choices in the non-clinical domain were not related to prospect-theory concordant decision making and risk aversion tendencies in the clinical domain. Recognizing this discordance may be important when applying prospect theory to interventions aimed at improving clinical care.

  3. Quantum Probability -- A New Direction for Modeling in Cognitive Science

    NASA Astrophysics Data System (ADS)

    Roy, Sisir

    2014-07-01

    Human cognition is still a puzzling issue in research and its appropriate modeling. It depends on how the brain behaves at that particular instance and identifies and responds to a signal among myriads of noises that are present in the surroundings (called external noise) as well as in the neurons themselves (called internal noise). Thus it is not surprising to assume that the functionality consists of various uncertainties, possibly a mixture of aleatory and epistemic uncertainties. It is also possible that a complicated pathway consisting of both types of uncertainties in continuum play a major role in human cognition. For more than 200 years mathematicians and philosophers have been using probability theory to describe human cognition. Recently in several experiments with human subjects, violation of traditional probability theory has been clearly revealed in plenty of cases. Literature survey clearly suggests that classical probability theory fails to model human cognition beyond a certain limit. While the Bayesian approach may seem to be a promising candidate to this problem, the complete success story of Bayesian methodology is yet to be written. The major problem seems to be the presence of epistemic uncertainty and its effect on cognition at any given time. Moreover the stochasticity in the model arises due to the unknown path or trajectory (definite state of mind at each time point), a person is following. To this end a generalized version of probability theory borrowing ideas from quantum mechanics may be a plausible approach. A superposition state in quantum theory permits a person to be in an indefinite state at each point of time. Such an indefinite state allows all the states to have the potential to be expressed at each moment. Thus a superposition state appears to be able to represent better, the uncertainty, ambiguity or conflict experienced by a person at any moment demonstrating that mental states follow quantum mechanics during perception and cognition of ambiguous figures.

  4. Reality, Causality, and Probability, from Quantum Mechanics to Quantum Field Theory

    NASA Astrophysics Data System (ADS)

    Plotnitsky, Arkady

    2015-10-01

    These three lectures consider the questions of reality, causality, and probability in quantum theory, from quantum mechanics to quantum field theory. They do so in part by exploring the ideas of the key founding figures of the theory, such N. Bohr, W. Heisenberg, E. Schrödinger, or P. A. M. Dirac. However, while my discussion of these figures aims to be faithful to their thinking and writings, and while these lectures are motivated by my belief in the helpfulness of their thinking for understanding and advancing quantum theory, this project is not driven by loyalty to their ideas. In part for that reason, these lectures also present different and even conflicting ways of thinking in quantum theory, such as that of Bohr or Heisenberg vs. that of Schrödinger. The lectures, most especially the third one, also consider new physical, mathematical, and philosophical complexities brought in by quantum field theory vis-à-vis quantum mechanics. I close by briefly addressing some of the implications of the argument presented here for the current state of fundamental physics.

  5. Quantum Bayesian perspective for intelligence reservoir characterization, monitoring and management

    NASA Astrophysics Data System (ADS)

    Lozada Aguilar, Miguel Ángel; Khrennikov, Andrei; Oleschko, Klaudia; de Jesús Correa, María

    2017-10-01

    The paper starts with a brief review of the literature about uncertainty in geological, geophysical and petrophysical data. In particular, we present the viewpoints of experts in geophysics on the application of Bayesian inference and subjective probability. Then we present arguments that the use of classical probability theory (CP) does not match completely the structure of geophysical data. We emphasize that such data are characterized by contextuality and non-Kolmogorovness (the impossibility to use the CP model), incompleteness as well as incompatibility of some geophysical measurements. These characteristics of geophysical data are similar to the characteristics of quantum physical data. Notwithstanding all this, contextuality can be seen as a major deviation of quantum theory from classical physics. In particular, the contextual probability viewpoint is the essence of the Växjö interpretation of quantum mechanics. We propose to use quantum probability (QP) for decision-making during the characterization, modelling, exploring and management of the intelligent hydrocarbon reservoir. Quantum Bayesianism (QBism), one of the recently developed information interpretations of quantum theory, can be used as the interpretational basis for such QP decision-making in geology, geophysics and petroleum projects design and management. This article is part of the themed issue `Second quantum revolution: foundational questions'.

  6. Dopaminergic Drug Effects on Probability Weighting during Risky Decision Making.

    PubMed

    Ojala, Karita E; Janssen, Lieneke K; Hashemi, Mahur M; Timmer, Monique H M; Geurts, Dirk E M; Ter Huurne, Niels P; Cools, Roshan; Sescousse, Guillaume

    2018-01-01

    Dopamine has been associated with risky decision-making, as well as with pathological gambling, a behavioral addiction characterized by excessive risk-taking behavior. However, the specific mechanisms through which dopamine might act to foster risk-taking and pathological gambling remain elusive. Here we test the hypothesis that this might be achieved, in part, via modulation of subjective probability weighting during decision making. Human healthy controls ( n = 21) and pathological gamblers ( n = 16) played a decision-making task involving choices between sure monetary options and risky gambles both in the gain and loss domains. Each participant played the task twice, either under placebo or the dopamine D 2 /D 3 receptor antagonist sulpiride, in a double-blind counterbalanced design. A prospect theory modelling approach was used to estimate subjective probability weighting and sensitivity to monetary outcomes. Consistent with prospect theory, we found that participants presented a distortion in the subjective weighting of probabilities, i.e., they overweighted low probabilities and underweighted moderate to high probabilities, both in the gain and loss domains. Compared with placebo, sulpiride attenuated this distortion in the gain domain. Across drugs, the groups did not differ in their probability weighting, although gamblers consistently underweighted losing probabilities in the placebo condition. Overall, our results reveal that dopamine D 2 /D 3 receptor antagonism modulates the subjective weighting of probabilities in the gain domain, in the direction of more objective, economically rational decision making.

  7. Cellular Automata Generalized To An Inferential System

    NASA Astrophysics Data System (ADS)

    Blower, David J.

    2007-11-01

    Stephen Wolfram popularized elementary one-dimensional cellular automata in his book, A New Kind of Science. Among many remarkable things, he proved that one of these cellular automata was a Universal Turing Machine. Such cellular automata can be interpreted in a different way by viewing them within the context of the formal manipulation rules from probability theory. Bayes's Theorem is the most famous of such formal rules. As a prelude, we recapitulate Jaynes's presentation of how probability theory generalizes classical logic using modus ponens as the canonical example. We emphasize the important conceptual standing of Boolean Algebra for the formal rules of probability manipulation and give an alternative demonstration augmenting and complementing Jaynes's derivation. We show the complementary roles played in arguments of this kind by Bayes's Theorem and joint probability tables. A good explanation for all of this is afforded by the expansion of any particular logic function via the disjunctive normal form (DNF). The DNF expansion is a useful heuristic emphasized in this exposition because such expansions point out where relevant 0s should be placed in the joint probability tables for logic functions involving any number of variables. It then becomes a straightforward exercise to rely on Boolean Algebra, Bayes's Theorem, and joint probability tables in extrapolating to Wolfram's cellular automata. Cellular automata are seen as purely deductive systems, just like classical logic, which probability theory is then able to generalize. Thus, any uncertainties which we might like to introduce into the discussion about cellular automata are handled with ease via the familiar inferential path. Most importantly, the difficult problem of predicting what cellular automata will do in the far future is treated like any inferential prediction problem.

  8. P value and the theory of hypothesis testing: an explanation for new researchers.

    PubMed

    Biau, David Jean; Jolles, Brigitte M; Porcher, Raphaël

    2010-03-01

    In the 1920s, Ronald Fisher developed the theory behind the p value and Jerzy Neyman and Egon Pearson developed the theory of hypothesis testing. These distinct theories have provided researchers important quantitative tools to confirm or refute their hypotheses. The p value is the probability to obtain an effect equal to or more extreme than the one observed presuming the null hypothesis of no effect is true; it gives researchers a measure of the strength of evidence against the null hypothesis. As commonly used, investigators will select a threshold p value below which they will reject the null hypothesis. The theory of hypothesis testing allows researchers to reject a null hypothesis in favor of an alternative hypothesis of some effect. As commonly used, investigators choose Type I error (rejecting the null hypothesis when it is true) and Type II error (accepting the null hypothesis when it is false) levels and determine some critical region. If the test statistic falls into that critical region, the null hypothesis is rejected in favor of the alternative hypothesis. Despite similarities between the two, the p value and the theory of hypothesis testing are different theories that often are misunderstood and confused, leading researchers to improper conclusions. Perhaps the most common misconception is to consider the p value as the probability that the null hypothesis is true rather than the probability of obtaining the difference observed, or one that is more extreme, considering the null is true. Another concern is the risk that an important proportion of statistically significant results are falsely significant. Researchers should have a minimum understanding of these two theories so that they are better able to plan, conduct, interpret, and report scientific experiments.

  9. Probability theory versus simulation of petroleum potential in play analysis

    USGS Publications Warehouse

    Crovelli, R.A.

    1987-01-01

    An analytic probabilistic methodology for resource appraisal of undiscovered oil and gas resources in play analysis is presented. This play-analysis methodology is a geostochastic system for petroleum resource appraisal in explored as well as frontier areas. An objective was to replace an existing Monte Carlo simulation method in order to increase the efficiency of the appraisal process. Underlying the two methods is a single geologic model which considers both the uncertainty of the presence of the assessed hydrocarbon and its amount if present. The results of the model are resource estimates of crude oil, nonassociated gas, dissolved gas, and gas for a geologic play in terms of probability distributions. The analytic method is based upon conditional probability theory and a closed form solution of all means and standard deviations, along with the probabilities of occurrence. ?? 1987 J.C. Baltzer A.G., Scientific Publishing Company.

  10. Theory of Stochastic Laplacian Growth

    NASA Astrophysics Data System (ADS)

    Alekseev, Oleg; Mineev-Weinstein, Mark

    2017-07-01

    We generalize the diffusion-limited aggregation by issuing many randomly-walking particles, which stick to a cluster at the discrete time unit providing its growth. Using simple combinatorial arguments we determine probabilities of different growth scenarios and prove that the most probable evolution is governed by the deterministic Laplacian growth equation. A potential-theoretical analysis of the growth probabilities reveals connections with the tau-function of the integrable dispersionless limit of the two-dimensional Toda hierarchy, normal matrix ensembles, and the two-dimensional Dyson gas confined in a non-uniform magnetic field. We introduce the time-dependent Hamiltonian, which generates transitions between different classes of equivalence of closed curves, and prove the Hamiltonian structure of the interface dynamics. Finally, we propose a relation between probabilities of growth scenarios and the semi-classical limit of certain correlation functions of "light" exponential operators in the Liouville conformal field theory on a pseudosphere.

  11. Monoamines and assessment of risks.

    PubMed

    Takahashi, Hidehiko

    2012-12-01

    Over the past decade, neuroeconomics studies utilizing neurophysiology methods (fMRI or EEG) have flourished, revealing the neural basis of 'boundedly rational' or 'irrational' decision-making that violates normative theory. The next question is how modulatory neurotransmission is involved in these central processes. Here I focused on recent efforts to understand how central monoamine transmission is related to nonlinear probability weighting and loss aversion, central features of prospect theory, which is a leading alternative to normative theory for decision-making under risk. Circumstantial evidence suggests that dopamine tone might be related to distortion of subjective reward probability and noradrenaline and serotonin tone might influence aversive emotional reaction to potential loss. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. Making Decisions about an Educational Game, Simulation or Workshop: A 'Game Theory' Perspective.

    ERIC Educational Resources Information Center

    Cryer, Patricia

    1988-01-01

    Uses game theory to help practitioners make decisions about educational games, simulations, or workshops whose outcomes depend to some extent on chance. Highlights include principles for making decisions involving risk; elementary laws of probability; utility theory; and principles for making decisions involving uncertainty. (eight references)…

  13. Uncertainty plus prior equals rational bias: an intuitive Bayesian probability weighting function.

    PubMed

    Fennell, John; Baddeley, Roland

    2012-10-01

    Empirical research has shown that when making choices based on probabilistic options, people behave as if they overestimate small probabilities, underestimate large probabilities, and treat positive and negative outcomes differently. These distortions have been modeled using a nonlinear probability weighting function, which is found in several nonexpected utility theories, including rank-dependent models and prospect theory; here, we propose a Bayesian approach to the probability weighting function and, with it, a psychological rationale. In the real world, uncertainty is ubiquitous and, accordingly, the optimal strategy is to combine probability statements with prior information using Bayes' rule. First, we show that any reasonable prior on probabilities leads to 2 of the observed effects; overweighting of low probabilities and underweighting of high probabilities. We then investigate 2 plausible kinds of priors: informative priors based on previous experience and uninformative priors of ignorance. Individually, these priors potentially lead to large problems of bias and inefficiency, respectively; however, when combined using Bayesian model comparison methods, both forms of prior can be applied adaptively, gaining the efficiency of empirical priors and the robustness of ignorance priors. We illustrate this for the simple case of generic good and bad options, using Internet blogs to estimate the relevant priors of inference. Given this combined ignorant/informative prior, the Bayesian probability weighting function is not only robust and efficient but also matches all of the major characteristics of the distortions found in empirical research. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  14. Inconclusive quantum measurements and decisions under uncertainty

    NASA Astrophysics Data System (ADS)

    Yukalov, Vyacheslav; Sornette, Didier

    2016-04-01

    We give a mathematical definition for the notion of inconclusive quantum measurements. In physics, such measurements occur at intermediate stages of a complex measurement procedure, with the final measurement result being operationally testable. Since the mathematical structure of Quantum Decision Theory has been developed in analogy with the theory of quantum measurements, the inconclusive quantum measurements correspond, in Quantum Decision Theory, to intermediate stages of decision making in the process of taking decisions under uncertainty. The general form of the quantum probability for a composite event is the sum of a utility factor, describing a rational evaluation of the considered prospect, and of an attraction factor, characterizing irrational, subconscious attitudes of the decision maker. Despite the involved irrationality, the probability of prospects can be evaluated. This is equivalent to the possibility of calculating quantum probabilities without specifying hidden variables. We formulate a general way of evaluation, based on the use of non-informative priors. As an example, we suggest the explanation of the decoy effect. Our quantitative predictions are in very good agreement with experimental data.

  15. Introducing the Qplex: a novel arena for quantum theory

    NASA Astrophysics Data System (ADS)

    Appleby, Marcus; Fuchs, Christopher A.; Stacey, Blake C.; Zhu, Huangjun

    2017-07-01

    We reconstruct quantum theory starting from the premise that, as Asher Peres remarked, "Unperformed experiments have no results." The tools of quantum information theory, and in particular the symmetric informationally complete (SIC) measurements, provide a concise expression of how exactly Peres's dictum holds true. That expression is a constraint on how the probability distributions for outcomes of different, hypothetical and mutually exclusive experiments ought to mesh together, a type of constraint not foreseen in classical thinking. Taking this as our foundational principle, we show how to reconstruct the formalism of quantum theory in finite-dimensional Hilbert spaces. The central variety of mathematical entity in our reconstruction is the qplex, a very particular type of subset of a probability simplex. Along the way, by closely studying the symmetry properties of qplexes, we derive a condition for the existence of a d-dimensional SIC.

  16. A cosmic book. [of physics of early universe

    NASA Technical Reports Server (NTRS)

    Peebles, P. J. E.; Silk, Joseph

    1988-01-01

    A system of assigning odds to the basic elements of cosmological theories is proposed in order to evaluate the strengths and weaknesses of the theories. A figure of merit for the theories is obtained by counting and weighing the plausibility of each of the basic elements that is not substantially supported by observation or mature fundamental theory. The magnetized strong model is found to be the most probable. In order of decreasing probability, the ranking for the rest of the models is: (1) the magnetized string model with no exotic matter and the baryon adiabatic model; (2) the hot dark matter model and the model of cosmic string loops; (3) the canonical cold dark matter model, the cosmic string loops model with hot dark matter, and the baryonic isocurvature model; and (4) the cosmic string loops model with no exotic matter.

  17. Measurements and mathematical formalism of quantum mechanics

    NASA Astrophysics Data System (ADS)

    Slavnov, D. A.

    2007-03-01

    A scheme for constructing quantum mechanics is given that does not have Hilbert space and linear operators as its basic elements. Instead, a version of algebraic approach is considered. Elements of a noncommutative algebra (observables) and functionals on this algebra (elementary states) associated with results of single measurements are used as primary components of the scheme. On the one hand, it is possible to use within the scheme the formalism of the standard (Kolmogorov) probability theory, and, on the other hand, it is possible to reproduce the mathematical formalism of standard quantum mechanics, and to study the limits of its applicability. A short outline is given of the necessary material from the theory of algebras and probability theory. It is described how the mathematical scheme of the paper agrees with the theory of quantum measurements, and avoids quantum paradoxes.

  18. Betting on the outcomes of measurements: a Bayesian theory of quantum probability

    NASA Astrophysics Data System (ADS)

    Pitowsky, Itamar

    We develop a systematic approach to quantum probability as a theory of rational betting in quantum gambles. In these games of chance, the agent is betting in advance on the outcomes of several (finitely many) incompatible measurements. One of the measurements is subsequently chosen and performed and the money placed on the other measurements is returned to the agent. We show how the rules of rational betting imply all the interesting features of quantum probability, even in such finite gambles. These include the uncertainty principle and the violation of Bell's inequality among others. Quantum gambles are closely related to quantum logic and provide a new semantics for it. We conclude with a philosophical discussion on the interpretation of quantum mechanics.

  19. Simple and compact expressions for neutrino oscillation probabilities in matter

    DOE PAGES

    Minakata, Hisakazu; Parke, Stephen J.

    2016-01-29

    We reformulate perturbation theory for neutrino oscillations in matter with an expansion parameter related to the ratio of the solar to the atmospheric Δm 2 scales. Unlike previous works, use a renormalized basis in which certain first-order effects are taken into account in the zeroth-order Hamiltonian. Using this perturbation theory we derive extremely compact expressions for the neutrino oscillations probabilities in matter. We find, for example, that the ν e disappearance probability at this order is of a simple two flavor form with an appropriately identified mixing angle and Δm 2. Furthermore, despite exceptional simplicity in their forms they accommodatemore » all order effects θ 13 and the matter potential.« less

  20. Quantile Functions, Convergence in Quantile, and Extreme Value Distribution Theory.

    DTIC Science & Technology

    1980-11-01

    Gnanadesikan (1968). Quantile functions are advocated by Parzen (1979) as providing an approach to probability-based data analysis. Quantile functions are... Gnanadesikan , R. (1968). Probability Plotting Methods for the Analysis of Data, Biomtrika, 55, 1-17.

  1. A brief introduction to probability.

    PubMed

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  2. Chapter 5 Multiple, Localized, and Delocalized/Conjugated Bonds in the Orbital Communication Theory of Molecular Systems

    NASA Astrophysics Data System (ADS)

    Nalewajski, Roman F.

    Information theory (IT) probe of the molecular electronic structure, within the communication theory of chemical bonds (CTCB), uses the standard entropy/information descriptors of the Shannon theory of communication to characterize a scattering of the electronic probabilities and their information content throughout the system chemical bonds generated by the occupied molecular orbitals (MO). These "communications" between the basis-set orbitals are determined by the two-orbital conditional probabilities: one- and two-electron in character. They define the molecular information system, in which the electron-allocation "signals" are transmitted between various orbital "inputs" and "outputs". It is argued, using the quantum mechanical superposition principle, that the one-electron conditional probabilities are proportional to the squares of corresponding elements of the charge and bond-order (CBO) matrix of the standard LCAO MO theory. Therefore, the probability of the interorbital connections in the molecular communication system is directly related to Wiberg's quadratic covalency indices of chemical bonds. The conditional-entropy (communication "noise") and mutual-information (information capacity) descriptors of these molecular channels generate the IT-covalent and IT-ionic bond components, respectively. The former reflects the electron delocalization (indeterminacy) due to the orbital mixing, throughout all chemical bonds in the system under consideration. The latter characterizes the localization (determinacy) in the probability scattering in the molecule. These two IT indices, respectively, indicate a fraction of the input information lost in the channel output, due to the communication noise, and its surviving part, due to deterministic elements in probability scattering in the molecular network. Together, these two components generate the system overall bond index. By a straightforward output reduction (condensation) of the molecular channel, the IT indices of molecular fragments, for example, localized bonds, functional groups, and forward and back donations accompanying the bond formation, and so on, can be extracted. The flow of information in such molecular communication networks is investigated in several prototype molecules. These illustrative (model) applications of the orbital communication theory of chemical bonds (CTCB) deal with several classical issues in the electronic structure theory: atom hybridization/promotion, single and multiple chemical bonds, bond conjugation, and so on. The localized bonds in hydrides and delocalized [pi]-bonds in simple hydrocarbons, as well as the multiple bonds in CO and CO2, are diagnosed using the entropy/information descriptors of CTCB. The atom promotion in hydrides and bond conjugation in [pi]-electron systems are investigated in more detail. A major drawback of the previous two-electron approach to molecular channels, namely, two weak bond differentiation in aromatic systems, has been shown to be remedied in the one-electron approach.

  3. A sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, J. D.; Oberkampf, William Louis; Helton, Jon Craig

    2006-10-01

    Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a modelmore » is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.« less

  4. Using a Card Trick to Illustrate Fixed Points and Stability

    ERIC Educational Resources Information Center

    Champanerkar, Jyoti; Jani, Mahendra

    2015-01-01

    Mathematical ideas from number theory, group theory, dynamical systems, and computer science have often been used to explain card tricks. Conversely, playing cards have been often used to illustrate the mathematical concepts of probability distributions and group theory. In this paper, we describe how the 21-card trick may be used to illustrate…

  5. A Test of Two Theories in the Initial Process Stage of Coalition Formation.

    ERIC Educational Resources Information Center

    Flaherty, John F.; Arenson, Sidney J.

    1978-01-01

    Males and females participated in a coalition formation procedure by interacting with a computer program that simulated a pachisi game situation. Female partner preference data supported a weighted probability model of coalition formation over a bargaining theory. Male partner preference data did not support either theory. (Author)

  6. Dopaminergic Drug Effects on Probability Weighting during Risky Decision Making

    PubMed Central

    Timmer, Monique H. M.; ter Huurne, Niels P.

    2018-01-01

    Abstract Dopamine has been associated with risky decision-making, as well as with pathological gambling, a behavioral addiction characterized by excessive risk-taking behavior. However, the specific mechanisms through which dopamine might act to foster risk-taking and pathological gambling remain elusive. Here we test the hypothesis that this might be achieved, in part, via modulation of subjective probability weighting during decision making. Human healthy controls (n = 21) and pathological gamblers (n = 16) played a decision-making task involving choices between sure monetary options and risky gambles both in the gain and loss domains. Each participant played the task twice, either under placebo or the dopamine D2/D3 receptor antagonist sulpiride, in a double-blind counterbalanced design. A prospect theory modelling approach was used to estimate subjective probability weighting and sensitivity to monetary outcomes. Consistent with prospect theory, we found that participants presented a distortion in the subjective weighting of probabilities, i.e., they overweighted low probabilities and underweighted moderate to high probabilities, both in the gain and loss domains. Compared with placebo, sulpiride attenuated this distortion in the gain domain. Across drugs, the groups did not differ in their probability weighting, although gamblers consistently underweighted losing probabilities in the placebo condition. Overall, our results reveal that dopamine D2/D3 receptor antagonism modulates the subjective weighting of probabilities in the gain domain, in the direction of more objective, economically rational decision making. PMID:29632870

  7. An application of extreme value theory to the management of a hydroelectric dam.

    PubMed

    Minkah, Richard

    2016-01-01

    Assessing the probability of very low or high water levels is an important issue in the management of hydroelectric dams. In the case of the Akosombo dam, very low and high water levels result in load shedding of electrical power and flooding in communities downstream respectively. In this paper, we use extreme value theory to estimate the probability and return period of very low water levels that can result in load shedding or a complete shutdown of the dam's operations. In addition, we assess the probability and return period of high water levels near the height of the dam and beyond. This provides a framework for a possible extension of the dam to sustain the generation of electrical power and reduce the frequency of spillage that causes flooding in communities downstream. The results show that an extension of the dam can reduce the probability and prolong the return period of a flood. In addition, we found a negligible probability of a complete shutdown of the dam due to inadequate water level.

  8. Theory of atomic spectral emission intensity

    NASA Astrophysics Data System (ADS)

    Yngström, Sten

    1994-07-01

    The theoretical derivation of a new spectral line intensity formula for atomic radiative emission is presented. The theory is based on first principles of quantum physics, electrodynamics, and statistical physics. Quantum rules lead to revision of the conventional principle of local thermal equilibrium of matter and radiation. Study of electrodynamics suggests absence of spectral emission from fractions of the numbers of atoms and ions in a plasma due to radiative inhibition caused by electromagnetic force fields. Statistical probability methods are extended by the statement: A macroscopic physical system develops in the most probable of all conceivable ways consistent with the constraining conditions for the system. The crucial role of statistical physics in transforming quantum logic into common sense logic is stressed. The theory is strongly supported by experimental evidence.

  9. Minimal entropy approximation for cellular automata

    NASA Astrophysics Data System (ADS)

    Fukś, Henryk

    2014-02-01

    We present a method for the construction of approximate orbits of measures under the action of cellular automata which is complementary to the local structure theory. The local structure theory is based on the idea of Bayesian extension, that is, construction of a probability measure consistent with given block probabilities and maximizing entropy. If instead of maximizing entropy one minimizes it, one can develop another method for the construction of approximate orbits, at the heart of which is the iteration of finite-dimensional maps, called minimal entropy maps. We present numerical evidence that the minimal entropy approximation sometimes outperforms the local structure theory in characterizing the properties of cellular automata. The density response curve for elementary CA rule 26 is used to illustrate this claim.

  10. Thermodynamic prediction of protein neutrality.

    PubMed

    Bloom, Jesse D; Silberg, Jonathan J; Wilke, Claus O; Drummond, D Allan; Adami, Christoph; Arnold, Frances H

    2005-01-18

    We present a simple theory that uses thermodynamic parameters to predict the probability that a protein retains the wild-type structure after one or more random amino acid substitutions. Our theory predicts that for large numbers of substitutions the probability that a protein retains its structure will decline exponentially with the number of substitutions, with the severity of this decline determined by properties of the structure. Our theory also predicts that a protein can gain extra robustness to the first few substitutions by increasing its thermodynamic stability. We validate our theory with simulations on lattice protein models and by showing that it quantitatively predicts previously published experimental measurements on subtilisin and our own measurements on variants of TEM1 beta-lactamase. Our work unifies observations about the clustering of functional proteins in sequence space, and provides a basis for interpreting the response of proteins to substitutions in protein engineering applications.

  11. Thermodynamic prediction of protein neutrality

    PubMed Central

    Bloom, Jesse D.; Silberg, Jonathan J.; Wilke, Claus O.; Drummond, D. Allan; Adami, Christoph; Arnold, Frances H.

    2005-01-01

    We present a simple theory that uses thermodynamic parameters to predict the probability that a protein retains the wild-type structure after one or more random amino acid substitutions. Our theory predicts that for large numbers of substitutions the probability that a protein retains its structure will decline exponentially with the number of substitutions, with the severity of this decline determined by properties of the structure. Our theory also predicts that a protein can gain extra robustness to the first few substitutions by increasing its thermodynamic stability. We validate our theory with simulations on lattice protein models and by showing that it quantitatively predicts previously published experimental measurements on subtilisin and our own measurements on variants of TEM1 β-lactamase. Our work unifies observations about the clustering of functional proteins in sequence space, and provides a basis for interpreting the response of proteins to substitutions in protein engineering applications. PMID:15644440

  12. The rational status of quantum cognition.

    PubMed

    Pothos, Emmanuel M; Busemeyer, Jerome R; Shiffrin, Richard M; Yearsley, James M

    2017-07-01

    Classic probability theory (CPT) is generally considered the rational way to make inferences, but there have been some empirical findings showing a divergence between reasoning and the principles of classical probability theory (CPT), inviting the conclusion that humans are irrational. Perhaps the most famous of these findings is the conjunction fallacy (CF). Recently, the CF has been shown consistent with the principles of an alternative probabilistic framework, quantum probability theory (QPT). Does this imply that QPT is irrational or does QPT provide an alternative interpretation of rationality? Our presentation consists of 3 parts. First, we examine the putative rational status of QPT using the same argument as used to establish the rationality of CPT, the Dutch Book (DB) argument, according to which reasoners should not commit to bets guaranteeing a loss. We prove the rational status of QPT by formulating it as a particular case of an extended form of CPT, with separate probability spaces produced by changing context. Second, we empirically examine the key requirement for whether a CF can be rational or not; the results show that participants indeed behave rationally, at least relative to the representations they employ. Finally, we consider whether the conditions for the CF to be rational are applicable in the outside (nonmental) world. Our discussion provides a general and alternative perspective for rational probabilistic inference, based on the idea that contextuality requires either reasoning in separate CPT probability spaces or reasoning with QPT principles. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  13. Quantum Bayesian perspective for intelligence reservoir characterization, monitoring and management.

    PubMed

    Lozada Aguilar, Miguel Ángel; Khrennikov, Andrei; Oleschko, Klaudia; de Jesús Correa, María

    2017-11-13

    The paper starts with a brief review of the literature about uncertainty in geological, geophysical and petrophysical data. In particular, we present the viewpoints of experts in geophysics on the application of Bayesian inference and subjective probability. Then we present arguments that the use of classical probability theory (CP) does not match completely the structure of geophysical data. We emphasize that such data are characterized by contextuality and non-Kolmogorovness (the impossibility to use the CP model), incompleteness as well as incompatibility of some geophysical measurements. These characteristics of geophysical data are similar to the characteristics of quantum physical data. Notwithstanding all this, contextuality can be seen as a major deviation of quantum theory from classical physics. In particular, the contextual probability viewpoint is the essence of the Växjö interpretation of quantum mechanics. We propose to use quantum probability (QP) for decision-making during the characterization, modelling, exploring and management of the intelligent hydrocarbon reservoir Quantum Bayesianism (QBism), one of the recently developed information interpretations of quantum theory, can be used as the interpretational basis for such QP decision-making in geology, geophysics and petroleum projects design and management.This article is part of the themed issue 'Second quantum revolution: foundational questions'. © 2017 The Author(s).

  14. Quantum probability, choice in large worlds, and the statistical structure of reality.

    PubMed

    Ross, Don; Ladyman, James

    2013-06-01

    Classical probability models of incentive response are inadequate in "large worlds," where the dimensions of relative risk and the dimensions of similarity in outcome comparisons typically differ. Quantum probability models for choice in large worlds may be motivated pragmatically - there is no third theory - or metaphysically: statistical processing in the brain adapts to the true scale-relative structure of the universe.

  15. Time Evolving Fission Chain Theory and Fast Neutron and Gamma-Ray Counting Distributions

    DOE PAGES

    Kim, K. S.; Nakae, L. F.; Prasad, M. K.; ...

    2015-11-01

    Here, we solve a simple theoretical model of time evolving fission chains due to Feynman that generalizes and asymptotically approaches the point model theory. The point model theory has been used to analyze thermal neutron counting data. This extension of the theory underlies fast counting data for both neutrons and gamma rays from metal systems. Fast neutron and gamma-ray counting is now possible using liquid scintillator arrays with nanosecond time resolution. For individual fission chains, the differential equations describing three correlated probability distributions are solved: the time-dependent internal neutron population, accumulation of fissions in time, and accumulation of leaked neutronsmore » in time. Explicit analytic formulas are given for correlated moments of the time evolving chain populations. The equations for random time gate fast neutron and gamma-ray counting distributions, due to randomly initiated chains, are presented. Correlated moment equations are given for both random time gate and triggered time gate counting. There are explicit formulas for all correlated moments are given up to triple order, for all combinations of correlated fast neutrons and gamma rays. The nonlinear differential equations for probabilities for time dependent fission chain populations have a remarkably simple Monte Carlo realization. A Monte Carlo code was developed for this theory and is shown to statistically realize the solutions to the fission chain theory probability distributions. Combined with random initiation of chains and detection of external quanta, the Monte Carlo code generates time tagged data for neutron and gamma-ray counting and from these data the counting distributions.« less

  16. Uncertain deduction and conditional reasoning.

    PubMed

    Evans, Jonathan St B T; Thompson, Valerie A; Over, David E

    2015-01-01

    There has been a paradigm shift in the psychology of deductive reasoning. Many researchers no longer think it is appropriate to ask people to assume premises and decide what necessarily follows, with the results evaluated by binary extensional logic. Most every day and scientific inference is made from more or less confidently held beliefs and not assumptions, and the relevant normative standard is Bayesian probability theory. We argue that the study of "uncertain deduction" should directly ask people to assign probabilities to both premises and conclusions, and report an experiment using this method. We assess this reasoning by two Bayesian metrics: probabilistic validity and coherence according to probability theory. On both measures, participants perform above chance in conditional reasoning, but they do much better when statements are grouped as inferences, rather than evaluated in separate tasks.

  17. The boundaries of instance-based learning theory for explaining decisions from experience.

    PubMed

    Gonzalez, Cleotilde

    2013-01-01

    Most demonstrations of how people make decisions in risky situations rely on decisions from description, where outcomes and their probabilities are explicitly stated. But recently, more attention has been given to decisions from experience where people discover these outcomes and probabilities through exploration. More importantly, risky behavior depends on how decisions are made (from description or experience), and although prospect theory explains decisions from description, a comprehensive model of decisions from experience is yet to be found. Instance-based learning theory (IBLT) explains how decisions are made from experience through interactions with dynamic environments (Gonzalez et al., 2003). The theory has shown robust explanations of behavior across multiple tasks and contexts, but it is becoming unclear what the theory is able to explain and what it does not. The goal of this chapter is to start addressing this problem. I will introduce IBLT and a recent cognitive model based on this theory: the IBL model of repeated binary choice; then I will discuss the phenomena that the IBL model explains and those that the model does not. The argument is for the theory's robustness but also for clarity in terms of concrete effects that the theory can or cannot account for. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. Stochastic processes, estimation theory and image enhancement

    NASA Technical Reports Server (NTRS)

    Assefi, T.

    1978-01-01

    An introductory account of stochastic processes, estimation theory, and image enhancement is presented. The book is primarily intended for first-year graduate students and practicing engineers and scientists whose work requires an acquaintance with the theory. Fundamental concepts of probability were reviewed that are required to support the main topics. The appendices discuss the remaining mathematical background.

  19. Balkanization and Unification of Probabilistic Inferences

    ERIC Educational Resources Information Center

    Yu, Chong-Ho

    2005-01-01

    Many research-related classes in social sciences present probability as a unified approach based upon mathematical axioms, but neglect the diversity of various probability theories and their associated philosophical assumptions. Although currently the dominant statistical and probabilistic approach is the Fisherian tradition, the use of Fisherian…

  20. Generalized quantum theory of recollapsing homogeneous cosmologies

    NASA Astrophysics Data System (ADS)

    Craig, David; Hartle, James B.

    2004-06-01

    A sum-over-histories generalized quantum theory is developed for homogeneous minisuperspace type A Bianchi cosmological models, focusing on the particular example of the classically recollapsing Bianchi type-IX universe. The decoherence functional for such universes is exhibited. We show how the probabilities of decoherent sets of alternative, coarse-grained histories of these model universes can be calculated. We consider in particular the probabilities for classical evolution defined by a suitable coarse graining. For a restricted class of initial conditions and coarse grainings we exhibit the approximate decoherence of alternative histories in which the universe behaves classically and those in which it does not. For these situations we show that the probability is near unity for the universe to recontract classically if it expands classically. We also determine the relative probabilities of quasiclassical trajectories for initial states of WKB form, recovering for such states a precise form of the familiar heuristic “JṡdΣ” rule of quantum cosmology, as well as a generalization of this rule to generic initial states.

  1. The probability heuristics model of syllogistic reasoning.

    PubMed

    Chater, N; Oaksford, M

    1999-03-01

    A probability heuristic model (PHM) for syllogistic reasoning is proposed. An informational ordering over quantified statements suggests simple probability based heuristics for syllogistic reasoning. The most important is the "min-heuristic": choose the type of the least informative premise as the type of the conclusion. The rationality of this heuristic is confirmed by an analysis of the probabilistic validity of syllogistic reasoning which treats logical inference as a limiting case of probabilistic inference. A meta-analysis of past experiments reveals close fits with PHM. PHM also compares favorably with alternative accounts, including mental logics, mental models, and deduction as verbal reasoning. Crucially, PHM extends naturally to generalized quantifiers, such as Most and Few, which have not been characterized logically and are, consequently, beyond the scope of current mental logic and mental model theories. Two experiments confirm the novel predictions of PHM when generalized quantifiers are used in syllogistic arguments. PHM suggests that syllogistic reasoning performance may be determined by simple but rational informational strategies justified by probability theory rather than by logic. Copyright 1999 Academic Press.

  2. Modality, probability, and mental models.

    PubMed

    Hinterecker, Thomas; Knauff, Markus; Johnson-Laird, P N

    2016-10-01

    We report 3 experiments investigating novel sorts of inference, such as: A or B or both. Therefore, possibly (A and B). Where the contents were sensible assertions, for example, Space tourism will achieve widespread popularity in the next 50 years or advances in material science will lead to the development of antigravity materials in the next 50 years, or both . Most participants accepted the inferences as valid, though they are invalid in modal logic and in probabilistic logic too. But, the theory of mental models predicts that individuals should accept them. In contrast, inferences of this sort—A or B but not both. Therefore, A or B or both—are both logically valid and probabilistically valid. Yet, as the model theory also predicts, most reasoners rejected them. The participants’ estimates of probabilities showed that their inferences tended not to be based on probabilistic validity, but that they did rate acceptable conclusions as more probable than unacceptable conclusions. We discuss the implications of the results for current theories of reasoning. PsycINFO Database Record (c) 2016 APA, all rights reserved

  3. Sensitivity and Bias in Decision-Making under Risk: Evaluating the Perception of Reward, Its Probability and Value

    PubMed Central

    Sharp, Madeleine E.; Viswanathan, Jayalakshmi; Lanyon, Linda J.; Barton, Jason J. S.

    2012-01-01

    Background There are few clinical tools that assess decision-making under risk. Tests that characterize sensitivity and bias in decisions between prospects varying in magnitude and probability of gain may provide insights in conditions with anomalous reward-related behaviour. Objective We designed a simple test of how subjects integrate information about the magnitude and the probability of reward, which can determine discriminative thresholds and choice bias in decisions under risk. Design/Methods Twenty subjects were required to choose between two explicitly described prospects, one with higher probability but lower magnitude of reward than the other, with the difference in expected value between the two prospects varying from 3 to 23%. Results Subjects showed a mean threshold sensitivity of 43% difference in expected value. Regarding choice bias, there was a ‘risk premium’ of 38%, indicating a tendency to choose higher probability over higher reward. An analysis using prospect theory showed that this risk premium is the predicted outcome of hypothesized non-linearities in the subjective perception of reward value and probability. Conclusions This simple test provides a robust measure of discriminative value thresholds and biases in decisions under risk. Prospect theory can also make predictions about decisions when subjective perception of reward or probability is anomalous, as may occur in populations with dopaminergic or striatal dysfunction, such as Parkinson's disease and schizophrenia. PMID:22493669

  4. Sensitivity and bias in decision-making under risk: evaluating the perception of reward, its probability and value.

    PubMed

    Sharp, Madeleine E; Viswanathan, Jayalakshmi; Lanyon, Linda J; Barton, Jason J S

    2012-01-01

    There are few clinical tools that assess decision-making under risk. Tests that characterize sensitivity and bias in decisions between prospects varying in magnitude and probability of gain may provide insights in conditions with anomalous reward-related behaviour. We designed a simple test of how subjects integrate information about the magnitude and the probability of reward, which can determine discriminative thresholds and choice bias in decisions under risk. Twenty subjects were required to choose between two explicitly described prospects, one with higher probability but lower magnitude of reward than the other, with the difference in expected value between the two prospects varying from 3 to 23%. Subjects showed a mean threshold sensitivity of 43% difference in expected value. Regarding choice bias, there was a 'risk premium' of 38%, indicating a tendency to choose higher probability over higher reward. An analysis using prospect theory showed that this risk premium is the predicted outcome of hypothesized non-linearities in the subjective perception of reward value and probability. This simple test provides a robust measure of discriminative value thresholds and biases in decisions under risk. Prospect theory can also make predictions about decisions when subjective perception of reward or probability is anomalous, as may occur in populations with dopaminergic or striatal dysfunction, such as Parkinson's disease and schizophrenia.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, K. S.; Nakae, L. F.; Prasad, M. K.

    Here, we solve a simple theoretical model of time evolving fission chains due to Feynman that generalizes and asymptotically approaches the point model theory. The point model theory has been used to analyze thermal neutron counting data. This extension of the theory underlies fast counting data for both neutrons and gamma rays from metal systems. Fast neutron and gamma-ray counting is now possible using liquid scintillator arrays with nanosecond time resolution. For individual fission chains, the differential equations describing three correlated probability distributions are solved: the time-dependent internal neutron population, accumulation of fissions in time, and accumulation of leaked neutronsmore » in time. Explicit analytic formulas are given for correlated moments of the time evolving chain populations. The equations for random time gate fast neutron and gamma-ray counting distributions, due to randomly initiated chains, are presented. Correlated moment equations are given for both random time gate and triggered time gate counting. There are explicit formulas for all correlated moments are given up to triple order, for all combinations of correlated fast neutrons and gamma rays. The nonlinear differential equations for probabilities for time dependent fission chain populations have a remarkably simple Monte Carlo realization. A Monte Carlo code was developed for this theory and is shown to statistically realize the solutions to the fission chain theory probability distributions. Combined with random initiation of chains and detection of external quanta, the Monte Carlo code generates time tagged data for neutron and gamma-ray counting and from these data the counting distributions.« less

  6. Joint probabilities and quantum cognition

    NASA Astrophysics Data System (ADS)

    de Barros, J. Acacio

    2012-12-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  7. The Probability of Exceedance as a Nonparametric Person-Fit Statistic for Tests of Moderate Length

    ERIC Educational Resources Information Center

    Tendeiro, Jorge N.; Meijer, Rob R.

    2013-01-01

    To classify an item score pattern as not fitting a nonparametric item response theory (NIRT) model, the probability of exceedance (PE) of an observed response vector x can be determined as the sum of the probabilities of all response vectors that are, at most, as likely as x, conditional on the test's total score. Vector x is to be considered…

  8. Teaching Classic Probability Problems With Modern Digital Tools

    ERIC Educational Resources Information Center

    Abramovich, Sergei; Nikitin, Yakov Yu.

    2017-01-01

    This article is written to share teaching ideas about using commonly available computer applications--a spreadsheet, "The Geometer's Sketchpad", and "Wolfram Alpha"--to explore three classic and historically significant problems from the probability theory. These ideas stem from the authors' work with prospective economists,…

  9. Probabilistic Relational Structures and Their Applications

    ERIC Educational Resources Information Center

    Domotor, Zoltan

    The principal objects of the investigation reported were, first, to study qualitative probability relations on Boolean algebras, and secondly, to describe applications in the theories of probability logic, information, automata, and probabilistic measurement. The main contribution of this work is stated in 10 definitions and 20 theorems. The basic…

  10. Teaching Probability in Intermediate Grades

    ERIC Educational Resources Information Center

    Engel, Arthur

    1971-01-01

    A discussion of the importance and procedures for including probability in the elementary through secondary mathematics curriculum is presented. Many examples and problems are presented which the author feels students can understand and will be motivated to do. Random digits, Monte Carlo methods, combinatorial theory, and Markov chains are…

  11. The Notions of Chance and Probabilities in Preschoolers

    ERIC Educational Resources Information Center

    Nikiforidou, Zoi; Pange, Jenny

    2010-01-01

    Chance, randomness and probability constitute statistical notions that are interrelated and characterize the logicomathematical thinking of children. Traditional theories support that probabilistic thinking evolves after the age of 7. However, recent research has underlined that children, as young as 4, may possess and develop basic notions,…

  12. Covering Numbers for Semicontinuous Functions

    DTIC Science & Technology

    2016-04-29

    functions, epi-distance, Attouch-Wets topology, epi-convergence, epi-spline, approximation theory . Date: April 29, 2016 1 Introduction Covering numbers of...classes of functions play central roles in parts of information theory , statistics, and applications such as machine learning; see for example [26...probability theory because there the hypo-distance metrizes weak convergence of distribution functions on IRd, which obviously are usc [22]. Thus, as an

  13. The maximum entropy method of moments and Bayesian probability theory

    NASA Astrophysics Data System (ADS)

    Bretthorst, G. Larry

    2013-08-01

    The problem of density estimation occurs in many disciplines. For example, in MRI it is often necessary to classify the types of tissues in an image. To perform this classification one must first identify the characteristics of the tissues to be classified. These characteristics might be the intensity of a T1 weighted image and in MRI many other types of characteristic weightings (classifiers) may be generated. In a given tissue type there is no single intensity that characterizes the tissue, rather there is a distribution of intensities. Often this distributions can be characterized by a Gaussian, but just as often it is much more complicated. Either way, estimating the distribution of intensities is an inference problem. In the case of a Gaussian distribution, one must estimate the mean and standard deviation. However, in the Non-Gaussian case the shape of the density function itself must be inferred. Three common techniques for estimating density functions are binned histograms [1, 2], kernel density estimation [3, 4], and the maximum entropy method of moments [5, 6]. In the introduction, the maximum entropy method of moments will be reviewed. Some of its problems and conditions under which it fails will be discussed. Then in later sections, the functional form of the maximum entropy method of moments probability distribution will be incorporated into Bayesian probability theory. It will be shown that Bayesian probability theory solves all of the problems with the maximum entropy method of moments. One gets posterior probabilities for the Lagrange multipliers, and, finally, one can put error bars on the resulting estimated density function.

  14. A new probability distribution model of turbulent irradiance based on Born perturbation theory

    NASA Astrophysics Data System (ADS)

    Wang, Hongxing; Liu, Min; Hu, Hao; Wang, Qian; Liu, Xiguo

    2010-10-01

    The subject of the PDF (Probability Density Function) of the irradiance fluctuations in a turbulent atmosphere is still unsettled. Theory reliably describes the behavior in the weak turbulence regime, but theoretical description in the strong and whole turbulence regimes are still controversial. Based on Born perturbation theory, the physical manifestations and correlations of three typical PDF models (Rice-Nakagami, exponential-Bessel and negative-exponential distribution) were theoretically analyzed. It is shown that these models can be derived by separately making circular-Gaussian, strong-turbulence and strong-turbulence-circular-Gaussian approximations in Born perturbation theory, which denies the viewpoint that the Rice-Nakagami model is only applicable in the extremely weak turbulence regime and provides theoretical arguments for choosing rational models in practical applications. In addition, a common shortcoming of the three models is that they are all approximations. A new model, called the Maclaurin-spread distribution, is proposed without any approximation except for assuming the correlation coefficient to be zero. So, it is considered that the new model can exactly reflect the Born perturbation theory. Simulated results prove the accuracy of this new model.

  15. Uncertain deduction and conditional reasoning

    PubMed Central

    Evans, Jonathan St. B. T.; Thompson, Valerie A.; Over, David E.

    2015-01-01

    There has been a paradigm shift in the psychology of deductive reasoning. Many researchers no longer think it is appropriate to ask people to assume premises and decide what necessarily follows, with the results evaluated by binary extensional logic. Most every day and scientific inference is made from more or less confidently held beliefs and not assumptions, and the relevant normative standard is Bayesian probability theory. We argue that the study of “uncertain deduction” should directly ask people to assign probabilities to both premises and conclusions, and report an experiment using this method. We assess this reasoning by two Bayesian metrics: probabilistic validity and coherence according to probability theory. On both measures, participants perform above chance in conditional reasoning, but they do much better when statements are grouped as inferences, rather than evaluated in separate tasks. PMID:25904888

  16. Human instrumental performance in ratio and interval contingencies: A challenge for associative theory.

    PubMed

    Pérez, Omar D; Aitken, Michael R F; Zhukovsky, Peter; Soto, Fabián A; Urcelay, Gonzalo P; Dickinson, Anthony

    2016-12-15

    Associative learning theories regard the probability of reinforcement as the critical factor determining responding. However, the role of this factor in instrumental conditioning is not completely clear. In fact, free-operant experiments show that participants respond at a higher rate on variable ratio than on variable interval schedules even though the reinforcement probability is matched between the schedules. This difference has been attributed to the differential reinforcement of long inter-response times (IRTs) by interval schedules, which acts to slow responding. In the present study, we used a novel experimental design to investigate human responding under random ratio (RR) and regulated probability interval (RPI) schedules, a type of interval schedule that sets a reinforcement probability independently of the IRT duration. Participants responded on each type of schedule before a final choice test in which they distributed responding between two schedules similar to those experienced during training. Although response rates did not differ during training, the participants responded at a lower rate on the RPI schedule than on the matched RR schedule during the choice test. This preference cannot be attributed to a higher probability of reinforcement for long IRTs and questions the idea that similar associative processes underlie classical and instrumental conditioning.

  17. Stochastic Laplacian growth

    NASA Astrophysics Data System (ADS)

    Alekseev, Oleg; Mineev-Weinstein, Mark

    2016-12-01

    A point source on a plane constantly emits particles which rapidly diffuse and then stick to a growing cluster. The growth probability of a cluster is presented as a sum over all possible scenarios leading to the same final shape. The classical point for the action, defined as a minus logarithm of the growth probability, describes the most probable scenario and reproduces the Laplacian growth equation, which embraces numerous fundamental free boundary dynamics in nonequilibrium physics. For nonclassical scenarios we introduce virtual point sources, in which presence the action becomes the Kullback-Leibler entropy. Strikingly, this entropy is shown to be the sum of electrostatic energies of layers grown per elementary time unit. Hence the growth probability of the presented nonequilibrium process obeys the Gibbs-Boltzmann statistics, which, as a rule, is not applied out from equilibrium. Each layer's probability is expressed as a product of simple factors in an auxiliary complex plane after a properly chosen conformal map. The action at this plane is a sum of Robin functions, which solve the Liouville equation. At the end we establish connections of our theory with the τ function of the integrable Toda hierarchy and with the Liouville theory for noncritical quantum strings.

  18. Origin of probabilities and their application to the multiverse

    NASA Astrophysics Data System (ADS)

    Albrecht, Andreas; Phillips, Daniel

    2014-12-01

    We argue using simple models that all successful practical uses of probabilities originate in quantum fluctuations in the microscopic physical world around us, often propagated to macroscopic scales. Thus we claim there is no physically verified fully classical theory of probability. We comment on the general implications of this view, and specifically question the application of purely classical probabilities to cosmology in cases where key questions are known to have no quantum answer. We argue that the ideas developed here may offer a way out of the notorious measure problems of eternal inflation.

  19. False memory for orthographically versus semantically similar words in adolescents with dyslexia: a fuzzy-trace theory perspective.

    PubMed

    Obidziński, Michał; Nieznański, Marek

    2017-10-01

    The presented research was conducted in order to investigate the connections between developmental dyslexia and the functioning of verbatim and gist memory traces-assumed in the fuzzy-trace theory. The participants were 71 high school students (33 with dyslexia and 38 without learning difficulties). The modified procedure and multinomial model of Stahl and Klauer (simplified conjoint recognition model) was used to collect and analyze data. Results showed statistically significant differences in four of the model parameters: (a) the probability of verbatim trace recollection upon presentation of orthographically similar stimulus was higher in the control than dyslexia group, (b) the probability of verbatim trace recollection upon presentation of semantically similar stimulus was higher in the control than dyslexia group, (c) the probability of gist trace retrieval upon presentation of semantically similar stimulus was higher in the dyslexia than control group, and (d) the probability of gist trace retrieval upon target stimulus presentation (in the semantic condition) was higher in the control than dyslexia group. The obtained results suggest differences of memory functioning in terms of verbatim and gist trace retrieval between people with and without dyslexia on specific, elementary cognitive processes postulated by the fuzzy-trace theory. These can indicate new approaches in the education of persons with developmental dyslexia, focused on specific impairments and the strengths of their memory functioning.

  20. Towards a Theory of Semantic Communication (Extended Technical Report)

    DTIC Science & Technology

    2011-03-01

    counting models of a sentence, when interpretations have different probabilities, what matters is the total probability of models of the sentence, not...of classic logics still hold in the LP semantics, e.g., De Morgan’s laws. However, modus pollens does hold in the LP semantics 10 F. Relation to

  1. Generalized Success-Breeds-Success Principle Leading to Time-Dependent Informetric Distributions.

    ERIC Educational Resources Information Center

    Egghe, Leo; Rousseau, Ronald

    1995-01-01

    Reformulates the success-breeds-success (SBS) principle in informetrics in order to generate a general theory of source-item relationships. Topics include a time-dependent probability, a new model for the expected probability that is compared with the SBS principle with exact combinatorial calculations, classical frequency distributions, and…

  2. A Brief Look at the History of Probability and Statistics.

    ERIC Educational Resources Information Center

    Lightner, James E.

    1991-01-01

    The historical development of probability theory is traced from its early origins in games of chance through its mathematical foundations in the work of Pascal and Fermat. The roots of statistics are also presented beginning with early actuarial developments through the work of Laplace, Gauss, and others. (MDH)

  3. Independent Events in Elementary Probability Theory

    ERIC Educational Resources Information Center

    Csenki, Attila

    2011-01-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): If the n events E[subscript 1],…

  4. Asymmetries in Predictive and Diagnostic Reasoning

    ERIC Educational Resources Information Center

    Fernbach, Philip M.; Darlow, Adam; Sloman, Steven A.

    2011-01-01

    In this article, we address the apparent discrepancy between causal Bayes net theories of cognition, which posit that judgments of uncertainty are generated from causal beliefs in a way that respects the norms of probability, and evidence that probability judgments based on causal beliefs are systematically in error. One purported source of bias…

  5. From the Law of Large Numbers to Large Deviation Theory in Statistical Physics: An Introduction

    NASA Astrophysics Data System (ADS)

    Cecconi, Fabio; Cencini, Massimo; Puglisi, Andrea; Vergni, Davide; Vulpiani, Angelo

    This contribution aims at introducing the topics of this book. We start with a brief historical excursion on the developments from the law of large numbers to the central limit theorem and large deviations theory. The same topics are then presented using the language of probability theory. Finally, some applications of large deviations theory in physics are briefly discussed through examples taken from statistical mechanics, dynamical and disordered systems.

  6. The biological precedents for medieval impetus theory and its Aristotelian character.

    PubMed

    Fritsche, Johannes

    2011-03-01

    While the impetus theory is often regarded as a non-Aristotelian theory that could not have emerged within the development of Aristotelianism, I argue that it is essentially Aristotelian. Given the state of the theories of body, movement and sexual reproduction and the development of the theory of the four elements in the Latin West at the end of the thirteenth century, the impetus theory was probably developed as an application to projectiles of Aristotle's theories of the male semen and of family resemblance. In addition, the impetus theory was even a convenient expedient to simplify the Aristotelian theory of movement and prevent it from drifting into non-Aristotelian territory.

  7. Confirmation of the Conditional Outdoor Leadership Theory.

    ERIC Educational Resources Information Center

    Dixon, Tim; Priest, Simon

    1991-01-01

    Responses of 75 expert outdoor leaders from Canada and the United States concerning leadership in 12 hypothetical backpacking scenarios provided partial support for a theory that predicted probability of leadership style (democratic, autocratic, or abdicratic) based on favorability of conditions, task orientation, and relationship orientation.…

  8. Groundwater Remediation using Bayesian Information-Gap Decision Theory

    NASA Astrophysics Data System (ADS)

    O'Malley, D.; Vesselinov, V. V.

    2016-12-01

    Probabilistic analyses of groundwater remediation scenarios frequently fail because the probability of an adverse, unanticipated event occurring is often high. In general, models of flow and transport in contaminated aquifers are always simpler than reality. Further, when a probabilistic analysis is performed, probability distributions are usually chosen more for convenience than correctness. The Bayesian Information-Gap Decision Theory (BIGDT) was designed to mitigate the shortcomings of the models and probabilistic decision analyses by leveraging a non-probabilistic decision theory - information-gap decision theory. BIGDT considers possible models that have not been explicitly enumerated and does not require us to commit to a particular probability distribution for model and remediation-design parameters. Both the set of possible models and the set of possible probability distributions grow as the degree of uncertainty increases. The fundamental question that BIGDT asks is "How large can these sets be before a particular decision results in an undesirable outcome?". The decision that allows these sets to be the largest is considered to be the best option. In this way, BIGDT enables robust decision-support for groundwater remediation problems. Here we apply BIGDT to in a representative groundwater remediation scenario where different options for hydraulic containment and pump & treat are being considered. BIGDT requires many model runs and for complex models high-performance computing resources are needed. These analyses are carried out on synthetic problems, but are applicable to real-world problems such as LANL site contaminations. BIGDT is implemented in Julia (a high-level, high-performance dynamic programming language for technical computing) and is part of the MADS framework (http://mads.lanl.gov/ and https://github.com/madsjulia/Mads.jl).

  9. A Bayesian account of quantum histories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marlow, Thomas

    2006-05-15

    We investigate whether quantum history theories can be consistent with Bayesian reasoning and whether such an analysis helps clarify the interpretation of such theories. First, we summarise and extend recent work categorising two different approaches to formalising multi-time measurements in quantum theory. The standard approach consists of describing an ordered series of measurements in terms of history propositions with non-additive 'probabilities.' The non-standard approach consists of defining multi-time measurements to consist of sets of exclusive and exhaustive history propositions and recovering the single-time exclusivity of results when discussing single-time history propositions. We analyse whether such history propositions can be consistentmore » with Bayes' rule. We show that certain class of histories are given a natural Bayesian interpretation, namely, the linearly positive histories originally introduced by Goldstein and Page. Thus, we argue that this gives a certain amount of interpretational clarity to the non-standard approach. We also attempt a justification of our analysis using Cox's axioms of probability theory.« less

  10. An explanation of resisted discoveries based on construal-level theory.

    PubMed

    Fang, Hui

    2015-02-01

    New discoveries and theories are crucial for the development of science, but they are often initially resisted by the scientific community. This paper analyses resistance to scientific discoveries that supplement previous research results or conclusions with new phenomena, such as long chains in macromolecules, Alfvén waves, parity nonconservation in weak interactions and quasicrystals. Construal-level theory is used to explain that the probability of new discoveries may be underestimated because of psychological distance. Thus, the insufficiently examined scope of an accepted theory may lead to overstating the suitable scope and underestimating the probability of its undiscovered counter-examples. Therefore, psychological activity can result in people instinctively resisting new discoveries. Direct evidence can help people judge the validity of a hypothesis with rational thinking. The effects of authorities and textbooks on the resistance to discoveries are also discussed. From the results of our analysis, suggestions are provided to reduce resistance to real discoveries, which will benefit the development of science.

  11. A short course on measure and probability theories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pebay, Philippe Pierre

    2004-02-01

    This brief Introduction to Measure Theory, and its applications to Probabilities, corresponds to the lecture notes of a seminar series given at Sandia National Laboratories in Livermore, during the spring of 2003. The goal of these seminars was to provide a minimal background to Computational Combustion scientists interested in using more advanced stochastic concepts and methods, e.g., in the context of uncertainty quantification. Indeed, most mechanical engineering curricula do not provide students with formal training in the field of probability, and even in less in measure theory. However, stochastic methods have been used more and more extensively in the pastmore » decade, and have provided more successful computational tools. Scientists at the Combustion Research Facility of Sandia National Laboratories have been using computational stochastic methods for years. Addressing more and more complex applications, and facing difficult problems that arose in applications showed the need for a better understanding of theoretical foundations. This is why the seminar series was launched, and these notes summarize most of the concepts which have been discussed. The goal of the seminars was to bring a group of mechanical engineers and computational combustion scientists to a full understanding of N. WIENER'S polynomial chaos theory. Therefore, these lectures notes are built along those lines, and are not intended to be exhaustive. In particular, the author welcomes any comments or criticisms.« less

  12. Beta-decay rate and beta-delayed neutron emission probability of improved gross theory

    NASA Astrophysics Data System (ADS)

    Koura, Hiroyuki

    2014-09-01

    A theoretical study has been carried out on beta-decay rate and beta-delayed neutron emission probability. The gross theory of the beta decay is based on an idea of the sum rule of the beta-decay strength function, and has succeeded in describing beta-decay half-lives of nuclei overall nuclear mass region. The gross theory includes not only the allowed transition as the Fermi and the Gamow-Teller, but also the first-forbidden transition. In this work, some improvements are introduced as the nuclear shell correction on nuclear level densities and the nuclear deformation for nuclear strength functions, those effects were not included in the original gross theory. The shell energy and the nuclear deformation for unmeasured nuclei are adopted from the KTUY nuclear mass formula, which is based on the spherical-basis method. Considering the properties of the integrated Fermi function, we can roughly categorized energy region of excited-state of a daughter nucleus into three regions: a highly-excited energy region, which fully affect a delayed neutron probability, a middle energy region, which is estimated to contribute the decay heat, and a region neighboring the ground-state, which determines the beta-decay rate. Some results will be given in the presentation. A theoretical study has been carried out on beta-decay rate and beta-delayed neutron emission probability. The gross theory of the beta decay is based on an idea of the sum rule of the beta-decay strength function, and has succeeded in describing beta-decay half-lives of nuclei overall nuclear mass region. The gross theory includes not only the allowed transition as the Fermi and the Gamow-Teller, but also the first-forbidden transition. In this work, some improvements are introduced as the nuclear shell correction on nuclear level densities and the nuclear deformation for nuclear strength functions, those effects were not included in the original gross theory. The shell energy and the nuclear deformation for unmeasured nuclei are adopted from the KTUY nuclear mass formula, which is based on the spherical-basis method. Considering the properties of the integrated Fermi function, we can roughly categorized energy region of excited-state of a daughter nucleus into three regions: a highly-excited energy region, which fully affect a delayed neutron probability, a middle energy region, which is estimated to contribute the decay heat, and a region neighboring the ground-state, which determines the beta-decay rate. Some results will be given in the presentation. This work is a result of Comprehensive study of delayed-neutron yields for accurate evaluation of kinetics of high-burn up reactors entrusted to Tokyo Institute of Technology by the Ministry of Education, Culture, Sports, Science and Technology of Japan.

  13. Bayesian Inference for Source Reconstruction: A Real-World Application

    DTIC Science & Technology

    2014-09-25

    deliberately or acci- dentally . Two examples of operational monitoring sensor networks are the deployment of biological sensor arrays by the Department of...remarkable paper, Cox [16] demonstrated that proba- bility theory, when interpreted as logic, is the only calculus that conforms to a consistent theory...of inference. This demonstration provides the firm logical basis for asserting that probability calculus is the unique quantitative theory of

  14. Symmetric polynomials in information theory: Entropy and subentropy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jozsa, Richard; Mitchison, Graeme

    2015-06-15

    Entropy and other fundamental quantities of information theory are customarily expressed and manipulated as functions of probabilities. Here we study the entropy H and subentropy Q as functions of the elementary symmetric polynomials in the probabilities and reveal a series of remarkable properties. Derivatives of all orders are shown to satisfy a complete monotonicity property. H and Q themselves become multivariate Bernstein functions and we derive the density functions of their Levy-Khintchine representations. We also show that H and Q are Pick functions in each symmetric polynomial variable separately. Furthermore, we see that H and the intrinsically quantum informational quantitymore » Q become surprisingly closely related in functional form, suggesting a special significance for the symmetric polynomials in quantum information theory. Using the symmetric polynomials, we also derive a series of further properties of H and Q.« less

  15. Decision making under time pressure, modeled in a prospect theory framework.

    PubMed

    Young, Diana L; Goodie, Adam S; Hall, Daniel B; Wu, Eric

    2012-07-01

    The current research examines the effects of time pressure on decision behavior based on a prospect theory framework. In Experiments 1 and 2, participants estimated certainty equivalents for binary gains-only bets in the presence or absence of time pressure. In Experiment 3, participants assessed comparable bets that were framed as losses. Data were modeled to establish psychological mechanisms underlying decision behavior. In Experiments 1 and 2, time pressure led to increased risk attractiveness, but no significant differences emerged in either probability discriminability or outcome utility. In Experiment 3, time pressure reduced probability discriminability, which was coupled with severe risk-seeking behavior for both conditions in the domain of losses. No significant effects of control over outcomes were observed. Results provide qualified support for theories that suggest increased risk-seeking for gains under time pressure.

  16. Decision making under time pressure, modeled in a prospect theory framework

    PubMed Central

    Young, Diana L.; Goodie, Adam S.; Hall, Daniel B.; Wu, Eric

    2012-01-01

    The current research examines the effects of time pressure on decision behavior based on a prospect theory framework. In Experiments 1 and 2, participants estimated certainty equivalents for binary gains-only bets in the presence or absence of time pressure. In Experiment 3, participants assessed comparable bets that were framed as losses. Data were modeled to establish psychological mechanisms underlying decision behavior. In Experiments 1 and 2, time pressure led to increased risk attractiveness, but no significant differences emerged in either probability discriminability or outcome utility. In Experiment 3, time pressure reduced probability discriminability, which was coupled with severe risk-seeking behavior for both conditions in the domain of losses. No significant effects of control over outcomes were observed. Results provide qualified support for theories that suggest increased risk-seeking for gains under time pressure. PMID:22711977

  17. Spatial estimation from remotely sensed data via empirical Bayes models

    NASA Technical Reports Server (NTRS)

    Hill, J. R.; Hinkley, D. V.; Kostal, H.; Morris, C. N.

    1984-01-01

    Multichannel satellite image data, available as LANDSAT imagery, are recorded as a multivariate time series (four channels, multiple passovers) in two spatial dimensions. The application of parametric empirical Bayes theory to classification of, and estimating the probability of, each crop type at each of a large number of pixels is considered. This theory involves both the probability distribution of imagery data, conditional on crop types, and the prior spatial distribution of crop types. For the latter Markov models indexed by estimable parameters are used. A broad outline of the general theory reveals several questions for further research. Some detailed results are given for the special case of two crop types when only a line transect is analyzed. Finally, the estimation of an underlying continuous process on the lattice is discussed which would be applicable to such quantities as crop yield.

  18. Interstellar Travel and Galactic Colonization: Insights from Percolation Theory and the Yule Process

    NASA Astrophysics Data System (ADS)

    Lingam, Manasvi

    2016-06-01

    In this paper, percolation theory is employed to place tentative bounds on the probability p of interstellar travel and the emergence of a civilization (or panspermia) that colonizes the entire Galaxy. The ensuing ramifications with regard to the Fermi paradox are also explored. In particular, it is suggested that the correlation function of inhabited exoplanets can be used to observationally constrain p in the near future. It is shown, by using a mathematical evolution model known as the Yule process, that the probability distribution for civilizations with a given number of colonized worlds is likely to exhibit a power-law tail. Some of the dynamical aspects of this issue, including the question of timescales and generalizing percolation theory, were also studied. The limitations of these models, and other avenues for future inquiry, are also outlined.

  19. Prospect evaluation as a function of numeracy and probability denominator.

    PubMed

    Millroth, Philip; Juslin, Peter

    2015-05-01

    This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Quantum probability rule: a generalization of the theorems of Gleason and Busch

    NASA Astrophysics Data System (ADS)

    Barnett, Stephen M.; Cresser, James D.; Jeffers, John; Pegg, David T.

    2014-04-01

    Busch's theorem deriving the standard quantum probability rule can be regarded as a more general form of Gleason's theorem. Here we show that a further generalization is possible by reducing the number of quantum postulates used by Busch. We do not assume that the positive measurement outcome operators are effects or that they form a probability operator measure. We derive a more general probability rule from which the standard rule can be obtained from the normal laws of probability when there is no measurement outcome information available, without the need for further quantum postulates. Our general probability rule has prediction-retrodiction symmetry and we show how it may be applied in quantum communications and in retrodictive quantum theory.

  1. Continuous Strategy Development for Effects-Based Operations

    DTIC Science & Technology

    2006-02-01

    the probability of COA success. The time slider from the “Time Selector” choice in the View menu may also be used to animate the probability coloring...will Deploy WMD, since this can be assumed to have the inverse probability (1-P) of our objective. Clausewitz theory teaches us that an enemy must be... using XSLT, a concise language for transforming XML documents, for forward and reverse conversion between the SDT and SMS plan formats. 2. Develop a

  2. Brain mechanisms of emotions.

    PubMed

    Simonov, P V

    1997-01-01

    At the 23rd International Congress of Physiology Sciences (Tokyo, 1965) the results of experiment led us to the conclusion that emotions were determined by the actual need and estimation of probability (possibility) of its satisfaction. Low probability of need satisfaction leads to negative emotions actively minimized by the subject. Increased probability of satisfaction, as compared to the earlier forecast, generates positive emotions which the subject tries to maximize, that is, to enhance, to prolong, to repeat. We named our concept the Need-Informational Theory of Emotions. According to this theory, motivation, emotion, and estimation of probability have different neuromorphological substrates. Activation through the hypothalamic motivatiogenic structures of the frontal parts of the neocortex orients the behavior to signals with a high probability of their reinforcement. At the same time the hippocampus is necessary for reactions to signals of low probability events, which are typical for the emotionally excited brain. By comparison of motivational excitation with available stimuli or their engrams, the amygdala selects a dominant motivation, destined to be satisfied in the first instance. In the cases of classical conditioning and escape reaction the reinforcement was related to involvement of the negative emotion's hypothalamic neurons, while in the course of avoidance reaction the positive emotion's neurons were involved. The role of the left and right frontal neocortex in the appearance or positive or negative emotions depends on these informational (cognitive) functions.

  3. [The brain mechanisms of emotions].

    PubMed

    Simonov, P V

    1997-01-01

    At the 23rd International Congress of Physiological Sciences (Tokyo, 1965) the results of experiment brought us to a conclusion that emotions were determined by the actual need and estimation of probability (possibility) of its satisfaction. Low probability of need satisfaction leads to negative emotions actively minimized by the subject. Increased probability of satisfaction, as compared to the earlier forecast, generates positive emotions which the subject tries to maximize, that is to enhance, to prolong, to repeat. We named our concept the Need-Informational Theory of Emotions. According to this theory, motivation, emotion and estimation of probability have different neuromorphological substrate. Activating by motivatiogenic structures of the hypothalamus the frontal parts of neocortex orients the behavior to signals with a high probability of their reinforcement. At the same time the hippocampus is necessary for reactions to signals of low probability events, which is typical for emotionally excited brain. By comparison of motivational excitation with available stimuli or their engrams the amygdala selects a dominant motivation, destined to be satisfied in the first instance. In the cases of classical conditioning and escape reaction the reinforcement was related to involvement of the negative emotion's hypothalamic neurons while in the course of avoidance reaction the positive emotion's neurons being involved. The role of the left and right frontal neocortex in the appearance of positive or negative emotions depends on this informational (cognitive) functions.

  4. Re-Conceptualization of Modified Angoff Standard Setting: Unified Statistical, Measurement, Cognitive, and Social Psychological Theories

    ERIC Educational Resources Information Center

    Iyioke, Ifeoma Chika

    2013-01-01

    This dissertation describes a design for training, in accordance with probability judgment heuristics principles, for the Angoff standard setting method. The new training with instruction, practice, and feedback tailored to the probability judgment heuristics principles was called the Heuristic training and the prevailing Angoff method training…

  5. Attention as Inference: Selection Is Probabilistic; Responses Are All-or-None Samples

    ERIC Educational Resources Information Center

    Vul, Edward; Hanus, Deborah; Kanwisher, Nancy

    2009-01-01

    Theories of probabilistic cognition postulate that internal representations are made up of multiple simultaneously held hypotheses, each with its own probability of being correct (henceforth, "probability distributions"). However, subjects make discrete responses and report the phenomenal contents of their mind to be all-or-none states rather than…

  6. Exaggerated Risk: Prospect Theory and Probability Weighting in Risky Choice

    ERIC Educational Resources Information Center

    Kusev, Petko; van Schaik, Paul; Ayton, Peter; Dent, John; Chater, Nick

    2009-01-01

    In 5 experiments, we studied precautionary decisions in which participants decided whether or not to buy insurance with specified cost against an undesirable event with specified probability and cost. We compared the risks taken for precautionary decisions with those taken for equivalent monetary gambles. Fitting these data to Tversky and…

  7. Is This Radical Enough? Curriculum Reform, Change, and the Language of Probability.

    ERIC Educational Resources Information Center

    Deever, Bryan

    1996-01-01

    Discusses the need for and construction of a language of probability to complement the extant languages of critique and possibility in critical curriculum theory. Actions framed in the language, and the ensuing dialogue, could then be forces in radical reform through a process of infiltration, appropriation, and reconfiguration. (SM)

  8. Teaching Probability for Conceptual Change (La Ensenanza de la Probabilidad por Cambio Conceptual).

    ERIC Educational Resources Information Center

    Castro, Cesar Saenz

    1998-01-01

    Presents a theoretical proposal of a methodology for the teaching of probability theory. Discusses the importance of the epistemological approach of Lakatos and the perspective of the conceptual change. Discusses research using a proposed didactic method with Spanish high school students (N=6). Concludes that significant differences on all…

  9. The Sequential Probability Ratio Test and Binary Item Response Models

    ERIC Educational Resources Information Center

    Nydick, Steven W.

    2014-01-01

    The sequential probability ratio test (SPRT) is a common method for terminating item response theory (IRT)-based adaptive classification tests. To decide whether a classification test should stop, the SPRT compares a simple log-likelihood ratio, based on the classification bound separating two categories, to prespecified critical values. As has…

  10. Markov Processes: Exploring the Use of Dynamic Visualizations to Enhance Student Understanding

    ERIC Educational Resources Information Center

    Pfannkuch, Maxine; Budgett, Stephanie

    2016-01-01

    Finding ways to enhance introductory students' understanding of probability ideas and theory is a goal of many first-year probability courses. In this article, we explore the potential of a prototype tool for Markov processes using dynamic visualizations to develop in students a deeper understanding of the equilibrium and hitting times…

  11. Kolmogorov complexity, statistical regularization of inverse problems, and Birkhoff's formalization of beauty

    NASA Astrophysics Data System (ADS)

    Kreinovich, Vladik; Longpre, Luc; Koshelev, Misha

    1998-09-01

    Most practical applications of statistical methods are based on the implicit assumption that if an event has a very small probability, then it cannot occur. For example, the probability that a kettle placed on a cold stove would start boiling by itself is not 0, it is positive, but it is so small, that physicists conclude that such an event is simply impossible. This assumption is difficult to formalize in traditional probability theory, because this theory only describes measures on sets and does not allow us to divide functions into 'random' and non-random ones. This distinction was made possible by the idea of algorithmic randomness, introduce by Kolmogorov and his student Martin- Loef in the 1960s. We show that this idea can also be used for inverse problems. In particular, we prove that for every probability measure, the corresponding set of random functions is compact, and, therefore, the corresponding restricted inverse problem is well-defined. The resulting techniques turns out to be interestingly related with the qualitative esthetic measure introduced by G. Birkhoff as order/complexity.

  12. Using optimal transport theory to estimate transition probabilities in metapopulation dynamics

    USGS Publications Warehouse

    Nichols, Jonathan M.; Spendelow, Jeffrey A.; Nichols, James D.

    2017-01-01

    This work considers the estimation of transition probabilities associated with populations moving among multiple spatial locations based on numbers of individuals at each location at two points in time. The problem is generally underdetermined as there exists an extremely large number of ways in which individuals can move from one set of locations to another. A unique solution therefore requires a constraint. The theory of optimal transport provides such a constraint in the form of a cost function, to be minimized in expectation over the space of possible transition matrices. We demonstrate the optimal transport approach on marked bird data and compare to the probabilities obtained via maximum likelihood estimation based on marked individuals. It is shown that by choosing the squared Euclidean distance as the cost, the estimated transition probabilities compare favorably to those obtained via maximum likelihood with marked individuals. Other implications of this cost are discussed, including the ability to accurately interpolate the population's spatial distribution at unobserved points in time and the more general relationship between the cost and minimum transport energy.

  13. Money, kisses, and electric shocks: on the affective psychology of risk.

    PubMed

    Rottenstreich, Y; Hsee, C K

    2001-05-01

    Prospect theory's S-shaped weighting function is often said to reflect the psychophysics of chance. We propose an affective rather than psychophysical deconstruction of the weighting function resting on two assumptions. First, preferences depend on the affective reactions associated with potential outcomes of a risky choice. Second, even with monetary values controlled, some outcomes are relatively affect-rich and others relatively affect-poor. Although the psychophysical and affective approaches are complementary, the affective approach has one novel implication: Weighting functions will be more S-shaped for lotteries involving affect-rich than affect-poor outcomes. That is, people will be more sensitive to departures from impossibility and certainty but less sensitive to intermediate probability variations for affect-rich outcomes. We corroborated this prediction by observing probability-outcome interactions: An affect-poor prize was preferred over an affect-rich prize under certainty, but the direction of preference reversed under low probability. We suggest that the assumption of probability-outcome independence, adopted by both expected-utility and prospect theory, may hold across outcomes of different monetary values, but not different affective values.

  14. Dependence in probabilistic modeling Dempster-Shafer theory and probability bounds analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferson, Scott; Nelsen, Roger B.; Hajagos, Janos

    2015-05-01

    This report summarizes methods to incorporate information (or lack of information) about inter-variable dependence into risk assessments that use Dempster-Shafer theory or probability bounds analysis to address epistemic and aleatory uncertainty. The report reviews techniques for simulating correlated variates for a given correlation measure and dependence model, computation of bounds on distribution functions under a specified dependence model, formulation of parametric and empirical dependence models, and bounding approaches that can be used when information about the intervariable dependence is incomplete. The report also reviews several of the most pervasive and dangerous myths among risk analysts about dependence in probabilistic models.

  15. Classical Physics and the Bounds of Quantum Correlations.

    PubMed

    Frustaglia, Diego; Baltanás, José P; Velázquez-Ahumada, María C; Fernández-Prieto, Armando; Lujambio, Aintzane; Losada, Vicente; Freire, Manuel J; Cabello, Adán

    2016-06-24

    A unifying principle explaining the numerical bounds of quantum correlations remains elusive, despite the efforts devoted to identifying it. Here, we show that these bounds are indeed not exclusive to quantum theory: for any abstract correlation scenario with compatible measurements, models based on classical waves produce probability distributions indistinguishable from those of quantum theory and, therefore, share the same bounds. We demonstrate this finding by implementing classical microwaves that propagate along meter-size transmission-line circuits and reproduce the probabilities of three emblematic quantum experiments. Our results show that the "quantum" bounds would also occur in a classical universe without quanta. The implications of this observation are discussed.

  16. Going through a quantum phase

    NASA Technical Reports Server (NTRS)

    Shapiro, Jeffrey H.

    1992-01-01

    Phase measurements on a single-mode radiation field are examined from a system-theoretic viewpoint. Quantum estimation theory is used to establish the primacy of the Susskind-Glogower (SG) phase operator; its phase eigenkets generate the probability operator measure (POM) for maximum likelihood phase estimation. A commuting observables description for the SG-POM on a signal x apparatus state space is derived. It is analogous to the signal-band x image-band formulation for optical heterodyne detection. Because heterodyning realizes the annihilation operator POM, this analogy may help realize the SG-POM. The wave function representation associated with the SG POM is then used to prove the duality between the phase measurement and the number operator measurement, from which a number-phase uncertainty principle is obtained, via Fourier theory, without recourse to linearization. Fourier theory is also employed to establish the principle of number-ket causality, leading to a Paley-Wiener condition that must be satisfied by the phase-measurement probability density function (PDF) for a single-mode field in an arbitrary quantum state. Finally, a two-mode phase measurement is shown to afford phase-conjugate quantum communication at zero error probability with finite average photon number. Application of this construct to interferometric precision measurements is briefly discussed.

  17. Two heads are better than one, but how much? Evidence that people's use of causal integration rules does not always conform to normative standards.

    PubMed

    Vadillo, Miguel A; Ortega-Castro, Nerea; Barberia, Itxaso; Baker, A G

    2014-01-01

    Many theories of causal learning and causal induction differ in their assumptions about how people combine the causal impact of several causes presented in compound. Some theories propose that when several causes are present, their joint causal impact is equal to the linear sum of the individual impact of each cause. However, some recent theories propose that the causal impact of several causes needs to be combined by means of a noisy-OR integration rule. In other words, the probability of the effect given several causes would be equal to the sum of the probability of the effect given each cause in isolation minus the overlap between those probabilities. In the present series of experiments, participants were given information about the causal impact of several causes and then they were asked what compounds of those causes they would prefer to use if they wanted to produce the effect. The results of these experiments suggest that participants actually use a variety of strategies, including not only the linear and the noisy-OR integration rules, but also averaging the impact of several causes.

  18. Weighted Fuzzy Risk Priority Number Evaluation of Turbine and Compressor Blades Considering Failure Mode Correlations

    NASA Astrophysics Data System (ADS)

    Gan, Luping; Li, Yan-Feng; Zhu, Shun-Peng; Yang, Yuan-Jian; Huang, Hong-Zhong

    2014-06-01

    Failure mode, effects and criticality analysis (FMECA) and Fault tree analysis (FTA) are powerful tools to evaluate reliability of systems. Although single failure mode issue can be efficiently addressed by traditional FMECA, multiple failure modes and component correlations in complex systems cannot be effectively evaluated. In addition, correlated variables and parameters are often assumed to be precisely known in quantitative analysis. In fact, due to the lack of information, epistemic uncertainty commonly exists in engineering design. To solve these problems, the advantages of FMECA, FTA, fuzzy theory, and Copula theory are integrated into a unified hybrid method called fuzzy probability weighted geometric mean (FPWGM) risk priority number (RPN) method. The epistemic uncertainty of risk variables and parameters are characterized by fuzzy number to obtain fuzzy weighted geometric mean (FWGM) RPN for single failure mode. Multiple failure modes are connected using minimum cut sets (MCS), and Boolean logic is used to combine fuzzy risk priority number (FRPN) of each MCS. Moreover, Copula theory is applied to analyze the correlation of multiple failure modes in order to derive the failure probabilities of each MCS. Compared to the case where dependency among multiple failure modes is not considered, the Copula modeling approach eliminates the error of reliability analysis. Furthermore, for purpose of quantitative analysis, probabilities importance weight from failure probabilities are assigned to FWGM RPN to reassess the risk priority, which generalize the definition of probability weight and FRPN, resulting in a more accurate estimation than that of the traditional models. Finally, a basic fatigue analysis case drawn from turbine and compressor blades in aeroengine is used to demonstrate the effectiveness and robustness of the presented method. The result provides some important insights on fatigue reliability analysis and risk priority assessment of structural system under failure correlations.

  19. Decision theory and information propagation in quantum physics

    NASA Astrophysics Data System (ADS)

    Forrester, Alan

    In recent papers, Zurek [(2005). Probabilities from entanglement, Born's rule p k =| ψ k | 2 from entanglement. Physical Review A, 71, 052105] has objected to the decision-theoretic approach of Deutsch [(1999) Quantum theory of probability and decisions. Proceedings of the Royal Society of London A, 455, 3129-3137] and Wallace [(2003). Everettian rationality: defending Deutsch's approach to probability in the Everett interpretation. Studies in History and Philosophy of Modern Physics, 34, 415-438] to deriving the Born rule for quantum probabilities on the grounds that it courts circularity. Deutsch and Wallace assume that the many worlds theory is true and that decoherence gives rise to a preferred basis. However, decoherence arguments use the reduced density matrix, which relies upon the partial trace and hence upon the Born rule for its validity. Using the Heisenberg picture and quantum Darwinism-the notion that classical information is quantum information that can proliferate in the environment pioneered in Ollivier et al. [(2004). Objective properties from subjective quantum states: Environment as a witness. Physical Review Letters, 93, 220401 and (2005). Environment as a witness: Selective proliferation of information and emergence of objectivity in a quantum universe. Physical Review A, 72, 042113]-I show that measurement interactions between two systems only create correlations between a specific set of commuting observables of system 1 and a specific set of commuting observables of system 2. This argument picks out a unique basis in which information flows in the correlations between those sets of commuting observables. I then derive the Born rule for both pure and mixed states and answer some other criticisms of the decision theoretic approach to quantum probability.

  20. Applications of a general random-walk theory for confined diffusion.

    PubMed

    Calvo-Muñoz, Elisa M; Selvan, Myvizhi Esai; Xiong, Ruichang; Ojha, Madhusudan; Keffer, David J; Nicholson, Donald M; Egami, Takeshi

    2011-01-01

    A general random walk theory for diffusion in the presence of nanoscale confinement is developed and applied. The random-walk theory contains two parameters describing confinement: a cage size and a cage-to-cage hopping probability. The theory captures the correct nonlinear dependence of the mean square displacement (MSD) on observation time for intermediate times. Because of its simplicity, the theory also requires modest computational requirements and is thus able to simulate systems with very low diffusivities for sufficiently long time to reach the infinite-time-limit regime where the Einstein relation can be used to extract the self-diffusivity. The theory is applied to three practical cases in which the degree of order in confinement varies. The three systems include diffusion of (i) polyatomic molecules in metal organic frameworks, (ii) water in proton exchange membranes, and (iii) liquid and glassy iron. For all three cases, the comparison between theory and the results of molecular dynamics (MD) simulations indicates that the theory can describe the observed diffusion behavior with a small fraction of the computational expense. The confined-random-walk theory fit to the MSDs of very short MD simulations is capable of accurately reproducing the MSDs of much longer MD simulations. Furthermore, the values of the parameter for cage size correspond to the physical dimensions of the systems and the cage-to-cage hopping probability corresponds to the activation barrier for diffusion, indicating that the two parameters in the theory are not simply fitted values but correspond to real properties of the physical system.

  1. Fitness Probability Distribution of Bit-Flip Mutation.

    PubMed

    Chicano, Francisco; Sutton, Andrew M; Whitley, L Darrell; Alba, Enrique

    2015-01-01

    Bit-flip mutation is a common mutation operator for evolutionary algorithms applied to optimize functions over binary strings. In this paper, we develop results from the theory of landscapes and Krawtchouk polynomials to exactly compute the probability distribution of fitness values of a binary string undergoing uniform bit-flip mutation. We prove that this probability distribution can be expressed as a polynomial in p, the probability of flipping each bit. We analyze these polynomials and provide closed-form expressions for an easy linear problem (Onemax), and an NP-hard problem, MAX-SAT. We also discuss a connection of the results with runtime analysis.

  2. The Everett-Wheeler interpretation and the open future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sudbery, Anthony

    2011-03-28

    I discuss the meaning of probability in the Everett-Wheeler interpretation of quantum mechanics, together with the problem of defining histories. To resolve these, I propose an understanding of probability arising from a form of temporal logic: the probability of a future-tense proposition is identified with its truth value in a many-valued and context-dependent logic. In short, probability is degree of truth. These ideas relate to traditional naive ideas of time and chance. Indeed, I argue that Everettian quantum mechanics is the only form of scientific theory that truly incorporates the perception that the future is open.

  3. Random walk in generalized quantum theory

    NASA Astrophysics Data System (ADS)

    Martin, Xavier; O'Connor, Denjoe; Sorkin, Rafael D.

    2005-01-01

    One can view quantum mechanics as a generalization of classical probability theory that provides for pairwise interference among alternatives. Adopting this perspective, we “quantize” the classical random walk by finding, subject to a certain condition of “strong positivity”, the most general Markovian, translationally invariant “decoherence functional” with nearest neighbor transitions.

  4. Cognitive Diagnostic Attribute-Level Discrimination Indices

    ERIC Educational Resources Information Center

    Henson, Robert; Roussos, Louis; Douglas, Jeff; He, Xuming

    2008-01-01

    Cognitive diagnostic models (CDMs) model the probability of correctly answering an item as a function of an examinee's attribute mastery pattern. Because estimation of the mastery pattern involves more than a continuous measure of ability, reliability concepts introduced by classical test theory and item response theory do not apply. The cognitive…

  5. Applications of presently planned interplanetary missions to testing gravitational theories

    NASA Technical Reports Server (NTRS)

    Friedman, L. D.

    1971-01-01

    A summary of the probable interplanetary missions for the 1970's is presented, which may prove useful in testing the general theory of relativity. Mission characteristics are discussed, as well as instrumentation. This last includes a low-level accelerometer and S-/X-band transponders and antennas.

  6. Fourth-Order Vibrational Transition State Theory and Chemical Kinetics

    NASA Astrophysics Data System (ADS)

    Stanton, John F.; Matthews, Devin A.; Gong, Justin Z.

    2015-06-01

    Second-order vibrational perturbation theory (VPT2) is an enormously successful and well-established theory for treating anharmonic effects on the vibrational levels of semi-rigid molecules. Partially as a consequence of the fact that the theory is exact for the Morse potential (which provides an appropriate qualitative model for stretching anharmonicity), VPT2 calculations for such systems with appropriate ab initio potential functions tend to give fundamental and overtone levels that fall within a handful of wavenumbers of experimentally measured positions. As a consequence, the next non-vanishing level of perturbation theory -- VPT4 -- offers only slight improvements over VPT2 and is not practical for most calculations since it requires information about force constants up through sextic. However, VPT4 (as well as VPT2) can be used for other applications such as the next vibrational correction to rotational constants (the ``gammas'') and other spectroscopic parameters. In addition, the marriage of VPT with the semi-classical transition state theory of Miller (SCTST) has recently proven to be a powerful and accurate treatment for chemical kinetics. In this talk, VPT4-based SCTST tunneling probabilities and cumulative reaction probabilities are give for the first time for selected low-dimensional model systems. The prospects for VPT4, both practical and intrinsic, will also be discussed.

  7. Decision theory for computing variable and value ordering decisions for scheduling problems

    NASA Technical Reports Server (NTRS)

    Linden, Theodore A.

    1993-01-01

    Heuristics that guide search are critical when solving large planning and scheduling problems, but most variable and value ordering heuristics are sensitive to only one feature of the search state. One wants to combine evidence from all features of the search state into a subjective probability that a value choice is best, but there has been no solid semantics for merging evidence when it is conceived in these terms. Instead, variable and value ordering decisions should be viewed as problems in decision theory. This led to two key insights: (1) The fundamental concept that allows heuristic evidence to be merged is the net incremental utility that will be achieved by assigning a value to a variable. Probability distributions about net incremental utility can merge evidence from the utility function, binary constraints, resource constraints, and other problem features. The subjective probability that a value is the best choice is then derived from probability distributions about net incremental utility. (2) The methods used for rumor control in Bayesian Networks are the primary way to prevent cycling in the computation of probable net incremental utility. These insights lead to semantically justifiable ways to compute heuristic variable and value ordering decisions that merge evidence from all available features of the search state.

  8. Quantum-Like Model for Decision Making Process in Two Players Game. A Non-Kolmogorovian Model

    NASA Astrophysics Data System (ADS)

    Asano, Masanari; Ohya, Masanori; Khrennikov, Andrei

    2011-03-01

    In experiments of games, players frequently make choices which are regarded as irrational in game theory. In papers of Khrennikov (Information Dynamics in Cognitive, Psychological and Anomalous Phenomena. Fundamental Theories of Physics, Kluwer Academic, Norwell, 2004; Fuzzy Sets Syst. 155:4-17, 2005; Biosystems 84:225-241, 2006; Found. Phys. 35(10):1655-1693, 2005; in QP-PQ Quantum Probability and White Noise Analysis, vol. XXIV, pp. 105-117, 2009), it was pointed out that statistics collected in such the experiments have "quantum-like" properties, which can not be explained in classical probability theory. In this paper, we design a simple quantum-like model describing a decision-making process in a two-players game and try to explain a mechanism of the irrational behavior of players. Finally we discuss a mathematical frame of non-Kolmogorovian system in terms of liftings (Accardi and Ohya, in Appl. Math. Optim. 39:33-59, 1999).

  9. Interstellar Travel and Galactic Colonization: Insights from Percolation Theory and the Yule Process.

    PubMed

    Lingam, Manasvi

    2016-06-01

    In this paper, percolation theory is employed to place tentative bounds on the probability p of interstellar travel and the emergence of a civilization (or panspermia) that colonizes the entire Galaxy. The ensuing ramifications with regard to the Fermi paradox are also explored. In particular, it is suggested that the correlation function of inhabited exoplanets can be used to observationally constrain p in the near future. It is shown, by using a mathematical evolution model known as the Yule process, that the probability distribution for civilizations with a given number of colonized worlds is likely to exhibit a power-law tail. Some of the dynamical aspects of this issue, including the question of timescales and generalizing percolation theory, were also studied. The limitations of these models, and other avenues for future inquiry, are also outlined. Complex life-Extraterrestrial life-Panspermia-Life detection-SETI. Astrobiology 16, 418-426.

  10. Comparison between the Health Belief Model and Subjective Expected Utility Theory: predicting incontinence prevention behaviour in post-partum women.

    PubMed

    Dolman, M; Chase, J

    1996-08-01

    A small-scale study was undertaken to test the relative predictive power of the Health Belief Model and Subjective Expected Utility Theory for the uptake of a behaviour (pelvic floor exercises) to reduce post-partum urinary incontinence in primigravida females. A structured questionnaire was used to gather data relevant to both models from a sample antenatal and postnatal primigravida women. Questions examined the perceived probability of becoming incontinent, the perceived (dis)utility of incontinence, the perceived probability of pelvic floor exercises preventing future urinary incontinence, the costs and benefits of performing pelvic floor exercises and sources of information and knowledge about incontinence. Multiple regression analysis focused on whether or not respondents intended to perform pelvic floor exercises and the factors influencing their decisions. Aggregated data were analysed to compare the Health Belief Model and Subjective Expected Utility Theory directly.

  11. Information Theory - The Bridge Connecting Bounded Rational Game Theory and Statistical Physics

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.

    2005-01-01

    A long-running difficulty with conventional game theory has been how to modify it to accommodate the bounded rationality of all red-world players. A recurring issue in statistical physics is how best to approximate joint probability distributions with decoupled (and therefore far more tractable) distributions. This paper shows that the same information theoretic mathematical structure, known as Product Distribution (PD) theory, addresses both issues. In this, PD theory not only provides a principle formulation of bounded rationality and a set of new types of mean field theory in statistical physics; it also shows that those topics are fundamentally one and the same.

  12. Visual complexity: a review.

    PubMed

    Donderi, Don C

    2006-01-01

    The idea of visual complexity, the history of its measurement, and its implications for behavior are reviewed, starting with structuralism and Gestalt psychology at the beginning of the 20th century and ending with visual complexity theory, perceptual learning theory, and neural circuit theory at the beginning of the 21st. Evidence is drawn from research on single forms, form and texture arrays and visual displays. Form complexity and form probability are shown to be linked through their reciprocal relationship in complexity theory, which is in turn shown to be consistent with recent developments in perceptual learning and neural circuit theory. Directions for further research are suggested.

  13. The Classicist and the Frequentist Approach to Probability within a "TinkerPlots2" Combinatorial Problem

    ERIC Educational Resources Information Center

    Prodromou, Theodosia

    2012-01-01

    This article seeks to address a pedagogical theory of introducing the classicist and the frequentist approach to probability, by investigating important elements in 9th grade students' learning process while working with a "TinkerPlots2" combinatorial problem. Results from this research study indicate that, after the students had seen…

  14. Come on Down ... The Prize Is Right in Your Classroom

    ERIC Educational Resources Information Center

    Butterworth, William T.; Coe, Paul R.

    2004-01-01

    "The Price is Right" ("TPIR") is a rich source of examples of applied probability, combinatorics, and game theory. While some of the games played on stage by individual contestants stress a knowledge of pricing, many are also heavily based on probability. "TPIR" stage games are a treasury of interesting modules that can be effective learning tools…

  15. The Influence of Improper Sets of Information on Judgment: How Irrelevant Information Can Bias Judged Probability

    ERIC Educational Resources Information Center

    Dougherty, Michael R.; Sprenger, Amber

    2006-01-01

    This article introduces 2 new sources of bias in probability judgment, discrimination failure and inhibition failure, which are conceptualized as arising from an interaction between error prone memory processes and a support theory like comparison process. Both sources of bias stem from the influence of irrelevant information on participants'…

  16. Quantum probability assignment limited by relativistic causality.

    PubMed

    Han, Yeong Deok; Choi, Taeseung

    2016-03-14

    Quantum theory has nonlocal correlations, which bothered Einstein, but found to satisfy relativistic causality. Correlation for a shared quantum state manifests itself, in the standard quantum framework, by joint probability distributions that can be obtained by applying state reduction and probability assignment that is called Born rule. Quantum correlations, which show nonlocality when the shared state has an entanglement, can be changed if we apply different probability assignment rule. As a result, the amount of nonlocality in quantum correlation will be changed. The issue is whether the change of the rule of quantum probability assignment breaks relativistic causality. We have shown that Born rule on quantum measurement is derived by requiring relativistic causality condition. This shows how the relativistic causality limits the upper bound of quantum nonlocality through quantum probability assignment.

  17. Prospect balancing theory: Bounded rationality of drivers' speed choice.

    PubMed

    Schmidt-Daffy, Martin

    2014-02-01

    This paper introduces a new approach to model the psychological determinants of drivers' speed choice: prospect-balancing theory. The theory transfers psychological insight into the bounded rationality of human decision-making to the field of driving behaviour. Speed choice is conceptualized as a trade-off between two options for action: the option to drive slower and the option to drive faster. Each option is weighted according to a subjective value and a subjectively weighted probability attributed to the achievement of the associated action goal; e.g. to avoid an accident by driving more slowly. The theory proposes that the subjective values and weightings of probability differ systematically from the objective conditions and thereby usually favour a cautious speed choice. A driving simulation study with 24 male participants supports this assumption. In a conflict between a monetary gain in case of fast arrival and a monetary loss in case of a collision with a deer, participants chose a velocity lower than that which would maximize their pay-out. Participants' subjective certainty of arriving in time and of avoiding a deer collision assessed at different driving speeds diverged from the respective objective probabilities in accordance with the observed bias in choice of speed. Results suggest that the bounded rationality of drivers' speed choice might be used to support attempts to improve road safety. Thus, understanding the motivational and perceptual determinants of this intuitive mode of decision-making might be a worthwhile focus of future research. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Incompleteness and limit of security theory of quantum key distribution

    NASA Astrophysics Data System (ADS)

    Hirota, Osamu; Murakami, Dan; Kato, Kentaro; Futami, Fumio

    2012-10-01

    It is claimed in the many papers that a trace distance: d guarantees the universal composition security in quantum key distribution (QKD) like BB84 protocol. In this introduction paper, at first, it is explicitly explained what is the main misconception in the claim of the unconditional security for QKD theory. In general terms, the cause of the misunderstanding on the security claim is the Lemma in the paper of Renner. It suggests that the generation of the perfect random key is assured by the probability (1-d), and its failure probability is d. Thus, it concludes that the generated key provides the perfect random key sequence when the protocol is success. So the QKD provides perfect secrecy to the one time pad. This is the reason for the composition claim. However, the quantity of the trace distance (or variational distance) is not the probability for such an event. If d is not small enough, always the generated key sequence is not uniform. Now one needs the reconstruction of the evaluation of the trace distance if one wants to use it. One should first go back to the indistinguishability theory in the computational complexity based, and to clarify the meaning of the value of the variational distance. In addition, the same analysis for the information theoretic case is necessary. The recent serial papers by H.P.Yuen have given the answer on such questions. In this paper, we show more concise description of Yuen's theory, and clarify that the upper bound theories for the trace distance by Tomamichel et al and Hayashi et al are constructed by the wrong reasoning of Renner and it is unsuitable as the security analysis. Finally, we introduce a new macroscopic quantum communication to replace Q-bit QKD.

  19. The Probabilistic Structure of Quantum Theory as Originating from Optimal Observation in the Face of the Observer's Lack of Knowledge of his Own State

    NASA Astrophysics Data System (ADS)

    Aerts, Sven

    2014-03-01

    One of the problems facing any attempt to understand quantum theory is that the theory does not seem to offer an explanation of the way the probabilities arise. Moreover, it is a commonly held view that no such explanation is compatible with the mathematical structure of quantum theory, i.e. that the theory is inherently indeterministic, simply because nature is like that. We propose an abstract formalisation of the observation of a system in which the interaction between the system and the observer deterministically produces one of n possible outcomes. If the observer consistently manages to realize the outcome which maximizes the likelihood ratio that the outcome was inferred from the state of the system under study (and not from his own state), he will be called optimal. The probability for a repeated measurement on an ensemble of identical system states, is then derived as a measure over observer states. If the state of the system is a statistical mixture, the optimal observer produces an unbiased estimate of the components of the mixture. In case the state space is a complex Hilbert space, the resulting probability is equal to the one given by the Born rule. The proposal offers a concise interpretation for the meaning of the occurrence of a specific outcome as the unique outcome that, relative to the state of the system, is least dependent on the state of the observer. We note that a similar paradigm is used in the literature of perception to explain optical illusions in human visual perception. We argue that the result strengthens Helmholtz's view that all observation, is in fact a form a inference.

  20. Scoring and Classifying Examinees Using Measurement Decision Theory

    ERIC Educational Resources Information Center

    Rudner, Lawrence M.

    2009-01-01

    This paper describes and evaluates the use of measurement decision theory (MDT) to classify examinees based on their item response patterns. The model has a simple framework that starts with the conditional probabilities of examinees in each category or mastery state responding correctly to each item. The presented evaluation investigates: (1) the…

  1. Protection Motivation and Self-Efficacy: A Model of Health Enhancement.

    ERIC Educational Resources Information Center

    Stanley, Melinda A.

    Protection motivation theory proposes that a perceived threat to health activates cognitive appraisals of the severity of the threatened event, the probability of its occurrence, and the efficacy of a coping response; a recent reformulation of the theory incorporates self-efficacy expectancy as a fourth mediating cognitive process. To test the…

  2. An Attachment Theory Approach to Narrating the Faith Journey of Children of Parental Divorce

    ERIC Educational Resources Information Center

    Kiesling, Chris

    2011-01-01

    This study explores the effects of parental divorce on a child's faith. Drawing from attachment theory, Granqvist and Kirkpatrick proposed two probable developmental pathways to religion. For those with secure attachment, whose cumulative experiences of sensitive, religious caregivers enhance the development of a God image as loving; belief…

  3. Argumentation, Dialogue Theory, and Probability Modeling: Alternative Frameworks for Argumentation Research in Education

    ERIC Educational Resources Information Center

    Nussbaum, E. Michael

    2011-01-01

    Toulmin's model of argumentation, developed in 1958, has guided much argumentation research in education. However, argumentation theory in philosophy and cognitive science has advanced considerably since 1958. There are currently several alternative frameworks of argumentation that can be useful for both research and practice in education. These…

  4. Randomized Item Response Theory Models

    ERIC Educational Resources Information Center

    Fox, Jean-Paul

    2005-01-01

    The randomized response (RR) technique is often used to obtain answers on sensitive questions. A new method is developed to measure latent variables using the RR technique because direct questioning leads to biased results. Within the RR technique is the probability of the true response modeled by an item response theory (IRT) model. The RR…

  5. A Progress Report on a Multi-Systems Theory of Communication Behavior.

    ERIC Educational Resources Information Center

    Grunig, James E.

    Previous research is reviewed in which a multi-systems theory of communication behavior has been used to explain communication behavior of individuals and of several organization-related systems and subsystems. Recent research is then summarized which sought to develop conditional probabilities that communication behavior will occur in each of 16…

  6. Modeling Conditional Probabilities in Complex Educational Assessments. CSE Technical Report.

    ERIC Educational Resources Information Center

    Mislevy, Robert J.; Almond, Russell; Dibello, Lou; Jenkins, Frank; Steinberg, Linda; Yan, Duanli; Senturk, Deniz

    An active area in psychometric research is coordinated task design and statistical analysis built around cognitive models. Compared with classical test theory and item response theory, there is often less information from observed data about the measurement-model parameters. On the other hand, there is more information from the grounding…

  7. THREE-PEE SAMPLING THEORY and program 'THRP' for computer generation of selection criteria

    Treesearch

    L. R. Grosenbaugh

    1965-01-01

    Theory necessary for sampling with probability proportional to prediction ('three-pee,' or '3P,' sampling) is first developed and then exemplified by numerical comparisons of several estimators. Program 'T RP' for computer generation of appropriate 3P-sample-selection criteria is described, and convenient random integer dispensers are...

  8. Statistical computation of tolerance limits

    NASA Technical Reports Server (NTRS)

    Wheeler, J. T.

    1993-01-01

    Based on a new theory, two computer codes were developed specifically to calculate the exact statistical tolerance limits for normal distributions within unknown means and variances for the one-sided and two-sided cases for the tolerance factor, k. The quantity k is defined equivalently in terms of the noncentral t-distribution by the probability equation. Two of the four mathematical methods employ the theory developed for the numerical simulation. Several algorithms for numerically integrating and iteratively root-solving the working equations are written to augment the program simulation. The program codes generate some tables of k's associated with the varying values of the proportion and sample size for each given probability to show accuracy obtained for small sample sizes.

  9. Ecological resilience in lakes and the conjunction fallacy.

    PubMed

    Spears, Bryan M; Futter, Martyn N; Jeppesen, Erik; Huser, Brian J; Ives, Stephen; Davidson, Thomas A; Adrian, Rita; Angeler, David G; Burthe, Sarah J; Carvalho, Laurence; Daunt, Francis; Gsell, Alena S; Hessen, Dag O; Janssen, Annette B G; Mackay, Eleanor B; May, Linda; Moorhouse, Heather; Olsen, Saara; Søndergaard, Martin; Woods, Helen; Thackeray, Stephen J

    2017-11-01

    There is a pressing need to apply stability and resilience theory to environmental management to restore degraded ecosystems effectively and to mitigate the effects of impending environmental change. Lakes represent excellent model case studies in this respect and have been used widely to demonstrate theories of ecological stability and resilience that are needed to underpin preventative management approaches. However, we argue that this approach is not yet fully developed because the pursuit of empirical evidence to underpin such theoretically grounded management continues in the absence of an objective probability framework. This has blurred the lines between intuitive logic (based on the elementary principles of probability) and extensional logic (based on assumption and belief) in this field.

  10. Quantum algorithms for quantum field theories.

    PubMed

    Jordan, Stephen P; Lee, Keith S M; Preskill, John

    2012-06-01

    Quantum field theory reconciles quantum mechanics and special relativity, and plays a central role in many areas of physics. We developed a quantum algorithm to compute relativistic scattering probabilities in a massive quantum field theory with quartic self-interactions (φ(4) theory) in spacetime of four and fewer dimensions. Its run time is polynomial in the number of particles, their energy, and the desired precision, and applies at both weak and strong coupling. In the strong-coupling and high-precision regimes, our quantum algorithm achieves exponential speedup over the fastest known classical algorithm.

  11. Learning and dynamics in social systems. Comment on "Collective learning modeling based on the kinetic theory of active particles" by D. Burini et al.

    NASA Astrophysics Data System (ADS)

    Dolfin, Marina

    2016-03-01

    The interesting novelty of the paper by Burini et al. [1] is that the authors present a survey and a new approach of collective learning based on suitable development of methods of the kinetic theory [2] and theoretical tools of evolutionary game theory [3]. Methods of statistical dynamics and kinetic theory lead naturally to stochastic and collective dynamics. Indeed, the authors propose the use of games where the state of the interacting entities is delivered by probability distributions.

  12. Is There a Conjunction Fallacy in Legal Probabilistic Decision Making?

    PubMed

    Wojciechowski, Bartosz W; Pothos, Emmanuel M

    2018-01-01

    Classical probability theory (CPT) has represented the rational standard for decision making in human cognition. Even though CPT has provided many descriptively excellent decision models, there have also been some empirical results persistently problematic for CPT accounts. The tension between the normative prescription of CPT and human behavior is particularly acute in cases where we have higher expectations for rational decisions. One such case concerns legal decision making from legal experts, such as attorneys and prosecutors and, more so, judges. In the present research we explore one of the most influential CPT decision fallacies, the conjunction fallacy (CF), in a legal decision making task, involving assessing evidence that the same suspect had committed two separate crimes. The information for the two crimes was presented consecutively. Each participant was asked to provide individual ratings for the two crimes in some cases and conjunctive probability rating for both crimes in other cases, after all information had been presented. Overall, 360 probability ratings for guilt were collected from 120 participants, comprised of 40 judges, 40 attorneys and prosecutors, and 40 individuals without legal education. Our results provide evidence for a double conjunction fallacy (in this case, a higher probability of committing both crimes than the probability of committing either crime individually), in the group of individuals without legal education. These results are discussed in terms of their applied implications and in relation to a recent framework for understanding such results, quantum probability theory (QPT).

  13. Is There a Conjunction Fallacy in Legal Probabilistic Decision Making?

    PubMed Central

    Wojciechowski, Bartosz W.; Pothos, Emmanuel M.

    2018-01-01

    Classical probability theory (CPT) has represented the rational standard for decision making in human cognition. Even though CPT has provided many descriptively excellent decision models, there have also been some empirical results persistently problematic for CPT accounts. The tension between the normative prescription of CPT and human behavior is particularly acute in cases where we have higher expectations for rational decisions. One such case concerns legal decision making from legal experts, such as attorneys and prosecutors and, more so, judges. In the present research we explore one of the most influential CPT decision fallacies, the conjunction fallacy (CF), in a legal decision making task, involving assessing evidence that the same suspect had committed two separate crimes. The information for the two crimes was presented consecutively. Each participant was asked to provide individual ratings for the two crimes in some cases and conjunctive probability rating for both crimes in other cases, after all information had been presented. Overall, 360 probability ratings for guilt were collected from 120 participants, comprised of 40 judges, 40 attorneys and prosecutors, and 40 individuals without legal education. Our results provide evidence for a double conjunction fallacy (in this case, a higher probability of committing both crimes than the probability of committing either crime individually), in the group of individuals without legal education. These results are discussed in terms of their applied implications and in relation to a recent framework for understanding such results, quantum probability theory (QPT). PMID:29674983

  14. Economic decision-making compared with an equivalent motor task.

    PubMed

    Wu, Shih-Wei; Delgado, Mauricio R; Maloney, Laurence T

    2009-04-14

    There is considerable evidence that human economic decision-making deviates from the predictions of expected utility theory (EUT) and that human performance conforms to EUT in many perceptual and motor decision tasks. It is possible that these results reflect a real difference in decision-making in the 2 domains but it is also possible that the observed discrepancy simply reflects typical differences in experimental design. We developed a motor task that is mathematically equivalent to choosing between lotteries and used it to compare how the same subject chose between classical economic lotteries and the same lotteries presented in equivalent motor form. In experiment 1, we found that subjects are more risk seeking in deciding between motor lotteries. In experiment 2, we used cumulative prospect theory to model choice and separately estimated the probability weighting functions and the value functions for each subject carrying out each task. We found no patterned differences in how subjects represented outcome value in the motor and the classical tasks. However, the probability weighting functions for motor and classical tasks were markedly and significantly different. Those for the classical task showed a typical tendency to overweight small probabilities and underweight large probabilities, and those for the motor task showed the opposite pattern of probability distortion. This outcome also accounts for the increased risk-seeking observed in the motor tasks of experiment 1. We conclude that the same subject distorts probability, but not value, differently in making identical decisions in motor and classical form.

  15. The exact probability distribution of the rank product statistics for replicated experiments.

    PubMed

    Eisinga, Rob; Breitling, Rainer; Heskes, Tom

    2013-03-18

    The rank product method is a widely accepted technique for detecting differentially regulated genes in replicated microarray experiments. To approximate the sampling distribution of the rank product statistic, the original publication proposed a permutation approach, whereas recently an alternative approximation based on the continuous gamma distribution was suggested. However, both approximations are imperfect for estimating small tail probabilities. In this paper we relate the rank product statistic to number theory and provide a derivation of its exact probability distribution and the true tail probabilities. Copyright © 2013 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  16. The Tightness of the Kesten-Stigum Reconstruction Bound of Symmetric Model with Multiple Mutations

    NASA Astrophysics Data System (ADS)

    Liu, Wenjian; Jammalamadaka, Sreenivasa Rao; Ning, Ning

    2018-02-01

    It is well known that reconstruction problems, as the interdisciplinary subject, have been studied in numerous contexts including statistical physics, information theory and computational biology, to name a few. We consider a 2 q-state symmetric model, with two categories of q states in each category, and 3 transition probabilities: the probability to remain in the same state, the probability to change states but remain in the same category, and the probability to change categories. We construct a nonlinear second-order dynamical system based on this model and show that the Kesten-Stigum reconstruction bound is not tight when q ≥ 4.

  17. Single, Complete, Probability Spaces Consistent With EPR-Bohm-Bell Experimental Data

    NASA Astrophysics Data System (ADS)

    Avis, David; Fischer, Paul; Hilbert, Astrid; Khrennikov, Andrei

    2009-03-01

    We show that paradoxical consequences of violations of Bell's inequality are induced by the use of an unsuitable probabilistic description for the EPR-Bohm-Bell experiment. The conventional description (due to Bell) is based on a combination of statistical data collected for different settings of polarization beam splitters (PBSs). In fact, such data consists of some conditional probabilities which only partially define a probability space. Ignoring this conditioning leads to apparent contradictions in the classical probabilistic model (due to Kolmogorov). We show how to make a completely consistent probabilistic model by taking into account the probabilities of selecting the settings of the PBSs. Our model matches both the experimental data and is consistent with classical probability theory.

  18. Logic, probability, and human reasoning.

    PubMed

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Aspects of neutrino oscillation in alternative gravity theories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chakraborty, Sumanta, E-mail: sumantac.physics@gmail.com

    2015-10-01

    Neutrino spin and flavour oscillation in curved spacetime have been studied for the most general static spherically symmetric configuration. Having exploited the spherical symmetry we have confined ourselves to the equatorial plane in order to determine the spin and flavour oscillation frequency in this general set-up. Using the symmetry properties we have derived spin oscillation frequency for neutrino moving along a geodesic or in a circular orbit. Starting from the expression of neutrino spin oscillation frequency we have shown that even in this general context, in high energy limit the spin oscillation frequency for neutrino moving along circular orbit vanishes.more » We have verified previous results along this line by transforming to Schwarzschild coordinates under appropriate limit. This finally lends itself to the probability of neutrino helicity flip which turns out to be non-zero. While for neutrino flavour oscillation we have derived general results for oscillation phase, which subsequently have been applied to three different gravity theories. One, of them appears as low-energy approximation to string theory, where we have an additional field, namely, dilaton field coupled to Maxwell field tensor. This yields a realization of Reissner-Nordström solution in string theory at low-energy. Next one corresponds to generalization of Schwarzschild solution by introduction of quadratic curvature terms of all possible form to the Einstein-Hilbert action. Finally, we have also discussed regular black hole solutions. In all these cases the flavour oscillation probabilities can be determined for solar neutrinos and thus can be used to put bounds on the parameters of these gravity theories. While for spin oscillation probability, we have considered two cases, Gauss-Bonnet term added to the Einstein-Hilbert action and the f(R) gravity theory. In both these cases we could impose bounds on the parameters which are consistent with previous considerations. In a nutshell, in this work we have presented both spin and flavour oscillation frequency of neutrino in most general static spherically symmetric spacetime, encompassing a vast class of solutions, which when applied to three such instances in alternative theories for flavour oscillation and two alternative theories for spin oscillation put bounds on the parameters of these theories. Implications are also discussed.« less

  20. A mathematical model for evolution and SETI.

    PubMed

    Maccone, Claudio

    2011-12-01

    Darwinian evolution theory may be regarded as a part of SETI theory in that the factor f(l) in the Drake equation represents the fraction of planets suitable for life on which life actually arose. In this paper we firstly provide a statistical generalization of the Drake equation where the factor f(l) is shown to follow the lognormal probability distribution. This lognormal distribution is a consequence of the Central Limit Theorem (CLT) of Statistics, stating that the product of a number of independent random variables whose probability densities are unknown and independent of each other approached the lognormal distribution when the number of factors increased to infinity. In addition we show that the exponential growth of the number of species typical of Darwinian Evolution may be regarded as the geometric locus of the peaks of a one-parameter family of lognormal distributions (b-lognormals) constrained between the time axis and the exponential growth curve. Finally, since each b-lognormal distribution in the family may in turn be regarded as the product of a large number (actually "an infinity") of independent lognormal probability distributions, the mathematical way is paved to further cast Darwinian Evolution into a mathematical theory in agreement with both its typical exponential growth in the number of living species and the Statistical Drake Equation.

  1. Modelling Evolution and SETI Mathematically

    NASA Astrophysics Data System (ADS)

    Maccone, Claudio

    2012-05-01

    Darwinian evolution theory may be regarded as a part of SETI theory in that the factor fl in the Drake equation represents the fraction of planets suitable for life on which life actually arose. In this paper we firstly provide a statistical generalization of the Drake equation where the factor fl is shown to follow the lognormal probability distribution. This lognormal distribution is a consequence of the Central Limit Theorem (CLT) of Statistics, stating that the product of a number of independent random variables whose probability densities are unknown and independent of each other approached the lognormal distribution when the number of factor increased to infinity. In addition we show that the exponential growth of the number of species typical of Darwinian Evolution may be regarded as the geometric locus of the peaks of a one-parameter family of lognormal distributions constrained between the time axis and the exponential growth curve. Finally, since each lognormal distribution in the family may in turn be regarded as the product of a large number (actually "an infinity") of independent lognormal probability distributions, the mathematical way is paved to further cast Darwinian Evolution into a mathematical theory in agreement with both its typical exponential growth in the number of living species and the Statistical Drake Equation.

  2. A Mathematical Model for Evolution and SETI

    NASA Astrophysics Data System (ADS)

    Maccone, Claudio

    2011-12-01

    Darwinian evolution theory may be regarded as a part of SETI theory in that the factor fl in the Drake equation represents the fraction of planets suitable for life on which life actually arose. In this paper we firstly provide a statistical generalization of the Drake equation where the factor fl is shown to follow the lognormal probability distribution. This lognormal distribution is a consequence of the Central Limit Theorem (CLT) of Statistics, stating that the product of a number of independent random variables whose probability densities are unknown and independent of each other approached the lognormal distribution when the number of factors increased to infinity. In addition we show that the exponential growth of the number of species typical of Darwinian Evolution may be regarded as the geometric locus of the peaks of a one-parameter family of lognormal distributions (b-lognormals) constrained between the time axis and the exponential growth curve. Finally, since each b-lognormal distribution in the family may in turn be regarded as the product of a large number (actually "an infinity") of independent lognormal probability distributions, the mathematical way is paved to further cast Darwinian Evolution into a mathematical theory in agreement with both its typical exponential growth in the number of living species and the Statistical Drake Equation.

  3. Quantum-correlation breaking channels, quantum conditional probability and Perron-Frobenius theory

    NASA Astrophysics Data System (ADS)

    Chruściński, Dariusz

    2013-03-01

    Using the quantum analog of conditional probability and classical Bayes theorem we discuss some aspects of particular entanglement breaking channels: quantum-classical and classical-classical channels. Applying the quantum analog of Perron-Frobenius theorem we generalize the recent result of Korbicz et al. (2012) [8] on full and spectrum broadcasting from quantum-classical channels to arbitrary quantum channels.

  4. Brownian motion in time-dependent logarithmic potential: Exact results for dynamics and first-passage properties.

    PubMed

    Ryabov, Artem; Berestneva, Ekaterina; Holubec, Viktor

    2015-09-21

    The paper addresses Brownian motion in the logarithmic potential with time-dependent strength, U(x, t) = g(t)log(x), subject to the absorbing boundary at the origin of coordinates. Such model can represent kinetics of diffusion-controlled reactions of charged molecules or escape of Brownian particles over a time-dependent entropic barrier at the end of a biological pore. We present a simple asymptotic theory which yields the long-time behavior of both the survival probability (first-passage properties) and the moments of the particle position (dynamics). The asymptotic survival probability, i.e., the probability that the particle will not hit the origin before a given time, is a functional of the potential strength. As such, it exhibits a rather varied behavior for different functions g(t). The latter can be grouped into three classes according to the regime of the asymptotic decay of the survival probability. We distinguish 1. the regular (power-law decay), 2. the marginal (power law times a slow function of time), and 3. the regime of enhanced absorption (decay faster than the power law, e.g., exponential). Results of the asymptotic theory show good agreement with numerical simulations.

  5. Misinterpretation of statistical distance in security of quantum key distribution shown by simulation

    NASA Astrophysics Data System (ADS)

    Iwakoshi, Takehisa; Hirota, Osamu

    2014-10-01

    This study will test an interpretation in quantum key distribution (QKD) that trace distance between the distributed quantum state and the ideal mixed state is a maximum failure probability of the protocol. Around 2004, this interpretation was proposed and standardized to satisfy both of the key uniformity in the context of universal composability and operational meaning of the failure probability of the key extraction. However, this proposal has not been verified concretely yet for many years while H. P. Yuen and O. Hirota have thrown doubt on this interpretation since 2009. To ascertain this interpretation, a physical random number generator was employed to evaluate key uniformity in QKD. In this way, we calculated statistical distance which correspond to trace distance in quantum theory after a quantum measurement is done, then we compared it with the failure probability whether universal composability was obtained. As a result, the degree of statistical distance of the probability distribution of the physical random numbers and the ideal uniformity was very large. It is also explained why trace distance is not suitable to guarantee the security in QKD from the view point of quantum binary decision theory.

  6. Quantum stochastic walks on networks for decision-making.

    PubMed

    Martínez-Martínez, Ismael; Sánchez-Burillo, Eduardo

    2016-03-31

    Recent experiments report violations of the classical law of total probability and incompatibility of certain mental representations when humans process and react to information. Evidence shows promise of a more general quantum theory providing a better explanation of the dynamics and structure of real decision-making processes than classical probability theory. Inspired by this, we show how the behavioral choice-probabilities can arise as the unique stationary distribution of quantum stochastic walkers on the classical network defined from Luce's response probabilities. This work is relevant because (i) we provide a very general framework integrating the positive characteristics of both quantum and classical approaches previously in confrontation, and (ii) we define a cognitive network which can be used to bring other connectivist approaches to decision-making into the quantum stochastic realm. We model the decision-maker as an open system in contact with her surrounding environment, and the time-length of the decision-making process reveals to be also a measure of the process' degree of interplay between the unitary and irreversible dynamics. Implementing quantum coherence on classical networks may be a door to better integrate human-like reasoning biases in stochastic models for decision-making.

  7. Quantum stochastic walks on networks for decision-making

    NASA Astrophysics Data System (ADS)

    Martínez-Martínez, Ismael; Sánchez-Burillo, Eduardo

    2016-03-01

    Recent experiments report violations of the classical law of total probability and incompatibility of certain mental representations when humans process and react to information. Evidence shows promise of a more general quantum theory providing a better explanation of the dynamics and structure of real decision-making processes than classical probability theory. Inspired by this, we show how the behavioral choice-probabilities can arise as the unique stationary distribution of quantum stochastic walkers on the classical network defined from Luce’s response probabilities. This work is relevant because (i) we provide a very general framework integrating the positive characteristics of both quantum and classical approaches previously in confrontation, and (ii) we define a cognitive network which can be used to bring other connectivist approaches to decision-making into the quantum stochastic realm. We model the decision-maker as an open system in contact with her surrounding environment, and the time-length of the decision-making process reveals to be also a measure of the process’ degree of interplay between the unitary and irreversible dynamics. Implementing quantum coherence on classical networks may be a door to better integrate human-like reasoning biases in stochastic models for decision-making.

  8. Quantum stochastic walks on networks for decision-making

    PubMed Central

    Martínez-Martínez, Ismael; Sánchez-Burillo, Eduardo

    2016-01-01

    Recent experiments report violations of the classical law of total probability and incompatibility of certain mental representations when humans process and react to information. Evidence shows promise of a more general quantum theory providing a better explanation of the dynamics and structure of real decision-making processes than classical probability theory. Inspired by this, we show how the behavioral choice-probabilities can arise as the unique stationary distribution of quantum stochastic walkers on the classical network defined from Luce’s response probabilities. This work is relevant because (i) we provide a very general framework integrating the positive characteristics of both quantum and classical approaches previously in confrontation, and (ii) we define a cognitive network which can be used to bring other connectivist approaches to decision-making into the quantum stochastic realm. We model the decision-maker as an open system in contact with her surrounding environment, and the time-length of the decision-making process reveals to be also a measure of the process’ degree of interplay between the unitary and irreversible dynamics. Implementing quantum coherence on classical networks may be a door to better integrate human-like reasoning biases in stochastic models for decision-making. PMID:27030372

  9. Cultural theories of nursing responsive to human needs and values.

    PubMed

    Kikuchi, June F

    2005-01-01

    To present an alternative to the recent proposal that the extant theories of nursing be replaced with culture-specific theories of nursing in order to have theories of nursing that are culturally responsive. A philosophical analysis of the implications for nursing practice of adopting the proposal serves as the basis for recommending a different philosophically based theoretical solution. Anticipated probable objections to that recommendation are considered. Culture-specific theories of nursing might allow nurses to be culturally sensitive. However, because nurses must be humanly responsive to the needs of people as human beings and not just as cultural beings, cultural theories of nursing should be developed within the precepts of a transcultural theory of nursing grounded in a conception of objective human needs and values.

  10. Testing typicality in multiverse cosmology

    NASA Astrophysics Data System (ADS)

    Azhar, Feraz

    2015-05-01

    In extracting predictions from theories that describe a multiverse, we face the difficulty that we must assess probability distributions over possible observations prescribed not just by an underlying theory, but by a theory together with a conditionalization scheme that allows for (anthropic) selection effects. This means we usually need to compare distributions that are consistent with a broad range of possible observations with actual experimental data. One controversial means of making this comparison is by invoking the "principle of mediocrity": that is, the principle that we are typical of the reference class implicit in the conjunction of the theory and the conditionalization scheme. In this paper, we quantitatively assess the principle of mediocrity in a range of cosmological settings, employing "xerographic distributions" to impose a variety of assumptions regarding typicality. We find that for a fixed theory, the assumption that we are typical gives rise to higher likelihoods for our observations. If, however, one allows both the underlying theory and the assumption of typicality to vary, then the assumption of typicality does not always provide the highest likelihoods. Interpreted from a Bayesian perspective, these results support the claim that when one has the freedom to consider different combinations of theories and xerographic distributions (or different "frameworks"), one should favor the framework that has the highest posterior probability; and then from this framework one can infer, in particular, how typical we are. In this way, the invocation of the principle of mediocrity is more questionable than has been recently claimed.

  11. Discriminating Sea Spikes in Incoherent Radar Measurements of Sea Clutter

    DTIC Science & Technology

    2008-03-01

    het detecteren echter niet te verwachten dat bet gebruik van sea spikes te onderzoeken. Een van deze modellen zal leiden tot een Auteur (s) dergelijk...report I TNO-DV 2008 A067 6/33 Abbreviations CFAR Constant False-Alarm Rate CST Composite Surface Theory FFT Fast Fourier Transform PDF Probability Density...described by the composite surface theory (CST). This theory describes the sea surface as small Bragg-resonant capillary waves riding on top of

  12. The first cycle of the reflective pedagogical paradigm implementation in the introduction probability theory course

    NASA Astrophysics Data System (ADS)

    Julie, Hongki

    2017-08-01

    One of purposes of this study was describing the steps of the teaching and learning process if the teacher in the Introduction Probability Theory course wanted to teach about the event probability by using the reflective pedagogical paradigm (RPP) and describing the results achieved by the students. The study consisted of three cycles, but the results would be presented in this paper was limited to the results obtained in the first cycle. Stages conducted by the researcher in the first cycle could be divided into five stages, namely (1) to know the students' context, (2) to plan and provide student learning experiences, (3) to facilitate students in actions, (4) to ask students to make a reflection and (5) to evaluate. The type of research used in this research was descriptive qualitative and quantitative research. The students' learning experience, the students' action, and the students' reflection would be described qualitatively. The student evaluation results would be described quantitatively. The research subject in this study was 38 students taking the introduction probability theory course in class C. From the students' reflection, still quite a lot of students were not complete in writing concepts that they have learned and / or have not been precise in describing the relationships between concepts that they have learned. From the students' evaluation, 85.29% students got score under 7. If examined more deeply, the most difficulty of students were in the mathematical horizontal process. As a result, they had difficulty in performing the mathematical vertical process.

  13. Inference of emission rates from multiple sources using Bayesian probability theory.

    PubMed

    Yee, Eugene; Flesch, Thomas K

    2010-03-01

    The determination of atmospheric emission rates from multiple sources using inversion (regularized least-squares or best-fit technique) is known to be very susceptible to measurement and model errors in the problem, rendering the solution unusable. In this paper, a new perspective is offered for this problem: namely, it is argued that the problem should be addressed as one of inference rather than inversion. Towards this objective, Bayesian probability theory is used to estimate the emission rates from multiple sources. The posterior probability distribution for the emission rates is derived, accounting fully for the measurement errors in the concentration data and the model errors in the dispersion model used to interpret the data. The Bayesian inferential methodology for emission rate recovery is validated against real dispersion data, obtained from a field experiment involving various source-sensor geometries (scenarios) consisting of four synthetic area sources and eight concentration sensors. The recovery of discrete emission rates from three different scenarios obtained using Bayesian inference and singular value decomposition inversion are compared and contrasted.

  14. Informational need of emotional stress

    NASA Astrophysics Data System (ADS)

    Simonov, P. V.; Frolov, M. V.

    According to the informational theory of emotions[1], emotions in humans depend on the power of some need (motivation) and the estimation by the subject of the probability (possibility) of the need staisfaction (the goal achievement). Low probability of need satisfaction leads to negative emotions, actively minimized by the subject. Increased probability of satisfaction, as compared to earlier forecast, generates positive emotions, which the subject tries to maximize, i.e. to enhance, to prolong, to repeat. The informational theory of emotions encompasses their reflective function, the laws of their appearance, the regulatory significance of emotions, and their role in organization of behavior. The level of emotional stress influences the operator's performance. A decrease in the emotional tonus leads to drowsiness, lack of vigilance, missing of significant signals and to slower reactions. An extremely high stress level disorganizes the activity, complicates it with a trend toward incorrect actions and reactions to insignificant signals (false alarms). The neurophysiological mechanisms of the influence of emotions on perceptual activity and operator performance as well as the significance of individuality are discussed.

  15. What subject matter questions motivate the use of machine learning approaches compared to statistical models for probability prediction?

    PubMed

    Binder, Harald

    2014-07-01

    This is a discussion of the following papers: "Probability estimation with machine learning methods for dichotomous and multicategory outcome: Theory" by Jochen Kruppa, Yufeng Liu, Gérard Biau, Michael Kohler, Inke R. König, James D. Malley, and Andreas Ziegler; and "Probability estimation with machine learning methods for dichotomous and multicategory outcome: Applications" by Jochen Kruppa, Yufeng Liu, Hans-Christian Diener, Theresa Holste, Christian Weimar, Inke R. König, and Andreas Ziegler. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Quantum-like Modeling of Cognition

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei

    2015-09-01

    This paper begins with a historical review of the mutual influence of physics and psychology, from Freud's invention of psychic energy inspired by von Boltzmann' thermodynamics to the enrichment quantum physics gained from the side of psychology by the notion of complementarity (the invention of Niels Bohr who was inspired by William James), besides we consider the resonance of the correspondence between Wolfgang Pauli and Carl Jung in both physics and psychology. Then we turn to the problem of development of mathematical models for laws of thought starting with Boolean logic and progressing towards foundations of classical probability theory. Interestingly, the laws of classical logic and probability are routinely violated not only by quantum statistical phenomena but by cognitive phenomena as well. This is yet another common feature between quantum physics and psychology. In particular, cognitive data can exhibit a kind of the probabilistic interference effect. This similarity with quantum physics convinced a multi-disciplinary group of scientists (physicists, psychologists, economists, sociologists) to apply the mathematical apparatus of quantum mechanics to modeling of cognition. We illustrate this activity by considering a few concrete phenomena: the order and disjunction effects, recognition of ambiguous figures, categorization-decision making. In Appendix 1 we briefly present essentials of theory of contextual probability and a method of representations of contextual probabilities by complex probability amplitudes (solution of the ``inverse Born's problem'') based on a quantum-like representation algorithm (QLRA).

  17. Optimal and Most Exact Confidence Intervals for Person Parameters in Item Response Theory Models

    ERIC Educational Resources Information Center

    Doebler, Anna; Doebler, Philipp; Holling, Heinz

    2013-01-01

    The common way to calculate confidence intervals for item response theory models is to assume that the standardized maximum likelihood estimator for the person parameter [theta] is normally distributed. However, this approximation is often inadequate for short and medium test lengths. As a result, the coverage probabilities fall below the given…

  18. The Task of Reviewing and Finding the Right Organizational Change Theory

    ERIC Educational Resources Information Center

    Alase, Abayomi Oluwatosin

    2017-01-01

    Organizational change is probably the singular most important undertaken that many organizations wish they could do to affect their productivities/profitability performances. This review paper will highlight some of the well-known theories and approaches to organizational change. In the late 1990s and early 2000s, America had one of the best…

  19. Rasch Measurement and Item Banking: Theory and Practice.

    ERIC Educational Resources Information Center

    Nakamura, Yuji

    The Rasch Model is an item response theory, one parameter model developed that states that the probability of a correct response on a test is a function of the difficulty of the item and the ability of the candidate. Item banking is useful for language testing. The Rasch Model provides estimates of item difficulties that are meaningful,…

  20. Active ingredients of substance use-focused self-help groups.

    PubMed

    Moos, Rudolf H

    2008-03-01

    This paper provides an overview of some of the probable active ingredients of self-help groups in light of four related theories that identify common social processes that appear to underlie effective psychosocial treatments for and continuing remission from these disorders. Social control theory specifies active ingredients such as bonding, goal direction and structure; social learning theory specifies the importance of norms and role models, behavioral economics and behavioral choice theory emphasizes involvement in rewarding activities other than substance use, and stress and coping theory highlights building self-efficacy and effective coping skills. A review of existing studies suggests that the emphasis on these active ingredients probably underlies some aspects of the effectiveness of self-help groups. Several issues that need to be addressed to enhance understanding of the active ingredients of action of self-help groups are discussed, including consideration of indices of Alcoholics Anonymous (AA) affiliation as active ingredients, identification of personal characteristics that may moderate the influence of active ingredients on substance use outcomes, examination of whether active ingredients of self-help groups, can amplify or compensate for treatment, identification of potential detrimental effects of involvement in self-help groups and focusing on the link between active ingredients of self-help groups and other aspects of the overall recovery milieu, such as the family and social networks.

  1. Probability for Weather and Climate

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of decision making versus advance science, are noted. It is argued that, just as no point forecast is complete without an estimate of its accuracy, no model-based probability forecast is complete without an estimate of its own irrelevance. The same nonlinearities that made the electronic computer so valuable links the selection and assimilation of observations, the formation of ensembles, the evolution of models, the casting of model simulations back into observables, and the presentation of this information to those who use it to take action or to advance science. Timescales of interest exceed the lifetime of a climate model and the career of a climate scientist, disarming the trichotomy that lead to swift advances in weather forecasting. Providing credible, informative climate services is a more difficult task. In this context, the value of comparing the forecasts of simulation models not only with each other but also with the performance of simple empirical models, whenever possible, is stressed. The credibility of meteorology is based on its ability to forecast and explain the weather. The credibility of climatology will always be based on flimsier stuff. Solid insights of climate science may be obscured if the severe limits on our ability to see the details of the future even probabilistically are not communicated clearly.

  2. Probability density function learning by unsupervised neurons.

    PubMed

    Fiori, S

    2001-10-01

    In a recent work, we introduced the concept of pseudo-polynomial adaptive activation function neuron (FAN) and presented an unsupervised information-theoretic learning theory for such structure. The learning model is based on entropy optimization and provides a way of learning probability distributions from incomplete data. The aim of the present paper is to illustrate some theoretical features of the FAN neuron, to extend its learning theory to asymmetrical density function approximation, and to provide an analytical and numerical comparison with other known density function estimation methods, with special emphasis to the universal approximation ability. The paper also provides a survey of PDF learning from incomplete data, as well as results of several experiments performed on real-world problems and signals.

  3. Communication: Energy transfer and reaction dynamics for DCl scattering on Au(111): An ab initio molecular dynamics study.

    PubMed

    Kolb, Brian; Guo, Hua

    2016-07-07

    Scattering and dissociative chemisorption of DCl on Au(111) are investigated using ab initio molecular dynamics with a slab model, in which the top two layers of Au are mobile. Substantial kinetic energy loss in the scattered DCl is found, but the amount of energy transfer is notably smaller than that observed in the experiment. On the other hand, the dissociative chemisorption probability reproduces the experimental trend with respect to the initial kinetic energy, but is about one order of magnitude larger than the reported initial sticking probability. While the theory-experiment agreement is significantly improved from the previous rigid surface model, the remaining discrepancies are still substantial, calling for further scrutiny in both theory and experiment.

  4. Poisson statistics of PageRank probabilities of Twitter and Wikipedia networks

    NASA Astrophysics Data System (ADS)

    Frahm, Klaus M.; Shepelyansky, Dima L.

    2014-04-01

    We use the methods of quantum chaos and Random Matrix Theory for analysis of statistical fluctuations of PageRank probabilities in directed networks. In this approach the effective energy levels are given by a logarithm of PageRank probability at a given node. After the standard energy level unfolding procedure we establish that the nearest spacing distribution of PageRank probabilities is described by the Poisson law typical for integrable quantum systems. Our studies are done for the Twitter network and three networks of Wikipedia editions in English, French and German. We argue that due to absence of level repulsion the PageRank order of nearby nodes can be easily interchanged. The obtained Poisson law implies that the nearby PageRank probabilities fluctuate as random independent variables.

  5. The General Necessary Condition for the Validity of Dirac's Transition Perturbation Theory

    NASA Technical Reports Server (NTRS)

    Quang, Nguyen Vinh

    1996-01-01

    For the first time, from the natural requirements for the successive approximation the general necessary condition of validity of the Dirac's method is explicitly established. It is proved that the conception of 'the transition probability per unit time' is not valid. The 'super-platinium rules' for calculating the transition probability are derived for the arbitrarily strong time-independent perturbation case.

  6. Various Effects of Embedded Intrapulse Communications on Pulsed Radar

    DTIC Science & Technology

    2017-06-01

    specific type of interference that may be encountered by radar; however, this introductory information should suffice to illustrate to the reader why...chapter we seek to not merely understand the overall statistical performance of the radar with embedded intrapulse communications but rather to evaluate...Theory Probability of detection, discussed in Chapter 4, assesses the statistical probability of a radar accurately identifying a target given a

  7. Theoretical aspects and modelling of cellular decision making, cell killing and information-processing in photodynamic therapy of cancer.

    PubMed

    Gkigkitzis, Ioannis

    2013-01-01

    The aim of this report is to provide a mathematical model of the mechanism for making binary fate decisions about cell death or survival, during and after Photodynamic Therapy (PDT) treatment, and to supply the logical design for this decision mechanism as an application of rate distortion theory to the biochemical processing of information by the physical system of a cell. Based on system biology models of the molecular interactions involved in the PDT processes previously established, and regarding a cellular decision-making system as a noisy communication channel, we use rate distortion theory to design a time dependent Blahut-Arimoto algorithm where the input is a stimulus vector composed of the time dependent concentrations of three PDT related cell death signaling molecules and the output is a cell fate decision. The molecular concentrations are determined by a group of rate equations. The basic steps are: initialize the probability of the cell fate decision, compute the conditional probability distribution that minimizes the mutual information between input and output, compute the cell probability of cell fate decision that minimizes the mutual information and repeat the last two steps until the probabilities converge. Advance to the next discrete time point and repeat the process. Based on the model from communication theory described in this work, and assuming that the activation of the death signal processing occurs when any of the molecular stimulants increases higher than a predefined threshold (50% of the maximum concentrations), for 1800s of treatment, the cell undergoes necrosis within the first 30 minutes with probability range 90.0%-99.99% and in the case of repair/survival, it goes through apoptosis within 3-4 hours with probability range 90.00%-99.00%. Although, there is no experimental validation of the model at this moment, it reproduces some patterns of survival ratios of predicted experimental data. Analytical modeling based on cell death signaling molecules has been shown to be an independent and useful tool for prediction of cell surviving response to PDT. The model can be adjusted to provide important insights for cellular response to other treatments such as hyperthermia, and diseases such as neurodegeneration.

  8. Electron number probability distributions for correlated wave functions.

    PubMed

    Francisco, E; Martín Pendás, A; Blanco, M A

    2007-03-07

    Efficient formulas for computing the probability of finding exactly an integer number of electrons in an arbitrarily chosen volume are only known for single-determinant wave functions [E. Cances et al., Theor. Chem. Acc. 111, 373 (2004)]. In this article, an algebraic method is presented that extends these formulas to the case of multideterminant wave functions and any number of disjoint volumes. The derived expressions are applied to compute the probabilities within the atomic domains derived from the space partitioning based on the quantum theory of atoms in molecules. Results for a series of test molecules are presented, paying particular attention to the effects of electron correlation and of some numerical approximations on the computed probabilities.

  9. Coincidence and covariance data acquisition in photoelectron and -ion spectroscopy. I. Formal theory

    NASA Astrophysics Data System (ADS)

    Mikosch, Jochen; Patchkovskii, Serguei

    2013-10-01

    We derive a formal theory of noisy Poisson processes with multiple outcomes. We obtain simple, compact expressions for the probability distribution function of arbitrarily complex composite events and its moments. We illustrate the utility of the theory by analyzing properties of coincidence and covariance photoelectron-photoion detection involving single-ionization events. The results and techniques introduced in this work are directly applicable to more general coincidence and covariance experiments, including multiple ionization and multiple-ion fragmentation pathways.

  10. Brownian excursions on combs

    NASA Astrophysics Data System (ADS)

    Dean, David S.; Jansons, Kalvis M.

    1993-03-01

    In this paper we use techniques from Ito excursion theory to analyze Brownian motion on generalized combs. Ito excursion theory is a little-known area of probability theory and we therefore present a brief introduction for the uninitiated. A general method for analyzing transport along the backbone of the comb is demonstrated and the specific case of a comb whose teeth are scaling branching trees is examined. We then present a recursive method for evaluating the distribution of the first passage times on hierarchical combs.

  11. Solid Rocket Fuel Constitutive Theory and Polymer Cure

    NASA Technical Reports Server (NTRS)

    Ream, Robert

    2006-01-01

    Solid Rocket Fuel is a complex composite material for which no general constitutive theory, based on first principles, has been developed. One of the principles such a relation would depend on is the morphology of the binder. A theory of polymer curing is required to determine this morphology. During work on such a theory an algorithm was developed for counting the number of ways a polymer chain could assemble. The methods used to develop and check this algorithm led to an analytic solution to the problem. This solution is used in a probability distribution function which characterizes the morphology of the polymer.

  12. Vector-mean-field theory of the fractional quantum Hall effect

    NASA Astrophysics Data System (ADS)

    Rejaei, B.; Beenakker, C. W. J.

    1992-12-01

    A mean-field theory of the fractional quantum Hall effect is formulated based on the adiabatic principle of Greiter and Wilczek. The theory is tested on known bulk properties (excitation gap, fractional charge, and statistics), and then applied to a confined region in a two-dimensional electron gas (quantum dot). For a small number N of electrons in the dot, the exact ground-state energy has cusps at the same angular momentum values as the mean-field theory. For large N, Wen's algebraic decay of the probability for resonant tunneling through the dot is reproduced, albeit with a different exponent.

  13. Distribution of distances between DNA barcode labels in nanochannels close to the persistence length

    NASA Astrophysics Data System (ADS)

    Reinhart, Wesley F.; Reifenberger, Jeff G.; Gupta, Damini; Muralidhar, Abhiram; Sheats, Julian; Cao, Han; Dorfman, Kevin D.

    2015-02-01

    We obtained experimental extension data for barcoded E. coli genomic DNA molecules confined in nanochannels from 40 nm to 51 nm in width. The resulting data set consists of 1 627 779 measurements of the distance between fluorescent probes on 25 407 individual molecules. The probability density for the extension between labels is negatively skewed, and the magnitude of the skewness is relatively insensitive to the distance between labels. The two Odijk theories for DNA confinement bracket the mean extension and its variance, consistent with the scaling arguments underlying the theories. We also find that a harmonic approximation to the free energy, obtained directly from the probability density for the distance between barcode labels, leads to substantial quantitative error in the variance of the extension data. These results suggest that a theory for DNA confinement in such channels must account for the anharmonic nature of the free energy as a function of chain extension.

  14. Probabilistic representation in syllogistic reasoning: A theory to integrate mental models and heuristics.

    PubMed

    Hattori, Masasi

    2016-12-01

    This paper presents a new theory of syllogistic reasoning. The proposed model assumes there are probabilistic representations of given signature situations. Instead of conducting an exhaustive search, the model constructs an individual-based "logical" mental representation that expresses the most probable state of affairs, and derives a necessary conclusion that is not inconsistent with the model using heuristics based on informativeness. The model is a unification of previous influential models. Its descriptive validity has been evaluated against existing empirical data and two new experiments, and by qualitative analyses based on previous empirical findings, all of which supported the theory. The model's behavior is also consistent with findings in other areas, including working memory capacity. The results indicate that people assume the probabilities of all target events mentioned in a syllogism to be almost equal, which suggests links between syllogistic reasoning and other areas of cognition. Copyright © 2016 The Author(s). Published by Elsevier B.V. All rights reserved.

  15. The negated conditional: a litmus test for the suppositional conditional?

    PubMed

    Handley, Simon J; Evans, Jonathan St B T; Thompson, Valerie A

    2006-05-01

    Under the suppositional account of conditionals, when people think about a conditional assertion, "if p then q," they engage in a mental simulation in which they imagine p holds and evaluate the probability that q holds under this supposition. One implication of this account is that belief in a conditional equates to conditional probability [P(q/p)]. In this paper, the authors examine a further implication of this analysis with respect to the wide-scope negation of conditional assertions, "it is not the case that if p then q." Under the suppositional account, nothing categorically follows from the negation of a conditional, other than a second conditional, "if p then not-q." In contrast, according to the mental model theory, a negated conditional is consistent only with the determinate state of affairs, p and not-q. In 4 experiments, the authors compare the contrasting predictions that arise from each of these accounts. The findings are consistent with the suppositional theory but are incongruent with the mental model theory of conditionals.

  16. Generalized quantum Fokker-Planck, diffusion, and Smoluchowski equations with true probability distribution functions.

    PubMed

    Banik, Suman Kumar; Bag, Bidhan Chandra; Ray, Deb Shankar

    2002-05-01

    Traditionally, quantum Brownian motion is described by Fokker-Planck or diffusion equations in terms of quasiprobability distribution functions, e.g., Wigner functions. These often become singular or negative in the full quantum regime. In this paper a simple approach to non-Markovian theory of quantum Brownian motion using true probability distribution functions is presented. Based on an initial coherent state representation of the bath oscillators and an equilibrium canonical distribution of the quantum mechanical mean values of their coordinates and momenta, we derive a generalized quantum Langevin equation in c numbers and show that the latter is amenable to a theoretical analysis in terms of the classical theory of non-Markovian dynamics. The corresponding Fokker-Planck, diffusion, and Smoluchowski equations are the exact quantum analogs of their classical counterparts. The present work is independent of path integral techniques. The theory as developed here is a natural extension of its classical version and is valid for arbitrary temperature and friction (the Smoluchowski equation being considered in the overdamped limit).

  17. Beyond statistical inference: A decision theory for science

    PubMed Central

    KILLEEN, PETER R.

    2008-01-01

    Traditional null hypothesis significance testing does not yield the probability of the null or its alternative and, therefore, cannot logically ground scientific decisions. The decision theory proposed here calculates the expected utility of an effect on the basis of (1) the probability of replicating it and (2) a utility function on its size. It takes significance tests—which place all value on the replicability of an effect and none on its magnitude—as a special case, one in which the cost of a false positive is revealed to be an order of magnitude greater than the value of a true positive. More realistic utility functions credit both replicability and effect size, integrating them for a single index of merit. The analysis incorporates opportunity cost and is consistent with alternate measures of effect size, such as r2 and information transmission, and with Bayesian model selection criteria. An alternate formulation is functionally equivalent to the formal theory, transparent, and easy to compute. PMID:17201351

  18. Testing option pricing with the Edgeworth expansion

    NASA Astrophysics Data System (ADS)

    Balieiro Filho, Ruy Gabriel; Rosenfeld, Rogerio

    2004-12-01

    There is a well-developed framework, the Black-Scholes theory, for the pricing of contracts based on the future prices of certain assets, called options. This theory assumes that the probability distribution of the returns of the underlying asset is a Gaussian distribution. However, it is observed in the market that this hypothesis is flawed, leading to the introduction of a fudge factor, the so-called volatility smile. Therefore, it would be interesting to explore extensions of the Black-Scholes theory to non-Gaussian distributions. In this paper, we provide an explicit formula for the price of an option when the distributions of the returns of the underlying asset is parametrized by an Edgeworth expansion, which allows for the introduction of higher independent moments of the probability distribution, namely skewness and kurtosis. We test our formula with options in the Brazilian and American markets, showing that the volatility smile can be reduced. We also check whether our approach leads to more efficient hedging strategies of these instruments.

  19. Beyond statistical inference: a decision theory for science.

    PubMed

    Killeen, Peter R

    2006-08-01

    Traditional null hypothesis significance testing does not yield the probability of the null or its alternative and, therefore, cannot logically ground scientific decisions. The decision theory proposed here calculates the expected utility of an effect on the basis of (1) the probability of replicating it and (2) a utility function on its size. It takes significance tests--which place all value on the replicability of an effect and none on its magnitude--as a special case, one in which the cost of a false positive is revealed to be an order of magnitude greater than the value of a true positive. More realistic utility functions credit both replicability and effect size, integrating them for a single index of merit. The analysis incorporates opportunity cost and is consistent with alternate measures of effect size, such as r2 and information transmission, and with Bayesian model selection criteria. An alternate formulation is functionally equivalent to the formal theory, transparent, and easy to compute.

  20. Rejecting probability summation for radial frequency patterns, not so Quick!

    PubMed

    Baldwin, Alex S; Schmidtmann, Gunnar; Kingdom, Frederick A A; Hess, Robert F

    2016-05-01

    Radial frequency (RF) patterns are used to assess how the visual system processes shape. They are thought to be detected globally. This is supported by studies that have found summation for RF patterns to be greater than what is possible if the parts were being independently detected and performance only then improved with an increasing number of cycles by probability summation between them. However, the model of probability summation employed in these previous studies was based on High Threshold Theory (HTT), rather than Signal Detection Theory (SDT). We conducted rating scale experiments to investigate the receiver operating characteristics. We find these are of the curved form predicted by SDT, rather than the straight lines predicted by HTT. This means that to test probability summation we must use a model based on SDT. We conducted a set of summation experiments finding that thresholds decrease as the number of modulated cycles increases at approximately the same rate as previously found. As this could be consistent with either additive or probability summation, we performed maximum-likelihood fitting of a set of summation models (Matlab code provided in our Supplementary material) and assessed the fits using cross validation. We find we are not able to distinguish whether the responses to the parts of an RF pattern are combined by additive or probability summation, because the predictions are too similar. We present similar results for summation between separate RF patterns, suggesting that the summation process there may be the same as that within a single RF. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. A general formula for computing maximum proportion correct scores in various psychophysical paradigms with arbitrary probability distributions of stimulus observations.

    PubMed

    Dai, Huanping; Micheyl, Christophe

    2015-05-01

    Proportion correct (Pc) is a fundamental measure of task performance in psychophysics. The maximum Pc score that can be achieved by an optimal (maximum-likelihood) observer in a given task is of both theoretical and practical importance, because it sets an upper limit on human performance. Within the framework of signal detection theory, analytical solutions for computing the maximum Pc score have been established for several common experimental paradigms under the assumption of Gaussian additive internal noise. However, as the scope of applications of psychophysical signal detection theory expands, the need is growing for psychophysicists to compute maximum Pc scores for situations involving non-Gaussian (internal or stimulus-induced) noise. In this article, we provide a general formula for computing the maximum Pc in various psychophysical experimental paradigms for arbitrary probability distributions of sensory activity. Moreover, easy-to-use MATLAB code implementing the formula is provided. Practical applications of the formula are illustrated, and its accuracy is evaluated, for two paradigms and two types of probability distributions (uniform and Gaussian). The results demonstrate that Pc scores computed using the formula remain accurate even for continuous probability distributions, as long as the conversion from continuous probability density functions to discrete probability mass functions is supported by a sufficiently high sampling resolution. We hope that the exposition in this article, and the freely available MATLAB code, facilitates calculations of maximum performance for a wider range of experimental situations, as well as explorations of the impact of different assumptions concerning internal-noise distributions on maximum performance in psychophysical experiments.

  2. The theory, direction, and magnitude of ecosystem fire probability as constrained by precipitation and temperature.

    PubMed

    Guyette, Richard; Stambaugh, Michael C; Dey, Daniel; Muzika, Rose Marie

    2017-01-01

    The effects of climate on wildland fire confronts society across a range of different ecosystems. Water and temperature affect the combustion dynamics, irrespective of whether those are associated with carbon fueled motors or ecosystems, but through different chemical, physical, and biological processes. We use an ecosystem combustion equation developed with the physical chemistry of atmospheric variables to estimate and simulate fire probability and mean fire interval (MFI). The calibration of ecosystem fire probability with basic combustion chemistry and physics offers a quantitative method to address wildland fire in addition to the well-studied forcing factors such as topography, ignition, and vegetation. We develop a graphic analysis tool for estimating climate forced fire probability with temperature and precipitation based on an empirical assessment of combustion theory and fire prediction in ecosystems. Climate-affected fire probability for any period, past or future, is estimated with given temperature and precipitation. A graphic analyses of wildland fire dynamics driven by climate supports a dialectic in hydrologic processes that affect ecosystem combustion: 1) the water needed by plants to produce carbon bonds (fuel) and 2) the inhibition of successful reactant collisions by water molecules (humidity and fuel moisture). These two postulates enable a classification scheme for ecosystems into three or more climate categories using their position relative to change points defined by precipitation in combustion dynamics equations. Three classifications of combustion dynamics in ecosystems fire probability include: 1) precipitation insensitive, 2) precipitation unstable, and 3) precipitation sensitive. All three classifications interact in different ways with variable levels of temperature.

  3. The theory, direction, and magnitude of ecosystem fire probability as constrained by precipitation and temperature

    PubMed Central

    Guyette, Richard; Stambaugh, Michael C.; Dey, Daniel

    2017-01-01

    The effects of climate on wildland fire confronts society across a range of different ecosystems. Water and temperature affect the combustion dynamics, irrespective of whether those are associated with carbon fueled motors or ecosystems, but through different chemical, physical, and biological processes. We use an ecosystem combustion equation developed with the physical chemistry of atmospheric variables to estimate and simulate fire probability and mean fire interval (MFI). The calibration of ecosystem fire probability with basic combustion chemistry and physics offers a quantitative method to address wildland fire in addition to the well-studied forcing factors such as topography, ignition, and vegetation. We develop a graphic analysis tool for estimating climate forced fire probability with temperature and precipitation based on an empirical assessment of combustion theory and fire prediction in ecosystems. Climate-affected fire probability for any period, past or future, is estimated with given temperature and precipitation. A graphic analyses of wildland fire dynamics driven by climate supports a dialectic in hydrologic processes that affect ecosystem combustion: 1) the water needed by plants to produce carbon bonds (fuel) and 2) the inhibition of successful reactant collisions by water molecules (humidity and fuel moisture). These two postulates enable a classification scheme for ecosystems into three or more climate categories using their position relative to change points defined by precipitation in combustion dynamics equations. Three classifications of combustion dynamics in ecosystems fire probability include: 1) precipitation insensitive, 2) precipitation unstable, and 3) precipitation sensitive. All three classifications interact in different ways with variable levels of temperature. PMID:28704457

  4. Entanglement-enhanced Neyman-Pearson target detection using quantum illumination

    NASA Astrophysics Data System (ADS)

    Zhuang, Quntao; Zhang, Zheshen; Shapiro, Jeffrey H.

    2017-08-01

    Quantum illumination (QI) provides entanglement-based target detection---in an entanglement-breaking environment---whose performance is significantly better than that of optimum classical-illumination target detection. QI's performance advantage was established in a Bayesian setting with the target presumed equally likely to be absent or present and error probability employed as the performance metric. Radar theory, however, eschews that Bayesian approach, preferring the Neyman-Pearson performance criterion to avoid the difficulties of accurately assigning prior probabilities to target absence and presence and appropriate costs to false-alarm and miss errors. We have recently reported an architecture---based on sum-frequency generation (SFG) and feedforward (FF) processing---for minimum error-probability QI target detection with arbitrary prior probabilities for target absence and presence. In this paper, we use our results for FF-SFG reception to determine the receiver operating characteristic---detection probability versus false-alarm probability---for optimum QI target detection under the Neyman-Pearson criterion.

  5. Predictive Game Theory

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.

    2005-01-01

    Probability theory governs the outcome of a game; there is a distribution over mixed strat.'s, not a single "equilibrium". To predict a single mixed strategy must use our loss function (external to the game's players. Provides a quantification of any strategy's rationality. Prove rationality falls as cost of computation rises (for players who have not previously interacted). All extends to games with varying numbers of players.

  6. Structuring an Adult Learning Environment. Part IV: Establishing an Environment for Problem Solving.

    ERIC Educational Resources Information Center

    Frankel, Alan; Brennan, James

    Through the years, many researchers have advanced theories of problem solving. Probably the best definition of problem solving to apply to adult learning programs is Wallas' (1926) four-stage theory. The stages are (1) a preparation, (2) an incubation period, (3) a moment of illumination, and (4) final application or verification of the solution.…

  7. Introduction to Fuzzy Set Theory

    NASA Technical Reports Server (NTRS)

    Kosko, Bart

    1990-01-01

    An introduction to fuzzy set theory is described. Topics covered include: neural networks and fuzzy systems; the dynamical systems approach to machine intelligence; intelligent behavior as adaptive model-free estimation; fuzziness versus probability; fuzzy sets; the entropy-subsethood theorem; adaptive fuzzy systems for backing up a truck-and-trailer; product-space clustering with differential competitive learning; and adaptive fuzzy system for target tracking.

  8. Math Ties: Problem Solving, Logic Teasers, and Math Puzzles All "Tied" To the Math Curriculum. Book B1.

    ERIC Educational Resources Information Center

    Santi, Terri

    This book contains a classroom-tested approach to the teaching of problem solving to all students in Grades 6-8, regardless of ability. Information on problem solving in general is provided, then mathematical problems on logic, exponents, fractions, pre-algebra, algebra, geometry, number theory, set theory, ratio, proportion, percent, probability,…

  9. A Multidimensional Ideal Point Item Response Theory Model for Binary Data

    ERIC Educational Resources Information Center

    Maydeu-Olivares, Albert; Hernandez, Adolfo; McDonald, Roderick P.

    2006-01-01

    We introduce a multidimensional item response theory (IRT) model for binary data based on a proximity response mechanism. Under the model, a respondent at the mode of the item response function (IRF) endorses the item with probability one. The mode of the IRF is the ideal point, or in the multidimensional case, an ideal hyperplane. The model…

  10. Non-Gaussian statistics of soliton timing jitter induced by amplifier noise.

    PubMed

    Ho, Keang-Po

    2003-11-15

    Based on first-order perturbation theory of the soliton, the Gordon-Haus timing jitter induced by amplifier noise is found to be non-Gaussian distributed. Both frequency and timing jitter have larger tail probabilities than Gaussian distribution given by the linearized perturbation theory. The timing jitter has a larger discrepancy from Gaussian distribution than does the frequency jitter.

  11. A Test of Turk's Theory of Norm Resistance Using Observational Data on Police-Suspect Encounters

    ERIC Educational Resources Information Center

    Weidner, Robert R.; Terrill, William

    2005-01-01

    Turk's theory of norm resistance explains how authority-subject relations can be structured in manners that have different probabilities of overt conflict (norm resistance). Building on previous research by Lanza-Kaduce and Greenleaf, this study uses data collected as part of an observational study of the police in Indianapolis, Indiana, and St.…

  12. Unsupervised Learning and Pattern Recognition of Biological Data Structures with Density Functional Theory and Machine Learning.

    PubMed

    Chen, Chien-Chang; Juan, Hung-Hui; Tsai, Meng-Yuan; Lu, Henry Horng-Shing

    2018-01-11

    By introducing the methods of machine learning into the density functional theory, we made a detour for the construction of the most probable density function, which can be estimated by learning relevant features from the system of interest. Using the properties of universal functional, the vital core of density functional theory, the most probable cluster numbers and the corresponding cluster boundaries in a studying system can be simultaneously and automatically determined and the plausibility is erected on the Hohenberg-Kohn theorems. For the method validation and pragmatic applications, interdisciplinary problems from physical to biological systems were enumerated. The amalgamation of uncharged atomic clusters validated the unsupervised searching process of the cluster numbers and the corresponding cluster boundaries were exhibited likewise. High accurate clustering results of the Fisher's iris dataset showed the feasibility and the flexibility of the proposed scheme. Brain tumor detections from low-dimensional magnetic resonance imaging datasets and segmentations of high-dimensional neural network imageries in the Brainbow system were also used to inspect the method practicality. The experimental results exhibit the successful connection between the physical theory and the machine learning methods and will benefit the clinical diagnoses.

  13. Transition probability spaces in loop quantum gravity

    NASA Astrophysics Data System (ADS)

    Guo, Xiao-Kan

    2018-03-01

    We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is exemplified by first checking such structures in covariant quantum mechanics and then identifying the transition probability spaces in spin foam models via a simplified version of general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the discrete analog of the Hilbert space of the canonical theory and the relevant quantum logical structures. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize in spin foam models two proposals by Crane about the mathematical structures of quantum gravity, namely, the quantum topos and causal sites. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.

  14. Improvement of gross theory of beta-decay for application to nuclear data

    NASA Astrophysics Data System (ADS)

    Koura, Hiroyuki; Yoshida, Tadashi; Tachibana, Takahiro; Chiba, Satoshi

    2017-09-01

    A theoretical study of β decay and delayed neutron has been carried out with a global β-decay model, the gross theory. The gross theory is based on a consideration of the sum rule of the β-strength function, and gives reasonable results of β-decay rates and delayed neutron in the entire nuclear mass region. In a fissioning nucleus, neutrons are produced by β decay of neutron-rich fission fragments from actinides known as delayed neutrons. The average number of delayed neutrons is estimated based on the sum of the β-delayed neutron-emission probabilities multiplied by the cumulative fission yield for each nucleus. Such a behavior is important to manipulate nuclear reactors, and when we adopt some new high-burn-up reactors, properties of minor actinides will play an important roll in the system, but these data have not been sufficient. We re-analyze and improve the gross theory. For example, we considered the parity of neutrons and protons at the Fermi surface, and treat a suppression for the allowed transitions in the framework of the gross theory. By using the improved gross theory, underestimated half-lives in the neutron-rich indium isotopes and neighboring region increase, and consequently follow experimental trend. The ability of reproduction (and also prediction) of the β-decay rates, delayed-neutron emission probabilities is discussed. With this work, we have described the development of a programming code of the gross theory of β-decay including the improved parts. After preparation finished, this code can be released for the nuclear data community.

  15. Aneuploidy theory explains tumor formation, the absence of immune surveillance, and the failure of chemotherapy.

    PubMed

    Rasnick, David

    2002-07-01

    The autocatalyzed progression of aneuploidy accounts for all cancer-specific phenotypes, the Hayflick limit of cultured cells, carcinogen-induced tumors in mice, the age distribution of human cancer, and multidrug-resistance. Here aneuploidy theory addresses tumor formation. The logistic equation, phi(n)(+1) = rphi(n) (1 - phi(n)), models the autocatalyzed progression of aneuploidy in vivo and in vitro. The variable phi(n)(+1) is the average aneuploid fraction of a population of cells at the n+1 cell division and is determined by the value at the nth cell division. The value r is the growth control parameter. The logistic equation was used to compute the probability distribution for values of phi after numerous divisions of aneuploid cells. The autocatalyzed progression of aneuploidy follows the laws of deterministic chaos, which means that certain values of phi are more probable than others. The probability map of the logistic equation shows that: 1) an aneuploid fraction of at least 0.30 is necessary to sustain a population of cancer cells; and 2) the most likely aneuploid fraction after many population doublings is 0.70, which is equivalent to a DNA(index)=1.7, the point of maximum disorder of the genome that still sustains life. Aneuploidy theory also explains the lack of immune surveillance and the failure of chemotherapy.

  16. Predicting Rotator Cuff Tears Using Data Mining and Bayesian Likelihood Ratios

    PubMed Central

    Lu, Hsueh-Yi; Huang, Chen-Yuan; Su, Chwen-Tzeng; Lin, Chen-Chiang

    2014-01-01

    Objectives Rotator cuff tear is a common cause of shoulder diseases. Correct diagnosis of rotator cuff tears can save patients from further invasive, costly and painful tests. This study used predictive data mining and Bayesian theory to improve the accuracy of diagnosing rotator cuff tears by clinical examination alone. Methods In this retrospective study, 169 patients who had a preliminary diagnosis of rotator cuff tear on the basis of clinical evaluation followed by confirmatory MRI between 2007 and 2011 were identified. MRI was used as a reference standard to classify rotator cuff tears. The predictor variable was the clinical assessment results, which consisted of 16 attributes. This study employed 2 data mining methods (ANN and the decision tree) and a statistical method (logistic regression) to classify the rotator cuff diagnosis into “tear” and “no tear” groups. Likelihood ratio and Bayesian theory were applied to estimate the probability of rotator cuff tears based on the results of the prediction models. Results Our proposed data mining procedures outperformed the classic statistical method. The correction rate, sensitivity, specificity and area under the ROC curve of predicting a rotator cuff tear were statistical better in the ANN and decision tree models compared to logistic regression. Based on likelihood ratios derived from our prediction models, Fagan's nomogram could be constructed to assess the probability of a patient who has a rotator cuff tear using a pretest probability and a prediction result (tear or no tear). Conclusions Our predictive data mining models, combined with likelihood ratios and Bayesian theory, appear to be good tools to classify rotator cuff tears as well as determine the probability of the presence of the disease to enhance diagnostic decision making for rotator cuff tears. PMID:24733553

  17. Quantum mechanical reaction probability of triplet ketene at the multireference second-order perturbation level of theory.

    PubMed

    Ogihara, Yusuke; Yamamoto, Takeshi; Kato, Shigeki

    2010-09-23

    Triplet ketene exhibits a steplike structure in the experimentally observed dissociation rates, but its mechanism is still unknown despite many theoretical efforts in the past decades. In this paper we revisit this problem by quantum mechanically calculating the reaction probability with multireference-based electronic structure theory. Specifically, we first construct an analytical potential energy surface of triplet state by fitting it to about 6000 ab initio energies computed at the multireference second-order Mller-Plesset perturbation (MRMP2) level. We then evaluate the cumulative reaction probability by using the transition state wave packet method together with an adiabatically constrained Hamiltonian. The result shows that the imaginary barrier frequency on the triplet surface is 328i cm-1, which is close to the CCSD(T) result (321i cm-1) but is likely too large for reproducing the experimentally observed steps. Indeed, our calculated reaction probability exhibits no signature of steps, reflecting too strong tunneling effect along the reaction coordinate. Nevertheless, it is emphasized that the flatness of the potential profile in the transition-state region (which governs the degree of tunneling) depends strongly on the level of electronic structure calculation, thus leaving some possibility that the use of more accurate theories might lead to the observed steps. We also demonstrate that the triplet potential surface differs significantly between the CASSCF and MRMP2 results, particularly in the transition-state region. This fact seems to require more attention when studying the "nonadiabatic" scenario for the steps, in which the crossing seam between S0 and T1 surfaces is assumed to play a central role.

  18. Weak Measurement and Quantum Smoothing of a Superconducting Qubit

    NASA Astrophysics Data System (ADS)

    Tan, Dian

    In quantum mechanics, the measurement outcome of an observable in a quantum system is intrinsically random, yielding a probability distribution. The state of the quantum system can be described by a density matrix rho(t), which depends on the information accumulated until time t, and represents our knowledge about the system. The density matrix rho(t) gives probabilities for the outcomes of measurements at time t. Further probing of the quantum system allows us to refine our prediction in hindsight. In this thesis, we experimentally examine a quantum smoothing theory in a superconducting qubit by introducing an auxiliary matrix E(t) which is conditioned on information obtained from time t to a final time T. With the complete information before and after time t, the pair of matrices [rho(t), E(t)] can be used to make smoothed predictions for the measurement outcome at time t. We apply the quantum smoothing theory in the case of continuous weak measurement unveiling the retrodicted quantum trajectories and weak values. In the case of strong projective measurement, while the density matrix rho(t) with only diagonal elements in a given basis |n〉 may be treated as a classical mixture, we demonstrate a failure of this classical mixture description in determining the smoothed probabilities for the measurement outcome at time t with both diagonal rho(t) and diagonal E(t). We study the correlations between quantum states and weak measurement signals and examine aspects of the time symmetry of continuous quantum measurement. We also extend our study of quantum smoothing theory to the case of resonance fluorescence of a superconducting qubit with homodyne measurement and observe some interesting effects such as the modification of the excited state probabilities, weak values, and evolution of the predicted and retrodicted trajectories.

  19. Multiple statistical tests: Lessons from a d20.

    PubMed

    Madan, Christopher R

    2016-01-01

    Statistical analyses are often conducted with α= .05. When multiple statistical tests are conducted, this procedure needs to be adjusted to compensate for the otherwise inflated Type I error. In some instances in tabletop gaming, sometimes it is desired to roll a 20-sided die (or 'd20') twice and take the greater outcome. Here I draw from probability theory and the case of a d20, where the probability of obtaining any specific outcome is (1)/ 20, to determine the probability of obtaining a specific outcome (Type-I error) at least once across repeated, independent statistical tests.

  20. Properties of the probability density function of the non-central chi-squared distribution

    NASA Astrophysics Data System (ADS)

    András, Szilárd; Baricz, Árpád

    2008-10-01

    In this paper we consider the probability density function (pdf) of a non-central [chi]2 distribution with arbitrary number of degrees of freedom. For this function we prove that can be represented as a finite sum and we deduce a partial derivative formula. Moreover, we show that the pdf is log-concave when the degrees of freedom is greater or equal than 2. At the end of this paper we present some Turán-type inequalities for this function and an elegant application of the monotone form of l'Hospital's rule in probability theory is given.

  1. Probability theory for 3-layer remote sensing in ideal gas law environment.

    PubMed

    Ben-David, Avishai; Davidson, Charles E

    2013-08-26

    We extend the probability model for 3-layer radiative transfer [Opt. Express 20, 10004 (2012)] to ideal gas conditions where a correlation exists between transmission and temperature of each of the 3 layers. The effect on the probability density function for the at-sensor radiances is surprisingly small, and thus the added complexity of addressing the correlation can be avoided. The small overall effect is due to (a) small perturbations by the correlation on variance population parameters and (b) cancellation of perturbation terms that appear with opposite signs in the model moment expressions.

  2. Investigation of Bose-Einstein Condensates in q-Deformed Potentials with First Order Perturbation Theory

    NASA Astrophysics Data System (ADS)

    Nutku, Ferhat; Aydıner, Ekrem

    2018-02-01

    The Gross-Pitaevskii equation, which is the governor equation of Bose-Einstein condensates, is solved by first order perturbation expansion under various q-deformed potentials. Stationary probability distributions reveal one and two soliton behavior depending on the type of the q-deformed potential. Additionally a spatial shift of the probability distribution is found for the dark soliton solution, when the q parameter is changed.

  3. The Poisson Random Process. Applications of Probability Theory to Operations Research. Modules and Monographs in Undergraduate Mathematics and Its Applications Project. UMAP Unit 340.

    ERIC Educational Resources Information Center

    Wilde, Carroll O.

    The Poisson probability distribution is seen to provide a mathematical model from which useful information can be obtained in practical applications. The distribution and some situations to which it applies are studied, and ways to find answers to practical questions are noted. The unit includes exercises and a model exam, and provides answers to…

  4. Statistical hydrodynamics and related problems in spaces of probability measures

    NASA Astrophysics Data System (ADS)

    Dostoglou, Stamatios

    2017-11-01

    A rigorous theory of statistical solutions of the Navier-Stokes equations, suitable for exploring Kolmogorov's ideas, has been developed by M.I. Vishik and A.V. Fursikov, culminating in their monograph "Mathematical problems of Statistical Hydromechanics." We review some progress made in recent years following this approach, with emphasis on problems concerning the correlation of velocities and corresponding questions in the space of probability measures on Hilbert spaces.

  5. Accurate step-hold tracking of smoothly varying periodic and aperiodic probability.

    PubMed

    Ricci, Matthew; Gallistel, Randy

    2017-07-01

    Subjects observing many samples from a Bernoulli distribution are able to perceive an estimate of the generating parameter. A question of fundamental importance is how the current percept-what we think the probability now is-depends on the sequence of observed samples. Answers to this question are strongly constrained by the manner in which the current percept changes in response to changes in the hidden parameter. Subjects do not update their percept trial-by-trial when the hidden probability undergoes unpredictable and unsignaled step changes; instead, they update it only intermittently in a step-hold pattern. It could be that the step-hold pattern is not essential to the perception of probability and is only an artifact of step changes in the hidden parameter. However, we now report that the step-hold pattern obtains even when the parameter varies slowly and smoothly. It obtains even when the smooth variation is periodic (sinusoidal) and perceived as such. We elaborate on a previously published theory that accounts for: (i) the quantitative properties of the step-hold update pattern; (ii) subjects' quick and accurate reporting of changes; (iii) subjects' second thoughts about previously reported changes; (iv) subjects' detection of higher-order structure in patterns of change. We also call attention to the challenges these results pose for trial-by-trial updating theories.

  6. A wave function for stock market returns

    NASA Astrophysics Data System (ADS)

    Ataullah, Ali; Davidson, Ian; Tippett, Mark

    2009-02-01

    The instantaneous return on the Financial Times-Stock Exchange (FTSE) All Share Index is viewed as a frictionless particle moving in a one-dimensional square well but where there is a non-trivial probability of the particle tunneling into the well’s retaining walls. Our analysis demonstrates how the complementarity principle from quantum mechanics applies to stock market prices and of how the wave function presented by it leads to a probability density which exhibits strong compatibility with returns earned on the FTSE All Share Index. In particular, our analysis shows that the probability density for stock market returns is highly leptokurtic with slight (though not significant) negative skewness. Moreover, the moments of the probability density determined under the complementarity principle employed here are all convergent - in contrast to many of the probability density functions on which the received theory of finance is based.

  7. Stimulus probability effects in absolute identification.

    PubMed

    Kent, Christopher; Lamberts, Koen

    2016-05-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of presentation probability on both proportion correct and response times. The effects were moderated by the ubiquitous stimulus position effect. The accuracy and response time data were predicted by an exemplar-based model of perceptual cognition (Kent & Lamberts, 2005). The bow in discriminability was also attenuated when presentation probability for middle items was relatively high, an effect that will constrain future model development. The study provides evidence for item-specific learning in absolute identification. Implications for other theories of absolute identification are discussed. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  8. Six-dimensional quantum dynamics study for the dissociative adsorption of HCl on Au(111) surface

    NASA Astrophysics Data System (ADS)

    Liu, Tianhui; Fu, Bina; Zhang, Dong H.

    2013-11-01

    The six-dimensional quantum dynamics calculations for the dissociative chemisorption of HCl on Au(111) are carried out using the time-dependent wave-packet approach, based on an accurate PES which was recently developed by neural network fitting to density functional theory energy points. The influence of vibrational excitation and rotational orientation of HCl on the reactivity is investigated by calculating the exact six-dimensional dissociation probabilities, as well as the four-dimensional fixed-site dissociation probabilities. The vibrational excitation of HCl enhances the reactivity and the helicopter orientation yields higher dissociation probability than the cartwheel orientation. A new interesting site-averaged effect is found for the title molecule-surface system that one can essentially reproduce the six-dimensional dissociation probability by averaging the four-dimensional dissociation probabilities over 25 fixed sites.

  9. An Inverse Problem for a Class of Conditional Probability Measure-Dependent Evolution Equations

    PubMed Central

    Mirzaev, Inom; Byrne, Erin C.; Bortz, David M.

    2016-01-01

    We investigate the inverse problem of identifying a conditional probability measure in measure-dependent evolution equations arising in size-structured population modeling. We formulate the inverse problem as a least squares problem for the probability measure estimation. Using the Prohorov metric framework, we prove existence and consistency of the least squares estimates and outline a discretization scheme for approximating a conditional probability measure. For this scheme, we prove general method stability. The work is motivated by Partial Differential Equation (PDE) models of flocculation for which the shape of the post-fragmentation conditional probability measure greatly impacts the solution dynamics. To illustrate our methodology, we apply the theory to a particular PDE model that arises in the study of population dynamics for flocculating bacterial aggregates in suspension, and provide numerical evidence for the utility of the approach. PMID:28316360

  10. Abort Trigger False Positive and False Negative Analysis Methodology for Threshold-Based Abort Detection

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin J.; Cruz, Jose A.; Johnson Stephen B.; Lo, Yunnhon

    2015-01-01

    This paper describes a quantitative methodology for bounding the false positive (FP) and false negative (FN) probabilities associated with a human-rated launch vehicle abort trigger (AT) that includes sensor data qualification (SDQ). In this context, an AT is a hardware and software mechanism designed to detect the existence of a specific abort condition. Also, SDQ is an algorithmic approach used to identify sensor data suspected of being corrupt so that suspect data does not adversely affect an AT's detection capability. The FP and FN methodologies presented here were developed to support estimation of the probabilities of loss of crew and loss of mission for the Space Launch System (SLS) which is being developed by the National Aeronautics and Space Administration (NASA). The paper provides a brief overview of system health management as being an extension of control theory; and describes how ATs and the calculation of FP and FN probabilities relate to this theory. The discussion leads to a detailed presentation of the FP and FN methodology and an example showing how the FP and FN calculations are performed. This detailed presentation includes a methodology for calculating the change in FP and FN probabilities that result from including SDQ in the AT architecture. To avoid proprietary and sensitive data issues, the example incorporates a mixture of open literature and fictitious reliability data. Results presented in the paper demonstrate the effectiveness of the approach in providing quantitative estimates that bound the probability of a FP or FN abort determination.

  11. Scalar decay in two-dimensional chaotic advection and Batchelor-regime turbulence

    NASA Astrophysics Data System (ADS)

    Fereday, D. R.; Haynes, P. H.

    2004-12-01

    This paper considers the decay in time of an advected passive scalar in a large-scale flow. The relation between the decay predicted by "Lagrangian stretching theories," which consider evolution of the scalar field within a small fluid element and then average over many such elements, and that observed at large times in numerical simulations, associated with emergence of a "strange eigenmode" is discussed. Qualitative arguments are supported by results from numerical simulations of scalar evolution in two-dimensional spatially periodic, time aperiodic flows, which highlight the differences between the actual behavior and that predicted by the Lagrangian stretching theories. In some cases the decay rate of the scalar variance is different from the theoretical prediction and determined globally and in other cases it apparently matches the theoretical prediction. An updated theory for the wavenumber spectrum of the scalar field and a theory for the probability distribution of the scalar concentration are presented. The wavenumber spectrum and the probability density function both depend on the decay rate of the variance, but can otherwise be calculated from the statistics of the Lagrangian stretching history. In cases where the variance decay rate is not determined by the Lagrangian stretching theory, the wavenumber spectrum for scales that are much smaller than the length scale of the flow but much larger than the diffusive scale is argued to vary as k-1+ρ, where k is wavenumber, and ρ is a positive number which depends on the decay rate of the variance γ2 and on the Lagrangian stretching statistics. The probability density function for the scalar concentration is argued to have algebraic tails, with exponent roughly -3 and with a cutoff that is determined by diffusivity κ and scales roughly as κ-1/2 and these predictions are shown to be in good agreement with numerical simulations.

  12. Quantum gravity in timeless configuration space

    NASA Astrophysics Data System (ADS)

    Gomes, Henrique

    2017-12-01

    On the path towards quantum gravity we find friction between temporal relations in quantum mechanics (QM) (where they are fixed and field-independent), and in general relativity (where they are field-dependent and dynamic). This paper aims to attenuate that friction, by encoding gravity in the timeless configuration space of spatial fields with dynamics given by a path integral. The framework demands that boundary conditions for this path integral be uniquely given, but unlike other approaches where they are prescribed—such as the no-boundary and the tunneling proposals—here I postulate basic principles to identify boundary conditions in a large class of theories. Uniqueness arises only if a reduced configuration space can be defined and if it has a profoundly asymmetric fundamental structure. These requirements place strong restrictions on the field and symmetry content of theories encompassed here; shape dynamics is one such theory. When these constraints are met, any emerging theory will have a Born rule given merely by a particular volume element built from the path integral in (reduced) configuration space. Also as in other boundary proposals, Time, including space-time, emerges as an effective concept; valid for certain curves in configuration space but not assumed from the start. When some such notion of time becomes available, conservation of (positive) probability currents ensues. I show that, in the appropriate limits, a Schrödinger equation dictates the evolution of weakly coupled source fields on a classical gravitational background. Due to the asymmetry of reduced configuration space, these probabilities and currents avoid a known difficulty of standard WKB approximations for Wheeler DeWitt in minisuperspace: the selection of a unique Hamilton–Jacobi solution to serve as background. I illustrate these constructions with a simple example of a full quantum gravitational theory (i.e. not in minisuperspace) for which the formalism is applicable, and give a formula for calculating gravitational semi-classical relative probabilities in it.

  13. Complex and unpredictable Cardano

    NASA Astrophysics Data System (ADS)

    Ekert, Artur

    2008-08-01

    This purely recreational paper is about one of the most colorful characters of the Italian Renaissance, Girolamo Cardano, and the discovery of two basic ingredients of quantum theory, probability and complex numbers.

  14. Identification of dynamic systems, theory and formulation

    NASA Technical Reports Server (NTRS)

    Maine, R. E.; Iliff, K. W.

    1985-01-01

    The problem of estimating parameters of dynamic systems is addressed in order to present the theoretical basis of system identification and parameter estimation in a manner that is complete and rigorous, yet understandable with minimal prerequisites. Maximum likelihood and related estimators are highlighted. The approach used requires familiarity with calculus, linear algebra, and probability, but does not require knowledge of stochastic processes or functional analysis. The treatment emphasizes unification of the various areas in estimation in dynamic systems is treated as a direct outgrowth of the static system theory. Topics covered include basic concepts and definitions; numerical optimization methods; probability; statistical estimators; estimation in static systems; stochastic processes; state estimation in dynamic systems; output error, filter error, and equation error methods of parameter estimation in dynamic systems, and the accuracy of the estimates.

  15. Finding Bounded Rational Equilibria. Part 1; Iterative Focusing

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.

    2004-01-01

    A long-running difficulty with conventional game theory has been how to modify it to accommodate the bounded rationality characterizing all real-world players. A recurring issue in statistical physics is how best to approximate joint probability distributions with decoupled (and therefore far more tractable) distributions. It has recently been shown that the same information theoretic mathematical structure, known as Probability Collectives (PC) underlies both issues. This relationship between statistical physics and game theory allows techniques and insights from the one field to be applied to the other. In particular, PC provides a formal model-independent definition of the degree of rationality of a player and of bounded rationality equilibria. This pair of papers extends previous work on PC by introducing new computational approaches to effectively find bounded rationality equilibria of common-interest (team) games.

  16. What is Quantum Information?

    NASA Astrophysics Data System (ADS)

    Lombardi, Olimpia; Fortin, Sebastian; Holik, Federico; López, Cristian

    2017-04-01

    Preface; Introduction; Part I. About the Concept of Information: 1. About the concept of information Sebastian Fortin and Olimpia Lombardi; 2. Representation, information, and theories of information Armond Duwell; 3. Information, communication, and manipulability Olimpia Lombardi and Cristian López; Part II. Information and quantum mechanics: 4. Quantum versus classical information Jeffrey Bub; 5. Quantum information and locality Dennis Dieks; 6. Pragmatic information in quantum mechanics Juan Roederer; 7. Interpretations of quantum theory: a map of madness Adán Cabello; Part III. Probability, Correlations, and Information: 8. On the tension between ontology and epistemology in quantum probabilities Amit Hagar; 9. Inferential versus dynamical conceptions of physics David Wallace; 10. Classical models for quantum information Federico Holik and Gustavo Martin Bosyk; 11. On the relative character of quantum correlations Guido Bellomo and Ángel Ricardo Plastino; Index.

  17. Do juries meet our expectations?

    PubMed

    Arkes, Hal R; Mellers, Barbara A

    2002-12-01

    Surveys of public opinion indicate that people have high expectations for juries. When it comes to serious crimes, most people want errors of convicting the innocent (false positives) or acquitting the guilty (false negatives) to fall well below 10%. Using expected utility theory, Bayes' Theorem, signal detection theory, and empirical evidence from detection studies of medical decision making, eyewitness testimony, and weather forecasting, we argue that the frequency of mistakes probably far exceeds these "tolerable" levels. We are not arguing against the use of juries. Rather, we point out that a closer look at jury decisions reveals a serious gap between what we expect from juries and what probably occurs. When deciding issues of guilt and/or punishing convicted criminals, we as a society should recognize and acknowledge the abundance of error.

  18. Spatial dynamics of invasion: the geometry of introduced species.

    PubMed

    Korniss, Gyorgy; Caraco, Thomas

    2005-03-07

    Many exotic species combine low probability of establishment at each introduction with rapid population growth once introduction does succeed. To analyse this phenomenon, we note that invaders often cluster spatially when rare, and consequently an introduced exotic's population dynamics should depend on locally structured interactions. Ecological theory for spatially structured invasion relies on deterministic approximations, and determinism does not address the observed uncertainty of the exotic-introduction process. We take a new approach to the population dynamics of invasion and, by extension, to the general question of invasibility in any spatial ecology. We apply the physical theory for nucleation of spatial systems to a lattice-based model of competition between plant species, a resident and an invader, and the analysis reaches conclusions that differ qualitatively from the standard ecological theories. Nucleation theory distinguishes between dynamics of single- and multi-cluster invasion. Low introduction rates and small system size produce single-cluster dynamics, where success or failure of introduction is inherently stochastic. Single-cluster invasion occurs only if the cluster reaches a critical size, typically preceded by a number of failed attempts. For this case, we identify the functional form of the probability distribution of time elapsing until invasion succeeds. Although multi-cluster invasion for sufficiently large systems exhibits spatial averaging and almost-deterministic dynamics of the global densities, an analytical approximation from nucleation theory, known as Avrami's law, describes our simulation results far better than standard ecological approximations.

  19. Dynamics of Markets

    NASA Astrophysics Data System (ADS)

    McCauley, Joseph L.

    2009-09-01

    Preface; 1. Econophysics: why and what; 2. Neo-classical economic theory; 3. Probability and stochastic processes; 4. Introduction to financial economics; 5. Introduction to portfolio selection theory; 6. Scaling, pair correlations, and conditional densities; 7. Statistical ensembles: deducing dynamics from time series; 8. Martingale option pricing; 9. FX market globalization: evolution of the dollar to worldwide reserve currency; 10. Macroeconomics and econometrics: regression models vs. empirically based modeling; 11. Complexity; Index.

  20. Hardy's argument and successive spin-s measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahanj, Ali

    2010-07-15

    We consider a hidden-variable theoretic description of successive measurements of noncommuting spin observables on an input spin-s state. In this scenario, the hidden-variable theory leads to a Hardy-type argument that quantum predictions violate it. We show that the maximum probability of success of Hardy's argument in quantum theory is ((1/2)){sup 4s}, which is more than in the spatial case.

  1. Boolean approach to dichotomic quantum measurement theories

    NASA Astrophysics Data System (ADS)

    Nagata, K.; Nakamura, T.; Batle, J.; Abdalla, S.; Farouk, A.

    2017-02-01

    Recently, a new measurement theory based on truth values was proposed by Nagata and Nakamura [Int. J. Theor. Phys. 55, 3616 (2016)], that is, a theory where the results of measurements are either 0 or 1. The standard measurement theory accepts a hidden variable model for a single Pauli observable. Hence, we can introduce a classical probability space for the measurement theory in this particular case. Additionally, we discuss in the present contribution the fact that projective measurement theories (the results of which are either +1 or -1) imply the Bell, Kochen, and Specker (BKS) paradox for a single Pauli observable. To justify our assertion, we present the BKS theorem in almost all the two-dimensional states by using a projective measurement theory. As an example, we present the BKS theorem in two-dimensions with white noise. Our discussion provides new insight into the quantum measurement problem by using this measurement theory based on the truth values.

  2. Graph Theory-Based Pinning Synchronization of Stochastic Complex Dynamical Networks.

    PubMed

    Li, Xiao-Jian; Yang, Guang-Hong

    2017-02-01

    This paper is concerned with the adaptive pinning synchronization problem of stochastic complex dynamical networks (CDNs). Based on algebraic graph theory and Lyapunov theory, pinning controller design conditions are derived, and the rigorous convergence analysis of synchronization errors in the probability sense is also conducted. Compared with the existing results, the topology structures of stochastic CDN are allowed to be unknown due to the use of graph theory. In particular, it is shown that the selection of nodes for pinning depends on the unknown lower bounds of coupling strengths. Finally, an example on a Chua's circuit network is given to validate the effectiveness of the theoretical results.

  3. From axiomatics of quantum probability to modelling geological uncertainty and management of intelligent hydrocarbon reservoirs with the theory of open quantum systems.

    PubMed

    Lozada Aguilar, Miguel Ángel; Khrennikov, Andrei; Oleschko, Klaudia

    2018-04-28

    As was recently shown by the authors, quantum probability theory can be used for the modelling of the process of decision-making (e.g. probabilistic risk analysis) for macroscopic geophysical structures such as hydrocarbon reservoirs. This approach can be considered as a geophysical realization of Hilbert's programme on axiomatization of statistical models in physics (the famous sixth Hilbert problem). In this conceptual paper , we continue development of this approach to decision-making under uncertainty which is generated by complexity, variability, heterogeneity, anisotropy, as well as the restrictions to accessibility of subsurface structures. The belief state of a geological expert about the potential of exploring a hydrocarbon reservoir is continuously updated by outputs of measurements, and selection of mathematical models and scales of numerical simulation. These outputs can be treated as signals from the information environment E The dynamics of the belief state can be modelled with the aid of the theory of open quantum systems: a quantum state (representing uncertainty in beliefs) is dynamically modified through coupling with E ; stabilization to a steady state determines a decision strategy. In this paper, the process of decision-making about hydrocarbon reservoirs (e.g. 'explore or not?'; 'open new well or not?'; 'contaminated by water or not?'; 'double or triple porosity medium?') is modelled by using the Gorini-Kossakowski-Sudarshan-Lindblad equation. In our model, this equation describes the evolution of experts' predictions about a geophysical structure. We proceed with the information approach to quantum theory and the subjective interpretation of quantum probabilities (due to quantum Bayesianism).This article is part of the theme issue 'Hilbert's sixth problem'. © 2018 The Author(s).

  4. From axiomatics of quantum probability to modelling geological uncertainty and management of intelligent hydrocarbon reservoirs with the theory of open quantum systems

    NASA Astrophysics Data System (ADS)

    Lozada Aguilar, Miguel Ángel; Khrennikov, Andrei; Oleschko, Klaudia

    2018-04-01

    As was recently shown by the authors, quantum probability theory can be used for the modelling of the process of decision-making (e.g. probabilistic risk analysis) for macroscopic geophysical structures such as hydrocarbon reservoirs. This approach can be considered as a geophysical realization of Hilbert's programme on axiomatization of statistical models in physics (the famous sixth Hilbert problem). In this conceptual paper, we continue development of this approach to decision-making under uncertainty which is generated by complexity, variability, heterogeneity, anisotropy, as well as the restrictions to accessibility of subsurface structures. The belief state of a geological expert about the potential of exploring a hydrocarbon reservoir is continuously updated by outputs of measurements, and selection of mathematical models and scales of numerical simulation. These outputs can be treated as signals from the information environment E. The dynamics of the belief state can be modelled with the aid of the theory of open quantum systems: a quantum state (representing uncertainty in beliefs) is dynamically modified through coupling with E; stabilization to a steady state determines a decision strategy. In this paper, the process of decision-making about hydrocarbon reservoirs (e.g. `explore or not?'; `open new well or not?'; `contaminated by water or not?'; `double or triple porosity medium?') is modelled by using the Gorini-Kossakowski-Sudarshan-Lindblad equation. In our model, this equation describes the evolution of experts' predictions about a geophysical structure. We proceed with the information approach to quantum theory and the subjective interpretation of quantum probabilities (due to quantum Bayesianism). This article is part of the theme issue `Hilbert's sixth problem'.

  5. Randomized central limit theorems: A unified theory.

    PubMed

    Eliazar, Iddo; Klafter, Joseph

    2010-08-01

    The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles' aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles' extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic-scaling all ensemble components by a common deterministic scale. However, there are "random environment" settings in which the underlying scaling schemes are stochastic-scaling the ensemble components by different random scales. Examples of such settings include Holtsmark's law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)-in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes-and present "randomized counterparts" to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.

  6. Randomized central limit theorems: A unified theory

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo; Klafter, Joseph

    2010-08-01

    The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles’ aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles’ extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic—scaling all ensemble components by a common deterministic scale. However, there are “random environment” settings in which the underlying scaling schemes are stochastic—scaling the ensemble components by different random scales. Examples of such settings include Holtsmark’s law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)—in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes—and present “randomized counterparts” to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.

  7. Weight for Stephen Finlay.

    PubMed

    Evers, Daan

    2013-04-01

    According to Stephen Finlay, ' A ought to X ' means that X -ing is more conducive to contextually salient ends than relevant alternatives. This in turn is analysed in terms of probability. I show why this theory of 'ought' is hard to square with a theory of a reason's weight which could explain why ' A ought to X ' logically entails that the balance of reasons favours that A X -es. I develop two theories of weight to illustrate my point. I first look at the prospects of a theory of weight based on expected utility theory. I then suggest a simpler theory. Although neither allows that ' A ought to X ' logically entails that the balance of reasons favours that A X -es, this price may be accepted. For there remains a strong pragmatic relation between these claims.

  8. MaxEnt-Based Ecological Theory: A Template for Integrated Catchment Theory

    NASA Astrophysics Data System (ADS)

    Harte, J.

    2017-12-01

    The maximum information entropy procedure (MaxEnt) is both a powerful tool for inferring least-biased probability distributions from limited data and a framework for the construction of complex systems theory. The maximum entropy theory of ecology (METE) describes remarkably well widely observed patterns in the distribution, abundance and energetics of individuals and taxa in relatively static ecosystems. An extension to ecosystems undergoing change in response to disturbance or natural succession (DynaMETE) is in progress. I describe the structure of both the static and the dynamic theory and show a range of comparisons with census data. I then propose a generalization of the MaxEnt approach that could provide a framework for a predictive theory of both static and dynamic, fully-coupled, eco-socio-hydrological catchment systems.

  9. [Knowledge and destiny or longevity and old age: the heritage of Homo sapiens].

    PubMed

    Goddio, A S

    1994-12-01

    Several theories have been proposed to explain ageing: limitation of the number of cell divisions or Hayflick's limit, the genetic theory, the action of free radicals, immune deficiency, etc. All of these theories share several points in common: their genetic determinism or repercussions which appear to be part of the heritage of complex organisms. Progress in genetics with chromosome decoding to localise genes and genetic manipulations or control of gene expression will probably allow an increased life expectancy, perhaps in the near future.

  10. Study on combat effectiveness of air defense missile weapon system based on queuing theory

    NASA Astrophysics Data System (ADS)

    Zhao, Z. Q.; Hao, J. X.; Li, L. J.

    2017-01-01

    Queuing Theory is a method to analyze the combat effectiveness of air defense missile weapon system. The model of service probability based on the queuing theory was constructed, and applied to analyzing the combat effectiveness of "Sidewinder" and "Tor-M1" air defense missile weapon system. Finally aimed at different targets densities, the combat effectiveness of different combat units of two types' defense missile weapon system is calculated. This method can be used to analyze the usefulness of air defense missile weapon system.

  11. Fundamentals and Special Problems of Synthetic Aperture Radar (SAR) (Les Aspects Fondamentaux et les Problemes Specifiques aux Radars a Ouverture Synthetique (SAR)

    DTIC Science & Technology

    1992-08-01

    limits of these topics will be included. Digital SAR processing is for SAR indispensible. Theories and special algorithms will be given along with basic...traitement num~rique est indispensable aux SAP,. Des theories et des algorithmes sp~cifiques; seront proposes, ainsi que des configurations de processeur...equation If N independent pixel values are added than fol- lows from the laws of probability theory that the ra mean value of the sum is identical with

  12. Probability of loss of assured safety in temperature dependent systems with multiple weak and strong links.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Jay Dean; Oberkampf, William Louis; Helton, Jon Craig

    2004-12-01

    Relationships to determine the probability that a weak link (WL)/strong link (SL) safety system will fail to function as intended in a fire environment are investigated. In the systems under study, failure of the WL system before failure of the SL system is intended to render the overall system inoperational and thus prevent the possible occurrence of accidents with potentially serious consequences. Formal developments of the probability that the WL system fails to deactivate the overall system before failure of the SL system (i.e., the probability of loss of assured safety, PLOAS) are presented for several WWSL configurations: (i) onemore » WL, one SL, (ii) multiple WLs, multiple SLs with failure of any SL before any WL constituting failure of the safety system, (iii) multiple WLs, multiple SLs with failure of all SLs before any WL constituting failure of the safety system, and (iv) multiple WLs, multiple SLs and multiple sublinks in each SL with failure of any sublink constituting failure of the associated SL and failure of all SLs before failure of any WL constituting failure of the safety system. The indicated probabilities derive from time-dependent temperatures in the WL/SL system and variability (i.e., aleatory uncertainty) in the temperatures at which the individual components of this system fail and are formally defined as multidimensional integrals. Numerical procedures based on quadrature (i.e., trapezoidal rule, Simpson's rule) and also on Monte Carlo techniques (i.e., simple random sampling, importance sampling) are described and illustrated for the evaluation of these integrals. Example uncertainty and sensitivity analyses for PLOAS involving the representation of uncertainty (i.e., epistemic uncertainty) with probability theory and also with evidence theory are presented.« less

  13. War Termination Concepts and Political, Economic and Military Targeting

    DTIC Science & Technology

    1978-03-01

    imposed by the level of military technology, also served to reduce the probability of transcultural conflict; thus shared value systems probably...from the Berlin Blockade in 1948 through the Cuban Missile Crisis , and, utilizing the "lessons learned" approach, reformulates deterrence theory on...aspect of which is the "general crisis of capitalism," presently said to be in its most severe phase since the 1930s. The present crisis , Soviet

  14. Higher order Stark effect and transition probabilities on hyperfine structure components of hydrogen like atoms

    NASA Astrophysics Data System (ADS)

    Pal'Chikov, V. G.

    2000-08-01

    A quantum-electrodynamical (QED) perturbation theory is developed for hydrogen and hydrogen-like atomic systems with interaction between bound electrons and radiative field being treated as the perturbation. The dependence of the perturbed energy of levels on hyperfine structure (hfs) effects and on the higher-order Stark effect is investigated. Numerical results have been obtained for the transition probability between the hfs components of hydrogen-like bismuth.

  15. Predictability of currency market exchange

    NASA Astrophysics Data System (ADS)

    Ohira, Toru; Sazuka, Naoya; Marumo, Kouhei; Shimizu, Tokiko; Takayasu, Misako; Takayasu, Hideki

    2002-05-01

    We analyze tick data of yen-dollar exchange with a focus on its up and down movement. We show that there exists a rather particular conditional probability structure with such high frequency data. This result provides us with evidence to question one of the basic assumptions of the traditional market theory, where such bias in high frequency price movements is regarded as not present. We also construct systematically a random walk model reflecting this probability structure.

  16. Constraints on scattering amplitudes in multistate Landau-Zener theory

    NASA Astrophysics Data System (ADS)

    Sinitsyn, Nikolai A.; Lin, Jeffmin; Chernyak, Vladimir Y.

    2017-01-01

    We derive a set of constraints, which we will call hierarchy constraints, on scattering amplitudes of an arbitrary multistate Landau-Zener model (MLZM). The presence of additional symmetries can transform such constraints into nontrivial relations between elements of the transition probability matrix. This observation can be used to derive complete solutions of some MLZMs or, for models that cannot be solved completely, to reduce the number of independent elements of the transition probability matrix.

  17. Phase transitions in Nowak Sznajd opinion dynamics

    NASA Astrophysics Data System (ADS)

    Wołoszyn, Maciej; Stauffer, Dietrich; Kułakowski, Krzysztof

    2007-05-01

    The Nowak modification of the Sznajd opinion dynamics model on the square lattice assumes that with probability β the opinions flip due to mass-media advertising from down to up, and vice versa. Besides, with probability α the Sznajd rule applies that a neighbour pair agreeing in its two opinions convinces all its six neighbours of that opinion. Our Monte Carlo simulations and mean-field theory find sharp phase transitions in the parameter space.

  18. Noncommutative Valuation of Options

    NASA Astrophysics Data System (ADS)

    Herscovich, Estanislao

    2016-12-01

    The aim of this note is to show that the classical results in finance theory for pricing of derivatives, given by making use of the replication principle, can be extended to the noncommutative world. We believe that this could be of interest in quantum probability. The main result called the First fundamental theorem of asset pricing, states that a noncommutative stock market admits no-arbitrage if and only if it admits a noncommutative equivalent martingale probability.

  19. Single photon radioluminescence. I. Theory and spectroscopic properties.

    PubMed Central

    Bicknese, S; Shahrokh, Z; Shohet, S B; Verkman, A S

    1992-01-01

    The excitation of a fluorescent molecule by a beta-decay electron (radioluminescence) depends upon the electron energy, the distance between radioactive 'donor' and fluorescent 'acceptor', and the excitation characteristics and solvent environment of the fluorophore. The theory for calculation of single photon radioluminescence (SPR) signals is developed here; in the accompanying paper, measurement methods and biological applications are presented. To calculate the three-dimensional spatial profile for electron energy deposition in an aqueous environment, a Monte Carlo calculation was performed incorporating theories of electron energy distributions, energy loss due to interactions with matter, and deflections in electron motion due to collisions. For low energy beta emitters, 50% of energy deposition occurs within 0.63 micron (3H, 18.5 keV), 22 microns (14C, 156 keV), 25 microns (35S, 167 keV), and 260 microns (36Cl, 712 keV) of the radioisotope. In close proximity to the beta emitter (100 nm, 3H; 10 microns, 14C) the probability for fluorophore excitation is approximately proportional to the inverse square of the distance between the beta emitter and fluorophore. To investigate the other factors that determine the probability for fluorophore excitation, SPR measurements were carried out in solutions containing 3H and a series of fluorophores in different solvents. In water, the probability of fluorescence excitation was nearly proportional to the integrated absorbance over a > 1,000-fold variation in absorbances. The probability of fluorescence excitation was enhanced up to 2,600-fold when the fluorophore was in a "scintillant" aromatic or hydrocarbon solvent. SPR emission spectra were similar to fluorescence emission spectra obtained with photon excitation. The single photon signal due to Bremsstrahlung increased with wavelength in agreement with theory. The distance dependence for the SPR signal predicted by the model was in good agreement with measurements in which a 14C donor was separated by known thicknesses of water from a fluorescently-coated coverglass. Quantitative predictions for radioluminescence signal as a function of donor-acceptor distance were developed for specific radioisotope-fluorophore geometries in biological samples. Images FIGURE 1 PMID:1477277

  20. Constructor theory of probability

    PubMed Central

    2016-01-01

    Unitary quantum theory, having no Born Rule, is non-probabilistic. Hence the notorious problem of reconciling it with the unpredictability and appearance of stochasticity in quantum measurements. Generalizing and improving upon the so-called ‘decision-theoretic approach’, I shall recast that problem in the recently proposed constructor theory of information—where quantum theory is represented as one of a class of superinformation theories, which are local, non-probabilistic theories conforming to certain constructor-theoretic conditions. I prove that the unpredictability of measurement outcomes (to which constructor theory gives an exact meaning) necessarily arises in superinformation theories. Then I explain how the appearance of stochasticity in (finitely many) repeated measurements can arise under superinformation theories. And I establish sufficient conditions for a superinformation theory to inform decisions (made under it) as if it were probabilistic, via a Deutsch–Wallace-type argument—thus defining a class of decision-supporting superinformation theories. This broadens the domain of applicability of that argument to cover constructor-theory compliant theories. In addition, in this version some of the argument's assumptions, previously construed as merely decision-theoretic, follow from physical properties expressed by constructor-theoretic principles. PMID:27616914

  1. Time preference and its relationship with age, health, and survival probability

    PubMed Central

    Chao, Li-Wei; Szrek, Helena; Pereira, Nuno Sousa; Pauly, Mark V.

    2009-01-01

    Although theories from economics and evolutionary biology predict that one's age, health, and survival probability should be associated with one's subjective discount rate (SDR), few studies have empirically tested for these links. Our study analyzes in detail how the SDR is related to age, health, and survival probability, by surveying a sample of individuals in townships around Durban, South Africa. In contrast to previous studies, we find that age is not significantly related to the SDR, but both physical health and survival expectations have a U-shaped relationship with the SDR. Individuals in very poor health have high discount rates, and those in very good health also have high discount rates. Similarly, those with expected survival probability on the extremes have high discount rates. Therefore, health and survival probability, and not age, seem to be predictors of one's SDR in an area of the world with high morbidity and mortality. PMID:20376300

  2. Six-dimensional quantum dynamics study for the dissociative adsorption of HCl on Au(111) surface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Tianhui; Fu, Bina; Zhang, Dong H., E-mail: zhangdh@dicp.ac.cn

    The six-dimensional quantum dynamics calculations for the dissociative chemisorption of HCl on Au(111) are carried out using the time-dependent wave-packet approach, based on an accurate PES which was recently developed by neural network fitting to density functional theory energy points. The influence of vibrational excitation and rotational orientation of HCl on the reactivity is investigated by calculating the exact six-dimensional dissociation probabilities, as well as the four-dimensional fixed-site dissociation probabilities. The vibrational excitation of HCl enhances the reactivity and the helicopter orientation yields higher dissociation probability than the cartwheel orientation. A new interesting site-averaged effect is found for the titlemore » molecule-surface system that one can essentially reproduce the six-dimensional dissociation probability by averaging the four-dimensional dissociation probabilities over 25 fixed sites.« less

  3. Fixation Probability in a Haploid-Diploid Population

    PubMed Central

    Bessho, Kazuhiro; Otto, Sarah P.

    2017-01-01

    Classical population genetic theory generally assumes either a fully haploid or fully diploid life cycle. However, many organisms exhibit more complex life cycles, with both free-living haploid and diploid stages. Here we ask what the probability of fixation is for selected alleles in organisms with haploid-diploid life cycles. We develop a genetic model that considers the population dynamics using both the Moran model and Wright–Fisher model. Applying a branching process approximation, we obtain an accurate fixation probability assuming that the population is large and the net effect of the mutation is beneficial. We also find the diffusion approximation for the fixation probability, which is accurate even in small populations and for deleterious alleles, as long as selection is weak. These fixation probabilities from branching process and diffusion approximations are similar when selection is weak for beneficial mutations that are not fully recessive. In many cases, particularly when one phase predominates, the fixation probability differs substantially for haploid-diploid organisms compared to either fully haploid or diploid species. PMID:27866168

  4. Transition probabilities for general birth-death processes with applications in ecology, genetics, and evolution

    PubMed Central

    Crawford, Forrest W.; Suchard, Marc A.

    2011-01-01

    A birth-death process is a continuous-time Markov chain that counts the number of particles in a system over time. In the general process with n current particles, a new particle is born with instantaneous rate λn and a particle dies with instantaneous rate μn. Currently no robust and efficient method exists to evaluate the finite-time transition probabilities in a general birth-death process with arbitrary birth and death rates. In this paper, we first revisit the theory of continued fractions to obtain expressions for the Laplace transforms of these transition probabilities and make explicit an important derivation connecting transition probabilities and continued fractions. We then develop an efficient algorithm for computing these probabilities that analyzes the error associated with approximations in the method. We demonstrate that this error-controlled method agrees with known solutions and outperforms previous approaches to computing these probabilities. Finally, we apply our novel method to several important problems in ecology, evolution, and genetics. PMID:21984359

  5. Application of fuzzy fault tree analysis based on modified fuzzy AHP and fuzzy TOPSIS for fire and explosion in the process industry.

    PubMed

    Yazdi, Mohammad; Korhan, Orhan; Daneshvar, Sahand

    2018-05-09

    This study aimed at establishing fault tree analysis (FTA) using expert opinion to compute the probability of an event. To find the probability of the top event (TE), all probabilities of the basic events (BEs) should be available when the FTA is drawn. In this case, employing expert judgment can be used as an alternative to failure data in an awkward situation. The fuzzy analytical hierarchy process as a standard technique is used to give a specific weight to each expert, and fuzzy set theory is engaged for aggregating expert opinion. In this regard, the probability of BEs will be computed and, consequently, the probability of the TE obtained using Boolean algebra. Additionally, to reduce the probability of the TE in terms of three parameters (safety consequences, cost and benefit), the importance measurement technique and modified TOPSIS was employed. The effectiveness of the proposed approach is demonstrated with a real-life case study.

  6. An artificial system for selecting the optimal surgical team.

    PubMed

    Saberi, Nahid; Mahvash, Mohsen; Zenati, Marco

    2015-01-01

    We introduce an intelligent system to optimize a team composition based on the team's historical outcomes and apply this system to compose a surgical team. The system relies on a record of the procedures performed in the past. The optimal team composition is the one with the lowest probability of unfavorable outcome. We use the theory of probability and the inclusion exclusion principle to model the probability of team outcome for a given composition. A probability value is assigned to each person of database and the probability of a team composition is calculated from them. The model allows to determine the probability of all possible team compositions even if there is no recoded procedure for some team compositions. From an analytical perspective, assembling an optimal team is equivalent to minimizing the overlap of team members who have a recurring tendency to be involved with procedures of unfavorable results. A conceptual example shows the accuracy of the proposed system on obtaining the optimal team.

  7. Epistemic-based investigation of the probability of hazard scenarios using Bayesian network for the lifting operation of floating objects

    NASA Astrophysics Data System (ADS)

    Toroody, Ahmad Bahoo; Abaiee, Mohammad Mahdi; Gholamnia, Reza; Ketabdari, Mohammad Javad

    2016-09-01

    Owing to the increase in unprecedented accidents with new root causes in almost all operational areas, the importance of risk management has dramatically risen. Risk assessment, one of the most significant aspects of risk management, has a substantial impact on the system-safety level of organizations, industries, and operations. If the causes of all kinds of failure and the interactions between them are considered, effective risk assessment can be highly accurate. A combination of traditional risk assessment approaches and modern scientific probability methods can help in realizing better quantitative risk assessment methods. Most researchers face the problem of minimal field data with respect to the probability and frequency of each failure. Because of this limitation in the availability of epistemic knowledge, it is important to conduct epistemic estimations by applying the Bayesian theory for identifying plausible outcomes. In this paper, we propose an algorithm and demonstrate its application in a case study for a light-weight lifting operation in the Persian Gulf of Iran. First, we identify potential accident scenarios and present them in an event tree format. Next, excluding human error, we use the event tree to roughly estimate the prior probability of other hazard-promoting factors using a minimal amount of field data. We then use the Success Likelihood Index Method (SLIM) to calculate the probability of human error. On the basis of the proposed event tree, we use the Bayesian network of the provided scenarios to compensate for the lack of data. Finally, we determine the resulting probability of each event based on its evidence in the epistemic estimation format by building on two Bayesian network types: the probability of hazard promotion factors and the Bayesian theory. The study results indicate that despite the lack of available information on the operation of floating objects, a satisfactory result can be achieved using epistemic data.

  8. Feedback produces divergence from prospect theory in descriptive choice.

    PubMed

    Jessup, Ryan K; Bishara, Anthony J; Busemeyer, Jerome R

    2008-10-01

    A recent study demonstrated that individuals making experience-based choices underweight small probabilities, in contrast to the overweighting observed in a typical descriptive paradigm. We tested whether trial-by-trial feedback in a repeated descriptive paradigm would engender choices more correspondent with experiential or descriptive paradigms. The results of a repeated gambling task indicated that individuals receiving feedback underweighted small probabilities, relative to their no-feedback counterparts. These results implicate feedback as a critical component during the decision-making process, even in the presence of fully specified descriptive information. A model comparison at the individual-subject level suggested that feedback drove individuals' decision weights toward objective probability weighting.

  9. Structure induction in diagnostic causal reasoning.

    PubMed

    Meder, Björn; Mayrhofer, Ralf; Waldmann, Michael R

    2014-07-01

    Our research examines the normative and descriptive adequacy of alternative computational models of diagnostic reasoning from single effects to single causes. Many theories of diagnostic reasoning are based on the normative assumption that inferences from an effect to its cause should reflect solely the empirically observed conditional probability of cause given effect. We argue against this assumption, as it neglects alternative causal structures that may have generated the sample data. Our structure induction model of diagnostic reasoning takes into account the uncertainty regarding the underlying causal structure. A key prediction of the model is that diagnostic judgments should not only reflect the empirical probability of cause given effect but should also depend on the reasoner's beliefs about the existence and strength of the link between cause and effect. We confirmed this prediction in 2 studies and showed that our theory better accounts for human judgments than alternative theories of diagnostic reasoning. Overall, our findings support the view that in diagnostic reasoning people go "beyond the information given" and use the available data to make inferences on the (unobserved) causal rather than on the (observed) data level. (c) 2014 APA, all rights reserved.

  10. A kinetic theory for age-structured stochastic birth-death processes

    NASA Astrophysics Data System (ADS)

    Chou, Tom; Greenman, Chris

    Classical age-structured mass-action models such as the McKendrick-von Foerster equation have been extensively studied but they are structurally unable to describe stochastic fluctuations or population-size-dependent birth and death rates. Conversely, current theories that include size-dependent population dynamics (e.g., carrying capacity) cannot be easily extended to take into account age-dependent birth and death rates. In this paper, we present a systematic derivation of a new fully stochastic kinetic theory for interacting age-structured populations. By defining multiparticle probability density functions, we derive a hierarchy of kinetic equations for the stochastic evolution of an aging population undergoing birth and death. We show that the fully stochastic age-dependent birth-death process precludes factorization of the corresponding probability densities, which then must be solved by using a BBGKY-like hierarchy. Our results generalize both deterministic models and existing master equation approaches by providing an intuitive and efficient way to simultaneously model age- and population-dependent stochastic dynamics applicable to the study of demography, stem cell dynamics, and disease evolution. NSF.

  11. Evolving the capacity to understand actions, intentions, and goals.

    PubMed

    Hauser, Marc; Wood, Justin

    2010-01-01

    We synthesize the contrasting predictions of motor simulation and teleological theories of action comprehension and present evidence from a series of studies showing that monkeys and apes-like humans-extract the meaning of an event by (a) going beyond the surface appearance of actions, attributing goals and intentions to the agent; (b) using details about the environment to infer when an action is rational or irrational; (c) making predictions about an agent's goal and the most probable action to obtain the goal, within the constraints of the situation; (d) predicting the most probable outcome of actions even when they are physiologically incapable of producing the actions; and (e) combining information about means and outcomes to make decisions about social interactions, some with moral relevance. These studies reveal the limitations of motor simulation theories, especially those that rely on the notion of direct matching and mirror neuron activation. They provide support, however, for a teleological theory, rooted in an inferential process that extracts information about action means, potential goals, and the environmental constraints that limit rational action.

  12. The limits of weak selection and large population size in evolutionary game theory.

    PubMed

    Sample, Christine; Allen, Benjamin

    2017-11-01

    Evolutionary game theory is a mathematical approach to studying how social behaviors evolve. In many recent works, evolutionary competition between strategies is modeled as a stochastic process in a finite population. In this context, two limits are both mathematically convenient and biologically relevant: weak selection and large population size. These limits can be combined in different ways, leading to potentially different results. We consider two orderings: the [Formula: see text] limit, in which weak selection is applied before the large population limit, and the [Formula: see text] limit, in which the order is reversed. Formal mathematical definitions of the [Formula: see text] and [Formula: see text] limits are provided. Applying these definitions to the Moran process of evolutionary game theory, we obtain asymptotic expressions for fixation probability and conditions for success in these limits. We find that the asymptotic expressions for fixation probability, and the conditions for a strategy to be favored over a neutral mutation, are different in the [Formula: see text] and [Formula: see text] limits. However, the ordering of limits does not affect the conditions for one strategy to be favored over another.

  13. Research in Stochastic Processes

    DTIC Science & Technology

    1988-10-10

    To appear in Proceedings Volume, Oberwolfach Conf. on Extremal Value Theory, Ed. J. HUsler and R. Reiss, Springer. 4. M.R. Leadbetter. The exceedance...Hsing, J. Husler and M.R. Leadbetter, On the exceedance point process for a stationary sequence, Probability Theor. Rel. Fields, 20, 1988, 97-112 Z.J...Oberwotfach Conf. on Extreme Value Theory. J. Husler and R. Reiss. eds.. Springer. to appear V. Mandrekar, On a limit theorem and invariance

  14. Proposed Test of Relative Phase as Hidden Variable in Quantum Mechanics

    DTIC Science & Technology

    2012-01-01

    implicitly due to its ubiquity in quantum theory , but searches for dependence of measurement outcome on other parameters have been lacking. For a two -state...implemen- tation for the specific case of an atomic two -state system with laser-induced fluores- cence for measurement. Keywords Quantum measurement...Measurement postulate · Born rule 1 Introduction 1.1 Problems with Quantum Measurement Quantum theory prescribes probabilities for outcomes of measurements

  15. Superconducting surface impedance under radiofrequency field

    DOE PAGES

    Xiao, Binping P.; Reece, Charles E.; Kelley, Michael J.

    2013-04-26

    Based on BCS theory with moving Cooper pairs, the electron states distribution at 0K and the probability of electron occupation with finite temperature have been derived and applied to anomalous skin effect theory to obtain the surface impedance of a superconductor under radiofrequency (RF) field. We present the numerical results for Nb and compare these with representative RF field-dependent effective surface resistance measurements from a 1.5 GHz resonant structure.

  16. A pedagogical derivation of the matrix element method in particle physics data analysis

    NASA Astrophysics Data System (ADS)

    Sumowidagdo, Suharyo

    2018-03-01

    The matrix element method provides a direct connection between the underlying theory of particle physics processes and detector-level physical observables. I am presenting a pedagogically-oriented derivation of the matrix element method, drawing from elementary concepts in probability theory, statistics, and the process of experimental measurements. The level of treatment should be suitable for beginning research student in phenomenology and experimental high energy physics.

  17. Assessing Goodness of Fit in Item Response Theory with Nonparametric Models: A Comparison of Posterior Probabilities and Kernel-Smoothing Approaches

    ERIC Educational Resources Information Center

    Sueiro, Manuel J.; Abad, Francisco J.

    2011-01-01

    The distance between nonparametric and parametric item characteristic curves has been proposed as an index of goodness of fit in item response theory in the form of a root integrated squared error index. This article proposes to use the posterior distribution of the latent trait as the nonparametric model and compares the performance of an index…

  18. Co-evolving prisoner's dilemma: Performance indicators and analytic approaches

    NASA Astrophysics Data System (ADS)

    Zhang, W.; Choi, C. W.; Li, Y. S.; Xu, C.; Hui, P. M.

    2017-02-01

    Understanding the intrinsic relation between the dynamical processes in a co-evolving network and the necessary ingredients in formulating a reliable theory is an important question and a challenging task. Using two slightly different definitions of performance indicator in the context of a co-evolving prisoner's dilemma game, it is shown that very different cooperative levels result and theories of different complexity are required to understand the key features. When the payoff per opponent is used as the indicator (Case A), non-cooperative strategy has an edge and dominates in a large part of the parameter space formed by the cutting-and-rewiring probability and the strategy imitation probability. When the payoff from all opponents is used (Case B), cooperative strategy has an edge and dominates the parameter space. Two distinct phases, one homogeneous and dynamical and another inhomogeneous and static, emerge and the phase boundary in the parameter space is studied in detail. A simple theory assuming an average competing environment for cooperative agents and another for non-cooperative agents is shown to perform well in Case A. The same theory, however, fails badly for Case B. It is necessary to include more spatial correlation into a theory for Case B. We show that the local configuration approximation, which takes into account of the different competing environments for agents with different strategies and degrees, is needed to give reliable results for Case B. The results illustrate that formulating a proper theory requires both a conceptual understanding of the effects of the adaptive processes in the problem and a delicate balance between simplicity and accuracy.

  19. University Students’ Conceptual Knowledge of Randomness and Probability in the Contexts of Evolution and Mathematics

    PubMed Central

    Fiedler, Daniela; Tröbst, Steffen; Harms, Ute

    2017-01-01

    Students of all ages face severe conceptual difficulties regarding key aspects of evolution—the central, unifying, and overarching theme in biology. Aspects strongly related to abstract “threshold” concepts like randomness and probability appear to pose particular difficulties. A further problem is the lack of an appropriate instrument for assessing students’ conceptual knowledge of randomness and probability in the context of evolution. To address this problem, we have developed two instruments, Randomness and Probability Test in the Context of Evolution (RaProEvo) and Randomness and Probability Test in the Context of Mathematics (RaProMath), that include both multiple-choice and free-response items. The instruments were administered to 140 university students in Germany, then the Rasch partial-credit model was applied to assess them. The results indicate that the instruments generate reliable and valid inferences about students’ conceptual knowledge of randomness and probability in the two contexts (which are separable competencies). Furthermore, RaProEvo detected significant differences in knowledge of randomness and probability, as well as evolutionary theory, between biology majors and preservice biology teachers. PMID:28572180

  20. Large-Scale Diffraction Patterns from Circular Objects

    ERIC Educational Resources Information Center

    Rinard, Phillip M.

    1976-01-01

    Investigates quantitatively the diffractions of light by a U.S. penny and an aperture of the same size. Differences noted between the theory and measurements are discussed, with probable causes indicated. (Author/CP)

  1. Laplace Transforms without Integration

    ERIC Educational Resources Information Center

    Robertson, Robert L.

    2017-01-01

    Calculating Laplace transforms from the definition often requires tedious integrations. This paper provides an integration-free technique for calculating Laplace transforms of many familiar functions. It also shows how the technique can be applied to probability theory.

  2. Finding Bounded Rational Equilibria. Part 2; Alternative Lagrangians and Uncountable Move Spaces

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.

    2004-01-01

    A long-running difficulty with conventional game theory has been how to modify it to accommodate the bounded rationality characterizing all real-world players. A recurring issue in statistical physics is how best to approximate joint probability distributions with decoupled (and therefore far more tractable) distributions. It has recently been shown that the same information theoretic mathematical structure, known as Probability Collectives (PC) underlies both issues. This relationship between statistical physics and game theory allows techniques and insights &om the one field to be applied to the other. In particular, PC provides a formal model-independent definition of the degree of rationality of a player and of bounded rationality equilibria. This pair of papers extends previous work on PC by introducing new computational approaches to effectively find bounded rationality equilibria of common-interest (team) games.

  3. Some Remarks on Knowledge and Probability Arising from Counterfactual Quantum Effects

    NASA Astrophysics Data System (ADS)

    Lupacchini, Rossella

    Can the mere possibility of a physical phenomenon affect the outcome of an experiment? In fact quantum theory presents us actual physical effects arising from "counterfactuals", that is physical effects brought about by things that might have happened, although they did not happen. How can it be? After a short outline of the quantum-mechanical description of physical reality, the occurrence of such counterfactual effects in quantum theory is illustrated by means of a Mach-Zehnder interferometer. Then these paradoxical phenomena undermining the very notion of physical event and questioning about which knowledge of physical reality can ever be obtained will be analysed using a classical possible-worlds model of knowledge and probability. Finally, a surprising application of counterfactual quantum effects producing a new kind of computing with no classical analogue will be shown.

  4. Wave theory of turbulence in compressible media (acoustic theory of turbulence)

    NASA Technical Reports Server (NTRS)

    Kentzer, C. P.

    1975-01-01

    The generation and the transmission of sound in turbulent flows are treated as one of the several aspects of wave propagation in turbulence. Fluid fluctuations are decomposed into orthogonal Fourier components, with five interacting modes of wave propagation: two vorticity modes, one entropy mode, and two acoustic modes. Wave interactions, governed by the inhomogeneous and nonlinear terms of the perturbed Navier-Stokes equations, are modeled by random functions which give the rates of change of wave amplitudes equal to the averaged interaction terms. The statistical framework adopted is a quantum-like formulation in terms of complex distribution functions. The spatial probability distributions are given by the squares of the absolute values of the complex characteristic functions. This formulation results in nonlinear diffusion-type transport equations for the probability densities of the five modes of wave propagation.

  5. What Eye Movements Can Tell about Theory of Mind in a Strategic Game

    PubMed Central

    Meijering, Ben; van Rijn, Hedderik; Taatgen, Niels A.; Verbrugge, Rineke

    2012-01-01

    This study investigates strategies in reasoning about mental states of others, a process that requires theory of mind. It is a first step in studying the cognitive basis of such reasoning, as strategies affect tradeoffs between cognitive resources. Participants were presented with a two-player game that required reasoning about the mental states of the opponent. Game theory literature discerns two candidate strategies that participants could use in this game: either forward reasoning or backward reasoning. Forward reasoning proceeds from the first decision point to the last, whereas backward reasoning proceeds in the opposite direction. Backward reasoning is the only optimal strategy, because the optimal outcome is known at each decision point. Nevertheless, we argue that participants prefer forward reasoning because it is similar to causal reasoning. Causal reasoning, in turn, is prevalent in human reasoning. Eye movements were measured to discern between forward and backward progressions of fixations. The observed fixation sequences corresponded best with forward reasoning. Early in games, the probability of observing a forward progression of fixations is higher than the probability of observing a backward progression. Later in games, the probabilities of forward and backward progressions are similar, which seems to imply that participants were either applying backward reasoning or jumping back to previous decision points while applying forward reasoning. Thus, the game-theoretical favorite strategy, backward reasoning, does seem to exist in human reasoning. However, participants preferred the more familiar, practiced, and prevalent strategy: forward reasoning. PMID:23029341

  6. Theoretical study of the dynamic magnetic response of ferrofluid to static and alternating magnetic fields

    NASA Astrophysics Data System (ADS)

    Batrudinov, Timur M.; Ambarov, Alexander V.; Elfimova, Ekaterina A.; Zverev, Vladimir S.; Ivanov, Alexey O.

    2017-06-01

    The dynamic magnetic response of ferrofluid in a static uniform external magnetic field to a weak, linear polarized, alternating magnetic field is investigated theoretically. The ferrofluid is modeled as a system of dipolar hard spheres, suspended in a long cylindrical tube whose long axis is parallel to the direction of the static and alternating magnetic fields. The theory is based on the Fokker-Planck-Brown equation formulated for the case when the both static and alternating magnetic fields are applied. The solution of the Fokker-Planck-Brown equation describing the orientational probability density of a randomly chosen dipolar particle is expressed as a series in terms of the spherical Legendre polynomials. The obtained analytical expression connecting three neighboring coefficients of the series makes possible to determine the probability density with any order of accuracy in terms of Legendre polynomials. The analytical formula for the probability density truncated at the first Legendre polynomial is evaluated and used for the calculation of the magnetization and dynamic susceptibility spectra. In the absence of the static magnetic field the presented theory gives the correct single-particle Debye-theory result, which is the exact solution of the Fokker-Planck-Brown equation for the case of applied weak alternating magnetic field. The influence of the static magnetic field on the dynamic susceptibility is analyzed in terms of the low-frequency behavior of the real part and the position of the peak in the imaginary part.

  7. Decision theory applied to image quality control in radiology.

    PubMed

    Lessa, Patrícia S; Caous, Cristofer A; Arantes, Paula R; Amaro, Edson; de Souza, Fernando M Campello

    2008-11-13

    The present work aims at the application of the decision theory to radiological image quality control (QC) in diagnostic routine. The main problem addressed in the framework of decision theory is to accept or reject a film lot of a radiology service. The probability of each decision of a determined set of variables was obtained from the selected films. Based on a radiology service routine a decision probability function was determined for each considered group of combination characteristics. These characteristics were related to the film quality control. These parameters were also framed in a set of 8 possibilities, resulting in 256 possible decision rules. In order to determine a general utility application function to access the decision risk, we have used a simple unique parameter called r. The payoffs chosen were: diagnostic's result (correct/incorrect), cost (high/low), and patient satisfaction (yes/no) resulting in eight possible combinations. Depending on the value of r, more or less risk will occur related to the decision-making. The utility function was evaluated in order to determine the probability of a decision. The decision was made with patients or administrators' opinions from a radiology service center. The model is a formal quantitative approach to make a decision related to the medical imaging quality, providing an instrument to discriminate what is really necessary to accept or reject a film or a film lot. The method presented herein can help to access the risk level of an incorrect radiological diagnosis decision.

  8. What eye movements can tell about theory of mind in a strategic game.

    PubMed

    Meijering, Ben; van Rijn, Hedderik; Taatgen, Niels A; Verbrugge, Rineke

    2012-01-01

    This study investigates strategies in reasoning about mental states of others, a process that requires theory of mind. It is a first step in studying the cognitive basis of such reasoning, as strategies affect tradeoffs between cognitive resources. Participants were presented with a two-player game that required reasoning about the mental states of the opponent. Game theory literature discerns two candidate strategies that participants could use in this game: either forward reasoning or backward reasoning. Forward reasoning proceeds from the first decision point to the last, whereas backward reasoning proceeds in the opposite direction. Backward reasoning is the only optimal strategy, because the optimal outcome is known at each decision point. Nevertheless, we argue that participants prefer forward reasoning because it is similar to causal reasoning. Causal reasoning, in turn, is prevalent in human reasoning. Eye movements were measured to discern between forward and backward progressions of fixations. The observed fixation sequences corresponded best with forward reasoning. Early in games, the probability of observing a forward progression of fixations is higher than the probability of observing a backward progression. Later in games, the probabilities of forward and backward progressions are similar, which seems to imply that participants were either applying backward reasoning or jumping back to previous decision points while applying forward reasoning. Thus, the game-theoretical favorite strategy, backward reasoning, does seem to exist in human reasoning. However, participants preferred the more familiar, practiced, and prevalent strategy: forward reasoning.

  9. A hierarchical instrumental decision theory of nicotine dependence.

    PubMed

    Hogarth, Lee; Troisi, Joseph R

    2015-01-01

    It is important to characterize the learning processes governing tobacco-seeking in order to understand how best to treat this behavior. Most drug learning theories have adopted a Pavlovian framework wherein the conditioned response is the main motivational process. We favor instead a hierarchical instrumental decision account, wherein expectations about the instrumental contingency between voluntary tobacco-seeking and the receipt of nicotine reward determines the probability of executing this behavior. To support this view, we review titration and nicotine discrimination research showing that internal signals for deprivation/satiation modulate expectations about the current incentive value of smoking, thereby modulating the propensity of this behavior. We also review research on cue-reactivity which has shown that external smoking cues modulate expectations about the probability of the tobacco-seeking response being effective, thereby modulating the propensity of this behavior. Economic decision theory is then considered to elucidate how expectations about the value and probability of response-nicotine contingency are integrated to form an overall utility estimate for that option for comparison with qualitatively different, nonsubstitute reinforcers, to determine response selection. As an applied test for this hierarchical instrumental decision framework, we consider how well it accounts for individual liability to smoking uptake and perseveration, pharmacotherapy, cue-extinction therapies, and plain packaging. We conclude that the hierarchical instrumental account is successful in reconciling this broad range of phenomenon precisely because it accepts that multiple diverse sources of internal and external information must be integrated to shape the decision to smoke.

  10. Ultraviolet divergences in non-renormalizable supersymmetric theories

    NASA Astrophysics Data System (ADS)

    Smilga, A.

    2017-03-01

    We present a pedagogical review of our current understanding of the ultraviolet structure of N = (1,1) 6D supersymmetric Yang-Mills theory and of N = 8 4 D supergravity. These theories are not renormalizable, they involve power ultraviolet divergences and, in all probability, an infinite set of higherdimensional counterterms that contribute to on-mass-shell scattering amplitudes. A specific feature of supersymmetric theories (especially, of extended supersymmetric theories) is that these counterterms may not be invariant off shell under the full set of supersymmetry transformations. The lowest-dimensional nontrivial counterterm is supersymmetric on shell. Still higher counterterms may lose even the on-shell invariance. On the other hand, the full effective Lagrangian, generating the amplitudes and representing an infinite sum of counterterms, still enjoys the complete symmetry of original theory. We also discuss simple supersymmetric quantum-mechanical models that exhibit the same behaviour.

  11. Prospect theory reflects selective allocation of attention.

    PubMed

    Pachur, Thorsten; Schulte-Mecklenbeck, Michael; Murphy, Ryan O; Hertwig, Ralph

    2018-02-01

    There is a disconnect in the literature between analyses of risky choice based on cumulative prospect theory (CPT) and work on predecisional information processing. One likely reason is that for expectation models (e.g., CPT), it is often assumed that people behaved only as if they conducted the computations leading to the predicted choice and that the models are thus mute regarding information processing. We suggest that key psychological constructs in CPT, such as loss aversion and outcome and probability sensitivity, can be interpreted in terms of attention allocation. In two experiments, we tested hypotheses about specific links between CPT parameters and attentional regularities. Experiment 1 used process tracing to monitor participants' predecisional attention allocation to outcome and probability information. As hypothesized, individual differences in CPT's loss-aversion, outcome-sensitivity, and probability-sensitivity parameters (estimated from participants' choices) were systematically associated with individual differences in attention allocation to outcome and probability information. For instance, loss aversion was associated with the relative attention allocated to loss and gain outcomes, and a more strongly curved weighting function was associated with less attention allocated to probabilities. Experiment 2 manipulated participants' attention to losses or gains, causing systematic differences in CPT's loss-aversion parameter. This result indicates that attention allocation can to some extent cause choice regularities that are captured by CPT. Our findings demonstrate an as-if model's capacity to reflect characteristics of information processing. We suggest that the observed CPT-attention links can be harnessed to inform the development of process models of risky choice. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  12. Continuous-time random-walk model for financial distributions

    NASA Astrophysics Data System (ADS)

    Masoliver, Jaume; Montero, Miquel; Weiss, George H.

    2003-02-01

    We apply the formalism of the continuous-time random walk to the study of financial data. The entire distribution of prices can be obtained once two auxiliary densities are known. These are the probability densities for the pausing time between successive jumps and the corresponding probability density for the magnitude of a jump. We have applied the formalism to data on the U.S. dollar deutsche mark future exchange, finding good agreement between theory and the observed data.

  13. Constraints on scattering amplitudes in multistate Landau-Zener theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sinitsyn, Nikolai A.; Lin, Jeffmin; Chernyak, Vladimir Y.

    2017-01-30

    Here, we derive a set of constraints, which we will call hierarchy constraints, on scattering amplitudes of an arbitrary multistate Landau-Zener model (MLZM). The presence of additional symmetries can transform such constraints into nontrivial relations between elements of the transition probability matrix. This observation can be used to derive complete solutions of some MLZMs or, for models that cannot be solved completely, to reduce the number of independent elements of the transition probability matrix.

  14. Unsolved Problems in Evolutionary Theory

    DTIC Science & Technology

    1967-01-01

    finding the probability of survival of a single new mutant). Most natural populations probably satisfy these conditions , as is illustrated by the...Ykl) of small quantities adding to zero. Then under suitable conditions on the function f(x), (3) xi + Yi,t+i = fi(x) + YE yjfi(tf) + O(y yt...It is clear that a sufficient condition for the point x to be locally stable is that all the roots of the matrix, (4) (a j) = ____ should have moduli

  15. Ecologically rational choice and the structure of the environment.

    PubMed

    Pleskac, Timothy J; Hertwig, Ralph

    2014-10-01

    In life, risk is reward and vice versa. Unfortunately, the big rewards people desire are relatively unlikely to occur. This relationship between risk and reward or probabilities and payoffs seems obvious to the financial community and to laypeople alike. Yet theories of decision making have largely ignored it. We conducted an ecological analysis of life's gambles, ranging from the domains of roulette and life insurance to scientific publications and artificial insemination. Across all domains, payoffs and probabilities proved intimately tied, with payoff magnitudes signaling their probabilities. In some cases, the constraints of the market result in these two core elements of choice being related via a power function; in other cases, other factors such as social norms appear to produce the inverse relationship between risks and rewards. We offer evidence that decision makers exploit this relationship in the form of a heuristic--the risk-reward heuristic--to infer the probability of a payoff during decisions under uncertainty. We demonstrate how the heuristic can help explain observed ambiguity aversion. We further show how this ecological relationship can inform other aspects of decision making, particularly the approach of using monetary lotteries to study choice under risk and uncertainty. Taken together, these findings suggest that theories of decision making need to model not only the decision process but also the environment to which the process is adapted.

  16. Role of environmental variability in the evolution of life history strategies.

    PubMed

    Hastings, A; Caswell, H

    1979-09-01

    We reexamine the role of environmental variability in the evolution of life history strategies. We show that normally distributed deviations in the quality of the environment should lead to normally distributed deviations in the logarithm of year-to-year survival probabilities, which leads to interesting consequences for the evolution of annual and perennial strategies and reproductive effort. We also examine the effects of using differing criteria to determine the outcome of selection. Some predictions of previous theory are reversed, allowing distinctions between r and K theory and a theory based on variability. However, these distinctions require information about both the environment and the selection process not required by current theory.

  17. Probability and Locality: Determinism Versus Indeterminism in Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Dickson, William Michael

    1995-01-01

    Quantum mechanics is often taken to be necessarily probabilistic. However, this view of quantum mechanics appears to be more the result of historical accident than of careful analysis. Moreover, quantum mechanics in its usual form faces serious problems. Although the mathematical core of quantum mechanics--quantum probability theory- -does not face conceptual difficulties, the application of quantum probability to the physical world leads to problems. In particular, quantum mechanics seems incapable of describing our everyday macroscopic experience. Therefore, several authors have proposed new interpretations --including (but not limited to) modal interpretations, spontaneous localization interpretations, the consistent histories approach, and the Bohm theory--each of which deals with quantum-mechanical probabilities differently. Each of these interpretations promises to describe our macroscopic experience and, arguably, each succeeds. Is there any way to compare them? Perhaps, if we turn to another troubling aspect of quantum mechanics, non-locality. Non -locality is troubling because prima facie it threatens the compatibility of quantum mechanics with special relativity. This prima facie threat is mitigated by the no-signalling theorems in quantum mechanics, but nonetheless one may find a 'conflict of spirit' between nonlocality in quantum mechanics and special relativity. Do any of these interpretations resolve this conflict of spirit?. There is a strong relation between how an interpretation deals with quantum-mechanical probabilities and how it deals with non-locality. The main argument here is that only a completely deterministic interpretation can be completely local. That is, locality together with the empirical predictions of quantum mechanics (specifically, its strict correlations) entails determinism. But even with this entailment in hand, comparison of the various interpretations requires a look at each, to see how non-locality arises, or in the case of deterministic interpretations, whether it arises. The result of this investigation is that, at the least, deterministic interpretations are no worse off with respect to special relativity than indeterministic interpretations. This conclusion runs against a common view that deterministic interpretations, specifically the Bohm theory, have more difficulty with special relativity than other interpretations.

  18. An information theory framework for dynamic functional domain connectivity.

    PubMed

    Vergara, Victor M; Miller, Robyn; Calhoun, Vince

    2017-06-01

    Dynamic functional network connectivity (dFNC) analyzes time evolution of coherent activity in the brain. In this technique dynamic changes are considered for the whole brain. This paper proposes an information theory framework to measure information flowing among subsets of functional networks call functional domains. Our method aims at estimating bits of information contained and shared among domains. The succession of dynamic functional states is estimated at the domain level. Information quantity is based on the probabilities of observing each dynamic state. Mutual information measurement is then obtained from probabilities across domains. Thus, we named this value the cross domain mutual information (CDMI). Strong CDMIs were observed in relation to the subcortical domain. Domains related to sensorial input, motor control and cerebellum form another CDMI cluster. Information flow among other domains was seldom found. Other methods of dynamic connectivity focus on whole brain dFNC matrices. In the current framework, information theory is applied to states estimated from pairs of multi-network functional domains. In this context, we apply information theory to measure information flow across functional domains. Identified CDMI clusters point to known information pathways in the basal ganglia and also among areas of sensorial input, patterns found in static functional connectivity. In contrast, CDMI across brain areas of higher level cognitive processing follow a different pattern that indicates scarce information sharing. These findings show that employing information theory to formally measured information flow through brain domains reveals additional features of functional connectivity. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. The evolution of genomic imprinting: theories, predictions and empirical tests

    PubMed Central

    Patten, M M; Ross, L; Curley, J P; Queller, D C; Bonduriansky, R; Wolf, J B

    2014-01-01

    The epigenetic phenomenon of genomic imprinting has motivated the development of numerous theories for its evolutionary origins and genomic distribution. In this review, we examine the three theories that have best withstood theoretical and empirical scrutiny. These are: Haig and colleagues' kinship theory; Day and Bonduriansky's sexual antagonism theory; and Wolf and Hager's maternal–offspring coadaptation theory. These theories have fundamentally different perspectives on the adaptive significance of imprinting. The kinship theory views imprinting as a mechanism to change gene dosage, with imprinting evolving because of the differential effect that gene dosage has on the fitness of matrilineal and patrilineal relatives. The sexual antagonism and maternal–offspring coadaptation theories view genomic imprinting as a mechanism to modify the resemblance of an individual to its two parents, with imprinting evolving to increase the probability of expressing the fitter of the two alleles at a locus. In an effort to stimulate further empirical work on the topic, we carefully detail the logic and assumptions of all three theories, clarify the specific predictions of each and suggest tests to discriminate between these alternative theories for why particular genes are imprinted. PMID:24755983

  20. The erroneous signals of detection theory.

    PubMed

    Trimmer, Pete C; Ehlman, Sean M; McNamara, John M; Sih, Andrew

    2017-10-25

    Signal detection theory has influenced the behavioural sciences for over 50 years. The theory provides a simple equation that indicates numerous 'intuitive' results; e.g. prey should be more prone to take evasive action (in response to an ambiguous cue) if predators are more common. Here, we use analytical and computational models to show that, in numerous biological scenarios, the standard results of signal detection theory do not apply; more predators can result in prey being less responsive to such cues. The standard results need not apply when the probability of danger pertains not just to the present, but also to future decisions. We identify how responses to risk should depend on background mortality and autocorrelation, and that predictions in relation to animal welfare can also be reversed from the standard theory. © 2017 The Author(s).

  1. Ubiquitous Log Odds: A Common Representation of Probability and Frequency Distortion in Perception, Action, and Cognition

    PubMed Central

    Zhang, Hang; Maloney, Laurence T.

    2012-01-01

    In decision from experience, the source of probability information affects how probability is distorted in the decision task. Understanding how and why probability is distorted is a key issue in understanding the peculiar character of experience-based decision. We consider how probability information is used not just in decision-making but also in a wide variety of cognitive, perceptual, and motor tasks. Very similar patterns of distortion of probability/frequency information have been found in visual frequency estimation, frequency estimation based on memory, signal detection theory, and in the use of probability information in decision-making under risk and uncertainty. We show that distortion of probability in all cases is well captured as linear transformations of the log odds of frequency and/or probability, a model with a slope parameter, and an intercept parameter. We then consider how task and experience influence these two parameters and the resulting distortion of probability. We review how the probability distortions change in systematic ways with task and report three experiments on frequency distortion where the distortions change systematically in the same task. We found that the slope of frequency distortions decreases with the sample size, which is echoed by findings in decision from experience. We review previous models of the representation of uncertainty and find that none can account for the empirical findings. PMID:22294978

  2. Through the looking glass. A '70s lesbian feminist considers queer theory.

    PubMed

    Cruikshank, Margaret

    2007-01-01

    Lesbian feminists who began their work in the 1970s probably share my mixed feelings about and attitudes towards Queer Theory: curiosity, envy, indignation and occasional agreement. The solar center of mostly male Queer Theory has young lesbian scholars orbiting around it. Gender used to share the stage with sexuality but now seems relegated to the wings. Like Marxists in the 1950s who remembered the heady days of the 1930s, we veteran lesbian feminists cannot help recalling the excitement and sense of possibility in Lesbian Studies twenty-five years ago.

  3. Quantum Information as a Non-Kolmogorovian Generalization of Shannon's Theory

    NASA Astrophysics Data System (ADS)

    Holik, Federico; Bosyk, Gustavo; Bellomo, Guido

    2015-10-01

    In this article we discuss the formal structure of a generalized information theory based on the extension of the probability calculus of Kolmogorov to a (possibly) non-commutative setting. By studying this framework, we argue that quantum information can be considered as a particular case of a huge family of non-commutative extensions of its classical counterpart. In any conceivable information theory, the possibility of dealing with different kinds of information measures plays a key role. Here, we generalize a notion of state spectrum, allowing us to introduce a majorization relation and a new family of generalized entropic measures.

  4. Real-time first-principles simulations of thermionic emission from N-doped diamond surfaces

    NASA Astrophysics Data System (ADS)

    Shinozaki, Tomoki; Hagiwara, Satoshi; Morioka, Naoya; Kimura, Yuji; Watanabe, Kazuyuki

    2018-06-01

    We investigate thermionic emission from N-doped C(100) surfaces terminated with H or Li atoms using finite-temperature real-time density functional theory simulations. The current–temperature characteristics are found to follow the Richardson–Dushman (RD) equation, which was derived from a semiclassical theory. However, the Richardson constants are two orders of magnitude smaller than the ideal values from the RD theory. This considerable reduction is attributed primarily to the extremely low transmission probability of electrons from the surfaces toward the vacuum. The present method enables straightforward evaluation of the ideal efficiency of a thermionic energy converter.

  5. Rise of planetary bodies.

    NASA Astrophysics Data System (ADS)

    Czechowski, Z.; Leliwa-Kopystyński, J.; Teisseyre, R.

    Contents: 1. On the probability of the formation of planetary systems. 2. Condensation triggered by supernova explosion and tidal capture theory. 3. Foundations of accretion theory. 4. The structure and evolution of the protoplanetary disk. 5. Coagulation of orbiting bodies. 6. Collision phenomena related to planetology: accretion, fragmentation, cratering. 7. Dynamics of planetesimals: Introduction, Safronov's approach, elements of the kinetic theory of gases, Nakagawa's approach, approaches considering inelastic collisions and gravitational encounters of planetesimals, Hämeen-Anttila approach, planetesimals with different masses. 8. Growth of the planetary embryo: Basic equations, model of growth of planetary embryos. 9. Origin of the Moon and the satellites.

  6. Judgment, Probability, and Aristotle's Rhetoric.

    ERIC Educational Resources Information Center

    Warnick, Barbara

    1989-01-01

    Discusses Aristotle's five means of making judgments: intelligence, "episteme" (scientific knowledge), "sophia" (theoretical wisdom), "techne" (art), and "phronesis" (practical wisdom). Sets Aristotle's theory of rhetorical argument within the context of his overall view of human judgment. Notes that…

  7. Development of a Nonlinear Probability of Collision Tool for the Earth Observing System

    NASA Technical Reports Server (NTRS)

    McKinley, David P.

    2006-01-01

    The Earth Observing System (EOS) spacecraft Terra, Aqua, and Aura fly in constellation with several other spacecraft in 705-kilometer mean altitude sun-synchronous orbits. All three spacecraft are operated by the Earth Science Mission Operations (ESMO) Project at Goddard Space Flight Center (GSFC). In 2004, the ESMO project began assessing the probability of collision of the EOS spacecraft with other space objects. In addition to conjunctions with high relative velocities, the collision assessment method for the EOS spacecraft must address conjunctions with low relative velocities during potential collisions between constellation members. Probability of Collision algorithms that are based on assumptions of high relative velocities and linear relative trajectories are not suitable for these situations; therefore an algorithm for handling the nonlinear relative trajectories was developed. This paper describes this algorithm and presents results from its validation for operational use. The probability of collision is typically calculated by integrating a Gaussian probability distribution over the volume swept out by a sphere representing the size of the space objects involved in the conjunction. This sphere is defined as the Hard Body Radius. With the assumption of linear relative trajectories, this volume is a cylinder, which translates into simple limits of integration for the probability calculation. For the case of nonlinear relative trajectories, the volume becomes a complex geometry. However, with an appropriate choice of coordinate systems, the new algorithm breaks down the complex geometry into a series of simple cylinders that have simple limits of integration. This nonlinear algorithm will be discussed in detail in the paper. The nonlinear Probability of Collision algorithm was first verified by showing that, when used in high relative velocity cases, it yields similar answers to existing high relative velocity linear relative trajectory algorithms. The comparison with the existing high velocity/linear theory will also be used to determine at what relative velocity the analysis should use the new nonlinear theory in place of the existing linear theory. The nonlinear algorithm was also compared to a known exact solution for the probability of collision between two objects when the relative motion is strictly circular and the error covariance is spherically symmetric. Figure I shows preliminary results from this comparison by plotting the probabilities calculated from the new algorithm and those from the exact solution versus the Hard Body Radius to Covariance ratio. These results show about 5% error when the Hard Body Radius is equal to one half the spherical covariance magnitude. The algorithm was then combined with a high fidelity orbit state and error covariance propagator into a useful tool for analyzing low relative velocity nonlinear relative trajectories. The high fidelity propagator is capable of using atmospheric drag, central body gravitational, solar radiation, and third body forces to provide accurate prediction of the relative trajectories and covariance evolution. The covariance propagator also includes a process noise model to ensure realistic evolutions of the error covariance. This paper will describe the integration of the nonlinear probability algorithm and the propagators into a useful collision assessment tool. Finally, a hypothetical case study involving a low relative velocity conjunction between members of the Earth Observation System constellation will be presented.

  8. Probabilistic Estimation of Rare Random Collisions in 3 Space

    DTIC Science & Technology

    2009-03-01

    extended Poisson process as a feature of probability theory. With the bulk of research in extended Poisson processes going into parame- ter estimation, the...application of extended Poisson processes to spatial processes is largely untouched. Faddy performed a short study of spatial data, but overtly...the theory of extended Poisson processes . To date, the processes are limited in that the rates only depend on the number of arrivals at some time

  9. Approximations of Two-Attribute Utility Functions

    DTIC Science & Technology

    1976-09-01

    preferred to") be a bina-zy relation on the set • of simple probability measures or ’gambles’ defined on a set T of consequences. Throughout this study it...simplifying independence assumptions. Although there are several approaches to this problem, the21 present study will focus on approximations of u... study will elicit additional interest in the topic. 2. REMARKS ON APPROXIMATION THEORY This section outlines a few basic ideas of approximation theory

  10. On the Black-Scholes European Option Pricing Model Robustness and Generality

    NASA Astrophysics Data System (ADS)

    Takada, Hellinton Hatsuo; de Oliveira Siqueira, José

    2008-11-01

    The common presentation of the widely known and accepted Black-Scholes European option pricing model explicitly imposes some restrictions such as the geometric Brownian motion assumption for the underlying stock price. In this paper, these usual restrictions are relaxed using maximum entropy principle of information theory, Pearson's distribution system, market frictionless and risk-neutrality theories to the calculation of a unique risk-neutral probability measure calibrated with market parameters.

  11. Orbital electron capture by the nucleus

    NASA Technical Reports Server (NTRS)

    Bambynek, W.; Behrens, H.; Chen, M. H.; Crasemann, B.; Fitzpatrick, M. L.; Ledingham, K. W. D.; Genz, H.; Mutterer, M.; Intemann, R. L.

    1976-01-01

    The theory of nuclear electron capture is reviewed in the light of current understanding of weak interactions. Experimental methods and results regarding capture probabilities, capture ratios, and EC/Beta(+) ratios are summarized. Radiative electron capture is discussed, including both theory and experiment. Atomic wave function overlap and electron exchange effects are covered, as are atomic transitions that accompany nuclear electron capture. Tables are provided to assist the reader in determining quantities of interest for specific cases.

  12. A quantum description of linear, and non-linear optical interactions in arrays of plasmonic nanoparticles

    NASA Astrophysics Data System (ADS)

    Arabahmadi, Ehsan; Ahmadi, Zabihollah; Rashidian, Bizhan

    2018-06-01

    A quantum theory for describing the interaction of photons and plasmons, in one- and two-dimensional arrays is presented. Ohmic losses and inter-band transitions are not considered. We use macroscopic approach, and quantum field theory methods including S-matrix expansion, and Feynman diagrams for this purpose. Non-linear interactions are also studied, and increasing the probability of such interactions, and its application are also discussed.

  13. Potential and flux field landscape theory. I. Global stability and dynamics of spatially dependent non-equilibrium systems.

    PubMed

    Wu, Wei; Wang, Jin

    2013-09-28

    We established a potential and flux field landscape theory to quantify the global stability and dynamics of general spatially dependent non-equilibrium deterministic and stochastic systems. We extended our potential and flux landscape theory for spatially independent non-equilibrium stochastic systems described by Fokker-Planck equations to spatially dependent stochastic systems governed by general functional Fokker-Planck equations as well as functional Kramers-Moyal equations derived from master equations. Our general theory is applied to reaction-diffusion systems. For equilibrium spatially dependent systems with detailed balance, the potential field landscape alone, defined in terms of the steady state probability distribution functional, determines the global stability and dynamics of the system. The global stability of the system is closely related to the topography of the potential field landscape in terms of the basins of attraction and barrier heights in the field configuration state space. The effective driving force of the system is generated by the functional gradient of the potential field alone. For non-equilibrium spatially dependent systems, the curl probability flux field is indispensable in breaking detailed balance and creating non-equilibrium condition for the system. A complete characterization of the non-equilibrium dynamics of the spatially dependent system requires both the potential field and the curl probability flux field. While the non-equilibrium potential field landscape attracts the system down along the functional gradient similar to an electron moving in an electric field, the non-equilibrium flux field drives the system in a curly way similar to an electron moving in a magnetic field. In the small fluctuation limit, the intrinsic potential field as the small fluctuation limit of the potential field for spatially dependent non-equilibrium systems, which is closely related to the steady state probability distribution functional, is found to be a Lyapunov functional of the deterministic spatially dependent system. Therefore, the intrinsic potential landscape can characterize the global stability of the deterministic system. The relative entropy functional of the stochastic spatially dependent non-equilibrium system is found to be the Lyapunov functional of the stochastic dynamics of the system. Therefore, the relative entropy functional quantifies the global stability of the stochastic system with finite fluctuations. Our theory offers an alternative general approach to other field-theoretic techniques, to study the global stability and dynamics of spatially dependent non-equilibrium field systems. It can be applied to many physical, chemical, and biological spatially dependent non-equilibrium systems.

  14. Generalized probability theories: what determines the structure of quantum theory?

    NASA Astrophysics Data System (ADS)

    Janotta, Peter; Hinrichsen, Haye

    2014-08-01

    The framework of generalized probabilistic theories is a powerful tool for studying the foundations of quantum physics. It provides the basis for a variety of recent findings that significantly improve our understanding of the rich physical structure of quantum theory. This review paper tries to present the framework and recent results to a broader readership in an accessible manner. To achieve this, we follow a constructive approach. Starting from a few basic physically motivated assumptions we show how a given set of observations can be manifested in an operational theory. Furthermore, we characterize consistency conditions limiting the range of possible extensions. In this framework classical and quantum theory appear as special cases, and the aim is to understand what distinguishes quantum mechanics as the fundamental theory realized in nature. It turns out that non-classical features of single systems can equivalently result from higher-dimensional classical theories that have been restricted. Entanglement and non-locality, however, are shown to be genuine non-classical features.

  15. More Intense Experiences, Less Intense Forecasts: Why People Overweight Probability Specifications in Affective Forecasts

    PubMed Central

    Buechel, Eva C.; Zhang, Jiao; Morewedge, Carey K.; Vosgerau, Joachim

    2014-01-01

    We propose that affective forecasters overestimate the extent to which experienced hedonic responses to an outcome are influenced by the probability of its occurrence. The experience of an outcome (e.g., winning a gamble) is typically more affectively intense than the simulation of that outcome (e.g., imagining winning a gamble) upon which the affective forecast for it is based. We suggest that, as a result, experiencers allocate a larger share of their attention toward the outcome (e.g., winning the gamble) and less to its probability specifications than do affective forecasters. Consequently, hedonic responses to an outcome are less sensitive to its probability specifications than are affective forecasts for that outcome. The results of 6 experiments provide support for our theory. Affective forecasters overestimated how sensitive experiencers would be to the probability of positive and negative outcomes (Experiments 1 and 2). Consistent with our attentional account, differences in sensitivity to probability specifications disappeared when the attention of forecasters was diverted from probability specifications (Experiment 3) or when the attention of experiencers was drawn toward probability specifications (Experiment 4). Finally, differences in sensitivity to probability specifications between forecasters and experiencers were diminished when the forecasted outcome was more affectively intense (Experiments 5 and 6). PMID:24128184

  16. More intense experiences, less intense forecasts: why people overweight probability specifications in affective forecasts.

    PubMed

    Buechel, Eva C; Zhang, Jiao; Morewedge, Carey K; Vosgerau, Joachim

    2014-01-01

    We propose that affective forecasters overestimate the extent to which experienced hedonic responses to an outcome are influenced by the probability of its occurrence. The experience of an outcome (e.g., winning a gamble) is typically more affectively intense than the simulation of that outcome (e.g., imagining winning a gamble) upon which the affective forecast for it is based. We suggest that, as a result, experiencers allocate a larger share of their attention toward the outcome (e.g., winning the gamble) and less to its probability specifications than do affective forecasters. Consequently, hedonic responses to an outcome are less sensitive to its probability specifications than are affective forecasts for that outcome. The results of 6 experiments provide support for our theory. Affective forecasters overestimated how sensitive experiencers would be to the probability of positive and negative outcomes (Experiments 1 and 2). Consistent with our attentional account, differences in sensitivity to probability specifications disappeared when the attention of forecasters was diverted from probability specifications (Experiment 3) or when the attention of experiencers was drawn toward probability specifications (Experiment 4). Finally, differences in sensitivity to probability specifications between forecasters and experiencers were diminished when the forecasted outcome was more affectively intense (Experiments 5 and 6).

  17. Is Einsteinian no-signalling violated in Bell tests?

    NASA Astrophysics Data System (ADS)

    Kupczynski, Marian

    2017-11-01

    Relativistic invariance is a physical law verified in several domains of physics. The impossibility of faster than light influences is not questioned by quantum theory. In quantum electrodynamics, in quantum field theory and in the standard model relativistic invariance is incorporated by construction. Quantum mechanics predicts strong long range correlations between outcomes of spin projection measurements performed in distant laboratories. In spite of these strong correlations marginal probability distributions should not depend on what was measured in the other laboratory what is called shortly: non-signalling. In several experiments, performed to test various Bell-type inequalities, some unexplained dependence of empirical marginal probability distributions on distant settings was observed. In this paper we demonstrate how a particular identification and selection procedure of paired distant outcomes is the most probable cause for this apparent violation of no-signalling principle. Thus this unexpected setting dependence does not prove the existence of superluminal influences and Einsteinian no-signalling principle has to be tested differently in dedicated experiments. We propose a detailed protocol telling how such experiments should be designed in order to be conclusive. We also explain how magical quantum correlations may be explained in a locally causal way.

  18. The functional transfer of genes from the mitochondria to the nucleus: the effects of selection, mutation, population size and rate of self-fertilization.

    PubMed

    Brandvain, Yaniv; Wade, Michael J

    2009-08-01

    The transfer of mitochondrial genes to the nucleus is a recurrent and consistent feature of eukaryotic genome evolution. Although many theories have been proposed to explain such transfers, little relevant data exist. The observation that clonal and self-fertilizing plants transfer more mitochondrial genes to their nuclei than do outcrossing plants contradicts predictions of major theories based on nuclear recombination and leaves a gap in our conceptual understanding how the observed pattern of gene transfer could arise. Here, with a series of deterministic and stochastic simulations, we show how epistatic selection and relative mutation rates of mitochondrial and nuclear genes influence mitochondrial-to-nuclear gene transfer. Specifically, we show that when there is a benefit to having a mitochondrial gene present in the nucleus, but absent in the mitochondria, self-fertilization dramatically increases both the rate and the probability of gene transfer. However, absent such a benefit, when mitochondrial mutation rates exceed those of the nucleus, self-fertilization decreases the rate and probability of transfer. This latter effect, however, is much weaker than the former. Our results are relevant to understanding the probabilities of fixation when loci in different genomes interact.

  19. Fault and event tree analyses for process systems risk analysis: uncertainty handling formulations.

    PubMed

    Ferdous, Refaul; Khan, Faisal; Sadiq, Rehan; Amyotte, Paul; Veitch, Brian

    2011-01-01

    Quantitative risk analysis (QRA) is a systematic approach for evaluating likelihood, consequences, and risk of adverse events. QRA based on event (ETA) and fault tree analyses (FTA) employs two basic assumptions. The first assumption is related to likelihood values of input events, and the second assumption is regarding interdependence among the events (for ETA) or basic events (for FTA). Traditionally, FTA and ETA both use crisp probabilities; however, to deal with uncertainties, the probability distributions of input event likelihoods are assumed. These probability distributions are often hard to come by and even if available, they are subject to incompleteness (partial ignorance) and imprecision. Furthermore, both FTA and ETA assume that events (or basic events) are independent. In practice, these two assumptions are often unrealistic. This article focuses on handling uncertainty in a QRA framework of a process system. Fuzzy set theory and evidence theory are used to describe the uncertainties in the input event likelihoods. A method based on a dependency coefficient is used to express interdependencies of events (or basic events) in ETA and FTA. To demonstrate the approach, two case studies are discussed. © 2010 Society for Risk Analysis.

  20. Uncertainty, imprecision, and the precautionary principle in climate change assessment.

    PubMed

    Borsuk, M E; Tomassini, L

    2005-01-01

    Statistical decision theory can provide useful support for climate change decisions made under conditions of uncertainty. However, the probability distributions used to calculate expected costs in decision theory are themselves subject to uncertainty, disagreement, or ambiguity in their specification. This imprecision can be described using sets of probability measures, from which upper and lower bounds on expectations can be calculated. However, many representations, or classes, of probability measures are possible. We describe six of the more useful classes and demonstrate how each may be used to represent climate change uncertainties. When expected costs are specified by bounds, rather than precise values, the conventional decision criterion of minimum expected cost is insufficient to reach a unique decision. Alternative criteria are required, and the criterion of minimum upper expected cost may be desirable because it is consistent with the precautionary principle. Using simple climate and economics models as an example, we determine the carbon dioxide emissions levels that have minimum upper expected cost for each of the selected classes. There can be wide differences in these emissions levels and their associated costs, emphasizing the need for care when selecting an appropriate class.

  1. Current Fluctuations in Stochastic Lattice Gases

    NASA Astrophysics Data System (ADS)

    Bertini, L.; de Sole, A.; Gabrielli, D.; Jona-Lasinio, G.; Landim, C.

    2005-01-01

    We study current fluctuations in lattice gases in the macroscopic limit extending the dynamic approach for density fluctuations developed in previous articles. More precisely, we establish a large deviation theory for the space-time fluctuations of the empirical current which include the previous results. We then estimate the probability of a fluctuation of the average current over a large time interval. It turns out that recent results by Bodineau and Derrida [Phys. Rev. Lett.922004180601] in certain cases underestimate this probability due to the occurrence of dynamical phase transitions.

  2. Quantum-like dynamics of decision-making in prisoner's dilemma game

    NASA Astrophysics Data System (ADS)

    Asano, Masanari; Basieva, Irina; Khrennikov, Andrei; Ohya, Masanori; Tanaka, Yoshiharu

    2012-03-01

    In cognitive psychology, some experiments of games were reported [1, 2, 3, 4], and these demonstrated that real players did not use the "rational strategy" provided by classical game theory. To discuss probabilities of such "irrational choice", recently, we proposed a decision-making model which is based on the formalism of quantum mechanics [5, 6, 7, 8]. In this paper, we briefly explain the above model and calculate the probability of irrational choice in several prisoner's dilemma (PD) games.

  3. Adolphe Quetelet: Prophet of the New Statistics,

    DTIC Science & Technology

    1980-05-01

    study human societies. He ended by insisting that a complete census of the population was necessary. Such a census was decreed in 1828 to take place in...What a sorry state for the human race. The parts played by the prisons, the fetters, the death penalty, seem fixed with as much probability as the...book on probability theory that "frequency regularities" imply "constant causes" in the world of the social side of humanity , and had hinted that a

  4. An appreciation of Richard Threlkeld Cox

    NASA Astrophysics Data System (ADS)

    Tribus, Myron

    2002-05-01

    Richard T. Cox's contributions to the foundations of probability theory and inductive logic are not generally appreciated or understood. This paper reviews his life and accomplishments, especially those in his book The Algebra of Probable Inference and his final publication Inference and Inquiry which, in this author's opinion, has the potential to influence in a significant way the design and analysis of self organizing systems which learn from experience. A simple application to the simulation of a neuron is presented as an example of the power of Cox's contribution.

  5. An experimental study of the surface elevation probability distribution and statistics of wind-generated waves

    NASA Technical Reports Server (NTRS)

    Huang, N. E.; Long, S. R.

    1980-01-01

    Laboratory experiments were performed to measure the surface elevation probability density function and associated statistical properties for a wind-generated wave field. The laboratory data along with some limited field data were compared. The statistical properties of the surface elevation were processed for comparison with the results derived from the Longuet-Higgins (1963) theory. It is found that, even for the highly non-Gaussian cases, the distribution function proposed by Longuet-Higgins still gives good approximations.

  6. Noisy probability judgment, the conjunction fallacy, and rationality: Comment on Costello and Watts (2014).

    PubMed

    Crupi, Vincenzo; Tentori, Katya

    2016-01-01

    According to Costello and Watts (2014), probability theory can account for key findings in human judgment research provided that random noise is embedded in the model. We concur with a number of Costello and Watts's remarks, but challenge the empirical adequacy of their model in one of their key illustrations (the conjunction fallacy) on the basis of recent experimental findings. We also discuss how our argument bears on heuristic and rational thinking. (c) 2015 APA, all rights reserved).

  7. Exact one-sided confidence limits for the difference between two correlated proportions.

    PubMed

    Lloyd, Chris J; Moldovan, Max V

    2007-08-15

    We construct exact and optimal one-sided upper and lower confidence bounds for the difference between two probabilities based on matched binary pairs using well-established optimality theory of Buehler. Starting with five different approximate lower and upper limits, we adjust them to have coverage probability exactly equal to the desired nominal level and then compare the resulting exact limits by their mean size. Exact limits based on the signed root likelihood ratio statistic are preferred and recommended for practical use.

  8. A discriminative test among the different theories proposed to explain the origin of the genetic code: the coevolution theory finds additional support.

    PubMed

    Giulio, Massimo Di

    2018-05-19

    A discriminative statistical test among the different theories proposed to explain the origin of the genetic code is presented. Gathering the amino acids into polarity and biosynthetic classes that are the first expression of the physicochemical theory of the origin of the genetic code and the second expression of the coevolution theory, these classes are utilized in the Fisher's exact test to establish their significance within the genetic code table. Linking to the rows and columns of the genetic code of probabilities that express the statistical significance of these classes, I have finally been in the condition to be able to calculate a χ value to link to both the physicochemical theory and to the coevolution theory that would express the corroboration level referred to these theories. The comparison between these two χ values showed that the coevolution theory is able to explain - in this strictly empirical analysis - the origin of the genetic code better than that of the physicochemical theory. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Two statistical mechanics aspects of complex networks

    NASA Astrophysics Data System (ADS)

    Thurner, Stefan; Biely, Christoly

    2006-12-01

    By adopting an ensemble interpretation of non-growing rewiring networks, network theory can be reduced to a counting problem of possible network states and an identification of their associated probabilities. We present two scenarios of how different rewirement schemes can be used to control the state probabilities of the system. In particular, we review how by generalizing the linking rules of random graphs, in combination with superstatistics and quantum mechanical concepts, one can establish an exact relation between the degree distribution of any given network and the nodes’ linking probability distributions. In a second approach, we control state probabilities by a network Hamiltonian, whose characteristics are motivated by biological and socio-economical statistical systems. We demonstrate that a thermodynamics of networks becomes a fully consistent concept, allowing to study e.g. ‘phase transitions’ and computing entropies through thermodynamic relations.

  10. Trapping dynamics of xenon on Pt(111)

    NASA Astrophysics Data System (ADS)

    Arumainayagam, Christopher R.; Madix, Robert J.; Mcmaster, Mark C.; Suzawa, Valerie M.; Tully, John C.

    1990-02-01

    The dynamics of Xe trapping on Pt(111) was studied using supersonic atomic beam techniques. Initial trapping probabilities ( S0) were measured directly as a function of incident translational energy ( EinT) and angle of incidence (θ i) at a surface temperature ( Tins) 95 K. The initial trapping probability decreases smoothly with increasing ET cosθ i;, rather than ET cos 2θ i, suggesting participation of parallel momentum in the trapping process. Accordingly, the measured initial trapping probability falls off more slowly with increasing incident translational energy than predicted by one-dimensional theories. This finding is in near agreement with previous mean translational energy measurements for Xe desorbing near the Pt(111) surface normal, assuming detailed balance applies. Three-dimensional stochastic classical trajectory calculations presented herein also exhibit the importance of tangential momentum in trapping and satisfactorily reproduce the experimental initial trapping probabilities.

  11. Quantum Probability Cancellation Due to a Single-Photon State

    NASA Technical Reports Server (NTRS)

    Ou, Z. Y.

    1996-01-01

    When an N-photon state enters a lossless symmetric beamsplitter from one input port, the photon distribution for the two output ports has the form of Bernouli Binormial, with highest probability at equal partition (N/2 at one outport and N/2 at the other). However, injection of a single photon state at the other input port can dramatically change the photon distribution at the outputs, resulting in zero probability at equal partition. Such a strong deviation from classical particle theory stems from quantum probability amplitude cancellation. The effect persists even if the N-photon state is replaced by an arbitrary state of light. A special case is the coherent state which corresponds to homodyne detection of a single photon state and can lead to the measurement of the wave function of a single photon state.

  12. Detection of nuclear resonance signals: modification of the receiver operating characteristics using feedback.

    PubMed

    Blauch, A J; Schiano, J L; Ginsberg, M D

    2000-06-01

    The performance of a nuclear resonance detection system can be quantified using binary detection theory. Within this framework, signal averaging increases the probability of a correct detection and decreases the probability of a false alarm by reducing the variance of the noise in the average signal. In conjunction with signal averaging, we propose another method based on feedback control concepts that further improves detection performance. By maximizing the nuclear resonance signal amplitude, feedback raises the probability of correct detection. Furthermore, information generated by the feedback algorithm can be used to reduce the probability of false alarm. We discuss the advantages afforded by feedback that cannot be obtained using signal averaging. As an example, we show how this method is applicable to the detection of explosives using nuclear quadrupole resonance. Copyright 2000 Academic Press.

  13. Uncertainty and probability for branching selves

    NASA Astrophysics Data System (ADS)

    Lewis, Peter J.

    Everettian accounts of quantum mechanics entail that people branch; every possible result of a measurement actually occurs, and I have one successor for each result. Is there room for probability in such an account? The prima facie answer is no; there are no ontic chances here, and no ignorance about what will happen. But since any adequate quantum mechanical theory must make probabilistic predictions, much recent philosophical labor has gone into trying to construct an account of probability for branching selves. One popular strategy involves arguing that branching selves introduce a new kind of subjective uncertainty. I argue here that the variants of this strategy in the literature all fail, either because the uncertainty is spurious, or because it is in the wrong place to yield probabilistic predictions. I conclude that uncertainty cannot be the ground for probability in Everettian quantum mechanics.

  14. Questions Revisited: A Close Examination of Calculus of Inference and Inquiry

    NASA Technical Reports Server (NTRS)

    Knuth, Kevin H.; Koga, Dennis (Technical Monitor)

    2003-01-01

    In this paper I examine more closely the way in which probability theory, the calculus of inference, is derived from the Boolean lattice structure of logical assertions ordered by implication. I demonstrate how the duality between the logical conjunction and disjunction in Boolean algebra is lost when deriving the probability calculus. In addition, I look more closely at the other lattice identities to verify that they are satisfied by the probability calculus. Last, I look towards developing the calculus of inquiry demonstrating that there is a sum and product rule for the relevance measure as well as a Bayes theorem. Current difficulties in deriving the complete inquiry calculus will also be discussed.

  15. Nonlinear resonance scattering of femtosecond X-ray pulses on atoms in plasmas

    NASA Astrophysics Data System (ADS)

    Rosmej, F. B.; Astapenko, V. A.; Lisitsa, V. S.; Moroz, N. N.

    2017-11-01

    It is shown that for sufficiently short pulses the resonance scattering probability becomes a nonlinear function of the pulse duration. For fs X-ray pulses scattered on atoms in plasmas maxima and minima develop in the nonlinear regime whereas in the limit of long pulses the probability becomes linear and turns over into the standard description of the electromagnetic pulse scattering. Numerical calculations are carried out in terms of a generalized scattering probability for the total time of pulse duration including fine structure splitting and ion Doppler broadening in hot plasmas. For projected X-ray monocycles, the generalized nonlinear approach differs by 1-2 orders of magnitude from the standard theory.

  16. Neutrino oscillation processes in a quantum-field-theoretical approach

    NASA Astrophysics Data System (ADS)

    Egorov, Vadim O.; Volobuev, Igor P.

    2018-05-01

    It is shown that neutrino oscillation processes can be consistently described in the framework of quantum field theory using only the plane wave states of the particles. Namely, the oscillating electron survival probabilities in experiments with neutrino detection by charged-current and neutral-current interactions are calculated in the quantum field-theoretical approach to neutrino oscillations based on a modification of the Feynman propagator in the momentum representation. The approach is most similar to the standard Feynman diagram technique. It is found that the oscillating distance-dependent probabilities of detecting an electron in experiments with neutrino detection by charged-current and neutral-current interactions exactly coincide with the corresponding probabilities calculated in the standard approach.

  17. The role of probability arguments in the history of science.

    PubMed

    Weinert, Friedel

    2010-03-01

    The paper examines Wesley Salmon's claim that the primary role of plausibility arguments in the history of science is to impose constraints on the prior probability of hypotheses (in the language of Bayesian confirmation theory). A detailed look at Copernicanism and Darwinism and, more briefly, Rutherford's discovery of the atomic nucleus reveals a further and arguably more important role of plausibility arguments. It resides in the consideration of likelihoods, which state how likely a given hypothesis makes a given piece of evidence. In each case the likelihoods raise the probability of one of the competing hypotheses and diminish the credibility of its rival, and this may happen either on the basis of 'old' or 'new' evidence.

  18. The complexity of divisibility.

    PubMed

    Bausch, Johannes; Cubitt, Toby

    2016-09-01

    We address two sets of long-standing open questions in linear algebra and probability theory, from a computational complexity perspective: stochastic matrix divisibility, and divisibility and decomposability of probability distributions. We prove that finite divisibility of stochastic matrices is an NP-complete problem, and extend this result to nonnegative matrices, and completely-positive trace-preserving maps, i.e. the quantum analogue of stochastic matrices. We further prove a complexity hierarchy for the divisibility and decomposability of probability distributions, showing that finite distribution divisibility is in P, but decomposability is NP-hard. For the former, we give an explicit polynomial-time algorithm. All results on distributions extend to weak-membership formulations, proving that the complexity of these problems is robust to perturbations.

  19. Die Kosmogonie Anton von Zachs.

    NASA Astrophysics Data System (ADS)

    Brosche, P.

    In his "Cosmogenische Betrachtungen" (1804), Anton von Zach rediscovered - probably independently - some aspects of the theories of Kant and Laplace. More originally, he envisaged also the consequences of an era of heavy impacts in the early history of the Earth.

  20. A scaling law for random walks on networks

    PubMed Central

    Perkins, Theodore J.; Foxall, Eric; Glass, Leon; Edwards, Roderick

    2014-01-01

    The dynamics of many natural and artificial systems are well described as random walks on a network: the stochastic behaviour of molecules, traffic patterns on the internet, fluctuations in stock prices and so on. The vast literature on random walks provides many tools for computing properties such as steady-state probabilities or expected hitting times. Previously, however, there has been no general theory describing the distribution of possible paths followed by a random walk. Here, we show that for any random walk on a finite network, there are precisely three mutually exclusive possibilities for the form of the path distribution: finite, stretched exponential and power law. The form of the distribution depends only on the structure of the network, while the stepping probabilities control the parameters of the distribution. We use our theory to explain path distributions in domains such as sports, music, nonlinear dynamics and stochastic chemical kinetics. PMID:25311870

  1. A Generalized Quantum Theory

    NASA Astrophysics Data System (ADS)

    Niestegge, Gerd

    2014-09-01

    In quantum mechanics, the selfadjoint Hilbert space operators play a triple role as observables, generators of the dynamical groups and statistical operators defining the mixed states. One might expect that this is typical of Hilbert space quantum mechanics, but it is not. The same triple role occurs for the elements of a certain ordered Banach space in a much more general theory based upon quantum logics and a conditional probability calculus (which is a quantum logical model of the Lueders-von Neumann measurement process). It is shown how positive groups, automorphism groups, Lie algebras and statistical operators emerge from one major postulate - the non-existence of third-order interference (third-order interference and its impossibility in quantum mechanics were discovered by R. Sorkin in 1994). This again underlines the power of the combination of the conditional probability calculus with the postulate that there is no third-order interference. In two earlier papers, its impact on contextuality and nonlocality had already been revealed.

  2. Statistical distribution of the vacuum energy density in racetrack Kähler uplift models in string theory

    NASA Astrophysics Data System (ADS)

    Sumitomo, Yoske; Tye, S.-H. Henry; Wong, Sam S. C.

    2013-07-01

    We study a racetrack model in the presence of the leading α'-correction in flux compactification in Type IIB string theory, for the purpose of getting conceivable de-Sitter vacua in the large compactified volume approximation. Unlike the Kähler Uplift model studied previously, the α'-correction is more controllable for the meta-stable de-Sitter vacua in the racetrack case since the constraint on the compactified volume size is very much relaxed. We find that the vacuum energy density Λ for de-Sitter vacua approaches zero exponentially as the volume grows. We also analyze properties of the probability distribution of Λ in this class of models. As in other cases studied earlier, the probability distribution again peaks sharply at Λ = 0. We also study the Racetrack Kähler Uplift model in the Swiss-Cheese type model.

  3. Critical behavior of the contact process on small-world networks

    NASA Astrophysics Data System (ADS)

    Ferreira, Ronan S.; Ferreira, Silvio C.

    2013-11-01

    We investigate the role of clustering on the critical behavior of the contact process (CP) on small-world networks using the Watts-Strogatz (WS) network model with an edge rewiring probability p. The critical point is well predicted by a homogeneous cluster-approximation for the limit of vanishing clustering ( p → 1). The critical exponents and dimensionless moment ratios of the CP are in agreement with those predicted by the mean-field theory for any p > 0. This independence on the network clustering shows that the small-world property is a sufficient condition for the mean-field theory to correctly predict the universality of the model. Moreover, we compare the CP dynamics on WS networks with rewiring probability p = 1 and random regular networks and show that the weak heterogeneity of the WS network slightly changes the critical point but does not alter other critical quantities of the model.

  4. The fuzzy cube and causal efficacy: representation of concomitant mechanisms in stroke.

    PubMed

    Jobe, Thomas H.; Helgason, Cathy M.

    1998-04-01

    Twentieth century medical science has embraced nineteenth century Boolean probability theory based upon two-valued Aristotelian logic. With the later addition of bit-based, von Neumann structured computational architectures, an epistemology based on randomness has led to a bivalent epidemiological methodology that dominates medical decision making. In contrast, fuzzy logic, based on twentieth century multi-valued logic, and computational structures that are content addressed and adaptively modified, has advanced a new scientific paradigm for the twenty-first century. Diseases such as stroke involve multiple concomitant causal factors that are difficult to represent using conventional statistical methods. We tested which paradigm best represented this complex multi-causal clinical phenomenon-stroke. We show that the fuzzy logic paradigm better represented clinical complexity in cerebrovascular disease than current probability theory based methodology. We believe this finding is generalizable to all of clinical science since multiple concomitant causal factors are involved in nearly all known pathological processes.

  5. Asteroid orbital error analysis: Theory and application

    NASA Technical Reports Server (NTRS)

    Muinonen, K.; Bowell, Edward

    1992-01-01

    We present a rigorous Bayesian theory for asteroid orbital error estimation in which the probability density of the orbital elements is derived from the noise statistics of the observations. For Gaussian noise in a linearized approximation the probability density is also Gaussian, and the errors of the orbital elements at a given epoch are fully described by the covariance matrix. The law of error propagation can then be applied to calculate past and future positional uncertainty ellipsoids (Cappellari et al. 1976, Yeomans et al. 1987, Whipple et al. 1991). To our knowledge, this is the first time a Bayesian approach has been formulated for orbital element estimation. In contrast to the classical Fisherian school of statistics, the Bayesian school allows a priori information to be formally present in the final estimation. However, Bayesian estimation does give the same results as Fisherian estimation when no priori information is assumed (Lehtinen 1988, and reference therein).

  6. A scaling law for random walks on networks

    NASA Astrophysics Data System (ADS)

    Perkins, Theodore J.; Foxall, Eric; Glass, Leon; Edwards, Roderick

    2014-10-01

    The dynamics of many natural and artificial systems are well described as random walks on a network: the stochastic behaviour of molecules, traffic patterns on the internet, fluctuations in stock prices and so on. The vast literature on random walks provides many tools for computing properties such as steady-state probabilities or expected hitting times. Previously, however, there has been no general theory describing the distribution of possible paths followed by a random walk. Here, we show that for any random walk on a finite network, there are precisely three mutually exclusive possibilities for the form of the path distribution: finite, stretched exponential and power law. The form of the distribution depends only on the structure of the network, while the stepping probabilities control the parameters of the distribution. We use our theory to explain path distributions in domains such as sports, music, nonlinear dynamics and stochastic chemical kinetics.

  7. Optimized nested Markov chain Monte Carlo sampling: theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coe, Joshua D; Shaw, M Sam; Sewell, Thomas D

    2009-01-01

    Metropolis Monte Carlo sampling of a reference potential is used to build a Markov chain in the isothermal-isobaric ensemble. At the endpoints of the chain, the energy is reevaluated at a different level of approximation (the 'full' energy) and a composite move encompassing all of the intervening steps is accepted on the basis of a modified Metropolis criterion. By manipulating the thermodynamic variables characterizing the reference system we maximize the average acceptance probability of composite moves, lengthening significantly the random walk made between consecutive evaluations of the full energy at a fixed acceptance probability. This provides maximally decorrelated samples ofmore » the full potential, thereby lowering the total number required to build ensemble averages of a given variance. The efficiency of the method is illustrated using model potentials appropriate to molecular fluids at high pressure. Implications for ab initio or density functional theory (DFT) treatment are discussed.« less

  8. A scaling law for random walks on networks.

    PubMed

    Perkins, Theodore J; Foxall, Eric; Glass, Leon; Edwards, Roderick

    2014-10-14

    The dynamics of many natural and artificial systems are well described as random walks on a network: the stochastic behaviour of molecules, traffic patterns on the internet, fluctuations in stock prices and so on. The vast literature on random walks provides many tools for computing properties such as steady-state probabilities or expected hitting times. Previously, however, there has been no general theory describing the distribution of possible paths followed by a random walk. Here, we show that for any random walk on a finite network, there are precisely three mutually exclusive possibilities for the form of the path distribution: finite, stretched exponential and power law. The form of the distribution depends only on the structure of the network, while the stepping probabilities control the parameters of the distribution. We use our theory to explain path distributions in domains such as sports, music, nonlinear dynamics and stochastic chemical kinetics.

  9. Electromagnetic cellular interactions.

    PubMed

    Cifra, Michal; Fields, Jeremy Z; Farhadi, Ashkan

    2011-05-01

    Chemical and electrical interaction within and between cells is well established. Just the opposite is true about cellular interactions via other physical fields. The most probable candidate for an other form of cellular interaction is the electromagnetic field. We review theories and experiments on how cells can generate and detect electromagnetic fields generally, and if the cell-generated electromagnetic field can mediate cellular interactions. We do not limit here ourselves to specialized electro-excitable cells. Rather we describe physical processes that are of a more general nature and probably present in almost every type of living cell. The spectral range included is broad; from kHz to the visible part of the electromagnetic spectrum. We show that there is a rather large number of theories on how cells can generate and detect electromagnetic fields and discuss experimental evidence on electromagnetic cellular interactions in the modern scientific literature. Although small, it is continuously accumulating. Copyright © 2010 Elsevier Ltd. All rights reserved.

  10. Prospect theory in the health domain: a quantitative assessment.

    PubMed

    Attema, Arthur E; Brouwer, Werner B F; I'Haridon, Olivier

    2013-12-01

    It is well-known that expected utility (EU) has empirical deficiencies. Cumulative prospect theory (CPT) has developed as an alternative with more descriptive validity. However, CPT's full function had not yet been quantified in the health domain. This paper is therefore the first to simultaneously measure utility of life duration, probability weighting, and loss aversion in this domain. We observe loss aversion and risk aversion for gains and losses, which for gains can be explained by probabilistic pessimism. Utility for gains is almost linear. For losses, we find less weighting of probability 1/2 and concave utility. This contrasts with the common finding of convex utility for monetary losses. However, CPT was proposed to explain choices among lotteries involving monetary outcomes. Life years are arguably very different from monetary outcomes and need not generate convex utility for losses. Moreover, utility of life duration reflects discounting, causing concave utility. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Site-percolation threshold of carbon nanotube fibers-Fast inspection of percolation with Markov stochastic theory

    NASA Astrophysics Data System (ADS)

    Xu, Fangbo; Xu, Zhiping; Yakobson, Boris I.

    2014-08-01

    We present a site-percolation model based on a modified FCC lattice, as well as an efficient algorithm of inspecting percolation which takes advantage of the Markov stochastic theory, in order to study the percolation threshold of carbon nanotube (CNT) fibers. Our Markov-chain based algorithm carries out the inspection of percolation by performing repeated sparse matrix-vector multiplications, which allows parallelized computation to accelerate the inspection for a given configuration. With this approach, we determine that the site-percolation transition of CNT fibers occurs at pc=0.1533±0.0013, and analyze the dependence of the effective percolation threshold (corresponding to 0.5 percolation probability) on the length and the aspect ratio of a CNT fiber on a finite-size-scaling basis. We also discuss the aspect ratio dependence of percolation probability with various values of p (not restricted to pc).

  12. Hybrid Evidence Theory-based Finite Element/Statistical Energy Analysis method for mid-frequency analysis of built-up systems with epistemic uncertainties

    NASA Astrophysics Data System (ADS)

    Yin, Shengwen; Yu, Dejie; Yin, Hui; Lü, Hui; Xia, Baizhan

    2017-09-01

    Considering the epistemic uncertainties within the hybrid Finite Element/Statistical Energy Analysis (FE/SEA) model when it is used for the response analysis of built-up systems in the mid-frequency range, the hybrid Evidence Theory-based Finite Element/Statistical Energy Analysis (ETFE/SEA) model is established by introducing the evidence theory. Based on the hybrid ETFE/SEA model and the sub-interval perturbation technique, the hybrid Sub-interval Perturbation and Evidence Theory-based Finite Element/Statistical Energy Analysis (SIP-ETFE/SEA) approach is proposed. In the hybrid ETFE/SEA model, the uncertainty in the SEA subsystem is modeled by a non-parametric ensemble, while the uncertainty in the FE subsystem is described by the focal element and basic probability assignment (BPA), and dealt with evidence theory. Within the hybrid SIP-ETFE/SEA approach, the mid-frequency response of interest, such as the ensemble average of the energy response and the cross-spectrum response, is calculated analytically by using the conventional hybrid FE/SEA method. Inspired by the probability theory, the intervals of the mean value, variance and cumulative distribution are used to describe the distribution characteristics of mid-frequency responses of built-up systems with epistemic uncertainties. In order to alleviate the computational burdens for the extreme value analysis, the sub-interval perturbation technique based on the first-order Taylor series expansion is used in ETFE/SEA model to acquire the lower and upper bounds of the mid-frequency responses over each focal element. Three numerical examples are given to illustrate the feasibility and effectiveness of the proposed method.

  13. Estimates of annual survival probabilities for adult Florida manatees (Trichechus manatus latirostris)

    USGS Publications Warehouse

    Langtimm, C.A.; O'Shea, T.J.; Pradel, R.; Beck, C.A.

    1998-01-01

    The population dynamics of large, long-lived mammals are particularly sensitive to changes in adult survival. Understanding factors affecting survival patterns is therefore critical for developing and testing theories of population dynamics and for developing management strategies aimed at preventing declines or extinction in such taxa. Few studies have used modern analytical approaches for analyzing variation and testing hypotheses about survival probabilities in large mammals. This paper reports a detailed analysis of annual adult survival in the Florida manatee (Trichechus manatus latirostris), an endangered marine mammal, based on a mark-recapture approach. Natural and boat-inflicted scars distinctively 'marked' individual manatees that were cataloged in a computer-based photographic system. Photo-documented resightings provided 'recaptures.' Using open population models, annual adult-survival probabilities were estimated for manatees observed in winter in three areas of Florida: Blue Spring, Crystal River, and the Atlantic coast. After using goodness-of-fit tests in Program RELEASE to search for violations of the assumptions of mark-recapture analysis, survival and sighting probabilities were modeled under several different biological hypotheses with Program SURGE. Estimates of mean annual probability of sighting varied from 0.948 for Blue Spring to 0.737 for Crystal River and 0.507 for the Atlantic coast. At Crystal River and Blue Spring, annual survival probabilities were best estimated as constant over the study period at 0.96 (95% CI = 0.951-0.975 and 0.900-0.985, respectively). On the Atlantic coast, where manatees are impacted more by human activities, annual survival probabilities had a significantly lower mean estimate of 0.91 (95% CI = 0.887-0.926) and varied unpredictably over the study period. For each study area, survival did not differ between sexes and was independent of relative adult age. The high constant adult-survival probabilities estimated for manatees in the Blue Spring and Crystal River areas were consistent with current mammalian life history theory and other empirical data available for large, long-lived mammals. Adult survival probabilities in these areas appeared high enough to maintain growing populations if other traits such as reproductive rates and juvenile survival were also sufficiently high lower and variable survival rates on the Atlantic coast are cause for concern.

  14. Assessing the chances of success: naïve statistics versus kind experience.

    PubMed

    Hogarth, Robin M; Mukherjee, Kanchan; Soyer, Emre

    2013-01-01

    Additive integration of information is ubiquitous in judgment and has been shown to be effective even when multiplicative rules of probability theory are prescribed. We explore the generality of these findings in the context of estimating probabilities of success in contests. We first define a normative model of these probabilities that takes account of relative skill levels in contests where only a limited number of entrants can win. We then report 4 experiments using a scenario about a competition. Experiments 1 and 2 both elicited judgments of probabilities, and, although participants' responses demonstrated considerable variability, their mean judgments provide a good fit to a simple linear model. Experiment 3 explored choices. Most participants entered most contests and showed little awareness of appropriate probabilities. Experiment 4 investigated effects of providing aids to calculate probabilities, specifically, access to expert advice and 2 simulation tools. With these aids, estimates were accurate and decisions varied appropriately with economic consequences. We discuss implications by considering when additive decision rules are dysfunctional, the interpretation of overconfidence based on contest-entry behavior, and the use of aids to help people make better decisions.

  15. Probability theory for 3-layer remote sensing radiative transfer model: univariate case.

    PubMed

    Ben-David, Avishai; Davidson, Charles E

    2012-04-23

    A probability model for a 3-layer radiative transfer model (foreground layer, cloud layer, background layer, and an external source at the end of line of sight) has been developed. The 3-layer model is fundamentally important as the primary physical model in passive infrared remote sensing. The probability model is described by the Johnson family of distributions that are used as a fit for theoretically computed moments of the radiative transfer model. From the Johnson family we use the SU distribution that can address a wide range of skewness and kurtosis values (in addition to addressing the first two moments, mean and variance). In the limit, SU can also describe lognormal and normal distributions. With the probability model one can evaluate the potential for detecting a target (vapor cloud layer), the probability of observing thermal contrast, and evaluate performance (receiver operating characteristics curves) in clutter-noise limited scenarios. This is (to our knowledge) the first probability model for the 3-layer remote sensing geometry that treats all parameters as random variables and includes higher-order statistics. © 2012 Optical Society of America

  16. Fixation Probability in a Haploid-Diploid Population.

    PubMed

    Bessho, Kazuhiro; Otto, Sarah P

    2017-01-01

    Classical population genetic theory generally assumes either a fully haploid or fully diploid life cycle. However, many organisms exhibit more complex life cycles, with both free-living haploid and diploid stages. Here we ask what the probability of fixation is for selected alleles in organisms with haploid-diploid life cycles. We develop a genetic model that considers the population dynamics using both the Moran model and Wright-Fisher model. Applying a branching process approximation, we obtain an accurate fixation probability assuming that the population is large and the net effect of the mutation is beneficial. We also find the diffusion approximation for the fixation probability, which is accurate even in small populations and for deleterious alleles, as long as selection is weak. These fixation probabilities from branching process and diffusion approximations are similar when selection is weak for beneficial mutations that are not fully recessive. In many cases, particularly when one phase predominates, the fixation probability differs substantially for haploid-diploid organisms compared to either fully haploid or diploid species. Copyright © 2017 by the Genetics Society of America.

  17. Six-dimensional quantum dynamics study for the dissociative adsorption of DCl on Au(111) surface

    NASA Astrophysics Data System (ADS)

    Liu, Tianhui; Fu, Bina; Zhang, Dong H.

    2014-04-01

    We carried out six-dimensional quantum dynamics calculations for the dissociative adsorption of deuterium chloride (DCl) on Au(111) surface using the initial state-selected time-dependent wave packet approach. The four-dimensional dissociation probabilities are also obtained with the center of mass of DCl fixed at various sites. These calculations were all performed based on an accurate potential energy surface recently constructed by neural network fitting to density function theory energy points. The origin of the extremely small dissociation probability for DCl/HCl (v = 0, j = 0) fixed at the top site compared to other fixed sites is elucidated in this study. The influence of vibrational excitation and rotational orientation of DCl on the reactivity was investigated by calculating six-dimensional dissociation probabilities. The vibrational excitation of DCl enhances the reactivity substantially and the helicopter orientation yields higher dissociation probability than the cartwheel orientation. The site-averaged dissociation probability over 25 fixed sites obtained from four-dimensional quantum dynamics calculations can accurately reproduce the six-dimensional dissociation probability.

  18. Six-dimensional quantum dynamics study for the dissociative adsorption of DCl on Au(111) surface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Tianhui; Fu, Bina, E-mail: bina@dicp.ac.cn, E-mail: zhangdh@dicp.ac.cn; Zhang, Dong H., E-mail: bina@dicp.ac.cn, E-mail: zhangdh@dicp.ac.cn

    We carried out six-dimensional quantum dynamics calculations for the dissociative adsorption of deuterium chloride (DCl) on Au(111) surface using the initial state-selected time-dependent wave packet approach. The four-dimensional dissociation probabilities are also obtained with the center of mass of DCl fixed at various sites. These calculations were all performed based on an accurate potential energy surface recently constructed by neural network fitting to density function theory energy points. The origin of the extremely small dissociation probability for DCl/HCl (v = 0, j = 0) fixed at the top site compared to other fixed sites is elucidated in this study. The influence of vibrational excitationmore » and rotational orientation of DCl on the reactivity was investigated by calculating six-dimensional dissociation probabilities. The vibrational excitation of DCl enhances the reactivity substantially and the helicopter orientation yields higher dissociation probability than the cartwheel orientation. The site-averaged dissociation probability over 25 fixed sites obtained from four-dimensional quantum dynamics calculations can accurately reproduce the six-dimensional dissociation probability.« less

  19. The double copy: gravity from gluons

    NASA Astrophysics Data System (ADS)

    White, C. D.

    2018-04-01

    Three of the four fundamental forces in nature are described by so-called gauge theories, which include the effects of both relativity and quantum mechanics. Gravity, on the other hand, is described by General Relativity, and the lack of a well-behaved quantum theory - believed to be relevant at the centre of black holes, and at the Big Bang itself - remains a notorious unsolved problem. Recently a new correspondence, the double copy, has been discovered between scattering amplitudes (quantities related to the probability for particles to interact) in gravity, and their gauge theory counterparts. This has subsequently been extended to other quantities, providing gauge theory analogues of e.g. black holes. We here review current research on the double copy, and describe some possible applications.

  20. Stochastic theory of fatigue corrosion

    NASA Astrophysics Data System (ADS)

    Hu, Haiyun

    1999-10-01

    A stochastic theory of corrosion has been constructed. The stochastic equations are described giving the transportation corrosion rate and fluctuation corrosion coefficient. In addition the pit diameter distribution function, the average pit diameter and the most probable pit diameter including other related empirical formula have been derived. In order to clarify the effect of stress range on the initiation and growth behaviour of pitting corrosion, round smooth specimen were tested under cyclic loading in 3.5% NaCl solution.

  1. Detection of digital FSK using a phase-locked loop

    NASA Technical Reports Server (NTRS)

    Lindsey, W. C.; Simon, M. K.

    1975-01-01

    A theory is presented for the design of a digital FSK receiver which employs a phase-locked loop to set up the desired matched filter as the arriving signal frequency switches. The developed mathematical model makes it possible to establish the error probability performance of systems which employ a class of digital FM modulations. The noise mechanism which accounts for decision errors is modeled on the basis of the Meyr distribution and renewal Markov process theory.

  2. A General Instance-Based Learning Framework for Studying Intuitive Decision-Making in a Cognitive Architecture (Open Access, Publisher’s Version)

    DTIC Science & Technology

    2014-06-19

    decision theory (Berger, 1985), and quantum probability theory (Busemeyer, Pothos, Franco, & Trueblood, 2011). Similarly, explanations in the 1 rch in...trengths of associations) are consciously inaccessible and con- titute the implicit knowledge of the model (Gonzalez & Lebiere, 005; Lebiere, Wallach...in memory (Lebiere et al., 2013). It is important to note hat this wholly implicit process is not consciously available to the odel. The second level

  3. Thermodynamically efficient solar concentrators

    NASA Astrophysics Data System (ADS)

    Winston, Roland

    2012-10-01

    Non-imaging Optics is the theory of thermodynamically efficient optics and as such depends more on thermodynamics than on optics. Hence in this paper a condition for the "best" design is proposed based on purely thermodynamic arguments, which we believe has profound consequences for design of thermal and even photovoltaic systems. This new way of looking at the problem of efficient concentration depends on probabilities, the ingredients of entropy and information theory while "optics" in the conventional sense recedes into the background.

  4. Applying generalized stochastic Petri nets to manufacturing systems containing nonexponential transition functions

    NASA Technical Reports Server (NTRS)

    Watson, James F., III; Desrochers, Alan A.

    1991-01-01

    Generalized stochastic Petri nets (GSPNs) are applied to flexible manufacturing systems (FMSs). Throughput subnets and s-transitions are presented. Two FMS examples containing nonexponential distributions which were analyzed in previous papers by queuing theory and probability theory, respectively, are treated using GSPNs developed using throughput subnets and s-transitions. The GSPN results agree with the previous results, and developing and analyzing the GSPN models are straightforward and relatively easy compared to other methodologies.

  5. Experimental non-classicality of an indivisible quantum system.

    PubMed

    Lapkiewicz, Radek; Li, Peizhe; Schaeff, Christoph; Langford, Nathan K; Ramelow, Sven; Wieśniak, Marcin; Zeilinger, Anton

    2011-06-22

    In contrast to classical physics, quantum theory demands that not all properties can be simultaneously well defined; the Heisenberg uncertainty principle is a manifestation of this fact. Alternatives have been explored--notably theories relying on joint probability distributions or non-contextual hidden-variable models, in which the properties of a system are defined independently of their own measurement and any other measurements that are made. Various deep theoretical results imply that such theories are in conflict with quantum mechanics. Simpler cases demonstrating this conflict have been found and tested experimentally with pairs of quantum bits (qubits). Recently, an inequality satisfied by non-contextual hidden-variable models and violated by quantum mechanics for all states of two qubits was introduced and tested experimentally. A single three-state system (a qutrit) is the simplest system in which such a contradiction is possible; moreover, the contradiction cannot result from entanglement between subsystems, because such a three-state system is indivisible. Here we report an experiment with single photonic qutrits which provides evidence that no joint probability distribution describing the outcomes of all possible measurements--and, therefore, no non-contextual theory--can exist. Specifically, we observe a violation of the Bell-type inequality found by Klyachko, Can, Binicioğlu and Shumovsky. Our results illustrate a deep incompatibility between quantum mechanics and classical physics that cannot in any way result from entanglement.

  6. Geotechnical parameter spatial distribution stochastic analysis based on multi-precision information assimilation

    NASA Astrophysics Data System (ADS)

    Wang, C.; Rubin, Y.

    2014-12-01

    Spatial distribution of important geotechnical parameter named compression modulus Es contributes considerably to the understanding of the underlying geological processes and the adequate assessment of the Es mechanics effects for differential settlement of large continuous structure foundation. These analyses should be derived using an assimilating approach that combines in-situ static cone penetration test (CPT) with borehole experiments. To achieve such a task, the Es distribution of stratum of silty clay in region A of China Expo Center (Shanghai) is studied using the Bayesian-maximum entropy method. This method integrates rigorously and efficiently multi-precision of different geotechnical investigations and sources of uncertainty. Single CPT samplings were modeled as a rational probability density curve by maximum entropy theory. Spatial prior multivariate probability density function (PDF) and likelihood PDF of the CPT positions were built by borehole experiments and the potential value of the prediction point, then, preceding numerical integration on the CPT probability density curves, the posterior probability density curve of the prediction point would be calculated by the Bayesian reverse interpolation framework. The results were compared between Gaussian Sequential Stochastic Simulation and Bayesian methods. The differences were also discussed between single CPT samplings of normal distribution and simulated probability density curve based on maximum entropy theory. It is shown that the study of Es spatial distributions can be improved by properly incorporating CPT sampling variation into interpolation process, whereas more informative estimations are generated by considering CPT Uncertainty for the estimation points. Calculation illustrates the significance of stochastic Es characterization in a stratum, and identifies limitations associated with inadequate geostatistical interpolation techniques. This characterization results will provide a multi-precision information assimilation method of other geotechnical parameters.

  7. The role of demographic compensation theory in incidental take assessments for endangered species

    USGS Publications Warehouse

    McGowan, Conor P.; Ryan, Mark R.; Runge, Michael C.; Millspaugh, Joshua J.; Cochrane, Jean Fitts

    2011-01-01

    Many endangered species laws provide exceptions to legislated prohibitions through incidental take provisions as long as take is the result of unintended consequences of an otherwise legal activity. These allowances presumably invoke the theory of demographic compensation, commonly applied to harvested species, by allowing limited harm as long as the probability of the species' survival or recovery is not reduced appreciably. Demographic compensation requires some density-dependent limits on survival or reproduction in a species' annual cycle that can be alleviated through incidental take. Using a population model for piping plovers in the Great Plains, we found that when the population is in rapid decline or when there is no density dependence, the probability of quasi-extinction increased linearly with increasing take. However, when the population is near stability and subject to density-dependent survival, there was no relationship between quasi-extinction probability and take rates. We note however, that a brief examination of piping plover demography and annual cycles suggests little room for compensatory capacity. We argue that a population's capacity for demographic compensation of incidental take should be evaluated when considering incidental allowances because compensation is the only mechanism whereby a population can absorb the negative effects of take without incurring a reduction in the probability of survival in the wild. With many endangered species there is probably little known about density dependence and compensatory capacity. Under these circumstances, using multiple system models (with and without compensation) to predict the population's response to incidental take and implementing follow-up monitoring to assess species response may be valuable in increasing knowledge and improving future decision making.

  8. Gas Hydrate Formation Probability Distributions: The Effect of Shear and Comparisons with Nucleation Theory.

    PubMed

    May, Eric F; Lim, Vincent W; Metaxas, Peter J; Du, Jianwei; Stanwix, Paul L; Rowland, Darren; Johns, Michael L; Haandrikman, Gert; Crosby, Daniel; Aman, Zachary M

    2018-03-13

    Gas hydrate formation is a stochastic phenomenon of considerable significance for any risk-based approach to flow assurance in the oil and gas industry. In principle, well-established results from nucleation theory offer the prospect of predictive models for hydrate formation probability in industrial production systems. In practice, however, heuristics are relied on when estimating formation risk for a given flowline subcooling or when quantifying kinetic hydrate inhibitor (KHI) performance. Here, we present statistically significant measurements of formation probability distributions for natural gas hydrate systems under shear, which are quantitatively compared with theoretical predictions. Distributions with over 100 points were generated using low-mass, Peltier-cooled pressure cells, cycled in temperature between 40 and -5 °C at up to 2 K·min -1 and analyzed with robust algorithms that automatically identify hydrate formation and initial growth rates from dynamic pressure data. The application of shear had a significant influence on the measured distributions: at 700 rpm mass-transfer limitations were minimal, as demonstrated by the kinetic growth rates observed. The formation probability distributions measured at this shear rate had mean subcoolings consistent with theoretical predictions and steel-hydrate-water contact angles of 14-26°. However, the experimental distributions were substantially wider than predicted, suggesting that phenomena acting on macroscopic length scales are responsible for much of the observed stochastic formation. Performance tests of a KHI provided new insights into how such chemicals can reduce the risk of hydrate blockage in flowlines. Our data demonstrate that the KHI not only reduces the probability of formation (by both shifting and sharpening the distribution) but also reduces hydrate growth rates by a factor of 2.

  9. An efficient algorithm to compute marginal posterior genotype probabilities for every member of a pedigree with loops

    PubMed Central

    2009-01-01

    Background Marginal posterior genotype probabilities need to be computed for genetic analyses such as geneticcounseling in humans and selective breeding in animal and plant species. Methods In this paper, we describe a peeling based, deterministic, exact algorithm to compute efficiently genotype probabilities for every member of a pedigree with loops without recourse to junction-tree methods from graph theory. The efficiency in computing the likelihood by peeling comes from storing intermediate results in multidimensional tables called cutsets. Computing marginal genotype probabilities for individual i requires recomputing the likelihood for each of the possible genotypes of individual i. This can be done efficiently by storing intermediate results in two types of cutsets called anterior and posterior cutsets and reusing these intermediate results to compute the likelihood. Examples A small example is used to illustrate the theoretical concepts discussed in this paper, and marginal genotype probabilities are computed at a monogenic disease locus for every member in a real cattle pedigree. PMID:19958551

  10. Rethinking the learning of belief network probabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Musick, R.

    Belief networks are a powerful tool for knowledge discovery that provide concise, understandable probabilistic models of data. There are methods grounded in probability theory to incrementally update the relationships described by the belief network when new information is seen, to perform complex inferences over any set of variables in the data, to incorporate domain expertise and prior knowledge into the model, and to automatically learn the model from data. This paper concentrates on part of the belief network induction problem, that of learning the quantitative structure (the conditional probabilities), given the qualitative structure. In particular, the current practice of rotemore » learning the probabilities in belief networks can be significantly improved upon. We advance the idea of applying any learning algorithm to the task of conditional probability learning in belief networks, discuss potential benefits, and show results of applying neutral networks and other algorithms to a medium sized car insurance belief network. The results demonstrate from 10 to 100% improvements in model error rates over the current approaches.« less

  11. Quantum-Bayesian coherence

    NASA Astrophysics Data System (ADS)

    Fuchs, Christopher A.; Schack, Rüdiger

    2013-10-01

    In the quantum-Bayesian interpretation of quantum theory (or QBism), the Born rule cannot be interpreted as a rule for setting measurement-outcome probabilities from an objective quantum state. But if not, what is the role of the rule? In this paper, the argument is given that it should be seen as an empirical addition to Bayesian reasoning itself. Particularly, it is shown how to view the Born rule as a normative rule in addition to usual Dutch-book coherence. It is a rule that takes into account how one should assign probabilities to the consequences of various intended measurements on a physical system, but explicitly in terms of prior probabilities for and conditional probabilities consequent upon the imagined outcomes of a special counterfactual reference measurement. This interpretation is exemplified by representing quantum states in terms of probabilities for the outcomes of a fixed, fiducial symmetric informationally complete measurement. The extent to which the general form of the new normative rule implies the full state-space structure of quantum mechanics is explored.

  12. A quantum probability framework for human probabilistic inference.

    PubMed

    Trueblood, Jennifer S; Yearsley, James M; Pothos, Emmanuel M

    2017-09-01

    There is considerable variety in human inference (e.g., a doctor inferring the presence of a disease, a juror inferring the guilt of a defendant, or someone inferring future weight loss based on diet and exercise). As such, people display a wide range of behaviors when making inference judgments. Sometimes, people's judgments appear Bayesian (i.e., normative), but in other cases, judgments deviate from the normative prescription of classical probability theory. How can we combine both Bayesian and non-Bayesian influences in a principled way? We propose a unified explanation of human inference using quantum probability theory. In our approach, we postulate a hierarchy of mental representations, from 'fully' quantum to 'fully' classical, which could be adopted in different situations. In our hierarchy of models, moving from the lowest level to the highest involves changing assumptions about compatibility (i.e., how joint events are represented). Using results from 3 experiments, we show that our modeling approach explains 5 key phenomena in human inference including order effects, reciprocity (i.e., the inverse fallacy), memorylessness, violations of the Markov condition, and antidiscounting. As far as we are aware, no existing theory or model can explain all 5 phenomena. We also explore transitions in our hierarchy, examining how representations change from more quantum to more classical. We show that classical representations provide a better account of data as individuals gain familiarity with a task. We also show that representations vary between individuals, in a way that relates to a simple measure of cognitive style, the Cognitive Reflection Test. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  13. Microwave backscattering theory and active remote sensing of the ocean surface

    NASA Technical Reports Server (NTRS)

    Brown, G. S.; Miller, L. S.

    1977-01-01

    The status is reviewed of electromagnetic scattering theory relative to the interpretation of microwave remote sensing data acquired from spaceborne platforms over the ocean surface. Particular emphasis is given to the assumptions which are either implicit or explicit in the theory. The multiple scale scattering theory developed during this investigation is extended to non-Gaussian surface statistics. It is shown that the important statistic for the case is the probability density function of the small scale heights conditioned on the large scale slopes; this dependence may explain the anisotropic scattering measurements recently obtained with the AAFE Radscat. It is noted that present surface measurements are inadequate to verify or reject the existing scattering theories. Surface measurements are recommended for qualifying sensor data from radar altimeters and scatterometers. Additional scattering investigations are suggested for imaging type radars employing synthetically generated apertures.

  14. The theory precision analyse of RFM localization of satellite remote sensing imagery

    NASA Astrophysics Data System (ADS)

    Zhang, Jianqing; Xv, Biao

    2009-11-01

    The tradition method of detecting precision of Rational Function Model(RFM) is to make use of a great deal check points, and it calculates mean square error through comparing calculational coordinate with known coordinate. This method is from theory of probability, through a large number of samples to statistic estimate value of mean square error, we can think its estimate value approaches in its true when samples are well enough. This paper is from angle of survey adjustment, take law of propagation of error as the theory basis, and it calculates theory precision of RFM localization. Then take the SPOT5 three array imagery as experiment data, and the result of traditional method and narrated method in the paper are compared, while has confirmed tradition method feasible, and answered its theory precision question from the angle of survey adjustment.

  15. The general theory of relativity - Why 'It is probably the most beautiful of all existing theories'

    NASA Astrophysics Data System (ADS)

    Chandrasekhar, S.

    1984-03-01

    An attempt is made to objectively evaluate the frequent judgment of Einstein's general theory of relativity, by such distinguished physicists as Pauli (1921), Dirac, Born, and Rutherford, as 'beautiful' and 'a work of art'. The criteria applied are that of Francis Bacon ('There is no excellent beauty that hath not some strangeness in the proportions') and that of Heisenberg ('Beauty is the proper conformity of the parts to one another and to the whole'). The strangeness in the proportions of the theory of general relativity consists in its relating, through juxtaposition, the concepts of space and time and those of matter and motion; these had previously been considered entirely independent. The criterion of 'conformity' is illustrated through the directness with which the theory allows the description of black holes.

  16. Operational formulation of time reversal in quantum theory

    NASA Astrophysics Data System (ADS)

    Oreshkov, Ognyan; Cerf, Nicolas J.

    2015-10-01

    The symmetry of quantum theory under time reversal has long been a subject of controversy because the transition probabilities given by Born’s rule do not apply backward in time. Here, we resolve this problem within a rigorous operational probabilistic framework. We argue that reconciling time reversal with the probabilistic rules of the theory requires a notion of operation that permits realizations through both pre- and post-selection. We develop the generalized formulation of quantum theory that stems from this approach and give a precise definition of time-reversal symmetry, emphasizing a previously overlooked distinction between states and effects. We prove an analogue of Wigner’s theorem, which characterizes all allowed symmetry transformations in this operationally time-symmetric quantum theory. Remarkably, we find larger classes of symmetry transformations than previously assumed, suggesting a possible direction in the search for extensions of known physics.

  17. Critical 2-D Percolation: Crossing Probabilities, Modular Forms and Factorization

    NASA Astrophysics Data System (ADS)

    Kleban, Peter

    2007-03-01

    We first consider crossing probabilities in critical 2-D percolation in rectangular geometries, derived via conformal field theory. These quantities are shown to exhibit interesting modular behavior [1], although the physical meaning of modular transformations in this context is not clear. We show that in many cases these functions are completely characterized by very simple transformation properties. In particular, Cardy's function for the percolation crossing probability (including the conformal dimension 1/3), follows from a simple modular argument. We next consider the probability of crossing between various points for percolation in the upper half-plane. For two points, with the point x an edge of the system, the probability is P(x,z)= k 1y^5/48 φ(x,z)^1/3 where φ is the potential at z of a 2-D dipole located at x, and k is a non-universal constant. For three points, one finds the exact and universal factorization [2,3] P(x1,x2,z)= C ; √P(x1,z)P(x2,z)P(x1,x2) with C= 8 √2; &5/2circ;3^3/4 ; γ(1/3)^9/2. These results are calculated by use of conformal field theory. Computer simulations verify them very precisely. Furthermore, simulations show that the same factorization holds asymptotically, with the same value of C, when one or both of the points xi are moved from the edge into the bulk.1.) Peter Kleban and Don Zagier, Crossing probabilities and modular forms, J. Stat. Phys. 113, 431-454 (2003) [arXiv: math-ph/0209023].2.) Peter Kleban, Jacob J. H. Simmons, and Robert M. Ziff, Anchored critical percolation clusters and 2-d electrostatics, Phys. Rev. Letters 97,115702 (2006) [arXiv: cond-mat/0605120].3.) Jacob J. H. Simmons and Peter Kleban, in preparation.

  18. Process, System, Causality, and Quantum Mechanics: A Psychoanalysis of Animal Faith

    NASA Astrophysics Data System (ADS)

    Etter, Tom; Noyes, H. Pierre

    We shall argue in this paper that a central piece of modern physics does not really belong to physics at all but to elementary probability theory. Given a joint probability distribution J on a set of random variables containing x and y, define a link between x and y to be the condition x=y on J. Define the {\\it state} D of a link x=y as the joint probability distribution matrix on x and y without the link. The two core laws of quantum mechanics are the Born probability rule, and the unitary dynamical law whose best known form is the Schrodinger's equation. Von Neumann formulated these two laws in the language of Hilbert space as prob(P) = trace(PD) and D'T = TD respectively, where P is a projection, D and D' are (von Neumann) density matrices, and T is a unitary transformation. We'll see that if we regard link states as density matrices, the algebraic forms of these two core laws occur as completely general theorems about links. When we extend probability theory by allowing cases to count negatively, we find that the Hilbert space framework of quantum mechanics proper emerges from the assumption that all D's are symmetrical in rows and columns. On the other hand, Markovian systems emerge when we assume that one of every linked variable pair has a uniform probability distribution. By representing quantum and Markovian structure in this way, we see clearly both how they differ, and also how they can coexist in natural harmony with each other, as they must in quantum measurement, which we'll examine in some detail. Looking beyond quantum mechanics, we see how both structures have their special places in a much larger continuum of formal systems that we have yet to look for in nature.

  19. Monte Carlo Perturbation Theory Estimates of Sensitivities to System Dimensions

    DOE PAGES

    Burke, Timothy P.; Kiedrowski, Brian C.

    2017-12-11

    Here, Monte Carlo methods are developed using adjoint-based perturbation theory and the differential operator method to compute the sensitivities of the k-eigenvalue, linear functions of the flux (reaction rates), and bilinear functions of the forward and adjoint flux (kinetics parameters) to system dimensions for uniform expansions or contractions. The calculation of sensitivities to system dimensions requires computing scattering and fission sources at material interfaces using collisions occurring at the interface—which is a set of events with infinitesimal probability. Kernel density estimators are used to estimate the source at interfaces using collisions occurring near the interface. The methods for computing sensitivitiesmore » of linear and bilinear ratios are derived using the differential operator method and adjoint-based perturbation theory and are shown to be equivalent to methods previously developed using a collision history–based approach. The methods for determining sensitivities to system dimensions are tested on a series of fast, intermediate, and thermal critical benchmarks as well as a pressurized water reactor benchmark problem with iterated fission probability used for adjoint-weighting. The estimators are shown to agree within 5% and 3σ of reference solutions obtained using direct perturbations with central differences for the majority of test problems.« less

  20. Enhanced peculiar velocities in brane-induced gravity

    NASA Astrophysics Data System (ADS)

    Wyman, Mark; Khoury, Justin

    2010-08-01

    The mounting evidence for anomalously large peculiar velocities in our Universe presents a challenge for the ΛCDM paradigm. The recent estimates of the large-scale bulk flow by Watkins et al. are inconsistent at the nearly 3σ level with ΛCDM predictions. Meanwhile, Lee and Komatsu have recently estimated that the occurrence of high-velocity merging systems such as the bullet cluster (1E0657-57) is unlikely at a 6.5-5.8σ level, with an estimated probability between 3.3×10-11 and 3.6×10-9 in ΛCDM cosmology. We show that these anomalies are alleviated in a broad class of infrared-modifed gravity theories, called brane-induced gravity, in which gravity becomes higher-dimensional at ultralarge distances. These theories include additional scalar forces that enhance gravitational attraction and therefore speed up structure formation at late times and on sufficiently large scales. The peculiar velocities are enhanced by 24-34% compared to standard gravity, with the maximal enhancement nearly consistent at the 2σ level with bulk flow observations. The occurrence of the bullet cluster in these theories is ≈104 times more probable than in ΛCDM cosmology.

  1. Enhanced peculiar velocities in brane-induced gravity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wyman, Mark; Khoury, Justin

    The mounting evidence for anomalously large peculiar velocities in our Universe presents a challenge for the {Lambda}CDM paradigm. The recent estimates of the large-scale bulk flow by Watkins et al. are inconsistent at the nearly 3{sigma} level with {Lambda}CDM predictions. Meanwhile, Lee and Komatsu have recently estimated that the occurrence of high-velocity merging systems such as the bullet cluster (1E0657-57) is unlikely at a 6.5-5.8{sigma} level, with an estimated probability between 3.3x10{sup -11} and 3.6x10{sup -9} in {Lambda}CDM cosmology. We show that these anomalies are alleviated in a broad class of infrared-modifed gravity theories, called brane-induced gravity, in which gravitymore » becomes higher-dimensional at ultralarge distances. These theories include additional scalar forces that enhance gravitational attraction and therefore speed up structure formation at late times and on sufficiently large scales. The peculiar velocities are enhanced by 24-34% compared to standard gravity, with the maximal enhancement nearly consistent at the 2{sigma} level with bulk flow observations. The occurrence of the bullet cluster in these theories is {approx_equal}10{sup 4} times more probable than in {Lambda}CDM cosmology.« less

  2. Theory of rotational transition in atom-diatom chemical reaction

    NASA Astrophysics Data System (ADS)

    Nakamura, Masato; Nakamura, Hiroki

    1989-05-01

    Rotational transition in atom-diatom chemical reaction is theoretically studied. A new approximate theory (which we call IOS-DW approximation) is proposed on the basis of the physical idea that rotational transition in reaction is induced by the following two different mechanisms: rotationally inelastic half collision in both initial and final arrangement channels, and coordinate transformation in the reaction zone. This theory gives a fairy compact expression for the state-to-state transition probability. Introducing the additional physically reasonable assumption that reaction (particle rearrangement) takes place in a spatially localized region, we have reduced this expression into a simpler analytical form which can explicitly give overall rotational state distribution in reaction. Numerical application was made to the H+H2 reaction and demonstrated its effectiveness for the simplicity. A further simplified most naive approximation, i.e., independent events approximation was also proposed and demonstrated to work well in the test calculation of H+H2. The overall rotational state distribution is expressed simply by a product sum of the transition probabilities for the three consecutive processes in reaction: inelastic transition in the initial half collision, transition due to particle rearrangement, and inelastic transition in the final half collision.

  3. A random walk rule for phase I clinical trials.

    PubMed

    Durham, S D; Flournoy, N; Rosenberger, W F

    1997-06-01

    We describe a family of random walk rules for the sequential allocation of dose levels to patients in a dose-response study, or phase I clinical trial. Patients are sequentially assigned the next higher, same, or next lower dose level according to some probability distribution, which may be determined by ethical considerations as well as the patient's response. It is shown that one can choose these probabilities in order to center dose level assignments unimodally around any target quantile of interest. Estimation of the quantile is discussed; the maximum likelihood estimator and its variance are derived under a two-parameter logistic distribution, and the maximum likelihood estimator is compared with other nonparametric estimators. Random walk rules have clear advantages: they are simple to implement, and finite and asymptotic distribution theory is completely worked out. For a specific random walk rule, we compute finite and asymptotic properties and give examples of its use in planning studies. Having the finite distribution theory available and tractable obviates the need for elaborate simulation studies to analyze the properties of the design. The small sample properties of our rule, as determined by exact theory, compare favorably to those of the continual reassessment method, determined by simulation.

  4. Balloon borne Antarctic frost point measurements and their impact on polar stratospheric cloud theories

    NASA Technical Reports Server (NTRS)

    Rosen, James M.; Hofmann, D. J.; Carpenter, J. R.; Harder, J. W.; Oltmans, S. J.

    1988-01-01

    The first balloon-borne frost point measurements over Antarctica were made during September and October, 1987 as part of the NOZE 2 effort at McMurdo. The results indicate water vapor mixing ratios on the order of 2 ppmv in the 15 to 20 km region which is somewhat smaller than the typical values currently being used significantly smaller than the typical values currently being used in polar stratospheric cloud (PSC) theories. The observed water vapor mixing ratio would correspond to saturated conditions for what is thought to be the lowest stratospheric temperatures encountered over the Antarctic. Through the use of available lidar observations there appears to be significant evidence that some PSCs form at temperatures higher than the local frost point (with respect to water) in the 10 to 20 km region thus supporting the nitric acid theory of PSC composition. Clouds near 15 km and below appear to form in regions saturated with respect to water and thus are probably mostly ice water clouds although they could contain relatively small amounts of other constituents. Photographic evidence suggests that the clouds forming above the frost point probably have an appearance quite different from the lower altitude iridescent, colored nacreous clouds.

  5. Statistical theory of combinatorial libraries of folding proteins: energetic discrimination of a target structure.

    PubMed

    Zou, J; Saven, J G

    2000-02-11

    A self-consistent theory is presented that can be used to estimate the number and composition of sequences satisfying a predetermined set of constraints. The theory is formulated so as to examine the features of sequences having a particular value of Delta=E(f)-(u), where E(f) is the energy of sequences when in a target structure and (u) is an average energy of non-target structures. The theory yields the probabilities w(i)(alpha) that each position i in the sequence is occupied by a particular monomer type alpha. The theory is applied to a simple lattice model of proteins. Excellent agreement is observed between the theory and the results of exact enumerations. The theory provides a quantitative framework for the design and interpretation of combinatorial experiments involving proteins, where a library of amino acid sequences is searched for sequences that fold to a desired structure. Copyright 2000 Academic Press.

  6. Preface of the special issue quantum foundations: information approach

    PubMed Central

    2016-01-01

    This special issue is based on the contributions of a group of top experts in quantum foundations and quantum information and probability. It enlightens a number of interpretational, mathematical and experimental problems of quantum theory. PMID:27091161

  7. Global warming precipitation accumulation increases above the current-climate cutoff scale

    PubMed Central

    Sahany, Sandeep; Stechmann, Samuel N.; Bernstein, Diana N.

    2017-01-01

    Precipitation accumulations, integrated over rainfall events, can be affected by both intensity and duration of the storm event. Thus, although precipitation intensity is widely projected to increase under global warming, a clear framework for predicting accumulation changes has been lacking, despite the importance of accumulations for societal impacts. Theory for changes in the probability density function (pdf) of precipitation accumulations is presented with an evaluation of these changes in global climate model simulations. We show that a simple set of conditions implies roughly exponential increases in the frequency of the very largest accumulations above a physical cutoff scale, increasing with event size. The pdf exhibits an approximately power-law range where probability density drops slowly with each order of magnitude size increase, up to a cutoff at large accumulations that limits the largest events experienced in current climate. The theory predicts that the cutoff scale, controlled by the interplay of moisture convergence variance and precipitation loss, tends to increase under global warming. Thus, precisely the large accumulations above the cutoff that are currently rare will exhibit increases in the warmer climate as this cutoff is extended. This indeed occurs in the full climate model, with a 3 °C end-of-century global-average warming yielding regional increases of hundreds of percent to >1,000% in the probability density of the largest accumulations that have historical precedents. The probabilities of unprecedented accumulations are also consistent with the extension of the cutoff. PMID:28115693

  8. Behavioral economic insights into physician tobacco treatment decision-making.

    PubMed

    Leone, Frank T; Evers-Casey, Sarah; Graden, Sarah; Schnoll, Robert

    2015-03-01

    Physicians self-report high adherence rates for Ask and Advise behaviors of tobacco dependence treatment but are much less likely to engage in "next steps" consistent with sophisticated management of chronic illness. A variety of potential explanations have been offered, yet each lacks face validity in light of experience with other challenging medical conditions. Conduct a preliminary exploration of the behavioral economics of tobacco treatment decision-making in the face of uncertain outcomes, seeking evidence that behaviors may be explained within the framework of Prospect Theory. Four physician cohorts were polled regarding their impressions of the utility of tobacco use treatment and their estimations of "success" probabilities. Contingent valuation was estimated by asking respondents to make monetary tradeoffs relative to three common chronic conditions. Responses from all four cohorts showed a similar pattern of high utility of tobacco use treatment but low success probability when compared with the other chronic medical conditions. Following instructional methods aimed at controverting cognitive biases related to tobacco, this pattern was reversed, with success probabilities attaining higher valuation than for diabetes. Important presuppositions regarding the potential "success" of tobacco-related patient interactions are likely limiting physician engagement by favoring the most secure visit outcome despite the limited potential for health gains. Under these conditions, low engagement rates would be consistent with Prospect Theory predictions. Interventions aimed at counteracting the cognitive biases limiting estimations of success probabilities seem to effectively reverse this pattern and provide clues to improving the adoption of target clinical behaviors.

  9. Global warming precipitation accumulation increases above the current-climate cutoff scale

    NASA Astrophysics Data System (ADS)

    Neelin, J. David; Sahany, Sandeep; Stechmann, Samuel N.; Bernstein, Diana N.

    2017-02-01

    Precipitation accumulations, integrated over rainfall events, can be affected by both intensity and duration of the storm event. Thus, although precipitation intensity is widely projected to increase under global warming, a clear framework for predicting accumulation changes has been lacking, despite the importance of accumulations for societal impacts. Theory for changes in the probability density function (pdf) of precipitation accumulations is presented with an evaluation of these changes in global climate model simulations. We show that a simple set of conditions implies roughly exponential increases in the frequency of the very largest accumulations above a physical cutoff scale, increasing with event size. The pdf exhibits an approximately power-law range where probability density drops slowly with each order of magnitude size increase, up to a cutoff at large accumulations that limits the largest events experienced in current climate. The theory predicts that the cutoff scale, controlled by the interplay of moisture convergence variance and precipitation loss, tends to increase under global warming. Thus, precisely the large accumulations above the cutoff that are currently rare will exhibit increases in the warmer climate as this cutoff is extended. This indeed occurs in the full climate model, with a 3 °C end-of-century global-average warming yielding regional increases of hundreds of percent to >1,000% in the probability density of the largest accumulations that have historical precedents. The probabilities of unprecedented accumulations are also consistent with the extension of the cutoff.

  10. Atom-counting in High Resolution Electron Microscopy:TEM or STEM - That's the question.

    PubMed

    Gonnissen, J; De Backer, A; den Dekker, A J; Sijbers, J; Van Aert, S

    2017-03-01

    In this work, a recently developed quantitative approach based on the principles of detection theory is used in order to determine the possibilities and limitations of High Resolution Scanning Transmission Electron Microscopy (HR STEM) and HR TEM for atom-counting. So far, HR STEM has been shown to be an appropriate imaging mode to count the number of atoms in a projected atomic column. Recently, it has been demonstrated that HR TEM, when using negative spherical aberration imaging, is suitable for atom-counting as well. The capabilities of both imaging techniques are investigated and compared using the probability of error as a criterion. It is shown that for the same incoming electron dose, HR STEM outperforms HR TEM under common practice standards, i.e. when the decision is based on the probability function of the peak intensities in HR TEM and of the scattering cross-sections in HR STEM. If the atom-counting decision is based on the joint probability function of the image pixel values, the dependence of all image pixel intensities as a function of thickness should be known accurately. Under this assumption, the probability of error may decrease significantly for atom-counting in HR TEM and may, in theory, become lower as compared to HR STEM under the predicted optimal experimental settings. However, the commonly used standard for atom-counting in HR STEM leads to a high performance and has been shown to work in practice. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Global warming precipitation accumulation increases above the current-climate cutoff scale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neelin, J. David; Sahany, Sandeep; Stechmann, Samuel N.

    Precipitation accumulations, integrated over rainfall events, can be affected by both intensity and duration of the storm event. Thus, although precipitation intensity is widely projected to increase under global warming, a clear framework for predicting accumulation changes has been lacking, despite the importance of accumulations for societal impacts. Theory for changes in the probability density function (pdf) of precipitation accumulations is presented with an evaluation of these changes in global climate model simulations. We show that a simple set of conditions implies roughly exponential increases in the frequency of the very largest accumulations above a physical cutoff scale, increasing withmore » event size. The pdf exhibits an approximately power-law range where probability density drops slowly with each order of magnitude size increase, up to a cutoff at large accumulations that limits the largest events experienced in current climate. The theory predicts that the cutoff scale, controlled by the interplay of moisture convergence variance and precipitation loss, tends to increase under global warming. Thus, precisely the large accumulations above the cutoff that are currently rare will exhibit increases in the warmer climate as this cutoff is extended. This indeed occurs in the full climate model, with a 3 °C end-of-century global-average warming yielding regional increases of hundreds of percent to >1,000% in the probability density of the largest accumulations that have historical precedents. The probabilities of unprecedented accumulations are also consistent with the extension of the cutoff.« less

  12. Global warming precipitation accumulation increases above the current-climate cutoff scale.

    PubMed

    Neelin, J David; Sahany, Sandeep; Stechmann, Samuel N; Bernstein, Diana N

    2017-02-07

    Precipitation accumulations, integrated over rainfall events, can be affected by both intensity and duration of the storm event. Thus, although precipitation intensity is widely projected to increase under global warming, a clear framework for predicting accumulation changes has been lacking, despite the importance of accumulations for societal impacts. Theory for changes in the probability density function (pdf) of precipitation accumulations is presented with an evaluation of these changes in global climate model simulations. We show that a simple set of conditions implies roughly exponential increases in the frequency of the very largest accumulations above a physical cutoff scale, increasing with event size. The pdf exhibits an approximately power-law range where probability density drops slowly with each order of magnitude size increase, up to a cutoff at large accumulations that limits the largest events experienced in current climate. The theory predicts that the cutoff scale, controlled by the interplay of moisture convergence variance and precipitation loss, tends to increase under global warming. Thus, precisely the large accumulations above the cutoff that are currently rare will exhibit increases in the warmer climate as this cutoff is extended. This indeed occurs in the full climate model, with a 3 °C end-of-century global-average warming yielding regional increases of hundreds of percent to >1,000% in the probability density of the largest accumulations that have historical precedents. The probabilities of unprecedented accumulations are also consistent with the extension of the cutoff.

  13. Global warming precipitation accumulation increases above the current-climate cutoff scale

    DOE PAGES

    Neelin, J. David; Sahany, Sandeep; Stechmann, Samuel N.; ...

    2017-01-23

    Precipitation accumulations, integrated over rainfall events, can be affected by both intensity and duration of the storm event. Thus, although precipitation intensity is widely projected to increase under global warming, a clear framework for predicting accumulation changes has been lacking, despite the importance of accumulations for societal impacts. Theory for changes in the probability density function (pdf) of precipitation accumulations is presented with an evaluation of these changes in global climate model simulations. We show that a simple set of conditions implies roughly exponential increases in the frequency of the very largest accumulations above a physical cutoff scale, increasing withmore » event size. The pdf exhibits an approximately power-law range where probability density drops slowly with each order of magnitude size increase, up to a cutoff at large accumulations that limits the largest events experienced in current climate. The theory predicts that the cutoff scale, controlled by the interplay of moisture convergence variance and precipitation loss, tends to increase under global warming. Thus, precisely the large accumulations above the cutoff that are currently rare will exhibit increases in the warmer climate as this cutoff is extended. This indeed occurs in the full climate model, with a 3 °C end-of-century global-average warming yielding regional increases of hundreds of percent to >1,000% in the probability density of the largest accumulations that have historical precedents. The probabilities of unprecedented accumulations are also consistent with the extension of the cutoff.« less

  14. Local Structure Theory for Cellular Automata.

    NASA Astrophysics Data System (ADS)

    Gutowitz, Howard Andrew

    The local structure theory (LST) is a generalization of the mean field theory for cellular automata (CA). The mean field theory makes the assumption that iterative application of the rule does not introduce correlations between the states of cells in different positions. This assumption allows the derivation of a simple formula for the limit density of each possible state of a cell. The most striking feature of CA is that they may well generate correlations between the states of cells as they evolve. The LST takes the generation of correlation explicitly into account. It thus has the potential to describe statistical characteristics in detail. The basic assumption of the LST is that though correlation may be generated by CA evolution, this correlation decays with distance. This assumption allows the derivation of formulas for the estimation of the probability of large blocks of states in terms of smaller blocks of states. Given the probabilities of blocks of size n, probabilities may be assigned to blocks of arbitrary size such that these probability assignments satisfy the Kolmogorov consistency conditions and hence may be used to define a measure on the set of all possible (infinite) configurations. Measures defined in this way are called finite (or n-) block measures. A function called the scramble operator of order n maps a measure to an approximating n-block measure. The action of a CA on configurations induces an action on measures on the set of all configurations. The scramble operator is combined with the CA map on measure to form the local structure operator (LSO). The LSO of order n maps the set of n-block measures into itself. It is hypothesised that the LSO applied to n-block measures approximates the rule itself on general measures, and does so increasingly well as n increases. The fundamental advantage of the LSO is that its action is explicitly computable from a finite system of rational recursion equations. Empirical study of a number of CA rules demonstrates the potential of the LST to describe the statistical features of CA. The behavior of some simple rules is derived analytically. Other rules have more complex, chaotic behavior. Even for these rules, the LST yields an accurate portrait of both small and large time statistics.

  15. Group-level traits can be studied with standard evolutionary theory.

    PubMed

    Scott-Phillips, Thomas C; Dickins, Thomas E

    2014-06-01

    Smaldino's target article draws on and seeks to add to a literature that has partially rejected orthodox, gene-centric evolutionary theory. However, orthodox theory has much to say about group-level traits. The target article does not reference or refute these views, and provides no explicit arguments for this narrow approach. In this commentary we: (i) give two examples of topics that the target article might and probably should have discussed (cultural epidemiology and the psychology of individual differences); and (ii) argue that the orthodox approach has much more to say about the emergence of group-level traits than the target article recognises, or gives credit for.

  16. Statistical foundations of liquid-crystal theory: II: Macroscopic balance laws.

    PubMed

    Seguin, Brian; Fried, Eliot

    2013-01-01

    Working on a state space determined by considering a discrete system of rigid rods, we use nonequilibrium statistical mechanics to derive macroscopic balance laws for liquid crystals. A probability function that satisfies the Liouville equation serves as the starting point for deriving each macroscopic balance. The terms appearing in the derived balances are interpreted as expected values and explicit formulas for these terms are obtained. Among the list of derived balances appear two, the tensor moment of inertia balance and the mesofluctuation balance, that are not standard in previously proposed macroscopic theories for liquid crystals but which have precedents in other theories for structured media.

  17. Introduction to focus issue: intrinsic and designed computation: information processing in dynamical systems--beyond the digital hegemony.

    PubMed

    Crutchfield, James P; Ditto, William L; Sinha, Sudeshna

    2010-09-01

    How dynamical systems store and process information is a fundamental question that touches a remarkably wide set of contemporary issues: from the breakdown of Moore's scaling laws--that predicted the inexorable improvement in digital circuitry--to basic philosophical problems of pattern in the natural world. It is a question that also returns one to the earliest days of the foundations of dynamical systems theory, probability theory, mathematical logic, communication theory, and theoretical computer science. We introduce the broad and rather eclectic set of articles in this Focus Issue that highlights a range of current challenges in computing and dynamical systems.

  18. Statistical foundations of liquid-crystal theory

    PubMed Central

    Seguin, Brian; Fried, Eliot

    2013-01-01

    Working on a state space determined by considering a discrete system of rigid rods, we use nonequilibrium statistical mechanics to derive macroscopic balance laws for liquid crystals. A probability function that satisfies the Liouville equation serves as the starting point for deriving each macroscopic balance. The terms appearing in the derived balances are interpreted as expected values and explicit formulas for these terms are obtained. Among the list of derived balances appear two, the tensor moment of inertia balance and the mesofluctuation balance, that are not standard in previously proposed macroscopic theories for liquid crystals but which have precedents in other theories for structured media. PMID:23554513

  19. Generalized probabilistic scale space for image restoration.

    PubMed

    Wong, Alexander; Mishra, Akshaya K

    2010-10-01

    A novel generalized sampling-based probabilistic scale space theory is proposed for image restoration. We explore extending the definition of scale space to better account for both noise and observation models, which is important for producing accurately restored images. A new class of scale-space realizations based on sampling and probability theory is introduced to realize this extended definition in the context of image restoration. Experimental results using 2-D images show that generalized sampling-based probabilistic scale-space theory can be used to produce more accurate restored images when compared with state-of-the-art scale-space formulations, particularly under situations characterized by low signal-to-noise ratios and image degradation.

  20. What Information Theory Says about Bounded Rational Best Response

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.

    2005-01-01

    Probability Collectives (PC) provides the information-theoretic extension of conventional full-rationality game theory to bounded rational games. Here an explicit solution to the equations giving the bounded rationality equilibrium of a game is presented. Then PC is used to investigate games in which the players use bounded rational best-response strategies. Next it is shown that in the continuum-time limit, bounded rational best response games result in a variant of the replicator dynamics of evolutionary game theory. It is then shown that for team (shared-payoff) games, this variant of replicator dynamics is identical to Newton-Raphson iterative optimization of the shared utility function.

Top