NASA Astrophysics Data System (ADS)
Berkovitz, Joseph
Bruno de Finetti is one of the founding fathers of the subjectivist school of probability, where probabilities are interpreted as rational degrees of belief. His work on the relation between the theorems of probability and rationality is among the corner stones of modern subjective probability theory. De Finetti maintained that rationality requires that degrees of belief be coherent, and he argued that the whole of probability theory could be derived from these coherence conditions. De Finetti's interpretation of probability has been highly influential in science. This paper focuses on the application of this interpretation to quantum mechanics. We argue that de Finetti held that the coherence conditions of degrees of belief in events depend on their verifiability. Accordingly, the standard coherence conditions of degrees of belief that are familiar from the literature on subjective probability only apply to degrees of belief in events which could (in principle) be jointly verified; and the coherence conditions of degrees of belief in events that cannot be jointly verified are weaker. While the most obvious explanation of de Finetti's verificationism is the influence of positivism, we argue that it could be motivated by the radical subjectivist and instrumental nature of probability in his interpretation; for as it turns out, in this interpretation it is difficult to make sense of the idea of coherent degrees of belief in, and accordingly probabilities of unverifiable events. We then consider the application of this interpretation to quantum mechanics, concentrating on the Einstein-Podolsky-Rosen experiment and Bell's theorem.
Students' Understanding of Conditional Probability on Entering University
ERIC Educational Resources Information Center
Reaburn, Robyn
2013-01-01
An understanding of conditional probability is essential for students of inferential statistics as it is used in Null Hypothesis Tests. Conditional probability is also used in Bayes' theorem, in the interpretation of medical screening tests and in quality control procedures. This study examines the understanding of conditional probability of…
The Formalism of Generalized Contexts and Decay Processes
NASA Astrophysics Data System (ADS)
Losada, Marcelo; Laura, Roberto
2013-04-01
The formalism of generalized contexts for quantum histories is used to investigate the possibility to consider the survival probability as the probability of no decay property at a given time conditional to no decay property at an earlier time. A negative result is found for an isolated system. The inclusion of two quantum measurement instruments at two different times makes possible to interpret the survival probability as a conditional probability of the whole system.
Probability in the Many-Worlds Interpretation of Quantum Mechanics
NASA Astrophysics Data System (ADS)
Vaidman, Lev
It is argued that, although in the Many-Worlds Interpretation of quantum mechanics there is no "probability" for an outcome of a quantum experiment in the usual sense, we can understand why we have an illusion of probability. The explanation involves: (a) A "sleeping pill" gedanken experiment which makes correspondence between an illegitimate question: "What is the probability of an outcome of a quantum measurement?" with a legitimate question: "What is the probability that `I' am in the world corresponding to that outcome?"; (b) A gedanken experiment which splits the world into several worlds which are identical according to some symmetry condition; and (c) Relativistic causality, which together with (b) explain the Born rule of standard quantum mechanics. The Quantum Sleeping Beauty controversy and "caring measure" replacing probability measure are discussed.
Use and interpretation of logistic regression in habitat-selection studies
Keating, Kim A.; Cherry, Steve
2004-01-01
Logistic regression is an important tool for wildlife habitat-selection studies, but the method frequently has been misapplied due to an inadequate understanding of the logistic model, its interpretation, and the influence of sampling design. To promote better use of this method, we review its application and interpretation under 3 sampling designs: random, case-control, and use-availability. Logistic regression is appropriate for habitat use-nonuse studies employing random sampling and can be used to directly model the conditional probability of use in such cases. Logistic regression also is appropriate for studies employing case-control sampling designs, but careful attention is required to interpret results correctly. Unless bias can be estimated or probability of use is small for all habitats, results of case-control studies should be interpreted as odds ratios, rather than probability of use or relative probability of use. When data are gathered under a use-availability design, logistic regression can be used to estimate approximate odds ratios if probability of use is small, at least on average. More generally, however, logistic regression is inappropriate for modeling habitat selection in use-availability studies. In particular, using logistic regression to fit the exponential model of Manly et al. (2002:100) does not guarantee maximum-likelihood estimates, valid probabilities, or valid likelihoods. We show that the resource selection function (RSF) commonly used for the exponential model is proportional to a logistic discriminant function. Thus, it may be used to rank habitats with respect to probability of use and to identify important habitat characteristics or their surrogates, but it is not guaranteed to be proportional to probability of use. Other problems associated with the exponential model also are discussed. We describe an alternative model based on Lancaster and Imbens (1996) that offers a method for estimating conditional probability of use in use-availability studies. Although promising, this model fails to converge to a unique solution in some important situations. Further work is needed to obtain a robust method that is broadly applicable to use-availability studies.
NASA Astrophysics Data System (ADS)
Fuchs, Christopher A.; Schack, Rüdiger
2013-10-01
In the quantum-Bayesian interpretation of quantum theory (or QBism), the Born rule cannot be interpreted as a rule for setting measurement-outcome probabilities from an objective quantum state. But if not, what is the role of the rule? In this paper, the argument is given that it should be seen as an empirical addition to Bayesian reasoning itself. Particularly, it is shown how to view the Born rule as a normative rule in addition to usual Dutch-book coherence. It is a rule that takes into account how one should assign probabilities to the consequences of various intended measurements on a physical system, but explicitly in terms of prior probabilities for and conditional probabilities consequent upon the imagined outcomes of a special counterfactual reference measurement. This interpretation is exemplified by representing quantum states in terms of probabilities for the outcomes of a fixed, fiducial symmetric informationally complete measurement. The extent to which the general form of the new normative rule implies the full state-space structure of quantum mechanics is explored.
Realistic Clocks for a Universe Without Time
NASA Astrophysics Data System (ADS)
Bryan, K. L. H.; Medved, A. J. M.
2018-01-01
There are a number of problematic features within the current treatment of time in physical theories, including the "timelessness" of the Universe as encapsulated by the Wheeler-DeWitt equation. This paper considers one particular investigation into resolving this issue; a conditional probability interpretation that was first proposed by Page and Wooters. Those authors addressed the apparent timelessness by subdividing a faux Universe into two entangled parts, "the clock" and "the remainder of the Universe", and then synchronizing the effective dynamics of the two subsystems by way of conditional probabilities. The current treatment focuses on the possibility of using a (somewhat) realistic clock system; namely, a coherent-state description of a damped harmonic oscillator. This clock proves to be consistent with the conditional probability interpretation; in particular, a standard evolution operator is identified with the position of the clock playing the role of time for the rest of the Universe. Restrictions on the damping factor are determined and, perhaps contrary to expectations, the optimal choice of clock is not necessarily one of minimal damping.
NASA Astrophysics Data System (ADS)
Lee, Jaeha; Tsutsui, Izumi
2017-05-01
We show that the joint behavior of an arbitrary pair of (generally noncommuting) quantum observables can be described by quasi-probabilities, which are an extended version of the standard probabilities used for describing the outcome of measurement for a single observable. The physical situations that require these quasi-probabilities arise when one considers quantum measurement of an observable conditioned by some other variable, with the notable example being the weak measurement employed to obtain Aharonov's weak value. Specifically, we present a general prescription for the construction of quasi-joint probability (QJP) distributions associated with a given combination of observables. These QJP distributions are introduced in two complementary approaches: one from a bottom-up, strictly operational construction realized by examining the mathematical framework of the conditioned measurement scheme, and the other from a top-down viewpoint realized by applying the results of the spectral theorem for normal operators and their Fourier transforms. It is then revealed that, for a pair of simultaneously measurable observables, the QJP distribution reduces to the unique standard joint probability distribution of the pair, whereas for a noncommuting pair there exists an inherent indefiniteness in the choice of such QJP distributions, admitting a multitude of candidates that may equally be used for describing the joint behavior of the pair. In the course of our argument, we find that the QJP distributions furnish the space of operators in the underlying Hilbert space with their characteristic geometric structures such that the orthogonal projections and inner products of observables can be given statistical interpretations as, respectively, “conditionings” and “correlations”. The weak value Aw for an observable A is then given a geometric/statistical interpretation as either the orthogonal projection of A onto the subspace generated by another observable B, or equivalently, as the conditioning of A given B with respect to the QJP distribution under consideration.
NASA Astrophysics Data System (ADS)
Gerd, Niestegge
2010-12-01
In the quantum mechanical Hilbert space formalism, the probabilistic interpretation is a later ad-hoc add-on, more or less enforced by the experimental evidence, but not motivated by the mathematical model itself. A model involving a clear probabilistic interpretation from the very beginning is provided by the quantum logics with unique conditional probabilities. It includes the projection lattices in von Neumann algebras and here probability conditionalization becomes identical with the state transition of the Lüders-von Neumann measurement process. This motivates the definition of a hierarchy of five compatibility and comeasurability levels in the abstract setting of the quantum logics with unique conditional probabilities. Their meanings are: the absence of quantum interference or influence, the existence of a joint distribution, simultaneous measurability, and the independence of the final state after two successive measurements from the sequential order of these two measurements. A further level means that two elements of the quantum logic (events) belong to the same Boolean subalgebra. In the general case, the five compatibility and comeasurability levels appear to differ, but they all coincide in the common Hilbert space formalism of quantum mechanics, in von Neumann algebras, and in some other cases.
The role of probabilities in physics.
Le Bellac, Michel
2012-09-01
Although modern physics was born in the XVIIth century as a fully deterministic theory in the form of Newtonian mechanics, the use of probabilistic arguments turned out later on to be unavoidable. Three main situations can be distinguished. (1) When the number of degrees of freedom is very large, on the order of Avogadro's number, a detailed dynamical description is not possible, and in fact not useful: we do not care about the velocity of a particular molecule in a gas, all we need is the probability distribution of the velocities. This statistical description introduced by Maxwell and Boltzmann allows us to recover equilibrium thermodynamics, gives a microscopic interpretation of entropy and underlies our understanding of irreversibility. (2) Even when the number of degrees of freedom is small (but larger than three) sensitivity to initial conditions of chaotic dynamics makes determinism irrelevant in practice, because we cannot control the initial conditions with infinite accuracy. Although die tossing is in principle predictable, the approach to chaotic dynamics in some limit implies that our ignorance of initial conditions is translated into a probabilistic description: each face comes up with probability 1/6. (3) As is well-known, quantum mechanics is incompatible with determinism. However, quantum probabilities differ in an essential way from the probabilities introduced previously: it has been shown from the work of John Bell that quantum probabilities are intrinsic and cannot be given an ignorance interpretation based on a hypothetical deeper level of description. Copyright © 2012 Elsevier Ltd. All rights reserved.
Using Playing Cards to Differentiate Probability Interpretations
ERIC Educational Resources Information Center
López Puga, Jorge
2014-01-01
The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.
NASA Astrophysics Data System (ADS)
Ovchinnikov, Igor V.; Schwartz, Robert N.; Wang, Kang L.
2016-03-01
The concept of deterministic dynamical chaos has a long history and is well established by now. Nevertheless, its field theoretic essence and its stochastic generalization have been revealed only very recently. Within the newly found supersymmetric theory of stochastics (STS), all stochastic differential equations (SDEs) possess topological or de Rahm supersymmetry and stochastic chaos is the phenomenon of its spontaneous breakdown. Even though the STS is free of approximations and thus is technically solid, it is still missing a firm interpretational basis in order to be physically sound. Here, we make a few important steps toward the construction of the interpretational foundation for the STS. In particular, we discuss that one way to understand why the ground states of chaotic SDEs are conditional (not total) probability distributions, is that some of the variables have infinite memory of initial conditions and thus are not “thermalized”, i.e., cannot be described by the initial-conditions-independent probability distributions. As a result, the definitive assumption of physical statistics that the ground state is a steady-state total probability distribution is not valid for chaotic SDEs.
Econophysics: Two-phase behaviour of financial markets
NASA Astrophysics Data System (ADS)
Plerou, Vasiliki; Gopikrishnan, Parameswaran; Stanley, H. Eugene
2003-01-01
Buying and selling in financial markets is driven by demand, which can be quantified by the imbalance in the number of shares transacted by buyers and sellers over a given time interval. Here we analyse the probability distribution of demand, conditioned on its local noise intensity Σ, and discover the surprising existence of a critical threshold, Σc. For Σ < Σc, the most probable value of demand is roughly zero; we interpret this as an equilibrium phase in which neither buying nor selling predominates. For Σ > Σc, two most probable values emerge that are symmetrical around zero demand, corresponding to excess demand and excess supply; we interpret this as an out-of-equilibrium phase in which the market behaviour is mainly buying for half of the time, and mainly selling for the other half.
2010-01-01
Background Abnormal results of diagnostic laboratory tests can be difficult to interpret when disease probability is very low. Although most physicians generally do not use Bayesian calculations to interpret abnormal results, their estimates of pretest disease probability and reasons for ordering diagnostic tests may - in a more implicit manner - influence test interpretation and further management. A better understanding of this influence may help to improve test interpretation and management. Therefore, the objective of this study was to examine the influence of physicians' pretest disease probability estimates, and their reasons for ordering diagnostic tests, on test result interpretation, posttest probability estimates and further management. Methods Prospective study among 87 primary care physicians in the Netherlands who each ordered laboratory tests for 25 patients. They recorded their reasons for ordering the tests (to exclude or confirm disease or to reassure patients) and their pretest disease probability estimates. Upon receiving the results they recorded how they interpreted the tests, their posttest probability estimates and further management. Logistic regression was used to analyse whether the pretest probability and the reasons for ordering tests influenced the interpretation, the posttest probability estimates and the decisions on further management. Results The physicians ordered tests for diagnostic purposes for 1253 patients; 742 patients had an abnormal result (64%). Physicians' pretest probability estimates and their reasons for ordering diagnostic tests influenced test interpretation, posttest probability estimates and further management. Abnormal results of tests ordered for reasons of reassurance were significantly more likely to be interpreted as normal (65.8%) compared to tests ordered to confirm a diagnosis or exclude a disease (27.7% and 50.9%, respectively). The odds for abnormal results to be interpreted as normal were much lower when the physician estimated a high pretest disease probability, compared to a low pretest probability estimate (OR = 0.18, 95% CI = 0.07-0.52, p < 0.001). Conclusions Interpretation and management of abnormal test results were strongly influenced by physicians' estimation of pretest disease probability and by the reason for ordering the test. By relating abnormal laboratory results to their pretest expectations, physicians may seek a balance between over- and under-reacting to laboratory test results. PMID:20158908
Houben, Paul H H; van der Weijden, Trudy; Winkens, Bjorn; Winkens, Ron A G; Grol, Richard P T M
2010-02-16
Abnormal results of diagnostic laboratory tests can be difficult to interpret when disease probability is very low. Although most physicians generally do not use Bayesian calculations to interpret abnormal results, their estimates of pretest disease probability and reasons for ordering diagnostic tests may--in a more implicit manner--influence test interpretation and further management. A better understanding of this influence may help to improve test interpretation and management. Therefore, the objective of this study was to examine the influence of physicians' pretest disease probability estimates, and their reasons for ordering diagnostic tests, on test result interpretation, posttest probability estimates and further management. Prospective study among 87 primary care physicians in the Netherlands who each ordered laboratory tests for 25 patients. They recorded their reasons for ordering the tests (to exclude or confirm disease or to reassure patients) and their pretest disease probability estimates. Upon receiving the results they recorded how they interpreted the tests, their posttest probability estimates and further management. Logistic regression was used to analyse whether the pretest probability and the reasons for ordering tests influenced the interpretation, the posttest probability estimates and the decisions on further management. The physicians ordered tests for diagnostic purposes for 1253 patients; 742 patients had an abnormal result (64%). Physicians' pretest probability estimates and their reasons for ordering diagnostic tests influenced test interpretation, posttest probability estimates and further management. Abnormal results of tests ordered for reasons of reassurance were significantly more likely to be interpreted as normal (65.8%) compared to tests ordered to confirm a diagnosis or exclude a disease (27.7% and 50.9%, respectively). The odds for abnormal results to be interpreted as normal were much lower when the physician estimated a high pretest disease probability, compared to a low pretest probability estimate (OR = 0.18, 95% CI = 0.07-0.52, p < 0.001). Interpretation and management of abnormal test results were strongly influenced by physicians' estimation of pretest disease probability and by the reason for ordering the test. By relating abnormal laboratory results to their pretest expectations, physicians may seek a balance between over- and under-reacting to laboratory test results.
2016-04-19
event is the same as conditioning on the event being certain, which formalizes the standard informal interpretation of conditional probability. The game ...theoretic application of our model, discussed within an example, sheds light on a number of issues in the analysis of extensive form games . Type...belief types Block 13: Supplementary Note © 2014 . Published in Games and Economic Behavior, Vol. Ed. 0 87, (0) (2014), (, (0). DoD Components
On the reality of the conjunction fallacy.
Sides, Ashley; Osherson, Daniel; Bonini, Nicolao; Viale, Riccardo
2002-03-01
Attributing higher "probability" to a sentence of form p-and-q, relative to p, is a reasoning fallacy only if (1) the word probability carries its modern, technical meaning and (2) the sentence p is interpreted as a conjunct of the conjunction p-and-q. Legitimate doubts arise about both conditions in classic demonstrations of the conjunction fallacy. We used betting paradigms and unambiguously conjunctive statements to reduce these sources of ambiguity about conjunctive reasoning. Despite the precautions, conjunction fallacies were as frequent under betting instructions as under standard probability instructions.
A short note on probability in clinical medicine.
Upshur, Ross E G
2013-06-01
Probability claims are ubiquitous in clinical medicine, yet exactly how clinical events relate to interpretations of probability has been not been well explored. This brief essay examines the major interpretations of probability and how these interpretations may account for the probabilistic nature of clinical events. It is argued that there are significant problems with the unquestioned application of interpretation of probability to clinical events. The essay concludes by suggesting other avenues to understand uncertainty in clinical medicine. © 2013 John Wiley & Sons Ltd.
Probability, arrow of time and decoherence
NASA Astrophysics Data System (ADS)
Bacciagaluppi, Guido
This paper relates both to the metaphysics of probability and to the physics of time asymmetry. Using the formalism of decoherent histories, it investigates whether intuitions about intrinsic time directedness that are often associated with probability can be justified in the context of no-collapse approaches to quantum mechanics. The standard (two-vector) approach to time symmetry in the decoherent histories literature is criticised, and an alternative approach is proposed, based on two decoherence conditions ('forwards' and 'backwards') within the one-vector formalism. In turn, considerations of forwards and backwards decoherence and of decoherence and recoherence suggest that a time-directed interpretation of probabilities, if adopted, should be both contingent and perspectival.
Chance and time: Cutting the Gordian knot
NASA Astrophysics Data System (ADS)
Hagar, Amit
One of the recurrent problems in the foundations of physics is to explain why we rarely observe certain phenomena that are allowed by our theories and laws. In thermodynamics, for example, the spontaneous approach towards equilibrium is ubiquitous yet the time-reversal-invariant laws that presumably govern thermal behaviour in the microscopic level equally allow spontaneous approach away from equilibrium to occur. Why are the former processes frequently observed while the latter are almost never reported? Another example comes from quantum mechanics where the formalism, if considered complete and universally applicable, predicts the existence of macroscopic superpositions---monstrous Schrodinger cats---and these are never observed: while electrons and atoms enjoy the cloudiness of waves, macroscopic objects are always localized to definite positions. A well-known explanatory framework due to Ludwig Boltzmann traces the rarity of "abnormal" thermodynamic phenomena to the scarcity of the initial conditions that lead to it. After all, physical laws are no more than algorithms and these are expected to generate different results according to different initial conditions, hence Boltzmann's insight that violations of thermodynamic laws are possible but highly improbable. Yet Boltzmann introduces probabilities into this explanatory scheme, and since the latter is couched in terms of classical mechanics, these probabilities must be interpreted as a result of ignorance of the exact state the system is in. Quantum mechanics has taught us otherwise. Here the attempts to explain why we never observe macroscopic superpositions have led to different interpretations of the formalism and to different solutions to the quantum measurement problem. These solutions introduce additional interpretations to the meaning of probability over and above ignorance of the definite state of the physical system: quantum probabilities may result from pure chance. Notwithstanding the success of the Boltzmannian framework in explaining the thermodynamic arrow in time it leaves us with a foundational puzzle: how can ignorance play a role in scientific explanation of objective reality? In turns out that two opposing solutions to the quantum measurement problem in which probabilities arise from the stochastic character of the underlying dynamics may scratch this explanatory itch. By offering a dynamical justification to the probabilities employed in classical statistical mechanics these two interpretations complete the Boltzmannian explanatory scheme and allow us to exorcize ignorance from scientific explanations of unobserved phenomena. In this thesis I argue that the puzzle of the thermodynamic arrow in time is closely related to the problem of interpreting quantum mechanics, i.e., to the measurement problem. We may solve one by fiat and thus solve the other, but it seems unwise to try solving them independently. I substantiate this claim by presenting two possible interpretations to non-relativistic quantum mechanics. Differing as they do on the meaning of the probabilities they introduce into the otherwise deterministic dynamics, these interpretations offer alternative explanatory schemes to the standard Boltzmannian statistical mechanical explanation of thermodynamic approach to equilibrium. I then show how notwithstanding their current empirical equivalence, the two approaches diverge at the continental divide between scientific realism and anti-realism.
Probability and Quantum Paradigms: the Interplay
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kracklauer, A. F.
Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a fewmore » details, this variant is appealing in its reliance on well tested concepts and technology.« less
Probability and Quantum Paradigms: the Interplay
NASA Astrophysics Data System (ADS)
Kracklauer, A. F.
2007-12-01
Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a few details, this variant is appealing in its reliance on well tested concepts and technology.
Bivariate categorical data analysis using normal linear conditional multinomial probability model.
Sun, Bingrui; Sutradhar, Brajendra
2015-02-10
Bivariate multinomial data such as the left and right eyes retinopathy status data are analyzed either by using a joint bivariate probability model or by exploiting certain odds ratio-based association models. However, the joint bivariate probability model yields marginal probabilities, which are complicated functions of marginal and association parameters for both variables, and the odds ratio-based association model treats the odds ratios involved in the joint probabilities as 'working' parameters, which are consequently estimated through certain arbitrary 'working' regression models. Also, this later odds ratio-based model does not provide any easy interpretations of the correlations between two categorical variables. On the basis of pre-specified marginal probabilities, in this paper, we develop a bivariate normal type linear conditional multinomial probability model to understand the correlations between two categorical variables. The parameters involved in the model are consistently estimated using the optimal likelihood and generalized quasi-likelihood approaches. The proposed model and the inferences are illustrated through an intensive simulation study as well as an analysis of the well-known Wisconsin Diabetic Retinopathy status data. Copyright © 2014 John Wiley & Sons, Ltd.
78 FR 55775 - Pipeline Safety: Information Collection Activities
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-11
...'' is open to wide interpretation and suggests that ``awareness'' be replaced with ``discovery'', which... conditions characterize ``discovery'' as ``when an operator's representative has adequate information from... ``adequate'' and ``probable'' in the definition of ``discovery'' provides additional clarity. Part A18 of the...
Enter the reverend: introduction to and application of Bayes' theorem in clinical ophthalmology.
Thomas, Ravi; Mengersen, Kerrie; Parikh, Rajul S; Walland, Mark J; Muliyil, Jayprakash
2011-12-01
Ophthalmic practice utilizes numerous diagnostic tests, some of which are used to screen for disease. Interpretation of test results and many clinical management issues are actually problems in inverse probability that can be solved using Bayes' theorem. Use two-by-two tables to understand Bayes' theorem and apply it to clinical examples. Specific examples of the utility of Bayes' theorem in diagnosis and management. Two-by-two tables are used to introduce concepts and understand the theorem. The application in interpretation of diagnostic tests is explained. Clinical examples demonstrate its potential use in making management decisions. Positive predictive value and conditional probability. The theorem demonstrates the futility of testing when prior probability of disease is low. Application to untreated ocular hypertension demonstrates that the estimate of glaucomatous optic neuropathy is similar to that obtained from the Ocular Hypertension Treatment Study. Similar calculations are used to predict the risk of acute angle closure in a primary angle closure suspect, the risk of pupillary block in a diabetic undergoing cataract surgery, and the probability that an observed decrease in intraocular pressure is due to the medication that has been started. The examples demonstrate how data required for management can at times be easily obtained from available information. Knowledge of Bayes' theorem helps in interpreting test results and supports the clinical teaching that testing for conditions with a low prevalence has a poor predictive value. In some clinical situations Bayes' theorem can be used to calculate vital data required for patient management. © 2011 The Authors. Clinical and Experimental Ophthalmology © 2011 Royal Australian and New Zealand College of Ophthalmologists.
Takach Lapner, Sarah; Julian, Jim A; Linkins, Lori-Ann; Bates, Shannon; Kearon, Clive
2017-10-05
Two new strategies for interpreting D-dimer results have been proposed: i) using a progressively higher D-dimer threshold with increasing age (age-adjusted strategy) and ii) using a D-dimer threshold in patients with low clinical probability that is twice the threshold used in patients with moderate clinical probability (clinical probability-adjusted strategy). Our objective was to compare the diagnostic accuracy of age-adjusted and clinical probability-adjusted D-dimer interpretation in patients with a low or moderate clinical probability of venous thromboembolism (VTE). We performed a retrospective analysis of clinical data and blood samples from two prospective studies. We compared the negative predictive value (NPV) for VTE, and the proportion of patients with a negative D-dimer result, using two D-dimer interpretation strategies: the age-adjusted strategy, which uses a progressively higher D-dimer threshold with increasing age over 50 years (age in years × 10 µg/L FEU); and the clinical probability-adjusted strategy which uses a D-dimer threshold of 1000 µg/L FEU in patients with low clinical probability and 500 µg/L FEU in patients with moderate clinical probability. A total of 1649 outpatients with low or moderate clinical probability for a first suspected deep vein thrombosis or pulmonary embolism were included. The NPV of both the clinical probability-adjusted strategy (99.7 %) and the age-adjusted strategy (99.6 %) were similar. However, the proportion of patients with a negative result was greater with the clinical probability-adjusted strategy (56.1 % vs, 50.9 %; difference 5.2 %; 95 % CI 3.5 % to 6.8 %). These findings suggest that clinical probability-adjusted D-dimer interpretation is a better way of interpreting D-dimer results compared to age-adjusted interpretation.
Ortega, Pedro A; Braun, Daniel A
2015-01-01
Free energy models of learning and acting do not only care about utility or extrinsic value, but also about intrinsic value, that is, the information value stemming from probability distributions that represent beliefs or strategies. While these intrinsic values can be interpreted as epistemic values or exploration bonuses under certain conditions, the framework of bounded rationality offers a complementary interpretation in terms of information-processing costs that we discuss here.
Propensity, Probability, and Quantum Theory
NASA Astrophysics Data System (ADS)
Ballentine, Leslie E.
2016-08-01
Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.
Dynamical interpretation of conditional patterns
NASA Technical Reports Server (NTRS)
Adrian, R. J.; Moser, R. D.; Moin, P.
1988-01-01
While great progress is being made in characterizing the 3-D structure of organized turbulent motions using conditional averaging analysis, there is a lack of theoretical guidance regarding the interpretation and utilization of such information. Questions concerning the significance of the structures, their contributions to various transport properties, and their dynamics cannot be answered without recourse to appropriate dynamical governing equations. One approach which addresses some of these questions uses the conditional fields as initial conditions and calculates their evolution from the Navier-Stokes equations, yielding valuable information about stability, growth, and longevity of the mean structure. To interpret statistical aspects of the structures, a different type of theory which deals with the structures in the context of their contributions to the statistics of the flow is needed. As a first step toward this end, an effort was made to integrate the structural information from the study of organized structures with a suitable statistical theory. This is done by stochastically estimating the two-point conditional averages that appear in the equation for the one-point probability density function, and relating the structures to the conditional stresses. Salient features of the estimates are identified, and the structure of the one-point estimates in channel flow is defined.
NASA Astrophysics Data System (ADS)
Chen, Po-Hao; Botzolakis, Emmanuel; Mohan, Suyash; Bryan, R. N.; Cook, Tessa
2016-03-01
In radiology, diagnostic errors occur either through the failure of detection or incorrect interpretation. Errors are estimated to occur in 30-35% of all exams and contribute to 40-54% of medical malpractice litigations. In this work, we focus on reducing incorrect interpretation of known imaging features. Existing literature categorizes cognitive bias leading a radiologist to an incorrect diagnosis despite having correctly recognized the abnormal imaging features: anchoring bias, framing effect, availability bias, and premature closure. Computational methods make a unique contribution, as they do not exhibit the same cognitive biases as a human. Bayesian networks formalize the diagnostic process. They modify pre-test diagnostic probabilities using clinical and imaging features, arriving at a post-test probability for each possible diagnosis. To translate Bayesian networks to clinical practice, we implemented an entirely web-based open-source software tool. In this tool, the radiologist first selects a network of choice (e.g. basal ganglia). Then, large, clearly labeled buttons displaying salient imaging features are displayed on the screen serving both as a checklist and for input. As the radiologist inputs the value of an extracted imaging feature, the conditional probabilities of each possible diagnosis are updated. The software presents its level of diagnostic discrimination using a Pareto distribution chart, updated with each additional imaging feature. Active collaboration with the clinical radiologist is a feasible approach to software design and leads to design decisions closely coupling the complex mathematics of conditional probability in Bayesian networks with practice.
Sanfilippo, Paul G; Hewitt, Alex W; Mackey, David A
2017-04-01
To outline and detail the importance of conditional probability in clinical decision making and discuss the various diagnostic measures eye care practitioners should be aware of in order to improve the scope of their clinical practice. We conducted a review of the importance of conditional probability in diagnostic testing for the eye care practitioner. Eye care practitioners use diagnostic tests on a daily basis to assist in clinical decision making and optimizing patient care and management. These tests provide probabilistic information that can enable the clinician to increase (or decrease) their level of certainty about the presence of a particular condition. While an understanding of the characteristics of diagnostic tests are essential to facilitate proper interpretation of test results and disease risk, many practitioners either confuse or misinterpret these measures. In the interests of their patients, practitioners should be aware of the basic concepts associated with diagnostic testing and the simple mathematical rule that underpins them. Importantly, the practitioner needs to recognize that the prevalence of a disease in the population greatly determines the clinical value of a diagnostic test.
Risk estimation using probability machines
2014-01-01
Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306
Risk estimation using probability machines.
Dasgupta, Abhijit; Szymczak, Silke; Moore, Jason H; Bailey-Wilson, Joan E; Malley, James D
2014-03-01
Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a "risk machine", will share properties from the statistical machine that it is derived from.
The comparability of different survey designs needs to be established to facilitate integration of data across scales and interpretation of trends over time. Probability-based survey designs are now being investigated to allow condition to be assessed at the watershed scale, an...
A Comprehensive Breath Plume Model for Disease Transmission via Expiratory Aerosols
NASA Astrophysics Data System (ADS)
Halloran, S. K.; Wexler, A. S.; Ristenpart, W. D.
2012-11-01
The peak in influenza incidence during wintertime represents a longstanding unresolved scientific question. One hypothesis is that the efficacy of airborne transmission via aerosols is increased at low humidity and temperature, conditions that prevail in wintertime. Recent experiments with guinea pigs suggest that transmission is indeed maximized at low humidity and temperature, a finding which has been widely interpreted in terms of airborne influenza virus survivability. This interpretation, however, neglects the effect of the airflow on the transmission probability. Here we provide a comprehensive model for assessing the probability of disease transmission via expiratory aerosols between test animals in laboratory conditions. The spread of aerosols emitted from an infected animal is modeled using dispersion theory for a homogeneous turbulent airflow. The concentration and size distribution of the evaporating droplets in the resulting ``Gaussian breath plume'' are calculated as functions of downstream position. We demonstrate that the breath plume model is broadly consistent with the guinea pig experiments, without invoking airborne virus survivability. Moreover, the results highlight the need for careful characterization of the airflow in airborne transmission experiments.
A Stochastic Super-Exponential Growth Model for Population Dynamics
NASA Astrophysics Data System (ADS)
Avila, P.; Rekker, A.
2010-11-01
A super-exponential growth model with environmental noise has been studied analytically. Super-exponential growth rate is a property of dynamical systems exhibiting endogenous nonlinear positive feedback, i.e., of self-reinforcing systems. Environmental noise acts on the growth rate multiplicatively and is assumed to be Gaussian white noise in the Stratonovich interpretation. An analysis of the stochastic super-exponential growth model with derivations of exact analytical formulae for the conditional probability density and the mean value of the population abundance are presented. Interpretations and various applications of the results are discussed.
The determination of substrate conditions from the orientations of solitary rugose corals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bolton, J.C.; Driese, S.G.
1990-10-01
The substrate conditions of mudstone strata formed in ancient epicontinental settings may be determined from taphonomic assemblages of solitary rugose corals. Equal-area plots on the orientations of preserved corals can be used to infer whether subsequent hydrodynamic conditions affected any post-mortem reworking of the corals. Mechanically stable positions for curved corals can be determined. Curved corals preserved in mechanically stable positions are interpreted to have been deposited on firm or hard substrates. Curved corals preserved in mechanically unstable positions were probably embedded in soft or soupy substrates.
Quantifying seining detection probability for fishes of Great Plains sand‐bed rivers
Mollenhauer, Robert; Logue, Daniel R.; Brewer, Shannon K.
2018-01-01
Species detection error (i.e., imperfect and variable detection probability) is an essential consideration when investigators map distributions and interpret habitat associations. When fish detection error that is due to highly variable instream environments needs to be addressed, sand‐bed streams of the Great Plains represent a unique challenge. We quantified seining detection probability for diminutive Great Plains fishes across a range of sampling conditions in two sand‐bed rivers in Oklahoma. Imperfect detection resulted in underestimates of species occurrence using naïve estimates, particularly for less common fishes. Seining detection probability also varied among fishes and across sampling conditions. We observed a quadratic relationship between water depth and detection probability, in which the exact nature of the relationship was species‐specific and dependent on water clarity. Similarly, the direction of the relationship between water clarity and detection probability was species‐specific and dependent on differences in water depth. The relationship between water temperature and detection probability was also species dependent, where both the magnitude and direction of the relationship varied among fishes. We showed how ignoring detection error confounded an underlying relationship between species occurrence and water depth. Despite imperfect and heterogeneous detection, our results support that determining species absence can be accomplished with two to six spatially replicated seine hauls per 200‐m reach under average sampling conditions; however, required effort would be higher under certain conditions. Detection probability was low for the Arkansas River Shiner Notropis girardi, which is federally listed as threatened, and more than 10 seine hauls per 200‐m reach would be required to assess presence across sampling conditions. Our model allows scientists to estimate sampling effort to confidently assess species occurrence, which maximizes the use of available resources. Increased implementation of approaches that consider detection error promote ecological advancements and conservation and management decisions that are better informed.
1982-09-01
considered to be Markovian and the fact that Ehrenberg has been openly critical of the use of first-order Markov processes in describing consumer ... behavior -/ disinclines us to treating these data in this manner. We Shall therefore interpret the p (i,i) as joint rather than conditional probabilities
Three-Valued Logics and Conditional Event Algebras
1990-12-05
order) over R Bruno & Gilio (10), among others. In all of the represented by the relation a.b iff a=a-b iff b=bva; above, only DeFinetti and Schay...ce’s (101 G. Bruno and A. Gilio , "Confronto fra eventi with interpretations for their outcomes through €. condizionati di probabiliti nulli nell’ infer
A Comprehensive Breath Plume Model for Disease Transmission via Expiratory Aerosols
Halloran, Siobhan K.; Wexler, Anthony S.; Ristenpart, William D.
2012-01-01
The peak in influenza incidence during wintertime in temperate regions represents a longstanding, unresolved scientific question. One hypothesis is that the efficacy of airborne transmission via aerosols is increased at lower humidities and temperatures, conditions that prevail in wintertime. Recent work with a guinea pig model by Lowen et al. indicated that humidity and temperature do modulate airborne influenza virus transmission, and several investigators have interpreted the observed humidity dependence in terms of airborne virus survivability. This interpretation, however, neglects two key observations: the effect of ambient temperature on the viral growth kinetics within the animals, and the strong influence of the background airflow on transmission. Here we provide a comprehensive theoretical framework for assessing the probability of disease transmission via expiratory aerosols between test animals in laboratory conditions. The spread of aerosols emitted from an infected animal is modeled using dispersion theory for a homogeneous turbulent airflow. The concentration and size distribution of the evaporating droplets in the resulting “Gaussian breath plume” are calculated as functions of position, humidity, and temperature. The overall transmission probability is modeled with a combination of the time-dependent viral concentration in the infected animal and the probability of droplet inhalation by the exposed animal downstream. We demonstrate that the breath plume model is broadly consistent with the results of Lowen et al., without invoking airborne virus survivability. The results also suggest that, at least for guinea pigs, variation in viral kinetics within the infected animals is the dominant factor explaining the increased transmission probability observed at lower temperatures. PMID:22615902
Adaptation in pronoun resolution: Evidence from Brazilian and European Portuguese.
Fernandes, Eunice G; Luegi, Paula; Correa Soares, Eduardo; de la Fuente, Israel; Hemforth, Barbara
2018-04-26
Previous research accounting for pronoun resolution as a problem of probabilistic inference has not explored the phenomenon of adaptation, whereby the processor constantly tracks and adapts, rationally, to changes in a statistical environment. We investigate whether Brazilian (BP) and European Portuguese (EP) speakers adapt to variations in the probability of occurrence of ambiguous overt and null pronouns, in two experiments assessing resolution toward subject and object referents. For each variety (BP, EP), participants were faced with either the same number of null and overt pronouns (equal distribution), or with an environment with fewer overt (than null) pronouns (unequal distribution). We find that the preference for interpreting overt pronouns as referring back to an object referent (object-biased interpretation) is higher when there are fewer overt pronouns (i.e., in the unequal, relative to the equal distribution condition). This is especially the case for BP, a variety with higher prior frequency and smaller object-biased interpretation of overt pronouns, suggesting that participants adapted incrementally and integrated prior statistical knowledge with the knowledge obtained in the experiment. We hypothesize that comprehenders adapted rationally, with the goal of maintaining, across variations in pronoun probability, the likelihood of subject and object referents. Our findings unify insights from research in pronoun resolution and in adaptation, and add to previous studies in both topics: They provide evidence for the influence of pronoun probability in pronoun resolution, and for an adaptation process whereby the language processor not only tracks statistical information, but uses it to make interpretational inferences. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
The Effects of Framing, Reflection, Probability, and Payoff on Risk Preference in Choice Tasks.
Kühberger; Schulte-Mecklenbeck; Perner
1999-06-01
A meta-analysis of Asian-disease-like studies is presented to identify the factors which determine risk preference. First the confoundings between probability levels, payoffs, and framing conditions are clarified in a task analysis. Then the role of framing, reflection, probability, type, and size of payoff is evaluated in a meta-analysis. It is shown that bidirectional framing effects exist for gains and for losses. Presenting outcomes as gains tends to induce risk aversion, while presenting outcomes as losses tends to induce risk seeking. Risk preference is also shown to depend on the size of the payoffs, on the probability levels, and on the type of good at stake (money/property vs human lives). In general, higher payoffs lead to increasing risk aversion. Higher probabilities lead to increasing risk aversion for gains and to increasing risk seeking for losses. These findings are confirmed by a subsequent empirical test. Shortcomings of existing formal theories, such as prospect theory, cumulative prospect theory, venture theory, and Markowitz's utility theory, are identified. It is shown that it is not probabilities or payoffs, but the framing condition, which explains most variance. These findings are interpreted as showing that no linear combination of formally relevant predictors is sufficient to capture the essence of the framing phenomenon. Copyright 1999 Academic Press.
On variational definition of quantum entropy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Belavkin, Roman V.
Entropy of distribution P can be defined in at least three different ways: 1) as the expectation of the Kullback-Leibler (KL) divergence of P from elementary δ-measures (in this case, it is interpreted as expected surprise); 2) as a negative KL-divergence of some reference measure ν from the probability measure P; 3) as the supremum of Shannon’s mutual information taken over all channels such that P is the output probability, in which case it is dual of some transportation problem. In classical (i.e. commutative) probability, all three definitions lead to the same quantity, providing only different interpretations of entropy. Inmore » non-commutative (i.e. quantum) probability, however, these definitions are not equivalent. In particular, the third definition, where the supremum is taken over all entanglements of two quantum systems with P being the output state, leads to the quantity that can be twice the von Neumann entropy. It was proposed originally by V. Belavkin and Ohya [1] and called the proper quantum entropy, because it allows one to define quantum conditional entropy that is always non-negative. Here we extend these ideas to define also quantum counterpart of proper cross-entropy and cross-information. We also show inequality for the values of classical and quantum information.« less
On the zigzagging causility model of EPR correlations and on the interpretation of quantum mechanics
NASA Astrophysics Data System (ADS)
de Beauregard, O. Costa
1988-09-01
Being formalized inside the S-matrix scheme, the zigzagging causility model of EPR correlations has full Lorentz and CPT invariance. EPR correlations, proper or reversed, and Wheeler's smoky dragon metaphor are respectively pictured in spacetime or in the momentum-energy space, as V-shaped, A-shaped, or C-shaped ABC zigzags, with a summation at B over virtual states |B> =
Interpretations of Probability in Quantum Mechanics: A Case of "Experimental Metaphysics"
NASA Astrophysics Data System (ADS)
Hellman, Geoffrey
After reviewing paradigmatic cases of "experimental metaphysics" basing inferences against local realism and determinism on experimental tests of Bells theorem (and successors), we concentrate on clarifying the meaning and status of "objective probability" in quantum mechanics. The terms "objective" and "subjective" are found ambiguous and inadequate, masking crucial differences turning on the question of what the numerical values of probability functions measure vs. the question of the nature of the "events" on which such functions are defined. This leads naturally to a 2×2 matrix of types of interpretations, which are then illustrated with salient examples. (Of independent interest are the splitting of "Copenhagen interpretation" into "objective" and "subjective" varieties in one of the dimensions and the splitting of Bohmian hidden variables from (other) modal interpretations along that same dimension.) It is then explained why Everett interpretations are difficult to categorize in these terms. Finally, we argue that Bohmian mechanics does not seriously threaten the experimental-metaphysical case for ultimate randomness and purely physical probabilities.
Aitken, C G
1999-07-01
It is thought that, in a consignment of discrete units, a certain proportion of the units contain illegal material. A sample of the consignment is to be inspected. Various methods for the determination of the sample size are compared. The consignment will be considered as a random sample from some super-population of units, a certain proportion of which contain drugs. For large consignments, a probability distribution, known as the beta distribution, for the proportion of the consignment which contains illegal material is obtained. This distribution is based on prior beliefs about the proportion. Under certain specific conditions the beta distribution gives the same numerical results as an approach based on the binomial distribution. The binomial distribution provides a probability for the number of units in a sample which contain illegal material, conditional on knowing the proportion of the consignment which contains illegal material. This is in contrast to the beta distribution which provides probabilities for the proportion of a consignment which contains illegal material, conditional on knowing the number of units in the sample which contain illegal material. The interpretation when the beta distribution is used is much more intuitively satisfactory. It is also much more flexible in its ability to cater for prior beliefs which may vary given the different circumstances of different crimes. For small consignments, a distribution, known as the beta-binomial distribution, for the number of units in the consignment which are found to contain illegal material, is obtained, based on prior beliefs about the number of units in the consignment which are thought to contain illegal material. As with the beta and binomial distributions for large samples, it is shown that, in certain specific conditions, the beta-binomial and hypergeometric distributions give the same numerical results. However, the beta-binomial distribution, as with the beta distribution, has a more intuitively satisfactory interpretation and greater flexibility. The beta and the beta-binomial distributions provide methods for the determination of the minimum sample size to be taken from a consignment in order to satisfy a certain criterion. The criterion requires the specification of a proportion and a probability.
O'Doherty, Kieran C
2007-02-01
The question of what probability actually is has long been debated in philosophy and statistics. Although the concept of probability is fundamental to many applications in the health sciences, these debates are generally not well known to health professionals. This paper begins with an outline of some of the different interpretations of probability. Examples are provided of how each interpretation manifests in clinical practice. The discipline of genetic counselling (familial cancer) is used to ground the discussion. In the second part of the paper, some of the implications that different interpretations of probability may have in practice are examined. The main purpose of the paper is to draw attention to the fact that there is much contention as to the nature of the concept of probability. In practice, this creates the potential for ambiguity and confusion. This paper constitutes a call for deeper engagement with the ways in which probability and risk are understood in health research and practice.
Chance, determinism and the classical theory of probability.
Vasudevan, Anubav
2018-02-01
This paper situates the metaphysical antinomy between chance and determinism in the historical context of some of the earliest developments in the mathematical theory of probability. Since Hacking's seminal work on the subject, it has been a widely held view that the classical theorists of probability were guilty of an unwitting equivocation between a subjective, or epistemic, interpretation of probability, on the one hand, and an objective, or statistical, interpretation, on the other. While there is some truth to this account, I argue that the tension at the heart of the classical theory of probability is not best understood in terms of the duality between subjective and objective interpretations of probability. Rather, the apparent paradox of chance and determinism, when viewed through the lens of the classical theory of probability, manifests itself in a much deeper ambivalence on the part of the classical probabilists as to the rational commensurability of causal and probabilistic reasoning. Copyright © 2017 Elsevier Ltd. All rights reserved.
Mowlavi, Gholamreza; Kacki, Sacha; Dupouy-Camet, Jean; Mobedi, Iraj; Makki, Mahsasadat; Harandi, Majid Fasihi; Naddaf, Saied Reza
2014-01-01
Two calcified objects recovered from a 3rd to 4th-century grave of an adolescent in Amiens (Northern France) were identified as probable hydatid cysts. By using thin-section petrographic techniques, probable Calodium hepaticum (syn. Capillaria hepatica) eggs were identified in the wall of the cysts. Human hepatic capillariosis has not been reported from archaeological material so far, but could be expected given the poor level of environmental hygiene prevalent in this period. Identification of tissue-dwelling parasites such as C. hepaticum in archaeological remains is particularly dependent on preservation conditions and taphonomic changes and should be interpreted with caution due to morphological similarities with Trichuris sp. eggs. © G. Mowlavi et al., published by EDP Sciences, 2014.
Probability and Locality: Determinism Versus Indeterminism in Quantum Mechanics
NASA Astrophysics Data System (ADS)
Dickson, William Michael
1995-01-01
Quantum mechanics is often taken to be necessarily probabilistic. However, this view of quantum mechanics appears to be more the result of historical accident than of careful analysis. Moreover, quantum mechanics in its usual form faces serious problems. Although the mathematical core of quantum mechanics--quantum probability theory- -does not face conceptual difficulties, the application of quantum probability to the physical world leads to problems. In particular, quantum mechanics seems incapable of describing our everyday macroscopic experience. Therefore, several authors have proposed new interpretations --including (but not limited to) modal interpretations, spontaneous localization interpretations, the consistent histories approach, and the Bohm theory--each of which deals with quantum-mechanical probabilities differently. Each of these interpretations promises to describe our macroscopic experience and, arguably, each succeeds. Is there any way to compare them? Perhaps, if we turn to another troubling aspect of quantum mechanics, non-locality. Non -locality is troubling because prima facie it threatens the compatibility of quantum mechanics with special relativity. This prima facie threat is mitigated by the no-signalling theorems in quantum mechanics, but nonetheless one may find a 'conflict of spirit' between nonlocality in quantum mechanics and special relativity. Do any of these interpretations resolve this conflict of spirit?. There is a strong relation between how an interpretation deals with quantum-mechanical probabilities and how it deals with non-locality. The main argument here is that only a completely deterministic interpretation can be completely local. That is, locality together with the empirical predictions of quantum mechanics (specifically, its strict correlations) entails determinism. But even with this entailment in hand, comparison of the various interpretations requires a look at each, to see how non-locality arises, or in the case of deterministic interpretations, whether it arises. The result of this investigation is that, at the least, deterministic interpretations are no worse off with respect to special relativity than indeterministic interpretations. This conclusion runs against a common view that deterministic interpretations, specifically the Bohm theory, have more difficulty with special relativity than other interpretations.
Martens, Brian K; DiGennaro, Florence D; Reed, Derek D; Szczech, Frances M; Rosenthal, Blair D
2008-01-01
Descriptive assessment methods have been used in applied settings to identify consequences for problem behavior, thereby aiding in the design of effective treatment programs. Consensus has not been reached, however, regarding the types of data or analytic strategies that are most useful for describing behavior–consequence relations. One promising approach involves the analysis of conditional probabilities from sequential recordings of behavior and events that follow its occurrence. In this paper we review several strategies for identifying contingent relations from conditional probabilities, and propose an alternative strategy known as a contingency space analysis (CSA). Step-by-step procedures for conducting and interpreting a CSA using sample data are presented, followed by discussion of the potential use of a CSA for conducting descriptive assessments, informing intervention design, and evaluating changes in reinforcement contingencies following treatment. PMID:18468280
Mollenhauer, Robert; Mouser, Joshua B.; Brewer, Shannon K.
2018-01-01
Temporal and spatial variability in streams result in heterogeneous gear capture probability (i.e., the proportion of available individuals identified) that confounds interpretation of data used to monitor fish abundance. We modeled tow-barge electrofishing capture probability at multiple spatial scales for nine Ozark Highland stream fishes. In addition to fish size, we identified seven reach-scale environmental characteristics associated with variable capture probability: stream discharge, water depth, conductivity, water clarity, emergent vegetation, wetted width–depth ratio, and proportion of riffle habitat. The magnitude of the relationship between capture probability and both discharge and depth varied among stream fishes. We also identified lithological characteristics among stream segments as a coarse-scale source of variable capture probability. The resulting capture probability model can be used to adjust catch data and derive reach-scale absolute abundance estimates across a wide range of sampling conditions with similar effort as used in more traditional fisheries surveys (i.e., catch per unit effort). Adjusting catch data based on variable capture probability improves the comparability of data sets, thus promoting both well-informed conservation and management decisions and advances in stream-fish ecology.
Computer-aided diagnosis with potential application to rapid detection of disease outbreaks.
Burr, Tom; Koster, Frederick; Picard, Rick; Forslund, Dave; Wokoun, Doug; Joyce, Ed; Brillman, Judith; Froman, Phil; Lee, Jack
2007-04-15
Our objectives are to quickly interpret symptoms of emergency patients to identify likely syndromes and to improve population-wide disease outbreak detection. We constructed a database of 248 syndromes, each syndrome having an estimated probability of producing any of 85 symptoms, with some two-way, three-way, and five-way probabilities reflecting correlations among symptoms. Using these multi-way probabilities in conjunction with an iterative proportional fitting algorithm allows estimation of full conditional probabilities. Combining these conditional probabilities with misdiagnosis error rates and incidence rates via Bayes theorem, the probability of each syndrome is estimated. We tested a prototype of computer-aided differential diagnosis (CADDY) on simulated data and on more than 100 real cases, including West Nile Virus, Q fever, SARS, anthrax, plague, tularaemia and toxic shock cases. We conclude that: (1) it is important to determine whether the unrecorded positive status of a symptom means that the status is negative or that the status is unknown; (2) inclusion of misdiagnosis error rates produces more realistic results; (3) the naive Bayes classifier, which assumes all symptoms behave independently, is slightly outperformed by CADDY, which includes available multi-symptom information on correlations; as more information regarding symptom correlations becomes available, the advantage of CADDY over the naive Bayes classifier should increase; (4) overlooking low-probability, high-consequence events is less likely if the standard output summary is augmented with a list of rare syndromes that are consistent with observed symptoms, and (5) accumulating patient-level probabilities across a larger population can aid in biosurveillance for disease outbreaks. c 2007 John Wiley & Sons, Ltd.
Interpretation of the results of statistical measurements. [search for basic probability model
NASA Technical Reports Server (NTRS)
Olshevskiy, V. V.
1973-01-01
For random processes, the calculated probability characteristic, and the measured statistical estimate are used in a quality functional, which defines the difference between the two functions. Based on the assumption that the statistical measurement procedure is organized so that the parameters for a selected model are optimized, it is shown that the interpretation of experimental research is a search for a basic probability model.
Lohmann, W
1978-01-01
The shape of the survivorship curve can easily be interpreted on condition that the probability of death is proportional to an exponentially rising function of ageing. According to the formation of a sum for determining of the age index by Ries it was investigated to what extent the survivorship curve may be approximated by a sum of exponentials. It follows that the difference between the pure exponential function and a sum of exponentials by using possible values is lying within the random variation. Because the probability of death for different diseases is variable, the new statement is a better one.
Profit intensity and cases of non-compliance with the law of demand/supply
NASA Astrophysics Data System (ADS)
Makowski, Marcin; Piotrowski, Edward W.; Sładkowski, Jan; Syska, Jacek
2017-05-01
We consider properties of the measurement intensity ρ of a random variable for which the probability density function represented by the corresponding Wigner function attains negative values on a part of the domain. We consider a simple economic interpretation of this problem. This model is used to present the applicability of the method to the analysis of the negative probability on markets where there are anomalies in the law of supply and demand (e.g. Giffen's goods). It turns out that the new conditions to optimize the intensity ρ require a new strategy. We propose a strategy (so-called à rebours strategy) based on the fixed point method and explore its effectiveness.
Predictive models attribute effects on fish assemblages to toxicity and habitat alteration.
de Zwart, Dick; Dyer, Scott D; Posthuma, Leo; Hawkins, Charles P
2006-08-01
Biological assessments should both estimate the condition of a biological resource (magnitude of alteration) and provide environmental managers with a diagnosis of the potential causes of impairment. Although methods of quantifying condition are well developed, identifying and proportionately attributing impairment to probable causes remain problematic. Furthermore, analyses of both condition and cause have often been difficult to communicate. We developed an approach that (1) links fish, habitat, and chemistry data collected from hundreds of sites in Ohio (USA) streams, (2) assesses the biological condition at each site, (3) attributes impairment to multiple probable causes, and (4) provides the results of the analyses in simple-to-interpret pie charts. The data set was managed using a geographic information system. Biological condition was assessed using a RIVPACS (river invertebrate prediction and classification system)-like predictive model. The model provided probabilities of capture for 117 fish species based on the geographic location of sites and local habitat descriptors. Impaired biological condition was defined as the proportion of those native species predicted to occur at a site that were observed. The potential toxic effects of exposure to mixtures of contaminants were estimated using species sensitivity distributions and mixture toxicity principles. Generalized linear regression models described species abundance as a function of habitat characteristics. Statistically linking biological condition, habitat characteristics including mixture risks, and species abundance allowed us to evaluate the losses of species with environmental conditions. Results were mapped as simple effect and probable-cause pie charts (EPC pie diagrams), with pie sizes corresponding to magnitude of local impairment, and slice sizes to the relative probable contributions of different stressors. The types of models we used have been successfully applied in ecology and ecotoxicology, but they have not previously been used in concert to quantify impairment and its likely causes. Although data limitations constrained our ability to examine complex interactions between stressors and species, the direct relationships we detected likely represent conservative estimates of stressor contributions to local impairment. Future refinements of the general approach and specific methods described here should yield even more promising results.
Probability Matching in the Right Hemisphere
ERIC Educational Resources Information Center
Miller, M.B.; Valsangkar-Smyth, M.
2005-01-01
Previously it has been shown that the left hemisphere, but not the right, of split-brain patients tends to match the frequency of previous occurrences in probability-guessing paradigms (Wolford, Miller, & Gazzaniga, 2000). This phenomenon has been attributed to an ''interpreter,'' a mechanism for making interpretations and forming hypotheses,…
Recursive recovery of Markov transition probabilities from boundary value data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patch, Sarah Kathyrn
1994-04-01
In an effort to mathematically describe the anisotropic diffusion of infrared radiation in biological tissue Gruenbaum posed an anisotropic diffusion boundary value problem in 1989. In order to accommodate anisotropy, he discretized the temporal as well as the spatial domain. The probabilistic interpretation of the diffusion equation is retained; radiation is assumed to travel according to a random walk (of sorts). In this random walk the probabilities with which photons change direction depend upon their previous as well as present location. The forward problem gives boundary value data as a function of the Markov transition probabilities. The inverse problem requiresmore » finding the transition probabilities from boundary value data. Problems in the plane are studied carefully in this thesis. Consistency conditions amongst the data are derived. These conditions have two effects: they prohibit inversion of the forward map but permit smoothing of noisy data. Next, a recursive algorithm which yields a family of solutions to the inverse problem is detailed. This algorithm takes advantage of all independent data and generates a system of highly nonlinear algebraic equations. Pluecker-Grassmann relations are instrumental in simplifying the equations. The algorithm is used to solve the 4 x 4 problem. Finally, the smallest nontrivial problem in three dimensions, the 2 x 2 x 2 problem, is solved.« less
The Everett-Wheeler interpretation and the open future
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sudbery, Anthony
2011-03-28
I discuss the meaning of probability in the Everett-Wheeler interpretation of quantum mechanics, together with the problem of defining histories. To resolve these, I propose an understanding of probability arising from a form of temporal logic: the probability of a future-tense proposition is identified with its truth value in a many-valued and context-dependent logic. In short, probability is degree of truth. These ideas relate to traditional naive ideas of time and chance. Indeed, I argue that Everettian quantum mechanics is the only form of scientific theory that truly incorporates the perception that the future is open.
On the predictability of outliers in ensemble forecasts
NASA Astrophysics Data System (ADS)
Siegert, S.; Bröcker, J.; Kantz, H.
2012-03-01
In numerical weather prediction, ensembles are used to retrieve probabilistic forecasts of future weather conditions. We consider events where the verification is smaller than the smallest, or larger than the largest ensemble member of a scalar ensemble forecast. These events are called outliers. In a statistically consistent K-member ensemble, outliers should occur with a base rate of 2/(K+1). In operational ensembles this base rate tends to be higher. We study the predictability of outlier events in terms of the Brier Skill Score and find that forecast probabilities can be calculated which are more skillful than the unconditional base rate. This is shown analytically for statistically consistent ensembles. Using logistic regression, forecast probabilities for outlier events in an operational ensemble are calculated. These probabilities exhibit positive skill which is quantitatively similar to the analytical results. Possible causes of these results as well as their consequences for ensemble interpretation are discussed.
Communicating the Threat of a Tropical Cyclone to the Eastern Range
NASA Technical Reports Server (NTRS)
Winters, Katherine A.; Roeder, William P.; McAleenan, Mike; Belson, Brian L.; Shafer, Jaclyn A.
2012-01-01
The 45th Weather Squadron (45 WS) has developed a tool to help visualize the Wind Speed Probability product from the National Hurricane Center (NHC) and to help communicate that information to space launch customers and decision makers at the 45th Space Wing (45 SW) and Kennedy Space Center (KSC) located in east central Florida. This paper reviews previous work and presents the new visualization tool, including initial feedback as well as the pros and cons. The NHC began issuing their Wind Speed Probability product for tropical cyclones publicly in 2006. The 45 WS uses this product to provide a threat assessment to 45 SW and KSC leadership for risk evaluations with an approaching tropical cyclone. Although the wind speed probabilities convey the uncertainty of a tropical cyclone well, communicating this information to customers is a challenge. The 45 WS continually strives to provide the wind speed probability information to customers in a context which clearly communicates the threat of a tropical cyclone. First, an intern from the Florida Institute of Technology (FIT) Atmospheric Sciences department, sponsored by Scitor Corporation, independently evaluated the NHC wind speed probability product. This work was later extended into a M.S. thesis at FIT, partially funded by Scitor Corporation and KSC. A second thesis at FIT further extended the evaluation partially funded by KSC. Using this analysis, the 45 WS categorized the probabilities into five probability interpretation categories: Very Low, Low, Moderate, High, and Very High. These probability interpretation categories convert the forecast probability and forecast interval into easily understood categories that are consistent across all ranges of probabilities and forecast intervals. As a follow-on project, KSC funded a summer intern to evaluate the human factors of the probability interpretation categories, which ultimately refined some of the thresholds. The 45 WS created a visualization tool to express the timing and risk for multiple locations in a single graphic. Preliminary results on an on-going project by FIT will be included in this paper. This project is developing a new method of assigning the probability interpretation categories and updating the evaluation of the performance of the NHC wind speed probability analysis.
Comparison of formation and fluid-column logs in a heterogeneous basalt aquifer
Paillet, F.L.; Williams, J.H.; Oki, D.S.; Knutson, K.D.
2002-01-01
Deep observation boreholes in the vicinity of active production wells in Honolulu, Hawaii, exhibit the anomalous condition that fluid-column electrical conductivity logs and apparent profiles of pore-water electrical conductivity derived from induction conductivity logs are nearly identical if a formation factor of 12.5 is assumed. This condition is documented in three boreholes where fluid-column logs clearly indicate the presence of strong borehole flow induced by withdrawal from partially penetrating water-supply wells. This result appears to contradict the basic principles of conductivity-log interpretation. Flow conditions in one of these boreholes was investigated in detail by obtaining flow profiles under two water production conditions using the electromagnetic flowmeter. The flow-log interpretation demonstrates that the fluid-column log resembles the induction log because the amount of inflow to the borehole increases systematically upward through the transition zone between deeper salt water and shallower fresh water. This condition allows the properties of the fluid column to approximate the properties of water entering the borehole as soon as the upflow stream encounters that producing zone. Because this condition occurs in all three boreholes investigated, the similarity of induction and fluid-column logs is probably not a coincidence, and may relate to aquifer response under the influence of pumping from production wells.
Comparison of formation and fluid-column logs in a heterogeneous basalt aquifer.
Paillet, F L; Williams, J H; Oki, D S; Knutson, K D
2002-01-01
Deep observation boreholes in the vicinity of active production wells in Honolulu, Hawaii, exhibit the anomalous condition that fluid-column electrical conductivity logs and apparent profiles of pore-water electrical conductivity derived from induction conductivity logs are nearly identical if a formation factor of 12.5 is assumed. This condition is documented in three boreholes where fluid-column logs clearly indicate the presence of strong borehole flow induced by withdrawal from partially penetrating water-supply wells. This result appears to contradict the basic principles of conductivity-log interpretation. Flow conditions in one of these boreholes was investigated in detail by obtaining flow profiles under two water production conditions using the electromagnetic flowmeter. The flow-log interpretation demonstrates that the fluid-column log resembles the induction log because the amount of inflow to the borehole increases systematically upward through the transition zone between deeper salt water and shallower fresh water. This condition allows the properties of the fluid column to approximate the properties of water entering the borehole as soon as the upflow stream encounters that producing zone. Because this condition occurs in all three boreholes investigated, the similarity of induction and fluid-column logs is probably not a coincidence, and may relate to aquifer response under the influence of pumping from production wells.
Rapidly locating and characterizing pollutant releases in buildings.
Sohn, Michael D; Reynolds, Pamela; Singh, Navtej; Gadgil, Ashok J
2002-12-01
Releases of airborne contaminants in or near a building can lead to significant human exposures unless prompt response measures are taken. However, possible responses can include conflicting strategies, such as shutting the ventilation system off versus running it in a purge mode or having occupants evacuate versus sheltering in place. The proper choice depends in part on knowing the source locations, the amounts released, and the likely future dispersion routes of the pollutants. We present an approach that estimates this information in real time. It applies Bayesian statistics to interpret measurements of airborne pollutant concentrations from multiple sensors placed in the building and computes best estimates and uncertainties of the release conditions. The algorithm is fast, capable of continuously updating the estimates as measurements stream in from sensors. We demonstrate the approach using a hypothetical pollutant release in a five-room building. Unknowns to the interpretation algorithm include location, duration, and strength of the source, and some building and weather conditions. Two sensor sampling plans and three levels of data quality are examined. Data interpretation in all examples is rapid; however, locating and characterizing the source with high probability depends on the amount and quality of data and the sampling plan.
Blackmore, C Craig; Terasawa, Teruhiko
2006-02-01
Error in radiology can be reduced by standardizing the interpretation of imaging studies to the optimum sensitivity and specificity. In this report, the authors demonstrate how the optimal interpretation of appendiceal computed tomography (CT) can be determined and how it varies in different clinical scenarios. Utility analysis and receiver operating characteristic (ROC) curve modeling were used to determine the trade-off between false-positive and false-negative test results to determine the optimal operating point on the ROC curve for the interpretation of appendicitis CT. Modeling was based on a previous meta-analysis for the accuracy of CT and on literature estimates of the utilities of various health states. The posttest probability of appendicitis was derived using Bayes's theorem. At a low prevalence of disease (screening), appendicitis CT should be interpreted at high specificity (97.7%), even at the expense of lower sensitivity (75%). Conversely, at a high probability of disease, high sensitivity (97.4%) is preferred (specificity 77.8%). When the clinical diagnosis of appendicitis is equivocal, CT interpretation should emphasize both sensitivity and specificity (sensitivity 92.3%, specificity 91.5%). Radiologists can potentially decrease medical error and improve patient health by varying the interpretation of appendiceal CT on the basis of the clinical probability of appendicitis. This report is an example of how utility analysis can be used to guide radiologists in the interpretation of imaging studies and provide guidance on appropriate targets for the standardization of interpretation.
Complementarity and Correlations
NASA Astrophysics Data System (ADS)
Maccone, Lorenzo; Bruß, Dagmar; Macchiavello, Chiara
2015-04-01
We provide an interpretation of entanglement based on classical correlations between measurement outcomes of complementary properties: States that have correlations beyond a certain threshold are entangled. The reverse is not true, however. We also show that, surprisingly, all separable nonclassical states exhibit smaller correlations for complementary observables than some strictly classical states. We use mutual information as a measure of classical correlations, but we conjecture that the first result holds also for other measures (e.g., the Pearson correlation coefficient or the sum of conditional probabilities).
System for information discovery
Pennock, Kelly A [Richland, WA; Miller, Nancy E [Kennewick, WA
2002-11-19
A sequence of word filters are used to eliminate terms in the database which do not discriminate document content, resulting in a filtered word set and a topic word set whose members are highly predictive of content. These two word sets are then formed into a two dimensional matrix with matrix entries calculated as the conditional probability that a document will contain a word in a row given that it contains the word in a column. The matrix representation allows the resultant vectors to be utilized to interpret document contents.
Self-imposed timeouts under increasing response requirements.
NASA Technical Reports Server (NTRS)
Dardano, J. F.
1973-01-01
Three male White Carneaux pigeons were used in the investigation. None of the results obtained contradicts the interpretation of self-imposed timeouts as an escape response reinforced by the removal of unfavorable reinforcement conditions, although some details of the performances reflect either a weak control and/or operation of other controlling variables. Timeout key responding can be considered as one of several classes of behavior having a low probability of occurrence, all of which compete with the behavior maintained by positive reinforcement schedule.
ERIC Educational Resources Information Center
Salleh, Safrul Izani Mohd; Gardner, John C.; Sulong, Zunaidah; McGowan, Carl B., Jr.
2011-01-01
This study examines the differences in the interpretation of ten "in context" verbal probability expressions used in accounting standards between native Chinese speaking and native English speaking accounting students in United Kingdom universities. The study assesses the degree of grouping factors consensus on the numerical…
NASA Astrophysics Data System (ADS)
Iwakoshi, Takehisa; Hirota, Osamu
2014-10-01
This study will test an interpretation in quantum key distribution (QKD) that trace distance between the distributed quantum state and the ideal mixed state is a maximum failure probability of the protocol. Around 2004, this interpretation was proposed and standardized to satisfy both of the key uniformity in the context of universal composability and operational meaning of the failure probability of the key extraction. However, this proposal has not been verified concretely yet for many years while H. P. Yuen and O. Hirota have thrown doubt on this interpretation since 2009. To ascertain this interpretation, a physical random number generator was employed to evaluate key uniformity in QKD. In this way, we calculated statistical distance which correspond to trace distance in quantum theory after a quantum measurement is done, then we compared it with the failure probability whether universal composability was obtained. As a result, the degree of statistical distance of the probability distribution of the physical random numbers and the ideal uniformity was very large. It is also explained why trace distance is not suitable to guarantee the security in QKD from the view point of quantum binary decision theory.
Incorporating uncertainty into medical decision making: an approach to unexpected test results.
Bianchi, Matt T; Alexander, Brian M; Cash, Sydney S
2009-01-01
The utility of diagnostic tests derives from the ability to translate the population concepts of sensitivity and specificity into information that will be useful for the individual patient: the predictive value of the result. As the array of available diagnostic testing broadens, there is a temptation to de-emphasize history and physical findings and defer to the objective rigor of technology. However, diagnostic test interpretation is not always straightforward. One significant barrier to routine use of probability-based test interpretation is the uncertainty inherent in pretest probability estimation, the critical first step of Bayesian reasoning. The context in which this uncertainty presents the greatest challenge is when test results oppose clinical judgment. It is this situation when decision support would be most helpful. The authors propose a simple graphical approach that incorporates uncertainty in pretest probability and has specific application to the interpretation of unexpected results. This method quantitatively demonstrates how uncertainty in disease probability may be amplified when test results are unexpected (opposing clinical judgment), even for tests with high sensitivity and specificity. The authors provide a simple nomogram for determining whether an unexpected test result suggests that one should "switch diagnostic sides.'' This graphical framework overcomes the limitation of pretest probability uncertainty in Bayesian analysis and guides decision making when it is most challenging: interpretation of unexpected test results.
A fuzzy Bayesian network approach to quantify the human behaviour during an evacuation
NASA Astrophysics Data System (ADS)
Ramli, Nurulhuda; Ghani, Noraida Abdul; Ahmad, Nazihah
2016-06-01
Bayesian Network (BN) has been regarded as a successful representation of inter-relationship of factors affecting human behavior during an emergency. This paper is an extension of earlier work of quantifying the variables involved in the BN model of human behavior during an evacuation using a well-known direct probability elicitation technique. To overcome judgment bias and reduce the expert's burden in providing precise probability values, a new approach for the elicitation technique is required. This study proposes a new fuzzy BN approach for quantifying human behavior during an evacuation. Three major phases of methodology are involved, namely 1) development of qualitative model representing human factors during an evacuation, 2) quantification of BN model using fuzzy probability and 3) inferencing and interpreting the BN result. A case study of three inter-dependencies of human evacuation factors such as danger assessment ability, information about the threat and stressful conditions are used to illustrate the application of the proposed method. This approach will serve as an alternative to the conventional probability elicitation technique in understanding the human behavior during an evacuation.
Warren, Tessa; Dickey, Michael Walsh; Liburd, Teljer L
2017-07-01
The rational inference, or noisy channel, account of language comprehension predicts that comprehenders are sensitive to the probabilities of different interpretations for a given sentence and adapt as these probabilities change (Gibson, Bergen & Piantadosi, 2013). This account provides an important new perspective on aphasic sentence comprehension: aphasia may increase the likelihood of sentence distortion, leading people with aphasia (PWA) to rely more on the prior probability of an interpretation and less on the form or structure of the sentence (Gibson, Sandberg, Fedorenko, Bergen & Kiran, 2015). We report the results of a sentence-picture matching experiment that tested the predictions of the rational inference account and other current models of aphasic sentence comprehension across a variety of sentence structures. Consistent with the rational inference account, PWA showed similar sensitivity to the probability of particular kinds of form distortions as age-matched controls, yet overall their interpretations relied more on prior probability and less on sentence form. As predicted by rational inference, but not by other models of sentence comprehension in aphasia, PWA's interpretations were more faithful to the form for active and passive sentences than for direct object and prepositional object sentences. However contra rational inference, there was no evidence that individual PWA's severity of syntactic or semantic impairment predicted their sensitivity to form versus the prior probability of a sentence, as cued by semantics. These findings confirm and extend previous findings that suggest the rational inference account holds promise for explaining aphasic and neurotypical comprehension, but they also raise new challenges for the account. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Lin, Wei; Chen, Yu-hua; Wang, Ji-yuan; Gao, Hong-sheng; Wang, Ji-jun; Su, Rong-hua; Mao, Wei
2011-04-01
Detection probability is an important index to represent and estimate target viability, which provides basis for target recognition and decision-making. But it will expend a mass of time and manpower to obtain detection probability in reality. At the same time, due to the different interpretation of personnel practice knowledge and experience, a great difference will often exist in the datum obtained. By means of studying the relationship between image features and perception quantity based on psychology experiments, the probability model has been established, in which the process is as following.Firstly, four image features have been extracted and quantified, which affect directly detection. Four feature similarity degrees between target and background were defined. Secondly, the relationship between single image feature similarity degree and perception quantity was set up based on psychological principle, and psychological experiments of target interpretation were designed which includes about five hundred people for interpretation and two hundred images. In order to reduce image features correlativity, a lot of artificial synthesis images have been made which include images with single brightness feature difference, images with single chromaticity feature difference, images with single texture feature difference and images with single shape feature difference. By analyzing and fitting a mass of experiments datum, the model quantitys have been determined. Finally, by applying statistical decision theory and experimental results, the relationship between perception quantity with target detection probability has been found. With the verification of a great deal of target interpretation in practice, the target detection probability can be obtained by the model quickly and objectively.
Dopkins, Stephen; Varner, Kaitlin; Hoyer, Darin
2017-10-01
In word recognition semantic priming of test words increased the false-alarm rate and the mean of confidence ratings to lures. Such priming also increased the standard deviation of confidence ratings to lures and the slope of the z-ROC function, suggesting that the priming increased the standard deviation of the lure evidence distribution. The Unequal Variance Signal Detection (UVSD) model interpreted the priming as increasing the standard deviation of the lure evidence distribution. Without additional parameters the Dual Process Signal Detection (DPSD) model could only accommodate the results by fitting the data for related and unrelated primes separately, interpreting the priming, implausibly, as decreasing the probability of target recollection (DPSD). With an additional parameter, for the probability of false (lure) recollection the model could fit the data for related and unrelated primes together, interpreting the priming as increasing the probability of false recollection. These results suggest that DPSD estimates of target recollection probability will decrease with increases in the lure confidence/evidence standard deviation unless a parameter is included for false recollection. Unfortunately the size of a given lure confidence/evidence standard deviation relative to other possible lure confidence/evidence standard deviations is often unspecified by context. Hence the model often has no way of estimating false recollection probability and thereby correcting its estimates of target recollection probability.
Detection of the toughest: Pedestrian injury risk as a smooth function of age.
Niebuhr, Tobias; Junge, Mirko
2017-07-04
Though it is common to refer to age-specific groups (e.g., children, adults, elderly), smooth trends conditional on age are mainly ignored in the literature. The present study examines the pedestrian injury risk in full-frontal pedestrian-to-passenger car accidents and incorporates age-in addition to collision speed and injury severity-as a plug-in parameter. Recent work introduced a model for pedestrian injury risk functions using explicit formulae with easily interpretable model parameters. This model is expanded by pedestrian age as another model parameter. Using the German In-Depth Accident Study (GIDAS) to obtain age-specific risk proportions, the model parameters are fitted to the raw data and then smoothed by broken-line regression. The approach supplies explicit probabilities for pedestrian injury risk conditional on pedestrian age, collision speed, and injury severity under investigation. All results yield consistency to each other in the sense that risks for more severe injuries are less probable than those for less severe injuries. As a side product, the approach indicates specific ages at which the risk behavior fundamentally changes. These threshold values can be interpreted as the most robust ages for pedestrians. The obtained age-wise risk functions can be aggregated and adapted to any population. The presented approach is formulated in such general terms that in can be directly used for other data sets or additional parameters; for example, the pedestrian's sex. Thus far, no other study using age as a plug-in parameter can be found.
A Framework for Multi-Stakeholder Decision-Making and ...
This contribution describes the implementation of the conditional-value-at-risk (CVaR) metric to create a general multi-stakeholder decision-making framework. It is observed that stakeholder dissatisfactions (distance to their individual ideal solutions) can be interpreted as random variables. We thus shape the dissatisfaction distribution and find an optimal compromise solution by solving a CVaR minimization problem parameterized in the probability level. This enables us to generalize multi-stakeholder settings previously proposed in the literature that minimizes average and worst-case dissatisfactions. We use the concept of the CVaR norm to give a geometric interpretation to this problem and use the properties of this norm to prove that the CVaR minimization problem yields Pareto optimal solutions for any choice of the probability level. We discuss a broad range of potential applications of the framework. We demonstrate the framework in a bio-waste processing facility location case study, where we seek compromise solutions (facility locations) that balance stakeholder priorities on transportation, safety, water quality, and capital costs. This conference presentation abstract explains a new decision-making framework that computes compromise solution alternatives (reach consensus) by mitigating dissatisfactions among stakeholders as needed for SHC Decision Science and Support Tools project.
Chen, Junwen; Milne, Kirby; Dayman, Janet; Kemps, Eva
2018-05-23
Two studies aimed to examine whether high socially anxious individuals are more likely to negatively interpret ambiguous social scenarios and facial expressions compared to low socially anxious individuals. We also examined whether interpretation bias serves as a mediator of the relationship between trait social anxiety and state anxiety responses, in particular current state anxiety, bodily sensations, and perceived probability and cost of negative evaluation pertaining to a speech task. Study 1 used ambiguous social scenarios and Study 2 used ambiguous facial expressions as stimuli to objectively assess interpretation bias. Undergraduate students with high and low social anxiety completed measures of state anxiety responses at three time points: baseline, after the interpretation bias task, and after the preparation for an impromptu speech. Results showed that high socially anxious individuals were more likely to endorse threat interpretations for ambiguous social scenarios and to interpret ambiguous faces as negative than low socially anxious individuals. Furthermore, negative interpretations mediated the relationship between trait social anxiety and perceived probability of negative evaluation pertaining to the speech task in Study 1 but not Study 2. The present studies provide new insight into the role of interpretation bias in social anxiety.
ERIC Educational Resources Information Center
Karelitz, Tzur M.; Budescu, David V.
2004-01-01
When forecasters and decision makers describe uncertain events using verbal probability terms, there is a risk of miscommunication because people use different probability phrases and interpret them in different ways. In an effort to facilitate the communication process, the authors investigated various ways of converting the forecasters' verbal…
Interpretations of Probability in Quantum Mechanics: A Case of ``Experimental Metaphysics''
NASA Astrophysics Data System (ADS)
Hellman, Geoffrey
After reviewing paradigmatic cases of “experimental metaphysics” basing inferences against local realism and determinism on experimental tests of Bells theorem (and successors), we concentrate on clarifying the meaning and status of “objective probability” in quantum mechanics. The terms “objective” and “subjective” are found ambiguous and inadequate, masking crucial differences turning on the question of what the numerical values of probability functions measure vs. the question of the nature of the “events” on which such functions are defined. This leads naturally to a 2×2 matrix of types of interpretations, which are then illustrated with salient examples. (Of independent interest are the splitting of “Copenhagen interpretation” into “objective” and “subjective” varieties in one of the dimensions and the splitting of Bohmian hidden variables from (other) modal interpretations along that same dimension.) It is then explained why Everett interpretations are difficult to categorize in these terms. Finally, we argue that Bohmian mechanics does not seriously threaten the experimental-metaphysical case for ultimate randomness and purely physical probabilities.
Information theoretic quantification of diagnostic uncertainty.
Westover, M Brandon; Eiseman, Nathaniel A; Cash, Sydney S; Bianchi, Matt T
2012-01-01
Diagnostic test interpretation remains a challenge in clinical practice. Most physicians receive training in the use of Bayes' rule, which specifies how the sensitivity and specificity of a test for a given disease combine with the pre-test probability to quantify the change in disease probability incurred by a new test result. However, multiple studies demonstrate physicians' deficiencies in probabilistic reasoning, especially with unexpected test results. Information theory, a branch of probability theory dealing explicitly with the quantification of uncertainty, has been proposed as an alternative framework for diagnostic test interpretation, but is even less familiar to physicians. We have previously addressed one key challenge in the practical application of Bayes theorem: the handling of uncertainty in the critical first step of estimating the pre-test probability of disease. This essay aims to present the essential concepts of information theory to physicians in an accessible manner, and to extend previous work regarding uncertainty in pre-test probability estimation by placing this type of uncertainty within a principled information theoretic framework. We address several obstacles hindering physicians' application of information theoretic concepts to diagnostic test interpretation. These include issues of terminology (mathematical meanings of certain information theoretic terms differ from clinical or common parlance) as well as the underlying mathematical assumptions. Finally, we illustrate how, in information theoretic terms, one can understand the effect on diagnostic uncertainty of considering ranges instead of simple point estimates of pre-test probability.
Habitable periglacial landscapes in martian mid-latitudes
NASA Astrophysics Data System (ADS)
Ulrich, M.; Wagner, D.; Hauber, E.; de Vera, J.-P.; Schirrmeister, L.
2012-05-01
Subsurface permafrost environments on Mars are considered to be zones where extant life could have survived. For the identification of possible habitats it is important to understand periglacial landscape evolution and related subsurface and environmental conditions. Many landforms that are interpreted to be related to ground ice are located in the martian mid-latitudinal belts. This paper summarizes the insights gained from studies of terrestrial analogs to permafrost landforms on Mars. The potential habitability of martian mid-latitude periglacial landscapes is exemplarily deduced for one such landscape, that of Utopia Planitia, by a review and discussion of environmental conditions influencing periglacial landscape evolution. Based on recent calculations of the astronomical forcing of climate changes, specific climate periods are identified within the last 10 Ma when thaw processes and liquid water were probably important for the development of permafrost geomorphology. No periods could be identified within the last 4 Ma which met the suggested threshold criteria for liquid water and habitable conditions. Implications of past and present environmental conditions such as temperature variations, ground-ice conditions, and liquid water activity are discussed with respect to the potential survival of highly-specialized microorganisms known from terrestrial permafrost. We conclude that possible habitable subsurface niches might have been developed in close relation to specific permafrost landform morphology on Mars. These would have probably been dominated by lithoautotrophic microorganisms (i.e. methanogenic archaea).
The Irrelevance of the Risk-Uncertainty Distinction.
Roser, Dominic
2017-10-01
Precautionary Principles are often said to be appropriate for decision-making in contexts of uncertainty such as climate policy. Contexts of uncertainty are contrasted to contexts of risk depending on whether we have probabilities or not. Against this view, I argue that the risk-uncertainty distinction is practically irrelevant. I start by noting that the history of the distinction between risk and uncertainty is more varied than is sometimes assumed. In order to examine the distinction, I unpack the idea of having probabilities, in particular by distinguishing three interpretations of probability: objective, epistemic, and subjective probability. I then claim that if we are concerned with whether we have probabilities at all-regardless of how low their epistemic credentials are-then we almost always have probabilities for policy-making. The reason is that subjective and epistemic probability are the relevant interpretations of probability and we almost always have subjective and epistemic probabilities. In contrast, if we are only concerned with probabilities that have sufficiently high epistemic credentials, then we obviously do not always have probabilities. Climate policy, for example, would then be a case of decision-making under uncertainty. But, so I argue, we should not dismiss probabilities with low epistemic credentials. Rather, when they are the best available probabilities our decision principles should make use of them. And, since they are almost always available, the risk-uncertainty distinction remains irrelevant.
Skill of Ensemble Seasonal Probability Forecasts
NASA Astrophysics Data System (ADS)
Smith, Leonard A.; Binter, Roman; Du, Hailiang; Niehoerster, Falk
2010-05-01
In operational forecasting, the computational complexity of large simulation models is, ideally, justified by enhanced performance over simpler models. We will consider probability forecasts and contrast the skill of ENSEMBLES-based seasonal probability forecasts of interest to the finance sector (specifically temperature forecasts for Nino 3.4 and the Atlantic Main Development Region (MDR)). The ENSEMBLES model simulations will be contrasted against forecasts from statistical models based on the observations (climatological distributions) and empirical dynamics based on the observations but conditioned on the current state (dynamical climatology). For some start dates, individual ENSEMBLES models yield significant skill even at a lead-time of 14 months. The nature of this skill is discussed, and chances of application are noted. Questions surrounding the interpretation of probability forecasts based on these multi-model ensemble simulations are then considered; the distributions considered are formed by kernel dressing the ensemble and blending with the climatology. The sources of apparent (RMS) skill in distributions based on multi-model simulations is discussed, and it is demonstrated that the inclusion of "zero-skill" models in the long range can improve Root-Mean-Square-Error scores, casting some doubt on the common justification for the claim that all models should be included in forming an operational probability forecast. It is argued that the rational response varies with lead time.
Extreme river flow dependence in Northern Scotland
NASA Astrophysics Data System (ADS)
Villoria, M. Franco; Scott, M.; Hoey, T.; Fischbacher-Smith, D.
2012-04-01
Various methods for the spatial analysis of hydrologic data have been developed recently. Here we present results using the conditional probability approach proposed by Keef et al. [Appl. Stat. (2009): 58,601-18] to investigate spatial interdependence in extreme river flows in Scotland. This approach does not require the specification of a correlation function, being mostly suitable for relatively small geographical areas. The work is motivated by the Flood Risk Management Act (Scotland (2009)) which requires maps of flood risk that take account of spatial dependence in extreme river flow. The method is based on two conditional measures of spatial flood risk: firstly the conditional probability PC(p) that a set of sites Y = (Y 1,...,Y d) within a region C of interest exceed a flow threshold Qp at time t (or any lag of t), given that in the specified conditioning site X > Qp; and, secondly the expected number of sites within C that will exceed a flow Qp on average (given that X > Qp). The conditional probabilities are estimated using the conditional distribution of Y |X = x (for large x), which can be modeled using a semi-parametric approach (Heffernan and Tawn [Roy. Statist. Soc. Ser. B (2004): 66,497-546]). Once the model is fitted, pseudo-samples can be generated to estimate functionals of the joint tails of the distribution of (Y,X). Conditional return level plots were directly compared to traditional return level plots thus improving our understanding of the dependence structure of extreme river flow events. Confidence intervals were calculated using block bootstrapping methods (100 replicates). We report results from applying this approach to a set of four rivers (Dulnain, Lossie, Ewe and Ness) in Northern Scotland. These sites were chosen based on data quality, spatial location and catchment characteristics. The river Ness, being the largest (catchment size 1839.1km2) was chosen as the conditioning river. Both the Ewe (441.1km2) and Ness catchments have predominantly impermeable bedrock, with the Ewe's one being very wet. The Lossie(216km2) and Dulnain (272.2km2) both contain significant areas of glacial deposits. River flow in the Dulnain is usually affected by snowmelt. In all cases, the conditional probability of each of the three rivers (Dulnain, Lossie, Ewe) decreases as the event in the conditioning river (Ness) becomes more extreme. The Ewe, despite being the furthest of the three sites from the Ness shows the strongest dependence, with relatively high (>0.4) conditional probabilities even for very extreme events (>0.995). Although the Lossie is closer geographically to the Ness than the Ewe, it shows relatively low conditional probabilities and can be considered independent of the Ness for very extreme events (> 0.990). The conditional probabilities seem to reflect the different catchment characteristics and dominant precipitation generating events, with the Ewe being more similar to the Ness than the other two rivers. This interpretation suggests that the conditional method may yield improved estimates of extreme events, but the approach is time consuming. An alternative model that is easier to implement, using a spatial quantile regression, is currently being investigated, which would also allow the introduction of further covariates, essential as the effects of climate change are incorporated into estimation procedures.
ANALYSIS OF A CLASSIFICATION ERROR MATRIX USING CATEGORICAL DATA TECHNIQUES.
Rosenfield, George H.; Fitzpatrick-Lins, Katherine
1984-01-01
Summary form only given. A classification error matrix typically contains tabulation results of an accuracy evaluation of a thematic classification, such as that of a land use and land cover map. The diagonal elements of the matrix represent the counts corrected, and the usual designation of classification accuracy has been the total percent correct. The nondiagonal elements of the matrix have usually been neglected. The classification error matrix is known in statistical terms as a contingency table of categorical data. As an example, an application of these methodologies to a problem of remotely sensed data concerning two photointerpreters and four categories of classification indicated that there is no significant difference in the interpretation between the two photointerpreters, and that there are significant differences among the interpreted category classifications. However, two categories, oak and cottonwood, are not separable in classification in this experiment at the 0. 51 percent probability. A coefficient of agreement is determined for the interpreted map as a whole, and individually for each of the interpreted categories. A conditional coefficient of agreement for the individual categories is compared to other methods for expressing category accuracy which have already been presented in the remote sensing literature.
Damage evaluation by a guided wave-hidden Markov model based method
NASA Astrophysics Data System (ADS)
Mei, Hanfei; Yuan, Shenfang; Qiu, Lei; Zhang, Jinjin
2016-02-01
Guided wave based structural health monitoring has shown great potential in aerospace applications. However, one of the key challenges of practical engineering applications is the accurate interpretation of the guided wave signals under time-varying environmental and operational conditions. This paper presents a guided wave-hidden Markov model based method to improve the damage evaluation reliability of real aircraft structures under time-varying conditions. In the proposed approach, an HMM based unweighted moving average trend estimation method, which can capture the trend of damage propagation from the posterior probability obtained by HMM modeling is used to achieve a probabilistic evaluation of the structural damage. To validate the developed method, experiments are performed on a hole-edge crack specimen under fatigue loading condition and a real aircraft wing spar under changing structural boundary conditions. Experimental results show the advantage of the proposed method.
Understanding disease mechanisms with models of signaling pathway activities.
Sebastian-Leon, Patricia; Vidal, Enrique; Minguez, Pablo; Conesa, Ana; Tarazona, Sonia; Amadoz, Alicia; Armero, Carmen; Salavert, Francisco; Vidal-Puig, Antonio; Montaner, David; Dopazo, Joaquín
2014-10-25
Understanding the aspects of the cell functionality that account for disease or drug action mechanisms is one of the main challenges in the analysis of genomic data and is on the basis of the future implementation of precision medicine. Here we propose a simple probabilistic model in which signaling pathways are separated into elementary sub-pathways or signal transmission circuits (which ultimately trigger cell functions) and then transforms gene expression measurements into probabilities of activation of such signal transmission circuits. Using this model, differential activation of such circuits between biological conditions can be estimated. Thus, circuit activation statuses can be interpreted as biomarkers that discriminate among the compared conditions. This type of mechanism-based biomarkers accounts for cell functional activities and can easily be associated to disease or drug action mechanisms. The accuracy of the proposed model is demonstrated with simulations and real datasets. The proposed model provides detailed information that enables the interpretation disease mechanisms as a consequence of the complex combinations of altered gene expression values. Moreover, it offers a framework for suggesting possible ways of therapeutic intervention in a pathologically perturbed system.
NASA Astrophysics Data System (ADS)
Stamenkovic, Philippe
2016-08-01
This paper tries to reconstruct Ernst Cassirer's potential reception of the EPR argument, as exposed by Einstein in his letter to Cassirer of March 1937. It is shown that, in conformity with his transcendental epistemology taking the conditions of accessibility as constitutive of the quantum object, Cassirer would probably have rejected the argument. Indeed, Cassirer would probably not have subscribed to its separability/local causality presupposition (which goes against his interpretation of the quantum formalism as a self-sufficient condition constitutive of the quantum object, without any reliance on spatial intuition), nor to its completeness requirement (as his partial endorsement of Bohr's complementarity, and his rejection of the Kantian "idea of complete determination", illustrate). By rejecting both of its premises, Cassirer's philosophy of physics thus enables to escape the EPR dilemma, and exhibits what, in Kantian terms, might be called a "negative utility" with respect to physical science. A further investigation of the anti-reductionist utility of Cassirer's systematic philosophy with respect to physics and other "symbolic forms" is finally suggested.
Extensional versus Intuitive Reasoning: The Conjunction Fallacy in Probability Judgment.
ERIC Educational Resources Information Center
Tversky, Amos; Kahneman, Daniel
1983-01-01
Judgments under uncertainty are often mediated by intuitive heuristics that are not bound by the conjunction rule of probability. Representativeness and availability heuristics can make a conjunction appear more probable than one of its constituents. Alternative interpretations of this conjunction fallacy are discussed and attempts to combat it…
NASA Technical Reports Server (NTRS)
Merrill, John T.; Rodriguez, Jose M.
1991-01-01
Trajectory and photochemical model calculations based on retrospective meteorological data for the operations areas of the NASA Pacific Exploratory Mission (PEM)-West mission are summarized. The trajectory climatology discussed here is intended to provide guidance for flight planning and initial data interpretation during the field phase of the expedition by indicating the most probable path air parcels are likely to take to reach various points in the area. The photochemical model calculations which are discussed indicate the sensitivity of the chemical environment to various initial chemical concentrations and to conditions along the trajectory. In the post-expedition analysis these calculations will be used to provide a climatological context for the meteorological conditions which are encountered in the field.
NASA Technical Reports Server (NTRS)
Celaya, Jose R.; Saxen, Abhinav; Goebel, Kai
2012-01-01
This article discusses several aspects of uncertainty representation and management for model-based prognostics methodologies based on our experience with Kalman Filters when applied to prognostics for electronics components. In particular, it explores the implications of modeling remaining useful life prediction as a stochastic process and how it relates to uncertainty representation, management, and the role of prognostics in decision-making. A distinction between the interpretations of estimated remaining useful life probability density function and the true remaining useful life probability density function is explained and a cautionary argument is provided against mixing interpretations for the two while considering prognostics in making critical decisions.
Quantum Bayesian perspective for intelligence reservoir characterization, monitoring and management
NASA Astrophysics Data System (ADS)
Lozada Aguilar, Miguel Ángel; Khrennikov, Andrei; Oleschko, Klaudia; de Jesús Correa, María
2017-10-01
The paper starts with a brief review of the literature about uncertainty in geological, geophysical and petrophysical data. In particular, we present the viewpoints of experts in geophysics on the application of Bayesian inference and subjective probability. Then we present arguments that the use of classical probability theory (CP) does not match completely the structure of geophysical data. We emphasize that such data are characterized by contextuality and non-Kolmogorovness (the impossibility to use the CP model), incompleteness as well as incompatibility of some geophysical measurements. These characteristics of geophysical data are similar to the characteristics of quantum physical data. Notwithstanding all this, contextuality can be seen as a major deviation of quantum theory from classical physics. In particular, the contextual probability viewpoint is the essence of the Växjö interpretation of quantum mechanics. We propose to use quantum probability (QP) for decision-making during the characterization, modelling, exploring and management of the intelligent hydrocarbon reservoir. Quantum Bayesianism (QBism), one of the recently developed information interpretations of quantum theory, can be used as the interpretational basis for such QP decision-making in geology, geophysics and petroleum projects design and management. This article is part of the themed issue `Second quantum revolution: foundational questions'.
On the Determinants of the Conjunction Fallacy: Probability versus Inductive Confirmation
ERIC Educational Resources Information Center
Tentori, Katya; Crupi, Vincenzo; Russo, Selena
2013-01-01
Major recent interpretations of the conjunction fallacy postulate that people assess the probability of a conjunction according to (non-normative) averaging rules as applied to the constituents' probabilities or represent the conjunction fallacy as an effect of random error in the judgment process. In the present contribution, we contrast such…
Serang, Oliver; MacCoss, Michael J.; Noble, William Stafford
2010-01-01
The problem of identifying proteins from a shotgun proteomics experiment has not been definitively solved. Identifying the proteins in a sample requires ranking them, ideally with interpretable scores. In particular, “degenerate” peptides, which map to multiple proteins, have made such a ranking difficult to compute. The problem of computing posterior probabilities for the proteins, which can be interpreted as confidence in a protein’s presence, has been especially daunting. Previous approaches have either ignored the peptide degeneracy problem completely, addressed it by computing a heuristic set of proteins or heuristic posterior probabilities, or by estimating the posterior probabilities with sampling methods. We present a probabilistic model for protein identification in tandem mass spectrometry that recognizes peptide degeneracy. We then introduce graph-transforming algorithms that facilitate efficient computation of protein probabilities, even for large data sets. We evaluate our identification procedure on five different well-characterized data sets and demonstrate our ability to efficiently compute high-quality protein posteriors. PMID:20712337
NASA Astrophysics Data System (ADS)
Smith, L. A.
2016-12-01
While probability forecasting has many philosophical and mathematical attractions, it is something of a dishonest nonsense if acting on such forecasts is expected to lead to rapid ruin. Model-based probabilities, when interpreted as actionable, are shown to lead to the rapid ruin of a cooperative entity offering odds interpreting the probability forecasts at face value. Arguably, these odds would not be considered "fair", but inasmuch as some definitions of "fair odds" include this case, this presentation will focus on "sustainable odds": Odds which are not expected to lead to the rapid ruin of the cooperative under the assumption that those placing bets have no information beyond that available to the forecast system. It is argued that sustainable odds will not correspond to probabilities outside the Perfect Model Scenario, that the "implied probabilities" determined from sustainable odds will always sum to more than one, and that the excess of this sum over one reflects the skill of the forecast system, being a quantitative measure of structural model error.
Diagnosing pulmonary embolisms: the clinician's point of view.
Carrillo Alcaraz, A; Martínez, A López; Solano, F J Sotos
Pulmonary thromboembolism is common and potentially severe. To ensure the correct approach to the diagnostic workup of pulmonary thromboembolism, it is essential to know the basic concepts governing the use of the different tests available. The diagnostic approach to pulmonary thromboembolism is an example of the application of the conditional probabilities of Bayes' theorem in daily practice. To interpret the available diagnostic tests correctly, it is necessary to analyze different concepts that are fundamental for decision making. Thus, it is necessary to know what the likelihood ratios, 95% confidence intervals, and decision thresholds mean. Whether to determine the D-dimer concentration or to do CT angiography or other imaging tests depends on their capacity to modify the pretest probability of having the disease to a posttest probability that is higher or lower than the thresholds for action. This review aims to clarify the diagnostic sequence of thromboembolic pulmonary disease, analyzing the main diagnostic tools (clinical examination, laboratory tests, and imaging tests), placing special emphasis on the principles that govern evidence-based medicine. Copyright © 2016 SERAM. Publicado por Elsevier España, S.L.U. All rights reserved.
Opinion evolution and rare events in an open community
NASA Astrophysics Data System (ADS)
Ye, Yusong; Yang, Zhuoqin; Zhang, Zili
2016-11-01
There are many multi-stable phenomena in society. To explain these multi-stable phenomena, we have studied opinion evolution in an open community. We focus on probability of transition (or the mean transition time) that the system transfer from one state to another. We suggest a bistable model to provide an interpretation of these phenomena. The quasi-potential method that we used is the most important method to calculate the transition time and it can be used to determine the whole probability density. We study the condition of bistability and then discuss rare events in a multi-stable system. In our model, we find that two parameters, ;temperature; and ;persuading intensity,; influence the behavior of the system; a suitable ;persuading intensity; and low ;temperature; make the system more stable. This means that the transition rarely happens. The asymmetric phenomenon caused by ;public-opinion; is also discussed.
Generalized monogamy of contextual inequalities from the no-disturbance principle.
Ramanathan, Ravishankar; Soeda, Akihito; Kurzyński, Paweł; Kaszlikowski, Dagomir
2012-08-03
In this Letter, we demonstrate that the property of monogamy of Bell violations seen for no-signaling correlations in composite systems can be generalized to the monogamy of contextuality in single systems obeying the Gleason property of no disturbance. We show how one can construct monogamies for contextual inequalities by using the graph-theoretic technique of vertex decomposition of a graph representing a set of measurements into subgraphs of suitable independence numbers that themselves admit a joint probability distribution. After establishing that all the subgraphs that are chordal graphs admit a joint probability distribution, we formulate a precise graph-theoretic condition that gives rise to the monogamy of contextuality. We also show how such monogamies arise within quantum theory for a single four-dimensional system and interpret violation of these relations in terms of a violation of causality. These monogamies can be tested with current experimental techniques.
Fermi's paradox, extraterrestrial life and the future of humanity: a Bayesian analysis
NASA Astrophysics Data System (ADS)
Verendel, Vilhelm; Häggström, Olle
2017-01-01
The Great Filter interpretation of Fermi's great silence asserts that Npq is not a very large number, where N is the number of potentially life-supporting planets in the observable universe, p is the probability that a randomly chosen such planet develops intelligent life to the level of present-day human civilization, and q is the conditional probability that it then goes on to develop a technological supercivilization visible all over the observable universe. Evidence suggests that N is huge, which implies that pq is very small. Hanson (1998) and Bostrom (2008) have argued that the discovery of extraterrestrial life would point towards p not being small and therefore a very small q, which can be seen as bad news for humanity's prospects of colonizing the universe. Here we investigate whether a Bayesian analysis supports their argument, and the answer turns out to depend critically on the choice of prior distribution.
Bradley, Paul M.
2011-01-01
Chlororespiration is common in shallow aquifer systems under conditions nominally identified as anoxic. Consequently, chlororespiration is a key component of remediation at many chloroethene-contaminated sites. In some instances, limited accumulation of reductive dechlorination daughter products is interpreted as evidence that natural attenuation is not adequate for site remediation. This conclusion is justified when evidence for parent compound (tetrachloroethene, PCE, or trichloroethene, TCE) degradation is lacking. For many chloroethene-contaminated shallow aquifer systems, however, nonconservative losses of the parent compounds are clear but the mass balance between parent compound attenuation and accumulation of reductive dechlorination daughter products is incomplete. Incomplete mass balance indicates a failure to account for important contaminant attenuation mechanisms and is consistent with contaminant degradation to nondiagnostic mineralization products like CO2. While anoxic mineralization of chloroethene compounds has been proposed previously, recent results suggest that oxygen-based mineralization of chloroethenes also can be significant at dissolved oxygen concentrations below the currently accepted field standard for nominally anoxic conditions. Thus, reassessment of the role and potential importance of low concentrations of oxygen in chloroethene biodegradation are needed, because mischaracterization of operant biodegradation processes can lead to expensive and ineffective remedial actions. A modified interpretive framework is provided for assessing the potential for chloroethene biodegradation under different redox conditions and the probable role of oxygen in chloroethene biodegradation.
On the generation of climate model ensembles
NASA Astrophysics Data System (ADS)
Haughton, Ned; Abramowitz, Gab; Pitman, Andy; Phipps, Steven J.
2014-10-01
Climate model ensembles are used to estimate uncertainty in future projections, typically by interpreting the ensemble distribution for a particular variable probabilistically. There are, however, different ways to produce climate model ensembles that yield different results, and therefore different probabilities for a future change in a variable. Perhaps equally importantly, there are different approaches to interpreting the ensemble distribution that lead to different conclusions. Here we use a reduced-resolution climate system model to compare three common ways to generate ensembles: initial conditions perturbation, physical parameter perturbation, and structural changes. Despite these three approaches conceptually representing very different categories of uncertainty within a modelling system, when comparing simulations to observations of surface air temperature they can be very difficult to separate. Using the twentieth century CMIP5 ensemble for comparison, we show that initial conditions ensembles, in theory representing internal variability, significantly underestimate observed variance. Structural ensembles, perhaps less surprisingly, exhibit over-dispersion in simulated variance. We argue that future climate model ensembles may need to include parameter or structural perturbation members in addition to perturbed initial conditions members to ensure that they sample uncertainty due to internal variability more completely. We note that where ensembles are over- or under-dispersive, such as for the CMIP5 ensemble, estimates of uncertainty need to be treated with care.
Aspects of Geodesical Motion with Fisher-Rao Metric: Classical and Quantum
NASA Astrophysics Data System (ADS)
Ciaglia, Florio M.; Cosmo, Fabio Di; Felice, Domenico; Mancini, Stefano; Marmo, Giuseppe; Pérez-Pardo, Juan M.
The purpose of this paper is to exploit the geometric structure of quantum mechanics and of statistical manifolds to study the qualitative effect that the quantum properties have in the statistical description of a system. We show that the end points of geodesics in the classical setting coincide with the probability distributions that minimise Shannon’s entropy, i.e. with distributions of zero dispersion. In the quantum setting this happens only for particular initial conditions, which in turn correspond to classical submanifolds. This result can be interpreted as a geometric manifestation of the uncertainty principle.
The emergent Copenhagen interpretation of quantum mechanics
NASA Astrophysics Data System (ADS)
Hollowood, Timothy J.
2014-05-01
We introduce a new and conceptually simple interpretation of quantum mechanics based on reduced density matrices of sub-systems from which the standard Copenhagen interpretation emerges as an effective description of macroscopically large systems. This interpretation describes a world in which definite measurement results are obtained with probabilities that reproduce the Born rule. Wave function collapse is seen to be a useful but fundamentally unnecessary piece of prudent book keeping which is only valid for macro-systems. The new interpretation lies in a class of modal interpretations in that it applies to quantum systems that interact with a much larger environment. However, we show that it does not suffer from the problems that have plagued similar modal interpretations like macroscopic superpositions and rapid flipping between macroscopically distinct states. We describe how the interpretation fits neatly together with fully quantum formulations of statistical mechanics and that a measurement process can be viewed as a process of ergodicity breaking analogous to a phase transition. The key feature of the new interpretation is that joint probabilities for the ergodic subsets of states of disjoint macro-systems only arise as emergent quantities. Finally we give an account of the EPR-Bohm thought experiment and show that the interpretation implies the violation of the Bell inequality characteristic of quantum mechanics but in a way that is rather novel. The final conclusion is that the Copenhagen interpretation gives a completely satisfactory phenomenology of macro-systems interacting with micro-systems.
Probable reasons for expressed agitation in persons with dementia.
Ragneskog, H; Gerdner, L A; Josefsson, K; Kihlgren, M
1998-05-01
Nursing home patients with dementia were videotaped in three previous studies. Sixty sequences of nine patients exhibiting agitated behaviors were examined to identify the most probable antecedents to agitation. Probable reasons were interpreted and applied to the Progressively Lowered Stress Threshold model, which suggests that agitation is stress related. Analysis suggests that agitation often serves as a form of communication. Two underlying reasons seem to be that the patient had loss of control over the situation and deficient autonomy. The most common causes for expressed agitation were interpreted as discomfort, a wish to be served immediately, conflict between patients or with nursing staff, reactions to environmental noises or sound, and invasion of personal space. It is recommended that nursing staff promote autonomy and independency for this group of patients whenever possible. By evaluating probable reasons for expressed agitation, the nursing staff can take steps to prevent or alleviate agitation.
Dalthorp, Daniel; Huso, Manuela
2015-12-02
Confirming the accuracy of predicted take and providing evidence that permitted take levels have not been exceeded can be challenging because carcasses may be detected with probability much less than 1, and often no carcasses are observed. When detection probability is high, finding 0 carcasses can be interpreted as evidence that none (or few) were actually killed. As the probability of observing an individual decreases, the likelihood of missing carcasses increases, making it unclear how to interpret having observed 0 (or few) carcasses. In a practical sense, the consequences of incorrect inference can be significant: overestimating take could result in costly and unjustified mitigation, whereas underestimating could result in unanticipated declines in species populations already at risk.
Interpreting null results from measurements with uncertain correlations: an info-gap approach.
Ben-Haim, Yakov
2011-01-01
Null events—not detecting a pernicious agent—are the basis for declaring the agent is absent. Repeated nulls strengthen confidence in the declaration. However, correlations between observations are difficult to assess in many situations and introduce uncertainty in interpreting repeated nulls. We quantify uncertain correlations using an info-gap model, which is an unbounded family of nested sets of possible probabilities. An info-gap model is nonprobabilistic and entails no assumption about a worst case. We then evaluate the robustness, to uncertain correlations, of estimates of the probability of a null event. This is then the basis for evaluating a nonprobabilistic robustness-based confidence interval for the probability of a null. © 2010 Society for Risk Analysis.
Integration of Geophysical Methods By A Generalised Probability Tomography Approach
NASA Astrophysics Data System (ADS)
Mauriello, P.; Patella, D.
In modern science, the propensity interpretative approach stands on the assumption that any physical system consists of two kinds of reality: actual and potential. Also geophysical data systems have potentialities that extend far beyond the few actual models normally attributed to them. Indeed, any geophysical data set is in itself quite inherently ambiguous. Classical deterministic inversion, including tomography, usu- ally forces a measured data set to collapse into a few rather subjective models based on some available a priori information. Classical interpretation is thus an intrinsically limited approach requiring a very deep logical extension. We think that a way to high- light a system full potentiality is to introduce probability as the leading paradigm in dealing with field data systems. Probability tomography has been recently introduced as a completely new approach to data interpretation. Probability tomography has been originally formulated for the self-potential method. It has been then extended to geo- electric, natural source electromagnetic induction, gravity and magnetic methods. Fol- lowing the same rationale, in this paper we generalize the probability tomography the- ory to a generic geophysical anomaly vector field, including the treatment for scalar fields as a particular case. This generalization makes then possible to address for the first time the problem of the integration of different methods by a conjoint probabil- ity tomography imaging procedure. The aim is to infer the existence of an unknown buried object through the analysis of an ad hoc occurrence probability function, blend- ing the physical messages brought forth by a set of singularly observed anomalies.
Evaluation of the river die-away biodegradation test
Wylie, Glenn D.; Jones, John R.; Johnson, B. Thomas
1982-01-01
The reliability of the river die-away (RDA) test for establishing the biodegradability of chemicals was assessed. Reproducibility of biodegradation in the RDA test was analyzed under conditions in which the test is commonly done. Biodegradation results were not reproducible for di-2-ethylexyl phthalate (DEHP) and phthalic acid in replicated RDA tests using Missouri River water. Chemical and biological changes during the RDA tests probably reflected relative laboratory conditions. Initial suspended solids and subsequent DEHP biodegradation were directly related. Interpretation of RDA test results is enhanced by replicating experiments and comparing biodegradation of the test compound with a compound whose degradation properties are known. However, biodegradation measured with the RDA test is too variable and too dependent on laboratory treatment of samples to apply results directly to the aquatic environment.
The Earth Through Time: Implications for Searching for Habitability and Life on Exoplanets
NASA Technical Reports Server (NTRS)
Pilcher, Carl B.
2016-01-01
The Earth has been both a habitable and inhabited planet for around 4 billion years, yet distant observers studying Earth at different epochs in our history would have detected substantially different and probably varying conditions. Understanding Earth's history thus has much to tell us about how to interpret observations of potentially habitable exoplanets. In this talk I will review the history of life on Earth, from the earliest microbial biosphere living under a relatively methane-rich atmosphere to the modern world of animals, plants, and atmospheric oxygen, with a focus on how observable conditions on Earth changed as the planet and its biosphere evolved. I'll discuss the implications of this history for assessing the habitability of-or presence of life on-planets around other stars.
Focused Assessment with Sonography for Trauma in weightlessness: a feasibility study
NASA Technical Reports Server (NTRS)
Kirkpatrick, Andrew W.; Hamilton, Douglas R.; Nicolaou, Savvas; Sargsyan, Ashot E.; Campbell, Mark R.; Feiveson, Alan; Dulchavsky, Scott A.; Melton, Shannon; Beck, George; Dawson, David L.
2003-01-01
BACKGROUND: The Focused Assessment with Sonography for Trauma (FAST) examines for fluid in gravitationally dependent regions. There is no prior experience with this technique in weightlessness, such as on the International Space Station, where sonography is currently the only diagnostic imaging tool. STUDY DESIGN: A ground-based (1 g) porcine model for sonography was developed. We examined both the feasibility and the comparative performance of the FAST examination in parabolic flight. Sonographic detection and fluid behavior were evaluated in four animals during alternating weightlessness (0 g) and hypergravity (1.8 g) periods. During flight, boluses of fluid were incrementally introduced into the peritoneal cavity. Standardized sonographic windows were recorded. Postflight, the video recordings were divided into 169 20-second segments for subsequent interpretation by 12 blinded ultrasonography experts. Reviewers first decided whether a video segment was of sufficient diagnostic quality to analyze (determinate). Determinate segments were then analyzed as containing or not containing fluid. A probit regression model compared the probability of a positive fluid diagnosis to actual fluid levels (0 to 500 mL) under both 0-g and 1.8-g conditions. RESULTS: The in-flight sonographers found real-time scanning and interpretation technically similar to that of terrestrial conditions, as long as restraint was maintained. On blinded review, 80% of the recorded ultrasound segments were considered determinate. The best sensitivity for diagnosis in 0 g was found to be from the subhepatic space, with probability of a positive fluid diagnosis ranging from 9% (no fluid) to 51% (500 mL fluid). CONCLUSIONS: The FAST examination is technically feasible in weightlessness, and merits operational consideration for clinical contingencies in space.
NASA Astrophysics Data System (ADS)
Peeters, L. J.; Mallants, D.; Turnadge, C.
2017-12-01
Groundwater impact assessments are increasingly being undertaken in a probabilistic framework whereby various sources of uncertainty (model parameters, model structure, boundary conditions, and calibration data) are taken into account. This has resulted in groundwater impact metrics being presented as probability density functions and/or cumulative distribution functions, spatial maps displaying isolines of percentile values for specific metrics, etc. Groundwater management on the other hand typically uses single values (i.e., in a deterministic framework) to evaluate what decisions are required to protect groundwater resources. For instance, in New South Wales, Australia, a nominal drawdown value of two metres is specified by the NSW Aquifer Interference Policy as trigger-level threshold. In many cases, when drawdowns induced by groundwater extraction exceed two metres, "make-good" provisions are enacted (such as the surrendering of extraction licenses). The information obtained from a quantitative uncertainty analysis can be used to guide decision making in several ways. Two examples are discussed here: the first of which would not require modification of existing "deterministic" trigger or guideline values, whereas the second example assumes that the regulatory criteria are also expressed in probabilistic terms. The first example is a straightforward interpretation of calculated percentile values for specific impact metrics. The second examples goes a step further, as the previous deterministic thresholds do not currently allow for a probabilistic interpretation; e.g., there is no statement that "the probability of exceeding the threshold shall not be larger than 50%". It would indeed be sensible to have a set of thresholds with an associated acceptable probability of exceedance (or probability of not exceeding a threshold) that decreases as the impact increases. We here illustrate how both the prediction uncertainty and management rules can be expressed in a probabilistic framework, using groundwater metrics derived for a highly stressed groundwater system.
Quantum Bayesian perspective for intelligence reservoir characterization, monitoring and management.
Lozada Aguilar, Miguel Ángel; Khrennikov, Andrei; Oleschko, Klaudia; de Jesús Correa, María
2017-11-13
The paper starts with a brief review of the literature about uncertainty in geological, geophysical and petrophysical data. In particular, we present the viewpoints of experts in geophysics on the application of Bayesian inference and subjective probability. Then we present arguments that the use of classical probability theory (CP) does not match completely the structure of geophysical data. We emphasize that such data are characterized by contextuality and non-Kolmogorovness (the impossibility to use the CP model), incompleteness as well as incompatibility of some geophysical measurements. These characteristics of geophysical data are similar to the characteristics of quantum physical data. Notwithstanding all this, contextuality can be seen as a major deviation of quantum theory from classical physics. In particular, the contextual probability viewpoint is the essence of the Växjö interpretation of quantum mechanics. We propose to use quantum probability (QP) for decision-making during the characterization, modelling, exploring and management of the intelligent hydrocarbon reservoir Quantum Bayesianism (QBism), one of the recently developed information interpretations of quantum theory, can be used as the interpretational basis for such QP decision-making in geology, geophysics and petroleum projects design and management.This article is part of the themed issue 'Second quantum revolution: foundational questions'. © 2017 The Author(s).
ERIC Educational Resources Information Center
Riggs, Peter J.
2013-01-01
Students often wrestle unsuccessfully with the task of correctly calculating momentum probability densities and have difficulty in understanding their interpretation. In the case of a particle in an "infinite" potential well, its momentum can take values that are not just those corresponding to the particle's quantised energies but…
O'Shea, T.J.; Ellison, L.E.; Neubaum, D.J.; Neubaum, M.A.; Reynolds, C.A.; Bowen, R.A.
2010-01-01
We used markrecapture estimation techniques and radiography to test hypotheses about 3 important aspects of recruitment in big brown bats (Eptesicus fuscus) in Fort Collins, Colorado: adult breeding probabilities, litter size, and 1st-year survival of young. We marked 2,968 females with passive integrated transponder (PIT) tags at multiple sites during 2001-2005 and based our assessments on direct recaptures (breeding probabilities) and passive detection with automated PIT tag readers (1st-year survival). We interpreted our data in relation to hypotheses regarding demographic influences of bat age, roost, and effects of years with unusual environmental conditions: extreme drought (2002) and arrival of a West Nile virus epizootic (2003). Conditional breeding probabilities at 6 roosts sampled in 2002-2005 were estimated as 0.64 (95% confidence interval [95% CI] = 0.530.73) in 1-year-old females, but were consistently high (95% CI = 0.940.96) and did not vary by roost, year, or prior year breeding status in older adults. Mean litter size was 1.11 (95% CI = 1.051.17), based on examination of 112 pregnant females by radiography. Litter size was not higher in older or larger females and was similar to results of other studies in western North America despite wide variation in latitude. First-year survival was estimated as 0.67 (95% CI = 0.610.73) for weaned females at 5 maternity roosts over 5 consecutive years, was lower than adult survival (0.79; 95% CI = 0.770.81), and varied by roost. Based on model selection criteria, strong evidence exists for complex roost and year effects on 1st-year survival. First-year survival was lowest in bats born during the drought year. Juvenile females that did not return to roosts as 1-year-olds had lower body condition indices in late summer of their natal year than those known to survive. ?? 2009 American Society of Mammalogists.
Applications of conformal field theory to problems in 2D percolation
NASA Astrophysics Data System (ADS)
Simmons, Jacob Joseph Harris
This thesis explores critical two-dimensional percolation in bounded regions in the continuum limit. The main method which we employ is conformal field theory (CFT). Our specific results follow from the null-vector structure of the c = 0 CFT that applies to critical two-dimensional percolation. We also make use of the duality symmetry obeyed at the percolation point, and the fact that percolation may be understood as the q-state Potts model in the limit q → 1. Our first results describe the correlations between points in the bulk and boundary intervals or points, i.e. the probability that the various points or intervals are in the same percolation cluster. These quantities correspond to order-parameter profiles under the given conditions, or cluster connection probabilities. We consider two specific cases: an anchoring interval, and two anchoring points. We derive results for these and related geometries using the CFT null-vectors for the corresponding boundary condition changing (bcc) operators. In addition, we exhibit several exact relationships between these probabilities. These relations between the various bulk-boundary connection probabilities involve parameters of the CFT called operator product expansion (OPE) coefficients. We then compute several of these OPE coefficients, including those arising in our new probability relations. Beginning with the familiar CFT operator φ1,2, which corresponds to a free-fixed spin boundary change in the q-state Potts model, we then develop physical interpretations of the bcc operators. We argue that, when properly normalized, higher-order bcc operators correspond to successive fusions of multiple φ1,2, operators. Finally, by identifying the derivative of φ1,2 with the operator φ1,4, we derive several new quantities called first crossing densities. These new results are then combined and integrated to obtain the three previously known crossing quantities in a rectangle: the probability of a horizontal crossing cluster, the probability of a cluster crossing both horizontally and vertically, and the expected number of horizontal crossing clusters. These three results were known to be solutions to a certain fifth-order differential equation, but until now no physically meaningful explanation had appeared. This differential equation arises naturally in our derivation.
NASA Astrophysics Data System (ADS)
Guillier, Bertrand; Chatelain, Jean-Luc
2006-06-01
The high activity level of Hybrid Events (HE) detected beneath the Cayambe volcano since 1989 has been more thoroughly investigated with data from a temporary array. The unusual HE spectral content allows separating a high-frequency signal riding on a low-frequency one, with a probable single source. HEs are interpreted as high frequency VT events, produced by the interaction between magmatic heat and an underground water system fed by thaw water from the summital glacier, which trigger simultaneous low-frequency fluid resonance in the highly fractured adjacent medium. Pure VTs are interpreted as 'aborted' HEs occurring probably in the oldest and coldest part of the volcano complex. To cite this article: B. Guillier, J.-L. Chatelain, C. R. Geoscience 338 (2006).
Nelson, Douglas L; Dyrdal, Gunvor M; Goodmon, Leilani B
2005-08-01
Measuring lexical knowledge poses a challenge to the study of the influence of preexisting knowledge on the retrieval of new memories. Many tasks focus on word pairs, but words are embedded in associative networks, so how should preexisting pair strength be measured? It has been measured by free association, similarity ratings, and co-occurrence statistics. Researchers interpret free association response probabilities as unbiased estimates of forward cue-to-target strength. In Study 1, analyses of large free association and extralist cued recall databases indicate that this interpretation is incorrect. Competitor and backward strengths bias free association probabilities, and as with other recall tasks, preexisting strength is described by a ratio rule. In Study 2, associative similarity ratings are predicted by forward and backward, but not by competitor, strength. Preexisting strength is not a unitary construct, because its measurement varies with method. Furthermore, free association probabilities predict extralist cued recall better than do ratings and co-occurrence statistics. The measure that most closely matches the criterion task may provide the best estimate of the identity of preexisting strength.
Intracranial Self-Stimulation to Evaluate Abuse Potential of Drugs
Miller, Laurence L.
2014-01-01
Intracranial self-stimulation (ICSS) is a behavioral procedure in which operant responding is maintained by pulses of electrical brain stimulation. In research to study abuse-related drug effects, ICSS relies on electrode placements that target the medial forebrain bundle at the level of the lateral hypothalamus, and experimental sessions manipulate frequency or amplitude of stimulation to engender a wide range of baseline response rates or response probabilities. Under these conditions, drug-induced increases in low rates/probabilities of responding maintained by low frequencies/amplitudes of stimulation are interpreted as an abuse-related effect. Conversely, drug-induced decreases in high rates/probabilities of responding maintained by high frequencies/amplitudes of stimulation can be interpreted as an abuse-limiting effect. Overall abuse potential can be inferred from the relative expression of abuse-related and abuse-limiting effects. The sensitivity and selectivity of ICSS to detect abuse potential of many classes of abused drugs is similar to the sensitivity and selectivity of drug self-administration procedures. Moreover, similar to progressive-ratio drug self-administration procedures, ICSS data can be used to rank the relative abuse potential of different drugs. Strengths of ICSS in comparison with drug self-administration include 1) potential for simultaneous evaluation of both abuse-related and abuse-limiting effects, 2) flexibility for use with various routes of drug administration or drug vehicles, 3) utility for studies in drug-naive subjects as well as in subjects with controlled levels of prior drug exposure, and 4) utility for studies of drug time course. Taken together, these considerations suggest that ICSS can make significant contributions to the practice of abuse potential testing. PMID:24973197
Timeless Configuration Space and the Emergence of Classical Behavior
NASA Astrophysics Data System (ADS)
Gomes, Henrique
2018-06-01
The inherent difficulty in talking about quantum decoherence in the context of quantum cosmology is that decoherence requires subsystems, and cosmology is the study of the whole Universe. Consistent histories gave a possible answer to this conundrum, by phrasing decoherence as loss of interference between alternative histories of closed systems. When one can apply Boolean logic to a set of histories, it is deemed `consistent'. However, the vast majority of the sets of histories that are merely consistent are blatantly nonclassical in other respects, and further constraints than just consistency need to be invoked. In this paper, I attempt to give an alternative answer to the issues faced by consistent histories, by exploring a timeless interpretation of quantum mechanics of closed systems. This is done solely in terms of path integrals in non-relativistic, timeless, configuration space. What prompts a fresh look at such foundational problems in this context is the advent of multiple gravitational models in which Lorentz symmetry is not fundamental, but only emergent. And what allows this approach to overcome previous barriers to a timeless, conditional probabilities interpretation of quantum mechanics is the new notion of records—made possible by an inherent asymmetry of configuration space. I outline and explore consequences of this approach for foundational issues of quantum mechanics, such as the natural emergence of the Born rule, conservation of probabilities, and the Sleeping Beauty paradox.
NASA Astrophysics Data System (ADS)
Sui, Xiukai; Wu, Bin; Wang, Long
2015-12-01
The likelihood that a mutant fixates in the wild population, i.e., fixation probability, has been intensively studied in evolutionary game theory, where individuals' fitness is frequency dependent. However, it is of limited interest when it takes long to take over. Thus the speed of evolution becomes an important issue. In general, it is still unclear how fixation times are affected by the population structure, although the fixation times have already been addressed in the well-mixed populations. Here we theoretically address this issue by pair approximation and diffusion approximation on regular graphs. It is shown (i) that under neutral selection, both unconditional and conditional fixation time are shortened by increasing the number of neighbors; (ii) that under weak selection, for the simplified prisoner's dilemma game, if benefit-to-cost ratio exceeds the degree of the graph, then the unconditional fixation time of a single cooperator is slower than that in the neutral case; and (iii) that under weak selection, for the conditional fixation time, limited neighbor size dilutes the counterintuitive stochastic slowdown which was found in well-mixed populations. Interestingly, we find that all of our results can be interpreted as that in the well-mixed population with a transformed payoff matrix. This interpretation is also valid for both death-birth and birth-death processes on graphs. This interpretation bridges the fixation time in the structured population and that in the well-mixed population. Thus it opens the avenue to investigate the challenging fixation time in structured populations by the known results in well-mixed populations.
Elmore, Joann G.; Nelson, Heidi D.; Pepe, Margaret S.; Longton, Gary M.; Tosteson, Anna N.A.; Geller, Berta; Onega, Tracy; Carney, Patricia A.; Jackson, Sara L.; Allison, Kimberly H.; Weaver, Donald L.
2016-01-01
Background The effect of physician diagnostic variability on accuracy at a population level depends on the prevalence of diagnoses. Objective To estimate how diagnostic variability affects accuracy from the perspective of a U.S. woman aged 50 to 59 years having a breast biopsy. Design Applied probability using Bayes theorem. Setting B-Path (Breast Pathology) Study comparing pathologists’ interpretations of a single biopsy slide versus a reference consensus interpretation from 3 experts. Participants 115 practicing pathologists (6900 total interpretations from 240 distinct cases). Measurements A single representative slide from each of the 240 cases was used to estimate the proportion of biopsies with a diagnosis that would be verified if the same slide were interpreted by a reference group of 3 expert pathologists. Probabilities of confirmation (predictive values) were estimated using B-Path Study results and prevalence of biopsy diagnoses for women aged 50 to 59 years in the Breast Cancer Surveillance Consortium. Results Overall, if 1 representative slide were used per case, 92.3% (95% CI, 91.4% to 93.1%) of breast biopsy diagnoses would be verified by reference consensus diagnoses, with 4.6% (CI, 3.9% to 5.3%) overinterpreted and 3.2% (CI, 2.7% to 3.6%) underinterpreted. Verification of invasive breast cancer and benign without atypia diagnoses is highly probable; estimated predictive values were 97.7% (CI, 96.5% to 98.7%) and 97.1% (CI, 96.7% to 97.4%), respectively. Verification is less probable for atypia (53.6% overinterpreted and 8.6% underinterpreted) and ductal carcinoma in situ (DCIS) (18.5% overinterpreted and 11.8% underinterpreted). Limitations Estimates are based on a testing situation with 1 slide used per case and without access to second opinions. Population-adjusted estimates may differ for women from other age groups, unscreened women, or women in different practice settings. Conclusion This analysis, based on interpretation of a single breast biopsy slide per case, predicts a low likelihood that a diagnosis of atypia or DCIS would be verified by a reference consensus diagnosis. This diagnostic gray zone should be considered in clinical management decisions in patients with these diagnoses. Primary Funding Source National Cancer Institute. PMID:26999810
NASA Astrophysics Data System (ADS)
Ishibashi, Yoshihiro; Fukui, Minoru
2018-03-01
The effect of the probabilistic delayed start on the one-dimensional traffic flow is investigated on the basis of several models. Analogy with the degeneracy of the states and its resolution, as well as that with the mathematical procedures adopted for them, is utilized. The perturbation is assumed to be proportional to the probability of the delayed start, and the perturbation function is determined so that imposed conditions are fulfilled. The obtained formulas coincide with those previously derived on the basis of the mean-field analyses of the Nagel-Schreckenberg and Fukui-Ishibashi models, and reproduce the cellular automaton simulation results.
ERIC Educational Resources Information Center
Bao, Lei; Redish, Edward F.
2002-01-01
Explains the critical role of probability in making sense of quantum physics and addresses the difficulties science and engineering undergraduates experience in helping students build a model of how to think about probability in physical systems. (Contains 17 references.) (Author/YDS)
Hilderink, Judith M; Rennenberg, Roger J M W; Vanmolkot, Floris H M; Bekers, Otto; Koopmans, Richard P; Meex, Steven J R
2017-09-01
When monitoring patients over time, clinicians may struggle to distinguish 'real changes' in consecutive blood parameters from so-called natural fluctuations. In practice, they have to do so by relying on their clinical experience and intuition. We developed Labtracker+ , a medical app that calculates the probability that an increase or decrease over time in a specific blood parameter is real, given the time between measurements. We presented patient cases to 135 participants to examine whether there is a difference between medical students, residents and experienced clinicians when it comes to interpreting changes between consecutive laboratory results. Participants were asked to interpret if changes in consecutive laboratory values were likely to be 'real' or rather due to natural fluctuations. The answers of the study participants were compared with the calculated probabilities by the app Labtracker+ and the concordance rates were assessed. Medical students (n=92), medical residents from the department of internal medicine (n=19) and internists (n=24) at a Dutch University Medical Centre. Concordance rates between the study participants and the calculated probabilities by the app Labtracker+ were compared. Besides, we tested whether physicians with clinical experience scored better concordance rates with the app Labtracker+ than inexperienced clinicians. Medical residents and internists showed significantly better concordance rates with the calculated probabilities by the app Labtracker+ than medical students, regarding their interpretation of differences between consecutive laboratory results (p=0.009 and p<0.001, respectively). The app Labtracker+ could serve as a clinical decision tool in the interpretation of consecutive laboratory test results and could contribute to rapid recognition of parameter changes by physicians. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Veterinary diagnostic imaging: Probability, accuracy and impact.
Lamb, Christopher R
2016-09-01
Diagnostic imaging is essential for diagnosis and management of many common problems in veterinary medicine, but imaging is not 100% accurate and does not always benefit the animal in the way intended. When assessing the need for imaging, the probability that the animal has a morphological lesion, the accuracy of the imaging and the likelihood of a beneficial impact on the animal must all be considered. Few imaging tests are sufficiently accurate that they enable a diagnosis to be ruled in or out; instead, the results of imaging only modify the probability of a diagnosis. Potential problems with excessive use of imaging include false positive diagnoses, detection of incidental findings and over-diagnosis, all of which may contribute to a negative benefit to the animal. Veterinary clinicians must be selective in their use of imaging, use existing clinical information when interpreting images and sensibly apply the results of imaging in the context of the needs of individual animals. There is a need for more clinical research to assess the impact of diagnostic imaging for animals with common conditions to help clinicians make decisions conducive to optimal patient care. Copyright © 2016 Elsevier Ltd. All rights reserved.
Optimum space shuttle launch times relative to natural environment
NASA Technical Reports Server (NTRS)
King, R. L.
1977-01-01
Three sets of meteorological criteria were analyzed to determine the probabilities of favorable launch and landing conditions. Probabilities were computed for every 3 hours on a yearly basis using 14 years of weather data. These temporal probability distributions, applicable to the three sets of weather criteria encompassing benign, moderate and severe weather conditions, were computed for both Kennedy Space Center (KSC) and Edwards Air Force Base. In addition, conditional probabilities were computed for unfavorable weather conditions occurring after a delay which may or may not be due to weather conditions. Also, for KSC, the probabilities of favorable landing conditions at various times after favorable launch conditions have prevailed have been computed so that mission probabilities may be more accurately computed for those time periods when persistence strongly correlates weather conditions. Moreover, the probabilities and conditional probabilities of the occurrence of both favorable and unfavorable events for each individual criterion were computed to indicate the significance of each weather element to the overall result.
The rational status of quantum cognition.
Pothos, Emmanuel M; Busemeyer, Jerome R; Shiffrin, Richard M; Yearsley, James M
2017-07-01
Classic probability theory (CPT) is generally considered the rational way to make inferences, but there have been some empirical findings showing a divergence between reasoning and the principles of classical probability theory (CPT), inviting the conclusion that humans are irrational. Perhaps the most famous of these findings is the conjunction fallacy (CF). Recently, the CF has been shown consistent with the principles of an alternative probabilistic framework, quantum probability theory (QPT). Does this imply that QPT is irrational or does QPT provide an alternative interpretation of rationality? Our presentation consists of 3 parts. First, we examine the putative rational status of QPT using the same argument as used to establish the rationality of CPT, the Dutch Book (DB) argument, according to which reasoners should not commit to bets guaranteeing a loss. We prove the rational status of QPT by formulating it as a particular case of an extended form of CPT, with separate probability spaces produced by changing context. Second, we empirically examine the key requirement for whether a CF can be rational or not; the results show that participants indeed behave rationally, at least relative to the representations they employ. Finally, we consider whether the conditions for the CF to be rational are applicable in the outside (nonmental) world. Our discussion provides a general and alternative perspective for rational probabilistic inference, based on the idea that contextuality requires either reasoning in separate CPT probability spaces or reasoning with QPT principles. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Oceanic ridges and transform faults: Their intersection angles and resistance to plate motion
Lachenbruch, A.H.; Thompson, G.A.
1972-01-01
The persistent near-orthogonal pattern formed by oceanic ridges and transform faults defies explanation in terms of rigid plates because it probably depends on the energy associated with deformation. For passive spreading, it is likely that the ridges and transforms adjust to a configuration offering minimum resistance to plate separation. This leads to a simple geometric model which yields conditions for the occurrence of transform faults and an aid to interpretation of structural patterns in the sea floor. Under reasonable assumptions, it is much more difficult for diverging plates to spread a kilometer of ridge than to slip a kilometer of transform fault, and the patterns observed at spreading centers might extend to lithospheric depths. Under these conditions, the resisting force at spreading centers could play a significant role in the dynamics of plate-tectonic systems. ?? 1972.
Striatal activity is modulated by target probability.
Hon, Nicholas
2017-06-14
Target probability has well-known neural effects. In the brain, target probability is known to affect frontal activity, with lower probability targets producing more prefrontal activation than those that occur with higher probability. Although the effect of target probability on cortical activity is well specified, its effect on subcortical structures such as the striatum is less well understood. Here, I examined this issue and found that the striatum was highly responsive to target probability. This is consistent with its hypothesized role in the gating of salient information into higher-order task representations. The current data are interpreted in light of that fact that different components of the striatum are sensitive to different types of task-relevant information.
Consistency of response and image recognition, pulmonary nodules
Liu, M A Q; Galvan, E; Bassett, R; Murphy, W A; Matamoros, A; Marom, E M
2014-01-01
Objective: To investigate the effect of recognition of a previously encountered radiograph on consistency of response in localized pulmonary nodules. Methods: 13 radiologists interpreted 40 radiographs each to locate pulmonary nodules. A few days later, they again interpreted 40 radiographs. Half of the images in the second set were new. We asked the radiologists whether each image had been in the first set. We used Fisher's exact test and Kruskal–Wallis test to evaluate the correlation between recognition of an image and consistency in its interpretation. We evaluated the data using all possible recognition levels—definitely, probably or possibly included vs definitely, probably or possibly not included by collapsing the recognition levels into two and by eliminating the “possibly included” and “possibly not included” scores. Results: With all but one of six methods of looking at the data, there was no significant correlation between consistency in interpretation and recognition of the image. When the possibly included and possibly not included scores were eliminated, there was a borderline statistical significance (p = 0.04) with slightly greater consistency in interpretation of recognized than that of non-recognized images. Conclusion: We found no convincing evidence that radiologists' recognition of images in an observer performance study affects their interpretation on a second encounter. Advances in knowledge: Conscious recognition of chest radiographs did not result in a greater degree of consistency in the tested interpretation than that in the interpretation of images that were not recognized. PMID:24697724
A local structure model for network analysis
Casleton, Emily; Nordman, Daniel; Kaiser, Mark
2017-04-01
The statistical analysis of networks is a popular research topic with ever widening applications. Exponential random graph models (ERGMs), which specify a model through interpretable, global network features, are common for this purpose. In this study we introduce a new class of models for network analysis, called local structure graph models (LSGMs). In contrast to an ERGM, a LSGM specifies a network model through local features and allows for an interpretable and controllable local dependence structure. In particular, LSGMs are formulated by a set of full conditional distributions for each network edge, e.g., the probability of edge presence/absence, depending onmore » neighborhoods of other edges. Additional model features are introduced to aid in specification and to help alleviate a common issue (occurring also with ERGMs) of model degeneracy. Finally, the proposed models are demonstrated on a network of tornadoes in Arkansas where a LSGM is shown to perform significantly better than a model without local dependence.« less
A local structure model for network analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Casleton, Emily; Nordman, Daniel; Kaiser, Mark
The statistical analysis of networks is a popular research topic with ever widening applications. Exponential random graph models (ERGMs), which specify a model through interpretable, global network features, are common for this purpose. In this study we introduce a new class of models for network analysis, called local structure graph models (LSGMs). In contrast to an ERGM, a LSGM specifies a network model through local features and allows for an interpretable and controllable local dependence structure. In particular, LSGMs are formulated by a set of full conditional distributions for each network edge, e.g., the probability of edge presence/absence, depending onmore » neighborhoods of other edges. Additional model features are introduced to aid in specification and to help alleviate a common issue (occurring also with ERGMs) of model degeneracy. Finally, the proposed models are demonstrated on a network of tornadoes in Arkansas where a LSGM is shown to perform significantly better than a model without local dependence.« less
Bolann, B J; Asberg, A
2004-01-01
The deviation of test results from patients' homeostatic set points in steady-state conditions may complicate interpretation of the results and the comparison of results with clinical decision limits. In this study the total deviation from the homeostatic set point is defined as the maximum absolute deviation for 95% of measurements, and we present analytical quality requirements that prevent analytical error from increasing this deviation to more than about 12% above the value caused by biology alone. These quality requirements are: 1) The stable systematic error should be approximately 0, and 2) a systematic error that will be detected by the control program with 90% probability, should not be larger than half the value of the combined analytical and intra-individual standard deviation. As a result, when the most common control rules are used, the analytical standard deviation may be up to 0.15 times the intra-individual standard deviation. Analytical improvements beyond these requirements have little impact on the interpretability of measurement results.
A Dynamic Model for Decision Making During Memory Retrieval
2015-10-26
quantum probability decision making. 15. SUBJECT TERMS...making can be interpreted in terms of humans knowledge of probability being rooted in quantum probability...over brief periods of time so that the changes are not perceived consciously , the effects seen
Quantum probabilistic logic programming
NASA Astrophysics Data System (ADS)
Balu, Radhakrishnan
2015-05-01
We describe a quantum mechanics based logic programming language that supports Horn clauses, random variables, and covariance matrices to express and solve problems in probabilistic logic. The Horn clauses of the language wrap random variables, including infinite valued, to express probability distributions and statistical correlations, a powerful feature to capture relationship between distributions that are not independent. The expressive power of the language is based on a mechanism to implement statistical ensembles and to solve the underlying SAT instances using quantum mechanical machinery. We exploit the fact that classical random variables have quantum decompositions to build the Horn clauses. We establish the semantics of the language in a rigorous fashion by considering an existing probabilistic logic language called PRISM with classical probability measures defined on the Herbrand base and extending it to the quantum context. In the classical case H-interpretations form the sample space and probability measures defined on them lead to consistent definition of probabilities for well formed formulae. In the quantum counterpart, we define probability amplitudes on Hinterpretations facilitating the model generations and verifications via quantum mechanical superpositions and entanglements. We cast the well formed formulae of the language as quantum mechanical observables thus providing an elegant interpretation for their probabilities. We discuss several examples to combine statistical ensembles and predicates of first order logic to reason with situations involving uncertainty.
On Measuring Quantitative Interpretations of Reasonable Doubt
ERIC Educational Resources Information Center
Dhami, Mandeep K.
2008-01-01
Beyond reasonable doubt represents a probability value that acts as the criterion for conviction in criminal trials. I introduce the membership function (MF) method as a new tool for measuring quantitative interpretations of reasonable doubt. Experiment 1 demonstrated that three different methods (i.e., direct rating, decision theory based, and…
Interpretation of Confidence Interval Facing the Conflict
ERIC Educational Resources Information Center
Andrade, Luisa; Fernández, Felipe
2016-01-01
As literature has reported, it is usual that university students in statistics courses, and even statistics teachers, interpret the confidence level associated with a confidence interval as the probability that the parameter value will be between the lower and upper interval limits. To confront this misconception, class activities have been…
Bieber, Frederick R; Buckleton, John S; Budowle, Bruce; Butler, John M; Coble, Michael D
2016-08-31
The evaluation and interpretation of forensic DNA mixture evidence faces greater interpretational challenges due to increasingly complex mixture evidence. Such challenges include: casework involving low quantity or degraded evidence leading to allele and locus dropout; allele sharing of contributors leading to allele stacking; and differentiation of PCR stutter artifacts from true alleles. There is variation in statistical approaches used to evaluate the strength of the evidence when inclusion of a specific known individual(s) is determined, and the approaches used must be supportable. There are concerns that methods utilized for interpretation of complex forensic DNA mixtures may not be implemented properly in some casework. Similar questions are being raised in a number of U.S. jurisdictions, leading to some confusion about mixture interpretation for current and previous casework. Key elements necessary for the interpretation and statistical evaluation of forensic DNA mixtures are described. Given the most common method for statistical evaluation of DNA mixtures in many parts of the world, including the USA, is the Combined Probability of Inclusion/Exclusion (CPI/CPE). Exposition and elucidation of this method and a protocol for use is the focus of this article. Formulae and other supporting materials are provided. Guidance and details of a DNA mixture interpretation protocol is provided for application of the CPI/CPE method in the analysis of more complex forensic DNA mixtures. This description, in turn, should help reduce the variability of interpretation with application of this methodology and thereby improve the quality of DNA mixture interpretation throughout the forensic community.
Ice Flow in Debris Aprons and Central Peaks, and the Application of Crater Counts
NASA Astrophysics Data System (ADS)
Hartmann, W. K.; Quantin, C.; Werner, S. C.; Popova, O.
2009-03-01
We apply studies of decameter-scale craters to studies of probable ice-flow-related features on Mars, to interpret both chronometry and geological processes among the features. We find losses of decameter-scale craters relative to nearby plains, probably due to sublimation.
Towards a Theory of Semantic Communication (Extended Technical Report)
2011-03-01
counting models of a sentence, when interpretations have different probabilities, what matters is the total probability of models of the sentence, not...of classic logics still hold in the LP semantics, e.g., De Morgan’s laws. However, modus pollens does hold in the LP semantics 10 F. Relation to
Wijeysundera, Duminda N; Austin, Peter C; Hux, Janet E; Beattie, W Scott; Laupacis, Andreas
2009-01-01
Randomized trials generally use "frequentist" statistics based on P-values and 95% confidence intervals. Frequentist methods have limitations that might be overcome, in part, by Bayesian inference. To illustrate these advantages, we re-analyzed randomized trials published in four general medical journals during 2004. We used Medline to identify randomized superiority trials with two parallel arms, individual-level randomization and dichotomous or time-to-event primary outcomes. Studies with P<0.05 in favor of the intervention were deemed "positive"; otherwise, they were "negative." We used several prior distributions and exact conjugate analyses to calculate Bayesian posterior probabilities for clinically relevant effects. Of 88 included studies, 39 were positive using a frequentist analysis. Although the Bayesian posterior probabilities of any benefit (relative risk or hazard ratio<1) were high in positive studies, these probabilities were lower and variable for larger benefits. The positive studies had only moderate probabilities for exceeding the effects that were assumed for calculating the sample size. By comparison, there were moderate probabilities of any benefit in negative studies. Bayesian and frequentist analyses complement each other when interpreting the results of randomized trials. Future reports of randomized trials should include both.
Biasi, G.P.; Weldon, R.J.; Fumal, T.E.; Seitz, G.G.
2002-01-01
We introduce a quantitative approach to paleoearthquake dating and apply it to paleoseismic data from the Wrightwood and Pallett Creek sites on the southern San Andreas fault. We illustrate how stratigraphic ordering, sedimentological, and historical data can be used quantitatively in the process of estimating earthquake ages. Calibrated radiocarbon age distributions are used directly from layer dating through recurrence intervals and recurrence probability estimation. The method does not eliminate subjective judgements in event dating, but it does provide a means of systematically and objectively approaching the dating process. Date distributions for the most recent 14 events at Wrightwood are based on sample and contextual evidence in Fumal et al. (2002) and site context and slip history in Weldon et al. (2002). Pallett Creek event and dating descriptions are from published sources. For the five most recent events at Wrightwood, our results are consistent with previously published estimates, with generally comparable or narrower uncertainties. For Pallett Creek, our earthquake date estimates generally overlap with previous results but typically have broader uncertainties. Some event date estimates are very sensitive to details of data interpretation. The historical earthquake in 1857 ruptured the ground at both sites but is not constrained by radiocarbon data. Radiocarbon ages, peat accumulation rates, and historical constraints at Pallett Creek for event X yield a date estimate in the earliest 1800s and preclude a date in the late 1600s. This event is almost certainly the historical 1812 earthquake, as previously concluded by Sieh et al. (1989). This earthquake also produced ground deformation at Wrightwood. All events at Pallett Creek, except for event T, about A.D. 1360, and possibly event I, about A.D. 960, have corresponding events at Wrightwood with some overlap in age ranges. Event T falls during a period of low sedimentation at Wrightwood when conditions were not favorable for recording earthquake evidence. Previously proposed correlations of Pallett Creek X with Wrightwood W3 in the 1690s and Pallett Creek event V with W5 around 1480 (Fumal et al., 1993) appear unlikely after our dating reevaluation. Apparent internal inconsistencies among event, layer, and dating relationships around events R and V identify them as candidates for further investigation at the site. Conditional probabilities of earthquake recurrence were estimated using Poisson, lognormal, and empirical models. The presence of 12 or 13 events at Wrightwood during the same interval that 10 events are reported at Pallett Creek is reflected in mean recurrence intervals of 105 and 135 years, respectively. Average Poisson model 30-year conditional probabilities are about 20% at Pallett Creek and 25% at Wrightwood. The lognormal model conditional probabilities are somewhat higher, about 25% for Pallett Creek and 34% for Wrightwood. Lognormal variance ??ln estimates of 0.76 and 0.70, respectively, imply only weak time predictability. Conditional probabilities of 29% and 46%, respectively, were estimated for an empirical distribution derived from the data alone. Conditional probability uncertainties are dominated by the brevity of the event series; dating uncertainty contributes only secondarily. Wrightwood and Pallett Creek event chronologies both suggest variations in recurrence interval with time, hinting that some form of recurrence rate modulation may be at work, but formal testing shows that neither series is more ordered than might be produced by a Poisson process.
Simulation of target interpretation based on infrared image features and psychology principle
NASA Astrophysics Data System (ADS)
Lin, Wei; Chen, Yu-hua; Gao, Hong-sheng; Wang, Zhan-feng; Wang, Ji-jun; Su, Rong-hua; Huang, Yan-ping
2009-07-01
It's an important and complicated process in target interpretation that target features extraction and identification, which effect psychosensorial quantity of interpretation person to target infrared image directly, and decide target viability finally. Using statistical decision theory and psychology principle, designing four psychophysical experiment, the interpretation model of the infrared target is established. The model can get target detection probability by calculating four features similarity degree between target region and background region, which were plotted out on the infrared image. With the verification of a great deal target interpretation in practice, the model can simulate target interpretation and detection process effectively, get the result of target interpretation impersonality, which can provide technique support for target extraction, identification and decision-making.
A Tutorial in Bayesian Potential Outcomes Mediation Analysis.
Miočević, Milica; Gonzalez, Oscar; Valente, Matthew J; MacKinnon, David P
2018-01-01
Statistical mediation analysis is used to investigate intermediate variables in the relation between independent and dependent variables. Causal interpretation of mediation analyses is challenging because randomization of subjects to levels of the independent variable does not rule out the possibility of unmeasured confounders of the mediator to outcome relation. Furthermore, commonly used frequentist methods for mediation analysis compute the probability of the data given the null hypothesis, which is not the probability of a hypothesis given the data as in Bayesian analysis. Under certain assumptions, applying the potential outcomes framework to mediation analysis allows for the computation of causal effects, and statistical mediation in the Bayesian framework gives indirect effects probabilistic interpretations. This tutorial combines causal inference and Bayesian methods for mediation analysis so the indirect and direct effects have both causal and probabilistic interpretations. Steps in Bayesian causal mediation analysis are shown in the application to an empirical example.
Vink, W D; Jones, G; Johnson, W O; Brown, J; Demirkan, I; Carter, S D; French, N P
2009-11-15
Bovine digital dermatitis (BDD) is an epidermitis which is a leading cause of infectious lameness. The only recognized diagnostic test is foot inspection, which is a labour-intensive procedure. There is no universally recognized, standardized lesion scoring system. As small lesions are easily missed, foot inspection has limited diagnostic sensitivity. Furthermore, interpretation is subjective, and prone to observer bias. Serology is more convenient to carry out and is potentially a more sensitive indicator of infection. By carrying out 20 serological assays using lesion-associated Treponema spp. isolates, three serogroups were identified. The reliability of the tests was established by assessing the level of agreement and the concordance correlation coefficient. Subsequently, an ELISA suitable for routine use was developed. The benchmark of diagnostic test validation is conventionally the determination of the key test parameters, sensitivity and specificity. This requires the imposition of a cut-off point. For serological assays with outcomes on a continuous scale, the degree by which the test result differs from this cut-off is disregarded. Bayesian statistical methodology has been developed which enables the assay result also to be interpreted on a continuous scale, thereby optimizing the information inherent in the test. Using a cross-sectional study dataset carried out on 8 representative dairy farms in the UK, the probability of infection, P(I), of each individual animal was estimated in the absence of a 'Gold Standard' by modelling I as a latent variable which was determined by lesion status, L as well as serology, S. Covariate data (foot hygiene score and age) were utilized to estimate P(L) when no lesion inspection was performed. Informative prior distributions were elicited where possible. The model was utilized for predictive inference, by computing estimates of P(I) and P(L) independently of the data. A more detailed and informative analysis of the farm-level distribution of infection could thus be performed. Also, biases associated with the subjective interpretation of lesion status were minimized. Model outputs showed that young stock were unlikely to be infected, whereas cows tended to have high or low probabilities of being infected. Estimates of probability of infection were considerably higher for animals with lesions than for those without. Associations were identified between both covariates and probability of infection in cows, but not in the young stock. Under the condition that the model assumptions are valid for the larger population, the results of this work can be generalized by predictive inference.
CPROB: A COMPUTATIONAL TOOL FOR CONDUCTING CONDITIONAL PROBABILITY ANALYSIS
Conditional probability analysis measures the probability of observing one event given that another event has occurred. In an environmental context, conditional probability analysis helps assess the association between an environmental contaminant (i.e. the stressor) and the ec...
NASA Astrophysics Data System (ADS)
Inoue, N.
2017-12-01
The conditional probability of surface ruptures is affected by various factors, such as shallow material properties, process of earthquakes, ground motions and so on. Toda (2013) pointed out difference of the conditional probability of strike and reverse fault by considering the fault dip and width of seismogenic layer. This study evaluated conditional probability of surface rupture based on following procedures. Fault geometry was determined from the randomly generated magnitude based on The Headquarters for Earthquake Research Promotion (2017) method. If the defined fault plane was not saturated in the assumed width of the seismogenic layer, the fault plane depth was randomly provided within the seismogenic layer. The logistic analysis was performed to two data sets: surface displacement calculated by dislocation methods (Wang et al., 2003) from the defined source fault, the depth of top of the defined source fault. The estimated conditional probability from surface displacement indicated higher probability of reverse faults than that of strike faults, and this result coincides to previous similar studies (i.e. Kagawa et al., 2004; Kataoka and Kusakabe, 2005). On the contrary, the probability estimated from the depth of the source fault indicated higher probability of thrust faults than that of strike and reverse faults, and this trend is similar to the conditional probability of PFDHA results (Youngs et al., 2003; Moss and Ross, 2011). The probability of combined simulated results of thrust and reverse also shows low probability. The worldwide compiled reverse fault data include low fault dip angle earthquake. On the other hand, in the case of Japanese reverse fault, there is possibility that the conditional probability of reverse faults with less low dip angle earthquake shows low probability and indicates similar probability of strike fault (i.e. Takao et al., 2013). In the future, numerical simulation by considering failure condition of surface by the source fault would be performed in order to examine the amount of the displacement and conditional probability quantitatively.
NASA Astrophysics Data System (ADS)
Nalewajski, Roman F.
The flow of information in the molecular communication networks in the (condensed) atomic orbital (AO) resolution is investigated and the plane-wave (momentum-space) interpretation of the average Fisher information in the molecular information system is given. It is argued using the quantum-mechanical superposition principle that, in the LCAO MO theory the squares of corresponding elements of the Charge and Bond-Order (CBO) matrix determine the conditional probabilities between AO, which generate the molecular communication system of the Orbital Communication Theory (OCT) of the chemical bond. The conditional-entropy ("noise," information-theoretic "covalency") and the mutual-information (information flow, information-theoretic "ionicity") descriptors of these molecular channels are related to Wiberg's covalency indices of chemical bonds. The illustrative application of OCT to the three-orbital model of the chemical bond X-Y, which is capable of describing the forward- and back-donations as well as the atom promotion accompanying the bond formation, is reported. It is demonstrated that the entropy/information characteristics of these separate bond-effects can be extracted by an appropriate reduction of the output of the molecular information channel, carried out by combining several exits into a single (condensed) one. The molecular channels in both the AO and hybrid orbital representations are examined for both the molecular and representative promolecular input probabilities.
ERIC Educational Resources Information Center
Harris, Adam J. L.; Corner, Adam
2011-01-01
Verbal probability expressions are frequently used to communicate risk and uncertainty. The Intergovernmental Panel on Climate Change (IPCC), for example, uses them to convey risks associated with climate change. Given the potential for human action to mitigate future environmental risks, it is important to understand how people respond to these…
Are quantum-mechanical-like models possible, or necessary, outside quantum physics?
NASA Astrophysics Data System (ADS)
Plotnitsky, Arkady
2014-12-01
This article examines some experimental conditions that invite and possibly require recourse to quantum-mechanical-like mathematical models (QMLMs), models based on the key mathematical features of quantum mechanics, in scientific fields outside physics, such as biology, cognitive psychology, or economics. In particular, I consider whether the following two correlative features of quantum phenomena that were decisive for establishing the mathematical formalism of quantum mechanics play similarly important roles in QMLMs elsewhere. The first is the individuality and discreteness of quantum phenomena, and the second is the irreducibly probabilistic nature of our predictions concerning them, coupled to the particular character of the probabilities involved, as different from the character of probabilities found in classical physics. I also argue that these features could be interpreted in terms of a particular form of epistemology that suspends and even precludes a causal and, in the first place, realist description of quantum objects and processes. This epistemology limits the descriptive capacity of quantum theory to the description, classical in nature, of the observed quantum phenomena manifested in measuring instruments. Quantum mechanics itself only provides descriptions, probabilistic in nature, concerning numerical data pertaining to such phenomena, without offering a physical description of quantum objects and processes. While QMLMs share their use of the quantum-mechanical or analogous mathematical formalism, they may differ by the roles, if any, the two features in question play in them and by different ways of interpreting the phenomena they considered and this formalism itself. This article will address those differences as well.
Equivalence principle for quantum systems: dephasing and phase shift of free-falling particles
NASA Astrophysics Data System (ADS)
Anastopoulos, C.; Hu, B. L.
2018-02-01
We ask the question of how the (weak) equivalence principle established in classical gravitational physics should be reformulated and interpreted for massive quantum objects that may also have internal degrees of freedom (dof). This inquiry is necessary because even elementary concepts like a classical trajectory are not well defined in quantum physics—trajectories originating from quantum histories become viable entities only under stringent decoherence conditions. From this investigation we posit two logically and operationally distinct statements of the equivalence principle for quantum systems. Version A: the probability distribution of position for a free-falling particle is the same as the probability distribution of a free particle, modulo a mass-independent shift of its mean. Version B: any two particles with the same velocity wave-function behave identically in free fall, irrespective of their masses. Both statements apply to all quantum states, including those without a classical correspondence, and also for composite particles with quantum internal dof. We also investigate the consequences of the interaction between internal and external dof induced by free fall. For a class of initial states, we find dephasing occurs for the translational dof, namely, the suppression of the off-diagonal terms of the density matrix, in the position basis. We also find a gravitational phase shift in the reduced density matrix of the internal dof that does not depend on the particle’s mass. For classical states, the phase shift has a natural classical interpretation in terms of gravitational red-shift and special relativistic time-dilation.
Rank and independence in contingency table
NASA Astrophysics Data System (ADS)
Tsumoto, Shusaku
2004-04-01
A contingency table summarizes the conditional frequencies of two attributes and shows how these two attributes are dependent on each other. Thus, this table is a fundamental tool for pattern discovery with conditional probabilities, such as rule discovery. In this paper, a contingency table is interpreted from the viewpoint of statistical independence and granular computing. The first important observation is that a contingency table compares two attributes with respect to the number of equivalence classes. For example, a n x n table compares two attributes with the same granularity, while a m x n(m >= n) table compares two attributes with different granularities. The second important observation is that matrix algebra is a key point of analysis of this table. Especially, the degree of independence, rank plays a very important role in evaluating the degree of statistical independence. Relations between rank and the degree of dependence are also investigated.
Gavett, Brandon E
2015-03-01
The base rates of abnormal test scores in cognitively normal samples have been a focus of recent research. The goal of the current study is to illustrate how Bayes' theorem uses these base rates--along with the same base rates in cognitively impaired samples and prevalence rates of cognitive impairment--to yield probability values that are more useful for making judgments about the absence or presence of cognitive impairment. Correlation matrices, means, and standard deviations were obtained from the Wechsler Memory Scale--4th Edition (WMS-IV) Technical and Interpretive Manual and used in Monte Carlo simulations to estimate the base rates of abnormal test scores in the standardization and special groups (mixed clinical) samples. Bayes' theorem was applied to these estimates to identify probabilities of normal cognition based on the number of abnormal test scores observed. Abnormal scores were common in the standardization sample (65.4% scoring below a scaled score of 7 on at least one subtest) and more common in the mixed clinical sample (85.6% scoring below a scaled score of 7 on at least one subtest). Probabilities varied according to the number of abnormal test scores, base rates of normal cognition, and cutoff scores. The results suggest that interpretation of base rates obtained from cognitively healthy samples must also account for data from cognitively impaired samples. Bayes' theorem can help neuropsychologists answer questions about the probability that an individual examinee is cognitively healthy based on the number of abnormal test scores observed.
Quantum computing and probability.
Ferry, David K
2009-11-25
Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.
Karahan Şen, Nazlı Pınar; Bekiş, Recep; Ceylan, Ali; Derebek, Erkan
2016-07-01
Myocardial perfusion scintigraphy (MPS) is a diagnostic test which is frequently used in the diagnosis of coronary heart disease (CHD). MPS is generally interpreted as ischemia present or absent; however, it has a power in predicting the disease, similar to other diagnostic tests. In this study, we aimed to assist in directing the high-risk patients to undergo coronary angiography (CA) primarily by evaluating patients without prior CHD history with pre-test and post-test probabilities. The study was designed as a retrospective study. Between January 2008 and July 2011, 139 patients with positive MPS results and followed by CA recently (<6 months) were evaluated from patient files. Patients' pre-test probabilities based on the Diamond and Forrester method and the likelihood ratios that were obtained from the literature were used to calculate the patients' post exercise and post-MPS probabilities. Patients were evaluated in risk groups as low, intermediate, and high, and an ROC curve analysis was performed for the post-MPS probabilities. Coronary artery stenosis (CAS) was determined in 59 patients (42.4%). A significant difference was determined between the risk groups according to CAS, both for the pre-test and post-test probabilities (p<0.001, p=0.024). The ROC analysis provided a cut-off value of 80.4% for post- MPS probability in predicting CAS with 67.9% sensitivity and 77.8% specificity. When the post-MPS probability is ≥80% in patients who have reversible perfusion defects on MPS, we suggest interpreting the MPS as "high probability positive" to improve the selection of true-positive patients to undergo CA, and these patients should be primarily recommended CA.
Şen, Nazlı Pınar Karahan; Bekiş, Recep; Ceylan, Ali; Derebek, Erkan
2016-01-01
Objective: Myocardial perfusion scintigraphy (MPS) is a diagnostic test which is frequently used in the diagnosis of coronary heart disease (CHD). MPS is generally interpreted as ischemia present or absent; however, it has a power in predicting the disease, similar to other diagnostic tests. In this study, we aimed to assist in directing the high-risk patients to undergo coronary angiography (CA) primarily by evaluating patients without prior CHD history with pre-test and post-test probabilities. Methods: The study was designed as a retrospective study. Between January 2008 and July 2011, 139 patients with positive MPS results and followed by CA recently (<6 months) were evaluated from patient files. Patients’ pre-test probabilities based on the Diamond and Forrester method and the likelihood ratios that were obtained from the literature were used to calculate the patients’ post-exercise and post-MPS probabilities. Patients were evaluated in risk groups as low, intermediate, and high, and an ROC curve analysis was performed for the post-MPS probabilities. Results: Coronary artery stenosis (CAS) was determined in 59 patients (42.4%). A significant difference was determined between the risk groups according to CAS, both for the pre-test and post-test probabilities (p<0.001, p=0.024). The ROC analysis provided a cut-off value of 80.4% for post-MPS probability in predicting CAS with 67.9% sensitivity and 77.8% specificity. Conclusion: When the post-MPS probability is ≥80% in patients who have reversible perfusion defects on MPS, we suggest interpreting the MPS as “high probability positive” to improve the selection of true-positive patients to undergo CA, and these patients should be primarily recommended CA. PMID:27004704
NASA Astrophysics Data System (ADS)
Smith, Leonard A.
2010-05-01
This contribution concerns "deep" or "second-order" uncertainty, such as the uncertainty in our probability forecasts themselves. It asks the question: "Is it rational to take (or offer) bets using model-based probabilities as if they were objective probabilities?" If not, what alternative approaches for determining odds, perhaps non-probabilistic odds, might prove useful in practice, given the fact we know our models are imperfect? We consider the case where the aim is to provide sustainable odds: not to produce a profit but merely to rationally expect to break even in the long run. In other words, to run a quantified risk of ruin that is relatively small. Thus the cooperative insurance schemes of coastal villages provide a more appropriate parallel than a casino. A "better" probability forecast would lead to lower premiums charged and less volatile fluctuations in the cash reserves of the village. Note that the Bayesian paradigm does not constrain one to interpret model distributions as subjective probabilities, unless one believes the model to be empirically adequate for the task at hand. In geophysics, this is rarely the case. When a probability forecast is interpreted as the objective probability of an event, the odds on that event can be easily computed as one divided by the probability of the event, and one need not favour taking either side of the wager. (Here we are using "odds-for" not "odds-to", the difference being whether of not the stake is returned; odds of one to one are equivalent to odds of two for one.) The critical question is how to compute sustainable odds based on information from imperfect models. We suggest that this breaks the symmetry between the odds-on an event and the odds-against it. While a probability distribution can always be translated into odds, interpreting the odds on a set of events might result in "implied-probabilities" that sum to more than one. And/or the set of odds may be incomplete, not covering all events. We ask whether or not probabilities based on imperfect models can be expected to yield probabilistic odds which are sustainable. Evidence is provided that suggest this is not the case. Even with very good models (good in an Root-Mean-Square sense), the risk of ruin of probabilistic odds is significantly higher than might be expected. Methods for constructing model-based non-probabilistic odds which are sustainable are discussed. The aim here is to be relevant to real world decision support, and so unrealistic assumptions of equal knowledge, equal compute power, or equal access to information are to be avoided. Finally, the use of non-probabilistic odds as a method for communicating deep uncertainty (uncertainty in a probability forecast itself) is discussed in the context of other methods, such as stating one's subjective probability that the models will prove inadequate in each particular instance (that is, the Probability of a "Big Surprise").
Breaking down the contribution of different meteorological mechanisms
NASA Astrophysics Data System (ADS)
Dufour, Ambroise; Tilinina, Natalia; Zolina, Olga; Gulev, Sergey
2017-04-01
Several mechanisms are held responsible for extreme atmospheric moisture into the Arctic - our case study - : extratropical cyclones, breaking Rossby waves, blocking events, etc. Based on composite analysis, all these phenomena have been associated with above average meridional moisture transport. These individual conclusions call for a synthesis in order to share the credit between the different mechanisms. However, it is impossible to break down the respective contributions by simply using their composites due to the risk of double counting. Indeed, the different phenomena may occur simultaneously and have overlapping regions of influence. As a result, building composites for one phenomenon will likely count in a portion of the others as well. This ambiguity is raised within a probabilistic framework by viewing composites as conditional expectations. For a given event A, the composite is written as the sum of each event's contribution weighted by the event's conditional probability given A. The composites for a set of events can be interpreted as a linear system whose coefficents are conditional probabilities and whose solution is each event's individual contribution. Using data from ERA Interim and cyclone tracks from the Shirshov Institute of Oceanology, we solve the linear system in the case of moisture transport through 70°N. The main result is to downgrade the collective influence of extratropical cyclones due to the predominance of weak inconsequential cyclones. Transient eddies are nonetheless responsible for more than 90 % of the transport : it undermines the common but untested assumption that transient eddies are identical to extratropical cyclones.
Predicting the Occurrence of Haze Events in Southeast Asia using Machine Learning Algorithms
NASA Astrophysics Data System (ADS)
Lee, H. H.; Chulakadabba, A.; Tonks, A.; Yang, Z.; Wang, C.
2017-12-01
Severe local- and regional-scale air pollution episodes typically originate from 1) high emissions of air pollutants, 2) poor dispersion conditions, and 3) trans-boundary pollutant transport. Biomass burning activities have become more frequent in Southeast Asia, especially in Sumatra, Borneo, and the mainland Southeast. Trans-boundary transport of biomass burning aerosols often lead to air quality problems in the region. Furthermore, particulate pollutants from human activities besides biomass burning also play an important role in the air quality of Southeast Asia. Singapore, for example, has a dynamic industrial sector including chemical, electric and metallurgic industries, and is the region's major petroleum-refining center. In addition, natural gas and oil power plants, waste incinerators, active port traffic, and a major regional airport further complicate Singapore's air quality issues. In this study, we compare five Machine Learning algorithms: k-Nearest Neighbors, Linear Support Vector Machine, Decision Tree, Random Forest and Artificial Neural Network, to identify haze patterns and determine variable importance. The algorithms were trained using local atmospheric data (i.e. months, atmospheric conditions, wind direction and relative humidity) from three observation stations in Singapore (Changi, Seletar and Paya Labar). We find that the algorithms reveal the associations in data within and between the stations, and provide in-depth interpretation of the haze sources. The algorithms also allow us to predict the probability of haze episodes in Singapore and to determine the correlation between this probability and atmospheric conditions.
On the validity of Freud's dream interpretations.
Michael, Michael
2008-03-01
In this article I defend Freud's method of dream interpretation against those who criticize it as involving a fallacy-namely, the reverse causal fallacy-and those who criticize it as permitting many interpretations, indeed any that the interpreter wants to put on the dream. The first criticism misconstrues the logic of the interpretative process: it does not involve an unjustified reversal of causal relations, but rather a legitimate attempt at an inference to the best explanation. The judgement of whether or not a particular interpretation is the best explanation depends on the details of the case in question. I outline the kinds of probabilities involved in making the judgement. My account also helps to cash out the metaphors of the jigsaw and crossword puzzles that Freudians have used in response to the 'many interpretations' objection. However, in defending Freud's method of dream interpretation, I do not thereby defend his theory of dreams, which cannot be justified by his interpretations alone.
Dependence of elastic hadron collisions on impact parameter
NASA Astrophysics Data System (ADS)
Procházka, Jiří; Lokajíček, Miloš V.; Kundrát, Vojtěch
2016-05-01
Elastic proton-proton collisions represent probably the greatest ensemble of available measured data, the analysis of which may provide a large amount of new physical results concerning fundamental particles. It is, however, necessary to analyze first some conclusions concerning pp collisions and their interpretations differing fundamentally from our common macroscopic experience. It has been argued, e.g., that elastic hadron collisions have been more central than inelastic ones, even if any explanation of the existence of so different processes, i.e., elastic and inelastic (with hundreds of secondary particles) collisions, under the same conditions has not been given until now. The given conclusion has been based on a greater number of simplifying mathematical assumptions (already done in earlier calculations), without their influence on physical interpretation being analyzed and entitled; the corresponding influence has started to be studied in the approach based on the eikonal model. The possibility of a peripheral interpretation of elastic collisions will be demonstrated and the corresponding results summarized. The arguments will be given on why no preference may be given to the mentioned centrality against the standard peripheral behaviour. The corresponding discussion on the contemporary description of elastic hadronic collision in dependence on the impact parameter will be summarized and the justification of some important assumptions will be considered.
Garrido-Balsells, José María; Jurado-Navas, Antonio; Paris, José Francisco; Castillo-Vazquez, Miguel; Puerta-Notario, Antonio
2015-03-09
In this paper, a novel and deeper physical interpretation on the recently published Málaga or ℳ statistical distribution is provided. This distribution, which is having a wide acceptance by the scientific community, models the optical irradiance scintillation induced by the atmospheric turbulence. Here, the analytical expressions previously published are modified in order to express them by a mixture of the known Generalized-K and discrete Binomial and Negative Binomial distributions. In particular, the probability density function (pdf) of the ℳ model is now obtained as a linear combination of these Generalized-K pdf, in which the coefficients depend directly on the parameters of the ℳ distribution. In this way, the Málaga model can be physically interpreted as a superposition of different optical sub-channels each of them described by the corresponding Generalized-K fading model and weighted by the ℳ dependent coefficients. The expressions here proposed are simpler than the equations of the original ℳ model and are validated by means of numerical simulations by generating ℳ -distributed random sequences and their associated histogram. This novel interpretation of the Málaga statistical distribution provides a valuable tool for analyzing the performance of atmospheric optical channels for every turbulence condition.
ERIC Educational Resources Information Center
Brown, Catherine A.; And Others
1988-01-01
Suggests that secondary school students seem to have reasonably good procedural knowledge in areas of mathematics as rational numbers, probability, measurement, and data organization and interpretation. It appears, however, that students are lacking the conceptual knowledge enabling them to successfully do the assessment items on applications,…
Cross-Over Between Different Symmetries
NASA Astrophysics Data System (ADS)
Frauendorf, S.
2014-09-01
The yrast states of even even vibrational and transitional nuclei are interpreted as a rotating condensate of interacting d-bosons. The corresponding semi-classical tidal wave concept is used for microscopic calculations of energies and E2 transition probabilities. The strong octupole correlations in the light rare earth and actinide nuclides are interpreted as rotation-induced condensation of interacting f-bosons.
Reliability Estimation of Parameters of Helical Wind Turbine with Vertical Axis
Dumitrascu, Adela-Eliza; Lepadatescu, Badea; Dumitrascu, Dorin-Ion; Nedelcu, Anisor; Ciobanu, Doina Valentina
2015-01-01
Due to the prolonged use of wind turbines they must be characterized by high reliability. This can be achieved through a rigorous design, appropriate simulation and testing, and proper construction. The reliability prediction and analysis of these systems will lead to identifying the critical components, increasing the operating time, minimizing failure rate, and minimizing maintenance costs. To estimate the produced energy by the wind turbine, an evaluation approach based on the Monte Carlo simulation model is developed which enables us to estimate the probability of minimum and maximum parameters. In our simulation process we used triangular distributions. The analysis of simulation results has been focused on the interpretation of the relative frequency histograms and cumulative distribution curve (ogive diagram), which indicates the probability of obtaining the daily or annual energy output depending on wind speed. The experimental researches consist in estimation of the reliability and unreliability functions and hazard rate of the helical vertical axis wind turbine designed and patented to climatic conditions for Romanian regions. Also, the variation of power produced for different wind speeds, the Weibull distribution of wind probability, and the power generated were determined. The analysis of experimental results indicates that this type of wind turbine is efficient at low wind speed. PMID:26167524
Reliability Estimation of Parameters of Helical Wind Turbine with Vertical Axis.
Dumitrascu, Adela-Eliza; Lepadatescu, Badea; Dumitrascu, Dorin-Ion; Nedelcu, Anisor; Ciobanu, Doina Valentina
2015-01-01
Due to the prolonged use of wind turbines they must be characterized by high reliability. This can be achieved through a rigorous design, appropriate simulation and testing, and proper construction. The reliability prediction and analysis of these systems will lead to identifying the critical components, increasing the operating time, minimizing failure rate, and minimizing maintenance costs. To estimate the produced energy by the wind turbine, an evaluation approach based on the Monte Carlo simulation model is developed which enables us to estimate the probability of minimum and maximum parameters. In our simulation process we used triangular distributions. The analysis of simulation results has been focused on the interpretation of the relative frequency histograms and cumulative distribution curve (ogive diagram), which indicates the probability of obtaining the daily or annual energy output depending on wind speed. The experimental researches consist in estimation of the reliability and unreliability functions and hazard rate of the helical vertical axis wind turbine designed and patented to climatic conditions for Romanian regions. Also, the variation of power produced for different wind speeds, the Weibull distribution of wind probability, and the power generated were determined. The analysis of experimental results indicates that this type of wind turbine is efficient at low wind speed.
Adams, Michael J.; Chelgren, Nathan; Reinitz, David M.; Cole, Rebecca A.; Rachowicz, L.J.; Galvan, Stephanie; Mccreary, Brome; Pearl, Christopher A.; Bailey, Larissa L.; Bettaso, Jamie B.; Bull, Evelyn L.; Leu, Matthias
2010-01-01
Batrachochytrium dendrobatidis is a fungal pathogen that is receiving attention around the world for its role in amphibian declines. Study of its occurrence patterns is hampered by false negatives: the failure to detect the pathogen when it is present. Occupancy models are a useful but currently underutilized tool for analyzing detection data when the probability of detecting a species is <1. We use occupancy models to evaluate hypotheses concerning the occurrence and prevalence of B. dendrobatidis and discuss how this application differs from a conventional occupancy approach. We found that the probability of detecting the pathogen, conditional on presence of the pathogen in the anuran population, was related to amphibian development stage, day of the year, elevation, and human activities. Batrachochytrium dendrobatidis was found throughout our study area but was only estimated to occur in 53.4% of 78 populations of native amphibians and 66.4% of 40 populations of nonnative Rana catesbeiana tested. We found little evidence to support any spatial hypotheses concerning the probability that the pathogen occurs in a population, but did find evidence of some taxonomic variation. We discuss the interpretation of occupancy model parameters, when, unlike a conventional occupancy application, the number of potential samples or observations is finite.
Hydrogeologic unit flow characterization using transition probability geostatistics.
Jones, Norman L; Walker, Justin R; Carle, Steven F
2005-01-01
This paper describes a technique for applying the transition probability geostatistics method for stochastic simulation to a MODFLOW model. Transition probability geostatistics has some advantages over traditional indicator kriging methods including a simpler and more intuitive framework for interpreting geologic relationships and the ability to simulate juxtapositional tendencies such as fining upward sequences. The indicator arrays generated by the transition probability simulation are converted to layer elevation and thickness arrays for use with the new Hydrogeologic Unit Flow package in MODFLOW 2000. This makes it possible to preserve complex heterogeneity while using reasonably sized grids and/or grids with nonuniform cell thicknesses.
The Conjunction Fallacy and the Many Meanings of "And"
ERIC Educational Resources Information Center
Hertwig, Ralph; Benz, Bjorn; Krauss, Stefan
2008-01-01
According to the conjunction rule, the probability of A "and" B cannot exceed the probability of either single event. This rule reads "and" in terms of the logical operator [inverted v], interpreting A and B as an intersection of two events. As linguists have long argued, in natural language "and" can convey a wide range of relationships between…
The Conjunction Fallacy: A Misunderstanding about Conjunction?
ERIC Educational Resources Information Center
Tentori, Katya; Bonini, Nicolao; Osherson, Daniel
2004-01-01
It is easy to construct pairs of sentences X, Y that lead many people to ascribe higher probability to the conjunction X-and-Y than to the conjuncts X, Y. Whether an error is thereby committed depends on reasoners' interpretation of the expressions "probability" and "and." We report two experiments designed to clarify the normative status of…
Toomey, D E; Yang, K H; Van Ee, C A
2014-01-01
Physical biomechanical surrogates are critical for testing the efficacy of injury-mitigating safety strategies. The interpretation of measured Hybrid III neck loads in test scenarios resulting in compressive loading modes would be aided by a further understanding of the correlation between the mechanical responses in the Hybrid III neck and the probability of injury in the human cervical spine. The anthropomorphic test device (ATD) peak upper and lower neck responses were measured during dynamic compressive loading conditions comparable to those of postmortem human subject (PMHS) experiments. The peak ATD response could then be compared to the PMHS injury outcomes. A Hybrid III 50th percentile ATD head and neck assembly was tested under conditions matching those of male PMHS tests conducted on an inverted drop track. This includes variation in impact plate orientation (4 sagittal plane and 2 frontal plane orientations), impact plate surface friction, and ATD initial head/neck orientation. This unique matched data with known injury outcomes were used to evaluate existing ATD neck injury criteria. The Hybrid III ATD head and neck assembly was found to be robust and repeatable under severe loading conditions. The initial axial force response of the ATD head and neck is very comparable to PMHS experiments up to the point of PMHS cervical column buckle or material failure. An ATD lower neck peak compressive force as low as 6,290 N was associated with an unstable orthopedic cervical injury in a PMHS under equivalent impact conditions. ATD upper neck peak compressive force associated with a 5% probability of unstable cervical orthopedic injury ranged from as low as 3,708 to 3,877 N depending on the initial ATD neck angle. The correlation between peak ATD compressive neck response and PMHS test outcome in the current study resulted in a relationship between axial load and injury probability consistent with the current Hybrid III injury assessment reference values. The results add to the current understanding of cervical injury probability based on ATD neck compressive loading in that it is the only known study, in addition to Mertz et al. (1978), formulated directly from ATD compressive loading scenarios with known human injury outcomes.
Optimum space shuttle launch times relative to natural environment
NASA Technical Reports Server (NTRS)
King, R. L.
1977-01-01
The probabilities of favorable and unfavorable weather conditions for launch and landing of the STS under different criteria were computed for every three hours on a yearly basis using 14 years of weather data. These temporal probability distributions were considered for three sets of weather criteria encompassing benign, moderate and severe weather conditions for both Kennedy Space Center and for Edwards Air Force Base. In addition, the conditional probabilities were computed for unfavorable weather conditions occurring after a delay which may or may not be due to weather conditions. Also for KSC, the probabilities of favorable landing conditions at various times after favorable launch conditions have prevailed. The probabilities were computed to indicate the significance of each weather element to the overall result.
Radiology's Achilles' heel: error and variation in the interpretation of the Röntgen image.
Robinson, P J
1997-11-01
The performance of the human eye and brain has failed to keep pace with the enormous technical progress in the first full century of radiology. Errors and variations in interpretation now represent the weakest aspect of clinical imaging. Those interpretations which differ from the consensus view of a panel of "experts" may be regarded as errors; where experts fail to achieve consensus, differing reports are regarded as "observer variation". Errors arise from poor technique, failures of perception, lack of knowledge and misjudgments. Observer variation is substantial and should be taken into account when different diagnostic methods are compared; in many cases the difference between observers outweighs the difference between techniques. Strategies for reducing error include attention to viewing conditions, training of the observers, availability of previous films and relevant clinical data, dual or multiple reporting, standardization of terminology and report format, and assistance from computers. Digital acquisition and display will probably not affect observer variation but the performance of radiologists, as measured by receiver operating characteristic (ROC) analysis, may be improved by computer-directed search for specific image features. Other current developments show that where image features can be comprehensively described, computer analysis can replace the perception function of the observer, whilst the function of interpretation can in some cases be performed better by artificial neural networks. However, computer-assisted diagnosis is still in its infancy and complete replacement of the human observer is as yet a remote possibility.
The Estimation of Tree Posterior Probabilities Using Conditional Clade Probability Distributions
Larget, Bret
2013-01-01
In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample. [Bayesian phylogenetics; conditional clade distributions; improved accuracy; posterior probabilities of trees.] PMID:23479066
ERIC Educational Resources Information Center
Satake, Eiki; Amato, Philip P.
2008-01-01
This paper presents an alternative version of formulas of conditional probabilities and Bayes' rule that demonstrate how the truth table of elementary mathematical logic applies to the derivations of the conditional probabilities of various complex, compound statements. This new approach is used to calculate the prior and posterior probabilities…
Negative values of quasidistributions and quantum wave and number statistics
NASA Astrophysics Data System (ADS)
Peřina, J.; Křepelka, J.
2018-04-01
We consider nonclassical wave and number quantum statistics, and perform a decomposition of quasidistributions for nonlinear optical down-conversion processes using Bessel functions. We show that negative values of the quasidistribution do not directly represent probabilities; however, they directly influence measurable number statistics. Negative terms in the decomposition related to the nonclassical behavior with negative amplitudes of probability can be interpreted as positive amplitudes of probability in the negative orthogonal Bessel basis, whereas positive amplitudes of probability in the positive basis describe classical cases. However, probabilities are positive in all cases, including negative values of quasidistributions. Negative and positive contributions of decompositions to quasidistributions are estimated. The approach can be adapted to quantum coherence functions.
Can we expect to predict climate if we cannot shadow weather?
NASA Astrophysics Data System (ADS)
Smith, Leonard
2010-05-01
What limits our ability to predict (or project) useful statistics of future climate? And how might we quantify those limits? In the early 1960s, Ed Lorenz illustrated one constraint on point forecasts of the weather (chaos) while noting another (model imperfections). In the mid-sixties he went on to discuss climate prediction, noting that chaos, per se, need not limit accurate forecasts of averages and the distributions that define climate. In short, chaos might place draconian limits on what we can say about a particular summer day in 2010 (or 2040), but it need not limit our ability to make accurate and informative statements about the weather over this summer as a whole, or climate distributions of the 2040's. If not chaos, what limits our ability to produce decision relevant probability distribution functions (PDFs)? Is this just a question of technology (raw computer power) and uncertain boundary conditions (emission scenarios)? Arguably, current model simulations of the Earth's climate are limited by model inadequacy: not that the initial or boundary conditions are unknown but that state-of-the-art models would not yield decision-relevant probability distributions even if they were known. Or to place this statement in an empirically falsifiable format: that in 2100 when the boundary conditions are known and computer power is (hopefully) sufficient to allow exhaustive exploration of today's state-of-the-art models: we will find today's models do not admit a trajectory consistent with our knowledge of the state of the earth in 2009 which would prove of decision support relevance for, say, 25 km, hourly resolution. In short: today's models cannot shadow the weather of this century even after the fact. Restating this conjecture in a more positive frame: a 2100 historian of science will be able to determine the highest space and time scales on which 2009 models could have (i) produced trajectories plausibly consistent with the (by then) observed twenty-first century and (ii) produced probability distributions useful as such for decision support. As it will be some time until such conjectures can be refuted, how might we best advise decision makers of the detail (specifically, space and time resolution of a quantity of interest as a function of lead-time) that it is rational to interpret model-based PDFs as decision-relevant probability distributions? Given the nonlinearities already incorporated in our models, how far into the future can one expect a simulation to get the temperature "right" given the simulation has precipitation badly "wrong"? When can biases in local temperature which melt model-ice no longer be dismissed, and neglected by presenting model-anomalies? At what lead times will feedbacks due to model inadequacies cause the 2007 model simulations to drift away from what today's basic science (and 2100 computer power) would suggest? How might one justify quantitative claims regarding "extreme events" (or NUMB weather)? Models are unlikely to forecast things they cannot shadow, or at least track. There is no constraint on rational scientists to take model distributions as their subjective probabilities, unless they believe the model is empirically adequate. How then are we to use today's simulations to inform today's decisions? Two approaches are considered. The first augments the model-based PDF with an explicit subjective-probability of a "Big Surprise". The second is to look not for a PDF but, following Solvency II, consider the risk from any event that cannot be ruled out at, say, the one in 200 level. The fact that neither approach provides the simplicity and apparent confidence of interpreting model-based PDFs as if they were objective probabilities does not contradict the claim that either might lead to better decision-making.
Hamster Math: Authentic Experiences in Data Collection.
ERIC Educational Resources Information Center
Jorgensen, Beth
1996-01-01
Describes the data collection and interpretation project of primary grade students involving predicting, graphing, estimating, measuring, number problem construction, problem solving, and probability. (MKR)
NASA Astrophysics Data System (ADS)
Kergadallan, Xavier; Bernardara, Pietro; Benoit, Michel; Andreewsky, Marc; Weiss, Jérôme
2013-04-01
Estimating the probability of occurrence of extreme sea levels is a central issue for the protection of the coast. Return periods of sea level with wave set-up contribution are estimated here in one site : Cherbourg in France in the English Channel. The methodology follows two steps : the first one is computation of joint probability of simultaneous wave height and still sea level, the second one is interpretation of that joint probabilities to assess a sea level for a given return period. Two different approaches were evaluated to compute joint probability of simultaneous wave height and still sea level : the first one is multivariate extreme values distributions of logistic type in which all components of the variables become large simultaneously, the second one is conditional approach for multivariate extreme values in which only one component of the variables have to be large. Two different methods were applied to estimate sea level with wave set-up contribution for a given return period : Monte-Carlo simulation in which estimation is more accurate but needs higher calculation time and classical ocean engineering design contours of type inverse-FORM in which the method is simpler and allows more complex estimation of wave setup part (wave propagation to the coast for example). We compare results from the two different approaches with the two different methods. To be able to use both Monte-Carlo simulation and design contours methods, wave setup is estimated with an simple empirical formula. We show advantages of the conditional approach compared to the multivariate extreme values approach when extreme sea-level occurs when either surge or wave height is large. We discuss the validity of the ocean engineering design contours method which is an alternative when computation of sea levels is too complex to use Monte-Carlo simulation method.
NASA Astrophysics Data System (ADS)
Roban, R. D.; Krézsek, C.; Melinte-Dobrinescu, M. C.
2017-06-01
The mid Cretaceous is characterized by high eustatic sea-levels with widespread oxic conditions that made possible the occurrence of globally correlated Oceanic Red Beds. However, very often, these eustatic signals have been overprinted by local tectonics, which in turn resulted in Lower Cretaceous closed and anoxic basins, as in the Eastern Carpathians. There, the black shale to red bed transition occurs in the latest Albian up to the early Cenomanian. Although earlier studies discussed the large-scale basin configuration, no detailed petrography and sedimentology study has been performed in the Eastern Carpathians. This paper describes the Hauterivian to Turonian lithofacies and interprets the depositional settings based on their sedimentological features. The studied sections crop out only in tectonic half windows of the Eastern Carpathians, part of the Vrancea Nappe. The lithofacies comprises black shales interbedded with siderites and sandstones, calcarenites, marls, radiolarites and red shales. The siliciclastic muddy lithofacies in general reflects accumulation by suspension settling of pelagites and hemipelagites in anoxic (black shale) to dysoxic (dark gray and gray to green shales) and oxic (red shales) conditions. The radiolarites alternate with siliceous shales and are considered as evidence of climate changes. The sandstones represent mostly low and high-density turbidite currents in deep-marine lobes, as well as channel/levee systems. The source area is an eastern one, e.g., the Eastern Carpathians Foreland, given the abundance of low grade metamorphic clasts. The Hauterivian - lower Albian sediments are interpreted as deep-marine, linear and multiple sourced mud dominated systems deposited in a mainly anoxic to dysoxic basin. The anoxic conditions existed in the early to late Albian, but sedimentation changed to a higher energy mud/sand-dominated submarine channels and levees. This coarsening upwards tendency is interpreted as the effect of the Aptian to Albian compressional tectonics of the Carpathians. The deepening of the Moldavide Basin from the Cenomanian is most probably linked to a significant sea-level rise.
ERIC Educational Resources Information Center
Satake, Eiki; Vashlishan Murray, Amy
2015-01-01
This paper presents a comparison of three approaches to the teaching of probability to demonstrate how the truth table of elementary mathematical logic can be used to teach the calculations of conditional probabilities. Students are typically introduced to the topic of conditional probabilities--especially the ones that involve Bayes' rule--with…
A Bayesian account of quantum histories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marlow, Thomas
2006-05-15
We investigate whether quantum history theories can be consistent with Bayesian reasoning and whether such an analysis helps clarify the interpretation of such theories. First, we summarise and extend recent work categorising two different approaches to formalising multi-time measurements in quantum theory. The standard approach consists of describing an ordered series of measurements in terms of history propositions with non-additive 'probabilities.' The non-standard approach consists of defining multi-time measurements to consist of sets of exclusive and exhaustive history propositions and recovering the single-time exclusivity of results when discussing single-time history propositions. We analyse whether such history propositions can be consistentmore » with Bayes' rule. We show that certain class of histories are given a natural Bayesian interpretation, namely, the linearly positive histories originally introduced by Goldstein and Page. Thus, we argue that this gives a certain amount of interpretational clarity to the non-standard approach. We also attempt a justification of our analysis using Cox's axioms of probability theory.« less
The effects of proficiency and bias on residents' interpretation of the microscopic urinalysis.
Flach, Stephen D; Canaris, Gay J; Tape, Thomas G; Huntley, Kathryn M; Wigton, Robert S
2002-01-01
This study aims to determine whether residents are influenced by clinical information when interpreting microscopic urinalysis (UA) and estimating the probability of a urinary tract infection (UTI), and to determine the accuracy and reliability of UA readings. Residents estimated the UA white blood cell count and the probability of a UTI in vignettes using a fractional factorial design, varying symptoms, gender, and the white blood cell count on preprepared urine slides. Individual-level results indicated a clinical information bias and poor accuracy. Seventeen of 38 residents increased the white blood cell count in response to female gender; 14 increased the white blood cell count in response to UTI symptoms. Forty-nine percent of the readings were inaccurate; agreement ranged from 50% to 67% for white and red blood cells and bacteria. Many residents gave inaccurate UA readings, and many readings varied with clinical information. A significant portion of residents needs assistance in objectively and accurately interpreting the UA.
NASA Astrophysics Data System (ADS)
Ronde, Christian De
In classical physics, probabilistic or statistical knowledge has been always related to ignorance or inaccurate subjective knowledge about an actual state of affairs. This idea has been extended to quantum mechanics through a completely incoherent interpretation of the Fermi-Dirac and Bose-Einstein statistics in terms of "strange" quantum particles. This interpretation, naturalized through a widespread "way of speaking" in the physics community, contradicts Born's physical account of Ψ as a "probability wave" which provides statistical information about outcomes that, in fact, cannot be interpreted in terms of `ignorance about an actual state of affairs'. In the present paper we discuss how the metaphysics of actuality has played an essential role in limiting the possibilities of understating things differently. We propose instead a metaphysical scheme in terms of immanent powers with definite potentia which allows us to consider quantum probability in a new light, namely, as providing objective knowledge about a potential state of affairs.
Geophysical interpretations west of and within the northwestern part of the Nevada Test Site
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grauch, V.J.; Sawyer, D.A.; Fridrich, C.J.
1997-12-31
This report focuses on interpretation of gravity and new magnetic data west of the Nevada Test Site (NTS) and within the northwestern part of NTS. The interpretations integrate the gravity and magnetic data with other geophysical, geological, and rock property data to put constraints on tectonic and magmatic features not exposed at the surface. West of NTS, where drill hole information is absent, these geophysical data provide the best available information on the subsurface. Interpreted subsurface features include calderas, intrusions, basalt flows and volcanoes, Tertiary basins, structurally high pre-Tertiary rocks, and fault zones. New features revealed by this study includemore » (1) a north-south buried tectonic fault east of Oasis Mountain, which the authors call the Hogback fault; (2) an east striking fault or accommodation zone along the south side of Oasis Valley basin, which they call the Hot Springs fault; (3) a NNE striking structural zone coinciding with the western margins of the caldera complexes; (4) regional magnetic highs that probably represent a thick sequence of Tertiary volcanic rocks; and (5) two probable buried calderas that may be related to the tuffs of Tolicha Peak and of Sleeping Butte, respectively.« less
Quantifying Treatment Benefit in Molecular Subgroups to Assess a Predictive Biomarker
Iasonos, Alexia; Chapman, Paul B.; Satagopan, Jaya M.
2016-01-01
There is an increased interest in finding predictive biomarkers that can guide treatment options for both mutation carriers and non-carriers. The statistical assessment of variation in treatment benefit (TB) according to the biomarker carrier status plays an important role in evaluating predictive biomarkers. For time to event endpoints, the hazard ratio (HR) for interaction between treatment and a biomarker from a Proportional Hazards regression model is commonly used as a measure of variation in treatment benefit. While this can be easily obtained using available statistical software packages, the interpretation of HR is not straightforward. In this article, we propose different summary measures of variation in TB on the scale of survival probabilities for evaluating a predictive biomarker. The proposed summary measures can be easily interpreted as quantifying differential in TB in terms of relative risk or excess absolute risk due to treatment in carriers versus non-carriers. We illustrate the use and interpretation of the proposed measures using data from completed clinical trials. We encourage clinical practitioners to interpret variation in TB in terms of measures based on survival probabilities, particularly in terms of excess absolute risk, as opposed to HR. PMID:27141007
Truth, models, model sets, AIC, and multimodel inference: a Bayesian perspective
Barker, Richard J.; Link, William A.
2015-01-01
Statistical inference begins with viewing data as realizations of stochastic processes. Mathematical models provide partial descriptions of these processes; inference is the process of using the data to obtain a more complete description of the stochastic processes. Wildlife and ecological scientists have become increasingly concerned with the conditional nature of model-based inference: what if the model is wrong? Over the last 2 decades, Akaike's Information Criterion (AIC) has been widely and increasingly used in wildlife statistics for 2 related purposes, first for model choice and second to quantify model uncertainty. We argue that for the second of these purposes, the Bayesian paradigm provides the natural framework for describing uncertainty associated with model choice and provides the most easily communicated basis for model weighting. Moreover, Bayesian arguments provide the sole justification for interpreting model weights (including AIC weights) as coherent (mathematically self consistent) model probabilities. This interpretation requires treating the model as an exact description of the data-generating mechanism. We discuss the implications of this assumption, and conclude that more emphasis is needed on model checking to provide confidence in the quality of inference.
Walter G. Thies; Douglas J. Westlind; Mark Loewen; Greg Brenner
2008-01-01
The Malheur model for fire-caused delayed mortality is presented as an easily interpreted graph (mortality-probability calculator) as part of a one-page field guide that allows the user to determine postfire probability of mortality for ponderosa pine (Pinus ponderosa Dougl. ex Laws.). Following both prescribed burns and wildfires, managers need...
Ergodic Theory, Interpretations of Probability and the Foundations of Statistical Mechanics
NASA Astrophysics Data System (ADS)
van Lith, Janneke
The traditional use of ergodic theory in the foundations of equilibrium statistical mechanics is that it provides a link between thermodynamic observables and microcanonical probabilities. First of all, the ergodic theorem demonstrates the equality of microcanonical phase averages and infinite time averages (albeit for a special class of systems, and up to a measure zero set of exceptions). Secondly, one argues that actual measurements of thermodynamic quantities yield time averaged quantities, since measurements take a long time. The combination of these two points is held to be an explanation why calculating microcanonical phase averages is a successful algorithm for predicting the values of thermodynamic observables. It is also well known that this account is problematic. This survey intends to show that ergodic theory nevertheless may have important roles to play, and it explores three other uses of ergodic theory. Particular attention is paid, firstly, to the relevance of specific interpretations of probability, and secondly, to the way in which the concern with systems in thermal equilibrium is translated into probabilistic language. With respect to the latter point, it is argued that equilibrium should not be represented as a stationary probability distribution as is standardly done; instead, a weaker definition is presented.
Population-specific FST values for forensic STR markers: A worldwide survey.
Buckleton, John; Curran, James; Goudet, Jérôme; Taylor, Duncan; Thiery, Alexandre; Weir, B S
2016-07-01
The interpretation of matching between DNA profiles of a person of interest and an item of evidence is undertaken using population genetic models to predict the probability of matching by chance. Calculation of matching probabilities is straightforward if allelic probabilities are known, or can be estimated, in the relevant population. It is more often the case, however, that the relevant population has not been sampled and allele frequencies are available only from a broader collection of populations as might be represented in a national or regional database. Variation of allele probabilities among the relevant populations is quantified by the population structure quantity FST and this quantity affects matching proportions. Matching within a population can be interpreted only with respect to matching between populations and we show here that FST, can be estimated from sample allelic matching proportions within and between populations. We report such estimates from data we extracted from 250 papers in the forensic literature, representing STR profiles at up to 24 loci from nearly 500,000 people in 446 different populations. The results suggest that theta values in current forensic use do not have the buffer of conservatism often thought. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Population-specific FST values for forensic STR markers: A worldwide survey
Buckleton, John; Curran, James; Goudet, Jérôme; Taylor, Duncan; Thiery, Alexandre; Weir, B.S.
2016-01-01
The interpretation of matching between DNA profiles of a person of interest and an item of evidence is undertaken using population genetic models to predict the probability of matching by chance. Calculation of matching probabilities is straightforward if allelic probabilities are known, or can be estimated, in the relevant population. It is more often the case, however, that the relevant population has not been sampled and allele frequencies are available only from a broader collection of populations as might be represented in a national or regional database. Variation of allele probabilities among the relevant populations is quantified by the population structure quantity FST and this quanity affects matching propoptions. Matching within a population can be interpreted only with respect to matching between populations and we show here that FST, can be estimated from sample allelic matching proportions within and between populations. We report such estimates from data we extracted from 250 papers in the forensic literature, representing STR profiles at up to 24 loci from nearly 500,000 people in 446 different populations. The results suggest that theta values in current forensic use do not have the buffer of conservativism often thought. PMID:27082756
Physical and chemical properties of the Martian soil: Review of resources
NASA Technical Reports Server (NTRS)
Stoker, C. R.; Gooding, James L.; Banin, A.; Clark, Benton C.; Roush, Ted
1991-01-01
The chemical and physical properties of Martian surface materials are reviewed from the perspective of using these resources to support human settlement. The resource potential of Martian sediments and soils can only be inferred from limited analyses performed by the Viking Landers (VL), from information derived from remote sensing, and from analysis of the SNC meteorites thought to be from Mars. Bulk elemental compositions by the VL inorganic chemical (x ray fluorescence) analysis experiments have been interpreted as evidence for clay minerals (possibly smectites) or mineraloids (palagonite) admixed with sulfate and chloride salts. The materials contained minerals bearing Fe, Ti, Al, Mg and Si. Martian surface materials may be used in many ways. Martian soil, with appropriate preconditioning, can probably be used as a plant growth medium, supplying mechanical support, nutrient elements, and water at optimal conditions to the plants. Loose Martian soils could be used to cover structures and provide radiation shielding for surface habitats. Martian soil could be wetted and formed into abode bricks used for construction. Duricrete bricks, with strength comparable to concrete, can probably be formed using compressed muds made from martian soil.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marchewka, M., E-mail: marmi@ur.edu.pl; Woźny, M.; Polit, J.
2014-03-21
To understand and interpret the experimental data on the phonon spectra of the solid solutions, it is necessary to describe mathematically the non-regular distribution of atoms in their lattices. It appears that such description is possible in case of the strongly stochastically homogenous distribution which requires a great number of atoms and very carefully mixed alloys. These conditions are generally fulfilled in case of high quality homogenous semiconductor solid solutions of the III–V and II–VI semiconductor compounds. In this case, we can use the Bernoulli relation describing probability of the occurrence of one n equivalent event which can be applied,more » to the probability of finding one from n configurations in the solid solution lattice. The results described in this paper for ternary HgCdTe and GaAsP as well as quaternary ZnCdHgTe can provide an affirmative answer to the question: whether stochastic geometry, e.g., the Bernoulli relation, is enough to describe the observed phonon spectra.« less
NASA Astrophysics Data System (ADS)
Marchewka, M.; Woźny, M.; Polit, J.; Kisiel, A.; Robouch, B. V.; Marcelli, A.; Sheregii, E. M.
2014-03-01
To understand and interpret the experimental data on the phonon spectra of the solid solutions, it is necessary to describe mathematically the non-regular distribution of atoms in their lattices. It appears that such description is possible in case of the strongly stochastically homogenous distribution which requires a great number of atoms and very carefully mixed alloys. These conditions are generally fulfilled in case of high quality homogenous semiconductor solid solutions of the III-V and II-VI semiconductor compounds. In this case, we can use the Bernoulli relation describing probability of the occurrence of one n equivalent event which can be applied, to the probability of finding one from n configurations in the solid solution lattice. The results described in this paper for ternary HgCdTe and GaAsP as well as quaternary ZnCdHgTe can provide an affirmative answer to the question: whether stochastic geometry, e.g., the Bernoulli relation, is enough to describe the observed phonon spectra.
Bell's "Theorem": loopholes vs. conceptual flaws
NASA Astrophysics Data System (ADS)
Kracklauer, A. F.
2017-12-01
An historical overview and detailed explication of a critical analysis of what has become known as Bell's Theorem to the effect that, it should be impossible to extend Quantum Theory with the addition of local, real variables so as to obtain a version free of the ambiguous and preternatural features of the currently accepted interpretations is presented. The central point on which this critical analysis, due originally to Edwin Jaynes, is that Bell incorrectly applied probabilistic formulas involving conditional probabilities. In addition, mathematical technicalities that have complicated the understanding of the logical or mathematical setting in which current theory and experimentation are embedded, are discussed. Finally, some historical speculations on the sociological environment, in particular misleading aspects, in which recent generations of physicists lived and worked are mentioned.
Stochastic foundations in nonlinear density-regulation growth
NASA Astrophysics Data System (ADS)
Méndez, Vicenç; Assaf, Michael; Horsthemke, Werner; Campos, Daniel
2017-08-01
In this work we construct individual-based models that give rise to the generalized logistic model at the mean-field deterministic level and that allow us to interpret the parameters of these models in terms of individual interactions. We also study the effect of internal fluctuations on the long-time dynamics for the different models that have been widely used in the literature, such as the theta-logistic and Savageau models. In particular, we determine the conditions for population extinction and calculate the mean time to extinction. If the population does not become extinct, we obtain analytical expressions for the population abundance distribution. Our theoretical results are based on WKB theory and the probability generating function formalism and are verified by numerical simulations.
How do pre-adolescent children interpret conditionals?
Markovits, Henry; Brisson, Janie; de Chantal, Pier-Luc
2016-12-01
Studies examining children's basic understanding of conditionals have led to very different conclusions. On the one hand, conditional inference tasks suggest that young children are able to interpret familiar conditionals in a complex manner. In contrast, truth-table tasks suggest that before adolescence, children have limited (conjunctive) representations of conditionals. We hypothesized that the latter results are due to use of what are essentially arbitrary conditionals. To examine this, we gave a truth-table task using two kinds of conditional rules, Arbitrary and Imaginary categorical rules (If an animal is a bori, then it has red wings) to 9- and 12-year-olds. Results with the Arbitrary rules were consistent with those found in previous studies, with the most frequent interpretation being the Conjunctive one. However, among even the youngest children, the most frequent interpretation of the Imaginary categorical rules was the defective conditional, which is only found with much older adolescents with Arbitrary rules. These results suggest that working memory limitations are not an important developmental factor in how young children interpret conditional rules.
The Probability Approach to English If-Conditional Sentences
ERIC Educational Resources Information Center
Wu, Mei
2012-01-01
Users of the Probability Approach choose the right one from four basic types of conditional sentences--factual, predictive, hypothetical and counterfactual conditionals, by judging how likely (i.e. the probability) the event in the result-clause will take place when the condition in the if-clause is met. Thirty-three students from the experimental…
Inferential revision in narrative texts: An ERP study.
Pérez, Ana; Cain, Kate; Castellanos, María C; Bajo, Teresa
2015-11-01
We evaluated the process of inferential revision during text comprehension in adults. Participants with high or low working memory read short texts, in which the introduction supported two plausible concepts (e.g., 'guitar/violin'), although one was more probable ('guitar'). There were three possible continuations: a neutral sentence, which did not refer back to either concept; a no-revise sentence, which referred to a general property consistent with either concept (e.g., '…beautiful curved body'); and a revise sentence, which referred to a property that was consistent with only the less likely concept (e.g., '…matching bow'). Readers took longer to read the sentence in the revise condition, indicating that they were able to evaluate their comprehension and detect a mismatch. In a final sentence, a target noun referred to the alternative concept supported in the revise condition (e.g., 'violin'). ERPs indicated that both working memory groups were able to evaluate their comprehension of the text (P3a), but only high working memory readers were able to revise their initial incorrect interpretation (P3b) and integrate the new information (N400) when reading the revise sentence. Low working memory readers had difficulties inhibiting the no-longer-relevant interpretation and thus failed to revise their situation model, and they experienced problems integrating semantically related information into an accurate memory representation.
PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT
We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...
PROBABILITY SURVEYS, CONDITIONAL PROBABILITIES, AND ECOLOGICAL RISK ASSESSMENT
We show that probability-based environmental resource monitoring programs, such as U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Asscssment Program EMAP) can be analyzed with a conditional probability analysis (CPA) to conduct quantitative probabi...
Secondary ion formation during electronic and nuclear sputtering of germanium
NASA Astrophysics Data System (ADS)
Breuer, L.; Ernst, P.; Herder, M.; Meinerzhagen, F.; Bender, M.; Severin, D.; Wucher, A.
2018-06-01
Using a time-of-flight mass spectrometer attached to the UNILAC beamline located at the GSI Helmholtz Centre for Heavy Ion Research, we investigate the formation of secondary ions sputtered from a germanium surface under irradiation by swift heavy ions (SHI) such as 5 MeV/u Au by simultaneously recording the mass spectra of the ejected secondary ions and their neutral counterparts. In these experiments, the sputtered neutral material is post-ionized via single photon absorption from a pulsed, intensive VUV laser. After post-ionization, the instrument cannot distinguish between secondary ions and post-ionized neutrals, so that both signals can be directly compared in order to investigate the ionization probability of different sputtered species. In order to facilitate an in-situ comparison with typical nuclear sputtering conditions, the system is also equipped with a conventional rare gas ion source delivering a 5 keV argon ion beam. For a dynamically sputter cleaned surface, it is found that the ionization probability of Ge atoms and Gen clusters ejected under electronic sputtering conditions is by more than an order of magnitude higher than that measured for keV sputtered particles. In addition, the mass spectra obtained under SHI irradiation show prominent signals of GenOm clusters, which are predominantly detected as positive or negative secondary ions. From the m-distribution for a given Ge nuclearity n, one can deduce that the sputtered material must originate from a germanium oxide matrix with approximate GeO stoichiometry, probably due to residual native oxide patches even at the dynamically cleaned surface. The results clearly demonstrate a fundamental difference between the ejection and ionization mechanisms in both cases, which is interpreted in terms of corresponding model calculations.
Exposure history determines pteropod vulnerability to ocean acidification along the US West Coast.
Bednaršek, N; Feely, R A; Tolimieri, N; Hermann, A J; Siedlecki, S A; Waldbusser, G G; McElhany, P; Alin, S R; Klinger, T; Moore-Maley, B; Pörtner, H O
2017-07-03
The pteropod Limacina helicina frequently experiences seasonal exposure to corrosive conditions (Ω ar < 1) along the US West Coast and is recognized as one of the species most susceptible to ocean acidification (OA). Yet, little is known about their capacity to acclimatize to such conditions. We collected pteropods in the California Current Ecosystem (CCE) that differed in the severity of exposure to Ω ar conditions in the natural environment. Combining field observations, high-CO 2 perturbation experiment results, and retrospective ocean transport simulations, we investigated biological responses based on histories of magnitude and duration of exposure to Ω ar < 1. Our results suggest that both exposure magnitude and duration affect pteropod responses in the natural environment. However, observed declines in calcification performance and survival probability under high CO 2 experimental conditions do not show acclimatization capacity or physiological tolerance related to history of exposure to corrosive conditions. Pteropods from the coastal CCE appear to be at or near the limit of their physiological capacity, and consequently, are already at extinction risk under projected acceleration of OA over the next 30 years. Our results demonstrate that Ω ar exposure history largely determines pteropod response to experimental conditions and is essential to the interpretation of biological observations and experimental results.
Probability Surveys, Conditional Probability, and Ecological Risk Assessment
We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency’s (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...
How People Interpret Conditionals: Shifts toward the Conditional Event
ERIC Educational Resources Information Center
Fugard, Andrew J. B.; Pfeifer, Niki; Mayerhofer, Bastian; Kleiter, Gernot D.
2011-01-01
We investigated how people interpret conditionals and how stable their interpretation is over a long series of trials. Participants were shown the colored patterns on each side of a 6-sided die and were asked how sure they were that a conditional holds of the side landing upward when the die is randomly thrown. Participants were presented with 71…
Selective Attention in Pigeon Temporal Discrimination.
Subramaniam, Shrinidhi; Kyonka, Elizabeth
2017-07-27
Cues can vary in how informative they are about when specific outcomes, such as food availability, will occur. This study was an experimental investigation of the functional relation between cue informativeness and temporal discrimination in a peak-interval (PI) procedure. Each session consisted of fixed-interval (FI) 2-s and 4-s schedules of food and occasional, 12-s PI trials during which pecks had no programmed consequences. Across conditions, the phi (ϕ) correlation between key light color and FI schedule value was manipulated. Red and green key lights signaled the onset of either or both FI schedules. Different colors were either predictive (ϕ = 1), moderately predictive (ϕ = 0.2-0.8), or not predictive (ϕ = 0) of a specific FI schedule. This study tested the hypothesis that temporal discrimination is a function of the momentary conditional probability of food; that is, pigeons peck the most at either 2 s or 4 s when ϕ = 1 and peck at both intervals when ϕ < 1. Response distributions were bimodal Gaussian curves; distributions from red- and green-key PI trials converged when ϕ ≤ 0.6. Peak times estimated by summed Gaussian functions, averaged across conditions and pigeons, were 1.85 s and 3.87 s, however, pigeons did not always maximize the momentary probability of food. When key light color was highly correlated with FI schedules (ϕ ≥ 0.6), estimates of peak times indicated that temporal discrimination accuracy was reduced at the unlikely interval, but not the likely interval. The mechanism of this reduced temporal discrimination accuracy could be interpreted as an attentional process.
Application of a linked stress release model in Corinth Gulf and Central Ionian Islands (Greece)
NASA Astrophysics Data System (ADS)
Mangira, Ourania; Vasiliadis, Georgios; Papadimitriou, Eleftheria
2017-06-01
Spatio-temporal stress changes and interactions between adjacent fault segments consist of the most important component in seismic hazard assessment, as they can alter the occurrence probability of strong earthquake onto these segments. The investigation of the interactions between adjacent areas by means of the linked stress release model is attempted for moderate earthquakes ( M ≥ 5.2) in the Corinth Gulf and the Central Ionian Islands (Greece). The study areas were divided in two subareas, based on seismotectonic criteria. The seismicity of each subarea is investigated by means of a stochastic point process and its behavior is determined by the conditional intensity function, which usually gets an exponential form. A conditional intensity function of Weibull form is used for identifying the most appropriate among the models (simple, independent and linked stress release model) for the interpretation of the earthquake generation process. The appropriateness of the models was decided after evaluation via the Akaike information criterion. Despite the fact that the curves of the conditional intensity functions exhibit similar behavior, the use of the exponential-type conditional intensity function seems to fit better the data.
Pretest probability estimation in the evaluation of patients with possible deep vein thrombosis.
Vinson, David R; Patel, Jason P; Irving, Cedric S
2011-07-01
An estimation of pretest probability is integral to the proper interpretation of a negative compression ultrasound in the diagnostic assessment of lower-extremity deep vein thrombosis. We sought to determine the rate, method, and predictors of pretest probability estimation in such patients. This cross-sectional study of outpatients was conducted in a suburban community hospital in 2006. Estimation of pretest probability was done by enzyme-linked immunosorbent assay d-dimer, Wells criteria, and unstructured clinical impression. Using logistic regression analysis, we measured predictors of documented risk assessment. A cohort analysis was undertaken to compare 3-month thromboembolic outcomes between risk groups. Among 524 cases, 289 (55.2%) underwent pretest probability estimation using the following methods: enzyme-linked immunosorbent assay d-dimer (228; 43.5%), clinical impression (106; 20.2%), and Wells criteria (24; 4.6%), with 69 (13.2%) patients undergoing a combination of at least two methods. Patient factors were not predictive of pretest probability estimation, but the specialty of the clinician was predictive; emergency physicians (P < .0001) and specialty clinicians (P = .001) were less likely than primary care clinicians to perform risk assessment. Thromboembolic events within 3 months were experienced by 0 of 52 patients in the explicitly low-risk group, 4 (1.8%) of 219 in the explicitly moderate- to high-risk group, and 1 (0.4%) of 226 in the group that did not undergo explicit risk assessment. Negative ultrasounds in the workup of deep vein thrombosis are commonly interpreted in isolation apart from pretest probability estimations. Risk assessments varied by physician specialties. Opportunities exist for improvement in the diagnostic evaluation of these patients. Copyright © 2011 Elsevier Inc. All rights reserved.
Modeling the effect of reward amount on probability discounting.
Myerson, Joel; Green, Leonard; Morris, Joshua
2011-03-01
The present study with college students examined the effect of amount on the discounting of probabilistic monetary rewards. A hyperboloid function accurately described the discounting of hypothetical rewards ranging in amount from $20 to $10,000,000. The degree of discounting increased continuously with amount of probabilistic reward. This effect of amount was not due to changes in the rate parameter of the discounting function, but rather was due to increases in the exponent. These results stand in contrast to those observed with the discounting of delayed monetary rewards, in which the degree of discounting decreases with reward amount due to amount-dependent decreases in the rate parameter. Taken together, this pattern of results suggests that delay and probability discounting reflect different underlying mechanisms. That is, the fact that the exponent in the delay discounting function is independent of amount is consistent with a psychophysical scaling interpretation, whereas the finding that the exponent of the probability-discounting function is amount-dependent is inconsistent with such an interpretation. Instead, the present results are consistent with the idea that the probability-discounting function is itself the product of a value function and a weighting function. This idea was first suggested by Kahneman and Tversky (1979), although their prospect theory does not predict amount effects like those observed. The effect of amount on probability discounting was parsimoniously incorporated into our hyperboloid discounting function by assuming that the exponent was proportional to the amount raised to a power. The amount-dependent exponent of the probability-discounting function may be viewed as reflecting the effect of amount on the weighting of the probability with which the reward will be received.
Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vourdas, A.
2014-08-15
The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H{sub 1},H{sub 2}), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H{sub 1}),P(H{sub 2}), to the subspacesmore » H{sub 1}, H{sub 2}. As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities.« less
The estimation of tree posterior probabilities using conditional clade probability distributions.
Larget, Bret
2013-07-01
In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample.
Cowley, Laura E; Maguire, Sabine; Farewell, Daniel M; Quinn-Scoggins, Harriet D; Flynn, Matthew O; Kemp, Alison M
2018-05-09
The validated Predicting Abusive Head Trauma (PredAHT) tool estimates the probability of abusive head trauma (AHT) based on combinations of six clinical features: head/neck bruising; apnea; seizures; rib/long-bone fractures; retinal hemorrhages. We aimed to determine the acceptability of PredAHT to child protection professionals. We conducted qualitative semi-structured interviews with 56 participants: clinicians (25), child protection social workers (10), legal practitioners (9, including 4 judges), police officers (8), and pathologists (4), purposively sampled across southwest United Kingdom. Interviews were recorded, transcribed and imported into NVivo for thematic analysis (38% double-coded). We explored participants' evaluations of PredAHT, their opinions about the optimal way to present the calculated probabilities, and their interpretation of probabilities in the context of suspected AHT. Clinicians, child protection social workers and police thought PredAHT would be beneficial as an objective adjunct to their professional judgment, to give them greater confidence in their decisions. Lawyers and pathologists appreciated its value for prompting multidisciplinary investigations, but were uncertain of its usefulness in court. Perceived disadvantages included: possible over-reliance and false reassurance from a low score. Interpretations regarding which percentages equate to 'low', 'medium' or 'high' likelihood of AHT varied; participants preferred a precise % probability over these general terms. Participants would use PredAHT with provisos: if they received multi-agency training to define accepted risk thresholds for consistent interpretation; with knowledge of its development; if it was accepted by colleagues. PredAHT may therefore increase professionals' confidence in their decision-making when investigating suspected AHT, but may be of less value in court. Copyright © 2018 Elsevier Ltd. All rights reserved.
Particles, Waves, and the Interpretation of Quantum Mechanics
ERIC Educational Resources Information Center
Christoudouleas, N. D.
1975-01-01
Presents an explanation, without mathematical equations, of the basic principles of quantum mechanics. Includes wave-particle duality, the probability character of the wavefunction, and the uncertainty relations. (MLH)
Dynamic stress changes during earthquake rupture
Day, S.M.; Yu, G.; Wald, D.J.
1998-01-01
We assess two competing dynamic interpretations that have been proposed for the short slip durations characteristic of kinematic earthquake models derived by inversion of earthquake waveform and geodetic data. The first interpretation would require a fault constitutive relationship in which rapid dynamic restrengthening of the fault surface occurs after passage of the rupture front, a hypothesized mechanical behavior that has been referred to as "self-healing." The second interpretation would require sufficient spatial heterogeneity of stress drop to permit rapid equilibration of elastic stresses with the residual dynamic friction level, a condition we refer to as "geometrical constraint." These interpretations imply contrasting predictions for the time dependence of the fault-plane shear stresses. We compare these predictions with dynamic shear stress changes for the 1992 Landers (M 7.3), 1994 Northridge (M 6.7), and 1995 Kobe (M 6.9) earthquakes. Stress changes are computed from kinematic slip models of these earthquakes, using a finite-difference method. For each event, static stress drop is highly variable spatially, with high stress-drop patches embedded in a background of low, and largely negative, stress drop. The time histories of stress change show predominantly monotonic stress change after passage of the rupture front, settling to a residual level, without significant evidence for dynamic restrengthening. The stress change at the rupture front is usually gradual rather than abrupt, probably reflecting the limited resolution inherent in the underlying kinematic inversions. On the basis of this analysis, as well as recent similar results obtained independently for the Kobe and Morgan Hill earthquakes, we conclude that, at the present time, the self-healing hypothesis is unnecessary to explain earthquake kinematics.
An empirical probability model of detecting species at low densities.
Delaney, David G; Leung, Brian
2010-06-01
False negatives, not detecting things that are actually present, are an important but understudied problem. False negatives are the result of our inability to perfectly detect species, especially those at low density such as endangered species or newly arriving introduced species. They reduce our ability to interpret presence-absence survey data and make sound management decisions (e.g., rapid response). To reduce the probability of false negatives, we need to compare the efficacy and sensitivity of different sampling approaches and quantify an unbiased estimate of the probability of detection. We conducted field experiments in the intertidal zone of New England and New York to test the sensitivity of two sampling approaches (quadrat vs. total area search, TAS), given different target characteristics (mobile vs. sessile). Using logistic regression we built detection curves for each sampling approach that related the sampling intensity and the density of targets to the probability of detection. The TAS approach reduced the probability of false negatives and detected targets faster than the quadrat approach. Mobility of targets increased the time to detection but did not affect detection success. Finally, we interpreted two years of presence-absence data on the distribution of the Asian shore crab (Hemigrapsus sanguineus) in New England and New York, using our probability model for false negatives. The type of experimental approach in this paper can help to reduce false negatives and increase our ability to detect species at low densities by refining sampling approaches, which can guide conservation strategies and management decisions in various areas of ecology such as conservation biology and invasion ecology.
Direct measurements of the atmospheric conduction current
NASA Technical Reports Server (NTRS)
Burke, H. K.; Few, A. A.
1978-01-01
A method of measuring the atmospheric conduction current above the ground has been employed to obtain data for 12 weeks during the first half of 1974. The instrument consists of a split aluminum sphere suspended by insulated wires to a wooden frame. The measuring electronics and the transmitter are enclosed within the spherical structure. The interaction of the instrument with its atmospheric electrical environment is analyzed, and it is shown that in steady state conditions, predictable differences in the instrumentally measured currents and the atmospheric conduction current will be less than 5% and in the nonsteady state situations the difference is less than 20%. Diurnal variations, a probable winter-summer variation, sunrise, and fog effects were observed for the data obtained during fair-weather conditions. Disturbed weather data are interpreted for the effects of low clouds on the atmospheric current. The charge concentrations within overcast clouds sufficient to produce the observed reversed atmospheric currents are estimated to be small in relation to values in thunderclouds.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stevanovic, Vladan; Jones, Eric
With few systems of technological interest having been studied as extensively as elemental silicon, there currently exists a wide disparity between the number of predicted low-energy silicon polymorphs and those that have been experimentally realized as metastable at ambient conditions. We put forward an explanation for this disparity wherein the likelihood of formation of a given polymorph under near-equilibrium conditions can be estimated on the basis of mean-field isothermal-isobaric (N,p,T) ensemble statistics. The probability that a polymorph will be experimentally realized is shown to depend upon both the hypervolume of that structure's potential energy basin of attraction and a Boltzmannmore » factor weight containing the polymorph's potential enthalpy per particle. Both attributes are calculated using density functional theory relaxations of randomly generated initial structures. We find that the metastable polymorphism displayed by silicon can be accounted for using this framework to the exclusion of a very large number of other low-energy structures.« less
An Investigation and Interpretation of Selected Topics in Uncertainty Reasoning
1989-12-01
Characterizing seconditry uncertainty as spurious evidence and including it in the inference process , It was shown that probability ratio graphs are a...in the inference process has great impact on the computational complexity of an Inference process . viii An Investigation and Interpretation of...Systems," he outlines a five step process that incorporates Blyeslan reasoning in the development of the expert system rule base: 1. A group of
Quantifying Treatment Benefit in Molecular Subgroups to Assess a Predictive Biomarker.
Iasonos, Alexia; Chapman, Paul B; Satagopan, Jaya M
2016-05-01
An increased interest has been expressed in finding predictive biomarkers that can guide treatment options for both mutation carriers and noncarriers. The statistical assessment of variation in treatment benefit (TB) according to the biomarker carrier status plays an important role in evaluating predictive biomarkers. For time-to-event endpoints, the hazard ratio (HR) for interaction between treatment and a biomarker from a proportional hazards regression model is commonly used as a measure of variation in TB. Although this can be easily obtained using available statistical software packages, the interpretation of HR is not straightforward. In this article, we propose different summary measures of variation in TB on the scale of survival probabilities for evaluating a predictive biomarker. The proposed summary measures can be easily interpreted as quantifying differential in TB in terms of relative risk or excess absolute risk due to treatment in carriers versus noncarriers. We illustrate the use and interpretation of the proposed measures with data from completed clinical trials. We encourage clinical practitioners to interpret variation in TB in terms of measures based on survival probabilities, particularly in terms of excess absolute risk, as opposed to HR. Clin Cancer Res; 22(9); 2114-20. ©2016 AACR. ©2016 American Association for Cancer Research.
Hawkins, Kirsten A; Cougle, Jesse R
2013-09-01
Research suggests that individuals high in anger have a bias for attributing hostile intentions to ambiguous situations. The current study tested whether this interpretation bias can be altered to influence anger reactivity to an interpersonal insult using a single-session cognitive bias modification program. One hundred thirty-five undergraduate students were randomized to receive positive training, negative training, or a control condition. Anger reactivity to insult was then assessed. Positive training led to significantly greater increases in positive interpretation bias relative to the negative group, though these increases were only marginally greater than the control group. Negative training led to increased negative interpretation bias relative to other groups. During the insult, participants in the positive condition reported less anger than those in the control condition. Observers rated participants in the positive condition as less irritated than those in the negative condition and more amused than the other two conditions. Though mediation of effects via bias modification was not demonstrated, among the positive condition posttraining interpretation bias was correlated with self-reported anger, suggesting that positive training reduced anger reactivity by influencing interpretation biases. Findings suggest that positive interpretation training may be a promising treatment for reducing anger. However, the current study was conducted with a non-treatment-seeking student sample; further research with a treatment-seeking sample with problematic anger is necessary. Copyright © 2013. Published by Elsevier Ltd.
Zhong, Qing; Chen, Qi-Yue; Li, Ping; Xie, Jian-Wei; Wang, Jia-Bin; Lin, Jian-Xian; Lu, Jun; Cao, Long-Long; Lin, Mi; Tu, Ru-Hong; Zheng, Chao-Hui; Huang, Chang-Ming
2018-04-20
The dynamic prognosis of patients who have undergone curative surgery for gastric cancer has yet to be reported. Our objective was to devise an accurate tool for predicting the conditional probability of survival for these patients. We analyzed 11,551 gastric cancer patients from the Surveillance, Epidemiology, and End Results database. Two-thirds of the patients were selected randomly for the development set and one-third for the validation set. Two nomograms were constructed to predict the conditional probability of overall survival and the conditional probability of disease-specific survival, using conditional survival methods. We then applied these nomograms to the 4,001 patients in the database from Fujian Medical University Union Hospital, Fuzhou, China, one of the most active Chinese institutes. The 5-year conditional probability of overall survival of the patients was 41.6% immediately after resection and increased to 52.8%, 68.2%, and 80.4% at 1, 2, and 3 years after gastrectomy. The 5-year conditional probability of disease-specific survival "increased" from 48.9% at the time of gastrectomy to 59.8%, 74.7%, and 85.5% for patients surviving 1, 2, and 3 years, respectively. Sex; race; age; depth of tumor invasion; lymph node metastasis; and tumor size, site, and grade were associated with overall survival and disease-specific survival (P <.05). Within the Surveillance, Epidemiology, and End Results validation set, the accuracy of the conditional probability of overall survival nomogram was 0.77, 0.81, 0.82, and 0.82 at 1, 3, 5, and 10 years after gastrectomy, respectively. Within the other validation set from the Fujian Medical University Union Hospital (n = 4,001), the accuracy of the conditional probability of overall survival nomogram was 0.76, 0.79, 0.77, and 0.77 at 1, 3, 5, and 10 years, respectively. The accuracy of the conditional probability of disease-specific survival model was also favorable. The calibration curve demonstrated good agreement between the predicted and observed survival rates. Based on the large Eastern and Western data sets, we developed and validated the first conditional nomogram for prediction of conditional probability of survival for patients with gastric cancer to allow consideration of the duration of survivorship. Copyright © 2018 Elsevier Inc. All rights reserved.
Horton, G.E.; Letcher, B.H.
2008-01-01
The inability to account for the availability of individuals in the study area during capture-mark-recapture (CMR) studies and the resultant confounding of parameter estimates can make correct interpretation of CMR model parameter estimates difficult. Although important advances based on the Cormack-Jolly-Seber (CJS) model have resulted in estimators of true survival that work by unconfounding either death or recapture probability from availability for capture in the study area, these methods rely on the researcher's ability to select a method that is correctly matched to emigration patterns in the population. If incorrect assumptions regarding site fidelity (non-movement) are made, it may be difficult or impossible as well as costly to change the study design once the incorrect assumption is discovered. Subtleties in characteristics of movement (e.g. life history-dependent emigration, nomads vs territory holders) can lead to mixtures in the probability of being available for capture among members of the same population. The result of these mixtures may be only a partial unconfounding of emigration from other CMR model parameters. Biologically-based differences in individual movement can combine with constraints on study design to further complicate the problem. Because of the intricacies of movement and its interaction with other parameters in CMR models, quantification of and solutions to these problems are needed. Based on our work with stream-dwelling populations of Atlantic salmon Salmo salar, we used a simulation approach to evaluate existing CMR models under various mixtures of movement probabilities. The Barker joint data model provided unbiased estimates of true survival under all conditions tested. The CJS and robust design models provided similarly unbiased estimates of true survival but only when emigration information could be incorporated directly into individual encounter histories. For the robust design model, Markovian emigration (future availability for capture depends on an individual's current location) was a difficult emigration pattern to detect unless survival and especially recapture probability were high. Additionally, when local movement was high relative to study area boundaries and movement became more diffuse (e.g. a random walk), local movement and permanent emigration were difficult to distinguish and had consequences for correctly interpreting the survival parameter being estimated (apparent survival vs true survival). ?? 2008 The Authors.
Renal angina: concept and development of pretest probability assessment in acute kidney injury.
Chawla, Lakhmir S; Goldstein, Stuart L; Kellum, John A; Ronco, Claudio
2015-02-27
The context of a diagnostic test is a critical component for the interpretation of its result. This context defines the pretest probability of the diagnosis and forms the basis for the interpretation and value of adding the diagnostic test. In the field of acute kidney injury, a multitude of early diagnostic biomarkers have been developed, but utilization in the appropriate context is less well understood and has not been codified until recently. In order to better operationalize the context and pretest probability assessment for acute kidney injury diagnosis, the renal angina concept was proposed in 2010 for use in both children and adults. Renal angina has been assessed in approximately 1,000 subjects. However, renal angina as a concept is still unfamiliar to most clinicians and the rationale for introducing the term is not obvious. We therefore review the concept and development of renal angina, and the currently available data validating it. We discuss the various arguments for and against this construct. Future research testing the performance of renal angina with acute kidney injury biomarkers is warranted.
NASA Technical Reports Server (NTRS)
Galvan, Jose Ramon; Saxena, Abhinav; Goebel, Kai Frank
2012-01-01
This article discusses several aspects of uncertainty representation and management for model-based prognostics methodologies based on our experience with Kalman Filters when applied to prognostics for electronics components. In particular, it explores the implications of modeling remaining useful life prediction as a stochastic process, and how it relates to uncertainty representation, management and the role of prognostics in decision-making. A distinction between the interpretations of estimated remaining useful life probability density function is explained and a cautionary argument is provided against mixing interpretations for two while considering prognostics in making critical decisions.
Option volatility and the acceleration Lagrangian
NASA Astrophysics Data System (ADS)
Baaquie, Belal E.; Cao, Yang
2014-01-01
This paper develops a volatility formula for option on an asset from an acceleration Lagrangian model and the formula is calibrated with market data. The Black-Scholes model is a simpler case that has a velocity dependent Lagrangian. The acceleration Lagrangian is defined, and the classical solution of the system in Euclidean time is solved by choosing proper boundary conditions. The conditional probability distribution of final position given the initial position is obtained from the transition amplitude. The volatility is the standard deviation of the conditional probability distribution. Using the conditional probability and the path integral method, the martingale condition is applied, and one of the parameters in the Lagrangian is fixed. The call option price is obtained using the conditional probability and the path integral method.
Asymptotic expansion of pair production probability in a time-dependent electric field
NASA Astrophysics Data System (ADS)
Arai, Takashi
2015-12-01
We study particle creation in a single pulse of an electric field in scalar quantum electrodynamics. We investigate the parameter condition for the case where the dynamical pair creation and Schwinger mechanism respectively dominate. Then, an asymptotic expansion for the particle distribution in terms of the time interval of the applied electric field is derived. We compare our result with particle creation in a constant electric field with a finite-time interval. These results coincide in an extremely strong field, however they differ in general field strength. We interpret the reason of this difference as a nonperturbative effect of high-frequency photons in external electric fields. Moreover, we find that the next-to-leading-order term in our asymptotic expansion coincides with the derivative expansion of the effective action.
Mendonça, J Ricardo G; Gevorgyan, Yeva
2017-05-01
We investigate one-dimensional elementary probabilistic cellular automata (PCA) whose dynamics in first-order mean-field approximation yields discrete logisticlike growth models for a single-species unstructured population with nonoverlapping generations. Beginning with a general six-parameter model, we find constraints on the transition probabilities of the PCA that guarantee that the ensuing approximations make sense in terms of population dynamics and classify the valid combinations thereof. Several possible models display a negative cubic term that can be interpreted as a weak Allee factor. We also investigate the conditions under which a one-parameter PCA derived from the more general six-parameter model can generate valid population growth dynamics. Numerical simulations illustrate the behavior of some of the PCA found.
Positive phase space distributions and uncertainty relations
NASA Technical Reports Server (NTRS)
Kruger, Jan
1993-01-01
In contrast to a widespread belief, Wigner's theorem allows the construction of true joint probabilities in phase space for distributions describing the object system as well as for distributions depending on the measurement apparatus. The fundamental role of Heisenberg's uncertainty relations in Schroedinger form (including correlations) is pointed out for these two possible interpretations of joint probability distributions. Hence, in order that a multivariate normal probability distribution in phase space may correspond to a Wigner distribution of a pure or a mixed state, it is necessary and sufficient that Heisenberg's uncertainty relation in Schroedinger form should be satisfied.
The effect of emotion on interpretation and logic in a conditional reasoning task.
Blanchette, Isabelle
2006-07-01
The effect of emotional content on logical reasoning is explored in three experiments. Theparticipants completed a conditional reasoning task (If p, then q) with emotional and neutral contents. In Experiment 1, existing emotional and neutral words were used. The emotional value of initially neutral words was experimentally manipulated in Experiments 1B and 2, using classical conditioning. In all experiments, participants were less likely to provide normatively correct answers when reasoning about emotional stimuli, compared with neutral stimuli. This was true for both negative (Experiments 1B and 2) and positive contents (Experiment 2). The participants' interpretations of the conditional statements were also measured (perceived sufficiency, necessity, causality, and plausibility). The results showed the expected relationship between interpretation and reasoning. However, emotion did not affect interpretation. Emotional and neutral conditional statements were interpreted similarly. The results are discussed in light of current models of emotion and reasoning.
Bivariate normal, conditional and rectangular probabilities: A computer program with applications
NASA Technical Reports Server (NTRS)
Swaroop, R.; Brownlow, J. D.; Ashwworth, G. R.; Winter, W. R.
1980-01-01
Some results for the bivariate normal distribution analysis are presented. Computer programs for conditional normal probabilities, marginal probabilities, as well as joint probabilities for rectangular regions are given: routines for computing fractile points and distribution functions are also presented. Some examples from a closed circuit television experiment are included.
Storkel, Holly L.; Lee, Jaehoon; Cox, Casey
2016-01-01
Purpose Noisy conditions make auditory processing difficult. This study explores whether noisy conditions influence the effects of phonotactic probability (the likelihood of occurrence of a sound sequence) and neighborhood density (phonological similarity among words) on adults' word learning. Method Fifty-eight adults learned nonwords varying in phonotactic probability and neighborhood density in either an unfavorable (0-dB signal-to-noise ratio [SNR]) or a favorable (+8-dB SNR) listening condition. Word learning was assessed using a picture naming task by scoring the proportion of phonemes named correctly. Results The unfavorable 0-dB SNR condition showed a significant interaction between phonotactic probability and neighborhood density in the absence of main effects. In particular, adults learned more words when phonotactic probability and neighborhood density were both low or both high. The +8-dB SNR condition did not show this interaction. These results are inconsistent with those from a prior adult word learning study conducted under quiet listening conditions that showed main effects of word characteristics. Conclusions As the listening condition worsens, adult word learning benefits from a convergence of phonotactic probability and neighborhood density. Clinical implications are discussed for potential populations who experience difficulty with auditory perception or processing, making them more vulnerable to noise. PMID:27788276
Han, Min Kyung; Storkel, Holly L; Lee, Jaehoon; Cox, Casey
2016-11-01
Noisy conditions make auditory processing difficult. This study explores whether noisy conditions influence the effects of phonotactic probability (the likelihood of occurrence of a sound sequence) and neighborhood density (phonological similarity among words) on adults' word learning. Fifty-eight adults learned nonwords varying in phonotactic probability and neighborhood density in either an unfavorable (0-dB signal-to-noise ratio [SNR]) or a favorable (+8-dB SNR) listening condition. Word learning was assessed using a picture naming task by scoring the proportion of phonemes named correctly. The unfavorable 0-dB SNR condition showed a significant interaction between phonotactic probability and neighborhood density in the absence of main effects. In particular, adults learned more words when phonotactic probability and neighborhood density were both low or both high. The +8-dB SNR condition did not show this interaction. These results are inconsistent with those from a prior adult word learning study conducted under quiet listening conditions that showed main effects of word characteristics. As the listening condition worsens, adult word learning benefits from a convergence of phonotactic probability and neighborhood density. Clinical implications are discussed for potential populations who experience difficulty with auditory perception or processing, making them more vulnerable to noise.
Qualitative Amino Acid Analysis of Small Peptides by GC/MS.
ERIC Educational Resources Information Center
Mabbott, Gary A.
1990-01-01
Experiments designed to help undergraduate students gain experience operating instruments and interpreting gas chromatography and mass spectrometry data are presented. Experimental reagents, procedures, analysis, and probable results are discussed. (CW)
Defensive efficacy interim design: Dynamic benefit/risk ratio view using probability of success.
Tang, Zhongwen
2017-01-01
Traditional efficacy interim design is based on alpha spending which does not have intuitive interpretation and hence is difficult to communicate with non-statistician colleagues. The alpha-spending approach is based on efficacy alone and hence does not have the flexibility to incorporate newly emerged safety signal. Newly emerged safety signal may nullify the originally set efficacy boundary. In contrast, the probability of success (POS) concept has intuitive interpretation and hence can facilitate our communication with non-statistician colleagues and help to obtain health authority (HA) buying. The success criteria of POS are not restricted to statistical significance. Hence, POS has the capability to incorporate both efficacy and safety information. We propose to use POS and its credible interval to design efficacy interim. In the proposed method, the efficacy boundary is adjustable to offset newly emerged safety signal.
NASA Astrophysics Data System (ADS)
Balbis, C.; Petrinovic, I. A.; Guzmán, S.
2016-11-01
We recognised and interpreted a recent pyroclastic density current (PDC) deposit at the Copahue volcano (Southern Andes), through a field survey and a sedimentological study. The relationships between the behaviour of the PDCs, the morphology of the Río Agrio valley and the eruptive dynamics were interpreted. We identified two lithofacies in the deposit that indicate variations in the eruptive dynamics: i) the opening of the conduit and the formation of a highly explosive eruption that formed a diluted PDC through the immediate collapse of the eruptive column; ii) a continued eruption which followed immediately and records the widening of the conduit, producing a dense PDC. The eruption occurred in 2000 CE, was phreatomagmatic (VEI ≤ 2), with a vesiculation level above 4000 m depth and fragmentation driven by the interaction of magma with an hydrothermal system at ca. 1500 m depth. As deduced from the comparison between the accessory lithics of this deposit and those of the 2012 CE eruption, the depth of onset of vesiculation and fragmentation level in this volcano is constant in depth. In order to reproduce the distribution pattern of this PDC's deposit and to simulate potential PDC's forming-processes, we made several computational modelling from "denser" to "more diluted" conditions. The latter fairly reproduces the distribution of the studied deposit and represents perhaps one of the most dangerous possible scenarios of the Copahue volcanic activity. PDCs occurrence has been considered in the last volcanic hazards map as a low probability process; evidences found in this contribution suggest instead to include them as more probable and thus very important for the hazards assessment of the Copahue volcano.
A stochastic Markov chain model to describe lung cancer growth and metastasis.
Newton, Paul K; Mason, Jeremy; Bethel, Kelly; Bazhenova, Lyudmila A; Nieva, Jorge; Kuhn, Peter
2012-01-01
A stochastic Markov chain model for metastatic progression is developed for primary lung cancer based on a network construction of metastatic sites with dynamics modeled as an ensemble of random walkers on the network. We calculate a transition matrix, with entries (transition probabilities) interpreted as random variables, and use it to construct a circular bi-directional network of primary and metastatic locations based on postmortem tissue analysis of 3827 autopsies on untreated patients documenting all primary tumor locations and metastatic sites from this population. The resulting 50 potential metastatic sites are connected by directed edges with distributed weightings, where the site connections and weightings are obtained by calculating the entries of an ensemble of transition matrices so that the steady-state distribution obtained from the long-time limit of the Markov chain dynamical system corresponds to the ensemble metastatic distribution obtained from the autopsy data set. We condition our search for a transition matrix on an initial distribution of metastatic tumors obtained from the data set. Through an iterative numerical search procedure, we adjust the entries of a sequence of approximations until a transition matrix with the correct steady-state is found (up to a numerical threshold). Since this constrained linear optimization problem is underdetermined, we characterize the statistical variance of the ensemble of transition matrices calculated using the means and variances of their singular value distributions as a diagnostic tool. We interpret the ensemble averaged transition probabilities as (approximately) normally distributed random variables. The model allows us to simulate and quantify disease progression pathways and timescales of progression from the lung position to other sites and we highlight several key findings based on the model.
Van Driessche, L; Valgaeren, B R; Gille, L; Boyen, F; Ducatelle, R; Haesebrouck, F; Deprez, P; Pardon, B
2017-05-01
Nonendoscopic bronchoalveolar lavage (BAL) is a practical alternative for a deep nasopharyngeal swab (DNS) to sample the airways of a large number of calves in a short period of time. The extent of commensal overgrowth and agreement of BAL with DNS culture results in preweaned calves are unknown. To compare commensal overgrowth and bacterial culture results between DNS and BAL samples. A total of 183 preweaned calves (144 with bovine respiratory disease and 39 healthy animals). Cross-sectional study. Deep nasopharyngeal swab and BAL samples were taken from each calf and cultured to detect Pasteurellaceae and Mycoplasma bovis. Agreement and associations between culture results of DNS and BAL samples were determined by kappa statistics and logistic regression. Bronchoalveolar lavage samples were less often polymicrobial, more frequently negative and yielded more pure cultures compared to DNS, leading to a clinically interpretable culture result in 79.2% of the cases compared to only in 31.2% of the DNS samples. Isolation rates were lower in healthy animals, but not different between DNS and BAL samples. Only Histophilus somni was more likely to be isolated from BAL samples. In clinical cases, a polymicrobial DNS culture result did not increase the probability of a polymicrobial BAL result by ≥30%, nor did it influence the probability of a negative culture. A significant herd effect was noted for all observed relationships. Nonendoscopic BAL samples are far less overgrown by bacteria compared to DNS samples under the conditions of this study, facilitating clinical interpretation and resulting in a higher return on investment in bacteriologic culturing. Copyright © 2017 The Authors. Journal of Veterinary Internal Medicine published by Wiley Periodicals, Inc. on behalf of the American College of Veterinary Internal Medicine.
The place of probability in Hilbert's axiomatization of physics, ca. 1900-1928
NASA Astrophysics Data System (ADS)
Verburgt, Lukas M.
2016-02-01
Although it has become a common place to refer to the 'sixth problem' of Hilbert's (1900) Paris lecture as the starting point for modern axiomatized probability theory, his own views on probability have received comparatively little explicit attention. The central aim of this paper is to provide a detailed account of this topic in light of the central observation that the development of Hilbert's project of the axiomatization of physics went hand-in-hand with a redefinition of the status of probability theory and the meaning of probability. Where Hilbert first regarded the theory as a mathematizable physical discipline and later approached it as a 'vague' mathematical application in physics, he eventually understood probability, first, as a feature of human thought and, then, as an implicitly defined concept without a fixed physical interpretation. It thus becomes possible to suggest that Hilbert came to question, from the early 1920s on, the very possibility of achieving the goal of the axiomatization of probability as described in the 'sixth problem' of 1900.
Internal validation of STRmix™ for the interpretation of single source and mixed DNA profiles.
Moretti, Tamyra R; Just, Rebecca S; Kehl, Susannah C; Willis, Leah E; Buckleton, John S; Bright, Jo-Anne; Taylor, Duncan A; Onorato, Anthony J
2017-07-01
The interpretation of DNA evidence can entail analysis of challenging STR typing results. Genotypes inferred from low quality or quantity specimens, or mixed DNA samples originating from multiple contributors, can result in weak or inconclusive match probabilities when a binary interpretation method and necessary thresholds (such as a stochastic threshold) are employed. Probabilistic genotyping approaches, such as fully continuous methods that incorporate empirically determined biological parameter models, enable usage of more of the profile information and reduce subjectivity in interpretation. As a result, software-based probabilistic analyses tend to produce more consistent and more informative results regarding potential contributors to DNA evidence. Studies to assess and internally validate the probabilistic genotyping software STRmix™ for casework usage at the Federal Bureau of Investigation Laboratory were conducted using lab-specific parameters and more than 300 single-source and mixed contributor profiles. Simulated forensic specimens, including constructed mixtures that included DNA from two to five donors across a broad range of template amounts and contributor proportions, were used to examine the sensitivity and specificity of the system via more than 60,000 tests comparing hundreds of known contributors and non-contributors to the specimens. Conditioned analyses, concurrent interpretation of amplification replicates, and application of an incorrect contributor number were also performed to further investigate software performance and probe the limitations of the system. In addition, the results from manual and probabilistic interpretation of both prepared and evidentiary mixtures were compared. The findings support that STRmix™ is sufficiently robust for implementation in forensic laboratories, offering numerous advantages over historical methods of DNA profile analysis and greater statistical power for the estimation of evidentiary weight, and can be used reliably in human identification testing. With few exceptions, likelihood ratio results reflected intuitively correct estimates of the weight of the genotype possibilities and known contributor genotypes. This comprehensive evaluation provides a model in accordance with SWGDAM recommendations for internal validation of a probabilistic genotyping system for DNA evidence interpretation. Copyright © 2017. Published by Elsevier B.V.
ERIC Educational Resources Information Center
Erickson, Tim
2017-01-01
Understanding a Bayesian perspective demands comfort with conditional probability and with probabilities that appear to change as we acquire additional information. This paper suggests a simple context in conditional probability that helps develop the understanding students would need for a successful introduction to Bayesian reasoning.
Global Patterns of Lightning Properties Derived by OTD and LIS
NASA Technical Reports Server (NTRS)
Beirle, Steffen; Koshak, W.; Blakeslee, R.; Wagner, T.
2014-01-01
The satellite instruments Optical Transient Detector (OTD) and Lightning Imaging Sensor (LIS) provide unique empirical data about the frequency of lightning flashes around the globe (OTD), and the tropics (LIS), which 5 has been used before to compile a well received global climatology of flash rate densities. Here we present a statistical analysis of various additional lightning properties derived from OTD/LIS, i.e. the number of so-called "events" and "groups" per flash, as well as 10 the mean flash duration, footprint and radiance. These normalized quantities, which can be associated with the flash "strength", show consistent spatial patterns; most strikingly, oceanic flashes show higher values than continental flashes for all properties. Over land, regions with high (Eastern US) 15 and low (India) flash strength can be clearly identified. We discuss possible causes and implications of the observed regional differences. Although a direct quantitative interpretation of the investigated flash properties is difficult, the observed spatial patterns provide valuable information for the 20 interpretation and application of climatological flash rates. Due to the systematic regional variations of physical flash characteristics, viewing conditions, and/or measurement sensitivities, parametrisations of lightning NOx based on total flash rate densities alone are probably affected by regional biases.
Hydrogeologic Unit Flow Characterization Using Transition Probability Geostatistics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, N L; Walker, J R; Carle, S F
2003-11-21
This paper describes a technique for applying the transition probability geostatistics method for stochastic simulation to a MODFLOW model. Transition probability geostatistics has several advantages over traditional indicator kriging methods including a simpler and more intuitive framework for interpreting geologic relationships and the ability to simulate juxtapositional tendencies such as fining upwards sequences. The indicator arrays generated by the transition probability simulation are converted to layer elevation and thickness arrays for use with the new Hydrogeologic Unit Flow (HUF) package in MODFLOW 2000. This makes it possible to preserve complex heterogeneity while using reasonably sized grids. An application of themore » technique involving probabilistic capture zone delineation for the Aberjona Aquifer in Woburn, Ma. is included.« less
Brand, Matthias; Schiebener, Johannes; Pertl, Marie-Theres; Delazer, Margarete
2014-01-01
Recent models on decision making under risk conditions have suggested that numerical abilities are important ingredients of advantageous decision-making performance, but empirical evidence is still limited. The results of our first study show that logical reasoning and basic mental calculation capacities predict ratio processing and that ratio processing predicts decision making under risk. In the second study, logical reasoning together with executive functions predicted probability processing (numeracy and probability knowledge), and probability processing predicted decision making under risk. These findings suggest that increasing an individual's understanding of ratios and probabilities should lead to more advantageous decisions under risk conditions.
Training interpretation biases among individuals with body dysmorphic disorder symptoms.
Premo, Julie E; Sarfan, Laurel D; Clerkin, Elise M
2016-03-01
The current study provided an initial test of a Cognitive Bias Modification for Interpretations (CBM-I) training paradigm among a sample with elevated BDD symptoms (N=86). As expected, BDD-relevant interpretations were reduced among participants who completed a positive (vs. comparison) training program. Results also pointed to the intriguing possibility that modifying biased appearance-relevant interpretations is causally related to changes in biased, socially relevant interpretations. Further, providing support for cognitive behavioral models, residual change in interpretations was associated with some aspects of in vivo stressor responding. However, contrary to expectations there were no significant effects of condition on emotional vulnerability to a BDD stressor, potentially because participants in both training conditions experienced reductions in biased socially-threatening interpretations following training (suggesting that the "comparison" condition was not inert). These findings have meaningful theoretical and clinical implications, and fit with transdiagnostic conceptualizations of psychopathology. Copyright © 2015 Elsevier Ltd. All rights reserved.
The effect of exposure to multiple lineups on face identification accuracy.
Hinz, T; Pezdek, K
2001-04-01
This study examines the conditions under which an intervening lineup affects identification accuracy on a subsequent lineup. One hundred and sixty adults observed a photograph of one target individual for 60 s. One week later, they viewed an intervening target-absent lineup and were asked to identify the target individual. Two days later, participants were shown one of three 6-person lineups that included a different photograph of the target face (present or absent), a foil face from the intervening lineup (present or absent), plus additional foil faces. The hit rate was higher when the foil face from the intervening lineup was absent from the test lineup and the false alarm rate was greater when the target face was absent from the test lineup. The results suggest that simply being exposed to an innocent suspect in an intervening lineup, whether that innocent suspect is identified by the witness or not, increases the probability of misidentifying the innocent suspect and decreases the probability of correctly identifying the true perpetrator in a subsequent test lineup. The implications of these findings both for police lineup procedures and for the interpretation of lineup results in the courtroom are discussed.
Quantum simulation of the integer factorization problem: Bell states in a Penning trap
NASA Astrophysics Data System (ADS)
Rosales, Jose Luis; Martin, Vicente
2018-03-01
The arithmetic problem of factoring an integer N can be translated into the physics of a quantum device, a result that supports Pólya's and Hilbert's conjecture to demonstrate Riemann's hypothesis. The energies of this system, being univocally related to the factors of N , are the eigenvalues of a bounded Hamiltonian. Here we solve the quantum conditions and show that the histogram of the discrete energies, provided by the spectrum of the system, should be interpreted in number theory as the relative probability for a prime to be a factor candidate of N . This is equivalent to a quantum sieve that is shown to require only o (ln√{N}) 3 energy measurements to solve the problem, recovering Shor's complexity result. Hence the outcome can be seen as a probability map that a pair of primes solve the given factorization problem. Furthermore, we show that a possible embodiment of this quantum simulator corresponds to two entangled particles in a Penning trap. The possibility to build the simulator experimentally is studied in detail. The results show that factoring numbers, many orders of magnitude larger than those computed with experimentally available quantum computers, is achievable using typical parameters in Penning traps.
NASA Astrophysics Data System (ADS)
Kvaček, Zlatko; Teodoridis, Vasilis; Kováčová, Marianna; Schlögl, Ján; Sitár, Viliam
2014-06-01
A new plant assemblage of Cerová-Lieskové from Lower Miocene (Karpatian) deposits in the Vienna Basin (western Slovakia) is preserved in a relatively deep, upper-slope marine environment. Depositional conditions with high sedimentation rates allowed exceptional preservation of plant remains. The plant assemblage consists of (1) conifers represented by foliage of Pinus hepios and Tetraclinis salicornioides, a seed cone of Pinus cf. ornata, and by pollen of the Cupressaceae, Pinaceae, Pinus sp. and Cathaya sp., and (2) angiosperms represented by Cinnamomum polymorphum, Platanus neptuni, Potamogeton sp. and lauroid foliage, by pollen of Liquidambar sp., Engelhardia sp. and Craigia sp., and in particular by infructescences (so far interpreted as belonging to cereal ears). We validate genus and species assignments of the infructescences: they belong to Palaeotriticum Sitár, including P. mockii Sitár and P. carpaticum Sitár, and probably represent herbaceous monocots that inhabited coastal marshes, similar to the living grass Spartina. Similar infructescences occur in the Lower and Middle Miocene deposits of the Carpathian Foredeep (Slup in Moravia), Tunjice Hills (Žale in Slovenia), and probably also in the Swiss Molasse (Lausanne). This plant assemblage demonstrates that the paleovegetation was represented by evergreen woodland with pines and grasses in undergrowth, similar to vegetation inhabiting coastal brackish marshes today. It also indicates subtropical climatic conditions in the Vienna Basin (central Paratethys), similar to those implied by other coeval plant assemblages from Central Europe
Hidden in the background: a local approach to CMB anomalies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sánchez, Juan C. Bueno, E-mail: juan.c.bueno@correounivalle.edu.co
2016-09-01
We investigate a framework aiming to provide a common origin for the large-angle anomalies detected in the Cosmic Microwave Background (CMB), which are hypothesized as the result of the statistical inhomogeneity developed by different isocurvature fields of mass m ∼ H present during inflation. The inhomogeneity arises as the combined effect of ( i ) the initial conditions for isocurvature fields (obtained after a fast-roll stage finishing many e -foldings before cosmological scales exit the horizon), ( ii ) their inflationary fluctuations and ( iii ) their coupling to other degrees of freedom. Our case of interest is when thesemore » fields (interpreted as the precursors of large-angle anomalies) leave an observable imprint only in isolated patches of the Universe. When the latter intersect the last scattering surface, such imprints arise in the CMB. Nevertheless, due to their statistically inhomogeneous nature, these imprints are difficult to detect, for they become hidden in the background similarly to the Cold Spot. We then compute the probability that a single isocurvature field becomes inhomogeneous at the end of inflation and find that, if the appropriate conditions are given (which depend exclusively on the preexisting fast-roll stage), this probability is at the percent level. Finally, we discuss several mechanisms (including the curvaton and the inhomogeneous reheating) to investigate whether an initial statistically inhomogeneous isocurvature field fluctuation might give rise to some of the observed anomalies. In particular, we focus on the Cold Spot, the power deficit at low multipoles and the breaking of statistical isotropy.« less
Functional response and population dynamics for fighting predator, based on activity distribution.
Garay, József; Varga, Zoltán; Gámez, Manuel; Cabello, Tomás
2015-03-07
The classical Holling type II functional response, describing the per capita predation as a function of prey density, was modified by Beddington and de Angelis to include interference of predators that increases with predator density and decreases the number of killed prey. In the present paper we further generalize the Beddington-de Angelis functional response, considering that all predator activities (searching and handling prey, fight and recovery) have time duration, the probabilities of predator activities depend on the encounter probabilities, and hence on the prey and predator abundance, too. Under these conditions, the aim of the study is to introduce a functional response for fighting the predator and to analyse the corresponding dynamics, when predator-predator-prey encounters also occur. From this general approach, the Holling type functional responses can also be obtained as particular cases. In terms of the activity distribution, we give biologically interpretable sufficient conditions for stable coexistence. We consider two-individual (predator-prey) and three-individual (predator-predator-prey) encounters. In the three-individual encounter model there is a relatively higher fighting rate and a lower killing rate. Using numerical simulation, we surprisingly found that when the intrinsic prey growth rate and the conversion rate are small enough, the equilibrium predator abundance is higher in the three-individual encounter case. The above means that, when the equilibrium abundance of the predator is small, coexistence appears first in the three-individual encounter model. Copyright © 2014 Elsevier Ltd. All rights reserved.
Nowakowski, Matilda E; Antony, Martin M; Koerner, Naomi
2015-12-01
The present study investigated the effects of computerized interpretation training and cognitive restructuring on symptomatology, behavior, and physiological reactivity in an analogue social anxiety sample. Seventy-two participants with elevated social anxiety scores were randomized to one session of computerized interpretation training (n = 24), cognitive restructuring (n = 24), or an active placebo control condition (n = 24). Participants completed self-report questionnaires focused on interpretation biases and social anxiety symptomatology at pre and posttraining and a speech task at posttraining during which subjective, behavioral, and physiological measures of anxiety were assessed. Only participants in the interpretation training condition endorsed significantly more positive than negative interpretations of ambiguous social situations at posttraining. There was no evidence of generalizability of interpretation training effects to self-report measures of interpretation biases and symptomatology or the anxiety response during the posttraining speech task. Participants in the cognitive restructuring condition were rated as having higher quality speeches and showing fewer signs of anxiety during the posttraining speech task compared to participants in the interpretation training condition. The present study did not include baseline measures of speech performance or computer assessed interpretation biases. The results of the present study bring into question the generalizability of computerized interpretation training as well as the effectiveness of a single session of cognitive restructuring in modifying the full anxiety response. Clinical and theoretical implications are discussed. Copyright © 2015 Elsevier Ltd. All rights reserved.
Bellin, Alberto; Tonina, Daniele
2007-10-30
Available models of solute transport in heterogeneous formations lack in providing complete characterization of the predicted concentration. This is a serious drawback especially in risk analysis where confidence intervals and probability of exceeding threshold values are required. Our contribution to fill this gap of knowledge is a probability distribution model for the local concentration of conservative tracers migrating in heterogeneous aquifers. Our model accounts for dilution, mechanical mixing within the sampling volume and spreading due to formation heterogeneity. It is developed by modeling local concentration dynamics with an Ito Stochastic Differential Equation (SDE) that under the hypothesis of statistical stationarity leads to the Beta probability distribution function (pdf) for the solute concentration. This model shows large flexibility in capturing the smoothing effect of the sampling volume and the associated reduction of the probability of exceeding large concentrations. Furthermore, it is fully characterized by the first two moments of the solute concentration, and these are the same pieces of information required for standard geostatistical techniques employing Normal or Log-Normal distributions. Additionally, we show that in the absence of pore-scale dispersion and for point concentrations the pdf model converges to the binary distribution of [Dagan, G., 1982. Stochastic modeling of groundwater flow by unconditional and conditional probabilities, 2, The solute transport. Water Resour. Res. 18 (4), 835-848.], while it approaches the Normal distribution for sampling volumes much larger than the characteristic scale of the aquifer heterogeneity. Furthermore, we demonstrate that the same model with the spatial moments replacing the statistical moments can be applied to estimate the proportion of the plume volume where solute concentrations are above or below critical thresholds. Application of this model to point and vertically averaged bromide concentrations from the first Cape Cod tracer test and to a set of numerical simulations confirms the above findings and for the first time it shows the superiority of the Beta model to both Normal and Log-Normal models in interpreting field data. Furthermore, we show that assuming a-priori that local concentrations are normally or log-normally distributed may result in a severe underestimate of the probability of exceeding large concentrations.
Distal facies variability within the Upper Triassic part of the Otuk Formation in northern Alaska
Whidden, Katherine J.; Dumoulin, Julie A.; Whalen, M.T.; Hutton, E.; Moore, Thomas; Gaswirth, Stephanie
2014-01-01
The Triassic-Jurassic Otuk Formation is a potentially important source rock in allochthonous structural positions in the northern foothills of the Brooks Range in the North Slope of Alaska. This study focuses on three localities of the Upper Triassic (Norian) limestone member, which form a present-day, 110-km-long, east-west transect in the central Brooks Range. All three sections are within the structurally lowest Endicott Mountain allochthon and are interpreted to have been deposited along a marine outer shelf with a ramp geometry.The uppermost limestone member of the Otuk was chosen for this study in order to better understand lateral and vertical variability within carbonate source rocks, to aid prediction of organic richness, and ultimately, to evaluate the potential for these units to act as continuous (or unconventional) reservoirs. At each locality, 1 to 4 m sections of the limestone member were measured and sampled in detail to capture fine-scale features. Hand sample and thin section descriptions reveal four major microfacies in the study area, and one diagenetically recrystallized microfacies. Microfacies 1 and 2 are interpreted to represent redeposition of material by downslope transport, whereas microfacies 3 and 4 have high total organic carbon (TOC) values and are classified as primary depositional organofacies. Microfacies 3 is interpreted to have been deposited under primarily high productivity conditions, with high concentrations of radiolarian tests. Microfacies 4 was deposited under the lowest relative-oxygen conditions, but abundant thin bivalve shells indicate that the sediment-water interface was probably not anoxic.The Otuk Formation is interpreted to have been deposited outboard of a southwest-facing ramp margin, with the location of the three limestone outcrops likely in relatively close proximity during deposition. All three sections have evidence of transported material, implying that the Triassic Alaskan Basin was not a low-energy, deep-water setting, but rather a dynamic system with intermittent, yet significant, downslope flow. Upwelling played an important role in the small-scale vertical variability in microfacies. The zone of upwelling and resultant oxygen-minimum zone may have migrated across the ramp during fourth- or fifth-order sea-level changes.
NASA Astrophysics Data System (ADS)
Johnson, David T.
Quantum mechanics is an extremely successful and accurate physical theory, yet since its inception, it has been afflicted with numerous conceptual difficulties. The primary subject of this thesis is the theory of entropic quantum dynamics (EQD), which seeks to avoid these conceptual problems by interpreting quantum theory from an informational perspective. We begin by reviewing Cox's work in describing probability theory as a means of rationally and consistently quantifying uncertainties. We then discuss how probabilities can be updated according to either Bayes' theorem or the extended method of maximum entropy (ME). After that discussion, we review the work of Caticha and Giffin that shows that Bayes' theorem is a special case of ME. This important result demonstrates that the ME method is the general method for updating probabilities. We then review some motivating difficulties in quantum mechanics before discussing Caticha's work in deriving quantum theory from the approach of entropic dynamics, which concludes our review. After entropic dynamics is introduced, we develop the concepts of symmetries and transformations from an informational perspective. The primary result is the formulation of a symmetry condition that any transformation must satisfy in order to qualify as a symmetry in EQD. We then proceed to apply this condition to the extended Galilean transformation. This transformation is of interest as it exhibits features of both special and general relativity. The transformation yields a gravitational potential that arises from an equivalence of information. We conclude the thesis with a discussion of the measurement problem in quantum mechanics. We discuss the difficulties that arise in the standard quantum mechanical approach to measurement before developing our theory of entropic measurement. In entropic dynamics, position is the only observable. We show how a theory built on this one observable can account for the multitude of measurements present in quantum theory. Furthermore, we show that the Born rule need not be postulated, but can be derived in EQD. Finally, we show how the wave function can be updated by the ME method as the phase is constructed purely in terms of probabilities.
NASA Astrophysics Data System (ADS)
May, J.-H.; Preusser, F.; Zech, R.; Ilgner, J.; Veit, H.
2009-04-01
Throughout the Central Andes, glacial landscapes have long been used for the reconstruction of Late Quaternary glaciations and landscape evolution. Much work has focused on the Andes in Peru, Chile and the Bolivian Altiplano, whereas relatively little data has been published on glaciation history in the eastern Andean ranges and slopes. Even less is known with regard to the postglacial evolution of these glacial landscapes. In the Cordillera de Cochabamba (Bolivia), local maximum advances probably peaked around 20-25 ka BP and were followed by significant readvances between ~12-16 ka BP. This generally points to temperature controlled maximum glacial advances along the humid eastern slopes of the Central Andes, which is supported by glacier-climate-modelling studies. However, most studies include only marginal information with regard to the complex geomorphic and sedimentary situation in the Cordillera de Cochabamba. Furthermore, the chronological results are afflicted with several methodological uncertainties inherent to surface exposure dating and call for application of alternative, independent age dating methods. Therefore this study aims at i) documenting and interpreting the complex glacial geomorphology of the Huara Loma valley in the Cordillera de Cochabamba (Bolivia), ii) analyzing the involved units of glacial sediments, and iii) improving the chronological framework by applying optically stimulated luminescence (OSL) and radiocarbon dating (14C). For this purpose, geomorphic mapping was combined with field documentation of sedimentary profiles. The involved sediments were subject to geochemical and mineralogical analysis in order to deduce information on their erosional and weathering histories. In addition, the interpretation of OSL ages from glacial and proglacial sediments integrated several methodological procedures with regard to sample preparation and statistical analysis of the measurements in order to increase the degree of confidence. These combined efforts confirm two major glacial advances in the Cordillera de Cochabamba, which took place during the global LGM and during the Lateglacial. However, their relative chronologies and sedimentary interpretation indicate that the maximum extent of glaciation at Huara Loma was reached during humid Lateglacial times whereas conditions during the LGM were probably too dry.
THE FLICKER RESPONSE CONTOURS FOR GENETICALLY RELATED FISHES. II
Crozier, W. J.; Wolf, Ernst
1939-01-01
The flicker response contour has been determined for several species and types of the teleosts Xiphophorus (X.) and Platypoecilius (P.) under the same conditions. The curve (F vs. log Im) is the same for representatives of each generic type, but is different for the two genera. Its duplex nature is analyzable in each instance by application of the probability integral equation to the rod and cone constituent parts. The parameters of this function provide rational measures of invariant properties of the curves, which have specific values according to the genetic constitution of the animal. The F 1 hybrids (H'') of X. montezuma x P. variatus show dominance of the X. properties with respect to cone Fmax. and σ' log I, but an intermediate value of the abscissa of inflection (τ'). The rod segment shows dominance of σ' log I from P., but an intermediate value of Fmax. and of τ'. The composite flicker curve involves the operation of two distinct assemblages of excitable elements, differing quantitatively but not qualitatively in physicochemical organization, probably only secondarily related to the histological differentiation of rods and cones because almost certainly of central nervous locus, but following different rules in hereditary determination, and therefore necessarily different in physical organization. The interpretation of the diverse behavior of the three parameters of the probability summation is discussed, particularly in relation to the physical significance of these parameters as revealed by their quantitative relations to temperature, retinal area, and light time fraction in the flash cycle, and to their interrelations in producing the decline of rod effects at higher intensities. It is stressed that in general the properties of the parameters of a chosen interpretive analytical function must be shown experimentally to possess the physical properties implied by the equation selected before the equation can be regarded as describing those invariant properties of the organic system concerned upon which alone can deduction of the nature of the system proceed. The importance of genetic procedures in furthering demonstration that the biological performance considered in any particular case exhibits constitutionally invariant features provides a potentially powerful instrument in such rational analysis. PMID:19873115
Improving experimental phases for strong reflections prior to density modification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Uervirojnangkoorn, Monarin; Hilgenfeld, Rolf; Terwilliger, Thomas C.
Experimental phasing of diffraction data from macromolecular crystals involves deriving phase probability distributions. These distributions are often bimodal, making their weighted average, the centroid phase, improbable, so that electron-density maps computed using centroid phases are often non-interpretable. Density modification brings in information about the characteristics of electron density in protein crystals. In successful cases, this allows a choice between the modes in the phase probability distributions, and the maps can cross the borderline between non-interpretable and interpretable. Based on the suggestions by Vekhter [Vekhter (2005), Acta Cryst. D 61, 899–902], the impact of identifying optimized phases for a small numbermore » of strong reflections prior to the density-modification process was investigated while using the centroid phase as a starting point for the remaining reflections. A genetic algorithm was developed that optimizes the quality of such phases using the skewness of the density map as a target function. Phases optimized in this way are then used in density modification. In most of the tests, the resulting maps were of higher quality than maps generated from the original centroid phases. In one of the test cases, the new method sufficiently improved a marginal set of experimental SAD phases to enable successful map interpretation. Lastly, a computer program, SISA, has been developed to apply this method for phase improvement in macromolecular crystallography.« less
Improving experimental phases for strong reflections prior to density modification
Uervirojnangkoorn, Monarin; Hilgenfeld, Rolf; Terwilliger, Thomas C.; ...
2013-09-20
Experimental phasing of diffraction data from macromolecular crystals involves deriving phase probability distributions. These distributions are often bimodal, making their weighted average, the centroid phase, improbable, so that electron-density maps computed using centroid phases are often non-interpretable. Density modification brings in information about the characteristics of electron density in protein crystals. In successful cases, this allows a choice between the modes in the phase probability distributions, and the maps can cross the borderline between non-interpretable and interpretable. Based on the suggestions by Vekhter [Vekhter (2005), Acta Cryst. D 61, 899–902], the impact of identifying optimized phases for a small numbermore » of strong reflections prior to the density-modification process was investigated while using the centroid phase as a starting point for the remaining reflections. A genetic algorithm was developed that optimizes the quality of such phases using the skewness of the density map as a target function. Phases optimized in this way are then used in density modification. In most of the tests, the resulting maps were of higher quality than maps generated from the original centroid phases. In one of the test cases, the new method sufficiently improved a marginal set of experimental SAD phases to enable successful map interpretation. Lastly, a computer program, SISA, has been developed to apply this method for phase improvement in macromolecular crystallography.« less
Harrigan, George G; Harrison, Jay M
2012-01-01
New transgenic (GM) crops are subjected to extensive safety assessments that include compositional comparisons with conventional counterparts as a cornerstone of the process. The influence of germplasm, location, environment, and agronomic treatments on compositional variability is, however, often obscured in these pair-wise comparisons. Furthermore, classical statistical significance testing can often provide an incomplete and over-simplified summary of highly responsive variables such as crop composition. In order to more clearly describe the influence of the numerous sources of compositional variation we present an introduction to two alternative but complementary approaches to data analysis and interpretation. These include i) exploratory data analysis (EDA) with its emphasis on visualization and graphics-based approaches and ii) Bayesian statistical methodology that provides easily interpretable and meaningful evaluations of data in terms of probability distributions. The EDA case-studies include analyses of herbicide-tolerant GM soybean and insect-protected GM maize and soybean. Bayesian approaches are presented in an analysis of herbicide-tolerant GM soybean. Advantages of these approaches over classical frequentist significance testing include the more direct interpretation of results in terms of probabilities pertaining to quantities of interest and no confusion over the application of corrections for multiple comparisons. It is concluded that a standardized framework for these methodologies could provide specific advantages through enhanced clarity of presentation and interpretation in comparative assessments of crop composition.
An Inverse Problem for a Class of Conditional Probability Measure-Dependent Evolution Equations
Mirzaev, Inom; Byrne, Erin C.; Bortz, David M.
2016-01-01
We investigate the inverse problem of identifying a conditional probability measure in measure-dependent evolution equations arising in size-structured population modeling. We formulate the inverse problem as a least squares problem for the probability measure estimation. Using the Prohorov metric framework, we prove existence and consistency of the least squares estimates and outline a discretization scheme for approximating a conditional probability measure. For this scheme, we prove general method stability. The work is motivated by Partial Differential Equation (PDE) models of flocculation for which the shape of the post-fragmentation conditional probability measure greatly impacts the solution dynamics. To illustrate our methodology, we apply the theory to a particular PDE model that arises in the study of population dynamics for flocculating bacterial aggregates in suspension, and provide numerical evidence for the utility of the approach. PMID:28316360
Counterfactuality of ‘counterfactual’ communication
NASA Astrophysics Data System (ADS)
Vaidman, L.
2015-11-01
The counterfactuality of the recently proposed protocols for direct quantum communication is analyzed. It is argued that the protocols can be counterfactual only for one value of the transmitted bit. The protocols achieve a reduced probability of detection of the particle in the transmission channel by increasing the number of paths in the channel. However, this probability is not lower than the probability of detecting a particle actually passing through such a multi-path channel, which was found to be surprisingly small. The relation between security and counterfactuality of the protocols is discussed. An analysis of counterfactuality of the protocols in the framework of the Bohmian interpretation is performed.
20007: Quantum particle displacement by a moving localized potential trap
NASA Astrophysics Data System (ADS)
Granot, E.; Marchewka, A.
2009-04-01
We describe the dynamics of a bound state of an attractive δ-well under displacement of the potential. Exact analytical results are presented for the suddenly moved potential. Since this is a quantum system, only a fraction of the initially confined wave function remains confined to the moving potential. However, it is shown that besides the probability to remain confined to the moving barrier and the probability to remain in the initial position, there is also a certain probability for the particle to move at double speed. A quasi-classical interpretation for this effect is suggested. The temporal and spectral dynamics of each one of the scenarios is investigated.
NASA Astrophysics Data System (ADS)
Singh Pradhan, Ananta Man; Kang, Hyo-Sub; Kim, Yun-Tae
2016-04-01
This study uses a physically based approach to evaluate the factor of safety of the hillslope for different hydrological conditions, in Mt Umyeon, south of Seoul. The hydrological conditions were determined using intensity and duration of whole Korea of known landslide inventory data. Quantile regression statistical method was used to ascertain different probability warning levels on the basis of rainfall thresholds. Physically based models are easily interpreted and have high predictive capabilities but rely on spatially explicit and accurate parameterization, which is commonly not possible. Statistical probabilistic methods can include other causative factors which influence the slope stability such as forest, soil and geology, but rely on good landslide inventories of the site. In this study a hybrid approach has described that combines the physically-based landslide susceptibility for different hydrological conditions. A presence-only based maximum entropy model was used to hybrid and analyze relation of landslide with conditioning factors. About 80% of the landslides were listed among the unstable sites identified in the proposed model, thereby presenting its effectiveness and accuracy in determining unstable areas and areas that require evacuation. These cumulative rainfall thresholds provide a valuable reference to guide disaster prevention authorities in the issuance of warning levels with the ability to reduce losses and save lives.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peterson, J.W.; Geiger, C.A.
1990-03-01
The Hardwood Gneiss is an areally small unit of Precambrian granulite-grade rocks exposed in the Archean gneiss terrane of the southern Lake Superior region. The rocks are located in the southwestern portion of the Upper Peninsula of Michigan and consist of a structurally conformable package of quartzitic, metapelitic, amphibolitic, and metabasic units. Three texturally distinct garnet types are present in the metabasites and are interpreted to represent two metamorphic events. Geothermobarometry indicates conditions of {approximately}8.2-11.6 kbar and {approximately}770C for M1, and conditions of {approximately}6.0-10.1 kbar and {approximately}610-740C for M2. It is proposed that M1 was Archean and contemporaneous with amore » high-grade metamorphic event recorded in the Minnesota River Valley. The M2 event was probably Early Proterozoic and pre-Penokean, with metamorphic conditions more intense than those generally ascribed to the Penokean Orogeny in Michigan, but similar to the conditions reported for the Kapuskasing zone of Ontario. The high paleopressures and temperatures of the M1 event make the Hardwood Gneiss distinct from any rocks previously described in the southern Lake Superior region, and suggest intense tectonic activity during the Archean.« less
A new understanding of multiple-pulsed laser-induced retinal injury thresholds.
Lund, David J; Sliney, David H
2014-04-01
Laser safety standards committees have struggled for years to formulate adequately a sound method for treating repetitive-pulse laser exposures. Safety standards for lamps and LEDs have ignored this issue because averaged irradiance appeared to treat the issue adequately for large retinal image sizes and skin exposures. Several authors have recently questioned the current approach of three test conditions (i.e., limiting single-pulse exposure, average irradiance, and a single-pulse-reduction factor) as still insufficient to treat pulses of unequal energies or certain pulse groupings. Schulmeister et al. employed thermal modeling to show that a total-on-time pulse (TOTP) rule was conservative. Lund further developed the approach of probability summation proposed by Menendez et al. to explain pulse-additivity, whereby additivity is the result of an increasing probability of detecting injury with multiple pulse exposures. This latter argument relates the increase in detection probability to the slope of the probit curve for the threshold studies. Since the uncertainty in the threshold for producing an ophthalmoscopically detectable minimal visible lesion (MVL) is large for retinal exposure to a collimated laser beam, safety committees traditionally applied large risk reduction factors ("safety factors") of one order of magnitude when deriving intrabeam, "point-source" exposure limits. This reduction factor took into account the probability of visually detecting the low-contrast lesion among other factors. The reduction factor is smaller for large spot sizes where these difficulties are quite reduced. Thus the N⁻⁰·²⁵ reduction factor may result from the difficulties in detecting the lesion. Recent studies on repetitive pulse exposures in both animal and in vitro (retinal explant) models support this interpretation of the available data.
The effect of air temperature and human thermal indices on mortality in Athens, Greece
NASA Astrophysics Data System (ADS)
Nastos, Panagiotis T.; Matzarakis, Andreas
2012-05-01
This paper investigates whether there is any association between the daily mortality for the wider region of Athens, Greece and the thermal conditions, for the 10-year period 1992-2001. The daily mortality datasets were acquired from the Hellenic Statistical Service and the daily meteorological datasets, concerning daily maximum and minimum air temperature, from the Hellinikon/Athens meteorological station, established at the headquarters of the Greek Meteorological Service. Besides, the daily values of the thermal indices Physiologically Equivalent Temperature (PET) and Universal Thermal Climate Index (UTCI) were evaluated in order to interpret the grade of physiological stress. The first step was the application of Pearson's χ 2 test to the compiled contingency tables, resulting in that the probability of independence is zero ( p = 0.000); namely, mortality is in close relation to the air temperature and PET/UTCI. Furthermore, the findings extracted by the generalized linear models showed that, statistically significant relationships ( p < 0.01) between air temperature, PET, UTCI and mortality exist on the same day. More concretely, on one hand during the cold period (October-March), a 10°C decrease in daily maximum air temperature, minimum air temperature, temperature range, PET and UTCI is related with an increase 13%, 15%, 2%, 7% and 6% of the probability having a death, respectively. On the other hand, during the warm period (April-September), a 10°C increase in daily maximum air temperature, minimum air temperature, temperature range, PET and UTCI is related with an increase 3%, 1%, 10%, 3% and 5% of the probability having a death, respectively. Taking into consideration the time lag effect of the examined parameters on mortality, it was found that significant effects of 3-day lag during the cold period appears against 1-day lag during the warm period. In spite of the general aspect that cold conditions seem to be favourable factors for daily mortality, the air temperature and PET/UTCI exceedances over specific thresholds depending on the distribution reveal that, very hot conditions are risk factors for the daily mortality.
NASA Astrophysics Data System (ADS)
Rodríguez-Tovar, F. J.; Uchman, A.; Orue-Etxebarria, X.; Apellaniz, E.
2013-02-01
Ichnological analysis was conducted in the Danian-Selandian (D-S) boundary interval from the Sopelana section (Basque Basin, northern Spain) to improve characterization of the recently defined Global Stratotype Section and Point of the base of the Selandian Stage (Middle Paleocene) in the nearby Zumaia section, and to interpret the Danian-Selandian boundary event with its associated palaeoenvironmental changes. The trace fossil assemblage of the boundary interval is relatively scarce and shows low diversity, consisting of Chondrites, Planolites, Thalassinoides, Trichichnus and Zoophycos, which cross-cut a diffuse, burrow-mottled background, typical of a normal burrowing tiered community. Distribution of trace fossils shows local drops in abundance and diversity just above the D-S boundary and about half a metre upwards into the succeeding Selandian. Generally, the Selandian part of the section has slightly lower trace fossil diversity and abundance. This is interpreted as due to a higher detrital food supply, corresponding to a sea-level fall, in contrast to a decreased food supply during the Selandian sea-level rise. Smaller-scale fluctuations of trace fossil diversity and abundance are also interpreted as due more to food content fluctuations in the sediment than to oxygenation of pore waters. Results reveal the minor influence of an extreme warming event (hyperthermal conditions) at the D-S boundary which affected the whole benthic habitat. Contrarily, a probable major effect of sea-level fluctuations can be envisaged, which determined variations in siliciclastic input and food content.
Quantifiers are incrementally interpreted in context, more than less
Urbach, Thomas P.; DeLong, Katherine A.; Kutas, Marta
2015-01-01
Language interpretation is often assumed to be incremental. However, our studies of quantifier expressions in isolated sentences found N400 event-related brain potential (ERP) evidence for partial but not full immediate quantifier interpretation (Urbach & Kutas, 2010). Here we tested similar quantifier expressions in pragmatically supporting discourse contexts (Alex was an unusual toddler. Most/Few kids prefer sweets/vegetables…) while participants made plausibility judgments (Experiment 1) or read for comprehension (Experiment 2). Control Experiments 3A (plausibility) and 3B (comprehension) removed the discourse contexts. Quantifiers always modulated typical and/or atypical word N400 amplitudes. However, only the real-time N400 effects only in Experiment 2 mirrored offline quantifier and typicality crossover interaction effects for plausibility ratings and cloze probabilities. We conclude that quantifier expressions can be interpreted fully and immediately, though pragmatic and task variables appear to impact the speed and/or depth of quantifier interpretation. PMID:26005285
What is the correct cost functional for variational data assimilation?
NASA Astrophysics Data System (ADS)
Bröcker, Jochen
2018-03-01
Variational approaches to data assimilation, and weakly constrained four dimensional variation (WC-4DVar) in particular, are important in the geosciences but also in other communities (often under different names). The cost functions and the resulting optimal trajectories may have a probabilistic interpretation, for instance by linking data assimilation with maximum aposteriori (MAP) estimation. This is possible in particular if the unknown trajectory is modelled as the solution of a stochastic differential equation (SDE), as is increasingly the case in weather forecasting and climate modelling. In this situation, the MAP estimator (or "most probable path" of the SDE) is obtained by minimising the Onsager-Machlup functional. Although this fact is well known, there seems to be some confusion in the literature, with the energy (or "least squares") functional sometimes been claimed to yield the most probable path. The first aim of this paper is to address this confusion and show that the energy functional does not, in general, provide the most probable path. The second aim is to discuss the implications in practice. Although the mentioned results pertain to stochastic models in continuous time, they do have consequences in practice where SDE's are approximated by discrete time schemes. It turns out that using an approximation to the SDE and calculating its most probable path does not necessarily yield a good approximation to the most probable path of the SDE proper. This suggest that even in discrete time, a version of the Onsager-Machlup functional should be used, rather than the energy functional, at least if the solution is to be interpreted as a MAP estimator.
Corson-Knowles, Daniel; Russell, Frances M
2018-05-01
Clinical ultrasound (CUS) is highly specific for the diagnosis of acute appendicitis but is operator-dependent. The goal of this study was to determine if a heterogeneous group of emergency physicians (EP) could diagnose acute appendicitis on CUS in patients with a moderate to high pre-test probability. This was a prospective, observational study of a convenience sample of adult and pediatric patients with suspected appendicitis. Sonographers received a structured, 20-minute CUS training on appendicitis prior to patient enrollment. The presence of a dilated (>6 mm diameter), non-compressible, blind-ending tubular structure was considered a positive study. Non-visualization or indeterminate studies were considered negative. We collected pre-test probability of acute appendicitis based on a 10-point visual analog scale (moderate to high was defined as >3), and confidence in CUS interpretation. The primary objective was measured by comparing CUS findings to surgical pathology and one week follow-up. We enrolled 105 patients; 76 had moderate to high pre-test probability. Of these, 24 were children. The rate of appendicitis was 36.8% in those with moderate to high pre-test probability. CUS were recorded by 33 different EPs. The sensitivity, specificity, and positive and negative likelihood ratios of EP-performed CUS in patients with moderate to high pre-test probability were 42.8% (95% confidence interval [CI] [25-62.5%]), 97.9% (95% CI [87.5-99.8%]), 20.7 (95% CI [2.8-149.9]) and 0.58 (95% CI [0.42-0.8]), respectively. The 16 false negative scans were all interpreted as indeterminate. There was one false positive CUS diagnosis; however, the sonographer reported low confidence of 2/10. A heterogeneous group of EP sonographers can safely identify acute appendicitis with high specificity in patients with moderate to high pre-test probability. This data adds support for surgical consultation without further imaging beyond CUS in the appropriate clinical setting.
O'Toole, Mia S; Mennin, Douglas S; Hougaard, Esben; Zachariae, Robert; Rosenberg, Nicole K
2015-01-01
The objective of the study was to investigate variables, derived from both cognitive and emotion regulation conceptualizations of social anxiety disorder (SAD), as possible change processes in cognitive behaviour therapy (CBT) for SAD. Several proposed change processes were investigated: estimated probability, estimated cost, safety behaviours, acceptance of emotions, cognitive reappraisal and expressive suppression. Participants were 50 patients with SAD, receiving a standard manualized CBT program, conducted in groups or individually. All variables were measured pre-therapy, mid-therapy and post-therapy. Lower level mediation models revealed that while a change in most process measures significantly predicted clinical improvement, only changes in estimated probability and cost and acceptance of emotions showed significant indirect effects of CBT for SAD. The results are in accordance with previous studies supporting the mediating role of changes in cognitive distortions in CBT for SAD. In addition, acceptance of emotions may also be a critical component to clinical improvement in SAD during CBT, although more research is needed on which elements of acceptance are most helpful for individuals with SAD. The study's lack of a control condition limits any conclusion regarding the specificity of the findings to CBT. Change in estimated probability and cost, and acceptance of emotions showed an indirect effect of CBT for SAD. Cognitive distortions appear relevant to target with cognitive restructuring techniques. Finding acceptance to have an indirect effect could be interpreted as support for contemporary CBT approaches that include acceptance-based strategies. Copyright © 2014 John Wiley & Sons, Ltd.
Updating: Learning versus Supposing
ERIC Educational Resources Information Center
Zhao, Jiaying; Crupi, Vincenzo; Tentori, Katya; Fitelson, Branden; Osherson, Daniel
2012-01-01
Bayesian orthodoxy posits a tight relationship between conditional probability and updating. Namely, the probability of an event "A" after learning "B" should equal the conditional probability of "A" given "B" prior to learning "B". We examine whether ordinary judgment conforms to the orthodox view. In three experiments we found substantial…
Music-evoked incidental happiness modulates probability weighting during risky lottery choices
Schulreich, Stefan; Heussen, Yana G.; Gerhardt, Holger; Mohr, Peter N. C.; Binkofski, Ferdinand C.; Koelsch, Stefan; Heekeren, Hauke R.
2014-01-01
We often make decisions with uncertain consequences. The outcomes of the choices we make are usually not perfectly predictable but probabilistic, and the probabilities can be known or unknown. Probability judgments, i.e., the assessment of unknown probabilities, can be influenced by evoked emotional states. This suggests that also the weighting of known probabilities in decision making under risk might be influenced by incidental emotions, i.e., emotions unrelated to the judgments and decisions at issue. Probability weighting describes the transformation of probabilities into subjective decision weights for outcomes and is one of the central components of cumulative prospect theory (CPT) that determine risk attitudes. We hypothesized that music-evoked emotions would modulate risk attitudes in the gain domain and in particular probability weighting. Our experiment featured a within-subject design consisting of four conditions in separate sessions. In each condition, the 41 participants listened to a different kind of music—happy, sad, or no music, or sequences of random tones—and performed a repeated pairwise lottery choice task. We found that participants chose the riskier lotteries significantly more often in the “happy” than in the “sad” and “random tones” conditions. Via structural regressions based on CPT, we found that the observed changes in participants' choices can be attributed to changes in the elevation parameter of the probability weighting function: in the “happy” condition, participants showed significantly higher decision weights associated with the larger payoffs than in the “sad” and “random tones” conditions. Moreover, elevation correlated positively with self-reported music-evoked happiness. Thus, our experimental results provide evidence in favor of a causal effect of incidental happiness on risk attitudes that can be explained by changes in probability weighting. PMID:24432007
Music-evoked incidental happiness modulates probability weighting during risky lottery choices.
Schulreich, Stefan; Heussen, Yana G; Gerhardt, Holger; Mohr, Peter N C; Binkofski, Ferdinand C; Koelsch, Stefan; Heekeren, Hauke R
2014-01-07
We often make decisions with uncertain consequences. The outcomes of the choices we make are usually not perfectly predictable but probabilistic, and the probabilities can be known or unknown. Probability judgments, i.e., the assessment of unknown probabilities, can be influenced by evoked emotional states. This suggests that also the weighting of known probabilities in decision making under risk might be influenced by incidental emotions, i.e., emotions unrelated to the judgments and decisions at issue. Probability weighting describes the transformation of probabilities into subjective decision weights for outcomes and is one of the central components of cumulative prospect theory (CPT) that determine risk attitudes. We hypothesized that music-evoked emotions would modulate risk attitudes in the gain domain and in particular probability weighting. Our experiment featured a within-subject design consisting of four conditions in separate sessions. In each condition, the 41 participants listened to a different kind of music-happy, sad, or no music, or sequences of random tones-and performed a repeated pairwise lottery choice task. We found that participants chose the riskier lotteries significantly more often in the "happy" than in the "sad" and "random tones" conditions. Via structural regressions based on CPT, we found that the observed changes in participants' choices can be attributed to changes in the elevation parameter of the probability weighting function: in the "happy" condition, participants showed significantly higher decision weights associated with the larger payoffs than in the "sad" and "random tones" conditions. Moreover, elevation correlated positively with self-reported music-evoked happiness. Thus, our experimental results provide evidence in favor of a causal effect of incidental happiness on risk attitudes that can be explained by changes in probability weighting.
CProb: a computational tool for conducting conditional probability analysis.
Hollister, Jeffrey W; Walker, Henry A; Paul, John F
2008-01-01
Conditional probability is the probability of observing one event given that another event has occurred. In an environmental context, conditional probability helps to assess the association between an environmental contaminant (i.e., the stressor) and the ecological condition of a resource (i.e., the response). These analyses, when combined with controlled experiments and other methodologies, show great promise in evaluating ecological conditions from observational data and in defining water quality and other environmental criteria. Current applications of conditional probability analysis (CPA) are largely done via scripts or cumbersome spreadsheet routines, which may prove daunting to end-users and do not provide access to the underlying scripts. Combining spreadsheets with scripts eases computation through a familiar interface (i.e., Microsoft Excel) and creates a transparent process through full accessibility to the scripts. With this in mind, we developed a software application, CProb, as an Add-in for Microsoft Excel with R, R(D)com Server, and Visual Basic for Applications. CProb calculates and plots scatterplots, empirical cumulative distribution functions, and conditional probability. In this short communication, we describe CPA, our motivation for developing a CPA tool, and our implementation of CPA as a Microsoft Excel Add-in. Further, we illustrate the use of our software with two examples: a water quality example and a landscape example. CProb is freely available for download at http://www.epa.gov/emap/nca/html/regions/cprob.
Artificial stimulation of auroral electron acceleration by intense field aligned currents
NASA Technical Reports Server (NTRS)
Holmgren, G.; Bostrom, R.; Kelley, M. C.; Kintner, P. M.; Lundin, R.; Bering, E. A.; Sheldon, W. R.; Fahleson, U. V.
1979-01-01
A cesium-doped high explosion was detonated at 165 km altitude in the auroral ionosphere during quiet conditions. An Alfven wave pulse with a 200-mV/m electric field was observed, with the peak occurring 135 ms after the explosion at a distance of about 1 km. The count rate of fixed energy 2-keV electron detectors abruptly increased at 140 ms, peaked at 415 ms, and indicated a downward field-aligned beam of accelerated electrons. An anomalously high-field aligned beam of backscattered electrons was also detected. The acceleration is interpreted as due to production of an electrostatic shock or double layer between 300 and 800 km altitude. The structure was probably formed by an instability of the intense field-aligned currents in the Alfven wave launched by the charge-separation electric field due to the explosion.
Microwave backscattering theory and active remote sensing of the ocean surface
NASA Technical Reports Server (NTRS)
Brown, G. S.; Miller, L. S.
1977-01-01
The status is reviewed of electromagnetic scattering theory relative to the interpretation of microwave remote sensing data acquired from spaceborne platforms over the ocean surface. Particular emphasis is given to the assumptions which are either implicit or explicit in the theory. The multiple scale scattering theory developed during this investigation is extended to non-Gaussian surface statistics. It is shown that the important statistic for the case is the probability density function of the small scale heights conditioned on the large scale slopes; this dependence may explain the anisotropic scattering measurements recently obtained with the AAFE Radscat. It is noted that present surface measurements are inadequate to verify or reject the existing scattering theories. Surface measurements are recommended for qualifying sensor data from radar altimeters and scatterometers. Additional scattering investigations are suggested for imaging type radars employing synthetically generated apertures.
Spatial structures arising along a surface wave produced plasma column: an experimental study
NASA Astrophysics Data System (ADS)
Atanassov, V.; Mateev, E.
2007-04-01
The formation of spatial structures in high-frequency and microwave discharges has been known for several decades. Nevertheless it still raises increased interest, probably due to the variety of the observed phenomena and the lack of adequate and systematic theoretical interpretation. In this paper we present preliminary results on observation of spatial structures appearing along a surface wave sustained plasma column. The experiments have been performed in noble gases (xenon and neon) at low to intermediate pressure and the surface wave has been launched by a surfatron. Under these conditions we have observed and documented: i) appearance of stationary plasma rings; ii) formation of standing-wave striationlike patterns; iii) contraction of the plasma column; iv) plasma column transition into moving plasma balls and filaments. Some of the existing theoretical considerations of these phenomena are reviewed and discussed.
Brody, Stuart; Krüger, Tillmann H C
2006-03-01
Research indicates that prolactin increases following orgasm are involved in a feedback loop that serves to decrease arousal through inhibitory central dopaminergic and probably peripheral processes. The magnitude of post-orgasmic prolactin increase is thus a neurohormonal index of sexual satiety. Using data from three studies of men and women engaging in masturbation or penile-vaginal intercourse to orgasm in the laboratory, we report that for both sexes (adjusted for prolactin changes in a non-sexual control condition), the magnitude of prolactin increase following intercourse is 400% greater than that following masturbation. The results are interpreted as an indication of intercourse being more physiologically satisfying than masturbation, and discussed in light of prior research reporting greater physiological and psychological benefits associated with coitus than with any other sexual activities.
NASA Technical Reports Server (NTRS)
Ogallagher, J. J.
1973-01-01
A simple one-dimensional time-dependent diffusion-convection model for the modulation of cosmic rays is presented. This model predicts that the observed intensity at a given time is approximately equal to the intensity given by the time independent diffusion convection solution under interplanetary conditions which existed a time iota in the past, (U(t sub o) = U sub s(t sub o - tau)) where iota is the average time spent by a particle inside the modulating cavity. Delay times in excess of several hundred days are possible with reasonable modulation parameters. Interpretation of phase lags observed during the 1969 to 1970 solar maximum in terms of this model suggests that the modulating region is probably not less than 10 a.u. and maybe as much as 35 a.u. in extent.
Probability in reasoning: a developmental test on conditionals.
Barrouillet, Pierre; Gauffroy, Caroline
2015-04-01
Probabilistic theories have been claimed to constitute a new paradigm for the psychology of reasoning. A key assumption of these theories is captured by what they call the Equation, the hypothesis that the meaning of the conditional is probabilistic in nature and that the probability of If p then q is the conditional probability, in such a way that P(if p then q)=P(q|p). Using the probabilistic truth-table task in which participants are required to evaluate the probability of If p then q sentences, the present study explored the pervasiveness of the Equation through ages (from early adolescence to adulthood), types of conditionals (basic, causal, and inducements) and contents. The results reveal that the Equation is a late developmental achievement only endorsed by a narrow majority of educated adults for certain types of conditionals depending on the content they involve. Age-related changes in evaluating the probability of all the conditionals studied closely mirror the development of truth-value judgements observed in previous studies with traditional truth-table tasks. We argue that our modified mental model theory can account for this development, and hence for the findings related with the probability task, which do not consequently support the probabilistic approach of human reasoning over alternative theories. Copyright © 2014 Elsevier B.V. All rights reserved.
Prescriptive models to support decision making in genetics.
Pauker, S G; Pauker, S P
1987-01-01
Formal prescriptive models can help patients and clinicians better understand the risks and uncertainties they face and better formulate well-reasoned decisions. Using Bayes rule, the clinician can interpret pedigrees, historical data, physical findings and laboratory data, providing individualized probabilities of various diagnoses and outcomes of pregnancy. With the advent of screening programs for genetic disease, it becomes increasingly important to consider the prior probabilities of disease when interpreting an abnormal screening test result. Decision trees provide a convenient formalism for structuring diagnostic, therapeutic and reproductive decisions; such trees can also enhance communication between clinicians and patients. Utility theory provides a mechanism for patients to understand the choices they face and to communicate their attitudes about potential reproductive outcomes in a manner which encourages the integration of those attitudes into appropriate decisions. Using a decision tree, the relevant probabilities and the patients' utilities, physicians can estimate the relative worth of various medical and reproductive options by calculating the expected utility of each. By performing relevant sensitivity analyses, clinicians and patients can understand the impact of various soft data, including the patients' attitudes toward various health outcomes, on the decision making process. Formal clinical decision analytic models can provide deeper understanding and improved decision making in clinical genetics.
Modelling uncertainty with generalized credal sets: application to conjunction and decision
NASA Astrophysics Data System (ADS)
Bronevich, Andrey G.; Rozenberg, Igor N.
2018-01-01
To model conflict, non-specificity and contradiction in information, upper and lower generalized credal sets are introduced. Any upper generalized credal set is a convex subset of plausibility measures interpreted as lower probabilities whose bodies of evidence consist of singletons and a certain event. Analogously, contradiction is modelled in the theory of evidence by a belief function that is greater than zero at empty set. Based on generalized credal sets, we extend the conjunctive rule for contradictory sources of information, introduce constructions like natural extension in the theory of imprecise probabilities and show that the model of generalized credal sets coincides with the model of imprecise probabilities if the profile of a generalized credal set consists of probability measures. We give ways how the introduced model can be applied to decision problems.
Perceptions and Expected Immediate Reactions to Severe Storm Displays.
Jon, Ihnji; Huang, Shih-Kai; Lindell, Michael K
2017-11-09
The National Weather Service has adopted warning polygons that more specifically indicate the risk area than its previous county-wide warnings. However, these polygons are not defined in terms of numerical strike probabilities (p s ). To better understand people's interpretations of warning polygons, 167 participants were shown 23 hypothetical scenarios in one of three information conditions-polygon-only (Condition A), polygon + tornadic storm cell (Condition B), and polygon + tornadic storm cell + flanking nontornadic storm cells (Condition C). Participants judged each polygon's p s and reported the likelihood of taking nine different response actions. The polygon-only condition replicated the results of previous studies; p s was highest at the polygon's centroid and declined in all directions from there. The two conditions displaying storm cells differed from the polygon-only condition only in having p s just as high at the polygon's edge nearest the storm cell as at its centroid. Overall, p s values were positively correlated with expectations of continuing normal activities, seeking information from social sources, seeking shelter, and evacuating by car. These results indicate that participants make more appropriate p s judgments when polygons are presented in their natural context of radar displays than when they are presented in isolation. However, the fact that p s judgments had moderately positive correlations with both sheltering (a generally appropriate response) and evacuation (a generally inappropriate response) suggests that experiment participants experience the same ambivalence about these two protective actions as people threatened by actual tornadoes. © 2017 Society for Risk Analysis.
Pavlovian conditioning enhances resistance to disruption of dogs performing an odor discrimination.
Hall, Nathaniel J; Smith, David W; Wynne, Clive D L
2015-05-01
Domestic dogs are used to aid in the detection of a variety of substances such as narcotics and explosives. Under real-world detection situations there are many variables that may disrupt the dog's performance. Prior research on behavioral momentum theory suggests that higher rates of reinforcement produce greater resistance to disruption, and that this is heavily influenced by the stimulus-reinforcer relationship. The present study tests the Pavlovian interpretation of resistance to change using dogs engaged in an odor discrimination task. Dogs were trained on two odor discriminations that alternated every six trials akin to a multiple schedule in which the reinforcement probability for a correct response was always 1. Dogs then received several sessions of either odor Pavlovian conditioning to the S+ of one odor discrimination (Pavlovian group) or explicitly unpaired exposure to the S+ of one odor discrimination (Unpaired group). The remaining odor discrimination pair for each dog always remained an unexposed control. Resistance to disruption was assessed under presession feeding, a food-odor disruptor condition, and extinction, with baseline sessions intervening between disruption conditions. Equivalent baseline detection rates were observed across experimental groups and odorant pairs. Under disruption conditions, Pavlovian conditioning led to enhanced resistance to disruption of detection performance compared to the unexposed control odor discrimination. Unpaired odor conditioning did not influence resistance to disruption. These results suggest that changes in Pavlovian contingencies are sufficient to influence resistance to change. © Society for the Experimental Analysis of Behavior.
An Attempt to Target Anxiety Sensitivity via Cognitive Bias Modification
Clerkin, Elise M.; Beard, Courtney; Fisher, Christopher R.; Schofield, Casey A
2015-01-01
Our goals in the present study were to test an adaptation of a Cognitive Bias Modification program to reduce anxiety sensitivity, and to evaluate the causal relationships between interpretation bias of physiological cues, anxiety sensitivity, and anxiety and avoidance associated with interoceptive exposures. Participants with elevated anxiety sensitivity who endorsed having a panic attack or limited symptom attack were randomly assigned to either an Interpretation Modification Program (IMP; n = 33) or a Control (n = 32) condition. During interpretation modification training (via the Word Sentence Association Paradigm), participants read short sentences describing ambiguous panic-relevant physiological and cognitive symptoms and were trained to endorse benign interpretations and reject threatening interpretations associated with these cues. Compared to the Control condition, IMP training successfully increased endorsements of benign interpretations and decreased endorsements of threatening interpretations at visit 2. Although self-reported anxiety sensitivity decreased from pre-selection to visit 1 and from visit 1 to visit 2, the reduction was not larger for the experimental versus control condition. Further, participants in IMP (vs. Control) training did not experience less anxiety and avoidance associated with interoceptive exposures. In fact, there was some evidence that those in the Control condition experienced less avoidance following training. Potential explanations for the null findings, including problems with the benign panic-relevant stimuli and limitations with the control condition, are discussed. PMID:25692491
An attempt to target anxiety sensitivity via cognitive bias modification.
Clerkin, Elise M; Beard, Courtney; Fisher, Christopher R; Schofield, Casey A
2015-01-01
Our goals in the present study were to test an adaptation of a Cognitive Bias Modification program to reduce anxiety sensitivity, and to evaluate the causal relationships between interpretation bias of physiological cues, anxiety sensitivity, and anxiety and avoidance associated with interoceptive exposures. Participants with elevated anxiety sensitivity who endorsed having a panic attack or limited symptom attack were randomly assigned to either an Interpretation Modification Program (IMP; n = 33) or a Control (n = 32) condition. During interpretation modification training (via the Word Sentence Association Paradigm), participants read short sentences describing ambiguous panic-relevant physiological and cognitive symptoms and were trained to endorse benign interpretations and reject threatening interpretations associated with these cues. Compared to the Control condition, IMP training successfully increased endorsements of benign interpretations and decreased endorsements of threatening interpretations at visit 2. Although self-reported anxiety sensitivity decreased from pre-selection to visit 1 and from visit 1 to visit 2, the reduction was not larger for the experimental versus control condition. Further, participants in IMP (vs. Control) training did not experience less anxiety and avoidance associated with interoceptive exposures. In fact, there was some evidence that those in the Control condition experienced less avoidance following training. Potential explanations for the null findings, including problems with the benign panic-relevant stimuli and limitations with the control condition, are discussed.
Rita, Angelo; Borghetti, Marco; Todaro, Luigi; Saracino, Antonio
2016-01-01
In the Mediterranean region, the widely predicted rise in temperature, change in the precipitation pattern, and increase in the frequency of extreme climatic events are expected to alter the shape of ecological communities and to affect plant physiological processes that regulate ecosystem functioning. Although change in the mean values are important, there is increasing evidence that plant distribution, survival, and productivity respond to extremes rather than to the average climatic condition. The present study aims to assess the effects of both mean and extreme climatic conditions on radial growth and functional anatomical traits using long-term tree-ring time series of two co-existing Quercus spp. from a drought-prone site in Southern Italy. In particular, this is the first attempt to apply the Generalized Additive Model for Location, Scale, and Shape (GAMLSS) technique and Bayesian modeling procedures to xylem traits data set, with the aim of (i) detecting non-linear long-term responses to climate and (ii) exploring relationships between climate extreme and xylem traits variability in terms of probability of occurrence. This study demonstrates the usefulness of long-term xylem trait chronologies as records of environmental conditions at annual resolution. Statistical analyses revealed that most of the variability in tree-ring width and specific hydraulic conductivity might be explained by cambial age. Additionally, results highlighted appreciable relationships between xylem traits and climate variability more than tree-ring width, supporting also the evidence that the plant hydraulic traits are closely linked to local climate extremes rather than average climatic conditions. We reported that the probability of extreme departure in specific hydraulic conductivity (Ks) rises at extreme values of Standardized Precipitation Index (SPI). Therefore, changing frequency or intensity of extreme events might overcome the adaptive limits of vascular transport, resulting in substantial reduction of hydraulic functionality and, hence increased incidence of xylem dysfunctions.
Rita, Angelo; Borghetti, Marco; Todaro, Luigi; Saracino, Antonio
2016-01-01
In the Mediterranean region, the widely predicted rise in temperature, change in the precipitation pattern, and increase in the frequency of extreme climatic events are expected to alter the shape of ecological communities and to affect plant physiological processes that regulate ecosystem functioning. Although change in the mean values are important, there is increasing evidence that plant distribution, survival, and productivity respond to extremes rather than to the average climatic condition. The present study aims to assess the effects of both mean and extreme climatic conditions on radial growth and functional anatomical traits using long-term tree-ring time series of two co-existing Quercus spp. from a drought-prone site in Southern Italy. In particular, this is the first attempt to apply the Generalized Additive Model for Location, Scale, and Shape (GAMLSS) technique and Bayesian modeling procedures to xylem traits data set, with the aim of (i) detecting non-linear long-term responses to climate and (ii) exploring relationships between climate extreme and xylem traits variability in terms of probability of occurrence. This study demonstrates the usefulness of long-term xylem trait chronologies as records of environmental conditions at annual resolution. Statistical analyses revealed that most of the variability in tree-ring width and specific hydraulic conductivity might be explained by cambial age. Additionally, results highlighted appreciable relationships between xylem traits and climate variability more than tree-ring width, supporting also the evidence that the plant hydraulic traits are closely linked to local climate extremes rather than average climatic conditions. We reported that the probability of extreme departure in specific hydraulic conductivity (Ks) rises at extreme values of Standardized Precipitation Index (SPI). Therefore, changing frequency or intensity of extreme events might overcome the adaptive limits of vascular transport, resulting in substantial reduction of hydraulic functionality and, hence increased incidence of xylem dysfunctions. PMID:27532008
People Like Logical Truth: Testing the Intuitive Detection of Logical Value in Basic Propositions.
Nakamura, Hiroko; Kawaguchi, Jun
2016-01-01
Recent studies on logical reasoning have suggested that people are intuitively aware of the logical validity of syllogisms or that they intuitively detect conflict between heuristic responses and logical norms via slight changes in their feelings. According to logical intuition studies, logically valid or heuristic logic no-conflict reasoning is fluently processed and induces positive feelings without conscious awareness. One criticism states that such effects of logicality disappear when confounding factors such as the content of syllogisms are controlled. The present study used abstract propositions and tested whether people intuitively detect logical value. Experiment 1 presented four logical propositions (conjunctive, biconditional, conditional, and material implications) regarding a target case and asked the participants to rate the extent to which they liked the statement. Experiment 2 tested the effects of matching bias, as well as intuitive logic, on the reasoners' feelings by manipulating whether the antecedent or consequent (or both) of the conditional was affirmed or negated. The results showed that both logicality and matching bias affected the reasoners' feelings, and people preferred logically true targets over logically false ones for all forms of propositions. These results suggest that people intuitively detect what is true from what is false during abstract reasoning. Additionally, a Bayesian mixed model meta-analysis of conditionals indicated that people's intuitive interpretation of the conditional "if p then q" fits better with the conditional probability, q given p.
Simulating reservoir lithologies by an actively conditioned Markov chain model
NASA Astrophysics Data System (ADS)
Feng, Runhai; Luthi, Stefan M.; Gisolf, Dries
2018-06-01
The coupled Markov chain model can be used to simulate reservoir lithologies between wells, by conditioning them on the observed data in the cored wells. However, with this method, only the state at the same depth as the current cell is going to be used for conditioning, which may be a problem if the geological layers are dipping. This will cause the simulated lithological layers to be broken or to become discontinuous across the reservoir. In order to address this problem, an actively conditioned process is proposed here, in which a tolerance angle is predefined. The states contained in the region constrained by the tolerance angle will be employed for conditioning in the horizontal chain first, after which a coupling concept with the vertical chain is implemented. In order to use the same horizontal transition matrix for different future states, the tolerance angle has to be small. This allows the method to work in reservoirs without complex structures caused by depositional processes or tectonic deformations. Directional artefacts in the modeling process are avoided through a careful choice of the simulation path. The tolerance angle and dipping direction of the strata can be obtained from a correlation between wells, or from seismic data, which are available in most hydrocarbon reservoirs, either by interpretation or by inversion that can also assist the construction of a horizontal probability matrix.
NASA Astrophysics Data System (ADS)
Haris, A.; Novriyani, M.; Suparno, S.; Hidayat, R.; Riyanto, A.
2017-07-01
This study presents the integration of seismic stochastic inversion and multi-attributes for delineating the reservoir distribution in term of lithology and porosity in the formation within depth interval between the Top Sihapas and Top Pematang. The method that has been used is a stochastic inversion, which is integrated with multi-attribute seismic by applying neural network Probabilistic Neural Network (PNN). Stochastic methods are used to predict the probability mapping sandstone as the result of impedance varied with 50 realizations that will produce a good probability. Analysis of Stochastic Seismic Tnversion provides more interpretive because it directly gives the value of the property. Our experiment shows that AT of stochastic inversion provides more diverse uncertainty so that the probability value will be close to the actual values. The produced AT is then used for an input of a multi-attribute analysis, which is used to predict the gamma ray, density and porosity logs. To obtain the number of attributes that are used, stepwise regression algorithm is applied. The results are attributes which are used in the process of PNN. This PNN method is chosen because it has the best correlation of others neural network method. Finally, we interpret the product of the multi-attribute analysis are in the form of pseudo-gamma ray volume, density volume and volume of pseudo-porosity to delineate the reservoir distribution. Our interpretation shows that the structural trap is identified in the southeastern part of study area, which is along the anticline.
A probabilistic model to predict clinical phenotypic traits from genome sequencing.
Chen, Yun-Ching; Douville, Christopher; Wang, Cheng; Niknafs, Noushin; Yeo, Grace; Beleva-Guthrie, Violeta; Carter, Hannah; Stenson, Peter D; Cooper, David N; Li, Biao; Mooney, Sean; Karchin, Rachel
2014-09-01
Genetic screening is becoming possible on an unprecedented scale. However, its utility remains controversial. Although most variant genotypes cannot be easily interpreted, many individuals nevertheless attempt to interpret their genetic information. Initiatives such as the Personal Genome Project (PGP) and Illumina's Understand Your Genome are sequencing thousands of adults, collecting phenotypic information and developing computational pipelines to identify the most important variant genotypes harbored by each individual. These pipelines consider database and allele frequency annotations and bioinformatics classifications. We propose that the next step will be to integrate these different sources of information to estimate the probability that a given individual has specific phenotypes of clinical interest. To this end, we have designed a Bayesian probabilistic model to predict the probability of dichotomous phenotypes. When applied to a cohort from PGP, predictions of Gilbert syndrome, Graves' disease, non-Hodgkin lymphoma, and various blood groups were accurate, as individuals manifesting the phenotype in question exhibited the highest, or among the highest, predicted probabilities. Thirty-eight PGP phenotypes (26%) were predicted with area-under-the-ROC curve (AUC)>0.7, and 23 (15.8%) of these were statistically significant, based on permutation tests. Moreover, in a Critical Assessment of Genome Interpretation (CAGI) blinded prediction experiment, the models were used to match 77 PGP genomes to phenotypic profiles, generating the most accurate prediction of 16 submissions, according to an independent assessor. Although the models are currently insufficiently accurate for diagnostic utility, we expect their performance to improve with growth of publicly available genomics data and model refinement by domain experts.
Internal Medicine residents use heuristics to estimate disease probability.
Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin
2015-01-01
Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. We randomized 55 Internal Medicine residents to different versions of four clinical vignettes and asked them to estimate probabilities of target conditions. We manipulated the clinical data for each vignette to be consistent with either 1) using a representative heuristic, by adding non-discriminating prototypical clinical features of the target condition, or 2) using anchoring with adjustment heuristic, by providing a high or low anchor for the target condition. When presented with additional non-discriminating data the odds of diagnosing the target condition were increased (odds ratio (OR) 2.83, 95% confidence interval [1.30, 6.15], p = 0.009). Similarly, the odds of diagnosing the target condition were increased when a high anchor preceded the vignette (OR 2.04, [1.09, 3.81], p = 0.025). Our findings suggest that despite previous exposure to the use of Bayesian reasoning, residents use heuristics, such as the representative heuristic and anchoring with adjustment, to estimate probabilities. Potential reasons for attribute substitution include the relative cognitive ease of heuristics vs. Bayesian reasoning or perhaps residents in their clinical practice use gist traces rather than precise probability estimates when diagnosing.
NASA Astrophysics Data System (ADS)
Dlugach, Zh. M.; Korablev, O. I.; Morozhenko, A. V.; Moroz, V. I.; Petrova, E. V.; Rodin, A. V.
2003-01-01
Atmospheric aerosols play an important role in forming the Martian climate. However, the basic physical properties of the Martian aerosols are still poorly known; there are many contradictions in their estimates. We present an analytical overview of the published results and potentialities of various methods. We consider mineral dust. Zonally averaged data obtained from mapping IR instruments (TES and IRTM) give the optical thickness of mineral aerosols τ9 = 0.05-0.1 in the 9-μm band for quite atmospheric conditions. There is a problem of comparing these estimates with those obtained in the visible spectral range. We suggest that the commonly used ratio τvis/τ9 >2 depends on the interpretation and it may actually be smaller. The ratio τvis/τ9 ~ 1 is in better agreement with the IRIS data (materials like montmorillonite). If we assume that τvis/τ9 = 1 and take into account the nonspherical particle shape, then the interpretation of ground-based integrated polarimetric observations (τ < 0.04) can be reconciled with IR measurements from the orbit. However, for thin layers, the sensitivity of both methods to the optical thickness is poorly understood: on the one hand, polarimetry depends on the cloud cover and, on the other hand, the interpretation of IR measurements requires that the atmospheric temperature profile and the surface temperature and emissivity be precisely known. For quite atmospheric conditions, the local optical-thickness estimates obtained by the Bouguer-Lambert-Beer method and from the sky brightness measured from Viking 1 and 2 and Mars Pathfinder landers are much larger: τ = 0.3-0.6. Estimates of the contrasts in images from the Viking orbiters yield the same values. Thus, there is still a factor of 3 to 10 difference between different groups of optical-thickness estimates for the quiet atmosphere. This difference is probably explained by the contribution of condensation clouds and/or by local/time variations.
Implicit interpretation biases affect emotional vulnerability: a training study.
Tran, Tanya B; Siemer, Matthias; Joormann, Jutta
2011-04-01
Cognitive theories of emotion propose that the interpretation of emotion-eliciting situations crucially shapes affective responses. Implicit or automatic biases in these interpretations may hinder emotion regulation and thereby increase risk for the onset and maintenance of psychological disorders. In this study, participants were randomly assigned to a positive or negative interpretation bias training using ambiguous social scenarios. After the completion of the training, a stress task was administered and changes in positive and negative affect and self-esteem were assessed. The results demonstrate that the interpretation bias training was successful in that participants exhibited a tendency to interpret novel scenarios in accordance with their training condition. Importantly, the positive training condition also had a protective effect on self-esteem. Participants in this condition did not exhibit a decrease in self-esteem after the stress task, whereas participants in the negative condition did. These results demonstrate that implicit cognitive biases can be trained and that this training affects self-esteem. Implications of these findings for research on psychopathology and emotion regulation are discussed. © 2011 Psychology Press, an imprint of the Taylor & Francis Group, an Informa business
An activity canyon characterization of the pharmacological topography.
Kulkarni, Varsha S; Wild, David J
2016-01-01
Highly chemically similar drugs usually possess similar biological activities, but sometimes, small changes in chemistry can result in a large difference in biological effects. Chemically similar drug pairs that show extreme deviations in activity represent distinctive drug interactions having important implications. These associations between chemical and biological similarity are studied as discontinuities in activity landscapes. Particularly, activity cliffs are quantified by the drop in similar activity of chemically similar drugs. In this paper, we construct a landscape using a large drug-target network and consider the rises in similarity and variation in activity along the chemical space. Detailed analysis of structure and activity gives a rigorous quantification of distinctive pairs and the probability of their occurrence. We analyze pairwise similarity (s) and variation (d) in activity of drugs on proteins. Interactions between drugs are quantified by considering pairwise s and d weights jointly with corresponding chemical similarity (c) weights. Similarity and variation in activity are measured as the number of common and uncommon targets of two drugs respectively. Distinctive interactions occur between drugs having high c and above (below) average d (s). Computation of predicted probability of distinctiveness employs joint probability of c, s and of c, d assuming independence of structure and activity. Predictions conform with the observations at different levels of distinctiveness. Results are validated on the data used and another drug ensemble. In the landscape, while s and d decrease as c increases, d maintains value more than s. c ∈ [0.3, 0.64] is the transitional region where rises in d are significantly greater than drops in s. It is fascinating that distinctive interactions filtered with high d and low s are different in nature. It is crucial that high c interactions are more probable of having above average d than s. Identification of distinctive interactions is better with high d than low s. These interactions belong to diverse classes. d is greatest between drugs and analogs prepared for treatment of same class of ailments but with different therapeutic specifications. In contrast, analogs having low s would treat ailments from distinct classes. Intermittent spikes in d along the axis of c represent canyons in the activity landscape. This new representation accounts for distinctiveness through relative rises in s and d. It provides a mathematical basis for predicting the probability of occurrence of distinctiveness. It identifies the drug pairs at varying levels of distinctiveness and non-distinctiveness. The predicted probability formula is validated even if data approximately satisfy the conditions of its construction. Also, the postulated independence of structure and activity is of little significance to the overall assessment. The difference in distinctive interactions obtained by s and d highlights the importance of studying both of them, and reveals how the choice of measurement can affect the interpretation. The methods in this paper can be used to interpret whether or not drug interactions are distinctive and the probability of their occurrence. Practitioners and researchers can rely on this identification for quantitative modeling and assessment.
Preface of the special issue quantum foundations: information approach
2016-01-01
This special issue is based on the contributions of a group of top experts in quantum foundations and quantum information and probability. It enlightens a number of interpretational, mathematical and experimental problems of quantum theory. PMID:27091161
Affective and cognitive factors influencing sensitivity to probabilistic information.
Tyszka, Tadeusz; Sawicki, Przemyslaw
2011-11-01
In study 1 different groups of female students were randomly assigned to one of four probabilistic information formats. Five different levels of probability of a genetic disease in an unborn child were presented to participants (within-subject factor). After the presentation of the probability level, participants were requested to indicate the acceptable level of pain they would tolerate to avoid the disease (in their unborn child), their subjective evaluation of the disease risk, and their subjective evaluation of being worried by this risk. The results of study 1 confirmed the hypothesis that an experience-based probability format decreases the subjective sense of worry about the disease, thus, presumably, weakening the tendency to overrate the probability of rare events. Study 2 showed that for the emotionally laden stimuli, the experience-based probability format resulted in higher sensitivity to probability variations than other formats of probabilistic information. These advantages of the experience-based probability format are interpreted in terms of two systems of information processing: the rational deliberative versus the affective experiential and the principle of stimulus-response compatibility. © 2011 Society for Risk Analysis.
Conditional, Time-Dependent Probabilities for Segmented Type-A Faults in the WGCEP UCERF 2
Field, Edward H.; Gupta, Vipin
2008-01-01
This appendix presents elastic-rebound-theory (ERT) motivated time-dependent probabilities, conditioned on the date of last earthquake, for the segmented type-A fault models of the 2007 Working Group on California Earthquake Probabilities (WGCEP). These probabilities are included as one option in the WGCEP?s Uniform California Earthquake Rupture Forecast 2 (UCERF 2), with the other options being time-independent Poisson probabilities and an ?Empirical? model based on observed seismicity rate changes. A more general discussion of the pros and cons of all methods for computing time-dependent probabilities, as well as the justification of those chosen for UCERF 2, are given in the main body of this report (and the 'Empirical' model is also discussed in Appendix M). What this appendix addresses is the computation of conditional, time-dependent probabilities when both single- and multi-segment ruptures are included in the model. Computing conditional probabilities is relatively straightforward when a fault is assumed to obey strict segmentation in the sense that no multi-segment ruptures occur (e.g., WGCEP (1988, 1990) or see Field (2007) for a review of all previous WGCEPs; from here we assume basic familiarity with conditional probability calculations). However, and as we?ll see below, the calculation is not straightforward when multi-segment ruptures are included, in essence because we are attempting to apply a point-process model to a non point process. The next section gives a review and evaluation of the single- and multi-segment rupture probability-calculation methods used in the most recent statewide forecast for California (WGCEP UCERF 1; Petersen et al., 2007). We then present results for the methodology adopted here for UCERF 2. We finish with a discussion of issues and possible alternative approaches that could be explored and perhaps applied in the future. A fault-by-fault comparison of UCERF 2 probabilities with those of previous studies is given in the main part of this report.
Deposition and re-erosion studies by means of local impurity injection in TEXTOR
NASA Astrophysics Data System (ADS)
Textor Team Kirschner, A.; Kreter, A.; Wienhold, P.; Brezinsek, S.; Coenen, J. W.; Esser, H. G.; Pospieszczyk, A.; Schulz, Ch.; Breuer, U.; Borodin, D.; Clever, M.; Ding, R.; Galonska, A.; Huber, A.; Litnovsky, A.; Matveev, D.; Ohya, K.; Philipps, V.; Samm, U.; Schmitz, O.; Schweer, B.; Stoschus, H.
2011-08-01
Pioneering experiments to study local erosion and deposition processes have been carried out in TEXTOR by injecting 13C marked hydrocarbons (CH4 and C2H4) as well as silane (SiD4) and tungsten-hexafluoride (WF6) through test limiters exposed to the edge plasma. The influence of various limiter materials (C, W, Mo) and surface roughness, different geometries (spherical or roof-like) and local plasma parameters has been studied. Depending on these conditions the local deposition efficiency of injected species varies between 0.1% and 9% - the largest deposition has been found for 13CH4 injection through unpolished, spherical C test limiter and ohmic plasma conditions. The most striking result is that ERO modelling cannot reproduce these low deposition efficiencies using the common assumptions on sticking probabilities and physical and chemical re-erosion yields. As an explanation large re-erosion due to background plasma and possibly low "effective sticking" of returning species is applied. This has been interpreted as enhanced re-erosion of re-deposits under simultaneous impact of high ion fluxes from plasma background.
Interpreting Medical Information Using Machine Learning and Individual Conditional Expectation.
Nohara, Yasunobu; Wakata, Yoshifumi; Nakashima, Naoki
2015-01-01
Recently, machine-learning techniques have spread many fields. However, machine-learning is still not popular in medical research field due to difficulty of interpreting. In this paper, we introduce a method of interpreting medical information using machine learning technique. The method gave new explanation of partial dependence plot and individual conditional expectation plot from medical research field.
Statistical Short-Range Guidance for Peak Wind Speed Forecasts at Edwards Air Force Base, CA
NASA Technical Reports Server (NTRS)
Dreher, Joseph; Crawford, Winifred; Lafosse, Richard; Hoeth, Brian; Burns, Kerry
2008-01-01
The peak winds near the surface are an important forecast element for Space Shuttle landings. As defined in the Shuttle Flight Rules (FRs), there are peak wind thresholds that cannot be exceeded in order to ensure the safety of the shuttle during landing operations. The National Weather Service Spaceflight Meteorology Group (SMG) is responsible for weather forecasts for all shuttle landings. They indicate peak winds are a challenging parameter to forecast. To alleviate the difficulty in making such wind forecasts, the Applied Meteorology Unit (AMTJ) developed a personal computer based graphical user interface (GUI) for displaying peak wind climatology and probabilities of exceeding peak-wind thresholds for the Shuttle Landing Facility (SLF) at Kennedy Space Center. However, the shuttle must land at Edwards Air Force Base (EAFB) in southern California when weather conditions at Kennedy Space Center in Florida are not acceptable, so SMG forecasters requested that a similar tool be developed for EAFB. Marshall Space Flight Center (MSFC) personnel archived and performed quality control of 2-minute average and 10-minute peak wind speeds at each tower adjacent to the main runway at EAFB from 1997- 2004. They calculated wind climatologies and probabilities of average peak wind occurrence based on the average speed. The climatologies were calculated for each tower and month, and were stratified by hour, direction, and direction/hour. For the probabilities of peak wind occurrence, MSFC calculated empirical and modeled probabilities of meeting or exceeding specific 10-minute peak wind speeds using probability density functions. The AMU obtained and reformatted the data into Microsoft Excel PivotTables, which allows users to display different values with point-click-drag techniques. The GUT was then created from the PivotTables using Visual Basic for Applications code. The GUI is run through a macro within Microsoft Excel and allows forecasters to quickly display and interpret peak wind climatology and likelihoods in a fast-paced operational environment. A summary of how the peak wind climatologies and probabilities were created and an overview of the GUT will be presented.
Clerkin, Elise M; Magee, Joshua C; Parsons, E Marie
2014-10-01
This study evaluated an adaptation of a Cognitive Bias Modification-Interpretation (CBM-I) procedure designed to shift interpretations of intrusive thoughts related to beliefs about the Importance and Control of Thoughts (ICT). Individuals high in the ICT belief domain were randomly assigned to one of two conditions: (a) a positive (n = 38) condition in which scenarios about intrusive thoughts were repeatedly paired with benign interpretations; or (b) a control (n = 39) condition in which scenarios about intrusive thoughts were paired with 50% benign and 50% threatening interpretations. Further, participants engaged in an ICT stressor task. Structural equation modeling with bias-corrected bootstrapping was used to examine the effects of training on ICT-relevant interpretations, beliefs, and ICT stressor responding. As predicted, individuals in a positive (vs. control) training condition reported decreases in ICT-relevant interpretations and beliefs. Further, there was a small, statistically significant indirect (i.e., mediated) effect of training on measures of ICT stressor responding, which occurred via decreases in ICT-relevant beliefs. In sum, results indicate that training was effective in influencing interpretations and beliefs tied to Importance/Control of Thoughts and that there may be clinical utility to shifting this belief domain.
Clerkin, Elise M.; Magee, Joshua C.; Parsons, E. Marie
2014-01-01
This study evaluated an adaptation of a Cognitive Bias Modification-Interpretation (CBM-I) procedure designed to shift interpretations of intrusive thoughts related to beliefs about the Importance and Control of Thoughts (ICT). Individuals high in the ICT belief domain were randomly assigned to one of two conditions: (a) a positive (n = 38) condition in which scenarios about intrusive thoughts were repeatedly paired with benign interpretations; or (b) a control (n = 39) condition in which scenarios about intrusive thoughts were paired with 50% benign and 50% threatening interpretations. Further, participants engaged in an ICT stressor task. Structural equation modeling with bias-corrected bootstrapping was used to examine the effects of training on ICT-relevant interpretations, beliefs, and ICT stressor responding. As predicted, individuals in a positive (vs. control) training condition reported decreases in ICT-relevant interpretations and beliefs. Further, there was a small, statistically significant indirect (i.e., mediated) effect of training on measures of ICT stressor responding, which occurred via decreases in ICT-relevant beliefs. In sum, results indicate that training was effective in influencing interpretations and beliefs tied to Importance/Control of Thoughts and that there may be clinical utility to shifting this belief domain. PMID:25414811
NASA Astrophysics Data System (ADS)
Radakovic, Nenad; McDougall, Douglas
2012-10-01
This classroom note illustrates how dynamic visualization can be used to teach conditional probability and Bayes' theorem. There are two features of the visualization that make it an ideal pedagogical tool in probability instruction. The first feature is the use of area-proportional Venn diagrams that, along with showing qualitative relationships, describe the quantitative relationship between two sets. The second feature is the slider and animation component of dynamic geometry software enabling students to observe how the change in the base rate of an event influences conditional probability. A hypothetical instructional sequence using a well-known breast cancer example is described.
Statistical Hypothesis Testing in Intraspecific Phylogeography: NCPA versus ABC
Templeton, Alan R.
2009-01-01
Nested clade phylogeographic analysis (NCPA) and approximate Bayesian computation (ABC) have been used to test phylogeographic hypotheses. Multilocus NCPA tests null hypotheses, whereas ABC discriminates among a finite set of alternatives. The interpretive criteria of NCPA are explicit and allow complex models to be built from simple components. The interpretive criteria of ABC are ad hoc and require the specification of a complete phylogeographic model. The conclusions from ABC are often influenced by implicit assumptions arising from the many parameters needed to specify a complex model. These complex models confound many assumptions so that biological interpretations are difficult. Sampling error is accounted for in NCPA, but ABC ignores important sources of sampling error that creates pseudo-statistical power. NCPA generates the full sampling distribution of its statistics, but ABC only yields local probabilities, which in turn make it impossible to distinguish between a good fitting model, a non-informative model, and an over-determined model. Both NCPA and ABC use approximations, but convergences of the approximations used in NCPA are well defined whereas those in ABC are not. NCPA can analyze a large number of locations, but ABC cannot. Finally, the dimensionality of tested hypothesis is known in NCPA, but not for ABC. As a consequence, the “probabilities” generated by ABC are not true probabilities and are statistically non-interpretable. Accordingly, ABC should not be used for hypothesis testing, but simulation approaches are valuable when used in conjunction with NCPA or other methods that do not rely on highly parameterized models. PMID:19192182
Internal Medicine residents use heuristics to estimate disease probability
Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin
2015-01-01
Background Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. Method We randomized 55 Internal Medicine residents to different versions of four clinical vignettes and asked them to estimate probabilities of target conditions. We manipulated the clinical data for each vignette to be consistent with either 1) using a representative heuristic, by adding non-discriminating prototypical clinical features of the target condition, or 2) using anchoring with adjustment heuristic, by providing a high or low anchor for the target condition. Results When presented with additional non-discriminating data the odds of diagnosing the target condition were increased (odds ratio (OR) 2.83, 95% confidence interval [1.30, 6.15], p = 0.009). Similarly, the odds of diagnosing the target condition were increased when a high anchor preceded the vignette (OR 2.04, [1.09, 3.81], p = 0.025). Conclusions Our findings suggest that despite previous exposure to the use of Bayesian reasoning, residents use heuristics, such as the representative heuristic and anchoring with adjustment, to estimate probabilities. Potential reasons for attribute substitution include the relative cognitive ease of heuristics vs. Bayesian reasoning or perhaps residents in their clinical practice use gist traces rather than precise probability estimates when diagnosing. PMID:27004080
Two statistical mechanics aspects of complex networks
NASA Astrophysics Data System (ADS)
Thurner, Stefan; Biely, Christoly
2006-12-01
By adopting an ensemble interpretation of non-growing rewiring networks, network theory can be reduced to a counting problem of possible network states and an identification of their associated probabilities. We present two scenarios of how different rewirement schemes can be used to control the state probabilities of the system. In particular, we review how by generalizing the linking rules of random graphs, in combination with superstatistics and quantum mechanical concepts, one can establish an exact relation between the degree distribution of any given network and the nodes’ linking probability distributions. In a second approach, we control state probabilities by a network Hamiltonian, whose characteristics are motivated by biological and socio-economical statistical systems. We demonstrate that a thermodynamics of networks becomes a fully consistent concept, allowing to study e.g. ‘phase transitions’ and computing entropies through thermodynamic relations.
Balsillie, J.H.; Donoghue, J.F.; Butler, K.M.; Koch, J.L.
2002-01-01
Two-dimensional plotting tools can be of invaluable assistance in analytical scientific pursuits, and have been widely used in the analysis and interpretation of sedimentologic data. We consider, in this work, the use of arithmetic probability paper (APP). Most statistical computer applications do not allow for the generation of APP plots, because of apparent intractable nonlinearity of the percentile (or probability) axis of the plot. We have solved this problem by identifying an equation(s) for determining plotting positions of Gaussian percentiles (or probabilities), so that APP plots can easily be computer generated. An EXCEL example is presented, and a programmed, simple-to-use EXCEL application template is hereby made publicly available, whereby a complete granulometric analysis including data listing, moment measure calculations, and frequency and cumulative APP plots, is automatically produced.
Interactive visualisation for interpreting diagnostic test accuracy study results.
Fanshawe, Thomas R; Power, Michael; Graziadio, Sara; Ordóñez-Mena, José M; Simpson, John; Allen, Joy
2018-02-01
Information about the performance of diagnostic tests is typically presented in the form of measures of test accuracy such as sensitivity and specificity. These measures may be difficult to translate directly into decisions about patient treatment, for which information presented in the form of probabilities of disease after a positive or a negative test result may be more useful. These probabilities depend on the prevalence of the disease, which is likely to vary between populations. This article aims to clarify the relationship between pre-test (prevalence) and post-test probabilities of disease, and presents two free, online interactive tools to illustrate this relationship. These tools allow probabilities of disease to be compared with decision thresholds above and below which different treatment decisions may be indicated. They are intended to help those involved in communicating information about diagnostic test performance and are likely to be of benefit when teaching these concepts. A substantive example is presented using C reactive protein as a diagnostic marker for bacterial infection in the older adult population. The tools may also be useful for manufacturers of clinical tests in planning product development, for authors of test evaluation studies to improve reporting and for users of test evaluations to facilitate interpretation and application of the results. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Conservative Analytical Collision Probabilities for Orbital Formation Flying
NASA Technical Reports Server (NTRS)
Carpenter, J. Russell
2004-01-01
The literature offers a number of approximations for analytically and/or efficiently computing the probability of collision between two space objects. However, only one of these techniques is a completely analytical approximation that is suitable for use in the preliminary design phase, when it is more important to quickly analyze a large segment of the trade space than it is to precisely compute collision probabilities. Unfortunately, among the types of formations that one might consider, some combine a range of conditions for which this analytical method is less suitable. This work proposes a simple, conservative approximation that produces reasonable upper bounds on the collision probability in such conditions. Although its estimates are much too conservative under other conditions, such conditions are typically well suited for use of the existing method.
Conservative Analytical Collision Probability for Design of Orbital Formations
NASA Technical Reports Server (NTRS)
Carpenter, J. Russell
2004-01-01
The literature offers a number of approximations for analytically and/or efficiently computing the probability of collision between two space objects. However, only one of these techniques is a completely analytical approximation that is suitable for use in the preliminary design phase, when it is more important to quickly analyze a large segment of the trade space than it is to precisely compute collision probabilities. Unfortunately, among the types of formations that one might consider, some combine a range of conditions for which this analytical method is less suitable. This work proposes a simple, conservative approximation that produces reasonable upper bounds on the collision probability in such conditions. Although its estimates are much too conservative under other conditions, such conditions are typically well suited for use of the existing method.
Seeing the talker's face supports executive processing of speech in steady state noise.
Mishra, Sushmit; Lunner, Thomas; Stenfelt, Stefan; Rönnberg, Jerker; Rudner, Mary
2013-01-01
Listening to speech in noise depletes cognitive resources, affecting speech processing. The present study investigated how remaining resources or cognitive spare capacity (CSC) can be deployed by young adults with normal hearing. We administered a test of CSC (CSCT; Mishra et al., 2013) along with a battery of established cognitive tests to 20 participants with normal hearing. In the CSCT, lists of two-digit numbers were presented with and without visual cues in quiet, as well as in steady-state and speech-like noise at a high intelligibility level. In low load conditions, two numbers were recalled according to instructions inducing executive processing (updating, inhibition) and in high load conditions the participants were additionally instructed to recall one extra number, which was the always the first item in the list. In line with previous findings, results showed that CSC was sensitive to memory load and executive function but generally not related to working memory capacity (WMC). Furthermore, CSCT scores in quiet were lowered by visual cues, probably due to distraction. In steady-state noise, the presence of visual cues improved CSCT scores, probably by enabling better encoding. Contrary to our expectation, CSCT performance was disrupted more in steady-state than speech-like noise, although only without visual cues, possibly because selective attention could be used to ignore the speech-like background and provide an enriched representation of target items in working memory similar to that obtained in quiet. This interpretation is supported by a consistent association between CSCT scores and updating skills.
Experimental analysis of the auditory detection process on avian point counts
Simons, T.R.; Alldredge, M.W.; Pollock, K.H.; Wettroth, J.M.
2007-01-01
We have developed a system for simulating the conditions of avian surveys in which birds are identified by sound. The system uses a laptop computer to control a set of amplified MP3 players placed at known locations around a survey point. The system can realistically simulate a known population of songbirds under a range of factors that affect detection probabilities. The goals of our research are to describe the sources and range of variability affecting point-count estimates and to find applications of sampling theory and methodologies that produce practical improvements in the quality of bird-census data. Initial experiments in an open field showed that, on average, observers tend to undercount birds on unlimited-radius counts, though the proportion of birds counted by individual observers ranged from 81% to 132% of the actual total. In contrast to the unlimited-radius counts, when data were truncated at a 50-m radius around the point, observers overestimated the total population by 17% to 122%. Results also illustrate how detection distances decline and identification errors increase with increasing levels of ambient noise. Overall, the proportion of birds heard by observers decreased by 28 ± 4.7% under breezy conditions, 41 ± 5.2% with the presence of additional background birds, and 42 ± 3.4% with the addition of 10 dB of white noise. These findings illustrate some of the inherent difficulties in interpreting avian abundance estimates based on auditory detections, and why estimates that do not account for variations in detection probability will not withstand critical scrutiny.
NASA Astrophysics Data System (ADS)
Utsumi, Yousuke; Tominaga, Nozomu; Tanaka, Masaomi; Morokuma, Tomoki; Yoshida, Michitoshi; Asakura, Yuichiro; Finet, François; Furusawa, Hisanori; Kawabata, Koji S.; Liu, Wei; Matsubayashi, Kazuya; Moritani, Yuki; Motohara, Kentaro; Nakata, Fumiaki; Ohta, Kouji; Terai, Tsuyoshi; Uemura, Makoto; Yasuda, Naoki
2018-01-01
We present the results of detailed analysis of an optical imaging survey conducted using the Subaru/Hyper Suprime-Cam (HSC) that aimed to identify an optical counterpart to the gravitational wave event GW151226. In half a night, the i- and z-band imaging survey by HSC covered 63.5 deg2 of the error region, which contains about 7% of the LIGO localization probability, and the same field was observed in three different epochs. The detectable magnitude of the candidates in a differenced image is evaluated as i ˜ 23.2 mag for the requirement of at least two 5 σ detections, and 1744 candidates are discovered. Assuming a kilonova as an optical counterpart, we compare the optical properties of the candidates with model predictions. A red and rapidly declining light curve condition enables the discrimination of a kilonova from other transients, and a small number of candidates satisfy this condition. The presence of stellar-like counterparts in the reference frame suggests that the surviving candidates are likely to be flare stars. The fact that most of those candidates are in the galactic plane, |b| < 5°, supports this interpretation. We also check whether the candidates are associated with the nearby GLADE galaxies, which reduces the number of contaminants even with a looser color cut. When a better probability map (with localization accuracy of ˜50 deg2) is available, kilonova searches of up to approximately 200 Mpc will become feasible by conducting immediate follow-up observations with an interval of 3-6 d.
Using the model statement to elicit information and cues to deceit in interpreter-based interviews.
Vrij, Aldert; Leal, Sharon; Mann, Samantha; Dalton, Gary; Jo, Eunkyung; Shaboltas, Alla; Khaleeva, Maria; Granskaya, Juliana; Houston, Kate
2017-06-01
We examined how the presence of an interpreter during an interview affects eliciting information and cues to deceit, while using a method that encourages interviewees to provide more detail (model statement, MS). A total of 199 Hispanic, Korean and Russian participants were interviewed either in their own native language without an interpreter, or through an interpreter. Interviewees either lied or told the truth about a trip they made during the last twelve months. Half of the participants listened to a MS at the beginning of the interview. The dependent variables were 'detail', 'complications', 'common knowledge details', 'self-handicapping strategies' and 'ratio of complications'. In the MS-absent condition, the interviews resulted in less detail when an interpreter was present than when an interpreter was absent. In the MS-present condition, the interviews resulted in a similar amount of detail in the interpreter present and absent conditions. Truthful statements included more complications and fewer common knowledge details and self-handicapping strategies than deceptive statements, and the ratio of complications was higher for truth tellers than liars. The MS strengthened these results, whereas an interpreter had no effect on these results. Copyright © 2017. Published by Elsevier B.V.
Implementation of the Iterative Proportion Fitting Algorithm for Geostatistical Facies Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li Yupeng, E-mail: yupeng@ualberta.ca; Deutsch, Clayton V.
2012-06-15
In geostatistics, most stochastic algorithm for simulation of categorical variables such as facies or rock types require a conditional probability distribution. The multivariate probability distribution of all the grouped locations including the unsampled location permits calculation of the conditional probability directly based on its definition. In this article, the iterative proportion fitting (IPF) algorithm is implemented to infer this multivariate probability. Using the IPF algorithm, the multivariate probability is obtained by iterative modification to an initial estimated multivariate probability using lower order bivariate probabilities as constraints. The imposed bivariate marginal probabilities are inferred from profiles along drill holes or wells.more » In the IPF process, a sparse matrix is used to calculate the marginal probabilities from the multivariate probability, which makes the iterative fitting more tractable and practical. This algorithm can be extended to higher order marginal probability constraints as used in multiple point statistics. The theoretical framework is developed and illustrated with estimation and simulation example.« less
ERIC Educational Resources Information Center
Radakovic, Nenad; McDougall, Douglas
2012-01-01
This classroom note illustrates how dynamic visualization can be used to teach conditional probability and Bayes' theorem. There are two features of the visualization that make it an ideal pedagogical tool in probability instruction. The first feature is the use of area-proportional Venn diagrams that, along with showing qualitative relationships,…
Statistical learning of action: the role of conditional probability.
Meyer, Meredith; Baldwin, Dare
2011-12-01
Identification of distinct units within a continuous flow of human action is fundamental to action processing. Such segmentation may rest in part on statistical learning. In a series of four experiments, we examined what types of statistics people can use to segment a continuous stream involving many brief, goal-directed action elements. The results of Experiment 1 showed no evidence for sensitivity to conditional probability, whereas Experiment 2 displayed learning based on joint probability. In Experiment 3, we demonstrated that additional exposure to the input failed to engender sensitivity to conditional probability. However, the results of Experiment 4 showed that a subset of adults-namely, those more successful at identifying actions that had been seen more frequently than comparison sequences-were also successful at learning conditional-probability statistics. These experiments help to clarify the mechanisms subserving processing of intentional action, and they highlight important differences from, as well as similarities to, prior studies of statistical learning in other domains, including language.
Yuan, Chao; Wang, Xue-Min; Galzote, Carlos; Tan, Yi-Mei; Bhagat, Kamlesh V; Yuan, Zhi-Kang; Du, Jian-Fei; Tan, Yuan
2013-06-01
Human repeated insult patch test (HRIPT) is regarded as one of the confirmatory test in determining the safety of skin sensitizers. A number of important factors should be considered when conducting and interpreting the results of the HRIPT. To investigate for probable critical factors that influence the results of HRIPT with the same protocol in Shanghai and Mumbai. Two HRIPTs were carried out in Shanghai and Mumbai in 2011. Six identical products and 1% sodium lauryl sulfate were tested. Two Chinese dermatologists performed the grading in the two cities. Climate conditions of Shanghai and Mumbai were also recorded. For four lower reaction ratio products, cumulative irritation scores in the induction phase were higher in individuals whose ethnicity was Indian rather than Chinese. Reaction ratio of the same four products was highly correlated to the climatic parameters. The other two higher reaction ratio products and the positive control had no difference between the two ethnicities. Greater attention ought to be paid to the impact of climate on the results of HRIPT, especially for the mild irritation cosmetics when giving the interpretation. Greater emphasis also needs to be placed on the ethnicity of the subjects. Crown Copyright © 2013. Published by Elsevier Inc. All rights reserved.
What is complementarity?: Niels Bohr and the architecture of quantum theory
NASA Astrophysics Data System (ADS)
Plotnitsky, Arkady
2014-12-01
This article explores Bohr’s argument, advanced under the heading of ‘complementarity,’ concerning quantum phenomena and quantum mechanics, and its physical and philosophical implications. In Bohr, the term complementarity designates both a particular concept and an overall interpretation of quantum phenomena and quantum mechanics, in part grounded in this concept. While the argument of this article is primarily philosophical, it will also address, historically, the development and transformations of Bohr’s thinking, under the impact of the development of quantum theory and Bohr’s confrontation with Einstein, especially their exchange concerning the EPR experiment, proposed by Einstein, Podolsky and Rosen in 1935. Bohr’s interpretation was progressively characterized by a more radical epistemology, in its ultimate form, which was developed in the 1930s and with which I shall be especially concerned here, defined by his new concepts of phenomenon and atomicity. According to this epistemology, quantum objects are seen as indescribable and possibly even as inconceivable, and as manifesting their existence only in the effects of their interactions with measuring instruments upon those instruments, effects that define phenomena in Bohr’s sense. The absence of causality is an automatic consequence of this epistemology. I shall also consider how probability and statistics work under these epistemological conditions.
Hydroclimate changes across the Amazon lowlands over the past 45,000 years
NASA Astrophysics Data System (ADS)
Wang, Xianfeng; Edwards, R. Lawrence; Auler, Augusto S.; Cheng, Hai; Kong, Xinggong; Wang, Yongjin; Cruz, Francisco W.; Dorale, Jeffrey A.; Chiang, Hong-Wei
2017-01-01
Reconstructing the history of tropical hydroclimates has been difficult, particularly for the Amazon basin—one of Earth’s major centres of deep atmospheric convection. For example, whether the Amazon basin was substantially drier or remained wet during glacial times has been controversial, largely because most study sites have been located on the periphery of the basin, and because interpretations can be complicated by sediment preservation, uncertainties in chronology, and topographical setting. Here we show that rainfall in the basin responds closely to changes in glacial boundary conditions in terms of temperature and atmospheric concentrations of carbon dioxide. Our results are based on a decadally resolved, uranium/thorium-dated, oxygen isotopic record for much of the past 45,000 years, obtained using speleothems from Paraíso Cave in eastern Amazonia; we interpret the record as being broadly related to precipitation. Relative to modern levels, precipitation in the region was about 58% during the Last Glacial Maximum (around 21,000 years ago) and 142% during the mid-Holocene epoch (about 6,000 years ago). We find that, as compared with cave records from the western edge of the lowlands, the Amazon was widely drier during the last glacial period, with much less recycling of water and probably reduced plant transpiration, although the rainforest persisted throughout this time.
Hydroclimate changes across the Amazon lowlands over the past 45,000 years.
Wang, Xianfeng; Edwards, R Lawrence; Auler, Augusto S; Cheng, Hai; Kong, Xinggong; Wang, Yongjin; Cruz, Francisco W; Dorale, Jeffrey A; Chiang, Hong-Wei
2017-01-11
Reconstructing the history of tropical hydroclimates has been difficult, particularly for the Amazon basin-one of Earth's major centres of deep atmospheric convection. For example, whether the Amazon basin was substantially drier or remained wet during glacial times has been controversial, largely because most study sites have been located on the periphery of the basin, and because interpretations can be complicated by sediment preservation, uncertainties in chronology, and topographical setting. Here we show that rainfall in the basin responds closely to changes in glacial boundary conditions in terms of temperature and atmospheric concentrations of carbon dioxide. Our results are based on a decadally resolved, uranium/thorium-dated, oxygen isotopic record for much of the past 45,000 years, obtained using speleothems from Paraíso Cave in eastern Amazonia; we interpret the record as being broadly related to precipitation. Relative to modern levels, precipitation in the region was about 58% during the Last Glacial Maximum (around 21,000 years ago) and 142% during the mid-Holocene epoch (about 6,000 years ago). We find that, as compared with cave records from the western edge of the lowlands, the Amazon was widely drier during the last glacial period, with much less recycling of water and probably reduced plant transpiration, although the rainforest persisted throughout this time.
Exploring possibilities of band gap measurement with off-axis EELS in TEM.
Korneychuk, Svetlana; Partoens, Bart; Guzzinati, Giulio; Ramaneti, Rajesh; Derluyn, Joff; Haenen, Ken; Verbeeck, Jo
2018-06-01
A technique to measure the band gap of dielectric materials with high refractive index by means of energy electron loss spectroscopy (EELS) is presented. The technique relies on the use of a circular (Bessel) aperture and suppresses Cherenkov losses and surface-guided light modes by enforcing a momentum transfer selection. The technique also strongly suppresses the elastic zero loss peak, making the acquisition, interpretation and signal to noise ratio of low loss spectra considerably better, especially for excitations in the first few eV of the EELS spectrum. Simulations of the low loss inelastic electron scattering probabilities demonstrate the beneficial influence of the Bessel aperture in this setup even for high accelerating voltages. The importance of selecting the optimal experimental convergence and collection angles is highlighted. The effect of the created off-axis acquisition conditions on the selection of the transitions from valence to conduction bands is discussed in detail on a simplified isotropic two band model. This opens the opportunity for deliberately selecting certain transitions by carefully tuning the microscope parameters. The suggested approach is experimentally demonstrated and provides good signal to noise ratio and interpretable band gap signals on reference samples of diamond, GaN and AlN while offering spatial resolution in the nm range. Copyright © 2018 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
May, J.A.; Stonecipher, S.A.; Steinmetz, J.C.
1991-03-01
The correct interpretation of intercalated Miocene siliciclastics and evaporites of Gemsa basin is crucial for understanding early rift tectonics of the Gulf of Suez, pinpointing the timing of isolation of the Gulf from the Mediterranean, and developing exploration plays. Evaporites of the Kareem Formation comprise celestites and massive, 'chicken-wire,' and laminated anhydrites. Although previously interpreted as sabkha deposits; sedimentologic, petrographic, and paleontologic analyses indicate these evaporites more likely formed in a submarine setting. Marls that encase the evaporites contain a diverse and abundant assemblage of nannoplankton, planktonic foraminifera, diatoms, pteropods, and fish scales indicative of basinal deposition. Associated turbidites alsomore » denote deep-water sedimentation. The paucity of benthic diatoms and foraminifera, plus the presence of unburrowed shales, phosphate nodules, early ferroan carbonate cements, and authigenic pyrite, suggest periodic anoxic, or at least disaerobic, bottom waters. These sequences probably represent partial isolation of the Gulf of Suez by middle Miocene, producing periodic basin restriction and evaporative drawdown. Episodes of increasing salinity likely caused the progressive decreases in foram abundance and diversity in marls beneath the anhydrites, culminating in subaqueous evaporite formation. Diverse, indigenous nannoplankton assemblages from shale seams within the anhydrites suggest Gemsa basin was stratified; shallow open-marine conditions coexisted with anhydrite crystallization from deeper hypersaline waters.« less
ERIC Educational Resources Information Center
Riskowski, Jody L.; Olbricht, Gayla; Wilson, Jennifer
2010-01-01
Statistics is the art and science of gathering, analyzing, and making conclusions from data. However, many people do not fully understand how to interpret statistical results and conclusions. Placing students in a collaborative environment involving project-based learning may enable them to overcome misconceptions of probability and enhance the…
Mandated Reporting Thresholds for Community Professionals
ERIC Educational Resources Information Center
Crowell, Kathryn; Levi, Benjamin H.
2012-01-01
This study examines how community-based mandated reporters understand and interpret "reasonable suspicion", the standard threshold for mandated reporting of suspected child abuse. Respondents were asked to identify the probability necessary for "suspicion of child abuse" to constitute "reasonable suspicion". Data were analyzed for internal…
Non Kolmogorov Probability Models Outside Quantum Mechanics
NASA Astrophysics Data System (ADS)
Accardi, Luigi
2009-03-01
This paper is devoted to analysis of main conceptual problems in the interpretation of QM: reality, locality, determinism, physical state, Heisenberg principle, "deterministic" and "exact" theories, laws of chance, notion of event, statistical invariants, adaptive realism, EPR correlations and, finally, the EPR-chameleon experiment.
Prochazka, Brian; Coates, Peter S.; Ricca, Mark; Casazza, Michael L.; Gustafson, K. Ben; Hull, Josh M.
2016-01-01
Fine-scale spatiotemporal studies can better identify relationships between individual survival and habitat fragmentation so that mechanistic interpretations can be made at the population level. Recent advances in Global Positioning System (GPS) technology and statistical models capable of deconstructing high-frequency location data have facilitated interpretation of animal movement within a behaviorally mechanistic framework. Habitat fragmentation due to singleleaf pinyon (Pinus monophylla; hereafter pinyon) and Utah juniper (Juniperus osteosperma; hereafter juniper) encroachment into sagebrush (Artemisia spp.) communities is a commonly implicated perturbation that can adversely influence greater sage-grouse (Centrocercus urophasianus; hereafter sage-grouse) demographic rates. Using an extensive GPS data set (233 birds and 282,954 locations) across 12 study sites within the Great Basin, we conducted a behavioral change point analysis and subsequently constructed Brownian bridge movement models from each behaviorally homogenous section. We found a positive relationship between modeled movement rate and probability of encountering pinyon-juniper with significant variation among age classes. The probability of encountering pinyon-juniper among adults was two and three times greater than that of yearlings and juveniles, respectively. However, the movement rate in response to the probability of encountering pinyon-juniper trees was 1.5 times greater for juveniles. We then assessed the risk of mortality associated with an interaction between movement rate and the probability of encountering pinyon-juniper using shared frailty models. During pinyon-juniper encounters, on average, juvenile, yearling, and adult birds experienced a 10.4%, 0.2%, and 0.3% reduction in annual survival probabilities. Populations that used pinyon-juniper habitats with a frequency ≥ 3.8 times the overall mean experienced decreases in annual survival probabilities of 71.1%, 0.9%, and 0.9%. This analytical framework identifies a likely behavioral mechanism behind how pinyon-juniper encroachment decreases habitat suitability for sage-grouse, whereby encountering pinyon-juniper stimulates faster yet riskier movements that may make sage-grouse more vulnerable to visually acute predators.
Statistics 101 for Radiologists.
Anvari, Arash; Halpern, Elkan F; Samir, Anthony E
2015-10-01
Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.
A framework for multi-stakeholder decision-making and ...
We propose a decision-making framework to compute compromise solutions that balance conflicting priorities of multiple stakeholders on multiple objectives. In our setting, we shape the stakeholder dis-satisfaction distribution by solving a conditional-value-at-risk (CVaR) minimization problem. The CVaR problem is parameterized by a probability level that shapes the tail of the dissatisfaction distribution. The proposed approach allows us to compute a family of compromise solutions and generalizes multi-stakeholder settings previously proposed in the literature that minimize average and worst-case dissatisfactions. We use the concept of the CVaR norm to give a geometric interpretation to this problem +and use the properties of this norm to prove that the CVaR minimization problem yields Pareto optimal solutions for any choice of the probability level. We discuss a broad range of potential applications of the framework that involve complex decision-making processes. We demonstrate the developments using a biowaste facility location case study in which we seek to balance stakeholder priorities on transportation, safety, water quality, and capital costs. This manuscript describes the methodology of a new decision-making framework that computes compromise solutions that balance conflicting priorities of multiple stakeholders on multiple objectives as needed for SHC Decision Science and Support Tools project. A biowaste facility location is employed as the case study
An inter-laboratory comparison study on transfer, persistence and recovery of DNA from cable ties.
Steensma, Kristy; Ansell, Ricky; Clarisse, Lindy; Connolly, Edward; Kloosterman, Ate D; McKenna, Louise G; van Oorschot, Roland A H; Szkuta, Bianca; Kokshoorn, Bas
2017-11-01
To address questions on the activity that led to the deposition of biological traces in a particular case, general information on the probabilities of transfer, persistence and recovery of cellular material in relevant scenarios is necessary. These figures may be derived from experimental data described in forensic literature when conditions relevant to the case were included. The experimental methodology regarding sampling, DNA extraction, DNA typing and profile interpretation that were used to generate these published data may differ from those applied in the case and thus the applicability of the literature data may be questioned. To assess the level of variability that different laboratories obtain when similar exhibits are analysed, we performed an inter-laboratory study between four partner laboratories. Five sets of 20 cable ties bound by different volunteers were distributed to the participating laboratories and sampled and processed according to the in-house protocols. Differences were found for the amount of retrieved DNA, as well as for the reportability and composition of the DNA profiles. These differences also resulted in different probabilities of transfer, persistence and recovery for each laboratory. Nevertheless, when applied to a case example, these differences resulted in similar assignments of weight of evidence given activity-level propositions. Copyright © 2017 Elsevier B.V. All rights reserved.
Cryptosporidiosis susceptibility and risk: a case study.
Makri, Anna; Modarres, Reza; Parkin, Rebecca
2004-02-01
Regional estimates of cryptosporidiosis risks from drinking water exposure were developed and validated, accounting for AIDS status and age. We constructed a model with probability distributions and point estimates representing Cryptosporidium in tap water, tap water consumed per day (exposure characterization); dose response, illness given infection, prolonged illness given illness; and three conditional probabilities describing the likelihood of case detection by active surveillance (health effects characterization). The model predictions were combined with population data to derive expected case numbers and incidence rates per 100,000 population, by age and AIDS status, borough specific and for New York City overall in 2000 (risk characterization). They were compared with same-year surveillance data to evaluate predictive ability, assumed to represent true incidence of waterborne cryptosporidiosis. The predicted mean risks, similar to previously published estimates for this region, overpredicted observed incidence-most extensively when accounting for AIDS status. The results suggest that overprediction may be due to conservative parameters applied to both non-AIDS and AIDS populations, and that biological differences for children need to be incorporated. Interpretations are limited by the unknown accuracy of available surveillance data, in addition to variability and uncertainty of model predictions. The model appears sensitive to geographical differences in AIDS prevalence. The use of surveillance data for validation and model parameters pertinent to susceptibility are discussed.
Clients' interpretation of risks provided in genetic counseling.
Wertz, D C; Sorenson, J R; Heeren, T C
1986-01-01
Clients in 544 genetic counseling sessions who were given numeric risks of having a child with a birth defect between 0% and 50% were asked to interpret these numeric risks on a five-point scale, ranging from very low to very high. Whereas clients' modal interpretation varied directly with numeric risks between 0% and 15%, the modal category of client risk interpretation remained "moderate" at risks between 15% and 50%. Uncertainty about normalcy of the next child increased as numeric risk increased, and few clients were willing to indicate that the child would probably or definitely be affected regardless of the numeric risk. Characteristics associated with clients' "pessimistic" interpretations of risk, identified by stepwise linear regression, included increased numeric risk, discussion in depth during the counseling session of whether they would have a child, have a living affected child, discussion of the effects of an affected child on relationships with client's other children, and seriousness of the disorder in question (causes intellectual impairment). Client interpretations are discussed in terms of recent developments in cognitive theory, including heuristics that influence judgments about risks, and implications for genetic counseling. PMID:3752089
Seroussi, Inbar; Grebenkov, Denis S.; Pasternak, Ofer; Sochen, Nir
2017-01-01
In order to bridge microscopic molecular motion with macroscopic diffusion MR signal in complex structures, we propose a general stochastic model for molecular motion in a magnetic field. The Fokker-Planck equation of this model governs the probability density function describing the diffusion-magnetization propagator. From the propagator we derive a generalized version of the Bloch-Torrey equation and the relation to the random phase approach. This derivation does not require assumptions such as a spatially constant diffusion coefficient, or ad-hoc selection of a propagator. In particular, the boundary conditions that implicitly incorporate the microstructure into the diffusion MR signal can now be included explicitly through a spatially varying diffusion coefficient. While our generalization is reduced to the conventional Bloch-Torrey equation for piecewise constant diffusion coefficients, it also predicts scenarios in which an additional term to the equation is required to fully describe the MR signal. PMID:28242566
Fellinger, Johannes; Holzinger, Daniel; Pollard, Robert
2012-03-17
Deafness is a heterogeneous condition with far-reaching effects on social, emotional, and cognitive development. Onset before language has been established happens in about seven per 10,000 people. Increased rates of mental health problems are reported in deaf people. Many regard themselves as members of a cultural minority who use sign language. In this Review, we describe discrepancies between a high burden of common mental health disorders and barriers to health care. About a quarter of deaf individuals have additional disabilities and a high probability of complex mental health needs. Research into factors affecting mental health of deaf children shows that early access to effective communication with family members and peers is desirable. Improved access to health and mental health care can be achieved by provision of specialist services with professionals trained to directly communicate with deaf people and with sign-language interpreters. Copyright © 2012 Elsevier Ltd. All rights reserved.
Portera Sánchez, Alberto
2004-01-01
The contents of this presentation are the consequence o reading the book Infectious Diseases and Music where the authors Drs. Gomis and Sánchez describe the infections suffered by more than fourty composers or interpreters. Although infections were more prevalent, intense psychological repercussions were also frequent. Reviewing the biographies of Bach, Mozart, Schubert and Beethoven I have selected some specially dramatic paragraphs of letters addressed for relatives and friends describing their intense and permanent physical and psychological disturbances which probably influenced the contents and style of their creations. Depression, anxiety and specially bipolar conditions with frequent and intense maniac phases were common but not exclusive to composers. Other artists and painters or poets also complained of similar disturbances. During their maiac states the artists perceive sounds and visual stimuli as well as their personal experiences with increased intensity and liveliness. Language is more fluid and their creativity and productivity become more powerful.
Study of discharge cleaning process in JIPP T-2 Torus by residual gas analyzer
NASA Astrophysics Data System (ADS)
Noda, N.; Hirokura, S.; Taniguchi, Y.; Tanahashi, S.
1982-12-01
During discharge cleaning, decay time of water vapor pressure changes when the pressure reaches a certain level. A long decay time observed in the later phase can be interpreted as a result of a slow deoxidization rate of chromium oxide, which may dominate the cleaning process in this phase. Optimization of plasma density for the cleaning is discussed comparing the experimental results on density dependence of water vapor pressure with a result based on a zero dimensional calculation for particle balance. One of the essential points for effective cleaning is the raising of the electron density of the plasma high enough that the dissociation loss rate of H2O is as large as the sticking loss rate. A density as high as 10 to the 11th power/cu cm is required for a clean surface condition where sticking probability is presumed to be around 0.5.
A social model for the evolution of sexually transmitted diseases
NASA Astrophysics Data System (ADS)
Gonçalves, Sebastián; Kuperman, Marcelo; Ferreira da Costa Gomes, Marcelo
2004-10-01
We have introduced recently a model for the spread of sexually transmitted diseases, in which the social behavior is incorporated as a key factor for the further propagation of the infection. The system may be regarded as a society of agents where in principle anyone can sexually interact with any other one in the population. The social behavior is taking into account by means of two parameters: the fraction of singles ρs and the promiscuity p. The promiscuity parameter defines the per individual daily probability of going out to look for a sexual partner, abandoning its eventual mate. In this contribution we show that the interaction between this two parameters give rise to a non-trivial epidemic threshold condition, when going from the homogeneous case ( ρs=1) to heterogeneous cases ( ρs<1). These results can have profound implication in the interpretation of real epidemic data.
Not feeling well … true or exaggerated? Self-assessed health as a leading health indicator.
Becchetti, Leonardo; Bachelet, Maria; Riccardini, Fabiola
2018-02-01
We provide original, international evidence documenting that self-assessed health (SAH) is a leading health indicator, that is, a significant predictor of future changes in health conditions, in a large sample of Europeans aged above 50 and living in 13 different countries. We find that, after controlling for attrition bias, lagged SAH is significantly and negatively correlated with changes in the number of chronic diseases, net of the correlations with levels, and changes in sociodemographic factors and health styles, country and regional health system effects, and declared symptoms. Illness-specific estimates document that lagged SAH significantly correlates with arthritis, cholesterol, and lung diseases (and weakly so with ulcer, hypertension, and cataracts) and has a significant correlation with the probability of contracting cancer. Interpretations and policy implications of our findings are discussed in the paper. Copyright © 2017 John Wiley & Sons, Ltd.
'Mommy, I miss daddy'. The effect of family structure on children's health in Brazil.
Ayllón, Sara; Ferreira-Batista, Natalia N
2015-12-01
This paper studies the relationship between single motherhood and children's height-for-age z-scores in Brazil. In order to isolate the causal effect between family structure and children's condition, we estimate an econometric model that uses male preference for firstborn sons and local sex ratios to instrument the probability of a woman becoming a single mother. Our results have a local average treatment effect interpretation (LATE). We find that children being raised by a single mother (whose marital status is affected by a firstborn girl and a low sex ratio) have a height-for-age z-score that is lower than that of children of similar characteristics that cohabit with both progenitors. We claim that the increasing trend of single motherhood in Brazil should be of concern in health policy design. Copyright © 2015 Elsevier B.V. All rights reserved.
Brocklehurst, K
1979-01-01
To facilitate mechanistic interpretation of the kinetics of time-dependent inhibition of enzymes and of similar protein modification reactions, it is important to know when the equilibrium assumption may be applied to the model: formula: (see text). The conventional criterion of quasi-equilibrium, k + 2 less than k-1, is not always easy to assess, particularly when k + 2 cannot be separately determined. It is demonstrated that the condition k + 2 less than k-1 is necessarily true, however, when the value of the apparent second-order rate constant for the modification reaction is much smaller than the value of k + 1. Since k + 1 is commonly at least 10(7)M-1.S-1 for substrates, it is probable that the equilibrium assumption may be properly applied to most irreversible inhibitions and modification reactions. PMID:518556
Preobrazhenskaia, L A; Ioffe, M E; Mats, V N
2004-01-01
The role of the prefrontal cortex was investigated on the reaction of the active choice of the two feeders under changes value and probability reinforcement. The experiments were performed on 2 dogs with prefrontal ablation (g. proreus). Before the lesions the dogs were taught to receive food in two different feeders to conditioned stimuli with equally probable alimentary reinforcement. After ablation in the inter-trial intervals the dogs were running from the one feeder to another. In the answer to conditioned stimuli for many times the dogs choose the same feeder. The disturbance of the behavior after some times completely restored. In the experiments with competition of probability events and values of reinforcement the dogs chose the feeder with low-probability but better quality of reinforcement. In the experiments with equal value but different probability the intact dogs chose the feeder with higher probability. In our experiments the dogs with prefrontal lesions chose the each feeder equiprobably. Thus in condition of free behavior one of different functions of the prefrontal cortex is the reactions choose with more probability of reinforcement.
Song, Jeffery W; Small, Mitchell J; Casman, Elizabeth A
2017-12-15
Environmental DNA (eDNA) sampling is an emerging tool for monitoring the spread of aquatic invasive species. One confounding factor when interpreting eDNA sampling evidence is that eDNA can be present in the water in the absence of living target organisms, originating from excreta, dead tissue, boats, or sewage effluent, etc. In the Chicago Area Waterway System (CAWS), electric fish dispersal barriers were built to prevent non-native Asian carp species from invading Lake Michigan, and yet Asian carp eDNA has been detected above the barriers sporadically since 2009. In this paper the influence of stream flow characteristics in the CAWS on the probability of invasive Asian carp eDNA detection in the CAWS from 2009 to 2012 was examined. In the CAWS, the direction of stream flow is mostly away from Lake Michigan, though there are infrequent reversals in flow direction towards Lake Michigan during dry spells. We find that the flow reversal volume into the Lake has a statistically significant positive relationship with eDNA detection probability, while other covariates, like gage height, precipitation, season, water temperature, dissolved oxygen concentration, pH and chlorophyll concentration do not. This suggests that stream flow direction is highly influential on eDNA detection in the CAWS and should be considered when interpreting eDNA evidence. We also find that the beta-binomial regression model provides a stronger fit for eDNA detection probability compared to a binomial regression model. This paper provides a statistical modeling framework for interpreting eDNA sampling evidence and for evaluating covariates influencing eDNA detection. Copyright © 2017 Elsevier B.V. All rights reserved.
Barazzetti Barbieri, Cristina; de Souza Sarkis, Jorge Eduardo
2018-07-01
The forensic interpretation of environmental analytical data is usually challenging due to the high geospatial variability of these data. The measurements' uncertainty includes contributions from the sampling and from the sample handling and preparation processes. These contributions are often disregarded in analytical techniques results' quality assurance. A pollution crime investigation case was used to carry out a methodology able to address these uncertainties in two different environmental compartments, freshwater sediments and landfill leachate. The methodology used to estimate the uncertainty was the duplicate method (that replicates predefined steps of the measurement procedure in order to assess its precision) and the parameters used to investigate the pollution were metals (Cr, Cu, Ni, and Zn) in the leachate, the suspect source, and in the sediment, the possible sink. The metal analysis results were compared to statutory limits and it was demonstrated that Cr and Ni concentrations in sediment samples exceeded the threshold levels at all sites downstream the pollution sources, considering the expanded uncertainty U of the measurements and a probability of contamination >0.975, at most sites. Cu and Zn concentrations were above the statutory limits at two sites, but the classification was inconclusive considering the uncertainties of the measurements. Metal analyses in leachate revealed that Cr concentrations were above the statutory limits with a probability of contamination >0.975 in all leachate ponds while the Cu, Ni and Zn probability of contamination was below 0.025. The results demonstrated that the estimation of the sampling uncertainty, which was the dominant component of the combined uncertainty, is required for a comprehensive interpretation of the environmental analyses results, particularly in forensic cases. Copyright © 2018 Elsevier B.V. All rights reserved.
Leue, Anja; Cano Rodilla, Carmen; Beauducel, André
2015-01-01
Individuals typically evaluate whether their performance and the obtained feedback match. Previous research has shown that feedback negativity (FN) depends on outcome probability and feedback valence. It is, however, less clear to what extent previous effects of outcome probability on FN depend on self-evaluations of response correctness. Therefore, we investigated the effects of outcome probability on FN amplitude in a simple go/no-go task that allowed for the self-evaluation of response correctness. We also investigated effects of performance incompatibility and feedback valence. In a sample of N = 22 participants, outcome probability was manipulated by means of precues, feedback valence by means of monetary feedback, and performance incompatibility by means of feedback that induced a match versus mismatch with individuals' performance. We found that the 100% outcome probability condition induced a more negative FN following no-loss than the 50% outcome probability condition. The FN following loss was more negative in the 50% compared to the 100% outcome probability condition. Performance-incompatible loss resulted in a more negative FN than performance-compatible loss. Our results indicate that the self-evaluation of the correctness of responses should be taken into account when the effects of outcome probability and expectation mismatch on FN are investigated. PMID:26783525
Leue, Anja; Cano Rodilla, Carmen; Beauducel, André
2015-01-01
Individuals typically evaluate whether their performance and the obtained feedback match. Previous research has shown that feedback negativity (FN) depends on outcome probability and feedback valence. It is, however, less clear to what extent previous effects of outcome probability on FN depend on self-evaluations of response correctness. Therefore, we investigated the effects of outcome probability on FN amplitude in a simple go/no-go task that allowed for the self-evaluation of response correctness. We also investigated effects of performance incompatibility and feedback valence. In a sample of N = 22 participants, outcome probability was manipulated by means of precues, feedback valence by means of monetary feedback, and performance incompatibility by means of feedback that induced a match versus mismatch with individuals' performance. We found that the 100% outcome probability condition induced a more negative FN following no-loss than the 50% outcome probability condition. The FN following loss was more negative in the 50% compared to the 100% outcome probability condition. Performance-incompatible loss resulted in a more negative FN than performance-compatible loss. Our results indicate that the self-evaluation of the correctness of responses should be taken into account when the effects of outcome probability and expectation mismatch on FN are investigated.
Probability interpretations of intraclass reliabilities.
Ellis, Jules L
2013-11-20
Research where many organizations are rated by different samples of individuals such as clients, patients, or employees frequently uses reliabilities computed from intraclass correlations. Consumers of statistical information, such as patients and policy makers, may not have sufficient background for deciding which levels of reliability are acceptable. It is shown that the reliability is related to various probabilities that may be easier to understand, for example, the proportion of organizations that will be classed significantly above (or below) the mean and the probability that an organization is classed correctly given that it is classed significantly above (or below) the mean. One can view these probabilities as the amount of information of the classification and the correctness of the classification. These probabilities have an inverse relationship: given a reliability, one can 'buy' correctness at the cost of informativeness and conversely. This article discusses how this can be used to make judgments about the required level of reliabilities. Copyright © 2013 John Wiley & Sons, Ltd.
Assessing the chances of success: naïve statistics versus kind experience.
Hogarth, Robin M; Mukherjee, Kanchan; Soyer, Emre
2013-01-01
Additive integration of information is ubiquitous in judgment and has been shown to be effective even when multiplicative rules of probability theory are prescribed. We explore the generality of these findings in the context of estimating probabilities of success in contests. We first define a normative model of these probabilities that takes account of relative skill levels in contests where only a limited number of entrants can win. We then report 4 experiments using a scenario about a competition. Experiments 1 and 2 both elicited judgments of probabilities, and, although participants' responses demonstrated considerable variability, their mean judgments provide a good fit to a simple linear model. Experiment 3 explored choices. Most participants entered most contests and showed little awareness of appropriate probabilities. Experiment 4 investigated effects of providing aids to calculate probabilities, specifically, access to expert advice and 2 simulation tools. With these aids, estimates were accurate and decisions varied appropriately with economic consequences. We discuss implications by considering when additive decision rules are dysfunctional, the interpretation of overconfidence based on contest-entry behavior, and the use of aids to help people make better decisions.
Evolution of ion emission yield of alloys with the nature of the solute. 2: Interpretation
NASA Technical Reports Server (NTRS)
Blaise, G.; Slodzian, G.
1977-01-01
Solid solutions of transition elements in copper, nickel, cobalt, iron, and aluminum matrices were analyzed by observing secondary ion emissions under bombardment with 6.2-keV argon ions. Enchancement of the production of solute-element ions was observed. An ion emission model is proposed according to which the ion yield is governed by the probability of an atom leaving the metal in a preionized state. The energy distribution of the valence electrons of the solute atoms is the bases of the probability calculation.
Correlation signatures of wet soils and snows. [algorithm development and computer programming
NASA Technical Reports Server (NTRS)
Phillips, M. R.
1972-01-01
Interpretation, analysis, and development of algorithms have provided the necessary computational programming tools for soil data processing, data handling and analysis. Algorithms that have been developed thus far, are adequate and have been proven successful for several preliminary and fundamental applications such as software interfacing capabilities, probability distributions, grey level print plotting, contour plotting, isometric data displays, joint probability distributions, boundary mapping, channel registration and ground scene classification. A description of an Earth Resources Flight Data Processor, (ERFDP), which handles and processes earth resources data under a users control is provided.
Preferences for equity in health behind a veil of ignorance.
Andersson, F; Lyttkens, C H
1999-08-01
Individual attitudes to distributions of life years between two groups in a society are explored by means of an experiment. Subjects are asked to place themselves behind a veil of ignorance which is specified in terms of risk (known probabilities) for some subjects and in terms of uncertainty (unknown probabilities) for some subjects. The latter is argued to be the appropriate interpretation of Rawls' notion. It is found that subjects exhibit convex preferences over life years for the two groups, and that preferences do not differ between the risk and the uncertainty specifications.
NASA Astrophysics Data System (ADS)
Pessenda, Luiz Carlos Ruiz; Ribeiro, Adauto de Souza; Gouveia, Susy Eli Marques; Aravena, Ramon; Boulet, Rene; Bendassolli, José Albertino
2004-09-01
The study place is in the Barreirinhas region, Maranhão State, northeastern Brazil. A vegetation transect of 78 km was studied among four vegetation types: Restinga (coastal vegetation), Cerrado (woody savanna), Cerradão (dense woody savanna), and Forest, as well as three forested sites around Lagoa do Caçó, located approximately 10 km of the transect. Soil profiles in this transect were sampled for δ13C analysis, as well as buried charcoal fragments were used for 14C dating. The data interpretation indicated that approximately between 15,000 and ˜9000 14C yr B.P., arboreal vegetation prevailed in the whole transect, probably due to the presence of a humid climate. Approximately between ˜9000 and 4000-3000 14C yr B.P., there was the expansion of the savanna, probably related to the presence of drier climate. From ˜4000-3000 14C yr B.P. to the present, the results indicated an increase in the arboreal density in the area, due to the return to a more humid and probably similar climate to the present. The presence of buried charcoal fragments in several soil depths suggested the occurrence of palaeofires during the Holocene. The vegetation dynamic inferred in this study for northeastern Brazil is in agreement with the results obtained in areas of Amazon region, based on pollen analysis of lake sediments and carbon isotope analysis of soil organic matter (SOM), implying than similar climatic conditions have affected these areas during the late Pleistocene until the present.
Bray, Christopher; Bell, Lauren N; Liang, Hong; Haykal, Rasha; Kaiksow, Farah; Mazza, Joseph J; Yale, Steven H
2016-12-01
Erythrocyte sedimentation rate (ESR) and C-reactive protein (CRP) are widely used laboratory markers of systemic inflammation. A thorough understanding of the similarities and differences between these two serological markers, including factors that affect measurements, is necessary for the proper utilization and interpretation of ESR and CRP. This review summarizes the current published literature (searched on MEDLINE through February 2016) surrounding the history and utilization of ESR and CRP, and examines factors that affect ESR and CRP measurements and discordance amongst these two inflammatory markers. As ESR and CRP lack sensitivity or specificity, these tests should be used only in combination with clinical history and physical exam for diagnosis and monitoring of pathological conditions. The clinical application of these tests in diagnosis is best applied to conditions in which there is high or low clinical probability of disease. Importantly, discrepancies between ESR and CRP measurements commonly have been reported in both inpatient and outpatient settings and this problem may be particularly prevalent in chronic inflammatory diseases. Numerous physiological factors, including noninfectious conditions and resolution of inflammation can contribute to abnormally high ESR/low CRP readings or vice versa. Although discordance may be encountered in certain settings, proper utilization of ESR and CRP measurements continues to play an important role in clinical management of many inflammatory and other conditions.
Conditional imitation might promote cooperation under high temptations to defect
NASA Astrophysics Data System (ADS)
Dai, Qionglin; Li, Haihong; Cheng, Hongyan; Qian, Xiaolan; Zhang, Mei; Yang, Junzhong
2012-07-01
In this paper we introduce a conditional imitation rule into an evolutionary game, in which the imitation probabilities of individuals are determined by a function of payoff difference and two crucial parameters μ and σ. The parameter μ characterizes the most adequate goal for individuals and the parameter σ characterizes the tolerance of individuals. By using the pair approximation method and numerical simulations, we find an anomalous cooperation enhancement in which the cooperation level shows a nonmonotonic variation with the increase of temptation. The parameter μ affects the regime of the payoff parameter which supports the anomalous cooperation enhancement, whereas the parameter σ plays a decisive role on the appearance of the nonmonotonic variation of the cooperation level. Furthermore, to give explicit implications for the parameters μ and σ we present an alterative form of the conditional imitation rule based on the benefit and the cost incurred to individuals during strategy updates. In this way, we also provide a phenomenological interpretation for the nonmonotonic behavior of cooperation with the increase of temptation. The results give a clue that a higher cooperation level could be obtained under adverse environments for cooperation by applying the conditional imitation rule, which is possible to be manipulated in real life. More generally, the results in this work might point out an efficient way to maintain cooperation in the risky environments to cooperators.
NASA Technical Reports Server (NTRS)
Kis, K. I.; Taylor, Patrick T.; Wittmann, G.; Toronyi, B.; Puszta, S.
2012-01-01
In this study we interpret the magnetic anomalies at satellite altitude over a part of Europe and the Pannonian Basin. These anomalies are derived from the total magnetic measurements from the CHAMP satellite. The anomalies reduced to an elevation of 324 km. An inversion method is used to interpret the total magnetic anomalies over the Pannonian Basin. A three dimensional triangular model is used in the inversion. Two parameter distributions: Laplacian and Gaussian are investigated. The regularized inversion is numerically calculated with the Simplex and Simulated Annealing methods and the anomalous source is located in the upper crust. A probable source of the magnetization is due to the exsolution of the hematite-ilmenite minerals.
Against Many-Worlds Interpretations
NASA Astrophysics Data System (ADS)
Kent, Adrian
This is a critical review of the literature on many-worlds interpretations, MWI, with arguments drawn partly from earlier critiques by Bell and Stein. The essential postulates involved in various MWI are extracted, and their consistency with the evident physical world is examined. Arguments are presented against MWI proposed by Everett, Graham and DeWitt. The relevance of frequency operators to MWI is examined; it is argued that frequency operator theorems of Hartle and Farhi-Goldstone-Gutmann do not in themselves provide a probability interpretation for quantum mechanics, and thus neither support existing MWI nor would be useful in constructing new MWI. Comments are made on papers by Geroch and Deutsch that advocate MWI. It is concluded that no plausible set of axioms exists for an MWI that describes known physics.
Thompson, Cheryl Bagley
2009-01-01
This 13th article of the Basics of Research series is first in a short series on statistical analysis. These articles will discuss creating your statistical analysis plan, levels of measurement, descriptive statistics, probability theory, inferential statistics, and general considerations for interpretation of the results of a statistical analysis.
From Movements to Actions: Two Mechanisms for Learning Action Sequences
ERIC Educational Resources Information Center
Endress, Ansgar D.; Wood, Justin N.
2011-01-01
When other individuals move, we interpret their movements as discrete, hierarchically-organized, goal-directed actions. However, the mechanisms that integrate visible movement features into actions are poorly understood. Here, we consider two sequence learning mechanisms--transitional probability-based (TP) and position-based encoding…
NASA Astrophysics Data System (ADS)
Straus, D. M.
2007-12-01
The probability distribution (pdf) of errors is followed in identical twin studies using the COLA T63 AGCM, integrated with observed SST for 15 recent winters. 30 integrations per winter (for 15 winters) are available with initial errors that are extremely small. The evolution of the pdf is tested for multi-modality, and the results interpreted in terms of clusters / regimes found in: (a) the set of 15x30 integrations mentioned, and (b) a larger ensemble of 55x15 integrations made with the same GCM using the same SSTs. The mapping of pdf evolution and clusters is also carried out for each winter separately, using the clusters found in the 55-member ensemble for the same winter alone. This technique yields information on the change in regimes caused by different boundary forcing (Straus and Molteni, 2004; Straus, Corti and Molteni, 2006). Analysis of the growing errors in terms of baroclinic and barotropic components allows for interpretation of the corresponding instabilities.
Calculating the weight of evidence in low-template forensic DNA casework.
Lohmueller, Kirk E; Rudin, Norah
2013-01-01
Interpreting and assessing the weight of low-template DNA evidence presents a formidable challenge in forensic casework. This report describes a case in which a similar mixed DNA profile was obtained from four different bloodstains. The defense proposed that the low-level minor profile came from an alternate suspect, the defendant's mistress. The strength of the evidence was assessed using a probabilistic approach that employed likelihood ratios incorporating the probability of allelic drop-out. Logistic regression was used to model the probability of drop-out using empirical validation data from the government laboratory. The DNA profile obtained from the bloodstain described in this report is at least 47 billion times more likely if, in addition to the victim, the alternate suspect was the minor contributor, than if another unrelated individual was the minor contributor. This case illustrates the utility of the probabilistic approach for interpreting complex low-template DNA profiles. © 2012 American Academy of Forensic Sciences.
Generalized spherical and simplicial coordinates
NASA Astrophysics Data System (ADS)
Richter, Wolf-Dieter
2007-12-01
Elementary trigonometric quantities are defined in l2,p analogously to that in l2,2, the sine and cosine functions are generalized for each p>0 as functions sinp and cosp such that they satisfy the basic equation cosp([phi])p+sinp([phi])p=1. The p-generalized radius coordinate of a point [xi][set membership, variant]Rn is defined for each p>0 as . On combining these quantities, ln,p-spherical coordinates are defined. It is shown that these coordinates are nearly related to ln,p-simplicial coordinates. The Jacobians of these generalized coordinate transformations are derived. Applications and interpretations from analysis deal especially with the definition of a generalized surface content on ln,p-spheres which is nearly related to a modified co-area formula and an extension of Cavalieri's and Torricelli's indivisibeln method, and with differential equations. Applications from probability theory deal especially with a geometric interpretation of the uniform probability distribution on the ln,p-sphere and with the derivation of certain generalized statistical distributions.
A time series model of the occurrence of gastric dilatation-volvulus in a population of dogs
Levine, Michael; Moore, George E
2009-01-01
Background Gastric dilatation-volvulus (GDV) is a life-threatening condition of mammals, with increased risk in large breed dogs. The study of its etiological factors is difficult due to the variety of possible living conditions. The association between meteorological events and the occurrence of GDV has been postulated but remains unclear. This study introduces the binary time series approach to the investigation of the possible meteorological risk factors for GDV. The data collected in a population of high-risk working dogs in Texas was used. Results Minimum and maximum daily atmospheric pressure on the day of GDV event and the maximum daily atmospheric pressure on the day before the GDV event were positively associated with the probability of GDV. All of the odds/multiplicative factors of a day being GDV day were interpreted conditionally on the past GDV occurrences. There was minimal difference between the binary and Poisson general linear models. Conclusion Time series modeling provided a novel method for evaluating the association between meteorological variables and GDV in a large population of dogs. Appropriate application of this method was enhanced by a common environment for the dogs and availability of meteorological data. The potential interaction between weather changes and patient risk factors for GDV deserves further investigation. PMID:19368730
A time series model of the occurrence of gastric dilatation-volvulus in a population of dogs.
Levine, Michael; Moore, George E
2009-04-15
Gastric dilatation-volvulus (GDV) is a life-threatening condition of mammals, with increased risk in large breed dogs. The study of its etiological factors is difficult due to the variety of possible living conditions. The association between meteorological events and the occurrence of GDV has been postulated but remains unclear. This study introduces the binary time series approach to the investigation of the possible meteorological risk factors for GDV. The data collected in a population of high-risk working dogs in Texas was used. Minimum and maximum daily atmospheric pressure on the day of GDV event and the maximum daily atmospheric pressure on the day before the GDV event were positively associated with the probability of GDV. All of the odds/multiplicative factors of a day being GDV day were interpreted conditionally on the past GDV occurrences. There was minimal difference between the binary and Poisson general linear models. Time series modeling provided a novel method for evaluating the association between meteorological variables and GDV in a large population of dogs. Appropriate application of this method was enhanced by a common environment for the dogs and availability of meteorological data. The potential interaction between weather changes and patient risk factors for GDV deserves further investigation.
Parker, Aimée; Pin, Carmen; Carding, Simon R.; Watson, Alastair J. M.; Byrne, Helen M.
2017-01-01
Our work addresses two key challenges, one biological and one methodological. First, we aim to understand how proliferation and cell migration rates in the intestinal epithelium are related under healthy, damaged (Ara-C treated) and recovering conditions, and how these relations can be used to identify mechanisms of repair and regeneration. We analyse new data, presented in more detail in a companion paper, in which BrdU/IdU cell-labelling experiments were performed under these respective conditions. Second, in considering how to more rigorously process these data and interpret them using mathematical models, we use a probabilistic, hierarchical approach. This provides a best-practice approach for systematically modelling and understanding the uncertainties that can otherwise undermine the generation of reliable conclusions—uncertainties in experimental measurement and treatment, difficult-to-compare mathematical models of underlying mechanisms, and unknown or unobserved parameters. Both spatially discrete and continuous mechanistic models are considered and related via hierarchical conditional probability assumptions. We perform model checks on both in-sample and out-of-sample datasets and use them to show how to test possible model improvements and assess the robustness of our conclusions. We conclude, for the present set of experiments, that a primarily proliferation-driven model suffices to predict labelled cell dynamics over most time-scales. PMID:28753601
Maclaren, Oliver J; Parker, Aimée; Pin, Carmen; Carding, Simon R; Watson, Alastair J M; Fletcher, Alexander G; Byrne, Helen M; Maini, Philip K
2017-07-01
Our work addresses two key challenges, one biological and one methodological. First, we aim to understand how proliferation and cell migration rates in the intestinal epithelium are related under healthy, damaged (Ara-C treated) and recovering conditions, and how these relations can be used to identify mechanisms of repair and regeneration. We analyse new data, presented in more detail in a companion paper, in which BrdU/IdU cell-labelling experiments were performed under these respective conditions. Second, in considering how to more rigorously process these data and interpret them using mathematical models, we use a probabilistic, hierarchical approach. This provides a best-practice approach for systematically modelling and understanding the uncertainties that can otherwise undermine the generation of reliable conclusions-uncertainties in experimental measurement and treatment, difficult-to-compare mathematical models of underlying mechanisms, and unknown or unobserved parameters. Both spatially discrete and continuous mechanistic models are considered and related via hierarchical conditional probability assumptions. We perform model checks on both in-sample and out-of-sample datasets and use them to show how to test possible model improvements and assess the robustness of our conclusions. We conclude, for the present set of experiments, that a primarily proliferation-driven model suffices to predict labelled cell dynamics over most time-scales.
Shine, R; LeMaster, M P; Moore, I T; Olsson, M M; Mason, R T
2001-03-01
Huge breeding aggregations of red-sided garter snakes (Thamnophis sirtalis parietalis) at overwintering dens in Manitoba provide a unique opportunity to identify sources of mortality and to clarify factors that influence a snake's vulnerability to these factors. Comparisons of sexes, body sizes, and body condition of more than 1000 dead snakes versus live animals sampled at the same time reveal significant biases. Three primary sources of mortality were identified. Predation by crows, Corvus brachyrhynchos (590 snakes killed), was focussed mostly on small snakes of both sexes. Crows generally removed the snake's liver and left the carcass, but very small snakes were sometimes brought back to the nest. Suffocation beneath massive piles of other snakes within the den (301 dead animals) involved mostly small males and (to a lesser extent) large females; snakes in poor body condition were particularly vulnerable. Many emaciated snakes (n = 142, mostly females) also died without overt injuries, probably due to depleted energy reserves. These biases in vulnerability are readily interpretable from information on behavioral ecology of the snakes. For example, sex biases in mortality reflect differences in postemergence behavior and locomotor capacity, the greater attractiveness of larger females to males, and the high energy costs of reproduction for females.
NASA Astrophysics Data System (ADS)
Sari, Dwi Ivayana; Budayasa, I. Ketut; Juniati, Dwi
2017-08-01
Formulation of mathematical learning goals now is not only oriented on cognitive product, but also leads to cognitive process, which is probabilistic thinking. Probabilistic thinking is needed by students to make a decision. Elementary school students are required to develop probabilistic thinking as foundation to learn probability at higher level. A framework of probabilistic thinking of students had been developed by using SOLO taxonomy, which consists of prestructural probabilistic thinking, unistructural probabilistic thinking, multistructural probabilistic thinking and relational probabilistic thinking. This study aimed to analyze of probability task completion based on taxonomy of probabilistic thinking. The subjects were two students of fifth grade; boy and girl. Subjects were selected by giving test of mathematical ability and then based on high math ability. Subjects were given probability tasks consisting of sample space, probability of an event and probability comparison. The data analysis consisted of categorization, reduction, interpretation and conclusion. Credibility of data used time triangulation. The results was level of boy's probabilistic thinking in completing probability tasks indicated multistructural probabilistic thinking, while level of girl's probabilistic thinking in completing probability tasks indicated unistructural probabilistic thinking. The results indicated that level of boy's probabilistic thinking was higher than level of girl's probabilistic thinking. The results could contribute to curriculum developer in developing probability learning goals for elementary school students. Indeed, teachers could teach probability with regarding gender difference.
Mestres-Missé, Anna; Trampel, Robert; Turner, Robert; Kotz, Sonja A
2016-04-01
A key aspect of optimal behavior is the ability to predict what will come next. To achieve this, we must have a fairly good idea of the probability of occurrence of possible outcomes. This is based both on prior knowledge about a particular or similar situation and on immediately relevant new information. One question that arises is: when considering converging prior probability and external evidence, is the most probable outcome selected or does the brain represent degrees of uncertainty, even highly improbable ones? Using functional magnetic resonance imaging, the current study explored these possibilities by contrasting words that differ in their probability of occurrence, namely, unbalanced ambiguous words and unambiguous words. Unbalanced ambiguous words have a strong frequency-based bias towards one meaning, while unambiguous words have only one meaning. The current results reveal larger activation in lateral prefrontal and insular cortices in response to dominant ambiguous compared to unambiguous words even when prior and contextual information biases one interpretation only. These results suggest a probability distribution, whereby all outcomes and their associated probabilities of occurrence--even if very low--are represented and maintained.
Encounter risk analysis of rainfall and reference crop evapotranspiration in the irrigation district
NASA Astrophysics Data System (ADS)
Zhang, Jinping; Lin, Xiaomin; Zhao, Yong; Hong, Yang
2017-09-01
Rainfall and reference crop evapotranspiration are random but mutually affected variables in the irrigation district, and their encounter situation can determine water shortage risks under the contexts of natural water supply and demand. However, in reality, the rainfall and reference crop evapotranspiration may have different marginal distributions and their relations are nonlinear. In this study, based on the annual rainfall and reference crop evapotranspiration data series from 1970 to 2013 in the Luhun irrigation district of China, the joint probability distribution of rainfall and reference crop evapotranspiration are developed with the Frank copula function. Using the joint probability distribution, the synchronous-asynchronous encounter risk, conditional joint probability, and conditional return period of different combinations of rainfall and reference crop evapotranspiration are analyzed. The results show that the copula-based joint probability distributions of rainfall and reference crop evapotranspiration are reasonable. The asynchronous encounter probability of rainfall and reference crop evapotranspiration is greater than their synchronous encounter probability, and the water shortage risk associated with meteorological drought (i.e. rainfall variability) is more prone to appear. Compared with other states, there are higher conditional joint probability and lower conditional return period in either low rainfall or high reference crop evapotranspiration. For a specifically high reference crop evapotranspiration with a certain frequency, the encounter risk of low rainfall and high reference crop evapotranspiration is increased with the decrease in frequency. For a specifically low rainfall with a certain frequency, the encounter risk of low rainfall and high reference crop evapotranspiration is decreased with the decrease in frequency. When either the high reference crop evapotranspiration exceeds a certain frequency or low rainfall does not exceed a certain frequency, the higher conditional joint probability and lower conditional return period of various combinations likely cause a water shortage, but the water shortage is not severe.
Hansen, John P
2003-01-01
Healthcare quality improvement professionals need to understand and use inferential statistics to interpret sample data from their organizations. In quality improvement and healthcare research studies all the data from a population often are not available, so investigators take samples and make inferences about the population by using inferential statistics. This three-part series will give readers an understanding of the concepts of inferential statistics as well as the specific tools for calculating confidence intervals for samples of data. This article, Part 2, describes probability, populations, and samples. The uses of descriptive and inferential statistics are outlined. The article also discusses the properties and probability of normal distributions, including the standard normal distribution.
NASA Astrophysics Data System (ADS)
Stephanik, Brian Michael
This dissertation describes the results of two related investigations into introductory student understanding of ideas from classical physics that are key elements of quantum mechanics. One investigation probes the extent to which students are able to interpret and apply potential energy diagrams (i.e., graphs of potential energy versus position). The other probes the extent to which students are able to reason classically about probability and spatial probability density. The results of these investigations revealed significant conceptual and reasoning difficulties that students encounter with these topics. The findings guided the design of instructional materials to address the major problems. Results from post-instructional assessments are presented that illustrate the impact of the curricula on student learning.
Betting on the outcomes of measurements: a Bayesian theory of quantum probability
NASA Astrophysics Data System (ADS)
Pitowsky, Itamar
We develop a systematic approach to quantum probability as a theory of rational betting in quantum gambles. In these games of chance, the agent is betting in advance on the outcomes of several (finitely many) incompatible measurements. One of the measurements is subsequently chosen and performed and the money placed on the other measurements is returned to the agent. We show how the rules of rational betting imply all the interesting features of quantum probability, even in such finite gambles. These include the uncertainty principle and the violation of Bell's inequality among others. Quantum gambles are closely related to quantum logic and provide a new semantics for it. We conclude with a philosophical discussion on the interpretation of quantum mechanics.
Deposition, exhumation, and paleoclimate of an ancient lake deposit, Gale crater, Mars
Grotzinger, J.P.; Gupta, S.; Malin, M.C.; Rubin, D.M.; Schieber, J.; Siebach, K.; Sumner, D.Y.; Stack, K.M.; Vasavada, A.R.; Arvidson, R.E.; Calef, F.; Edgar, Lauren; Fischer, W.F.; Grant, J.A.; Griffes, J.L.; Kah, L.C.; Lamb, M.P.; Lewis, K.W.; Mangold, N.; Minitti, M.E.; Palucis, M.C.; Rice, M.; Williams, R.M.E.; Yingst, R.A.; Blake, D.; Blaney, D.; Conrad, P.; Crisp, J.A.; Dietrich, W.E.; Dromart, G.; Edgett, K.S.; Ewing, R.C.; Gellert, R.; Hurowitz, J.A.; Kocurek, G.; Mahaffy, P.G.; McBride, M.J.; McLennan, S.M.; Mischna, M.A.; Ming, D.; Milliken, R.E.; Newsom, H.; Oehler, D.; Parker, T.J.; Vaniman, D.; Wiens, R.C.; Wilson, S.A.
2015-01-01
The landforms of northern Gale crater on Mars expose thick sequences of sedimentary rocks. Based on images obtained by the Curiosity rover, we interpret these outcrops as evidence for past fluvial, deltaic, and lacustrine environments. Degradation of the crater wall and rim probably supplied these sediments, which advanced inward from the wall, infilling both the crater and an internal lake basin to a thickness of at least 75 meters. This intracrater lake system probably existed intermittently for thousands to millions of years, implying a relatively wet climate that supplied moisture to the crater rim and transported sediment via streams into the lake basin. The deposits in Gale crater were then exhumed, probably by wind-driven erosion, creating Aeolis Mons (Mount Sharp).
NASA Technical Reports Server (NTRS)
Denning, Peter J.
1989-01-01
In 1983 and 1984, the Infrared Astronomical Satellite (IRAS) detected 5,425 stellar objects and measured their infrared spectra. In 1987 a program called AUTOCLASS used Bayesian inference methods to discover the classes present in these data and determine the most probable class of each object, revealing unknown phenomena in astronomy. AUTOCLASS has rekindled the old debate on the suitability of Bayesian methods, which are computationally intensive, interpret probabilities as plausibility measures rather than frequencies, and appear to depend on a subjective assessment of the probability of a hypothesis before the data were collected. Modern statistical methods have, however, recently been shown to also depend on subjective elements. These debates bring into question the whole tradition of scientific objectivity and offer scientists a new way to take responsibility for their findings and conclusions.
Characterising RNA secondary structure space using information entropy
2013-01-01
Comparative methods for RNA secondary structure prediction use evolutionary information from RNA alignments to increase prediction accuracy. The model is often described in terms of stochastic context-free grammars (SCFGs), which generate a probability distribution over secondary structures. It is, however, unclear how this probability distribution changes as a function of the input alignment. As prediction programs typically only return a single secondary structure, better characterisation of the underlying probability space of RNA secondary structures is of great interest. In this work, we show how to efficiently compute the information entropy of the probability distribution over RNA secondary structures produced for RNA alignments by a phylo-SCFG, and implement it for the PPfold model. We also discuss interpretations and applications of this quantity, including how it can clarify reasons for low prediction reliability scores. PPfold and its source code are available from http://birc.au.dk/software/ppfold/. PMID:23368905
People Like Logical Truth: Testing the Intuitive Detection of Logical Value in Basic Propositions
2016-01-01
Recent studies on logical reasoning have suggested that people are intuitively aware of the logical validity of syllogisms or that they intuitively detect conflict between heuristic responses and logical norms via slight changes in their feelings. According to logical intuition studies, logically valid or heuristic logic no-conflict reasoning is fluently processed and induces positive feelings without conscious awareness. One criticism states that such effects of logicality disappear when confounding factors such as the content of syllogisms are controlled. The present study used abstract propositions and tested whether people intuitively detect logical value. Experiment 1 presented four logical propositions (conjunctive, biconditional, conditional, and material implications) regarding a target case and asked the participants to rate the extent to which they liked the statement. Experiment 2 tested the effects of matching bias, as well as intuitive logic, on the reasoners’ feelings by manipulating whether the antecedent or consequent (or both) of the conditional was affirmed or negated. The results showed that both logicality and matching bias affected the reasoners’ feelings, and people preferred logically true targets over logically false ones for all forms of propositions. These results suggest that people intuitively detect what is true from what is false during abstract reasoning. Additionally, a Bayesian mixed model meta-analysis of conditionals indicated that people’s intuitive interpretation of the conditional “if p then q” fits better with the conditional probability, q given p. PMID:28036402
Tracking the Sensory Environment: An ERP Study of Probability and Context Updating in ASD
Westerfield, Marissa A.; Zinni, Marla; Vo, Khang; Townsend, Jeanne
2014-01-01
We recorded visual event-related brain potentials (ERPs) from 32 adult male participants (16 high-functioning participants diagnosed with Autism Spectrum Disorder (ASD) and 16 control participants, ranging in age from 18–53 yrs) during a three-stimulus oddball paradigm. Target and non-target stimulus probability was varied across three probability conditions, whereas the probability of a third non-target stimulus was held constant in all conditions. P3 amplitude to target stimuli was more sensitive to probability in ASD than in TD participants, whereas P3 amplitude to non-target stimuli was less responsive to probability in ASD participants. This suggests that neural responses to changes in event probability are attention-dependant in high-functioning ASD. The implications of these findings for higher-level behaviors such as prediction and planning are discussed. PMID:24488156
Decomposition of conditional probability for high-order symbolic Markov chains.
Melnik, S S; Usatenko, O V
2017-07-01
The main goal of this paper is to develop an estimate for the conditional probability function of random stationary ergodic symbolic sequences with elements belonging to a finite alphabet. We elaborate on a decomposition procedure for the conditional probability function of sequences considered to be high-order Markov chains. We represent the conditional probability function as the sum of multilinear memory function monomials of different orders (from zero up to the chain order). This allows us to introduce a family of Markov chain models and to construct artificial sequences via a method of successive iterations, taking into account at each step increasingly high correlations among random elements. At weak correlations, the memory functions are uniquely expressed in terms of the high-order symbolic correlation functions. The proposed method fills the gap between two approaches, namely the likelihood estimation and the additive Markov chains. The obtained results may have applications for sequential approximation of artificial neural network training.
Decomposition of conditional probability for high-order symbolic Markov chains
NASA Astrophysics Data System (ADS)
Melnik, S. S.; Usatenko, O. V.
2017-07-01
The main goal of this paper is to develop an estimate for the conditional probability function of random stationary ergodic symbolic sequences with elements belonging to a finite alphabet. We elaborate on a decomposition procedure for the conditional probability function of sequences considered to be high-order Markov chains. We represent the conditional probability function as the sum of multilinear memory function monomials of different orders (from zero up to the chain order). This allows us to introduce a family of Markov chain models and to construct artificial sequences via a method of successive iterations, taking into account at each step increasingly high correlations among random elements. At weak correlations, the memory functions are uniquely expressed in terms of the high-order symbolic correlation functions. The proposed method fills the gap between two approaches, namely the likelihood estimation and the additive Markov chains. The obtained results may have applications for sequential approximation of artificial neural network training.
Karl, Herman A.
1989-01-01
High-resolution seismic-reflection data have been used to a varying degree by geoscientists to interpret the history of marine sediment accumulations around Antarctica. Reconnaissance analysis of 1-, 3.5-, and 12-kHz data collected by the U.S. Geological Survey in the western Ross Sea has led to the identification of eight echo-character facies and six microtopographic facies in the sediment deposits that overlie the Ross Sea unconformity. Three depositional facies regions, each characterized by a particular assemblage of echo-character type and microtopographic facies, have been identified on the continental shelf. These suites of acoustic facies are the result of specific depositional processes that control type and accumulation of sediment in a region. Evidence of glacial processes and products is uncommon in regions 1 and 2, but is abundant in region 3. McMurdo Sound, region 1, is characterized by a monospecific set of acoustic facies. This unique assemblage probably represents turbidity current deposition in the western part of the basin. Most of the seafloor in region 2, from about latitude 77??S to 75??S, is deeper than 600 m below sealevel. The microtopographic facies and echo-character facies observed on the lower slopes and basin floor there reflect the thin deposits of pelagic sediments that have accumulated in the low-energy conditions that are typical of deep-water environments. In shallower water near the boundary with region 3, the signature of the acoustic facies is different from that in deeper water and probably indicates higher energy conditions or, perhaps, ice-related processes. Thick deposits of tills emplaced by lodgement during the most recent advance of the West Antarctic Ice Sheet are common from latitude 75??S to the northern boundary of the study area just south of Coulman Island (region 3). The signature of microtopographic facies in this region reflects the relief of the base of the grounded ice sheet prior to decoupling from the seafloor. Current winnowing and scour of shallow parts of the seafloor inhibits sediment deposition and maintains the irregular, hummocky relief that characterizes much of the region. Seafloor relief of this type in other polar areas could indicate the former presence of grounded ice. ?? 1989.
New normative standards of conditional reasoning and the dual-source model
Singmann, Henrik; Klauer, Karl Christoph; Over, David
2014-01-01
There has been a major shift in research on human reasoning toward Bayesian and probabilistic approaches, which has been called a new paradigm. The new paradigm sees most everyday and scientific reasoning as taking place in a context of uncertainty, and inference is from uncertain beliefs and not from arbitrary assumptions. In this manuscript we present an empirical test of normative standards in the new paradigm using a novel probabilized conditional reasoning task. Our results indicated that for everyday conditional with at least a weak causal connection between antecedent and consequent only the conditional probability of the consequent given antecedent contributes unique variance to predicting the probability of conditional, but not the probability of the conjunction, nor the probability of the material conditional. Regarding normative accounts of reasoning, we found significant evidence that participants' responses were confidence preserving (i.e., p-valid in the sense of Adams, 1998) for MP inferences, but not for MT inferences. Additionally, only for MP inferences and to a lesser degree for DA inferences did the rate of responses inside the coherence intervals defined by mental probability logic (Pfeifer and Kleiter, 2005, 2010) exceed chance levels. In contrast to the normative accounts, the dual-source model (Klauer et al., 2010) is a descriptive model. It posits that participants integrate their background knowledge (i.e., the type of information primary to the normative approaches) and their subjective probability that a conclusion is seen as warranted based on its logical form. Model fits showed that the dual-source model, which employed participants' responses to a deductive task with abstract contents to estimate the form-based component, provided as good an account of the data as a model that solely used data from the probabilized conditional reasoning task. PMID:24860516
New normative standards of conditional reasoning and the dual-source model.
Singmann, Henrik; Klauer, Karl Christoph; Over, David
2014-01-01
There has been a major shift in research on human reasoning toward Bayesian and probabilistic approaches, which has been called a new paradigm. The new paradigm sees most everyday and scientific reasoning as taking place in a context of uncertainty, and inference is from uncertain beliefs and not from arbitrary assumptions. In this manuscript we present an empirical test of normative standards in the new paradigm using a novel probabilized conditional reasoning task. Our results indicated that for everyday conditional with at least a weak causal connection between antecedent and consequent only the conditional probability of the consequent given antecedent contributes unique variance to predicting the probability of conditional, but not the probability of the conjunction, nor the probability of the material conditional. Regarding normative accounts of reasoning, we found significant evidence that participants' responses were confidence preserving (i.e., p-valid in the sense of Adams, 1998) for MP inferences, but not for MT inferences. Additionally, only for MP inferences and to a lesser degree for DA inferences did the rate of responses inside the coherence intervals defined by mental probability logic (Pfeifer and Kleiter, 2005, 2010) exceed chance levels. In contrast to the normative accounts, the dual-source model (Klauer et al., 2010) is a descriptive model. It posits that participants integrate their background knowledge (i.e., the type of information primary to the normative approaches) and their subjective probability that a conclusion is seen as warranted based on its logical form. Model fits showed that the dual-source model, which employed participants' responses to a deductive task with abstract contents to estimate the form-based component, provided as good an account of the data as a model that solely used data from the probabilized conditional reasoning task.
Wisco, Blair E; Marx, Brian P; Miller, Mark W; Wolf, Erika J; Mota, Natalie P; Krystal, John H; Southwick, Steven M; Pietrzak, Robert H
2016-11-01
With the publication of DSM-5, important changes were made to the diagnostic criteria for posttraumatic stress disorder (PTSD), including the addition of 3 new symptoms. Some have argued that these changes will further increase the already high rates of comorbidity between PTSD and other psychiatric disorders. This study examined the prevalence of DSM-5 PTSD, conditional probability of PTSD given certain trauma exposures, endorsement of specific PTSD symptoms, and psychiatric comorbidities in the US veteran population. Data were analyzed from the National Health and Resilience in Veterans Study (NHRVS), a Web-based survey of a cross-sectional, nationally representative, population-based sample of 1,484 US veterans, which was fielded from September through October 2013. Probable PTSD was assessed using the PTSD Checklist-5. The weighted lifetime and past-month prevalence of probable DSM-5 PTSD was 8.1% (SE = 0.7%) and 4.7% (SE = 0.6%), respectively. Conditional probability of lifetime probable PTSD ranged from 10.1% (sudden death of close family member or friend) to 28.0% (childhood sexual abuse). The DSM-5 PTSD symptoms with the lowest prevalence among veterans with probable PTSD were trauma-related amnesia and reckless and self-destructive behavior. Probable PTSD was associated with increased odds of mood and anxiety disorders (OR = 7.6-62.8, P < .001), substance use disorders (OR = 3.9-4.5, P < .001), and suicidal behaviors (OR = 6.7-15.1, P < .001). In US veterans, the prevalence of DSM-5 probable PTSD, conditional probability of probable PTSD, and odds of psychiatric comorbidity were similar to prior findings with DSM-IV-based measures; we found no evidence that changes in DSM-5 increase psychiatric comorbidity. Results underscore the high rates of exposure to both military and nonmilitary trauma and the high public health burden of DSM-5 PTSD and comorbid conditions in veterans. © Copyright 2016 Physicians Postgraduate Press, Inc.
Improving experimental phases for strong reflections prior to density modification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Uervirojnangkoorn, Monarin; University of Lübeck, Ratzeburger Allee 160, 23538 Lübeck; Hilgenfeld, Rolf, E-mail: hilgenfeld@biochem.uni-luebeck.de
A genetic algorithm has been developed to optimize the phases of the strongest reflections in SIR/SAD data. This is shown to facilitate density modification and model building in several test cases. Experimental phasing of diffraction data from macromolecular crystals involves deriving phase probability distributions. These distributions are often bimodal, making their weighted average, the centroid phase, improbable, so that electron-density maps computed using centroid phases are often non-interpretable. Density modification brings in information about the characteristics of electron density in protein crystals. In successful cases, this allows a choice between the modes in the phase probability distributions, and the mapsmore » can cross the borderline between non-interpretable and interpretable. Based on the suggestions by Vekhter [Vekhter (2005 ▶), Acta Cryst. D61, 899–902], the impact of identifying optimized phases for a small number of strong reflections prior to the density-modification process was investigated while using the centroid phase as a starting point for the remaining reflections. A genetic algorithm was developed that optimizes the quality of such phases using the skewness of the density map as a target function. Phases optimized in this way are then used in density modification. In most of the tests, the resulting maps were of higher quality than maps generated from the original centroid phases. In one of the test cases, the new method sufficiently improved a marginal set of experimental SAD phases to enable successful map interpretation. A computer program, SISA, has been developed to apply this method for phase improvement in macromolecular crystallography.« less
Self-Verification of Ability through Biased Performance Memory.
ERIC Educational Resources Information Center
Karabenick, Stuart A.; LeBlanc, Daniel
Evidence points to a pervasive tendency for persons to behave to maintain their existing cognitive structures. One strategy by which this self-verification is made more probable involves information processing. Through attention, encoding and retrieval, and the interpretation of events, persons process information so that self-confirmatory…
Using Astrology to Teach Research Methods to Introductory Psychology Students.
ERIC Educational Resources Information Center
Ward, Roger A.; Grasha, Anthony F.
1986-01-01
Provides a classroom demonstration designed to test an astrological hypothesis and help teach introductory psychology students about research design and data interpretation. Illustrates differences between science and nonscience, the role of theory in developing and testing hypotheses, making comparisons among groups, probability and statistical…
Statement Verification: A Stochastic Model of Judgment and Response.
ERIC Educational Resources Information Center
Wallsten, Thomas S.; Gonzalez-Vallejo, Claudia
1994-01-01
A stochastic judgment model (SJM) is presented as a framework for addressing issues in statement verification and probability judgment. Results of 5 experiments with 264 undergraduates support the validity of the model and provide new information that is interpreted in terms of the SJM. (SLD)
Physiological condition of autumn-banded mallards and its relationship to hunting vulnerability
Hepp, G.R.; Blohm, R.J.; Reynolds, R.E.; Hines, J.E.; Nichols, J.D.
1986-01-01
An important topic of waterfowl ecology concerns the relationship between the physiological condition of ducks during the nonbreeding season and fitness, i.e., survival and future reproductive success. We investigated this subject using direct band recovery records of mallards (Anas platyrhynchos) banded in autumn (1 Oct-15 Dec) 1981-83 in the Mississippi Alluvial Valley (MAV) [USA]. A condition index, weight (g)/wing length (mm), was calculated for each duck, and we tested whether condition of mallards at time of banding was related to their probability of recovery during the hunting season. In 3 years, 5,610 mallards were banded and there were 234 direct recoveries. Three binary regression model was used to test the relationship between recovery probability and condition. Likelihood-ratio tests were conducted to determine the most suitable model. For mallards banded in autumn there was a negative relationship between physical condition and the probability of recovery. Mallards in poor condition at the time of banding had a greater probability of being recovered during the hunting season. In general, this was true for all ages and sex classes; however, the strongest relationship occurred for adult males.
From Inverse Problems in Mathematical Physiology to Quantitative Differential Diagnoses
Zenker, Sven; Rubin, Jonathan; Clermont, Gilles
2007-01-01
The improved capacity to acquire quantitative data in a clinical setting has generally failed to improve outcomes in acutely ill patients, suggesting a need for advances in computer-supported data interpretation and decision making. In particular, the application of mathematical models of experimentally elucidated physiological mechanisms could augment the interpretation of quantitative, patient-specific information and help to better target therapy. Yet, such models are typically complex and nonlinear, a reality that often precludes the identification of unique parameters and states of the model that best represent available data. Hypothesizing that this non-uniqueness can convey useful information, we implemented a simplified simulation of a common differential diagnostic process (hypotension in an acute care setting), using a combination of a mathematical model of the cardiovascular system, a stochastic measurement model, and Bayesian inference techniques to quantify parameter and state uncertainty. The output of this procedure is a probability density function on the space of model parameters and initial conditions for a particular patient, based on prior population information together with patient-specific clinical observations. We show that multimodal posterior probability density functions arise naturally, even when unimodal and uninformative priors are used. The peaks of these densities correspond to clinically relevant differential diagnoses and can, in the simplified simulation setting, be constrained to a single diagnosis by assimilating additional observations from dynamical interventions (e.g., fluid challenge). We conclude that the ill-posedness of the inverse problem in quantitative physiology is not merely a technical obstacle, but rather reflects clinical reality and, when addressed adequately in the solution process, provides a novel link between mathematically described physiological knowledge and the clinical concept of differential diagnoses. We outline possible steps toward translating this computational approach to the bedside, to supplement today's evidence-based medicine with a quantitatively founded model-based medicine that integrates mechanistic knowledge with patient-specific information. PMID:17997590
Interpretation of diagnostic data: 5. How to do it with simple maths.
1983-11-01
The use of simple maths with the likelihood ratio strategy fits in nicely with our clinical views. By making the most out of the entire range of diagnostic test results (i.e., several levels, each with its own likelihood ratio, rather than a single cut-off point and a single ratio) and by permitting us to keep track of the likelihood that a patient has the target disorder at each point along the diagnostic sequence, this strategy allows us to place patients at an extremely high or an extremely low likelihood of disease. Thus, the numbers of patients with ultimately false-positive results (who suffer the slings of labelling and the arrows of needless therapy) and of those with ultimately false-negative results (who therefore miss their chance for diagnosis and, possibly, efficacious therapy) will be dramatically reduced. The following guidelines will be useful in interpreting signs, symptoms and laboratory tests with the likelihood ratio strategy: Seek out, and demand from the clinical or laboratory experts who ought to know, the likelihood ratios for key symptoms and signs, and several levels (rather than just the positive and negative results) of diagnostic test results. Identify, when feasible, the logical sequence of diagnostic tests. Estimate the pretest probability of disease for the patient, and, using either the nomogram or the conversion formulas, apply the likelihood ratio that corresponds to the first diagnostic test result. While remembering that the resulting post-test probability or odds from the first test becomes the pretest probability or odds for the next diagnostic test, repeat the process for all the pertinent symptoms, signs and laboratory studies that pertain to the target disorder. However, these combinations may not be independent, and convergent diagnostic tests, if treated as independent, will combine to overestimate the final post-test probability of disease. You are now far more sophisticated in interpreting diagnostic tests than most of your teachers. In the last part of our series we will show you some rather complex strategies that combine diagnosis and therapy, quantify our as yet nonquantified ideas about use, and require the use of at least a hand calculator.
Interpretation of diagnostic data: 5. How to do it with simple maths.
1983-01-01
The use of simple maths with the likelihood ratio strategy fits in nicely with our clinical views. By making the most out of the entire range of diagnostic test results (i.e., several levels, each with its own likelihood ratio, rather than a single cut-off point and a single ratio) and by permitting us to keep track of the likelihood that a patient has the target disorder at each point along the diagnostic sequence, this strategy allows us to place patients at an extremely high or an extremely low likelihood of disease. Thus, the numbers of patients with ultimately false-positive results (who suffer the slings of labelling and the arrows of needless therapy) and of those with ultimately false-negative results (who therefore miss their chance for diagnosis and, possibly, efficacious therapy) will be dramatically reduced. The following guidelines will be useful in interpreting signs, symptoms and laboratory tests with the likelihood ratio strategy: Seek out, and demand from the clinical or laboratory experts who ought to know, the likelihood ratios for key symptoms and signs, and several levels (rather than just the positive and negative results) of diagnostic test results. Identify, when feasible, the logical sequence of diagnostic tests. Estimate the pretest probability of disease for the patient, and, using either the nomogram or the conversion formulas, apply the likelihood ratio that corresponds to the first diagnostic test result. While remembering that the resulting post-test probability or odds from the first test becomes the pretest probability or odds for the next diagnostic test, repeat the process for all the pertinent symptoms, signs and laboratory studies that pertain to the target disorder. However, these combinations may not be independent, and convergent diagnostic tests, if treated as independent, will combine to overestimate the final post-test probability of disease. You are now far more sophisticated in interpreting diagnostic tests than most of your teachers. In the last part of our series we will show you some rather complex strategies that combine diagnosis and therapy, quantify our as yet nonquantified ideas about use, and require the use of at least a hand calculator. PMID:6671182
Quantitative assessment of building fire risk to life safety.
Guanquan, Chu; Jinhua, Sun
2008-06-01
This article presents a quantitative risk assessment framework for evaluating fire risk to life safety. Fire risk is divided into two parts: probability and corresponding consequence of every fire scenario. The time-dependent event tree technique is used to analyze probable fire scenarios based on the effect of fire protection systems on fire spread and smoke movement. To obtain the variation of occurrence probability with time, Markov chain is combined with a time-dependent event tree for stochastic analysis on the occurrence probability of fire scenarios. To obtain consequences of every fire scenario, some uncertainties are considered in the risk analysis process. When calculating the onset time to untenable conditions, a range of fires are designed based on different fire growth rates, after which uncertainty of onset time to untenable conditions can be characterized by probability distribution. When calculating occupant evacuation time, occupant premovement time is considered as a probability distribution. Consequences of a fire scenario can be evaluated according to probability distribution of evacuation time and onset time of untenable conditions. Then, fire risk to life safety can be evaluated based on occurrence probability and consequences of every fire scenario. To express the risk assessment method in detail, a commercial building is presented as a case study. A discussion compares the assessment result of the case study with fire statistics.
Dinov, Ivo D.; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas
2014-01-01
Summary Data analysis requires subtle probability reasoning to answer questions like What is the chance of event A occurring, given that event B was observed? This generic question arises in discussions of many intriguing scientific questions such as What is the probability that an adolescent weighs between 120 and 140 pounds given that they are of average height? and What is the probability of (monetary) inflation exceeding 4% and housing price index below 110? To address such problems, learning some applied, theoretical or cross-disciplinary probability concepts is necessary. Teaching such courses can be improved by utilizing modern information technology resources. Students’ understanding of multivariate distributions, conditional probabilities, correlation and causation can be significantly strengthened by employing interactive web-based science educational resources. Independent of the type of a probability course (e.g. majors, minors or service probability course, rigorous measure-theoretic, applied or statistics course) student motivation, learning experiences and knowledge retention may be enhanced by blending modern technological tools within the classical conceptual pedagogical models. We have designed, implemented and disseminated a portable open-source web-application for teaching multivariate distributions, marginal, joint and conditional probabilities using the special case of bivariate Normal distribution. A real adolescent height and weight dataset is used to demonstrate the classroom utilization of the new web-application to address problems of parameter estimation, univariate and multivariate inference. PMID:25419016
Dinov, Ivo D; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas
2013-01-01
Data analysis requires subtle probability reasoning to answer questions like What is the chance of event A occurring, given that event B was observed? This generic question arises in discussions of many intriguing scientific questions such as What is the probability that an adolescent weighs between 120 and 140 pounds given that they are of average height? and What is the probability of (monetary) inflation exceeding 4% and housing price index below 110? To address such problems, learning some applied, theoretical or cross-disciplinary probability concepts is necessary. Teaching such courses can be improved by utilizing modern information technology resources. Students' understanding of multivariate distributions, conditional probabilities, correlation and causation can be significantly strengthened by employing interactive web-based science educational resources. Independent of the type of a probability course (e.g. majors, minors or service probability course, rigorous measure-theoretic, applied or statistics course) student motivation, learning experiences and knowledge retention may be enhanced by blending modern technological tools within the classical conceptual pedagogical models. We have designed, implemented and disseminated a portable open-source web-application for teaching multivariate distributions, marginal, joint and conditional probabilities using the special case of bivariate Normal distribution. A real adolescent height and weight dataset is used to demonstrate the classroom utilization of the new web-application to address problems of parameter estimation, univariate and multivariate inference.
Metocean design parameter estimation for fixed platform based on copula functions
NASA Astrophysics Data System (ADS)
Zhai, Jinjin; Yin, Qilin; Dong, Sheng
2017-08-01
Considering the dependent relationship among wave height, wind speed, and current velocity, we construct novel trivariate joint probability distributions via Archimedean copula functions. Total 30-year data of wave height, wind speed, and current velocity in the Bohai Sea are hindcast and sampled for case study. Four kinds of distributions, namely, Gumbel distribution, lognormal distribution, Weibull distribution, and Pearson Type III distribution, are candidate models for marginal distributions of wave height, wind speed, and current velocity. The Pearson Type III distribution is selected as the optimal model. Bivariate and trivariate probability distributions of these environmental conditions are established based on four bivariate and trivariate Archimedean copulas, namely, Clayton, Frank, Gumbel-Hougaard, and Ali-Mikhail-Haq copulas. These joint probability models can maximize marginal information and the dependence among the three variables. The design return values of these three variables can be obtained by three methods: univariate probability, conditional probability, and joint probability. The joint return periods of different load combinations are estimated by the proposed models. Platform responses (including base shear, overturning moment, and deck displacement) are further calculated. For the same return period, the design values of wave height, wind speed, and current velocity obtained by the conditional and joint probability models are much smaller than those by univariate probability. Considering the dependence among variables, the multivariate probability distributions provide close design parameters to actual sea state for ocean platform design.
Probability based models for estimation of wildfire risk
Haiganoush Preisler; D. R. Brillinger; R. E. Burgan; John Benoit
2004-01-01
We present a probability-based model for estimating fire risk. Risk is defined using three probabilities: the probability of fire occurrence; the conditional probability of a large fire given ignition; and the unconditional probability of a large fire. The model is based on grouped data at the 1 km²-day cell level. We fit a spatially and temporally explicit non-...
18 CFR 284.6 - Rate interpretations.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 18 Conservation of Power and Water Resources 1 2013-04-01 2013-04-01 false Rate interpretations... AUTHORITIES General Provisions and Conditions § 284.6 Rate interpretations. (a) Procedure. A pipeline may... rates and charges comply with the requirements of this part. (b) Address. Requests for interpretations...
18 CFR 284.6 - Rate interpretations.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Rate interpretations... AUTHORITIES General Provisions and Conditions § 284.6 Rate interpretations. (a) Procedure. A pipeline may... rates and charges comply with the requirements of this part. (b) Address. Requests for interpretations...
18 CFR 284.6 - Rate interpretations.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Rate interpretations... AUTHORITIES General Provisions and Conditions § 284.6 Rate interpretations. (a) Procedure. A pipeline may... rates and charges comply with the requirements of this part. (b) Address. Requests for interpretations...
18 CFR 284.6 - Rate interpretations.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Rate interpretations... AUTHORITIES General Provisions and Conditions § 284.6 Rate interpretations. (a) Procedure. A pipeline may... rates and charges comply with the requirements of this part. (b) Address. Requests for interpretations...
18 CFR 284.6 - Rate interpretations.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Rate interpretations... AUTHORITIES General Provisions and Conditions § 284.6 Rate interpretations. (a) Procedure. A pipeline may... rates and charges comply with the requirements of this part. (b) Address. Requests for interpretations...
Targeting the probability versus cost of feared outcomes in public speaking anxiety.
Nelson, Elizabeth A; Deacon, Brett J; Lickel, James J; Sy, Jennifer T
2010-04-01
Cognitive-behavioral theory suggests that social phobia is maintained, in part, by overestimates of the probability and cost of negative social events. Indeed, empirically supported cognitive-behavioral treatments directly target these cognitive biases through the use of in vivo exposure or behavioral experiments. While cognitive-behavioral theories and treatment protocols emphasize the importance of targeting probability and cost biases in the reduction of social anxiety, few studies have examined specific techniques for reducing probability and cost bias, and thus the relative efficacy of exposure to the probability versus cost of negative social events is unknown. In the present study, 37 undergraduates with high public speaking anxiety were randomly assigned to a single-session intervention designed to reduce either the perceived probability or the perceived cost of negative outcomes associated with public speaking. Compared to participants in the probability treatment condition, those in the cost treatment condition demonstrated significantly greater improvement on measures of public speaking anxiety and cost estimates for negative social events. The superior efficacy of the cost treatment condition was mediated by greater treatment-related changes in social cost estimates. The clinical implications of these findings are discussed. Published by Elsevier Ltd.
Topological and Orthomodular Modeling of Context in Behavioral Science
NASA Astrophysics Data System (ADS)
Narens, Louis
2017-02-01
Two non-boolean methods are discussed for modeling context in behavioral data and theory. The first is based on intuitionistic logic, which is similar to classical logic except that not every event has a complement. Its probability theory is also similar to classical probability theory except that the definition of probability function needs to be generalized to unions of events instead of applying only to unions of disjoint events. The generalization is needed, because intuitionistic event spaces may not contain enough disjoint events for the classical definition to be effective. The second method develops a version of quantum logic for its underlying probability theory. It differs from Hilbert space logic used in quantum mechanics as a foundation for quantum probability theory in variety of ways. John von Neumann and others have commented about the lack of a relative frequency approach and a rational foundation for this probability theory. This article argues that its version of quantum probability theory does not have such issues. The method based on intuitionistic logic is useful for modeling cognitive interpretations that vary with context, for example, the mood of the decision maker, the context produced by the influence of other items in a choice experiment, etc. The method based on this article's quantum logic is useful for modeling probabilities across contexts, for example, how probabilities of events from different experiments are related.
Supporting anticipation in driving through attentional and interpretational in-vehicle displays.
Stahl, Patrick; Donmez, Birsen; Jamieson, Greg A
2016-06-01
This paper evaluates two different types of in-vehicle interfaces to support anticipation in driving: one aids attention allocation and the other aids interpretation of traffic in addition to attention allocation. Anticipation is a competency that has been shown to facilitate safety and eco-driving through the efficient positioning of a vehicle for probable, upcoming changes in traffic. This competency has been shown to improve with driving experience. In an earlier simulator study, we showed that compared to novice drivers, experienced drivers exhibited a greater number of timely actions to avoid upcoming traffic conflicts. In this study, we seek to facilitate anticipation in general and for novice drivers in particular, who appear to lack the competency. We hypothesize that anticipation depends on two major steps and that it can be supported by aiding each: (1) conscious perception of relevant cues, and (2) effective processing of these cues to create a situational assessment as a basis for anticipation of future developments. We conducted a simulator experiment with 24 experienced and 24 novice drivers to evaluate two interfaces that were designed to aid the two hypothesized steps of anticipation. The attentional interface was designed to direct attention toward the most relevant cue. The interpretational interface represented several cues, and in addition to directing attention also aimed to aid sense-making of these cues. The results confirmed our hypothesis that novice drivers' anticipation performance, as measured through timely actions to avoid upcoming traffic conflicts, would be improved with either interface type. However, results contradicted our expectation that novice drivers would obtain larger improvements with the interpretational interface. Experienced drivers performed better than novice drivers to begin with and did not show any statistically significant improvements with either interface. Both interfaces improved anticipation performance for novice drivers. Future research should evaluate the effectiveness of these interfaces in a wider variety of driving conditions, such as when the driver is multitasking. Copyright © 2016 Elsevier Ltd. All rights reserved.
Ennis, Erin J; Foley, Joe P
2016-07-15
A stochastic approach was utilized to estimate the probability of a successful isocratic or gradient separation in conventional chromatography for numbers of sample components, peak capacities, and saturation factors ranging from 2 to 30, 20-300, and 0.017-1, respectively. The stochastic probabilities were obtained under conditions of (i) constant peak width ("gradient" conditions) and (ii) peak width increasing linearly with time ("isocratic/constant N" conditions). The isocratic and gradient probabilities obtained stochastically were compared with the probabilities predicted by Martin et al. [Anal. Chem., 58 (1986) 2200-2207] and Davis and Stoll [J. Chromatogr. A, (2014) 128-142]; for a given number of components and peak capacity the same trend is always observed: probability obtained with the isocratic stochastic approach
When 95% Accurate Isn't: Exploring Bayes's Theorem
ERIC Educational Resources Information Center
CadwalladerOlsker, Todd D.
2011-01-01
Bayes's theorem is notorious for being a difficult topic to learn and to teach. Problems involving Bayes's theorem (either implicitly or explicitly) generally involve calculations based on two or more given probabilities and their complements. Further, a correct solution depends on students' ability to interpret the problem correctly. Most people…
Data Interpretation: Using Probability
ERIC Educational Resources Information Center
Drummond, Gordon B.; Vowler, Sarah L.
2011-01-01
Experimental data are analysed statistically to allow researchers to draw conclusions from a limited set of measurements. The hard fact is that researchers can never be certain that measurements from a sample will exactly reflect the properties of the entire group of possible candidates available to be studied (although using a sample is often the…
High Court's TB Ruling Probably Applies to AIDS.
ERIC Educational Resources Information Center
Sendor, Benjamin
1987-01-01
Discusses a United States Supreme Court decision upholding Section 504 protection for an elementary school teacher fired due to recurrent tuberculosis. The school board may need to make reasonable accommodation for employees handicapped by contagious diseases. The Court might also interpret Section 504 as covering AIDS carriers. (MLH)
Common quandaries and their practical solutions in Bayesian network modeling
Bruce G. Marcot
2017-01-01
Use and popularity of Bayesian network (BN) modeling has greatly expanded in recent years, but many common problems remain. Here, I summarize key problems in BN model construction and interpretation,along with suggested practical solutions. Problems in BN model construction include parameterizing probability values, variable definition, complex network structures,...
Causes of Effects and Effects of Causes
ERIC Educational Resources Information Center
Pearl, Judea
2015-01-01
This article summarizes a conceptual framework and simple mathematical methods of estimating the probability that one event was a necessary cause of another, as interpreted by lawmakers. We show that the fusion of observational and experimental data can yield informative bounds that, under certain circumstances, meet legal criteria of causation.…
Linear Programming Problems for Generalized Uncertainty
ERIC Educational Resources Information Center
Thipwiwatpotjana, Phantipa
2010-01-01
Uncertainty occurs when there is more than one realization that can represent an information. This dissertation concerns merely discrete realizations of an uncertainty. Different interpretations of an uncertainty and their relationships are addressed when the uncertainty is not a probability of each realization. A well known model that can handle…
Enhancing the Teaching and Learning of Mathematical Visual Images
ERIC Educational Resources Information Center
Quinnell, Lorna
2014-01-01
The importance of mathematical visual images is indicated by the introductory paragraph in the Statistics and Probability content strand of the Australian Curriculum, which draws attention to the importance of learners developing skills to analyse and draw inferences from data and "represent, summarise and interpret data and undertake…
Reporting Child Sexual Abuse: Ethical Dilemmas, and Guidelines for Decision Making.
ERIC Educational Resources Information Center
Zambelli, Grace C.; Lee, Sandra S.
All states have laws mandating that certain individuals report suspected occurrences of child abuse. Mandatory reporting statutes, their administration, and their judicial interpretation have created many ethical, legal, and clinical dilemmas. The abrogation of the confidentiality in the therapeutic relationship is probably the foremost ethical…
Propagation of landslide inventory errors on data driven landslide susceptibility models
NASA Astrophysics Data System (ADS)
Henriques, C. S.; Zezere, J. L.; Neves, M.; Garcia, R. A. C.; Oliveira, S. C.; Piedade, A.
2009-04-01
Research on landslide susceptibility assessment developed recently worldwide has shown that quality and reliability of modelling results are more sensitive to the quality and consistence of the cartographic database than to statistical tools used in the modelling process. Particularly, the quality of the landslide inventory is of crucial importance, because data-driven models used for landside susceptibility evaluation are based on the spatial correlation between past landslide occurrences and a data set of thematic layers representing independent landslide predisposing factors. Uncertainty within landslide inventorying may be very high and is usually related to: (i) the geological and geomorphological complexity of the study area; (ii) the dominant land use and the rhythm and magnitude of land use change; (iii) the conservation level of landslide evidences (e.g., topography, vegetation, drainage) both in the field and aerial photographs; and (iv) the experience of the geomorphologist(s) that build the landslide inventory. Traditionally, landslide inventory has been made through aerial-photo interpretation and field work surveying by using standard geomorphological techniques. More recently, the interpretation of detailed geo-referenced digital ortophotomaps (pixel = 0.5 m), combined with the accurate topography, as become an additional analytical tool for landslide identification at the regional scale. The present study was performed in a test site (256 km2) within Caldas da Rainha County, located in the central part of Portugal. Detailed geo-referenced digital ortophotomaps obtained in 2004 were used to build three different landslide inventories. The landslide inventory #1 was constructed by a single regular trained geomorphologist using photo-interpretation. 408 probable slope movements were identified and geo-referenced by a point marked in the central part of the probable landslide rupture zone. The landslide inventory #2 was obtained through the examination of landslide inventory #1 by a senior geomorphologist. This second phase of photo and morphologic interpretation (pre-validation) allows the selection of 204 probable slope movements from the first landslide inventory. The landslide inventory #3 was obtained by the field verification of the total set of probable landslide zones (408 points), and was performed by 6 geomorphologists. This inventory has 193 validated slope movements, and includes 101 "new landslides" that have not been recognized by the ortophotomaps interpretation. Additionally, the field work enabled the cartographic delimitation of the slope movement depletion and accumulation zones, and the definition of landslide type. Landslide susceptibility was assessed using the three landslide inventories by using a single predictive model (logistic regression) and the same set of landslide predisposing factors to allow comparison of results. Uncertainty associated to landslide inventory errors and their propagation on landslide susceptibility results are evaluated and compared by the computation of success-rate and prediction-rate curves. The error derived from landslide inventorying is quantified by assessing the overlapping degree of susceptible areas obtained from the different prediction models.
Faithful Squashed Entanglement
NASA Astrophysics Data System (ADS)
Brandão, Fernando G. S. L.; Christandl, Matthias; Yard, Jon
2011-09-01
Squashed entanglement is a measure for the entanglement of bipartite quantum states. In this paper we present a lower bound for squashed entanglement in terms of a distance to the set of separable states. This implies that squashed entanglement is faithful, that is, it is strictly positive if and only if the state is entangled. We derive the lower bound on squashed entanglement from a lower bound on the quantum conditional mutual information which is used to define squashed entanglement. The quantum conditional mutual information corresponds to the amount by which strong subadditivity of von Neumann entropy fails to be saturated. Our result therefore sheds light on the structure of states that almost satisfy strong subadditivity with equality. The proof is based on two recent results from quantum information theory: the operational interpretation of the quantum mutual information as the optimal rate for state redistribution and the interpretation of the regularised relative entropy of entanglement as an error exponent in hypothesis testing. The distance to the set of separable states is measured in terms of the LOCC norm, an operationally motivated norm giving the optimal probability of distinguishing two bipartite quantum states, each shared by two parties, using any protocol formed by local quantum operations and classical communication (LOCC) between the parties. A similar result for the Frobenius or Euclidean norm follows as an immediate consequence. The result has two applications in complexity theory. The first application is a quasipolynomial-time algorithm solving the weak membership problem for the set of separable states in LOCC or Euclidean norm. The second application concerns quantum Merlin-Arthur games. Here we show that multiple provers are not more powerful than a single prover when the verifier is restricted to LOCC operations thereby providing a new characterisation of the complexity class QMA.
Use of historical and geospatial data to guide the restoration of a Lake Erie coastal marsh
Kowalski, Kurt P.; Wilcox, Douglas A.
1999-01-01
Historical and geospatial data were used to identify the relationships between water levels, wetland vegetation, littoral drift of sediments, and the condition of a protective barrier beach at Metzger Marsh, a coastal wetland in western Lake Erie, to enhance and guide a joint federal and state wetland restoration project. Eleven sets of large-scale aerial photographs dating from 1940 through 1994 were interpreted to delineate major vegetation types and boundaries of the barrier beach. A geographic information system (GIS) was then used to digitize the data and calculate the vegetated area and length of barrier beach. Supplemented by paleoecological and sedimentological analyses, aerial photographic interpretation revealed that Metzger Marsh was once a drowned-river-mouth wetland dominated by sedges and protected by a sand barrier beach. Extremely high water levels, storm events, and reduction of sediments in the littoral drift contributed to the complete destruction of the barrier beach in 1973 and prevented its recovery. The extent of wetland vegetation, correlated to water levels and condition of the barrier beach, decreased from a high of 108 ha in 1940 to a low of 33 ha in 1994. The lack of an adequate sediment supply and low probability of a period of extremely low lake levels in the near future made natural reestablishment of the barrier beach and wetland vegetation unlikely. Therefore, the federal and state managers chose to construct a dike to replace the protective barrier beach. Recommendations stemming from this historical analysis, however, resulted in the incorporation of a water-control structure in the dike that will retain a hydrologic connection between wetland and lake. Management of the wetland will seek to mimic processes natural to the wetland type identified by this analysis.
Shimatani, Ichiro Ken; Yoda, Ken; Katsumata, Nobuhiro; Sato, Katsufumi
2012-01-01
To analyze an animal's movement trajectory, a basic model is required that satisfies the following conditions: the model must have an ecological basis and the parameters used in the model must have ecological interpretations, a broad range of movement patterns can be explained by that model, and equations and probability distributions in the model should be mathematically tractable. Random walk models used in previous studies do not necessarily satisfy these requirements, partly because movement trajectories are often more oriented or tortuous than expected from the models. By improving the modeling for turning angles, this study aims to propose a basic movement model. On the basis of the recently developed circular auto-regressive model, we introduced a new movement model and extended its applicability to capture the asymmetric effects of external factors such as wind. The model was applied to GPS trajectories of a seabird (Calonectris leucomelas) to demonstrate its applicability to various movement patterns and to explain how the model parameters are ecologically interpreted under a general conceptual framework for movement ecology. Although it is based on a simple extension of a generalized linear model to circular variables, the proposed model enables us to evaluate the effects of external factors on movement separately from the animal's internal state. For example, maximum likelihood estimates and model selection suggested that in one homing flight section, the seabird intended to fly toward the island, but misjudged its navigation and was driven off-course by strong winds, while in the subsequent flight section, the seabird reset the focal direction, navigated the flight under strong wind conditions, and succeeded in approaching the island.
Assessing Lay Understanding of Common Presentations of Earthquake Hazard Information
NASA Astrophysics Data System (ADS)
Thompson, K. J.; Krantz, D. H.
2010-12-01
The Working Group on California Earthquake Probabilities (WGCEP) includes, in its introduction to earthquake rupture forecast maps, the assertion that "In daily living, people are used to making decisions based on probabilities -- from the flip of a coin (50% probability of heads) to weather forecasts (such as a 30% chance of rain) to the annual chance of being killed by lightning (about 0.0003%)." [3] However, psychology research identifies a large gap between lay and expert perception of risk for various hazards [2], and cognitive psychologists have shown in numerous studies [1,4-6] that people neglect, distort, misjudge, or misuse probabilities, even when given strong guidelines about the meaning of numerical or verbally stated probabilities [7]. The gap between lay and expert use of probability needs to be recognized more clearly by scientific organizations such as WGCEP. This study undertakes to determine how the lay public interprets earthquake hazard information, as presented in graphical map form by the Uniform California Earthquake Rupture Forecast (UCERF), compiled by the WGCEP and other bodies including the USGS and CGS. It also explores alternate ways of presenting hazard data, to determine which presentation format most effectively translates information from scientists to public. Participants both from California and from elsewhere in the United States are included, to determine whether familiarity -- either with the experience of an earthquake, or with the geography of the forecast area -- affects people's ability to interpret an earthquake hazards map. We hope that the comparisons between the interpretations by scientific experts and by different groups of laypeople will both enhance theoretical understanding of factors that affect information transmission and assist bodies such as the WGCEP in their laudable attempts to help people prepare themselves and their communities for possible natural hazards. [1] Kahneman, D & Tversky, A (1979). Prospect Theory: An Analysis of Decision under Risk. Econometrica, XLVII: 263-291. [2] Fischhoff, B, Slovic, P, Lichtenstein, S, Read, S & Combs, B (1978). How safe is safe enough? A psychometric study of attitudes towards technological risks and benefits. Pol Sci, 9, 127-152. [3] http://www.scec.org/ucerf/ [4] Hau, R, Pleskac, TJ, Kiefer, J & Hertwig, R (2008). The Description-Experience Gap in Risky Choice: The Role of Sample Size and Experienced Probabilities. J Behav Decis Making, 21: 493-518. [5] Lichtenstein, S, Slovic, P, Fischhoff, B, Layman, M & Combs, B (1978). Judged frequency of lethal events. J Exp Psy: Human Learning and Memory, 4, 551-578. [6] Hertwig, R, Barron, G, Weber, EU & Erev, I (2006). The role of information sampling in risky choice. In K Fiedler & P Juslin (Eds), Information sampling and adaptive cognition. Pp 75-91. New York: Cambridge University Press. [7] Budescu, DV, Broomell, S & Por HH (2009). Improving communication of uncertainty in the reports of the intergovernmental panel on climate change. Psychol Sci, 20(3), 299-308.
NASA Astrophysics Data System (ADS)
Boslough, M.
2011-12-01
Climate-related uncertainty is traditionally presented as an error bar, but it is becoming increasingly common to express it in terms of a probability density function (PDF). PDFs are a necessary component of probabilistic risk assessments, for which simple "best estimate" values are insufficient. Many groups have generated PDFs for climate sensitivity using a variety of methods. These PDFs are broadly consistent, but vary significantly in their details. One axiom of the verification and validation community is, "codes don't make predictions, people make predictions." This is a statement of the fact that subject domain experts generate results using assumptions within a range of epistemic uncertainty and interpret them according to their expert opinion. Different experts with different methods will arrive at different PDFs. For effective decision support, a single consensus PDF would be useful. We suggest that market methods can be used to aggregate an ensemble of opinions into a single distribution that expresses the consensus. Prediction markets have been shown to be highly successful at forecasting the outcome of events ranging from elections to box office returns. In prediction markets, traders can take a position on whether some future event will or will not occur. These positions are expressed as contracts that are traded in a double-action market that aggregates price, which can be interpreted as a consensus probability that the event will take place. Since climate sensitivity cannot directly be measured, it cannot be predicted. However, the changes in global mean surface temperature are a direct consequence of climate sensitivity, changes in forcing, and internal variability. Viable prediction markets require an undisputed event outcome on a specific date. Climate-related markets exist on Intrade.com, an online trading exchange. One such contract is titled "Global Temperature Anomaly for Dec 2011 to be greater than 0.65 Degrees C." Settlement is based global temperature anomaly data published by NASS GISS. Typical climate contracts predict the probability of a specified future temperature, but not the probability density or best estimate. One way to generate a probability distribution would be to create a family of contracts over a range of specified temperatures and interpret the price of each contract as its exceedance probability. The resulting plot of probability vs. anomaly is the market-based cumulative density function. The best estimate can be determined by interpolation, and the market-based uncertainty estimate can be based on the spread. One requirement for an effective prediction market is liquidity. Climate contracts are currently considered somewhat of a novelty and often lack sufficient liquidity, but climate change has the potential to generate both tremendous losses for some (e.g. agricultural collapse and extreme weather events) and wealth for others (access to natural resources and trading routes). Use of climate markets by large stakeholders has the potential to generate the liquidity necessary to make them viable. Sandia is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. DoE's NNSA under contract DE-AC04-94AL85000.
NASA Astrophysics Data System (ADS)
Marinov, D.; Lopatik, D.; Guaitella, O.; Hübner, M.; Ionikh, Y.; Röpcke, J.; Rousseau, A.
2012-05-01
A new method for determination of the wall de-excitation probability \\gamma _{N_2 } of vibrationally excited N2 on different surfaces exposed to low-pressure plasmas has been developed. A short dc discharge pulse of only a few milliseconds was applied to a mixture containing 0.05-1% of CO2 in N2 at a pressure of 133 Pa. Due to a nearly resonant fast vibrational transfer between N2(v) and the asymmetric ν3 mode of CO2 the vibrational excitation of these titrating molecules is an image of the degree of vibrational excitation of N2. In the afterglow, the vibrational relaxation of CO2 was monitored in situ using quantum cascade laser absorption spectroscopy. The experimental results were interpreted in terms of a numerical model of non-equilibrium vibrational kinetics in CO2-N2 mixtures. Heterogeneous relaxation was the main quenching process of N2(v) under the conditions of this study, which allowed determination of the value of \\gamma _{N_2 } from the best agreement between the experiment and the model. The new method is suitable for \\gamma _{N_2 } determination in a single plasma pulse with the discharge tube surface pretreated by a low-pressure plasma. The relaxation probability of the first vibrational level of nitrogen γ1 = (1.1 ± 0.15) × 10-3 found for Pyrex and silica is in reasonable agreement with the literature data. Using the new technique the N2(v = 1) quenching probability was measured on TiO2 surface, γ1 = (9 ± 1) × 10-3. A linear enhancement of the N2(v) wall deactivation probability with an increase in the admixture of CO2 was observed for all studied materials. In order to explain this effect, a vibrational energy transfer mechanism between N2(v) and adsorbed CO2 is proposed.
Espejo, L A; Zagmutt, F J; Groenendaal, H; Muñoz-Zanzi, C; Wells, S J
2015-11-01
The objective of this study was to evaluate the performance of bacterial culture of feces and serum ELISA to correctly identify cows with Mycobacterium avium ssp. paratuberculosis (MAP) at heavy, light, and non-fecal-shedding levels. A total of 29,785 parallel test results from bacterial culture of feces and serum ELISA were collected from 17 dairy herds in Minnesota, Pennsylvania, and Colorado. Samples were obtained from adult cows from dairy herds enrolled for up to 10 yr in the National Johne's Disease Demonstration Herd Project. A Bayesian latent class model was fitted to estimate the probabilities that bacterial culture of feces (using 72-h sedimentation or 30-min centrifugation methods) and serum ELISA results correctly identified cows as high positive, low positive, or negative given that cows were heavy, light, and non-shedders, respectively. The model assumed that no gold standard test was available and conditional independency existed between diagnostic tests. The estimated conditional probabilities that bacterial culture of feces correctly identified heavy shedders, light shedders, and non-shedders were 70.9, 32.0, and 98.5%, respectively. The same values for the serum ELISA were 60.6, 18.7, and 99.5%, respectively. Differences in diagnostic test performance were observed among states. These results improve the interpretation of results from bacterial culture of feces and serum ELISA for detection of MAP and MAP antibody (respectively), which can support on-farm infection control decisions and can be used to evaluate disease-testing strategies, taking into account the accuracy of these tests. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Evans, Karla K.; Birdwell, Robyn L.; Wolfe, Jeremy M.
2013-01-01
Mammography is an important tool in the early detection of breast cancer. However, the perceptual task is difficult and a significant proportion of cancers are missed. Visual search experiments show that miss (false negative) errors are elevated when targets are rare (low prevalence) but it is unknown if low prevalence is a significant factor under real world, clinical conditions. Here we show that expert mammographers in a real, low-prevalence, clinical setting, miss a much higher percentage of cancers than are missed when the mammographers search for the same cancers under high prevalence conditions. We inserted 50 positive and 50 negative cases into the normal workflow of the breast cancer screening service of an urban hospital over the course of nine months. This rate was slow enough not to markedly raise disease prevalence in the radiologists’ daily practice. Six radiologists subsequently reviewed all 100 cases in a session where the prevalence of disease was 50%. In the clinical setting, participants missed 30% of the cancers. In the high prevalence setting, participants missed just 12% of the same cancers. Under most circumstances, this low prevalence effect is probably adaptive. It is usually wise to be conservative about reporting events with very low base rates (Was that a flying saucer? Probably not.). However, while this response to low prevalence appears to be strongly engrained in human visual search mechanisms, it may not be as adaptive in socially important, low prevalence tasks like medical screening. While the results of any one study must be interpreted cautiously, these data are consistent with the conclusion that this behavioral response to low prevalence could be a substantial contributor to miss errors in breast cancer screening. PMID:23737980
Politeness and the communication of uncertainty.
Holtgraves, Thomas; Perdew, Audrey
2016-09-01
Ambiguity in language derives, in part, from the multiple motivations that underlie the choice to use any particular expression. The use of some lexical items, such as probability expressions and scalar terms, can be motivated by a desire to communicate uncertainty as well as a desire to be polite (i.e., manage face). Research has demonstrated that the interpretation of these items can be influenced by the existence of a potential politeness motive. In general, communications about negative events, relative to positive events, result in higher likelihood estimates whenever politeness can be discerned as a potential motive. With few exceptions, however, this research has focused only on the hearer. In the present research we focused on the dyad and examined whether speakers vary their messages as a function of politeness, and the effect that this has on subsequent judgments made by a recipient. In two experiments we presented participants with situations that varied in terms of face-threat and asked them how they would communicate potentially threatening information. Both experiments included a second set of participants who read these utterances and provided judgments as to the degree of uncertainty conveyed by the utterance. In both experiments, messages in the face-threatening condition conveyed greater uncertainty than messages in the non-face-threatening condition, and the probability estimates made by the second set of participants varied as a function of conveyed uncertainty. This research demonstrates that when examining speakers and hearers together, severe events may be judged less likely (rather than more likely), because speakers tend to hedge the certainty with which they communicate the information. Copyright © 2016 Elsevier B.V. All rights reserved.
[Age index and an interpretation of survivorship curves (author's transl)].
Lohmann, W
1977-01-01
Clinical investigations showed that the age dependences of physiological functions do not show -- as generally assumed -- a linear increase with age, but an exponential one. Considering this result one can easily interpret the survivorship curve of a population (Gompertz plot). The only thing that is required is that the probability of death (death rate) is proportional to a function of ageing given by mu(t) = mu0 exp (alpha t). Considering survivorship curves resulting from annual death statistics and fitting them by suitable parameters, then the resulting alpha-values are in agreement with clinical data.
Microbiology of ancient and modern hydrothermal systems.
Reysenbach, A L; Cady, S L
2001-02-01
Hydrothermal systems have prevailed throughout geological history on earth, and ancient ARCHAEAN hydrothermal deposits could provide clues to understanding earth's earliest biosphere. Modern hydrothermal systems support a plethora of microorganisms and macroorganisms, and provide good comparisons for paleontological interpretation of ancient hydrothermal systems. However, all of the microfossils associated with ancient hydrothermal deposits reported to date are filamentous, and limited STABLE ISOTOPE analysis suggests that these microfossils were probably autotrophs. Therefore, the morphology and mode of carbon metabolism are attributes of microorganisms from modern hydrothermal systems that provide valuable information for interpreting the geological record using morphological and isotopic signatures.
Quantum nonlocality does not exist
Tipler, Frank J.
2014-01-01
Quantum nonlocality is shown to be an artifact of the Copenhagen interpretation, in which each observed quantity has exactly one value at any instant. In reality, all physical systems obey quantum mechanics, which obeys no such rule. Locality is restored if observed and observer are both assumed to obey quantum mechanics, as in the many-worlds interpretation (MWI). Using the MWI, I show that the quantum side of Bell’s inequality, generally believed nonlocal, is really due to a series of three measurements (not two as in the standard, oversimplified analysis), all three of which have only local effects. Thus, experiments confirming “nonlocality” are actually confirming the MWI. The mistaken interpretation of nonlocality experiments depends crucially on a question-begging version of the Born interpretation, which makes sense only in “collapse” versions of quantum theory, about the meaning of the modulus of the wave function, so I use the interpretation based on the MWI, namely that the wave function is a world density amplitude, not a probability amplitude. This view allows the Born interpretation to be derived directly from the Schrödinger equation, by applying the Schrödinger equation to both the observed and the observer. PMID:25015084
Quantum nonlocality does not exist.
Tipler, Frank J
2014-08-05
Quantum nonlocality is shown to be an artifact of the Copenhagen interpretation, in which each observed quantity has exactly one value at any instant. In reality, all physical systems obey quantum mechanics, which obeys no such rule. Locality is restored if observed and observer are both assumed to obey quantum mechanics, as in the many-worlds interpretation (MWI). Using the MWI, I show that the quantum side of Bell's inequality, generally believed nonlocal, is really due to a series of three measurements (not two as in the standard, oversimplified analysis), all three of which have only local effects. Thus, experiments confirming "nonlocality" are actually confirming the MWI. The mistaken interpretation of nonlocality experiments depends crucially on a question-begging version of the Born interpretation, which makes sense only in "collapse" versions of quantum theory, about the meaning of the modulus of the wave function, so I use the interpretation based on the MWI, namely that the wave function is a world density amplitude, not a probability amplitude. This view allows the Born interpretation to be derived directly from the Schrödinger equation, by applying the Schrödinger equation to both the observed and the observer.
Approaches to Evaluating Probability of Collision Uncertainty
NASA Technical Reports Server (NTRS)
Hejduk, Matthew D.; Johnson, Lauren C.
2016-01-01
While the two-dimensional probability of collision (Pc) calculation has served as the main input to conjunction analysis risk assessment for over a decade, it has done this mostly as a point estimate, with relatively little effort made to produce confidence intervals on the Pc value based on the uncertainties in the inputs. The present effort seeks to try to carry these uncertainties through the calculation in order to generate a probability density of Pc results rather than a single average value. Methods for assessing uncertainty in the primary and secondary objects' physical sizes and state estimate covariances, as well as a resampling approach to reveal the natural variability in the calculation, are presented; and an initial proposal for operationally-useful display and interpretation of these data for a particular conjunction is given.
Deposition, exhumation, and paleoclimate of an ancient lake deposit, Gale crater, Mars.
Grotzinger, J P; Gupta, S; Malin, M C; Rubin, D M; Schieber, J; Siebach, K; Sumner, D Y; Stack, K M; Vasavada, A R; Arvidson, R E; Calef, F; Edgar, L; Fischer, W F; Grant, J A; Griffes, J; Kah, L C; Lamb, M P; Lewis, K W; Mangold, N; Minitti, M E; Palucis, M; Rice, M; Williams, R M E; Yingst, R A; Blake, D; Blaney, D; Conrad, P; Crisp, J; Dietrich, W E; Dromart, G; Edgett, K S; Ewing, R C; Gellert, R; Hurowitz, J A; Kocurek, G; Mahaffy, P; McBride, M J; McLennan, S M; Mischna, M; Ming, D; Milliken, R; Newsom, H; Oehler, D; Parker, T J; Vaniman, D; Wiens, R C; Wilson, S A
2015-10-09
The landforms of northern Gale crater on Mars expose thick sequences of sedimentary rocks. Based on images obtained by the Curiosity rover, we interpret these outcrops as evidence for past fluvial, deltaic, and lacustrine environments. Degradation of the crater wall and rim probably supplied these sediments, which advanced inward from the wall, infilling both the crater and an internal lake basin to a thickness of at least 75 meters. This intracrater lake system probably existed intermittently for thousands to millions of years, implying a relatively wet climate that supplied moisture to the crater rim and transported sediment via streams into the lake basin. The deposits in Gale crater were then exhumed, probably by wind-driven erosion, creating Aeolis Mons (Mount Sharp). Copyright © 2015, American Association for the Advancement of Science.
Height probabilities in the Abelian sandpile model on the generalized finite Bethe lattice
NASA Astrophysics Data System (ADS)
Chen, Haiyan; Zhang, Fuji
2013-08-01
In this paper, we study the sandpile model on the generalized finite Bethe lattice with a particular boundary condition. Using a combinatorial method, we give the exact expressions for all single-site probabilities and some two-site joint probabilities. As a by-product, we prove that the height probabilities of bulk vertices are all the same for the Bethe lattice with certain given boundary condition, which was found from numerical evidence by Grassberger and Manna ["Some more sandpiles," J. Phys. (France) 51, 1077-1098 (1990)], 10.1051/jphys:0199000510110107700 but without a proof.
MacDonald, P M; Kirkpatrick, S W; Sullivan, L A
1996-11-01
Schematic drawings of facial expressions were evaluated as a possible assessment tool for research on emotion recognition and interpretation involving young children. A subset of Ekman and Friesen's (1976) Pictures of Facial Affect was used as the standard for comparison. Preschool children (N = 138) were shown drawing and photographs in two context conditions for six emotions (anger, disgust, fear, happiness, sadness, and surprise). The overall correlation between accuracy for the photographs and drawings was .677. A significant difference was found for the stimulus condition (photographs vs. drawings) but not for the administration condition (label-based vs. context-based). Children were significantly more accurate in interpreting drawings than photographs and tended to be more accurate in identifying facial expressions in the label-based administration condition for both photographs and drawings than in the context-based administration condition.
Deviation from threshold model in ultrafast laser ablation of graphene at sub-micron scale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gil-Villalba, A.; Xie, C.; Salut, R.
We investigate a method to measure ultrafast laser ablation threshold with respect to spot size. We use structured complex beams to generate a pattern of craters in CVD graphene with a single laser pulse. A direct comparison between beam profile and SEM characterization allows us to determine the dependence of ablation probability on spot-size, for crater diameters ranging between 700 nm and 2.5 μm. We report a drastic decrease of ablation probability when the crater diameter is below 1 μm which we interpret in terms of free-carrier diffusion.
Quantifying risk: verbal probability expressions in Spanish and English.
Cohn, Lawrence D; Vázquez, Miguel E Cortés; Alvarez, Adolfo
2009-01-01
To investigate how Spanish- and English-speaking adults interpret verbal probability expressions presented in Spanish and English (eg, posiblemente and possibly, respectively). Professional translators and university students from México and the United States read a series of likelihood statements in Spanish or English and then estimated the certainty implied by each statement. Several terms that are regarded as cognates in English and Spanish elicited significantly different likelihood ratings. Several language equivalencies were also identified. These findings provide the first reported evaluation of Spanish likelihood terms for use in risk communications directed towards monolingual and bilingual Spanish speakers.
Isotopic effects in the collinear reactive FHH system
NASA Technical Reports Server (NTRS)
Lepetit, B.; Launay, J. M.; Le Dourneuf, M.
1986-01-01
Exact quantum reaction probabilities for a collinear model of the F + HH, HD, DD and DH reactions on the MV potential energy surface have been computed using hyperspherical coordinates. The results, obtained up to a total energy of 1.8 eV, show three main features: (1) resonances, whose positions and widths are analyzed simply in the hyperspherical formalism; (2) a slowly varying background increasing for FHD, decreasing for FDH, and oscillating for FHH and FDD, whose variations are interpreted by classical dynamics; and (3) partial reaction probabilities revealing decreasing vibrational adiabaticity in the order FHH-FDD-FHD-FDH.
Braubach, Matthias; Tobollik, Myriam; Mudu, Pierpaolo; Hiscock, Rosemary; Chapizanis, Dimitris; Sarigiannis, Denis A.; Keuken, Menno; Perez, Laura; Martuzzi, Marco
2015-01-01
Well-being impact assessments of urban interventions are a difficult challenge, as there is no agreed methodology and scarce evidence on the relationship between environmental conditions and well-being. The European Union (EU) project “Urban Reduction of Greenhouse Gas Emissions in China and Europe” (URGENCHE) explored a methodological approach to assess traffic noise-related well-being impacts of transport interventions in three European cities (Basel, Rotterdam and Thessaloniki) linking modeled traffic noise reduction effects with survey data indicating noise-well-being associations. Local noise models showed a reduction of high traffic noise levels in all cities as a result of different urban interventions. Survey data indicated that perception of high noise levels was associated with lower probability of well-being. Connecting the local noise exposure profiles with the noise-well-being associations suggests that the urban transport interventions may have a marginal but positive effect on population well-being. This paper also provides insight into the methodological challenges of well-being assessments and highlights the range of limitations arising from the current lack of reliable evidence on environmental conditions and well-being. Due to these limitations, the results should be interpreted with caution. PMID:26016437
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meyer, M.A.; Booker, J.M.
1990-01-01
Expert opinion is frequently used in probabilistic safety assessment (PSA), particularly in estimating low probability events. In this paper, we discuss some of the common problems encountered in eliciting and analyzing expert opinion data and offer solutions or recommendations. The problems are: that experts are not naturally Bayesian. People fail to update their existing information to account for new information as it becomes available, as would be predicted by the Bayesian philosophy; that experts cannot be fully calibrated. To calibrate experts, the feedback from the known quantities must be immediate, frequent, and specific to the task; that experts are limitedmore » in the number of things that they can mentally juggle at a time to 7 {plus minus} 2; that data gatherers and analysts can introduce bias by unintentionally causing an altering of the expert's thinking or answers; that the level of detail the data, or granularity, can affect the analyses; and the conditioning effect poses difficulties in gathering and analyzing of the expert data. The data that the expert gives can be conditioned on a variety of factors that can affect the analysis and the interpretation of the results. 31 refs.« less
Braubach, Matthias; Tobollik, Myriam; Mudu, Pierpaolo; Hiscock, Rosemary; Chapizanis, Dimitris; Sarigiannis, Denis A; Keuken, Menno; Perez, Laura; Martuzzi, Marco
2015-05-26
Well-being impact assessments of urban interventions are a difficult challenge, as there is no agreed methodology and scarce evidence on the relationship between environmental conditions and well-being. The European Union (EU) project "Urban Reduction of Greenhouse Gas Emissions in China and Europe" (URGENCHE) explored a methodological approach to assess traffic noise-related well-being impacts of transport interventions in three European cities (Basel, Rotterdam and Thessaloniki) linking modeled traffic noise reduction effects with survey data indicating noise-well-being associations. Local noise models showed a reduction of high traffic noise levels in all cities as a result of different urban interventions. Survey data indicated that perception of high noise levels was associated with lower probability of well-being. Connecting the local noise exposure profiles with the noise-well-being associations suggests that the urban transport interventions may have a marginal but positive effect on population well-being. This paper also provides insight into the methodological challenges of well-being assessments and highlights the range of limitations arising from the current lack of reliable evidence on environmental conditions and well-being. Due to these limitations, the results should be interpreted with caution.
Joint Sparse Recovery With Semisupervised MUSIC
NASA Astrophysics Data System (ADS)
Wen, Zaidao; Hou, Biao; Jiao, Licheng
2017-05-01
Discrete multiple signal classification (MUSIC) with its low computational cost and mild condition requirement becomes a significant noniterative algorithm for joint sparse recovery (JSR). However, it fails in rank defective problem caused by coherent or limited amount of multiple measurement vectors (MMVs). In this letter, we provide a novel sight to address this problem by interpreting JSR as a binary classification problem with respect to atoms. Meanwhile, MUSIC essentially constructs a supervised classifier based on the labeled MMVs so that its performance will heavily depend on the quality and quantity of these training samples. From this viewpoint, we develop a semisupervised MUSIC (SS-MUSIC) in the spirit of machine learning, which declares that the insufficient supervised information in the training samples can be compensated from those unlabeled atoms. Instead of constructing a classifier in a fully supervised manner, we iteratively refine a semisupervised classifier by exploiting the labeled MMVs and some reliable unlabeled atoms simultaneously. Through this way, the required conditions and iterations can be greatly relaxed and reduced. Numerical experimental results demonstrate that SS-MUSIC can achieve much better recovery performances than other MUSIC extended algorithms as well as some typical greedy algorithms for JSR in terms of iterations and recovery probability.
Conditional Probability Analysis: A Statistical Tool for Environmental Analysis.
The use and application of environmental conditional probability analysis (CPA) is relatively recent. The first presentation using CPA was made in 2002 at the New England Association of Environmental Biologists Annual Meeting in Newport. Rhode Island. CPA has been used since the...
Bayes factor and posterior probability: Complementary statistical evidence to p-value.
Lin, Ruitao; Yin, Guosheng
2015-09-01
As a convention, a p-value is often computed in hypothesis testing and compared with the nominal level of 0.05 to determine whether to reject the null hypothesis. Although the smaller the p-value, the more significant the statistical test, it is difficult to perceive the p-value in a probability scale and quantify it as the strength of the data against the null hypothesis. In contrast, the Bayesian posterior probability of the null hypothesis has an explicit interpretation of how strong the data support the null. We make a comparison of the p-value and the posterior probability by considering a recent clinical trial. The results show that even when we reject the null hypothesis, there is still a substantial probability (around 20%) that the null is true. Not only should we examine whether the data would have rarely occurred under the null hypothesis, but we also need to know whether the data would be rare under the alternative. As a result, the p-value only provides one side of the information, for which the Bayes factor and posterior probability may offer complementary evidence. Copyright © 2015 Elsevier Inc. All rights reserved.
Twenty-five years of change in southern African passerine diversity: nonclimatic factors of change.
Péron, Guillaume; Altwegg, Res
2015-09-01
We analysed more than 25 years of change in passerine bird distribution in South Africa, Swaziland and Lesotho, to show that species distributions can be influenced by processes that are at least in part independent of the local strength and direction of climate change: land use and ecological succession. We used occupancy models that separate species' detection from species' occupancy probability, fitted to citizen science data from both phases of the Southern African Bird Atlas Project (1987-1996 and 2007-2013). Temporal trends in species' occupancy probability were interpreted in terms of local extinction/colonization, and temporal trends in detection probability were interpreted in terms of change in abundance. We found for the first time at this scale that, as predicted in the context of bush encroachment, closed-savannah specialists increased where open-savannah specialists decreased. In addition, the trend in the abundance of species a priori thought to be favoured by agricultural conversion was negatively correlated with human population density, which is in line with hypotheses explaining the decline in farmland birds in the Northern Hemisphere. In addition to climate, vegetation cover and the intensity and time since agricultural conversion constitute important predictors of biodiversity changes in the region. Their inclusion will improve the reliability of predictive models of species distribution. © 2015 John Wiley & Sons Ltd.
Bidirectional Classical Stochastic Processes with Measurements and Feedback
NASA Technical Reports Server (NTRS)
Hahne, G. E.
2005-01-01
A measurement on a quantum system is said to cause the "collapse" of the quantum state vector or density matrix. An analogous collapse occurs with measurements on a classical stochastic process. This paper addresses the question of describing the response of a classical stochastic process when there is feedback from the output of a measurement to the input, and is intended to give a model for quantum-mechanical processes that occur along a space-like reaction coordinate. The classical system can be thought of in physical terms as two counterflowing probability streams, which stochastically exchange probability currents in a way that the net probability current, and hence the overall probability, suitably interpreted, is conserved. The proposed formalism extends the . mathematics of those stochastic processes describable with linear, single-step, unidirectional transition probabilities, known as Markov chains and stochastic matrices. It is shown that a certain rearrangement and combination of the input and output of two stochastic matrices of the same order yields another matrix of the same type. Each measurement causes the partial collapse of the probability current distribution in the midst of such a process, giving rise to calculable, but non-Markov, values for the ensuing modification of the system's output probability distribution. The paper concludes with an analysis of a classical probabilistic version of the so-called grandfather paradox.
Pilling, Michael; Gellatly, Angus
2013-07-01
We investigated the influence of dimensional set on report of object feature information using an immediate memory probe task. Participants viewed displays containing up to 36 coloured geometric shapes which were presented for several hundred milliseconds before one item was abruptly occluded by a probe. A cue presented simultaneously with the probe instructed participants to report either about the colour or shape of the probe item. A dimensional set towards the colour or shape of the presented items was induced by manipulating task probability - the relative probability with which the two feature dimensions required report. This was done across two participant groups: One group was given trials where there was a higher report probability of colour, the other a higher report probability of shape. Two experiments showed that features were reported most accurately when they were of high task probability, though in both cases the effect was largely driven by the colour dimension. Importantly the task probability effect did not interact with display set size. This is interpreted as tentative evidence that this manipulation influences feature processing in a global manner and at a stage prior to visual short term memory. Copyright © 2013 Elsevier B.V. All rights reserved.
On defense strategies for system of systems using aggregated correlations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rao, Nageswara S.; Imam, Neena; Ma, Chris Y. T.
2017-04-01
We consider a System of Systems (SoS) wherein each system Si, i = 1; 2; ... ;N, is composed of discrete cyber and physical components which can be attacked and reinforced. We characterize the disruptions using aggregate failure correlation functions given by the conditional failure probability of SoS given the failure of an individual system. We formulate the problem of ensuring the survival of SoS as a game between an attacker and a provider, each with a utility function composed of asurvival probability term and a cost term, both expressed in terms of the number of components attacked and reinforced.more » The survival probabilities of systems satisfy simple product-form, first-order differential conditions, which simplify the Nash Equilibrium (NE) conditions. We derive the sensitivity functions that highlight the dependence of SoS survival probability at NE on cost terms, correlation functions, and individual system survival probabilities.We apply these results to a simplified model of distributed cloud computing infrastructure.« less
On the determinants of the conjunction fallacy: probability versus inductive confirmation.
Tentori, Katya; Crupi, Vincenzo; Russo, Selena
2013-02-01
Major recent interpretations of the conjunction fallacy postulate that people assess the probability of a conjunction according to (non-normative) averaging rules as applied to the constituents' probabilities or represent the conjunction fallacy as an effect of random error in the judgment process. In the present contribution, we contrast such accounts with a different reading of the phenomenon based on the notion of inductive confirmation as defined by contemporary Bayesian theorists. Averaging rule hypotheses along with the random error model and many other existing proposals are shown to all imply that conjunction fallacy rates would rise as the perceived probability of the added conjunct does. By contrast, our account predicts that the conjunction fallacy depends on the added conjunct being perceived as inductively confirmed. Four studies are reported in which the judged probability versus confirmation of the added conjunct have been systematically manipulated and dissociated. The results consistently favor a confirmation-theoretic account of the conjunction fallacy against competing views. Our proposal is also discussed in connection with related issues in the study of human inductive reasoning. 2013 APA, all rights reserved
An evaluation of lithographed forest stereograms.
David A. Bernstein
1961-01-01
Aerial photo stereograms are valuable for showing neophyte photo interpreters the stereoscopic appearance of common objects and conditions. They are also useful for instruction in measuring heights, horizontal distances, and angles on photos. Collections of stereograms of known conditions are worthwhile reference material for interpretation work in unknown areas.
Single, Complete, Probability Spaces Consistent With EPR-Bohm-Bell Experimental Data
NASA Astrophysics Data System (ADS)
Avis, David; Fischer, Paul; Hilbert, Astrid; Khrennikov, Andrei
2009-03-01
We show that paradoxical consequences of violations of Bell's inequality are induced by the use of an unsuitable probabilistic description for the EPR-Bohm-Bell experiment. The conventional description (due to Bell) is based on a combination of statistical data collected for different settings of polarization beam splitters (PBSs). In fact, such data consists of some conditional probabilities which only partially define a probability space. Ignoring this conditioning leads to apparent contradictions in the classical probabilistic model (due to Kolmogorov). We show how to make a completely consistent probabilistic model by taking into account the probabilities of selecting the settings of the PBSs. Our model matches both the experimental data and is consistent with classical probability theory.
Logic, probability, and human reasoning.
Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P
2015-04-01
This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Eriksson, Patrick G.; Reczko, Boris F. F.
1998-09-01
Five genetic facies associations/architectural elements are recognised for the epeiric sea deposits preserved in the Early Proterozoic Timeball Hill Formation, South Africa. Basal carbonaceous mudrocks, interpreted as anoxic suspension deposits, grade up into sheet-like, laminated, graded mudrocks and succeeding sheets of laminated and cross-laminated siltstones and fine-grained sandstones. The latter two architectural elements are compatible with the Te, Td and Tc subdivisions of low-density turbidity current systems. Thin interbeds of stromatolitic carbonate within these first three facies associations support photic water depths up to about 100 m. Laterally extensive sheets of mature, cross-bedded sandstone disconformably overlie the turbidite deposits, and are ascribed to lower tidal flat processes. Interbedded lenticular, immature sandstones and mudrocks comprise the fifth architectural element, and are interpreted as medial to upper tidal flat sediments. Small lenses of coarse siltstone-very fine-grained sandstone, analogous to modern continental rise contourite deposits, occur within the suspension and distal turbidite sediments, and also form local wedges of inferred contourites at the transition from suspension to lowermost turbidite deposits. Blanketing and progressive shallowing of the floor of the Timeball Hill basin by basal suspension deposits greatly reduced wave action, thereby promoting preservation of low-density turbidity current deposits across the basin under stillstand or highstand conditions. A lowstand tidal flat facies tract laid down widespread sandy deposits of the medial Klapperkop Member within the formation. Salinity gradients and contemporaneous cold periglacial water masses were probably responsible for formation of the inferred contourites. The combination of the depositional systems interpreted for the Timeball Hill Formation may provide a provisional model for Early Proterozoic epeiric basin settings.
NASA Technical Reports Server (NTRS)
Gracey, William; Jewel, Joseph W., Jr.; Carpenter, Gene T.
1960-01-01
The overall errors of the service altimeter installations of a variety of civil transport, military, and general-aviation airplanes have been experimentally determined during normal landing-approach and take-off operations. The average height above the runway at which the data were obtained was about 280 feet for the landings and about 440 feet for the take-offs. An analysis of the data obtained from 196 airplanes during 415 landing approaches and from 70 airplanes during 152 take-offs showed that: 1. The overall error of the altimeter installations in the landing- approach condition had a probable value (50 percent probability) of +/- 36 feet and a maximum probable value (99.7 percent probability) of +/- 159 feet with a bias of +10 feet. 2. The overall error in the take-off condition had a probable value of +/- 47 feet and a maximum probable value of +/- 207 feet with a bias of -33 feet. 3. The overall errors of the military airplanes were generally larger than those of the civil transports in both the landing-approach and take-off conditions. In the landing-approach condition the probable error and the maximum probable error of the military airplanes were +/- 43 and +/- 189 feet, respectively, with a bias of +15 feet, whereas those for the civil transports were +/- 22 and +/- 96 feet, respectively, with a bias of +1 foot. 4. The bias values of the error distributions (+10 feet for the landings and -33 feet for the take-offs) appear to represent a measure of the hysteresis characteristics (after effect and recovery) and friction of the instrument and the pressure lag of the tubing-instrument system.
DOT National Transportation Integrated Search
2009-10-13
This paper describes a probabilistic approach to estimate the conditional probability of release of hazardous materials from railroad tank cars during train accidents. Monte Carlo methods are used in developing a probabilistic model to simulate head ...
The Dependence Structure of Conditional Probabilities in a Contingency Table
ERIC Educational Resources Information Center
Joarder, Anwar H.; Al-Sabah, Walid S.
2002-01-01
Conditional probability and statistical independence can be better explained with contingency tables. In this note some special cases of 2 x 2 contingency tables are considered. In turn an interesting insight into statistical dependence as well as independence of events is obtained.
NASA Astrophysics Data System (ADS)
Kaur, Prabhmandeep; Jain, Virander Kumar; Kar, Subrat
2014-12-01
In this paper, we investigate the performance of a Free Space Optic (FSO) link considering the impairments caused by the presence of various weather conditions such as very clear air, drizzle, haze, fog, etc., and turbulence in the atmosphere. Analytic expression for the outage probability is derived using the gamma-gamma distribution for turbulence and accounting the effect of weather conditions using the Beer-Lambert's law. The effect of receiver diversity schemes using aperture averaging and array receivers on the outage probability is studied and compared. As the aperture diameter is increased, the outage probability decreases irrespective of the turbulence strength (weak, moderate and strong) and weather conditions. Similar effects are observed when the number of direct detection receivers in the array are increased. However, it is seen that as the desired level of performance in terms of the outage probability decreases, array receiver becomes the preferred choice as compared to the receiver with aperture averaging.
Ben-Shlomo, Yoav; Collin, Simon M; Quekett, James; Sterne, Jonathan A C; Whiting, Penny
2015-01-01
There is little evidence on how best to present diagnostic information to doctors and whether this makes any difference to clinical management. We undertook a randomised controlled trial to see if different data presentations altered clinicians' decision to further investigate or treat a patient with a fictitious disorder ("Green syndrome") and their ability to determine post-test probability. We recruited doctors registered with the United Kingdom's largest online network for medical doctors between 10 July and 6" November 2012. Participants were randomised to one of four arms: (a) text summary of sensitivity and specificity, (b) Fagan's nomogram, (c) probability-modifying plot (PMP), (d) natural frequency tree (NFT). The main outcome measure was the decision whether to treat, not treat or undertake a brain biopsy on the hypothetical patient and the correct post-test probability. Secondary outcome measures included knowledge of diagnostic tests. 917 participants attempted the survey and complete data were available from 874 (95.3%). Doctors randomized to the PMP and NFT arms were more likely to treat the patient than those randomized to the text-only arm. (ORs 1.49, 95% CI 1.02, 2.16) and 1.43, 95% CI 0.98, 2.08 respectively). More patients randomized to the PMP (87/218-39.9%) and NFT (73/207-35.3%) arms than the nomogram (50/194-25.8%) or text only (30/255-11.8%) arms reported the correct post-test probability (p <0.001). Younger age, postgraduate training and higher self-rated confidence all predicted better knowledge performance. Doctors with better knowledge were more likely to view an optional learning tutorial (OR per correct answer 1.18, 95% CI 1.06, 1.31). Presenting diagnostic data using a probability-modifying plot or natural frequency tree influences the threshold for treatment and improves interpretation of tests results compared to text summary of sensitivity and specificity or Fagan's nomogram.
NASA Astrophysics Data System (ADS)
Jablonski, Bryce V. J.; Dalrymple, Robert W.
2016-04-01
Inclined heterolithic stratification in the Lower Cretaceous McMurray Formation, exposed along the Steepbank River in north-eastern Alberta, Canada, accumulated on point bars of a 30 to 40 m deep continental-scale river in the fluvial-marine transition. This inclined heterolithic stratification consists of two alternating lithologies, sand and fine-grained beds. Sand beds were deposited rapidly by unidirectional currents and contain little or no bioturbation. Fine-grained beds contain rare tidal structures, and are intensely bioturbated by low-diversity ichnofossil assemblages. The alternations between the sand and fine-grained beds are probably caused by strong variations in fluvial discharge; that are believed to be seasonal (probably annual) in duration. The sand beds accumulated during river floods, under fluvially dominated conditions when the water was fresh, whereas the fine-grained beds accumulated during the late stages of the river flood and deposition continued under tidally influenced brackish-water conditions during times of low-river flow (i.e. the interflood periods). These changes reflect the annual migration in the positions of the tidal and salinity limits within the fluvial-marine transition that result from changes in river discharge. Sand and fine-grained beds are cyclically organized in the studied outcrops forming metre-scale cycles. A single metre-scale cycle is defined by a sharp base, an upward decrease in sand-bed thickness and upward increases in the preservation of fine-grained beds and the intensity of bioturbation. Metre-scale cycles are interpreted to be the product of a longer term (decadal) cyclicity in fluvial discharge, probably caused by fluctuations in ocean or solar dynamics. The volumetric dominance of river-flood deposits within the succession suggests that accumulation occurred in a relatively landward position within the fluvial-marine transition. This study shows that careful observation can reveal much about the interplay of processes within the fluvial-marine transition, which in turn provides a powerful tool for determining the palaeo-environmental location of a deposit within the fluvial-marine transition.
Bayesian network interface for assisting radiology interpretation and education
NASA Astrophysics Data System (ADS)
Duda, Jeffrey; Botzolakis, Emmanuel; Chen, Po-Hao; Mohan, Suyash; Nasrallah, Ilya; Rauschecker, Andreas; Rudie, Jeffrey; Bryan, R. Nick; Gee, James; Cook, Tessa
2018-03-01
In this work, we present the use of Bayesian networks for radiologist decision support during clinical interpretation. This computational approach has the advantage of avoiding incorrect diagnoses that result from known human cognitive biases such as anchoring bias, framing effect, availability bias, and premature closure. To integrate Bayesian networks into clinical practice, we developed an open-source web application that provides diagnostic support for a variety of radiology disease entities (e.g., basal ganglia diseases, bone lesions). The Clinical tool presents the user with a set of buttons representing clinical and imaging features of interest. These buttons are used to set the value for each observed feature. As features are identified, the conditional probabilities for each possible diagnosis are updated in real time. Additionally, using sensitivity analysis, the interface may be set to inform the user which remaining imaging features provide maximum discriminatory information to choose the most likely diagnosis. The Case Submission tools allow the user to submit a validated case and the associated imaging features to a database, which can then be used for future tuning/testing of the Bayesian networks. These submitted cases are then reviewed by an assigned expert using the provided QC tool. The Research tool presents users with cases with previously labeled features and a chosen diagnosis, for the purpose of performance evaluation. Similarly, the Education page presents cases with known features, but provides real time feedback on feature selection.
Logical reasoning versus information processing in the dual-strategy model of reasoning.
Markovits, Henry; Brisson, Janie; de Chantal, Pier-Luc
2017-01-01
One of the major debates concerning the nature of inferential reasoning is between counterexample-based strategies such as mental model theory and statistical strategies underlying probabilistic models. The dual-strategy model, proposed by Verschueren, Schaeken, & d'Ydewalle (2005a, 2005b), which suggests that people might have access to both kinds of strategy has been supported by several recent studies. These have shown that statistical reasoners make inferences based on using information about premises in order to generate a likelihood estimate of conclusion probability. However, while results concerning counterexample reasoners are consistent with a counterexample detection model, these results could equally be interpreted as indicating a greater sensitivity to logical form. In order to distinguish these 2 interpretations, in Studies 1 and 2, we presented reasoners with Modus ponens (MP) inferences with statistical information about premise strength and in Studies 3 and 4, naturalistic MP inferences with premises having many disabling conditions. Statistical reasoners accepted the MP inference more often than counterexample reasoners in Studies 1 and 2, while the opposite pattern was observed in Studies 3 and 4. Results show that these strategies must be defined in terms of information processing, with no clear relations to "logical" reasoning. These results have additional implications for the underlying debate about the nature of human reasoning. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Maceral distributions in Illinois coals and their paleoenvironmental implications
Harvey, R.D.; Dillon, J.W.
1985-01-01
For purposes of assessing the maceral distribution of Illinois (U.S.A.) coals analyses were assembled for 326 face channel and drill core samples from 24 coal members of the Pennsylvanian System. The inertinite content of coals from the Missourian and Virgilian Series averages 16.1% (mineral free), compared to 9.4% for older coals from the Desmoinesian and older Series. This indicates there was generally a higher state of oxidation in the peat that formed the younger coals. This state probably resulted from greater exposure of these peats to weathering as the climate became drier and the water table lower than was the case for the older coals, although oxidation during allochthonous deposition of inertinite components is a genetic factor that needs further study to confirm the importance of the climate. Regional variation of the vitrinite-inertinite ratio (V-I), on a mineral- and micrinite-free basis, was observed in the Springfield (No. 5) and Herrin (No. 6) Coal Members to be related to the geographical position of paleochannel (river) deposits known to have been contemporaneous with the peats that formed these two coal strata. The V-I ratio is highest (generally 12-27) in samples from areas adjacent to the channels, and lower (5-11) some 10-20 km away. We interpret the V-I ratio to be an inverse index of the degree of oxidation to which the original peat was exposed. High V-I ratio coal located near the channels probably formed under more anoxic conditions than did the lower V-I ratio coal some distance away from the channels. The low V-I ratio coal probably formed in areas of the peat swamp where the watertable was generally lower than the channel areas. ?? 1986.
Clinical Assessment of a Nocardia PCR-Based Assay for Diagnosis of Nocardiosis.
Rouzaud, Claire; Rodriguez-Nava, Véronica; Catherinot, Emilie; Méchaï, Frédéric; Bergeron, Emmanuelle; Farfour, Eric; Scemla, Anne; Poirée, Sylvain; Delavaud, Christophe; Mathieu, Daniel; Durupt, Stéphane; Larosa, Fabrice; Lengelé, Jean-Philippe; Christophe, Jean-Louis; Suarez, Felipe; Lortholary, Olivier; Lebeaux, David
2018-06-01
The diagnosis of nocardiosis, a severe opportunistic infection, is challenging. We assessed the specificity and sensitivity of a 16S rRNA Nocardia PCR-based assay performed on clinical samples. In this multicenter study (January 2014 to April 2015), patients who were admitted to three hospitals and had an underlying condition favoring nocardiosis, clinical and radiological signs consistent with nocardiosis, and a Nocardia PCR assay result for a clinical sample were included. Patients were classified as negative control (NC) (negative Nocardia culture results and proven alternative diagnosis or improvement at 6 months without anti- Nocardia treatment), positive control (PC) (positive Nocardia culture results), or probable nocardiosis (positive Nocardia PCR results, negative Nocardia culture results, and no alternative diagnosis). Sixty-eight patients were included; 47 were classified as NC, 8 as PC, and 13 as probable nocardiosis. PCR results were negative for 35/47 NC patients (74%). For the 12 NC patients with positive PCR results, the PCR assay had been performed with respiratory samples. These NC patients had chronic bronchopulmonary disease more frequently than did the NC patients with negative PCR results (8/12 patients [67%] versus 11/35 patients [31%]; P = 0.044). PCR results were positive for 7/8 PC patients (88%). There were 13 cases of probable nocardiosis, diagnosed solely using the PCR results; 9 of those patients (69%) had lung involvement (consolidation or nodule). Nocardia PCR testing had a specificity of 74% and a sensitivity of 88% for the diagnosis of nocardiosis. Nocardia PCR testing may be helpful for the diagnosis of nocardiosis in immunocompromised patients but interpretation of PCR results from respiratory samples is difficult, because the PCR assay may also detect colonization. Copyright © 2018 American Society for Microbiology.
Reichert, Brian E.; Martin, J.; Kendall, William L.; Cattau, Christopher E.; Kitchens, Wiley M.
2010-01-01
Individuals in wild populations face risks associated with both intrinsic (i.e. aging) and external (i.e. environmental) sources of mortality. Condition-dependent mortality occurs when there is an interaction between such factors; however, few studies have clearly demonstrated condition-dependent mortality and some have even argued that condition-dependent mortality does not occur in wild avian populations. Using large sample sizes (2084 individuals, 3746 re-sights) of individual-based longitudinal data collected over a 33 year period (1976-2008) on multiple cohorts, we used a capture-mark-recapture framework to model age-dependent survival in the snail kite Rostrhamus sociabilis plumbeus population in Florida. Adding to the growing amount of evidence for actuarial senescence in wild populations, we found evidence of senescent declines in survival probabilities in adult kites. We also tested the hypothesis that older kites experienced condition-dependent mortality during a range-wide drought event (2000-2002). The results provide convincing evidence that the annual survival probability of senescent kites was disproportionately affected by the drought relative to the survival probability of prime-aged adults. To our knowledge, this is the first evidence of condition-dependent mortality to be demonstrated in a wild avian population, a finding which challenges recent conclusions drawn in the literature. Our study suggests that senescence and condition-dependent mortality can affect the demography of wild avian populations. Accounting for these sources of variation may be particularly important to appropriately compute estimates of population growth rate, and probabilities of quasi-extinctions.
Galvao-Carmona, Alejandro; González-Rosa, Javier J.; Hidalgo-Muñoz, Antonio R.; Páramo, Dolores; Benítez, María L.; Izquierdo, Guillermo; Vázquez-Marrufo, Manuel
2014-01-01
Background: The study of the attentional system remains a challenge for current neuroscience. The “Attention Network Test” (ANT) was designed to study simultaneously three different attentional networks (alerting, orienting, and executive) based in subtraction of different experimental conditions. However, some studies recommend caution with these calculations due to the interactions between the attentional networks. In particular, it is highly relevant that several interpretations about attentional impairment have arisen from these calculations in diverse pathologies. Event related potentials (ERPs) and neural source analysis can be applied to disentangle the relationships between these attentional networks not specifically shown by behavioral measures. Results: This study shows that there is a basic level of alerting (tonic alerting) in the no cue (NC) condition, represented by a slow negative trend in the ERP trace prior to the onset of the target stimuli. A progressive increase in the CNV amplitude related to the amount of information provided by the cue conditions is also shown. Neural source analysis reveals specific modulations of the CNV related to a task-related expectancy presented in the NC condition; a late modulation triggered by the central cue (CC) condition and probably representing a generic motor preparation; and an early and late modulation for spatial cue (SC) condition suggesting specific motor and sensory preactivation. Finally, the first component in the information processing of the target stimuli modulated by the interaction between orienting network and the executive system can be represented by N1. Conclusions: The ANT is useful as a paradigm to study specific attentional mechanisms and their interactions. However, calculation of network effects is based in subtractions with non-comparable experimental conditions, as evidenced by the present data, which can induce misinterpretations in the study of the attentional capacity in human subjects. PMID:25352800
BIODEGRADATION PROBABILITY PROGRAM (BIODEG)
The Biodegradation Probability Program (BIODEG) calculates the probability that a chemical under aerobic conditions with mixed cultures of microorganisms will biodegrade rapidly or slowly. It uses fragment constants developed using multiple linear and non-linear regressions and d...
Weather-centric rangeland revegetation planning
Hardegree, Stuart P.; Abatzoglou, John T.; Brunson, Mark W.; Germino, Matthew; Hegewisch, Katherine C.; Moffet, Corey A.; Pilliod, David S.; Roundy, Bruce A.; Boehm, Alex R.; Meredith, Gwendwr R.
2018-01-01
Invasive annual weeds negatively impact ecosystem services and pose a major conservation threat on semiarid rangelands throughout the western United States. Rehabilitation of these rangelands is challenging due to interannual climate and subseasonal weather variability that impacts seed germination, seedling survival and establishment, annual weed dynamics, wildfire frequency, and soil stability. Rehabilitation and restoration outcomes could be improved by adopting a weather-centric approach that uses the full spectrum of available site-specific weather information from historical observations, seasonal climate forecasts, and climate-change projections. Climate data can be used retrospectively to interpret success or failure of past seedings by describing seasonal and longer-term patterns of environmental variability subsequent to planting. A more detailed evaluation of weather impacts on site conditions may yield more flexible adaptive-management strategies for rangeland restoration and rehabilitation, as well as provide estimates of transition probabilities between desirable and undesirable vegetation states. Skillful seasonal climate forecasts could greatly improve the cost efficiency of management treatments by limiting revegetation activities to time periods where forecasts suggest higher probabilities of successful seedling establishment. Climate-change projections are key to the application of current environmental models for development of mitigation and adaptation strategies and for management practices that require a multidecadal planning horizon. Adoption of new weather technology will require collaboration between land managers and revegetation specialists and modifications to the way we currently plan and conduct rangeland rehabilitation and restoration in the Intermountain West.
Rule-based programming paradigm: a formal basis for biological, chemical and physical computation.
Krishnamurthy, V; Krishnamurthy, E V
1999-03-01
A rule-based programming paradigm is described as a formal basis for biological, chemical and physical computations. In this paradigm, the computations are interpreted as the outcome arising out of interaction of elements in an object space. The interactions can create new elements (or same elements with modified attributes) or annihilate old elements according to specific rules. Since the interaction rules are inherently parallel, any number of actions can be performed cooperatively or competitively among the subsets of elements, so that the elements evolve toward an equilibrium or unstable or chaotic state. Such an evolution may retain certain invariant properties of the attributes of the elements. The object space resembles Gibbsian ensemble that corresponds to a distribution of points in the space of positions and momenta (called phase space). It permits the introduction of probabilities in rule applications. As each element of the ensemble changes over time, its phase point is carried into a new phase point. The evolution of this probability cloud in phase space corresponds to a distributed probabilistic computation. Thus, this paradigm can handle tor deterministic exact computation when the initial conditions are exactly specified and the trajectory of evolution is deterministic. Also, it can handle probabilistic mode of computation if we want to derive macroscopic or bulk properties of matter. We also explain how to support this rule-based paradigm using relational-database like query processing and transactions.
Carbon isotopes in mollusk shell carbonates
NASA Astrophysics Data System (ADS)
McConnaughey, Ted A.; Gillikin, David Paul
2008-10-01
Mollusk shells contain many isotopic clues about calcification physiology and environmental conditions at the time of shell formation. In this review, we use both published and unpublished data to discuss carbon isotopes in both bivalve and gastropod shell carbonates. Land snails construct their shells mainly from respired CO2, and shell δ13C reflects the local mix of C3 and C4 plants consumed. Shell δ13C is typically >10‰ heavier than diet, probably because respiratory gas exchange discards CO2, and retains the isotopically heavier HCO3 -. Respired CO2 contributes less to the shells of aquatic mollusks, because CO2/O2 ratios are usually higher in water than in air, leading to more replacement of respired CO2 by environmental CO2. Fluid exchange with the environment also brings additional dissolved inorganic carbon (DIC) into the calcification site. Shell δ13C is typically a few ‰ lower than ambient DIC, and often decreases with age. Shell δ13C retains clues about processes such as ecosystem metabolism and estuarine mixing. Ca2+ ATPase-based models of calcification physiology developed for corals and algae likely apply to mollusks, too, but lower pH and carbonic anhydrase at the calcification site probably suppress kinetic isotope effects. Carbon isotopes in biogenic carbonates are clearly complex, but cautious interpretation can provide a wealth of information, especially after vital effects are better understood.
Three-dimensional structure of the submarine flanks of La Réunion inferred from geophysical data
NASA Astrophysics Data System (ADS)
Gailler, Lydie-Sarah; LéNat, Jean-FrançOis
2010-12-01
La Réunion (Indian Ocean) constitutes a huge volcanic oceanic system of which most of the volume is submerged. We present a study of its submarine part based on the interpretation of magnetic and gravity data compiled from old and recent surveys. A model of the submarine internal structure is derived from 3-D and 2-D models using constraints from previous geological and geophysical studies. Two large-scale, previously unknown, buried volcanic construction zones are discovered in continuation of the island's construction. To the east, the Alizés submarine zone is interpreted as the remnants of Les Alizés volcano eastward flank whose center is marked by a large hypovolcanic intrusion complex. To the southwest, the Etang Salé submarine zone is interpreted as an extension of Piton des Neiges, probably fed by a volcanic rift zone over a large extent. They were predominantly built during the Matuyama period and thus probably belong to early volcanism. A correlation exists between their top and seismic horizons recognized in previous studies and interpreted as the base of the volcanic edifice. Their morphology suggested a lithospheric bulging beneath La Réunion, not required to explain our data, since the seismic interfaces match the top of our volcanic constructions. The coastal shelf coincides with a negative Bouguer anomaly belt, often associated with magnetic anomalies, suggesting a shelf built by hyaloclastites. A detailed analysis of the offshore continuation of La Montagne Massif to the north confirms this hypothesis. The gravity analysis confirms that the bathymetric bulges, forming the northern, eastern, southern, and western submarine flanks, are predominantly built by debris avalanche deposits at the surface.
Figure-ground organization in different phases of the perceptual alternation phenomenon.
Tuccio, M T
1995-12-01
Two experiments on figure-ground organization were designed to examine whether the regions of an ambiguous stimulus perceived as "figure" vary as a function of regional area and experience with the stimulus. In Exp. 1 the perceived duration of each interpretation was recorded during continuous viewing for 10 subjects who had been trained until both percepts appeared with statistical regularity (stationary phase). In Exp. 2 the first interpretation reported by 172 naive observers after a few seconds of pattern exposure was recorded. The well-known tendency to interpret smaller regions as figure was noted in Exp. 2 whereas the results of Exp. 1 suggested equal probability of the percepts. Over-all results suggest that alternation is learned during the transient or "early" phase of perception, with some stimulus features and cultural factors influencing the figure-ground organization. During the stationary or late phase of perception the subject is well practiced and the alternating of interpretations becomes largely automatic.
Rotational Dynamics of Proteins from Spin Relaxation Times and Molecular Dynamics Simulations.
Ollila, O H Samuli; Heikkinen, Harri A; Iwaï, Hideo
2018-06-14
Conformational fluctuations and rotational tumbling of proteins can be experimentally accessed with nuclear spin relaxation experiments. However, interpretation of molecular dynamics from the experimental data is often complicated, especially for molecules with anisotropic shape. Here, we apply classical molecular dynamics simulations to interpret the conformational fluctuations and rotational tumbling of proteins with arbitrarily anisotropic shape. The direct calculation of spin relaxation times from simulation data did not reproduce the experimental data. This was successfully corrected by scaling the overall rotational diffusion coefficients around the protein inertia axes with a constant factor. The achieved good agreement with experiments allowed the interpretation of the internal and overall dynamics of proteins with significantly anisotropic shape. The overall rotational diffusion was found to be Brownian, having only a short subdiffusive region below 0.12 ns. The presented methodology can be applied to interpret rotational dynamics and conformation fluctuations of proteins with arbitrary anisotropic shape. However, a water model with more realistic dynamical properties is probably required for intrinsically disordered proteins.
Suspected pulmonary embolism and lung scan interpretation: Trial of a Bayesian reporting method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Becker, D.M.; Philbrick, J.T.; Schoonover, F.W.
The objective of this research is to determine whether a Bayesian method of lung scan (LS) reporting could influence the management of patients with suspected pulmonary embolism (PE). The study is performed by the following: (1) A descriptive study of the diagnostic process for suspected PE using the new reporting method; (2) a non-experimental evaluation of the reporting method comparing prospective patients and historical controls; and (3) a survey of physicians' reactions to the reporting innovation. Of 148 consecutive patients enrolled at the time of LS, 129 were completely evaluated; 75 patients scanned the previous year served as controls. Themore » LS results of patients with suspected PE were reported as posttest probabilities of PE calculated from physician-provided pretest probabilities and the likelihood ratios for PE of LS interpretations. Despite the Bayesian intervention, the confirmation or exclusion of PE was often based on inconclusive evidence. PE was considered by the clinician to be ruled out in 98% of patients with posttest probabilities less than 25% and ruled in for 95% of patients with posttest probabilities greater than 75%. Prospective patients and historical controls were similar in terms of tests ordered after the LS (e.g., pulmonary angiography). Patients with intermediate or indeterminate lung scan results had the highest proportion of subsequent testing. Most physicians (80%) found the reporting innovation to be helpful, either because it confirmed clinical judgement (94 cases) or because it led to additional testing (7 cases). Despite the probabilistic guidance provided by the study, the diagnosis of PE was often neither clearly established nor excluded. While physicians appreciated the innovation and were not confused by the terminology, their clinical decision making was not clearly enhanced.« less
A Model for Managing Anger and Conflict.
ERIC Educational Resources Information Center
Hamilton, Beatrice
Anger is probably the most misunderstood and least expressed feeling. To understand anger, it is necessary to explore the process. Anger usually follows an experience of frustration, unmet expectations, loss of self-respect, and fear. The next stage seems to be anxiety, which may be interpreted as a disappointment, discomfort, or powerlessness.…
ERIC Educational Resources Information Center
Rosner, Burton S.; Kochanski, Greg
2009-01-01
Signal detection theory (SDT) makes the frequently challenged assumption that decision criteria have no variance. An extended model, the Law of Categorical Judgment, relaxes this assumption. The long accepted equation for the law, however, is flawed: It can generate negative probabilities. The correct equation, the Law of Categorical Judgment…
ERIC Educational Resources Information Center
Tchoshanov, Mourat; Quinones, Maria Cruz; Shakirova, Kadriya B.; Ibragimova, Elena N.; Shakirova, Liliana R.
2017-01-01
The interpretive cross-case study focused on the examination of connections between teacher and student topic-specific knowledge of lower secondary mathematics. Two teachers were selected for the study using non-probability purposive sampling technique. Teachers completed the Teacher Content Knowledge Survey before teaching a topic on division of…
An Experiment in Voice Data Entry for Imagery Interpretation Reporting.
1981-03-01
INTERCEPTORS 219 KOTLIN CLASS KOTLIN CLASS- 22e KOTLN SAM CL KOTLIN -SAM CLASS 221 SKORY CLASS SKORY CLASS_ 222 RIVA CLASS RIGA CLASS 223 GRISHA CLASS GRISHA...INTERCEPTORS ----------------------------------------- ---------------------- INSTALLATION 0362-V34273 *2 PROBABLE SKORY CLASS DESTROYERS *3 CONFIRMED KOTLIN ...CLASS TORPEDO BOATS! 4 CONFIRMED KOTLIN SAM-CLASS DETSTROYERS
Educational Gerontology in Korea: An Interpretive and Critical Study
ERIC Educational Resources Information Center
Kee, Youngwha
2010-01-01
Wilma Donahue's book in 1955, "Education for Later Maturity", was considered the first major work to identify the educational needs of the aging person. Peterson considers it one of the earliest comprehensive surveys of older learners. However, the idea of educational gerontology was probably first used in 1970 at the University of…
Fibres, Blood and Broken Glass
ERIC Educational Resources Information Center
Tomlinson, Bob; Peacock, Alan
2005-01-01
Crime Scene Investigators (CSIs) are the bridge between the police and forensic science specialists. Their job is to recover physical evidence from the scene of a crime, and try to make sense of it to interpret and explain what probably happened--which is just what scientists also do. They recover many things: objects, photographs of the crime…
In Defense of the Chi-Square Continuity Correction.
ERIC Educational Resources Information Center
Veldman, Donald J.; McNemar, Quinn
Published studies of the sampling distribution of chi-square with and without Yates' correction for continuity have been interpreted as discrediting the correction. Yates' correction actually produces a biased chi-square value which in turn yields a better estimate of the exact probability of the discrete event concerned when used in conjunction…
Assigning and Combining Probabilities in Single-Case Studies
ERIC Educational Resources Information Center
Manolov, Rumen; Solanas, Antonio
2012-01-01
There is currently a considerable diversity of quantitative measures available for summarizing the results in single-case studies. Given that the interpretation of some of them is difficult due to the lack of established benchmarks, the current article proposes an approach for obtaining further numerical evidence on the importance of the results,…
Interpreting results of cluster surveys in emergency settings: is the LQAS test the best option?
Bilukha, Oleg O; Blanton, Curtis
2008-12-09
Cluster surveys are commonly used in humanitarian emergencies to measure health and nutrition indicators. Deitchler et al. have proposed to use Lot Quality Assurance Sampling (LQAS) hypothesis testing in cluster surveys to classify the prevalence of global acute malnutrition as exceeding or not exceeding the pre-established thresholds. Field practitioners and decision-makers must clearly understand the meaning and implications of using this test in interpreting survey results to make programmatic decisions. We demonstrate that the LQAS test--as proposed by Deitchler et al.--is prone to producing false-positive results and thus is likely to suggest interventions in situations where interventions may not be needed. As an alternative, to provide more useful information for decision-making, we suggest reporting the probability of an indicator's exceeding the threshold as a direct measure of "risk". Such probability can be easily determined in field settings by using a simple spreadsheet calculator. The "risk" of exceeding the threshold can then be considered in the context of other aggravating and protective factors to make informed programmatic decisions.
Interpreting results of cluster surveys in emergency settings: is the LQAS test the best option?
Bilukha, Oleg O; Blanton, Curtis
2008-01-01
Cluster surveys are commonly used in humanitarian emergencies to measure health and nutrition indicators. Deitchler et al. have proposed to use Lot Quality Assurance Sampling (LQAS) hypothesis testing in cluster surveys to classify the prevalence of global acute malnutrition as exceeding or not exceeding the pre-established thresholds. Field practitioners and decision-makers must clearly understand the meaning and implications of using this test in interpreting survey results to make programmatic decisions. We demonstrate that the LQAS test–as proposed by Deitchler et al. – is prone to producing false-positive results and thus is likely to suggest interventions in situations where interventions may not be needed. As an alternative, to provide more useful information for decision-making, we suggest reporting the probability of an indicator's exceeding the threshold as a direct measure of "risk". Such probability can be easily determined in field settings by using a simple spreadsheet calculator. The "risk" of exceeding the threshold can then be considered in the context of other aggravating and protective factors to make informed programmatic decisions. PMID:19068120
NASA Astrophysics Data System (ADS)
Hartmann, William K.; Werner, Stephanie C.
2010-06-01
Recent controversies about systems of crater-count dating have been largely resolved, and with continuing refinements, crater counts will offer a fundamental geological tool to interpret not only ages, but also the nature of geological processes altering the surface of Mars. As an example of the latter technique, we present data on two debris aprons east of Hellas. The aprons show much shorter survival times of small craters than do the nearby contiguous plains. The order-of-magnitude depths of layers involved in the loss process can be judged from the depths of the affected craters. We infer that ice-rich layers in the top tens of meters of both aprons have lost crater topography within the last few 10 8 yr, probably due to flow or sublimation of ice-rich materials. Mantling by ice-rich deposits, associated with climate change cycles of obliquity change, has probably also affected both the aprons and the plains. The crater-count tool thus adds chronological and vertical dimensional information to purely morphological studies.
Ducci, Daniela; de Melo, M Teresa Condesso; Preziosi, Elisabetta; Sellerino, Mariangela; Parrone, Daniele; Ribeiro, Luis
2016-11-01
The natural background level (NBL) concept is revisited and combined with indicator kriging method to analyze the spatial distribution of groundwater quality within a groundwater body (GWB). The aim is to provide a methodology to easily identify areas with the same probability of exceeding a given threshold (which may be a groundwater quality criteria, standards, or recommended limits for selected properties and constituents). Three case studies with different hydrogeological settings and located in two countries (Portugal and Italy) are used to derive NBL using the preselection method and validate the proposed methodology illustrating its main advantages over conventional statistical water quality analysis. Indicator kriging analysis was used to create probability maps of the three potential groundwater contaminants. The results clearly indicate the areas within a groundwater body that are potentially contaminated because the concentrations exceed the drinking water standards or even the local NBL, and cannot be justified by geogenic origin. The combined methodology developed facilitates the management of groundwater quality because it allows for the spatial interpretation of NBL values. Copyright © 2016 Elsevier B.V. All rights reserved.
Cyber-Physical Correlations for Infrastructure Resilience: A Game-Theoretic Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rao, Nageswara S; He, Fei; Ma, Chris Y. T.
In several critical infrastructures, the cyber and physical parts are correlated so that disruptions to one affect the other and hence the whole system. These correlations may be exploited to strategically launch components attacks, and hence must be accounted for ensuring the infrastructure resilience, specified by its survival probability. We characterize the cyber-physical interactions at two levels: (i) the failure correlation function specifies the conditional survival probability of cyber sub-infrastructure given the physical sub-infrastructure as a function of their marginal probabilities, and (ii) the individual survival probabilities of both sub-infrastructures are characterized by first-order differential conditions. We formulate a resiliencemore » problem for infrastructures composed of discrete components as a game between the provider and attacker, wherein their utility functions consist of an infrastructure survival probability term and a cost term expressed in terms of the number of components attacked and reinforced. We derive Nash Equilibrium conditions and sensitivity functions that highlight the dependence of infrastructure resilience on the cost term, correlation function and sub-infrastructure survival probabilities. These results generalize earlier ones based on linear failure correlation functions and independent component failures. We apply the results to models of cloud computing infrastructures and energy grids.« less
Body Condition Indices Predict Reproductive Success but Not Survival in a Sedentary, Tropical Bird
Milenkaya, Olga; Catlin, Daniel H.; Legge, Sarah; Walters, Jeffrey R.
2015-01-01
Body condition may predict individual fitness because those in better condition have more resources to allocate towards improving their fitness. However, the hypothesis that condition indices are meaningful proxies for fitness has been questioned. Here, we ask if intraspecific variation in condition indices predicts annual reproductive success and survival. We monitored a population of Neochmia phaeton (crimson finch), a sedentary, tropical passerine, for reproductive success and survival over four breeding seasons, and sampled them for commonly used condition indices: mass adjusted for body size, muscle and fat scores, packed cell volume, hemoglobin concentration, total plasma protein, and heterophil to lymphocyte ratio. Our study population is well suited for this research because individuals forage in common areas and do not hold territories such that variation in condition between individuals is not confounded by differences in habitat quality. Furthermore, we controlled for factors that are known to impact condition indices in our study population (e.g., breeding stage) such that we assessed individual condition relative to others in the same context. Condition indices that reflect energy reserves predicted both the probability of an individual fledging young and the number of young produced that survived to independence, but only during some years. Those that were relatively heavy for their body size produced about three times more independent young compared to light individuals. That energy reserves are a meaningful predictor of reproductive success in a sedentary passerine supports the idea that energy reserves are at least sometimes predictors of fitness. However, hematological indices failed to predict reproductive success and none of the indices predicted survival. Therefore, some but not all condition indices may be informative, but because we found that most indices did not predict any component of fitness, we question the ubiquitous interpretation of condition indices as surrogates for individual quality and fitness. PMID:26305457
Evaluation of Differential DependencY (EDDY) is a statistical test for the differential dependency relationship of a set of genes between two given conditions. For each condition, possible dependency network structures are enumerated and their likelihoods are computed to represent a probability distribution of dependency networks. The difference between the probability distributions of dependency networks is computed between conditions, and its statistical significance is evaluated with random permutations of condition labels on the samples.
Evaluation of Differential DependencY (EDDY) is a statistical test for the differential dependency relationship of a set of genes between two given conditions. For each condition, possible dependency network structures are enumerated and their likelihoods are computed to represent a probability distribution of dependency networks. The difference between the probability distributions of dependency networks is computed between conditions, and its statistical significance is evaluated with random permutations of condition labels on the samples.
Conditional Probabilities and Collapse in Quantum Measurements
NASA Astrophysics Data System (ADS)
Laura, Roberto; Vanni, Leonardo
2008-09-01
We show that including both the system and the apparatus in the quantum description of the measurement process, and using the concept of conditional probabilities, it is possible to deduce the statistical operator of the system after a measurement with a given result, which gives the probability distribution for all possible consecutive measurements on the system. This statistical operator, representing the state of the system after the first measurement, is in general not the same that would be obtained using the postulate of collapse.
Probabilities of good, marginal, and poor flying conditions for space shuttle ferry flights
NASA Technical Reports Server (NTRS)
Whiting, D. M.; Guttman, N. B.
1977-01-01
Empirical probabilities are provided for good, marginal, and poor flying weather for ferrying the Space Shuttle Orbiter from Edwards AFB, California, to Kennedy Space Center, Florida, and from Edwards AFB to Marshall Space Flight Center, Alabama. Results are given by month for each overall route plus segments of each route. The criteria for defining a day as good, marginal, or poor and the method of computing the relative frequencies and conditional probabilities for monthly reference periods are described.
Pre-Service Teachers' Conceptions of Probability
ERIC Educational Resources Information Center
Odafe, Victor U.
2011-01-01
Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…
The Efficacy of Using Diagrams When Solving Probability Word Problems in College
ERIC Educational Resources Information Center
Beitzel, Brian D.; Staley, Richard K.
2015-01-01
Previous experiments have shown a deleterious effect of visual representations on college students' ability to solve total- and joint-probability word problems. The present experiments used conditional-probability problems, known to be more difficult than total- and joint-probability problems. The diagram group was instructed in how to use tree…
Relative Contributions of Three Descriptive Methods: Implications for Behavioral Assessment
ERIC Educational Resources Information Center
Pence, Sacha T.; Roscoe, Eileen M.; Bourret, Jason C.; Ahearn, William H.
2009-01-01
This study compared the outcomes of three descriptive analysis methods--the ABC method, the conditional probability method, and the conditional and background probability method--to each other and to the results obtained from functional analyses. Six individuals who had been diagnosed with developmental delays and exhibited problem behavior…
NASA Technical Reports Server (NTRS)
Whitnah, A. M.; Howes, D. B.
1971-01-01
Statistical information for the Apollo command module water landings is presented. This information includes the probability of occurrence of various impact conditions, a successful impact, and body X-axis loads of various magnitudes.
I show that a conditional probability analysis using a stressor-response model based on a logistic regression provides a useful approach for developing candidate water quality criteria from empirical data, such as the Maryland Biological Streams Survey (MBSS) data.
Spatial prediction models for the probable biological condition of streams and rivers in the USA
The National Rivers and Streams Assessment (NRSA) is a probability-based survey conducted by the US Environmental Protection Agency and its state and tribal partners. It provides information on the ecological condition of the rivers and streams in the conterminous USA, and the ex...
Random forest models for the probable biological condition of streams and rivers in the USA
The National Rivers and Streams Assessment (NRSA) is a probability based survey conducted by the US Environmental Protection Agency and its state and tribal partners. It provides information on the ecological condition of the rivers and streams in the conterminous USA, and the ex...
A conditional probability approach using monitoring data to develop geographic-specific water quality criteria for protection of aquatic life is presented. Typical methods to develop criteria using existing monitoring data are limited by two issues: (1) how to extrapolate to an...
Application of FTIR spectroscopy to the characterization of archeological wood.
Traoré, Mohamed; Kaal, Joeri; Martínez Cortizas, Antonio
2016-01-15
Two archeological wood samples were studied by attenuated total reflectance Fourier transform infrared (FTIR-ATR) spectroscopy. They originate from a shipwreck in Ribadeo Bay in the northwest of Spain and from a beam wood of an old nave of the Cathedral of Segovia in the central Spain. Principal component analysis was applied to the transposed data matrix (samples as columns and spectral bands as rows) of 43 recorded spectra (18 in the shipwreck and 25 in the beam wood). The results showed differences between the two samples, with a larger proportion of carbohydrates and smaller proportion of lignin in the beam than in the shipwreck wood. Within the beam wood, lignin content was significantly lower in the recent than the old tree rings (P=0.005). These variations can be attributed to species differences between the two woods (oak and pine respectively), with a mixture of guaiacyl and syringyl in hardwood lignin, whereas softwood lignin consists almost exclusively of guaiacyl moieties. The influence of environmental conditions on the FTIR fingerprint was probably reflected by enhanced oxidation of lignin in aerated conditions (beam wood) and hydrolysis of carbohydrates in submerged-anoxic conditions (shipwreck wood). Molecular characterization by analytical pyrolysis of selected samples from each wood type confirmed the interpretation of the mechanisms behind the variability in wood composition obtained by the FTIR-ATR. Copyright © 2015 Elsevier B.V. All rights reserved.