Science.gov

Sample records for probability learning

  1. Learning a Probability Distribution Efficiently and Reliably

    NASA Technical Reports Server (NTRS)

    Laird, Philip; Gamble, Evan

    1988-01-01

    A new algorithm, called the CDF-Inversion Algorithm, is described. Using it, one can efficiently learn a probability distribution over a finite set to a specified accuracy and confidence. The algorithm can be extended to learn joint distributions over a vector space. Some implementation results are described.

  2. Rethinking the learning of belief network probabilities

    SciTech Connect

    Musick, R.

    1996-03-01

    Belief networks are a powerful tool for knowledge discovery that provide concise, understandable probabilistic models of data. There are methods grounded in probability theory to incrementally update the relationships described by the belief network when new information is seen, to perform complex inferences over any set of variables in the data, to incorporate domain expertise and prior knowledge into the model, and to automatically learn the model from data. This paper concentrates on part of the belief network induction problem, that of learning the quantitative structure (the conditional probabilities), given the qualitative structure. In particular, the current practice of rote learning the probabilities in belief networks can be significantly improved upon. We advance the idea of applying any learning algorithm to the task of conditional probability learning in belief networks, discuss potential benefits, and show results of applying neural networks and other algorithms to a medium sized car insurance belief network. The results demonstrate from 10 to 100% improvements in model error rates over the current approaches.

  3. Probability density function learning by unsupervised neurons.

    PubMed

    Fiori, S

    2001-10-01

    In a recent work, we introduced the concept of pseudo-polynomial adaptive activation function neuron (FAN) and presented an unsupervised information-theoretic learning theory for such structure. The learning model is based on entropy optimization and provides a way of learning probability distributions from incomplete data. The aim of the present paper is to illustrate some theoretical features of the FAN neuron, to extend its learning theory to asymmetrical density function approximation, and to provide an analytical and numerical comparison with other known density function estimation methods, with special emphasis to the universal approximation ability. The paper also provides a survey of PDF learning from incomplete data, as well as results of several experiments performed on real-world problems and signals. PMID:11709808

  4. Dynamic probability estimator for machine learning.

    PubMed

    Starzyk, Janusz A; Wang, Feng

    2004-03-01

    An efficient algorithm for dynamic estimation of probabilities without division on unlimited number of input data is presented. The method estimates probabilities of the sampled data from the raw sample count, while keeping the total count value constant. Accuracy of the estimate depends on the counter size, rather than on the total number of data points. Estimator follows variations of the incoming data probability within a fixed window size, without explicit implementation of the windowing technique. Total design area is very small and all probabilities are estimated concurrently. Dynamic probability estimator was implemented using a programmable gate array from Xilinx. The performance of this implementation is evaluated in terms of the area efficiency and execution time. This method is suitable for the highly integrated design of artificial neural networks where a large number of dynamic probability estimators can work concurrently. PMID:15384523

  5. Probability & Statistics: Modular Learning Exercises. Student Edition

    ERIC Educational Resources Information Center

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The materials are centered on the fictional town of Happy Shores, a coastal community which is at risk for hurricanes. Actuaries at an insurance company figure out the risks and…

  6. Probability & Statistics: Modular Learning Exercises. Teacher Edition

    ERIC Educational Resources Information Center

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The modules also introduce students to real world math concepts and problems that property and casualty actuaries come across in their work. They are designed to be used by teachers and…

  7. Choice Strategies in Multiple-Cue Probability Learning

    ERIC Educational Resources Information Center

    White, Chris M.; Koehler, Derek J.

    2007-01-01

    Choice strategies for selecting among outcomes in multiple-cue probability learning were investigated using a simulated medical diagnosis task. Expected choice probabilities (the proportion of times each outcome was selected given each cue pattern) under alternative choice strategies were constructed from corresponding observed judged…

  8. Fostering Positive Attitude in Probability Learning Using Graphing Calculator

    ERIC Educational Resources Information Center

    Tan, Choo-Kim; Harji, Madhubala Bava; Lau, Siong-Hoe

    2011-01-01

    Although a plethora of research evidence highlights positive and significant outcomes of the incorporation of the Graphing Calculator (GC) in mathematics education, its use in the teaching and learning process appears to be limited. The obvious need to revisit the teaching and learning of Probability has resulted in this study, i.e. to incorporate…

  9. Learning about Posterior Probability: Do Diagrams and Elaborative Interrogation Help?

    ERIC Educational Resources Information Center

    Clinton, Virginia; Alibali, Martha W.; Nathan, Mitchell J.

    2016-01-01

    To learn from a text, students must make meaningful connections among related ideas in that text. This study examined the effectiveness of two methods of improving connections--elaborative interrogation and diagrams--in written lessons about posterior probability. Undergraduate students (N = 198) read a lesson in one of three questioning…

  10. Computational Modelling and Simulation Fostering New Approaches in Learning Probability

    ERIC Educational Resources Information Center

    Kuhn, Markus; Hoppe, Ulrich; Lingnau, Andreas; Wichmann, Astrid

    2006-01-01

    Discovery learning in mathematics in the domain of probability based on hands-on experiments is normally limited because of the difficulty in providing sufficient materials and data volume in terms of repetitions of the experiments. Our cooperative, computational modelling and simulation environment engages students and teachers in composing and…

  11. Dual-Processes in Learning and Judgment: Evidence from the Multiple Cue Probability Learning Paradigm

    ERIC Educational Resources Information Center

    Rolison, Jonathan J.; Evans, Jonathan St. B. T.; Dennis, Ian; Walsh, Clare R.

    2012-01-01

    Multiple cue probability learning (MCPL) involves learning to predict a criterion based on a set of novel cues when feedback is provided in response to each judgment made. But to what extent does MCPL require controlled attention and explicit hypothesis testing? The results of two experiments show that this depends on cue polarity. Learning about…

  12. Supervised learning of probability distributions by neural networks

    NASA Technical Reports Server (NTRS)

    Baum, Eric B.; Wilczek, Frank

    1988-01-01

    Supervised learning algorithms for feedforward neural networks are investigated analytically. The back-propagation algorithm described by Werbos (1974), Parker (1985), and Rumelhart et al. (1986) is generalized by redefining the values of the input and output neurons as probabilities. The synaptic weights are then varied to follow gradients in the logarithm of likelihood rather than in the error. This modification is shown to provide a more rigorous theoretical basis for the algorithm and to permit more accurate predictions. A typical application involving a medical-diagnosis expert system is discussed.

  13. Judgments of learning index relative confidence, not subjective probability.

    PubMed

    Zawadzka, Katarzyna; Higham, Philip A

    2015-11-01

    The underconfidence-with-practice (UWP) effect is a common finding in calibration studies concerned with judgments of learning (JOLs) elicited on a percentage scale. The UWP pattern is present when, in a procedure consisting of multiple study-test cycles, the mean scale JOLs underestimate the mean recall performance on Cycle 2 and beyond. Although this pattern is present both for items recalled and unrecalled on the preceding cycle, to date research has concentrated mostly on the sources of UWP for the latter type of items. In the present study, we aimed to bridge this gap. In three experiments, we examined calibration on the third of three cycles. The results of Experiment 1 demonstrated the typical pattern of higher recall and scale JOLs for previously recalled items than for unrecalled ones. More importantly, they also revealed that even though the UWP effect was found for items previously recalled both once and twice, its magnitude was greater for the former class of items. Experiments 2 and 3, which employed a binary betting task and a binary 0 %/100 % JOL task, respectively, demonstrated that people can accurately predict future recall for previously recalled items with binary decisions. In both experiments, the UWP effect was absent for both items recalled once and twice. We suggest that the sensitivity of scale JOLs, but not binary judgments, to the number of previous recall successes strengthens the claim of Hanczakowski, Zawadzka, Pasek, and Higham (Journal of Memory and Language 69:429-444, 2013) that scale JOLs reflect confidence in, rather than the subjective probability of, future recall. PMID:26111879

  14. A Mathematical Microworld for Students to Learn Introductory Probability.

    ERIC Educational Resources Information Center

    Jiang, Zhonghong; Potter, Walter D.

    1993-01-01

    Describes the Microworld Chance, a simulation-oriented computer environment that allows students to explore probability concepts in five subenvironments: coins, dice, spinners, thumbtacks, and marbles. Results of a teaching experiment to examine the effectiveness of the microworld in changing students' misconceptions about probability are…

  15. Paradoxes and counterexamples in teaching and learning of probability at university

    NASA Astrophysics Data System (ADS)

    Klymchuk, Sergiy; Kachapova, Farida

    2012-09-01

    This article is devoted to practical aspects of teaching and learning of probability at university. It presents the difficulties and attitudes of first-year university science and engineering students towards using paradoxes and counterexamples as a pedagogical strategy in teaching and learning of probability. It also presents a student's point of view on the effectiveness of this pedagogical strategy.

  16. Paradoxes and Counterexamples in Teaching and Learning of Probability at University

    ERIC Educational Resources Information Center

    Klymchuk, Sergiy; Kachapova, Farida

    2012-01-01

    This article is devoted to practical aspects of teaching and learning of probability at university. It presents the difficulties and attitudes of first-year university science and engineering students towards using paradoxes and counterexamples as a pedagogical strategy in teaching and learning of probability. It also presents a student's point of…

  17. The Effects of Memory and Abstractive Integration on Children's Probability Learning.

    ERIC Educational Resources Information Center

    Kreitler, Shulamith; And Others

    1983-01-01

    Examines the relation between children's (1) probability learning performance and a measure of their memory for items presented in a sequence and (2) probability learning and performance on a test of abstractive integration. Participating were 80 six- and seven-year-old boys and girls from both low and middle socioeconomic classes. (Author/RH)

  18. Illustrating Probability in Genetics with Hands-On Learning: Making the Math Real

    ERIC Educational Resources Information Center

    Pierce, Benjamin A.; Honeycutt, Brenda B.

    2007-01-01

    Probability is an essential tool for understanding heredity and modern genetics, yet many students have difficulty with this topic due to the abstract and quantitative nature of the subject. To facilitate student learning of probability in genetics, we have developed a set of hands-on, cooperative activities that allow students to determine…

  19. Sequence Learning in Infancy: The Independent Contributions of Conditional Probability and Pair Frequency Information

    ERIC Educational Resources Information Center

    Marcovitch, Stuart; Lewkowicz, David J.

    2009-01-01

    The ability to perceive sequences is fundamental to cognition. Previous studies have shown that infants can learn visual sequences as early as 2 months of age and it has been suggested that this ability is mediated by sensitivity to conditional probability information. Typically, conditional probability information has covaried with frequency…

  20. The Effect of Simulation-Based Learning on Prospective Teachers' Inference Skills in Teaching Probability

    ERIC Educational Resources Information Center

    Koparan, Timur; Yilmaz, Gül Kaleli

    2015-01-01

    The effect of simulation-based probability teaching on the prospective teachers' inference skills has been examined with this research. In line with this purpose, it has been aimed to examine the design, implementation and efficiency of a learning environment for experimental probability. Activities were built on modeling, simulation and the…

  1. Word Learning by Preschoolers with SLI: Effect of Phonotactic Probability and Object Familiarity

    ERIC Educational Resources Information Center

    Gray, Shelley; Brinkley, Shara; Svetina, Dubravka

    2012-01-01

    Purpose: In this study, the authors investigated whether previous findings of a low phonotactic probability/unfamiliar object word-learning advantage in preschoolers could be replicated, whether this advantage would be apparent at different "stages" of word learning, and whether findings would differ for preschoolers with specific language…

  2. Learning Probabilities in Computer Engineering by Using a Competency- and Problem-Based Approach

    ERIC Educational Resources Information Center

    Khoumsi, Ahmed; Hadjou, Brahim

    2005-01-01

    Our department has redesigned its electrical and computer engineering programs by adopting a learning methodology based on competence development, problem solving, and the realization of design projects. In this article, we show how this pedagogical approach has been successfully used for learning probabilities and their application to computer…

  3. Probability and Statistics in Astronomical Machine Learning and Data Minin

    NASA Astrophysics Data System (ADS)

    Scargle, Jeffrey

    2012-03-01

    Statistical issues peculiar to astronomy have implications for machine learning and data mining. It should be obvious that statistics lies at the heart of machine learning and data mining. Further it should be no surprise that the passive observational nature of astronomy, the concomitant lack of sampling control, and the uniqueness of its realm (the whole universe!) lead to some special statistical issues and problems. As described in the Introduction to this volume, data analysis technology is largely keeping up with major advances in astrophysics and cosmology, even driving many of them. And I realize that there are many scientists with good statistical knowledge and instincts, especially in the modern era I like to call the Age of Digital Astronomy. Nevertheless, old impediments still lurk, and the aim of this chapter is to elucidate some of them. Many experiences with smart people doing not-so-smart things (cf. the anecdotes collected in the Appendix here) have convinced me that the cautions given here need to be emphasized. Consider these four points: 1. Data analysis often involves searches of many cases, for example, outcomes of a repeated experiment, for a feature of the data. 2. The feature comprising the goal of such searches may not be defined unambiguously until the search is carried out, or perhaps vaguely even then. 3. The human visual system is very good at recognizing patterns in noisy contexts. 4. People are much easier to convince of something they want to believe, or already believe, as opposed to unpleasant or surprising facts. One can argue that all four are good things during the initial, exploratory phases of most data analysis. They represent the curiosity and creativity of the scientific process, especially during the exploration of data collections from new observational programs such as all-sky surveys in wavelengths not accessed before or sets of images of a planetary surface not yet explored. On the other hand, confirmatory scientific

  4. The change probability effect: incidental learning, adaptability, and shared visual working memory resources.

    PubMed

    van Lamsweerde, Amanda E; Beck, Melissa R

    2011-12-01

    Statistical properties in the visual environment can be used to improve performance on visual working memory (VWM) tasks. The current study examined the ability to incidentally learn that a change is more likely to occur to a particular feature dimension (shape, color, or location) and use this information to improve change detection performance for that dimension (the change probability effect). Participants completed a change detection task in which one change type was more probable than others. Change probability effects were found for color and shape changes, but not location changes, and intentional strategies did not improve the effect. Furthermore, the change probability effect developed and adapted to new probability information quickly. Finally, in some conditions, an improvement in change detection performance for a probable change led to an impairment in change detection for improbable changes. PMID:21963330

  5. Incidental learning of probability information is differentially affected by the type of visual working memory representation.

    PubMed

    van Lamsweerde, Amanda E; Beck, Melissa R

    2015-12-01

    In this study, we investigated whether the ability to learn probability information is affected by the type of representation held in visual working memory. Across 4 experiments, participants detected changes to displays of coloured shapes. While participants detected changes in 1 dimension (e.g., colour), a feature from a second, nonchanging dimension (e.g., shape) predicted which object was most likely to change. In Experiments 1 and 3, items could be grouped by similarity in the changing dimension across items (e.g., colours and shapes were repeated in the display), while in Experiments 2 and 4 items could not be grouped by similarity (all features were unique). Probability information from the predictive dimension was learned and used to increase performance, but only when all of the features within a display were unique (Experiments 2 and 4). When it was possible to group by feature similarity in the changing dimension (e.g., 2 blue objects appeared within an array), participants were unable to learn probability information and use it to improve performance (Experiments 1 and 3). The results suggest that probability information can be learned in a dimension that is not explicitly task-relevant, but only when the probability information is represented with the changing dimension in visual working memory. PMID:26010021

  6. A Cross-Sectional Comparison of the Effects of Phonotactic Probability and Neighborhood Density on Word Learning by Preschool Children

    ERIC Educational Resources Information Center

    Hoover, Jill R.; Storkel, Holly L.; Hogan, Tiffany P.

    2010-01-01

    Two experiments examined the effects of phonotactic probability and neighborhood density on word learning by 3-, 4-, and 5-year-old children. Nonwords orthogonally varying in probability and density were taught with learning and retention measured via picture naming. Experiment 1 used a within story probability/across story density exposure…

  7. Influence of phonotactic probability/neighbourhood density on lexical learning in late talkers

    PubMed Central

    MacRoy-Higgins, Michelle; Schwartz, Richard G.; Shafer, Valerie L.; Marton, Klara

    2013-01-01

    Background Toddlers who are late talkers demonstrate delays in phonological and lexical skills. However, the influence of phonological factors on lexical acquisition in toddlers who are late talkers has not been examined directly. Aims To examine the influence of phonotactic probability/neighbourhood density on word learning in toddlers who were late talkers using comprehension, production and word recognition tasks. Methods & Procedures Two-year-olds who were late talkers (n = 12) and typically developing toddlers (n = 12) were exposed to 12 novel pseudo-words for unfamiliar objects in ten training sessions. Pseudo-words contained high or low phonotactic probability English sound sequences. The toddlers’ comprehension, speech production and detection of mispronunciation of the newly learned words were examined using a preferential looking paradigm. Outcomes & Results Late talkers showed poorer performance than toddlers with typical language development in all three tasks: comprehension, production and detection of mispronunciations. The toddlers with typical language development showed better speech production and more sensitivity to mispronunciations for high than low phonotactic probability/neighbourhood density sequences. Phonotactic probability/neighbourhood density did not influence the late talkers’ speech production or sensitivity to mispronunciations; they performed similarly for pseudo-words with high and low phonotactic probability/neighbourhood density sound sequences. Conclusions & Implications The results indicate that some late talkers do not recognize statistical properties of their language, which may contribute to their slower lexical learning. PMID:23472958

  8. Effects of Multiple Simulation Presentation among Students of Different Anxiety Levels in the Learning of Probability

    ERIC Educational Resources Information Center

    Fong, Soon Fook; Por, Fei Ping; Tang, Ai Ling

    2012-01-01

    The purpose of this study was to investigate the effects of multiple simulation presentation in interactive multimedia are on the achievement of students with different levels of anxiety in the learning of Probability. The interactive multimedia courseware was developed in two different modes, which were Multiple Simulation Presentation (MSP) and…

  9. Blind Students' Learning of Probability through the Use of a Tactile Model

    ERIC Educational Resources Information Center

    Vita, Aida Carvalho; Kataoka, Verônica Yumi

    2014-01-01

    The objective of this paper is to discuss how blind students learn basic concepts of probability using the tactile model proposed by Vita (2012). Among the activities were part of the teaching sequence "Jefferson's Random Walk", in which students built a tree diagram (using plastic trays, foam cards, and toys), and pictograms in 3D…

  10. The Influence of Phonotactic Probability and Neighborhood Density on Children's Production of Newly Learned Words

    ERIC Educational Resources Information Center

    Heisler, Lori; Goffman, Lisa

    2016-01-01

    A word learning paradigm was used to teach children novel words that varied in phonotactic probability and neighborhood density. The effects of frequency and density on speech production were examined when phonetic forms were nonreferential (i.e., when no referent was attached) and when phonetic forms were referential (i.e., when a referent was…

  11. Influence of Phonotactic Probability/Neighbourhood Density on Lexical Learning in Late Talkers

    ERIC Educational Resources Information Center

    MacRoy-Higgins, Michelle; Schwartz, Richard G.; Shafer, Valerie L.; Marton, Klara

    2013-01-01

    Background: Toddlers who are late talkers demonstrate delays in phonological and lexical skills. However, the influence of phonological factors on lexical acquisition in toddlers who are late talkers has not been examined directly. Aims: To examine the influence of phonotactic probability/neighbourhood density on word learning in toddlers who were…

  12. Splitting the variance of statistical learning performance: A parametric investigation of exposure duration and transitional probabilities.

    PubMed

    Bogaerts, Louisa; Siegelman, Noam; Frost, Ram

    2016-08-01

    What determines individuals' efficacy in detecting regularities in visual statistical learning? Our theoretical starting point assumes that the variance in performance of statistical learning (SL) can be split into the variance related to efficiency in encoding representations within a modality and the variance related to the relative computational efficiency of detecting the distributional properties of the encoded representations. Using a novel methodology, we dissociated encoding from higher-order learning factors, by independently manipulating exposure duration and transitional probabilities in a stream of visual shapes. Our results show that the encoding of shapes and the retrieving of their transitional probabilities are not independent and additive processes, but interact to jointly determine SL performance. The theoretical implications of these findings for a mechanistic explanation of SL are discussed. PMID:26743060

  13. Ensemble learning of inverse probability weights for marginal structural modeling in large observational datasets.

    PubMed

    Gruber, Susan; Logan, Roger W; Jarrín, Inmaculada; Monge, Susana; Hernán, Miguel A

    2015-01-15

    Inverse probability weights used to fit marginal structural models are typically estimated using logistic regression. However, a data-adaptive procedure may be able to better exploit information available in measured covariates. By combining predictions from multiple algorithms, ensemble learning offers an alternative to logistic regression modeling to further reduce bias in estimated marginal structural model parameters. We describe the application of two ensemble learning approaches to estimating stabilized weights: super learning (SL), an ensemble machine learning approach that relies on V-fold cross validation, and an ensemble learner (EL) that creates a single partition of the data into training and validation sets. Longitudinal data from two multicenter cohort studies in Spain (CoRIS and CoRIS-MD) were analyzed to estimate the mortality hazard ratio for initiation versus no initiation of combined antiretroviral therapy among HIV positive subjects. Both ensemble approaches produced hazard ratio estimates further away from the null, and with tighter confidence intervals, than logistic regression modeling. Computation time for EL was less than half that of SL. We conclude that ensemble learning using a library of diverse candidate algorithms offers an alternative to parametric modeling of inverse probability weights when fitting marginal structural models. With large datasets, EL provides a rich search over the solution space in less time than SL with comparable results. PMID:25316152

  14. Ensemble learning of inverse probability weights for marginal structural modeling in large observational datasets

    PubMed Central

    Gruber, Susan; Logan, Roger W.; Jarrín, Inmaculada; Monge, Susana; Hernán, Miguel A.

    2014-01-01

    Inverse probability weights used to fit marginal structural models are typically estimated using logistic regression. However a data-adaptive procedure may be able to better exploit information available in measured covariates. By combining predictions from multiple algorithms, ensemble learning offers an alternative to logistic regression modeling to further reduce bias in estimated marginal structural model parameters. We describe the application of two ensemble learning approaches to estimating stabilized weights: super learning (SL), an ensemble machine learning approach that relies on V -fold cross validation, and an ensemble learner (EL) that creates a single partition of the data into training and validation sets. Longitudinal data from two multicenter cohort studies in Spain (CoRIS and CoRIS-MD) were analyzed to estimate the mortality hazard ratio for initiation versus no initiation of combined antiretroviral therapy among HIV positive subjects. Both ensemble approaches produced hazard ratio estimates further away from the null, and with tighter confidence intervals, than logistic regression modeling. Computation time for EL was less than half that of SL. We conclude that ensemble learning using a library of diverse candidate algorithms offers an alternative to parametric modeling of inverse probability weights when fitting marginal structural models. With large datasets, EL provides a rich search over the solution space in less time than SL with comparable results. PMID:25316152

  15. Computational Modeling of Statistical Learning: Effects of Transitional Probability versus Frequency and Links to Word Learning

    ERIC Educational Resources Information Center

    Mirman, Daniel; Estes, Katharine Graf; Magnuson, James S.

    2010-01-01

    Statistical learning mechanisms play an important role in theories of language acquisition and processing. Recurrent neural network models have provided important insights into how these mechanisms might operate. We examined whether such networks capture two key findings in human statistical learning. In Simulation 1, a simple recurrent network…

  16. More than words: Adults learn probabilities over categories and relationships between them

    PubMed Central

    Hudson Kam, Carla L.

    2009-01-01

    This study examines whether human learners can acquire statistics over abstract categories and their relationships to each other. Adult learners were exposed to miniature artificial languages containing variation in the ordering of the Subject, Object, and Verb constituents. Different orders (e.g. SOV, VSO) occurred in the input with different frequencies, but the occurrence of one order versus another was not predictable. Importantly, the language was constructed such that participants could only match the overall input probabilities if they were tracking statistics over abstract categories, not over individual words. At test, participants reproduced the probabilities present in the input with a high degree of accuracy. Closer examination revealed that learner’s were matching the probabilities associated with individual verbs rather than the category as a whole. However, individual nouns had no impact on word orders produced. Thus, participants learned the probabilities of a particular ordering of the abstract grammatical categories Subject and Object associated with each verb. Results suggest that statistical learning mechanisms are capable of tracking relationships between abstract linguistic categories in addition to individual items. PMID:20161375

  17. Impact of Statistical Learning Methods on the Predictive Power of Multivariate Normal Tissue Complication Probability Models

    SciTech Connect

    Xu Chengjian; Schaaf, Arjen van der; Schilstra, Cornelis; Langendijk, Johannes A.; Veld, Aart A. van't

    2012-03-15

    Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.

  18. Learning to Teach Probability: Relationships among Preservice Teachers' Beliefs and Orientations, Content Knowledge, and Pedagogical Content Knowledge of Probability

    ERIC Educational Resources Information Center

    Ives, Sarah Elizabeth

    2009-01-01

    The purposes of this study were to investigate preservice mathematics teachers' orientations, content knowledge, and pedagogical content knowledge of probability; the relationships among these three aspects; and the usefulness of tasks with respect to examining these aspects of knowledge. The design of the study was a multi-case study of five…

  19. Probability estimation with machine learning methods for dichotomous and multicategory outcome: applications.

    PubMed

    Kruppa, Jochen; Liu, Yufeng; Diener, Hans-Christian; Holste, Theresa; Weimar, Christian; König, Inke R; Ziegler, Andreas

    2014-07-01

    Machine learning methods are applied to three different large datasets, all dealing with probability estimation problems for dichotomous or multicategory data. Specifically, we investigate k-nearest neighbors, bagged nearest neighbors, random forests for probability estimation trees, and support vector machines with the kernels of Bessel, linear, Laplacian, and radial basis type. Comparisons are made with logistic regression. The dataset from the German Stroke Study Collaboration with dichotomous and three-category outcome variables allows, in particular, for temporal and external validation. The other two datasets are freely available from the UCI learning repository and provide dichotomous outcome variables. One of them, the Cleveland Clinic Foundation Heart Disease dataset, uses data from one clinic for training and from three clinics for external validation, while the other, the thyroid disease dataset, allows for temporal validation by separating data into training and test data by date of recruitment into study. For dichotomous outcome variables, we use receiver operating characteristics, areas under the curve values with bootstrapped 95% confidence intervals, and Hosmer-Lemeshow-type figures as comparison criteria. For dichotomous and multicategory outcomes, we calculated bootstrap Brier scores with 95% confidence intervals and also compared them through bootstrapping. In a supplement, we provide R code for performing the analyses and for random forest analyses in Random Jungle, version 2.1.0. The learning machines show promising performance over all constructed models. They are simple to apply and serve as an alternative approach to logistic or multinomial logistic regression analysis. PMID:24989843

  20. The effect of incremental changes in phonotactic probability and neighborhood density on word learning by preschool children

    PubMed Central

    Storkel, Holly L.; Bontempo, Daniel E.; Aschenbrenner, Andrew J.; Maekawa, Junko; Lee, Su-Yeon

    2013-01-01

    Purpose Phonotactic probability or neighborhood density have predominately been defined using gross distinctions (i.e., low vs. high). The current studies examined the influence of finer changes in probability (Experiment 1) and density (Experiment 2) on word learning. Method The full range of probability or density was examined by sampling five nonwords from each of four quartiles. Three- and 5-year-old children received training on nonword-nonobject pairs. Learning was measured in a picture-naming task immediately following training and 1-week after training. Results were analyzed using multi-level modeling. Results A linear spline model best captured nonlinearities in phonotactic probability. Specifically word learning improved as probability increased in the lowest quartile, worsened as probability increased in the midlow quartile, and then remained stable and poor in the two highest quartiles. An ordinary linear model sufficiently described neighborhood density. Here, word learning improved as density increased across all quartiles. Conclusion Given these different patterns, phonotactic probability and neighborhood density appear to influence different word learning processes. Specifically, phonotactic probability may affect recognition that a sound sequence is an acceptable word in the language and is a novel word for the child, whereas neighborhood density may influence creation of a new representation in long-term memory. PMID:23882005

  1. Under the hood of statistical learning: A statistical MMN reflects the magnitude of transitional probabilities in auditory sequences

    PubMed Central

    Koelsch, Stefan; Busch, Tobias; Jentschke, Sebastian; Rohrmeier, Martin

    2016-01-01

    Within the framework of statistical learning, many behavioural studies investigated the processing of unpredicted events. However, surprisingly few neurophysiological studies are available on this topic, and no statistical learning experiment has investigated electroencephalographic (EEG) correlates of processing events with different transition probabilities. We carried out an EEG study with a novel variant of the established statistical learning paradigm. Timbres were presented in isochronous sequences of triplets. The first two sounds of all triplets were equiprobable, while the third sound occurred with either low (10%), intermediate (30%), or high (60%) probability. Thus, the occurrence probability of the third item of each triplet (given the first two items) was varied. Compared to high-probability triplet endings, endings with low and intermediate probability elicited an early anterior negativity that had an onset around 100 ms and was maximal at around 180 ms. This effect was larger for events with low than for events with intermediate probability. Our results reveal that, when predictions are based on statistical learning, events that do not match a prediction evoke an early anterior negativity, with the amplitude of this mismatch response being inversely related to the probability of such events. Thus, we report a statistical mismatch negativity (sMMN) that reflects statistical learning of transitional probability distributions that go beyond auditory sensory memory capabilities. PMID:26830652

  2. The Effect of Incremental Changes in Phonotactic Probability and Neighborhood Density on Word Learning by Preschool Children

    ERIC Educational Resources Information Center

    Storkel, Holly L.; Bontempo, Daniel E.; Aschenbrenner, Andrew J.; Maekawa, Junko; Lee, Su-Yeon

    2013-01-01

    Purpose: Phonotactic probability or neighborhood density has predominately been defined through the use of gross distinctions (i.e., low vs. high). In the current studies, the authors examined the influence of finer changes in probability (Experiment 1) and density (Experiment 2) on word learning. Method: The authors examined the full range of…

  3. Triggering word learning in children with Language Impairment: the effect of phonotactic probability and neighbourhood density.

    PubMed

    McKean, Cristina; Letts, Carolyn; Howard, David

    2014-11-01

    The effect of phonotactic probability (PP) and neighbourhood density (ND) on triggering word learning was examined in children with Language Impairment (3;04-6;09) and compared to Typically Developing children. Nonwords, varying PP and ND orthogonally, were presented in a story context and their learning tested using a referent identification task. Group comparisons with receptive vocabulary as a covariate found no group differences in overall scores or in the influence of PP or ND. Therefore, there was no evidence of atypical lexical or phonological processing. 'Convergent' PP/ND (High PP/High ND; Low PP/Low ND) was optimal for word learning in both groups. This bias interacted with vocabulary knowledge. 'Divergent' PP/ND word scores (High PP/Low ND; Low PP/High ND) were positively correlated with vocabulary so the 'divergence disadvantage' reduced as vocabulary knowledge grew; an interaction hypothesized to represent developmental changes in lexical-phonological processing linked to the emergence of phonological representations. PMID:24191951

  4. The Distracting Effect of Material Reward: An Alternative Explanation for the Superior Performance of Reward Groups in Probability Learning

    ERIC Educational Resources Information Center

    McGraw, Kenneth O.; McCullers, John C.

    1974-01-01

    To determine whether the distraction effect associated with material rewards in discrimination learning can account for the superior performance of reward groups in probability learning, the performance of 144 school children (preschool, second, and fifth grades) on a two-choice successive discrimination task was compared under three reinforcement…

  5. ANNz2 - Photometric redshift and probability density function estimation using machine-learning

    NASA Astrophysics Data System (ADS)

    Sadeh, Iftach

    2014-05-01

    Large photometric galaxy surveys allow the study of questions at the forefront of science, such as the nature of dark energy. The success of such surveys depends on the ability to measure the photometric redshifts of objects (photo-zs), based on limited spectral data. A new major version of the public photo-z estimation software, ANNz , is presented here. The new code incorporates several machine-learning methods, such as artificial neural networks and boosted decision/regression trees, which are all used in concert. The objective of the algorithm is to dynamically optimize the performance of the photo-z estimation, and to properly derive the associated uncertainties. In addition to single-value solutions, the new code also generates full probability density functions in two independent ways.

  6. Learning concepts of fractals and probability by “doing science”

    NASA Astrophysics Data System (ADS)

    Stanley, H. Eugene

    1989-09-01

    Very recent advances in computer technology provide the power of mainframe systems in relatively compact and inexpensive personal computers; soon the computing power of even a supercomputer will be available on a desktop at a price comparable to today's personal computers. Over the next decade this tremendous computing power can and probably will become available in schools throughout the world. Here we discuss the possibility of harnessing this new technological resource as a teaching tool for specific topics in mathematics and science, focusing on random processes in nature and their deep connection to concepts in probability and fractal geometry. Such natural phenomena as the growth of snowflakes via random aggregation and the disordered geometric configurations of polymer chains demonstrate that fundamentally random microscopic processes can give rise to predictable macroscopic behaviors. They also give rise to random fractal structures of inherent interest and great beauty. Because it is impossible to view the underlying processes directly, computer simulation and visualization is an indispensable tool for understanding and studying these phenomena. In the process of “doing science” with both hands-on experiments and computer simulations, students would learn abstract mathematical concepts in a context which is at once concrete and inherently motivating. Furthermore, the techniques they could employ would mirror in most respects those in current use by researchers, thus forging an unprecedented link between this curriculum and the professional worlds of science and mathematics.

  7. Activity in Inferior Parietal and Medial Prefrontal Cortex Signals the Accumulation of Evidence in a Probability Learning Task

    PubMed Central

    d'Acremont, Mathieu; Fornari, Eleonora; Bossaerts, Peter

    2013-01-01

    In an uncertain environment, probabilities are key to predicting future events and making adaptive choices. However, little is known about how humans learn such probabilities and where and how they are encoded in the brain, especially when they concern more than two outcomes. During functional magnetic resonance imaging (fMRI), young adults learned the probabilities of uncertain stimuli through repetitive sampling. Stimuli represented payoffs and participants had to predict their occurrence to maximize their earnings. Choices indicated loss and risk aversion but unbiased estimation of probabilities. BOLD response in medial prefrontal cortex and angular gyri increased linearly with the probability of the currently observed stimulus, untainted by its value. Connectivity analyses during rest and task revealed that these regions belonged to the default mode network. The activation of past outcomes in memory is evoked as a possible mechanism to explain the engagement of the default mode network in probability learning. A BOLD response relating to value was detected only at decision time, mainly in striatum. It is concluded that activity in inferior parietal and medial prefrontal cortex reflects the amount of evidence accumulated in favor of competing and uncertain outcomes. PMID:23401673

  8. Computer-Based Graphical Displays for Enhancing Mental Animation and Improving Reasoning in Novice Learning of Probability

    ERIC Educational Resources Information Center

    Kaplan, Danielle E.; Wu, Erin Chia-ling

    2006-01-01

    Our research suggests static and animated graphics can lead to more animated thinking and more correct problem solving in computer-based probability learning. Pilot software modules were developed for graduate online statistics courses and representation research. A study with novice graduate student statisticians compared problem solving in five…

  9. Using Rasch Analysis to Explore What Students Learn about Probability Concepts

    ERIC Educational Resources Information Center

    Mahmud, Zamalia; Porter, Anne

    2015-01-01

    Students' understanding of probability concepts have been investigated from various different perspectives. This study was set out to investigate perceived understanding of probability concepts of forty-four students from the STAT131 Understanding Uncertainty and Variation course at the University of Wollongong, NSW. Rasch measurement which is…

  10. Calibrating perceived understanding and competency in probability concepts: A diagnosis of learning difficulties based on Rasch probabilistic model

    NASA Astrophysics Data System (ADS)

    Mahmud, Zamalia; Porter, Anne; Salikin, Masniyati; Ghani, Nor Azura Md

    2015-12-01

    Students' understanding of probability concepts have been investigated from various different perspectives. Competency on the other hand is often measured separately in the form of test structure. This study was set out to show that perceived understanding and competency can be calibrated and assessed together using Rasch measurement tools. Forty-four students from the STAT131 Understanding Uncertainty and Variation course at the University of Wollongong, NSW have volunteered to participate in the study. Rasch measurement which is based on a probabilistic model is used to calibrate the responses from two survey instruments and investigate the interactions between them. Data were captured from the e-learning platform Moodle where students provided their responses through an online quiz. The study shows that majority of the students perceived little understanding about conditional and independent events prior to learning about it but tend to demonstrate a slightly higher competency level afterward. Based on the Rasch map, there is indication of some increase in learning and knowledge about some probability concepts at the end of the two weeks lessons on probability concepts.

  11. Value and probability coding in a feedback-based learning task utilizing food rewards

    PubMed Central

    Lempert, Karolina M.

    2014-01-01

    For the consequences of our actions to guide behavior, the brain must represent different types of outcome-related information. For example, an outcome can be construed as negative because an expected reward was not delivered or because an outcome of low value was delivered. Thus behavioral consequences can differ in terms of the information they provide about outcome probability and value. We investigated the role of the striatum in processing probability-based and value-based negative feedback by training participants to associate cues with food rewards and then employing a selective satiety procedure to devalue one food outcome. Using functional magnetic resonance imaging, we examined brain activity related to receipt of expected rewards, receipt of devalued outcomes, omission of expected rewards, omission of devalued outcomes, and expected omissions of an outcome. Nucleus accumbens activation was greater for rewarding outcomes than devalued outcomes, but activity in this region did not correlate with the probability of reward receipt. Activation of the right caudate and putamen, however, was largest in response to rewarding outcomes relative to expected omissions of reward. The dorsal striatum (caudate and putamen) at the time of feedback also showed a parametric increase correlating with the trialwise probability of reward receipt. Our results suggest that the ventral striatum is sensitive to the motivational relevance, or subjective value, of the outcome, while the dorsal striatum codes for a more complex signal that incorporates reward probability. Value and probability information may be integrated in the dorsal striatum, to facilitate action planning and allocation of effort. PMID:25339705

  12. Learning in reverse: 8-month-old infants track backward transitional probabilities

    PubMed Central

    Pelucchi, Bruna; Hay, Jessica F.; Saffran, Jenny R.

    2009-01-01

    Numerous recent studies suggest that human learners, including both infants and adults, readily track sequential statistics computed between adjacent elements. One such statistic, transitional probability, is typically calculated as the likelihood that one element predicts another. However, little is known about whether listeners are sensitive to the directionality of this computation. To address this issue, we tested 8-month-old infants in a word segmentation task, using fluent speech drawn from an unfamiliar natural language. Critically, test items were distinguished solely by their backward transitional probabilities. The results provide the first evidence that infants track backward statistics in fluent speech. PMID:19717144

  13. We Can Still Learn About Probability by Rolling Dice and Tossing Coins

    ERIC Educational Resources Information Center

    Dunn, Peter K.

    2005-01-01

    Rolling dice and tossing coins can still be used to teach probability even if students know (or think they know) what happens in these experiments. This article considers many simple variations of these experiments which are interesting, potentially enjoyable and challenging. Using these variations can cause students (and teachers) to think again…

  14. Learning probability distributions from smooth observables and the maximum entropy principle: some remarks

    NASA Astrophysics Data System (ADS)

    Obuchi, Tomoyuki; Monasson, Rémi

    2015-09-01

    The maximum entropy principle (MEP) is a very useful working hypothesis in a wide variety of inference problems, ranging from biological to engineering tasks. To better understand the reasons of the success of MEP, we propose a statistical-mechanical formulation to treat the space of probability distributions constrained by the measures of (experimental) observables. In this paper we first review the results of a detailed analysis of the simplest case of randomly chosen observables. In addition, we investigate by numerical and analytical means the case of smooth observables, which is of practical relevance. Our preliminary results are presented and discussed with respect to the efficiency of the MEP.

  15. Unification of field theory and maximum entropy methods for learning probability densities

    NASA Astrophysics Data System (ADS)

    Kinney, Justin B.

    2015-09-01

    The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy density estimate can be recovered in the infinite smoothness limit of an appropriate Bayesian field theory. I also show that Bayesian field theory estimation can be performed without imposing any boundary conditions on candidate densities, and that the infinite smoothness limit of these theories recovers the most common types of maximum entropy estimates. Bayesian field theory thus provides a natural test of the maximum entropy null hypothesis and, furthermore, returns an alternative (lower entropy) density estimate when the maximum entropy hypothesis is falsified. The computations necessary for this approach can be performed rapidly for one-dimensional data, and software for doing this is provided.

  16. The Influence of Part-Word Phonotactic Probability/Neighborhood Density on Word Learning by Preschool Children Varying in Expressive Vocabulary

    ERIC Educational Resources Information Center

    Storkel, Holly L.; Hoover, Jill R.

    2011-01-01

    The goal of this study was to examine the influence of part-word phonotactic probability/neighborhood density on word learning by preschool children with normal vocabularies that varied in size. Ninety-eight children (age 2 ; 11-6 ; 0) were taught consonant-vowel-consonant (CVC) nonwords orthogonally varying in the probability/density of the CV…

  17. A Study of Students' Learning Styles, Discipline Attitudes and Knowledge Acquisition in Technology-Enhanced Probability and Statistics Education.

    PubMed

    Christou, Nicolas; Dinov, Ivo D

    2010-09-01

    Many modern technological advances have direct impact on the format, style and efficacy of delivery and consumption of educational content. For example, various novel communication and information technology tools and resources enable efficient, timely, interactive and graphical demonstrations of diverse scientific concepts. In this manuscript, we report on a meta-study of 3 controlled experiments of using the Statistics Online Computational Resources in probability and statistics courses. Web-accessible SOCR applets, demonstrations, simulations and virtual experiments were used in different courses as treatment and compared to matched control classes utilizing traditional pedagogical approaches. Qualitative and quantitative data we collected for all courses included Felder-Silverman-Soloman index of learning styles, background assessment, pre and post surveys of attitude towards the subject, end-point satisfaction survey, and varieties of quiz, laboratory and test scores. Our findings indicate that students' learning styles and attitudes towards a discipline may be important confounds of their final quantitative performance. The observed positive effects of integrating information technology with established pedagogical techniques may be valid across disciplines within the broader spectrum courses in the science education curriculum. The two critical components of improving science education via blended instruction include instructor training, and development of appropriate activities, simulations and interactive resources. PMID:21603097

  18. Effect of Phonotactic Probability and Neighborhood Density on Word-Learning Configuration by Preschoolers with Typical Development and Specific Language Impairment

    ERIC Educational Resources Information Center

    Gray, Shelley; Pittman, Andrea; Weinhold, Juliet

    2014-01-01

    Purpose: In this study, the authors assessed the effects of phonotactic probability and neighborhood density on word-learning configuration by preschoolers with specific language impairment (SLI) and typical language development (TD). Method: One hundred thirty-one children participated: 48 with SLI, 44 with TD matched on age and gender, and 39…

  19. Prediction of seizure incidence probability in PTZ model of kindling through spatial learning ability in male and female rats.

    PubMed

    Haeri, Narges-Al-Sadat; Palizvan, Mohammad Reza; Sadegh, Mehdi; Aghaei, Zohre; Rafiei, Mohammad

    2016-07-01

    Epilepsy is a common neurological disease characterized by periodic seizures. Cognitive deficits and impairments in learning and memory are also associated with epilepsy. Neuronal changes and synaptic modifications in kindling model of epilepsy are similar to those occur during the learning procedure and memory formation. Herein we investigated whether seizure susceptibility in pentylenetetrazol (PTZ) model of kindling is predictable based on the learning ability in the Morris water maze (MWM) task in male and female rats. Allocentric learning was tested using MWM in present of light while egocentric learning was evaluated by MWM in dark room. The results indicated no significant differences in allocentric learning abilities between male and female rats. However, male rats were able to memorize the location of the platform more effectively compared to females in egocentric test. In addition, a statistically significant negative correlation between learning abilities (working memory) and seizure susceptibility in male rats was found while this correlation was positive in female rats. On the other hand, although there was no significant correlation between retrieval (reference memory) of spatial memories and seizure parameters in male rats, female rats showed a significant negative correlation. These findings may provide some evidences for prediction of seizure susceptibility according to learning ability and memory retention. PMID:27098273

  20. Why Probability?

    ERIC Educational Resources Information Center

    Weatherly, Myra S.

    1984-01-01

    Instruction in mathematical probability to enhance higher levels of critical and creative thinking with gifted students is described. Among thinking skills developed by such an approach are analysis, synthesis, evaluation, fluency, and complexity. (CL)

  1. Probability workshop to be better in probability topic

    NASA Astrophysics Data System (ADS)

    Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed

    2015-02-01

    The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.

  2. Uncertainty quantification and integration of machine learning techniques for predicting acid rock drainage chemistry: a probability bounds approach.

    PubMed

    Betrie, Getnet D; Sadiq, Rehan; Morin, Kevin A; Tesfamariam, Solomon

    2014-08-15

    Acid rock drainage (ARD) is a major pollution problem globally that has adversely impacted the environment. Identification and quantification of uncertainties are integral parts of ARD assessment and risk mitigation, however previous studies on predicting ARD drainage chemistry have not fully addressed issues of uncertainties. In this study, artificial neural networks (ANN) and support vector machine (SVM) are used for the prediction of ARD drainage chemistry and their predictive uncertainties are quantified using probability bounds analysis. Furthermore, the predictions of ANN and SVM are integrated using four aggregation methods to improve their individual predictions. The results of this study showed that ANN performed better than SVM in enveloping the observed concentrations. In addition, integrating the prediction of ANN and SVM using the aggregation methods improved the predictions of individual techniques. PMID:24852616

  3. Adapting machine learning techniques to censored time-to-event health record data: A general-purpose approach using inverse probability of censoring weighting.

    PubMed

    Vock, David M; Wolfson, Julian; Bandyopadhyay, Sunayan; Adomavicius, Gediminas; Johnson, Paul E; Vazquez-Benitez, Gabriela; O'Connor, Patrick J

    2016-06-01

    Models for predicting the probability of experiencing various health outcomes or adverse events over a certain time frame (e.g., having a heart attack in the next 5years) based on individual patient characteristics are important tools for managing patient care. Electronic health data (EHD) are appealing sources of training data because they provide access to large amounts of rich individual-level data from present-day patient populations. However, because EHD are derived by extracting information from administrative and clinical databases, some fraction of subjects will not be under observation for the entire time frame over which one wants to make predictions; this loss to follow-up is often due to disenrollment from the health system. For subjects without complete follow-up, whether or not they experienced the adverse event is unknown, and in statistical terms the event time is said to be right-censored. Most machine learning approaches to the problem have been relatively ad hoc; for example, common approaches for handling observations in which the event status is unknown include (1) discarding those observations, (2) treating them as non-events, (3) splitting those observations into two observations: one where the event occurs and one where the event does not. In this paper, we present a general-purpose approach to account for right-censored outcomes using inverse probability of censoring weighting (IPCW). We illustrate how IPCW can easily be incorporated into a number of existing machine learning algorithms used to mine big health care data including Bayesian networks, k-nearest neighbors, decision trees, and generalized additive models. We then show that our approach leads to better calibrated predictions than the three ad hoc approaches when applied to predicting the 5-year risk of experiencing a cardiovascular adverse event, using EHD from a large U.S. Midwestern healthcare system. PMID:26992568

  4. Evaluation of the probability of arrester failure in a high-voltage transmission line using a Q learning artificial neural network model

    NASA Astrophysics Data System (ADS)

    Ekonomou, L.; Karampelas, P.; Vita, V.; Chatzarakis, G. E.

    2011-04-01

    One of the most popular methods of protecting high voltage transmission lines against lightning strikes and internal overvoltages is the use of arresters. The installation of arresters in high voltage transmission lines can prevent or even reduce the lines' failure rate. Several studies based on simulation tools have been presented in order to estimate the critical currents that exceed the arresters' rated energy stress and to specify the arresters' installation interval. In this work artificial intelligence, and more specifically a Q-learning artificial neural network (ANN) model, is addressed for evaluating the arresters' failure probability. The aims of the paper are to describe in detail the developed Q-learning ANN model and to compare the results obtained by its application in operating 150 kV Greek transmission lines with those produced using a simulation tool. The satisfactory and accurate results of the proposed ANN model can make it a valuable tool for designers of electrical power systems seeking more effective lightning protection, reducing operational costs and better continuity of service.

  5. Probability 1/e

    ERIC Educational Resources Information Center

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  6. On Probability Domains III

    NASA Astrophysics Data System (ADS)

    Frič, Roman; Papčo, Martin

    2015-12-01

    Domains of generalized probability have been introduced in order to provide a general construction of random events, observables and states. It is based on the notion of a cogenerator and the properties of product. We continue our previous study and show how some other quantum structures fit our categorical approach. We discuss how various epireflections implicitly used in the classical probability theory are related to the transition to fuzzy probability theory and describe the latter probability theory as a genuine categorical extension of the former. We show that the IF-probability can be studied via the fuzzy probability theory. We outline a "tensor modification" of the fuzzy probability theory.

  7. The Role of Cooperative Learning Type Team Assisted Individualization to Improve the Students' Mathematics Communication Ability in the Subject of Probability Theory

    ERIC Educational Resources Information Center

    Tinungki, Georgina Maria

    2015-01-01

    The importance of learning mathematics can not be separated from its role in all aspects of life. Communicating ideas by using mathematics language is even more practical, systematic, and efficient. In order to overcome the difficulties of students who have insufficient understanding of mathematics material, good communications should be built in…

  8. Experience Matters: Information Acquisition Optimizes Probability Gain

    PubMed Central

    Nelson, Jonathan D.; McKenzie, Craig R.M.; Cottrell, Garrison W.; Sejnowski, Terrence J.

    2010-01-01

    Deciding which piece of information to acquire or attend to is fundamental to perception, categorization, medical diagnosis, and scientific inference. Four statistical theories of the value of information—information gain, Kullback-Liebler distance, probability gain (error minimization), and impact—are equally consistent with extant data on human information acquisition. Three experiments, designed via computer optimization to be maximally informative, tested which of these theories best describes human information search. Experiment 1, which used natural sampling and experience-based learning to convey environmental probabilities, found that probability gain explained subjects’ information search better than the other statistical theories or the probability-of-certainty heuristic. Experiments 1 and 2 found that subjects behaved differently when the standard method of verbally presented summary statistics (rather than experience-based learning) was used to convey environmental probabilities. Experiment 3 found that subjects’ preference for probability gain is robust, suggesting that the other models contribute little to subjects’ search behavior. PMID:20525915

  9. Experience matters: information acquisition optimizes probability gain.

    PubMed

    Nelson, Jonathan D; McKenzie, Craig R M; Cottrell, Garrison W; Sejnowski, Terrence J

    2010-07-01

    Deciding which piece of information to acquire or attend to is fundamental to perception, categorization, medical diagnosis, and scientific inference. Four statistical theories of the value of information-information gain, Kullback-Liebler distance, probability gain (error minimization), and impact-are equally consistent with extant data on human information acquisition. Three experiments, designed via computer optimization to be maximally informative, tested which of these theories best describes human information search. Experiment 1, which used natural sampling and experience-based learning to convey environmental probabilities, found that probability gain explained subjects' information search better than the other statistical theories or the probability-of-certainty heuristic. Experiments 1 and 2 found that subjects behaved differently when the standard method of verbally presented summary statistics (rather than experience-based learning) was used to convey environmental probabilities. Experiment 3 found that subjects' preference for probability gain is robust, suggesting that the other models contribute little to subjects' search behavior. PMID:20525915

  10. Probability and Relative Frequency

    NASA Astrophysics Data System (ADS)

    Drieschner, Michael

    2016-01-01

    The concept of probability seems to have been inexplicable since its invention in the seventeenth century. In its use in science, probability is closely related with relative frequency. So the task seems to be interpreting that relation. In this paper, we start with predicted relative frequency and show that its structure is the same as that of probability. I propose to call that the `prediction interpretation' of probability. The consequences of that definition are discussed. The "ladder"-structure of the probability calculus is analyzed. The expectation of the relative frequency is shown to be equal to the predicted relative frequency. Probability is shown to be the most general empirically testable prediction.

  11. Evolution and Probability.

    ERIC Educational Resources Information Center

    Bailey, David H.

    2000-01-01

    Some of the most impressive-sounding criticisms of the conventional theory of biological evolution involve probability. Presents a few examples of how probability should and should not be used in discussing evolution. (ASK)

  12. BIODEGRADATION PROBABILITY PROGRAM (BIODEG)

    EPA Science Inventory

    The Biodegradation Probability Program (BIODEG) calculates the probability that a chemical under aerobic conditions with mixed cultures of microorganisms will biodegrade rapidly or slowly. It uses fragment constants developed using multiple linear and non-linear regressions and d...

  13. Probability on a Budget.

    ERIC Educational Resources Information Center

    Ewbank, William A.; Ginther, John L.

    2002-01-01

    Describes how to use common dice numbered 1-6 for simple mathematical situations including probability. Presents a lesson using regular dice and specially marked dice to explore some of the concepts of probability. (KHR)

  14. Spatial Probability Cuing and Right Hemisphere Damage

    ERIC Educational Resources Information Center

    Shaqiri, Albulena; Anderson, Britt

    2012-01-01

    In this experiment we studied statistical learning, inter-trial priming, and visual attention. We assessed healthy controls and right brain damaged (RBD) patients with and without neglect, on a simple visual discrimination task designed to measure priming effects and probability learning. All participants showed a preserved priming effect for item…

  15. Dependent Probability Spaces

    ERIC Educational Resources Information Center

    Edwards, William F.; Shiflett, Ray C.; Shultz, Harris

    2008-01-01

    The mathematical model used to describe independence between two events in probability has a non-intuitive consequence called dependent spaces. The paper begins with a very brief history of the development of probability, then defines dependent spaces, and reviews what is known about finite spaces with uniform probability. The study of finite…

  16. Searching with probabilities

    SciTech Connect

    Palay, A.J.

    1985-01-01

    This book examines how probability distributions can be used as a knowledge representation technique. It presents a mechanism that can be used to guide a selective search algorithm to solve a variety of tactical chess problems. Topics covered include probabilities and searching the B algorithm and chess probabilities - in practice, examples, results, and future work.

  17. In All Probability, Probability is not All

    ERIC Educational Resources Information Center

    Helman, Danny

    2004-01-01

    The national lottery is often portrayed as a game of pure chance with no room for strategy. This misperception seems to stem from the application of probability instead of expectancy considerations, and can be utilized to introduce the statistical concept of expectation.

  18. Predicting accurate probabilities with a ranking loss

    PubMed Central

    Menon, Aditya Krishna; Jiang, Xiaoqian J; Vembu, Shankar; Elkan, Charles; Ohno-Machado, Lucila

    2013-01-01

    In many real-world applications of machine learning classifiers, it is essential to predict the probability of an example belonging to a particular class. This paper proposes a simple technique for predicting probabilities based on optimizing a ranking loss, followed by isotonic regression. This semi-parametric technique offers both good ranking and regression performance, and models a richer set of probability distributions than statistical workhorses such as logistic regression. We provide experimental results that show the effectiveness of this technique on real-world applications of probability prediction. PMID:25285328

  19. A Posteriori Transit Probabilities

    NASA Astrophysics Data System (ADS)

    Stevens, Daniel J.; Gaudi, B. Scott

    2013-08-01

    Given the radial velocity (RV) detection of an unseen companion, it is often of interest to estimate the probability that the companion also transits the primary star. Typically, one assumes a uniform distribution for the cosine of the inclination angle i of the companion's orbit. This yields the familiar estimate for the prior transit probability of ~Rlowast/a, given the primary radius Rlowast and orbital semimajor axis a, and assuming small companions and a circular orbit. However, the posterior transit probability depends not only on the prior probability distribution of i but also on the prior probability distribution of the companion mass Mc, given a measurement of the product of the two (the minimum mass Mc sin i) from an RV signal. In general, the posterior can be larger or smaller than the prior transit probability. We derive analytic expressions for the posterior transit probability assuming a power-law form for the distribution of true masses, dΓ/dMcvpropMcα, for integer values -3 <= α <= 3. We show that for low transit probabilities, these probabilities reduce to a constant multiplicative factor fα of the corresponding prior transit probability, where fα in general depends on α and an assumed upper limit on the true mass. The prior and posterior probabilities are equal for α = -1. The posterior transit probability is ~1.5 times larger than the prior for α = -3 and is ~4/π times larger for α = -2, but is less than the prior for α>=0, and can be arbitrarily small for α > 1. We also calculate the posterior transit probability in different mass regimes for two physically-motivated mass distributions of companions around Sun-like stars. We find that for Jupiter-mass planets, the posterior transit probability is roughly equal to the prior probability, whereas the posterior is likely higher for Super-Earths and Neptunes (10 M⊕ - 30 M⊕) and Super-Jupiters (3 MJup - 10 MJup), owing to the predicted steep rise in the mass function toward smaller

  20. Normal tissue complication probability (NTCP) modelling using spatial dose metrics and machine learning methods for severe acute oral mucositis resulting from head and neck radiotherapy

    PubMed Central

    Dean, Jamie A; Wong, Kee H; Welsh, Liam C; Jones, Ann-Britt; Schick, Ulrike; Newbold, Kate L; Bhide, Shreerang A; Harrington, Kevin J; Nutting, Christopher M; Gulliford, Sarah L

    2016-01-01

    Background and Purpose Severe acute mucositis commonly results from head and neck (chemo)radiotherapy. A predictive model of mucositis could guide clinical decision-making and inform treatment planning. We aimed to generate such a model using spatial dose metrics and machine learning. Material and Methods Predictive models of severe acute mucositis were generated using radiotherapy dose (dose-volume and spatial dose metrics) and clinical data. Penalised logistic regression, support vector classification and random forest classification (RFC) models were generated and compared. Internal validation was performed (with 100-iteration cross-validation), using multiple metrics, including area under the receiver operating characteristic curve (AUC) and calibration slope, to assess performance. Associations between covariates and severe mucositis were explored using the models. Results The dose-volume-based models (standard) performed equally to those incorporating spatial information. Discrimination was similar between models, but the RFCstandard had the best calibration. The mean AUC and calibration slope for this model were 0.71 (s.d.=0.09) and 3.9 (s.d.=2.2), respectively. The volumes of oral cavity receiving intermediate and high doses were associated with severe mucositis. Conclusions The RFCstandard model performance is modest-to-good, but should be improved, and requires external validation. Reducing the volumes of oral cavity receiving intermediate and high doses may reduce mucositis incidence. PMID:27240717

  1. Probability, statistics, and computational science.

    PubMed

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters. PMID:22407706

  2. Estimation of State Transition Probabilities: A Neural Network Model

    NASA Astrophysics Data System (ADS)

    Saito, Hiroshi; Takiyama, Ken; Okada, Masato

    2015-12-01

    Humans and animals can predict future states on the basis of acquired knowledge. This prediction of the state transition is important for choosing the best action, and the prediction is only possible if the state transition probability has already been learned. However, how our brains learn the state transition probability is unknown. Here, we propose a simple algorithm for estimating the state transition probability by utilizing the state prediction error. We analytically and numerically confirmed that our algorithm is able to learn the probability completely with an appropriate learning rate. Furthermore, our learning rule reproduced experimentally reported psychometric functions and neural activities in the lateral intraparietal area in a decision-making task. Thus, our algorithm might describe the manner in which our brains learn state transition probabilities and predict future states.

  3. A Comprehensive Probability Project for the Upper Division One-Semester Probability Course Using Yahtzee

    ERIC Educational Resources Information Center

    Wilson, Jason; Lawman, Joshua; Murphy, Rachael; Nelson, Marissa

    2011-01-01

    This article describes a probability project used in an upper division, one-semester probability course with third-semester calculus and linear algebra prerequisites. The student learning outcome focused on developing the skills necessary for approaching project-sized math/stat application problems. These skills include appropriately defining…

  4. Single-case probabilities

    NASA Astrophysics Data System (ADS)

    Miller, David

    1991-12-01

    The propensity interpretation of probability, bred by Popper in 1957 (K. R. Popper, in Observation and Interpretation in the Philosophy of Physics, S. Körner, ed. (Butterworth, London, 1957, and Dover, New York, 1962), p. 65; reprinted in Popper Selections, D. W. Miller, ed. (Princeton University Press, Princeton, 1985), p. 199) from pure frequency stock, is the only extant objectivist account that provides any proper understanding of single-case probabilities as well as of probabilities in ensembles and in the long run. In Sec. 1 of this paper I recall salient points of the frequency interpretations of von Mises and of Popper himself, and in Sec. 2 I filter out from Popper's numerous expositions of the propensity interpretation its most interesting and fertile strain. I then go on to assess it. First I defend it, in Sec. 3, against recent criticisms (P. Humphreys, Philos. Rev. 94, 557 (1985); P. Milne, Erkenntnis 25, 129 (1986)) to the effect that conditional [or relative] probabilities, unlike absolute probabilities, can only rarely be made sense of as propensities. I then challenge its predominance, in Sec. 4, by outlining a rival theory: an irreproachably objectivist theory of probability, fully applicable to the single case, that interprets physical probabilities as instantaneous frequencies.

  5. Probability with Roulette

    ERIC Educational Resources Information Center

    Marshall, Jennings B.

    2007-01-01

    This article describes how roulette can be used to teach basic concepts of probability. Various bets are used to illustrate the computation of expected value. A betting system shows variations in patterns that often appear in random events.

  6. Launch Collision Probability

    NASA Technical Reports Server (NTRS)

    Bollenbacher, Gary; Guptill, James D.

    1999-01-01

    This report analyzes the probability of a launch vehicle colliding with one of the nearly 10,000 tracked objects orbiting the Earth, given that an object on a near-collision course with the launch vehicle has been identified. Knowledge of the probability of collision throughout the launch window can be used to avoid launching at times when the probability of collision is unacceptably high. The analysis in this report assumes that the positions of the orbiting objects and the launch vehicle can be predicted as a function of time and therefore that any tracked object which comes close to the launch vehicle can be identified. The analysis further assumes that the position uncertainty of the launch vehicle and the approaching space object can be described with position covariance matrices. With these and some additional simplifying assumptions, a closed-form solution is developed using two approaches. The solution shows that the probability of collision is a function of position uncertainties, the size of the two potentially colliding objects, and the nominal separation distance at the point of closest approach. ne impact of the simplifying assumptions on the accuracy of the final result is assessed and the application of the results to the Cassini mission, launched in October 1997, is described. Other factors that affect the probability of collision are also discussed. Finally, the report offers alternative approaches that can be used to evaluate the probability of collision.

  7. Experimental Probability in Elementary School

    ERIC Educational Resources Information Center

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  8. Acceptance, values, and probability.

    PubMed

    Steel, Daniel

    2015-10-01

    This essay makes a case for regarding personal probabilities used in Bayesian analyses of confirmation as objects of acceptance and rejection. That in turn entails that personal probabilities are subject to the argument from inductive risk, which aims to show non-epistemic values can legitimately influence scientific decisions about which hypotheses to accept. In a Bayesian context, the argument from inductive risk suggests that value judgments can influence decisions about which probability models to accept for likelihoods and priors. As a consequence, if the argument from inductive risk is sound, then non-epistemic values can affect not only the level of evidence deemed necessary to accept a hypothesis but also degrees of confirmation themselves. PMID:26386533

  9. Probability & Perception: The Representativeness Heuristic in Action

    ERIC Educational Resources Information Center

    Lu, Yun; Vasko, Francis J.; Drummond, Trevor J.; Vasko, Lisa E.

    2014-01-01

    If the prospective students of probability lack a background in mathematical proofs, hands-on classroom activities may work well to help them to learn to analyze problems correctly. For example, students may physically roll a die twice to count and compare the frequency of the sequences. Tools such as graphing calculators or Microsoft Excel®…

  10. Approximating Integrals Using Probability

    ERIC Educational Resources Information Center

    Maruszewski, Richard F., Jr.; Caudle, Kyle A.

    2005-01-01

    As part of a discussion on Monte Carlo methods, which outlines how to use probability expectations to approximate the value of a definite integral. The purpose of this paper is to elaborate on this technique and then to show several examples using visual basic as a programming tool. It is an interesting method because it combines two branches of…

  11. Varga: On Probability.

    ERIC Educational Resources Information Center

    Varga, Tamas

    This booklet resulted from a 1980 visit by the author, a Hungarian mathematics educator, to the Teachers' Center Project at Southern Illinois University at Edwardsville. Included are activities and problems that make probablility concepts accessible to young children. The topics considered are: two probability games; choosing two beads; matching…

  12. Application of Quantum Probability

    NASA Astrophysics Data System (ADS)

    Bohdalová, Mária; Kalina, Martin; Nánásiová, Ol'ga

    2009-03-01

    This is the first attempt to smooth time series using estimators with applying quantum probability with causality (non-commutative s-maps on an othomodular lattice). In this context it means that we use non-symmetric covariance matrix to construction of our estimator.

  13. Univariate Probability Distributions

    ERIC Educational Resources Information Center

    Leemis, Lawrence M.; Luckett, Daniel J.; Powell, Austin G.; Vermeer, Peter E.

    2012-01-01

    We describe a web-based interactive graphic that can be used as a resource in introductory classes in mathematical statistics. This interactive graphic presents 76 common univariate distributions and gives details on (a) various features of the distribution such as the functional form of the probability density function and cumulative distribution…

  14. Waste Package Misload Probability

    SciTech Connect

    J.K. Knudsen

    2001-11-20

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a.

  15. Stimulus probability effects in absolute identification.

    PubMed

    Kent, Christopher; Lamberts, Koen

    2016-05-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of presentation probability on both proportion correct and response times. The effects were moderated by the ubiquitous stimulus position effect. The accuracy and response time data were predicted by an exemplar-based model of perceptual cognition (Kent & Lamberts, 2005). The bow in discriminability was also attenuated when presentation probability for middle items was relatively high, an effect that will constrain future model development. The study provides evidence for item-specific learning in absolute identification. Implications for other theories of absolute identification are discussed. (PsycINFO Database Record PMID:26478959

  16. Seismicity alert probabilities at Parkfield, California, revisited

    USGS Publications Warehouse

    Michael, A.J.; Jones, L.M.

    1998-01-01

    For a decade, the US Geological Survey has used the Parkfield Earthquake Prediction Experiment scenario document to estimate the probability that earthquakes observed on the San Andreas fault near Parkfield will turn out to be foreshocks followed by the expected magnitude six mainshock. During this time, we have learned much about the seismogenic process at Parkfield, about the long-term probability of the Parkfield mainshock, and about the estimation of these types of probabilities. The probabilities for potential foreshocks at Parkfield are reexamined and revised in light of these advances. As part of this process, we have confirmed both the rate of foreshocks before strike-slip earthquakes in the San Andreas physiographic province and the uniform distribution of foreshocks with magnitude proposed by earlier studies. Compared to the earlier assessment, these new estimates of the long-term probability of the Parkfield mainshock are lower, our estimate of the rate of background seismicity is higher, and we find that the assumption that foreshocks at Parkfield occur in a unique way is not statistically significant at the 95% confidence level. While the exact numbers vary depending on the assumptions that are made, the new alert probabilities are lower than previously estimated. Considering the various assumptions and the statistical uncertainties in the input parameters, we also compute a plausible range for the probabilities. The range is large, partly due to the extra knowledge that exists for the Parkfield segment, making us question the usefulness of these numbers.

  17. Probability mapping of contaminants

    SciTech Connect

    Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.

    1994-04-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).

  18. Measurement Uncertainty and Probability

    NASA Astrophysics Data System (ADS)

    Willink, Robin

    2013-02-01

    Part I. Principles: 1. Introduction; 2. Foundational ideas in measurement; 3. Components of error or uncertainty; 4. Foundational ideas in probability and statistics; 5. The randomization of systematic errors; 6. Beyond the standard confidence interval; Part II. Evaluation of Uncertainty: 7. Final preparation; 8. Evaluation using the linear approximation; 9. Evaluation without the linear approximations; 10. Uncertainty information fit for purpose; Part III. Related Topics: 11. Measurement of vectors and functions; 12. Why take part in a measurement comparison?; 13. Other philosophies; 14. An assessment of objective Bayesian methods; 15. A guide to the expression of uncertainty in measurement; 16. Measurement near a limit - an insoluble problem?; References; Index.

  19. Emptiness Formation Probability

    NASA Astrophysics Data System (ADS)

    Crawford, Nicholas; Ng, Stephen; Starr, Shannon

    2016-08-01

    We present rigorous upper and lower bounds on the emptiness formation probability for the ground state of a spin-1/2 Heisenberg XXZ quantum spin system. For a d-dimensional system we find a rate of decay of the order {exp(-c L^{d+1})} where L is the sidelength of the box in which we ask for the emptiness formation event to occur. In the {d=1} case this confirms previous predictions made in the integrable systems community, though our bounds do not achieve the precision predicted by Bethe ansatz calculations. On the other hand, our bounds in the case {d ≥ 2} are new. The main tools we use are reflection positivity and a rigorous path integral expansion, which is a variation on those previously introduced by Toth, Aizenman-Nachtergaele and Ueltschi.

  20. Familiarity and preference for pitch probability profiles.

    PubMed

    Cui, Anja-Xiaoxing; Collett, Meghan J; Troje, Niko F; Cuddy, Lola L

    2015-05-01

    We investigated familiarity and preference judgments of participants toward a novel musical system. We exposed participants to tone sequences generated from a novel pitch probability profile. Afterward, we either asked participants to identify more familiar or we asked participants to identify preferred tone sequences in a two-alternative forced-choice task. The task paired a tone sequence generated from the pitch probability profile they had been exposed to and a tone sequence generated from another pitch probability profile at three levels of distinctiveness. We found that participants identified tone sequences as more familiar if they were generated from the same pitch probability profile which they had been exposed to. However, participants did not prefer these tone sequences. We interpret this relationship between familiarity and preference to be consistent with an inverted U-shaped relationship between knowledge and affect. The fact that participants identified tone sequences as even more familiar if they were generated from the more distinctive (caricatured) version of the pitch probability profile which they had been exposed to suggests that the statistical learning of the pitch probability profile is involved in gaining of musical knowledge. PMID:25838257

  1. Pointwise probability reinforcements for robust statistical inference.

    PubMed

    Frénay, Benoît; Verleysen, Michel

    2014-02-01

    Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation. PMID:24300550

  2. A "Virtual Spin" on the Teaching of Probability

    ERIC Educational Resources Information Center

    Beck, Shari A.; Huse, Vanessa E.

    2007-01-01

    This article, which describes integrating virtual manipulatives with the teaching of probability at the elementary level, puts a "virtual spin" on the teaching of probability to provide more opportunities for students to experience successful learning. The traditional use of concrete manipulatives is enhanced with virtual coins and spinners from…

  3. Visualizing and Understanding Probability and Statistics: Graphical Simulations Using Excel

    ERIC Educational Resources Information Center

    Gordon, Sheldon P.; Gordon, Florence S.

    2009-01-01

    The authors describe a collection of dynamic interactive simulations for teaching and learning most of the important ideas and techniques of introductory statistics and probability. The modules cover such topics as randomness, simulations of probability experiments such as coin flipping, dice rolling and general binomial experiments, a simulation…

  4. A Tale of Two Probabilities

    ERIC Educational Resources Information Center

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  5. Laboratory-tutorial activities for teaching probability

    NASA Astrophysics Data System (ADS)

    Wittmann, Michael C.; Morgan, Jeffrey T.; Feeley, Roger E.

    2006-12-01

    We report on the development of students’ ideas of probability and probability density in a University of Maine laboratory-based general education physics course called Intuitive Quantum Physics. Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a set of activities used to teach concepts of probability and probability density. Rudimentary knowledge of mechanics is needed for one activity, but otherwise the material requires no additional preparation. Extensions of the activities include relating probability density to potential energy graphs for certain “touchstone” examples. Students have difficulties learning the target concepts, such as comparing the ratio of time in a region to total time in all regions. Instead, they often focus on edge effects, pattern match to previously studied situations, reason about necessary but incomplete macroscopic elements of the system, use the gambler’s fallacy, and use expectations about ensemble results rather than expectation values to predict future events. We map the development of their thinking to provide examples of problems rather than evidence of a curriculum’s success.

  6. The Probability of Causal Conditionals

    ERIC Educational Resources Information Center

    Over, David E.; Hadjichristidis, Constantinos; Evans, Jonathan St. B. T.; Handley, Simon J.; Sloman, Steven A.

    2007-01-01

    Conditionals in natural language are central to reasoning and decision making. A theoretical proposal called the Ramsey test implies the conditional probability hypothesis: that the subjective probability of a natural language conditional, P(if p then q), is the conditional subjective probability, P(q [such that] p). We report three experiments on…

  7. Quantum probability and many worlds

    NASA Astrophysics Data System (ADS)

    Hemmo, Meir; Pitowsky, Itamar

    We discuss the meaning of probabilities in the many worlds interpretation of quantum mechanics. We start by presenting very briefly the many worlds theory, how the problem of probability arises, and some unsuccessful attempts to solve it in the past. Then we criticize a recent attempt by Deutsch to derive the quantum mechanical probabilities from the non-probabilistic parts of quantum mechanics and classical decision theory. We further argue that the Born probability does not make sense even as an additional probability rule in the many worlds theory. Our conclusion is that the many worlds theory fails to account for the probabilistic statements of standard (collapse) quantum mechanics.

  8. Unders and Overs: Using a Dice Game to Illustrate Basic Probability Concepts

    ERIC Educational Resources Information Center

    McPherson, Sandra Hanson

    2015-01-01

    In this paper, the dice game "Unders and Overs" is described and presented as an active learning exercise to introduce basic probability concepts. The implementation of the exercise is outlined and the resulting presentation of various probability concepts are described.

  9. Propensity, Probability, and Quantum Theory

    NASA Astrophysics Data System (ADS)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  10. PROBABILITY SURVEYS, CONDITIONAL PROBABILITIES, AND ECOLOGICAL RISK ASSESSMENT

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Asscssment Program EMAP) can be analyzed with a conditional probability analysis (CPA) to conduct quantitative probabi...

  11. Probability Surveys, Conditional Probability, and Ecological Risk Assessment

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency’s (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  12. The relationship between species detection probability and local extinction probability

    USGS Publications Warehouse

    Alpizar-Jara, R.; Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Pollock, K.H.; Rosenberry, C.S.

    2004-01-01

    In community-level ecological studies, generally not all species present in sampled areas are detected. Many authors have proposed the use of estimation methods that allow detection probabilities that are <1 and that are heterogeneous among species. These methods can also be used to estimate community-dynamic parameters such as species local extinction probability and turnover rates (Nichols et al. Ecol Appl 8:1213-1225; Conserv Biol 12:1390-1398). Here, we present an ad hoc approach to estimating community-level vital rates in the presence of joint heterogeneity of detection probabilities and vital rates. The method consists of partitioning the number of species into two groups using the detection frequencies and then estimating vital rates (e.g., local extinction probabilities) for each group. Estimators from each group are combined in a weighted estimator of vital rates that accounts for the effect of heterogeneity. Using data from the North American Breeding Bird Survey, we computed such estimates and tested the hypothesis that detection probabilities and local extinction probabilities were negatively related. Our analyses support the hypothesis that species detection probability covaries negatively with local probability of extinction and turnover rates. A simulation study was conducted to assess the performance of vital parameter estimators as well as other estimators relevant to questions about heterogeneity, such as coefficient of variation of detection probabilities and proportion of species in each group. Both the weighted estimator suggested in this paper and the original unweighted estimator for local extinction probability performed fairly well and provided no basis for preferring one to the other.

  13. Updating: Learning versus Supposing

    ERIC Educational Resources Information Center

    Zhao, Jiaying; Crupi, Vincenzo; Tentori, Katya; Fitelson, Branden; Osherson, Daniel

    2012-01-01

    Bayesian orthodoxy posits a tight relationship between conditional probability and updating. Namely, the probability of an event "A" after learning "B" should equal the conditional probability of "A" given "B" prior to learning "B". We examine whether ordinary judgment conforms to the orthodox view. In three experiments we found substantial…

  14. The Probabilities of Conditionals Revisited

    ERIC Educational Resources Information Center

    Douven, Igor; Verbrugge, Sara

    2013-01-01

    According to what is now commonly referred to as "the Equation" in the literature on indicative conditionals, the probability of any indicative conditional equals the probability of its consequent of the conditional given the antecedent of the conditional. Philosophers widely agree in their assessment that the triviality arguments of…

  15. Minimizing the probable maximum flood

    SciTech Connect

    Woodbury, M.S.; Pansic, N. ); Eberlein, D.T. )

    1994-06-01

    This article examines Wisconsin Electric Power Company's efforts to determine an economical way to comply with Federal Energy Regulatory Commission requirements at two hydroelectric developments on the Michigamme River. Their efforts included refinement of the area's probable maximum flood model based, in part, on a newly developed probable maximum precipitation estimate.

  16. Decision analysis with approximate probabilities

    NASA Technical Reports Server (NTRS)

    Whalen, Thomas

    1992-01-01

    This paper concerns decisions under uncertainty in which the probabilities of the states of nature are only approximately known. Decision problems involving three states of nature are studied. This is due to the fact that some key issues do not arise in two-state problems, while probability spaces with more than three states of nature are essentially impossible to graph. The primary focus is on two levels of probabilistic information. In one level, the three probabilities are separately rounded to the nearest tenth. This can lead to sets of rounded probabilities which add up to 0.9, 1.0, or 1.1. In the other level, probabilities are rounded to the nearest tenth in such a way that the rounded probabilities are forced to sum to 1.0. For comparison, six additional levels of probabilistic information, previously analyzed, were also included in the present analysis. A simulation experiment compared four criteria for decisionmaking using linearly constrained probabilities (Maximin, Midpoint, Standard Laplace, and Extended Laplace) under the eight different levels of information about probability. The Extended Laplace criterion, which uses a second order maximum entropy principle, performed best overall.

  17. Probability of sea level rise

    SciTech Connect

    Titus, J.G.; Narayanan, V.K.

    1995-10-01

    The report develops probability-based projections that can be added to local tide-gage trends to estimate future sea level at particular locations. It uses the same models employed by previous assessments of sea level rise. The key coefficients in those models are based on subjective probability distributions supplied by a cross-section of climatologists, oceanographers, and glaciologists.

  18. Computation of Most Probable Numbers

    PubMed Central

    Russek, Estelle; Colwell, Rita R.

    1983-01-01

    A rapid computational method for maximum likelihood estimation of most-probable-number values, incorporating a modified Newton-Raphson method, is presented. The method offers a much greater reliability for the most-probable-number estimate of total viable bacteria, i.e., those capable of growth in laboratory media. PMID:6870242

  19. The probabilities of unique events.

    PubMed

    Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Phil

    2012-01-01

    Many theorists argue that the probabilities of unique events, even real possibilities such as President Obama's re-election, are meaningless. As a consequence, psychologists have seldom investigated them. We propose a new theory (implemented in a computer program) in which such estimates depend on an intuitive non-numerical system capable only of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of conjunctions should often tend to split the difference between the probabilities of the two conjuncts. We report two experiments showing that individuals commit such violations of the probability calculus, and corroborating other predictions of the theory, e.g., individuals err in the same way even when they make non-numerical verbal estimates, such as that an event is highly improbable. PMID:23056224

  20. Transition probabilities of Br II

    NASA Technical Reports Server (NTRS)

    Bengtson, R. D.; Miller, M. H.

    1976-01-01

    Absolute transition probabilities of the three most prominent visible Br II lines are measured in emission. Results compare well with Coulomb approximations and with line strengths extrapolated from trends in homologous atoms.

  1. The Probabilities of Unique Events

    PubMed Central

    Khemlani, Sangeet S.; Lotstein, Max; Johnson-Laird, Phil

    2012-01-01

    Many theorists argue that the probabilities of unique events, even real possibilities such as President Obama's re-election, are meaningless. As a consequence, psychologists have seldom investigated them. We propose a new theory (implemented in a computer program) in which such estimates depend on an intuitive non-numerical system capable only of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of conjunctions should often tend to split the difference between the probabilities of the two conjuncts. We report two experiments showing that individuals commit such violations of the probability calculus, and corroborating other predictions of the theory, e.g., individuals err in the same way even when they make non-numerical verbal estimates, such as that an event is highly improbable. PMID:23056224

  2. VESPA: False positive probabilities calculator

    NASA Astrophysics Data System (ADS)

    Morton, Timothy D.

    2015-03-01

    Validation of Exoplanet Signals using a Probabilistic Algorithm (VESPA) calculates false positive probabilities and statistically validates transiting exoplanets. Written in Python, it uses isochrones [ascl:1503.010] and the package simpledist.

  3. Dinosaurs, Dinosaur Eggs, and Probability.

    ERIC Educational Resources Information Center

    Teppo, Anne R.; Hodgson, Ted

    2001-01-01

    Outlines several recommendations for teaching probability in the secondary school. Offers an activity that employs simulation by hand and using a programmable calculator in which geometry, analytical geometry, and discrete mathematics are explored. (KHR)

  4. A Quantum Probability Model of Causal Reasoning

    PubMed Central

    Trueblood, Jennifer S.; Busemeyer, Jerome R.

    2012-01-01

    People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause) with diagnostic judgments (i.e., the conditional probability of a cause given an effect). The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment. PMID:22593747

  5. Joint probabilities and quantum cognition

    SciTech Connect

    Acacio de Barros, J.

    2012-12-18

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  6. Joint probabilities and quantum cognition

    NASA Astrophysics Data System (ADS)

    de Barros, J. Acacio

    2012-12-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  7. Evaluation of microbial release probabilities

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Work undertaken to improve the estimation of the probability of release of microorganisms from unmanned Martian landing spacecraft is summarized. An analytical model is described for the development of numerical values for release parameters and release mechanisms applicable to flight missions are defined. Laboratory test data are used to evolve parameter values for use by flight projects in estimating numerical values for release probabilities. The analysis treats microbial burden located on spacecraft surfaces, between mated surfaces, and encapsulated within materials.

  8. Joint probability distributions for projection probabilities of random orthonormal states

    NASA Astrophysics Data System (ADS)

    Alonso, L.; Gorin, T.

    2016-04-01

    The quantum chaos conjecture applied to a finite dimensional quantum system implies that such a system has eigenstates that show similar statistical properties as the column vectors of random orthogonal or unitary matrices. Here, we consider the different probabilities for obtaining a specific outcome in a projective measurement, provided the system is in one of its eigenstates. We then give analytic expressions for the joint probability density for these probabilities, with respect to the ensemble of random matrices. In the case of the unitary group, our results can be applied, also, to the phenomenon of universal conductance fluctuations, where the same mathematical quantities describe partial conductances in a two-terminal mesoscopic scattering problem with a finite number of modes in each terminal.

  9. Can Personality Type Explain Heterogeneity in Probability Distortions?

    PubMed Central

    Capra, C. Monica; Jiang, Bing; Engelmann, Jan B.; Berns, Gregory S.

    2014-01-01

    There are two regularities we have learned from experimental studies of choice under risk. The first is that the majority of people weigh objective probabilities non-linearly. The second regularity, although less commonly acknowledged, is that there is a large amount of heterogeneity in how people distort probabilities. Despite of this, little effort has been made to identify the source of heterogeneity. In this paper, we explore the possibility that personality type is linked to probability distortions. Using validated psychological questionnaires, we clustered participants into distinct personality types: motivated, impulsive, and affective. We found that the motivated viewed gambling more attractive, whereas the impulsive were the most capable of discriminating non-extreme probabilities. Our results suggest that the observed heterogeneity in probability distortions may be explained by personality profiles, which can be elicited though standard psychological questionnaires. PMID:24639891

  10. Imprecise probabilities in engineering analyses

    NASA Astrophysics Data System (ADS)

    Beer, Michael; Ferson, Scott; Kreinovich, Vladik

    2013-05-01

    Probabilistic uncertainty and imprecision in structural parameters and in environmental conditions and loads are challenging phenomena in engineering analyses. They require appropriate mathematical modeling and quantification to obtain realistic results when predicting the behavior and reliability of engineering structures and systems. But the modeling and quantification is complicated by the characteristics of the available information, which involves, for example, sparse data, poor measurements and subjective information. This raises the question whether the available information is sufficient for probabilistic modeling or rather suggests a set-theoretical approach. The framework of imprecise probabilities provides a mathematical basis to deal with these problems which involve both probabilistic and non-probabilistic information. A common feature of the various concepts of imprecise probabilities is the consideration of an entire set of probabilistic models in one analysis. The theoretical differences between the concepts mainly concern the mathematical description of the set of probabilistic models and the connection to the probabilistic models involved. This paper provides an overview on developments which involve imprecise probabilities for the solution of engineering problems. Evidence theory, probability bounds analysis with p-boxes, and fuzzy probabilities are discussed with emphasis on their key features and on their relationships to one another. This paper was especially prepared for this special issue and reflects, in various ways, the thinking and presentation preferences of the authors, who are also the guest editors for this special issue.

  11. Measure and probability in cosmology

    NASA Astrophysics Data System (ADS)

    Schiffrin, Joshua S.; Wald, Robert M.

    2012-07-01

    General relativity has a Hamiltonian formulation, which formally provides a canonical (Liouville) measure on the space of solutions. In ordinary statistical physics, the Liouville measure is used to compute probabilities of macrostates, and it would seem natural to use the similar measure arising in general relativity to compute probabilities in cosmology, such as the probability that the Universe underwent an era of inflation. Indeed, a number of authors have used the restriction of this measure to the space of homogeneous and isotropic universes with scalar field matter (minisuperspace)—namely, the Gibbons-Hawking-Stewart measure—to make arguments about the likelihood of inflation. We argue here that there are at least four major difficulties with using the measure of general relativity to make probability arguments in cosmology: (1) Equilibration does not occur on cosmological length scales. (2) Even in the minisuperspace case, the measure of phase space is infinite and the computation of probabilities depends very strongly on how the infinity is regulated. (3) The inhomogeneous degrees of freedom must be taken into account (we illustrate how) even if one is interested only in universes that are very nearly homogeneous. The measure depends upon how the infinite number of degrees of freedom are truncated, and how one defines “nearly homogeneous.” (4) In a Universe where the second law of thermodynamics holds, one cannot make use of our knowledge of the present state of the Universe to retrodict the likelihood of past conditions.

  12. Flood hazard probability mapping method

    NASA Astrophysics Data System (ADS)

    Kalantari, Zahra; Lyon, Steve; Folkeson, Lennart

    2015-04-01

    In Sweden, spatially explicit approaches have been applied in various disciplines such as landslide modelling based on soil type data and flood risk modelling for large rivers. Regarding flood mapping, most previous studies have focused on complex hydrological modelling on a small scale whereas just a few studies have used a robust GIS-based approach integrating most physical catchment descriptor (PCD) aspects on a larger scale. The aim of the present study was to develop methodology for predicting the spatial probability of flooding on a general large scale. Factors such as topography, land use, soil data and other PCDs were analysed in terms of their relative importance for flood generation. The specific objective was to test the methodology using statistical methods to identify factors having a significant role on controlling flooding. A second objective was to generate an index quantifying flood probability value for each cell, based on different weighted factors, in order to provide a more accurate analysis of potential high flood hazards than can be obtained using just a single variable. The ability of indicator covariance to capture flooding probability was determined for different watersheds in central Sweden. Using data from this initial investigation, a method to subtract spatial data for multiple catchments and to produce soft data for statistical analysis was developed. It allowed flood probability to be predicted from spatially sparse data without compromising the significant hydrological features on the landscape. By using PCD data, realistic representations of high probability flood regions was made, despite the magnitude of rain events. This in turn allowed objective quantification of the probability of floods at the field scale for future model development and watershed management.

  13. Interference of probabilities in dynamics

    SciTech Connect

    Zak, Michail

    2014-08-15

    A new class of dynamical systems with a preset type of interference of probabilities is introduced. It is obtained from the extension of the Madelung equation by replacing the quantum potential with a specially selected feedback from the Liouville equation. It has been proved that these systems are different from both Newtonian and quantum systems, but they can be useful for modeling spontaneous collective novelty phenomena when emerging outputs are qualitatively different from the weighted sum of individual inputs. Formation of language and fast decision-making process as potential applications of the probability interference is discussed.

  14. Probability as a Physical Motive

    NASA Astrophysics Data System (ADS)

    Martin, Peter

    2007-06-01

    Recent theoretical progress in nonequilibrium thermodynamics, linking thephysical principle of Maximum Entropy Production (“MEP”) to the information-theoretical“MaxEnt” principle of scientific inference, together with conjectures from theoreticalphysics that there may be no fundamental causal laws but only probabilities for physicalprocesses, and from evolutionary theory that biological systems expand “the adjacentpossible” as rapidly as possible, all lend credence to the proposition that probability shouldbe recognized as a fundamental physical motive. It is further proposed that spatial order andtemporal order are two aspects of the same thing, and that this is the essence of the secondlaw of thermodynamics.

  15. Knowledge typology for imprecise probabilities.

    SciTech Connect

    Wilson, G. D.; Zucker, L. J.

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  16. Genetics as a Context for the Study of Probability.

    ERIC Educational Resources Information Center

    Brahier, Daniel J.

    1999-01-01

    Presents an activity in which students develop a handy strategy for exploring probability as applied in the life sciences by learning to use the Punnett square, an effective tool that helps students visualize sample spaces and determine the likelihood of events. (ASK)

  17. Teaching Probability with the Support of the R Statistical Software

    ERIC Educational Resources Information Center

    dos Santos Ferreira, Robson; Kataoka, Verônica Yumi; Karrer, Monica

    2014-01-01

    The objective of this paper is to discuss aspects of high school students' learning of probability in a context where they are supported by the statistical software R. We report on the application of a teaching experiment, constructed using the perspective of Gal's probabilistic literacy and Papert's constructionism. The results…

  18. Stretching Probability Explorations with Geoboards

    ERIC Educational Resources Information Center

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  19. ESTIMATION OF AGE TRANSITION PROBABILITIES.

    ERIC Educational Resources Information Center

    ZINTER, JUDITH R.

    THIS NOTE DESCRIBES THE PROCEDURES USED IN DETERMINING DYNAMOD II AGE TRANSITION MATRICES. A SEPARATE MATRIX FOR EACH SEX-RACE GROUP IS DEVELOPED. THESE MATRICES WILL BE USED AS AN AID IN ESTIMATING THE TRANSITION PROBABILITIES IN THE LARGER DYNAMOD II MATRIX RELATING AGE TO OCCUPATIONAL CATEGORIES. THREE STEPS WERE USED IN THE PROCEDURE--(1)…

  20. Probability Simulation in Middle School.

    ERIC Educational Resources Information Center

    Lappan, Glenda; Winter, M. J.

    1980-01-01

    Two simulations designed to teach probability to middle-school age pupils are presented. The first simulates the one-on-one foul shot simulation in basketball; the second deals with collecting a set of six cereal box prizes by buying boxes containing one toy each. (MP)

  1. Some Surprising Probabilities from Bingo.

    ERIC Educational Resources Information Center

    Mercer, Joseph O.

    1993-01-01

    Investigates the probability of winning the largest prize at Bingo through a series of five simpler problems. Investigations are conducted with the aid of either BASIC computer programs, spreadsheets, or a computer algebra system such as Mathematica. Provides sample data tables to illustrate findings. (MDH)

  2. GPS: Geometry, Probability, and Statistics

    ERIC Educational Resources Information Center

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  3. Conditional Independence in Applied Probability.

    ERIC Educational Resources Information Center

    Pfeiffer, Paul E.

    This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…

  4. Dynamic SEP event probability forecasts

    NASA Astrophysics Data System (ADS)

    Kahler, S. W.; Ling, A.

    2015-10-01

    The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.

  5. Probability densities in strong turbulence

    NASA Astrophysics Data System (ADS)

    Yakhot, Victor

    2006-03-01

    In this work we, using Mellin’s transform combined with the Gaussian large-scale boundary condition, calculate probability densities (PDFs) of velocity increments P(δu,r), velocity derivatives P(u,r) and the PDF of the fluctuating dissipation scales Q(η,Re), where Re is the large-scale Reynolds number. The resulting expressions strongly deviate from the Log-normal PDF P(δu,r) often quoted in the literature. It is shown that the probability density of the small-scale velocity fluctuations includes information about the large (integral) scale dynamics which is responsible for the deviation of P(δu,r) from P(δu,r). An expression for the function D(h) of the multifractal theory, free from spurious logarithms recently discussed in [U. Frisch, M. Martins Afonso, A. Mazzino, V. Yakhot, J. Fluid Mech. 542 (2005) 97] is also obtained.

  6. Probability, Information and Statistical Physics

    NASA Astrophysics Data System (ADS)

    Kuzemsky, A. L.

    2016-03-01

    In this short survey review we discuss foundational issues of the probabilistic approach to information theory and statistical mechanics from a unified standpoint. Emphasis is on the inter-relations between theories. The basic aim is tutorial, i.e. to carry out a basic introduction to the analysis and applications of probabilistic concepts to the description of various aspects of complexity and stochasticity. We consider probability as a foundational concept in statistical mechanics and review selected advances in the theoretical understanding of interrelation of the probability, information and statistical description with regard to basic notions of statistical mechanics of complex systems. It includes also a synthesis of past and present researches and a survey of methodology. The purpose of this terse overview is to discuss and partially describe those probabilistic methods and approaches that are used in statistical mechanics with the purpose of making these ideas easier to understanding and to apply.

  7. A Lakatosian Encounter with Probability

    ERIC Educational Resources Information Center

    Chick, Helen

    2010-01-01

    There is much to be learned and pondered by reading "Proofs and Refutations" by Imre Lakatos (Lakatos, 1976). It highlights the importance of mathematical definitions, and how definitions evolve to capture the essence of the object they are defining. It also provides an exhilarating encounter with the ups and downs of the mathematical reasoning…

  8. Probability for primordial black holes

    NASA Astrophysics Data System (ADS)

    Bousso, R.; Hawking, S. W.

    1995-11-01

    We consider two quantum cosmological models with a massive scalar field: an ordinary Friedmann universe and a universe containing primordial black holes. For both models we discuss the complex solutions to the Euclidean Einstein equations. Using the probability measure obtained from the Hartle-Hawking no-boundary proposal we find that the only unsuppressed black holes start at the Planck size but can grow with the horizon scale during the roll down of the scalar field to the minimum.

  9. Relative transition probabilities of cobalt

    NASA Technical Reports Server (NTRS)

    Roig, R. A.; Miller, M. H.

    1974-01-01

    Results of determinations of neutral-cobalt transition probabilities measured relative to Co I 4150.43 A and Co II 4145.15 A, using a gas-driven shock tube as the spectroscopic light source. Results are presented for 139 Co I lines in the range from 3940 to 6640 A and 11 Co II lines in the range from 3840 to 4730 A, which are estimated to have reliabilities ranging from 8 to 50%.

  10. Probability for Weather and Climate

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  11. Probability of Detection Demonstration Transferability

    NASA Technical Reports Server (NTRS)

    Parker, Bradford H.

    2008-01-01

    The ongoing Mars Science Laboratory (MSL) Propellant Tank Penetrant Nondestructive Evaluation (NDE) Probability of Detection (POD) Assessment (NESC activity) has surfaced several issues associated with liquid penetrant POD demonstration testing. This presentation lists factors that may influence the transferability of POD demonstration tests. Initial testing will address the liquid penetrant inspection technique. Some of the factors to be considered in this task are crack aspect ratio, the extent of the crack opening, the material and the distance between the inspection surface and the inspector's eye.

  12. Lectures on probability and statistics

    SciTech Connect

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another.

  13. Measure and Probability in Cosmology

    NASA Astrophysics Data System (ADS)

    Schiffrin, Joshua; Wald, Robert

    2012-03-01

    General relativity has a Hamiltonian formulation, which formally provides a canonical (Liouville) measure on the space of solutions. A number of authors have used the restriction of this measure to the space of homogeneous and isotropic universes with scalar field matter (minisuperspace)---namely, the Gibbons-Hawking-Stewart measure---to make arguments about the likelihood of inflation. We argue here that there are at least four major difficulties with using the measure of general relativity to make probability arguments in cosmology: (1) Equilibration does not occur on cosmological length scales. (2) Even in the minisuperspace case, the measure of phase space is infinite and the computation of probabilities depends very strongly on how the infinity is regulated. (3) The inhomogeneous degrees of freedom must be taken into account even if one is interested only in universes that are very nearly homogeneous. The measure depends upon how the infinite number of degrees of freedom are truncated, and how one defines ``nearly homogeneous''. (4) In a universe where the second law of thermodynamics holds, one cannot make use of our knowledge of the present state of the universe to ``retrodict'' the likelihood of past conditions.

  14. MSPI False Indication Probability Simulations

    SciTech Connect

    Dana Kelly; Kurt Vedros; Robert Youngblood

    2011-03-01

    This paper examines false indication probabilities in the context of the Mitigating System Performance Index (MSPI), in order to investigate the pros and cons of different approaches to resolving two coupled issues: (1) sensitivity to the prior distribution used in calculating the Bayesian-corrected unreliability contribution to the MSPI, and (2) whether (in a particular plant configuration) to model the fuel oil transfer pump (FOTP) as a separate component, or integrally to its emergency diesel generator (EDG). False indication probabilities were calculated for the following situations: (1) all component reliability parameters at their baseline values, so that the true indication is green, meaning that an indication of white or above would be false positive; (2) one or more components degraded to the extent that the true indication would be (mid) white, and “false” would be green (negative) or yellow (negative) or red (negative). In key respects, this was the approach taken in NUREG-1753. The prior distributions examined were the constrained noninformative (CNI) prior used currently by the MSPI, a mixture of conjugate priors, the Jeffreys noninformative prior, a nonconjugate log(istic)-normal prior, and the minimally informative prior investigated in (Kelly et al., 2010). The mid-white performance state was set at ?CDF = ?10 ? 10-6/yr. For each simulated time history, a check is made of whether the calculated ?CDF is above or below 10-6/yr. If the parameters were at their baseline values, and ?CDF > 10-6/yr, this is counted as a false positive. Conversely, if one or all of the parameters are set to values corresponding to ?CDF > 10-6/yr but that time history’s ?CDF < 10-6/yr, this is counted as a false negative indication. The false indication (positive or negative) probability is then estimated as the number of false positive or negative counts divided by the number of time histories (100,000). Results are presented for a set of base case parameter values

  15. Associativity and normative credal probability.

    PubMed

    Snow, P

    2002-01-01

    Cox's Theorem is a widely cited motivation for probabilistic models of uncertain belief. The theorem relates the associativity of the logical connectives to that of the arithmetic operations of probability. Recent questions about the correctness of Cox's Theorem have been resolved, but there are new questions about one functional equation used by Cox in 1946. This equation is missing from his later work. Advances in knowledge since 1946 and changes in Cox's research interests explain the equation's disappearance. Other associativity-based motivations avoid functional equations altogether, and so may be more transparently applied to finite domains and discrete beliefs. A discrete counterpart of Cox's Theorem can be assembled from results that have been in the literature since 1959. PMID:18238098

  16. Imprecise probability for non-commuting observables

    NASA Astrophysics Data System (ADS)

    Allahverdyan, Armen E.

    2015-08-01

    It is known that non-commuting observables in quantum mechanics do not have joint probability. This statement refers to the precise (additive) probability model. I show that the joint distribution of any non-commuting pair of variables can be quantified via upper and lower probabilities, i.e. the joint probability is described by an interval instead of a number (imprecise probability). I propose transparent axioms from which the upper and lower probability operators follow. The imprecise probability depend on the non-commuting observables, is linear over the state (density matrix) and reverts to the usual expression for commuting observables.

  17. Exploring Student Difficulties with Multiplicity and Probability in Statistical Physics

    NASA Astrophysics Data System (ADS)

    Mountcastle, Donald; Thompson, John; Smith, Trevor

    2010-03-01

    We continue our research program on the teaching and learning of concepts in upper-division thermal physics at the University of Maine. Typical statistical physics textbooks introduce entropy (S) and multiplicity (w) [S = k ln(w)] with binary events such as flipping coins N times. Inherent confusion with probability and statistics, macrostates and microstates, and their varying dependence on N leads to student conceptual difficulties that persist after textbook-centered activities. We developed and implemented a guided-inquiry tutorial on the binomial distribution with student use of computational software to produce calculations of multiplicities, outcome probabilities, and graphs of their distributions as functions of N. This allows convenient exploration of statistics over more than seven orders of magnitude in N. Comparing student answers to pre- and post-tutorial questions, we find some, but not all of the intended learning results are achieved.

  18. Fusion probability in heavy nuclei

    NASA Astrophysics Data System (ADS)

    Banerjee, Tathagata; Nath, S.; Pal, Santanu

    2015-03-01

    Background: Fusion between two massive nuclei is a very complex process and is characterized by three stages: (a) capture inside the potential barrier, (b) formation of an equilibrated compound nucleus (CN), and (c) statistical decay of the CN leading to a cold evaporation residue (ER) or fission. The second stage is the least understood of the three and is the most crucial in predicting yield of superheavy elements (SHE) formed in complete fusion reactions. Purpose: A systematic study of average fusion probability, , is undertaken to obtain a better understanding of its dependence on various reaction parameters. The study may also help to clearly demarcate onset of non-CN fission (NCNF), which causes fusion probability, PCN, to deviate from unity. Method: ER excitation functions for 52 reactions leading to CN in the mass region 170-220, which are available in the literature, have been compared with statistical model (SM) calculations. Capture cross sections have been obtained from a coupled-channels code. In the SM, shell corrections in both the level density and the fission barrier have been included. for these reactions has been extracted by comparing experimental and theoretical ER excitation functions in the energy range ˜5 %-35% above the potential barrier, where known effects of nuclear structure are insignificant. Results: has been shown to vary with entrance channel mass asymmetry, η (or charge product, ZpZt ), as well as with fissility of the CN, χCN. No parameter has been found to be adequate as a single scaling variable to determine . Approximate boundaries have been obtained from where starts deviating from unity. Conclusions: This study quite clearly reveals the limits of applicability of the SM in interpreting experimental observables from fusion reactions involving two massive nuclei. Deviation of from unity marks the beginning of the domain of dynamical models of fusion. Availability of precise ER cross

  19. Task specificity of attention training: the case of probability cuing

    PubMed Central

    Jiang, Yuhong V.; Swallow, Khena M.; Won, Bo-Yeong; Cistera, Julia D.; Rosenbaum, Gail M.

    2014-01-01

    Statistical regularities in our environment enhance perception and modulate the allocation of spatial attention. Surprisingly little is known about how learning-induced changes in spatial attention transfer across tasks. In this study, we investigated whether a spatial attentional bias learned in one task transfers to another. Most of the experiments began with a training phase in which a search target was more likely to be located in one quadrant of the screen than in the other quadrants. An attentional bias toward the high-probability quadrant developed during training (probability cuing). In a subsequent, testing phase, the target's location distribution became random. In addition, the training and testing phases were based on different tasks. Probability cuing did not transfer between visual search and a foraging-like task. However, it did transfer between various types of visual search tasks that differed in stimuli and difficulty. These data suggest that different visual search tasks share a common and transferrable learned attentional bias. However, this bias is not shared by high-level, decision-making tasks such as foraging. PMID:25113853

  20. Exploring the Overestimation of Conjunctive Probabilities

    PubMed Central

    Nilsson, Håkan; Rieskamp, Jörg; Jenny, Mirjam A.

    2013-01-01

    People often overestimate probabilities of conjunctive events. The authors explored whether the accuracy of conjunctive probability estimates can be improved by increased experience with relevant constituent events and by using memory aids. The first experiment showed that increased experience with constituent events increased the correlation between the estimated and the objective conjunctive probabilities, but that it did not reduce overestimation of conjunctive probabilities. The second experiment showed that reducing cognitive load with memory aids for the constituent probabilities led to improved estimates of the conjunctive probabilities and to decreased overestimation of conjunctive probabilities. To explain the cognitive process underlying people’s probability estimates, the configural weighted average model was tested against the normative multiplicative model. The configural weighted average model generates conjunctive probabilities that systematically overestimate objective probabilities although the generated probabilities still correlate strongly with the objective probabilities. For the majority of participants this model was better than the multiplicative model in predicting the probability estimates. However, when memory aids were provided, the predictive accuracy of the multiplicative model increased. In sum, memory tools can improve people’s conjunctive probability estimates. PMID:23460026

  1. Direct probability mapping of contaminants

    SciTech Connect

    Rautman, C.A.

    1993-09-17

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. Geostatistical simulation provides powerful tools for investigating contaminant levels, and in particular, for identifying and using the spatial interrelationships among a set of isolated sample values. This additional information can be used to assess the likelihood of encountering contamination at unsampled locations and to evaluate the risk associated with decisions to remediate or not to remediate specific regions within a site. Past operation of the DOE Feed Materials Production Center has contaminated a site near Fernald, Ohio, with natural uranium. Soil geochemical data have been collected as part of the Uranium-in-Soils Integrated Demonstration Project. These data have been used to construct a number of stochastic images of potential contamination for parcels approximately the size of a selective remediation unit. Each such image accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely, statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination. Evaluation of the geostatistical simulations can yield maps representing the expected magnitude of the contamination for various regions and other information that may be important in determining a suitable remediation process or in sizing equipment to accomplish the restoration.

  2. Trajectory versus probability density entropy.

    PubMed

    Bologna, M; Grigolini, P; Karagiorgis, M; Rosa, A

    2001-07-01

    We show that the widely accepted conviction that a connection can be established between the probability density entropy and the Kolmogorov-Sinai (KS) entropy is questionable. We adopt the definition of density entropy as a functional of a distribution density whose time evolution is determined by a transport equation, conceived as the only prescription to use for the calculation. Although the transport equation is built up for the purpose of affording a picture equivalent to that stemming from trajectory dynamics, no direct use of trajectory time evolution is allowed, once the transport equation is defined. With this definition in mind we prove that the detection of a time regime of increase of the density entropy with a rate identical to the KS entropy is possible only in a limited number of cases. The proposals made by some authors to establish a connection between the two entropies in general, violate our definition of density entropy and imply the concept of trajectory, which is foreign to that of density entropy. PMID:11461383

  3. Trajectory versus probability density entropy

    NASA Astrophysics Data System (ADS)

    Bologna, Mauro; Grigolini, Paolo; Karagiorgis, Markos; Rosa, Angelo

    2001-07-01

    We show that the widely accepted conviction that a connection can be established between the probability density entropy and the Kolmogorov-Sinai (KS) entropy is questionable. We adopt the definition of density entropy as a functional of a distribution density whose time evolution is determined by a transport equation, conceived as the only prescription to use for the calculation. Although the transport equation is built up for the purpose of affording a picture equivalent to that stemming from trajectory dynamics, no direct use of trajectory time evolution is allowed, once the transport equation is defined. With this definition in mind we prove that the detection of a time regime of increase of the density entropy with a rate identical to the KS entropy is possible only in a limited number of cases. The proposals made by some authors to establish a connection between the two entropies in general, violate our definition of density entropy and imply the concept of trajectory, which is foreign to that of density entropy.

  4. Probability distributions of turbulent energy.

    PubMed

    Momeni, Mahdi; Müller, Wolf-Christian

    2008-05-01

    Probability density functions (PDFs) of scale-dependent energy fluctuations, P[deltaE(l)] , are studied in high-resolution direct numerical simulations of Navier-Stokes and incompressible magnetohydrodynamic (MHD) turbulence. MHD flows with and without a strong mean magnetic field are considered. For all three systems it is found that the PDFs of inertial range energy fluctuations exhibit self-similarity and monoscaling in agreement with recent solar-wind measurements [Hnat, Geophys. Res. Lett. 29, 86 (2002)]. Furthermore, the energy PDFs exhibit similarity over all scales of the turbulent system showing no substantial qualitative change of shape as the scale of the fluctuations varies. This is in contrast to the well-known behavior of PDFs of turbulent velocity fluctuations. In all three cases under consideration the P[deltaE(l)] resemble Lévy-type gamma distributions approximately Delta;{-1} exp(-|deltaE|/Delta)|deltaE|;{-gamma} The observed gamma distributions exhibit a scale-dependent width Delta(l) and a system-dependent gamma . The monoscaling property reflects the inertial-range scaling of the Elsässer-field fluctuations due to lacking Galilei invariance of deltaE . The appearance of Lévy distributions is made plausible by a simple model of energy transfer. PMID:18643170

  5. THE BLACK HOLE FORMATION PROBABILITY

    SciTech Connect

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P {sub BH}(M {sub ZAMS}). Although we find that it is difficult to derive a unique P {sub BH}(M {sub ZAMS}) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P {sub BH}(M {sub ZAMS}) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P {sub BH}(M {sub ZAMS}) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  6. The Black Hole Formation Probability

    NASA Astrophysics Data System (ADS)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  7. The Probability Distribution for a Biased Spinner

    ERIC Educational Resources Information Center

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  8. Subjective and objective probabilities in quantum mechanics

    SciTech Connect

    Srednicki, Mark

    2005-05-15

    We discuss how the apparently objective probabilities predicted by quantum mechanics can be treated in the framework of Bayesian probability theory, in which all probabilities are subjective. Our results are in accord with earlier work by Caves, Fuchs, and Schack, but our approach and emphasis are different. We also discuss the problem of choosing a noninformative prior for a density matrix.

  9. Using Playing Cards to Differentiate Probability Interpretations

    ERIC Educational Resources Information Center

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  10. Illustrating Basic Probability Calculations Using "Craps"

    ERIC Educational Resources Information Center

    Johnson, Roger W.

    2006-01-01

    Instructors may use the gambling game of craps to illustrate the use of a number of fundamental probability identities. For the "pass-line" bet we focus on the chance of winning and the expected game length. To compute these, probabilities of unions of disjoint events, probabilities of intersections of independent events, conditional probabilities…

  11. Pre-Service Teachers' Conceptions of Probability

    ERIC Educational Resources Information Center

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  12. Teaching Probabilities and Statistics to Preschool Children

    ERIC Educational Resources Information Center

    Pange, Jenny

    2003-01-01

    This study considers the teaching of probabilities and statistics to a group of preschool children using traditional classroom activities and Internet games. It was clear from this study that children can show a high level of understanding of probabilities and statistics, and demonstrate high performance in probability games. The use of Internet…

  13. The Cognitive Substrate of Subjective Probability

    ERIC Educational Resources Information Center

    Nilsson, Hakan; Olsson, Henrik; Juslin, Peter

    2005-01-01

    The prominent cognitive theories of probability judgment were primarily developed to explain cognitive biases rather than to account for the cognitive processes in probability judgment. In this article the authors compare 3 major theories of the processes and representations in probability judgment: the representativeness heuristic, implemented as…

  14. Dynamic Encoding of Speech Sequence Probability in Human Temporal Cortex

    PubMed Central

    Leonard, Matthew K.; Bouchard, Kristofer E.; Tang, Claire

    2015-01-01

    Sensory processing involves identification of stimulus features, but also integration with the surrounding sensory and cognitive context. Previous work in animals and humans has shown fine-scale sensitivity to context in the form of learned knowledge about the statistics of the sensory environment, including relative probabilities of discrete units in a stream of sequential auditory input. These statistics are a defining characteristic of one of the most important sequential signals humans encounter: speech. For speech, extensive exposure to a language tunes listeners to the statistics of sound sequences. To address how speech sequence statistics are neurally encoded, we used high-resolution direct cortical recordings from human lateral superior temporal cortex as subjects listened to words and nonwords with varying transition probabilities between sound segments. In addition to their sensitivity to acoustic features (including contextual features, such as coarticulation), we found that neural responses dynamically encoded the language-level probability of both preceding and upcoming speech sounds. Transition probability first negatively modulated neural responses, followed by positive modulation of neural responses, consistent with coordinated predictive and retrospective recognition processes, respectively. Furthermore, transition probability encoding was different for real English words compared with nonwords, providing evidence for online interactions with high-order linguistic knowledge. These results demonstrate that sensory processing of deeply learned stimuli involves integrating physical stimulus features with their contextual sequential structure. Despite not being consciously aware of phoneme sequence statistics, listeners use this information to process spoken input and to link low-level acoustic representations with linguistic information about word identity and meaning. PMID:25948269

  15. Datamining approaches for modeling tumor control probability

    PubMed Central

    Naqa, Issam El; Deasy, Joseph O.; Mu, Yi; Huang, Ellen; Hope, Andrew J.; Lindsay, Patricia E.; Apte, Aditya; Alaly, James; Bradley, Jeffrey D.

    2016-01-01

    Background Tumor control probability (TCP) to radiotherapy is determined by complex interactions between tumor biology, tumor microenvironment, radiation dosimetry, and patient-related variables. The complexity of these heterogeneous variable interactions constitutes a challenge for building predictive models for routine clinical practice. We describe a datamining framework that can unravel the higher order relationships among dosimetric dose-volume prognostic variables, interrogate various radiobiological processes, and generalize to unseen data before when applied prospectively. Material and methods Several datamining approaches are discussed that include dose-volume metrics, equivalent uniform dose, mechanistic Poisson model, and model building methods using statistical regression and machine learning techniques. Institutional datasets of non-small cell lung cancer (NSCLC) patients are used to demonstrate these methods. The performance of the different methods was evaluated using bivariate Spearman rank correlations (rs). Over-fitting was controlled via resampling methods. Results Using a dataset of 56 patients with primary NCSLC tumors and 23 candidate variables, we estimated GTV volume and V75 to be the best model parameters for predicting TCP using statistical resampling and a logistic model. Using these variables, the support vector machine (SVM) kernel method provided superior performance for TCP prediction with an rs = 0.68 on leave-one-out testing compared to logistic regression (rs = 0.4), Poisson-based TCP (rs = 0.33), and cell kill equivalent uniform dose model (rs = 0.17). Conclusions The prediction of treatment response can be improved by utilizing datamining approaches, which are able to unravel important non-linear complex interactions among model variables and have the capacity to predict on unseen data for prospective clinical applications. PMID:20192878

  16. Prior probabilities modulate cortical surprise responses: A study of event-related potentials.

    PubMed

    Seer, Caroline; Lange, Florian; Boos, Moritz; Dengler, Reinhard; Kopp, Bruno

    2016-07-01

    The human brain predicts events in its environment based on expectations, and unexpected events are surprising. When probabilistic contingencies in the environment are precisely instructed, the individual can form expectations based on quantitative probabilistic information ('inference-based learning'). In contrast, when probabilistic contingencies are imprecisely instructed, expectations are formed based on the individual's cumulative experience ('experience-based learning'). Here, we used the urn-ball paradigm to investigate how variations in prior probabilities and in the precision of information about these priors modulate choice behavior and event-related potential (ERP) correlates of surprise. In the urn-ball paradigm, participants are repeatedly forced to infer hidden states responsible for generating observable events, given small samples of factual observations. We manipulated prior probabilities of the states, and we rendered the priors calculable or incalculable, respectively. The analysis of choice behavior revealed that the tendency to consider prior probabilities when making decisions about hidden states was stronger when prior probabilities were calculable, at least in some of our participants. Surprise-related P3b amplitudes were observed in both the calculable and the incalculable prior probability condition. In contrast, calculability of prior probabilities modulated anteriorly distributed ERP amplitudes: when prior probabilities were calculable, surprising events elicited enhanced P3a amplitudes. However, when prior probabilities were incalculable, surprise was associated with enhanced N2 amplitudes. Furthermore, interindividual variability in reliance on prior probabilities was associated with attenuated P3b surprise responses under calculable in comparison to incalculable prior probabilities. Our results suggest two distinct neural systems for probabilistic learning that are recruited depending on contextual cues such as the precision of

  17. Bell Could Become the Copernicus of Probability

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei

    2016-07-01

    Our aim is to emphasize the role of mathematical models in physics, especially models of geometry and probability. We briefly compare developments of geometry and probability by pointing to similarities and differences: from Euclid to Lobachevsky and from Kolmogorov to Bell. In probability, Bell could play the same role as Lobachevsky in geometry. In fact, violation of Bell’s inequality can be treated as implying the impossibility to apply the classical probability model of Kolmogorov (1933) to quantum phenomena. Thus the quantum probabilistic model (based on Born’s rule) can be considered as the concrete example of the non-Kolmogorovian model of probability, similarly to the Lobachevskian model — the first example of the non-Euclidean model of geometry. This is the “probability model” interpretation of the violation of Bell’s inequality. We also criticize the standard interpretation—an attempt to add to rigorous mathematical probability models additional elements such as (non)locality and (un)realism. Finally, we compare embeddings of non-Euclidean geometries into the Euclidean space with embeddings of the non-Kolmogorovian probabilities (in particular, quantum probability) into the Kolmogorov probability space. As an example, we consider the CHSH-test.

  18. Probability and Quantum Paradigms: the Interplay

    SciTech Connect

    Kracklauer, A. F.

    2007-12-03

    Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a few details, this variant is appealing in its reliance on well tested concepts and technology.

  19. Derivation of quantum probability from measurement

    NASA Astrophysics Data System (ADS)

    Herbut, Fedor

    2016-05-01

    To begin with, it is pointed out that the form of the quantum probability formula originates in the very initial state of the object system as seen when the state is expanded with the eigenprojectors of the measured observable. Making use of the probability reproducibility condition, which is a key concept in unitary measurement theory, one obtains the relevant coherent distribution of the complete-measurement results in the final unitary-measurement state in agreement with the mentioned probability formula. Treating the transition from the final unitary, or premeasurement, state, where all possible results are present, to one complete-measurement result sketchily in the usual way, the well-known probability formula is derived. In conclusion it is pointed out that the entire argument is only formal unless one makes it physical assuming that the quantum probability law is valid in the extreme case of probability-one (certain) events (projectors).

  20. Probability and Quantum Paradigms: the Interplay

    NASA Astrophysics Data System (ADS)

    Kracklauer, A. F.

    2007-12-01

    Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a few details, this variant is appealing in its reliance on well tested concepts and technology.

  1. Entropy analysis of systems exhibiting negative probabilities

    NASA Astrophysics Data System (ADS)

    Tenreiro Machado, J. A.

    2016-07-01

    This paper addresses the concept of negative probability and its impact upon entropy. An analogy between the probability generating functions, in the scope of quasiprobability distributions, and the Grünwald-Letnikov definition of fractional derivatives, is explored. Two distinct cases producing negative probabilities are formulated and their distinct meaning clarified. Numerical calculations using the Shannon entropy characterize further the characteristics of the two limit cases.

  2. Calculating the CEP (Circular Error Probable)

    NASA Technical Reports Server (NTRS)

    1987-01-01

    This report compares the probability contained in the Circular Error Probable associated with an Elliptical Error Probable to that of the EEP at a given confidence level. The levels examined are 50 percent and 95 percent. The CEP is found to be both more conservative and less conservative than the associated EEP, depending on the eccentricity of the ellipse. The formulas used are derived in the appendix.

  3. Neural correlates of the divergence of instrumental probability distributions.

    PubMed

    Liljeholm, Mimi; Wang, Shuo; Zhang, June; O'Doherty, John P

    2013-07-24

    Flexible action selection requires knowledge about how alternative actions impact the environment: a "cognitive map" of instrumental contingencies. Reinforcement learning theories formalize this map as a set of stochastic relationships between actions and states, such that for any given action considered in a current state, a probability distribution is specified over possible outcome states. Here, we show that activity in the human inferior parietal lobule correlates with the divergence of such outcome distributions-a measure that reflects whether discrimination between alternative actions increases the controllability of the future-and, further, that this effect is dissociable from those of other information theoretic and motivational variables, such as outcome entropy, action values, and outcome utilities. Our results suggest that, although ultimately combined with reward estimates to generate action values, outcome probability distributions associated with alternative actions may be contrasted independently of valence computations, to narrow the scope of the action selection problem. PMID:23884955

  4. Neural Correlates of the Divergence of Instrumental Probability Distributions

    PubMed Central

    Wang, Shuo; Zhang, June; O'Doherty, John P.

    2013-01-01

    Flexible action selection requires knowledge about how alternative actions impact the environment: a “cognitive map” of instrumental contingencies. Reinforcement learning theories formalize this map as a set of stochastic relationships between actions and states, such that for any given action considered in a current state, a probability distribution is specified over possible outcome states. Here, we show that activity in the human inferior parietal lobule correlates with the divergence of such outcome distributions–a measure that reflects whether discrimination between alternative actions increases the controllability of the future–and, further, that this effect is dissociable from those of other information theoretic and motivational variables, such as outcome entropy, action values, and outcome utilities. Our results suggest that, although ultimately combined with reward estimates to generate action values, outcome probability distributions associated with alternative actions may be contrasted independently of valence computations, to narrow the scope of the action selection problem. PMID:23884955

  5. Psychophysics of the probability weighting function

    NASA Astrophysics Data System (ADS)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (0<α<1 and w(0)=1,w(1e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  6. Executable Code Recognition in Network Flows Using Instruction Transition Probabilities

    NASA Astrophysics Data System (ADS)

    Kim, Ikkyun; Kang, Koohong; Choi, Yangseo; Kim, Daewon; Oh, Jintae; Jang, Jongsoo; Han, Kijun

    The ability to recognize quickly inside network flows to be executable is prerequisite for malware detection. For this purpose, we introduce an instruction transition probability matrix (ITPX) which is comprised of the IA-32 instruction sets and reveals the characteristics of executable code's instruction transition patterns. And then, we propose a simple algorithm to detect executable code inside network flows using a reference ITPX which is learned from the known Windows Portable Executable files. We have tested the algorithm with more than thousands of executable and non-executable codes. The results show that it is very promising enough to use in real world.

  7. Predetonation probability of a fission-bomb core

    NASA Astrophysics Data System (ADS)

    Reed, B. Cameron

    2010-08-01

    An undergraduate-level derivation of the probability that a uranium or plutonium fission bomb will suffer an uncontrolled predetonation due to neutrons liberated in spontaneous fissions in the fissile material is developed. Consistent with what was learned by Los Alamos bomb designers during World War II, it is shown why uncontrolled predetonation was not a problem for a U-235 bomb of the Little Boy "gun" design but necessitated development of implosion engineering for the Pu-239 Trinity and Fat Man bombs where the cores were contaminated with highly spontaneously fissile Pu-240.

  8. Lattice Duality: The Origin of Probability and Entropy

    NASA Technical Reports Server (NTRS)

    Knuth, Kevin H.

    2004-01-01

    Bayesian probability theory is an inference calculus, which originates from a generalization of inclusion on the Boolean lattice of logical assertions to a degree of inclusion represented by a real number. Dual to this lattice is the distributive lattice of questions constructed from the ordered set of down-sets of assertions, which forms the foundation of the calculus of inquiry-a generalization of information theory. In this paper we introduce this novel perspective on these spaces in which machine learning is performed and discuss the relationship between these results and several proposed generalizations of information theory in the literature.

  9. Simulations of Probabilities for Quantum Computing

    NASA Technical Reports Server (NTRS)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  10. Correlation as Probability of Common Descent.

    ERIC Educational Resources Information Center

    Falk, Ruma; Well, Arnold D.

    1996-01-01

    One interpretation of the Pearson product-moment correlation ("r"), correlation as the probability of originating from common descent, important to the genetic measurement of inbreeding, is examined. The conditions under which "r" can be interpreted as the probability of "identity by descent" are specified, and the possibility of generalizing this…

  11. 47 CFR 1.1623 - Probability calculation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 1 2012-10-01 2012-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants by Random Selection Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a)...

  12. Teaching Probability: A Socio-Constructivist Perspective

    ERIC Educational Resources Information Center

    Sharma, Sashi

    2015-01-01

    There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.

  13. Teaching Statistics and Probability: 1981 Yearbook.

    ERIC Educational Resources Information Center

    Shulte, Albert P., Ed.; Smart, James R., Ed.

    This 1981 yearbook of the National Council of Teachers of Mathematics (NCTM) offers classroom ideas for teaching statistics and probability, viewed as important topics in the school mathematics curriculum. Statistics and probability are seen as appropriate because they: (1) provide meaningful applications of mathematics at all levels; (2) provide…

  14. Phonotactic Probabilities in Young Children's Speech Production

    ERIC Educational Resources Information Center

    Zamuner, Tania S.; Gerken, Louann; Hammond, Michael

    2004-01-01

    This research explores the role of phonotactic probability in two-year-olds' production of coda consonants. Twenty-nine children were asked to repeat CVC non-words that were used as labels for pictures of imaginary animals. The CVC non-words were controlled for their phonotactic probabilities, neighbourhood densities, word-likelihood ratings, and…

  15. 47 CFR 1.1623 - Probability calculation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall...

  16. 47 CFR 1.1623 - Probability calculation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall...

  17. 47 CFR 1.1623 - Probability calculation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 1 2014-10-01 2014-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants by Random Selection Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a)...

  18. 47 CFR 1.1623 - Probability calculation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 1 2013-10-01 2013-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants by Random Selection Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a)...

  19. Average Transmission Probability of a Random Stack

    ERIC Educational Resources Information Center

    Lu, Yin; Miniatura, Christian; Englert, Berthold-Georg

    2010-01-01

    The transmission through a stack of identical slabs that are separated by gaps with random widths is usually treated by calculating the average of the logarithm of the transmission probability. We show how to calculate the average of the transmission probability itself with the aid of a recurrence relation and derive analytical upper and lower…

  20. Probability Simulations by Non-Lipschitz Chaos

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-Lipschitz dynamics, without utilization of any man-made devices. Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  1. Probability: A Matter of Life and Death

    ERIC Educational Resources Information Center

    Hassani, Mehdi; Kippen, Rebecca; Mills, Terence

    2016-01-01

    Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…

  2. Stimulus Probability Effects in Absolute Identification

    ERIC Educational Resources Information Center

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  3. Laboratory-Tutorial Activities for Teaching Probability

    ERIC Educational Resources Information Center

    Wittmann, Michael C.; Morgan, Jeffrey T.; Feeley, Roger E.

    2006-01-01

    We report on the development of students' ideas of probability and probability density in a University of Maine laboratory-based general education physics course called "Intuitive Quantum Physics". Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a…

  4. Probability Issues in without Replacement Sampling

    ERIC Educational Resources Information Center

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  5. Assessment of the probability of contaminating Mars

    NASA Technical Reports Server (NTRS)

    Judd, B. R.; North, D. W.; Pezier, J. P.

    1974-01-01

    New methodology is proposed to assess the probability that the planet Mars will by biologically contaminated by terrestrial microorganisms aboard a spacecraft. Present NASA methods are based on the Sagan-Coleman formula, which states that the probability of contamination is the product of the expected microbial release and a probability of growth. The proposed new methodology extends the Sagan-Coleman approach to permit utilization of detailed information on microbial characteristics, the lethality of release and transport mechanisms, and of other information about the Martian environment. Three different types of microbial release are distinguished in the model for assessing the probability of contamination. The number of viable microbes released by each mechanism depends on the bio-burden in various locations on the spacecraft and on whether the spacecraft landing is accomplished according to plan. For each of the three release mechanisms a probability of growth is computed, using a model for transport into an environment suited to microbial growth.

  6. Quantum probability assignment limited by relativistic causality.

    PubMed

    Han, Yeong Deok; Choi, Taeseung

    2016-01-01

    Quantum theory has nonlocal correlations, which bothered Einstein, but found to satisfy relativistic causality. Correlation for a shared quantum state manifests itself, in the standard quantum framework, by joint probability distributions that can be obtained by applying state reduction and probability assignment that is called Born rule. Quantum correlations, which show nonlocality when the shared state has an entanglement, can be changed if we apply different probability assignment rule. As a result, the amount of nonlocality in quantum correlation will be changed. The issue is whether the change of the rule of quantum probability assignment breaks relativistic causality. We have shown that Born rule on quantum measurement is derived by requiring relativistic causality condition. This shows how the relativistic causality limits the upper bound of quantum nonlocality through quantum probability assignment. PMID:26971717

  7. Multinomial mixture model with heterogeneous classification probabilities

    USGS Publications Warehouse

    Holland, M.D.; Gray, B.R.

    2011-01-01

    Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.

  8. Quantum probability assignment limited by relativistic causality

    PubMed Central

    Han, Yeong Deok; Choi, Taeseung

    2016-01-01

    Quantum theory has nonlocal correlations, which bothered Einstein, but found to satisfy relativistic causality. Correlation for a shared quantum state manifests itself, in the standard quantum framework, by joint probability distributions that can be obtained by applying state reduction and probability assignment that is called Born rule. Quantum correlations, which show nonlocality when the shared state has an entanglement, can be changed if we apply different probability assignment rule. As a result, the amount of nonlocality in quantum correlation will be changed. The issue is whether the change of the rule of quantum probability assignment breaks relativistic causality. We have shown that Born rule on quantum measurement is derived by requiring relativistic causality condition. This shows how the relativistic causality limits the upper bound of quantum nonlocality through quantum probability assignment. PMID:26971717

  9. Survival probability in patients with liver trauma.

    PubMed

    Buci, Skender; Kukeli, Agim

    2016-08-01

    Purpose - The purpose of this paper is to assess the survival probability among patients with liver trauma injury using the anatomical and psychological scores of conditions, characteristics and treatment modes. Design/methodology/approach - A logistic model is used to estimate 173 patients' survival probability. Data are taken from patient records. Only emergency room patients admitted to University Hospital of Trauma (former Military Hospital) in Tirana are included. Data are recorded anonymously, preserving the patients' privacy. Findings - When correctly predicted, the logistic models show that survival probability varies from 70.5 percent up to 95.4 percent. The degree of trauma injury, trauma with liver and other organs, total days the patient was hospitalized, and treatment method (conservative vs intervention) are statistically important in explaining survival probability. Practical implications - The study gives patients, their relatives and physicians ample and sound information they can use to predict survival chances, the best treatment and resource management. Originality/value - This study, which has not been done previously, explores survival probability, success probability for conservative and non-conservative treatment, and success probability for single vs multiple injuries from liver trauma. PMID:27477933

  10. Liquefaction probability curves for surficial geologic deposits

    USGS Publications Warehouse

    Holzer, Thomas L.; Noce, Thomas E.; Bennett, Michael J.

    2011-01-01

    Liquefaction probability curves that predict the probability of surface manifestations of earthquake-induced liquefaction are developed for 14 different types of surficial geologic units. The units consist of alluvial fan, beach ridge, river delta topset and foreset beds, eolian dune, point bar, flood basin, natural river and alluvial fan levees, abandoned river channel, deep-water lake, lagoonal, sandy artificial fill, and valley train deposits. Probability is conditioned on earthquake magnitude and peak ground acceleration. Curves are developed for water table depths of 1.5 and 5.0 m. Probabilities are derived from complementary cumulative frequency distributions of the liquefaction potential index (LPI) that were computed from 927 cone penetration tests. For natural deposits with a water table at 1.5 m and subjected to a M7.5 earthquake with peak ground acceleration (PGA)  =  0.25g, probabilities range from 0.5 for beach ridge, point bar, and deltaic deposits. The curves also were used to assign ranges of liquefaction probabilities to the susceptibility categories proposed previously for different geologic deposits. For the earthquake described here, probabilities for susceptibility categories have ranges of 0–0.08 for low, 0.09–0.30 for moderate, 0.31–0.62 for high, and 0.63–1.00 for very high. Retrospective predictions of liquefaction during historical earthquakes based on the curves compare favorably to observations.

  11. Semigroups of tomographic probabilities and quantum correlations

    NASA Astrophysics Data System (ADS)

    Man'ko, V. I.

    2008-08-01

    Semigroups of stochastic and bistochastic matrices constructed by means of spin tomograms or tomographic probabilities and their relations to the problem of Bell's inequalities and entanglement are reviewed. The probability determining the quantum state of spins and the probability densities determining the quantum states of particles with continuous variables are considered. Entropies for semigroups of stochastic and bisctochastic matrices are studied, in view of both the Shannon information entropy and its generalization like Rényi entropy. Qubit portraits of qudit states are discussed in the connection with the problem of Bell's inequality violation for entangled states.

  12. Probability distributions for a surjective unimodal map

    NASA Astrophysics Data System (ADS)

    Sun, Hongyan; Wang, Long

    1996-04-01

    In this paper we show that the probability distributions for a surjective unimodal map can be classified into three types, δ function, asymmetric and symmetric type; by identifying the binary structures of its initial values. The Borel's normal number theorem is equivalent or prior to the Frobenius-Perron operator in analyzing the probability distributions for this kind of maps, and in particular we can constitute a multifractal probability distribution from the surjective tent map by selecting a non- Borel's normal number as the initial value.

  13. Robust location and spread measures for nonparametric probability density function estimation.

    PubMed

    López-Rubio, Ezequiel

    2009-10-01

    Robustness against outliers is a desirable property of any unsupervised learning scheme. In particular, probability density estimators benefit from incorporating this feature. A possible strategy to achieve this goal is to substitute the sample mean and the sample covariance matrix by more robust location and spread estimators. Here we use the L1-median to develop a nonparametric probability density function (PDF) estimator. We prove its most relevant properties, and we show its performance in density estimation and classification applications. PMID:19885963

  14. Characteristic length of the knotting probability revisited

    NASA Astrophysics Data System (ADS)

    Uehara, Erica; Deguchi, Tetsuo

    2015-09-01

    We present a self-avoiding polygon (SAP) model for circular DNA in which the radius of impermeable cylindrical segments corresponds to the screening length of double-stranded DNA surrounded by counter ions. For the model we evaluate the probability for a generated SAP with N segments having a given knot K through simulation. We call it the knotting probability of a knot K with N segments for the SAP model. We show that when N is large the most significant factor in the knotting probability is given by the exponentially decaying part exp(-N/NK), where the estimates of parameter NK are consistent with the same value for all the different knots we investigated. We thus call it the characteristic length of the knotting probability. We give formulae expressing the characteristic length as a function of the cylindrical radius rex, i.e. the screening length of double-stranded DNA.

  15. Inclusion probability with dropout: an operational formula.

    PubMed

    Milot, E; Courteau, J; Crispino, F; Mailly, F

    2015-05-01

    In forensic genetics, a mixture of two or more contributors to a DNA profile is often interpreted using the inclusion probabilities theory. In this paper, we present a general formula for estimating the probability of inclusion (PI, also known as the RMNE probability) from a subset of visible alleles when dropouts are possible. This one-locus formula can easily be extended to multiple loci using the cumulative probability of inclusion. We show that an exact formulation requires fixing the number of contributors, hence to slightly modify the classic interpretation of the PI. We discuss the implications of our results for the enduring debate over the use of PI vs likelihood ratio approaches within the context of low template amplifications. PMID:25559642

  16. A Survey of Tables of Probability Distributions

    PubMed Central

    Kacker, Raghu; Olkin, Ingram

    2005-01-01

    This article is a survey of the tables of probability distributions published about or after the publication in 1964 of the Handbook of Mathematical Functions, edited by Abramowitz and Stegun PMID:27308104

  17. On Convergent Probability of a Random Walk

    ERIC Educational Resources Information Center

    Lee, Y.-F.; Ching, W.-K.

    2006-01-01

    This note introduces an interesting random walk on a straight path with cards of random numbers. The method of recurrent relations is used to obtain the convergent probability of the random walk with different initial positions.

  18. Determining Probabilities by Examining Underlying Structure.

    ERIC Educational Resources Information Center

    Norton, Robert M.

    2001-01-01

    Discusses how dice games pose fairness issues that appeal to students and examines a structure for three games involving two dice in a way that leads directly to the theoretical probabilities for all possible outcomes. (YDS)

  19. Neutron initiation probability in fast burst reactor

    SciTech Connect

    Liu, X.; Du, J.; Xie, Q.; Fan, X.

    2012-07-01

    Based on the probability balance of neutron random events in multiply system, the four random process of neutron in prompt super-critical is described and then the equation of neutron initiation probability W(r,E,{Omega},t) is deduced. On the assumption of static, slightly prompt super-critical and the two factorial approximation, the formula of the average probability of 'one' neutron is derived which is the same with the result derived from the point model. The MC simulation using point model is applied in Godiva- II and CFBR-II, and the simulation result of one neutron initiation is well consistent with the theory that the initiation probability of Godiva- II inverted commas CFBR-II burst reactor are 0.00032, 0.00027 respectively on the ordinary burst operation. (authors)

  20. Probability tree algorithm for general diffusion processes

    NASA Astrophysics Data System (ADS)

    Ingber, Lester; Chen, Colleen; Mondescu, Radu Paul; Muzzall, David; Renedo, Marco

    2001-11-01

    Motivated by path-integral numerical solutions of diffusion processes, PATHINT, we present a tree algorithm, PATHTREE, which permits extremely fast accurate computation of probability distributions of a large class of general nonlinear diffusion processes.

  1. Transition Probability and the ESR Experiment

    ERIC Educational Resources Information Center

    McBrierty, Vincent J.

    1974-01-01

    Discusses the use of a modified electron spin resonance apparatus to demonstrate some features of the expression for the transition probability per second between two energy levels. Applications to the third year laboratory program are suggested. (CC)

  2. Infants Segment Continuous Events Using Transitional Probabilities

    ERIC Educational Resources Information Center

    Stahl, Aimee E.; Romberg, Alexa R.; Roseberry, Sarah; Golinkoff, Roberta Michnick; Hirsh-Pasek, Kathryn

    2014-01-01

    Throughout their 1st year, infants adeptly detect statistical structure in their environment. However, little is known about whether statistical learning is a primary mechanism for event segmentation. This study directly tests whether statistical learning alone is sufficient to segment continuous events. Twenty-eight 7- to 9-month-old infants…

  3. Non-Gaussian Photon Probability Distribution

    NASA Astrophysics Data System (ADS)

    Solomon, Benjamin T.

    2010-01-01

    This paper investigates the axiom that the photon's probability distribution is a Gaussian distribution. The Airy disc empirical evidence shows that the best fit, if not exact, distribution is a modified Gamma mΓ distribution (whose parameters are α = r, βr/√u ) in the plane orthogonal to the motion of the photon. This modified Gamma distribution is then used to reconstruct the probability distributions along the hypotenuse from the pinhole, arc from the pinhole, and a line parallel to photon motion. This reconstruction shows that the photon's probability distribution is not a Gaussian function. However, under certain conditions, the distribution can appear to be Normal, thereby accounting for the success of quantum mechanics. This modified Gamma distribution changes with the shape of objects around it and thus explains how the observer alters the observation. This property therefore places additional constraints to quantum entanglement experiments. This paper shows that photon interaction is a multi-phenomena effect consisting of the probability to interact Pi, the probabilistic function and the ability to interact Ai, the electromagnetic function. Splitting the probability function Pi from the electromagnetic function Ai enables the investigation of the photon behavior from a purely probabilistic Pi perspective. The Probabilistic Interaction Hypothesis is proposed as a consistent method for handling the two different phenomena, the probability function Pi and the ability to interact Ai, thus redefining radiation shielding, stealth or cloaking, and invisibility as different effects of a single phenomenon Pi of the photon probability distribution. Sub wavelength photon behavior is successfully modeled as a multi-phenomena behavior. The Probabilistic Interaction Hypothesis provides a good fit to Otoshi's (1972) microwave shielding, Schurig et al. (2006) microwave cloaking, and Oulton et al. (2008) sub wavelength confinement; thereby providing a strong case that

  4. Robust satisficing and the probability of survival

    NASA Astrophysics Data System (ADS)

    Ben-Haim, Yakov

    2014-01-01

    Concepts of robustness are sometimes employed when decisions under uncertainty are made without probabilistic information. We present a theorem that establishes necessary and sufficient conditions for non-probabilistic robustness to be equivalent to the probability of satisfying the specified outcome requirements. When this holds, probability is enhanced (or maximised) by enhancing (or maximising) robustness. Two further theorems establish important special cases. These theorems have implications for success or survival under uncertainty. Applications to foraging and finance are discussed.

  5. The spline probability hypothesis density filter

    NASA Astrophysics Data System (ADS)

    Sithiravel, Rajiv; Tharmarasa, Ratnasingham; McDonald, Mike; Pelletier, Michel; Kirubarajan, Thiagalingam

    2012-06-01

    The Probability Hypothesis Density Filter (PHD) is a multitarget tracker for recursively estimating the number of targets and their state vectors from a set of observations. The PHD filter is capable of working well in scenarios with false alarms and missed detections. Two distinct PHD filter implementations are available in the literature: the Sequential Monte Carlo Probability Hypothesis Density (SMC-PHD) and the Gaussian Mixture Probability Hypothesis Density (GM-PHD) filters. The SMC-PHD filter uses particles to provide target state estimates, which can lead to a high computational load, whereas the GM-PHD filter does not use particles, but restricts to linear Gaussian mixture models. The SMC-PHD filter technique provides only weighted samples at discrete points in the state space instead of a continuous estimate of the probability density function of the system state and thus suffers from the well-known degeneracy problem. This paper proposes a B-Spline based Probability Hypothesis Density (S-PHD) filter, which has the capability to model any arbitrary probability density function. The resulting algorithm can handle linear, non-linear, Gaussian, and non-Gaussian models and the S-PHD filter can also provide continuous estimates of the probability density function of the system state. In addition, by moving the knots dynamically, the S-PHD filter ensures that the splines cover only the region where the probability of the system state is significant, hence the high efficiency of the S-PHD filter is maintained at all times. Also, unlike the SMC-PHD filter, the S-PHD filter is immune to the degeneracy problem due to its continuous nature. The S-PHD filter derivations and simulations are provided in this paper.

  6. Site occupancy models with heterogeneous detection probabilities

    USGS Publications Warehouse

    Royle, J. Andrew

    2006-01-01

    Models for estimating the probability of occurrence of a species in the presence of imperfect detection are important in many ecological disciplines. In these ?site occupancy? models, the possibility of heterogeneity in detection probabilities among sites must be considered because variation in abundance (and other factors) among sampled sites induces variation in detection probability (p). In this article, I develop occurrence probability models that allow for heterogeneous detection probabilities by considering several common classes of mixture distributions for p. For any mixing distribution, the likelihood has the general form of a zero-inflated binomial mixture for which inference based upon integrated likelihood is straightforward. A recent paper by Link (2003, Biometrics 59, 1123?1130) demonstrates that in closed population models used for estimating population size, different classes of mixture distributions are indistinguishable from data, yet can produce very different inferences about population size. I demonstrate that this problem can also arise in models for estimating site occupancy in the presence of heterogeneous detection probabilities. The implications of this are discussed in the context of an application to avian survey data and the development of animal monitoring programs.

  7. The cumulative reaction probability as eigenvalue problem

    NASA Astrophysics Data System (ADS)

    Manthe, Uwe; Miller, William H.

    1993-09-01

    It is shown that the cumulative reaction probability for a chemical reaction can be expressed (absolutely rigorously) as N(E)=∑kpk(E), where {pk} are the eigenvalues of a certain Hermitian matrix (or operator). The eigenvalues {pk} all lie between 0 and 1 and thus have the interpretation as probabilities, eigenreaction probabilities which may be thought of as the rigorous generalization of the transmission coefficients for the various states of the activated complex in transition state theory. The eigenreaction probabilities {pk} can be determined by diagonalizing a matrix that is directly available from the Hamiltonian matrix itself. It is also shown how a very efficient iterative method can be used to determine the eigenreaction probabilities for problems that are too large for a direct diagonalization to be possible. The number of iterations required is much smaller than that of previous methods, approximately the number of eigenreaction probabilities that are significantly different from zero. All of these new ideas are illustrated by application to three model problems—transmission through a one-dimensional (Eckart potential) barrier, the collinear H+H2→H2+H reaction, and the three-dimensional version of this reaction for total angular momentum J=0.

  8. The Estimation of Tree Posterior Probabilities Using Conditional Clade Probability Distributions

    PubMed Central

    Larget, Bret

    2013-01-01

    In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample. [Bayesian phylogenetics; conditional clade distributions; improved accuracy; posterior probabilities of trees.] PMID:23479066

  9. The Problem with Probability: Why rare hazards feel even rarer

    NASA Astrophysics Data System (ADS)

    Thompson, K. J.

    2013-12-01

    Even as scientists improve the accuracy of their forecasts for large-scale events like natural hazards and climate change, a gap remains between the confidence the scientific community has in those estimates, and the skepticism with which the lay public tends to view statements of uncertainty. Beyond the challenges of helping the public to understand probabilistic forecasts lies yet another barrier to effective communication: the fact that even when humans can estimate or state the correct probability of a rare event, we tend to distort that probability in our minds, acting as if the likelihood is higher or lower than we know it to be. A half century of empirical research in psychology and economics leaves us with a clear view of the ways that people interpret stated, or described probabilities--e.g., "There is a 6% chance of a Northridge-sized earthquake occurring in your area in the next 10 years." In the past decade, the focus of cognitive scientists has turned to the other method humans use to learn probabilities: intuitively estimating the chances of a rare event by assessing our personal experience with various outcomes. While it is well understood that described probabilities are over-weighted when they are small (e.g., a 5% chance might be treated more like a 10% or 12% chance), it appears that in many cases, experienced rare probabilities are in fact under-weighted. This distortion is not an under-estimation, and therefore cannot be prevented by reminding people of the described probability. This paper discusses the mechanisms and effects of this difference in the way probability is used when a number is provided, as opposed to when the frequency of a rare event is intuited. In addition to recommendations based on the current state of research on the way people appear to make decisions from experience, suggestions are made for how to present probabilistic information to best take advantage of people's tendencies to either amplify risk or ignore it, as well

  10. Minimal entropy probability paths between genome families.

    PubMed

    Ahlbrandt, Calvin; Benson, Gary; Casey, William

    2004-05-01

    We develop a metric for probability distributions with applications to biological sequence analysis. Our distance metric is obtained by minimizing a functional defined on the class of paths over probability measures on N categories. The underlying mathematical theory is connected to a constrained problem in the calculus of variations. The solution presented is a numerical solution, which approximates the true solution in a set of cases called rich paths where none of the components of the path is zero. The functional to be minimized is motivated by entropy considerations, reflecting the idea that nature might efficiently carry out mutations of genome sequences in such a way that the increase in entropy involved in transformation is as small as possible. We characterize sequences by frequency profiles or probability vectors, in the case of DNA where N is 4 and the components of the probability vector are the frequency of occurrence of each of the bases A, C, G and T. Given two probability vectors a and b, we define a distance function based as the infimum of path integrals of the entropy function H( p) over all admissible paths p(t), 0 < or = t< or =1, with p(t) a probability vector such that p(0)=a and p(1)=b. If the probability paths p(t) are parameterized as y(s) in terms of arc length s and the optimal path is smooth with arc length L, then smooth and "rich" optimal probability paths may be numerically estimated by a hybrid method of iterating Newton's method on solutions of a two point boundary value problem, with unknown distance L between the abscissas, for the Euler-Lagrange equations resulting from a multiplier rule for the constrained optimization problem together with linear regression to improve the arc length estimate L. Matlab code for these numerical methods is provided which works only for "rich" optimal probability vectors. These methods motivate a definition of an elementary distance function which is easier and faster to calculate, works on non

  11. The role of probabilities in physics.

    PubMed

    Le Bellac, Michel

    2012-09-01

    Although modern physics was born in the XVIIth century as a fully deterministic theory in the form of Newtonian mechanics, the use of probabilistic arguments turned out later on to be unavoidable. Three main situations can be distinguished. (1) When the number of degrees of freedom is very large, on the order of Avogadro's number, a detailed dynamical description is not possible, and in fact not useful: we do not care about the velocity of a particular molecule in a gas, all we need is the probability distribution of the velocities. This statistical description introduced by Maxwell and Boltzmann allows us to recover equilibrium thermodynamics, gives a microscopic interpretation of entropy and underlies our understanding of irreversibility. (2) Even when the number of degrees of freedom is small (but larger than three) sensitivity to initial conditions of chaotic dynamics makes determinism irrelevant in practice, because we cannot control the initial conditions with infinite accuracy. Although die tossing is in principle predictable, the approach to chaotic dynamics in some limit implies that our ignorance of initial conditions is translated into a probabilistic description: each face comes up with probability 1/6. (3) As is well-known, quantum mechanics is incompatible with determinism. However, quantum probabilities differ in an essential way from the probabilities introduced previously: it has been shown from the work of John Bell that quantum probabilities are intrinsic and cannot be given an ignorance interpretation based on a hypothetical deeper level of description. PMID:22609725

  12. Effects of Neutrino Decay on Oscillation Probabilities

    NASA Astrophysics Data System (ADS)

    Leonard, Kayla; de Gouvêa, André

    2016-01-01

    It is now well accepted that neutrinos oscillate as a quantum mechanical result of a misalignment between their mass-eigenstates and the flavor-eigenstates. We study neutrino decay—the idea that there may be new, light states that the three Standard Model flavors may be able to decay into. We consider what effects this neutrino decay would have on the observed oscillation probabilities.The Hamiltonian governs how the states change with time, so we use it to calculate an oscillation amplitude, and from that, the oscillation probability. We simplify the theoretical probabilities using results from experimental data, such as the neutrino mixing angles and mass differences. By exploring what values of the decay parameters are physically allowable, we can begin to understand just how large the decay parameters can be. We compare the probabilities in the case of no neutrino decay and in the case of maximum neutrino decay to determine how much of an effect neutrino decay could have on observations, and discuss the ability of future experiments to detect these differences.We also examine neutrino decay in the realm of CP invariance, and found that it is a new source of CP violation. Our work indicates that there is a difference in the oscillation probabilities between particle transitions and their corresponding antiparticle transitions. If neutrino decay were proven true, it could be an important factor in understanding leptogenesis and the particle-antiparticle asymmetry present in our Universe.

  13. Approximation of Failure Probability Using Conditional Sampling

    NASA Technical Reports Server (NTRS)

    Giesy. Daniel P.; Crespo, Luis G.; Kenney, Sean P.

    2008-01-01

    In analyzing systems which depend on uncertain parameters, one technique is to partition the uncertain parameter domain into a failure set and its complement, and judge the quality of the system by estimating the probability of failure. If this is done by a sampling technique such as Monte Carlo and the probability of failure is small, accurate approximation can require so many sample points that the computational expense is prohibitive. Previous work of the authors has shown how to bound the failure event by sets of such simple geometry that their probabilities can be calculated analytically. In this paper, it is shown how to make use of these failure bounding sets and conditional sampling within them to substantially reduce the computational burden of approximating failure probability. It is also shown how the use of these sampling techniques improves the confidence intervals for the failure probability estimate for a given number of sample points and how they reduce the number of sample point analyses needed to achieve a given level of confidence.

  14. Computing Earthquake Probabilities on Global Scales

    NASA Astrophysics Data System (ADS)

    Holliday, James R.; Graves, William R.; Rundle, John B.; Turcotte, Donald L.

    2016-03-01

    Large devastating events in systems such as earthquakes, typhoons, market crashes, electricity grid blackouts, floods, droughts, wars and conflicts, and landslides can be unexpected and devastating. Events in many of these systems display frequency-size statistics that are power laws. Previously, we presented a new method for calculating probabilities for large events in systems such as these. This method counts the number of small events since the last large event and then converts this count into a probability by using a Weibull probability law. We applied this method to the calculation of large earthquake probabilities in California-Nevada, USA. In that study, we considered a fixed geographic region and assumed that all earthquakes within that region, large magnitudes as well as small, were perfectly correlated. In the present article, we extend this model to systems in which the events have a finite correlation length. We modify our previous results by employing the correlation function for near mean field systems having long-range interactions, an example of which is earthquakes and elastic interactions. We then construct an application of the method and show examples of computed earthquake probabilities.

  15. Reconstructing the prior probabilities of allelic phylogenies.

    PubMed Central

    Golding, G Brian

    2002-01-01

    In general when a phylogeny is reconstructed from DNA or protein sequence data, it makes use only of the probabilities of obtaining some phylogeny given a collection of data. It is also possible to determine the prior probabilities of different phylogenies. This information can be of use in analyzing the biological causes for the observed divergence of sampled taxa. Unusually "rare" topologies for a given data set may be indicative of different biological forces acting. A recursive algorithm is presented that calculates the prior probabilities of a phylogeny for different allelic samples and for different phylogenies. This method is a straightforward extension of Ewens' sample distribution. The probability of obtaining each possible sample according to Ewens' distribution is further subdivided into each of the possible phylogenetic topologies. These probabilities depend not only on the identity of the alleles and on 4N(mu) (four times the effective population size times the neutral mutation rate) but also on the phylogenetic relationships among the alleles. Illustrations of the algorithm are given to demonstrate how different phylogenies are favored under different conditions. PMID:12072482

  16. Classical and Quantum Probability for Biologists - Introduction

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei.

    2010-01-01

    The aim of this review (oriented to biologists looking for applications of QM) is to provide a detailed comparative analysis of classical (Kolmogorovian) and quantum (Dirac-von Neumann) models. We will stress differences in the definition of conditional probability and as a consequence in the structures of matrices of transition probabilities, especially the condition of double stochasticity which arises naturally in QM. One of the most fundamental differences between two models is deformation of the classical formula of total probability (FTP) which plays an important role in statistics and decision making. An additional term appears in the QM-version of FTP - so called interference term. Finally, we discuss Bell's inequality and show that the common viewpoint that its violation induces either nonlocality or "death of realism" has not been completely justified. For us it is merely a sign of non-Kolmogorovianity of probabilistic data collected in a few experiments with incompatible setups of measurement devices.

  17. Detection probability of EBPSK-MODEM system

    NASA Astrophysics Data System (ADS)

    Yao, Yu; Wu, Lenan

    2016-07-01

    Since the impacting filter-based receiver is able to transform phase modulation into amplitude peak, a simple threshold decision can detect the Extend-Binary Phase Shift Keying (EBPSK) modulated ranging signal in noise environment. In this paper, an analysis of the EBPSK-MODEM system output gives the probability density function for EBPSK modulated signals plus noise. The equation of detection probability (pd) for fluctuating and non-fluctuating targets has been deduced. Also, a comparison of the pd for the EBPSK-MODEM system and pulse radar receiver is made, and some results are plotted. Moreover, the probability curves of such system with several modulation parameters are analysed. When modulation parameter is not smaller than 6, the detection performance of EBPSK-MODEM system is more excellent than traditional radar system. In addition to theoretical considerations, computer simulations are provided for illustrating the performance.

  18. Local Directed Percolation Probability in Two Dimensions

    NASA Astrophysics Data System (ADS)

    Inui, Norio; Konno, Norio; Komatsu, Genichi; Kameoka, Koichi

    1998-01-01

    Using the series expansion method and Monte Carlo simulation,we study the directed percolation probability on the square lattice Vn0=\\{ (x,y) \\in {Z}2:x+y=even, 0 ≤ y ≤ n, - y ≤ x ≤ y \\}.We calculate the local percolationprobability Pnl defined as the connection probability between theorigin and a site (0,n). The critical behavior of P∞lis clearly different from the global percolation probability P∞g characterized by a critical exponent βg.An analysis based on the Padé approximants shows βl=2βg.In addition, we find that the series expansion of P2nl can be expressed as a function of Png.

  19. Sampling Quantum Nonlocal Correlations with High Probability

    NASA Astrophysics Data System (ADS)

    González-Guillén, C. E.; Jiménez, C. H.; Palazuelos, C.; Villanueva, I.

    2016-05-01

    It is well known that quantum correlations for bipartite dichotomic measurements are those of the form {γ=(< u_i,v_jrangle)_{i,j=1}^n}, where the vectors u i and v j are in the unit ball of a real Hilbert space. In this work we study the probability of the nonlocal nature of these correlations as a function of {α=m/n}, where the previous vectors are sampled according to the Haar measure in the unit sphere of {R^m}. In particular, we prove the existence of an {α_0 > 0} such that if {α≤ α_0}, {γ} is nonlocal with probability tending to 1 as {n→ ∞}, while for {α > 2}, {γ} is local with probability tending to 1 as {n→ ∞}.

  20. Explosion probability of unexploded ordnance: expert beliefs.

    PubMed

    MacDonald, Jacqueline Anne; Small, Mitchell J; Morgan, M G

    2008-08-01

    This article reports on a study to quantify expert beliefs about the explosion probability of unexploded ordnance (UXO). Some 1,976 sites at closed military bases in the United States are contaminated with UXO and are slated for cleanup, at an estimated cost of $15-140 billion. Because no available technology can guarantee 100% removal of UXO, information about explosion probability is needed to assess the residual risks of civilian reuse of closed military bases and to make decisions about how much to invest in cleanup. This study elicited probability distributions for the chance of UXO explosion from 25 experts in explosive ordnance disposal, all of whom have had field experience in UXO identification and deactivation. The study considered six different scenarios: three different types of UXO handled in two different ways (one involving children and the other involving construction workers). We also asked the experts to rank by sensitivity to explosion 20 different kinds of UXO found at a case study site at Fort Ord, California. We found that the experts do not agree about the probability of UXO explosion, with significant differences among experts in their mean estimates of explosion probabilities and in the amount of uncertainty that they express in their estimates. In three of the six scenarios, the divergence was so great that the average of all the expert probability distributions was statistically indistinguishable from a uniform (0, 1) distribution-suggesting that the sum of expert opinion provides no information at all about the explosion risk. The experts' opinions on the relative sensitivity to explosion of the 20 UXO items also diverged. The average correlation between rankings of any pair of experts was 0.41, which, statistically, is barely significant (p= 0.049) at the 95% confidence level. Thus, one expert's rankings provide little predictive information about another's rankings. The lack of consensus among experts suggests that empirical studies

  1. Monte Carlo simulation of scenario probability distributions

    SciTech Connect

    Glaser, R.

    1996-10-23

    Suppose a scenario of interest can be represented as a series of events. A final result R may be viewed then as the intersection of three events, A, B, and C. The probability of the result P(R) in this case is the product P(R) = P(A) P(B {vert_bar} A) P(C {vert_bar} A {intersection} B). An expert may be reluctant to estimate P(R) as a whole yet agree to supply his notions of the component probabilities in the form of prior distributions. Each component prior distribution may be viewed as the stochastic characterization of the expert`s uncertainty regarding the true value of the component probability. Mathematically, the component probabilities are treated as independent random variables and P(R) as their product; the induced prior distribution for P(R) is determined which characterizes the expert`s uncertainty regarding P(R). It may be both convenient and adequate to approximate the desired distribution by Monte Carlo simulation. Software has been written for this task that allows a variety of component priors that experts with good engineering judgment might feel comfortable with. The priors are mostly based on so-called likelihood classes. The software permits an expert to choose for a given component event probability one of six types of prior distributions, and the expert specifies the parameter value(s) for that prior. Each prior is unimodal. The expert essentially decides where the mode is, how the probability is distributed in the vicinity of the mode, and how rapidly it attenuates away. Limiting and degenerate applications allow the expert to be vague or precise.

  2. Electric quadrupole transition probabilities for atomic lithium

    SciTech Connect

    Çelik, Gültekin; Gökçe, Yasin; Yıldız, Murat

    2014-05-15

    Electric quadrupole transition probabilities for atomic lithium have been calculated using the weakest bound electron potential model theory (WBEPMT). We have employed numerical non-relativistic Hartree–Fock wavefunctions for expectation values of radii and the necessary energy values have been taken from the compilation at NIST. The results obtained with the present method agree very well with the Coulomb approximation results given by Caves (1975). Moreover, electric quadrupole transition probability values not existing in the literature for some highly excited levels have been obtained using the WBEPMT.

  3. Non-Gaussian Photon Probability Distribution

    SciTech Connect

    Solomon, Benjamin T.

    2010-01-28

    This paper investigates the axiom that the photon's probability distribution is a Gaussian distribution. The Airy disc empirical evidence shows that the best fit, if not exact, distribution is a modified Gamma mGAMMA distribution (whose parameters are alpha = r, betar/sq root(u)) in the plane orthogonal to the motion of the photon. This modified Gamma distribution is then used to reconstruct the probability distributions along the hypotenuse from the pinhole, arc from the pinhole, and a line parallel to photon motion. This reconstruction shows that the photon's probability distribution is not a Gaussian function. However, under certain conditions, the distribution can appear to be Normal, thereby accounting for the success of quantum mechanics. This modified Gamma distribution changes with the shape of objects around it and thus explains how the observer alters the observation. This property therefore places additional constraints to quantum entanglement experiments. This paper shows that photon interaction is a multi-phenomena effect consisting of the probability to interact P{sub i}, the probabilistic function and the ability to interact A{sub i}, the electromagnetic function. Splitting the probability function P{sub i} from the electromagnetic function A{sub i} enables the investigation of the photon behavior from a purely probabilistic P{sub i} perspective. The Probabilistic Interaction Hypothesis is proposed as a consistent method for handling the two different phenomena, the probability function P{sub i} and the ability to interact A{sub i}, thus redefining radiation shielding, stealth or cloaking, and invisibility as different effects of a single phenomenon P{sub i} of the photon probability distribution. Sub wavelength photon behavior is successfully modeled as a multi-phenomena behavior. The Probabilistic Interaction Hypothesis provides a good fit to Otoshi's (1972) microwave shielding, Schurig et al.(2006) microwave cloaking, and Oulton et al.(2008) sub

  4. Quantum probability and quantum decision-making.

    PubMed

    Yukalov, V I; Sornette, D

    2016-01-13

    A rigorous general definition of quantum probability is given, which is valid not only for elementary events but also for composite events, for operationally testable measurements as well as for inconclusive measurements, and also for non-commuting observables in addition to commutative observables. Our proposed definition of quantum probability makes it possible to describe quantum measurements and quantum decision-making on the same common mathematical footing. Conditions are formulated for the case when quantum decision theory reduces to its classical counterpart and for the situation where the use of quantum decision theory is necessary. PMID:26621989

  5. Steering in spin tomographic probability representation

    NASA Astrophysics Data System (ADS)

    Man'ko, V. I.; Markovich, L. A.

    2016-09-01

    The steering property known for two-qubit state in terms of specific inequalities for the correlation function is translated for the state of qudit with the spin j = 3 / 2. Since most steering detection inequalities are based on the correlation functions we introduce analogs of such functions for the single qudit systems. The tomographic probability representation for the qudit states is applied. The connection between the correlation function in the two-qubit system and the single qudit is presented in an integral form with an intertwining kernel calculated explicitly in tomographic probability terms.

  6. Practical algorithmic probability: an image inpainting example

    NASA Astrophysics Data System (ADS)

    Potapov, Alexey; Scherbakov, Oleg; Zhdanov, Innokentii

    2013-12-01

    Possibility of practical application of algorithmic probability is analyzed on an example of image inpainting problem that precisely corresponds to the prediction problem. Such consideration is fruitful both for the theory of universal prediction and practical image inpaiting methods. Efficient application of algorithmic probability implies that its computation is essentially optimized for some specific data representation. In this paper, we considered one image representation, namely spectral representation, for which an image inpainting algorithm is proposed based on the spectrum entropy criterion. This algorithm showed promising results in spite of very simple representation. The same approach can be used for introducing ALP-based criterion for more powerful image representations.

  7. Flood frequency: expected and unexpected probabilities

    USGS Publications Warehouse

    Thomas, D.M.

    1976-01-01

    Flood-frequency curves may be defined either with or without an ' expeced probability ' adustment; and the two curves differ in the way that they attempt to average the time-sampling uncertainties. A curve with no adustment is shown to estimate a median value of both discharge and frequency of occurrence, while an expected probability curve is shown to estimate a mean frequency of flood years. The attributes and constraints of the two types of curves for various uses are discussed. 

  8. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    NASA Astrophysics Data System (ADS)

    Vourdas, A.

    2014-08-01

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator {{D}}(H_1, H_2), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors {{P}}(H_1), {{P}}(H_2), to the subspaces H1, H2. As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities.

  9. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    SciTech Connect

    Vourdas, A.

    2014-08-15

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H{sub 1},H{sub 2}), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H{sub 1}),P(H{sub 2}), to the subspaces H{sub 1}, H{sub 2}. As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities.

  10. Developing a Model and Applications for Probabilities of Student Success: A Case Study of Predictive Analytics

    ERIC Educational Resources Information Center

    Calvert, Carol Elaine

    2014-01-01

    This case study relates to distance learning students on open access courses. It demonstrates the use of predictive analytics to generate a model of the probabilities of success and retention at different points, or milestones, in a student journey. A core set of explanatory variables has been established and their varying relative importance at…

  11. There Once Was a 9-Block ...--A Middle-School Design for Probability and Statistics

    ERIC Educational Resources Information Center

    Abrahamson, Dor; Janusz, Ruth M.; Wilensky, Uri

    2006-01-01

    ProbLab is a probability-and-statistics unit developed at the Center for Connected Learning and Computer-Based Modeling, Northwestern University. Students analyze the combinatorial space of the 9-block, a 3-by-3 grid of squares, in which each square can be either green or blue. All 512 possible 9-blocks are constructed and assembled in a "bar…

  12. A Computational Model of Word Segmentation from Continuous Speech Using Transitional Probabilities of Atomic Acoustic Events

    ERIC Educational Resources Information Center

    Rasanen, Okko

    2011-01-01

    Word segmentation from continuous speech is a difficult task that is faced by human infants when they start to learn their native language. Several studies indicate that infants might use several different cues to solve this problem, including intonation, linguistic stress, and transitional probabilities between subsequent speech sounds. In this…

  13. The Independent Effects of Phonotactic Probability and Neighbourhood Density on Lexical Acquisition by Preschool Children

    ERIC Educational Resources Information Center

    Storkel, Holly L.; Lee, Su-Yeon

    2011-01-01

    The goal of this research was to disentangle effects of phonotactic probability, the likelihood of occurrence of a sound sequence, and neighbourhood density, the number of phonologically similar words, in lexical acquisition. Two-word learning experiments were conducted with 4-year-old children. Experiment 1 manipulated phonotactic probability…

  14. Math Academy: Are You Game? Explorations in Probability. Supplemental Math Materials for Grades 3-6

    ERIC Educational Resources Information Center

    Rimbey, Kimberly

    2007-01-01

    Created by teachers for teachers, the Math Academy tools and activities included in this booklet were designed to create hands-on activities and a fun learning environment for the teaching of mathematics to the students. This booklet contains the themed program "Are You Game? Math Academy--Explorations in Probability," which teachers can use to…

  15. Technique for Evaluating Multiple Probability Occurrences /TEMPO/

    NASA Technical Reports Server (NTRS)

    Mezzacappa, M. A.

    1970-01-01

    Technique is described for adjustment of engineering response information by broadening the application of statistical subjective stimuli theory. The study is specifically concerned with a mathematical evaluation of the expected probability of relative occurrence which can be identified by comparison rating techniques.

  16. Assessing Schematic Knowledge of Introductory Probability Theory

    ERIC Educational Resources Information Center

    Birney, Damian P.; Fogarty, Gerard J.; Plank, Ashley

    2005-01-01

    The ability to identify schematic knowledge is an important goal for both assessment and instruction. In the current paper, schematic knowledge of statistical probability theory is explored from the declarative-procedural framework using multiple methods of assessment. A sample of 90 undergraduate introductory statistics students was required to…

  17. Automatic Item Generation of Probability Word Problems

    ERIC Educational Resources Information Center

    Holling, Heinz; Bertling, Jonas P.; Zeuch, Nina

    2009-01-01

    Mathematical word problems represent a common item format for assessing student competencies. Automatic item generation (AIG) is an effective way of constructing many items with predictable difficulties, based on a set of predefined task parameters. The current study presents a framework for the automatic generation of probability word problems…

  18. Phonotactic Probability Effects in Children Who Stutter

    ERIC Educational Resources Information Center

    Anderson, Julie D.; Byrd, Courtney T.

    2008-01-01

    Purpose: The purpose of this study was to examine the influence of "phonotactic probability", which is the frequency of different sound segments and segment sequences, on the overall fluency with which words are produced by preschool children who stutter (CWS) as well as to determine whether it has an effect on the type of stuttered disfluency…

  19. Estimating the Probability of Negative Events

    ERIC Educational Resources Information Center

    Harris, Adam J. L.; Corner, Adam; Hahn, Ulrike

    2009-01-01

    How well we are attuned to the statistics of our environment is a fundamental question in understanding human behaviour. It seems particularly important to be able to provide accurate assessments of the probability with which negative events occur so as to guide rational choice of preventative actions. One question that arises here is whether or…

  20. Large Deviations: Advanced Probability for Undergrads

    ERIC Educational Resources Information Center

    Rolls, David A.

    2007-01-01

    In the branch of probability called "large deviations," rates of convergence (e.g. of the sample mean) are considered. The theory makes use of the moment generating function. So, particularly for sums of independent and identically distributed random variables, the theory can be made accessible to senior undergraduates after a first course in…

  1. Simplicity and Probability in Causal Explanation

    ERIC Educational Resources Information Center

    Lombrozo, Tania

    2007-01-01

    What makes some explanations better than others? This paper explores the roles of simplicity and probability in evaluating competing causal explanations. Four experiments investigate the hypothesis that simpler explanations are judged both better and more likely to be true. In all experiments, simplicity is quantified as the number of causes…

  2. Exploring Concepts in Probability: Using Graphics Calculators

    ERIC Educational Resources Information Center

    Ghosh, Jonaki

    2004-01-01

    This article describes a project in which certain key concepts in probability were explored using graphics calculators with year 10 students. The lessons were conducted in the regular classroom where students were provided with a Casio CFX 9850 GB PLUS graphics calculator with which they were familiar from year 9. The participants in the…

  3. The Smart Potential behind Probability Matching

    ERIC Educational Resources Information Center

    Gaissmaier, Wolfgang; Schooler, Lael J.

    2008-01-01

    Probability matching is a classic choice anomaly that has been studied extensively. While many approaches assume that it is a cognitive shortcut driven by cognitive limitations, recent literature suggests that it is not a strategy per se, but rather another outcome of people's well-documented misperception of randomness. People search for patterns…

  4. Monte Carlo, Probability, Algebra, and Pi.

    ERIC Educational Resources Information Center

    Hinders, Duane C.

    1981-01-01

    The uses of random number generators are illustrated in three ways: (1) the solution of a probability problem using a coin; (2) the solution of a system of simultaneous linear equations using a die; and (3) the approximation of pi using darts. (MP)

  5. On the bound of first excursion probability

    NASA Technical Reports Server (NTRS)

    Yang, J. N.

    1969-01-01

    Method has been developed to improve the lower bound of the first excursion probability that can apply to the problem with either constant or time-dependent barriers. The method requires knowledge of the joint density function of the random process at two arbitrary instants.

  6. Quantum temporal probabilities in tunneling systems

    NASA Astrophysics Data System (ADS)

    Anastopoulos, Charis; Savvidou, Ntina

    2013-09-01

    We study the temporal aspects of quantum tunneling as manifested in time-of-arrival experiments in which the detected particle tunnels through a potential barrier. In particular, we present a general method for constructing temporal probabilities in tunneling systems that (i) defines 'classical' time observables for quantum systems and (ii) applies to relativistic particles interacting through quantum fields. We show that the relevant probabilities are defined in terms of specific correlation functions of the quantum field associated with tunneling particles. We construct a probability distribution with respect to the time of particle detection that contains all information about the temporal aspects of the tunneling process. In specific cases, this probability distribution leads to the definition of a delay time that, for parity-symmetric potentials, reduces to the phase time of Bohm and Wigner. We apply our results to piecewise constant potentials, by deriving the appropriate junction conditions on the points of discontinuity. For the double square potential, in particular, we demonstrate the existence of (at least) two physically relevant time parameters, the delay time and a decay rate that describes the escape of particles trapped in the inter-barrier region. Finally, we propose a resolution to the paradox of apparent superluminal velocities for tunneling particles. We demonstrate that the idea of faster-than-light speeds in tunneling follows from an inadmissible use of classical reasoning in the description of quantum systems.

  7. Conceptual Variation and Coordination in Probability Reasoning

    ERIC Educational Resources Information Center

    Nilsson, Per

    2009-01-01

    This study investigates students' conceptual variation and coordination among theoretical and experimental interpretations of probability. In the analysis we follow how Swedish students (12-13 years old) interact with a dice game, specifically designed to offer the students opportunities to elaborate on the logic of sample space,…

  8. Teaching Mathematics with Technology: Probability Simulations.

    ERIC Educational Resources Information Center

    Bright, George W.

    1989-01-01

    Discussed are the use of probability simulations in a mathematics classroom. Computer simulations using regular dice and special dice are described. Sample programs used to generate 100 rolls of a pair of dice in BASIC and Logo languages are provided. (YP)

  9. Probability in Action: The Red Traffic Light

    ERIC Educational Resources Information Center

    Shanks, John A.

    2007-01-01

    Emphasis on problem solving in mathematics has gained considerable attention in recent years. While statistics teaching has always been problem driven, the same cannot be said for the teaching of probability where discrete examples involving coins and playing cards are often the norm. This article describes an application of simple probability…

  10. Confusion between Odds and Probability, a Pandemic?

    ERIC Educational Resources Information Center

    Fulton, Lawrence V.; Mendez, Francis A.; Bastian, Nathaniel D.; Musal, R. Muzaffer

    2012-01-01

    This manuscript discusses the common confusion between the terms probability and odds. To emphasize the importance and responsibility of being meticulous in the dissemination of information and knowledge, this manuscript reveals five cases of sources of inaccurate statistical language imbedded in the dissemination of information to the general…

  11. Posterior Probabilities for a Consensus Ordering.

    ERIC Educational Resources Information Center

    Fligner, Michael A.; Verducci, Joseph S.

    1990-01-01

    The concept of consensus ordering is defined, and formulas for exact and approximate posterior probabilities for consensus ordering are developed under the assumption of a generalized Mallows' model with a diffuse conjugate prior. These methods are applied to a data set concerning 98 college students. (SLD)

  12. Five-Parameter Bivariate Probability Distribution

    NASA Technical Reports Server (NTRS)

    Tubbs, J.; Brewer, D.; Smith, O. W.

    1986-01-01

    NASA technical memorandum presents four papers about five-parameter bivariate gamma class of probability distributions. With some overlap of subject matter, papers address different aspects of theories of these distributions and use in forming statistical models of such phenomena as wind gusts. Provides acceptable results for defining constraints in problems designing aircraft and spacecraft to withstand large wind-gust loads.

  13. Independent Events in Elementary Probability Theory

    ERIC Educational Resources Information Center

    Csenki, Attila

    2011-01-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): If the n events E[subscript 1],…

  14. Probability distribution functions in turbulent convection

    NASA Technical Reports Server (NTRS)

    Balachandar, S.; Sirovich, L.

    1991-01-01

    Results of an extensive investigation of probability distribution functions (pdfs) for Rayleigh-Benard convection, in hard turbulence regime, are presented. It is shown that the pdfs exhibit a high degree of internal universality. In certain cases this universality is established within two Kolmogorov scales of a boundary. A discussion of the factors leading to the universality is presented.

  15. Monte Carlo methods to calculate impact probabilities

    NASA Astrophysics Data System (ADS)

    Rickman, H.; Wiśniowski, T.; Wajer, P.; Gabryszewski, R.; Valsecchi, G. B.

    2014-09-01

    Context. Unraveling the events that took place in the solar system during the period known as the late heavy bombardment requires the interpretation of the cratered surfaces of the Moon and terrestrial planets. This, in turn, requires good estimates of the statistical impact probabilities for different source populations of projectiles, a subject that has received relatively little attention, since the works of Öpik (1951, Proc. R. Irish Acad. Sect. A, 54, 165) and Wetherill (1967, J. Geophys. Res., 72, 2429). Aims: We aim to work around the limitations of the Öpik and Wetherill formulae, which are caused by singularities due to zero denominators under special circumstances. Using modern computers, it is possible to make good estimates of impact probabilities by means of Monte Carlo simulations, and in this work, we explore the available options. Methods: We describe three basic methods to derive the average impact probability for a projectile with a given semi-major axis, eccentricity, and inclination with respect to a target planet on an elliptic orbit. One is a numerical averaging of the Wetherill formula; the next is a Monte Carlo super-sizing method using the target's Hill sphere. The third uses extensive minimum orbit intersection distance (MOID) calculations for a Monte Carlo sampling of potentially impacting orbits, along with calculations of the relevant interval for the timing of the encounter allowing collision. Numerical experiments are carried out for an intercomparison of the methods and to scrutinize their behavior near the singularities (zero relative inclination and equal perihelion distances). Results: We find an excellent agreement between all methods in the general case, while there appear large differences in the immediate vicinity of the singularities. With respect to the MOID method, which is the only one that does not involve simplifying assumptions and approximations, the Wetherill averaging impact probability departs by diverging toward

  16. The albedo effect on neutron transmission probability.

    PubMed

    Khanouchi, A; Sabir, A; Boulkheir, M; Ichaoui, R; Ghassoun, J; Jehouani, A

    1997-01-01

    The aim of this study is to evaluate the albedo effect on the neutron transmission probability through slab shields. For this reason we have considered an infinite homogeneous slab having a fixed thickness equal to 20 lambda (lambda is the mean free path of the neutron in the slab). This slab is characterized by the factor Ps (scattering probability) and contains a vacuum channel which is formed by two horizontal parts and an inclined one (David, M. C. (1962) Duc and Voids in shields. In Reactor Handbook, Vol. III, Part B, p. 166). The thickness of the vacuum channel is taken equal to 2 lambda. An infinite plane source of neutrons is placed on the first of the slab (left face) and detectors, having windows equal to 2 lambda, are placed on the second face of the slab (right face). Neutron histories are sampled by the Monte Carlo method (Booth, T. E. and Hendricks, J. S. (1994) Nuclear Technology 5) using exponential biasing in order to increase the Monte Carlo calculation efficiency (Levitt, L. B. (1968) Nuclear Science and Engineering 31, 500-504; Jehouani, A., Ghassoun, J. and Abouker, A. (1994) In Proceedings of the 6th International Symposium on Radiation Physics, Rabat, Morocco) and we have applied the statistical weight method which supposes that the neutron is born at the source with a unit statistical weight and after each collision this weight is corrected. For different values of the scattering probability and for different slopes of the inclined part of the channel we have calculated the neutron transmission probability for different positions of the detectors versus the albedo at the vacuum channel-medium interface. Some analytical representations are also presented for these transmission probabilities. PMID:9463883

  17. Quantum temporal probabilities in tunneling systems

    SciTech Connect

    Anastopoulos, Charis Savvidou, Ntina

    2013-09-15

    We study the temporal aspects of quantum tunneling as manifested in time-of-arrival experiments in which the detected particle tunnels through a potential barrier. In particular, we present a general method for constructing temporal probabilities in tunneling systems that (i) defines ‘classical’ time observables for quantum systems and (ii) applies to relativistic particles interacting through quantum fields. We show that the relevant probabilities are defined in terms of specific correlation functions of the quantum field associated with tunneling particles. We construct a probability distribution with respect to the time of particle detection that contains all information about the temporal aspects of the tunneling process. In specific cases, this probability distribution leads to the definition of a delay time that, for parity-symmetric potentials, reduces to the phase time of Bohm and Wigner. We apply our results to piecewise constant potentials, by deriving the appropriate junction conditions on the points of discontinuity. For the double square potential, in particular, we demonstrate the existence of (at least) two physically relevant time parameters, the delay time and a decay rate that describes the escape of particles trapped in the inter-barrier region. Finally, we propose a resolution to the paradox of apparent superluminal velocities for tunneling particles. We demonstrate that the idea of faster-than-light speeds in tunneling follows from an inadmissible use of classical reasoning in the description of quantum systems. -- Highlights: •Present a general methodology for deriving temporal probabilities in tunneling systems. •Treatment applies to relativistic particles interacting through quantum fields. •Derive a new expression for tunneling time. •Identify new time parameters relevant to tunneling. •Propose a resolution of the superluminality paradox in tunneling.

  18. Neural representation of probabilities for Bayesian inference.

    PubMed

    Rich, Dylan; Cazettes, Fanny; Wang, Yunyan; Peña, José Luis; Fischer, Brian J

    2015-04-01

    Bayesian models are often successful in describing perception and behavior, but the neural representation of probabilities remains in question. There are several distinct proposals for the neural representation of probabilities, but they have not been directly compared in an example system. Here we consider three models: a non-uniform population code where the stimulus-driven activity and distribution of preferred stimuli in the population represent a likelihood function and a prior, respectively; the sampling hypothesis which proposes that the stimulus-driven activity over time represents a posterior probability and that the spontaneous activity represents a prior; and the class of models which propose that a population of neurons represents a posterior probability in a distributed code. It has been shown that the non-uniform population code model matches the representation of auditory space generated in the owl's external nucleus of the inferior colliculus (ICx). However, the alternative models have not been tested, nor have the three models been directly compared in any system. Here we tested the three models in the owl's ICx. We found that spontaneous firing rate and the average stimulus-driven response of these neurons were not consistent with predictions of the sampling hypothesis. We also found that neural activity in ICx under varying levels of sensory noise did not reflect a posterior probability. On the other hand, the responses of ICx neurons were consistent with the non-uniform population code model. We further show that Bayesian inference can be implemented in the non-uniform population code model using one spike per neuron when the population is large and is thus able to support the rapid inference that is necessary for sound localization. PMID:25561333

  19. Learning to Learn

    ERIC Educational Resources Information Center

    Roberts, Dominic

    2010-01-01

    Everyone learns in a different way--for some, learning comes naturally, but for others it can be a real struggle. Many negative experiences of education are a result of individuals not knowing how they learn most effectively, or believing that they do not have the capacity to learn well. Addressing the issues of how individuals learn can help…

  20. Using High-Probability Foods to Increase the Acceptance of Low-Probability Foods

    ERIC Educational Resources Information Center

    Meier, Aimee E.; Fryling, Mitch J.; Wallace, Michele D.

    2012-01-01

    Studies have evaluated a range of interventions to treat food selectivity in children with autism and related developmental disabilities. The high-probability instructional sequence is one intervention with variable results in this area. We evaluated the effectiveness of a high-probability sequence using 3 presentations of a preferred food on…

  1. Killeen's Probability of Replication and Predictive Probabilities: How to Compute, Use, and Interpret Them

    ERIC Educational Resources Information Center

    Lecoutre, Bruno; Lecoutre, Marie-Paule; Poitevineau, Jacques

    2010-01-01

    P. R. Killeen's (2005a) probability of replication ("p[subscript rep]") of an experimental result is the fiducial Bayesian predictive probability of finding a same-sign effect in a replication of an experiment. "p[subscript rep]" is now routinely reported in "Psychological Science" and has also begun to appear in other journals. However, there is…

  2. VOLCANIC RISK ASSESSMENT - PROBABILITY AND CONSEQUENCES

    SciTech Connect

    G.A. Valentine; F.V. Perry; S. Dartevelle

    2005-08-26

    Risk is the product of the probability and consequences of an event. Both of these must be based upon sound science that integrates field data, experiments, and modeling, but must also be useful to decision makers who likely do not understand all aspects of the underlying science. We review a decision framework used in many fields such as performance assessment for hazardous and/or radioactive waste disposal sites that can serve to guide the volcanological community towards integrated risk assessment. In this framework the underlying scientific understanding of processes that affect probability and consequences drive the decision-level results, but in turn these results can drive focused research in areas that cause the greatest level of uncertainty at the decision level. We review two examples of the determination of volcanic event probability: (1) probability of a new volcano forming at the proposed Yucca Mountain radioactive waste repository, and (2) probability that a subsurface repository in Japan would be affected by the nearby formation of a new stratovolcano. We also provide examples of work on consequences of explosive eruptions, within the framework mentioned above. These include field-based studies aimed at providing data for ''closure'' of wall rock erosion terms in a conduit flow model, predictions of dynamic pressure and other variables related to damage by pyroclastic flow into underground structures, and vulnerability criteria for structures subjected to conditions of explosive eruption. Process models (e.g., multiphase flow) are important for testing the validity or relative importance of possible scenarios in a volcanic risk assessment. We show how time-dependent multiphase modeling of explosive ''eruption'' of basaltic magma into an open tunnel (drift) at the Yucca Mountain repository provides insight into proposed scenarios that include the development of secondary pathways to the Earth's surface. Addressing volcanic risk within a decision

  3. From data to probability densities without histograms

    NASA Astrophysics Data System (ADS)

    Berg, Bernd A.; Harris, Robert C.

    2008-09-01

    When one deals with data drawn from continuous variables, a histogram is often inadequate to display their probability density. It deals inefficiently with statistical noise, and binsizes are free parameters. In contrast to that, the empirical cumulative distribution function (obtained after sorting the data) is parameter free. But it is a step function, so that its differentiation does not give a smooth probability density. Based on Fourier series expansion and Kolmogorov tests, we introduce a simple method, which overcomes this problem. Error bars on the estimated probability density are calculated using a jackknife method. We give several examples and provide computer code reproducing them. You may want to look at the corresponding figures 4 to 9 first. Program summaryProgram title: cdf_to_pd Catalogue identifier: AEBC_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEBC_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 2758 No. of bytes in distributed program, including test data, etc.: 18 594 Distribution format: tar.gz Programming language: Fortran 77 Computer: Any capable of compiling and executing Fortran code Operating system: Any capable of compiling and executing Fortran code Classification: 4.14, 9 Nature of problem: When one deals with data drawn from continuous variables, a histogram is often inadequate to display the probability density. It deals inefficiently with statistical noise, and binsizes are free parameters. In contrast to that, the empirical cumulative distribution function (obtained after sorting the data) is parameter free. But it is a step function, so that its differentiation does not give a smooth probability density. Solution method: Based on Fourier series expansion and Kolmogorov tests, we introduce a simple method, which

  4. Nuclear data uncertainties: I, Basic concepts of probability

    SciTech Connect

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  5. Earthquake probabilities: theoretical assessments and reality

    NASA Astrophysics Data System (ADS)

    Kossobokov, V. G.

    2013-12-01

    It is of common knowledge that earthquakes are complex phenomena which classification and sizing remain serious problems of the contemporary seismology. In general, their frequency-magnitude distribution exhibit power law scaling. This scaling differs significantly when different time and/or space domains are considered. At the scale of a particular earthquake rupture zone the frequency of similar size events is usually estimated to be about once in several hundred years. Evidently, contemporary seismology does not possess enough reported instrumental data for any reliable quantification of an earthquake probability at a given place of expected event. Regretfully, most of the state-of-the-art theoretical approaches to assess probability of seismic events are based on trivial (e.g. Poisson, periodic, etc) or, conversely, delicately-designed (e.g. STEP, ETAS, etc) models of earthquake sequences. Some of these models are evidently erroneous, some can be rejected by the existing statistics, and some are hardly testable in our life-time. Nevertheless such probabilistic counts including seismic hazard assessment and earthquake forecasting when used on practice eventually mislead to scientifically groundless advices communicated to decision makers and inappropriate decisions. As a result, the population of seismic regions continues facing unexpected risk and losses. The international project Global Earthquake Model (GEM) is on the wrong track, if it continues to base seismic risk estimates on the standard, mainly probabilistic, methodology to assess seismic hazard. It is generally accepted that earthquakes are infrequent, low-probability events. However, they keep occurring at earthquake-prone areas with 100% certainty. Given the expectation of seismic event once per hundred years, the daily probability of occurrence on a certain date may range from 0 to 100% depending on a choice of probability space (which is yet unknown and, therefore, made by a subjective lucky chance

  6. Approximate probability distributions of the master equation

    NASA Astrophysics Data System (ADS)

    Thomas, Philipp; Grima, Ramon

    2015-07-01

    Master equations are common descriptions of mesoscopic systems. Analytical solutions to these equations can rarely be obtained. We here derive an analytical approximation of the time-dependent probability distribution of the master equation using orthogonal polynomials. The solution is given in two alternative formulations: a series with continuous and a series with discrete support, both of which can be systematically truncated. While both approximations satisfy the system size expansion of the master equation, the continuous distribution approximations become increasingly negative and tend to oscillations with increasing truncation order. In contrast, the discrete approximations rapidly converge to the underlying non-Gaussian distributions. The theory is shown to lead to particularly simple analytical expressions for the probability distributions of molecule numbers in metabolic reactions and gene expression systems.

  7. Conflict Probability Estimation for Free Flight

    NASA Technical Reports Server (NTRS)

    Paielli, Russell A.; Erzberger, Heinz

    1996-01-01

    The safety and efficiency of free flight will benefit from automated conflict prediction and resolution advisories. Conflict prediction is based on trajectory prediction and is less certain the farther in advance the prediction, however. An estimate is therefore needed of the probability that a conflict will occur, given a pair of predicted trajectories and their levels of uncertainty. A method is developed in this paper to estimate that conflict probability. The trajectory prediction errors are modeled as normally distributed, and the two error covariances for an aircraft pair are combined into a single equivalent covariance of the relative position. A coordinate transformation is then used to derive an analytical solution. Numerical examples and Monte Carlo validation are presented.

  8. Continuum ionization transition probabilities of atomic oxygen

    NASA Technical Reports Server (NTRS)

    Samson, J. R.; Petrosky, V. E.

    1973-01-01

    The technique of photoelectron spectroscopy was used to obtain the relative continuum transition probabilities of atomic oxygen at 584 A for transitions from 3P ground state into the 4S, D2, and P2 states of the ion. Transition probability ratios for the D2 and P2 states relative to the S4 state of the ion are 1.57 + or - 0.14 and 0.82 + or - 0.07, respectively. In addition, transitions from excited O2(a 1 Delta g) state into the O2(+)(2 Phi u and 2 Delta g) were observed. The adiabatic ionization potential of O2(+)(2 Delta g) was measured as 18.803 + or - 0.006 eV.

  9. Approaches to Evaluating Probability of Collision Uncertainty

    NASA Technical Reports Server (NTRS)

    Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    While the two-dimensional probability of collision (Pc) calculation has served as the main input to conjunction analysis risk assessment for over a decade, it has done this mostly as a point estimate, with relatively little effort made to produce confidence intervals on the Pc value based on the uncertainties in the inputs. The present effort seeks to try to carry these uncertainties through the calculation in order to generate a probability density of Pc results rather than a single average value. Methods for assessing uncertainty in the primary and secondary objects' physical sizes and state estimate covariances, as well as a resampling approach to reveal the natural variability in the calculation, are presented; and an initial proposal for operationally-useful display and interpretation of these data for a particular conjunction is given.

  10. Estimation of transition probabilities of credit ratings

    NASA Astrophysics Data System (ADS)

    Peng, Gan Chew; Hin, Pooi Ah

    2015-12-01

    The present research is based on the quarterly credit ratings of ten companies over 15 years taken from the database of the Taiwan Economic Journal. The components in the vector mi (mi1, mi2,⋯, mi10) may first be used to denote the credit ratings of the ten companies in the i-th quarter. The vector mi+1 in the next quarter is modelled to be dependent on the vector mi via a conditional distribution which is derived from a 20-dimensional power-normal mixture distribution. The transition probability Pkl (i ,j ) for getting mi+1,j = l given that mi, j = k is then computed from the conditional distribution. It is found that the variation of the transition probability Pkl (i ,j ) as i varies is able to give indication for the possible transition of the credit rating of the j-th company in the near future.

  11. Multiple model cardinalized probability hypothesis density filter

    NASA Astrophysics Data System (ADS)

    Georgescu, Ramona; Willett, Peter

    2011-09-01

    The Probability Hypothesis Density (PHD) filter propagates the first-moment approximation to the multi-target Bayesian posterior distribution while the Cardinalized PHD (CPHD) filter propagates both the posterior likelihood of (an unlabeled) target state and the posterior probability mass function of the number of targets. Extensions of the PHD filter to the multiple model (MM) framework have been published and were implemented either with a Sequential Monte Carlo or a Gaussian Mixture approach. In this work, we introduce the multiple model version of the more elaborate CPHD filter. We present the derivation of the prediction and update steps of the MMCPHD particularized for the case of two target motion models and proceed to show that in the case of a single model, the new MMCPHD equations reduce to the original CPHD equations.

  12. Volcano shapes, entropies, and eruption probabilities

    NASA Astrophysics Data System (ADS)

    Gudmundsson, Agust; Mohajeri, Nahid

    2014-05-01

    We propose that the shapes of polygenetic volcanic edifices reflect the shapes of the associated probability distributions of eruptions. In this view, the peak of a given volcanic edifice coincides roughly with the peak of the probability (or frequency) distribution of its eruptions. The broadness and slopes of the edifices vary widely, however. The shapes of volcanic edifices can be approximated by various distributions, either discrete (binning or histogram approximation) or continuous. For a volcano shape (profile) approximated by a normal curve, for example, the broadness would be reflected in its standard deviation (spread). Entropy (S) of a discrete probability distribution is a measure of the absolute uncertainty as to the next outcome/message: in this case, the uncertainty as to time and place of the next eruption. A uniform discrete distribution (all bins of equal height), representing a flat volcanic field or zone, has the largest entropy or uncertainty. For continuous distributions, we use differential entropy, which is a measure of relative uncertainty, or uncertainty change, rather than absolute uncertainty. Volcano shapes can be approximated by various distributions, from which the entropies and thus the uncertainties as regards future eruptions can be calculated. We use the Gibbs-Shannon formula for the discrete entropies and the analogues general formula for the differential entropies and compare their usefulness for assessing the probabilities of eruptions in volcanoes. We relate the entropies to the work done by the volcano during an eruption using the Helmholtz free energy. Many factors other than the frequency of eruptions determine the shape of a volcano. These include erosion, landslides, and the properties of the erupted materials (including their angle of repose). The exact functional relation between the volcano shape and the eruption probability distribution must be explored for individual volcanoes but, once established, can be used to

  13. Finding Possibility and Probability Lessons in Sports

    ERIC Educational Resources Information Center

    Busadee, Nutjira; Laosinchai, Parames; Panijpan, Bhinyo

    2011-01-01

    Today's students demand that their lessons be real, interesting, relevant, and manageable. Mathematics is one subject that eludes many students partly because its traditional presentation lacks those elements that encourage students to learn. Easy accessibility through electronic media has exposed people all over the world to a variety of sports…

  14. Probability with Collaborative Data Visualization Software

    ERIC Educational Resources Information Center

    Willis, Melinda B. N.; Hay, Sue; Martin, Fred G.; Scribner-MacLean, Michelle; Rudnicki, Ivan

    2015-01-01

    Mathematics teachers continually look for ways to make the learning of mathematics more active and engaging. Hands-on activities, in particular, have been demonstrated to improve student engagement and understanding in mathematics classes. Likewise, many scholars have emphasized the growing importance of giving students experience with the…

  15. Probability Explorations in a Multicultural Context

    ERIC Educational Resources Information Center

    Naresh, Nirmala; Harper, Suzanne R.; Keiser, Jane M.; Krumpe, Norm

    2014-01-01

    Mathematical ideas exist and develop in many different cultures. From this multicultural perspective, teachers can use a variety of approaches to acknowledge the role of culture in the teaching and learning of mathematics. Curricular materials that "emphasize both the mathematical and sociocultural aspects" not only help teachers achieve…

  16. Probability and Statistics in Aerospace Engineering

    NASA Technical Reports Server (NTRS)

    Rheinfurth, M. H.; Howell, L. W.

    1998-01-01

    This monograph was prepared to give the practicing engineer a clear understanding of probability and statistics with special consideration to problems frequently encountered in aerospace engineering. It is conceived to be both a desktop reference and a refresher for aerospace engineers in government and industry. It could also be used as a supplement to standard texts for in-house training courses on the subject.

  17. Investigation of Flood Inundation Probability in Taiwan

    NASA Astrophysics Data System (ADS)

    Wang, Chia-Ho; Lai, Yen-Wei; Chang, Tsang-Jung

    2010-05-01

    Taiwan is located at a special point, which is in the path of typhoons from northeast Pacific Ocean. Taiwan is also situated in a tropical-subtropical transition zone. As a result, rainfall is abundant all the year round, especially in summer and autumn. For flood inundation analysis in Taiwan, there exist a lot of uncertainties in hydrological, hydraulic and land-surface topography characteristics, which can change flood inundation characteristics. According to the 7th work item of article 22 in Disaster Prevention and Protection Act in Taiwan, for preventing flood disaster being deteriorating, investigation analysis of disaster potentials, hazardous degree and situation simulation must be proceeded with scientific approaches. However, the flood potential analysis uses a deterministic approach to define flood inundation without considering data uncertainties. This research combines data uncertainty concept in flood inundation maps for showing flood probabilities in each grid. It can be an emergency evacuation basis as typhoons come and extremely torrential rain begin. The research selects Hebauyu watershed of Chiayi County as the demonstration area. Owing to uncertainties of data used, sensitivity analysis is first conducted by using Latin Hypercube sampling (LHS). LHS data sets are next input into an integrated numerical model, which is herein developed to assess flood inundation hazards in coastal lowlands, base on the extension of the 1-D river routing model and the 2-D inundation routing model. Finally, the probability of flood inundation simulation is calculated, and the flood inundation probability maps are obtained. Flood Inundation probability maps can be an alternative of the old flood potential maps for being a regard of building new hydraulic infrastructure in the future.

  18. Sampling probability distributions of lesions in mammograms

    NASA Astrophysics Data System (ADS)

    Looney, P.; Warren, L. M.; Dance, D. R.; Young, K. C.

    2015-03-01

    One approach to image perception studies in mammography using virtual clinical trials involves the insertion of simulated lesions into normal mammograms. To facilitate this, a method has been developed that allows for sampling of lesion positions across the cranio-caudal and medio-lateral radiographic projections in accordance with measured distributions of real lesion locations. 6825 mammograms from our mammography image database were segmented to find the breast outline. The outlines were averaged and smoothed to produce an average outline for each laterality and radiographic projection. Lesions in 3304 mammograms with malignant findings were mapped on to a standardised breast image corresponding to the average breast outline using piecewise affine transforms. A four dimensional probability distribution function was found from the lesion locations in the cranio-caudal and medio-lateral radiographic projections for calcification and noncalcification lesions. Lesion locations sampled from this probability distribution function were mapped on to individual mammograms using a piecewise affine transform which transforms the average outline to the outline of the breast in the mammogram. The four dimensional probability distribution function was validated by comparing it to the two dimensional distributions found by considering each radiographic projection and laterality independently. The correlation of the location of the lesions sampled from the four dimensional probability distribution function across radiographic projections was shown to match the correlation of the locations of the original mapped lesion locations. The current system has been implemented as a web-service on a server using the Python Django framework. The server performs the sampling, performs the mapping and returns the results in a javascript object notation format.

  19. Non-signalling Theories and Generalized Probability

    NASA Astrophysics Data System (ADS)

    Tylec, Tomasz I.; Kuś, Marek; Krajczok, Jacek

    2016-04-01

    We provide mathematically rigorous justification of using term probability in connection to the so called non-signalling theories, known also as Popescu's and Rohrlich's box worlds. No only do we prove correctness of these models (in the sense that they describe composite system of two independent subsystems) but we obtain new properties of non-signalling boxes and expose new tools for further investigation. Moreover, it allows strightforward generalization to more complicated systems.

  20. Non-signalling Theories and Generalized Probability

    NASA Astrophysics Data System (ADS)

    Tylec, Tomasz I.; Kuś, Marek; Krajczok, Jacek

    2016-09-01

    We provide mathematically rigorous justification of using term probability in connection to the so called non-signalling theories, known also as Popescu's and Rohrlich's box worlds. No only do we prove correctness of these models (in the sense that they describe composite system of two independent subsystems) but we obtain new properties of non-signalling boxes and expose new tools for further investigation. Moreover, it allows strightforward generalization to more complicated systems.

  1. Continuum ionization transition probabilities of atomic oxygen

    NASA Technical Reports Server (NTRS)

    Samson, J. A. R.; Petrosky, V. E.

    1974-01-01

    The technique of photoelectron spectroscopy was employed in the investigation. Atomic oxygen was produced in a microwave discharge operating at a power of 40 W and at a pressure of approximately 20 mtorr. The photoelectron spectrum of the oxygen with and without the discharge is shown. The atomic states can be clearly seen. In connection with the measurement of the probability for transitions into the various ionic states, the analyzer collection efficiency was determined as a function of electron energy.

  2. Computational methods for probability of instability calculations

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Burnside, O. H.

    1990-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of a dynamic system than can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the roots of the characteristics equation or Routh-Hurwitz test functions are investigated. Computational methods based on system reliability analysis methods and importance sampling concepts are proposed to perform efficient probabilistic analysis. Numerical examples are provided to demonstrate the methods.

  3. SureTrak Probability of Impact Display

    NASA Technical Reports Server (NTRS)

    Elliott, John

    2012-01-01

    The SureTrak Probability of Impact Display software was developed for use during rocket launch operations. The software displays probability of impact information for each ship near the hazardous area during the time immediately preceding the launch of an unguided vehicle. Wallops range safety officers need to be sure that the risk to humans is below a certain threshold during each use of the Wallops Flight Facility Launch Range. Under the variable conditions that can exist at launch time, the decision to launch must be made in a timely manner to ensure a successful mission while not exceeding those risk criteria. Range safety officers need a tool that can give them the needed probability of impact information quickly, and in a format that is clearly understandable. This application is meant to fill that need. The software is a reuse of part of software developed for an earlier project: Ship Surveillance Software System (S4). The S4 project was written in C++ using Microsoft Visual Studio 6. The data structures and dialog templates from it were copied into a new application that calls the implementation of the algorithms from S4 and displays the results as needed. In the S4 software, the list of ships in the area was received from one local radar interface and from operators who entered the ship information manually. The SureTrak Probability of Impact Display application receives ship data from two local radars as well as the SureTrak system, eliminating the need for manual data entry.

  4. Classical probabilities for Majorana and Weyl spinors

    SciTech Connect

    Wetterich, C.

    2011-08-15

    Highlights: > Map of classical statistical Ising model to fermionic quantum field theory. > Lattice-regularized real Grassmann functional integral for single Weyl spinor. > Emerging complex structure characteristic for quantum physics. > A classical statistical ensemble describes a quantum theory. - Abstract: We construct a map between the quantum field theory of free Weyl or Majorana fermions and the probability distribution of a classical statistical ensemble for Ising spins or discrete bits. More precisely, a Grassmann functional integral based on a real Grassmann algebra specifies the time evolution of the real wave function q{sub {tau}}(t) for the Ising states {tau}. The time dependent probability distribution of a generalized Ising model obtains as p{sub {tau}}(t)=q{sub {tau}}{sup 2}(t). The functional integral employs a lattice regularization for single Weyl or Majorana spinors. We further introduce the complex structure characteristic for quantum mechanics. Probability distributions of the Ising model which correspond to one or many propagating fermions are discussed explicitly. Expectation values of observables can be computed equivalently in the classical statistical Ising model or in the quantum field theory for fermions.

  5. Chemisorptive electron emission versus sticking probability

    NASA Astrophysics Data System (ADS)

    Böttcher, Artur; Niehus, Horst

    2001-07-01

    The chemisorption of N2O on thin Cs films has been studied by monitoring the time evolution of the sticking probability as well as the kinetics of the low-energy electron emission. By combining the data sets, two time domains become distinguishable: the initial chemisorption stage is characterized by a high sticking probability (0.1probability of less than 0.01. Such evident anticoincidence between the exoemission and the chemisorption excludes the model of surface harpooning as the elementary process responsible for the electron emission in the late chemisorption stage. A long-term emission decay has also been observed after turning off the flux of chemisorbing molecules. A model is proposed that attributes both, the late chemisorptive and the nonchemisorptive electron emission to the relaxation of a narrow state originated from an oxygen vacancy in the Cs oxide layer terminating the surface. The presence of such a state has been confirmed by the metastable de-excitation spectroscopy [MDS, He*(21S)].

  6. The Probability Distribution of Daily Streamflow

    NASA Astrophysics Data System (ADS)

    Blum, A.; Vogel, R. M.

    2015-12-01

    Flow duration curves (FDCs) are a graphical illustration of the cumulative distribution of streamflow. Daily streamflows often range over many orders of magnitude, making it extremely challenging to find a probability distribution function (pdf) which can mimic the steady state or period of record FDC (POR-FDC). Median annual FDCs (MA-FDCs) describe the pdf of daily streamflow in a typical year. For POR- and MA-FDCs, Lmoment diagrams, visual assessments of FDCs and Quantile-Quantile probability plot correlation coefficients are used to evaluate goodness of fit (GOF) of candidate probability distributions. FDCs reveal that both four-parameter kappa (KAP) and three-parameter generalized Pareto (GP3) models result in very high GOF for the MA-FDC and a relatively lower GOF for POR-FDCs at over 500 rivers across the coterminous U.S. Physical basin characteristics, such as baseflow index as well as hydroclimatic indices such as the aridity index and the runoff ratio are found to be correlated with one of the shape parameters (kappa) of the KAP and GP3 pdfs. Our work also reveals several important areas for future research including improved parameter estimators for the KAP pdf, as well as increasing our understanding of the conditions which give rise to improved GOF of analytical pdfs to large samples of daily streamflows.

  7. Bacteria survival probability in bactericidal filter paper.

    PubMed

    Mansur-Azzam, Nura; Hosseinidoust, Zeinab; Woo, Su Gyeong; Vyhnalkova, Renata; Eisenberg, Adi; van de Ven, Theo G M

    2014-05-01

    Bactericidal filter papers offer the simplicity of gravity filtration to simultaneously eradicate microbial contaminants and particulates. We previously detailed the development of biocidal block copolymer micelles that could be immobilized on a filter paper to actively eradicate bacteria. Despite the many advantages offered by this system, its widespread use is hindered by its unknown mechanism of action which can result in non-reproducible outcomes. In this work, we sought to investigate the mechanism by which a certain percentage of Escherichia coli cells survived when passing through the bactericidal filter paper. Through the process of elimination, the possibility that the bacterial survival probability was controlled by the initial bacterial load or the existence of resistant sub-populations of E. coli was dismissed. It was observed that increasing the thickness or the number of layers of the filter significantly decreased bacterial survival probability for the biocidal filter paper but did not affect the efficiency of the blank filter paper (no biocide). The survival probability of bacteria passing through the antibacterial filter paper appeared to depend strongly on the number of collision between each bacterium and the biocide-loaded micelles. It was thus hypothesized that during each collision a certain number of biocide molecules were directly transferred from the hydrophobic core of the micelle to the bacterial lipid bilayer membrane. Therefore, each bacterium must encounter a certain number of collisions to take up enough biocide to kill the cell and cells that do not undergo the threshold number of collisions are expected to survive. PMID:24681395

  8. Detection probabilities in fuel cycle oriented safeguards

    SciTech Connect

    Canty, J.J.; Stein, G.; Avenhaus, R. )

    1987-01-01

    An intensified discussion of evaluation criteria for International Atomic Energy Agency (IAEA) safeguards effectiveness is currently under way. Considerations basic to the establishment of such criteria are derived from the model agreement INFCIRC/153 and include threshold amounts, strategic significance, conversion times, required assurances, cost-effectiveness, and nonintrusiveness. In addition to these aspects, the extent to which fuel cycle characteristics are taken into account in safeguards implementations (Article 81c of INFCIRC/153) will be reflected in the criteria. The effectiveness of safeguards implemented under given manpower constraints is evaluated. As the significant quantity and timeliness criteria have established themselves within the safeguards community, these are taken as fixed. Detection probabilities, on the other hand, still provide a certain degree of freedom in interpretation. The problem of randomization of inspection activities across a fuel cycle, or portions thereof, is formalized as a two-person zero-sum game, the payoff function of which is the detection probability achieved by the inspectorate. It is argued, from the point of view of risk of detection, that fuel cycle-independent, minimally accepted threshold criteria for such detection probabilities cannot and should not be applied.

  9. Augmenting Transition Probabilities for Neutral Atomic Nitrogen

    NASA Technical Reports Server (NTRS)

    Terrazas-Salines, Imelda; Park, Chul; Strawa, Anthony W.; Hartman, G. Joseph (Technical Monitor)

    1996-01-01

    The transition probability values for a number of neutral atomic nitrogen (NI) lines in the visible wavelength range are determined in order to augment those given in the National Bureau of Standards Tables. These values are determined from experimentation as well as by using the published results of other investigators. The experimental determination of the lines in the 410 to 430 nm range was made from the observation of the emission from the arc column of an arc-heated wind tunnel. The transition probability values of these NI lines are determined to an accuracy of +/- 30% by comparison of their measured intensities with those of the atomic oxygen (OI) multiplet at around 615 nm. The temperature of the emitting medium is determined both using a multiple-layer model, based on a theoretical model of the flow in the arc column, and an empirical single-layer model. The results show that the two models lead to the same values of transition probabilities for the NI lines.

  10. Determination of riverbank erosion probability using Locally Weighted Logistic Regression

    NASA Astrophysics Data System (ADS)

    Ioannidou, Elena; Flori, Aikaterini; Varouchakis, Emmanouil A.; Giannakis, Georgios; Vozinaki, Anthi Eirini K.; Karatzas, George P.; Nikolaidis, Nikolaos

    2015-04-01

    erosion occurrence probability can be calculated in conjunction with the model deviance regarding the independent variables tested. The most straightforward measure for goodness of fit is the G statistic. It is a simple and effective way to study and evaluate the Logistic Regression model efficiency and the reliability of each independent variable. The developed statistical model is applied to the Koiliaris River Basin on the island of Crete, Greece. Two datasets of river bank slope, river cross-section width and indications of erosion were available for the analysis (12 and 8 locations). Two different types of spatial dependence functions, exponential and tricubic, were examined to determine the local spatial dependence of the independent variables at the measurement locations. The results show a significant improvement when the tricubic function is applied as the erosion probability is accurately predicted at all eight validation locations. Results for the model deviance show that cross-section width is more important than bank slope in the estimation of erosion probability along the Koiliaris riverbanks. The proposed statistical model is a useful tool that quantifies the erosion probability along the riverbanks and can be used to assist managing erosion and flooding events. Acknowledgements This work is part of an on-going THALES project (CYBERSENSORS - High Frequency Monitoring System for Integrated Water Resources Management of Rivers). The project has been co-financed by the European Union (European Social Fund - ESF) and Greek national funds through the Operational Program "Education and Lifelong Learning" of the National Strategic Reference Framework (NSRF) - Research Funding Program: THALES. Investing in knowledge society through the European Social Fund.

  11. Implicit chaining in cotton-top tamarins (Saguinus oedipus) with elements equated for probability of reinforcement.

    PubMed

    Locurto, Charles; Dillon, Laura; Collins, Meaghan; Conway, Maura; Cunningham, Kate

    2013-07-01

    Three experiments examined the implicit learning of sequences under conditions in which the elements comprising a sequence were equated in terms of reinforcement probability. In Experiment 1 cotton-top tamarins (Saguinus oedipus) experienced a five-element sequence displayed serially on a touch screen in which reinforcement probability was equated across elements at .16 per element. Tamarins demonstrated learning of this sequence with higher latencies during a random test as compared to baseline sequence training. In Experiments 2 and 3, manipulations of the procedure used in the first experiment were undertaken to rule out a confound owing to the fact that the elements in Experiment 1 bore different temporal relations to the intertrial interval (ITI), an inhibitory period. The results of Experiments 2 and 3 indicated that the implicit learning observed in Experiment 1 was not due to temporal proximity between some elements and the inhibitory ITI. The results taken together support two conclusion: First that tamarins engaged in sequence learning whether or not there was contingent reinforcement for learning the sequence, and second that this learning was not due to subtle differences in associative strength between the elements of the sequence. PMID:23344718

  12. CPROB: A COMPUTATIONAL TOOL FOR CONDUCTING CONDITIONAL PROBABILITY ANALYSIS

    EPA Science Inventory

    Conditional probability analysis measures the probability of observing one event given that another event has occurred. In an environmental context, conditional probability analysis helps assess the association between an environmental contaminant (i.e. the stressor) and the ec...

  13. Taking the easy way out? Increasing implementation effort reduces probability maximizing under cognitive load.

    PubMed

    Schulze, Christin; Newell, Ben R

    2016-07-01

    Cognitive load has previously been found to have a positive effect on strategy selection in repeated risky choice. Specifically, whereas inferior probability matching often prevails under single-task conditions, optimal probability maximizing sometimes dominates when a concurrent task competes for cognitive resources. We examined the extent to which this seemingly beneficial effect of increased task demands hinges on the effort required to implement each of the choice strategies. Probability maximizing typically involves a simple repeated response to a single option, whereas probability matching requires choice proportions to be tracked carefully throughout a sequential choice task. Here, we flipped this pattern by introducing a manipulation that made the implementation of maximizing more taxing and, at the same time, allowed decision makers to probability match via a simple repeated response to a single option. The results from two experiments showed that increasing the implementation effort of probability maximizing resulted in decreased adoption rates of this strategy. This was the case both when decision makers simultaneously learned about the outcome probabilities and responded to a dual task (Exp. 1) and when these two aspects were procedurally separated in two distinct stages (Exp. 2). We conclude that the effort involved in implementing a choice strategy is a key factor in shaping repeated choice under uncertainty. Moreover, highlighting the importance of implementation effort casts new light on the sometimes surprising and inconsistent effects of cognitive load that have previously been reported in the literature. PMID:26884088

  14. Social Science and the Bayesian Probability Explanation Model

    NASA Astrophysics Data System (ADS)

    Yin, Jie; Zhao, Lei

    2014-03-01

    C. G. Hempel, one of the logical empiricists, who builds up his probability explanation model by using the empiricist view of probability, this model encountered many difficulties in the scientific explanation in which Hempel is difficult to make a reasonable defense. Based on the bayesian probability theory, the Bayesian probability model provides an approach of a subjective probability explanation based on the subjective probability, using the subjectivist view of probability. On the one hand, this probability model establishes the epistemological status of the subject in the social science; On the other hand, it provides a feasible explanation model for the social scientific explanation, which has important methodological significance.

  15. A probable probability distribution of a series nonequilibrium states in a simple system out of equilibrium

    NASA Astrophysics Data System (ADS)

    Gao, Haixia; Li, Ting; Xiao, Changming

    2016-05-01

    When a simple system is in its nonequilibrium state, it will shift to its equilibrium state. Obviously, in this process, there are a series of nonequilibrium states. With the assistance of Bayesian statistics and hyperensemble, a probable probability distribution of these nonequilibrium states can be determined by maximizing the hyperensemble entropy. It is known that the largest probability is the equilibrium state, and the far a nonequilibrium state is away from the equilibrium one, the smaller the probability will be, and the same conclusion can also be obtained in the multi-state space. Furthermore, if the probability stands for the relative time the corresponding nonequilibrium state can stay, then the velocity of a nonequilibrium state returning back to its equilibrium can also be determined through the reciprocal of the derivative of this probability. It tells us that the far away the state from the equilibrium is, the faster the returning velocity will be; if the system is near to its equilibrium state, the velocity will tend to be smaller and smaller, and finally tends to 0 when it gets the equilibrium state.

  16. Spin Glass Computations and Ruelle's Probability Cascades

    NASA Astrophysics Data System (ADS)

    Arguin, Louis-Pierre

    2007-03-01

    We study the Parisi functional, appearing in the Parisi formula for the pressure of the SK model, as a functional on Ruelle's Probability Cascades (RPC). Computation techniques for the RPC formulation of the functional are developed. They are used to derive continuity and monotonicity properties of the functional retrieving a theorem of Guerra. We also detail the connection between the Aizenman-Sims-Starr variational principle and the Parisi formula. As a final application of the techniques, we rederive the Almeida-Thouless line in the spirit of Toninelli but relying on the RPC structure.

  17. Snell Envelope with Small Probability Criteria

    SciTech Connect

    Del Moral, Pierre Hu, Peng; Oudjane, Nadia

    2012-12-15

    We present a new algorithm to compute the Snell envelope in the specific case where the criteria to optimize is associated with a small probability or a rare event. This new approach combines the Stochastic Mesh approach of Broadie and Glasserman with a particle approximation scheme based on a specific change of measure designed to concentrate the computational effort in regions pointed out by the criteria. The theoretical analysis of this new algorithm provides non asymptotic convergence estimates. Finally, the numerical tests confirm the practical interest of this approach.

  18. Mapping probability of shipping sound exposure level.

    PubMed

    Gervaise, Cédric; Aulanier, Florian; Simard, Yvan; Roy, Nathalie

    2015-06-01

    Mapping vessel noise is emerging as one method of identifying areas where sound exposure due to shipping noise could have negative impacts on aquatic ecosystems. The probability distribution function (pdf) of sound exposure levels (SEL) is an important metric for identifying areas of concern. In this paper a probabilistic shipping SEL modeling method is described to obtain the pdf of SEL using the sonar equation and statistical relations linking the pdfs of ship traffic density, source levels, and transmission losses to their products and sums. PMID:26093451

  19. Calculation of radiative transition probabilities and lifetimes

    NASA Technical Reports Server (NTRS)

    Zemke, W. T.; Verma, K. K.; Stwalley, W. C.

    1982-01-01

    Procedures for calculating bound-bound and bound-continuum (free) radiative transition probabilities and radiative lifetimes are summarized. Calculations include rotational dependence and R-dependent electronic transition moments (no Franck-Condon or R-centroid approximation). Detailed comparisons of theoretical results with experimental measurements are made for bound-bound transitions in the A-X systems of LiH and Na2. New bound-free results are presented for LiH. New bound-free results and comparisons with very recent fluorescence experiments are presented for Na2.

  20. Modulation Based on Probability Density Functions

    NASA Technical Reports Server (NTRS)

    Williams, Glenn L.

    2009-01-01

    A proposed method of modulating a sinusoidal carrier signal to convey digital information involves the use of histograms representing probability density functions (PDFs) that characterize samples of the signal waveform. The method is based partly on the observation that when a waveform is sampled (whether by analog or digital means) over a time interval at least as long as one half cycle of the waveform, the samples can be sorted by frequency of occurrence, thereby constructing a histogram representing a PDF of the waveform during that time interval.

  1. Learning How To Learn.

    ERIC Educational Resources Information Center

    Barnett, Demian

    2000-01-01

    In one California high school, learning to learn is a measurable outcome assessed by all students' participation in graduation by exhibition. Students must meet state requirements and demonstrate learning prowess by publicly exhibiting their skills in math, science, language arts, social science, service learning, and postgraduation planning. (MLH)

  2. The Human Bathtub: Safety and Risk Predictions Including the Dynamic Probability of Operator Errors

    SciTech Connect

    Duffey, Romney B.; Saull, John W.

    2006-07-01

    Reactor safety and risk are dominated by the potential and major contribution for human error in the design, operation, control, management, regulation and maintenance of the plant, and hence to all accidents. Given the possibility of accidents and errors, now we need to determine the outcome (error) probability, or the chance of failure. Conventionally, reliability engineering is associated with the failure rate of components, or systems, or mechanisms, not of human beings in and interacting with a technological system. The probability of failure requires a prior knowledge of the total number of outcomes, which for any predictive purposes we do not know or have. Analysis of failure rates due to human error and the rate of learning allow a new determination of the dynamic human error rate in technological systems, consistent with and derived from the available world data. The basis for the analysis is the 'learning hypothesis' that humans learn from experience, and consequently the accumulated experience defines the failure rate. A new 'best' equation has been derived for the human error, outcome or failure rate, which allows for calculation and prediction of the probability of human error. We also provide comparisons to the empirical Weibull parameter fitting used in and by conventional reliability engineering and probabilistic safety analysis methods. These new analyses show that arbitrary Weibull fitting parameters and typical empirical hazard function techniques cannot be used to predict the dynamics of human errors and outcomes in the presence of learning. Comparisons of these new insights show agreement with human error data from the world's commercial airlines, the two shuttle failures, and from nuclear plant operator actions and transient control behavior observed in transients in both plants and simulators. The results demonstrate that the human error probability (HEP) is dynamic, and that it may be predicted using the learning hypothesis and the minimum

  3. Sparsity-inspired nonparametric probability characterization for radio propagation in body area networks.

    PubMed

    Yang, Xiaodong; Yang, Shuyuan; Abbasi, Qammer Hussain; Zhang, Zhiya; Ren, Aifeng; Zhao, Wei; Alomainy, Akram

    2015-05-01

    Parametric probability models are common references for channel characterization. However, the limited number of samples and uncertainty of the propagation scenario affect the characterization accuracy of parametric models for body area networks. In this paper, we propose a sparse nonparametric probability model for body area wireless channel characterization. The path loss and root-mean-square delay, which are significant wireless channel parameters, can be learned from this nonparametric model. A comparison with available parametric models shows that the proposed model is very feasible for the body area propagation environment and can be seen as a significant supplement to parametric approaches. PMID:25014979

  4. Properties of atoms in molecules: Transition probabilities

    NASA Astrophysics Data System (ADS)

    Bader, R. F. W.; Bayles, D.; Heard, G. L.

    2000-06-01

    The transition probability for electric dipole transitions is a measurable property of a system and is therefore, partitionable into atomic contributions using the physics of a proper open system. The derivation of the dressed property density, whose averaging over an atomic basin yields the atomic contribution to a given oscillator strength, is achieved through the development of perturbation theory for an open system. A dressed density describes the local contribution resulting from the interaction of a single electron at some position r, as determined by the relevant observable, averaged over the motions of all of the remaining particles in the system. In the present work, the transition probability density expressed in terms of the relevant transition density, yields a local measure of the associated oscillator strength resulting from the interaction of the entire molecule with a radiation field. The definition of the atomic contributions to the oscillator strength enables one to determine the extent to which a given electronic or vibrational transition is spatially localized to a given atom or functional group. The concepts introduced in this article are applied to the Rydberg-type transitions observed in the electronic excitation of a nonbonding electron in formaldehyde and ammonia. The atomic partitioning of the molecular density distribution and of the molecular properties by surfaces of zero flux in the gradient vector field of the electron density, the boundary condition defining the physics of a proper open system, is found to apply to the density distributions of the excited, Rydberg states.

  5. Classical probability model for Bell inequality

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei

    2014-04-01

    We show that by taking into account randomness of realization of experimental contexts it is possible to construct common Kolmogorov space for data collected for these contexts, although they can be incompatible. We call such a construction "Kolmogorovization" of contextuality. This construction of common probability space is applied to Bell's inequality. It is well known that its violation is a consequence of collecting statistical data in a few incompatible experiments. In experiments performed in quantum optics contexts are determined by selections of pairs of angles (θi,θ'j) fixing orientations of polarization beam splitters. Opposite to the common opinion, we show that statistical data corresponding to measurements of polarizations of photons in the singlet state, e.g., in the form of correlations, can be described in the classical probabilistic framework. The crucial point is that in constructing the common probability space one has to take into account not only randomness of the source (as Bell did), but also randomness of context-realizations (in particular, realizations of pairs of angles (θi, θ'j)). One may (but need not) say that randomness of "free will" has to be accounted for.

  6. On the probability of matching DNA fingerprints.

    PubMed

    Risch, N J; Devlin, B

    1992-02-01

    Forensic scientists commonly assume that DNA fingerprint patterns are infrequent in the general population and that genotypes are independent across loci. To test these assumptions, the number of matching DNA patterns in two large databases from the Federal Bureau of Investigation (FBI) and from Lifecodes was determined. No deviation from independence across loci in either database was apparent. For the Lifecodes database, the probability of a three-locus match ranges from 1 in 6,233 in Caucasians to 1 in 119,889 in Blacks. When considering all trios of five loci in the FBI database, there was only a single match observed out of more than 7.6 million comparisons. If independence is assumed, the probability of a five-locus match ranged from 1.32 x 10(-12) in Southeast Hispanics to 5.59 x 10(-14) in Blacks, implying that the minimum number of possible patterns for each ethnic group is several orders of magnitude greater than their corresponding population sizes in the United States. The most common five-locus pattern can have a frequency no greater than about 10(-6). Hence, individual five-locus DNA profiles are extremely uncommon, if not unique. PMID:1738844

  7. Estimating flood exceedance probabilities in estuarine regions

    NASA Astrophysics Data System (ADS)

    Westra, Seth; Leonard, Michael

    2016-04-01

    Flood events in estuarine regions can arise from the interaction of extreme rainfall and storm surge. Determining flood level exceedance probabilities in these regions is complicated by the dependence of these processes for extreme events. A comprehensive study of tide and rainfall gauges along the Australian coastline was conducted to determine the dependence of these extremes using a bivariate logistic threshold-excess model. The dependence strength is shown to vary as a function of distance over many hundreds of kilometres indicating that the dependence arises due to synoptic scale meteorological forcings. It is also shown to vary as a function of storm burst duration, time lag between the extreme rainfall and the storm surge event. The dependence estimates are then used with a bivariate design variable method to determine flood risk in estuarine regions for a number of case studies. Aspects of the method demonstrated in the case studies include, the resolution and range of the hydraulic response table, fitting of probability distributions, computational efficiency, uncertainty, potential variation in marginal distributions due to climate change, and application to two dimensional output from hydraulic models. Case studies are located on the Swan River (Western Australia), Nambucca River and Hawkesbury Nepean River (New South Wales).

  8. Measures, Probability and Holography in Cosmology

    NASA Astrophysics Data System (ADS)

    Phillips, Daniel

    This dissertation compiles four research projects on predicting values for cosmological parameters and models of the universe on the broadest scale. The first examines the Causal Entropic Principle (CEP) in inhomogeneous cosmologies. The CEP aims to predict the unexpectedly small value of the cosmological constant Lambda using a weighting by entropy increase on causal diamonds. The original work assumed a purely isotropic and homogeneous cosmology. But even the level of inhomogeneity observed in our universe forces reconsideration of certain arguments about entropy production. In particular, we must consider an ensemble of causal diamonds associated with each background cosmology and we can no longer immediately discard entropy production in the far future of the universe. Depending on our choices for a probability measure and our treatment of black hole evaporation, the prediction for Lambda may be left intact or dramatically altered. The second related project extends the CEP to universes with curvature. We have found that curvature values larger than rho k = 40rhom are disfavored by more than $99.99% and a peak value at rhoLambda = 7.9 x 10-123 and rhok =4.3rho m for open universes. For universes that allow only positive curvature or both positive and negative curvature, we find a correlation between curvature and dark energy that leads to an extended region of preferred values. Our universe is found to be disfavored to an extent depending the priors on curvature. We also provide a comparison to previous anthropic constraints on open universes and discuss future directions for this work. The third project examines how cosmologists should formulate basic questions of probability. We argue using simple models that all successful practical uses of probabilities originate in quantum fluctuations in the microscopic physical world around us, often propagated to macroscopic scales. Thus we claim there is no physically verified fully classical theory of probability. We

  9. Risks and probabilities of breast cancer: short-term versus lifetime probabilities.

    PubMed Central

    Bryant, H E; Brasher, P M

    1994-01-01

    OBJECTIVE: To calculate age-specific short-term and lifetime probabilities of breast cancer among a cohort of Canadian women. DESIGN: Double decrement life table. SETTING: Alberta. SUBJECTS: Women with first invasive breast cancers registered with the Alberta Cancer Registry between 1985 and 1987. MAIN OUTCOME MEASURES: Lifetime probability of breast cancer from birth and for women at various ages; short-term (up to 10 years) probability of breast cancer for women at various ages. RESULTS: The lifetime probability of breast cancer is 10.17% at birth and peaks at 10.34% at age 25 years, after which it decreases owing to a decline in the number of years over which breast cancer risk will be experienced. However, the probability of manifesting breast cancer in the next year increases steadily from the age of 30 onward, reaching 0.36% at 85 years. The probability of manifesting the disease within the next 10 years peaks at 2.97% at age 70 and decreases thereafter, again owing to declining probabilities of surviving the interval. CONCLUSIONS: Given that the incidence of breast cancer among Albertan women during the study period was similar to the national average, we conclude that currently more than 1 in 10 women in Canada can expect to have breast cancer at some point during their life. However, risk varies considerably over a woman's lifetime, with most risk concentrated after age 49. On the basis of the shorter-term age-specific risks that we present, the clinician can put breast cancer risk into perspective for younger women and heighten awareness among women aged 50 years or more. PMID:8287343

  10. Towards an Emergent View of Learning Work

    ERIC Educational Resources Information Center

    Johnsson, Mary C.; Boud, David

    2010-01-01

    The purpose of this paper is to challenge models of workplace learning that seek to isolate or manipulate a limited set of features to increase the probability of learning. Such models typically attribute learning (or its absence) to individual engagement, manager expectations or organizational affordances and are therefore at least implicitly…

  11. Mixture Modeling of Individual Learning Curves

    ERIC Educational Resources Information Center

    Streeter, Matthew

    2015-01-01

    We show that student learning can be accurately modeled using a mixture of learning curves, each of which specifies error probability as a function of time. This approach generalizes Knowledge Tracing [7], which can be viewed as a mixture model in which the learning curves are step functions. We show that this generality yields order-of-magnitude…

  12. Detecting Learning Moment-by-Moment

    ERIC Educational Resources Information Center

    Baker, Ryan S. J. D.; Goldstein, Adam B.; Heffernan, Neil T.

    2011-01-01

    Intelligent tutors have become increasingly accurate at detecting whether a student knows a skill, or knowledge component (KC), at a given time. However, current student models do not tell us exactly at which point a KC is learned. In this paper, we present a machine-learned model that assesses the probability that a student learned a KC at a…

  13. Obtaining Well Calibrated Probabilities Using Bayesian Binning

    PubMed Central

    Naeini, Mahdi Pakdaman; Cooper, Gregory F.; Hauskrecht, Milos

    2015-01-01

    Learning probabilistic predictive models that are well calibrated is critical for many prediction and decision-making tasks in artificial intelligence. In this paper we present a new non-parametric calibration method called Bayesian Binning into Quantiles (BBQ) which addresses key limitations of existing calibration methods. The method post processes the output of a binary classification algorithm; thus, it can be readily combined with many existing classification algorithms. The method is computationally tractable, and empirically accurate, as evidenced by the set of experiments reported here on both real and simulated datasets. PMID:25927013

  14. Motivation through Mastery Learning

    ERIC Educational Resources Information Center

    Josten, Denice

    2006-01-01

    Every developmental education teacher would probably agree that motivation to complete assignments is one of the biggest obstacles to getting students to practice necessary skills. Short stories and other types of recreational reading might help motivate students, but they need to learn specific skills to help them cope with college textbook…

  15. Learning from Our Students

    ERIC Educational Resources Information Center

    Noddings, Nel

    2004-01-01

    Most teachers have been good students. Some students are fast learners and attain the required knowledge and skills easily; others are obedient, hard workers. In either case, teachers are likely to believe that if students really try, they will do well. Listening to students over many years, the author has learned that this is probably not true.…

  16. Trending in Probability of Collision Measurements

    NASA Technical Reports Server (NTRS)

    Vallejo, J. J.; Hejduk, M. D.; Stamey, J. D.

    2015-01-01

    A simple model is proposed to predict the behavior of Probabilities of Collision (P(sub c)) for conjunction events. The model attempts to predict the location and magnitude of the peak P(sub c) value for an event by assuming the progression of P(sub c) values can be modeled to first order by a downward-opening parabola. To incorporate prior information from a large database of past conjunctions, the Bayes paradigm is utilized; and the operating characteristics of the model are established through a large simulation study. Though the model is simple, it performs well in predicting the temporal location of the peak (P(sub c)) and thus shows promise as a decision aid in operational conjunction assessment risk analysis.

  17. Quantum probabilities for inflation from holography

    NASA Astrophysics Data System (ADS)

    Hartle, James B.; Hawking, S. W.; Hertog, Thomas

    2014-01-01

    The evolution of the universe is determined by its quantum state. The wave function of the universe obeys the constraints of general relativity and in particular the Wheeler-DeWitt equation (WDWE). For non-zero Λ, we show that solutions of the WDWE at large volume have two domains in which geometries and fields are asymptotically real. In one the histories are Euclidean asymptotically anti-de Sitter, in the other they are Lorentzian asymptotically classical de Sitter. Further, the universal complex semiclassical asymptotic structure of solutions of the WDWE implies that the leading order in hbar quantum probabilities for classical, asymptotically de Sitter histories can be obtained from the action of asymptotically anti-de Sitter configurations. This leads to a promising, universal connection between quantum cosmology and holography.

  18. Audio feature extraction using probability distribution function

    NASA Astrophysics Data System (ADS)

    Suhaib, A.; Wan, Khairunizam; Aziz, Azri A.; Hazry, D.; Razlan, Zuradzman M.; Shahriman A., B.

    2015-05-01

    Voice recognition has been one of the popular applications in robotic field. It is also known to be recently used for biometric and multimedia information retrieval system. This technology is attained from successive research on audio feature extraction analysis. Probability Distribution Function (PDF) is a statistical method which is usually used as one of the processes in complex feature extraction methods such as GMM and PCA. In this paper, a new method for audio feature extraction is proposed which is by using only PDF as a feature extraction method itself for speech analysis purpose. Certain pre-processing techniques are performed in prior to the proposed feature extraction method. Subsequently, the PDF result values for each frame of sampled voice signals obtained from certain numbers of individuals are plotted. From the experimental results obtained, it can be seen visually from the plotted data that each individuals' voice has comparable PDF values and shapes.

  19. Probability of Brownian motion hitting an obstacle

    SciTech Connect

    Knessl, C.; Keller, J.B.

    2000-02-01

    The probability p(x) that Brownian motion with drift, starting at x, hits an obstacle is analyzed. The obstacle {Omega} is a compact subset of R{sup n}. It is shown that p(x) is expressible in terms of the field U(x) scattered by {Omega} when it is hit by plane wave. Therefore results for U(x), and methods for finding U(x) can be used to determine p(x). The authors illustrate this by obtaining exact and asymptotic results for p(x) when {Omega} is a slit in R{sup 2}, and asymptotic results when {Omega} is a disc in R{sup 3}.

  20. Carrier Modulation Via Waveform Probability Density Function

    NASA Technical Reports Server (NTRS)

    Williams, Glenn L.

    2004-01-01

    Beyond the classic modes of carrier modulation by varying amplitude (AM), phase (PM), or frequency (FM), we extend the modulation domain of an analog carrier signal to include a class of general modulations which are distinguished by their probability density function histogram. Separate waveform states are easily created by varying the pdf of the transmitted waveform. Individual waveform states are assignable as proxies for digital ONEs or ZEROs. At the receiver, these states are easily detected by accumulating sampled waveform statistics and performing periodic pattern matching, correlation, or statistical filtering. No fundamental natural laws are broken in the detection process. We show how a typical modulation scheme would work in the digital domain and suggest how to build an analog version. We propose that clever variations of the modulating waveform (and thus the histogram) can provide simple steganographic encoding.

  1. Carrier Modulation Via Waveform Probability Density Function

    NASA Technical Reports Server (NTRS)

    Williams, Glenn L.

    2006-01-01

    Beyond the classic modes of carrier modulation by varying amplitude (AM), phase (PM), or frequency (FM), we extend the modulation domain of an analog carrier signal to include a class of general modulations which are distinguished by their probability density function histogram. Separate waveform states are easily created by varying the pdf of the transmitted waveform. Individual waveform states are assignable as proxies for digital one's or zero's. At the receiver, these states are easily detected by accumulating sampled waveform statistics and performing periodic pattern matching, correlation, or statistical filtering. No fundamental physical laws are broken in the detection process. We show how a typical modulation scheme would work in the digital domain and suggest how to build an analog version. We propose that clever variations of the modulating waveform (and thus the histogram) can provide simple steganographic encoding.

  2. 5426 Sharp: A Probable Hungaria Binary

    NASA Astrophysics Data System (ADS)

    Warner, Brian D.; Benishek, Vladimir; Ferrero, Andrea

    2015-07-01

    Initial CCD photometry observations of the Hungaria asteroid 5426 Sharp in 2014 December and 2015 January at the Center of Solar System Studies-Palmer Divide Station in Landers, CA, showed attenuations from the general lightcurve, indicating the possibility of the asteroid being a binary system. The secondary period was almost exactly an Earth day, prompting a collaboration to be formed with observers in Europe, which eventually allowed establishing two periods: P1 = 4.5609 ± 0.0003 h, A1 = 0.18 ± 0.01 mag and P2 = 24.22 ± 0.02 h, A2 = 0.08 ± 0.01 mag. No mutual events, i.e., occultations and/or eclipses, were seen, therefore the asteroid is considered a probable and not confirmed binary

  3. Objective Lightning Probability Forecast Tool Phase II

    NASA Technical Reports Server (NTRS)

    Lambert, Winnie

    2007-01-01

    This presentation describes the improvement of a set of lightning probability forecast equations that are used by the 45th Weather Squadron forecasters for their daily 1100 UTC (0700 EDT) weather briefing during the warm season months of May-September. This information is used for general scheduling of operations at Cape Canaveral Air Force Station and Kennedy Space Center. Forecasters at the Spaceflight Meteorology Group also make thunderstorm forecasts during Shuttle flight operations. Five modifications were made by the Applied Meteorology Unit: increased the period of record from 15 to 17 years, changed the method of calculating the flow regime of the day, calculated a new optimal layer relative humidity, used a new smoothing technique for the daily climatology, and used a new valid area. The test results indicated that the modified equations showed and increase in skill over the current equations, good reliability, and an ability to distinguish between lightning and non-lightning days.

  4. Quantum probabilities for inflation from holography

    SciTech Connect

    Hartle, James B.; Hawking, S.W.; Hertog, Thomas E-mail: S.W.Hawking@damtp.cam.ac.uk

    2014-01-01

    The evolution of the universe is determined by its quantum state. The wave function of the universe obeys the constraints of general relativity and in particular the Wheeler-DeWitt equation (WDWE). For non-zero Λ, we show that solutions of the WDWE at large volume have two domains in which geometries and fields are asymptotically real. In one the histories are Euclidean asymptotically anti-de Sitter, in the other they are Lorentzian asymptotically classical de Sitter. Further, the universal complex semiclassical asymptotic structure of solutions of the WDWE implies that the leading order in h-bar quantum probabilities for classical, asymptotically de Sitter histories can be obtained from the action of asymptotically anti-de Sitter configurations. This leads to a promising, universal connection between quantum cosmology and holography.

  5. On the probability of dinosaur fleas.

    PubMed

    Dittmar, Katharina; Zhu, Qiyun; Hastriter, Michael W; Whiting, Michael F

    2016-01-01

    Recently, a set of publications described flea fossils from Jurassic and Early Cretaceous geological strata in northeastern China, which were suggested to have parasitized feathered dinosaurs, pterosaurs, and early birds or mammals. In support of these fossils being fleas, a recent publication in BMC Evolutionary Biology described the extended abdomen of a female fossil specimen as due to blood feeding.We here comment on these findings, and conclude that the current interpretation of the evolutionary trajectory and ecology of these putative dinosaur fleas is based on appeal to probability, rather than evidence. Hence, their taxonomic positioning as fleas, or stem fleas, as well as their ecological classification as ectoparasites and blood feeders is not supported by currently available data. PMID:26754250

  6. Model estimates hurricane wind speed probabilities

    NASA Astrophysics Data System (ADS)

    Mumane, Richard J.; Barton, Chris; Collins, Eric; Donnelly, Jeffrey; Eisner, James; Emanuel, Kerry; Ginis, Isaac; Howard, Susan; Landsea, Chris; Liu, Kam-biu; Malmquist, David; McKay, Megan; Michaels, Anthony; Nelson, Norm; O Brien, James; Scott, David; Webb, Thompson, III

    In the United States, intense hurricanes (category 3, 4, and 5 on the Saffir/Simpson scale) with winds greater than 50 m s -1 have caused more damage than any other natural disaster [Pielke and Pielke, 1997]. Accurate estimates of wind speed exceedance probabilities (WSEP) due to intense hurricanes are therefore of great interest to (re)insurers, emergency planners, government officials, and populations in vulnerable coastal areas.The historical record of U.S. hurricane landfall is relatively complete only from about 1900, and most model estimates of WSEP are derived from this record. During the 1899-1998 period, only two category-5 and 16 category-4 hurricanes made landfall in the United States. The historical record therefore provides only a limited sample of the most intense hurricanes.

  7. Homonymous Hemianopsia Associated with Probable Alzheimer's Disease.

    PubMed

    Ishiwata, Akiko; Kimura, Kazumi

    2016-01-01

    Posterior cortical atrophy (PCA) is a rare neurodegenerative disorder that has cerebral atrophy in the parietal, occipital, or occipitotemporal cortices and is characterized by visuospatial and visuoperceptual impairments. The most cases are pathologically compatible with Alzheimer's disease (AD). We describe a case of PCA in which a combination of imaging methods, in conjunction with symptoms and neurological and neuropsychological examinations, led to its being diagnosed and to AD being identified as its probable cause. Treatment with donepezil for 6 months mildly improved alexia symptoms, but other symptoms remained unchanged. A 59-year-old Japanese woman with progressive alexia, visual deficit, and mild memory loss was referred to our neurologic clinic for the evaluation of right homonymous hemianopsia. Our neurological examination showed alexia, constructional apraxia, mild disorientation, short-term memory loss, and right homonymous hemianopsia. These findings resulted in a score of 23 (of 30) points on the Mini-Mental State Examination. Occipital atrophy was identified, with magnetic resonance imaging (MRI) showing left-side dominance. The MRI data were quantified with voxel-based morphometry, and PCA was diagnosed on the basis of these findings. Single photon emission computed tomography with (123)I-N-isopropyl-p-iodoamphetamine showed hypoperfusion in the corresponding voxel-based morphometry occipital lobes. Additionally, the finding of hypoperfusion in the posterior associate cortex, posterior cingulate gyrus, and precuneus was consistent with AD. Therefore, the PCA was considered to be a result of AD. We considered Lewy body dementia as a differential diagnosis because of the presence of hypoperfusion in the occipital lobes. However, the patient did not meet the criteria for Lewy body dementia during the course of the disease. We therefore consider including PCA in the differential diagnoses to be important for patients with visual deficit, cognitive

  8. Repetition probability effects for inverted faces.

    PubMed

    Grotheer, Mareike; Hermann, Petra; Vidnyánszky, Zoltán; Kovács, Gyula

    2014-11-15

    It has been shown, that the repetition related reduction of the blood-oxygen level dependent (BOLD) signal is modulated by the probability of repetitions (P(rep)) for faces (Summerfield et al., 2008), providing support for the predictive coding (PC) model of visual perception (Rao and Ballard, 1999). However, the stage of face processing where repetition suppression (RS) is modulated by P(rep) is still unclear. Face inversion is known to interrupt higher level configural/holistic face processing steps and if modulation of RS by P(rep) takes place at these stages of face processing, P(rep) effects are expected to be reduced for inverted when compared to upright faces. Therefore, here we aimed at investigating whether P(rep) effects on RS observed for face stimuli originate at the higher-level configural/holistic stages of face processing by comparing these effects for upright and inverted faces. Similarly to previous studies, we manipulated P(rep) for pairs of stimuli in individual blocks of fMRI recordings. This manipulation significantly influenced repetition suppression in the posterior FFA, the OFA and the LO, independently of stimulus orientation. Our results thus reveal that RS in the ventral visual stream is modulated by P(rep) even in the case of face inversion and hence strongly compromised configural/holistic face processing. An additional whole-brain analysis could not identify any areas where the modulatory effect of probability was orientation specific either. These findings imply that P(rep) effects on RS might originate from the earlier stages of face processing. PMID:25123974

  9. Direct Updating of an RNA Base-Pairing Probability Matrix with Marginal Probability Constraints

    PubMed Central

    2012-01-01

    Abstract A base-pairing probability matrix (BPPM) stores the probabilities for every possible base pair in an RNA sequence and has been used in many algorithms in RNA informatics (e.g., RNA secondary structure prediction and motif search). In this study, we propose a novel algorithm to perform iterative updates of a given BPPM, satisfying marginal probability constraints that are (approximately) given by recently developed biochemical experiments, such as SHAPE, PAR, and FragSeq. The method is easily implemented and is applicable to common models for RNA secondary structures, such as energy-based or machine-learning–based models. In this article, we focus mainly on the details of the algorithms, although preliminary computational experiments will also be presented. PMID:23210474

  10. A model selection algorithm for a posteriori probability estimation with neural networks.

    PubMed

    Arribas, Juan Ignacio; Cid-Sueiro, Jesús

    2005-07-01

    This paper proposes a novel algorithm to jointly determine the structure and the parameters of a posteriori probability model based on neural networks (NNs). It makes use of well-known ideas of pruning, splitting, and merging neural components and takes advantage of the probabilistic interpretation of these components. The algorithm, so called a posteriori probability model selection (PPMS), is applied to an NN architecture called the generalized softmax perceptron (GSP) whose outputs can be understood as probabilities although results shown can be extended to more general network architectures. Learning rules are derived from the application of the expectation-maximization algorithm to the GSP-PPMS structure. Simulation results show the advantages of the proposed algorithm with respect to other schemes. PMID:16121722

  11. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    SciTech Connect

    Helton, J.C.

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S{sub st}, S{sub st}, p{sub st}) for stochastic uncertainty, a probability space (S{sub su}, S{sub su}, p{sub su}) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S{sub st}, S{sub st}, p{sub st}) and (S{sub su}, S{sub su}, p{sub su}). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency`s standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems.

  12. Correlation between the clinical pretest probability score and the lung ventilation and perfusion scan probability

    PubMed Central

    Bhoobalan, Shanmugasundaram; Chakravartty, Riddhika; Dolbear, Gill; Al-Janabi, Mazin

    2013-01-01

    Purpose: Aim of the study was to determine the accuracy of the clinical pretest probability (PTP) score and its association with lung ventilation and perfusion (VQ) scan. Materials and Methods: A retrospective analysis of 510 patients who had a lung VQ scan between 2008 and 2010 were included in the study. Out of 510 studies, the number of normal, low, and high probability VQ scans were 155 (30%), 289 (57%), and 55 (11%), respectively. Results: A total of 103 patients underwent computed tomography pulmonary angiography (CTPA) scan in which 21 (20%) had a positive scan, 81 (79%) had a negative scan and one (1%) had an equivocal result. The rate of PE in the normal, low-probability, and high-probability scan categories were: 2 (9.5%), 10 (47.5%), and 9 (43%) respectively. A very low correlation (Pearson correlation coefficient r = 0.20) between the clinical PTP score and lung VQ scan. The area under the curve (AUC) of the clinical PTP score was 52% when compared with the CTPA results. However, the accuracy of lung VQ scan was better (AUC = 74%) when compared with CTPA scan. Conclusion: The clinical PTP score is unreliable on its own; however, it may still aid in the interpretation of lung VQ scan. The accuracy of the lung VQ scan was better in the assessment of underlying pulmonary embolism (PE). PMID:24379532

  13. Semi-supervised dimensionality reduction using estimated class membership probabilities

    NASA Astrophysics Data System (ADS)

    Li, Wei; Ruan, Qiuqi; Wan, Jun

    2012-10-01

    In solving pattern-recognition tasks with partially labeled training data, the semi-supervised dimensionality reduction method, which considers both labeled and unlabeled data, is preferable for improving the classification and generalization capability of the testing data. Among such techniques, graph-based semi-supervised learning methods have attracted a lot of attention due to their appealing properties in discovering discriminative structure and geometric structure of data points. Although they have achieved remarkable success, they cannot promise good performance when the size of the labeled data set is small, as a result of inaccurate class matrix variance approximated by insufficient labeled training data. In this paper, we tackle this problem by combining class membership probabilities estimated from unlabeled data and ground-truth class information associated with labeled data to more precisely characterize the class distribution. Therefore, it is expected to enhance performance in classification tasks. We refer to this approach as probabilistic semi-supervised discriminant analysis (PSDA). The proposed PSDA is applied to face and facial expression recognition tasks and is evaluated using the ORL, Extended Yale B, and CMU PIE face databases and the Cohn-Kanade facial expression database. The promising experimental results demonstrate the effectiveness of our proposed method.

  14. Atomic Transition Probabilities for Neutral Cerium

    NASA Astrophysics Data System (ADS)

    Lawler, J. E.; den Hartog, E. A.; Wood, M. P.; Nitz, D. E.; Chisholm, J.; Sobeck, J.

    2009-10-01

    The spectra of neutral cerium (Ce I) and singly ionized cerium (Ce II) are more complex than spectra of other rare earth species. The resulting high density of lines in the visible makes Ce ideal for use in metal halide (MH) High Intensity Discharge (HID) lamps. Inclusion of cerium-iodide in a lamp dose can improve both the Color Rendering Index and luminous efficacy of a MH-HID lamp. Basic spectroscopic data including absolute atomic transition probabilities for Ce I and Ce II are needed for diagnosing and modeling these MH-HID lamps. Recent work on Ce II [1] is now being augmented with similar work on Ce I. Radiative lifetimes from laser induced fluorescence measurements [2] on neutral Ce are being combined with emission branching fractions from spectra recorded using a Fourier transform spectrometer. A total of 14 high resolution spectra are being analyzed to determine branching fractions for 2000 to 3000 lines from 153 upper levels in neutral Ce. Representative data samples and progress to date will be presented. [4pt] [1] J. E. Lawler, C. Sneden, J. J. Cowan, I. I. Ivans, and E. A. Den Hartog, Astrophys. J. Suppl. Ser. 182, 51-79 (2009). [0pt] [2] E. A. Den Hartog, K. P. Buettner, and J. E. Lawler, J. Phys. B: Atomic, Molecular & Optical Physics 42, 085006 (7pp) (2009).

  15. Joint probability distributions and fluctuation theorems

    NASA Astrophysics Data System (ADS)

    García-García, Reinaldo; Lecomte, Vivien; Kolton, Alejandro B.; Domínguez, Daniel

    2012-02-01

    We derive various exact results for Markovian systems that spontaneously relax to a non-equilibrium steady state by using joint probability distribution symmetries of different entropy production decompositions. The analytical approach is applied to diverse problems such as the description of the fluctuations induced by experimental errors, for unveiling symmetries of correlation functions appearing in fluctuation-dissipation relations recently generalized to non-equilibrium steady states, and also for mapping averages between different trajectory-based dynamical ensembles. Many known fluctuation theorems arise as special instances of our approach for particular twofold decompositions of the total entropy production. As a complement, we also briefly review and synthesize the variety of fluctuation theorems applying to stochastic dynamics of both continuous systems described by a Langevin dynamics and discrete systems obeying a Markov dynamics, emphasizing how these results emerge from distinct symmetries of the dynamical entropy of the trajectory followed by the system. For Langevin dynamics, we embed the 'dual dynamics' with a physical meaning, and for Markov systems we show how the fluctuation theorems translate into symmetries of modified evolution operators.

  16. Do aftershock probabilities decay with time?

    USGS Publications Warehouse

    Michael, Andrew J.

    2012-01-01

    So, do aftershock probabilities decay with time? Consider a thought experiment in which we are at the time of the mainshock and ask how many aftershocks will occur a day, week, month, year, or even a century from now. First we must decide how large a window to use around each point in time. Let's assume that, as we go further into the future, we are asking a less precise question. Perhaps a day from now means 1 day 10% of a day, a week from now means 1 week 10% of a week, and so on. If we ignore c because it is a small fraction of a day (e.g., Reasenberg and Jones, 1989, hereafter RJ89), and set p = 1 because it is usually close to 1 (its value in the original Omori law), then the rate of earthquakes (K=t) decays at 1=t. If the length of the windows being considered increases proportionally to t, then the number of earthquakes at any time from now is the same because the rate decrease is canceled by the increase in the window duration. Under these conditions we should never think "It's a bit late for this to be an aftershock."

  17. Parametric probability distributions for anomalous change detection

    SciTech Connect

    Theiler, James P; Foy, Bernard R; Wohlberg, Brendt E; Scovel, James C

    2010-01-01

    The problem of anomalous change detection arises when two (or possibly more) images are taken of the same scene, but at different times. The aim is to discount the 'pervasive differences' that occur thoughout the imagery, due to the inevitably different conditions under which the images were taken (caused, for instance, by differences in illumination, atmospheric conditions, sensor calibration, or misregistration), and to focus instead on the 'anomalous changes' that actually take place in the scene. In general, anomalous change detection algorithms attempt to model these normal or pervasive differences, based on data taken directly from the imagery, and then identify as anomalous those pixels for which the model does not hold. For many algorithms, these models are expressed in terms of probability distributions, and there is a class of such algorithms that assume the distributions are Gaussian. By considering a broader class of distributions, however, a new class of anomalous change detection algorithms can be developed. We consider several parametric families of such distributions, derive the associated change detection algorithms, and compare the performance with standard algorithms that are based on Gaussian distributions. We find that it is often possible to significantly outperform these standard algorithms, even using relatively simple non-Gaussian models.

  18. Lectures on probability and statistics. Revision

    SciTech Connect

    Yost, G.P.

    1985-06-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. They begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probabilty of any specified outcome. They finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another. Hopefully, the reader will come away from these notes with a feel for some of the problems and uncertainties involved. Although there are standard approaches, most of the time there is no cut and dried ''best'' solution - ''best'' according to every criterion.

  19. Enhanced awakening probability of repetitive impulse sounds.

    PubMed

    Vos, Joos; Houben, Mark M J

    2013-09-01

    In the present study relations between the level of impulse sounds and the observed proportion of behaviorally confirmed awakening reactions were determined. The sounds (shooting sounds, bangs produced by door slamming or by container transshipment, aircraft landings) were presented by means of loudspeakers in the bedrooms of 50 volunteers. The fragments for the impulse sounds consisted of single or multiple events. The sounds were presented during a 6-h period that started 75 min after the subjects wanted to sleep. In order to take account of habituation, each subject participated during 18 nights. At equal indoor A-weighted sound exposure levels, the proportion of awakening for the single impulse sounds was equal to that for the aircraft sounds. The proportion of awakening induced by the multiple impulse sounds, however, was significantly higher. For obtaining the same rate of awakening, the sound level of each of the successive impulses in a fragment had to be about 15-25 dB lower than the level of one single impulse. This level difference was largely independent of the degree of habituation. Various explanations for the enhanced awakening probability are discussed. PMID:23967934

  20. Essays on probability elicitation scoring rules

    NASA Astrophysics Data System (ADS)

    Firmino, Paulo Renato A.; dos Santos Neto, Ademir B.

    2012-10-01

    In probability elicitation exercises it has been usual to considerer scoring rules (SRs) to measure the performance of experts when inferring about a given unknown, Θ, for which the true value, θ*, is (or will shortly be) known to the experimenter. Mathematically, SRs quantify the discrepancy between f(θ) (the distribution reflecting the expert's uncertainty about Θ) and d(θ), a zero-one indicator function of the observation θ*. Thus, a remarkable characteristic of SRs is to contrast expert's beliefs with the observation θ*. The present work aims at extending SRs concepts and formulas for the cases where Θ is aleatory, highlighting advantages of goodness-of-fit and entropy-like measures. Conceptually, it is argued that besides of evaluating the personal performance of the expert, SRs may also play a role when comparing the elicitation processes adopted to obtain f(θ). Mathematically, it is proposed to replace d(θ) by g(θ), the distribution that model the randomness of Θ, and do also considerer goodness-of-fit and entropylike metrics, leading to SRs that measure the adherence of f(θ) to g(θ). The implications of this alternative perspective are discussed and illustrated by means of case studies based on the simulation of controlled experiments. The usefulness of the proposed approach for evaluating the performance of experts and elicitation processes is investigated.

  1. Probability of rupture of multiple fault segments

    USGS Publications Warehouse

    Andrews, D.J.; Schwerer, E.

    2000-01-01

    Fault segments identified from geologic and historic evidence have sometimes been adopted as features limiting the likely extends of earthquake ruptures. There is no doubt that individual segments can sometimes join together to produce larger earthquakes. This work is a trial of an objective method to determine the probability of multisegment ruptures. The frequency of occurrence of events on all conjectured combinations of adjacent segments in northern California is found by fitting to both geologic slip rates and to an assumed distribution of event sizes for the region as a whole. Uncertainty in the shape of the distribution near the maximum magnitude has a large effect on the solution. Frequencies of individual events cannot be determined, but it is possible to find a set of frequencies to fit a model closely. A robust conclusion for the San Francisco Bay region is that large multisegment events occur on the San Andreas and San Gregorio faults, but single-segment events predominate on the extended Hayward and Calaveras strands of segments.

  2. Probability judgments under ambiguity and conflict

    PubMed Central

    Smithson, Michael

    2015-01-01

    Whether conflict and ambiguity are distinct kinds of uncertainty remains an open question, as does their joint impact on judgments of overall uncertainty. This paper reviews recent advances in our understanding of human judgment and decision making when both ambiguity and conflict are present, and presents two types of testable models of judgments under conflict and ambiguity. The first type concerns estimate-pooling to arrive at “best” probability estimates. The second type is models of subjective assessments of conflict and ambiguity. These models are developed for dealing with both described and experienced information. A framework for testing these models in the described-information setting is presented, including a reanalysis of a multi-nation data-set to test best-estimate models, and a study of participants' assessments of conflict, ambiguity, and overall uncertainty reported by Smithson (2013). A framework for research in the experienced-information setting is then developed, that differs substantially from extant paradigms in the literature. This framework yields new models of “best” estimates and perceived conflict. The paper concludes with specific suggestions for future research on judgment and decision making under conflict and ambiguity. PMID:26042081

  3. Levetiracetam: Probably Associated Diurnal Frequent Urination.

    PubMed

    Ju, Jun; Zou, Li-Ping; Shi, Xiu-Yu; Hu, Lin-Yan; Pang, Ling-Yu

    2016-01-01

    Diurnal frequent urination is a common condition in elementary school children who are especially at risk for associated somatic and behavioral problems. Levetiracetam (LEV) is a broad-spectrum antiepileptic drug that has been used in both partial and generalized seizures and less commonly adverse effects including psychiatric and behavioral problems. Diurnal frequent urination is not a well-known adverse effect of LEV. Here, we reported 2 pediatric cases with epilepsy that developed diurnal frequent urination after LEV administration. Case 1 was a 6-year-old male patient who presented urinary frequency and urgency in the daytime since the third day after LEV was given as adjunctive therapy. Symptoms increased accompanied by the raised dosage of LEV. Laboratory tests and auxiliary examinations did not found evidence of organic disease. Diurnal frequent urination due to LEV was suspected, and then the drug was discontinued. As expected, his frequency of urination returned to normal levels. Another 13-year-old female patient got similar clinical manifestations after oral LEV monotherapy and the symptoms became aggravated while in stress state. Since the most common causes of frequent micturition had been ruled out, the patient was considered to be diagnosed with LEV-associated psychogenic frequent urination. The dosage of LEV was reduced to one-third, and the frequency of urination was reduced by 60%. Both patients got the Naranjo score of 6, which indicated that LEV was a "probable" cause of diurnal frequent urination. Although a definite causal link between LEV and diurnal urinary frequency in the 2 cases remains to be established, we argue that diurnal frequent urination associated with LEV deserves clinician's attention. PMID:26938751

  4. Movement disorders of probable infectious origin

    PubMed Central

    Jhunjhunwala, Ketan; Netravathi, M.; Pal, Pramod Kumar

    2014-01-01

    Background: Movement disorders (MDs) associated with infections remains an important debilitating disorder in the Asian countries. Objectives: The objective of the following study is to report the clinical and imaging profile of a large cohort of patients with MDs probably associated with infection. Materials and Methods: This was a chart review of 35 patients (F:M-15:20) presenting with MD in the Neurology services of National Institute of Mental Health and Neurosciences, India. The demographic profile, type of infection, time from infection to MD, phenomenology of MD and magnetic resonance imaging (MRI) findings were reviewed. Results: The mean age at presentation was 22.6 ± 13.3 years, (5-60), age of onset of MD was 15.7 ± 15 years, and duration of symptoms was 6.9 ± 8.1 years (42 days to 32 years). The mean latency of onset of MD after the infection was 5.9 ± 4.2 weeks. The phenomenology of MD were: (1) Pure dystonia-28.6%, (2) dystonia with choreoathetosis-22.9%, (3) Parkinsonism-14.6%, (4) pure tremor, hemiballismus, myoclonus and chorea-2.9% each, and (5) mixed MD-22.9%. Most often the MD was generalized (60%), followed by right upper limb (31.4%) and left upper limb (8.6%). A viral encephalitic type of neuroinfection was the most common infection (85.7%), which was associated with MD. Abnormalities of brain MRI, seen in 79.2%, included signal changes in (1) thalamus-52.0%, (2) putamen and subcortical white matter-16% each, (3) pons-12%, (4) striatopallidum, striatum and grey matter-8% each, and (5) caudate, cerebellum, lentiform nucleus, midbrain and subthalamic nucleus-4.0% each. Conclusions: MDs associated with infection were the most often post-encephalitic. Dystonia was the most common MD, and thalamus was the most common anatomical site involved. PMID:25221398

  5. Projecting Climate Change Impacts on Wildfire Probabilities

    NASA Astrophysics Data System (ADS)

    Westerling, A. L.; Bryant, B. P.; Preisler, H.

    2008-12-01

    We present preliminary results of the 2008 Climate Change Impact Assessment for wildfire in California, part of the second biennial science report to the California Climate Action Team organized via the California Climate Change Center by the California Energy Commission's Public Interest Energy Research Program pursuant to Executive Order S-03-05 of Governor Schwarzenegger. In order to support decision making by the State pertaining to mitigation of and adaptation to climate change and its impacts, we model wildfire occurrence monthly from 1950 to 2100 under a range of climate scenarios from the Intergovernmental Panel on Climate Change. We use six climate change models (GFDL CM2.1, NCAR PCM1, CNRM CM3, MPI ECHAM5, MIROC3.2 med, NCAR CCSM3) under two emissions scenarios--A2 (C02 850ppm max atmospheric concentration) and B1(CO2 550ppm max concentration). Climate model output has been downscaled to a 1/8 degree (~12 km) grid using two alternative methods: a Bias Correction and Spatial Donwscaling (BCSD) and a Constructed Analogues (CA) downscaling. Hydrologic variables have been simulated from temperature, precipitation, wind and radiation forcing data using the Variable Infiltration Capacity (VIC) Macroscale Hydrologic Model. We model wildfire as a function of temperature, moisture deficit, and land surface characteristics using nonlinear logistic regression techniques. Previous work on wildfire climatology and seasonal forecasting has demonstrated that these variables account for much of the inter-annual and seasonal variation in wildfire. The results of this study are monthly gridded probabilities of wildfire occurrence by fire size class, and estimates of the number of structures potentially affected by fires. In this presentation we will explore the range of modeled outcomes for wildfire in California, considering the effects of emissions scenarios, climate model sensitivities, downscaling methods, hydrologic simulations, statistical model specifications for

  6. Failure-probability driven dose painting

    SciTech Connect

    Vogelius, Ivan R.; Håkansson, Katrin; Due, Anne K.; Aznar, Marianne C.; Kristensen, Claus A.; Rasmussen, Jacob; Specht, Lena; Berthelsen, Anne K.; Bentzen, Søren M.

    2013-08-15

    Purpose: To demonstrate a data-driven dose-painting strategy based on the spatial distribution of recurrences in previously treated patients. The result is a quantitative way to define a dose prescription function, optimizing the predicted local control at constant treatment intensity. A dose planning study using the optimized dose prescription in 20 patients is performed.Methods: Patients treated at our center have five tumor subvolumes from the center of the tumor (PET positive volume) and out delineated. The spatial distribution of 48 failures in patients with complete clinical response after (chemo)radiation is used to derive a model for tumor control probability (TCP). The total TCP is fixed to the clinically observed 70% actuarial TCP at five years. Additionally, the authors match the distribution of failures between the five subvolumes to the observed distribution. The steepness of the dose–response is extracted from the literature and the authors assume 30% and 20% risk of subclinical involvement in the elective volumes. The result is a five-compartment dose response model matching the observed distribution of failures. The model is used to optimize the distribution of dose in individual patients, while keeping the treatment intensity constant and the maximum prescribed dose below 85 Gy.Results: The vast majority of failures occur centrally despite the small volumes of the central regions. Thus, optimizing the dose prescription yields higher doses to the central target volumes and lower doses to the elective volumes. The dose planning study shows that the modified prescription is clinically feasible. The optimized TCP is 89% (range: 82%–91%) as compared to the observed TCP of 70%.Conclusions: The observed distribution of locoregional failures was used to derive an objective, data-driven dose prescription function. The optimized dose is predicted to result in a substantial increase in local control without increasing the predicted risk of toxicity.

  7. On lacunary statistical convergence of order α in probability

    NASA Astrophysics Data System (ADS)

    Işık, Mahmut; Et, Kübra Elif

    2015-09-01

    In this study, we examine the concepts of lacunary statistical convergence of order α in probability and Nθ—convergence of order α in probability. We give some relations connected to these concepts.

  8. What is preexisting strength? Predicting free association probabilities, similarity ratings, and cued recall probabilities.

    PubMed

    Nelson, Douglas L; Dyrdal, Gunvor M; Goodmon, Leilani B

    2005-08-01

    Measuring lexical knowledge poses a challenge to the study of the influence of preexisting knowledge on the retrieval of new memories. Many tasks focus on word pairs, but words are embedded in associative networks, so how should preexisting pair strength be measured? It has been measured by free association, similarity ratings, and co-occurrence statistics. Researchers interpret free association response probabilities as unbiased estimates of forward cue-to-target strength. In Study 1, analyses of large free association and extralist cued recall databases indicate that this interpretation is incorrect. Competitor and backward strengths bias free association probabilities, and as with other recall tasks, preexisting strength is described by a ratio rule. In Study 2, associative similarity ratings are predicted by forward and backward, but not by competitor, strength. Preexisting strength is not a unitary construct, because its measurement varies with method. Furthermore, free association probabilities predict extralist cued recall better than do ratings and co-occurrence statistics. The measure that most closely matches the criterion task may provide the best estimate of the identity of preexisting strength. PMID:16447386

  9. Learning foraging thresholds for lizards

    SciTech Connect

    Goldberg, L.A.; Hart, W.E.; Wilson, D.B.

    1996-01-12

    This work gives a proof of convergence for a randomized learning algorithm that describes how anoles (lizards found in the Carribean) learn a foraging threshold distance. This model assumes that an anole will pursue a prey if and only if it is within this threshold of the anole`s perch. This learning algorithm was proposed by the biologist Roughgarden and his colleagues. They experimentally confirmed that this algorithm quickly converges to the foraging threshold that is predicted by optimal foraging theory our analysis provides an analytic confirmation that the learning algorithm converses to this optimal foraging threshold with high probability.

  10. Probability in Theories With Complex Dynamics and Hardy's Fifth Axiom

    NASA Astrophysics Data System (ADS)

    Burić, Nikola

    2010-08-01

    L. Hardy has formulated an axiomatization program of quantum mechanics and generalized probability theories that has been quite influential. In this paper, properties of typical Hamiltonian dynamical systems are used to argue that there are applications of probability in physical theories of systems with dynamical complexity that require continuous spaces of pure states. Hardy’s axiomatization program does not deal with such theories. In particular Hardy’s fifth axiom does not differentiate between such applications of classical probability and quantum probability.

  11. Asymptotic behavior of the supremum tail probability for anomalous diffusions

    NASA Astrophysics Data System (ADS)

    Michna, Zbigniew

    2008-01-01

    In this paper we investigate asymptotic behavior of the tail probability for subordinated self-similar processes with regularly varying tail probability. We show that the tail probability of the one-dimensional distributions and the supremum tail probability are regularly varying with the pre-factor depending on the moments of the subordinating process. We can apply our result to the so-called anomalous diffusion.

  12. Pretest probability assessment derived from attribute matching

    PubMed Central

    Kline, Jeffrey A; Johnson, Charles L; Pollack, Charles V; Diercks, Deborah B; Hollander, Judd E; Newgard, Craig D; Garvey, J Lee

    2005-01-01

    Background Pretest probability (PTP) assessment plays a central role in diagnosis. This report compares a novel attribute-matching method to generate a PTP for acute coronary syndrome (ACS). We compare the new method with a validated logistic regression equation (LRE). Methods Eight clinical variables (attributes) were chosen by classification and regression tree analysis of a prospectively collected reference database of 14,796 emergency department (ED) patients evaluated for possible ACS. For attribute matching, a computer program identifies patients within the database who have the exact profile defined by clinician input of the eight attributes. The novel method was compared with the LRE for ability to produce PTP estimation <2% in a validation set of 8,120 patients evaluated for possible ACS and did not have ST segment elevation on ECG. 1,061 patients were excluded prior to validation analysis because of ST-segment elevation (713), missing data (77) or being lost to follow-up (271). Results In the validation set, attribute matching produced 267 unique PTP estimates [median PTP value 6%, 1st–3rd quartile 1–10%] compared with the LRE, which produced 96 unique PTP estimates [median 24%, 1st–3rd quartile 10–30%]. The areas under the receiver operating characteristic curves were 0.74 (95% CI 0.65 to 0.82) for the attribute matching curve and 0.68 (95% CI 0.62 to 0.77) for LRE. The attribute matching system categorized 1,670 (24%, 95% CI = 23–25%) patients as having a PTP < 2.0%; 28 developed ACS (1.7% 95% CI = 1.1–2.4%). The LRE categorized 244 (4%, 95% CI = 3–4%) with PTP < 2.0%; four developed ACS (1.6%, 95% CI = 0.4–4.1%). Conclusion Attribute matching estimated a very low PTP for ACS in a significantly larger proportion of ED patients compared with a validated LRE. PMID:16095534

  13. Factors influencing reporting and harvest probabilities in North American geese

    USGS Publications Warehouse

    Zimmerman, G.S.; Moser, T.J.; Kendall, W.L.; Doherty, P.F., Jr.; White, Gary C.; Caswell, D.F.

    2009-01-01

    We assessed variation in reporting probabilities of standard bands among species, populations, harvest locations, and size classes of North American geese to enable estimation of unbiased harvest probabilities. We included reward (US10,20,30,50, or100) and control (0) banded geese from 16 recognized goose populations of 4 species: Canada (Branta canadensis), cackling (B. hutchinsii), Ross's (Chen rossii), and snow geese (C. caerulescens). We incorporated spatially explicit direct recoveries and live recaptures into a multinomial model to estimate reporting, harvest, and band-retention probabilities. We compared various models for estimating harvest probabilities at country (United States vs. Canada), flyway (5 administrative regions), and harvest area (i.e., flyways divided into northern and southern sections) scales. Mean reporting probability of standard bands was 0.73 (95 CI 0.690.77). Point estimates of reporting probabilities for goose populations or spatial units varied from 0.52 to 0.93, but confidence intervals for individual estimates overlapped and model selection indicated that models with species, population, or spatial effects were less parsimonious than those without these effects. Our estimates were similar to recently reported estimates for mallards (Anas platyrhynchos). We provide current harvest probability estimates for these populations using our direct measures of reporting probability, improving the accuracy of previous estimates obtained from recovery probabilities alone. Goose managers and researchers throughout North America can use our reporting probabilities to correct recovery probabilities estimated from standard banding operations for deriving spatially explicit harvest probabilities.

  14. 14 CFR 417.224 - Probability of failure analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ..., must account for launch vehicle failure probability in a consistent manner. A launch vehicle failure... probabilistically valid. For a launch vehicle with fewer than two flights, the failure probability estimate must... circumstances. For a launch vehicle with two or more flights, launch vehicle failure probability......

  15. 14 CFR 417.224 - Probability of failure analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Probability of failure analysis. 417.224..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.224 Probability of failure..., must account for launch vehicle failure probability in a consistent manner. A launch vehicle...

  16. 14 CFR 417.224 - Probability of failure analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Probability of failure analysis. 417.224..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.224 Probability of failure..., must account for launch vehicle failure probability in a consistent manner. A launch vehicle...

  17. Pig Data and Bayesian Inference on Multinomial Probabilities

    ERIC Educational Resources Information Center

    Kern, John C.

    2006-01-01

    Bayesian inference on multinomial probabilities is conducted based on data collected from the game Pass the Pigs[R]. Prior information on these probabilities is readily available from the instruction manual, and is easily incorporated in a Dirichlet prior. Posterior analysis of the scoring probabilities quantifies the discrepancy between empirical…

  18. 28 CFR 2.214 - Probable cause hearing and determination.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Probable cause hearing and determination... § 2.214 Probable cause hearing and determination. (a) Hearing. A supervised releasee who is retaken... been convicted of a new crime, shall be given a probable cause hearing by an examiner of the...

  19. 28 CFR 2.101 - Probable cause hearing and determination.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Probable cause hearing and determination... Parolees § 2.101 Probable cause hearing and determination. (a) Hearing. A parolee who is retaken and held... convicted of a new crime, shall be given a probable cause hearing by an examiner of the Commission no...

  20. 21 CFR 1316.10 - Administrative probable cause.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Administrative probable cause. 1316.10 Section..., PRACTICES, AND PROCEDURES Administrative Inspections § 1316.10 Administrative probable cause. If the judge or magistrate is satisfied that “administrative probable cause,” as defined in section 510(d)(1)...

  1. Probability Constructs in Preschool Education and How they Are Taught

    ERIC Educational Resources Information Center

    Antonopoulos, Konstantinos; Zacharos, Konstantinos

    2013-01-01

    The teaching of Probability Theory constitutes a new trend in mathematics education internationally. The purpose of this research project was to explore the degree to which preschoolers understand key concepts of probabilistic thinking, such as sample space, the probability of an event and probability comparisons. At the same time, we evaluated an…

  2. Three Dimensional Probability Distributions of the Interplanetary Magnetic Field

    NASA Astrophysics Data System (ADS)

    Podesta, J. J.

    2014-12-01

    Empirical probability density functions (PDFs) of the interplanetary magnetic field (IMF) have been derived from spacecraft data since the early years of the space age. A survey of the literature shows that past studies have investigated the separate Cartesian components of the magnetic field, the vector magnitude, and the direction of the IMF by means of one-dimensional or two-dimensional PDFs. But, to my knowledge, there exist no studies which investigate the three dimensional nature of the IMF by means of three dimensional PDFs, either in (Bx,By,Bz)(B_x,B_y,B_z)-coordinates or (BR,BT,BN)(B_R,B_T,B_N)-coordinates or some other appropriate system of coordinates. Likewise, there exist no studies which investigate three dimensional PDFs of magnetic field fluctuations, that is, vector differences bmB(t+τ)-bmB(t)bm{B}(t+tau)-bm{B}(t). In this talk, I shall present examples of three dimensional PDFs obtained from spacecraft data that demonstrate the solar wind magnetic field possesses a very interesting spatial structure that, to my knowledge, has not previously been identified. Perhaps because of the well known model of Barnes (1981) in which the magnitude of the IMF remains constant, it may be commonly believed that there is nothing new to learn from a full three dimensional PDF. To the contrary, there is much to learn from the investigation of three dimensional PDFs of the solar wind plasma velocity and the magnetic field, as well as three dimensional PDFs of their fluctuations. Knowledge of these PDFs will not only improve understanding of solar wind physics, it is an essential prerequisite for the construction of realistic models of the stochastic time series measured by a single spacecraft, one of the longstanding goals of space physics research. In addition, three dimensional PDFs contain valuable information about the anisotropy of solar wind fluctuations in three dimensional physical space, information that may help identify the reason why the three

  3. Total probabilities of ensemble runoff forecasts

    NASA Astrophysics Data System (ADS)

    Olav Skøien, Jon; Bogner, Konrad; Salamon, Peter; Smith, Paul; Pappenberger, Florian

    2016-04-01

    Ensemble forecasting has for a long time been used as a method in meteorological modelling to indicate the uncertainty of the forecasts. However, as the ensembles often exhibit both bias and dispersion errors, it is necessary to calibrate and post-process them. Two of the most common methods for this are Bayesian Model Averaging (Raftery et al., 2005) and Ensemble Model Output Statistics (EMOS) (Gneiting et al., 2005). There are also methods for regionalizing these methods (Berrocal et al., 2007) and for incorporating the correlation between lead times (Hemri et al., 2013). Engeland and Steinsland Engeland and Steinsland (2014) developed a framework which can estimate post-processing parameters which are different in space and time, but still can give a spatially and temporally consistent output. However, their method is computationally complex for our larger number of stations, and cannot directly be regionalized in the way we would like, so we suggest a different path below. The target of our work is to create a mean forecast with uncertainty bounds for a large number of locations in the framework of the European Flood Awareness System (EFAS - http://www.efas.eu) We are therefore more interested in improving the forecast skill for high-flows rather than the forecast skill of lower runoff levels. EFAS uses a combination of ensemble forecasts and deterministic forecasts from different forecasters to force a distributed hydrologic model and to compute runoff ensembles for each river pixel within the model domain. Instead of showing the mean and the variability of each forecast ensemble individually, we will now post-process all model outputs to find a total probability, the post-processed mean and uncertainty of all ensembles. The post-processing parameters are first calibrated for each calibration location, but assuring that they have some spatial correlation, by adding a spatial penalty in the calibration process. This can in some cases have a slight negative

  4. Pattern formation, logistics, and maximum path probability

    NASA Astrophysics Data System (ADS)

    Kirkaldy, J. S.

    1985-05-01

    The concept of pattern formation, which to current researchers is a synonym for self-organization, carries the connotation of deductive logic together with the process of spontaneous inference. Defining a pattern as an equivalence relation on a set of thermodynamic objects, we establish that a large class of irreversible pattern-forming systems, evolving along idealized quasisteady paths, approaches the stable steady state as a mapping upon the formal deductive imperatives of a propositional function calculus. In the preamble the classical reversible thermodynamics of composite systems is analyzed as an externally manipulated system of space partitioning and classification based on ideal enclosures and diaphragms. The diaphragms have discrete classification capabilities which are designated in relation to conserved quantities by descriptors such as impervious, diathermal, and adiabatic. Differentiability in the continuum thermodynamic calculus is invoked as equivalent to analyticity and consistency in the underlying class or sentential calculus. The seat of inference, however, rests with the thermodynamicist. In the transition to an irreversible pattern-forming system the defined nature of the composite reservoirs remains, but a given diaphragm is replaced by a pattern-forming system which by its nature is a spontaneously evolving volume partitioner and classifier of invariants. The seat of volition or inference for the classification system is thus transferred from the experimenter or theoretician to the diaphragm, and with it the full deductive facility. The equivalence relations or partitions associated with the emerging patterns may thus be associated with theorems of the natural pattern-forming calculus. The entropy function, together with its derivatives, is the vehicle which relates the logistics of reservoirs and diaphragms to the analog logistics of the continuum. Maximum path probability or second-order differentiability of the entropy in isolation are

  5. Probability Distribution for Flowing Interval Spacing

    SciTech Connect

    S. Kuzio

    2004-09-22

    Fracture spacing is a key hydrologic parameter in analyses of matrix diffusion. Although the individual fractures that transmit flow in the saturated zone (SZ) cannot be identified directly, it is possible to determine the fractured zones that transmit flow from flow meter survey observations. The fractured zones that transmit flow as identified through borehole flow meter surveys have been defined in this report as flowing intervals. The flowing interval spacing is measured between the midpoints of each flowing interval. The determination of flowing interval spacing is important because the flowing interval spacing parameter is a key hydrologic parameter in SZ transport modeling, which impacts the extent of matrix diffusion in the SZ volcanic matrix. The output of this report is input to the ''Saturated Zone Flow and Transport Model Abstraction'' (BSC 2004 [DIRS 170042]). Specifically, the analysis of data and development of a data distribution reported herein is used to develop the uncertainty distribution for the flowing interval spacing parameter for the SZ transport abstraction model. Figure 1-1 shows the relationship of this report to other model reports that also pertain to flow and transport in the SZ. Figure 1-1 also shows the flow of key information among the SZ reports. It should be noted that Figure 1-1 does not contain a complete representation of the data and parameter inputs and outputs of all SZ reports, nor does it show inputs external to this suite of SZ reports. Use of the developed flowing interval spacing probability distribution is subject to the limitations of the assumptions discussed in Sections 5 and 6 of this analysis report. The number of fractures in a flowing interval is not known. Therefore, the flowing intervals are assumed to be composed of one flowing zone in the transport simulations. This analysis may overestimate the flowing interval spacing because the number of fractures that contribute to a flowing interval cannot be

  6. Debris-flow hazard map units from gridded probabilities

    USGS Publications Warehouse

    Campbell, Russell H.; Bernknopf, Richard L.

    1997-01-01

    The common statistical practice of dividing a range of probabilities into equal probability intervals may not result in useful landslide-hazard map units for areas populated by equal-area cells, each of which has a unique probability. Most hazard map areas contain very large numbers of cells having low probability of failure, and as probability increases, the number of cells decreases in a non-linear fashion. Exploration of this distribution suggests that the spatial frequency of expected failures may be used to identify probability intervals that define map units. From a spatial database of gridded probabilities, map units that address the different objectives of land-use planners and emergency response officials can be defined.

  7. Conditional Probabilities for Large Events Estimated by Small Earthquake Rate

    NASA Astrophysics Data System (ADS)

    Wu, Yi-Hsuan; Chen, Chien-Chih; Li, Hsien-Chi

    2016-01-01

    We examined forecasting quiescence and activation models to obtain the conditional probability that a large earthquake will occur in a specific time period on different scales in Taiwan. The basic idea of the quiescence and activation models is to use earthquakes that have magnitudes larger than the completeness magnitude to compute the expected properties of large earthquakes. We calculated the probability time series for the whole Taiwan region and for three subareas of Taiwan—the western, eastern, and northeastern Taiwan regions—using 40 years of data from the Central Weather Bureau catalog. In the probability time series for the eastern and northeastern Taiwan regions, a high probability value is usually yielded in cluster events such as events with foreshocks and events that all occur in a short time period. In addition to the time series, we produced probability maps by calculating the conditional probability for every grid point at the time just before a large earthquake. The probability maps show that high probability values are yielded around the epicenter before a large earthquake. The receiver operating characteristic (ROC) curves of the probability maps demonstrate that the probability maps are not random forecasts, but also suggest that lowering the magnitude of a forecasted large earthquake may not improve the forecast method itself. From both the probability time series and probability maps, it can be observed that the probability obtained from the quiescence model increases before a large earthquake and the probability obtained from the activation model increases as the large earthquakes occur. The results lead us to conclude that the quiescence model has better forecast potential than the activation model.

  8. Young Star Probably Ejected From Triple System

    NASA Astrophysics Data System (ADS)

    2003-01-01

    Astronomers analyzing nearly 20 years of data from the National Science Foundation's Very Large Array radio telescope have discovered that a small star in a multiple-star system in the constellation Taurus probably has been ejected from the system after a close encounter with one of the system's more-massive components, presumed to be a compact double star. This is the first time any such event has been observed. Path of Small Star, 1983-2001 "Our analysis shows a drastic change in the orbit of this young star after it made a close approach to another object in the system," said Luis Rodriguez of the Institute of Astronomy of the National Autonomous University of Mexico (UNAM). "The young star was accelerated to a large velocity by the close approach, and certainly now is in a very different, more remote orbit, and may even completely escape its companions," said Laurent Loinard, leader of the research team that also included Monica Rodriguez in addition to Luis Rodriguez. The UNAM astronomers presented their findings at the American Astronomical Society's meeting in Seattle, WA. The discovery of this chaotic event will be important for advancing our understanding of classical dynamic astronomy and of how stars evolve, including possibly providing an explanation for the production of the mysterious "brown dwarfs," the astronomers said. The scientists analyzed VLA observations of T Tauri, a multiple system of young stars some 450 light-years from Earth. The observations were made from 1983 to 2001. The T Tauri system includes a "Northern" star, the famous star that gives its name to the class of young visible stars, and a "Southern" system of stars, all orbiting each other. The VLA data were used to track the orbit of the smaller Southern star around the larger Southern object, presumed to be a pair of stars orbiting each other closely. The astronomers' plot of the smaller star's orbit shows that it followed an apparently elliptical orbit around its twin companions

  9. A physical-space approach for the probability hypothesis density and cardinalized probability hypothesis density filters

    NASA Astrophysics Data System (ADS)

    Erdinc, Ozgur; Willett, Peter; Bar-Shalom, Yaakov

    2006-05-01

    The probability hypothesis density (PHD) filter, an automatically track-managed multi-target tracker, is attracting increasing but cautious attention. Its derivation is elegant and mathematical, and thus of course many engineers fear it; perhaps that is currently limiting the number of researchers working on the subject. In this paper, we explore a physical-space approach - a bin model - which leads us to arrive the same filter equations as the PHD. Unlike the original derivation of the PHD filter, the concepts used are the familiar ones of conditional probability. The original PHD suffers from a "target-death" problem in which even a single missed detection can lead to the apparent disappearance of a target. To obviate this, PHD originator Mahler has recently developed a new "cardinalized" version of PHD (CPHD). We are able to extend our physical-space derivation to the CPHD case as well. We stress that the original derivations are mathematically correct, and need no embellishment from us; our contribution here is to offer an alternative derivation, one that we find appealing.

  10. Learning to Learn Cooperatively

    ERIC Educational Resources Information Center

    Byrd, Anne Hammond

    2009-01-01

    Cooperative learning, put quite simply, is a type of instruction whereby students work together in small groups to achieve a common goal. Cooperative learning has become increasingly popular as a feature of Communicative Language Teaching (CLT) with benefits that include increased student interest due to the quick pace of cooperative tasks,…

  11. Learning about Learning

    ERIC Educational Resources Information Center

    Siegler, Robert S.

    2004-01-01

    The field of children's learning was thriving when the Merrill-Palmer Quarterly was launched; the field later went into eclipse and now is in the midst of a resurgence. This commentary examines reasons for these trends, and describes the emerging field of children's learning. In particular, the new field is seen as differing from the old in its…

  12. Predicting Robust Learning with the Visual Form of the Moment-by-Moment Learning Curve

    ERIC Educational Resources Information Center

    Baker, Ryan S.; Hershkovitz, Arnon; Rossi, Lisa M.; Goldstein, Adam B.; Gowda, Sujith M.

    2013-01-01

    We present a new method for analyzing a student's learning over time for a specific skill: analysis of the graph of the student's moment-by-moment learning over time. Moment-by-moment learning is calculated using a data-mined model that assesses the probability that a student learned a skill or concept at a specific time during learning…

  13. ERP Correlates of Verbal and Numerical Probabilities in Risky Choices: A Two-Stage Probability Processing View

    PubMed Central

    Li, Shu; Du, Xue-Lei; Li, Qi; Xuan, Yan-Hua; Wang, Yun; Rao, Li-Lin

    2016-01-01

    Two kinds of probability expressions, verbal and numerical, have been used to characterize the uncertainty that people face. However, the question of whether verbal and numerical probabilities are cognitively processed in a similar manner remains unresolved. From a levels-of-processing perspective, verbal and numerical probabilities may be processed differently during early sensory processing but similarly in later semantic-associated operations. This event-related potential (ERP) study investigated the neural processing of verbal and numerical probabilities in risky choices. The results showed that verbal probability and numerical probability elicited different N1 amplitudes but that verbal and numerical probabilities elicited similar N2 and P3 waveforms in response to different levels of probability (high to low). These results were consistent with a levels-of-processing framework and suggest some internal consistency between the cognitive processing of verbal and numerical probabilities in risky choices. Our findings shed light on possible mechanism underlying probability expression and may provide the neural evidence to support the translation of verbal to numerical probabilities (or vice versa). PMID:26834612

  14. Learning to aid learning.

    PubMed

    Richards, Jacqui

    2016-01-01

    The National Health Service (NHS) is one of the largest employers in the world and, with 1.3 million staff, the biggest employer in Europe. With over three hundred different careers on offer (NHS 2015), the acquisition of skills and qualifications, through academic and clinical training, is an integral part of day-to-day life in the health service. As such, mentoring has become a significant feature in the preparation of healthcare professionals, to support students and ensure learning needs and experiences are appropriate to competency. This article examines the mentor's role, in relation to a teaching innovation designed to address students' identified learning needs to meet the requirements of the multi-professional learning and assessment in practice course NM6156. The effectiveness of the aids to learning will be assessed through an online quiz, and its usefulness will be analysed with reference to educational theories of learning and development. PMID:26975128

  15. The Effect of Conditional Probability of Chord Progression on Brain Response: An MEG Study

    PubMed Central

    Kim, Seung-Goo; Kim, June Sic; Chung, Chun Kee

    2011-01-01

    Background Recent electrophysiological and neuroimaging studies have explored how and where musical syntax in Western music is processed in the human brain. An inappropriate chord progression elicits an event-related potential (ERP) component called an early right anterior negativity (ERAN) or simply an early anterior negativity (EAN) in an early stage of processing the musical syntax. Though the possible underlying mechanism of the EAN is assumed to be probabilistic learning, the effect of the probability of chord progressions on the EAN response has not been previously explored explicitly. Methodology/Principal Findings In the present study, the empirical conditional probabilities in a Western music corpus were employed as an approximation of the frequencies in previous exposure of participants. Three types of chord progression were presented to musicians and non-musicians in order to examine the correlation between the probability of chord progression and the neuromagnetic response using magnetoencephalography (MEG). Chord progressions were found to elicit early responses in a negatively correlating fashion with the conditional probability. Observed EANm (as a magnetic counterpart of the EAN component) responses were consistent with the previously reported EAN responses in terms of latency and location. The effect of conditional probability interacted with the effect of musical training. In addition, the neural response also correlated with the behavioral measures in the non-musicians. Conclusions/Significance Our study is the first to reveal the correlation between the probability of chord progression and the corresponding neuromagnetic response. The current results suggest that the physiological response is a reflection of the probabilistic representations of the musical syntax. Moreover, the results indicate that the probabilistic representation is related to the musical training as well as the sensitivity of an individual. PMID:21364895

  16. Exploring non-signalling polytopes with negative probability

    NASA Astrophysics Data System (ADS)

    Oas, G.; Acacio de Barros, J.; Carvalhaes, C.

    2014-12-01

    Bipartite and tripartite EPR-Bell type systems are examined via joint quasi-probability distributions where probabilities are permitted to be negative. It is shown that such distributions exist only when the no-signalling condition is satisfied. A characteristic measure, the probability mass, is introduced and, via its minimization, limits the number of quasi-distributions describing a given marginal probability distribution. The minimized probability mass is shown to be an alternative way to characterize non-local systems. Non-signalling polytopes for two to eight settings in the bipartite scenario are examined and compared to prior work. Examining perfect cloning of non-local systems within the tripartite scenario suggests defining two categories of signalling. It is seen that many properties of non-local systems can be efficiently described by quasi-probability theory.

  17. Path probability of stochastic motion: A functional approach

    NASA Astrophysics Data System (ADS)

    Hattori, Masayuki; Abe, Sumiyoshi

    2016-06-01

    The path probability of a particle undergoing stochastic motion is studied by the use of functional technique, and the general formula is derived for the path probability distribution functional. The probability of finding paths inside a tube/band, the center of which is stipulated by a given path, is analytically evaluated in a way analogous to continuous measurements in quantum mechanics. Then, the formalism developed here is applied to the stochastic dynamics of stock price in finance.

  18. Posterior Probability Matching and Human Perceptual Decision Making

    PubMed Central

    Murray, Richard F.; Patel, Khushbu; Yee, Alan

    2015-01-01

    Probability matching is a classic theory of decision making that was first developed in models of cognition. Posterior probability matching, a variant in which observers match their response probabilities to the posterior probability of each response being correct, is being used increasingly often in models of perception. However, little is known about whether posterior probability matching is consistent with the vast literature on vision and hearing that has developed within signal detection theory. Here we test posterior probability matching models using two tools from detection theory. First, we examine the models’ performance in a two-pass experiment, where each block of trials is presented twice, and we measure the proportion of times that the model gives the same response twice to repeated stimuli. We show that at low performance levels, posterior probability matching models give highly inconsistent responses across repeated presentations of identical trials. We find that practised human observers are more consistent across repeated trials than these models predict, and we find some evidence that less practised observers more consistent as well. Second, we compare the performance of posterior probability matching models on a discrimination task to the performance of a theoretical ideal observer that achieves the best possible performance. We find that posterior probability matching is very inefficient at low-to-moderate performance levels, and that human observers can be more efficient than is ever possible according to posterior probability matching models. These findings support classic signal detection models, and rule out a broad class of posterior probability matching models for expert performance on perceptual tasks that range in complexity from contrast discrimination to symmetry detection. However, our findings leave open the possibility that inexperienced observers may show posterior probability matching behaviour, and our methods provide new tools

  19. A SNP discovery method to assess variant allele probability from next-generation resequencing data

    PubMed Central

    Shen, Yufeng; Wan, Zhengzheng; Coarfa, Cristian; Drabek, Rafal; Chen, Lei; Ostrowski, Elizabeth A.; Liu, Yue; Weinstock, George M.; Wheeler, David A.; Gibbs, Richard A.; Yu, Fuli

    2010-01-01

    Accurate identification of genetic variants from next-generation sequencing (NGS) data is essential for immediate large-scale genomic endeavors such as the 1000 Genomes Project, and is crucial for further genetic analysis based on the discoveries. The key challenge in single nucleotide polymorphism (SNP) discovery is to distinguish true individual variants (occurring at a low frequency) from sequencing errors (often occurring at frequencies orders of magnitude higher). Therefore, knowledge of the error probabilities of base calls is essential. We have developed Atlas-SNP2, a computational tool that detects and accounts for systematic sequencing errors caused by context-related variables in a logistic regression model learned from training data sets. Subsequently, it estimates the posterior error probability for each substitution through a Bayesian formula that integrates prior knowledge of the overall sequencing error probability and the estimated SNP rate with the results from the logistic regression model for the given substitutions. The estimated posterior SNP probability can be used to distinguish true SNPs from sequencing errors. Validation results show that Atlas-SNP2 achieves a false-positive rate of lower than 10%, with an ∼5% or lower false-negative rate. PMID:20019143

  20. Learning about Sounds Contributes to Learning about Words: Effects of Prosody and Phonotactics on Infant Word Learning

    ERIC Educational Resources Information Center

    Estes, Katharine Graf; Bowen, Sara

    2013-01-01

    This research investigates how early learning about native language sound structure affects how infants associate sounds with meanings during word learning. Infants (19-month-olds) were presented with bisyllabic labels with high or low phonotactic probability (i.e., sequences of frequent or infrequent phonemes in English). The labels were produced…

  1. Anytime synthetic projection: Maximizing the probability of goal satisfaction

    NASA Technical Reports Server (NTRS)

    Drummond, Mark; Bresina, John L.

    1990-01-01

    A projection algorithm is presented for incremental control rule synthesis. The algorithm synthesizes an initial set of goal achieving control rules using a combination of situation probability and estimated remaining work as a search heuristic. This set of control rules has a certain probability of satisfying the given goal. The probability is incrementally increased by synthesizing additional control rules to handle 'error' situations the execution system is likely to encounter when following the initial control rules. By using situation probabilities, the algorithm achieves a computationally effective balance between the limited robustness of triangle tables and the absolute robustness of universal plans.

  2. Weibull probability graph paper: a call for standardization

    NASA Astrophysics Data System (ADS)

    Kane, Martin D.

    2001-04-01

    Weibull analysis of tensile strength data is routinely performed to determine the quality of optical fiber. A typical Weibull analysis includes setting up an experiment, testing the samples, plotting and interpreting the data, and performing a statistical analysis. One typical plot that is often included in the analysis is the Weibull probability plot in which the data are plotted as points on a special type of graph paper known as Weibull probability paper. If the data belong to a Weibull probability density function, they will fall approximately on a straight line. A search of the literature reveals that many Weibull analyses have been performed on optical fiber, but the associated Weibull probability plots have been drawn incorrectly. In some instances the plots have been shown with the ordinate (Probability) starting from 0% and ending at 100%. This has no physical meaning because the Weibull probability density function is a continuous distribution and is inherently not bounded. This paper will discuss the Weibull probability density function, the proper construction of Weibull probability graph paper, and interpretation of data through analysis of the associated probability plot.

  3. On the Role of Prior Probability in Adiabatic Quantum Algorithms

    NASA Astrophysics Data System (ADS)

    Sun, Jie; Lu, Songfeng; Yang, Liping

    2016-03-01

    In this paper, we study the role of prior probability on the efficiency of quantum local adiabatic search algorithm. The following aspects for prior probability are found here: firstly, only the probabilities of marked states affect the running time of the adiabatic evolution; secondly, the prior probability can be used for improving the efficiency of the adiabatic algorithm; thirdly, like the usual quantum adiabatic evolution, the running time for the case of multiple solution states where the number of marked elements are smaller enough than the size of the set assigned that contains them can be significantly bigger than that of the case where the assigned set only contains all the marked states.

  4. Oscillations in probability distributions for stochastic gene expression

    SciTech Connect

    Petrosyan, K. G. Hu, Chin-Kun

    2014-05-28

    The phenomenon of oscillations in probability distribution functions of number of components is found for a model of stochastic gene expression. It takes place in cases of low levels of molecules or strong intracellular noise. The oscillations distinguish between more probable even and less probable odd number of particles. The even-odd symmetry restores as the number of molecules increases with the probability distribution function tending to Poisson distribution. We discuss the possibility of observation of the phenomenon in gene, protein, and mRNA expression experiments.

  5. Code System to Calculate Pressure Vessel Failure Probabilities.

    Energy Science and Technology Software Center (ESTSC)

    2001-03-27

    Version 00 OCTAVIA (Operationally Caused Transients And Vessel Integrity Analysis) calculates the probability of pressure vessel failure from operationally-caused pressure transients which can occur in a pressurized water reactor (PWR). For specified vessel and operating environment characteristics the program computes the failure pressure at which the vessel will fail for different-sized flaws existing in the beltline and the probability of vessel failure per reactor year due to the flaw. The probabilities are summed over themore » various flaw sizes to obtain the total vessel failure probability. Sensitivity studies can be performed to investigate different vessel or operating characteristics in the same computer run.« less

  6. Category dimensionality and feature knowledge: When more features are learned as easily as fewer

    PubMed Central

    Hoffman, Aaron B.; Murphy, Gregory L.

    2006-01-01

    Three experiments compared the learning of lower-dimensional family-resemblance categories (four dimensions) to the learning of higher-dimensional ones (eight dimensions). Category learning models incorporating error-driven learning, hypothesis-testing, or limited capacity attention predict that additional dimensions should either increase learning difficulty or decrease learning of individual features. Contrary to these predictions, the experiments showed no slower learning of high-dimensional categories, while subjects learning high-dimensional categories learned more features than those learning low-dimensional categories. This result obtained both in standard learning with feedback and in noncontingent, observational learning. Our results show that, rather than interfering with learning, categories with more dimensions cause subjects to learn more. We contrast the learning of family-resemblance categories with learning in classical conditioning and probability learning paradigms, in which competition among features is well documented. PMID:16569148

  7. Individual Values, Learning Routines and Academic Procrastination

    ERIC Educational Resources Information Center

    Dietz, Franziska; Hofer, Manfred; Fries, Stefan

    2007-01-01

    Background: Academic procrastination, the tendency to postpone learning activities, is regarded as a consequence of postmodern values that are prominent in post-industrialized societies. When students strive for leisure goals and have no structured routines for academic tasks, delaying strenuous learning activities becomes probable. Aims: The…

  8. Student Understanding of Probability and Introductory Statistical Physics in Upper-division Courses on Thermal Physics

    NASA Astrophysics Data System (ADS)

    Loverude, Michael E.

    2006-12-01

    This talk describes part of an ongoing investigation of student learning in the context of upper-division courses in thermal physics. In particular, we will examine student understanding of the fundamental concepts of statistical physics, and the underlying mathematics of probability. Our results suggest that students lack a deep understanding of the statistics of binary systems like coin flips, calling into question their ability to apply these results to simple physical systems. We will provide examples of student responses and written explanations and discuss implications for instruction.

  9. Situated Learning in Young Romanian Roma Successful Learning Biographies

    ERIC Educational Resources Information Center

    Nistor, Nicolae; Stanciu, Dorin; Vanea, Cornelia; Sasu, Virginia Maria; Dragota, Maria

    2014-01-01

    European Roma are often associated with social problems and conflicts due to poverty and low formal education. Nevertheless, Roma communities traditionally develop expertise in ethnically specific domains, probably by alternative, informal ways, such as situated learning in communities of practice. Although predictable, empirical evidence of…

  10. Learning Disabilities

    MedlinePlus

    ... Enhancing Diversity Find People About NINDS NINDS Learning Disabilities Information Page Table of Contents (click to jump ... Español Additional resources from MedlinePlus What are Learning Disabilities? Learning disabilities are disorders that affect the ability ...

  11. Teachers Learning How to Learn

    ERIC Educational Resources Information Center

    James, Mary; McCormick, Robert

    2009-01-01

    School pupils learning how to learn (LHTL), aimed at helping them develop learning autonomy, requires teachers to develop new classroom practices. Hence teachers LHTL is equally important. The TLRP "Learning How to Learn in Classrooms, Schools and Networks" project researched how practices were developed by teachers in 40 primary and secondary…

  12. Quantum particles from coarse grained classical probabilities in phase space

    SciTech Connect

    Wetterich, C.

    2010-07-15

    Quantum particles can be obtained from a classical probability distribution in phase space by a suitable coarse graining, whereby simultaneous classical information about position and momentum can be lost. For a suitable time evolution of the classical probabilities and choice of observables all features of a quantum particle in a potential follow from classical statistics. This includes interference, tunneling and the uncertainty relation.

  13. 14 CFR 417.224 - Probability of failure analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Probability of failure analysis. 417.224..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.224 Probability of failure analysis. (a) General. All flight safety analyses for a launch, regardless of hazard or phase of...

  14. Preservice Elementary Teachers and the Fundamentals of Probability

    ERIC Educational Resources Information Center

    Dollard, Clark

    2011-01-01

    This study examined how preservice elementary teachers think about situations involving probability. Twenty-four preservice elementary teachers who had not yet studied probability as part of their preservice elementary mathematics coursework were interviewed using a task-based interview. The participants' responses showed a wide variety of…

  15. The case for lower probabilities as measures of uncertainty

    SciTech Connect

    Tonn, B. ); Wagner, C. . Dept. of Mathematics)

    1991-01-01

    This paper presents the case for using lower probabilities as measures of uncertainty in expert systems. A debate has raged within the artificial intelligence community for years about how to represent uncertainty in expert systems. Several camps have emerged. One camp has focused on developing alternatives to probability theory, such as certainty factors, fuzzy sets, and endorsements. A second camp has focused on retrofitting classical, additive probability, for example, by developing a cautious approach to probabilistic reasoning and interpreting probability within a possible worlds framework. This paper falls into a third camp, which encompasses generalizations of probability theory. The most discussed generalization is known as Dempster-Shafer Theory, which is based on the combined work of Dempster and Shafer. Lower probabilities are actually a substantial generalization of DST. This paper has two parts. The first presents the definitions of lower probabilities, DST, and additive probability. This section includes a discussion of capacities, the most general type of uncertainty measure. The purpose of this section is to show the differences among the uncertainty measures.

  16. Use of External Visual Representations in Probability Problem Solving

    ERIC Educational Resources Information Center

    Corter, James E.; Zahner, Doris C.

    2007-01-01

    We investigate the use of external visual representations in probability problem solving. Twenty-six students enrolled in an introductory statistics course for social sciences graduate students (post-baccalaureate) solved eight probability problems in a structured interview format. Results show that students spontaneously use self-generated…

  17. Misconceptions in Rational Numbers, Probability, Algebra, and Geometry

    ERIC Educational Resources Information Center

    Rakes, Christopher R.

    2010-01-01

    In this study, the author examined the relationship of probability misconceptions to algebra, geometry, and rational number misconceptions and investigated the potential of probability instruction as an intervention to address misconceptions in all 4 content areas. Through a review of literature, 5 fundamental concepts were identified that, if…

  18. The Probability Approach to English If-Conditional Sentences

    ERIC Educational Resources Information Center

    Wu, Mei

    2012-01-01

    Users of the Probability Approach choose the right one from four basic types of conditional sentences--factual, predictive, hypothetical and counterfactual conditionals, by judging how likely (i.e. the probability) the event in the result-clause will take place when the condition in the if-clause is met. Thirty-three students from the experimental…

  19. Public Attitudes toward Stuttering in Turkey: Probability versus Convenience Sampling

    ERIC Educational Resources Information Center

    Ozdemir, R. Sertan; St. Louis, Kenneth O.; Topbas, Seyhun

    2011-01-01

    Purpose: A Turkish translation of the "Public Opinion Survey of Human Attributes-Stuttering" ("POSHA-S") was used to compare probability versus convenience sampling to measure public attitudes toward stuttering. Method: A convenience sample of adults in Eskisehir, Turkey was compared with two replicates of a school-based, probability cluster…

  20. Prizes in Cereal Boxes: An Application of Probability.

    ERIC Educational Resources Information Center

    Litwiller, Bonnie H.; Duncan, David R.

    1992-01-01

    Presents four cases of real-world probabilistic situations to promote more effective teaching of probability. Calculates the probability of obtaining six of six different prizes successively in six, seven, eight, and nine boxes of cereal, generalizes the problem to n boxes of cereal, and offers suggestions to extend the problem. (MDH)

  1. Fisher classifier and its probability of error estimation

    NASA Technical Reports Server (NTRS)

    Chittineni, C. B.

    1979-01-01

    Computationally efficient expressions are derived for estimating the probability of error using the leave-one-out method. The optimal threshold for the classification of patterns projected onto Fisher's direction is derived. A simple generalization of the Fisher classifier to multiple classes is presented. Computational expressions are developed for estimating the probability of error of the multiclass Fisher classifier.

  2. Multiple-event probability in general-relativistic quantum mechanics

    SciTech Connect

    Hellmann, Frank; Mondragon, Mauricio; Perez, Alejandro; Rovelli, Carlo

    2007-04-15

    We discuss the definition of quantum probability in the context of 'timeless' general-relativistic quantum mechanics. In particular, we study the probability of sequences of events, or multievent probability. In conventional quantum mechanics this can be obtained by means of the 'wave function collapse' algorithm. We first point out certain difficulties of some natural definitions of multievent probability, including the conditional probability widely considered in the literature. We then observe that multievent probability can be reduced to single-event probability, by taking into account the quantum nature of the measuring apparatus. In fact, by exploiting the von-Neumann freedom of moving the quantum/classical boundary, one can always trade a sequence of noncommuting quantum measurements at different times, with an ensemble of simultaneous commuting measurements on the joint system+apparatus system. This observation permits a formulation of quantum theory based only on single-event probability, where the results of the wave function collapse algorithm can nevertheless be recovered. The discussion also bears on the nature of the quantum collapse.

  3. A Quantum Theoretical Explanation for Probability Judgment Errors

    ERIC Educational Resources Information Center

    Busemeyer, Jerome R.; Pothos, Emmanuel M.; Franco, Riccardo; Trueblood, Jennifer S.

    2011-01-01

    A quantum probability model is introduced and used to explain human probability judgment errors including the conjunction and disjunction fallacies, averaging effects, unpacking effects, and order effects on inference. On the one hand, quantum theory is similar to other categorization and memory models of cognition in that it relies on vector…

  4. Probabilities of Natural Events Occurring at Savannah River Plant

    SciTech Connect

    Huang, J.C.

    2001-07-17

    This report documents the comprehensive evaluation of probability models of natural events which are applicable to Savannah River Plant. The probability curves selected for these natural events are recommended to be used by all SRP/SRL safety analysts. This will ensure a consistency in analysis methodology for postulated SAR incidents involving natural phenomena.

  5. 14 CFR 417.224 - Probability of failure analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Probability of failure analysis. 417.224 Section 417.224 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.224 Probability of failure analysis. (a) General. All...

  6. On the Provenance of Judgments of Conditional Probability

    ERIC Educational Resources Information Center

    Zhao, Jiaying; Shah, Anuj; Osherson, Daniel

    2009-01-01

    In standard treatments of probability, Pr(A[vertical bar]B) is defined as the ratio of Pr(A[intersection]B) to Pr(B), provided that Pr(B) greater than 0. This account of conditional probability suggests a psychological question, namely, whether estimates of Pr(A[vertical bar]B) arise in the mind via implicit calculation of…

  7. Probability Theory, Not the Very Guide of Life

    ERIC Educational Resources Information Center

    Juslin, Peter; Nilsson, Hakan; Winman, Anders

    2009-01-01

    Probability theory has long been taken as the self-evident norm against which to evaluate inductive reasoning, and classical demonstrations of violations of this norm include the conjunction error and base-rate neglect. Many of these phenomena require multiplicative probability integration, whereas people seem more inclined to linear additive…

  8. A Probability Model of Accuracy in Deception Detection Experiments.

    ERIC Educational Resources Information Center

    Park, Hee Sun; Levine, Timothy R.

    2001-01-01

    Extends the recent work on the veracity effect in deception detection. Explains the probabilistic nature of a receiver's accuracy in detecting deception and analyzes a receiver's detection of deception in terms of set theory and conditional probability. Finds that accuracy is shown to be a function of the relevant conditional probability and the…

  9. The Influence of Phonotactic Probability on Word Recognition in Toddlers

    ERIC Educational Resources Information Center

    MacRoy-Higgins, Michelle; Shafer, Valerie L.; Schwartz, Richard G.; Marton, Klara

    2014-01-01

    This study examined the influence of phonotactic probability on word recognition in English-speaking toddlers. Typically developing toddlers completed a preferential looking paradigm using familiar words, which consisted of either high or low phonotactic probability sound sequences. The participants' looking behavior was recorded in response…

  10. A New Way to Evaluate the Probability and Fresnel Integrals

    ERIC Educational Resources Information Center

    Khalili, Parviz

    2007-01-01

    In this article, we show how "Laplace Transform" may be used to evaluate variety of nontrivial improper integrals, including "Probability" and "Fresnel" integrals. The algorithm we have developed here to evaluate "Probability, Fresnel" and other similar integrals seems to be new. This method transforms the evaluation of certain improper integrals…

  11. Approach to learning disability.

    PubMed

    Kulkarni, M; Kalantre, S; Upadhye, S; Karande, S; Ahuja, S

    2001-06-01

    Learning disabilities (LD) is one of the important causes of poor academic performance in school going children. Learning disabilities are developmental disorders that usually manifest during the period of normal education. These disabilities create a significant gap between the true potential and day to day performance of an individual. Dyslexia, dysgraphia and dyscalculia denote the problem related to reading, writing and mathematics. Perinatal problems are certain neurological conditions, known to be associated with LD; however, genetic predisposition seems to be the most probable etiological factors. Evaluation of a child suspected to be having LD consists of medical examination, vision and hearing test analysis of school performance. The psycho-behaviour assessment and education testing are essential in the process of diagnosis. The experienced persons in the field of LD should interpret the results of such tests. With Individualized Remedial Education Plan (IEP) most children learn to cope up with disability and may get integrated in a regular steam. PMID:11450386

  12. On the decode error probability for Reed-Solomon codes

    NASA Technical Reports Server (NTRS)

    Mceliece, R. J.; Swanson, L.

    1986-01-01

    Upper bounds on the decoder error probability for Reed-Solomon codes are derived. By definition, decoder error occurs when the decoder finds a codeword other than the transmitted codeword; this is in contrast to decoder failure, which occurs when the decoder fails to find any codeword at all. The results imply, for example, that for a t error correcting Reed-Solomon code of length q - 1 over GF(q), if more than t errors occur, the probability of decoder error is less than 1/t! In particular, for the Voyager Reed-Solomon code, the probability of decoder error given a word error is smaller than 3 x 10 to the minus 14th power. Thus, in a typical operating region with probability 100,000 of word error, the probability of undetected word error is about 10 to the minus 14th power.

  13. On the decoder error probability for Reed-Solomon codes

    NASA Technical Reports Server (NTRS)

    Mceliece, Robert J.; Swanson, Laif

    1986-01-01

    Upper bounds on the decoder error probability for Reed-Solomon codes are derived. By definition, decoder error occurs when the decoder finds a codeword other than the transmitted codeword; this is in contrast to decoder failure, which occurs when the decoder fails to find any codeword at all. The results imply, for example, that for a t error-correcting Reed-Solomon code of length q - 1 over GF(q), if more than t errors occur, the probability of decoder error is less than 1/t. In particular, for the Voyager Reed-Solomon code, the probability of decoder error given a word error is smaller than 3 x 10 to the minus 14th power. Thus, in a typical operating region with probability 100,000 of word error, the probability of undetected word error is about 10 to the minus 14th power.

  14. On the decode error probability for Reed-Solomon codes

    NASA Astrophysics Data System (ADS)

    McEliece, R. J.; Swanson, L.

    1986-02-01

    Upper bounds on the decoder error probability for Reed-Solomon codes are derived. By definition, decoder error occurs when the decoder finds a codeword other than the transmitted codeword; this is in contrast to decoder failure, which occurs when the decoder fails to find any codeword at all. The results imply, for example, that for a t error correcting Reed-Solomon code of length q - 1 over GF(q), if more than t errors occur, the probability of decoder error is less than 1/t] In particular, for the Voyager Reed-Solomon code, the probability of decoder error given a word error is smaller than 3 x 10 to the minus 14th power. Thus, in a typical operating region with probability 100,000 of word error, the probability of undetected word error is about 10 to the minus 14th power.

  15. A discussion on the origin of quantum probabilities

    SciTech Connect

    Holik, Federico; Sáenz, Manuel; Plastino, Angel

    2014-01-15

    We study the origin of quantum probabilities as arising from non-Boolean propositional-operational structures. We apply the method developed by Cox to non distributive lattices and develop an alternative formulation of non-Kolmogorovian probability measures for quantum mechanics. By generalizing the method presented in previous works, we outline a general framework for the deduction of probabilities in general propositional structures represented by lattices (including the non-distributive case). -- Highlights: •Several recent works use a derivation similar to that of R.T. Cox to obtain quantum probabilities. •We apply Cox’s method to the lattice of subspaces of the Hilbert space. •We obtain a derivation of quantum probabilities which includes mixed states. •The method presented in this work is susceptible to generalization. •It includes quantum mechanics and classical mechanics as particular cases.

  16. Oil spill contamination probability in the southeastern Levantine basin.

    PubMed

    Goldman, Ron; Biton, Eli; Brokovich, Eran; Kark, Salit; Levin, Noam

    2015-02-15

    Recent gas discoveries in the eastern Mediterranean Sea led to multiple operations with substantial economic interest, and with them there is a risk of oil spills and their potential environmental impacts. To examine the potential spatial distribution of this threat, we created seasonal maps of the probability of oil spill pollution reaching an area in the Israeli coastal and exclusive economic zones, given knowledge of its initial sources. We performed simulations of virtual oil spills using realistic atmospheric and oceanic conditions. The resulting maps show dominance of the alongshore northerly current, which causes the high probability areas to be stretched parallel to the coast, increasing contamination probability downstream of source points. The seasonal westerly wind forcing determines how wide the high probability areas are, and may also restrict these to a small coastal region near source points. Seasonal variability in probability distribution, oil state, and pollution time is also discussed. PMID:25534630

  17. The Link between Statistical Segmentation and Word Learning in Adults

    ERIC Educational Resources Information Center

    Mirman, Daniel; Magnuson, James S.; Estes, Katharine Graf; Dixon, James A.

    2008-01-01

    Many studies have shown that listeners can segment words from running speech based on conditional probabilities of syllable transitions, suggesting that this statistical learning could be a foundational component of language learning. However, few studies have shown a direct link between statistical segmentation and word learning. We examined this…

  18. Adaptive Learning and Risk Taking

    ERIC Educational Resources Information Center

    Denrell, Jerker

    2007-01-01

    Humans and animals learn from experience by reducing the probability of sampling alternatives with poor past outcomes. Using simulations, J. G. March (1996) illustrated how such adaptive sampling could lead to risk-averse as well as risk-seeking behavior. In this article, the author develops a formal theory of how adaptive sampling influences risk…

  19. Stimulus probability effects on temporal bisection performance of mice (Mus musculus).

    PubMed

    Akdoğan, Başak; Balcı, Fuat

    2016-01-01

    In the temporal bisection task, participants classify experienced stimulus durations as short or long based on their temporal similarity to previously learned reference durations. Temporal decision making in this task should be influenced by the experienced probabilities of the reference durations for adaptiveness. In this study, we tested the temporal bisection performance of mice (Mus musculus) under different short and long reference duration probability conditions implemented across two experimental phases. In Phase 1, the proportion of reference durations (compared to probe durations) was 0.5, whereas in Phase 2 it was increased to 0.8 to further examine the adjustment of choice behavior with more frequent reference duration presentations (under higher reinforcement rate). Our findings suggest that mice developed adaptive biases in their choice behaviors. These adjustments in choice behavior were nearly optimal as the mice maximized their gain to a great extent which required them to monitor stimulus probabilities as well as the level of variability in their temporal judgments. We further found that short but not long categorization response times were sensitive to stimulus probability manipulations, which in turn suggests an asymmetry between short and long categorizations. Finally, we investigated the latent decision processes underlying the bias manifested in subjects' choice behavior within the diffusion model framework. Our results revealed that probabilistic information influenced the starting point and the rate of evidence accumulation process. Overall, the stimulus probability effects on choice behavior were modulated by the reinforcement rate. Our findings illustrate that mice can adapt their temporal behaviors with respect to the probabilistic contingencies in the environment. PMID:26242608

  20. A Semantic Web-Based Authoring Tool to Facilitate the Planning of Collaborative Learning Scenarios Compliant with Learning Theories

    ERIC Educational Resources Information Center

    Isotani, Seiji; Mizoguchi, Riichiro; Isotani, Sadao; Capeli, Olimpio M.; Isotani, Naoko; de Albuquerque, Antonio R. P. L.; Bittencourt, Ig. I.; Jaques, Patricia

    2013-01-01

    When the goal of group activities is to support long-term learning, the task of designing well-thought-out collaborative learning (CL) scenarios is an important key to success. To help students adequately acquire and develop their knowledge and skills, a teacher can plan a scenario that increases the probability for learning to occur. Such a…

  1. Learning Strategies for Learning Technologies.

    ERIC Educational Resources Information Center

    Olgren, Christine H.

    2000-01-01

    Underpinning the use of old or new learning technologies is what a learner has to do to process information effectively. A learner-centered approach should connect learning strategies (orientation, management, information processing, evaluation of outcomes) to learning technologies. (SK)

  2. Anosognosia and procedural learning in Alzheimer's disease.

    PubMed

    Starkstein, S E; Sabe, L; Cuerva, A G; Kuzis, G; Leiguarda, R

    1997-04-01

    Awareness of cognitive deficits may rely on the implicit learning of intellectual limitations, and anosognosia in Alzheimer's disease (AD) may result from deficits in implicit learning. To examine this hypothesis, a consecutive series of 55 patients with probable AD were divided into groups with mild (n = 13), severe (n = 12), or no anosognosia (n = 30) and were assessed with a neuropsychological battery that included tests of declarative and procedural learning. Whereas there were no significant between-group differences in tests of declarative learning (the Buschke Selective Reminding Test and the Benton Visual Retention Test), patients with severe anosognosia showed a significantly worse performance on procedural learning (as measured with the Maze Learning Test) and a test assessing set shifting abilities (the Wisconsin Card Sorting Test) than AD patients without anosognosia. The authors' results suggest that deficits in procedural learning and anosognosia in AD may result from dysfunction in habit-learning systems. PMID:9150509

  3. Electrofishing capture probability of smallmouth bass in streams

    USGS Publications Warehouse

    Dauwalter, D.C.; Fisher, W.L.

    2007-01-01

    Abundance estimation is an integral part of understanding the ecology and advancing the management of fish populations and communities. Mark-recapture and removal methods are commonly used to estimate the abundance of stream fishes. Alternatively, abundance can be estimated by dividing the number of individuals sampled by the probability of capture. We conducted a mark-recapture study and used multiple repeated-measures logistic regression to determine the influence of fish size, sampling procedures, and stream habitat variables on the cumulative capture probability for smallmouth bass Micropterus dolomieu in two eastern Oklahoma streams. The predicted capture probability was used to adjust the number of individuals sampled to obtain abundance estimates. The observed capture probabilities were higher for larger fish and decreased with successive electrofishing passes for larger fish only. Model selection suggested that the number of electrofishing passes, fish length, and mean thalweg depth affected capture probabilities the most; there was little evidence for any effect of electrofishing power density and woody debris density on capture probability. Leave-one-out cross validation showed that the cumulative capture probability model predicts smallmouth abundance accurately. ?? Copyright by the American Fisheries Society 2007.

  4. Equal prior probabilities: can one do any better?

    PubMed

    Biedermann, A; Taroni, F; Garbolino, P

    2007-10-25

    This paper discusses recommendations concerning the use of prior probabilities that underlie recent, but in no way novel, proposals of presenting scientific evidence in terms of posterior probabilities, in the context sometimes referred to as the 'full Bayes' approach'. A chief issue of this procedure is a proposal that--given the unavailability of case-specific circumstantial information--scientists should consider the prior probabilities of the propositions under which scientific evidence is evaluated as equal. The discussion presented here draws the reader's attention to the fact that the philosophical foundations of such a recommendation (in particular, attempted justifications through the Principle of Maximum Entropy (PME)) are far more controversial than what is actually admitted by the advocates for their use in the theory and practice of forensic science. Invoking only basic assumptions and the mathematical rules of probability calculus, the authors of this paper propose an argument that shows that there can be other more feasible and defensible strategies for eliciting reasonable prior probabilities. It is solely demanded that the reasoner is willing to make up his mind seriously on certain standard issues of fairly general criminal cases, such as evidential relevance or the probability of a suspect's guilt. However, because these issues intimately pertain to the responsibility of the trier of the fact, it is argued here that scientists' attempts to define appropriate prior probabilities should continue to be considered as untenable for the need. PMID:17267153

  5. Naive Probability: Model-Based Estimates of Unique Events.

    PubMed

    Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Philip N

    2015-08-01

    We describe a dual-process theory of how individuals estimate the probabilities of unique events, such as Hillary Clinton becoming U.S. President. It postulates that uncertainty is a guide to improbability. In its computer implementation, an intuitive system 1 simulates evidence in mental models and forms analog non-numerical representations of the magnitude of degrees of belief. This system has minimal computational power and combines evidence using a small repertoire of primitive operations. It resolves the uncertainty of divergent evidence for single events, for conjunctions of events, and for inclusive disjunctions of events, by taking a primitive average of non-numerical probabilities. It computes conditional probabilities in a tractable way, treating the given event as evidence that may be relevant to the probability of the dependent event. A deliberative system 2 maps the resulting representations into numerical probabilities. With access to working memory, it carries out arithmetical operations in combining numerical estimates. Experiments corroborated the theory's predictions. Participants concurred in estimates of real possibilities. They violated the complete joint probability distribution in the predicted ways, when they made estimates about conjunctions: P(A), P(B), P(A and B), disjunctions: P(A), P(B), P(A or B or both), and conditional probabilities P(A), P(B), P(B|A). They were faster to estimate the probabilities of compound propositions when they had already estimated the probabilities of each of their components. We discuss the implications of these results for theories of probabilistic reasoning. PMID:25363706

  6. Two-slit experiment: quantum and classical probabilities

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei

    2015-06-01

    Inter-relation between quantum and classical probability models is one of the most fundamental problems of quantum foundations. Nowadays this problem also plays an important role in quantum technologies, in quantum cryptography and the theory of quantum random generators. In this letter, we compare the viewpoint of Richard Feynman that the behavior of quantum particles cannot be described by classical probability theory with the viewpoint that quantum-classical inter-relation is more complicated (cf, in particular, with the tomographic model of quantum mechanics developed in detail by Vladimir Man'ko). As a basic example, we consider the two-slit experiment, which played a crucial role in quantum foundational debates at the beginning of quantum mechanics (QM). In particular, its analysis led Niels Bohr to the formulation of the principle of complementarity. First, we demonstrate that in complete accordance with Feynman's viewpoint, the probabilities for the two-slit experiment have the non-Kolmogorovian structure, since they violate one of basic laws of classical probability theory, the law of total probability (the heart of the Bayesian analysis). However, then we show that these probabilities can be embedded in a natural way into the classical (Kolmogorov, 1933) probability model. To do this, one has to take into account the randomness of selection of different experimental contexts, the joint consideration of which led Feynman to a conclusion about the non-classicality of quantum probability. We compare this embedding of non-Kolmogorovian quantum probabilities into the Kolmogorov model with well-known embeddings of non-Euclidean geometries into Euclidean space (e.g., the Poincaré disk model for the Lobachvesky plane).

  7. Crossing probability for directed polymers in random media

    NASA Astrophysics Data System (ADS)

    De Luca, Andrea; Le Doussal, Pierre

    2015-10-01

    We study the probability that two directed polymers in the same random potential do not intersect. We use the replica method to map the problem onto the attractive Lieb-Liniger model with generalized statistics between particles. Employing both the nested Bethe ansatz and known formula from MacDonald processes, we obtain analytical expressions for the first few moments of this probability and compare them to a numerical simulation of a discrete model at high temperature. From these observations, several large time properties of the noncrossing probabilities are conjectured. Extensions of our formalism to more general observables are discussed.

  8. Fitness Probability Distribution of Bit-Flip Mutation.

    PubMed

    Chicano, Francisco; Sutton, Andrew M; Whitley, L Darrell; Alba, Enrique

    2015-01-01

    Bit-flip mutation is a common mutation operator for evolutionary algorithms applied to optimize functions over binary strings. In this paper, we develop results from the theory of landscapes and Krawtchouk polynomials to exactly compute the probability distribution of fitness values of a binary string undergoing uniform bit-flip mutation. We prove that this probability distribution can be expressed as a polynomial in p, the probability of flipping each bit. We analyze these polynomials and provide closed-form expressions for an easy linear problem (Onemax), and an NP-hard problem, MAX-SAT. We also discuss a connection of the results with runtime analysis. PMID:24885680

  9. Quantum correlations in terms of neutrino oscillation probabilities

    NASA Astrophysics Data System (ADS)

    Alok, Ashutosh Kumar; Banerjee, Subhashish; Uma Sankar, S.

    2016-08-01

    Neutrino oscillations provide evidence for the mode entanglement of neutrino mass eigenstates in a given flavour eigenstate. Given this mode entanglement, it is pertinent to consider the relation between the oscillation probabilities and other quantum correlations. In this work, we show that all the well-known quantum correlations, such as the Bell's inequality, are directly related to the neutrino oscillation probabilities. The results of the neutrino oscillation experiments, which measure the neutrino survival probability to be less than unity, imply Bell's inequality violation.

  10. New methodology for assessing the probability of contaminating Mars

    NASA Technical Reports Server (NTRS)

    North, D. W.; Judd, B. R.; Pezier, J. P.

    1974-01-01

    Methodology is proposed to assess the probability that the planet Mars will be contaminated by terrestrial microorganisms aboard a spacecraft. The present NASA methods are extended to permit utilization of detailed information on microbial characteristics, the lethality of release and transport mechanisms, and of other information about the Martian environment. Different types of microbial release are distinguished, and for each release mechanism a probability of growth is computed. Using this new methodology, an assessment was carried out for the 1975 Viking landings on Mars. The resulting probability of contamination for each Viking lander is 6 x 10 to the -6 power, and is amenable to revision as additional information becomes available.

  11. Standard quantum mechanics featuring probabilities instead of wave functions

    SciTech Connect

    Manko, V. I. Manko, O. V.

    2006-06-15

    A new formulation of quantum mechanics (probability representation) is discussed. In this representation, a quantum state is described by a standard positive definite probability distribution (tomogram) rather than by a wave function. An unambiguous relation (analog of Radon transformation) between the density operator and a tomogram is constructed both for continuous coordinates and for spin variables. A novel feature of a state, tomographic entropy, is considered, and its connection with von Neumann entropy is discussed. A one-to-one map of quantum observables (Hermitian operators) on positive probability distributions is found.

  12. Origin of probabilities and their application to the multiverse

    NASA Astrophysics Data System (ADS)

    Albrecht, Andreas; Phillips, Daniel

    2014-12-01

    We argue using simple models that all successful practical uses of probabilities originate in quantum fluctuations in the microscopic physical world around us, often propagated to macroscopic scales. Thus we claim there is no physically verified fully classical theory of probability. We comment on the general implications of this view, and specifically question the application of purely classical probabilities to cosmology in cases where key questions are known to have no quantum answer. We argue that the ideas developed here may offer a way out of the notorious measure problems of eternal inflation.

  13. Sampling of quasidistributions, nonclassical behavior and negative probabilities

    NASA Astrophysics Data System (ADS)

    Peřina, J.; Křepelka, J.

    2016-05-01

    We perform a sampling of the quasidistribution for the process of optical down-conversion in nonclassical regime, in which negative values of the quasidistribution are exhibited, using the Shannon-Kotelnikov sampling formula. We show that negative values of the quasidistribution do not directly represent probabilities, however, negative terms in the sampling formula related to the nonclassical behavior can be interpreted as positive probabilities in the negative orthogonal sinc-basis, whereas positive probabilities in the positive sinc-basis describe classical cases.

  14. Imprecise probability assessment of tipping points in the climate system.

    PubMed

    Kriegler, Elmar; Hall, Jim W; Held, Hermann; Dawson, Richard; Schellnhuber, Hans Joachim

    2009-03-31

    Major restructuring of the Atlantic meridional overturning circulation, the Greenland and West Antarctic ice sheets, the Amazon rainforest and ENSO, are a source of concern for climate policy. We have elicited subjective probability intervals for the occurrence of such major changes under global warming from 43 scientists. Although the expert estimates highlight large uncertainty, they allocate significant probability to some of the events listed above. We deduce conservative lower bounds for the probability of triggering at least 1 of those events of 0.16 for medium (2-4 degrees C), and 0.56 for high global mean temperature change (above 4 degrees C) relative to year 2000 levels. PMID:19289827

  15. Two-Valued Probability Measure on the Pontryagin Space

    NASA Astrophysics Data System (ADS)

    Matvejchuk, Marjan; Utkina, Elena

    2015-12-01

    The well known Kochen-Specker's theorem is devoted to the problem of hidden variables in quantum mechanics. The Kochen-Specker theorem says: There is no two-valued probability measure on the real Hilbert space of dimension three. In the paper we present an analogy of Kochen-Specker's theorem in Pontryagin space: A Pontryagin spase H of dimension greater than or equal to three has a two-valued probability measure if and only if H has indefinite rank one: in which case, any such two-valued probability measure on H is unique.

  16. Learning Disabilities.

    ERIC Educational Resources Information Center

    Clow, John, Ed.; Woolschlager, Ruth B., Ed.

    The learning disabilities monograph contains five brief articles dealing with various aspects of learning disabilities as they related to business education. "Learning Disabilities: A Challenge for the Vocational Business Educator" (Dorothy Munger) concerns screening students with learning disabilities into rather than out of business education…

  17. Learning Styles.

    ERIC Educational Resources Information Center

    Ross, Dorian

    A variety of research findings and observations concerning learning styles are compiled in this guide to help teachers understand the implications of their students' learning preferences. The first section describes the Learning Preference Inventory (LPI) as an instrument that asks the student to select among eight learning situations, e.g.,…

  18. The Relationship between Socio-Economic Status, General Language Learning Outcome, and Beliefs about Language Learning

    ERIC Educational Resources Information Center

    Ariani, Mohsen Ghasemi; Ghafournia, Narjes

    2016-01-01

    The objective of this study is to explore the probable relationship between Iranian students' socioeconomic status, general language learning outcome, and their beliefs about language learning. To this end, 350 postgraduate students, doing English for specific courses at Islamic Azad University of Neyshabur participated in this study. They were…

  19. Wald Sequential Probability Ratio Test for Space Object Conjunction Assessment

    NASA Technical Reports Server (NTRS)

    Carpenter, James R.; Markley, F Landis

    2014-01-01

    This paper shows how satellite owner/operators may use sequential estimates of collision probability, along with a prior assessment of the base risk of collision, in a compound hypothesis ratio test to inform decisions concerning collision risk mitigation maneuvers. The compound hypothesis test reduces to a simple probability ratio test, which appears to be a novel result. The test satisfies tolerances related to targeted false alarm and missed detection rates. This result is independent of the method one uses to compute the probability density that one integrates to compute collision probability. A well-established test case from the literature shows that this test yields acceptable results within the constraints of a typical operational conjunction assessment decision timeline. Another example illustrates the use of the test in a practical conjunction assessment scenario based on operations of the International Space Station.

  20. 2. BARN. VIEW LOOKING NORTHWEST. THE ROLLING DOOR PROBABLY REPLACES ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. BARN. VIEW LOOKING NORTHWEST. THE ROLLING DOOR PROBABLY REPLACES AN ORIGINAL 4/4 DOUBLE-HUNG WINDOW. - Tonto Ranger Station, Barn, Forest Service Road 65 at Tonto Wash, Skull Valley, Yavapai County, AZ

  1. Modeling Finite-Time Failure Probabilities in Risk Analysis Applications.

    PubMed

    Dimitrova, Dimitrina S; Kaishev, Vladimir K; Zhao, Shouqi

    2015-10-01

    In this article, we introduce a framework for analyzing the risk of systems failure based on estimating the failure probability. The latter is defined as the probability that a certain risk process, characterizing the operations of a system, reaches a possibly time-dependent critical risk level within a finite-time interval. Under general assumptions, we define two dually connected models for the risk process and derive explicit expressions for the failure probability and also the joint probability of the time of the occurrence of failure and the excess of the risk process over the risk level. We illustrate how these probabilistic models and results can be successfully applied in several important areas of risk analysis, among which are systems reliability, inventory management, flood control via dam management, infectious disease spread, and financial insolvency. Numerical illustrations are also presented. PMID:26010201

  2. Interval estimation of small tail probabilities - applications in food safety.

    PubMed

    Kedem, Benjamin; Pan, Lemeng; Zhou, Wen; Coelho, Carlos A

    2016-08-15

    Often in food safety and bio-surveillance it is desirable to estimate the probability that a contaminant or a function thereof exceeds an unsafe high threshold. The probability or chance in question is very small. To estimate such a probability, we need information about large values. In many cases, the data do not contain information about exceedingly large contamination levels, which ostensibly renders the problem insolvable. A solution is suggested whereby more information about small tail probabilities are obtained by combining the real data with computer-generated data repeatedly. This method provides short yet reliable interval estimates based on moderately large samples. An illustration is provided in terms of lead exposure data. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26891189

  3. 21. HISTORIC VIEW OF EARLY MIRAK DESIGN. PROBABLY AT THE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    21. HISTORIC VIEW OF EARLY MIRAK DESIGN. PROBABLY AT THE FARM OF KLAUS RIEDEL'S GRANDPARENTS IN BERNSTADT, SAXONY, 1930. - Marshall Space Flight Center, Redstone Rocket (Missile) Test Stand, Dodd Road, Huntsville, Madison County, AL

  4. Maximum Probability Reaction Sequences in Stochastic Chemical Kinetic Systems

    PubMed Central

    Salehi, Maryam; Perkins, Theodore J.

    2010-01-01

    The detailed behavior of many molecular processes in the cell, such as protein folding, protein complex assembly, and gene regulation, transcription and translation, can often be accurately captured by stochastic chemical kinetic models. We investigate a novel computational problem involving these models – that of finding the most-probable sequence of reactions that connects two or more states of the system observed at different times. We describe an efficient method for computing the probability of a given reaction sequence, but argue that computing most-probable reaction sequences is EXPSPACE-hard. We develop exact (exhaustive) and approximate algorithms for finding most-probable reaction sequences. We evaluate these methods on test problems relating to a recently-proposed stochastic model of folding of the Trp-cage peptide. Our results provide new computational tools for analyzing stochastic chemical models, and demonstrate their utility in illuminating the behavior of real-world systems. PMID:21629860

  5. Quantum Probability Theory and the Foundations of Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Fröhlich, Jürg; Schubnel, Baptiste

    By and large, people are better at coining expressions than at filling them with interesting, concrete contents. Thus, it may not be very surprising that there are many professional probabilists who may have heard the expression but do not appear to be aware of the need to develop "quantum probability theory" into a thriving, rich, useful field featured at meetings and conferences on probability theory. Although our aim, in this essay, is not to contribute new results on quantum probability theory, we hope to be able to let the reader feel the enormous potential and richness of this field. What we intend to do, in the following, is to contribute some novel points of view to the "foundations of quantum mechanics", using mathematical tools from "quantum probability theory" (such as the theory of operator algebras).

  6. 42. VIEW EAST OF PLASTIC STACK (PROBABLY PVC) WHICH VENTED ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    42. VIEW EAST OF PLASTIC STACK (PROBABLY PVC) WHICH VENTED FUMES FROM THE DIPPING OPERATIONS IN BUILDING 49A; BUILDING 49 IS AT THE LEFT OF THE PHOTOGRAPH - Scovill Brass Works, 59 Mill Street, Waterbury, New Haven County, CT

  7. 10. NORTHWEST END OF WHITSETT PLANT SHOWING PIPELINES PROBABLY FOR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    10. NORTHWEST END OF WHITSETT PLANT SHOWING PIPELINES PROBABLY FOR FIRE WATER STORAGE, LOOKING SOUTHWEST. - Whitsett Pump Plant, West side of Colorado River, north of Parker Dam, Parker Dam, San Bernardino County, CA

  8. A Probability Problem from Real Life: The Tire Exploded.

    ERIC Educational Resources Information Center

    Bartlett, Albert A.

    1993-01-01

    Discusses the probability of seeing a tire explode or disintegrate while traveling down the highway. Suggests that a person observing 10 hours a day would see a failure on the average of once every 300 years. (MVL)

  9. Adaptive MFR parameter control: fixed vs. variable probabilities of detection

    NASA Astrophysics Data System (ADS)

    Boers, Yvo; Driessen, Hans; Zwaga, Jitse

    2005-09-01

    In this paper an efficient adaptive parameter control scheme for Multi Function Radar (MFR) is used. This scheme has been introduced in.5 The scheme has been designed in such a way that it meets constraints on specific quantities that are relevant for target tracking while minimizing the energy spent. It is shown here, that this optimal scheme leads to a considerable variation of the realized detection probability, even within a single scenario. We also show that constraining or fixing the probability of detection to a certain predefined value leads to a considerable increase in the energy spent on the target. This holds even when one optimizes the fixed probability of detection. The bottom line message is that the detection probability is not a design parameter by itself, but merely the product of an optimal schedule.

  10. Review of Literature for Model Assisted Probability of Detection

    SciTech Connect

    Meyer, Ryan M.; Crawford, Susan L.; Lareau, John P.; Anderson, Michael T.

    2014-09-30

    This is a draft technical letter report for NRC client documenting a literature review of model assisted probability of detection (MAPOD) for potential application to nuclear power plant components for improvement of field NDE performance estimations.

  11. The use of sequential probabilities in the segmentation of speech.

    PubMed

    van der Lugt, A H

    2001-07-01

    The present investigation addresses the possible utility of sequential probabilities in the segmentation of spoken language. In a series of five word- spotting and two control lexical decision experiments, high- versus low-probability consonant-vowel (Experiments 1, 2, 5, and 7) and vowel-consonant (Experiments 1, 3, 4, and 6) strings were presented either in the nonsense contexts of target words (Experiments 1-3) or within the target words themselves (Experiments 4-7). The results suggest that listeners, at least for sequences in the onset position, indeed use sequential probabilities as cues for segmentation. The probability of a sound sequence influenced segmentation more when the sequence occurred within the target words (Experiments 4-7 vs. Experiments 1-3). Furthermore, the effects were reliable only when the sequences occurred in the onset position (Experiments 1, 2, 5, and 7 vs. Experiments 1, 3, 4, and 6). PMID:11521849

  12. Exploring Probability through an Evens-Odds Dice Game.

    ERIC Educational Resources Information Center

    Quinn, Robert J.; Wiest, Lynda R.

    1999-01-01

    Presents a dice game that students can use as a basis for exploring mathematical probabilities and making decisions while they also exercise skills in multiplication, pattern identification, proportional thinking, and communication. (ASK)

  13. PROBABILITY SAMPLING AND POPULATION INFERENCE IN MONITORING PROGRAMS

    EPA Science Inventory

    A fundamental difference between probability sampling and conventional statistics is that "sampling" deals with real, tangible populations, whereas "conventional statistics" usually deals with hypothetical populations that have no real-world realization. he focus here is on real ...

  14. Generalized Sequential Probability Ratio Test for Separate Families of Hypotheses

    PubMed Central

    Li, Xiaoou; Liu, Jingchen; Ying, Zhiliang

    2014-01-01

    In this paper, we consider the problem of testing two separate families of hypotheses via a generalization of the sequential probability ratio test. In particular, the generalized likelihood ratio statistic is considered and the stopping rule is the first boundary crossing of the generalized likelihood ratio statistic. We show that this sequential test is asymptotically optimal in the sense that it achieves asymptotically the shortest expected sample size as the maximal type I and type II error probabilities tend to zero.

  15. Multiobjective fuzzy stochastic linear programming problems with inexact probability distribution

    SciTech Connect

    Hamadameen, Abdulqader Othman; Zainuddin, Zaitul Marlizawati

    2014-06-19

    This study deals with multiobjective fuzzy stochastic linear programming problems with uncertainty probability distribution which are defined as fuzzy assertions by ambiguous experts. The problem formulation has been presented and the two solutions strategies are; the fuzzy transformation via ranking function and the stochastic transformation when α{sup –}. cut technique and linguistic hedges are used in the uncertainty probability distribution. The development of Sen’s method is employed to find a compromise solution, supported by illustrative numerical example.

  16. Weak measurements measure probability amplitudes (and very little else)

    NASA Astrophysics Data System (ADS)

    Sokolovski, D.

    2016-04-01

    Conventional quantum mechanics describes a pre- and post-selected system in terms of virtual (Feynman) paths via which the final state can be reached. In the absence of probabilities, a weak measurement (WM) determines the probability amplitudes for the paths involved. The weak values (WV) can be identified with these amplitudes, or their linear combinations. This allows us to explain the "unusual" properties of the WV, and avoid the "paradoxes" often associated with the WM.

  17. Multiobjective fuzzy stochastic linear programming problems with inexact probability distribution

    NASA Astrophysics Data System (ADS)

    Hamadameen, Abdulqader Othman; Zainuddin, Zaitul Marlizawati

    2014-06-01

    This study deals with multiobjective fuzzy stochastic linear programming problems with uncertainty probability distribution which are defined as fuzzy assertions by ambiguous experts. The problem formulation has been presented and the two solutions strategies are; the fuzzy transformation via ranking function and the stochastic transformation when α-. cut technique and linguistic hedges are used in the uncertainty probability distribution. The development of Sen's method is employed to find a compromise solution, supported by illustrative numerical example.

  18. An Alternative Approach to the Total Probability Formula. Classroom Notes

    ERIC Educational Resources Information Center

    Wu, Dane W. Wu; Bangerter, Laura M.

    2004-01-01

    Given a set of urns, each filled with a mix of black chips and white chips, what is the probability of drawing a black chip from the last urn after some sequential random shifts of chips among the urns? The Total Probability Formula (TPF) is the common tool to solve such a problem. However, when the number of urns is more than two and the number…

  19. A Bayesian Estimator of Protein-Protein Association Probabilities

    SciTech Connect

    Gilmore, Jason M.; Auberry, Deanna L.; Sharp, Julia L.; White, Amanda M.; Anderson, Kevin K.; Daly, Don S.

    2008-07-01

    The Bayesian Estimator of Protein-Protein Association Probabilities (BEPro3) is a software tool for estimating probabilities of protein-protein association between bait and prey protein pairs using data from multiple-bait, multiple-replicate, protein pull-down LC-MS assay experiments. BEPro3 is open source software that runs on both Windows XP and Mac OS 10.4 or newer versions, and is freely available from http://www.pnl.gov/statistics/BEPro3.

  20. Universality probability of a prefix-free machine.

    PubMed

    Barmpalias, George; Dowe, David L

    2012-07-28

    We study the notion of universality probability of a universal prefix-free machine, as introduced by C. S. Wallace. We show that it is random relative to the third iterate of the halting problem and determine its Turing degree and its place in the arithmetical hierarchy of complexity. Furthermore, we give a computational characterization of the real numbers that are universality probabilities of universal prefix-free machines. PMID:22711870

  1. Incorporating Skew into RMS Surface Roughness Probability Distribution

    NASA Technical Reports Server (NTRS)

    Stahl, Mark T.; Stahl, H. Philip.

    2013-01-01

    The standard treatment of RMS surface roughness data is the application of a Gaussian probability distribution. This handling of surface roughness ignores the skew present in the surface and overestimates the most probable RMS of the surface, the mode. Using experimental data we confirm the Gaussian distribution overestimates the mode and application of an asymmetric distribution provides a better fit. Implementing the proposed asymmetric distribution into the optical manufacturing process would reduce the polishing time required to meet surface roughness specifications.

  2. Probability density function modeling for sub-powered interconnects

    NASA Astrophysics Data System (ADS)

    Pater, Flavius; Amaricǎi, Alexandru

    2016-06-01

    This paper proposes three mathematical models for reliability probability density function modeling the interconnect supplied at sub-threshold voltages: spline curve approximations, Gaussian models,and sine interpolation. The proposed analysis aims at determining the most appropriate fitting for the switching delay - probability of correct switching for sub-powered interconnects. We compare the three mathematical models with the Monte-Carlo simulations of interconnects for 45 nm CMOS technology supplied at 0.25V.

  3. Learning English, Learning Science

    ERIC Educational Resources Information Center

    Nelson, Virginia

    2010-01-01

    Using science notebooks effectively in the classroom can encourage students who are learning English to keep up and keep interested. English language proficiency might head the list of content areas that schools can teach properly and effectively through science. Amaral, Garrison, and Klentschy (2002) reported that a successful inquiry-based…

  4. Learning to Love Learning

    ERIC Educational Resources Information Center

    Castleman, Ben; Littky, Dennis

    2007-01-01

    Too often, teachers in public schools do not have the time to get to know their students or tailor their instruction to students' interests. As a result, many students lose interest in school. The Met School, a public high school in Providence, Rhode Island, is designed to help students enjoy school while learning real-world skills. Castleman and…

  5. Learning How to Learn.

    ERIC Educational Resources Information Center

    Novak, Joseph D.; Gowin, D. Bob

    This eight-chapter book clearly presents a theory of how children learn and, therefore, how teachers and others can help children think about science as well as other topics. Its ideas and techniques may be adopted for preschoolers when objects are conceptually ordered, or for theoretical physicists when findings are conceptually organized. In…

  6. Probability of incipient spanning clusters in critical square bond percolation

    SciTech Connect

    Shchur, L.N.; Kosyakov, S.S.

    1997-06-01

    The probability of simultaneous occurrence of at least k spanning clusters has been studied by Monte Carlo simulations on the 2D square lattice with free boundaries at the bond percolation threshold p{sub c} = {1/2}. It is found that the probability of k and more Incipient Spanning Clusters (ISC) have the values P(k > 1) {approx} 0.00658(3) and P(k > 2) {approx} 0.00000148(21) provided that the limit of these probabilities for infinite lattice exists. The probability P(k > 3) of more than three ISC could be estimated to be of the order of 10{sup -11} and is beyond the possibility to compute such a value by nowadays computers. So, it is impossible to check in simulations the Aizenman law for the probabilities when k {much_gt} 1. We have detected a single sample with four ISC in a total number of about 10{sup 10} samples investigated. The probability of this single event is 1/10 for the number of samples. The influence of boundary conditions is discussed in the last section.

  7. Revising probability estimates: Why increasing likelihood means increasing impact.

    PubMed

    Maglio, Sam J; Polman, Evan

    2016-08-01

    Forecasted probabilities rarely stay the same for long. Instead, they are subject to constant revision-moving upward or downward, uncertain events become more or less likely. Yet little is known about how people interpret probability estimates beyond static snapshots, like a 30% chance of rain. Here, we consider the cognitive, affective, and behavioral consequences of revisions to probability forecasts. Stemming from a lay belief that revisions signal the emergence of a trend, we find in 10 studies (comprising uncertain events such as weather, climate change, sex, sports, and wine) that upward changes to event-probability (e.g., increasing from 20% to 30%) cause events to feel less remote than downward changes (e.g., decreasing from 40% to 30%), and subsequently change people's behavior regarding those events despite the revised event-probabilities being the same. Our research sheds light on how revising the probabilities for future events changes how people manage those uncertain events. (PsycINFO Database Record PMID:27281350

  8. An improved probability mapping approach to assess genome mosaicism

    PubMed Central

    Zhaxybayeva, Olga; Gogarten, J Peter

    2003-01-01

    Background Maximum likelihood and posterior probability mapping are useful visualization techniques that are used to ascertain the mosaic nature of prokaryotic genomes. However, posterior probabilities, especially when calculated for four-taxon cases, tend to overestimate the support for tree topologies. Furthermore, because of poor taxon sampling four-taxon analyses suffer from sensitivity to the long branch attraction artifact. Here we extend the probability mapping approach by improving taxon sampling of the analyzed datasets, and by using bootstrap support values, a more conservative tool to assess reliability. Results Quartets of orthologous proteins were complemented with homologs from selected reference genomes. The mapping of bootstrap support values from these extended datasets gives results similar to the original maximum likelihood and posterior probability mapping. The more conservative nature of the plotted support values allows to focus further analyses on those protein families that strongly disagree with the majority or plurality of genes present in the analyzed genomes. Conclusion Posterior probability is a non-conservative measure for support, and posterior probability mapping only provides a quick estimation of phylogenetic information content of four genomes. This approach can be utilized as a pre-screen to select genes that might have been horizontally transferred. Better taxon sampling combined with subtree analyses prevents the inconsistencies associated with four-taxon analyses, but retains the power of visual representation. Nevertheless, a case-by-case inspection of individual multi-taxon phylogenies remains necessary to differentiate unrecognized paralogy and shared phylogenetic reconstruction artifacts from horizontal gene transfer events. PMID:12974984

  9. Rapidly assessing the probability of exceptionally high natural hazard losses

    NASA Astrophysics Data System (ADS)

    Gollini, Isabella; Rougier, Jonathan

    2014-05-01

    One of the objectives in catastrophe modeling is to assess the probability distribution of losses for a specified period, such as a year. From the point of view of an insurance company, the whole of the loss distribution is interesting, and valuable in determining insurance premiums. But the shape of the righthand tail is critical, because it impinges on the solvency of the company. A simple measure of the risk of insolvency is the probability that the annual loss will exceed the company's current operating capital. Imposing an upper limit on this probability is one of the objectives of the EU Solvency II directive. If a probabilistic model is supplied for the loss process, then this tail probability can be computed, either directly, or by simulation. This can be a lengthy calculation for complex losses. Given the inevitably subjective nature of quantifying loss distributions, computational resources might be better used in a sensitivity analysis. This requires either a quick approximation to the tail probability or an upper bound on the probability, ideally a tight one. We present several different bounds, all of which can be computed nearly instantly from a very general event loss table. We provide a numerical illustration, and discuss the conditions under which the bound is tight. Although we consider the perspective of insurance and reinsurance companies, exactly the same issues concern the risk manager, who is typically very sensitive to large losses.

  10. Estimating Second Order Probability Beliefs from Subjective Survival Data

    PubMed Central

    Hudomiet, Péter; Willis, Robert J.

    2013-01-01

    Based on subjective survival probability questions in the Health and Retirement Study (HRS), we use an econometric model to estimate the determinants of individual-level uncertainty about personal longevity. This model is built around the modal response hypothesis (MRH), a mathematical expression of the idea that survey responses of 0%, 50%, or 100% to probability questions indicate a high level of uncertainty about the relevant probability. We show that subjective survival expectations in 2002 line up very well with realized mortality of the HRS respondents between 2002 and 2010. We show that the MRH model performs better than typically used models in the literature of subjective probabilities. Our model gives more accurate estimates of low probability events and it is able to predict the unusually high fraction of focal 0%, 50%, and 100% answers observed in many data sets on subjective probabilities. We show that subjects place too much weight on parents’ age at death when forming expectations about their own longevity, whereas other covariates such as demographics, cognition, personality, subjective health, and health behavior are under weighted. We also find that less educated people, smokers, and women have less certain beliefs, and recent health shocks increase uncertainty about survival, too. PMID:24403866

  11. Anticipating abrupt shifts in temporal evolution of probability of eruption

    NASA Astrophysics Data System (ADS)

    Rohmer, Jeremy; Loschetter, Annick

    2016-04-01

    Estimating the probability of eruption by jointly accounting for different sources of monitoring parameters over time is a key component for volcano risk management. In the present study, we are interested in the transition from a state of low-to-moderate probability value and to the one of high probability value: the latter value generally supports the call for evacuation. By using the data of MESIMEX exercise at the Vesuvius volcano, we investigated the potential for time-varying indicators related to the correlation structure or to the variability of the probability time series for detecting in advance this critical transition. We found that changes in the power spectra and in the standard deviation estimated over a rolling time window both present an abrupt increase, which marks the approaching shift. Our numerical experiments revealed that the transition from an eruption probability of 10-15% to >70% could be identified up 4 hours in advance, ~2.5 days before the evacuation call (decided for an eruption probability >80% during the MESIMEX exercise). This additional lead time could be useful to place different key services (e.g., emergency services for vulnerable groups, commandeering additional transportation means, etc.) on a higher level of alert before the actual call for evacuation.

  12. Probability fields revisited in the context of ensemble Kalman filtering

    NASA Astrophysics Data System (ADS)

    Xu, Teng; Gómez-Hernández, J. Jaime

    2015-12-01

    Hu et al. (2013) proposed an approach to update complex geological facies models generated by multiple-point geostatistical simulation while keeping geological and statistical consistency. Their approach is based on mapping the facies realization onto the spatially uncorrelated uniform random numbers used by the sequential multiple-point simulation to generate the facies realization itself. The ensemble Kalman filter was then used to update the uniform random number realizations, which were then used to generate a new facies realization by multiple-point simulation. This approach has not a good performance that we attribute to the fact that, being the probabilities random and spatially uncorrelated, their correlation with the state variable (piezometric heads) is very weak, and the Kalman gain is always small. The approach is reminiscent of the probability field simulation, which also maps the conductivity realizations onto a field of uniform random numbers; although the mapping now is done using the local conditional distribution functions built based on a prior statistical model and the conditioning data. Contrary to Hu et al. (2013) approach, this field of uniform random numbers, termed a probability field, displays spatial patterns related to the conductivity spatial patterns, and, therefore, the correlation between probabilities and state variable is as strong as the correlation between conductivities and state variable could be. Similarly to Hu et al. (2013), we propose to use the ensemble Kalman filter to update the probability fields, and show that the existence of this correlation between probability values and state variables provides better results.

  13. Probability measurements characterizing the classicality of a physical system

    NASA Astrophysics Data System (ADS)

    Dorninger, Dietmar; Länger, Helmut

    2014-02-01

    Let S be a set of states of a physical system. The probabilities p(s) of the occurrence of an event when the system is in different states s ∈ S define a function from S to [0,1] called a multidimensional probability. When appropriately structured in respect to the order, complements and sums of functions, sets P of multidimensional probabilities give rise to the so-called algebras of S-probabilities which, in the case of classical physical systems, are Boolean algebras. Knowing only a (small) subset X of P, and not the whole of P, the question arises whether the functions of X indicate that one deals with a classical physical system or not. We will show that this question can be settled by (experimentally) finding further multidimensional probabilities which are terms of the given ones and can be precalculated by a recursive procedure depending on the number of elements of X. Our main tool for this procedure is a characterization of commuting pairs of multidimensional probabilities.

  14. Probability analysis of position errors using uncooled IR stereo camera

    NASA Astrophysics Data System (ADS)

    Oh, Jun Ho; Lee, Sang Hwa; Lee, Boo Hwan; Park, Jong-Il

    2016-05-01

    This paper analyzes the random phenomenon of 3D positions when tracking moving objects using the infrared (IR) stereo camera, and proposes a probability model of 3D positions. The proposed probability model integrates two random error phenomena. One is the pixel quantization error which is caused by discrete sampling pixels in estimating disparity values of stereo camera. The other is the timing jitter which results from the irregular acquisition-timing in the uncooled IR cameras. This paper derives a probability distribution function by combining jitter model with pixel quantization error. To verify the proposed probability function of 3D positions, the experiments on tracking fast moving objects are performed using IR stereo camera system. The 3D depths of moving object are estimated by stereo matching, and be compared with the ground truth obtained by laser scanner system. According to the experiments, the 3D depths of moving object are estimated within the statistically reliable range which is well derived by the proposed probability distribution. It is expected that the proposed probability model of 3D positions can be applied to various IR stereo camera systems that deal with fast moving objects.

  15. On learning dynamics underlying the evolution of learning rules.

    PubMed

    Dridi, Slimane; Lehmann, Laurent

    2014-02-01

    In order to understand the development of non-genetically encoded actions during an animal's lifespan, it is necessary to analyze the dynamics and evolution of learning rules producing behavior. Owing to the intrinsic stochastic and frequency-dependent nature of learning dynamics, these rules are often studied in evolutionary biology via agent-based computer simulations. In this paper, we show that stochastic approximation theory can help to qualitatively understand learning dynamics and formulate analytical models for the evolution of learning rules. We consider a population of individuals repeatedly interacting during their lifespan, and where the stage game faced by the individuals fluctuates according to an environmental stochastic process. Individuals adjust their behavioral actions according to learning rules belonging to the class of experience-weighted attraction learning mechanisms, which includes standard reinforcement and Bayesian learning as special cases. We use stochastic approximation theory in order to derive differential equations governing action play probabilities, which turn out to have qualitative features of mutator-selection equations. We then perform agent-based simulations to find the conditions where the deterministic approximation is closest to the original stochastic learning process for standard 2-action 2-player fluctuating games, where interaction between learning rules and preference reversal may occur. Finally, we analyze a simplified model for the evolution of learning in a producer-scrounger game, which shows that the exploration rate can interact in a non-intuitive way with other features of co-evolving learning rules. Overall, our analyses illustrate the usefulness of applying stochastic approximation theory in the study of animal learning. PMID:24055617

  16. More than Just Finding Color: Strategy in Global Visual Search Is Shaped by Learned Target Probabilities

    ERIC Educational Resources Information Center

    Williams, Carrick C.; Pollatsek, Alexander; Cave, Kyle R.; Stroud, Michael J.

    2009-01-01

    In 2 experiments, eye movements were examined during searches in which elements were grouped into four 9-item clusters. The target (a red or blue "T") was known in advance, and each cluster contained different numbers of target-color elements. Rather than color composition of a cluster invariantly guiding the order of search though clusters, the…

  17. Probability Learning as a Function of Age, Sex, and Type of Constraint

    ERIC Educational Resources Information Center

    Pecan, Erene V.; Schvaneveldt, Roger W.

    1970-01-01

    Higher levels of predicting the more frequent event were achieved with males than females; with the contingent than the noncontingent situation; and with adult males than boys in the noncontingent situation. Females were more likely to repeat an incorrect prediction. (MH)

  18. Transition Probabilities of the Rare Earth Neutral Lanthanum

    NASA Astrophysics Data System (ADS)

    Palmer, Andria; Lawler, James E.; Den Hartog, Elizabeth

    2015-01-01

    In continuation of a long-standing project to measure transition probabilities for rare earth elements, La i is currently being studied. Transition probabilities of the rare earths and other elements are determined in order to assist astronomers in making stellar spectroscopy more quantitative. Atomic spectroscopy is a key tool for astronomers as it provides nearly all the details about the physics and chemistry of the universe outside of our solar system. Rare earth elements tend to have complex electronic structure due to their open 4f, 5d, 6s, and 6p shells. This leads to a rich spectrum throughout the ultraviolet, visible and near-infrared, making them very accessible elements for study in stellar photospheric spectra. A transition probability is the probability per unit time for a transition to occur between an upper level and a lower level. The process for measuring transition probabilities is by using the well-established technique of time-resolved laser-induced fluorescence to measure the radiative lifetimes for each upper level. This is then combined with branching fractions measured using a 1m high-resolution Fourier Transform Spectrometer. Radiative lifetimes for ~70 upper levels of neutral La along with their associated branching fractions will be reported, resulting in the determination of several hundred new transition probabilities. These transition probabilities will assist astronomers in analyzing the chemical compositions of older, cooler stars which give insight into the origins of the chemical elements.This work supported by by NSF grant AST-1211055 (JEL & EDH) and by the NSF REU program (AJP).

  19. On the probability of cure for heavy-ion radiotherapy

    NASA Astrophysics Data System (ADS)

    Hanin, Leonid; Zaider, Marco

    2014-07-01

    The probability of a cure in radiation therapy (RT)—viewed as the probability of eventual extinction of all cancer cells—is unobservable, and the only way to compute it is through modeling the dynamics of cancer cell population during and post-treatment. The conundrum at the heart of biophysical models aimed at such prospective calculations is the absence of information on the initial size of the subpopulation of clonogenic cancer cells (also called stem-like cancer cells), that largely determines the outcome of RT, both in an individual and population settings. Other relevant parameters (e.g. potential doubling time, cell loss factor and survival probability as a function of dose) are, at least in principle, amenable to empirical determination. In this article we demonstrate that, for heavy-ion RT, microdosimetric considerations (justifiably ignored in conventional RT) combined with an expression for the clone extinction probability obtained from a mechanistic model of radiation cell survival lead to useful upper bounds on the size of the pre-treatment population of clonogenic cancer cells as well as upper and lower bounds on the cure probability. The main practical impact of these limiting values is the ability to make predictions about the probability of a cure for a given population of patients treated to newer, still unexplored treatment modalities from the empirically determined probability of a cure for the same or similar population resulting from conventional low linear energy transfer (typically photon/electron) RT. We also propose that the current trend to deliver a lower total dose in a smaller number of fractions with larger-than-conventional doses per fraction has physical limits that must be understood before embarking on a particular treatment schedule.

  20. Learn, how to learn

    NASA Astrophysics Data System (ADS)

    Narayanan, M.

    2002-12-01

    Ernest L. Boyer, in his 1990 book, "Scholarship Reconsidered: Priorities of the Professorate" cites some ground breaking studies and offers a new paradigm that identifies the need to recognize the growing conversation about teaching, scholarship and research in the Universities. The use of `ACORN' model suggested by Hawkins and Winter to conquer and mastering change, may offer some helpful hints for the novice professor, whose primary objective might be to teach students to `learn how to learn'. Action : It is possible to effectively change things only when a teaching professor actually tries out a new idea. Communication : Changes are successful only when the new ideas effectively communicated and implemented. Ownership : Support for change is extremely important and is critical. Only strong commitment for accepting changes demonstrates genuine leadership. Reflection : Feedback helps towards thoughtful evaluation of the changes implemented. Only reflection can provide a tool for continuous improvement. Nurture : Implemented changes deliver results only when nurtured and promoted with necessary support systems, documentation and infrastructures. Inspired by the ACORN model, the author experimented on implementing certain principles of `Total Quality Management' in the classroom. The author believes that observing the following twenty principles would indeed help the student learners how to learn, on their own towards achieving the goal of `Lifelong Learning'. The author uses an acronym : QUOTES : Quality Underscored On Teaching Excellence Strategy, to describe his methods for improving classroom teacher-learner participation. 1. Break down all barriers. 2. Create consistency of purpose with a plan. 3. Adopt the new philosophy of quality. 4. Establish high Standards. 5. Establish Targets / Goals. 6. Reduce dependence on Lectures. 7. Employ Modern Methods. 8. Control the Process. 9. Organize to reach goals. 10. Prevention vs. Correction. 11. Periodic Improvements. 12