Arons, Alexander M M; Krabbe, Paul F M
2013-02-01
Interest is rising in measuring subjective health outcomes, such as treatment outcomes that are not directly quantifiable (functional disability, symptoms, complaints, side effects and health-related quality of life). Health economists in particular have applied probabilistic choice models in the area of health evaluation. They increasingly use discrete choice models based on random utility theory to derive values for healthcare goods or services. Recent attempts have been made to use discrete choice models as an alternative method to derive values for health states. In this article, various probabilistic choice models are described according to their underlying theory. A historical overview traces their development and applications in diverse fields. The discussion highlights some theoretical and technical aspects of the choice models and their similarity and dissimilarity. The objective of the article is to elucidate the position of each model and their applications for health-state valuation.
The probabilistic nature of preferential choice.
Rieskamp, Jörg
2008-11-01
Previous research has developed a variety of theories explaining when and why people's decisions under risk deviate from the standard economic view of expected utility maximization. These theories are limited in their predictive accuracy in that they do not explain the probabilistic nature of preferential choice, that is, why an individual makes different choices in nearly identical situations, or why the magnitude of these inconsistencies varies in different situations. To illustrate the advantage of probabilistic theories, three probabilistic theories of decision making under risk are compared with their deterministic counterparts. The probabilistic theories are (a) a probabilistic version of a simple choice heuristic, (b) a probabilistic version of cumulative prospect theory, and (c) decision field theory. By testing the theories with the data from three experimental studies, the superiority of the probabilistic models over their deterministic counterparts in predicting people's decisions under risk become evident. When testing the probabilistic theories against each other, decision field theory provides the best account of the observed behavior.
DOT National Transportation Integrated Search
2015-12-01
This study adopts a dwelling unit level of analysis and considers a probabilistic choice set generation approach for residential choice modeling. In doing so, we accommodate the fact that housing choices involve both characteristics of the dwelling u...
Relative Gains, Losses, and Reference Points in Probabilistic Choice in Rats
Marshall, Andrew T.; Kirkpatrick, Kimberly
2015-01-01
Theoretical reference points have been proposed to differentiate probabilistic gains from probabilistic losses in humans, but such a phenomenon in non-human animals has yet to be thoroughly elucidated. Three experiments evaluated the effect of reward magnitude on probabilistic choice in rats, seeking to determine reference point use by examining the effect of previous outcome magnitude(s) on subsequent choice behavior. Rats were trained to choose between an outcome that always delivered reward (low-uncertainty choice) and one that probabilistically delivered reward (high-uncertainty). The probability of high-uncertainty outcome receipt and the magnitudes of low-uncertainty and high-uncertainty outcomes were manipulated within and between experiments. Both the low- and high-uncertainty outcomes involved variable reward magnitudes, so that either a smaller or larger magnitude was probabilistically delivered, as well as reward omission following high-uncertainty choices. In Experiments 1 and 2, the between groups factor was the magnitude of the high-uncertainty-smaller (H-S) and high-uncertainty-larger (H-L) outcome, respectively. The H-S magnitude manipulation differentiated the groups, while the H-L magnitude manipulation did not. Experiment 3 showed that manipulating the probability of differential losses as well as the expected value of the low-uncertainty choice produced systematic effects on choice behavior. The results suggest that the reference point for probabilistic gains and losses was the expected value of the low-uncertainty choice. Current theories of probabilistic choice behavior have difficulty accounting for the present results, so an integrated theoretical framework is proposed. Overall, the present results have implications for understanding individual differences and corresponding underlying mechanisms of probabilistic choice behavior. PMID:25658448
Probabilistic reversal learning is impaired in Parkinson's disease
Peterson, David A.; Elliott, Christian; Song, David D.; Makeig, Scott; Sejnowski, Terrence J.; Poizner, Howard
2009-01-01
In many everyday settings, the relationship between our choices and their potentially rewarding outcomes is probabilistic and dynamic. In addition, the difficulty of the choices can vary widely. Although a large body of theoretical and empirical evidence suggests that dopamine mediates rewarded learning, the influence of dopamine in probabilistic and dynamic rewarded learning remains unclear. We adapted a probabilistic rewarded learning task originally used to study firing rates of dopamine cells in primate substantia nigra pars compacta (Morris et al. 2006) for use as a reversal learning task with humans. We sought to investigate how the dopamine depletion in Parkinson's disease (PD) affects probabilistic reward learning and adaptation to a reversal in reward contingencies. Over the course of 256 trials subjects learned to choose the more favorable from among pairs of images with small or large differences in reward probabilities. During a subsequent otherwise identical reversal phase, the reward probability contingencies for the stimuli were reversed. Seventeen Parkinson's disease (PD) patients of mild to moderate severity were studied off of their dopaminergic medications and compared to 15 age-matched controls. Compared to controls, PD patients had distinct pre- and post-reversal deficiencies depending upon the difficulty of the choices they had to learn. The patients also exhibited compromised adaptability to the reversal. A computational model of the subjects’ trial-by-trial choices demonstrated that the adaptability was sensitive to the gain with which patients weighted pre-reversal feedback. Collectively, the results implicate the nigral dopaminergic system in learning to make choices in environments with probabilistic and dynamic reward contingencies. PMID:19628022
NASA Astrophysics Data System (ADS)
Ershov, Egor; Karnaukhov, Victor; Mozerov, Mikhail
2016-02-01
Two consecutive frames of a lateral navigation camera video sequence can be considered as an appropriate approximation to epipolar stereo. To overcome edge-aware inaccuracy caused by occlusion, we propose a model that matches the current frame to the next and to the previous ones. The positive disparity of matching to the previous frame has its symmetric negative disparity to the next frame. The proposed algorithm performs probabilistic choice for each matched pixel between the positive disparity and its symmetric disparity cost. A disparity map obtained by optimization over the cost volume composed of the proposed probabilistic choice is more accurate than the traditional left-to-right and right-to-left disparity maps cross-check. Also, our algorithm needs two times less computational operations per pixel than the cross-check technique. The effectiveness of our approach is demonstrated on synthetic data and real video sequences, with ground-truth value.
Vanderveldt, Ariana; Green, Leonard; Myerson, Joel
2014-01-01
The value of an outcome is affected both by the delay until its receipt (delay discounting) and by the likelihood of its receipt (probability discounting). Despite being well-described by the same hyperboloid function, delay and probability discounting involve fundamentally different processes, as revealed, for example, by the differential effects of reward amount. Previous research has focused on the discounting of delayed and probabilistic rewards separately, with little research examining more complex situations in which rewards are both delayed and probabilistic. In two experiments, participants made choices between smaller rewards that were both immediate and certain and larger rewards that were both delayed and probabilistic. Analyses revealed significant interactions between delay and probability factors inconsistent with an additive model. In contrast, a hyperboloid discounting model in which delay and probability were combined multiplicatively provided an excellent fit to the data. These results suggest that the hyperboloid is a good descriptor of decision making in complicated monetary choice situations like those people encounter in everyday life. PMID:24933696
A Probabilistic Strategy for Understanding Action Selection
Kim, Byounghoon; Basso, Michele A.
2010-01-01
Brain regions involved in transforming sensory signals into movement commands are the likely sites where decisions are formed. Once formed, a decision must be read-out from the activity of populations of neurons to produce a choice of action. How this occurs remains unresolved. We recorded from four superior colliculus (SC) neurons simultaneously while monkeys performed a target selection task. We implemented three models to gain insight into the computational principles underlying population coding of action selection. We compared the population vector average (PVA), winner-takes-all (WTA) and a Bayesian model, maximum a posteriori estimate (MAP) to determine which predicted choices most often. The probabilistic model predicted more trials correctly than both the WTA and the PVA. The MAP model predicted 81.88% whereas WTA predicted 71.11% and PVA/OLE predicted the least number of trials at 55.71 and 69.47%. Recovering MAP estimates using simulated, non-uniform priors that correlated with monkeys’ choice performance, improved the accuracy of the model by 2.88%. A dynamic analysis revealed that the MAP estimate evolved over time and the posterior probability of the saccade choice reached a maximum at the time of the saccade. MAP estimates also scaled with choice performance accuracy. Although there was overlap in the prediction abilities of all the models, we conclude that movement choice from populations of neurons may be best understood by considering frameworks based on probability. PMID:20147560
NASA Astrophysics Data System (ADS)
Teoh, Lay Eng; Khoo, Hooi Ling
2013-09-01
This study deals with two major aspects of airlines, i.e. supply and demand management. The aspect of supply focuses on the mathematical formulation of an optimal fleet management model to maximize operational profit of the airlines while the aspect of demand focuses on the incorporation of mode choice modeling as parts of the developed model. The proposed methodology is outlined in two-stage, i.e. Fuzzy Analytic Hierarchy Process is first adopted to capture mode choice modeling in order to quantify the probability of probable phenomena (for aircraft acquisition/leasing decision). Then, an optimization model is developed as a probabilistic dynamic programming model to determine the optimal number and types of aircraft to be acquired and/or leased in order to meet stochastic demand during the planning horizon. The findings of an illustrative case study show that the proposed methodology is viable. The results demonstrate that the incorporation of mode choice modeling could affect the operational profit and fleet management decision of the airlines at varying degrees.
ERIC Educational Resources Information Center
Kolodny, Oren; Lotem, Arnon; Edelman, Shimon
2015-01-01
We introduce a set of biologically and computationally motivated design choices for modeling the learning of language, or of other types of sequential, hierarchically structured experience and behavior, and describe an implemented system that conforms to these choices and is capable of unsupervised learning from raw natural-language corpora. Given…
Effects of Time between Trials on Rats' and Pigeons' Choices with Probabilistic Delayed Reinforcers
ERIC Educational Resources Information Center
Mazur, James E.; Biondi, Dawn R.
2011-01-01
Parallel experiments with rats and pigeons examined reasons for previous findings that in choices with probabilistic delayed reinforcers, rats' choices were affected by the time between trials whereas pigeons' choices were not. In both experiments, the animals chose between a standard alternative and an adjusting alternative. A choice of the…
Modeling Spanish Mood Choice in Belief Statements
ERIC Educational Resources Information Center
Robinson, Jason R.
2013-01-01
This work develops a computational methodology new to linguistics that empirically evaluates competing linguistic theories on Spanish verbal mood choice through the use of computational techniques to learn mood and other hidden linguistic features from Spanish belief statements found in corpora. The machine learned probabilistic linguistic models…
The Effects of the Previous Outcome on Probabilistic Choice in Rats
Marshall, Andrew T.; Kirkpatrick, Kimberly
2014-01-01
This study examined the effects of previous outcomes on subsequent choices in a probabilistic-choice task. Twenty-four rats were trained to choose between a certain outcome (1 or 3 pellets) versus an uncertain outcome (3 or 9 pellets), delivered with a probability of .1, .33, .67, and .9 in different phases. Uncertain outcome choices increased with the probability of uncertain food. Additionally, uncertain choices increased with the probability of uncertain food following both certain-choice outcomes and unrewarded uncertain choices. However, following uncertain-choice food outcomes, there was a tendency to choose the uncertain outcome in all cases, indicating that the rats continued to “gamble” after successful uncertain choices, regardless of the overall probability or magnitude of food. A subsequent manipulation, in which the probability of uncertain food varied within each session as a function of the previous uncertain outcome, examined how the previous outcome and probability of uncertain food affected choice in a dynamic environment. Uncertain-choice behavior increased with the probability of uncertain food. The rats exhibited increased sensitivity to probability changes and a greater degree of win–stay/lose–shift behavior than in the static phase. Simulations of two sequential choice models were performed to explore the possible mechanisms of reward value computations. The simulation results supported an exponentially decaying value function that updated as a function of trial (rather than time). These results emphasize the importance of analyzing global and local factors in choice behavior and suggest avenues for the future development of sequential-choice models. PMID:23205915
Heck, Daniel W; Hilbig, Benjamin E; Moshagen, Morten
2017-08-01
Decision strategies explain how people integrate multiple sources of information to make probabilistic inferences. In the past decade, increasingly sophisticated methods have been developed to determine which strategy explains decision behavior best. We extend these efforts to test psychologically more plausible models (i.e., strategies), including a new, probabilistic version of the take-the-best (TTB) heuristic that implements a rank order of error probabilities based on sequential processing. Within a coherent statistical framework, deterministic and probabilistic versions of TTB and other strategies can directly be compared using model selection by minimum description length or the Bayes factor. In an experiment with inferences from given information, only three of 104 participants were best described by the psychologically plausible, probabilistic version of TTB. Similar as in previous studies, most participants were classified as users of weighted-additive, a strategy that integrates all available information and approximates rational decisions. Copyright © 2017 Elsevier Inc. All rights reserved.
The Epistemic Representation of Information Flow Security in Probabilistic Systems
1995-06-01
The new characterization also means that our security crite- rion is expressible in a simpler logic and model. 1 Introduction Multilevel security is...ber generator) during its execution. Such probabilistic choices are useful in a multilevel security context for Supported by grants HKUST 608/94E from... 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and
Initial Correction versus Negative Marking in Multiple Choice Examinations
ERIC Educational Resources Information Center
Van Hecke, Tanja
2015-01-01
Optimal assessment tools should measure in a limited time the knowledge of students in a correct and unbiased way. A method for automating the scoring is multiple choice scoring. This article compares scoring methods from a probabilistic point of view by modelling the probability to pass: the number right scoring, the initial correction (IC) and…
Donati, Maria Anna; Panno, Angelo; Chiesi, Francesca; Primi, Caterina
2014-01-01
This study tested the mediating role of probabilistic reasoning ability in the relationship between fluid intelligence and advantageous decision making among adolescents in explicit situations of risk--that is, in contexts in which information on the choice options (gains, losses, and probabilities) were explicitly presented at the beginning of the task. Participants were 282 adolescents attending high school (77% males, mean age = 17.3 years). We first measured fluid intelligence and probabilistic reasoning ability. Then, to measure decision making under explicit conditions of risk, participants performed the Game of Dice Task, in which they have to decide among different alternatives that are explicitly linked to a specific amount of gain or loss and have obvious winning probabilities that are stable over time. Analyses showed a significant positive indirect effect of fluid intelligence on advantageous decision making through probabilistic reasoning ability that acted as a mediator. Specifically, fluid intelligence may enhance ability to reason in probabilistic terms, which in turn increases the likelihood of advantageous choices when adolescents are confronted with an explicit decisional context. Findings show that in experimental paradigm settings, adolescents are able to make advantageous decisions using cognitive abilities when faced with decisions under explicit risky conditions. This study suggests that interventions designed to promote probabilistic reasoning, for example by incrementing the mathematical prerequisites necessary to reason in probabilistic terms, may have a positive effect on adolescents' decision-making abilities.
Young children do not succeed in choice tasks that imply evaluating chances.
Girotto, Vittorio; Fontanari, Laura; Gonzalez, Michel; Vallortigara, Giorgio; Blaye, Agnès
2016-07-01
Preverbal infants manifest probabilistic intuitions in their reactions to the outcomes of simple physical processes and in their choices. Their ability conflicts with the evidence that, before the age of about 5years, children's verbal judgments do not reveal probability understanding. To assess these conflicting results, three studies tested 3-5-year-olds on choice tasks on which infants perform successfully. The results showed that children of all age groups made optimal choices in tasks that did not require forming probabilistic expectations. In probabilistic tasks, however, only 5-year-olds made optimal choices. Younger children performed at random and/or were guided by superficial heuristics. These results suggest caution in interpreting infants' ability to evaluate chance, and indicate that the development of this ability may not follow a linear trajectory. Copyright © 2016 Elsevier B.V. All rights reserved.
The Probabilistic Nature of Preferential Choice
ERIC Educational Resources Information Center
Rieskamp, Jorg
2008-01-01
Previous research has developed a variety of theories explaining when and why people's decisions under risk deviate from the standard economic view of expected utility maximization. These theories are limited in their predictive accuracy in that they do not explain the probabilistic nature of preferential choice, that is, why an individual makes…
Perceptual Decision-Making as Probabilistic Inference by Neural Sampling.
Haefner, Ralf M; Berkes, Pietro; Fiser, József
2016-05-04
We address two main challenges facing systems neuroscience today: understanding the nature and function of cortical feedback between sensory areas and of correlated variability. Starting from the old idea of perception as probabilistic inference, we show how to use knowledge of the psychophysical task to make testable predictions for the influence of feedback signals on early sensory representations. Applying our framework to a two-alternative forced choice task paradigm, we can explain multiple empirical findings that have been hard to account for by the traditional feedforward model of sensory processing, including the task dependence of neural response correlations and the diverging time courses of choice probabilities and psychophysical kernels. Our model makes new predictions and characterizes a component of correlated variability that represents task-related information rather than performance-degrading noise. It demonstrates a normative way to integrate sensory and cognitive components into physiologically testable models of perceptual decision-making. Copyright © 2016 Elsevier Inc. All rights reserved.
Probabilistic vs. non-probabilistic approaches to the neurobiology of perceptual decision-making
Drugowitsch, Jan; Pouget, Alexandre
2012-01-01
Optimal binary perceptual decision making requires accumulation of evidence in the form of a probability distribution that specifies the probability of the choices being correct given the evidence so far. Reward rates can then be maximized by stopping the accumulation when the confidence about either option reaches a threshold. Behavioral and neuronal evidence suggests that humans and animals follow such a probabilitistic decision strategy, although its neural implementation has yet to be fully characterized. Here we show that that diffusion decision models and attractor network models provide an approximation to the optimal strategy only under certain circumstances. In particular, neither model type is sufficiently flexible to encode the reliability of both the momentary and the accumulated evidence, which is a pre-requisite to accumulate evidence of time-varying reliability. Probabilistic population codes, in contrast, can encode these quantities and, as a consequence, have the potential to implement the optimal strategy accurately. PMID:22884815
The cerebellum and decision making under uncertainty.
Blackwood, Nigel; Ffytche, Dominic; Simmons, Andrew; Bentall, Richard; Murray, Robin; Howard, Robert
2004-06-01
This study aimed to identify the neural basis of probabilistic reasoning, a type of inductive inference that aids decision making under conditions of uncertainty. Eight normal subjects performed two separate two-alternative-choice tasks (the balls in a bottle and personality survey tasks) while undergoing functional magnetic resonance imaging (fMRI). The experimental conditions within each task were chosen so that they differed only in their requirement to make a decision under conditions of uncertainty (probabilistic reasoning and frequency determination required) or under conditions of certainty (frequency determination required). The same visual stimuli and motor responses were used in the experimental conditions. We provide evidence that the neo-cerebellum, in conjunction with the premotor cortex, inferior parietal lobule and medial occipital cortex, mediates the probabilistic inferences that guide decision making under uncertainty. We hypothesise that the neo-cerebellum constructs internal working models of uncertain events in the external world, and that such probabilistic models subserve the predictive capacity central to induction. Copyright 2004 Elsevier B.V.
A decision network account of reasoning about other people's choices
Jern, Alan; Kemp, Charles
2015-01-01
The ability to predict and reason about other people's choices is fundamental to social interaction. We propose that people reason about other people's choices using mental models that are similar to decision networks. Decision networks are extensions of Bayesian networks that incorporate the idea that choices are made in order to achieve goals. In our first experiment, we explore how people predict the choices of others. Our remaining three experiments explore how people infer the goals and knowledge of others by observing the choices that they make. We show that decision networks account for our data better than alternative computational accounts that do not incorporate the notion of goal-directed choice or that do not rely on probabilistic inference. PMID:26010559
The evolution of mate choice: a dialogue between theory and experiment.
Roff, Derek A
2015-12-01
Research on the evolution of mate choice has followed three avenues of investigation: (1) theoretical models of the evolution of preference and the preferred trait; (2) proposed models of mate choice; and (3) experiments and observations on mate choice, both in the laboratory and with free-ranging animals. However, there has been relatively little dialogue among these three areas. Most attempts to account for observations of mate choice using theoretical mate-choice models have focused only upon a subset of particular models and have generally failed to consider the difference between probabilistic and deterministic models. In this review, I outline the underlying reasoning of the commonly cited mate-choice models and review the conclusions of the empirical investigations. I present a brief outline of how one might go about testing these models. It remains uncertain if, in general, mate-choice models can be realistically analyzed. Although it is clear that females frequently discriminate among males, data also suggest that females may typically have a very limited number of males from which to choose. The extent to which female choice under natural conditions is relatively random because of limited opportunities remains an open question for the majority of species. © 2015 New York Academy of Sciences.
A Game-Theoretic Approach to Information-Flow Control via Protocol Composition
NASA Astrophysics Data System (ADS)
Alvim, Mário; Chatzikokolakis, Konstantinos; Kawamoto, Yusuke; Palamidessi, Catuscia
2018-05-01
In the inference attacks studied in Quantitative Information Flow (QIF), the attacker typically tries to interfere with the system in the attempt to increase its leakage of secret information. The defender, on the other hand, typically tries to decrease leakage by introducing some controlled noise. This noise introduction can be modeled as a type of protocol composition, i.e., a probabilistic choice among different protocols, and its effect on the amount of leakage depends heavily on whether or not this choice is visible to the attacker. In this work, we consider operators for modeling visible and hidden choice in protocol composition, and we study their algebraic properties. We then formalize the interplay between defender and attacker in a game-theoretic framework adapted to the specific issues of QIF, where the payoff is information leakage. We consider various kinds of leakage games, depending on whether players act simultaneously or sequentially, and on whether or not the choices of the defender are visible to the attacker. In the case of sequential games, the choice of the second player is generally a function of the choice of the first player, and his/her probabilistic choice can be either over the possible functions (mixed strategy) or it can be on the result of the function (behavioral strategy). We show that when the attacker moves first in a sequential game with a hidden choice, then behavioral strategies are more advantageous for the defender than mixed strategies. This contrasts with the standard game theory, where the two types of strategies are equivalent. Finally, we establish a hierarchy of these games in terms of their information leakage and provide methods for finding optimal strategies (at the points of equilibrium) for both attacker and defender in the various cases.
A decision network account of reasoning about other people's choices.
Jern, Alan; Kemp, Charles
2015-09-01
The ability to predict and reason about other people's choices is fundamental to social interaction. We propose that people reason about other people's choices using mental models that are similar to decision networks. Decision networks are extensions of Bayesian networks that incorporate the idea that choices are made in order to achieve goals. In our first experiment, we explore how people predict the choices of others. Our remaining three experiments explore how people infer the goals and knowledge of others by observing the choices that they make. We show that decision networks account for our data better than alternative computational accounts that do not incorporate the notion of goal-directed choice or that do not rely on probabilistic inference. Copyright © 2015 Elsevier B.V. All rights reserved.
A Probabilistic, Dynamic, and Attribute-wise Model of Intertemporal Choice
Dai, Junyi; Busemeyer, Jerome R.
2014-01-01
Most theoretical and empirical research on intertemporal choice assumes a deterministic and static perspective, leading to the widely adopted delay discounting models. As a form of preferential choice, however, intertemporal choice may be generated by a stochastic process that requires some deliberation time to reach a decision. We conducted three experiments to investigate how choice and decision time varied as a function of manipulations designed to examine the delay duration effect, the common difference effect, and the magnitude effect in intertemporal choice. The results, especially those associated with the delay duration effect, challenged the traditional deterministic and static view and called for alternative approaches. Consequently, various static or dynamic stochastic choice models were explored and fit to the choice data, including alternative-wise models derived from the traditional exponential or hyperbolic discount function and attribute-wise models built upon comparisons of direct or relative differences in money and delay. Furthermore, for the first time, dynamic diffusion models, such as those based on decision field theory, were also fit to the choice and response time data simultaneously. The results revealed that the attribute-wise diffusion model with direct differences, power transformations of objective value and time, and varied diffusion parameter performed the best and could account for all three intertemporal effects. In addition, the empirical relationship between choice proportions and response times was consistent with the prediction of diffusion models and thus favored a stochastic choice process for intertemporal choice that requires some deliberation time to make a decision. PMID:24635188
NASA Astrophysics Data System (ADS)
Prinn, R. G.
2013-12-01
The world is facing major challenges that create tensions between human development and environmental sustenance. In facing these challenges, computer models are invaluable tools for addressing the need for probabilistic approaches to forecasting. To illustrate this, I use the MIT Integrated Global System Model framework (IGSM; http://globalchange.mit.edu ). The IGSM consists of a set of coupled sub-models of global economic and technological development and resultant emissions, and physical, dynamical and chemical processes in the atmosphere, land, ocean and ecosystems (natural and managed). Some of the sub-models have both complex and simplified versions available, with the choice of which version to use being guided by the questions being addressed. Some sub-models (e.g.urban air pollution) are reduced forms of complex ones created by probabilistic collocation with polynomial chaos bases. Given the significant uncertainties in the model components, it is highly desirable that forecasts be probabilistic. We achieve this by running 400-member ensembles (Latin hypercube sampling) with different choices for key uncertain variables and processes within the human and natural system model components (pdfs of inputs estimated by model-observation comparisons, literature surveys, or expert elicitation). The IGSM has recently been used for probabilistic forecasts of climate, each using 400-member ensembles: one ensemble assumes no explicit climate mitigation policy and others assume increasingly stringent policies involving stabilization of greenhouse gases at various levels. These forecasts indicate clearly that the greatest effect of these policies is to lower the probability of extreme changes. The value of such probability analyses for policy decision-making lies in their ability to compare relative (not just absolute) risks of various policies, which are less affected by the earth system model uncertainties. Given the uncertainties in forecasts, it is also clear that we need to evaluate policies based on their ability to lower risk, and to re-evaluate decisions over time as new knowledge is gained. Reference: R. G. Prinn, Development and Application of Earth System Models, Proceedings, National Academy of Science, June 15, 2012, http://www.pnas.org/cgi/doi/10.1073/pnas.1107470109.
Spatial planning using probabilistic flood maps
NASA Astrophysics Data System (ADS)
Alfonso, Leonardo; Mukolwe, Micah; Di Baldassarre, Giuliano
2015-04-01
Probabilistic flood maps account for uncertainty in flood inundation modelling and convey a degree of certainty in the outputs. Major sources of uncertainty include input data, topographic data, model structure, observation data and parametric uncertainty. Decision makers prefer less ambiguous information from modellers; this implies that uncertainty is suppressed to yield binary flood maps. Though, suppressing information may potentially lead to either surprise or misleading decisions. Inclusion of uncertain information in the decision making process is therefore desirable and transparent. To this end, we utilise the Prospect theory and information from a probabilistic flood map to evaluate potential decisions. Consequences related to the decisions were evaluated using flood risk analysis. Prospect theory explains how choices are made given options for which probabilities of occurrence are known and accounts for decision makers' characteristics such as loss aversion and risk seeking. Our results show that decision making is pronounced when there are high gains and loss, implying higher payoffs and penalties, therefore a higher gamble. Thus the methodology may be appropriately considered when making decisions based on uncertain information.
Reasoning about Probabilistic Security Using Task-PIOAs
NASA Astrophysics Data System (ADS)
Jaggard, Aaron D.; Meadows, Catherine; Mislove, Michael; Segala, Roberto
Task-structured probabilistic input/output automata (Task-PIOAs) are concurrent probabilistic automata that, among other things, have been used to provide a formal framework for the universal composability paradigms of protocol security. One of their advantages is that that they allow one to distinguish high-level nondeterminism that can affect the outcome of the protocol, from low-level choices, which can't. We present an alternative approach to analyzing the structure of Task-PIOAs that relies on ordered sets. We focus on two of the components that are required to define and apply Task-PIOAs: discrete probability theory and automata theory. We believe our development gives insight into the structure of Task-PIOAs and how they can be utilized to model crypto-protocols. We illustrate our approach with an example from anonymity, an area that has not previously been addressed using Task-PIOAs. We model Chaum's Dining Cryptographers Protocol at a level that does not require cryptographic primitives in the analysis. We show via this example how our approach can leverage a proof of security in the case a principal behaves deterministically to prove security when that principal behaves probabilistically.
Oh-Descher, Hanna; Beck, Jeffrey M; Ferrari, Silvia; Sommer, Marc A; Egner, Tobias
2017-11-15
Real-life decision-making often involves combining multiple probabilistic sources of information under finite time and cognitive resources. To mitigate these pressures, people "satisfice", foregoing a full evaluation of all available evidence to focus on a subset of cues that allow for fast and "good-enough" decisions. Although this form of decision-making likely mediates many of our everyday choices, very little is known about the way in which the neural encoding of cue information changes when we satisfice under time pressure. Here, we combined human functional magnetic resonance imaging (fMRI) with a probabilistic classification task to characterize neural substrates of multi-cue decision-making under low (1500 ms) and high (500 ms) time pressure. Using variational Bayesian inference, we analyzed participants' choices to track and quantify cue usage under each experimental condition, which was then applied to model the fMRI data. Under low time pressure, participants performed near-optimally, appropriately integrating all available cues to guide choices. Both cortical (prefrontal and parietal cortex) and subcortical (hippocampal and striatal) regions encoded individual cue weights, and activity linearly tracked trial-by-trial variations in the amount of evidence and decision uncertainty. Under increased time pressure, participants adaptively shifted to using a satisficing strategy by discounting the least informative cue in their decision process. This strategic change in decision-making was associated with an increased involvement of the dopaminergic midbrain, striatum, thalamus, and cerebellum in representing and integrating cue values. We conclude that satisficing the probabilistic inference process under time pressure leads to a cortical-to-subcortical shift in the neural drivers of decisions. Copyright © 2017 Elsevier Inc. All rights reserved.
Glucocorticoid Regulation of Food-Choice Behavior in Humans: Evidence from Cushing's Syndrome.
Moeller, Scott J; Couto, Lizette; Cohen, Vanessa; Lalazar, Yelena; Makotkine, Iouri; Williams, Nia; Yehuda, Rachel; Goldstein, Rita Z; Geer, Eliza B
2016-01-01
The mechanisms by which glucocorticoids regulate food intake and resulting body mass in humans are not well-understood. One potential mechanism could involve modulation of reward processing, but human stress models examining effects of glucocorticoids on behavior contain important confounds. Here, we studied individuals with Cushing's syndrome, a rare endocrine disorder characterized by chronic excess endogenous glucocorticoids. Twenty-three patients with Cushing's syndrome (13 with active disease; 10 with disease in remission) and 15 controls with a comparably high body mass index (BMI) completed two simulated food-choice tasks (one with "explicit" task contingencies and one with "probabilistic" task contingencies), during which they indicated their objective preference for viewing high calorie food images vs. standardized pleasant, unpleasant, and neutral images. All participants also completed measures of food craving, and approximately half of the participants provided 24-h urine samples for assessment of cortisol and cortisone concentrations. Results showed that on the explicit task (but not the probabilistic task), participants with active Cushing's syndrome made fewer food-related choices than participants with Cushing's syndrome in remission, who in turn made fewer food-related choices than overweight controls. Corroborating this group effect, higher urine cortisone was negatively correlated with food-related choice in the subsample of all participants for whom these data were available. On the probabilistic task, despite a lack of group differences, higher food-related choice correlated with higher state and trait food craving in active Cushing's patients. Taken together, relative to overweight controls, Cushing's patients, particularly those with active disease, displayed a reduced vigor of responding for food rewards that was presumably attributable to glucocorticoid abnormalities. Beyond Cushing's, these results may have relevance for elucidating glucocorticoid contributions to food-seeking behavior, enhancing mechanistic understanding of weight fluctuations associated with oral glucocorticoid therapy and/or chronic stress, and informing the neurobiology of neuropsychiatric conditions marked by abnormal cortisol dynamics (e.g., major depression, Alzheimer's disease).
NASA Technical Reports Server (NTRS)
Abbey, Craig K.; Eckstein, Miguel P.
2002-01-01
We consider estimation and statistical hypothesis testing on classification images obtained from the two-alternative forced-choice experimental paradigm. We begin with a probabilistic model of task performance for simple forced-choice detection and discrimination tasks. Particular attention is paid to general linear filter models because these models lead to a direct interpretation of the classification image as an estimate of the filter weights. We then describe an estimation procedure for obtaining classification images from observer data. A number of statistical tests are presented for testing various hypotheses from classification images based on some more compact set of features derived from them. As an example of how the methods we describe can be used, we present a case study investigating detection of a Gaussian bump profile.
Uncertainty plus Prior Equals Rational Bias: An Intuitive Bayesian Probability Weighting Function
ERIC Educational Resources Information Center
Fennell, John; Baddeley, Roland
2012-01-01
Empirical research has shown that when making choices based on probabilistic options, people behave as if they overestimate small probabilities, underestimate large probabilities, and treat positive and negative outcomes differently. These distortions have been modeled using a nonlinear probability weighting function, which is found in several…
NASA Astrophysics Data System (ADS)
Warner, Thomas T.; Sheu, Rong-Shyang; Bowers, James F.; Sykes, R. Ian; Dodd, Gregory C.; Henn, Douglas S.
2002-05-01
Ensemble simulations made using a coupled atmospheric dynamic model and a probabilistic Lagrangian puff dispersion model were employed in a forensic analysis of the transport and dispersion of a toxic gas that may have been released near Al Muthanna, Iraq, during the Gulf War. The ensemble study had two objectives, the first of which was to determine the sensitivity of the calculated dosage fields to the choices that must be made about the configuration of the atmospheric dynamic model. In this test, various choices were used for model physics representations and for the large-scale analyses that were used to construct the model initial and boundary conditions. The second study objective was to examine the dispersion model's ability to use ensemble inputs to predict dosage probability distributions. Here, the dispersion model was used with the ensemble mean fields from the individual atmospheric dynamic model runs, including the variability in the individual wind fields, to generate dosage probabilities. These are compared with the explicit dosage probabilities derived from the individual runs of the coupled modeling system. The results demonstrate that the specific choices made about the dynamic-model configuration and the large-scale analyses can have a large impact on the simulated dosages. For example, the area near the source that is exposed to a selected dosage threshold varies by up to a factor of 4 among members of the ensemble. The agreement between the explicit and ensemble dosage probabilities is relatively good for both low and high dosage levels. Although only one ensemble was considered in this study, the encouraging results suggest that a probabilistic dispersion model may be of value in quantifying the effects of uncertainties in a dynamic-model ensemble on dispersion model predictions of atmospheric transport and dispersion.
Quantification and Formalization of Security
2010-02-01
Quantification of Information Flow . . . . . . . . . . . . . . . . . . 30 2.4 Language Semantics . . . . . . . . . . . . . . . . . . . . . . . . . . 46...system behavior observed by users holding low clearances. This policy, or a variant of it, is enforced by many pro- gramming language -based mechanisms...illustrates with a particular programming language (while-programs plus probabilistic choice). The model is extended in §2.5 to programs in which
Discounting of food, sex, and money.
Holt, Daniel D; Newquist, Matthew H; Smits, Rochelle R; Tiry, Andrew M
2014-06-01
Discounting is a useful framework for understanding choice involving a range of delayed and probabilistic outcomes (e.g., money, food, drugs), but relatively few studies have examined how people discount other commodities (e.g., entertainment, sex). Using a novel discounting task, where the length of a line represented the value of an outcome and was adjusted using a staircase procedure, we replicated previous findings showing that individuals discount delayed and probabilistic outcomes in a manner well described by a hyperbola-like function. In addition, we found strong positive correlations between discounting rates of delayed, but not probabilistic, outcomes. This suggests that discounting of delayed outcomes may be relatively predictable across outcome types but that discounting of probabilistic outcomes may depend more on specific contexts. The generality of delay discounting and potential context dependence of probability discounting may provide important information regarding factors contributing to choice behavior.
Probabilistic registration of an unbiased statistical shape model to ultrasound images of the spine
NASA Astrophysics Data System (ADS)
Rasoulian, Abtin; Rohling, Robert N.; Abolmaesumi, Purang
2012-02-01
The placement of an epidural needle is among the most difficult regional anesthetic techniques. Ultrasound has been proposed to improve success of placement. However, it has not become the standard-of-care because of limitations in the depictions and interpretation of the key anatomical features. We propose to augment the ultrasound images with a registered statistical shape model of the spine to aid interpretation. The model is created with a novel deformable group-wise registration method which utilizes a probabilistic approach to register groups of point sets. The method is compared to a volume-based model building technique and it demonstrates better generalization and compactness. We instantiate and register the shape model to a spine surface probability map extracted from the ultrasound images. Validation is performed on human subjects. The achieved registration accuracy (2-4 mm) is sufficient to guide the choice of puncture site and trajectory of an epidural needle.
Testing Transitivity of Preferences on Two-Alternative Forced Choice Data
Regenwetter, Michel; Dana, Jason; Davis-Stober, Clintin P.
2010-01-01
As Duncan Luce and other prominent scholars have pointed out on several occasions, testing algebraic models against empirical data raises difficult conceptual, mathematical, and statistical challenges. Empirical data often result from statistical sampling processes, whereas algebraic theories are nonprobabilistic. Many probabilistic specifications lead to statistical boundary problems and are subject to nontrivial order constrained statistical inference. The present paper discusses Luce's challenge for a particularly prominent axiom: Transitivity. The axiom of transitivity is a central component in many algebraic theories of preference and choice. We offer the currently most complete solution to the challenge in the case of transitivity of binary preference on the theory side and two-alternative forced choice on the empirical side, explicitly for up to five, and implicitly for up to seven, choice alternatives. We also discuss the relationship between our proposed solution and weak stochastic transitivity. We recommend to abandon the latter as a model of transitive individual preferences. PMID:21833217
Continuous track paths reveal additive evidence integration in multistep decision making.
Buc Calderon, Cristian; Dewulf, Myrtille; Gevers, Wim; Verguts, Tom
2017-10-03
Multistep decision making pervades daily life, but its underlying mechanisms remain obscure. We distinguish four prominent models of multistep decision making, namely serial stage, hierarchical evidence integration, hierarchical leaky competing accumulation (HLCA), and probabilistic evidence integration (PEI). To empirically disentangle these models, we design a two-step reward-based decision paradigm and implement it in a reaching task experiment. In a first step, participants choose between two potential upcoming choices, each associated with two rewards. In a second step, participants choose between the two rewards selected in the first step. Strikingly, as predicted by the HLCA and PEI models, the first-step decision dynamics were initially biased toward the choice representing the highest sum/mean before being redirected toward the choice representing the maximal reward (i.e., initial dip). Only HLCA and PEI predicted this initial dip, suggesting that first-step decision dynamics depend on additive integration of competing second-step choices. Our data suggest that potential future outcomes are progressively unraveled during multistep decision making.
A Discounting Framework for Choice With Delayed and Probabilistic Rewards
Green, Leonard; Myerson, Joel
2005-01-01
When choosing between delayed or uncertain outcomes, individuals discount the value of such outcomes on the basis of the expected time to or the likelihood of their occurrence. In an integrative review of the expanding experimental literature on discounting, the authors show that although the same form of hyperbola-like function describes discounting of both delayed and probabilistic outcomes, a variety of recent findings are inconsistent with a single-process account. The authors also review studies that compare discounting in different populations and discuss the theoretical and practical implications of the findings. The present effort illustrates the value of studying choice involving both delayed and probabilistic outcomes within a general discounting framework that uses similar experimental procedures and a common analytical approach. PMID:15367080
Forecasting the impact of transport improvements on commuting and residential choice
NASA Astrophysics Data System (ADS)
Elhorst, J. Paul; Oosterhaven, Jan
2006-03-01
This paper develops a probabilistic, competing-destinations, assignment model that predicts changes in the spatial pattern of the working population as a result of transport improvements. The choice of residence is explained by a new non-parametric model, which represents an alternative to the popular multinominal logit model. Travel times between zones are approximated by a normal distribution function with different mean and variance for each pair of zones, whereas previous models only use average travel times. The model’s forecast error of the spatial distribution of the Dutch working population is 7% when tested on 1998 base-year data. To incorporate endogenous changes in its causal variables, an almost ideal demand system is estimated to explain the choice of transport mode, and a new economic geography inter-industry model (RAEM) is estimated to explain the spatial distribution of employment. In the application, the model is used to forecast the impact of six mutually exclusive Dutch core-periphery railway proposals in the projection year 2020.
Consumer Behavior in the Choice of Mode of Transport: A Case Study in the Toledo-Madrid Corridor
Muro-Rodríguez, Ana I.; Perez-Jiménez, Israel R.; Gutiérrez-Broncano, Santiago
2017-01-01
Within the context of the consumption of goods or services the decisions made by individuals involve the choice between a set of discrete alternatives, such as the choice of mode of transport. The methodology for analyzing the consumer behavior are the models of discrete choice based on the Theory of Random Utility. These models are based on the definition of preferences through a utility function that is maximized. These models also denominated of disaggregated demand derived from the decision of a set of individuals, who are formalized by the application of probabilistic models. The objective of this study is to determine the behavior of the consumer in the choice of a service, namely of transport services and in a short-distance corridor, such as Toledo-Madrid. The Toledo-Madrid corridor is characterized by being short distance, with high speed train available within the choice options to get the airport, along with the bus and the car. And where offers of HST and aircraft services can be proposed as complementary modes. By applying disaggregated transport models with revealed preference survey data and declared preferences, one can determine the most important variables involved in the choice and determine the arrangements for payment of individuals. These payment provisions may condition the use of certain transport policies to promote the use of efficient transportation. PMID:28676776
Consumer Behavior in the Choice of Mode of Transport: A Case Study in the Toledo-Madrid Corridor.
Muro-Rodríguez, Ana I; Perez-Jiménez, Israel R; Gutiérrez-Broncano, Santiago
2017-01-01
Within the context of the consumption of goods or services the decisions made by individuals involve the choice between a set of discrete alternatives, such as the choice of mode of transport. The methodology for analyzing the consumer behavior are the models of discrete choice based on the Theory of Random Utility. These models are based on the definition of preferences through a utility function that is maximized. These models also denominated of disaggregated demand derived from the decision of a set of individuals, who are formalized by the application of probabilistic models. The objective of this study is to determine the behavior of the consumer in the choice of a service, namely of transport services and in a short-distance corridor, such as Toledo-Madrid. The Toledo-Madrid corridor is characterized by being short distance, with high speed train available within the choice options to get the airport, along with the bus and the car. And where offers of HST and aircraft services can be proposed as complementary modes. By applying disaggregated transport models with revealed preference survey data and declared preferences, one can determine the most important variables involved in the choice and determine the arrangements for payment of individuals. These payment provisions may condition the use of certain transport policies to promote the use of efficient transportation.
Bhanji, Jamil P.; Beer, Jennifer S.; Bunge, Silvia A.
2014-01-01
A decision may be difficult because complex information processing is required to evaluate choices according to deterministic decision rules and/or because it is not certain which choice will lead to the best outcome in a probabilistic context. Factors that tax decision making such as decision rule complexity and low decision certainty should be disambiguated for a more complete understanding of the decision making process. Previous studies have examined the brain regions that are modulated by decision rule complexity or by decision certainty but have not examined these factors together in the context of a single task or study. In the present functional magnetic resonance imaging study, both decision rule complexity and decision certainty were varied in comparable decision tasks. Further, the level of certainty about which choice to make (choice certainty) was varied separately from certainty about the final outcome resulting from a choice (outcome certainty). Lateral prefrontal cortex, dorsal anterior cingulate cortex, and bilateral anterior insula were modulated by decision rule complexity. Anterior insula was engaged more strongly by low than high choice certainty decisions, whereas ventromedial prefrontal cortex showed the opposite pattern. These regions showed no effect of the independent manipulation of outcome certainty. The results disambiguate the influence of decision rule complexity, choice certainty, and outcome certainty on activity in diverse brain regions that have been implicated in decision making. Lateral prefrontal cortex plays a key role in implementing deterministic decision rules, ventromedial prefrontal cortex in probabilistic rules, and anterior insula in both. PMID:19781652
Decision making generalized by a cumulative probability weighting function
NASA Astrophysics Data System (ADS)
dos Santos, Lindomar Soares; Destefano, Natália; Martinez, Alexandre Souto
2018-01-01
Typical examples of intertemporal decision making involve situations in which individuals must choose between a smaller reward, but more immediate, and a larger one, delivered later. Analogously, probabilistic decision making involves choices between options whose consequences differ in relation to their probability of receiving. In Economics, the expected utility theory (EUT) and the discounted utility theory (DUT) are traditionally accepted normative models for describing, respectively, probabilistic and intertemporal decision making. A large number of experiments confirmed that the linearity assumed by the EUT does not explain some observed behaviors, as nonlinear preference, risk-seeking and loss aversion. That observation led to the development of new theoretical models, called non-expected utility theories (NEUT), which include a nonlinear transformation of the probability scale. An essential feature of the so-called preference function of these theories is that the probabilities are transformed by decision weights by means of a (cumulative) probability weighting function, w(p) . We obtain in this article a generalized function for the probabilistic discount process. This function has as particular cases mathematical forms already consecrated in the literature, including discount models that consider effects of psychophysical perception. We also propose a new generalized function for the functional form of w. The limiting cases of this function encompass some parametric forms already proposed in the literature. Far beyond a mere generalization, our function allows the interpretation of probabilistic decision making theories based on the assumption that individuals behave similarly in the face of probabilities and delays and is supported by phenomenological models.
Location contexts of user check-ins to model urban geo life-style patterns.
Hasan, Samiul; Ukkusuri, Satish V
2015-01-01
Geo-location data from social media offers us information, in new ways, to understand people's attitudes and interests through their activity choices. In this paper, we explore the idea of inferring individual life-style patterns from activity-location choices revealed in social media. We present a model to understand life-style patterns using the contextual information (e. g. location categories) of user check-ins. Probabilistic topic models are developed to infer individual geo life-style patterns from two perspectives: i) to characterize the patterns of user interests to different types of places and ii) to characterize the patterns of user visits to different neighborhoods. The method is applied to a dataset of Foursquare check-ins of the users from New York City. The co-existence of several location contexts and the corresponding probabilities in a given pattern provide useful information about user interests and choices. It is found that geo life-style patterns have similar items-either nearby neighborhoods or similar location categories. The semantic and geographic proximity of the items in a pattern reflects the hidden regularity in user preferences and location choice behavior.
NASA Astrophysics Data System (ADS)
Sycheva, Elena A.; Vasilev, Aleksandr S.; Lashmanov, Oleg U.; Korotaev, Valery V.
2017-06-01
The article is devoted to the optimization of optoelectronic systems of the spatial position of objects. Probabilistic characteristics of the detection of an active structured mark on a random noisy background are investigated. The developed computer model and the results of the study allow us to estimate the probabilistic characteristics of detection of a complex structured mark on a random gradient background, and estimate the error of spatial coordinates. The results of the study make it possible to improve the accuracy of measuring the coordinates of the object. Based on the research recommendations are given on the choice of parameters of the optimal mark structure for use in opticalelectronic systems for monitoring the spatial position of large-sized structures.
Risk-sensitive choice in humans as a function of an earnings budget.
Pietras, C J; Hackenberc, T D
2001-01-01
Risky choice in 3 adult humans was investigated across procedural manipulations designed to model energy-budget manipulations conducted with nonhumans. Subjects were presented with repeated choices between a fixed and a variable number of points. An energy budget was simulated by use of an earnings budget, defined as the number of points needed within a block of trials for points to be exchanged for money. During positive earnings-budget conditions, exclusive preference for the fixed option met the earnings requirement. During negative earnings-budget conditions, exclusive preference for the certain option did not meet the earnings requirement, but choice for the variable option met the requirement probabilistically. Choice was generally risk averse (the fixed option was preferred) when the earnings budget was positive and risk prone (the variable option was preferred) when the earnings budget was negative. Furthermore, choice was most risk prone during negative earnings-budget conditions in which the earnings requirement was most stringent. Local choice patterns were also frequently consistent with the predictions of a dynamic optimization model, indicating that choice was simultaneously sensitive to short-term choice contingencies, current point earnings, and the earnings requirement. Overall, these results show that the patterns of risky choice generated by energy-budget variables can also be produced by choice contingencies that do not involve immediate survival, and that risky choice in humans may be similar to that shown in nonhumans when choice is studied under analogous experimental conditions. PMID:11516113
Development of a Prototype Model-Form Uncertainty Knowledge Base
NASA Technical Reports Server (NTRS)
Green, Lawrence L.
2016-01-01
Uncertainties are generally classified as either aleatory or epistemic. Aleatory uncertainties are those attributed to random variation, either naturally or through manufacturing processes. Epistemic uncertainties are generally attributed to a lack of knowledge. One type of epistemic uncertainty is called model-form uncertainty. The term model-form means that among the choices to be made during a design process within an analysis, there are different forms of the analysis process, which each give different results for the same configuration at the same flight conditions. Examples of model-form uncertainties include the grid density, grid type, and solver type used within a computational fluid dynamics code, or the choice of the number and type of model elements within a structures analysis. The objectives of this work are to identify and quantify a representative set of model-form uncertainties and to make this information available to designers through an interactive knowledge base (KB). The KB can then be used during probabilistic design sessions, so as to enable the possible reduction of uncertainties in the design process through resource investment. An extensive literature search has been conducted to identify and quantify typical model-form uncertainties present within aerospace design. An initial attempt has been made to assemble the results of this literature search into a searchable KB, usable in real time during probabilistic design sessions. A concept of operations and the basic structure of a model-form uncertainty KB are described. Key operations within the KB are illustrated. Current limitations in the KB, and possible workarounds are explained.
ERIC Educational Resources Information Center
Madhkhan, Mozhgan; Mousavi, Mojtaba
2017-01-01
Anaphoric expressions are among the most frequent language forms which depend on context for their resolution. Among efforts made in theorizing referential choice, distance approaches take into account how accessibility/continuity is reflected by choice of referring expressions. The thing is that individual's choices are made under major and minor…
Lost in Search: (Mal-)Adaptation to Probabilistic Decision Environments in Children and Adults
ERIC Educational Resources Information Center
Betsch, Tilmann; Lehmann, Anne; Lindow, Stefanie; Lang, Anna; Schoemann, Martin
2016-01-01
Adaptive decision making in probabilistic environments requires individuals to use probabilities as weights in predecisional information searches and/or when making subsequent choices. Within a child-friendly computerized environment (Mousekids), we tracked 205 children's (105 children 5-6 years of age and 100 children 9-10 years of age) and 103…
A Novel TRM Calculation Method by Probabilistic Concept
NASA Astrophysics Data System (ADS)
Audomvongseree, Kulyos; Yokoyama, Akihiko; Verma, Suresh Chand; Nakachi, Yoshiki
In a new competitive environment, it becomes possible for the third party to access a transmission facility. From this structure, to efficiently manage the utilization of the transmission network, a new definition about Available Transfer Capability (ATC) has been proposed. According to the North American ElectricReliability Council (NERC)’s definition, ATC depends on several parameters, i. e. Total Transfer Capability (TTC), Transmission Reliability Margin (TRM), and Capacity Benefit Margin (CBM). This paper is focused on the calculation of TRM which is one of the security margin reserved for any uncertainty of system conditions. The TRM calculation by probabilistic method is proposed in this paper. Based on the modeling of load forecast error and error in transmission line limitation, various cases of transmission transfer capability and its related probabilistic nature can be calculated. By consideration of the proposed concept of risk analysis, the appropriate required amount of TRM can be obtained. The objective of this research is to provide realistic information on the actual ability of the network which may be an alternative choice for system operators to make an appropriate decision in the competitive market. The advantages of the proposed method are illustrated by application to the IEEJ-WEST10 model system.
Astrobiological complexity with probabilistic cellular automata.
Vukotić, Branislav; Ćirković, Milan M
2012-08-01
The search for extraterrestrial life and intelligence constitutes one of the major endeavors in science, but has yet been quantitatively modeled only rarely and in a cursory and superficial fashion. We argue that probabilistic cellular automata (PCA) represent the best quantitative framework for modeling the astrobiological history of the Milky Way and its Galactic Habitable Zone. The relevant astrobiological parameters are to be modeled as the elements of the input probability matrix for the PCA kernel. With the underlying simplicity of the cellular automata constructs, this approach enables a quick analysis of large and ambiguous space of the input parameters. We perform a simple clustering analysis of typical astrobiological histories with "Copernican" choice of input parameters and discuss the relevant boundary conditions of practical importance for planning and guiding empirical astrobiological and SETI projects. In addition to showing how the present framework is adaptable to more complex situations and updated observational databases from current and near-future space missions, we demonstrate how numerical results could offer a cautious rationale for continuation of practical SETI searches.
NASA Astrophysics Data System (ADS)
Baer, P.; Mastrandrea, M.
2006-12-01
Simple probabilistic models which attempt to estimate likely transient temperature change from specified CO2 emissions scenarios must make assumptions about at least six uncertain aspects of the causal chain between emissions and temperature: current radiative forcing (including but not limited to aerosols), current land use emissions, carbon sinks, future non-CO2 forcing, ocean heat uptake, and climate sensitivity. Of these, multiple PDFs (probability density functions) have been published for the climate sensitivity, a couple for current forcing and ocean heat uptake, one for future non-CO2 forcing, and none for current land use emissions or carbon cycle uncertainty (which are interdependent). Different assumptions about these parameters, as well as different model structures, will lead to different estimates of likely temperature increase from the same emissions pathway. Thus policymakers will be faced with a range of temperature probability distributions for the same emissions scenarios, each described by a central tendency and spread. Because our conventional understanding of uncertainty and probability requires that a probabilistically defined variable of interest have only a single mean (or median, or modal) value and a well-defined spread, this "multidimensional" uncertainty defies straightforward utilization in policymaking. We suggest that there are no simple solutions to the questions raised. Crucially, we must dispel the notion that there is a "true" probability probabilities of this type are necessarily subjective, and reasonable people may disagree. Indeed, we suggest that what is at stake is precisely the question, what is it reasonable to believe, and to act as if we believe? As a preliminary suggestion, we demonstrate how the output of a simple probabilistic climate model might be evaluated regarding the reasonableness of the outputs it calculates with different input PDFs. We suggest further that where there is insufficient evidence to clearly favor one range of probabilistic projections over another, that the choice of results on which to base policy must necessarily involve ethical considerations, as they have inevitable consequences for the distribution of risk In particular, the choice to use a more "optimistic" PDF for climate sensitivity (or other components of the causal chain) leads to the allowance of higher emissions consistent with any specified goal for risk reduction, and thus leads to higher climate impacts, in exchange for lower mitigation costs.
ERIC Educational Resources Information Center
Karsina, Allen; Thompson, Rachel H.; Rodriguez, Nicole M.; Vanselow, Nicholas R.
2012-01-01
We evaluated the effects of differential reinforcement and accurate verbal rules with feedback on the preference for choice and the verbal reports of 6 adults. Participants earned points on a probabilistic schedule by completing the terminal links of a concurrent-chains arrangement in a computer-based game of chance. In free-choice terminal links,…
When good pigeons make bad decisions: Choice with probabilistic delays and outcomes.
Pisklak, Jeffrey M; McDevitt, Margaret A; Dunn, Roger M; Spetch, Marcia L
2015-11-01
Pigeons chose between an (optimal) alternative that sometimes provided food after a 10-s delay and other times after a 40-s delay and another (suboptimal) alternative that sometimes provided food after 10 s but other times no food after 40 s. When outcomes were not signaled during the delays, pigeons strongly preferred the optimal alternative. When outcomes were signaled, choices of the suboptimal alternative increased and most pigeons preferred the alternative that provided no food after the long delay despite the cost in terms of obtained food. The pattern of results was similar whether the short delays occurred on 25% or 50% of the trials. Shortening the 40-s delay to food sharply reduced suboptimal choices, but shortening the delay to no food had little effect. The results suggest that a signaled delay to no food does not punish responding in probabilistic choice procedures. The findings are discussed in terms of conditioned reinforcement by signals for good news. © Society for the Experimental Analysis of Behavior.
Hold it! The influence of lingering rewards on choice diversification and persistence.
Schulze, Christin; van Ravenzwaaij, Don; Newell, Ben R
2017-11-01
Learning to choose adaptively when faced with uncertain and variable outcomes is a central challenge for decision makers. This study examines repeated choice in dynamic probability learning tasks in which outcome probabilities changed either as a function of the choices participants made or independently of those choices. This presence/absence of sequential choice-outcome dependencies was implemented by manipulating a single task aspect between conditions: the retention/withdrawal of reward across individual choice trials. The study addresses how people adapt to these learning environments and to what extent they engage in 2 choice strategies often contrasted as paradigmatic examples of striking violation of versus nominal adherence to rational choice: diversification and persistent probability maximizing, respectively. Results show that decisions approached adaptive choice diversification and persistence when sufficient feedback was provided on the dynamic rules of the probabilistic environments. The findings of divergent behavior in the 2 environments indicate that diversified choices represented a response to the reward retention manipulation rather than to the mere variability of outcome probabilities. Choice in both environments was well accounted for by the generalized matching law, and computational modeling-based strategy analyses indicated that adaptive choice arose mainly from reliance on reinforcement learning strategies. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Location Contexts of User Check-Ins to Model Urban Geo Life-Style Patterns
Hasan, Samiul; Ukkusuri, Satish V.
2015-01-01
Geo-location data from social media offers us information, in new ways, to understand people's attitudes and interests through their activity choices. In this paper, we explore the idea of inferring individual life-style patterns from activity-location choices revealed in social media. We present a model to understand life-style patterns using the contextual information (e. g. location categories) of user check-ins. Probabilistic topic models are developed to infer individual geo life-style patterns from two perspectives: i) to characterize the patterns of user interests to different types of places and ii) to characterize the patterns of user visits to different neighborhoods. The method is applied to a dataset of Foursquare check-ins of the users from New York City. The co-existence of several location contexts and the corresponding probabilities in a given pattern provide useful information about user interests and choices. It is found that geo life-style patterns have similar items—either nearby neighborhoods or similar location categories. The semantic and geographic proximity of the items in a pattern reflects the hidden regularity in user preferences and location choice behavior. PMID:25970430
Probabilistic projections of 21st century climate change over Northern Eurasia
NASA Astrophysics Data System (ADS)
Monier, E.; Sokolov, A. P.; Schlosser, C. A.; Scott, J. R.; Gao, X.
2013-12-01
We present probabilistic projections of 21st century climate change over Northern Eurasia using the Massachusetts Institute of Technology (MIT) Integrated Global System Model (IGSM), an integrated assessment model that couples an earth system model of intermediate complexity, with a two-dimensional zonal-mean atmosphere, to a human activity model. Regional climate change is obtained by two downscaling methods: a dynamical downscaling, where the IGSM is linked to a three dimensional atmospheric model; and a statistical downscaling, where a pattern scaling algorithm uses climate-change patterns from 17 climate models. This framework allows for key sources of uncertainty in future projections of regional climate change to be accounted for: emissions projections; climate system parameters (climate sensitivity, strength of aerosol forcing and ocean heat uptake rate); natural variability; and structural uncertainty. Results show that the choice of climate policy and the climate parameters are the largest drivers of uncertainty. We also nd that dierent initial conditions lead to dierences in patterns of change as large as when using different climate models. Finally, this analysis reveals the wide range of possible climate change over Northern Eurasia, emphasizing the need to consider all sources of uncertainty when modeling climate impacts over Northern Eurasia.
Probabilistic projections of 21st century climate change over Northern Eurasia
NASA Astrophysics Data System (ADS)
Monier, Erwan; Sokolov, Andrei; Schlosser, Adam; Scott, Jeffery; Gao, Xiang
2013-12-01
We present probabilistic projections of 21st century climate change over Northern Eurasia using the Massachusetts Institute of Technology (MIT) Integrated Global System Model (IGSM), an integrated assessment model that couples an Earth system model of intermediate complexity with a two-dimensional zonal-mean atmosphere to a human activity model. Regional climate change is obtained by two downscaling methods: a dynamical downscaling, where the IGSM is linked to a three-dimensional atmospheric model, and a statistical downscaling, where a pattern scaling algorithm uses climate change patterns from 17 climate models. This framework allows for four major sources of uncertainty in future projections of regional climate change to be accounted for: emissions projections, climate system parameters (climate sensitivity, strength of aerosol forcing and ocean heat uptake rate), natural variability, and structural uncertainty. The results show that the choice of climate policy and the climate parameters are the largest drivers of uncertainty. We also find that different initial conditions lead to differences in patterns of change as large as when using different climate models. Finally, this analysis reveals the wide range of possible climate change over Northern Eurasia, emphasizing the need to consider these sources of uncertainty when modeling climate impacts over Northern Eurasia.
Managing flowback and produced water from hydraulic fracturing under stochastic environment
NASA Astrophysics Data System (ADS)
Zhang, X.; Sun, A. Y.; Duncan, I. J.; Vesselinov, V. V.
2017-12-01
A large volume of wastewater is being generated from hydraulic fracturing in shale gas plays, including flowback and produced water. The produced wastewater in terms of its quantity and quality has become one of the main environmental problems facing shale gas industries worldwide. Cost-effective planning and management of flowback and produced water is highly desirable. Careful choice of treatment, disposal, and reuse options can lower costs and reduce potential environmental impacts. To handle the recourse issue in decision-making, a two-stage stochastic management model is developed to provide optimal alternatives for fracturing wastewater management. The proposed model is capable of prompting corrective actions to allow decision makers to adjust the pre-defined management strategies. By using this two-stage model, potential penalties arising from decision infeasibility can be minimized. The applicability of the proposed model is demonstrated using a representative synthetic example, in which tradeoffs between economic and environmental goals are quantified. This approach can generate informed defensible decisions for shale gas wastewater management. In addition, probabilistic and non-probabilistic uncertainties are effectively addressed.
Local Choices: Rationality and the Contextuality of Decision-Making
Vlaev, Ivo
2018-01-01
Rational explanation is ubiquitous in psychology and social sciences, ranging from rational analysis, expectancy-value theories, ideal observer models, mental logic to probabilistic frameworks, rational choice theory, and informal “folk psychological” explanation. However, rational explanation appears to be challenged by apparently systematic irrationality observed in psychological experiments, especially in the field of judgement and decision-making (JDM). Here, it is proposed that the experimental results require not that rational explanation should be rejected, but that rational explanation is local, i.e., within a context. Thus, rational models need to be supplemented with a theory of contextual shifts. We review evidence in JDM that patterns of choices are often consistent within contexts, but unstable between contexts. We also demonstrate that for a limited, though reasonably broad, class of decision-making domains, recent theoretical models can be viewed as providing theories of contextual shifts. It is argued that one particular significant source of global inconsistency arises from a cognitive inability to represent absolute magnitudes, whether for perceptual variables, utilities, payoffs, or probabilities. This overall argument provides a fresh perspective on the scope and limits of human rationality. PMID:29301289
Local Choices: Rationality and the Contextuality of Decision-Making.
Vlaev, Ivo
2018-01-02
Rational explanation is ubiquitous in psychology and social sciences, ranging from rational analysis, expectancy-value theories, ideal observer models, mental logic to probabilistic frameworks, rational choice theory, and informal "folk psychological" explanation. However, rational explanation appears to be challenged by apparently systematic irrationality observed in psychological experiments, especially in the field of judgement and decision-making (JDM). Here, it is proposed that the experimental results require not that rational explanation should be rejected, but that rational explanation is local , i.e., within a context. Thus, rational models need to be supplemented with a theory of contextual shifts. We review evidence in JDM that patterns of choices are often consistent within contexts, but unstable between contexts. We also demonstrate that for a limited, though reasonably broad, class of decision-making domains, recent theoretical models can be viewed as providing theories of contextual shifts. It is argued that one particular significant source of global inconsistency arises from a cognitive inability to represent absolute magnitudes, whether for perceptual variables, utilities, payoffs, or probabilities. This overall argument provides a fresh perspective on the scope and limits of human rationality.
Glucocorticoid Regulation of Food-Choice Behavior in Humans: Evidence from Cushing's Syndrome
Moeller, Scott J.; Couto, Lizette; Cohen, Vanessa; Lalazar, Yelena; Makotkine, Iouri; Williams, Nia; Yehuda, Rachel; Goldstein, Rita Z.; Geer, Eliza B.
2016-01-01
The mechanisms by which glucocorticoids regulate food intake and resulting body mass in humans are not well-understood. One potential mechanism could involve modulation of reward processing, but human stress models examining effects of glucocorticoids on behavior contain important confounds. Here, we studied individuals with Cushing's syndrome, a rare endocrine disorder characterized by chronic excess endogenous glucocorticoids. Twenty-three patients with Cushing's syndrome (13 with active disease; 10 with disease in remission) and 15 controls with a comparably high body mass index (BMI) completed two simulated food-choice tasks (one with “explicit” task contingencies and one with “probabilistic” task contingencies), during which they indicated their objective preference for viewing high calorie food images vs. standardized pleasant, unpleasant, and neutral images. All participants also completed measures of food craving, and approximately half of the participants provided 24-h urine samples for assessment of cortisol and cortisone concentrations. Results showed that on the explicit task (but not the probabilistic task), participants with active Cushing's syndrome made fewer food-related choices than participants with Cushing's syndrome in remission, who in turn made fewer food-related choices than overweight controls. Corroborating this group effect, higher urine cortisone was negatively correlated with food-related choice in the subsample of all participants for whom these data were available. On the probabilistic task, despite a lack of group differences, higher food-related choice correlated with higher state and trait food craving in active Cushing's patients. Taken together, relative to overweight controls, Cushing's patients, particularly those with active disease, displayed a reduced vigor of responding for food rewards that was presumably attributable to glucocorticoid abnormalities. Beyond Cushing's, these results may have relevance for elucidating glucocorticoid contributions to food-seeking behavior, enhancing mechanistic understanding of weight fluctuations associated with oral glucocorticoid therapy and/or chronic stress, and informing the neurobiology of neuropsychiatric conditions marked by abnormal cortisol dynamics (e.g., major depression, Alzheimer's disease). PMID:26903790
Orhan, A Emin; Ma, Wei Ji
2017-07-26
Animals perform near-optimal probabilistic inference in a wide range of psychophysical tasks. Probabilistic inference requires trial-to-trial representation of the uncertainties associated with task variables and subsequent use of this representation. Previous work has implemented such computations using neural networks with hand-crafted and task-dependent operations. We show that generic neural networks trained with a simple error-based learning rule perform near-optimal probabilistic inference in nine common psychophysical tasks. In a probabilistic categorization task, error-based learning in a generic network simultaneously explains a monkey's learning curve and the evolution of qualitative aspects of its choice behavior. In all tasks, the number of neurons required for a given level of performance grows sublinearly with the input population size, a substantial improvement on previous implementations of probabilistic inference. The trained networks develop a novel sparsity-based probabilistic population code. Our results suggest that probabilistic inference emerges naturally in generic neural networks trained with error-based learning rules.Behavioural tasks often require probability distributions to be inferred about task specific variables. Here, the authors demonstrate that generic neural networks can be trained using a simple error-based learning rule to perform such probabilistic computations efficiently without any need for task specific operations.
A Flexible Mechanism of Rule Selection Enables Rapid Feature-Based Reinforcement Learning
Balcarras, Matthew; Womelsdorf, Thilo
2016-01-01
Learning in a new environment is influenced by prior learning and experience. Correctly applying a rule that maps a context to stimuli, actions, and outcomes enables faster learning and better outcomes compared to relying on strategies for learning that are ignorant of task structure. However, it is often difficult to know when and how to apply learned rules in new contexts. In our study we explored how subjects employ different strategies for learning the relationship between stimulus features and positive outcomes in a probabilistic task context. We test the hypothesis that task naive subjects will show enhanced learning of feature specific reward associations by switching to the use of an abstract rule that associates stimuli by feature type and restricts selections to that dimension. To test this hypothesis we designed a decision making task where subjects receive probabilistic feedback following choices between pairs of stimuli. In the task, trials are grouped in two contexts by blocks, where in one type of block there is no unique relationship between a specific feature dimension (stimulus shape or color) and positive outcomes, and following an un-cued transition, alternating blocks have outcomes that are linked to either stimulus shape or color. Two-thirds of subjects (n = 22/32) exhibited behavior that was best fit by a hierarchical feature-rule model. Supporting the prediction of the model mechanism these subjects showed significantly enhanced performance in feature-reward blocks, and rapidly switched their choice strategy to using abstract feature rules when reward contingencies changed. Choice behavior of other subjects (n = 10/32) was fit by a range of alternative reinforcement learning models representing strategies that do not benefit from applying previously learned rules. In summary, these results show that untrained subjects are capable of flexibly shifting between behavioral rules by leveraging simple model-free reinforcement learning and context-specific selections to drive responses. PMID:27064794
DYT1 dystonia increases risk taking in humans.
Arkadir, David; Radulescu, Angela; Raymond, Deborah; Lubarr, Naomi; Bressman, Susan B; Mazzoni, Pietro; Niv, Yael
2016-06-01
It has been difficult to link synaptic modification to overt behavioral changes. Rodent models of DYT1 dystonia, a motor disorder caused by a single gene mutation, demonstrate increased long-term potentiation and decreased long-term depression in corticostriatal synapses. Computationally, such asymmetric learning predicts risk taking in probabilistic tasks. Here we demonstrate abnormal risk taking in DYT1 dystonia patients, which is correlated with disease severity, thereby supporting striatal plasticity in shaping choice behavior in humans.
Do probabilistic forecasts lead to better decisions?
NASA Astrophysics Data System (ADS)
Ramos, M. H.; van Andel, S. J.; Pappenberger, F.
2012-12-01
The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also start putting attention to ways of communicating the probabilistic forecasts to decision makers. Communicating probabilistic forecasts includes preparing tools and products for visualization, but also requires understanding how decision makers perceive and use uncertainty information in real-time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision makers. Answers were collected and analyzed. In this paper, we present the results of this exercise and discuss if indeed we make better decisions on the basis of probabilistic forecasts.
Do probabilistic forecasts lead to better decisions?
NASA Astrophysics Data System (ADS)
Ramos, M. H.; van Andel, S. J.; Pappenberger, F.
2013-06-01
The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also started focusing attention on ways of communicating the probabilistic forecasts to decision-makers. Communicating probabilistic forecasts includes preparing tools and products for visualisation, but also requires understanding how decision-makers perceive and use uncertainty information in real time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision-makers. Answers were collected and analysed. In this paper, we present the results of this exercise and discuss if we indeed make better decisions on the basis of probabilistic forecasts.
Kolodny, Oren; Lotem, Arnon; Edelman, Shimon
2015-03-01
We introduce a set of biologically and computationally motivated design choices for modeling the learning of language, or of other types of sequential, hierarchically structured experience and behavior, and describe an implemented system that conforms to these choices and is capable of unsupervised learning from raw natural-language corpora. Given a stream of linguistic input, our model incrementally learns a grammar that captures its statistical patterns, which can then be used to parse or generate new data. The grammar constructed in this manner takes the form of a directed weighted graph, whose nodes are recursively (hierarchically) defined patterns over the elements of the input stream. We evaluated the model in seventeen experiments, grouped into five studies, which examined, respectively, (a) the generative ability of grammar learned from a corpus of natural language, (b) the characteristics of the learned representation, (c) sequence segmentation and chunking, (d) artificial grammar learning, and (e) certain types of structure dependence. The model's performance largely vindicates our design choices, suggesting that progress in modeling language acquisition can be made on a broad front-ranging from issues of generativity to the replication of human experimental findings-by bringing biological and computational considerations, as well as lessons from prior efforts, to bear on the modeling approach. Copyright © 2014 Cognitive Science Society, Inc.
Using Models to Inform Policy: Insights from Modeling the Complexities of Global Polio Eradication
NASA Astrophysics Data System (ADS)
Thompson, Kimberly M.
Drawing on over 20 years of experience modeling risks in complex systems, this talk will challenge SBP participants to develop models that provide timely and useful answers to critical policy questions when decision makers need them. The talk will include reflections on the opportunities and challenges associated with developing integrated models for complex problems and communicating their results effectively. Dr. Thompson will focus the talk largely on collaborative modeling related to global polio eradication and the application of system dynamics tools. After successful global eradication of wild polioviruses, live polioviruses will still present risks that could potentially lead to paralytic polio cases. This talk will present the insights of efforts to use integrated dynamic, probabilistic risk, decision, and economic models to address critical policy questions related to managing global polio risks. Using a dynamic disease transmission model combined with probabilistic model inputs that characterize uncertainty for a stratified world to account for variability, we find that global health leaders will face some difficult choices, but that they can take actions that will manage the risks effectively. The talk will emphasize the need for true collaboration between modelers and subject matter experts, and the importance of working with decision makers as partners to ensure the development of useful models that actually get used.
A computational framework to empower probabilistic protein design
Fromer, Menachem; Yanover, Chen
2008-01-01
Motivation: The task of engineering a protein to perform a target biological function is known as protein design. A commonly used paradigm casts this functional design problem as a structural one, assuming a fixed backbone. In probabilistic protein design, positional amino acid probabilities are used to create a random library of sequences to be simultaneously screened for biological activity. Clearly, certain choices of probability distributions will be more successful in yielding functional sequences. However, since the number of sequences is exponential in protein length, computational optimization of the distribution is difficult. Results: In this paper, we develop a computational framework for probabilistic protein design following the structural paradigm. We formulate the distribution of sequences for a structure using the Boltzmann distribution over their free energies. The corresponding probabilistic graphical model is constructed, and we apply belief propagation (BP) to calculate marginal amino acid probabilities. We test this method on a large structural dataset and demonstrate the superiority of BP over previous methods. Nevertheless, since the results obtained by BP are far from optimal, we thoroughly assess the paradigm using high-quality experimental data. We demonstrate that, for small scale sub-problems, BP attains identical results to those produced by exact inference on the paradigmatic model. However, quantitative analysis shows that the distributions predicted significantly differ from the experimental data. These findings, along with the excellent performance we observed using BP on the smaller problems, suggest potential shortcomings of the paradigm. We conclude with a discussion of how it may be improved in the future. Contact: fromer@cs.huji.ac.il PMID:18586717
scoringRules - A software package for probabilistic model evaluation
NASA Astrophysics Data System (ADS)
Lerch, Sebastian; Jordan, Alexander; Krüger, Fabian
2016-04-01
Models in the geosciences are generally surrounded by uncertainty, and being able to quantify this uncertainty is key to good decision making. Accordingly, probabilistic forecasts in the form of predictive distributions have become popular over the last decades. With the proliferation of probabilistic models arises the need for decision theoretically principled tools to evaluate the appropriateness of models and forecasts in a generalized way. Various scoring rules have been developed over the past decades to address this demand. Proper scoring rules are functions S(F,y) which evaluate the accuracy of a forecast distribution F , given that an outcome y was observed. As such, they allow to compare alternative models, a crucial ability given the variety of theories, data sources and statistical specifications that is available in many situations. This poster presents the software package scoringRules for the statistical programming language R, which contains functions to compute popular scoring rules such as the continuous ranked probability score for a variety of distributions F that come up in applied work. Two main classes are parametric distributions like normal, t, or gamma distributions, and distributions that are not known analytically, but are indirectly described through a sample of simulation draws. For example, Bayesian forecasts produced via Markov Chain Monte Carlo take this form. Thereby, the scoringRules package provides a framework for generalized model evaluation that both includes Bayesian as well as classical parametric models. The scoringRules package aims to be a convenient dictionary-like reference for computing scoring rules. We offer state of the art implementations of several known (but not routinely applied) formulas, and implement closed-form expressions that were previously unavailable. Whenever more than one implementation variant exists, we offer statistically principled default choices.
Probabilistic learning and inference in schizophrenia
Averbeck, Bruno B.; Evans, Simon; Chouhan, Viraj; Bristow, Eleanor; Shergill, Sukhwinder S.
2010-01-01
Patients with schizophrenia make decisions on the basis of less evidence when required to collect information to make an inference, a behavior often called jumping to conclusions. The underlying basis for this behaviour remains controversial. We examined the cognitive processes underpinning this finding by testing subjects on the beads task, which has been used previously to elicit jumping to conclusions behaviour, and a stochastic sequence learning task, with a similar decision theoretic structure. During the sequence learning task, subjects had to learn a sequence of button presses, while receiving noisy feedback on their choices. We fit a Bayesian decision making model to the sequence task and compared model parameters to the choice behavior in the beads task in both patients and healthy subjects. We found that patients did show a jumping to conclusions style; and those who picked early in the beads task tended to learn less from positive feedback in the sequence task. This favours the likelihood of patients selecting early because they have a low threshold for making decisions, and that they make choices on the basis of relatively little evidence. PMID:20810252
Probabilistic learning and inference in schizophrenia.
Averbeck, Bruno B; Evans, Simon; Chouhan, Viraj; Bristow, Eleanor; Shergill, Sukhwinder S
2011-04-01
Patients with schizophrenia make decisions on the basis of less evidence when required to collect information to make an inference, a behavior often called jumping to conclusions. The underlying basis for this behavior remains controversial. We examined the cognitive processes underpinning this finding by testing subjects on the beads task, which has been used previously to elicit jumping to conclusions behavior, and a stochastic sequence learning task, with a similar decision theoretic structure. During the sequence learning task, subjects had to learn a sequence of button presses, while receiving a noisy feedback on their choices. We fit a Bayesian decision making model to the sequence task and compared model parameters to the choice behavior in the beads task in both patients and healthy subjects. We found that patients did show a jumping to conclusions style; and those who picked early in the beads task tended to learn less from positive feedback in the sequence task. This favours the likelihood of patients selecting early because they have a low threshold for making decisions, and that they make choices on the basis of relatively little evidence. Published by Elsevier B.V.
Heuristic and optimal policy computations in the human brain during sequential decision-making.
Korn, Christoph W; Bach, Dominik R
2018-01-23
Optimal decisions across extended time horizons require value calculations over multiple probabilistic future states. Humans may circumvent such complex computations by resorting to easy-to-compute heuristics that approximate optimal solutions. To probe the potential interplay between heuristic and optimal computations, we develop a novel sequential decision-making task, framed as virtual foraging in which participants have to avoid virtual starvation. Rewards depend only on final outcomes over five-trial blocks, necessitating planning over five sequential decisions and probabilistic outcomes. Here, we report model comparisons demonstrating that participants primarily rely on the best available heuristic but also use the normatively optimal policy. FMRI signals in medial prefrontal cortex (MPFC) relate to heuristic and optimal policies and associated choice uncertainties. Crucially, reaction times and dorsal MPFC activity scale with discrepancies between heuristic and optimal policies. Thus, sequential decision-making in humans may emerge from integration between heuristic and optimal policies, implemented by controllers in MPFC.
Choice with a fixed requirement for food, and the generality of the matching relation
Stubbs, D. Alan; Dreyfus, Leon R.; Fetterman, J. Gregor; Dorman, Lana G.
1986-01-01
Pigeons were trained on choice procedures in which responses on each of two keys were reinforced probabilistically, but only after a schedule requirement had been met. Under one arrangement, a fixed-interval choice procedure was used in which responses were not reinforced until the interval was over; then a response on one key would be reinforced, with the effective key changing irregularly from interval to interval. Under a second, fixed-ratio choice procedure, responses on either key counted towards completion of the ratio and then, once the ratio had been completed, a response on the probabilistically selected key would produce food. In one experiment, the schedule requirements were varied for both fixed-interval and fixed-ratio schedules. In the second experiment, relative reinforcement rate was varied. And in a third experiment, the duration of an intertrial interval separating choices was varied. The results for 11 pigeons across all three experiments indicate that there were often large deviations between relative response rates and relative reinforcement rates. Overall performance measures were characterized by a great deal of variability across conditions. More detailed measures of choice across the schedule requirement were also quite variable across conditions. In spite of this variability, performance was consistent across conditions in its efficiency of producing food. The absence of matching of behavior allocation to reinforcement rate indicates an important difference between the present procedures and other choice procedures; that difference raises questions about the specific conditions that lead to matching as an outcome. PMID:16812452
Solway, A.; Botvinick, M.
2013-01-01
Recent work has given rise to the view that reward-based decision making is governed by two key controllers: a habit system, which stores stimulus-response associations shaped by past reward, and a goal-oriented system that selects actions based on their anticipated outcomes. The current literature provides a rich body of computational theory addressing habit formation, centering on temporal-difference learning mechanisms. Less progress has been made toward formalizing the processes involved in goal-directed decision making. We draw on recent work in cognitive neuroscience, animal conditioning, cognitive and developmental psychology and machine learning, to outline a new theory of goal-directed decision making. Our basic proposal is that the brain, within an identifiable network of cortical and subcortical structures, implements a probabilistic generative model of reward, and that goal-directed decision making is effected through Bayesian inversion of this model. We present a set of simulations implementing the account, which address benchmark behavioral and neuroscientific findings, and which give rise to a set of testable predictions. We also discuss the relationship between the proposed framework and other models of decision making, including recent models of perceptual choice, to which our theory bears a direct connection. PMID:22229491
Upgrades to the REA method for producing probabilistic climate change projections
NASA Astrophysics Data System (ADS)
Xu, Ying; Gao, Xuejie; Giorgi, Filippo
2010-05-01
We present an augmented version of the Reliability Ensemble Averaging (REA) method designed to generate probabilistic climate change information from ensembles of climate model simulations. Compared to the original version, the augmented one includes consideration of multiple variables and statistics in the calculation of the performance-based weights. In addition, the model convergence criterion previously employed is removed. The method is applied to the calculation of changes in mean and variability for temperature and precipitation over different sub-regions of East Asia based on the recently completed CMIP3 multi-model ensemble. Comparison of the new and old REA methods, along with the simple averaging procedure, and the use of different combinations of performance metrics shows that at fine sub-regional scales the choice of weighting is relevant. This is mostly because the models show a substantial spread in performance for the simulation of precipitation statistics, a result that supports the use of model weighting as a useful option to account for wide ranges of quality of models. The REA method, and in particular the upgraded one, provides a simple and flexible framework for assessing the uncertainty related to the aggregation of results from ensembles of models in order to produce climate change information at the regional scale. KEY WORDS: REA method, Climate change, CMIP3
Network approaches for expert decisions in sports.
Glöckner, Andreas; Heinen, Thomas; Johnson, Joseph G; Raab, Markus
2012-04-01
This paper focuses on a model comparison to explain choices based on gaze behavior via simulation procedures. We tested two classes of models, a parallel constraint satisfaction (PCS) artificial neuronal network model and an accumulator model in a handball decision-making task from a lab experiment. Both models predict action in an option-generation task in which options can be chosen from the perspective of a playmaker in handball (i.e., passing to another player or shooting at the goal). Model simulations are based on a dataset of generated options together with gaze behavior measurements from 74 expert handball players for 22 pieces of video footage. We implemented both classes of models as deterministic vs. probabilistic models including and excluding fitted parameters. Results indicated that both classes of models can fit and predict participants' initially generated options based on gaze behavior data, and that overall, the classes of models performed about equally well. Early fixations were thereby particularly predictive for choices. We conclude that the analyses of complex environments via network approaches can be successfully applied to the field of experts' decision making in sports and provide perspectives for further theoretical developments. Copyright © 2011 Elsevier B.V. All rights reserved.
Economic Choices Reveal Probability Distortion in Macaque Monkeys
Lak, Armin; Bossaerts, Peter; Schultz, Wolfram
2015-01-01
Economic choices are largely determined by two principal elements, reward value (utility) and probability. Although nonlinear utility functions have been acknowledged for centuries, nonlinear probability weighting (probability distortion) was only recently recognized as a ubiquitous aspect of real-world choice behavior. Even when outcome probabilities are known and acknowledged, human decision makers often overweight low probability outcomes and underweight high probability outcomes. Whereas recent studies measured utility functions and their corresponding neural correlates in monkeys, it is not known whether monkeys distort probability in a manner similar to humans. Therefore, we investigated economic choices in macaque monkeys for evidence of probability distortion. We trained two monkeys to predict reward from probabilistic gambles with constant outcome values (0.5 ml or nothing). The probability of winning was conveyed using explicit visual cues (sector stimuli). Choices between the gambles revealed that the monkeys used the explicit probability information to make meaningful decisions. Using these cues, we measured probability distortion from choices between the gambles and safe rewards. Parametric modeling of the choices revealed classic probability weighting functions with inverted-S shape. Therefore, the animals overweighted low probability rewards and underweighted high probability rewards. Empirical investigation of the behavior verified that the choices were best explained by a combination of nonlinear value and nonlinear probability distortion. Together, these results suggest that probability distortion may reflect evolutionarily preserved neuronal processing. PMID:25698750
Economic choices reveal probability distortion in macaque monkeys.
Stauffer, William R; Lak, Armin; Bossaerts, Peter; Schultz, Wolfram
2015-02-18
Economic choices are largely determined by two principal elements, reward value (utility) and probability. Although nonlinear utility functions have been acknowledged for centuries, nonlinear probability weighting (probability distortion) was only recently recognized as a ubiquitous aspect of real-world choice behavior. Even when outcome probabilities are known and acknowledged, human decision makers often overweight low probability outcomes and underweight high probability outcomes. Whereas recent studies measured utility functions and their corresponding neural correlates in monkeys, it is not known whether monkeys distort probability in a manner similar to humans. Therefore, we investigated economic choices in macaque monkeys for evidence of probability distortion. We trained two monkeys to predict reward from probabilistic gambles with constant outcome values (0.5 ml or nothing). The probability of winning was conveyed using explicit visual cues (sector stimuli). Choices between the gambles revealed that the monkeys used the explicit probability information to make meaningful decisions. Using these cues, we measured probability distortion from choices between the gambles and safe rewards. Parametric modeling of the choices revealed classic probability weighting functions with inverted-S shape. Therefore, the animals overweighted low probability rewards and underweighted high probability rewards. Empirical investigation of the behavior verified that the choices were best explained by a combination of nonlinear value and nonlinear probability distortion. Together, these results suggest that probability distortion may reflect evolutionarily preserved neuronal processing. Copyright © 2015 Stauffer et al.
DYT1 dystonia increases risk taking in humans
Arkadir, David; Radulescu, Angela; Raymond, Deborah; Lubarr, Naomi; Bressman, Susan B; Mazzoni, Pietro; Niv, Yael
2016-01-01
It has been difficult to link synaptic modification to overt behavioral changes. Rodent models of DYT1 dystonia, a motor disorder caused by a single gene mutation, demonstrate increased long-term potentiation and decreased long-term depression in corticostriatal synapses. Computationally, such asymmetric learning predicts risk taking in probabilistic tasks. Here we demonstrate abnormal risk taking in DYT1 dystonia patients, which is correlated with disease severity, thereby supporting striatal plasticity in shaping choice behavior in humans. DOI: http://dx.doi.org/10.7554/eLife.14155.001 PMID:27249418
NASA Astrophysics Data System (ADS)
David, McInerney; Mark, Thyer; Dmitri, Kavetski; George, Kuczera
2017-04-01
This study provides guidance to hydrological researchers which enables them to provide probabilistic predictions of daily streamflow with the best reliability and precision for different catchment types (e.g. high/low degree of ephemerality). Reliable and precise probabilistic prediction of daily catchment-scale streamflow requires statistical characterization of residual errors of hydrological models. It is commonly known that hydrological model residual errors are heteroscedastic, i.e. there is a pattern of larger errors in higher streamflow predictions. Although multiple approaches exist for representing this heteroscedasticity, few studies have undertaken a comprehensive evaluation and comparison of these approaches. This study fills this research gap by evaluating 8 common residual error schemes, including standard and weighted least squares, the Box-Cox transformation (with fixed and calibrated power parameter, lambda) and the log-sinh transformation. Case studies include 17 perennial and 6 ephemeral catchments in Australia and USA, and two lumped hydrological models. We find the choice of heteroscedastic error modelling approach significantly impacts on predictive performance, though no single scheme simultaneously optimizes all performance metrics. The set of Pareto optimal schemes, reflecting performance trade-offs, comprises Box-Cox schemes with lambda of 0.2 and 0.5, and the log scheme (lambda=0, perennial catchments only). These schemes significantly outperform even the average-performing remaining schemes (e.g., across ephemeral catchments, median precision tightens from 105% to 40% of observed streamflow, and median biases decrease from 25% to 4%). Theoretical interpretations of empirical results highlight the importance of capturing the skew/kurtosis of raw residuals and reproducing zero flows. Recommendations for researchers and practitioners seeking robust residual error schemes for practical work are provided.
Multisensory decisions provide support for probabilistic number representations.
Kanitscheider, Ingmar; Brown, Amanda; Pouget, Alexandre; Churchland, Anne K
2015-06-01
A large body of evidence suggests that an approximate number sense allows humans to estimate numerosity in sensory scenes. This ability is widely observed in humans, including those without formal mathematical training. Despite this, many outstanding questions remain about the nature of the numerosity representation in the brain. Specifically, it is not known whether approximate numbers are represented as scalar estimates of numerosity or, alternatively, as probability distributions over numerosity. In the present study, we used a multisensory decision task to distinguish these possibilities. We trained human subjects to decide whether a test stimulus had a larger or smaller numerosity compared with a fixed reference. Depending on the trial, the numerosity was presented as either a sequence of visual flashes or a sequence of auditory tones, or both. To test for a probabilistic representation, we varied the reliability of the stimulus by adding noise to the visual stimuli. In accordance with a probabilistic representation, we observed a significant improvement in multisensory compared with unisensory trials. Furthermore, a trial-by-trial analysis revealed that although individual subjects showed strategic differences in how they leveraged auditory and visual information, all subjects exploited the reliability of unisensory cues. An alternative, nonprobabilistic model, in which subjects combined cues without regard for reliability, was not able to account for these trial-by-trial choices. These findings provide evidence that the brain relies on a probabilistic representation for numerosity decisions. Copyright © 2015 the American Physiological Society.
Saito, Hiroshi; Katahira, Kentaro; Okanoya, Kazuo; Okada, Masato
2014-01-01
The decision making behaviors of humans and animals adapt and then satisfy an "operant matching law" in certain type of tasks. This was first pointed out by Herrnstein in his foraging experiments on pigeons. The matching law has been one landmark for elucidating the underlying processes of decision making and its learning in the brain. An interesting question is whether decisions are made deterministically or probabilistically. Conventional learning models of the matching law are based on the latter idea; they assume that subjects learn choice probabilities of respective alternatives and decide stochastically with the probabilities. However, it is unknown whether the matching law can be accounted for by a deterministic strategy or not. To answer this question, we propose several deterministic Bayesian decision making models that have certain incorrect beliefs about an environment. We claim that a simple model produces behavior satisfying the matching law in static settings of a foraging task but not in dynamic settings. We found that the model that has a belief that the environment is volatile works well in the dynamic foraging task and exhibits undermatching, which is a slight deviation from the matching law observed in many experiments. This model also demonstrates the double-exponential reward history dependency of a choice and a heavier-tailed run-length distribution, as has recently been reported in experiments on monkeys.
Discrete-Slots Models of Visual Working-Memory Response Times
Donkin, Christopher; Nosofsky, Robert M.; Gold, Jason M.; Shiffrin, Richard M.
2014-01-01
Much recent research has aimed to establish whether visual working memory (WM) is better characterized by a limited number of discrete all-or-none slots or by a continuous sharing of memory resources. To date, however, researchers have not considered the response-time (RT) predictions of discrete-slots versus shared-resources models. To complement the past research in this field, we formalize a family of mixed-state, discrete-slots models for explaining choice and RTs in tasks of visual WM change detection. In the tasks under investigation, a small set of visual items is presented, followed by a test item in 1 of the studied positions for which a change judgment must be made. According to the models, if the studied item in that position is retained in 1 of the discrete slots, then a memory-based evidence-accumulation process determines the choice and the RT; if the studied item in that position is missing, then a guessing-based accumulation process operates. Observed RT distributions are therefore theorized to arise as probabilistic mixtures of the memory-based and guessing distributions. We formalize an analogous set of continuous shared-resources models. The model classes are tested on individual subjects with both qualitative contrasts and quantitative fits to RT-distribution data. The discrete-slots models provide much better qualitative and quantitative accounts of the RT and choice data than do the shared-resources models, although there is some evidence for “slots plus resources” when memory set size is very small. PMID:24015956
Dopamine D3 Receptor Availability Is Associated with Inflexible Decision Making.
Groman, Stephanie M; Smith, Nathaniel J; Petrullli, J Ryan; Massi, Bart; Chen, Lihui; Ropchan, Jim; Huang, Yiyun; Lee, Daeyeol; Morris, Evan D; Taylor, Jane R
2016-06-22
Dopamine D2/3 receptor signaling is critical for flexible adaptive behavior; however, it is unclear whether D2, D3, or both receptor subtypes modulate precise signals of feedback and reward history that underlie optimal decision making. Here, PET with the radioligand [(11)C]-(+)-PHNO was used to quantify individual differences in putative D3 receptor availability in rodents trained on a novel three-choice spatial acquisition and reversal-learning task with probabilistic reinforcement. Binding of [(11)C]-(+)-PHNO in the midbrain was negatively related to the ability of rats to adapt to changes in rewarded locations, but not to the initial learning. Computational modeling of choice behavior in the reversal phase indicated that [(11)C]-(+)-PHNO binding in the midbrain was related to the learning rate and sensitivity to positive, but not negative, feedback. Administration of a D3-preferring agonist likewise impaired reversal performance by reducing the learning rate and sensitivity to positive feedback. These results demonstrate a previously unrecognized role for D3 receptors in select aspects of reinforcement learning and suggest that individual variation in midbrain D3 receptors influences flexible behavior. Our combined neuroimaging, behavioral, pharmacological, and computational approach implicates the dopamine D3 receptor in decision-making processes that are altered in psychiatric disorders. Flexible decision-making behavior is dependent upon dopamine D2/3 signaling in corticostriatal brain regions. However, the role of D3 receptors in adaptive, goal-directed behavior has not been thoroughly investigated. By combining PET imaging with the D3-preferring radioligand [(11)C]-(+)-PHNO, pharmacology, a novel three-choice probabilistic discrimination and reversal task and computational modeling of behavior in rats, we report that naturally occurring variation in [(11)C]-(+)-PHNO receptor availability relates to specific aspects of flexible decision making. We confirm these relationships using a D3-preferring agonist, thus identifying a unique role of midbrain D3 receptors in decision-making processes. Copyright © 2016 the authors 0270-6474/16/366732-10$15.00/0.
Paradoxical choice in rats: Subjective valuation and mechanism of choice.
Ojeda, Andrés; Murphy, Robin A; Kacelnik, Alex
2018-07-01
Decision-makers benefit from information only when they can use it to guide behavior. However, recent experiments found that pigeons and starlings value information that they cannot use. Here we show that this paradox is also present in rats, and explore the underlying decision process. Subjects chose between two options that delivered food probabilistically after a fixed delay. In one option ("info"), outcomes (food/no-food) were signaled immediately after choice, whereas in the alternative ("non-info") the outcome was uncertain until the delay lapsed. Rats sacrificed up to 20% potential rewards by preferring the info option, but reversed preference when the cost was 60%. This reversal contrasts with the results found with pigeons and starlings and may reflect species' differences worth of further investigation. Results are consistent with predictions of the Sequential Choice Model (SCM), that proposes that choices are driven by the mechanisms that control action in sequential encounters. As expected from the SCM, latencies to respond in single-option trials predicted preferences in choice trials, and latencies in choice trials were the same or shorter than in single-option trials. We argue that the congruence of results in distant vertebrates probably reflects evolved adaptations to shared fundamental challenges in nature, and that the apparently paradoxical overvaluing of information is not sub-optimal as has been claimed, even though its functional significance is not yet understood. Copyright © 2018 Elsevier B.V. All rights reserved.
Mihaljević, Bojan; Bielza, Concha; Benavides-Piccione, Ruth; DeFelipe, Javier; Larrañaga, Pedro
2014-01-01
Interneuron classification is an important and long-debated topic in neuroscience. A recent study provided a data set of digitally reconstructed interneurons classified by 42 leading neuroscientists according to a pragmatic classification scheme composed of five categorical variables, namely, of the interneuron type and four features of axonal morphology. From this data set we now learned a model which can classify interneurons, on the basis of their axonal morphometric parameters, into these five descriptive variables simultaneously. Because of differences in opinion among the neuroscientists, especially regarding neuronal type, for many interneurons we lacked a unique, agreed-upon classification, which we could use to guide model learning. Instead, we guided model learning with a probability distribution over the neuronal type and the axonal features, obtained, for each interneuron, from the neuroscientists' classification choices. We conveniently encoded such probability distributions with Bayesian networks, calling them label Bayesian networks (LBNs), and developed a method to predict them. This method predicts an LBN by forming a probabilistic consensus among the LBNs of the interneurons most similar to the one being classified. We used 18 axonal morphometric parameters as predictor variables, 13 of which we introduce in this paper as quantitative counterparts to the categorical axonal features. We were able to accurately predict interneuronal LBNs. Furthermore, when extracting crisp (i.e., non-probabilistic) predictions from the predicted LBNs, our method outperformed related work on interneuron classification. Our results indicate that our method is adequate for multi-dimensional classification of interneurons with probabilistic labels. Moreover, the introduced morphometric parameters are good predictors of interneuron type and the four features of axonal morphology and thus may serve as objective counterparts to the subjective, categorical axonal features.
Evaluation of probabilistic forecasts with the scoringRules package
NASA Astrophysics Data System (ADS)
Jordan, Alexander; Krüger, Fabian; Lerch, Sebastian
2017-04-01
Over the last decades probabilistic forecasts in the form of predictive distributions have become popular in many scientific disciplines. With the proliferation of probabilistic models arises the need for decision-theoretically principled tools to evaluate the appropriateness of models and forecasts in a generalized way in order to better understand sources of prediction errors and to improve the models. Proper scoring rules are functions S(F,y) which evaluate the accuracy of a forecast distribution F , given that an outcome y was observed. In coherence with decision-theoretical principles they allow to compare alternative models, a crucial ability given the variety of theories, data sources and statistical specifications that is available in many situations. This contribution presents the software package scoringRules for the statistical programming language R, which provides functions to compute popular scoring rules such as the continuous ranked probability score for a variety of distributions F that come up in applied work. For univariate variables, two main classes are parametric distributions like normal, t, or gamma distributions, and distributions that are not known analytically, but are indirectly described through a sample of simulation draws. For example, ensemble weather forecasts take this form. The scoringRules package aims to be a convenient dictionary-like reference for computing scoring rules. We offer state of the art implementations of several known (but not routinely applied) formulas, and implement closed-form expressions that were previously unavailable. Whenever more than one implementation variant exists, we offer statistically principled default choices. Recent developments include the addition of scoring rules to evaluate multivariate forecast distributions. The use of the scoringRules package is illustrated in an example on post-processing ensemble forecasts of temperature.
Rygula, Rafal; Clarke, Hannah F.; Cardinal, Rudolf N.; Cockcroft, Gemma J.; Xia, Jing; Dalley, Jeff W.; Robbins, Trevor W.; Roberts, Angela C.
2015-01-01
Understanding the role of serotonin (or 5-hydroxytryptamine, 5-HT) in aversive processing has been hampered by the contradictory findings, across studies, of increased sensitivity to punishment in terms of subsequent response choice but decreased sensitivity to punishment-induced response suppression following gross depletion of central 5-HT. To address this apparent discrepancy, the present study determined whether both effects could be found in the same animals by performing localized 5-HT depletions in the amygdala or orbitofrontal cortex (OFC) of a New World monkey, the common marmoset. 5-HT depletion in the amygdala impaired response choice on a probabilistic visual discrimination task by increasing the effectiveness of misleading, or false, punishment and reward, and decreased response suppression in a variable interval test of punishment sensitivity that employed the same reward and punisher. 5-HT depletion in the OFC also disrupted probabilistic discrimination learning and decreased response suppression. Computational modeling of behavior on the discrimination task showed that the lesions reduced reinforcement sensitivity. A novel, unitary account of the findings in terms of the causal role of 5-HT in the anticipation of both negative and positive motivational outcomes is proposed and discussed in relation to current theories of 5-HT function and our understanding of mood and anxiety disorders. PMID:24879752
NASA Astrophysics Data System (ADS)
Miki, K.; Panesi, M.; Prudencio, E. E.; Prudhomme, S.
2012-05-01
The objective in this paper is to analyze some stochastic models for estimating the ionization reaction rate constant of atomic Nitrogen (N + e- → N+ + 2e-). Parameters of the models are identified by means of Bayesian inference using spatially resolved absolute radiance data obtained from the Electric Arc Shock Tube (EAST) wind-tunnel. The proposed methodology accounts for uncertainties in the model parameters as well as physical model inadequacies, providing estimates of the rate constant that reflect both types of uncertainties. We present four different probabilistic models by varying the error structure (either additive or multiplicative) and by choosing different descriptions of the statistical correlation among data points. In order to assess the validity of our methodology, we first present some calibration results obtained with manufactured data and then proceed by using experimental data collected at EAST experimental facility. In order to simulate the radiative signature emitted in the shock-heated air plasma, we use a one-dimensional flow solver with Park's two-temperature model that simulates non-equilibrium effects. We also discuss the implications of the choice of the stochastic model on the estimation of the reaction rate and its uncertainties. Our analysis shows that the stochastic models based on correlated multiplicative errors are the most plausible models among the four models proposed in this study. The rate of the atomic Nitrogen ionization is found to be (6.2 ± 3.3) × 1011 cm3 mol-1 s-1 at 10,000 K.
Hernaus, Dennis; Gold, James M; Waltz, James A; Frank, Michael J
2018-04-03
While many have emphasized impaired reward prediction error signaling in schizophrenia, multiple studies suggest that some decision-making deficits may arise from overreliance on stimulus-response systems together with a compromised ability to represent expected value. Guided by computational frameworks, we formulated and tested two scenarios in which maladaptive representations of expected value should be most evident, thereby delineating conditions that may evoke decision-making impairments in schizophrenia. In a modified reinforcement learning paradigm, 42 medicated people with schizophrenia and 36 healthy volunteers learned to select the most frequently rewarded option in a 75-25 pair: once when presented with a more deterministic (90-10) pair and once when presented with a more probabilistic (60-40) pair. Novel and old combinations of choice options were presented in a subsequent transfer phase. Computational modeling was employed to elucidate contributions from stimulus-response systems (actor-critic) and expected value (Q-learning). People with schizophrenia showed robust performance impairments with increasing value difference between two competing options, which strongly correlated with decreased contributions from expected value-based learning (Q-learning). Moreover, a subtle yet consistent contextual choice bias for the probabilistic 75 option was present in people with schizophrenia, which could be accounted for by a context-dependent reward prediction error in the actor-critic. We provide evidence that decision-making impairments in schizophrenia increase monotonically with demands placed on expected value computations. A contextual choice bias is consistent with overreliance on stimulus-response learning, which may signify a deficit secondary to the maladaptive representation of expected value. These results shed new light on conditions under which decision-making impairments may arise. Copyright © 2018 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.
Zoratto, Francesca; Laviola, Giovanni; Adriani, Walter
2016-03-23
Interest is rising for animal modelling of Gambling disorder (GD), which is rapidly emerging as a mental health concern. In the present study, we assessed gambling proneness in male Wistar-Han rats using the "Probabilistic Delivery Task" (PDT). This operant protocol is based on choice between either certain, small amounts of food (SS) or larger amounts of food (LLL) delivered (or not) depending on a given (and progressively decreasing) probability. Here, we manipulated the ratio between large and small reward size to assess the impact of different magnitudes on rats' performance. Specifically, we drew a comparison between threefold (2 vs 6 pellets) and fivefold (1 vs 5 pellets) sizes. As a consequence, the "indifferent point" (IP, at which either choice is mathematically equivalent in terms of total foraging) was at 33% and 20% probability of delivery, respectively. Animals tested with the sharper contrast (i.e. fivefold ratio) exhibited sustained preference for LLL far beyond the IP, despite high uncertainty and low payoff, which rendered LLL a sub-optimal option. By contrast, animals facing a slighter contrast (i.e. threefold ratio) were increasingly disturbed by progressive rarefaction of rewards, as expressed by enhanced inadequate nose-poking: this was in accordance with their prompt shift in preference to SS, already shown around the IP. In conclusion, a five-folded LLL-to-SS ratio was not only more attractive, but also less frustrating than a three-folded one. Thus, a profile of gambling proneness in the PDT is more effectively induced by marked contrast between alternative options. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Selective attention increases choice certainty in human decision making.
Zizlsperger, Leopold; Sauvigny, Thomas; Haarmeier, Thomas
2012-01-01
Choice certainty is a probabilistic estimate of past performance and expected outcome. In perceptual decisions the degree of confidence correlates closely with choice accuracy and reaction times, suggesting an intimate relationship to objective performance. Here we show that spatial and feature-based attention increase human subjects' certainty more than accuracy in visual motion discrimination tasks. Our findings demonstrate for the first time a dissociation of choice accuracy and certainty with a significantly stronger influence of voluntary top-down attention on subjective performance measures than on objective performance. These results reveal a so far unknown mechanism of the selection process implemented by attention and suggest a unique biological valence of choice certainty beyond a faithful reflection of the decision process.
Balancing Risk and Reward: A Rat Model of Risky Decision-Making
Simon, Nicholas W.; Gilbert, Ryan J.; Mayse, Jeffrey D.; Bizon, Jennifer L.; Setlow, Barry
2009-01-01
We developed a behavioral task in rats to assess the influence of risk of punishment on decision-making. Male Long-Evans rats were given choices between pressing a lever to obtain a small, “safe” food reward and a large food reward associated with risk of punishment (footshock). Each test session consisted of 5 blocks of 10 choice trials, with punishment risk increasing with each consecutive block (0, 25, 50, 75, 100%). Preference for the large, “risky” reward declined with both increased probability and increased magnitude of punishment, and reward choice was not affected by the level of satiation or the order of risk presentation. Performance in this risky decision-making task was correlated with the degree to which the rats discounted the value of probabilistic rewards, but not delayed rewards. Finally, the acute effects of different doses of amphetamine and cocaine on risky decision-making were assessed. Systemic amphetamine administration caused a dose-dependent decrease in choice of the large risky reward (i.e. – it made rats more risk-averse). Cocaine did not cause a shift in reward choice, but instead impaired rats’ sensitivity to changes in punishment risk. These results should prove useful for investigating neuropsychiatric disorders in which risk taking is a prominent feature, such as attention deficit/hyperactivity disorder and addiction. PMID:19440192
Balancing risk and reward: a rat model of risky decision making.
Simon, Nicholas W; Gilbert, Ryan J; Mayse, Jeffrey D; Bizon, Jennifer L; Setlow, Barry
2009-09-01
We developed a behavioral task in rats to assess the influence of risk of punishment on decision making. Male Long-Evans rats were given choices between pressing a lever to obtain a small, 'safe' food reward and a large food reward associated with risk of punishment (footshock). Each test session consisted of 5 blocks of 10 choice trials, with punishment risk increasing with each consecutive block (0, 25, 50, 75, 100%). Preference for the large, 'risky' reward declined with both increased probability and increased magnitude of punishment, and reward choice was not affected by the level of satiation or the order of risk presentation. Performance in this risky decision-making task was correlated with the degree to which the rats discounted the value of probabilistic rewards, but not delayed rewards. Finally, the acute effects of different doses of amphetamine and cocaine on risky decision making were assessed. Systemic amphetamine administration caused a dose-dependent decrease in choice of the large risky reward (ie, it made rats more risk averse). Cocaine did not cause a shift in reward choice, but instead impaired the rats' sensitivity to changes in punishment risk. These results should prove useful for investigating neuropsychiatric disorders in which risk taking is a prominent feature, such as attention deficit/hyperactivity disorder and addiction.
Li, Xiaomeng; Xu, Hongzhong; Chen, Jiawei; Chen, Qinghua; Zhang, Jiang; Di, Zengru
2016-01-01
Human migration is responsible for forming modern civilization and has had an important influence on the development of various countries. There are many issues worth researching, and “the reason to move” is the most basic one. The concept of migration cost in the classical self-selection theory, which was introduced by Roy and Borjas, is useful. However, migration cost cannot address global migration because of the limitations of deterministic and bilateral choice. Following the idea of migration cost, this paper developed a new probabilistic multilateral migration model by introducing the Boltzmann factor from statistical physics. After characterizing the underlying mechanism or driving force of human mobility, we reveal some interesting facts that have provided a deeper understanding of international migration, such as the negative correlation between migration costs for emigrants and immigrants and a global classification with clear regional and economic characteristics, based on clustering of migration cost vectors. In addition, we deconstruct the migration barriers using regression analysis and find that the influencing factors are complicated but can be partly (12.5%) described by several macro indexes, such as the GDP growth of the destination country, the GNI per capita and the HDI of both the source and destination countries. PMID:27597319
Probabilistic modelling of flood events using the entropy copula
NASA Astrophysics Data System (ADS)
Li, Fan; Zheng, Qian
2016-11-01
The estimation of flood frequency is vital for the flood control strategies and hydraulic structure design. Generating synthetic flood events according to statistical properties of observations is one of plausible methods to analyze the flood frequency. Due to the statistical dependence among the flood event variables (i.e. the flood peak, volume and duration), a multidimensional joint probability estimation is required. Recently, the copula method is widely used for multivariable dependent structure construction, however, the copula family should be chosen before application and the choice process is sometimes rather subjective. The entropy copula, a new copula family, employed in this research proposed a way to avoid the relatively subjective process by combining the theories of copula and entropy. The analysis shows the effectiveness of the entropy copula for probabilistic modelling the flood events of two hydrological gauges, and a comparison of accuracy with the popular copulas was made. The Gibbs sampling technique was applied for trivariate flood events simulation in order to mitigate the calculation difficulties of extending to three dimension directly. The simulation results indicate that the entropy copula is a simple and effective copula family for trivariate flood simulation.
NASA Astrophysics Data System (ADS)
Li, Xiaomeng; Xu, Hongzhong; Chen, Jiawei; Chen, Qinghua; Zhang, Jiang; di, Zengru
2016-09-01
Human migration is responsible for forming modern civilization and has had an important influence on the development of various countries. There are many issues worth researching, and “the reason to move” is the most basic one. The concept of migration cost in the classical self-selection theory, which was introduced by Roy and Borjas, is useful. However, migration cost cannot address global migration because of the limitations of deterministic and bilateral choice. Following the idea of migration cost, this paper developed a new probabilistic multilateral migration model by introducing the Boltzmann factor from statistical physics. After characterizing the underlying mechanism or driving force of human mobility, we reveal some interesting facts that have provided a deeper understanding of international migration, such as the negative correlation between migration costs for emigrants and immigrants and a global classification with clear regional and economic characteristics, based on clustering of migration cost vectors. In addition, we deconstruct the migration barriers using regression analysis and find that the influencing factors are complicated but can be partly (12.5%) described by several macro indexes, such as the GDP growth of the destination country, the GNI per capita and the HDI of both the source and destination countries.
NASA Astrophysics Data System (ADS)
McInerney, David; Thyer, Mark; Kavetski, Dmitri; Kuczera, George
2016-04-01
Appropriate representation of residual errors in hydrological modelling is essential for accurate and reliable probabilistic streamflow predictions. In particular, residual errors of hydrological predictions are often heteroscedastic, with large errors associated with high runoff events. Although multiple approaches exist for representing this heteroscedasticity, few if any studies have undertaken a comprehensive evaluation and comparison of these approaches. This study fills this research gap by evaluating a range of approaches for representing heteroscedasticity in residual errors. These approaches include the 'direct' weighted least squares approach and 'transformational' approaches, such as logarithmic, Box-Cox (with and without fitting the transformation parameter), logsinh and the inverse transformation. The study reports (1) theoretical comparison of heteroscedasticity approaches, (2) empirical evaluation of heteroscedasticity approaches using a range of multiple catchments / hydrological models / performance metrics and (3) interpretation of empirical results using theory to provide practical guidance on the selection of heteroscedasticity approaches. Importantly, for hydrological practitioners, the results will simplify the choice of approaches to represent heteroscedasticity. This will enhance their ability to provide hydrological probabilistic predictions with the best reliability and precision for different catchment types (e.g. high/low degree of ephemerality).
BAYESIAN PROTEIN STRUCTURE ALIGNMENT.
Rodriguez, Abel; Schmidler, Scott C
The analysis of the three-dimensional structure of proteins is an important topic in molecular biochemistry. Structure plays a critical role in defining the function of proteins and is more strongly conserved than amino acid sequence over evolutionary timescales. A key challenge is the identification and evaluation of structural similarity between proteins; such analysis can aid in understanding the role of newly discovered proteins and help elucidate evolutionary relationships between organisms. Computational biologists have developed many clever algorithmic techniques for comparing protein structures, however, all are based on heuristic optimization criteria, making statistical interpretation somewhat difficult. Here we present a fully probabilistic framework for pairwise structural alignment of proteins. Our approach has several advantages, including the ability to capture alignment uncertainty and to estimate key "gap" parameters which critically affect the quality of the alignment. We show that several existing alignment methods arise as maximum a posteriori estimates under specific choices of prior distributions and error models. Our probabilistic framework is also easily extended to incorporate additional information, which we demonstrate by including primary sequence information to generate simultaneous sequence-structure alignments that can resolve ambiguities obtained using structure alone. This combined model also provides a natural approach for the difficult task of estimating evolutionary distance based on structural alignments. The model is illustrated by comparison with well-established methods on several challenging protein alignment examples.
Learning Probabilistic Logic Models from Probabilistic Examples
Chen, Jianzhong; Muggleton, Stephen; Santos, José
2009-01-01
Abstract We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples. PMID:19888348
Learning Probabilistic Logic Models from Probabilistic Examples.
Chen, Jianzhong; Muggleton, Stephen; Santos, José
2008-10-01
We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples.
Jarmolowicz, David P; Sofis, Michael J; Darden, Alexandria C
2016-07-01
Although progressive ratio (PR) schedules have been used to explore effects of a range of reinforcer parameters (e.g., magnitude, delay), effects of reinforcer probability remain underexplored. The present project used independently progressing concurrent PR PR schedules to examine effects of reinforcer probability on PR breakpoint (highest completed ratio prior to a session terminating 300s pause) and response allocation. The probability of reinforcement on one lever remained at 100% across all conditions while the probability of reinforcement on the other lever was systematically manipulated (i.e., 100%, 50%, 25%, 12.5%, and a replication of 25%). Breakpoints systematically decreased with decreasing reinforcer probabilities while breakpoints on the control lever remained unchanged. Patterns of switching between the two levers were well described by a choice-by-choice unit price model that accounted for the hyperbolic discounting of the value of probabilistic reinforcers. Copyright © 2016 Elsevier B.V. All rights reserved.
Effect of experimental analogs of contingency management treatment on cocaine seeking behavior.
Greenwald, Mark K; Ledgerwood, David M; Lundahl, Leslie H; Steinmiller, Caren L
2014-06-01
Contingency management (CM) treatment is effective for treating cocaine dependence but further mechanistic studies of its efficacy are warranted. This study aimed to determine whether: (a) higher vs. lower predictable money amounts ($3 vs. $1; analogs of standard voucher-based CM) increase cocaine demand elasticity; and (b) probabilistic amounts matched for expected value with the $3-predictable amount (50% chance of $6; 25% chance of $12; and 12.5% chance of $24; analogs of prize CM) similarly affect cocaine choice. Each of 15 cocaine-dependent participants first completed a qualifying session to ensure that intranasal cocaine functioned as a reinforcer, then completed a 10-session, within-subject, randomized crossover study. During each of the 10 sessions, the participant responded on a progressive ratio schedule to earn units of cocaine (5-mg or 10-mg) and/or money (five monetary conditions above). During the reinforcement qualifying session (10-mg vs. 0-mg units; no money alternative), cocaine choice was high. The $3-predictable amount significantly decreased cocaine choice relative to both the $1-predictable amount and the qualifying session. Cocaine-choices in the probabilistic conditions were similar to the $3 predictable condition. These findings indicate that CM interventions targeted at reducing cocaine self-administration are more likely to succeed with higher value non-drug reinforcement. Copyright © 2014. Published by Elsevier Ireland Ltd.
Maintaining homeostasis by decision-making.
Korn, Christoph W; Bach, Dominik R
2015-05-01
Living organisms need to maintain energetic homeostasis. For many species, this implies taking actions with delayed consequences. For example, humans may have to decide between foraging for high-calorie but hard-to-get, and low-calorie but easy-to-get food, under threat of starvation. Homeostatic principles prescribe decisions that maximize the probability of sustaining appropriate energy levels across the entire foraging trajectory. Here, predictions from biological principles contrast with predictions from economic decision-making models based on maximizing the utility of the endpoint outcome of a choice. To empirically arbitrate between the predictions of biological and economic models for individual human decision-making, we devised a virtual foraging task in which players chose repeatedly between two foraging environments, lost energy by the passage of time, and gained energy probabilistically according to the statistics of the environment they chose. Reaching zero energy was framed as starvation. We used the mathematics of random walks to derive endpoint outcome distributions of the choices. This also furnished equivalent lotteries, presented in a purely economic, casino-like frame, in which starvation corresponded to winning nothing. Bayesian model comparison showed that--in both the foraging and the casino frames--participants' choices depended jointly on the probability of starvation and the expected endpoint value of the outcome, but could not be explained by economic models based on combinations of statistical moments or on rank-dependent utility. This implies that under precisely defined constraints biological principles are better suited to explain human decision-making than economic models based on endpoint utility maximization.
Maintaining Homeostasis by Decision-Making
Korn, Christoph W.; Bach, Dominik R.
2015-01-01
Living organisms need to maintain energetic homeostasis. For many species, this implies taking actions with delayed consequences. For example, humans may have to decide between foraging for high-calorie but hard-to-get, and low-calorie but easy-to-get food, under threat of starvation. Homeostatic principles prescribe decisions that maximize the probability of sustaining appropriate energy levels across the entire foraging trajectory. Here, predictions from biological principles contrast with predictions from economic decision-making models based on maximizing the utility of the endpoint outcome of a choice. To empirically arbitrate between the predictions of biological and economic models for individual human decision-making, we devised a virtual foraging task in which players chose repeatedly between two foraging environments, lost energy by the passage of time, and gained energy probabilistically according to the statistics of the environment they chose. Reaching zero energy was framed as starvation. We used the mathematics of random walks to derive endpoint outcome distributions of the choices. This also furnished equivalent lotteries, presented in a purely economic, casino-like frame, in which starvation corresponded to winning nothing. Bayesian model comparison showed that—in both the foraging and the casino frames—participants’ choices depended jointly on the probability of starvation and the expected endpoint value of the outcome, but could not be explained by economic models based on combinations of statistical moments or on rank-dependent utility. This implies that under precisely defined constraints biological principles are better suited to explain human decision-making than economic models based on endpoint utility maximization. PMID:26024504
Probabilistic versus deterministic hazard assessment in liquefaction susceptible zones
NASA Astrophysics Data System (ADS)
Daminelli, Rosastella; Gerosa, Daniele; Marcellini, Alberto; Tento, Alberto
2015-04-01
Probabilistic seismic hazard assessment (PSHA), usually adopted in the framework of seismic codes redaction, is based on Poissonian description of the temporal occurrence, negative exponential distribution of magnitude and attenuation relationship with log-normal distribution of PGA or response spectrum. The main positive aspect of this approach stems into the fact that is presently a standard for the majority of countries, but there are weak points in particular regarding the physical description of the earthquake phenomenon. Factors like site effects, source characteristics like duration of the strong motion and directivity that could significantly influence the expected motion at the site are not taken into account by PSHA. Deterministic models can better evaluate the ground motion at a site from a physical point of view, but its prediction reliability depends on the degree of knowledge of the source, wave propagation and soil parameters. We compare these two approaches in selected sites affected by the May 2012 Emilia-Romagna and Lombardia earthquake, that caused widespread liquefaction phenomena unusually for magnitude less than 6. We focus on sites liquefiable because of their soil mechanical parameters and water table level. Our analysis shows that the choice between deterministic and probabilistic hazard analysis is strongly dependent on site conditions. The looser the soil and the higher the liquefaction potential, the more suitable is the deterministic approach. Source characteristics, in particular the duration of strong ground motion, have long since recognized as relevant to induce liquefaction; unfortunately a quantitative prediction of these parameters appears very unlikely, dramatically reducing the possibility of their adoption in hazard assessment. Last but not least, the economic factors are relevant in the choice of the approach. The case history of 2012 Emilia-Romagna and Lombardia earthquake, with an officially estimated cost of 6 billions Euros, shows that geological and geophysical investigations necessary to assess a reliable deterministic hazard evaluation are largely justified.
A Proposed Probabilistic Extension of the Halpern and Pearl Definition of ‘Actual Cause’
2017-01-01
ABSTRACT Joseph Halpern and Judea Pearl ([2005]) draw upon structural equation models to develop an attractive analysis of ‘actual cause’. Their analysis is designed for the case of deterministic causation. I show that their account can be naturally extended to provide an elegant treatment of probabilistic causation. 1Introduction2Preemption3Structural Equation Models4The Halpern and Pearl Definition of ‘Actual Cause’5Preemption Again6The Probabilistic Case7Probabilistic Causal Models8A Proposed Probabilistic Extension of Halpern and Pearl’s Definition9Twardy and Korb’s Account10Probabilistic Fizzling11Conclusion PMID:29593362
Residential water demand with endogenous pricing: The Canadian Case
NASA Astrophysics Data System (ADS)
Reynaud, Arnaud; Renzetti, Steven; Villeneuve, Michel
2005-11-01
In this paper, we show that the rate structure endogeneity may result in a misspecification of the residential water demand function. We propose to solve this endogeneity problem by estimating a probabilistic model describing how water rates are chosen by local communities. This model is estimated on a sample of Canadian local communities. We first show that the pricing structure choice reflects efficiency considerations, equity concerns, and, in some cases, a strategy of price discrimination across consumers by Canadian communities. Hence estimating the residential water demand without taking into account the pricing structures' endogeneity leads to a biased estimation of price and income elasticities. We also demonstrate that the pricing structure per se plays a significant role in influencing price responsiveness of Canadian residential consumers.
A Unified Probabilistic Framework for Dose-Response Assessment of Human Health Effects.
Chiu, Weihsueh A; Slob, Wout
2015-12-01
When chemical health hazards have been identified, probabilistic dose-response assessment ("hazard characterization") quantifies uncertainty and/or variability in toxicity as a function of human exposure. Existing probabilistic approaches differ for different types of endpoints or modes-of-action, lacking a unifying framework. We developed a unified framework for probabilistic dose-response assessment. We established a framework based on four principles: a) individual and population dose responses are distinct; b) dose-response relationships for all (including quantal) endpoints can be recast as relating to an underlying continuous measure of response at the individual level; c) for effects relevant to humans, "effect metrics" can be specified to define "toxicologically equivalent" sizes for this underlying individual response; and d) dose-response assessment requires making adjustments and accounting for uncertainty and variability. We then derived a step-by-step probabilistic approach for dose-response assessment of animal toxicology data similar to how nonprobabilistic reference doses are derived, illustrating the approach with example non-cancer and cancer datasets. Probabilistically derived exposure limits are based on estimating a "target human dose" (HDMI), which requires risk management-informed choices for the magnitude (M) of individual effect being protected against, the remaining incidence (I) of individuals with effects ≥ M in the population, and the percent confidence. In the example datasets, probabilistically derived 90% confidence intervals for HDMI values span a 40- to 60-fold range, where I = 1% of the population experiences ≥ M = 1%-10% effect sizes. Although some implementation challenges remain, this unified probabilistic framework can provide substantially more complete and transparent characterization of chemical hazards and support better-informed risk management decisions.
Global/local methods for probabilistic structural analysis
NASA Technical Reports Server (NTRS)
Millwater, H. R.; Wu, Y.-T.
1993-01-01
A probabilistic global/local method is proposed to reduce the computational requirements of probabilistic structural analysis. A coarser global model is used for most of the computations with a local more refined model used only at key probabilistic conditions. The global model is used to establish the cumulative distribution function (cdf) and the Most Probable Point (MPP). The local model then uses the predicted MPP to adjust the cdf value. The global/local method is used within the advanced mean value probabilistic algorithm. The local model can be more refined with respect to the g1obal model in terms of finer mesh, smaller time step, tighter tolerances, etc. and can be used with linear or nonlinear models. The basis for this approach is described in terms of the correlation between the global and local models which can be estimated from the global and local MPPs. A numerical example is presented using the NESSUS probabilistic structural analysis program with the finite element method used for the structural modeling. The results clearly indicate a significant computer savings with minimal loss in accuracy.
Global/local methods for probabilistic structural analysis
NASA Astrophysics Data System (ADS)
Millwater, H. R.; Wu, Y.-T.
1993-04-01
A probabilistic global/local method is proposed to reduce the computational requirements of probabilistic structural analysis. A coarser global model is used for most of the computations with a local more refined model used only at key probabilistic conditions. The global model is used to establish the cumulative distribution function (cdf) and the Most Probable Point (MPP). The local model then uses the predicted MPP to adjust the cdf value. The global/local method is used within the advanced mean value probabilistic algorithm. The local model can be more refined with respect to the g1obal model in terms of finer mesh, smaller time step, tighter tolerances, etc. and can be used with linear or nonlinear models. The basis for this approach is described in terms of the correlation between the global and local models which can be estimated from the global and local MPPs. A numerical example is presented using the NESSUS probabilistic structural analysis program with the finite element method used for the structural modeling. The results clearly indicate a significant computer savings with minimal loss in accuracy.
I Plan Therefore I Choose: Free-Choice Bias Due to Prior Action-Probability but Not Action-Value
Suriya-Arunroj, Lalitta; Gail, Alexander
2015-01-01
According to an emerging view, decision-making, and motor planning are tightly entangled at the level of neural processing. Choice is influenced not only by the values associated with different options, but also biased by other factors. Here we test the hypothesis that preliminary action planning can induce choice biases gradually and independently of objective value when planning overlaps with one of the potential action alternatives. Subjects performed center-out reaches obeying either a clockwise or counterclockwise cue-response rule in two tasks. In the probabilistic task, a pre-cue indicated the probability of each of the two potential rules to become valid. When the subsequent rule-cue unambiguously indicated which of the pre-cued rules was actually valid (instructed trials), subjects responded faster to rules pre-cued with higher probability. When subjects were allowed to choose freely between two equally rewarded rules (choice trials) they chose the originally more likely rule more often and faster, despite the lack of an objective advantage in selecting this target. In the amount task, the pre-cue indicated the amount of potential reward associated with each rule. Subjects responded faster to rules pre-cued with higher reward amount in instructed trials of the amount task, equivalent to the more likely rule in the probabilistic task. Yet, in contrast, subjects showed hardly any choice bias and no increase in response speed in favor of the original high-reward target in the choice trials of the amount task. We conclude that free-choice behavior is robustly biased when predictability encourages the planning of one of the potential responses, while prior reward expectations without action planning do not induce such strong bias. Our results provide behavioral evidence for distinct contributions of expected value and action planning in decision-making and a tight interdependence of motor planning and action selection, supporting the idea that the underlying neural mechanisms overlap. PMID:26635565
Cost effectiveness and projected national impact of colorectal cancer screening in France.
Hassan, C; Benamouzig, R; Spada, C; Ponchon, T; Zullo, A; Saurin, J C; Costamagna, G
2011-09-01
Colorectal cancer (CRC) is a major cause of morbidity and mortality in France. Only scanty data on cost-effectiveness of CRC screening in Europe are available, generating uncertainty over its efficiency. Although immunochemical fecal tests (FIT) and guaiac-based fecal occult blood tests (g-FOBT) have been shown to be cost-effective in France, cost-effectiveness of endoscopic screening has not yet been addressed. Cost-effectiveness of screening strategies using colonoscopy, flexible sigmoidoscopy, second-generation colon capsule endoscopy (CCE), FIT and g-FOBT were compared using a Markov model. A 40 % adherence rate was assumed for all strategies. Colonoscopy costs included anesthesiologist assistance. Incremental cost-effectiveness ratios (ICERs) were calculated. Probabilistic and value-of-information analyses were used to estimate the expected benefit of future research. A third-payer perspective was adopted. In the reference case analysis, FIT repeated every year was the most cost-effective strategy, with an ICER of €48165 per life-year gained vs. FIT every 2 years, which was the next most cost-effective strategy. Although CCE every 5 years was as effective as FIT 1-year, it was not a cost-effective alternative. Colonoscopy repeated every 10 years was substantially more costly, and slightly less effective than FIT 1-year. When projecting the model outputs onto the French population, the least (g-FOBT 2-years) and most (FIT 1-year) effective strategies reduced the absolute number of annual CRC deaths from 16037 to 12916 and 11217, respectively, resulting in an annual additional cost of €26 million and €347 million, respectively. Probabilistic sensitivity analysis demonstrated that FIT 1-year was the optimal choice in 20% of the simulated scenarios, whereas sigmoidoscopy 5-years, colonoscopy, and FIT 2-years were the optimal choices in 40%, 26%, and 14%, respectively. A screening program based on FIT 1-year appeared to be the most cost-effective approach for CRC screening in France. However, a substantial uncertainty over this choice is still present. © Georg Thieme Verlag KG Stuttgart · New York.
Rats bred for high alcohol drinking are more sensitive to delayed and probabilistic outcomes.
Wilhelm, C J; Mitchell, S H
2008-10-01
Alcoholics and heavy drinkers score higher on measures of impulsivity than nonalcoholics and light drinkers. This may be because of factors that predate drug exposure (e.g. genetics). This study examined the role of genetics by comparing impulsivity measures in ethanol-naive rats selectively bred based on their high [high alcohol drinking (HAD)] or low [low alcohol drinking (LAD)] consumption of ethanol. Replicates 1 and 2 of the HAD and LAD rats, developed by the University of Indiana Alcohol Research Center, completed two different discounting tasks. Delay discounting examines sensitivity to rewards that are delayed in time and is commonly used to assess 'choice' impulsivity. Probability discounting examines sensitivity to the uncertain delivery of rewards and has been used to assess risk taking and risk assessment. High alcohol drinking rats discounted delayed and probabilistic rewards more steeply than LAD rats. Discount rates associated with probabilistic and delayed rewards were weakly correlated, while bias was strongly correlated with discount rate in both delay and probability discounting. The results suggest that selective breeding for high alcohol consumption selects for animals that are more sensitive to delayed and probabilistic outcomes. Sensitivity to delayed or probabilistic outcomes may be predictive of future drinking in genetically predisposed individuals.
Emulation for probabilistic weather forecasting
NASA Astrophysics Data System (ADS)
Cornford, Dan; Barillec, Remi
2010-05-01
Numerical weather prediction models are typically very expensive to run due to their complexity and resolution. Characterising the sensitivity of the model to its initial condition and/or to its parameters requires numerous runs of the model, which is impractical for all but the simplest models. To produce probabilistic forecasts requires knowledge of the distribution of the model outputs, given the distribution over the inputs, where the inputs include the initial conditions, boundary conditions and model parameters. Such uncertainty analysis for complex weather prediction models seems a long way off, given current computing power, with ensembles providing only a partial answer. One possible way forward that we develop in this work is the use of statistical emulators. Emulators provide an efficient statistical approximation to the model (or simulator) while quantifying the uncertainty introduced. In the emulator framework, a Gaussian process is fitted to the simulator response as a function of the simulator inputs using some training data. The emulator is essentially an interpolator of the simulator output and the response in unobserved areas is dictated by the choice of covariance structure and parameters in the Gaussian process. Suitable parameters are inferred from the data in a maximum likelihood, or Bayesian framework. Once trained, the emulator allows operations such as sensitivity analysis or uncertainty analysis to be performed at a much lower computational cost. The efficiency of emulators can be further improved by exploiting the redundancy in the simulator output through appropriate dimension reduction techniques. We demonstrate this using both Principal Component Analysis on the model output and a new reduced-rank emulator in which an optimal linear projection operator is estimated jointly with other parameters, in the context of simple low order models, such as the Lorenz 40D system. We present the application of emulators to probabilistic weather forecasting, where the construction of the emulator training set replaces the traditional ensemble model runs. Thus the actual forecast distributions are computed using the emulator conditioned on the ‘ensemble runs' which are chosen to explore the plausible input space using relatively crude experimental design methods. One benefit here is that the ensemble does not need to be a sample from the true distribution of the input space, rather it should cover that input space in some sense. The probabilistic forecasts are computed using Monte Carlo methods sampling from the input distribution and using the emulator to produce the output distribution. Finally we discuss the limitations of this approach and briefly mention how we might use similar methods to learn the model error within a framework that incorporates a data assimilation like aspect, using emulators and learning complex model error representations. We suggest future directions for research in the area that will be necessary to apply the method to more realistic numerical weather prediction models.
ERIC Educational Resources Information Center
Meiser, Thorsten; Rummel, Jan; Fleig, Hanna
2018-01-01
Pseudocontingencies are inferences about correlations in the environment that are formed on the basis of statistical regularities like skewed base rates or varying base rates across environmental contexts. Previous research has demonstrated that pseudocontingencies provide a pervasive mechanism of inductive inference in numerous social judgment…
Generative Topic Modeling in Image Data Mining and Bioinformatics Studies
ERIC Educational Resources Information Center
Chen, Xin
2012-01-01
Probabilistic topic models have been developed for applications in various domains such as text mining, information retrieval and computer vision and bioinformatics domain. In this thesis, we focus on developing novel probabilistic topic models for image mining and bioinformatics studies. Specifically, a probabilistic topic-connection (PTC) model…
Amodeo, Dionisio A.; Grospe, Gena; Zang, Hui; Dwivedi, Yogesh; Ragozzino, Michael E.
2016-01-01
Central infusion of the Na+/K+-ATPase inhibitor, ouabain in rats serves as an animal model of mania because it leads to hyperactivity, as well as reproduces ion dysregulation and reduced BDNF levels similar to that observed in bipolar disorder. Bipolar disorder is also associated with cognitive inflexibility and working memory deficits. It is unknown whether ouabain treatment in rats leads to similar cognitive flexibility and working memory deficits. The present study examined the effects of an intracerebral ventricular infusion of ouabain in rats on spontaneous alternation, probabilistic reversal learning and BDNF expression levels in the frontal cortex. Ouabain treatment significantly increased locomotor activity, but did not affect alternation performance in a Y-maze. Ouabain treatment selectively impaired reversal learning in a spatial discrimination task using an 80/20 probabilistic reinforcement procedure. The reversal learning deficit in ouabain-treated rats resulted from an impaired ability to maintain a new choice pattern (increased regressive errors). Ouabain treatment also decreased sensitivity to negative feedback during the initial phase of reversal learning. Expression of BDNF mRNA and protein levels was downregulated in the frontal cortex which also negatively correlated with regressive errors. These findings suggest that the ouabain model of mania may be useful in understanding the neuropathophysiology that contributes to cognitive flexibility deficits and test potential treatments to alleviate cognitive deficits in bipolar disorder. PMID:27267245
NASA Technical Reports Server (NTRS)
Duda, David P.; Minnis, Patrick
2009-01-01
Straightforward application of the Schmidt-Appleman contrail formation criteria to diagnose persistent contrail occurrence from numerical weather prediction data is hindered by significant bias errors in the upper tropospheric humidity. Logistic models of contrail occurrence have been proposed to overcome this problem, but basic questions remain about how random measurement error may affect their accuracy. A set of 5000 synthetic contrail observations is created to study the effects of random error in these probabilistic models. The simulated observations are based on distributions of temperature, humidity, and vertical velocity derived from Advanced Regional Prediction System (ARPS) weather analyses. The logistic models created from the simulated observations were evaluated using two common statistical measures of model accuracy, the percent correct (PC) and the Hanssen-Kuipers discriminant (HKD). To convert the probabilistic results of the logistic models into a dichotomous yes/no choice suitable for the statistical measures, two critical probability thresholds are considered. The HKD scores are higher when the climatological frequency of contrail occurrence is used as the critical threshold, while the PC scores are higher when the critical probability threshold is 0.5. For both thresholds, typical random errors in temperature, relative humidity, and vertical velocity are found to be small enough to allow for accurate logistic models of contrail occurrence. The accuracy of the models developed from synthetic data is over 85 percent for both the prediction of contrail occurrence and non-occurrence, although in practice, larger errors would be anticipated.
A Unified Probabilistic Framework for Dose–Response Assessment of Human Health Effects
Slob, Wout
2015-01-01
Background When chemical health hazards have been identified, probabilistic dose–response assessment (“hazard characterization”) quantifies uncertainty and/or variability in toxicity as a function of human exposure. Existing probabilistic approaches differ for different types of endpoints or modes-of-action, lacking a unifying framework. Objectives We developed a unified framework for probabilistic dose–response assessment. Methods We established a framework based on four principles: a) individual and population dose responses are distinct; b) dose–response relationships for all (including quantal) endpoints can be recast as relating to an underlying continuous measure of response at the individual level; c) for effects relevant to humans, “effect metrics” can be specified to define “toxicologically equivalent” sizes for this underlying individual response; and d) dose–response assessment requires making adjustments and accounting for uncertainty and variability. We then derived a step-by-step probabilistic approach for dose–response assessment of animal toxicology data similar to how nonprobabilistic reference doses are derived, illustrating the approach with example non-cancer and cancer datasets. Results Probabilistically derived exposure limits are based on estimating a “target human dose” (HDMI), which requires risk management–informed choices for the magnitude (M) of individual effect being protected against, the remaining incidence (I) of individuals with effects ≥ M in the population, and the percent confidence. In the example datasets, probabilistically derived 90% confidence intervals for HDMI values span a 40- to 60-fold range, where I = 1% of the population experiences ≥ M = 1%–10% effect sizes. Conclusions Although some implementation challenges remain, this unified probabilistic framework can provide substantially more complete and transparent characterization of chemical hazards and support better-informed risk management decisions. Citation Chiu WA, Slob W. 2015. A unified probabilistic framework for dose–response assessment of human health effects. Environ Health Perspect 123:1241–1254; http://dx.doi.org/10.1289/ehp.1409385 PMID:26006063
A Generalized Measurement Model to Quantify Health: The Multi-Attribute Preference Response Model
Krabbe, Paul F. M.
2013-01-01
After 40 years of deriving metric values for health status or health-related quality of life, the effective quantification of subjective health outcomes is still a challenge. Here, two of the best measurement tools, the discrete choice and the Rasch model, are combined to create a new model for deriving health values. First, existing techniques to value health states are briefly discussed followed by a reflection on the recent revival of interest in patients’ experience with regard to their possible role in health measurement. Subsequently, three basic principles for valid health measurement are reviewed, namely unidimensionality, interval level, and invariance. In the main section, the basic operation of measurement is then discussed in the framework of probabilistic discrete choice analysis (random utility model) and the psychometric Rasch model. It is then shown how combining the main features of these two models yields an integrated measurement model, called the multi-attribute preference response (MAPR) model, which is introduced here. This new model transforms subjective individual rank data into a metric scale using responses from patients who have experienced certain health states. Its measurement mechanism largely prevents biases such as adaptation and coping. Several extensions of the MAPR model are presented. The MAPR model can be applied to a wide range of research problems. If extended with the self-selection of relevant health domains for the individual patient, this model will be more valid than existing valuation techniques. PMID:24278141
NASA Technical Reports Server (NTRS)
Boyce, L.
1992-01-01
A probabilistic general material strength degradation model has been developed for structural components of aerospace propulsion systems subjected to diverse random effects. The model has been implemented in two FORTRAN programs, PROMISS (Probabilistic Material Strength Simulator) and PROMISC (Probabilistic Material Strength Calibrator). PROMISS calculates the random lifetime strength of an aerospace propulsion component due to as many as eighteen diverse random effects. Results are presented in the form of probability density functions and cumulative distribution functions of lifetime strength. PROMISC calibrates the model by calculating the values of empirical material constants.
Buesing, Lars; Bill, Johannes; Nessler, Bernhard; Maass, Wolfgang
2011-01-01
The organization of computations in networks of spiking neurons in the brain is still largely unknown, in particular in view of the inherently stochastic features of their firing activity and the experimentally observed trial-to-trial variability of neural systems in the brain. In principle there exists a powerful computational framework for stochastic computations, probabilistic inference by sampling, which can explain a large number of macroscopic experimental data in neuroscience and cognitive science. But it has turned out to be surprisingly difficult to create a link between these abstract models for stochastic computations and more detailed models of the dynamics of networks of spiking neurons. Here we create such a link and show that under some conditions the stochastic firing activity of networks of spiking neurons can be interpreted as probabilistic inference via Markov chain Monte Carlo (MCMC) sampling. Since common methods for MCMC sampling in distributed systems, such as Gibbs sampling, are inconsistent with the dynamics of spiking neurons, we introduce a different approach based on non-reversible Markov chains that is able to reflect inherent temporal processes of spiking neuronal activity through a suitable choice of random variables. We propose a neural network model and show by a rigorous theoretical analysis that its neural activity implements MCMC sampling of a given distribution, both for the case of discrete and continuous time. This provides a step towards closing the gap between abstract functional models of cortical computation and more detailed models of networks of spiking neurons. PMID:22096452
Friston, Karl J.; Dolan, Raymond J.
2017-01-01
Normative models of human cognition often appeal to Bayesian filtering, which provides optimal online estimates of unknown or hidden states of the world, based on previous observations. However, in many cases it is necessary to optimise beliefs about sequences of states rather than just the current state. Importantly, Bayesian filtering and sequential inference strategies make different predictions about beliefs and subsequent choices, rendering them behaviourally dissociable. Taking data from a probabilistic reversal task we show that subjects’ choices provide strong evidence that they are representing short sequences of states. Between-subject measures of this implicit sequential inference strategy had a neurobiological underpinning and correlated with grey matter density in prefrontal and parietal cortex, as well as the hippocampus. Our findings provide, to our knowledge, the first evidence for sequential inference in human cognition, and by exploiting between-subject variation in this measure we provide pointers to its neuronal substrates. PMID:28486504
Reasoning in Reference Games: Individual- vs. Population-Level Probabilistic Modeling
Franke, Michael; Degen, Judith
2016-01-01
Recent advances in probabilistic pragmatics have achieved considerable success in modeling speakers’ and listeners’ pragmatic reasoning as probabilistic inference. However, these models are usually applied to population-level data, and so implicitly suggest a homogeneous population without individual differences. Here we investigate potential individual differences in Theory-of-Mind related depth of pragmatic reasoning in so-called reference games that require drawing ad hoc Quantity implicatures of varying complexity. We show by Bayesian model comparison that a model that assumes a heterogenous population is a better predictor of our data, especially for comprehension. We discuss the implications for the treatment of individual differences in probabilistic models of language use. PMID:27149675
Probabilistic graphs as a conceptual and computational tool in hydrology and water management
NASA Astrophysics Data System (ADS)
Schoups, Gerrit
2014-05-01
Originally developed in the fields of machine learning and artificial intelligence, probabilistic graphs constitute a general framework for modeling complex systems in the presence of uncertainty. The framework consists of three components: 1. Representation of the model as a graph (or network), with nodes depicting random variables in the model (e.g. parameters, states, etc), which are joined together by factors. Factors are local probabilistic or deterministic relations between subsets of variables, which, when multiplied together, yield the joint distribution over all variables. 2. Consistent use of probability theory for quantifying uncertainty, relying on basic rules of probability for assimilating data into the model and expressing unknown variables as a function of observations (via the posterior distribution). 3. Efficient, distributed approximation of the posterior distribution using general-purpose algorithms that exploit model structure encoded in the graph. These attributes make probabilistic graphs potentially useful as a conceptual and computational tool in hydrology and water management (and beyond). Conceptually, they can provide a common framework for existing and new probabilistic modeling approaches (e.g. by drawing inspiration from other fields of application), while computationally they can make probabilistic inference feasible in larger hydrological models. The presentation explores, via examples, some of these benefits.
Assessing Logo Programming among Jordanian Seventh Grade Students through Turtle Geometry
ERIC Educational Resources Information Center
Khasawneh, Amal A.
2009-01-01
The present study is concerned with assessing Logo programming experiences among seventh grade students. A formal multiple-choice test and five performance tasks were used to collect data. The results provided that students' performance was better than the expected score by the probabilistic laws, and a very low correlation between their Logo…
Lost in search: (Mal-)adaptation to probabilistic decision environments in children and adults.
Betsch, Tilmann; Lehmann, Anne; Lindow, Stefanie; Lang, Anna; Schoemann, Martin
2016-02-01
Adaptive decision making in probabilistic environments requires individuals to use probabilities as weights in predecisional information searches and/or when making subsequent choices. Within a child-friendly computerized environment (Mousekids), we tracked 205 children's (105 children 5-6 years of age and 100 children 9-10 years of age) and 103 adults' (age range: 21-22 years) search behaviors and decisions under different probability dispersions (.17; .33, .83 vs. .50, .67, .83) and constraint conditions (instructions to limit search: yes vs. no). All age groups limited their depth of search when instructed to do so and when probability dispersion was high (range: .17-.83). Unlike adults, children failed to use probabilities as weights for their searches, which were largely not systematic. When examining choices, however, elementary school children (unlike preschoolers) systematically used probabilities as weights in their decisions. This suggests that an intuitive understanding of probabilities and the capacity to use them as weights during integration is not a sufficient condition for applying simple selective search strategies that place one's focus on weight distributions. PsycINFO Database Record (c) 2016 APA, all rights reserved.
Denovan, Andrew; Dagnall, Neil; Drinkwater, Kenneth; Parker, Andrew; Clough, Peter
2017-01-01
The present study assessed the degree to which probabilistic reasoning performance and thinking style influenced perception of risk and self-reported levels of terrorism-related behavior change. A sample of 263 respondents, recruited via convenience sampling, completed a series of measures comprising probabilistic reasoning tasks (perception of randomness, base rate, probability, and conjunction fallacy), the Reality Testing subscale of the Inventory of Personality Organization (IPO-RT), the Domain-Specific Risk-Taking Scale, and a terrorism-related behavior change scale. Structural equation modeling examined three progressive models. Firstly, the Independence Model assumed that probabilistic reasoning, perception of risk and reality testing independently predicted terrorism-related behavior change. Secondly, the Mediation Model supposed that probabilistic reasoning and reality testing correlated, and indirectly predicted terrorism-related behavior change through perception of risk. Lastly, the Dual-Influence Model proposed that probabilistic reasoning indirectly predicted terrorism-related behavior change via perception of risk, independent of reality testing. Results indicated that performance on probabilistic reasoning tasks most strongly predicted perception of risk, and preference for an intuitive thinking style (measured by the IPO-RT) best explained terrorism-related behavior change. The combination of perception of risk with probabilistic reasoning ability in the Dual-Influence Model enhanced the predictive power of the analytical-rational route, with conjunction fallacy having a significant indirect effect on terrorism-related behavior change via perception of risk. The Dual-Influence Model possessed superior fit and reported similar predictive relations between intuitive-experiential and analytical-rational routes and terrorism-related behavior change. The discussion critically examines these findings in relation to dual-processing frameworks. This includes considering the limitations of current operationalisations and recommendations for future research that align outcomes and subsequent work more closely to specific dual-process models.
Denovan, Andrew; Dagnall, Neil; Drinkwater, Kenneth; Parker, Andrew; Clough, Peter
2017-01-01
The present study assessed the degree to which probabilistic reasoning performance and thinking style influenced perception of risk and self-reported levels of terrorism-related behavior change. A sample of 263 respondents, recruited via convenience sampling, completed a series of measures comprising probabilistic reasoning tasks (perception of randomness, base rate, probability, and conjunction fallacy), the Reality Testing subscale of the Inventory of Personality Organization (IPO-RT), the Domain-Specific Risk-Taking Scale, and a terrorism-related behavior change scale. Structural equation modeling examined three progressive models. Firstly, the Independence Model assumed that probabilistic reasoning, perception of risk and reality testing independently predicted terrorism-related behavior change. Secondly, the Mediation Model supposed that probabilistic reasoning and reality testing correlated, and indirectly predicted terrorism-related behavior change through perception of risk. Lastly, the Dual-Influence Model proposed that probabilistic reasoning indirectly predicted terrorism-related behavior change via perception of risk, independent of reality testing. Results indicated that performance on probabilistic reasoning tasks most strongly predicted perception of risk, and preference for an intuitive thinking style (measured by the IPO-RT) best explained terrorism-related behavior change. The combination of perception of risk with probabilistic reasoning ability in the Dual-Influence Model enhanced the predictive power of the analytical-rational route, with conjunction fallacy having a significant indirect effect on terrorism-related behavior change via perception of risk. The Dual-Influence Model possessed superior fit and reported similar predictive relations between intuitive-experiential and analytical-rational routes and terrorism-related behavior change. The discussion critically examines these findings in relation to dual-processing frameworks. This includes considering the limitations of current operationalisations and recommendations for future research that align outcomes and subsequent work more closely to specific dual-process models. PMID:29062288
Effects of delay and probability combinations on discounting in humans
Cox, David J.; Dallery, Jesse
2017-01-01
To determine discount rates, researchers typically adjust the amount of an immediate or certain option relative to a delayed or uncertain option. Because this adjusting amount method can be relatively time consuming, researchers have developed more efficient procedures. One such procedure is a 5-trial adjusting delay procedure, which measures the delay at which an amount of money loses half of its value (e.g., $1000 is valued at $500 with a 10-year delay to its receipt). Experiment 1 (n = 212) used 5-trial adjusting delay or probability tasks to measure delay discounting of losses, probabilistic gains, and probabilistic losses. Experiment 2 (n = 98) assessed combined probabilistic and delayed alternatives. In both experiments, we compared results from 5-trial adjusting delay or probability tasks to traditional adjusting amount procedures. Results suggest both procedures produced similar rates of probability and delay discounting in six out of seven comparisons. A magnitude effect consistent with previous research was observed for probabilistic gains and losses, but not for delayed losses. Results also suggest that delay and probability interact to determine the value of money. Five-trial methods may allow researchers to assess discounting more efficiently as well as study more complex choice scenarios. PMID:27498073
Sjöberg, C; Ahnesjö, A
2013-06-01
Label fusion multi-atlas approaches for image segmentation can give better segmentation results than single atlas methods. We present a multi-atlas label fusion strategy based on probabilistic weighting of distance maps. Relationships between image similarities and segmentation similarities are estimated in a learning phase and used to derive fusion weights that are proportional to the probability for each atlas to improve the segmentation result. The method was tested using a leave-one-out strategy on a database of 21 pre-segmented prostate patients for different image registrations combined with different image similarity scorings. The probabilistic weighting yields results that are equal or better compared to both fusion with equal weights and results using the STAPLE algorithm. Results from the experiments demonstrate that label fusion by weighted distance maps is feasible, and that probabilistic weighted fusion improves segmentation quality more the stronger the individual atlas segmentation quality depends on the corresponding registered image similarity. The regions used for evaluation of the image similarity measures were found to be more important than the choice of similarity measure. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Rapid target foraging with reach or gaze: The hand looks further ahead than the eye
2017-01-01
Real-world tasks typically consist of a series of target-directed actions and often require choices about which targets to act on and in what order. Such choice behavior can be assessed from an optimal foraging perspective whereby target selection is shaped by a balance between rewards and costs. Here we evaluated such decision-making in a rapid movement foraging task. On a given trial, participants were presented with 15 targets of varying size and value and were instructed to harvest as much reward as possible by either moving a handle to the targets (hand task) or by briefly fixating them (eye task). The short trial duration enabled participants to harvest about half the targets, ensuring that total reward was due to choice behavior. We developed a probabilistic model to predict target-by-target harvesting choices that considered the rewards and movement-related costs (i.e., target distance and size) associated with the current target as well as future targets. In the hand task, in comparison to the eye task, target choice was more strongly influenced by movement-related costs and took into account a greater number of future targets, consistent with the greater costs associated with arm movement. In both tasks, participants exhibited near-optimal behaviour and in a constrained version of the hand task in which choices could only be based on target positions, participants consistently chose among the shortest movement paths. Our results demonstrate that people can rapidly and effectively integrate values and movement-related costs associated with current and future targets when sequentially harvesting targets. PMID:28683138
Evaluating bacterial gene-finding HMM structures as probabilistic logic programs.
Mørk, Søren; Holmes, Ian
2012-03-01
Probabilistic logic programming offers a powerful way to describe and evaluate structured statistical models. To investigate the practicality of probabilistic logic programming for structure learning in bioinformatics, we undertook a simplified bacterial gene-finding benchmark in PRISM, a probabilistic dialect of Prolog. We evaluate Hidden Markov Model structures for bacterial protein-coding gene potential, including a simple null model structure, three structures based on existing bacterial gene finders and two novel model structures. We test standard versions as well as ADPH length modeling and three-state versions of the five model structures. The models are all represented as probabilistic logic programs and evaluated using the PRISM machine learning system in terms of statistical information criteria and gene-finding prediction accuracy, in two bacterial genomes. Neither of our implementations of the two currently most used model structures are best performing in terms of statistical information criteria or prediction performances, suggesting that better-fitting models might be achievable. The source code of all PRISM models, data and additional scripts are freely available for download at: http://github.com/somork/codonhmm. Supplementary data are available at Bioinformatics online.
Integration of Advanced Probabilistic Analysis Techniques with Multi-Physics Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cetiner, Mustafa Sacit; none,; Flanagan, George F.
2014-07-30
An integrated simulation platform that couples probabilistic analysis-based tools with model-based simulation tools can provide valuable insights for reactive and proactive responses to plant operating conditions. The objective of this work is to demonstrate the benefits of a partial implementation of the Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Framework Specification through the coupling of advanced PRA capabilities and accurate multi-physics plant models. Coupling a probabilistic model with a multi-physics model will aid in design, operations, and safety by providing a more accurate understanding of plant behavior. This represents the first attempt at actually integrating these two typesmore » of analyses for a control system used for operations, on a faster than real-time basis. This report documents the development of the basic communication capability to exchange data with the probabilistic model using Reliability Workbench (RWB) and the multi-physics model using Dymola. The communication pathways from injecting a fault (i.e., failing a component) to the probabilistic and multi-physics models were successfully completed. This first version was tested with prototypic models represented in both RWB and Modelica. First, a simple event tree/fault tree (ET/FT) model was created to develop the software code to implement the communication capabilities between the dynamic-link library (dll) and RWB. A program, written in C#, successfully communicates faults to the probabilistic model through the dll. A systems model of the Advanced Liquid-Metal Reactor–Power Reactor Inherently Safe Module (ALMR-PRISM) design developed under another DOE project was upgraded using Dymola to include proper interfaces to allow data exchange with the control application (ConApp). A program, written in C+, successfully communicates faults to the multi-physics model. The results of the example simulation were successfully plotted.« less
Probabilistic structural analysis methods of hot engine structures
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Hopkins, D. A.
1989-01-01
Development of probabilistic structural analysis methods for hot engine structures is a major activity at Lewis Research Center. Recent activities have focused on extending the methods to include the combined uncertainties in several factors on structural response. This paper briefly describes recent progress on composite load spectra models, probabilistic finite element structural analysis, and probabilistic strength degradation modeling. Progress is described in terms of fundamental concepts, computer code development, and representative numerical results.
Application of Probabilistic Analysis to Aircraft Impact Dynamics
NASA Technical Reports Server (NTRS)
Lyle, Karen H.; Padula, Sharon L.; Stockwell, Alan E.
2003-01-01
Full-scale aircraft crash simulations performed with nonlinear, transient dynamic, finite element codes can incorporate structural complexities such as: geometrically accurate models; human occupant models; and advanced material models to include nonlinear stressstrain behaviors, laminated composites, and material failure. Validation of these crash simulations is difficult due to a lack of sufficient information to adequately determine the uncertainty in the experimental data and the appropriateness of modeling assumptions. This paper evaluates probabilistic approaches to quantify the uncertainty in the simulated responses. Several criteria are used to determine that a response surface method is the most appropriate probabilistic approach. The work is extended to compare optimization results with and without probabilistic constraints.
How social information affects information search and choice in probabilistic inferences.
Puskaric, Marin; von Helversen, Bettina; Rieskamp, Jörg
2018-01-01
When making decisions, people are often exposed to relevant information stemming from qualitatively different sources. For instance, when making a choice between two alternatives people can rely on the advice of other people (i.e., social information) or search for factual information about the alternatives (i.e., non-social information). Prior research in categorization has shown that social information is given special attention when both social and non-social information is available, even when the social information has no additional informational value. The goal of the current work is to investigate whether framing information as social or non-social also influences information search and choice in probabilistic inferences. In a first study, we found that framing cues (i.e., the information used to make a decision) with medium validity as social increased the probability that they were searched for compared to a task where the same cues were framed as non-social information, but did not change the strategy people relied on. A second and a third study showed that framing a cue with high validity as social information facilitated learning to rely on a non-compensatory decision strategy. Overall, the results suggest that social in comparison to non-social information is given more attention and is learned faster than non-social information. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Ines, A. V. M.; Han, E.; Baethgen, W.
2017-12-01
Advances in seasonal climate forecasts (SCFs) during the past decades have brought great potential to improve agricultural climate risk managements associated with inter-annual climate variability. In spite of popular uses of crop simulation models in addressing climate risk problems, the models cannot readily take seasonal climate predictions issued in the format of tercile probabilities of most likely rainfall categories (i.e, below-, near- and above-normal). When a skillful SCF is linked with the crop simulation models, the informative climate information can be further translated into actionable agronomic terms and thus better support strategic and tactical decisions. In other words, crop modeling connected with a given SCF allows to simulate "what-if" scenarios with different crop choices or management practices and better inform the decision makers. In this paper, we present a decision support tool, called CAMDT (Climate Agriculture Modeling and Decision Tool), which seamlessly integrates probabilistic SCFs to DSSAT-CSM-Rice model to guide decision-makers in adopting appropriate crop and agricultural water management practices for given climatic conditions. The CAMDT has a functionality to disaggregate a probabilistic SCF into daily weather realizations (either a parametric or non-parametric disaggregation method) and to run DSSAT-CSM-Rice with the disaggregated weather realizations. The convenient graphical user-interface allows easy implementation of several "what-if" scenarios for non-technical users and visualize the results of the scenario runs. In addition, the CAMDT also translates crop model outputs to economic terms once the user provides expected crop price and cost. The CAMDT is a practical tool for real-world applications, specifically for agricultural climate risk management in the Bicol region, Philippines, having a great flexibility for being adapted to other crops or regions in the world. CAMDT GitHub: https://github.com/Agro-Climate/CAMDT
Multi-Agent simulation of generation capacity expansion decisions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Botterud, A.; Mahalik, M.; Conzelmann, G.
2008-01-01
In this paper, we use a multi-agent simulation model, EMCAS, to analyze generation expansion in the Iberian electricity market. The expansion model simulates generation investment decisions of decentralized generating companies (GenCos) interacting in a complex, multidimensional environment. A probabilistic dispatch algorithm calculates prices and profits for new candidate units in different future states of the system. Uncertainties in future load, hydropower conditions, and competitorspsila actions are represented in a scenario tree, and decision analysis is used to identify the optimal expansion decision for each individual GenCo. We run the model using detailed data for the Iberian market. In a scenariomore » analysis, we look at the impact of market design variables, such as the energy price cap and carbon emission prices. We also analyze how market concentration and GenCospsila risk preferences influence the timing and choice of new generating capacity.« less
A Model-Based Prognostics Approach Applied to Pneumatic Valves
NASA Technical Reports Server (NTRS)
Daigle, Matthew J.; Goebel, Kai
2011-01-01
Within the area of systems health management, the task of prognostics centers on predicting when components will fail. Model-based prognostics exploits domain knowledge of the system, its components, and how they fail by casting the underlying physical phenomena in a physics-based model that is derived from first principles. Uncertainty cannot be avoided in prediction, therefore, algorithms are employed that help in managing these uncertainties. The particle filtering algorithm has become a popular choice for model-based prognostics due to its wide applicability, ease of implementation, and support for uncertainty management. We develop a general model-based prognostics methodology within a robust probabilistic framework using particle filters. As a case study, we consider a pneumatic valve from the Space Shuttle cryogenic refueling system. We develop a detailed physics-based model of the pneumatic valve, and perform comprehensive simulation experiments to illustrate our prognostics approach and evaluate its effectiveness and robustness. The approach is demonstrated using historical pneumatic valve data from the refueling system.
Temporal Causality Analysis of Sentiment Change in a Cancer Survivor Network.
Bui, Ngot; Yen, John; Honavar, Vasant
2016-06-01
Online health communities constitute a useful source of information and social support for patients. American Cancer Society's Cancer Survivor Network (CSN), a 173,000-member community, is the largest online network for cancer patients, survivors, and caregivers. A discussion thread in CSN is often initiated by a cancer survivor seeking support from other members of CSN. Discussion threads are multi-party conversations that often provide a source of social support e.g., by bringing about a change of sentiment from negative to positive on the part of the thread originator. While previous studies regarding cancer survivors have shown that members of an online health community derive benefits from their participation in such communities, causal accounts of the factors that contribute to the observed benefits have been lacking. We introduce a novel framework to examine the temporal causality of sentiment dynamics in the CSN. We construct a Probabilistic Computation Tree Logic representation and a corresponding probabilistic Kripke structure to represent and reason about the changes in sentiments of posts in a thread over time. We use a sentiment classifier trained using machine learning on a set of posts manually tagged with sentiment labels to classify posts as expressing either positive or negative sentiment. We analyze the probabilistic Kripke structure to identify the prima facie causes of sentiment change on the part of the thread originators in the CSN forum and their significance. We find that the sentiment of replies appears to causally influence the sentiment of the thread originator. Our experiments also show that the conclusions are robust with respect to the choice of the (i) classification threshold of the sentiment classifier; (ii) and the choice of the specific sentiment classifier used. We also extend the basic framework for temporal causality analysis to incorporate the uncertainty in the states of the probabilistic Kripke structure resulting from the use of an imperfect state transducer (in our case, the sentiment classifier). Our analysis of temporal causality of CSN sentiment dynamics offers new insights that the designers, managers and moderators of an online community such as CSN can utilize to facilitate and enhance the interactions so as to better meet the social support needs of the CSN participants. The proposed methodology for analysis of temporal causality has broad applicability in a variety of settings where the dynamics of the underlying system can be modeled in terms of state variables that change in response to internal or external inputs.
Temporal Causality Analysis of Sentiment Change in a Cancer Survivor Network
Bui, Ngot; Yen, John; Honavar, Vasant
2017-01-01
Online health communities constitute a useful source of information and social support for patients. American Cancer Society’s Cancer Survivor Network (CSN), a 173,000-member community, is the largest online network for cancer patients, survivors, and caregivers. A discussion thread in CSN is often initiated by a cancer survivor seeking support from other members of CSN. Discussion threads are multi-party conversations that often provide a source of social support e.g., by bringing about a change of sentiment from negative to positive on the part of the thread originator. While previous studies regarding cancer survivors have shown that members of an online health community derive benefits from their participation in such communities, causal accounts of the factors that contribute to the observed benefits have been lacking. We introduce a novel framework to examine the temporal causality of sentiment dynamics in the CSN. We construct a Probabilistic Computation Tree Logic representation and a corresponding probabilistic Kripke structure to represent and reason about the changes in sentiments of posts in a thread over time. We use a sentiment classifier trained using machine learning on a set of posts manually tagged with sentiment labels to classify posts as expressing either positive or negative sentiment. We analyze the probabilistic Kripke structure to identify the prima facie causes of sentiment change on the part of the thread originators in the CSN forum and their significance. We find that the sentiment of replies appears to causally influence the sentiment of the thread originator. Our experiments also show that the conclusions are robust with respect to the choice of the (i) classification threshold of the sentiment classifier; (ii) and the choice of the specific sentiment classifier used. We also extend the basic framework for temporal causality analysis to incorporate the uncertainty in the states of the probabilistic Kripke structure resulting from the use of an imperfect state transducer (in our case, the sentiment classifier). Our analysis of temporal causality of CSN sentiment dynamics offers new insights that the designers, managers and moderators of an online community such as CSN can utilize to facilitate and enhance the interactions so as to better meet the social support needs of the CSN participants. The proposed methodology for analysis of temporal causality has broad applicability in a variety of settings where the dynamics of the underlying system can be modeled in terms of state variables that change in response to internal or external inputs. PMID:29399599
[Rational choice, prediction, and medical decision. Contribution of severity scores].
Bizouarn, P; Fiat, E; Folscheid, D
2001-11-01
The aim of this study was to determine what type of representation the medical doctor adopted concerning the uncertainty about the future in critically ill patients in the context of preoperative evaluation and intensive care medicine and to explore through the representation of the patient health status the different possibilities of choice he was able to make. The role played by the severity classification systems in the process of medical decision-making under probabilistic uncertainty was assessed according to the theories of rational behaviour. In this context, a medical rationality needed to be discovered, going beyond the instrumental status of the objective and/or subjective constructions of rational choice theories and reaching a dimension where means and expected ends could be included.
Dopamine neurons learn relative chosen value from probabilistic rewards
Lak, Armin; Stauffer, William R; Schultz, Wolfram
2016-01-01
Economic theories posit reward probability as one of the factors defining reward value. Individuals learn the value of cues that predict probabilistic rewards from experienced reward frequencies. Building on the notion that responses of dopamine neurons increase with reward probability and expected value, we asked how dopamine neurons in monkeys acquire this value signal that may represent an economic decision variable. We found in a Pavlovian learning task that reward probability-dependent value signals arose from experienced reward frequencies. We then assessed neuronal response acquisition during choices among probabilistic rewards. Here, dopamine responses became sensitive to the value of both chosen and unchosen options. Both experiments showed also the novelty responses of dopamine neurones that decreased as learning advanced. These results show that dopamine neurons acquire predictive value signals from the frequency of experienced rewards. This flexible and fast signal reflects a specific decision variable and could update neuronal decision mechanisms. DOI: http://dx.doi.org/10.7554/eLife.18044.001 PMID:27787196
Structural and parameteric uncertainty quantification in cloud microphysics parameterization schemes
NASA Astrophysics Data System (ADS)
van Lier-Walqui, M.; Morrison, H.; Kumjian, M. R.; Prat, O. P.; Martinkus, C.
2017-12-01
Atmospheric model parameterization schemes employ approximations to represent the effects of unresolved processes. These approximations are a source of error in forecasts, caused in part by considerable uncertainty about the optimal value of parameters within each scheme -- parameteric uncertainty. Furthermore, there is uncertainty regarding the best choice of the overarching structure of the parameterization scheme -- structrual uncertainty. Parameter estimation can constrain the first, but may struggle with the second because structural choices are typically discrete. We address this problem in the context of cloud microphysics parameterization schemes by creating a flexible framework wherein structural and parametric uncertainties can be simultaneously constrained. Our scheme makes no assuptions about drop size distribution shape or the functional form of parametrized process rate terms. Instead, these uncertainties are constrained by observations using a Markov Chain Monte Carlo sampler within a Bayesian inference framework. Our scheme, the Bayesian Observationally-constrained Statistical-physical Scheme (BOSS), has flexibility to predict various sets of prognostic drop size distribution moments as well as varying complexity of process rate formulations. We compare idealized probabilistic forecasts from versions of BOSS with varying levels of structural complexity. This work has applications in ensemble forecasts with model physics uncertainty, data assimilation, and cloud microphysics process studies.
Effects of different feedback types on information integration in repeated monetary gambles
Haffke, Peter; Hübner, Ronald
2015-01-01
Most models of risky decision making assume that all relevant information is taken into account (e.g., von Neumann and Morgenstern, 1944; Kahneman and Tversky, 1979). However, there are also some models supposing that only part of the information is considered (e.g., Brandstätter et al., 2006; Gigerenzer and Gaissmaier, 2011). To further investigate the amount of information that is usually used for decision making, and how the use depends on feedback, we conducted a series of three experiments in which participants choose between two lotteries and where no feedback, outcome feedback, and error feedback was provided, respectively. The results show that without feedback participants mostly chose the lottery with the higher winning probability, and largely ignored the potential gains. The same results occurred when the outcome of each decision was fed back. Only after presenting error feedback (i.e., signaling whether a choice was optimal or not), participants considered probabilities as well as gains, resulting in more optimal choices. We propose that outcome feedback was ineffective, because of its probabilistic and ambiguous nature. Participants improve information integration only if provided with a consistent and deterministic signal such as error feedback. PMID:25667576
Effects of different feedback types on information integration in repeated monetary gambles.
Haffke, Peter; Hübner, Ronald
2014-01-01
Most models of risky decision making assume that all relevant information is taken into account (e.g., von Neumann and Morgenstern, 1944; Kahneman and Tversky, 1979). However, there are also some models supposing that only part of the information is considered (e.g., Brandstätter et al., 2006; Gigerenzer and Gaissmaier, 2011). To further investigate the amount of information that is usually used for decision making, and how the use depends on feedback, we conducted a series of three experiments in which participants choose between two lotteries and where no feedback, outcome feedback, and error feedback was provided, respectively. The results show that without feedback participants mostly chose the lottery with the higher winning probability, and largely ignored the potential gains. The same results occurred when the outcome of each decision was fed back. Only after presenting error feedback (i.e., signaling whether a choice was optimal or not), participants considered probabilities as well as gains, resulting in more optimal choices. We propose that outcome feedback was ineffective, because of its probabilistic and ambiguous nature. Participants improve information integration only if provided with a consistent and deterministic signal such as error feedback.
NASA Technical Reports Server (NTRS)
Cruse, T. A.
1987-01-01
The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.
NASA Technical Reports Server (NTRS)
Cruse, T. A.; Burnside, O. H.; Wu, Y.-T.; Polch, E. Z.; Dias, J. B.
1988-01-01
The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.
The Dopaminergic Midbrain Encodes the Expected Certainty about Desired Outcomes.
Schwartenbeck, Philipp; FitzGerald, Thomas H B; Mathys, Christoph; Dolan, Ray; Friston, Karl
2015-10-01
Dopamine plays a key role in learning; however, its exact function in decision making and choice remains unclear. Recently, we proposed a generic model based on active (Bayesian) inference wherein dopamine encodes the precision of beliefs about optimal policies. Put simply, dopamine discharges reflect the confidence that a chosen policy will lead to desired outcomes. We designed a novel task to test this hypothesis, where subjects played a "limited offer" game in a functional magnetic resonance imaging experiment. Subjects had to decide how long to wait for a high offer before accepting a low offer, with the risk of losing everything if they waited too long. Bayesian model comparison showed that behavior strongly supported active inference, based on surprise minimization, over classical utility maximization schemes. Furthermore, midbrain activity, encompassing dopamine projection neurons, was accurately predicted by trial-by-trial variations in model-based estimates of precision. Our findings demonstrate that human subjects infer both optimal policies and the precision of those inferences, and thus support the notion that humans perform hierarchical probabilistic Bayesian inference. In other words, subjects have to infer both what they should do as well as how confident they are in their choices, where confidence may be encoded by dopaminergic firing. © The Author 2014. Published by Oxford University Press.
The Dopaminergic Midbrain Encodes the Expected Certainty about Desired Outcomes
Schwartenbeck, Philipp; FitzGerald, Thomas H. B.; Mathys, Christoph; Dolan, Ray; Friston, Karl
2015-01-01
Dopamine plays a key role in learning; however, its exact function in decision making and choice remains unclear. Recently, we proposed a generic model based on active (Bayesian) inference wherein dopamine encodes the precision of beliefs about optimal policies. Put simply, dopamine discharges reflect the confidence that a chosen policy will lead to desired outcomes. We designed a novel task to test this hypothesis, where subjects played a “limited offer” game in a functional magnetic resonance imaging experiment. Subjects had to decide how long to wait for a high offer before accepting a low offer, with the risk of losing everything if they waited too long. Bayesian model comparison showed that behavior strongly supported active inference, based on surprise minimization, over classical utility maximization schemes. Furthermore, midbrain activity, encompassing dopamine projection neurons, was accurately predicted by trial-by-trial variations in model-based estimates of precision. Our findings demonstrate that human subjects infer both optimal policies and the precision of those inferences, and thus support the notion that humans perform hierarchical probabilistic Bayesian inference. In other words, subjects have to infer both what they should do as well as how confident they are in their choices, where confidence may be encoded by dopaminergic firing. PMID:25056572
Probabilistic machine learning and artificial intelligence.
Ghahramani, Zoubin
2015-05-28
How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.
Probabilistic machine learning and artificial intelligence
NASA Astrophysics Data System (ADS)
Ghahramani, Zoubin
2015-05-01
How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.
Process for computing geometric perturbations for probabilistic analysis
Fitch, Simeon H. K. [Charlottesville, VA; Riha, David S [San Antonio, TX; Thacker, Ben H [San Antonio, TX
2012-04-10
A method for computing geometric perturbations for probabilistic analysis. The probabilistic analysis is based on finite element modeling, in which uncertainties in the modeled system are represented by changes in the nominal geometry of the model, referred to as "perturbations". These changes are accomplished using displacement vectors, which are computed for each node of a region of interest and are based on mean-value coordinate calculations.
Probabilistic modeling of discourse-aware sentence processing.
Dubey, Amit; Keller, Frank; Sturt, Patrick
2013-07-01
Probabilistic models of sentence comprehension are increasingly relevant to questions concerning human language processing. However, such models are often limited to syntactic factors. This restriction is unrealistic in light of experimental results suggesting interactions between syntax and other forms of linguistic information in human sentence processing. To address this limitation, this article introduces two sentence processing models that augment a syntactic component with information about discourse co-reference. The novel combination of probabilistic syntactic components with co-reference classifiers permits them to more closely mimic human behavior than existing models. The first model uses a deep model of linguistics, based in part on probabilistic logic, allowing it to make qualitative predictions on experimental data; the second model uses shallow processing to make quantitative predictions on a broad-coverage reading-time corpus. Copyright © 2013 Cognitive Science Society, Inc.
E-Area LLWF Vadose Zone Model: Probabilistic Model for Estimating Subsided-Area Infiltration Rates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dyer, J.; Flach, G.
A probabilistic model employing a Monte Carlo sampling technique was developed in Python to generate statistical distributions of the upslope-intact-area to subsided-area ratio (Area UAi/Area SAi) for closure cap subsidence scenarios that differ in assumed percent subsidence and the total number of intact plus subsided compartments. The plan is to use this model as a component in the probabilistic system model for the E-Area Performance Assessment (PA), contributing uncertainty in infiltration estimates.
ERIC Educational Resources Information Center
Kuperman, Victor; Bresnan, Joan
2012-01-01
In a series of seven studies, this paper examines acoustic characteristics of the spontaneous speech production of the English dative alternation ("gave the book to the boy/ the boy the book") as a function of the probability of the choice between alternating constructions. Probabilistic effects on the acoustic duration were observed in the…
An Alternative Perspective on von Winterfeldt et al.'s (1997) Test of Consequence Monotonicity
ERIC Educational Resources Information Center
Ho, Moon-Ho R.; Regenwetter, Michel; Niederee, Reinhard; Heyer, Dieter
2005-01-01
D. von Winterfeldt, N.-K. Chung, R. D. Luce, and Y. Cho (see record 1997-03378-008) provided several tests for consequence monotonicity of choice or judgment, using certainty equivalents of gambles. The authors reaxiomatized consequence monotonicity in a probabilistic framework and reanalyzed von Winterfeldt et al.'s main experiment via a…
Bartha, Erzsebet; Davidson, Thomas; Brodtkorb, Thor-Henrik; Carlsson, Per; Kalman, Sigridur
2013-07-09
A randomized, controlled trial, intended to include 460 patients, is currently studying peroperative goal-directed hemodynamic treatment (GDHT) of aged hip-fracture patients. Interim efficacy analysis performed on the first 100 patients was statistically uncertain; thus, the trial is continuing in accordance with the trial protocol. This raised the present investigation's main question: Is it reasonable to continue to fund the trial to decrease uncertainty? To answer this question, a previously developed probabilistic cost-effectiveness model was used. That model depicts (1) a choice between routine fluid treatment and GDHT, given uncertainty of current evidence and (2) the monetary value of further data collection to decrease uncertainty. This monetary value, that is, the expected value of perfect information (EVPI), could be used to compare future research costs. Thus, the primary aim of the present investigation was to analyze EVPI of an ongoing trial with interim efficacy observed. A previously developed probabilistic decision analytic cost-effectiveness model was employed to compare the routine fluid treatment to GDHT. Results from the interim analysis, published trials, the meta-analysis, and the registry data were used as model inputs. EVPI was predicted using (1) combined uncertainty of model inputs; (2) threshold value of society's willingness to pay for one, quality-adjusted life-year; and (3) estimated number of future patients exposed to choice between GDHT and routine fluid treatment during the expected lifetime of GDHT. If a decision to use GDHT were based on cost-effectiveness, then the decision would have a substantial degree of uncertainty. Assuming a 5-year lifetime of GDHT in clinical practice, the number of patients who would be subject to future decisions was 30,400. EVPI per patient would be €204 at a €20,000 threshold value of society's willingness to pay for one quality-adjusted life-year. Given a future population of 30,400 individuals, total EVPI would be €6.19 million. If future trial costs are below EVPI, further data collection is potentially cost-effective. When applying a cost-effectiveness model, statements such as 'further research is needed' are replaced with 'further research is cost-effective and 'further funding of a trial is justified'. ClinicalTrials.gov NCT01141894.
McClelland, James L.
2013-01-01
This article seeks to establish a rapprochement between explicitly Bayesian models of contextual effects in perception and neural network models of such effects, particularly the connectionist interactive activation (IA) model of perception. The article is in part an historical review and in part a tutorial, reviewing the probabilistic Bayesian approach to understanding perception and how it may be shaped by context, and also reviewing ideas about how such probabilistic computations may be carried out in neural networks, focusing on the role of context in interactive neural networks, in which both bottom-up and top-down signals affect the interpretation of sensory inputs. It is pointed out that connectionist units that use the logistic or softmax activation functions can exactly compute Bayesian posterior probabilities when the bias terms and connection weights affecting such units are set to the logarithms of appropriate probabilistic quantities. Bayesian concepts such the prior, likelihood, (joint and marginal) posterior, probability matching and maximizing, and calculating vs. sampling from the posterior are all reviewed and linked to neural network computations. Probabilistic and neural network models are explicitly linked to the concept of a probabilistic generative model that describes the relationship between the underlying target of perception (e.g., the word intended by a speaker or other source of sensory stimuli) and the sensory input that reaches the perceiver for use in inferring the underlying target. It is shown how a new version of the IA model called the multinomial interactive activation (MIA) model can sample correctly from the joint posterior of a proposed generative model for perception of letters in words, indicating that interactive processing is fully consistent with principled probabilistic computation. Ways in which these computations might be realized in real neural systems are also considered. PMID:23970868
McClelland, James L
2013-01-01
This article seeks to establish a rapprochement between explicitly Bayesian models of contextual effects in perception and neural network models of such effects, particularly the connectionist interactive activation (IA) model of perception. The article is in part an historical review and in part a tutorial, reviewing the probabilistic Bayesian approach to understanding perception and how it may be shaped by context, and also reviewing ideas about how such probabilistic computations may be carried out in neural networks, focusing on the role of context in interactive neural networks, in which both bottom-up and top-down signals affect the interpretation of sensory inputs. It is pointed out that connectionist units that use the logistic or softmax activation functions can exactly compute Bayesian posterior probabilities when the bias terms and connection weights affecting such units are set to the logarithms of appropriate probabilistic quantities. Bayesian concepts such the prior, likelihood, (joint and marginal) posterior, probability matching and maximizing, and calculating vs. sampling from the posterior are all reviewed and linked to neural network computations. Probabilistic and neural network models are explicitly linked to the concept of a probabilistic generative model that describes the relationship between the underlying target of perception (e.g., the word intended by a speaker or other source of sensory stimuli) and the sensory input that reaches the perceiver for use in inferring the underlying target. It is shown how a new version of the IA model called the multinomial interactive activation (MIA) model can sample correctly from the joint posterior of a proposed generative model for perception of letters in words, indicating that interactive processing is fully consistent with principled probabilistic computation. Ways in which these computations might be realized in real neural systems are also considered.
Decision making in recurrent neuronal circuits.
Wang, Xiao-Jing
2008-10-23
Decision making has recently emerged as a central theme in neurophysiological studies of cognition, and experimental and computational work has led to the proposal of a cortical circuit mechanism of elemental decision computations. This mechanism depends on slow recurrent synaptic excitation balanced by fast feedback inhibition, which not only instantiates attractor states for forming categorical choices but also long transients for gradually accumulating evidence in favor of or against alternative options. Such a circuit endowed with reward-dependent synaptic plasticity is able to produce adaptive choice behavior. While decision threshold is a core concept for reaction time tasks, it can be dissociated from a general decision rule. Moreover, perceptual decisions and value-based economic choices are described within a unified framework in which probabilistic choices result from irregular neuronal activity as well as iterative interactions of a decision maker with an uncertain environment or other unpredictable decision makers in a social group.
Rivas, Elena; Lang, Raymond; Eddy, Sean R
2012-02-01
The standard approach for single-sequence RNA secondary structure prediction uses a nearest-neighbor thermodynamic model with several thousand experimentally determined energy parameters. An attractive alternative is to use statistical approaches with parameters estimated from growing databases of structural RNAs. Good results have been reported for discriminative statistical methods using complex nearest-neighbor models, including CONTRAfold, Simfold, and ContextFold. Little work has been reported on generative probabilistic models (stochastic context-free grammars [SCFGs]) of comparable complexity, although probabilistic models are generally easier to train and to use. To explore a range of probabilistic models of increasing complexity, and to directly compare probabilistic, thermodynamic, and discriminative approaches, we created TORNADO, a computational tool that can parse a wide spectrum of RNA grammar architectures (including the standard nearest-neighbor model and more) using a generalized super-grammar that can be parameterized with probabilities, energies, or arbitrary scores. By using TORNADO, we find that probabilistic nearest-neighbor models perform comparably to (but not significantly better than) discriminative methods. We find that complex statistical models are prone to overfitting RNA structure and that evaluations should use structurally nonhomologous training and test data sets. Overfitting has affected at least one published method (ContextFold). The most important barrier to improving statistical approaches for RNA secondary structure prediction is the lack of diversity of well-curated single-sequence RNA secondary structures in current RNA databases.
Rivas, Elena; Lang, Raymond; Eddy, Sean R.
2012-01-01
The standard approach for single-sequence RNA secondary structure prediction uses a nearest-neighbor thermodynamic model with several thousand experimentally determined energy parameters. An attractive alternative is to use statistical approaches with parameters estimated from growing databases of structural RNAs. Good results have been reported for discriminative statistical methods using complex nearest-neighbor models, including CONTRAfold, Simfold, and ContextFold. Little work has been reported on generative probabilistic models (stochastic context-free grammars [SCFGs]) of comparable complexity, although probabilistic models are generally easier to train and to use. To explore a range of probabilistic models of increasing complexity, and to directly compare probabilistic, thermodynamic, and discriminative approaches, we created TORNADO, a computational tool that can parse a wide spectrum of RNA grammar architectures (including the standard nearest-neighbor model and more) using a generalized super-grammar that can be parameterized with probabilities, energies, or arbitrary scores. By using TORNADO, we find that probabilistic nearest-neighbor models perform comparably to (but not significantly better than) discriminative methods. We find that complex statistical models are prone to overfitting RNA structure and that evaluations should use structurally nonhomologous training and test data sets. Overfitting has affected at least one published method (ContextFold). The most important barrier to improving statistical approaches for RNA secondary structure prediction is the lack of diversity of well-curated single-sequence RNA secondary structures in current RNA databases. PMID:22194308
NASA Astrophysics Data System (ADS)
Zolfaghari, Mohammad R.
2009-07-01
Recent achievements in computer and information technology have provided the necessary tools to extend the application of probabilistic seismic hazard mapping from its traditional engineering use to many other applications. Examples for such applications are risk mitigation, disaster management, post disaster recovery planning and catastrophe loss estimation and risk management. Due to the lack of proper knowledge with regard to factors controlling seismic hazards, there are always uncertainties associated with all steps involved in developing and using seismic hazard models. While some of these uncertainties can be controlled by more accurate and reliable input data, the majority of the data and assumptions used in seismic hazard studies remain with high uncertainties that contribute to the uncertainty of the final results. In this paper a new methodology for the assessment of seismic hazard is described. The proposed approach provides practical facility for better capture of spatial variations of seismological and tectonic characteristics, which allows better treatment of their uncertainties. In the proposed approach, GIS raster-based data models are used in order to model geographical features in a cell-based system. The cell-based source model proposed in this paper provides a framework for implementing many geographically referenced seismotectonic factors into seismic hazard modelling. Examples for such components are seismic source boundaries, rupture geometry, seismic activity rate, focal depth and the choice of attenuation functions. The proposed methodology provides improvements in several aspects of the standard analytical tools currently being used for assessment and mapping of regional seismic hazard. The proposed methodology makes the best use of the recent advancements in computer technology in both software and hardware. The proposed approach is well structured to be implemented using conventional GIS tools.
Crupi, Vincenzo; Nelson, Jonathan D; Meder, Björn; Cevolani, Gustavo; Tentori, Katya
2018-06-17
Searching for information is critical in many situations. In medicine, for instance, careful choice of a diagnostic test can help narrow down the range of plausible diseases that the patient might have. In a probabilistic framework, test selection is often modeled by assuming that people's goal is to reduce uncertainty about possible states of the world. In cognitive science, psychology, and medical decision making, Shannon entropy is the most prominent and most widely used model to formalize probabilistic uncertainty and the reduction thereof. However, a variety of alternative entropy metrics (Hartley, Quadratic, Tsallis, Rényi, and more) are popular in the social and the natural sciences, computer science, and philosophy of science. Particular entropy measures have been predominant in particular research areas, and it is often an open issue whether these divergences emerge from different theoretical and practical goals or are merely due to historical accident. Cutting across disciplinary boundaries, we show that several entropy and entropy reduction measures arise as special cases in a unified formalism, the Sharma-Mittal framework. Using mathematical results, computer simulations, and analyses of published behavioral data, we discuss four key questions: How do various entropy models relate to each other? What insights can be obtained by considering diverse entropy models within a unified framework? What is the psychological plausibility of different entropy models? What new questions and insights for research on human information acquisition follow? Our work provides several new pathways for theoretical and empirical research, reconciling apparently conflicting approaches and empirical findings within a comprehensive and unified information-theoretic formalism. Copyright © 2018 Cognitive Science Society, Inc.
Noradrenergic modulation of risk/reward decision making.
Montes, David R; Stopper, Colin M; Floresco, Stan B
2015-08-01
Catecholamine transmission modulates numerous cognitive and reward-related processes that can subserve more complex functions such as cost/benefit decision making. Dopamine has been shown to play an integral role in decisions involving reward uncertainty, yet there is a paucity of research investigating the contributions of noradrenaline (NA) transmission to these functions. The present study was designed to elucidate the contribution of NA to risk/reward decision making in rats, assessed with a probabilistic discounting task. We examined the effects of reducing noradrenergic transmission with the α2 agonist clonidine (10-100 μg/kg), and increasing activity at α2A receptor sites with the agonist guanfacine (0.1-1 mg/kg), the α2 antagonist yohimbine (1-3 mg/kg), and the noradrenaline transporter (NET) inhibitor atomoxetine (0.3-3 mg/kg) on probabilistic discounting. Rats chose between a small/certain reward and a larger/risky reward, wherein the probability of obtaining the larger reward either decreased (100-12.5 %) or increased (12.5-100 %) over a session. In well-trained rats, clonidine reduced risky choice by decreasing reward sensitivity, whereas guanfacine did not affect choice behavior. Yohimbine impaired adjustments in decision biases as reward probability changed within a session by altering negative feedback sensitivity. In a subset of rats that displayed prominent discounting of probabilistic rewards, the lowest dose of atomoxetine increased preference for the large/risky reward when this option had greater long-term utility. These data highlight an important and previously uncharacterized role for noradrenergic transmission in mediating different aspects of risk/reward decision making and mediating reward and negative feedback sensitivity.
A probabilistic NF2 relational algebra for integrated information retrieval and database systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fuhr, N.; Roelleke, T.
The integration of information retrieval (IR) and database systems requires a data model which allows for modelling documents as entities, representing uncertainty and vagueness and performing uncertain inference. For this purpose, we present a probabilistic data model based on relations in non-first-normal-form (NF2). Here, tuples are assigned probabilistic weights giving the probability that a tuple belongs to a relation. Thus, the set of weighted index terms of a document are represented as a probabilistic subrelation. In a similar way, imprecise attribute values are modelled as a set-valued attribute. We redefine the relational operators for this type of relations such thatmore » the result of each operator is again a probabilistic NF2 relation, where the weight of a tuple gives the probability that this tuple belongs to the result. By ordering the tuples according to decreasing probabilities, the model yields a ranking of answers like in most IR models. This effect also can be used for typical database queries involving imprecise attribute values as well as for combinations of database and IR queries.« less
Balancing the stochastic description of uncertainties as a function of hydrologic model complexity
NASA Astrophysics Data System (ADS)
Del Giudice, D.; Reichert, P.; Albert, C.; Kalcic, M.; Logsdon Muenich, R.; Scavia, D.; Bosch, N. S.; Michalak, A. M.
2016-12-01
Uncertainty analysis is becoming an important component of forecasting water and pollutant fluxes in urban and rural environments. Properly accounting for errors in the modeling process can help to robustly assess the uncertainties associated with the inputs (e.g. precipitation) and outputs (e.g. runoff) of hydrological models. In recent years we have investigated several Bayesian methods to infer the parameters of a mechanistic hydrological model along with those of the stochastic error component. The latter describes the uncertainties of model outputs and possibly inputs. We have adapted our framework to a variety of applications, ranging from predicting floods in small stormwater systems to nutrient loads in large agricultural watersheds. Given practical constraints, we discuss how in general the number of quantities to infer probabilistically varies inversely with the complexity of the mechanistic model. Most often, when evaluating a hydrological model of intermediate complexity, we can infer the parameters of the model as well as of the output error model. Describing the output errors as a first order autoregressive process can realistically capture the "downstream" effect of inaccurate inputs and structure. With simpler runoff models we can additionally quantify input uncertainty by using a stochastic rainfall process. For complex hydrologic transport models, instead, we show that keeping model parameters fixed and just estimating time-dependent output uncertainties could be a viable option. The common goal across all these applications is to create time-dependent prediction intervals which are both reliable (cover the nominal amount of validation data) and precise (are as narrow as possible). In conclusion, we recommend focusing both on the choice of the hydrological model and of the probabilistic error description. The latter can include output uncertainty only, if the model is computationally-expensive, or, with simpler models, it can separately account for different sources of errors like in the inputs and the structure of the model.
Effects of delay and probability combinations on discounting in humans.
Cox, David J; Dallery, Jesse
2016-10-01
To determine discount rates, researchers typically adjust the amount of an immediate or certain option relative to a delayed or uncertain option. Because this adjusting amount method can be relatively time consuming, researchers have developed more efficient procedures. One such procedure is a 5-trial adjusting delay procedure, which measures the delay at which an amount of money loses half of its value (e.g., $1000 is valued at $500 with a 10-year delay to its receipt). Experiment 1 (n=212) used 5-trial adjusting delay or probability tasks to measure delay discounting of losses, probabilistic gains, and probabilistic losses. Experiment 2 (n=98) assessed combined probabilistic and delayed alternatives. In both experiments, we compared results from 5-trial adjusting delay or probability tasks to traditional adjusting amount procedures. Results suggest both procedures produced similar rates of probability and delay discounting in six out of seven comparisons. A magnitude effect consistent with previous research was observed for probabilistic gains and losses, but not for delayed losses. Results also suggest that delay and probability interact to determine the value of money. Five-trial methods may allow researchers to assess discounting more efficiently as well as study more complex choice scenarios. Copyright © 2016 Elsevier B.V. All rights reserved.
Intelligence moderates neural responses to monetary reward and punishment.
Hawes, Daniel R; DeYoung, Colin G; Gray, Jeremy R; Rustichini, Aldo
2014-05-01
The relations between intelligence (IQ) and neural responses to monetary gains and losses were investigated in a simple decision task. In 94 healthy adults, typical responses of striatal blood oxygen level-dependent (BOLD) signal after monetary reward and punishment were weaker for subjects with higher IQ. IQ-moderated differential responses to gains and losses were also found for regions in the medial prefrontal cortex, posterior cingulate cortex, and left inferior frontal cortex. These regions have previously been identified with the subjective utility of monetary outcomes. Analysis of subjects' behavior revealed a correlation between IQ and the extent to which choices were related to experienced decision outcomes in preceding trials. Specifically, higher IQ predicted behavior to be more strongly correlated with an extended period of previously experienced decision outcomes, whereas lower IQ predicted behavior to be correlated exclusively to the most recent decision outcomes. We link these behavioral and imaging findings to a theoretical model capable of describing a role for intelligence during the evaluation of rewards generated by unknown probabilistic processes. Our results demonstrate neural differences in how people of different intelligence respond to experienced monetary rewards and punishments. Our theoretical discussion offers a functional description for how these individual differences may be linked to choice behavior. Together, our results and model support the hypothesis that observed correlations between intelligence and preferences may be rooted in the way decision outcomes are experienced ex post, rather than deriving exclusively from how choices are evaluated ex ante.
A probabilistic and continuous model of protein conformational space for template-free modeling.
Zhao, Feng; Peng, Jian; Debartolo, Joe; Freed, Karl F; Sosnick, Tobin R; Xu, Jinbo
2010-06-01
One of the major challenges with protein template-free modeling is an efficient sampling algorithm that can explore a huge conformation space quickly. The popular fragment assembly method constructs a conformation by stringing together short fragments extracted from the Protein Data Base (PDB). The discrete nature of this method may limit generated conformations to a subspace in which the native fold does not belong. Another worry is that a protein with really new fold may contain some fragments not in the PDB. This article presents a probabilistic model of protein conformational space to overcome the above two limitations. This probabilistic model employs directional statistics to model the distribution of backbone angles and 2(nd)-order Conditional Random Fields (CRFs) to describe sequence-angle relationship. Using this probabilistic model, we can sample protein conformations in a continuous space, as opposed to the widely used fragment assembly and lattice model methods that work in a discrete space. We show that when coupled with a simple energy function, this probabilistic method compares favorably with the fragment assembly method in the blind CASP8 evaluation, especially on alpha or small beta proteins. To our knowledge, this is the first probabilistic method that can search conformations in a continuous space and achieves favorable performance. Our method also generated three-dimensional (3D) models better than template-based methods for a couple of CASP8 hard targets. The method described in this article can also be applied to protein loop modeling, model refinement, and even RNA tertiary structure prediction.
Dopaminergic modulation of reward-guided decision making in alcohol-preferring AA rats.
Oinio, Ville; Bäckström, Pia; Uhari-Väänänen, Johanna; Raasmaja, Atso; Piepponen, Petteri; Kiianmaa, Kalervo
2017-05-30
R**esults from animal gambling models have highlighted the importance of dopaminergic neurotransmission in modulating decision making when large sucrose rewards are combined with uncertainty. The majority of these models use food restriction as a tool to motivate animals to accomplish operant behavioral tasks, in which sucrose is used as a reward. As enhanced motivation to obtain sucrose due to hunger may impact its reward-seeking effect, we wanted to examine the decision-making behavior of rats in a situation where rats were fed ad libitum. For this purpose, we chose alcohol-preferring AA (alko alcohol) rats, as these rats have been shown to have high preference for sweet agents. In the present study, AA rats were trained to self-administer sucrose pellet rewards in a two-lever choice task (one pellet vs. three pellets). Once rational choice behavior had been established, the probability of gaining three pellets was decreased over time (50%, 33%, 25% then 20%). The effect of d-amphetamine on decision making was studied at every probability level, as well as the effect of the dopamine D 1 receptor agonist SKF-81297 and D 2 agonist quinpirole at probability levels of 100% and 25%. d-Amphetamine increased unprofitable choices in a dose-dependent manner at the two lowest probability levels. Quinpirole increased the frequency of unprofitable decisions at the 25% probability level, and SKF-82197 did not affect choice behavior. These results mirror the findings of probabilistic discounting studies using food-restricted rats. Based on this, the use of AA rats provides a new approach for studies on reward-guided decision making. Copyright © 2017 Elsevier B.V. All rights reserved.
Efficient Modeling and Active Learning Discovery of Biological Responses
Naik, Armaghan W.; Kangas, Joshua D.; Langmead, Christopher J.; Murphy, Robert F.
2013-01-01
High throughput and high content screening involve determination of the effect of many compounds on a given target. As currently practiced, screening for each new target typically makes little use of information from screens of prior targets. Further, choices of compounds to advance to drug development are made without significant screening against off-target effects. The overall drug development process could be made more effective, as well as less expensive and time consuming, if potential effects of all compounds on all possible targets could be considered, yet the cost of such full experimentation would be prohibitive. In this paper, we describe a potential solution: probabilistic models that can be used to predict results for unmeasured combinations, and active learning algorithms for efficiently selecting which experiments to perform in order to build those models and determining when to stop. Using simulated and experimental data, we show that our approaches can produce powerful predictive models without exhaustive experimentation and can learn them much faster than by selecting experiments at random. PMID:24358322
Eddy, Sean R.
2008-01-01
Sequence database searches require accurate estimation of the statistical significance of scores. Optimal local sequence alignment scores follow Gumbel distributions, but determining an important parameter of the distribution (λ) requires time-consuming computational simulation. Moreover, optimal alignment scores are less powerful than probabilistic scores that integrate over alignment uncertainty (“Forward” scores), but the expected distribution of Forward scores remains unknown. Here, I conjecture that both expected score distributions have simple, predictable forms when full probabilistic modeling methods are used. For a probabilistic model of local sequence alignment, optimal alignment bit scores (“Viterbi” scores) are Gumbel-distributed with constant λ = log 2, and the high scoring tail of Forward scores is exponential with the same constant λ. Simulation studies support these conjectures over a wide range of profile/sequence comparisons, using 9,318 profile-hidden Markov models from the Pfam database. This enables efficient and accurate determination of expectation values (E-values) for both Viterbi and Forward scores for probabilistic local alignments. PMID:18516236
Fuzzy-probabilistic model for risk assessment of radioactive material railway transportation.
Avramenko, M; Bolyatko, V; Kosterev, V
2005-01-01
Transportation of radioactive materials is obviously accompanied by a certain risk. A model for risk assessment of emergency situations and terrorist attacks may be useful for choosing possible routes and for comparing the various defence strategies. In particular, risk assessment is crucial for safe transportation of excess weapons-grade plutonium arising from the removal of plutonium from military employment. A fuzzy-probabilistic model for risk assessment of railway transportation has been developed taking into account the different natures of risk-affecting parameters (probabilistic and not probabilistic but fuzzy). Fuzzy set theory methods as well as standard methods of probability theory have been used for quantitative risk assessment. Information-preserving transformations are applied to realise the correct aggregation of probabilistic and fuzzy parameters. Estimations have also been made of the inhalation doses resulting from possible accidents during plutonium transportation. The obtained data show the scale of possible consequences that may arise from plutonium transportation accidents.
Combining Empirical and Stochastic Models for Extreme Floods Estimation
NASA Astrophysics Data System (ADS)
Zemzami, M.; Benaabidate, L.
2013-12-01
Hydrological models can be defined as physical, mathematical or empirical. The latter class uses mathematical equations independent of the physical processes involved in the hydrological system. The linear regression and Gradex (Gradient of Extreme values) are classic examples of empirical models. However, conventional empirical models are still used as a tool for hydrological analysis by probabilistic approaches. In many regions in the world, watersheds are not gauged. This is true even in developed countries where the gauging network has continued to decline as a result of the lack of human and financial resources. Indeed, the obvious lack of data in these watersheds makes it impossible to apply some basic empirical models for daily forecast. So we had to find a combination of rainfall-runoff models in which it would be possible to create our own data and use them to estimate the flow. The estimated design floods would be a good choice to illustrate the difficulties facing the hydrologist for the construction of a standard empirical model in basins where hydrological information is rare. The construction of the climate-hydrological model, which is based on frequency analysis, was established to estimate the design flood in the Anseghmir catchments, Morocco. The choice of using this complex model returns to its ability to be applied in watersheds where hydrological information is not sufficient. It was found that this method is a powerful tool for estimating the design flood of the watershed and also other hydrological elements (runoff, volumes of water...).The hydrographic characteristics and climatic parameters were used to estimate the runoff, water volumes and design flood for different return periods.
What do we gain with Probabilistic Flood Loss Models?
NASA Astrophysics Data System (ADS)
Schroeter, K.; Kreibich, H.; Vogel, K.; Merz, B.; Lüdtke, S.
2015-12-01
The reliability of flood loss models is a prerequisite for their practical usefulness. Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks and traditional stage damage functions which are cast in a probabilistic framework. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005, 2006 and 2013 in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The reliability of the probabilistic predictions within validation runs decreases only slightly and achieves a very good coverage of observations within the predictive interval. Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.
A comprehensive probabilistic analysis model of oil pipelines network based on Bayesian network
NASA Astrophysics Data System (ADS)
Zhang, C.; Qin, T. X.; Jiang, B.; Huang, C.
2018-02-01
Oil pipelines network is one of the most important facilities of energy transportation. But oil pipelines network accident may result in serious disasters. Some analysis models for these accidents have been established mainly based on three methods, including event-tree, accident simulation and Bayesian network. Among these methods, Bayesian network is suitable for probabilistic analysis. But not all the important influencing factors are considered and the deployment rule of the factors has not been established. This paper proposed a probabilistic analysis model of oil pipelines network based on Bayesian network. Most of the important influencing factors, including the key environment condition and emergency response are considered in this model. Moreover, the paper also introduces a deployment rule for these factors. The model can be used in probabilistic analysis and sensitive analysis of oil pipelines network accident.
Impact from Magnitude-Rupture Length Uncertainty on Seismic Hazard and Risk
NASA Astrophysics Data System (ADS)
Apel, E. V.; Nyst, M.; Kane, D. L.
2015-12-01
In probabilistic seismic hazard and risk assessments seismic sources are typically divided into two groups: fault sources (to model known faults) and background sources (to model unknown faults). In areas like the Central and Eastern United States and Hawaii the hazard and risk is driven primarily by background sources. Background sources can be modeled as areas, points or pseudo-faults. When background sources are modeled as pseudo-faults, magnitude-length or magnitude-area scaling relationships are required to construct these pseudo-faults. However the uncertainty associated with these relationships is often ignored or discarded in hazard and risk models, particularly when faults sources are the dominant contributor. Conversely, in areas modeled only with background sources these uncertainties are much more significant. In this study we test the impact of using various relationships and the resulting epistemic uncertainties on the seismic hazard and risk in the Central and Eastern United States and Hawaii. It is common to use only one magnitude length relationship when calculating hazard. However, Stirling et al. (2013) showed that for a given suite of magnitude-rupture length relationships the variability can be quite large. The 2014 US National Seismic Hazard Maps (Petersen et al., 2014) used one magnitude-rupture length relationship (Somerville, et al., 2001) in the Central and Eastern United States, and did not consider variability in the seismogenic rupture plane width. Here we use a suite of metrics to compare the USGS approach with these variable uncertainty models to assess 1) the impact on hazard and risk and 2) the epistemic uncertainty associated with choice of relationship. In areas where the seismic hazard is dominated by larger crustal faults (e.g. New Madrid) the choice of magnitude-rupture length relationship has little impact on the hazard or risk. However away from these regions, the choice of relationship is more significant and may approach the size of the uncertainty associated with the ground motion prediction equation suite.
NASA Technical Reports Server (NTRS)
Leser, Patrick E.; Hochhalter, Jacob D.; Newman, John A.; Leser, William P.; Warner, James E.; Wawrzynek, Paul A.; Yuan, Fuh-Gwo
2015-01-01
Utilizing inverse uncertainty quantification techniques, structural health monitoring can be integrated with damage progression models to form probabilistic predictions of a structure's remaining useful life. However, damage evolution in realistic structures is physically complex. Accurately representing this behavior requires high-fidelity models which are typically computationally prohibitive. In the present work, a high-fidelity finite element model is represented by a surrogate model, reducing computation times. The new approach is used with damage diagnosis data to form a probabilistic prediction of remaining useful life for a test specimen under mixed-mode conditions.
Modeling Color Difference for Visualization Design.
Szafir, Danielle Albers
2018-01-01
Color is frequently used to encode values in visualizations. For color encodings to be effective, the mapping between colors and values must preserve important differences in the data. However, most guidelines for effective color choice in visualization are based on either color perceptions measured using large, uniform fields in optimal viewing environments or on qualitative intuitions. These limitations may cause data misinterpretation in visualizations, which frequently use small, elongated marks. Our goal is to develop quantitative metrics to help people use color more effectively in visualizations. We present a series of crowdsourced studies measuring color difference perceptions for three common mark types: points, bars, and lines. Our results indicate that peoples' abilities to perceive color differences varies significantly across mark types. Probabilistic models constructed from the resulting data can provide objective guidance for designers, allowing them to anticipate viewer perceptions in order to inform effective encoding design.
The Effects of Heuristics and Apophenia on Probabilistic Choice.
Ellerby, Zack W; Tunney, Richard J
2017-01-01
Given a repeated choice between two or more options with independent and identically distributed reward probabilities, overall pay-offs can be maximized by the exclusive selection of the option with the greatest likelihood of reward. The tendency to match response proportions to reward contingencies is suboptimal. Nevertheless, this behaviour is well documented. A number of explanatory accounts have been proposed for probability matching. These include failed pattern matching, driven by apophenia, and a heuristic-driven response that can be overruled with sufficient deliberation. We report two experiments that were designed to test the relative effects on choice behaviour of both an intuitive versus strategic approach to the task and belief that there was a predictable pattern in the reward sequence, through a combination of both direct experimental manipulation and post-experimental self-report. Mediation analysis was used to model the pathways of effects. Neither of two attempted experimental manipulations of apophenia, nor self-reported levels of apophenia, had a significant effect on proportions of maximizing choices. However, the use of strategy over intuition proved a consistent predictor of maximizing, across all experimental conditions. A parallel analysis was conducted to assess the effect of controlling for individual variance in perceptions of reward contingencies. Although this analysis suggested that apophenia did increase probability matching in the standard task preparation, this effect was found to result from an unforeseen relationship between self-reported apophenia and perceived reward probabilities. A Win-Stay Lose-Shift (WSLS ) analysis indicated no reliable relationship between WSLS and either intuition or strategy use.
Probabilistic models of cognition: conceptual foundations.
Chater, Nick; Tenenbaum, Joshua B; Yuille, Alan
2006-07-01
Remarkable progress in the mathematics and computer science of probability has led to a revolution in the scope of probabilistic models. In particular, 'sophisticated' probabilistic methods apply to structured relational systems such as graphs and grammars, of immediate relevance to the cognitive sciences. This Special Issue outlines progress in this rapidly developing field, which provides a potentially unifying perspective across a wide range of domains and levels of explanation. Here, we introduce the historical and conceptual foundations of the approach, explore how the approach relates to studies of explicit probabilistic reasoning, and give a brief overview of the field as it stands today.
Development of Probabilistic Flood Inundation Mapping For Flooding Induced by Dam Failure
NASA Astrophysics Data System (ADS)
Tsai, C.; Yeh, J. J. J.
2017-12-01
A primary function of flood inundation mapping is to forecast flood hazards and assess potential losses. However, uncertainties limit the reliability of inundation hazard assessments. Major sources of uncertainty should be taken into consideration by an optimal flood management strategy. This study focuses on the 20km reach downstream of the Shihmen Reservoir in Taiwan. A dam failure induced flood herein provides the upstream boundary conditions of flood routing. The two major sources of uncertainty that are considered in the hydraulic model and the flood inundation mapping herein are uncertainties in the dam break model and uncertainty of the roughness coefficient. The perturbance moment method is applied to a dam break model and the hydro system model to develop probabilistic flood inundation mapping. Various numbers of uncertain variables can be considered in these models and the variability of outputs can be quantified. The probabilistic flood inundation mapping for dam break induced floods can be developed with consideration of the variability of output using a commonly used HEC-RAS model. Different probabilistic flood inundation mappings are discussed and compared. Probabilistic flood inundation mappings are hoped to provide new physical insights in support of the evaluation of concerning reservoir flooded areas.
Software for Probabilistic Risk Reduction
NASA Technical Reports Server (NTRS)
Hensley, Scott; Michel, Thierry; Madsen, Soren; Chapin, Elaine; Rodriguez, Ernesto
2004-01-01
A computer program implements a methodology, denoted probabilistic risk reduction, that is intended to aid in planning the development of complex software and/or hardware systems. This methodology integrates two complementary prior methodologies: (1) that of probabilistic risk assessment and (2) a risk-based planning methodology, implemented in a prior computer program known as Defect Detection and Prevention (DDP), in which multiple requirements and the beneficial effects of risk-mitigation actions are taken into account. The present methodology and the software are able to accommodate both process knowledge (notably of the efficacy of development practices) and product knowledge (notably of the logical structure of a system, the development of which one seeks to plan). Estimates of the costs and benefits of a planned development can be derived. Functional and non-functional aspects of software can be taken into account, and trades made among them. It becomes possible to optimize the planning process in the sense that it becomes possible to select the best suite of process steps and design choices to maximize the expectation of success while remaining within budget.
Biases in probabilistic category learning in relation to social anxiety
Abraham, Anna; Hermann, Christiane
2015-01-01
Instrumental learning paradigms are rarely employed to investigate the mechanisms underlying acquired fear responses in social anxiety. Here, we adapted a probabilistic category learning paradigm to assess information processing biases as a function of the degree of social anxiety traits in a sample of healthy individuals without a diagnosis of social phobia. Participants were presented with three pairs of neutral faces with differing probabilistic accuracy contingencies (A/B: 80/20, C/D: 70/30, E/F: 60/40). Upon making their choice, negative and positive feedback was conveyed using angry and happy faces, respectively. The highly socially anxious group showed a strong tendency to be more accurate at learning the probability contingency associated with the most ambiguous stimulus pair (E/F: 60/40). Moreover, when pairing the most positively reinforced stimulus or the most negatively reinforced stimulus with all the other stimuli in a test phase, the highly socially anxious group avoided the most negatively reinforced stimulus significantly more than the control group. The results are discussed with reference to avoidance learning and hypersensitivity to negative socially evaluative information associated with social anxiety. PMID:26347685
Martin, Sébastien; Troccaz, Jocelyne; Daanenc, Vincent
2010-04-01
The authors present a fully automatic algorithm for the segmentation of the prostate in three-dimensional magnetic resonance (MR) images. The approach requires the use of an anatomical atlas which is built by computing transformation fields mapping a set of manually segmented images to a common reference. These transformation fields are then applied to the manually segmented structures of the training set in order to get a probabilistic map on the atlas. The segmentation is then realized through a two stage procedure. In the first stage, the processed image is registered to the probabilistic atlas. Subsequently, a probabilistic segmentation is obtained by mapping the probabilistic map of the atlas to the patient's anatomy. In the second stage, a deformable surface evolves toward the prostate boundaries by merging information coming from the probabilistic segmentation, an image feature model and a statistical shape model. During the evolution of the surface, the probabilistic segmentation allows the introduction of a spatial constraint that prevents the deformable surface from leaking in an unlikely configuration. The proposed method is evaluated on 36 exams that were manually segmented by a single expert. A median Dice similarity coefficient of 0.86 and an average surface error of 2.41 mm are achieved. By merging prior knowledge, the presented method achieves a robust and completely automatic segmentation of the prostate in MR images. Results show that the use of a spatial constraint is useful to increase the robustness of the deformable model comparatively to a deformable surface that is only driven by an image appearance model.
Probabilistic load simulation: Code development status
NASA Astrophysics Data System (ADS)
Newell, J. F.; Ho, H.
1991-05-01
The objective of the Composite Load Spectra (CLS) project is to develop generic load models to simulate the composite load spectra that are included in space propulsion system components. The probabilistic loads thus generated are part of the probabilistic design analysis (PDA) of a space propulsion system that also includes probabilistic structural analyses, reliability, and risk evaluations. Probabilistic load simulation for space propulsion systems demands sophisticated probabilistic methodology and requires large amounts of load information and engineering data. The CLS approach is to implement a knowledge based system coupled with a probabilistic load simulation module. The knowledge base manages and furnishes load information and expertise and sets up the simulation runs. The load simulation module performs the numerical computation to generate the probabilistic loads with load information supplied from the CLS knowledge base.
Buchsbaum, Daphna; Seiver, Elizabeth; Bridgers, Sophie; Gopnik, Alison
2012-01-01
A major challenge children face is uncovering the causal structure of the world around them. Previous research on children's causal inference has demonstrated their ability to learn about causal relationships in the physical environment using probabilistic evidence. However, children must also learn about causal relationships in the social environment, including discovering the causes of other people's behavior, and understanding the causal relationships between others' goal-directed actions and the outcomes of those actions. In this chapter, we argue that social reasoning and causal reasoning are deeply linked, both in the real world and in children's minds. Children use both types of information together and in fact reason about both physical and social causation in fundamentally similar ways. We suggest that children jointly construct and update causal theories about their social and physical environment and that this process is best captured by probabilistic models of cognition. We first present studies showing that adults are able to jointly infer causal structure and human action structure from videos of unsegmented human motion. Next, we describe how children use social information to make inferences about physical causes. We show that the pedagogical nature of a demonstrator influences children's choices of which actions to imitate from within a causal sequence and that this social information interacts with statistical causal evidence. We then discuss how children combine evidence from an informant's testimony and expressed confidence with evidence from their own causal observations to infer the efficacy of different potential causes. We also discuss how children use these same causal observations to make inferences about the knowledge state of the social informant. Finally, we suggest that psychological causation and attribution are part of the same causal system as physical causation. We present evidence that just as children use covariation between physical causes and their effects to learn physical causal relationships, they also use covaration between people's actions and the environment to make inferences about the causes of human behavior.
Application of a stochastic snowmelt model for probabilistic decisionmaking
NASA Technical Reports Server (NTRS)
Mccuen, R. H.
1983-01-01
A stochastic form of the snowmelt runoff model that can be used for probabilistic decision-making was developed. The use of probabilistic streamflow predictions instead of single valued deterministic predictions leads to greater accuracy in decisions. While the accuracy of the output function is important in decisionmaking, it is also important to understand the relative importance of the coefficients. Therefore, a sensitivity analysis was made for each of the coefficients.
ERIC Educational Resources Information Center
Weatherly, Jeffrey N.; Derenne, Adam
2013-01-01
The present study tested whether participants would discount "won" money differently than they would "owed" money in a probability-discounting task. Participants discounted $1000 or $100,000 that they had won in a sweepstakes or that was owed to them using the multiple-choice (Experiment 1) or fill-in-the-blank (Experiment 2) method of collecting…
Counterfactual distribution of Schrödinger cat states
NASA Astrophysics Data System (ADS)
Shenoy-Hejamadi, Akshata; Srikanth, R.
2015-12-01
In the counterfactual cryptography scheme proposed by Noh, the sender Alice probabilistically transmits classical information to the receiver Bob without the physical travel of a particle. Here we generalize this idea to the distribution of quantum entanglement. The key insight is to replace their classical input choices with quantum superpositions. We further show that the scheme can be generalized to counterfactually distribute multipartite cat states.
Vagueness as Probabilistic Linguistic Knowledge
NASA Astrophysics Data System (ADS)
Lassiter, Daniel
Consideration of the metalinguistic effects of utterances involving vague terms has led Barker [1] to treat vagueness using a modified Stalnakerian model of assertion. I present a sorites-like puzzle for factual beliefs in the standard Stalnakerian model [28] and show that it can be resolved by enriching the model to make use of probabilistic belief spaces. An analogous problem arises for metalinguistic information in Barker's model, and I suggest that a similar enrichment is needed here as well. The result is a probabilistic theory of linguistic representation that retains a classical metalanguage but avoids the undesirable divorce between meaning and use inherent in the epistemic theory [34]. I also show that the probabilistic approach provides a plausible account of the sorites paradox and higher-order vagueness and that it fares well empirically and conceptually in comparison to leading competitors.
Probalistic Models for Solar Particle Events
NASA Technical Reports Server (NTRS)
Adams, James H., Jr.; Xapsos, Michael
2009-01-01
Probabilistic Models of Solar Particle Events (SPEs) are used in space mission design studies to describe the radiation environment that can be expected at a specified confidence level. The task of the designer is then to choose a design that will operate in the model radiation environment. Probabilistic models have already been developed for solar proton events that describe the peak flux, event-integrated fluence and missionintegrated fluence. In addition a probabilistic model has been developed that describes the mission-integrated fluence for the Z>2 elemental spectra. This talk will focus on completing this suite of models by developing models for peak flux and event-integrated fluence elemental spectra for the Z>2 element
EFFECTS OF CORRELATED PROBABILISTIC EXPOSURE MODEL INPUTS ON SIMULATED RESULTS
In recent years, more probabilistic models have been developed to quantify aggregate human exposures to environmental pollutants. The impact of correlation among inputs in these models is an important issue, which has not been resolved. Obtaining correlated data and implementi...
Cardinal, Rudolf N; Howes, Nathan J
2005-01-01
Background Animals must frequently make choices between alternative courses of action, seeking to maximize the benefit obtained. They must therefore evaluate the magnitude and the likelihood of the available outcomes. Little is known of the neural basis of this process, or what might predispose individuals to be overly conservative or to take risks excessively (avoiding or preferring uncertainty, respectively). The nucleus accumbens core (AcbC) is known to contribute to rats' ability to choose large, delayed rewards over small, immediate rewards; AcbC lesions cause impulsive choice and an impairment in learning with delayed reinforcement. However, it is not known how the AcbC contributes to choice involving probabilistic reinforcement, such as between a large, uncertain reward and a small, certain reward. We examined the effects of excitotoxic lesions of the AcbC on probabilistic choice in rats. Results Rats chose between a single food pellet delivered with certainty (p = 1) and four food pellets delivered with varying degrees of uncertainty (p = 1, 0.5, 0.25, 0.125, and 0.0625) in a discrete-trial task, with the large-reinforcer probability decreasing or increasing across the session. Subjects were trained on this task and then received excitotoxic or sham lesions of the AcbC before being retested. After a transient period during which AcbC-lesioned rats exhibited relative indifference between the two alternatives compared to controls, AcbC-lesioned rats came to exhibit risk-averse choice, choosing the large reinforcer less often than controls when it was uncertain, to the extent that they obtained less food as a result. Rats behaved as if indifferent between a single certain pellet and four pellets at p = 0.32 (sham-operated) or at p = 0.70 (AcbC-lesioned) by the end of testing. When the probabilities did not vary across the session, AcbC-lesioned rats and controls strongly preferred the large reinforcer when it was certain, and strongly preferred the small reinforcer when the large reinforcer was very unlikely (p = 0.0625), with no differences between AcbC-lesioned and sham-operated groups. Conclusion These results support the view that the AcbC contributes to action selection by promoting the choice of uncertain, as well as delayed, reinforcement. PMID:15921529
Delay or probability discounting in a model of impulsive behavior: effect of alcohol.
Richards, J B; Zhang, L; Mitchell, S H; de Wit, H
1999-01-01
Little is known about the acute effects of drugs of abuse on impulsivity and self-control. In this study, impulsivity was assessed in humans using a computer task that measured delay and probability discounting. Discounting describes how much the value of a reward (or punisher) is decreased when its occurrence is either delayed or uncertain. Twenty-four healthy adult volunteers ingested a moderate dose of ethanol (0.5 or 0.8 g/kg ethanol: n = 12 at each dose) or placebo before completing the discounting task. In the task the participants were given a series of choices between a small, immediate, certain amount of money and $10 that was either delayed (0, 2, 30, 180, or 365 days) or probabilistic (i.e., certainty of receipt was 1.0, .9, .75, .5, or .25). The point at which each individual was indifferent between the smaller immediate or certain reward and the $10 delayed or probabilistic reward was identified using an adjusting-amount procedure. The results indicated that (a) delay and probability discounting were well described by a hyperbolic function; (b) delay and probability discounting were positively correlated within subjects; (c) delay and probability discounting were moderately correlated with personality measures of impulsivity; and (d) alcohol had no effect on discounting. PMID:10220927
Chakraborty, Subhojit; Kolling, Nils; Walton, Mark E; Mitchell, Anna S
2016-01-01
Adaptive decision-making uses information gained when exploring alternative options to decide whether to update the current choice strategy. Magnocellular mediodorsal thalamus (MDmc) supports adaptive decision-making, but its causal contribution is not well understood. Monkeys with excitotoxic MDmc damage were tested on probabilistic three-choice decision-making tasks. They could learn and track the changing values in object-reward associations, but they were severely impaired at updating choices after reversals in reward contingencies or when there were multiple options associated with reward. These deficits were not caused by perseveration or insensitivity to negative feedback though. Instead, monkeys with MDmc lesions exhibited an inability to use reward to promote choice repetition after switching to an alternative option due to a diminished influence of recent past choices and the last outcome to guide future behavior. Together, these data suggest MDmc allows for the rapid discovery and persistence with rewarding options, particularly in uncertain or changing environments. DOI: http://dx.doi.org/10.7554/eLife.13588.001 PMID:27136677
Use of limited data to construct Bayesian networks for probabilistic risk assessment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Groth, Katrina M.; Swiler, Laura Painton
2013-03-01
Probabilistic Risk Assessment (PRA) is a fundamental part of safety/quality assurance for nuclear power and nuclear weapons. Traditional PRA very effectively models complex hardware system risks using binary probabilistic models. However, traditional PRA models are not flexible enough to accommodate non-binary soft-causal factors, such as digital instrumentation&control, passive components, aging, common cause failure, and human errors. Bayesian Networks offer the opportunity to incorporate these risks into the PRA framework. This report describes the results of an early career LDRD project titled %E2%80%9CUse of Limited Data to Construct Bayesian Networks for Probabilistic Risk Assessment%E2%80%9D. The goal of the work was tomore » establish the capability to develop Bayesian Networks from sparse data, and to demonstrate this capability by producing a data-informed Bayesian Network for use in Human Reliability Analysis (HRA) as part of nuclear power plant Probabilistic Risk Assessment (PRA). This report summarizes the research goal and major products of the research.« less
Probabilistic drug connectivity mapping
2014-01-01
Background The aim of connectivity mapping is to match drugs using drug-treatment gene expression profiles from multiple cell lines. This can be viewed as an information retrieval task, with the goal of finding the most relevant profiles for a given query drug. We infer the relevance for retrieval by data-driven probabilistic modeling of the drug responses, resulting in probabilistic connectivity mapping, and further consider the available cell lines as different data sources. We use a special type of probabilistic model to separate what is shared and specific between the sources, in contrast to earlier connectivity mapping methods that have intentionally aggregated all available data, neglecting information about the differences between the cell lines. Results We show that the probabilistic multi-source connectivity mapping method is superior to alternatives in finding functionally and chemically similar drugs from the Connectivity Map data set. We also demonstrate that an extension of the method is capable of retrieving combinations of drugs that match different relevant parts of the query drug response profile. Conclusions The probabilistic modeling-based connectivity mapping method provides a promising alternative to earlier methods. Principled integration of data from different cell lines helps to identify relevant responses for specific drug repositioning applications. PMID:24742351
Optimal radiotherapy dose schedules under parametric uncertainty
NASA Astrophysics Data System (ADS)
Badri, Hamidreza; Watanabe, Yoichi; Leder, Kevin
2016-01-01
We consider the effects of parameter uncertainty on the optimal radiation schedule in the context of the linear-quadratic model. Our interest arises from the observation that if inter-patient variability in normal and tumor tissue radiosensitivity or sparing factor of the organs-at-risk (OAR) are not accounted for during radiation scheduling, the performance of the therapy may be strongly degraded or the OAR may receive a substantially larger dose than the allowable threshold. This paper proposes a stochastic radiation scheduling concept to incorporate inter-patient variability into the scheduling optimization problem. Our method is based on a probabilistic approach, where the model parameters are given by a set of random variables. Our probabilistic formulation ensures that our constraints are satisfied with a given probability, and that our objective function achieves a desired level with a stated probability. We used a variable transformation to reduce the resulting optimization problem to two dimensions. We showed that the optimal solution lies on the boundary of the feasible region and we implemented a branch and bound algorithm to find the global optimal solution. We demonstrated how the configuration of optimal schedules in the presence of uncertainty compares to optimal schedules in the absence of uncertainty (conventional schedule). We observed that in order to protect against the possibility of the model parameters falling into a region where the conventional schedule is no longer feasible, it is required to avoid extremal solutions, i.e. a single large dose or very large total dose delivered over a long period. Finally, we performed numerical experiments in the setting of head and neck tumors including several normal tissues to reveal the effect of parameter uncertainty on optimal schedules and to evaluate the sensitivity of the solutions to the choice of key model parameters.
NASA Astrophysics Data System (ADS)
McInerney, David; Thyer, Mark; Kavetski, Dmitri; Lerat, Julien; Kuczera, George
2017-03-01
Reliable and precise probabilistic prediction of daily catchment-scale streamflow requires statistical characterization of residual errors of hydrological models. This study focuses on approaches for representing error heteroscedasticity with respect to simulated streamflow, i.e., the pattern of larger errors in higher streamflow predictions. We evaluate eight common residual error schemes, including standard and weighted least squares, the Box-Cox transformation (with fixed and calibrated power parameter λ) and the log-sinh transformation. Case studies include 17 perennial and 6 ephemeral catchments in Australia and the United States, and two lumped hydrological models. Performance is quantified using predictive reliability, precision, and volumetric bias metrics. We find the choice of heteroscedastic error modeling approach significantly impacts on predictive performance, though no single scheme simultaneously optimizes all performance metrics. The set of Pareto optimal schemes, reflecting performance trade-offs, comprises Box-Cox schemes with λ of 0.2 and 0.5, and the log scheme (λ = 0, perennial catchments only). These schemes significantly outperform even the average-performing remaining schemes (e.g., across ephemeral catchments, median precision tightens from 105% to 40% of observed streamflow, and median biases decrease from 25% to 4%). Theoretical interpretations of empirical results highlight the importance of capturing the skew/kurtosis of raw residuals and reproducing zero flows. Paradoxically, calibration of λ is often counterproductive: in perennial catchments, it tends to overfit low flows at the expense of abysmal precision in high flows. The log-sinh transformation is dominated by the simpler Pareto optimal schemes listed above. Recommendations for researchers and practitioners seeking robust residual error schemes for practical work are provided.
Probabilistic Reinforcement Learning in Adults with Autism Spectrum Disorders
Solomon, Marjorie; Smith, Anne C.; Frank, Michael J.; Ly, Stanford; Carter, Cameron S.
2017-01-01
Background Autism spectrum disorders (ASDs) can be conceptualized as disorders of learning, however there have been few experimental studies taking this perspective. Methods We examined the probabilistic reinforcement learning performance of 28 adults with ASDs and 30 typically developing adults on a task requiring learning relationships between three stimulus pairs consisting of Japanese characters with feedback that was valid with different probabilities (80%, 70%, and 60%). Both univariate and Bayesian state–space data analytic methods were employed. Hypotheses were based on the extant literature as well as on neurobiological and computational models of reinforcement learning. Results Both groups learned the task after training. However, there were group differences in early learning in the first task block where individuals with ASDs acquired the most frequently accurately reinforced stimulus pair (80%) comparably to typically developing individuals; exhibited poorer acquisition of the less frequently reinforced 70% pair as assessed by state–space learning curves; and outperformed typically developing individuals on the near chance (60%) pair. Individuals with ASDs also demonstrated deficits in using positive feedback to exploit rewarded choices. Conclusions Results support the contention that individuals with ASDs are slower learners. Based on neurobiology and on the results of computational modeling, one interpretation of this pattern of findings is that impairments are related to deficits in flexible updating of reinforcement history as mediated by the orbito-frontal cortex, with spared functioning of the basal ganglia. This hypothesis about the pathophysiology of learning in ASDs can be tested using functional magnetic resonance imaging. PMID:21425243
DOT National Transportation Integrated Search
2009-10-13
This paper describes a probabilistic approach to estimate the conditional probability of release of hazardous materials from railroad tank cars during train accidents. Monte Carlo methods are used in developing a probabilistic model to simulate head ...
NASA Astrophysics Data System (ADS)
Fei, Cheng-Wei; Bai, Guang-Chen
2014-12-01
To improve the computational precision and efficiency of probabilistic design for mechanical dynamic assembly like the blade-tip radial running clearance (BTRRC) of gas turbine, a distribution collaborative probabilistic design method-based support vector machine of regression (SR)(called as DCSRM) is proposed by integrating distribution collaborative response surface method and support vector machine regression model. The mathematical model of DCSRM is established and the probabilistic design idea of DCSRM is introduced. The dynamic assembly probabilistic design of aeroengine high-pressure turbine (HPT) BTRRC is accomplished to verify the proposed DCSRM. The analysis results reveal that the optimal static blade-tip clearance of HPT is gained for designing BTRRC, and improving the performance and reliability of aeroengine. The comparison of methods shows that the DCSRM has high computational accuracy and high computational efficiency in BTRRC probabilistic analysis. The present research offers an effective way for the reliability design of mechanical dynamic assembly and enriches mechanical reliability theory and method.
Superposition-Based Analysis of First-Order Probabilistic Timed Automata
NASA Astrophysics Data System (ADS)
Fietzke, Arnaud; Hermanns, Holger; Weidenbach, Christoph
This paper discusses the analysis of first-order probabilistic timed automata (FPTA) by a combination of hierarchic first-order superposition-based theorem proving and probabilistic model checking. We develop the overall semantics of FPTAs and prove soundness and completeness of our method for reachability properties. Basically, we decompose FPTAs into their time plus first-order logic aspects on the one hand, and their probabilistic aspects on the other hand. Then we exploit the time plus first-order behavior by hierarchic superposition over linear arithmetic. The result of this analysis is the basis for the construction of a reachability equivalent (to the original FPTA) probabilistic timed automaton to which probabilistic model checking is finally applied. The hierarchic superposition calculus required for the analysis is sound and complete on the first-order formulas generated from FPTAs. It even works well in practice. We illustrate the potential behind it with a real-life DHCP protocol example, which we analyze by means of tool chain support.
Campbell, Kieran R; Yau, Christopher
2017-03-15
Modeling bifurcations in single-cell transcriptomics data has become an increasingly popular field of research. Several methods have been proposed to infer bifurcation structure from such data, but all rely on heuristic non-probabilistic inference. Here we propose the first generative, fully probabilistic model for such inference based on a Bayesian hierarchical mixture of factor analyzers. Our model exhibits competitive performance on large datasets despite implementing full Markov-Chain Monte Carlo sampling, and its unique hierarchical prior structure enables automatic determination of genes driving the bifurcation process. We additionally propose an Empirical-Bayes like extension that deals with the high levels of zero-inflation in single-cell RNA-seq data and quantify when such models are useful. We apply or model to both real and simulated single-cell gene expression data and compare the results to existing pseudotime methods. Finally, we discuss both the merits and weaknesses of such a unified, probabilistic approach in the context practical bioinformatics analyses.
Solving probability reasoning based on DNA strand displacement and probability modules.
Zhang, Qiang; Wang, Xiaobiao; Wang, Xiaojun; Zhou, Changjun
2017-12-01
In computation biology, DNA strand displacement technology is used to simulate the computation process and has shown strong computing ability. Most researchers use it to solve logic problems, but it is only rarely used in probabilistic reasoning. To process probabilistic reasoning, a conditional probability derivation model and total probability model based on DNA strand displacement were established in this paper. The models were assessed through the game "read your mind." It has been shown to enable the application of probabilistic reasoning in genetic diagnosis. Copyright © 2017 Elsevier Ltd. All rights reserved.
Quantum formalism for classical statistics
NASA Astrophysics Data System (ADS)
Wetterich, C.
2018-06-01
In static classical statistical systems the problem of information transport from a boundary to the bulk finds a simple description in terms of wave functions or density matrices. While the transfer matrix formalism is a type of Heisenberg picture for this problem, we develop here the associated Schrödinger picture that keeps track of the local probabilistic information. The transport of the probabilistic information between neighboring hypersurfaces obeys a linear evolution equation, and therefore the superposition principle for the possible solutions. Operators are associated to local observables, with rules for the computation of expectation values similar to quantum mechanics. We discuss how non-commutativity naturally arises in this setting. Also other features characteristic of quantum mechanics, such as complex structure, change of basis or symmetry transformations, can be found in classical statistics once formulated in terms of wave functions or density matrices. We construct for every quantum system an equivalent classical statistical system, such that time in quantum mechanics corresponds to the location of hypersurfaces in the classical probabilistic ensemble. For suitable choices of local observables in the classical statistical system one can, in principle, compute all expectation values and correlations of observables in the quantum system from the local probabilistic information of the associated classical statistical system. Realizing a static memory material as a quantum simulator for a given quantum system is not a matter of principle, but rather of practical simplicity.
Stochastic Matching and the Voluntary Nature of Choice
Neuringer, Allen; Jensen, Greg; Piff, Paul
2007-01-01
Attempts to characterize voluntary behavior have been ongoing for thousands of years. We provide experimental evidence that judgments of volition are based upon distributions of responses in relation to obtained rewards. Participants watched as responses, said to be made by “actors,” appeared on a computer screen. The participant's task was to estimate how well each actor represented the voluntary choices emitted by a real person. In actuality, all actors' responses were generated by algorithms based on Baum's (1979) generalized matching function. We systematically varied the exponent values (sensitivity parameter) of these algorithms: some actors matched response proportions to received reinforcer proportions, others overmatched (predominantly chose the highest-valued alternative), and yet others undermatched (chose relatively equally among the alternatives). In each of five experiments, we found that the matching actor's responses were judged most closely to approximate voluntary choice. We found also that judgments of high volition depended upon stochastic (or probabilistic) generation. Thus, stochastic responses that match reinforcer proportions best represent voluntary human choice. PMID:17725049
Ludvig, Elliot A; Spetch, Marcia L
2011-01-01
When faced with risky decisions, people tend to be risk averse for gains and risk seeking for losses (the reflection effect). Studies examining this risk-sensitive decision making, however, typically ask people directly what they would do in hypothetical choice scenarios. A recent flurry of studies has shown that when these risky decisions include rare outcomes, people make different choices for explicitly described probabilities than for experienced probabilistic outcomes. Specifically, rare outcomes are overweighted when described and underweighted when experienced. In two experiments, we examined risk-sensitive decision making when the risky option had two equally probable (50%) outcomes. For experience-based decisions, there was a reversal of the reflection effect with greater risk seeking for gains than for losses, as compared to description-based decisions. This fundamental difference in experienced and described choices cannot be explained by the weighting of rare events and suggests a separate subjective utility curve for experience.
Ludvig, Elliot A.; Spetch, Marcia L.
2011-01-01
When faced with risky decisions, people tend to be risk averse for gains and risk seeking for losses (the reflection effect). Studies examining this risk-sensitive decision making, however, typically ask people directly what they would do in hypothetical choice scenarios. A recent flurry of studies has shown that when these risky decisions include rare outcomes, people make different choices for explicitly described probabilities than for experienced probabilistic outcomes. Specifically, rare outcomes are overweighted when described and underweighted when experienced. In two experiments, we examined risk-sensitive decision making when the risky option had two equally probable (50%) outcomes. For experience-based decisions, there was a reversal of the reflection effect with greater risk seeking for gains than for losses, as compared to description-based decisions. This fundamental difference in experienced and described choices cannot be explained by the weighting of rare events and suggests a separate subjective utility curve for experience. PMID:21673807
Akinci, A.; Galadini, F.; Pantosti, D.; Petersen, M.; Malagnini, L.; Perkins, D.
2009-01-01
We produce probabilistic seismic-hazard assessments for the central Apennines, Italy, using time-dependent models that are characterized using a Brownian passage time recurrence model. Using aperiodicity parameters, ?? of 0.3, 0.5, and 0.7, we examine the sensitivity of the probabilistic ground motion and its deaggregation to these parameters. For the seismic source model we incorporate both smoothed historical seismicity over the area and geological information on faults. We use the maximum magnitude model for the fault sources together with a uniform probability of rupture along the fault (floating fault model) to model fictitious faults to account for earthquakes that cannot be correlated with known geologic structural segmentation.
Probabilistic clustering of rainfall condition for landslide triggering
NASA Astrophysics Data System (ADS)
Rossi, Mauro; Luciani, Silvia; Cesare Mondini, Alessandro; Kirschbaum, Dalia; Valigi, Daniela; Guzzetti, Fausto
2013-04-01
Landslides are widespread natural and man made phenomena. They are triggered by earthquakes, rapid snow melting, human activities, but mostly by typhoons and intense or prolonged rainfall precipitations. In Italy mostly they are triggered by intense precipitation. The prediction of landslide triggered by rainfall precipitations over large areas is commonly based on the exploitation of empirical models. Empirical landslide rainfall thresholds are used to identify rainfall conditions for the possible landslide initiation. It's common practice to define rainfall thresholds by assuming a power law lower boundary in the rainfall intensity-duration or cumulative rainfall-duration space above which landslide can occur. The boundary is defined considering rainfall conditions associated to landslide phenomena using heuristic approaches, and doesn't consider rainfall events not causing landslides. Here we present a new fully automatic method to identify the probability of landslide occurrence associated to rainfall conditions characterized by measures of intensity or cumulative rainfall and rainfall duration. The method splits the rainfall events of the past in two groups: a group of events causing landslides and its complementary, then estimate their probabilistic distributions. Next, the probabilistic membership of the new event to one of the two clusters is estimated. The method doesn't assume a priori any threshold model, but simple exploits the real empirical distribution of rainfall events. The approach was applied in the Umbria region, Central Italy, where a catalogue of landslide timing, were obtained through the search of chronicles, blogs and other source of information in the period 2002-2012. The approach was tested using rain gauge measures and satellite rainfall estimates (NASA TRMM-v6), allowing in both cases the identification of the rainfall condition triggering landslides in the region. Compared to the other existing threshold definition methods, the prosed one (i) largely reduces the subjectivity in the choice of the threshold model and in how it is calculated, and (ii) it can be easier set-up in other study areas. The proposed approach can be conveniently integrated in existing early-warning system to improve the accuracy of the estimation of the real landslide occurrence probability associated to rainfall events and its uncertainty.
Error Discounting in Probabilistic Category Learning
Craig, Stewart; Lewandowsky, Stephan; Little, Daniel R.
2011-01-01
Some current theories of probabilistic categorization assume that people gradually attenuate their learning in response to unavoidable error. However, existing evidence for this error discounting is sparse and open to alternative interpretations. We report two probabilistic-categorization experiments that investigated error discounting by shifting feedback probabilities to new values after different amounts of training. In both experiments, responding gradually became less responsive to errors, and learning was slowed for some time after the feedback shift. Both results are indicative of error discounting. Quantitative modeling of the data revealed that adding a mechanism for error discounting significantly improved the fits of an exemplar-based and a rule-based associative learning model, as well as of a recency-based model of categorization. We conclude that error discounting is an important component of probabilistic learning. PMID:21355666
Probabilistic interpretation of Peelle's pertinent puzzle and its resolution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hanson, Kenneth M.; Kawano, T.; Talou, P.
2004-01-01
Peelle's Pertinent Puzzle (PPP) states a seemingly plausible set of measurements with their covariance matrix, which produce an implausible answer. To answer the PPP question, we describe a reasonable experimental situation that is consistent with the PPP solution. The confusion surrounding the PPP arises in part because of its imprecise statement, which permits to a variety of interpretations and resulting answers, some of which seem implausible. We emphasize the importance of basing the analysis on an unambiguous probabilistic model that reflects the experimental situation. We present several different models of how the measurements quoted in the PPP problem could bemore » obtained, and interpret their solution in terms of a detailed probabilistic analysis. We suggest a probabilistic approach to handling uncertainties about which model to use.« less
Probabilistic Interpretation of Peelle's Pertinent Puzzle and its Resolution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hanson, Kenneth M.; Kawano, Toshihiko; Talou, Patrick
2005-05-24
Peelle's Pertinent Puzzle (PPP) states a seemingly plausible set of measurements with their covariance matrix, which produce an implausible answer. To answer the PPP question, we describe a reasonable experimental situation that is consistent with the PPP solution. The confusion surrounding the PPP arises in part because of its imprecise statement, which permits to a variety of interpretations and resulting answers, some of which seem implausible. We emphasize the importance of basing the analysis on an unambiguous probabilistic model that reflects the experimental situation. We present several different models of how the measurements quoted in the PPP problem could bemore » obtained, and interpret their solution in terms of a detailed probabilistic analysis. We suggest a probabilistic approach to handling uncertainties about which model to use.« less
Failed rib region prediction in a human body model during crash events with precrash braking.
Guleyupoglu, B; Koya, B; Barnard, R; Gayzik, F S
2018-02-28
The objective of this study is 2-fold. We used a validated human body finite element model to study the predicted chest injury (focusing on rib fracture as a function of element strain) based on varying levels of simulated precrash braking. Furthermore, we compare deterministic and probabilistic methods of rib injury prediction in the computational model. The Global Human Body Models Consortium (GHBMC) M50-O model was gravity settled in the driver position of a generic interior equipped with an advanced 3-point belt and airbag. Twelve cases were investigated with permutations for failure, precrash braking system, and crash severity. The severities used were median (17 kph), severe (34 kph), and New Car Assessment Program (NCAP; 56.4 kph). Cases with failure enabled removed rib cortical bone elements once 1.8% effective plastic strain was exceeded. Alternatively, a probabilistic framework found in the literature was used to predict rib failure. Both the probabilistic and deterministic methods take into consideration location (anterior, lateral, and posterior). The deterministic method is based on a rubric that defines failed rib regions dependent on a threshold for contiguous failed elements. The probabilistic method depends on age-based strain and failure functions. Kinematics between both methods were similar (peak max deviation: ΔX head = 17 mm; ΔZ head = 4 mm; ΔX thorax = 5 mm; ΔZ thorax = 1 mm). Seat belt forces at the time of probabilistic failed region initiation were lower than those at deterministic failed region initiation. The probabilistic method for rib fracture predicted more failed regions in the rib (an analog for fracture) than the deterministic method in all but 1 case where they were equal. The failed region patterns between models are similar; however, there are differences that arise due to stress reduced from element elimination that cause probabilistic failed regions to continue to rise after no deterministic failed region would be predicted. Both the probabilistic and deterministic methods indicate similar trends with regards to the effect of precrash braking; however, there are tradeoffs. The deterministic failed region method is more spatially sensitive to failure and is more sensitive to belt loads. The probabilistic failed region method allows for increased capability in postprocessing with respect to age. The probabilistic failed region method predicted more failed regions than the deterministic failed region method due to force distribution differences.
NASA Astrophysics Data System (ADS)
Kaźmierczak, Bartosz; Wartalska, Katarzyna; Wdowikowski, Marcin; Kotowski, Andrzej
2017-11-01
Modern scientific research in the area of heavy rainfall analysis regarding to the sewerage design indicates the need to develop and use probabilistic rain models. One of the issues that remains to be resolved is the length of the shortest amount of rain to be analyzed. It is commonly believed that the best time is 5 minutes, while the least rain duration measured by the national services is often 10 or even 15 minutes. Main aim of this paper is to present the difference between probabilistic rainfall models results given from rainfall time series including and excluding 5 minutes rainfall duration. Analysis were made for long-time period from 1961-2010 on polish meteorological station Legnica. To develop best fitted to measurement rainfall data probabilistic model 4 probabilistic distributions were used. Results clearly indicates that models including 5 minutes rainfall duration remains more appropriate to use.
Pasta, D J; Taylor, J L; Henning, J M
1999-01-01
Decision-analytic models are frequently used to evaluate the relative costs and benefits of alternative therapeutic strategies for health care. Various types of sensitivity analysis are used to evaluate the uncertainty inherent in the models. Although probabilistic sensitivity analysis is more difficult theoretically and computationally, the results can be much more powerful and useful than deterministic sensitivity analysis. The authors show how a Monte Carlo simulation can be implemented using standard software to perform a probabilistic sensitivity analysis incorporating the bootstrap. The method is applied to a decision-analytic model evaluating the cost-effectiveness of Helicobacter pylori eradication. The necessary steps are straightforward and are described in detail. The use of the bootstrap avoids certain difficulties encountered with theoretical distributions. The probabilistic sensitivity analysis provided insights into the decision-analytic model beyond the traditional base-case and deterministic sensitivity analyses and should become the standard method for assessing sensitivity.
Using hidden Markov models to align multiple sequences.
Mount, David W
2009-07-01
A hidden Markov model (HMM) is a probabilistic model of a multiple sequence alignment (msa) of proteins. In the model, each column of symbols in the alignment is represented by a frequency distribution of the symbols (called a "state"), and insertions and deletions are represented by other states. One moves through the model along a particular path from state to state in a Markov chain (i.e., random choice of next move), trying to match a given sequence. The next matching symbol is chosen from each state, recording its probability (frequency) and also the probability of going to that state from a previous one (the transition probability). State and transition probabilities are multiplied to obtain a probability of the given sequence. The hidden nature of the HMM is due to the lack of information about the value of a specific state, which is instead represented by a probability distribution over all possible values. This article discusses the advantages and disadvantages of HMMs in msa and presents algorithms for calculating an HMM and the conditions for producing the best HMM.
A Re-Unification of Two Competing Models for Document Retrieval.
ERIC Educational Resources Information Center
Bodoff, David
1999-01-01
Examines query-oriented versus document-oriented information retrieval and feedback learning. Highlights include a reunification of the two approaches for probabilistic document retrieval and for vector space model (VSM) retrieval; learning in VSM and in probabilistic models; multi-dimensional scaling; and ongoing field studies. (LRW)
A PROBABILISTIC POPULATION EXPOSURE MODEL FOR PM10 AND PM 2.5
A first generation probabilistic population exposure model for Particulate Matter (PM), specifically for predicting PM10, and PM2.5, exposures of an urban, population has been developed. This model is intended to be used to predict exposure (magnitude, frequency, and duration) ...
Probabilistic Prediction of Lifetimes of Ceramic Parts
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Gyekenyesi, John P.; Jadaan, Osama M.; Palfi, Tamas; Powers, Lynn; Reh, Stefan; Baker, Eric H.
2006-01-01
ANSYS/CARES/PDS is a software system that combines the ANSYS Probabilistic Design System (PDS) software with a modified version of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) Version 6.0 software. [A prior version of CARES/Life was reported in Program for Evaluation of Reliability of Ceramic Parts (LEW-16018), NASA Tech Briefs, Vol. 20, No. 3 (March 1996), page 28.] CARES/Life models effects of stochastic strength, slow crack growth, and stress distribution on the overall reliability of a ceramic component. The essence of the enhancement in CARES/Life 6.0 is the capability to predict the probability of failure using results from transient finite-element analysis. ANSYS PDS models the effects of uncertainty in material properties, dimensions, and loading on the stress distribution and deformation. ANSYS/CARES/PDS accounts for the effects of probabilistic strength, probabilistic loads, probabilistic material properties, and probabilistic tolerances on the lifetime and reliability of the component. Even failure probability becomes a stochastic quantity that can be tracked as a response variable. ANSYS/CARES/PDS enables tracking of all stochastic quantities in the design space, thereby enabling more precise probabilistic prediction of lifetimes of ceramic components.
Probabilistic structural analysis of aerospace components using NESSUS
NASA Technical Reports Server (NTRS)
Shiao, Michael C.; Nagpal, Vinod K.; Chamis, Christos C.
1988-01-01
Probabilistic structural analysis of a Space Shuttle main engine turbopump blade is conducted using the computer code NESSUS (numerical evaluation of stochastic structures under stress). The goal of the analysis is to derive probabilistic characteristics of blade response given probabilistic descriptions of uncertainties in blade geometry, material properties, and temperature and pressure distributions. Probability densities are derived for critical blade responses. Risk assessment and failure life analysis is conducted assuming different failure models.
Cognitive Development Effects of Teaching Probabilistic Decision Making to Middle School Students
ERIC Educational Resources Information Center
Mjelde, James W.; Litzenberg, Kerry K.; Lindner, James R.
2011-01-01
This study investigated the comprehension and effectiveness of teaching formal, probabilistic decision-making skills to middle school students. Two specific objectives were to determine (1) if middle school students can comprehend a probabilistic decision-making approach, and (2) if exposure to the modeling approaches improves middle school…
One of the major recommendations of the National Academy of Science to the USEPA, NMFS and USFWS was to utilize probabilistic methods when assessing the risks of pesticides to federally listed endangered and threatened species. The Terrestrial Investigation Model (TIM, version 3....
Exploring Term Dependences in Probabilistic Information Retrieval Model.
ERIC Educational Resources Information Center
Cho, Bong-Hyun; Lee, Changki; Lee, Gary Geunbae
2003-01-01
Describes a theoretic process to apply Bahadur-Lazarsfeld expansion (BLE) to general probabilistic models and the state-of-the-art 2-Poisson model. Through experiments on two standard document collections, one in Korean and one in English, it is demonstrated that incorporation of term dependences using BLE significantly contributes to performance…
A Probabilistic Model of Phonological Relationships from Contrast to Allophony
ERIC Educational Resources Information Center
Hall, Kathleen Currie
2009-01-01
This dissertation proposes a model of phonological relationships, the Probabilistic Phonological Relationship Model (PPRM), that quantifies how predictably distributed two sounds in a relationship are. It builds on a core premise of traditional phonological analysis, that the ability to define phonological relationships such as contrast and…
Iowa Gambling Task (IGT): twenty years after – gambling disorder and IGT
Brevers, Damien; Bechara, Antoine; Cleeremans, Axel; Noël, Xavier
2013-01-01
The Iowa Gambling Task (IGT) involves probabilistic learning via monetary rewards and punishments, where advantageous task performance requires subjects to forego potential large immediate rewards for small longer-term rewards to avoid larger losses. Pathological gamblers (PG) perform worse on the IGT compared to controls, relating to their persistent preference toward high, immediate, and uncertain rewards despite experiencing larger losses. In this contribution, we review studies that investigated processes associated with poor IGT performance in PG. Findings from these studies seem to fit with recent neurocognitive models of addiction, which argue that the diminished ability of addicted individuals to ponder short-term against long-term consequences of a choice may be the product of an hyperactive automatic attentional and memory system for signaling the presence of addiction-related cues (e.g., high uncertain rewards associated with disadvantageous decks selection during the IGT) and for attributing to such cues pleasure and excitement. This incentive-salience associated with gambling-related choice in PG may be so high that it could literally “hijack” resources [“hot” executive functions (EFs)] involved in emotional self-regulation and necessary to allow the enactment of further elaborate decontextualized problem-solving abilities (“cool” EFs). A framework for future research is also proposed, which highlights the need for studies examining how these processes contribute specifically to the aberrant choice profile displayed by PG on the IGT. PMID:24137138
PubMed related articles: a probabilistic topic-based model for content similarity
Lin, Jimmy; Wilbur, W John
2007-01-01
Background We present a probabilistic topic-based model for content similarity called pmra that underlies the related article search feature in PubMed. Whether or not a document is about a particular topic is computed from term frequencies, modeled as Poisson distributions. Unlike previous probabilistic retrieval models, we do not attempt to estimate relevance–but rather our focus is "relatedness", the probability that a user would want to examine a particular document given known interest in another. We also describe a novel technique for estimating parameters that does not require human relevance judgments; instead, the process is based on the existence of MeSH ® in MEDLINE ®. Results The pmra retrieval model was compared against bm25, a competitive probabilistic model that shares theoretical similarities. Experiments using the test collection from the TREC 2005 genomics track shows a small but statistically significant improvement of pmra over bm25 in terms of precision. Conclusion Our experiments suggest that the pmra model provides an effective ranking algorithm for related article search. PMID:17971238
Toward a Probabilistic Phenological Model for Wheat Growing Degree Days (GDD)
NASA Astrophysics Data System (ADS)
Rahmani, E.; Hense, A.
2017-12-01
Are there deterministic relations between phenological and climate parameters? The answer is surely `No'. This answer motivated us to solve the problem through probabilistic theories. Thus, we developed a probabilistic phenological model which has the advantage of giving additional information in terms of uncertainty. To that aim, we turned to a statistical analysis named survival analysis. Survival analysis deals with death in biological organisms and failure in mechanical systems. In survival analysis literature, death or failure is considered as an event. By event, in this research we mean ripening date of wheat. We will assume only one event in this special case. By time, we mean the growing duration from sowing to ripening as lifetime for wheat which is a function of GDD. To be more precise we will try to perform the probabilistic forecast for wheat ripening. The probability value will change between 0 and 1. Here, the survivor function gives the probability that the not ripened wheat survives longer than a specific time or will survive to the end of its lifetime as a ripened crop. The survival function at each station is determined by fitting a normal distribution to the GDD as the function of growth duration. Verification of the models obtained is done using CRPS skill score (CRPSS). The positive values of CRPSS indicate the large superiority of the probabilistic phonologic survival model to the deterministic models. These results demonstrate that considering uncertainties in modeling are beneficial, meaningful and necessary. We believe that probabilistic phenological models have the potential to help reduce the vulnerability of agricultural production systems to climate change thereby increasing food security.
A probabilistic seismic model for the European Arctic
NASA Astrophysics Data System (ADS)
Hauser, Juerg; Dyer, Kathleen M.; Pasyanos, Michael E.; Bungum, Hilmar; Faleide, Jan I.; Clark, Stephen A.; Schweitzer, Johannes
2011-01-01
The development of three-dimensional seismic models for the crust and upper mantle has traditionally focused on finding one model that provides the best fit to the data while observing some regularization constraints. In contrast to this, the inversion employed here fits the data in a probabilistic sense and thus provides a quantitative measure of model uncertainty. Our probabilistic model is based on two sources of information: (1) prior information, which is independent from the data, and (2) different geophysical data sets, including thickness constraints, velocity profiles, gravity data, surface wave group velocities, and regional body wave traveltimes. We use a Markov chain Monte Carlo (MCMC) algorithm to sample models from the prior distribution, the set of plausible models, and test them against the data to generate the posterior distribution, the ensemble of models that fit the data with assigned uncertainties. While being computationally more expensive, such a probabilistic inversion provides a more complete picture of solution space and allows us to combine various data sets. The complex geology of the European Arctic, encompassing oceanic crust, continental shelf regions, rift basins and old cratonic crust, as well as the nonuniform coverage of the region by data with varying degrees of uncertainty, makes it a challenging setting for any imaging technique and, therefore, an ideal environment for demonstrating the practical advantages of a probabilistic approach. Maps of depth to basement and depth to Moho derived from the posterior distribution are in good agreement with previously published maps and interpretations of the regional tectonic setting. The predicted uncertainties, which are as important as the absolute values, correlate well with the variations in data coverage and quality in the region. A practical advantage of our probabilistic model is that it can provide estimates for the uncertainties of observables due to model uncertainties. We will demonstrate how this can be used for the formulation of earthquake location algorithms that take model uncertainties into account when estimating location uncertainties.
Learning Probabilistic Inference through Spike-Timing-Dependent Plasticity.
Pecevski, Dejan; Maass, Wolfgang
2016-01-01
Numerous experimental data show that the brain is able to extract information from complex, uncertain, and often ambiguous experiences. Furthermore, it can use such learnt information for decision making through probabilistic inference. Several models have been proposed that aim at explaining how probabilistic inference could be performed by networks of neurons in the brain. We propose here a model that can also explain how such neural network could acquire the necessary information for that from examples. We show that spike-timing-dependent plasticity in combination with intrinsic plasticity generates in ensembles of pyramidal cells with lateral inhibition a fundamental building block for that: probabilistic associations between neurons that represent through their firing current values of random variables. Furthermore, by combining such adaptive network motifs in a recursive manner the resulting network is enabled to extract statistical information from complex input streams, and to build an internal model for the distribution p (*) that generates the examples it receives. This holds even if p (*) contains higher-order moments. The analysis of this learning process is supported by a rigorous theoretical foundation. Furthermore, we show that the network can use the learnt internal model immediately for prediction, decision making, and other types of probabilistic inference.
Learning Probabilistic Inference through Spike-Timing-Dependent Plasticity123
Pecevski, Dejan
2016-01-01
Abstract Numerous experimental data show that the brain is able to extract information from complex, uncertain, and often ambiguous experiences. Furthermore, it can use such learnt information for decision making through probabilistic inference. Several models have been proposed that aim at explaining how probabilistic inference could be performed by networks of neurons in the brain. We propose here a model that can also explain how such neural network could acquire the necessary information for that from examples. We show that spike-timing-dependent plasticity in combination with intrinsic plasticity generates in ensembles of pyramidal cells with lateral inhibition a fundamental building block for that: probabilistic associations between neurons that represent through their firing current values of random variables. Furthermore, by combining such adaptive network motifs in a recursive manner the resulting network is enabled to extract statistical information from complex input streams, and to build an internal model for the distribution p* that generates the examples it receives. This holds even if p* contains higher-order moments. The analysis of this learning process is supported by a rigorous theoretical foundation. Furthermore, we show that the network can use the learnt internal model immediately for prediction, decision making, and other types of probabilistic inference. PMID:27419214
The Gain-Loss Model: A Probabilistic Skill Multimap Model for Assessing Learning Processes
ERIC Educational Resources Information Center
Robusto, Egidio; Stefanutti, Luca; Anselmi, Pasquale
2010-01-01
Within the theoretical framework of knowledge space theory, a probabilistic skill multimap model for assessing learning processes is proposed. The learning process of a student is modeled as a function of the student's knowledge and of an educational intervention on the attainment of specific skills required to solve problems in a knowledge…
A selective emotional decision-making bias elicited by facial expressions.
Furl, Nicholas; Gallagher, Shannon; Averbeck, Bruno B
2012-01-01
Emotional and social information can sway otherwise rational decisions. For example, when participants decide between two faces that are probabilistically rewarded, they make biased choices that favor smiling relative to angry faces. This bias may arise because facial expressions evoke positive and negative emotional responses, which in turn may motivate social approach and avoidance. We tested a wide range of pictures that evoke emotions or convey social information, including animals, words, foods, a variety of scenes, and faces differing in trustworthiness or attractiveness, but we found only facial expressions biased decisions. Our results extend brain imaging and pharmacological findings, which suggest that a brain mechanism supporting social interaction may be involved. Facial expressions appear to exert special influence over this social interaction mechanism, one capable of biasing otherwise rational choices. These results illustrate that only specific types of emotional experiences can best sway our choices.
A Selective Emotional Decision-Making Bias Elicited by Facial Expressions
Furl, Nicholas; Gallagher, Shannon; Averbeck, Bruno B.
2012-01-01
Emotional and social information can sway otherwise rational decisions. For example, when participants decide between two faces that are probabilistically rewarded, they make biased choices that favor smiling relative to angry faces. This bias may arise because facial expressions evoke positive and negative emotional responses, which in turn may motivate social approach and avoidance. We tested a wide range of pictures that evoke emotions or convey social information, including animals, words, foods, a variety of scenes, and faces differing in trustworthiness or attractiveness, but we found only facial expressions biased decisions. Our results extend brain imaging and pharmacological findings, which suggest that a brain mechanism supporting social interaction may be involved. Facial expressions appear to exert special influence over this social interaction mechanism, one capable of biasing otherwise rational choices. These results illustrate that only specific types of emotional experiences can best sway our choices. PMID:22438936
A PROBABILISTIC MODELING FRAMEWORK FOR PREDICTING POPULATION EXPOSURES TO BENZENE
The US Environmental Protection Agency (EPA) is modifying their probabilistic Stochastic Human Exposure Dose Simulation (SHEDS) model to assess aggregate exposures to air toxics. Air toxics include urban Hazardous Air Pollutants (HAPS) such as benzene from mobile sources, part...
A new discriminative kernel from probabilistic models.
Tsuda, Koji; Kawanabe, Motoaki; Rätsch, Gunnar; Sonnenburg, Sören; Müller, Klaus-Robert
2002-10-01
Recently, Jaakkola and Haussler (1999) proposed a method for constructing kernel functions from probabilistic models. Their so-called Fisher kernel has been combined with discriminative classifiers such as support vector machines and applied successfully in, for example, DNA and protein analysis. Whereas the Fisher kernel is calculated from the marginal log-likelihood, we propose the TOP kernel derived; from tangent vectors of posterior log-odds. Furthermore, we develop a theoretical framework on feature extractors from probabilistic models and use it for analyzing the TOP kernel. In experiments, our new discriminative TOP kernel compares favorably to the Fisher kernel.
From cyclone tracks to the costs of European winter storms: A probabilistic loss assessment model
NASA Astrophysics Data System (ADS)
Renggli, Dominik; Corti, Thierry; Reese, Stefan; Wueest, Marc; Viktor, Elisabeth; Zimmerli, Peter
2014-05-01
The quantitative assessment of the potential losses of European winter storms is essential for the economic viability of a global reinsurance company. For this purpose, reinsurance companies generally use probabilistic loss assessment models. This work presents an innovative approach to develop physically meaningful probabilistic events for Swiss Re's new European winter storm loss model. The meteorological hazard component of the new model is based on cyclone and windstorm tracks identified in the 20th Century Reanalysis data. The knowledge of the evolution of winter storms both in time and space allows the physically meaningful perturbation of properties of historical events (e.g. track, intensity). The perturbation includes a random element but also takes the local climatology and the evolution of the historical event into account. The low-resolution wind footprints taken from 20th Century Reanalysis are processed by a statistical-dynamical downscaling to generate high-resolution footprints of the historical and probabilistic winter storm events. Downscaling transfer functions are generated using ENSEMBLES regional climate model data. The result is a set of reliable probabilistic events representing thousands of years. The event set is then combined with country- and risk-specific vulnerability functions and detailed market- or client-specific exposure information to compute (re-)insurance risk premiums.
Probabilistic Usage of the Multi-Factor Interaction Model
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2008-01-01
A Multi-Factor Interaction Model (MFIM) is used to predict the insulating foam mass expulsion during the ascending of a space vehicle. The exponents in the MFIM are evaluated by an available approach which consists of least squares and an optimization algorithm. These results were subsequently used to probabilistically evaluate the effects of the uncertainties in each participating factor in the mass expulsion. The probabilistic results show that the surface temperature dominates at high probabilities and the pressure which causes the mass expulsion at low probabil
Biehler, J; Wall, W A
2018-02-01
If computational models are ever to be used in high-stakes decision making in clinical practice, the use of personalized models and predictive simulation techniques is a must. This entails rigorous quantification of uncertainties as well as harnessing available patient-specific data to the greatest extent possible. Although researchers are beginning to realize that taking uncertainty in model input parameters into account is a necessity, the predominantly used probabilistic description for these uncertain parameters is based on elementary random variable models. In this work, we set out for a comparison of different probabilistic models for uncertain input parameters using the example of an uncertain wall thickness in finite element models of abdominal aortic aneurysms. We provide the first comparison between a random variable and a random field model for the aortic wall and investigate the impact on the probability distribution of the computed peak wall stress. Moreover, we show that the uncertainty about the prevailing peak wall stress can be reduced if noninvasively available, patient-specific data are harnessed for the construction of the probabilistic wall thickness model. Copyright © 2017 John Wiley & Sons, Ltd.
Fully probabilistic control for stochastic nonlinear control systems with input dependent noise.
Herzallah, Randa
2015-03-01
Robust controllers for nonlinear stochastic systems with functional uncertainties can be consistently designed using probabilistic control methods. In this paper a generalised probabilistic controller design for the minimisation of the Kullback-Leibler divergence between the actual joint probability density function (pdf) of the closed loop control system, and an ideal joint pdf is presented emphasising how the uncertainty can be systematically incorporated in the absence of reliable systems models. To achieve this objective all probabilistic models of the system are estimated from process data using mixture density networks (MDNs) where all the parameters of the estimated pdfs are taken to be state and control input dependent. Based on this dependency of the density parameters on the input values, explicit formulations to the construction of optimal generalised probabilistic controllers are obtained through the techniques of dynamic programming and adaptive critic methods. Using the proposed generalised probabilistic controller, the conditional joint pdfs can be made to follow the ideal ones. A simulation example is used to demonstrate the implementation of the algorithm and encouraging results are obtained. Copyright © 2014 Elsevier Ltd. All rights reserved.
Reconstructing Constructivism: Causal Models, Bayesian Learning Mechanisms, and the Theory Theory
ERIC Educational Resources Information Center
Gopnik, Alison; Wellman, Henry M.
2012-01-01
We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework…
To help address the Food Quality Protection Act of 1996, a physically-based, two-stage Monte Carlo probabilistic model has been developed to quantify and analyze aggregate exposure and dose to pesticides via multiple routes and pathways. To illustrate model capabilities and ide...
Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling
NASA Technical Reports Server (NTRS)
Hojnicki, Jeffrey S.; Rusick, Jeffrey J.
2005-01-01
Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).
Nagarajan, Mahesh B; Raman, Steven S; Lo, Pechin; Lin, Wei-Chan; Khoshnoodi, Pooria; Sayre, James W; Ramakrishna, Bharath; Ahuja, Preeti; Huang, Jiaoti; Margolis, Daniel J A; Lu, David S K; Reiter, Robert E; Goldin, Jonathan G; Brown, Matthew S; Enzmann, Dieter R
2018-02-19
We present a method for generating a T2 MR-based probabilistic model of tumor occurrence in the prostate to guide the selection of anatomical sites for targeted biopsies and serve as a diagnostic tool to aid radiological evaluation of prostate cancer. In our study, the prostate and any radiological findings within were segmented retrospectively on 3D T2-weighted MR images of 266 subjects who underwent radical prostatectomy. Subsequent histopathological analysis determined both the ground truth and the Gleason grade of the tumors. A randomly chosen subset of 19 subjects was used to generate a multi-subject-derived prostate template. Subsequently, a cascading registration algorithm involving both affine and non-rigid B-spline transforms was used to register the prostate of every subject to the template. Corresponding transformation of radiological findings yielded a population-based probabilistic model of tumor occurrence. The quality of our probabilistic model building approach was statistically evaluated by measuring the proportion of correct placements of tumors in the prostate template, i.e., the number of tumors that maintained their anatomical location within the prostate after their transformation into the prostate template space. Probabilistic model built with tumors deemed clinically significant demonstrated a heterogeneous distribution of tumors, with higher likelihood of tumor occurrence at the mid-gland anterior transition zone and the base-to-mid-gland posterior peripheral zones. Of 250 MR lesions analyzed, 248 maintained their original anatomical location with respect to the prostate zones after transformation to the prostate. We present a robust method for generating a probabilistic model of tumor occurrence in the prostate that could aid clinical decision making, such as selection of anatomical sites for MR-guided prostate biopsies.
Thomas C. Edwards; D. Richard Cutler; Niklaus E. Zimmermann; Linda Geiser; Gretchen G. Moisen
2006-01-01
We evaluated the effects of probabilistic (hereafter DESIGN) and non-probabilistic (PURPOSIVE) sample surveys on resultant classification tree models for predicting the presence of four lichen species in the Pacific Northwest, USA. Models derived from both survey forms were assessed using an independent data set (EVALUATION). Measures of accuracy as gauged by...
Modelling default and likelihood reasoning as probabilistic reasoning
NASA Technical Reports Server (NTRS)
Buntine, Wray
1990-01-01
A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. Likely and by default are in fact treated as duals in the same sense as possibility and necessity. To model these four forms probabilistically, a qualitative default probabilistic (QDP) logic and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequent results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.
O'Sullivan, Aaron J; Pigat, Sandrine; O'Mahony, Cian; Gibney, Michael J; McKevitt, Aideen I
2016-11-01
The choice of suitable normal foods is limited for individuals with particular medical conditions, e.g., inborn errors of metabolism (phenylketonuria - PKU) or severe cow's milk protein allergy (CMPA). Patients may have dietary restrictions and exclusive or partial replacement of specific food groups with specially formulated products to meet particular nutrition requirements. Artificial sweeteners are used to improve the appearance and palatability of such food products to avoid food refusal and ensure dietary adherence. Young children have a higher risk of exceeding acceptable daily intakes for additives than adults due to higher food intakes kg -1 body weight. The Budget Method and EFSA's Food Additives Intake Model (FAIM) are not equipped to assess partial dietary replacement with special formulations as they are built on data from dietary surveys of consumers without special medical requirements impacting the diet. The aim of this study was to explore dietary exposure modelling as a means of estimating the intake of artificial sweeteners by young PKU and CMPA patients aged 1-3 years. An adapted validated probabilistic model (FACET) was used to assess patients' exposure to artificial sweeteners. Food consumption data were derived from the food consumption survey data of healthy young children in Ireland from the National Preschool and Nutrition Survey (NPNS, 2010-11). Specially formulated foods for special medical purposes were included in the exposure model to replace restricted foods. Inclusion was based on recommendations for adequate protein intake and dietary adherence data. Exposure assessment results indicated that young children with PKU and CMPA have higher relative average intakes of artificial sweeteners than healthy young children. The reliability and robustness of the model in the estimation of patient additive exposures was further investigated and provides the first exposure estimates for these special populations.
Capturing spatial and temporal patterns of widespread, extreme flooding across Europe
NASA Astrophysics Data System (ADS)
Busby, Kathryn; Raven, Emma; Liu, Ye
2013-04-01
Statistical characterisation of physical hazards is an integral part of probabilistic catastrophe models used by the reinsurance industry to estimate losses from large scale events. Extreme flood events are not restricted by country boundaries which poses an issue for reinsurance companies as their exposures often extend beyond them. We discuss challenges and solutions that allow us to appropriately capture the spatial and temporal dependence of extreme hydrological events on a continental-scale, which in turn enables us to generate an industry-standard stochastic event set for estimating financial losses for widespread flooding. By presenting our event set methodology, we focus on explaining how extreme value theory (EVT) and dependence modelling are used to account for short, inconsistent hydrological data from different countries, and how to make appropriate statistical decisions that best characterise the nature of flooding across Europe. The consistency of input data is of vital importance when identifying historical flood patterns. Collating data from numerous sources inherently causes inconsistencies and we demonstrate our robust approach to assessing the data and refining it to compile a single consistent dataset. This dataset is then extrapolated using a parameterised EVT distribution to estimate extremes. Our method then captures the dependence of flood events across countries using an advanced multivariate extreme value model. Throughout, important statistical decisions are explored including: (1) distribution choice; (2) the threshold to apply for extracting extreme data points; (3) a regional analysis; (4) the definition of a flood event, which is often linked with reinsurance industry's hour's clause; and (5) handling of missing values. Finally, having modelled the historical patterns of flooding across Europe, we sample from this model to generate our stochastic event set comprising of thousands of events over thousands of years. We then briefly illustrate how this is applied within a probabilistic model to estimate catastrophic loss curves used by the reinsurance industry.
Namazi-Rad, Mohammad-Reza; Dunbar, Michelle; Ghaderi, Hadi; Mokhtarian, Payam
2015-01-01
To achieve greater transit-time reduction and improvement in reliability of transport services, there is an increasing need to assist transport planners in understanding the value of punctuality; i.e. the potential improvements, not only to service quality and the consumer but also to the actual profitability of the service. In order for this to be achieved, it is important to understand the network-specific aspects that affect both the ability to decrease transit-time, and the associated cost-benefit of doing so. In this paper, we outline a framework for evaluating the effectiveness of proposed changes to average transit-time, so as to determine the optimal choice of average arrival time subject to desired punctuality levels whilst simultaneously minimizing operational costs. We model the service transit-time variability using a truncated probability density function, and simultaneously compare the trade-off between potential gains and increased service costs, for several commonly employed cost-benefit functions of general form. We formulate this problem as a constrained optimization problem to determine the optimal choice of average transit time, so as to increase the level of service punctuality, whilst simultaneously ensuring a minimum level of cost-benefit to the service operator. PMID:25992902
Residential air exchange rates (AERs) are a key determinant in the infiltration of ambient air pollution indoors. Population-based human exposure models using probabilistic approaches to estimate personal exposure to air pollutants have relied on input distributions from AER meas...
Dinov, Martin; Leech, Robert
2017-01-01
Part of the process of EEG microstate estimation involves clustering EEG channel data at the global field power (GFP) maxima, very commonly using a modified K-means approach. Clustering has also been done deterministically, despite there being uncertainties in multiple stages of the microstate analysis, including the GFP peak definition, the clustering itself and in the post-clustering assignment of microstates back onto the EEG timecourse of interest. We perform a fully probabilistic microstate clustering and labeling, to account for these sources of uncertainty using the closest probabilistic analog to KM called Fuzzy C-means (FCM). We train softmax multi-layer perceptrons (MLPs) using the KM and FCM-inferred cluster assignments as target labels, to then allow for probabilistic labeling of the full EEG data instead of the usual correlation-based deterministic microstate label assignment typically used. We assess the merits of the probabilistic analysis vs. the deterministic approaches in EEG data recorded while participants perform real or imagined motor movements from a publicly available data set of 109 subjects. Though FCM group template maps that are almost topographically identical to KM were found, there is considerable uncertainty in the subsequent assignment of microstate labels. In general, imagined motor movements are less predictable on a time point-by-time point basis, possibly reflecting the more exploratory nature of the brain state during imagined, compared to during real motor movements. We find that some relationships may be more evident using FCM than using KM and propose that future microstate analysis should preferably be performed probabilistically rather than deterministically, especially in situations such as with brain computer interfaces, where both training and applying models of microstates need to account for uncertainty. Probabilistic neural network-driven microstate assignment has a number of advantages that we have discussed, which are likely to be further developed and exploited in future studies. In conclusion, probabilistic clustering and a probabilistic neural network-driven approach to microstate analysis is likely to better model and reveal details and the variability hidden in current deterministic and binarized microstate assignment and analyses.
Dinov, Martin; Leech, Robert
2017-01-01
Part of the process of EEG microstate estimation involves clustering EEG channel data at the global field power (GFP) maxima, very commonly using a modified K-means approach. Clustering has also been done deterministically, despite there being uncertainties in multiple stages of the microstate analysis, including the GFP peak definition, the clustering itself and in the post-clustering assignment of microstates back onto the EEG timecourse of interest. We perform a fully probabilistic microstate clustering and labeling, to account for these sources of uncertainty using the closest probabilistic analog to KM called Fuzzy C-means (FCM). We train softmax multi-layer perceptrons (MLPs) using the KM and FCM-inferred cluster assignments as target labels, to then allow for probabilistic labeling of the full EEG data instead of the usual correlation-based deterministic microstate label assignment typically used. We assess the merits of the probabilistic analysis vs. the deterministic approaches in EEG data recorded while participants perform real or imagined motor movements from a publicly available data set of 109 subjects. Though FCM group template maps that are almost topographically identical to KM were found, there is considerable uncertainty in the subsequent assignment of microstate labels. In general, imagined motor movements are less predictable on a time point-by-time point basis, possibly reflecting the more exploratory nature of the brain state during imagined, compared to during real motor movements. We find that some relationships may be more evident using FCM than using KM and propose that future microstate analysis should preferably be performed probabilistically rather than deterministically, especially in situations such as with brain computer interfaces, where both training and applying models of microstates need to account for uncertainty. Probabilistic neural network-driven microstate assignment has a number of advantages that we have discussed, which are likely to be further developed and exploited in future studies. In conclusion, probabilistic clustering and a probabilistic neural network-driven approach to microstate analysis is likely to better model and reveal details and the variability hidden in current deterministic and binarized microstate assignment and analyses. PMID:29163110
Integral modeling of human eyes: from anatomy to visual response
NASA Astrophysics Data System (ADS)
Navarro, Rafael
2006-02-01
Three basic stages towards the global modeling of the eye are presented. In the first stage, an adequate choice of the basis geometrical model, general ellipsoid in this case, permits, to fit in a natural way the typical "melon" shape of the cornea with minimum complexity. In addition it facilitates to extract most of its optically relevant parameters, such as the position and orientation of it optical axis in the 3D space, the paraxial and overall refractive power, the amount and axis of astigmatism, etc. In the second level, this geometrical model, along with optical design and optimization tools, is applied to build customized optical models of individual eyes, able to reproduce the measured wave aberration with high fidelity. Finally, we put together a sequence of schematic, but functionally realistic models of the different stages of image acquisition, coding and analysis in the visual system, along with a probabilistic Bayesian maximum a posteriori identification approach. This permitted us to build a realistic simulation of the all the essential processes involved in a visual acuity clinical exam. It is remarkable that at all three levels, it has been possible for the models to predict the experimental data with high accuracy.
Martínez-Velázquez, Eduardo S; Ramos-Loyo, Julieta; González-Garrido, Andrés A; Sequeira, Henrique
2015-01-21
Feedback-related negativity (FRN) is a negative deflection that appears around 250 ms after the gain or loss of feedback to chosen alternatives in a gambling task in frontocentral regions following outcomes. Few studies have reported FRN enhancement in adolescents compared with adults in a gambling task without probabilistic reinforcement learning, despite the fact that learning from positive or negative consequences is crucial for decision-making during adolescence. Therefore, the aim of the present research was to identify differences in FRN amplitude and latency between adolescents and adults on a gambling task with favorable and unfavorable probabilistic reinforcement learning conditions, in addition to a nonlearning condition with monetary gains and losses. Higher rate scores of high-magnitude choices during the final 30 trials compared with the first 30 trials were observed during the favorable condition, whereas lower rates were observed during the unfavorable condition in both groups. Higher FRN amplitude in all conditions and longer latency in the nonlearning condition were observed in adolescents compared with adults and in relation to losses. Results indicate that both the adolescents and the adults improved their performance in relation to positive and negative feedback. However, the FRN findings suggest an increased sensitivity to external feedback to losses in adolescents compared with adults, irrespective of the presence or absence of probabilistic reinforcement learning. These results reflect processing differences on the neural monitoring system and provide new perspectives on the dynamic development of an adolescent's brain.
Bakker, Alexander M R; Wong, Tony E; Ruckert, Kelsey L; Keller, Klaus
2017-06-20
There is a growing awareness that uncertainties surrounding future sea-level projections may be much larger than typically perceived. Recently published projections appear widely divergent and highly sensitive to non-trivial model choices . Moreover, the West Antarctic ice sheet (WAIS) may be much less stable than previous believed, enabling a rapid disintegration. Here, we present a set of probabilistic sea-level projections that approximates the deeply uncertain WAIS contributions. The projections aim to inform robust decisions by clarifying the sensitivity to non-trivial or controversial assumptions. We show that the deeply uncertain WAIS contribution can dominate other uncertainties within decades. These deep uncertainties call for the development of robust adaptive strategies. These decision-making needs, in turn, require mission-oriented basic science, for example about potential signposts and the maximum rate of WAIS-induced sea-level changes.
Data Analysis Recipes: Using Markov Chain Monte Carlo
NASA Astrophysics Data System (ADS)
Hogg, David W.; Foreman-Mackey, Daniel
2018-05-01
Markov Chain Monte Carlo (MCMC) methods for sampling probability density functions (combined with abundant computational resources) have transformed the sciences, especially in performing probabilistic inferences, or fitting models to data. In this primarily pedagogical contribution, we give a brief overview of the most basic MCMC method and some practical advice for the use of MCMC in real inference problems. We give advice on method choice, tuning for performance, methods for initialization, tests of convergence, troubleshooting, and use of the chain output to produce or report parameter estimates with associated uncertainties. We argue that autocorrelation time is the most important test for convergence, as it directly connects to the uncertainty on the sampling estimate of any quantity of interest. We emphasize that sampling is a method for doing integrals; this guides our thinking about how MCMC output is best used. .
A generative, probabilistic model of local protein structure.
Boomsma, Wouter; Mardia, Kanti V; Taylor, Charles C; Ferkinghoff-Borg, Jesper; Krogh, Anders; Hamelryck, Thomas
2008-07-01
Despite significant progress in recent years, protein structure prediction maintains its status as one of the prime unsolved problems in computational biology. One of the key remaining challenges is an efficient probabilistic exploration of the structural space that correctly reflects the relative conformational stabilities. Here, we present a fully probabilistic, continuous model of local protein structure in atomic detail. The generative model makes efficient conformational sampling possible and provides a framework for the rigorous analysis of local sequence-structure correlations in the native state. Our method represents a significant theoretical and practical improvement over the widely used fragment assembly technique by avoiding the drawbacks associated with a discrete and nonprobabilistic approach.
Denis Valle; Benjamin Baiser; Christopher W. Woodall; Robin Chazdon; Jerome Chave
2014-01-01
We propose a novel multivariate method to analyse biodiversity data based on the Latent Dirichlet Allocation (LDA) model. LDA, a probabilistic model, reduces assemblages to sets of distinct component communities. It produces easily interpretable results, can represent abrupt and gradual changes in composition, accommodates missing data and allows for coherent estimates...
Boos, Moritz; Seer, Caroline; Lange, Florian; Kopp, Bruno
2016-01-01
Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modeling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities) by two (likelihoods) design. Five computational models of cognitive processes were compared with the observed behavior. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted) S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model's success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modeling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modeling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision. PMID:27303323
Model fitting data from syllogistic reasoning experiments.
Hattori, Masasi
2016-12-01
The data presented in this article are related to the research article entitled "Probabilistic representation in syllogistic reasoning: A theory to integrate mental models and heuristics" (M. Hattori, 2016) [1]. This article presents predicted data by three signature probabilistic models of syllogistic reasoning and model fitting results for each of a total of 12 experiments ( N =404) in the literature. Models are implemented in R, and their source code is also provided.
The Probability Heuristics Model of Syllogistic Reasoning.
ERIC Educational Resources Information Center
Chater, Nick; Oaksford, Mike
1999-01-01
Proposes a probability heuristic model for syllogistic reasoning and confirms the rationality of this heuristic by an analysis of the probabilistic validity of syllogistic reasoning that treats logical inference as a limiting case of probabilistic inference. Meta-analysis and two experiments involving 40 adult participants and using generalized…
Modeling Array Stations in SIG-VISA
NASA Astrophysics Data System (ADS)
Ding, N.; Moore, D.; Russell, S.
2013-12-01
We add support for array stations to SIG-VISA, a system for nuclear monitoring using probabilistic inference on seismic signals. Array stations comprise a large portion of the IMS network; they can provide increased sensitivity and more accurate directional information compared to single-component stations. Our existing model assumed that signals were independent at each station, which is false when lots of stations are close together, as in an array. The new model removes that assumption by jointly modeling signals across array elements. This is done by extending our existing Gaussian process (GP) regression models, also known as kriging, from a 3-dimensional single-component space of events to a 6-dimensional space of station-event pairs. For each array and each event attribute (including coda decay, coda height, amplitude transfer and travel time), we model the joint distribution across array elements using a Gaussian process that learns the correlation lengthscale across the array, thereby incorporating information of array stations into the probabilistic inference framework. To evaluate the effectiveness of our model, we perform ';probabilistic beamforming' on new events using our GP model, i.e., we compute the event azimuth having highest posterior probability under the model, conditioned on the signals at array elements. We compare the results from our probabilistic inference model to the beamforming currently performed by IMS station processing.
NASA Astrophysics Data System (ADS)
Caglar, Mehmet Umut; Pal, Ranadip
2011-03-01
Central dogma of molecular biology states that ``information cannot be transferred back from protein to either protein or nucleic acid''. However, this assumption is not exactly correct in most of the cases. There are a lot of feedback loops and interactions between different levels of systems. These types of interactions are hard to analyze due to the lack of cell level data and probabilistic - nonlinear nature of interactions. Several models widely used to analyze and simulate these types of nonlinear interactions. Stochastic Master Equation (SME) models give probabilistic nature of the interactions in a detailed manner, with a high calculation cost. On the other hand Probabilistic Boolean Network (PBN) models give a coarse scale picture of the stochastic processes, with a less calculation cost. Differential Equation (DE) models give the time evolution of mean values of processes in a highly cost effective way. The understanding of the relations between the predictions of these models is important to understand the reliability of the simulations of genetic regulatory networks. In this work the success of the mapping between SME, PBN and DE models is analyzed and the accuracy and affectivity of the control policies generated by using PBN and DE models is compared.
Development of probabilistic regional climate scenario in East Asia
NASA Astrophysics Data System (ADS)
Dairaku, K.; Ueno, G.; Ishizaki, N. N.
2015-12-01
Climate information and services for Impacts, Adaptation and Vulnerability (IAV) Assessments are of great concern. In order to develop probabilistic regional climate information that represents the uncertainty in climate scenario experiments in East Asia (CORDEX-EA and Japan), the probability distribution of 2m air temperature was estimated by using developed regression model. The method can be easily applicable to other regions and other physical quantities, and also to downscale to finer-scale dependent on availability of observation dataset. Probabilistic climate information in present (1969-1998) and future (2069-2098) climate was developed using CMIP3 SRES A1b scenarios 21 models and the observation data (CRU_TS3.22 & University of Delaware in CORDEX-EA, NIAES AMeDAS mesh data in Japan). The prototype of probabilistic information in CORDEX-EA and Japan represent the quantified structural uncertainties of multi-model ensemble experiments of climate change scenarios. Appropriate combination of statistical methods and optimization of climate ensemble experiments using multi-General Circulation Models (GCMs) and multi-regional climate models (RCMs) ensemble downscaling experiments are investigated.
Fuller, Robert William; Wong, Tony E; Keller, Klaus
2017-01-01
The response of the Antarctic ice sheet (AIS) to changing global temperatures is a key component of sea-level projections. Current projections of the AIS contribution to sea-level changes are deeply uncertain. This deep uncertainty stems, in part, from (i) the inability of current models to fully resolve key processes and scales, (ii) the relatively sparse available data, and (iii) divergent expert assessments. One promising approach to characterizing the deep uncertainty stemming from divergent expert assessments is to combine expert assessments, observations, and simple models by coupling probabilistic inversion and Bayesian inversion. Here, we present a proof-of-concept study that uses probabilistic inversion to fuse a simple AIS model and diverse expert assessments. We demonstrate the ability of probabilistic inversion to infer joint prior probability distributions of model parameters that are consistent with expert assessments. We then confront these inferred expert priors with instrumental and paleoclimatic observational data in a Bayesian inversion. These additional constraints yield tighter hindcasts and projections. We use this approach to quantify how the deep uncertainty surrounding expert assessments affects the joint probability distributions of model parameters and future projections.
Domurat, Artur; Kowalczuk, Olga; Idzikowska, Katarzyna; Borzymowska, Zuzanna; Nowak-Przygodzka, Marta
2015-01-01
This paper has two aims. First, we investigate how often people make choices conforming to Bayes' rule when natural sampling is applied. Second, we show that using Bayes' rule is not necessary to make choices satisfying Bayes' rule. Simpler methods, even fallacious heuristics, might prescribe correct choices reasonably often under specific circumstances. We considered elementary situations with binary sets of hypotheses and data. We adopted an ecological approach and prepared two-stage computer tasks resembling natural sampling. Probabilistic relations were inferred from a set of pictures, followed by a choice which was made to maximize the chance of a preferred outcome. Use of Bayes' rule was deduced indirectly from choices. Study 1 used a stratified sample of N = 60 participants equally distributed with regard to gender and type of education (humanities vs. pure sciences). Choices satisfying Bayes' rule were dominant. To investigate ways of making choices more directly, we replicated Study 1, adding a task with a verbal report. In Study 2 (N = 76) choices conforming to Bayes' rule dominated again. However, the verbal reports revealed use of a new, non-inverse rule, which always renders correct choices, but is easier than Bayes' rule to apply. It does not require inversion of conditions [transforming P(H) and P(D|H) into P(H|D)] when computing chances. Study 3 examined the efficiency of three fallacious heuristics (pre-Bayesian, representativeness, and evidence-only) in producing choices concordant with Bayes' rule. Computer-simulated scenarios revealed that the heuristics produced correct choices reasonably often under specific base rates and likelihood ratios. Summing up we conclude that natural sampling results in most choices conforming to Bayes' rule. However, people tend to replace Bayes' rule with simpler methods, and even use of fallacious heuristics may be satisfactorily efficient.
Domurat, Artur; Kowalczuk, Olga; Idzikowska, Katarzyna; Borzymowska, Zuzanna; Nowak-Przygodzka, Marta
2015-01-01
This paper has two aims. First, we investigate how often people make choices conforming to Bayes’ rule when natural sampling is applied. Second, we show that using Bayes’ rule is not necessary to make choices satisfying Bayes’ rule. Simpler methods, even fallacious heuristics, might prescribe correct choices reasonably often under specific circumstances. We considered elementary situations with binary sets of hypotheses and data. We adopted an ecological approach and prepared two-stage computer tasks resembling natural sampling. Probabilistic relations were inferred from a set of pictures, followed by a choice which was made to maximize the chance of a preferred outcome. Use of Bayes’ rule was deduced indirectly from choices. Study 1 used a stratified sample of N = 60 participants equally distributed with regard to gender and type of education (humanities vs. pure sciences). Choices satisfying Bayes’ rule were dominant. To investigate ways of making choices more directly, we replicated Study 1, adding a task with a verbal report. In Study 2 (N = 76) choices conforming to Bayes’ rule dominated again. However, the verbal reports revealed use of a new, non-inverse rule, which always renders correct choices, but is easier than Bayes’ rule to apply. It does not require inversion of conditions [transforming P(H) and P(D|H) into P(H|D)] when computing chances. Study 3 examined the efficiency of three fallacious heuristics (pre-Bayesian, representativeness, and evidence-only) in producing choices concordant with Bayes’ rule. Computer-simulated scenarios revealed that the heuristics produced correct choices reasonably often under specific base rates and likelihood ratios. Summing up we conclude that natural sampling results in most choices conforming to Bayes’ rule. However, people tend to replace Bayes’ rule with simpler methods, and even use of fallacious heuristics may be satisfactorily efficient. PMID:26347676
a Probabilistic Embedding Clustering Method for Urban Structure Detection
NASA Astrophysics Data System (ADS)
Lin, X.; Li, H.; Zhang, Y.; Gao, L.; Zhao, L.; Deng, M.
2017-09-01
Urban structure detection is a basic task in urban geography. Clustering is a core technology to detect the patterns of urban spatial structure, urban functional region, and so on. In big data era, diverse urban sensing datasets recording information like human behaviour and human social activity, suffer from complexity in high dimension and high noise. And unfortunately, the state-of-the-art clustering methods does not handle the problem with high dimension and high noise issues concurrently. In this paper, a probabilistic embedding clustering method is proposed. Firstly, we come up with a Probabilistic Embedding Model (PEM) to find latent features from high dimensional urban sensing data by "learning" via probabilistic model. By latent features, we could catch essential features hidden in high dimensional data known as patterns; with the probabilistic model, we can also reduce uncertainty caused by high noise. Secondly, through tuning the parameters, our model could discover two kinds of urban structure, the homophily and structural equivalence, which means communities with intensive interaction or in the same roles in urban structure. We evaluated the performance of our model by conducting experiments on real-world data and experiments with real data in Shanghai (China) proved that our method could discover two kinds of urban structure, the homophily and structural equivalence, which means clustering community with intensive interaction or under the same roles in urban space.
Composite load spectra for select space propulsion structural components
NASA Technical Reports Server (NTRS)
Newell, J. F.; Ho, H. W.; Kurth, R. E.
1991-01-01
The work performed to develop composite load spectra (CLS) for the Space Shuttle Main Engine (SSME) using probabilistic methods. The three methods were implemented to be the engine system influence model. RASCAL was chosen to be the principal method as most component load models were implemented with the method. Validation of RASCAL was performed. High accuracy comparable to the Monte Carlo method can be obtained if a large enough bin size is used. Generic probabilistic models were developed and implemented for load calculations using the probabilistic methods discussed above. Each engine mission, either a real fighter or a test, has three mission phases: the engine start transient phase, the steady state phase, and the engine cut off transient phase. Power level and engine operating inlet conditions change during a mission. The load calculation module provides the steady-state and quasi-steady state calculation procedures with duty-cycle-data option. The quasi-steady state procedure is for engine transient phase calculations. In addition, a few generic probabilistic load models were also developed for specific conditions. These include the fixed transient spike model, the poison arrival transient spike model, and the rare event model. These generic probabilistic load models provide sufficient latitude for simulating loads with specific conditions. For SSME components, turbine blades, transfer ducts, LOX post, and the high pressure oxidizer turbopump (HPOTP) discharge duct were selected for application of the CLS program. They include static pressure loads and dynamic pressure loads for all four components, centrifugal force for the turbine blade, temperatures of thermal loads for all four components, and structural vibration loads for the ducts and LOX posts.
REWARD/PUNISHMENT REVERSAL LEARNING IN OLDER SUICIDE ATTEMPTERS
Dombrovski, Alexandre Y.; Clark, Luke; Siegle, Greg J.; Butters, Meryl A.; Ichikawa, Naho; Sahakian, Barbara; Szanto, Katalin
2011-01-01
Objective Suicide rates are very high in old age, and the contribution of cognitive risk factors remains poorly understood. Suicide may be viewed as an outcome of an altered decision process. We hypothesized that impairment in a component of affective decision-making – reward/punishment-based learning – is associated with attempted suicide in late-life depression. We expected that suicide attempters would discount past reward/punishment history, focusing excessively on the most recent rewards and punishments. Further, we hypothesized that this impairment could be dissociated from executive abilities such as forward planning. Method We assessed reward/punishment-based learning using the Probabilistic Reversal Learning task in 65 individuals aged 60 and older: suicide attempters, suicide ideators, non-suicidal depressed elderly, and non-depressed controls. We used a reinforcement learning computational model to decompose reward/punishment processing over time. The Stockings of Cambridge test served as a control measure of executive function. Results Suicide attempters but not suicide ideators showed impaired probabilistic reversal learning compared to both non-suicidal depressed elderly and to non-depressed controls, after controlling for effects of education, global cognitive function, and substance use. Model-based analyses revealed that suicide attempters discounted previous history to a higher degree, compared to controls, basing their choice largely on reward/punishment received on the last trial. Groups did not differ in their performance on the Stockings of Cambridge. Conclusions Older suicide attempters display impaired reward/punishment-based learning. We propose a hypothesis that older suicide attempters make overly present-focused decisions, ignoring past experiences. Modification of this ‘myopia for the past’ may have therapeutic potential. PMID:20231320
Fusion set selection with surrogate metric in multi-atlas based image segmentation
NASA Astrophysics Data System (ADS)
Zhao, Tingting; Ruan, Dan
2016-02-01
Multi-atlas based image segmentation sees unprecedented opportunities but also demanding challenges in the big data era. Relevant atlas selection before label fusion plays a crucial role in reducing potential performance loss from heterogeneous data quality and high computation cost from extensive data. This paper starts with investigating the image similarity metric (termed ‘surrogate’), an alternative to the inaccessible geometric agreement metric (termed ‘oracle’) in atlas relevance assessment, and probes into the problem of how to select the ‘most-relevant’ atlases and how many such atlases to incorporate. We propose an inference model to relate the surrogates and the oracle geometric agreement metrics. Based on this model, we quantify the behavior of the surrogates in mimicking oracle metrics for atlas relevance ordering. Finally, analytical insights on the choice of fusion set size are presented from a probabilistic perspective, with the integrated goal of including the most relevant atlases and excluding the irrelevant ones. Empirical evidence and performance assessment are provided based on prostate and corpus callosum segmentation.
Asano, Masanari; Khrennikov, Andrei; Ohya, Masanori; Tanaka, Yoshiharu; Yamato, Ichiro
2016-05-28
We compare the contextual probabilistic structures of the seminal two-slit experiment (quantum interference experiment), the system of three interacting bodies andEscherichia colilactose-glucose metabolism. We show that they have the same non-Kolmogorov probabilistic structure resulting from multi-contextuality. There are plenty of statistical data with non-Kolmogorov features; in particular, the probabilistic behaviour of neither quantum nor biological systems can be described classically. Biological systems (even cells and proteins) are macroscopic systems and one may try to present a more detailed model of interactions in such systems that lead to quantum-like probabilistic behaviour. The system of interactions between three bodies is one of the simplest metaphoric examples for such interactions. By proceeding further in this way (by playing withn-body systems) we shall be able to find metaphoric mechanical models for complex bio-interactions, e.g. signalling between cells, leading to non-Kolmogorov probabilistic data. © 2016 The Author(s).
Scalable DB+IR Technology: Processing Probabilistic Datalog with HySpirit.
Frommholz, Ingo; Roelleke, Thomas
2016-01-01
Probabilistic Datalog (PDatalog, proposed in 1995) is a probabilistic variant of Datalog and a nice conceptual idea to model Information Retrieval in a logical, rule-based programming paradigm. Making PDatalog work in real-world applications requires more than probabilistic facts and rules, and the semantics associated with the evaluation of the programs. We report in this paper some of the key features of the HySpirit system required to scale the execution of PDatalog programs. Firstly, there is the requirement to express probability estimation in PDatalog. Secondly, fuzzy-like predicates are required to model vague predicates (e.g. vague match of attributes such as age or price). Thirdly, to handle large data sets there are scalability issues to be addressed, and therefore, HySpirit provides probabilistic relational indexes and parallel and distributed processing . The main contribution of this paper is a consolidated view on the methods of the HySpirit system to make PDatalog applicable in real-scale applications that involve a wide range of requirements typical for data (information) management and analysis.
Asano, Masanari; Ohya, Masanori; Yamato, Ichiro
2016-01-01
We compare the contextual probabilistic structures of the seminal two-slit experiment (quantum interference experiment), the system of three interacting bodies and Escherichia coli lactose–glucose metabolism. We show that they have the same non-Kolmogorov probabilistic structure resulting from multi-contextuality. There are plenty of statistical data with non-Kolmogorov features; in particular, the probabilistic behaviour of neither quantum nor biological systems can be described classically. Biological systems (even cells and proteins) are macroscopic systems and one may try to present a more detailed model of interactions in such systems that lead to quantum-like probabilistic behaviour. The system of interactions between three bodies is one of the simplest metaphoric examples for such interactions. By proceeding further in this way (by playing with n-body systems) we shall be able to find metaphoric mechanical models for complex bio-interactions, e.g. signalling between cells, leading to non-Kolmogorov probabilistic data. PMID:27091163
Sukumaran, Jeet; Knowles, L Lacey
2018-06-01
The development of process-based probabilistic models for historical biogeography has transformed the field by grounding it in modern statistical hypothesis testing. However, most of these models abstract away biological differences, reducing species to interchangeable lineages. We present here the case for reintegration of biology into probabilistic historical biogeographical models, allowing a broader range of questions about biogeographical processes beyond ancestral range estimation or simple correlation between a trait and a distribution pattern, as well as allowing us to assess how inferences about ancestral ranges themselves might be impacted by differential biological traits. We show how new approaches to inference might cope with the computational challenges resulting from the increased complexity of these trait-based historical biogeographical models. Copyright © 2018 Elsevier Ltd. All rights reserved.
Two deterministic models (US EPA’s Office of Pesticide Programs Residential Standard Operating Procedures (OPP Residential SOPs) and Draft Protocol for Measuring Children’s Non-Occupational Exposure to Pesticides by all Relevant Pathways (Draft Protocol)) and four probabilistic mo...
EXPERIENCES WITH USING PROBABILISTIC EXPOSURE ANALYSIS METHODS IN THE U.S. EPA
Over the past decade various Offices and Programs within the U.S. EPA have either initiated or increased the development and application of probabilistic exposure analysis models. These models have been applied to a broad range of research or regulatory problems in EPA, such as e...
Structural reliability assessment capability in NESSUS
NASA Technical Reports Server (NTRS)
Millwater, H.; Wu, Y.-T.
1992-01-01
The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.
Structural reliability assessment capability in NESSUS
NASA Astrophysics Data System (ADS)
Millwater, H.; Wu, Y.-T.
1992-07-01
The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.
NASA Astrophysics Data System (ADS)
Song, Lu-Kai; Wen, Jie; Fei, Cheng-Wei; Bai, Guang-Chen
2018-05-01
To improve the computing efficiency and precision of probabilistic design for multi-failure structure, a distributed collaborative probabilistic design method-based fuzzy neural network of regression (FR) (called as DCFRM) is proposed with the integration of distributed collaborative response surface method and fuzzy neural network regression model. The mathematical model of DCFRM is established and the probabilistic design idea with DCFRM is introduced. The probabilistic analysis of turbine blisk involving multi-failure modes (deformation failure, stress failure and strain failure) was investigated by considering fluid-structure interaction with the proposed method. The distribution characteristics, reliability degree, and sensitivity degree of each failure mode and overall failure mode on turbine blisk are obtained, which provides a useful reference for improving the performance and reliability of aeroengine. Through the comparison of methods shows that the DCFRM reshapes the probability of probabilistic analysis for multi-failure structure and improves the computing efficiency while keeping acceptable computational precision. Moreover, the proposed method offers a useful insight for reliability-based design optimization of multi-failure structure and thereby also enriches the theory and method of mechanical reliability design.
A novel probabilistic framework for event-based speech recognition
NASA Astrophysics Data System (ADS)
Juneja, Amit; Espy-Wilson, Carol
2003-10-01
One of the reasons for unsatisfactory performance of the state-of-the-art automatic speech recognition (ASR) systems is the inferior acoustic modeling of low-level acoustic-phonetic information in the speech signal. An acoustic-phonetic approach to ASR, on the other hand, explicitly targets linguistic information in the speech signal, but such a system for continuous speech recognition (CSR) is not known to exist. A probabilistic and statistical framework for CSR based on the idea of the representation of speech sounds by bundles of binary valued articulatory phonetic features is proposed. Multiple probabilistic sequences of linguistically motivated landmarks are obtained using binary classifiers of manner phonetic features-syllabic, sonorant and continuant-and the knowledge-based acoustic parameters (APs) that are acoustic correlates of those features. The landmarks are then used for the extraction of knowledge-based APs for source and place phonetic features and their binary classification. Probabilistic landmark sequences are constrained using manner class language models for isolated or connected word recognition. The proposed method could overcome the disadvantages encountered by the early acoustic-phonetic knowledge-based systems that led the ASR community to switch to systems highly dependent on statistical pattern analysis methods and probabilistic language or grammar models.
Probabilistic Seismic Risk Model for Western Balkans
NASA Astrophysics Data System (ADS)
Stejskal, Vladimir; Lorenzo, Francisco; Pousse, Guillaume; Radovanovic, Slavica; Pekevski, Lazo; Dojcinovski, Dragi; Lokin, Petar; Petronijevic, Mira; Sipka, Vesna
2010-05-01
A probabilistic seismic risk model for insurance and reinsurance purposes is presented for an area of Western Balkans, covering former Yugoslavia and Albania. This territory experienced many severe earthquakes during past centuries producing significant damage to many population centres in the region. The highest hazard is related to external Dinarides, namely to the collision zone of the Adriatic plate. The model is based on a unified catalogue for the region and a seismic source model consisting of more than 30 zones covering all the three main structural units - Southern Alps, Dinarides and the south-western margin of the Pannonian Basin. A probabilistic methodology using Monte Carlo simulation was applied to generate the hazard component of the model. Unique set of damage functions based on both loss experience and engineering assessments is used to convert the modelled ground motion severity into the monetary loss.
Probabilistic risk models for multiple disturbances: an example of forest insects and wildfires
Haiganoush K. Preisler; Alan A. Ager; Jane L. Hayes
2010-01-01
Building probabilistic risk models for highly random forest disturbances like wildfire and forest insect outbreaks is a challenging. Modeling the interactions among natural disturbances is even more difficult. In the case of wildfire and forest insects, we looked at the probability of a large fire given an insect outbreak and also the incidence of insect outbreaks...
Probabilistic Structural Analysis Methods (PSAM) for Select Space Propulsion System Components
NASA Technical Reports Server (NTRS)
1999-01-01
Probabilistic Structural Analysis Methods (PSAM) are described for the probabilistic structural analysis of engine components for current and future space propulsion systems. Components for these systems are subjected to stochastic thermomechanical launch loads. Uncertainties or randomness also occurs in material properties, structural geometry, and boundary conditions. Material property stochasticity, such as in modulus of elasticity or yield strength, exists in every structure and is a consequence of variations in material composition and manufacturing processes. Procedures are outlined for computing the probabilistic structural response or reliability of the structural components. The response variables include static or dynamic deflections, strains, and stresses at one or several locations, natural frequencies, fatigue or creep life, etc. Sample cases illustrates how the PSAM methods and codes simulate input uncertainties and compute probabilistic response or reliability using a finite element model with probabilistic methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Yun, E-mail: genliyun@126.com, E-mail: cuiwanzhao@126.com; Cui, Wan-Zhao, E-mail: genliyun@126.com, E-mail: cuiwanzhao@126.com; Wang, Hong-Guang
2015-05-15
Effects of the secondary electron emission (SEE) phenomenon of metal surface on the multipactor analysis of microwave components are investigated numerically and experimentally in this paper. Both the secondary electron yield (SEY) and the emitted energy spectrum measurements are performed on silver plated samples for accurate description of the SEE phenomenon. A phenomenological probabilistic model based on SEE physics is utilized and fitted accurately to the measured SEY and emitted energy spectrum of the conditioned surface material of microwave components. Specially, the phenomenological probabilistic model is extended to the low primary energy end lower than 20 eV mathematically, since no accuratemore » measurement data can be obtained. Embedding the phenomenological probabilistic model into the Electromagnetic Particle-In-Cell (EM-PIC) method, the electronic resonant multipacting in microwave components can be tracked and hence the multipactor threshold can be predicted. The threshold prediction error of the transformer and the coaxial filter is 0.12 dB and 1.5 dB, respectively. Simulation results demonstrate that the discharge threshold is strongly dependent on the SEYs and its energy spectrum in the low energy end (lower than 50 eV). Multipacting simulation results agree quite well with experiments in practical components, while the phenomenological probabilistic model fit both the SEY and the emission energy spectrum better than the traditionally used model and distribution. The EM-PIC simulation method with the phenomenological probabilistic model for the surface collision simulation has been demonstrated for predicting the multipactor threshold in metal components for space application.« less
Inherent limitations of probabilistic models for protein-DNA binding specificity
Ruan, Shuxiang
2017-01-01
The specificities of transcription factors are most commonly represented with probabilistic models. These models provide a probability for each base occurring at each position within the binding site and the positions are assumed to contribute independently. The model is simple and intuitive and is the basis for many motif discovery algorithms. However, the model also has inherent limitations that prevent it from accurately representing true binding probabilities, especially for the highest affinity sites under conditions of high protein concentration. The limitations are not due to the assumption of independence between positions but rather are caused by the non-linear relationship between binding affinity and binding probability and the fact that independent normalization at each position skews the site probabilities. Generally probabilistic models are reasonably good approximations, but new high-throughput methods allow for biophysical models with increased accuracy that should be used whenever possible. PMID:28686588
Bramley, Neil R; Lagnado, David A; Speekenbrink, Maarten
2015-05-01
Interacting with a system is key to uncovering its causal structure. A computational framework for interventional causal learning has been developed over the last decade, but how real causal learners might achieve or approximate the computations entailed by this framework is still poorly understood. Here we describe an interactive computer task in which participants were incentivized to learn the structure of probabilistic causal systems through free selection of multiple interventions. We develop models of participants' intervention choices and online structure judgments, using expected utility gain, probability gain, and information gain and introducing plausible memory and processing constraints. We find that successful participants are best described by a model that acts to maximize information (rather than expected score or probability of being correct); that forgets much of the evidence received in earlier trials; but that mitigates this by being conservative, preferring structures consistent with earlier stated beliefs. We explore 2 heuristics that partly explain how participants might be approximating these models without explicitly representing or updating a hypothesis space. (c) 2015 APA, all rights reserved).
Resolving the paradox of suboptimal choice.
Zentall, Thomas R
2016-01-01
When humans engage in commercial (totally probabilistic) gambling they are making suboptimal choices because the return is generally less than the investment. This review (a) examines the literature on pigeon suboptimal choice, (b) describes the conditions under which it occurs, (c) identifies the mechanisms that appear to be responsible for the effect, and (d) suggests that similar processes may be able to account for analogous suboptimal choice when humans engage in commercial gambling. Pigeons show suboptimal choice when they choose between 1 alternative that 20% of the time provides them with a signal that they will always get fed or 80% of the time with a signal that they will not get fed (overall 20% reinforcement) and a second alternative that 100% of the time provides them with a signal that they will get fed 50% of the time (overall 50% reinforcement). The pigeons' strong preference for the suboptimal choice was investigated in a series of experiments that found the preference for the suboptimal alternative was determined by the value of the signal that predicted reinforcement, rather its frequency and that the frequency of the signal that predicted nonreinforcement had little effect on the suboptimal choice. Paradoxically, this account makes the prediction that pigeons will be indifferent between an alternative that 50% of the time provides a fully predictive stimulus for reinforcement and an alternative that 100% of the time provides a fully predictive stimulus for reinforcement. The similarities and differences of this suboptimal choice task to human gambling are discussed. (c) 2016 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Panzera, Francesco; Lombardo, Giuseppe; Rigano, Rosaria
2010-05-01
The seismic hazard assessment (SHA) can be performed using either Deterministic or Probabilistic approaches. In present study a probabilistic analysis was carried out for the Catania and Siracusa towns using two different procedures: the 'site' (Albarello and Mucciarelli, 2002) and the 'seismotectonic' (Cornell 1968; Esteva, 1967) methodologies. The SASHA code (D'Amico and Albarello, 2007) was used to calculate seismic hazard through the 'site' approach, whereas the CRISIS2007 code (Ordaz et al., 2007) was adopted in the Esteva-Cornell procedure. According to current international conventions for PSHA (SSHAC, 1997), a logic tree approach was followed to consider and reduce the epistemic uncertainties, for both seismotectonic and site methods. The code SASHA handles the intensity data taking into account the macroseismic information of past earthquakes. CRISIS2007 code needs, as input elements, a seismic catalogue tested for completeness, a seismogenetic zonation and ground motion predicting equations. Data concerning the characterization of regional seismic sources and ground motion attenuation properties were taken from the literature. Special care was devoted to define source zone models, taking into account the most recent studies on regional seismotectonic features and, in particular, the possibility of considering the Malta escarpment as a potential source. The combined use of the above mentioned approaches allowed us to obtain useful elements to define the site seismic hazard in Catania and Siracusa. The results point out that the choice of the probabilistic model plays a fundamental role. It is indeed observed that when the site intensity data are used, the town of Catania shows hazard values higher than the ones found for Siracusa, for each considered return period. On the contrary, when the Esteva-Cornell method is used, Siracusa urban area shows higher hazard than Catania, for return periods greater than one hundred years. The higher hazard observed, through the site approach, for Catania area can be interpreted in terms of greater damage historically observed at this town and its smaller distance from the seismogenic structures. On the other hand, the higher level of hazard found for Siracusa, throughout the Esteva-Cornell approach, could be a consequence of the features of such method which spreads out the intensities over a wide area. However, in SHA the use of a combined approach is recommended for a mutual validation of obtained results and any choice between the two approaches is strictly linked to the knowledge of the local seismotectonic features. References Albarello D. and Mucciarelli M.; 2002: Seismic hazard estimates using ill?defined macroseismic data at site. Pure Appl. Geophys., 159, 1289?1304. Cornell C.A.; 1968: Engineering seismic risk analysis. Bull. Seism. Soc. Am., 58(5), 1583-1606. D'Amico V. and Albarello D.; 2007: Codice per il calcolo della pericolosità sismica da dati di sito (freeware). Progetto DPC-INGV S1, http://esse1.mi.ingv.it/d12.html Esteva L.; 1967: Criterios para la construcción de espectros para diseño sísmico. Proceedings of XII Jornadas Sudamericanas de Ingeniería Estructural y III Simposio Panamericano de Estructuras, Caracas, 1967. Published later in Boletín del Instituto de Materiales y Modelos Estructurales, Universidad Central de Venezuela, No. 19. Ordaz M., Aguilar A. and Arboleda J.; 2007: CRISIS2007, Program for computing seismic hazard. Version 5.4, Mexico City: UNAM. SSHAC (Senior Seismic Hazard Analysis Committee); 1997: Recommendations for probabilistic seismic hazard analysis: guidance on uncertainty and use of experts. NUREG/CR-6372.
Probabilistic Learning by Rodent Grid Cells
Cheung, Allen
2016-01-01
Mounting evidence shows mammalian brains are probabilistic computers, but the specific cells involved remain elusive. Parallel research suggests that grid cells of the mammalian hippocampal formation are fundamental to spatial cognition but their diverse response properties still defy explanation. No plausible model exists which explains stable grids in darkness for twenty minutes or longer, despite being one of the first results ever published on grid cells. Similarly, no current explanation can tie together grid fragmentation and grid rescaling, which show very different forms of flexibility in grid responses when the environment is varied. Other properties such as attractor dynamics and grid anisotropy seem to be at odds with one another unless additional properties are assumed such as a varying velocity gain. Modelling efforts have largely ignored the breadth of response patterns, while also failing to account for the disastrous effects of sensory noise during spatial learning and recall, especially in darkness. Here, published electrophysiological evidence from a range of experiments are reinterpreted using a novel probabilistic learning model, which shows that grid cell responses are accurately predicted by a probabilistic learning process. Diverse response properties of probabilistic grid cells are statistically indistinguishable from rat grid cells across key manipulations. A simple coherent set of probabilistic computations explains stable grid fields in darkness, partial grid rescaling in resized arenas, low-dimensional attractor grid cell dynamics, and grid fragmentation in hairpin mazes. The same computations also reconcile oscillatory dynamics at the single cell level with attractor dynamics at the cell ensemble level. Additionally, a clear functional role for boundary cells is proposed for spatial learning. These findings provide a parsimonious and unified explanation of grid cell function, and implicate grid cells as an accessible neuronal population readout of a set of probabilistic spatial computations. PMID:27792723
A Probabilistic Model for Diagnosing Misconceptions by a Pattern Classification Approach.
ERIC Educational Resources Information Center
Tatsuoka, Kikumi K.
A probabilistic approach is introduced to classify and diagnose erroneous rules of operation resulting from a variety of misconceptions ("bugs") in a procedural domain of arithmetic. The model is contrasted with the deterministic approach which has commonly been used in the field of artificial intelligence, and the advantage of treating the…
Nonlinear probabilistic finite element models of laminated composite shells
NASA Technical Reports Server (NTRS)
Engelstad, S. P.; Reddy, J. N.
1993-01-01
A probabilistic finite element analysis procedure for laminated composite shells has been developed. A total Lagrangian finite element formulation, employing a degenerated 3-D laminated composite shell with the full Green-Lagrange strains and first-order shear deformable kinematics, forms the modeling foundation. The first-order second-moment technique for probabilistic finite element analysis of random fields is employed and results are presented in the form of mean and variance of the structural response. The effects of material nonlinearity are included through the use of a rate-independent anisotropic plasticity formulation with the macroscopic point of view. Both ply-level and micromechanics-level random variables can be selected, the latter by means of the Aboudi micromechanics model. A number of sample problems are solved to verify the accuracy of the procedures developed and to quantify the variability of certain material type/structure combinations. Experimental data is compared in many cases, and the Monte Carlo simulation method is used to check the probabilistic results. In general, the procedure is quite effective in modeling the mean and variance response of the linear and nonlinear behavior of laminated composite shells.
Kindermans, Pieter-Jan; Verschore, Hannes; Schrauwen, Benjamin
2013-10-01
In recent years, in an attempt to maximize performance, machine learning approaches for event-related potential (ERP) spelling have become more and more complex. In this paper, we have taken a step back as we wanted to improve the performance without building an overly complex model, that cannot be used by the community. Our research resulted in a unified probabilistic model for ERP spelling, which is based on only three assumptions and incorporates language information. On top of that, the probabilistic nature of our classifier yields a natural dynamic stopping strategy. Furthermore, our method uses the same parameters across 25 subjects from three different datasets. We show that our classifier, when enhanced with language models and dynamic stopping, improves the spelling speed and accuracy drastically. Additionally, we would like to point out that as our model is entirely probabilistic, it can easily be used as the foundation for complex systems in future work. All our experiments are executed on publicly available datasets to allow for future comparison with similar techniques.
Pecevski, Dejan; Buesing, Lars; Maass, Wolfgang
2011-01-01
An important open problem of computational neuroscience is the generic organization of computations in networks of neurons in the brain. We show here through rigorous theoretical analysis that inherent stochastic features of spiking neurons, in combination with simple nonlinear computational operations in specific network motifs and dendritic arbors, enable networks of spiking neurons to carry out probabilistic inference through sampling in general graphical models. In particular, it enables them to carry out probabilistic inference in Bayesian networks with converging arrows (“explaining away”) and with undirected loops, that occur in many real-world tasks. Ubiquitous stochastic features of networks of spiking neurons, such as trial-to-trial variability and spontaneous activity, are necessary ingredients of the underlying computational organization. We demonstrate through computer simulations that this approach can be scaled up to neural emulations of probabilistic inference in fairly large graphical models, yielding some of the most complex computations that have been carried out so far in networks of spiking neurons. PMID:22219717
NASA Astrophysics Data System (ADS)
Bandte, Oliver
It has always been the intention of systems engineering to invent or produce the best product possible. Many design techniques have been introduced over the course of decades that try to fulfill this intention. Unfortunately, no technique has succeeded in combining multi-criteria decision making with probabilistic design. The design technique developed in this thesis, the Joint Probabilistic Decision Making (JPDM) technique, successfully overcomes this deficiency by generating a multivariate probability distribution that serves in conjunction with a criterion value range of interest as a universally applicable objective function for multi-criteria optimization and product selection. This new objective function constitutes a meaningful Xnetric, called Probability of Success (POS), that allows the customer or designer to make a decision based on the chance of satisfying the customer's goals. In order to incorporate a joint probabilistic formulation into the systems design process, two algorithms are created that allow for an easy implementation into a numerical design framework: the (multivariate) Empirical Distribution Function and the Joint Probability Model. The Empirical Distribution Function estimates the probability that an event occurred by counting how many times it occurred in a given sample. The Joint Probability Model on the other hand is an analytical parametric model for the multivariate joint probability. It is comprised of the product of the univariate criterion distributions, generated by the traditional probabilistic design process, multiplied with a correlation function that is based on available correlation information between pairs of random variables. JPDM is an excellent tool for multi-objective optimization and product selection, because of its ability to transform disparate objectives into a single figure of merit, the likelihood of successfully meeting all goals or POS. The advantage of JPDM over other multi-criteria decision making techniques is that POS constitutes a single optimizable function or metric that enables a comparison of all alternative solutions on an equal basis. Hence, POS allows for the use of any standard single-objective optimization technique available and simplifies a complex multi-criteria selection problem into a simple ordering problem, where the solution with the highest POS is best. By distinguishing between controllable and uncontrollable variables in the design process, JPDM can account for the uncertain values of the uncontrollable variables that are inherent to the design problem, while facilitating an easy adjustment of the controllable ones to achieve the highest possible POS. Finally, JPDM's superiority over current multi-criteria decision making techniques is demonstrated with an optimization of a supersonic transport concept and ten contrived equations as well as a product selection example, determining an airline's best choice among Boeing's B-747, B-777, Airbus' A340, and a Supersonic Transport. The optimization examples demonstrate JPDM's ability to produce a better solution with a higher POS than an Overall Evaluation Criterion or Goal Programming approach. Similarly, the product selection example demonstrates JPDM's ability to produce a better solution with a higher POS and different ranking than the Overall Evaluation Criterion or Technique for Order Preferences by Similarity to the Ideal Solution (TOPSIS) approach.
Chen, Jonathan H; Goldstein, Mary K; Asch, Steven M; Mackey, Lester; Altman, Russ B
2017-05-01
Build probabilistic topic model representations of hospital admissions processes and compare the ability of such models to predict clinical order patterns as compared to preconstructed order sets. The authors evaluated the first 24 hours of structured electronic health record data for > 10 K inpatients. Drawing an analogy between structured items (e.g., clinical orders) to words in a text document, the authors performed latent Dirichlet allocation probabilistic topic modeling. These topic models use initial clinical information to predict clinical orders for a separate validation set of > 4 K patients. The authors evaluated these topic model-based predictions vs existing human-authored order sets by area under the receiver operating characteristic curve, precision, and recall for subsequent clinical orders. Existing order sets predict clinical orders used within 24 hours with area under the receiver operating characteristic curve 0.81, precision 16%, and recall 35%. This can be improved to 0.90, 24%, and 47% ( P < 10 -20 ) by using probabilistic topic models to summarize clinical data into up to 32 topics. Many of these latent topics yield natural clinical interpretations (e.g., "critical care," "pneumonia," "neurologic evaluation"). Existing order sets tend to provide nonspecific, process-oriented aid, with usability limitations impairing more precise, patient-focused support. Algorithmic summarization has the potential to breach this usability barrier by automatically inferring patient context, but with potential tradeoffs in interpretability. Probabilistic topic modeling provides an automated approach to detect thematic trends in patient care and generate decision support content. A potential use case finds related clinical orders for decision support. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association.
Goldstein, Mary K; Asch, Steven M; Mackey, Lester; Altman, Russ B
2017-01-01
Objective: Build probabilistic topic model representations of hospital admissions processes and compare the ability of such models to predict clinical order patterns as compared to preconstructed order sets. Materials and Methods: The authors evaluated the first 24 hours of structured electronic health record data for > 10 K inpatients. Drawing an analogy between structured items (e.g., clinical orders) to words in a text document, the authors performed latent Dirichlet allocation probabilistic topic modeling. These topic models use initial clinical information to predict clinical orders for a separate validation set of > 4 K patients. The authors evaluated these topic model-based predictions vs existing human-authored order sets by area under the receiver operating characteristic curve, precision, and recall for subsequent clinical orders. Results: Existing order sets predict clinical orders used within 24 hours with area under the receiver operating characteristic curve 0.81, precision 16%, and recall 35%. This can be improved to 0.90, 24%, and 47% (P < 10−20) by using probabilistic topic models to summarize clinical data into up to 32 topics. Many of these latent topics yield natural clinical interpretations (e.g., “critical care,” “pneumonia,” “neurologic evaluation”). Discussion: Existing order sets tend to provide nonspecific, process-oriented aid, with usability limitations impairing more precise, patient-focused support. Algorithmic summarization has the potential to breach this usability barrier by automatically inferring patient context, but with potential tradeoffs in interpretability. Conclusion: Probabilistic topic modeling provides an automated approach to detect thematic trends in patient care and generate decision support content. A potential use case finds related clinical orders for decision support. PMID:27655861
A Guide to the Literature on Learning Graphical Models
NASA Technical Reports Server (NTRS)
Buntine, Wray L.; Friedland, Peter (Technical Monitor)
1994-01-01
This literature review discusses different methods under the general rubric of learning Bayesian networks from data, and more generally, learning probabilistic graphical models. Because many problems in artificial intelligence, statistics and neural networks can be represented as a probabilistic graphical model, this area provides a unifying perspective on learning. This paper organizes the research in this area along methodological lines of increasing complexity.
Guidelines for a graph-theoretic implementation of structural equation modeling
Grace, James B.; Schoolmaster, Donald R.; Guntenspergen, Glenn R.; Little, Amanda M.; Mitchell, Brian R.; Miller, Kathryn M.; Schweiger, E. William
2012-01-01
Structural equation modeling (SEM) is increasingly being chosen by researchers as a framework for gaining scientific insights from the quantitative analyses of data. New ideas and methods emerging from the study of causality, influences from the field of graphical modeling, and advances in statistics are expanding the rigor, capability, and even purpose of SEM. Guidelines for implementing the expanded capabilities of SEM are currently lacking. In this paper we describe new developments in SEM that we believe constitute a third-generation of the methodology. Most characteristic of this new approach is the generalization of the structural equation model as a causal graph. In this generalization, analyses are based on graph theoretic principles rather than analyses of matrices. Also, new devices such as metamodels and causal diagrams, as well as an increased emphasis on queries and probabilistic reasoning, are now included. Estimation under a graph theory framework permits the use of Bayesian or likelihood methods. The guidelines presented start from a declaration of the goals of the analysis. We then discuss how theory frames the modeling process, requirements for causal interpretation, model specification choices, selection of estimation method, model evaluation options, and use of queries, both to summarize retrospective results and for prospective analyses. The illustrative example presented involves monitoring data from wetlands on Mount Desert Island, home of Acadia National Park. Our presentation walks through the decision process involved in developing and evaluating models, as well as drawing inferences from the resulting prediction equations. In addition to evaluating hypotheses about the connections between human activities and biotic responses, we illustrate how the structural equation (SE) model can be queried to understand how interventions might take advantage of an environmental threshold to limit Typha invasions. The guidelines presented provide for an updated definition of the SEM process that subsumes the historical matrix approach under a graph-theory implementation. The implementation is also designed to permit complex specifications and to be compatible with various estimation methods. Finally, they are meant to foster the use of probabilistic reasoning in both retrospective and prospective considerations of the quantitative implications of the results.
Wong, Tony E.; Keller, Klaus
2017-01-01
The response of the Antarctic ice sheet (AIS) to changing global temperatures is a key component of sea-level projections. Current projections of the AIS contribution to sea-level changes are deeply uncertain. This deep uncertainty stems, in part, from (i) the inability of current models to fully resolve key processes and scales, (ii) the relatively sparse available data, and (iii) divergent expert assessments. One promising approach to characterizing the deep uncertainty stemming from divergent expert assessments is to combine expert assessments, observations, and simple models by coupling probabilistic inversion and Bayesian inversion. Here, we present a proof-of-concept study that uses probabilistic inversion to fuse a simple AIS model and diverse expert assessments. We demonstrate the ability of probabilistic inversion to infer joint prior probability distributions of model parameters that are consistent with expert assessments. We then confront these inferred expert priors with instrumental and paleoclimatic observational data in a Bayesian inversion. These additional constraints yield tighter hindcasts and projections. We use this approach to quantify how the deep uncertainty surrounding expert assessments affects the joint probability distributions of model parameters and future projections. PMID:29287095
Bayesian outcome-based strategy classification.
Lee, Michael D
2016-03-01
Hilbig and Moshagen (Psychonomic Bulletin & Review, 21, 1431-1443, 2014) recently developed a method for making inferences about the decision processes people use in multi-attribute forced choice tasks. Their paper makes a number of worthwhile theoretical and methodological contributions. Theoretically, they provide an insightful psychological motivation for a probabilistic extension of the widely-used "weighted additive" (WADD) model, and show how this model, as well as other important models like "take-the-best" (TTB), can and should be expressed in terms of meaningful priors. Methodologically, they develop an inference approach based on the Minimum Description Length (MDL) principles that balances both the goodness-of-fit and complexity of the decision models they consider. This paper aims to preserve these useful contributions, but provide a complementary Bayesian approach with some theoretical and methodological advantages. We develop a simple graphical model, implemented in JAGS, that allows for fully Bayesian inferences about which models people use to make decisions. To demonstrate the Bayesian approach, we apply it to the models and data considered by Hilbig and Moshagen (Psychonomic Bulletin & Review, 21, 1431-1443, 2014), showing how a prior predictive analysis of the models, and posterior inferences about which models people use and the parameter settings at which they use them, can contribute to our understanding of human decision making.
NASA Technical Reports Server (NTRS)
Thacker, B. H.; Mcclung, R. C.; Millwater, H. R.
1990-01-01
An eigenvalue analysis of a typical space propulsion system turbopump blade is presented using an approximate probabilistic analysis methodology. The methodology was developed originally to investigate the feasibility of computing probabilistic structural response using closed-form approximate models. This paper extends the methodology to structures for which simple closed-form solutions do not exist. The finite element method will be used for this demonstration, but the concepts apply to any numerical method. The results agree with detailed analysis results and indicate the usefulness of using a probabilistic approximate analysis in determining efficient solution strategies.
Probabilistic Models for Solar Particle Events
NASA Technical Reports Server (NTRS)
Adams, James H., Jr.; Dietrich, W. F.; Xapsos, M. A.; Welton, A. M.
2009-01-01
Probabilistic Models of Solar Particle Events (SPEs) are used in space mission design studies to provide a description of the worst-case radiation environment that the mission must be designed to tolerate.The models determine the worst-case environment using a description of the mission and a user-specified confidence level that the provided environment will not be exceeded. This poster will focus on completing the existing suite of models by developing models for peak flux and event-integrated fluence elemental spectra for the Z>2 elements. It will also discuss methods to take into account uncertainties in the data base and the uncertainties resulting from the limited number of solar particle events in the database. These new probabilistic models are based on an extensive survey of SPE measurements of peak and event-integrated elemental differential energy spectra. Attempts are made to fit the measured spectra with eight different published models. The model giving the best fit to each spectrum is chosen and used to represent that spectrum for any energy in the energy range covered by the measurements. The set of all such spectral representations for each element is then used to determine the worst case spectrum as a function of confidence level. The spectral representation that best fits these worst case spectra is found and its dependence on confidence level is parameterized. This procedure creates probabilistic models for the peak and event-integrated spectra.
A Probabilistic Framework for Quantifying Mixed Uncertainties in Cyber Attacker Payoffs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chatterjee, Samrat; Tipireddy, Ramakrishna; Oster, Matthew R.
Quantification and propagation of uncertainties in cyber attacker payoffs is a key aspect within multiplayer, stochastic security games. These payoffs may represent penalties or rewards associated with player actions and are subject to various sources of uncertainty, including: (1) cyber-system state, (2) attacker type, (3) choice of player actions, and (4) cyber-system state transitions over time. Past research has primarily focused on representing defender beliefs about attacker payoffs as point utility estimates. More recently, within the physical security domain, attacker payoff uncertainties have been represented as Uniform and Gaussian probability distributions, and mathematical intervals. For cyber-systems, probability distributions may helpmore » address statistical (aleatory) uncertainties where the defender may assume inherent variability or randomness in the factors contributing to the attacker payoffs. However, systematic (epistemic) uncertainties may exist, where the defender may not have sufficient knowledge or there is insufficient information about the attacker’s payoff generation mechanism. Such epistemic uncertainties are more suitably represented as generalizations of probability boxes. This paper explores the mathematical treatment of such mixed payoff uncertainties. A conditional probabilistic reasoning approach is adopted to organize the dependencies between a cyber-system’s state, attacker type, player actions, and state transitions. This also enables the application of probabilistic theories to propagate various uncertainties in the attacker payoffs. An example implementation of this probabilistic framework and resulting attacker payoff distributions are discussed. A goal of this paper is also to highlight this uncertainty quantification problem space to the cyber security research community and encourage further advancements in this area.« less
NASA Astrophysics Data System (ADS)
Vico, Giulia; Porporato, Amilcare
2013-04-01
Supplemental irrigation represents one of the main strategies to mitigate the effects of climate variability and stabilize yields. Irrigated agriculture currently provides 40% of food production and its relevance is expected to further increase in the near future, in face of the projected alterations of rainfall patterns and increase in food, fiber, and biofuel demand. Because of the significant investments and water requirements involved in irrigation, strategic choices are needed to preserve productivity and profitability, while maintaining a sustainable water management - a nontrivial task given the unpredictability of the rainfall forcing. To facilitate decision making under uncertainty, a widely applicable probabilistic framework is proposed. The occurrence of rainfall events and irrigation applications are linked probabilistically to crop development during the growing season and yields at harvest. Based on these linkages, the probability density function of yields and corresponding probability density function of required irrigation volumes, as well as the probability density function of yields under the most common case of limited water availability are obtained analytically, as a function of irrigation strategy, climate, soil and crop parameters. The full probabilistic description of the frequency of occurrence of yields and water requirements is a crucial tool for decision making under uncertainty, e.g., via expected utility analysis. Furthermore, the knowledge of the probability density function of yield allows us to quantify the yield reduction hydrologic risk. Two risk indices are defined and quantified: the long-term risk index, suitable for long-term irrigation strategy assessment and investment planning, and the real-time risk index, providing a rigorous probabilistic quantification of the emergence of drought conditions during a single growing season in an agricultural setting. Our approach employs relatively few parameters and is thus easily and broadly applicable to different crops and sites, under current and future climate scenarios. Hence, the proposed probabilistic framework provides a quantitative tool to assess the impact of irrigation strategy and water allocation on the risk of not meeting a certain target yield, thus guiding the optimal allocation of water resources for human and environmental needs.
Probabilistic evaluation of uncertainties and risks in aerospace components
NASA Technical Reports Server (NTRS)
Shah, A. R.; Shiao, M. C.; Nagpal, V. K.; Chamis, C. C.
1992-01-01
A methodology is presented for the computational simulation of primitive variable uncertainties, and attention is given to the simulation of specific aerospace components. Specific examples treated encompass a probabilistic material behavior model, as well as static, dynamic, and fatigue/damage analyses of a turbine blade in a mistuned bladed rotor in the SSME turbopumps. An account is given of the use of the NESSES probabilistic FEM analysis CFD code.
Probabilistic Design and Analysis Framework
NASA Technical Reports Server (NTRS)
Strack, William C.; Nagpal, Vinod K.
2010-01-01
PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.
The economics of abrupt climate change.
Perrings, Charles
2003-09-15
The US National Research Council defines abrupt climate change as a change of state that is sufficiently rapid and sufficiently widespread in its effects that economies are unprepared or incapable of adapting. This may be too restrictive a definition, but abrupt climate change does have implications for the choice between the main response options: mitigation (which reduces the risks of climate change) and adaptation (which reduces the costs of climate change). The paper argues that by (i) increasing the costs of change and the potential growth of consumption, and (ii) reducing the time to change, abrupt climate change favours mitigation over adaptation. Furthermore, because the implications of change are fundamentally uncertain and potentially very high, it favours a precautionary approach in which mitigation buys time for learning. Adaptation-oriented decision tools, such as scenario planning, are inappropriate in these circumstances. Hence learning implies the use of probabilistic models that include socioeconomic feedbacks.
Analysis of Logistics in Support of a Human Lunar Outpost
NASA Technical Reports Server (NTRS)
Cirillo, William; Earle, Kevin; Goodliff, Kandyce; Reeves, j. D.; Andrashko, Mark; Merrill, R. Gabe; Stromgren, Chel
2008-01-01
Strategic level analysis of the integrated behavior of lunar transportation system and lunar surface system architecture options is performed to inform NASA Constellation Program senior management on the benefit, viability, affordability, and robustness of system design choices. This paper presents an overview of the approach used to perform the campaign (strategic) analysis, with an emphasis on the logistics modeling and the impacts of logistics resupply on campaign behavior. An overview of deterministic and probabilistic analysis approaches is provided, with a discussion of the importance of each approach to understanding the integrated system behavior. The logistics required to support lunar surface habitation are analyzed from both 'macro-logistics' and 'micro-logistics' perspectives, where macro-logistics focuses on the delivery of goods to a destination and micro-logistics focuses on local handling of re-supply goods at a destination. An example campaign is provided to tie the theories of campaign analysis to results generation capabilities.
Abstract knowledge versus direct experience in processing of binomial expressions
Morgan, Emily; Levy, Roger
2016-01-01
We ask whether word order preferences for binomial expressions of the form A and B (e.g. bread and butter) are driven by abstract linguistic knowledge of ordering constraints referencing the semantic, phonological, and lexical properties of the constituent words, or by prior direct experience with the specific items in questions. Using forced-choice and self-paced reading tasks, we demonstrate that online processing of never-before-seen binomials is influenced by abstract knowledge of ordering constraints, which we estimate with a probabilistic model. In contrast, online processing of highly frequent binomials is primarily driven by direct experience, which we estimate from corpus frequency counts. We propose a trade-off wherein processing of novel expressions relies upon abstract knowledge, while reliance upon direct experience increases with increased exposure to an expression. Our findings support theories of language processing in which both compositional generation and direct, holistic reuse of multi-word expressions play crucial roles. PMID:27776281
ERIC Educational Resources Information Center
Dougherty, Michael R.; Franco-Watkins, Ana M.; Thomas, Rick
2008-01-01
The theory of probabilistic mental models (PMM; G. Gigerenzer, U. Hoffrage, & H. Kleinbolting, 1991) has had a major influence on the field of judgment and decision making, with the most recent important modifications to PMM theory being the identification of several fast and frugal heuristics (G. Gigerenzer & D. G. Goldstein, 1996). These…
Developing probabilistic models to predict amphibian site occupancy in a patchy landscape
R. A. Knapp; K.R. Matthews; H. K. Preisler; R. Jellison
2003-01-01
Abstract. Human-caused fragmentation of habitats is threatening an increasing number of animal and plant species, making an understanding of the factors influencing patch occupancy ever more important. The overall goal of the current study was to develop probabilistic models of patch occupancy for the mountain yellow-legged frog (Rana muscosa). This once-common species...
Brandsch, Rainer
2017-10-01
Migration modelling provides reliable migration estimates from food-contact materials (FCM) to food or food simulants based on mass-transfer parameters like diffusion and partition coefficients related to individual materials. In most cases, mass-transfer parameters are not readily available from the literature and for this reason are estimated with a given uncertainty. Historically, uncertainty was accounted for by introducing upper limit concepts first, turning out to be of limited applicability due to highly overestimated migration results. Probabilistic migration modelling gives the possibility to consider uncertainty of the mass-transfer parameters as well as other model inputs. With respect to a functional barrier, the most important parameters among others are the diffusion properties of the functional barrier and its thickness. A software tool that accepts distribution as inputs and is capable of applying Monte Carlo methods, i.e., random sampling from the input distributions of the relevant parameters (i.e., diffusion coefficient and layer thickness), predicts migration results with related uncertainty and confidence intervals. The capabilities of probabilistic migration modelling are presented in the view of three case studies (1) sensitivity analysis, (2) functional barrier efficiency and (3) validation by experimental testing. Based on the predicted migration by probabilistic migration modelling and related exposure estimates, safety evaluation of new materials in the context of existing or new packaging concepts is possible. Identifying associated migration risk and potential safety concerns in the early stage of packaging development is possible. Furthermore, dedicated material selection exhibiting required functional barrier efficiency under application conditions becomes feasible. Validation of the migration risk assessment by probabilistic migration modelling through a minimum of dedicated experimental testing is strongly recommended.
Single, Complete, Probability Spaces Consistent With EPR-Bohm-Bell Experimental Data
NASA Astrophysics Data System (ADS)
Avis, David; Fischer, Paul; Hilbert, Astrid; Khrennikov, Andrei
2009-03-01
We show that paradoxical consequences of violations of Bell's inequality are induced by the use of an unsuitable probabilistic description for the EPR-Bohm-Bell experiment. The conventional description (due to Bell) is based on a combination of statistical data collected for different settings of polarization beam splitters (PBSs). In fact, such data consists of some conditional probabilities which only partially define a probability space. Ignoring this conditioning leads to apparent contradictions in the classical probabilistic model (due to Kolmogorov). We show how to make a completely consistent probabilistic model by taking into account the probabilities of selecting the settings of the PBSs. Our model matches both the experimental data and is consistent with classical probability theory.
Probabilistic learning of nonlinear dynamical systems using sequential Monte Carlo
NASA Astrophysics Data System (ADS)
Schön, Thomas B.; Svensson, Andreas; Murray, Lawrence; Lindsten, Fredrik
2018-05-01
Probabilistic modeling provides the capability to represent and manipulate uncertainty in data, models, predictions and decisions. We are concerned with the problem of learning probabilistic models of dynamical systems from measured data. Specifically, we consider learning of probabilistic nonlinear state-space models. There is no closed-form solution available for this problem, implying that we are forced to use approximations. In this tutorial we will provide a self-contained introduction to one of the state-of-the-art methods-the particle Metropolis-Hastings algorithm-which has proven to offer a practical approximation. This is a Monte Carlo based method, where the particle filter is used to guide a Markov chain Monte Carlo method through the parameter space. One of the key merits of the particle Metropolis-Hastings algorithm is that it is guaranteed to converge to the "true solution" under mild assumptions, despite being based on a particle filter with only a finite number of particles. We will also provide a motivating numerical example illustrating the method using a modeling language tailored for sequential Monte Carlo methods. The intention of modeling languages of this kind is to open up the power of sophisticated Monte Carlo methods-including particle Metropolis-Hastings-to a large group of users without requiring them to know all the underlying mathematical details.
Reasoning and choice in the Monty Hall Dilemma (MHD): implications for improving Bayesian reasoning
Tubau, Elisabet; Aguilar-Lleyda, David; Johnson, Eric D.
2015-01-01
The Monty Hall Dilemma (MHD) is a two-step decision problem involving counterintuitive conditional probabilities. The first choice is made among three equally probable options, whereas the second choice takes place after the elimination of one of the non-selected options which does not hide the prize. Differing from most Bayesian problems, statistical information in the MHD has to be inferred, either by learning outcome probabilities or by reasoning from the presented sequence of events. This often leads to suboptimal decisions and erroneous probability judgments. Specifically, decision makers commonly develop a wrong intuition that final probabilities are equally distributed, together with a preference for their first choice. Several studies have shown that repeated practice enhances sensitivity to the different reward probabilities, but does not facilitate correct Bayesian reasoning. However, modest improvements in probability judgments have been observed after guided explanations. To explain these dissociations, the present review focuses on two types of causes producing the observed biases: Emotional-based choice biases and cognitive limitations in understanding probabilistic information. Among the latter, we identify a crucial cause for the universal difficulty in overcoming the equiprobability illusion: Incomplete representation of prior and conditional probabilities. We conclude that repeated practice and/or high incentives can be effective for overcoming choice biases, but promoting an adequate partitioning of possibilities seems to be necessary for overcoming cognitive illusions and improving Bayesian reasoning. PMID:25873906
Pearce, Marcus T
2018-05-11
Music perception depends on internal psychological models derived through exposure to a musical culture. It is hypothesized that this musical enculturation depends on two cognitive processes: (1) statistical learning, in which listeners acquire internal cognitive models of statistical regularities present in the music to which they are exposed; and (2) probabilistic prediction based on these learned models that enables listeners to organize and process their mental representations of music. To corroborate these hypotheses, I review research that uses a computational model of probabilistic prediction based on statistical learning (the information dynamics of music (IDyOM) model) to simulate data from empirical studies of human listeners. The results show that a broad range of psychological processes involved in music perception-expectation, emotion, memory, similarity, segmentation, and meter-can be understood in terms of a single, underlying process of probabilistic prediction using learned statistical models. Furthermore, IDyOM simulations of listeners from different musical cultures demonstrate that statistical learning can plausibly predict causal effects of differential cultural exposure to musical styles, providing a quantitative model of cultural distance. Understanding the neural basis of musical enculturation will benefit from close coordination between empirical neuroimaging and computational modeling of underlying mechanisms, as outlined here. © 2018 The Authors. Annals of the New York Academy of Sciences published by Wiley Periodicals, Inc. on behalf of New York Academy of Sciences.
A note on probabilistic models over strings: the linear algebra approach.
Bouchard-Côté, Alexandre
2013-12-01
Probabilistic models over strings have played a key role in developing methods that take into consideration indels as phylogenetically informative events. There is an extensive literature on using automata and transducers on phylogenies to do inference on these probabilistic models, in which an important theoretical question is the complexity of computing the normalization of a class of string-valued graphical models. This question has been investigated using tools from combinatorics, dynamic programming, and graph theory, and has practical applications in Bayesian phylogenetics. In this work, we revisit this theoretical question from a different point of view, based on linear algebra. The main contribution is a set of results based on this linear algebra view that facilitate the analysis and design of inference algorithms on string-valued graphical models. As an illustration, we use this method to give a new elementary proof of a known result on the complexity of inference on the "TKF91" model, a well-known probabilistic model over strings. Compared to previous work, our proving method is easier to extend to other models, since it relies on a novel weak condition, triangular transducers, which is easy to establish in practice. The linear algebra view provides a concise way of describing transducer algorithms and their compositions, opens the possibility of transferring fast linear algebra libraries (for example, based on GPUs), as well as low rank matrix approximation methods, to string-valued inference problems.
An analytical probabilistic model of the quality efficiency of a sewer tank
NASA Astrophysics Data System (ADS)
Balistrocchi, Matteo; Grossi, Giovanna; Bacchi, Baldassare
2009-12-01
The assessment of the efficiency of a storm water storage facility devoted to the sewer overflow control in urban areas strictly depends on the ability to model the main features of the rainfall-runoff routing process and the related wet weather pollution delivery. In this paper the possibility of applying the analytical probabilistic approach for developing a tank design method, whose potentials are similar to the continuous simulations, is proved. In the model derivation the quality issues of such devices were implemented. The formulation is based on a Weibull probabilistic model of the main characteristics of the rainfall process and on a power law describing the relationship between the dimensionless storm water cumulative runoff volume and the dimensionless cumulative pollutograph. Following this approach, efficiency indexes were established. The proposed model was verified by comparing its results to those obtained by continuous simulations; satisfactory agreement is shown for the proposed efficiency indexes.
PROBABILISTIC MODELING FOR ADVANCED HUMAN EXPOSURE ASSESSMENT
Human exposures to environmental pollutants widely vary depending on the emission patterns that result in microenvironmental pollutant concentrations, as well as behavioral factors that determine the extent of an individual's contact with these pollutants. Probabilistic human exp...
Regional scaling of annual mean precipitation and water availability with global temperature change
NASA Astrophysics Data System (ADS)
Greve, Peter; Gudmundsson, Lukas; Seneviratne, Sonia I.
2018-03-01
Changes in regional water availability belong to the most crucial potential impacts of anthropogenic climate change, but are highly uncertain. It is thus of key importance for stakeholders to assess the possible implications of different global temperature thresholds on these quantities. Using a subset of climate model simulations from the fifth phase of the Coupled Model Intercomparison Project (CMIP5), we derive here the sensitivity of regional changes in precipitation and in precipitation minus evapotranspiration to global temperature changes. The simulations span the full range of available emission scenarios, and the sensitivities are derived using a modified pattern scaling approach. The applied approach assumes linear relationships on global temperature changes while thoroughly addressing associated uncertainties via resampling methods. This allows us to assess the full distribution of the simulations in a probabilistic sense. Northern high-latitude regions display robust responses towards wetting, while subtropical regions display a tendency towards drying but with a large range of responses. Even though both internal variability and the scenario choice play an important role in the overall spread of the simulations, the uncertainty stemming from the climate model choice usually accounts for about half of the total uncertainty in most regions. We additionally assess the implications of limiting global mean temperature warming to values below (i) 2 K or (ii) 1.5 K (as stated within the 2015 Paris Agreement). We show that opting for the 1.5 K target might just slightly influence the mean response, but could substantially reduce the risk of experiencing extreme changes in regional water availability.
Probabilistic boundary element method
NASA Technical Reports Server (NTRS)
Cruse, T. A.; Raveendra, S. T.
1989-01-01
The purpose of the Probabilistic Structural Analysis Method (PSAM) project is to develop structural analysis capabilities for the design analysis of advanced space propulsion system hardware. The boundary element method (BEM) is used as the basis of the Probabilistic Advanced Analysis Methods (PADAM) which is discussed. The probabilistic BEM code (PBEM) is used to obtain the structural response and sensitivity results to a set of random variables. As such, PBEM performs analogous to other structural analysis codes such as finite elements in the PSAM system. For linear problems, unlike the finite element method (FEM), the BEM governing equations are written at the boundary of the body only, thus, the method eliminates the need to model the volume of the body. However, for general body force problems, a direct condensation of the governing equations to the boundary of the body is not possible and therefore volume modeling is generally required.
NASA Technical Reports Server (NTRS)
Canfield, R. C.; Ricchiazzi, P. J.
1980-01-01
An approximate probabilistic radiative transfer equation and the statistical equilibrium equations are simultaneously solved for a model hydrogen atom consisting of three bound levels and ionization continuum. The transfer equation for L-alpha, L-beta, H-alpha, and the Lyman continuum is explicitly solved assuming complete redistribution. The accuracy of this approach is tested by comparing source functions and radiative loss rates to values obtained with a method that solves the exact transfer equation. Two recent model solar-flare chromospheres are used for this test. It is shown that for the test atmospheres the probabilistic method gives values of the radiative loss rate that are characteristically good to a factor of 2. The advantage of this probabilistic approach is that it retains a description of the dominant physical processes of radiative transfer in the complete redistribution case, yet it achieves a major reduction in computational requirements.
Wang, Yue; Adalý, Tülay; Kung, Sun-Yuan; Szabo, Zsolt
2007-01-01
This paper presents a probabilistic neural network based technique for unsupervised quantification and segmentation of brain tissues from magnetic resonance images. It is shown that this problem can be solved by distribution learning and relaxation labeling, resulting in an efficient method that may be particularly useful in quantifying and segmenting abnormal brain tissues where the number of tissue types is unknown and the distributions of tissue types heavily overlap. The new technique uses suitable statistical models for both the pixel and context images and formulates the problem in terms of model-histogram fitting and global consistency labeling. The quantification is achieved by probabilistic self-organizing mixtures and the segmentation by a probabilistic constraint relaxation network. The experimental results show the efficient and robust performance of the new algorithm and that it outperforms the conventional classification based approaches. PMID:18172510
Time Alignment as a Necessary Step in the Analysis of Sleep Probabilistic Curves
NASA Astrophysics Data System (ADS)
Rošt'áková, Zuzana; Rosipal, Roman
2018-02-01
Sleep can be characterised as a dynamic process that has a finite set of sleep stages during the night. The standard Rechtschaffen and Kales sleep model produces discrete representation of sleep and does not take into account its dynamic structure. In contrast, the continuous sleep representation provided by the probabilistic sleep model accounts for the dynamics of the sleep process. However, analysis of the sleep probabilistic curves is problematic when time misalignment is present. In this study, we highlight the necessity of curve synchronisation before further analysis. Original and in time aligned sleep probabilistic curves were transformed into a finite dimensional vector space, and their ability to predict subjects' age or daily measures is evaluated. We conclude that curve alignment significantly improves the prediction of the daily measures, especially in the case of the S2-related sleep states or slow wave sleep.
NASA Astrophysics Data System (ADS)
Mayr, G. J.; Kneringer, P.; Dietz, S. J.; Zeileis, A.
2016-12-01
Low visibility or low cloud ceiling reduce the capacity of airports by requiring special low visibility procedures (LVP) for incoming/departing aircraft. Probabilistic forecasts when such procedures will become necessary help to mitigate delays and economic losses.We compare the performance of probabilistic nowcasts with two statistical methods: ordered logistic regression, and trees and random forests. These models harness historic and current meteorological measurements in the vicinity of the airport and LVP states, and incorporate diurnal and seasonal climatological information via generalized additive models (GAM). The methods are applied at Vienna International Airport (Austria). The performance is benchmarked against climatology, persistence and human forecasters.
Composite Load Spectra for Select Space Propulsion Structural Components
NASA Technical Reports Server (NTRS)
Ho, Hing W.; Newell, James F.
1994-01-01
Generic load models are described with multiple levels of progressive sophistication to simulate the composite (combined) load spectra (CLS) that are induced in space propulsion system components, representative of Space Shuttle Main Engines (SSME), such as transfer ducts, turbine blades and liquid oxygen (LOX) posts. These generic (coupled) models combine the deterministic models for composite load dynamic, acoustic, high-pressure and high rotational speed, etc., load simulation using statistically varying coefficients. These coefficients are then determined using advanced probabilistic simulation methods with and without strategically selected experimental data. The entire simulation process is included in a CLS computer code. Applications of the computer code to various components in conjunction with the PSAM (Probabilistic Structural Analysis Method) to perform probabilistic load evaluation and life prediction evaluations are also described to illustrate the effectiveness of the coupled model approach.
Appraisal of jump distributions in ensemble-based sampling algorithms
NASA Astrophysics Data System (ADS)
Dejanic, Sanda; Scheidegger, Andreas; Rieckermann, Jörg; Albert, Carlo
2017-04-01
Sampling Bayesian posteriors of model parameters is often required for making model-based probabilistic predictions. For complex environmental models, standard Monte Carlo Markov Chain (MCMC) methods are often infeasible because they require too many sequential model runs. Therefore, we focused on ensemble methods that use many Markov chains in parallel, since they can be run on modern cluster architectures. Little is known about how to choose the best performing sampler, for a given application. A poor choice can lead to an inappropriate representation of posterior knowledge. We assessed two different jump moves, the stretch and the differential evolution move, underlying, respectively, the software packages EMCEE and DREAM, which are popular in different scientific communities. For the assessment, we used analytical posteriors with features as they often occur in real posteriors, namely high dimensionality, strong non-linear correlations or multimodality. For posteriors with non-linear features, standard convergence diagnostics based on sample means can be insufficient. Therefore, we resorted to an entropy-based convergence measure. We assessed the samplers by means of their convergence speed, robustness and effective sample sizes. For posteriors with strongly non-linear features, we found that the stretch move outperforms the differential evolution move, w.r.t. all three aspects.
Badre, David
2012-01-01
Growing evidence suggests that the prefrontal cortex (PFC) is organized hierarchically, with more anterior regions having increasingly abstract representations. How does this organization support hierarchical cognitive control and the rapid discovery of abstract action rules? We present computational models at different levels of description. A neural circuit model simulates interacting corticostriatal circuits organized hierarchically. In each circuit, the basal ganglia gate frontal actions, with some striatal units gating the inputs to PFC and others gating the outputs to influence response selection. Learning at all of these levels is accomplished via dopaminergic reward prediction error signals in each corticostriatal circuit. This functionality allows the system to exhibit conditional if–then hypothesis testing and to learn rapidly in environments with hierarchical structure. We also develop a hybrid Bayesian-reinforcement learning mixture of experts (MoE) model, which can estimate the most likely hypothesis state of individual participants based on their observed sequence of choices and rewards. This model yields accurate probabilistic estimates about which hypotheses are attended by manipulating attentional states in the generative neural model and recovering them with the MoE model. This 2-pronged modeling approach leads to multiple quantitative predictions that are tested with functional magnetic resonance imaging in the companion paper. PMID:21693490
Chronic Exposure to Methamphetamine Disrupts Reinforcement-Based Decision Making in Rats.
Groman, Stephanie M; Rich, Katherine M; Smith, Nathaniel J; Lee, Daeyeol; Taylor, Jane R
2018-03-01
The persistent use of psychostimulant drugs, despite the detrimental outcomes associated with continued drug use, may be because of disruptions in reinforcement-learning processes that enable behavior to remain flexible and goal directed in dynamic environments. To identify the reinforcement-learning processes that are affected by chronic exposure to the psychostimulant methamphetamine (MA), the current study sought to use computational and biochemical analyses to characterize decision-making processes, assessed by probabilistic reversal learning, in rats before and after they were exposed to an escalating dose regimen of MA (or saline control). The ability of rats to use flexible and adaptive decision-making strategies following changes in stimulus-reward contingencies was significantly impaired following exposure to MA. Computational analyses of parameters that track choice and outcome behavior indicated that exposure to MA significantly impaired the ability of rats to use negative outcomes effectively. These MA-induced changes in decision making were similar to those observed in rats following administration of a dopamine D2/3 receptor antagonist. These data use computational models to provide insight into drug-induced maladaptive decision making that may ultimately identify novel targets for the treatment of psychostimulant addiction. We suggest that the disruption in utilization of negative outcomes to adaptively guide dynamic decision making is a new behavioral mechanism by which MA rigidly biases choice behavior.
Lunar Exploration Architecture Level Key Drivers and Sensitivities
NASA Technical Reports Server (NTRS)
Goodliff, Kandyce; Cirillo, William; Earle, Kevin; Reeves, J. D.; Shyface, Hilary; Andraschko, Mark; Merrill, R. Gabe; Stromgren, Chel; Cirillo, Christopher
2009-01-01
Strategic level analysis of the integrated behavior of lunar transportation and lunar surface systems architecture options is performed to assess the benefit, viability, affordability, and robustness of system design choices. This analysis employs both deterministic and probabilistic modeling techniques so that the extent of potential future uncertainties associated with each option are properly characterized. The results of these analyses are summarized in a predefined set of high-level Figures of Merit (FOMs) so as to provide senior NASA Constellation Program (CxP) and Exploration Systems Mission Directorate (ESMD) management with pertinent information to better inform strategic level decision making. The strategic level exploration architecture model is designed to perform analysis at as high a level as possible but still capture those details that have major impacts on system performance. The strategic analysis methodology focuses on integrated performance, affordability, and risk analysis, and captures the linkages and feedbacks between these three areas. Each of these results leads into the determination of the high-level FOMs. This strategic level analysis methodology has been previously applied to Space Shuttle and International Space Station assessments and is now being applied to the development of the Constellation Program point-of-departure lunar architecture. This paper provides an overview of the strategic analysis methodology and the lunar exploration architecture analyses to date. In studying these analysis results, the strategic analysis team has identified and characterized key drivers affecting the integrated architecture behavior. These key drivers include inclusion of a cargo lander, mission rate, mission location, fixed-versus- variable costs/return on investment, and the requirement for probabilistic analysis. Results of sensitivity analysis performed on lunar exploration architecture scenarios are also presented.
NASA Astrophysics Data System (ADS)
García-Mayordomo, J.; Gaspar-Escribano, J. M.; Benito, B.
2007-10-01
A probabilistic seismic hazard assessment of the Province of Murcia in terms of peak ground acceleration (PGA) and spectral accelerations [SA( T)] is presented in this paper. In contrast to most of the previous studies in the region, which were performed for PGA making use of intensity-to-PGA relationships, hazard is here calculated in terms of magnitude and using European spectral ground-motion models. Moreover, we have considered the most important faults in the region as specific seismic sources, and also comprehensively reviewed the earthquake catalogue. Hazard calculations are performed following the Probabilistic Seismic Hazard Assessment (PSHA) methodology using a logic tree, which accounts for three different seismic source zonings and three different ground-motion models. Hazard maps in terms of PGA and SA(0.1, 0.2, 0.5, 1.0 and 2.0 s) and coefficient of variation (COV) for the 475-year return period are shown. Subsequent analysis is focused on three sites of the province, namely, the cities of Murcia, Lorca and Cartagena, which are important industrial and tourism centres. Results at these sites have been analysed to evaluate the influence of the different input options. The most important factor affecting the results is the choice of the attenuation relationship, whereas the influence of the selected seismic source zonings appears strongly site dependant. Finally, we have performed an analysis of source contribution to hazard at each of these cities to provide preliminary guidance in devising specific risk scenarios. We have found that local source zones control the hazard for PGA and SA( T ≤ 1.0 s), although contribution from specific fault sources and long-distance north Algerian sources becomes significant from SA(0.5 s) onwards.
Probabilistic modeling of the evolution of gene synteny within reconciled phylogenies
2015-01-01
Background Most models of genome evolution concern either genetic sequences, gene content or gene order. They sometimes integrate two of the three levels, but rarely the three of them. Probabilistic models of gene order evolution usually have to assume constant gene content or adopt a presence/absence coding of gene neighborhoods which is blind to complex events modifying gene content. Results We propose a probabilistic evolutionary model for gene neighborhoods, allowing genes to be inserted, duplicated or lost. It uses reconciled phylogenies, which integrate sequence and gene content evolution. We are then able to optimize parameters such as phylogeny branch lengths, or probabilistic laws depicting the diversity of susceptibility of syntenic regions to rearrangements. We reconstruct a structure for ancestral genomes by optimizing a likelihood, keeping track of all evolutionary events at the level of gene content and gene synteny. Ancestral syntenies are associated with a probability of presence. We implemented the model with the restriction that at most one gene duplication separates two gene speciations in reconciled gene trees. We reconstruct ancestral syntenies on a set of 12 drosophila genomes, and compare the evolutionary rates along the branches and along the sites. We compare with a parsimony method and find a significant number of results not supported by the posterior probability. The model is implemented in the Bio++ library. It thus benefits from and enriches the classical models and methods for molecular evolution. PMID:26452018
The U.S. Environmental Protection Agency has conducted a probabilistic exposure and dose assessment on the arsenic (As) and chromium (Cr) components of Chromated Copper Arsenate (CCA) using the Stochastic Human Exposure and Dose Simulation model for wood preservatives (SHEDS-Wood...
Probabilistic Based Modeling and Simulation Assessment
2010-06-01
different crash and blast scenarios. With the integration of the high fidelity neck and head model, a methodology to calculate the probability of injury...variability, correlation, and multiple (often competing) failure metrics. Important scenarios include vehicular collisions, blast /fragment impact, and...first area of focus is to develop a methodology to integrate probabilistic analysis into finite element analysis of vehicle collisions and blast . The
ERIC Educational Resources Information Center
Tsitsipis, Georgios; Stamovlasis, Dimitrios; Papageorgiou, George
2012-01-01
In this study, the effect of 3 cognitive variables such as logical thinking, field dependence/field independence, and convergent/divergent thinking on some specific students' answers related to the particulate nature of matter was investigated by means of probabilistic models. Besides recording and tabulating the students' responses, a combination…
Steven C. McKelvey; William D. Smith; Frank Koch
2012-01-01
This project summary describes a probabilistic model developed with funding support from the Forest Health Monitoring Program of the Forest Service, U.S. Department of Agriculture (BaseEM Project SO-R-08-01). The model has been implemented in SODBuster, a standalone software package developed using the Java software development kit from Sun Microsystems.
Economic Dispatch for Microgrid Containing Electric Vehicles via Probabilistic Modeling: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yao, Yin; Gao, Wenzhong; Momoh, James
In this paper, an economic dispatch model with probabilistic modeling is developed for a microgrid. The electric power supply in a microgrid consists of conventional power plants and renewable energy power plants, such as wind and solar power plants. Because of the fluctuation in the output of solar and wind power plants, an empirical probabilistic model is developed to predict their hourly output. According to different characteristics of wind and solar power plants, the parameters for probabilistic distribution are further adjusted individually for both. On the other hand, with the growing trend in plug-in electric vehicles (PHEVs), an integrated microgridmore » system must also consider the impact of PHEVs. The charging loads from PHEVs as well as the discharging output via the vehicle-to-grid (V2G) method can greatly affect the economic dispatch for all of the micro energy sources in a microgrid. This paper presents an optimization method for economic dispatch in a microgrid considering conventional power plants, renewable power plants, and PHEVs. The simulation results reveal that PHEVs with V2G capability can be an indispensable supplement in a modern microgrid.« less
Probabilistic population projections with migration uncertainty
Azose, Jonathan J.; Ševčíková, Hana; Raftery, Adrian E.
2016-01-01
We produce probabilistic projections of population for all countries based on probabilistic projections of fertility, mortality, and migration. We compare our projections to those from the United Nations’ Probabilistic Population Projections, which uses similar methods for fertility and mortality but deterministic migration projections. We find that uncertainty in migration projection is a substantial contributor to uncertainty in population projections for many countries. Prediction intervals for the populations of Northern America and Europe are over 70% wider, whereas prediction intervals for the populations of Africa, Asia, and the world as a whole are nearly unchanged. Out-of-sample validation shows that the model is reasonably well calibrated. PMID:27217571
Probabilistic brains: knowns and unknowns
Pouget, Alexandre; Beck, Jeffrey M; Ma, Wei Ji; Latham, Peter E
2015-01-01
There is strong behavioral and physiological evidence that the brain both represents probability distributions and performs probabilistic inference. Computational neuroscientists have started to shed light on how these probabilistic representations and computations might be implemented in neural circuits. One particularly appealing aspect of these theories is their generality: they can be used to model a wide range of tasks, from sensory processing to high-level cognition. To date, however, these theories have only been applied to very simple tasks. Here we discuss the challenges that will emerge as researchers start focusing their efforts on real-life computations, with a focus on probabilistic learning, structural learning and approximate inference. PMID:23955561
A probabilistic maintenance model for diesel engines
NASA Astrophysics Data System (ADS)
Pathirana, Shan; Abeygunawardane, Saranga Kumudu
2018-02-01
In this paper, a probabilistic maintenance model is developed for inspection based preventive maintenance of diesel engines based on the practical model concepts discussed in the literature. Developed model is solved using real data obtained from inspection and maintenance histories of diesel engines and experts' views. Reliability indices and costs were calculated for the present maintenance policy of diesel engines. A sensitivity analysis is conducted to observe the effect of inspection based preventive maintenance on the life cycle cost of diesel engines.
Poltavski, Dmitri V; Weatherly, Jeffrey N
2013-12-01
The purpose of the present study was to investigate temporal and probabilistic discounting in smokers and never-smokers, across a number of commodities, using a multiple-choice method. One hundred and eighty-two undergraduate university students, of whom 90 had never smoked, 73 were self-reported light smokers (<10 cigarettes/day), and 17 were heavy smokers (10+cigarettes/day), completed computerized batteries of delay and probability discounting questions pertaining to a total of eight commodities and administered in a multiple-choice format. In addition to cigarettes, monetary rewards, and health outcomes, the tasks included novel commodities such as ideal dating partner and retirement income. The results showed that heavy smokers probability discounted commodities at a significantly shallower rate than never-smokers, suggesting greater risk-taking. No effect of smoking status was observed for delay discounting questions. The only commodity that was probability discounted significantly less than others was 'finding an ideal dating partner'. The results suggest that probability discounting tasks using the multiple-choice format can discriminate between non-abstaining smokers and never-smokers and could be further explored in the context of behavioral and drug addictions.
Carrasco, E; Valero, A; Pérez-Rodríguez, F; García-Gimeno, R M; Zurera, G
2007-03-10
The recent Commission Regulation (EC) No 2073/2005 establishes microbiological criteria in foods. For the pathogen Listeria monocytogenes in the category ready-to-eat foods able to support its growth, other than those intended for infants and for special medical purposes, two different microbiological criteria are proposed: (i) L. monocytogenes levels should be <100 cfu/g throughout the shelf-life of the product, (ii) absence in 25 g of the product at the stage before the food has left the immediate control of the food business operator, who has produced it. The application of either the first or the second of these criteria depends on whether or not the manufacturer is able to demonstrate that the level of L. monocytogenes in the food product will not exceed 100 cfu/g throughout its shelf-life. This demonstration should be based on physico-chemical characteristics of the target product and consultation of scientific literature, and, when necessary, on quantitative models and/or challenge tests. Once the characteristics of the product as well as scientific literature show that the pathogen has potential to grow on a specific food commodity, it seems adequate to use quantitative models and/or perform challenge tests to study the extent to which L. monocytogenes could grow. In this study, we aim to illustrate with an example in cooked ham the application of quantitative models as a tool to manage the compliance with these criteria. Two approaches were considered: deterministic and probabilistic, in three different commercial brands (A, B, and C). The deterministic approach showed that the limit 100 cfu/g was exceeded largely at the end of the shelf-life of all three; however, when reducing the storage time, the level of L. monocytogenes remained below 100 cfu/g in B. The probabilistic approach demonstrated very low percentiles corresponding to 100 cfu/g; when reducing the storage time, percentiles for three products increased, especially in products B and C (from 4.92% to 75.90%, and from 0.90% to 73.90%, respectively). This study shows how different storage times influence the level of L. monocytogenes at the end of the shelf-life of cooked ham, and, depending on the level reached, the microbiological criterion applied should be different, as stated above. Beside this, the choice of either point-estimate or probabilistic approach should be determined by the competent sanitary authority, and, in case of selecting the second approach, a certain percentile for the level 100 cfu/g should be established.
Probabilistic estimates of drought impacts on agricultural production
NASA Astrophysics Data System (ADS)
Madadgar, Shahrbanou; AghaKouchak, Amir; Farahmand, Alireza; Davis, Steven J.
2017-08-01
Increases in the severity and frequency of drought in a warming climate may negatively impact agricultural production and food security. Unlike previous studies that have estimated agricultural impacts of climate condition using single-crop yield distributions, we develop a multivariate probabilistic model that uses projected climatic conditions (e.g., precipitation amount or soil moisture) throughout a growing season to estimate the probability distribution of crop yields. We demonstrate the model by an analysis of the historical period 1980-2012, including the Millennium Drought in Australia (2001-2009). We find that precipitation and soil moisture deficit in dry growing seasons reduced the average annual yield of the five largest crops in Australia (wheat, broad beans, canola, lupine, and barley) by 25-45% relative to the wet growing seasons. Our model can thus produce region- and crop-specific agricultural sensitivities to climate conditions and variability. Probabilistic estimates of yield may help decision-makers in government and business to quantitatively assess the vulnerability of agriculture to climate variations. We develop a multivariate probabilistic model that uses precipitation to estimate the probability distribution of crop yields. The proposed model shows how the probability distribution of crop yield changes in response to droughts. During Australia's Millennium Drought precipitation and soil moisture deficit reduced the average annual yield of the five largest crops.
Menze, Bjoern H.; Van Leemput, Koen; Lashkari, Danial; Riklin-Raviv, Tammy; Geremia, Ezequiel; Alberts, Esther; Gruber, Philipp; Wegener, Susanne; Weber, Marc-André; Székely, Gabor; Ayache, Nicholas; Golland, Polina
2016-01-01
We introduce a generative probabilistic model for segmentation of brain lesions in multi-dimensional images that generalizes the EM segmenter, a common approach for modelling brain images using Gaussian mixtures and a probabilistic tissue atlas that employs expectation-maximization (EM) to estimate the label map for a new image. Our model augments the probabilistic atlas of the healthy tissues with a latent atlas of the lesion. We derive an estimation algorithm with closed-form EM update equations. The method extracts a latent atlas prior distribution and the lesion posterior distributions jointly from the image data. It delineates lesion areas individually in each channel, allowing for differences in lesion appearance across modalities, an important feature of many brain tumor imaging sequences. We also propose discriminative model extensions to map the output of the generative model to arbitrary labels with semantic and biological meaning, such as “tumor core” or “fluid-filled structure”, but without a one-to-one correspondence to the hypo-or hyper-intense lesion areas identified by the generative model. We test the approach in two image sets: the publicly available BRATS set of glioma patient scans, and multimodal brain images of patients with acute and subacute ischemic stroke. We find the generative model that has been designed for tumor lesions to generalize well to stroke images, and the generative-discriminative model to be one of the top ranking methods in the BRATS evaluation. PMID:26599702
Menze, Bjoern H; Van Leemput, Koen; Lashkari, Danial; Riklin-Raviv, Tammy; Geremia, Ezequiel; Alberts, Esther; Gruber, Philipp; Wegener, Susanne; Weber, Marc-Andre; Szekely, Gabor; Ayache, Nicholas; Golland, Polina
2016-04-01
We introduce a generative probabilistic model for segmentation of brain lesions in multi-dimensional images that generalizes the EM segmenter, a common approach for modelling brain images using Gaussian mixtures and a probabilistic tissue atlas that employs expectation-maximization (EM), to estimate the label map for a new image. Our model augments the probabilistic atlas of the healthy tissues with a latent atlas of the lesion. We derive an estimation algorithm with closed-form EM update equations. The method extracts a latent atlas prior distribution and the lesion posterior distributions jointly from the image data. It delineates lesion areas individually in each channel, allowing for differences in lesion appearance across modalities, an important feature of many brain tumor imaging sequences. We also propose discriminative model extensions to map the output of the generative model to arbitrary labels with semantic and biological meaning, such as "tumor core" or "fluid-filled structure", but without a one-to-one correspondence to the hypo- or hyper-intense lesion areas identified by the generative model. We test the approach in two image sets: the publicly available BRATS set of glioma patient scans, and multimodal brain images of patients with acute and subacute ischemic stroke. We find the generative model that has been designed for tumor lesions to generalize well to stroke images, and the extended discriminative -discriminative model to be one of the top ranking methods in the BRATS evaluation.
Probabilistic, meso-scale flood loss modelling
NASA Astrophysics Data System (ADS)
Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno
2016-04-01
Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.
Zhang, Lei; Zeng, Zhi; Ji, Qiang
2011-09-01
Chain graph (CG) is a hybrid probabilistic graphical model (PGM) capable of modeling heterogeneous relationships among random variables. So far, however, its application in image and video analysis is very limited due to lack of principled learning and inference methods for a CG of general topology. To overcome this limitation, we introduce methods to extend the conventional chain-like CG model to CG model with more general topology and the associated methods for learning and inference in such a general CG model. Specifically, we propose techniques to systematically construct a generally structured CG, to parameterize this model, to derive its joint probability distribution, to perform joint parameter learning, and to perform probabilistic inference in this model. To demonstrate the utility of such an extended CG, we apply it to two challenging image and video analysis problems: human activity recognition and image segmentation. The experimental results show improved performance of the extended CG model over the conventional directed or undirected PGMs. This study demonstrates the promise of the extended CG for effective modeling and inference of complex real-world problems.
A quantitative model of optimal data selection in Wason's selection task.
Hattori, Masasi
2002-10-01
The optimal data selection model proposed by Oaksford and Chater (1994) successfully formalized Wason's selection task (Wason, 1966). The model, however, involved some questionable assumptions and was also not sufficient as a model of the task because it could not provide quantitative predictions of the card selection frequencies. In this paper, the model was revised to provide quantitative fits to the data. The model can predict the selection frequencies of cards based on a selection tendency function (STF), or conversely, it enables the estimation of subjective probabilities from data. Past experimental data were first re-analysed based on the model. In Experiment 1, the superiority of the revised model was shown. However, when the relationship between antecedent and consequent was forced to deviate from the biconditional form, the model was not supported. In Experiment 2, it was shown that sufficient emphasis on probabilistic information can affect participants' performance. A detailed experimental method to sort participants by probabilistic strategies was introduced. Here, the model was supported by a subgroup of participants who used the probabilistic strategy. Finally, the results were discussed from the viewpoint of adaptive rationality.
Probabilistic Modeling of Settlement Risk at Land Disposal Facilities - 12304
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foye, Kevin C.; Soong, Te-Yang
2012-07-01
The long-term reliability of land disposal facility final cover systems - and therefore the overall waste containment - depends on the distortions imposed on these systems by differential settlement/subsidence. The evaluation of differential settlement is challenging because of the heterogeneity of the waste mass (caused by inconsistent compaction, void space distribution, debris-soil mix ratio, waste material stiffness, time-dependent primary compression of the fine-grained soil matrix, long-term creep settlement of the soil matrix and the debris, etc.) at most land disposal facilities. Deterministic approaches to long-term final cover settlement prediction are not able to capture the spatial variability in the wastemore » mass and sub-grade properties which control differential settlement. An alternative, probabilistic solution is to use random fields to model the waste and sub-grade properties. The modeling effort informs the design, construction, operation, and maintenance of land disposal facilities. A probabilistic method to establish design criteria for waste placement and compaction is introduced using the model. Random fields are ideally suited to problems of differential settlement modeling of highly heterogeneous foundations, such as waste. Random fields model the seemingly random spatial distribution of a design parameter, such as compressibility. When used for design, the use of these models prompts the need for probabilistic design criteria. It also allows for a statistical approach to waste placement acceptance criteria. An example design evaluation was performed, illustrating the use of the probabilistic differential settlement simulation methodology to assemble a design guidance chart. The purpose of this design evaluation is to enable the designer to select optimal initial combinations of design slopes and quality control acceptance criteria that yield an acceptable proportion of post-settlement slopes meeting some design minimum. For this specific example, relative density, which can be determined through field measurements, was selected as the field quality control parameter for waste placement. This technique can be extended to include a rigorous performance-based methodology using other parameters (void space criteria, debris-soil mix ratio, pre-loading, etc.). As shown in this example, each parameter range, or sets of parameter ranges can be selected such that they can result in an acceptable, long-term differential settlement according to the probabilistic model. The methodology can also be used to re-evaluate the long-term differential settlement behavior at closed land disposal facilities to identify, if any, problematic facilities so that remedial action (e.g., reinforcement of upper and intermediate waste layers) can be implemented. Considering the inherent spatial variability in waste and earth materials and the need for engineers to apply sound quantitative practices to engineering analysis, it is important to apply the available probabilistic techniques to problems of differential settlement. One such method to implement probability-based differential settlement analyses for the design of landfill final covers has been presented. The design evaluation technique presented is one tool to bridge the gap from deterministic practice to probabilistic practice. (authors)« less
Quan-Hoang, Vuong
2016-10-01
Patients have to acquire information to support their decision on choosing a suitable healthcare provider. But in developing countries like Vietnam, accessibility issues remain an obstacle, thus adversely affect both quality and costliness of healthcare information. Vietnamese use both sources from health professionals and friends/relatives, especially when quality of the Internet-based cheaper sources appear to be still questionable. The search of information from both professionals and friends/relatives incurs some cost, which can be viewed as low or high depending low or high accessibility to the sources. These views potentially affect their choices. To investigate the effects that medical/health services information on perceived expensiveness of patients' labor costs. Two related objectives are a) establishing empirical relations between accessibility to sources and expensiveness; and, b) probabilistic trends of probabilities for perceived expensiveness. There is evidence for established relations among the variables "Convexp" and "Convrel" (all p's < 0.01), indicating that both information sources (experts and friends/relatives) have influence on patients perception of information expensiveness. The use of experts source tends to increase the probability of perceived expensiveness. a) Probabilistic trends show Vietnamese patients have propensity to value healthcare information highly and do not see it as "expensive"; b) The majority of Vietnamese households still take non-professional advices at their own risks; c) There is more for the public healthcare information system to do to reduce costliness and risk of information. The Internet-based health service users communities cannot replace this system.
A probabilistic, distributed, recursive mechanism for decision-making in the brain
Gurney, Kevin N.
2018-01-01
Decision formation recruits many brain regions, but the procedure they jointly execute is unknown. Here we characterize its essential composition, using as a framework a novel recursive Bayesian algorithm that makes decisions based on spike-trains with the statistics of those in sensory cortex (MT). Using it to simulate the random-dot-motion task, we demonstrate it quantitatively replicates the choice behaviour of monkeys, whilst predicting losses of otherwise usable information from MT. Its architecture maps to the recurrent cortico-basal-ganglia-thalamo-cortical loops, whose components are all implicated in decision-making. We show that the dynamics of its mapped computations match those of neural activity in the sensorimotor cortex and striatum during decisions, and forecast those of basal ganglia output and thalamus. This also predicts which aspects of neural dynamics are and are not part of inference. Our single-equation algorithm is probabilistic, distributed, recursive, and parallel. Its success at capturing anatomy, behaviour, and electrophysiology suggests that the mechanism implemented by the brain has these same characteristics. PMID:29614077
Quantum-like model of brain's functioning: decision making from decoherence.
Asano, Masanari; Ohya, Masanori; Tanaka, Yoshiharu; Basieva, Irina; Khrennikov, Andrei
2011-07-21
We present a quantum-like model of decision making in games of the Prisoner's Dilemma type. By this model the brain processes information by using representation of mental states in a complex Hilbert space. Driven by the master equation the mental state of a player, say Alice, approaches an equilibrium point in the space of density matrices (representing mental states). This equilibrium state determines Alice's mixed (i.e., probabilistic) strategy. We use a master equation in which quantum physics describes the process of decoherence as the result of interaction with environment. Thus our model is a model of thinking through decoherence of the initially pure mental state. Decoherence is induced by the interaction with memory and the external mental environment. We study (numerically) the dynamics of quantum entropy of Alice's mental state in the process of decision making. We also consider classical entropy corresponding to Alice's choices. We introduce a measure of Alice's diffidence as the difference between classical and quantum entropies of Alice's mental state. We see that (at least in our model example) diffidence decreases (approaching zero) in the process of decision making. Finally, we discuss the problem of neuronal realization of quantum-like dynamics in the brain; especially roles played by lateral prefrontal cortex or/and orbitofrontal cortex. Copyright © 2011 Elsevier Ltd. All rights reserved.
Karsina, Allen; Thompson, Rachel H; Rodriguez, Nicole M; Vanselow, Nicholas R
2012-01-01
We evaluated the effects of differential reinforcement and accurate verbal rules with feedback on the preference for choice and the verbal reports of 6 adults. Participants earned points on a probabilistic schedule by completing the terminal links of a concurrent-chains arrangement in a computer-based game of chance. In free-choice terminal links, participants selected 3 numbers from an 8-number array; in restricted-choice terminal links participants selected the order of 3 numbers preselected by a computer program. A pop-up box then informed the participants if the numbers they selected or ordered matched or did not match numbers generated by the computer but not displayed; matching in a trial resulted in one point earned. In baseline sessions, schedules of reinforcement were equal across free- and restricted-choice arrangements and a running tally of points earned was shown each trial. The effects of differentially reinforcing restricted-choice selections were evaluated using a reversal design. For 4 participants, the effects of providing a running tally of points won by arrangement and verbal rules regarding the schedule of reinforcement were also evaluated using a nonconcurrent multiple-baseline-across-participants design. Results varied across participants but generally demonstrated that (a) preference for choice corresponded more closely to verbal reports of the odds of winning than to reinforcement schedules, (b) rules and feedback were correlated with more accurate verbal reports, and (c) preference for choice corresponded more highly to the relative number of reinforcements obtained across free- and restricted-choice arrangements in a session than to the obtained probability of reinforcement or to verbal reports of the odds of winning. PMID:22754103
Karsina, Allen; Thompson, Rachel H; Rodriguez, Nicole M; Vanselow, Nicholas R
2012-01-01
We evaluated the effects of differential reinforcement and accurate verbal rules with feedback on the preference for choice and the verbal reports of 6 adults. Participants earned points on a probabilistic schedule by completing the terminal links of a concurrent-chains arrangement in a computer-based game of chance. In free-choice terminal links, participants selected 3 numbers from an 8-number array; in restricted-choice terminal links participants selected the order of 3 numbers preselected by a computer program. A pop-up box then informed the participants if the numbers they selected or ordered matched or did not match numbers generated by the computer but not displayed; matching in a trial resulted in one point earned. In baseline sessions, schedules of reinforcement were equal across free- and restricted-choice arrangements and a running tally of points earned was shown each trial. The effects of differentially reinforcing restricted-choice selections were evaluated using a reversal design. For 4 participants, the effects of providing a running tally of points won by arrangement and verbal rules regarding the schedule of reinforcement were also evaluated using a nonconcurrent multiple-baseline-across-participants design. Results varied across participants but generally demonstrated that (a) preference for choice corresponded more closely to verbal reports of the odds of winning than to reinforcement schedules, (b) rules and feedback were correlated with more accurate verbal reports, and (c) preference for choice corresponded more highly to the relative number of reinforcements obtained across free- and restricted-choice arrangements in a session than to the obtained probability of reinforcement or to verbal reports of the odds of winning.
Probabilistic Structural Analysis Program
NASA Technical Reports Server (NTRS)
Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.
2010-01-01
NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.
What is the Value Added to Adaptation Planning by Probabilistic Projections of Climate Change?
NASA Astrophysics Data System (ADS)
Wilby, R. L.
2008-12-01
Probabilistic projections of climate change offer new sources of risk information to support regional impacts assessment and adaptation options appraisal. However, questions continue to surround how best to apply these scenarios in a practical context, and whether the added complexity and computational burden leads to more robust decision-making. This paper provides an overview of recent efforts in the UK to 'bench-test' frameworks for employing probabilistic projections ahead of the release of the next generation, UKCIP08 projections (in November 2008). This is involving close collaboration between government agencies, research and stakeholder communities. Three examples will be cited to illustrate how probabilistic projections are already informing decisions about future flood risk management in London, water resource planning in trial river basins, and assessments of risks from rising water temperatures to Atlantic salmon stocks in southern England. When compared with conventional deterministic scenarios, ensemble projections allow exploration of a wider range of management options and highlight timescales for implementing adaptation measures. Users of probabilistic scenarios must keep in mind that other uncertainties (e.g., due to impacts model structure and parameterisation) should be handled in an equally rigorous way to those arising from climate models and emission scenarios. Finally, it is noted that a commitment to long-term monitoring is also critical for tracking environmental change, testing model projections, and for evaluating the success (or not) of any scenario-led interventions.
Environmental probabilistic quantitative assessment methodologies
Crovelli, R.A.
1995-01-01
In this paper, four petroleum resource assessment methodologies are presented as possible pollution assessment methodologies, even though petroleum as a resource is desirable, whereas pollution is undesirable. A methodology is defined in this paper to consist of a probability model and a probabilistic method, where the method is used to solve the model. The following four basic types of probability models are considered: 1) direct assessment, 2) accumulation size, 3) volumetric yield, and 4) reservoir engineering. Three of the four petroleum resource assessment methodologies were written as microcomputer systems, viz. TRIAGG for direct assessment, APRAS for accumulation size, and FASPU for reservoir engineering. A fourth microcomputer system termed PROBDIST supports the three assessment systems. The three assessment systems have different probability models but the same type of probabilistic method. The type of advantages of the analytic method are in computational speed and flexibility, making it ideal for a microcomputer. -from Author
Classification of Company Performance using Weighted Probabilistic Neural Network
NASA Astrophysics Data System (ADS)
Yasin, Hasbi; Waridi Basyiruddin Arifin, Adi; Warsito, Budi
2018-05-01
Classification of company performance can be judged by looking at its financial status, whether good or bad state. Classification of company performance can be achieved by some approach, either parametric or non-parametric. Neural Network is one of non-parametric methods. One of Artificial Neural Network (ANN) models is Probabilistic Neural Network (PNN). PNN consists of four layers, i.e. input layer, pattern layer, addition layer, and output layer. The distance function used is the euclidean distance and each class share the same values as their weights. In this study used PNN that has been modified on the weighting process between the pattern layer and the addition layer by involving the calculation of the mahalanobis distance. This model is called the Weighted Probabilistic Neural Network (WPNN). The results show that the company's performance modeling with the WPNN model has a very high accuracy that reaches 100%.
Don't Fear Optimality: Sampling for Probabilistic-Logic Sequence Models
NASA Astrophysics Data System (ADS)
Thon, Ingo
One of the current challenges in artificial intelligence is modeling dynamic environments that change due to the actions or activities undertaken by people or agents. The task of inferring hidden states, e.g. the activities or intentions of people, based on observations is called filtering. Standard probabilistic models such as Dynamic Bayesian Networks are able to solve this task efficiently using approximative methods such as particle filters. However, these models do not support logical or relational representations. The key contribution of this paper is the upgrade of a particle filter algorithm for use with a probabilistic logical representation through the definition of a proposal distribution. The performance of the algorithm depends largely on how well this distribution fits the target distribution. We adopt the idea of logical compilation into Binary Decision Diagrams for sampling. This allows us to use the optimal proposal distribution which is normally prohibitively slow.
Parsons, Thomas E.; Geist, Eric L.
2009-01-01
The idea that faults rupture in repeated, characteristic earthquakes is central to most probabilistic earthquake forecasts. The concept is elegant in its simplicity, and if the same event has repeated itself multiple times in the past, we might anticipate the next. In practice however, assembling a fault-segmented characteristic earthquake rupture model can grow into a complex task laden with unquantified uncertainty. We weigh the evidence that supports characteristic earthquakes against a potentially simpler model made from extrapolation of a Gutenberg–Richter magnitude-frequency law to individual fault zones. We find that the Gutenberg–Richter model satisfies key data constraints used for earthquake forecasting equally well as a characteristic model. Therefore, judicious use of instrumental and historical earthquake catalogs enables large-earthquake-rate calculations with quantifiable uncertainty that should get at least equal weighting in probabilistic forecasting.
Risk assessment of turbine rotor failure using probabilistic ultrasonic non-destructive evaluations
NASA Astrophysics Data System (ADS)
Guan, Xuefei; Zhang, Jingdan; Zhou, S. Kevin; Rasselkorde, El Mahjoub; Abbasi, Waheed A.
2014-02-01
The study presents a method and application of risk assessment methodology for turbine rotor fatigue failure using probabilistic ultrasonic nondestructive evaluations. A rigorous probabilistic modeling for ultrasonic flaw sizing is developed by incorporating the model-assisted probability of detection, and the probability density function (PDF) of the actual flaw size is derived. Two general scenarios, namely the ultrasonic inspection with an identified flaw indication and the ultrasonic inspection without flaw indication, are considered in the derivation. To perform estimations for fatigue reliability and remaining useful life, uncertainties from ultrasonic flaw sizing and fatigue model parameters are systematically included and quantified. The model parameter PDF is estimated using Bayesian parameter estimation and actual fatigue testing data. The overall method is demonstrated using a realistic application of steam turbine rotor, and the risk analysis under given safety criteria is provided to support maintenance planning.
NASA Technical Reports Server (NTRS)
Boyce, Lola; Lovelace, Thomas B.
1989-01-01
FORTRAN program RANDOM2 is presented in the form of a user's manual. RANDOM2 is based on fracture mechanics using a probabilistic fatigue crack growth model. It predicts the random lifetime of an engine component to reach a given crack size. Details of the theoretical background, input data instructions, and a sample problem illustrating the use of the program are included.
NASA Technical Reports Server (NTRS)
Boyce, Lola; Lovelace, Thomas B.
1989-01-01
FORTRAN programs RANDOM3 and RANDOM4 are documented in the form of a user's manual. Both programs are based on fatigue strength reduction, using a probabilistic constitutive model. The programs predict the random lifetime of an engine component to reach a given fatigue strength. The theoretical backgrounds, input data instructions, and sample problems illustrating the use of the programs are included.
Bayesian Probabilistic Projection of International Migration.
Azose, Jonathan J; Raftery, Adrian E
2015-10-01
We propose a method for obtaining joint probabilistic projections of migration for all countries, broken down by age and sex. Joint trajectories for all countries are constrained to satisfy the requirement of zero global net migration. We evaluate our model using out-of-sample validation and compare point projections to the projected migration rates from a persistence model similar to the method used in the United Nations' World Population Prospects, and also to a state-of-the-art gravity model.
Probabilistic Estimates of Global Mean Sea Level and its Underlying Processes
NASA Astrophysics Data System (ADS)
Hay, C.; Morrow, E.; Kopp, R. E.; Mitrovica, J. X.
2015-12-01
Local sea level can vary significantly from the global mean value due to a suite of processes that includes ongoing sea-level changes due to the last ice age, land water storage, ocean circulation changes, and non-uniform sea-level changes that arise when modern-day land ice rapidly melts. Understanding these sources of spatial and temporal variability is critical to estimating past and present sea-level change and projecting future sea-level rise. Using two probabilistic techniques, a multi-model Kalman smoother and Gaussian process regression, we have reanalyzed 20th century tide gauge observations to produce a new estimate of global mean sea level (GMSL). Our methods allow us to extract global information from the sparse tide gauge field by taking advantage of the physics-based and model-derived geometry of the contributing processes. Both methods provide constraints on the sea-level contribution of glacial isostatic adjustment (GIA). The Kalman smoother tests multiple discrete models of glacial isostatic adjustment (GIA), probabilistically computing the most likely GIA model given the observations, while the Gaussian process regression characterizes the prior covariance structure of a suite of GIA models and then uses this structure to estimate the posterior distribution of local rates of GIA-induced sea-level change. We present the two methodologies, the model-derived geometries of the underlying processes, and our new probabilistic estimates of GMSL and GIA.
Praveen, Paurush; Fröhlich, Holger
2013-01-01
Inferring regulatory networks from experimental data via probabilistic graphical models is a popular framework to gain insights into biological systems. However, the inherent noise in experimental data coupled with a limited sample size reduces the performance of network reverse engineering. Prior knowledge from existing sources of biological information can address this low signal to noise problem by biasing the network inference towards biologically plausible network structures. Although integrating various sources of information is desirable, their heterogeneous nature makes this task challenging. We propose two computational methods to incorporate various information sources into a probabilistic consensus structure prior to be used in graphical model inference. Our first model, called Latent Factor Model (LFM), assumes a high degree of correlation among external information sources and reconstructs a hidden variable as a common source in a Bayesian manner. The second model, a Noisy-OR, picks up the strongest support for an interaction among information sources in a probabilistic fashion. Our extensive computational studies on KEGG signaling pathways as well as on gene expression data from breast cancer and yeast heat shock response reveal that both approaches can significantly enhance the reconstruction accuracy of Bayesian Networks compared to other competing methods as well as to the situation without any prior. Our framework allows for using diverse information sources, like pathway databases, GO terms and protein domain data, etc. and is flexible enough to integrate new sources, if available.
Modelling default and likelihood reasoning as probabilistic
NASA Technical Reports Server (NTRS)
Buntine, Wray
1990-01-01
A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. 'Likely' and 'by default' are in fact treated as duals in the same sense as 'possibility' and 'necessity'. To model these four forms probabilistically, a logic QDP and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequence results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.
The application of probabilistic design theory to high temperature low cycle fatigue
NASA Technical Reports Server (NTRS)
Wirsching, P. H.
1981-01-01
Metal fatigue under stress and thermal cycling is a principal mode of failure in gas turbine engine hot section components such as turbine blades and disks and combustor liners. Designing for fatigue is subject to considerable uncertainty, e.g., scatter in cycles to failure, available fatigue test data and operating environment data, uncertainties in the models used to predict stresses, etc. Methods of analyzing fatigue test data for probabilistic design purposes are summarized. The general strain life as well as homo- and hetero-scedastic models are considered. Modern probabilistic design theory is reviewed and examples are presented which illustrate application to reliability analysis of gas turbine engine components.
NASA Technical Reports Server (NTRS)
Bast, Callie Corinne Scheidt
1994-01-01
This thesis presents the on-going development of methodology for a probabilistic material strength degradation model. The probabilistic model, in the form of a postulated randomized multifactor equation, provides for quantification of uncertainty in the lifetime material strength of aerospace propulsion system components subjected to a number of diverse random effects. This model is embodied in the computer program entitled PROMISS, which can include up to eighteen different effects. Presently, the model includes four effects that typically reduce lifetime strength: high temperature, mechanical fatigue, creep, and thermal fatigue. Statistical analysis was conducted on experimental Inconel 718 data obtained from the open literature. This analysis provided regression parameters for use as the model's empirical material constants, thus calibrating the model specifically for Inconel 718. Model calibration was carried out for four variables, namely, high temperature, mechanical fatigue, creep, and thermal fatigue. Methodology to estimate standard deviations of these material constants for input into the probabilistic material strength model was developed. Using the current version of PROMISS, entitled PROMISS93, a sensitivity study for the combined effects of mechanical fatigue, creep, and thermal fatigue was performed. Results, in the form of cumulative distribution functions, illustrated the sensitivity of lifetime strength to any current value of an effect. In addition, verification studies comparing a combination of mechanical fatigue and high temperature effects by model to the combination by experiment were conducted. Thus, for Inconel 718, the basic model assumption of independence between effects was evaluated. Results from this limited verification study strongly supported this assumption.
2011-11-01
assessment to quality of localization/characterization estimates. This protocol includes four critical components: (1) a procedure to identify the...critical factors impacting SHM system performance; (2) a multistage or hierarchical approach to SHM system validation; (3) a model -assisted evaluation...Lindgren, E. A ., Buynak, C. F., Steffes, G., Derriso, M., “ Model -assisted Probabilistic Reliability Assessment for Structural Health Monitoring
Probabilistic Model Development
NASA Technical Reports Server (NTRS)
Adam, James H., Jr.
2010-01-01
Objective: Develop a Probabilistic Model for the Solar Energetic Particle Environment. Develop a tool to provide a reference solar particle radiation environment that: 1) Will not be exceeded at a user-specified confidence level; 2) Will provide reference environments for: a) Peak flux; b) Event-integrated fluence; and c) Mission-integrated fluence. The reference environments will consist of: a) Elemental energy spectra; b) For protons, helium and heavier ions.
Probabilistic Finite Element Analysis & Design Optimization for Structural Designs
NASA Astrophysics Data System (ADS)
Deivanayagam, Arumugam
This study focuses on implementing probabilistic nature of material properties (Kevlar® 49) to the existing deterministic finite element analysis (FEA) of fabric based engine containment system through Monte Carlo simulations (MCS) and implementation of probabilistic analysis in engineering designs through Reliability Based Design Optimization (RBDO). First, the emphasis is on experimental data analysis focusing on probabilistic distribution models which characterize the randomness associated with the experimental data. The material properties of Kevlar® 49 are modeled using experimental data analysis and implemented along with an existing spiral modeling scheme (SMS) and user defined constitutive model (UMAT) for fabric based engine containment simulations in LS-DYNA. MCS of the model are performed to observe the failure pattern and exit velocities of the models. Then the solutions are compared with NASA experimental tests and deterministic results. MCS with probabilistic material data give a good prospective on results rather than a single deterministic simulation results. The next part of research is to implement the probabilistic material properties in engineering designs. The main aim of structural design is to obtain optimal solutions. In any case, in a deterministic optimization problem even though the structures are cost effective, it becomes highly unreliable if the uncertainty that may be associated with the system (material properties, loading etc.) is not represented or considered in the solution process. Reliable and optimal solution can be obtained by performing reliability optimization along with the deterministic optimization, which is RBDO. In RBDO problem formulation, in addition to structural performance constraints, reliability constraints are also considered. This part of research starts with introduction to reliability analysis such as first order reliability analysis, second order reliability analysis followed by simulation technique that are performed to obtain probability of failure and reliability of structures. Next, decoupled RBDO procedure is proposed with a new reliability analysis formulation with sensitivity analysis, which is performed to remove the highly reliable constraints in the RBDO, thereby reducing the computational time and function evaluations. Followed by implementation of the reliability analysis concepts and RBDO in finite element 2D truss problems and a planar beam problem are presented and discussed.
NASA Astrophysics Data System (ADS)
Xu, Lei; Zhai, Wanming; Gao, Jianmin
2017-11-01
Track irregularities are inevitably in a process of stochastic evolution due to the uncertainty and continuity of wheel-rail interactions. For depicting the dynamic behaviours of vehicle-track coupling system caused by track random irregularities thoroughly, it is a necessity to develop a track irregularity probabilistic model to simulate rail surface irregularities with ergodic properties on amplitudes, wavelengths and probabilities, and to build a three-dimensional vehicle-track coupled model by properly considering the wheel-rail nonlinear contact mechanisms. In the present study, the vehicle-track coupled model is programmed by combining finite element method with wheel-rail coupling model firstly. Then, in light of the capability of power spectral density (PSD) in characterising amplitudes and wavelengths of stationary random signals, a track irregularity probabilistic model is presented to reveal and simulate the whole characteristics of track irregularity PSD. Finally, extended applications from three aspects, that is, extreme analysis, reliability analysis and response relationships between dynamic indices, are conducted to the evaluation and application of the proposed models.
Quantifying introgression risk with realistic population genetics.
Ghosh, Atiyo; Meirmans, Patrick G; Haccou, Patsy
2012-12-07
Introgression is the permanent incorporation of genes from the genome of one population into another. This can have severe consequences, such as extinction of endemic species, or the spread of transgenes. Quantification of the risk of introgression is an important component of genetically modified crop regulation. Most theoretical introgression studies aimed at such quantification disregard one or more of the most important factors concerning introgression: realistic genetical mechanisms, repeated invasions and stochasticity. In addition, the use of linkage as a risk mitigation strategy has not been studied properly yet with genetic introgression models. Current genetic introgression studies fail to take repeated invasions and demographic stochasticity into account properly, and use incorrect measures of introgression risk that can be manipulated by arbitrary choices. In this study, we present proper methods for risk quantification that overcome these difficulties. We generalize a probabilistic risk measure, the so-called hazard rate of introgression, for application to introgression models with complex genetics and small natural population sizes. We illustrate the method by studying the effects of linkage and recombination on transgene introgression risk at different population sizes.
Quantifying introgression risk with realistic population genetics
Ghosh, Atiyo; Meirmans, Patrick G.; Haccou, Patsy
2012-01-01
Introgression is the permanent incorporation of genes from the genome of one population into another. This can have severe consequences, such as extinction of endemic species, or the spread of transgenes. Quantification of the risk of introgression is an important component of genetically modified crop regulation. Most theoretical introgression studies aimed at such quantification disregard one or more of the most important factors concerning introgression: realistic genetical mechanisms, repeated invasions and stochasticity. In addition, the use of linkage as a risk mitigation strategy has not been studied properly yet with genetic introgression models. Current genetic introgression studies fail to take repeated invasions and demographic stochasticity into account properly, and use incorrect measures of introgression risk that can be manipulated by arbitrary choices. In this study, we present proper methods for risk quantification that overcome these difficulties. We generalize a probabilistic risk measure, the so-called hazard rate of introgression, for application to introgression models with complex genetics and small natural population sizes. We illustrate the method by studying the effects of linkage and recombination on transgene introgression risk at different population sizes. PMID:23055068
Squared exponential covariance function for prediction of hydrocarbon in seabed logging application
NASA Astrophysics Data System (ADS)
Mukhtar, Siti Mariam; Daud, Hanita; Dass, Sarat Chandra
2016-11-01
Seabed Logging technology (SBL) has progressively emerged as one of the demanding technologies in Exploration and Production (E&P) industry. Hydrocarbon prediction in deep water areas is crucial task for a driller in any oil and gas company as drilling cost is very expensive. Simulation data generated by Computer Software Technology (CST) is used to predict the presence of hydrocarbon where the models replicate real SBL environment. These models indicate that the hydrocarbon filled reservoirs are more resistive than surrounding water filled sediments. Then, as hydrocarbon depth is increased, it is more challenging to differentiate data with and without hydrocarbon. MATLAB is used for data extractions for curve fitting process using Gaussian process (GP). GP can be classified into regression and classification problems, where this work only focuses on Gaussian process regression (GPR) problem. Most popular choice to supervise GPR is squared exponential (SE), as it provides stability and probabilistic prediction in huge amounts of data. Hence, SE is used to predict the presence or absence of hydrocarbon in the reservoir from the data generated.
Testing for ontological errors in probabilistic forecasting models of natural systems
Marzocchi, Warner; Jordan, Thomas H.
2014-01-01
Probabilistic forecasting models describe the aleatory variability of natural systems as well as our epistemic uncertainty about how the systems work. Testing a model against observations exposes ontological errors in the representation of a system and its uncertainties. We clarify several conceptual issues regarding the testing of probabilistic forecasting models for ontological errors: the ambiguity of the aleatory/epistemic dichotomy, the quantification of uncertainties as degrees of belief, the interplay between Bayesian and frequentist methods, and the scientific pathway for capturing predictability. We show that testability of the ontological null hypothesis derives from an experimental concept, external to the model, that identifies collections of data, observed and not yet observed, that are judged to be exchangeable when conditioned on a set of explanatory variables. These conditional exchangeability judgments specify observations with well-defined frequencies. Any model predicting these behaviors can thus be tested for ontological error by frequentist methods; e.g., using P values. In the forecasting problem, prior predictive model checking, rather than posterior predictive checking, is desirable because it provides more severe tests. We illustrate experimental concepts using examples from probabilistic seismic hazard analysis. Severe testing of a model under an appropriate set of experimental concepts is the key to model validation, in which we seek to know whether a model replicates the data-generating process well enough to be sufficiently reliable for some useful purpose, such as long-term seismic forecasting. Pessimistic views of system predictability fail to recognize the power of this methodology in separating predictable behaviors from those that are not. PMID:25097265
Evaluation of Lithofacies Up-Scaling Methods for Probabilistic Prediction of Carbon Dioxide Behavior
NASA Astrophysics Data System (ADS)
Park, J. Y.; Lee, S.; Lee, Y. I.; Kihm, J. H.; Kim, J. M.
2017-12-01
Behavior of carbon dioxide injected into target reservoir (storage) formations is highly dependent on heterogeneities of geologic lithofacies and properties. These heterogeneous lithofacies and properties basically have probabilistic characteristics. Thus, their probabilistic evaluation has to be implemented properly into predicting behavior of injected carbon dioxide in heterogeneous storage formations. In this study, a series of three-dimensional geologic modeling is performed first using SKUA-GOCAD (ASGA and Paradigm) to establish lithofacies models of the Janggi Conglomerate in the Janggi Basin, Korea within a modeling domain. The Janggi Conglomerate is composed of mudstone, sandstone, and conglomerate, and it has been identified as a potential reservoir rock (clastic saline formation) for geologic carbon dioxide storage. Its lithofacies information are obtained from four boreholes and used in lithofacies modeling. Three different up-scaling methods (i.e., nearest to cell center, largest proportion, and random) are applied, and lithofacies modeling is performed 100 times for each up-scaling method. The lithofacies models are then compared and analyzed with the borehole data to evaluate the relative suitability of the three up-scaling methods. Finally, the lithofacies models are converted into coarser lithofacies models within the same modeling domain with larger grid blocks using the three up-scaling methods, and a series of multiphase thermo-hydrological numerical simulation is performed using TOUGH2-MP (Zhang et al., 2008) to predict probabilistically behavior of injected carbon dioxide. The coarser lithofacies models are also compared and analyzed with the borehole data and finer lithofacies models to evaluate the relative suitability of the three up-scaling methods. Three-dimensional geologic modeling, up-scaling, and multiphase thermo-hydrological numerical simulation as linked methodologies presented in this study can be utilized as a practical probabilistic evaluation tool to predict behavior of injected carbon dioxide and even to analyze its leakage risk. This work was supported by the Korea CCS 2020 Project of the Korea Carbon Capture and Sequestration R&D Center (KCRC) funded by the National Research Foundation (NRF), Ministry of Science and ICT (MSIT), Korea.
NASA Astrophysics Data System (ADS)
Naseri Kouzehgarani, Asal
2009-12-01
Most models of aircraft trajectories are non-linear and stochastic in nature; and their internal parameters are often poorly defined. The ability to model, simulate and analyze realistic air traffic management conflict detection scenarios in a scalable, composable, multi-aircraft fashion is an extremely difficult endeavor. Accurate techniques for aircraft mode detection are critical in order to enable the precise projection of aircraft conflicts, and for the enactment of altitude separation resolution strategies. Conflict detection is an inherently probabilistic endeavor; our ability to detect conflicts in a timely and accurate manner over a fixed time horizon is traded off against the increased human workload created by false alarms---that is, situations that would not develop into an actual conflict, or would resolve naturally in the appropriate time horizon-thereby introducing a measure of probabilistic uncertainty in any decision aid fashioned to assist air traffic controllers. The interaction of the continuous dynamics of the aircraft, used for prediction purposes, with the discrete conflict detection logic gives rise to the hybrid nature of the overall system. The introduction of the probabilistic element, common to decision alerting and aiding devices, places the conflict detection and resolution problem in the domain of probabilistic hybrid phenomena. A hidden Markov model (HMM) has two stochastic components: a finite-state Markov chain and a finite set of output probability distributions. In other words an unobservable stochastic process (hidden) that can only be observed through another set of stochastic processes that generate the sequence of observations. The problem of self separation in distributed air traffic management reduces to the ability of aircraft to communicate state information to neighboring aircraft, as well as model the evolution of aircraft trajectories between communications, in the presence of probabilistic uncertain dynamics as well as partially observable and uncertain data. We introduce the Hybrid Hidden Markov Modeling (HHMM) formalism to enable the prediction of the stochastic aircraft states (and thus, potential conflicts), by combining elements of the probabilistic timed input output automaton and the partially observable Markov decision process frameworks, along with the novel addition of a Markovian scheduler to remove the non-deterministic elements arising from the enabling of several actions simultaneously. Comparisons of aircraft in level, climbing/descending and turning flight are performed, and unknown flight track data is evaluated probabilistically against the tuned model in order to assess the effectiveness of the model in detecting the switch between multiple flight modes for a given aircraft. This also allows for the generation of probabilistic distribution over the execution traces of the hybrid hidden Markov model, which then enables the prediction of the states of aircraft based on partially observable and uncertain data. Based on the composition properties of the HHMM, we study a decentralized air traffic system where aircraft are moving along streams and can perform cruise, accelerate, climb and turn maneuvers. We develop a common decentralized policy for conflict avoidance with spatially distributed agents (aircraft in the sky) and assure its safety properties via correctness proofs.
NASA Astrophysics Data System (ADS)
Appleby, D. M.
2007-02-01
Einstein initially objected to the probabilistic aspect of quantum mechanics—the idea that God is playing at dice. Later he changed his ground, and focussed instead on the point that the Copenhagen Interpretation leads to what Einstein saw as the abandonment of physical realism. We argue here that Einstein's initial intuition was perfectly sound, and that it is precisely the fact that quantum mechanics is a fundamentally probabilistic theory which is at the root of all the controversies regarding its interpretation. Probability is an intrinsically logical concept. This means that the quantum state has an essentially logical significance. It is extremely difficult to reconcile that fact with Einstein's belief, that it is the task of physics to give us a vision of the world apprehended sub specie aeternitatis. Quantum mechanics thus presents us with a simple choice: either to follow Einstein in looking for a theory which is not probabilistic at the fundamental level, or else to accept that physics does not in fact put us in the position of God looking down on things from above. There is a widespread fear that the latter alternative must inevitably lead to a greatly impoverished, positivistic view of physical theory. It appears to us, however, that the truth is just the opposite. The Einsteinian vision is much less attractive than it seems at first sight. In particular, it is closely connected with philosophical reductionism.
A probabilistic assessment of health risks associated with short-term exposure to tropospheric ozone
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitfield, R.G; Biller, W.F.; Jusko, M.J.
1996-06-01
The work described in this report is part of a larger risk assessment sponsored by the U.S. Environmental Protection Agency. Earlier efforts developed exposure-response relationships for acute health effects among populations engaged in heavy exertion. Those efforts also developed a probabilistic national ambient air quality standards exposure model and a general methodology for integrating probabilistic exposure-response relation- ships and exposure estimates to calculate overall risk results. Recently published data make it possible to model additional health endpoints (for exposure at moderate exertion), including hospital admissions. New air quality and exposure estimates for alternative national ambient air quality standards for ozonemore » are combined with exposure-response models to produce the risk results for hospital admissions and acute health effects. Sample results explain the methodology and introduce risk output formats.« less
Mixture Modeling for Background and Sources Separation in x-ray Astronomical Images
NASA Astrophysics Data System (ADS)
Guglielmetti, Fabrizia; Fischer, Rainer; Dose, Volker
2004-11-01
A probabilistic technique for the joint estimation of background and sources in high-energy astrophysics is described. Bayesian probability theory is applied to gain insight into the coexistence of background and sources through a probabilistic two-component mixture model, which provides consistent uncertainties of background and sources. The present analysis is applied to ROSAT PSPC data (0.1-2.4 keV) in Survey Mode. A background map is modelled using a Thin-Plate spline. Source probability maps are obtained for each pixel (45 arcsec) independently and for larger correlation lengths, revealing faint and extended sources. We will demonstrate that the described probabilistic method allows for detection improvement of faint extended celestial sources compared to the Standard Analysis Software System (SASS) used for the production of the ROSAT All-Sky Survey (RASS) catalogues.
Probabilistic failure assessment with application to solid rocket motors
NASA Technical Reports Server (NTRS)
Jan, Darrell L.; Davidson, Barry D.; Moore, Nicholas R.
1990-01-01
A quantitative methodology is being developed for assessment of risk of failure of solid rocket motors. This probabilistic methodology employs best available engineering models and available information in a stochastic framework. The framework accounts for incomplete knowledge of governing parameters, intrinsic variability, and failure model specification error. Earlier case studies have been conducted on several failure modes of the Space Shuttle Main Engine. Work in progress on application of this probabilistic approach to large solid rocket boosters such as the Advanced Solid Rocket Motor for the Space Shuttle is described. Failure due to debonding has been selected as the first case study for large solid rocket motors (SRMs) since it accounts for a significant number of historical SRM failures. Impact of incomplete knowledge of governing parameters and failure model specification errors is expected to be important.
An efficient deterministic-probabilistic approach to modeling regional groundwater flow: 1. Theory
Yen, Chung-Cheng; Guymon, Gary L.
1990-01-01
An efficient probabilistic model is developed and cascaded with a deterministic model for predicting water table elevations in regional aquifers. The objective is to quantify model uncertainty where precise estimates of water table elevations may be required. The probabilistic model is based on the two-point probability method which only requires prior knowledge of uncertain variables mean and coefficient of variation. The two-point estimate method is theoretically developed and compared with the Monte Carlo simulation method. The results of comparisons using hypothetical determinisitic problems indicate that the two-point estimate method is only generally valid for linear problems where the coefficients of variation of uncertain parameters (for example, storage coefficient and hydraulic conductivity) is small. The two-point estimate method may be applied to slightly nonlinear problems with good results, provided coefficients of variation are small. In such cases, the two-point estimate method is much more efficient than the Monte Carlo method provided the number of uncertain variables is less than eight.
An Efficient Deterministic-Probabilistic Approach to Modeling Regional Groundwater Flow: 1. Theory
NASA Astrophysics Data System (ADS)
Yen, Chung-Cheng; Guymon, Gary L.
1990-07-01
An efficient probabilistic model is developed and cascaded with a deterministic model for predicting water table elevations in regional aquifers. The objective is to quantify model uncertainty where precise estimates of water table elevations may be required. The probabilistic model is based on the two-point probability method which only requires prior knowledge of uncertain variables mean and coefficient of variation. The two-point estimate method is theoretically developed and compared with the Monte Carlo simulation method. The results of comparisons using hypothetical determinisitic problems indicate that the two-point estimate method is only generally valid for linear problems where the coefficients of variation of uncertain parameters (for example, storage coefficient and hydraulic conductivity) is small. The two-point estimate method may be applied to slightly nonlinear problems with good results, provided coefficients of variation are small. In such cases, the two-point estimate method is much more efficient than the Monte Carlo method provided the number of uncertain variables is less than eight.
Zhang, Miaomiao; Wells, William M; Golland, Polina
2017-10-01
We present an efficient probabilistic model of anatomical variability in a linear space of initial velocities of diffeomorphic transformations and demonstrate its benefits in clinical studies of brain anatomy. To overcome the computational challenges of the high dimensional deformation-based descriptors, we develop a latent variable model for principal geodesic analysis (PGA) based on a low dimensional shape descriptor that effectively captures the intrinsic variability in a population. We define a novel shape prior that explicitly represents principal modes as a multivariate complex Gaussian distribution on the initial velocities in a bandlimited space. We demonstrate the performance of our model on a set of 3D brain MRI scans from the Alzheimer's Disease Neuroimaging Initiative (ADNI) database. Our model yields a more compact representation of group variation at substantially lower computational cost than the state-of-the-art method such as tangent space PCA (TPCA) and probabilistic principal geodesic analysis (PPGA) that operate in the high dimensional image space. Copyright © 2017 Elsevier B.V. All rights reserved.
Incorporating seismic phase correlations into a probabilistic model of global-scale seismology
NASA Astrophysics Data System (ADS)
Arora, Nimar
2013-04-01
We present a probabilistic model of seismic phases whereby the attributes of the body-wave phases are correlated to those of the first arriving P phase. This model has been incorporated into NET-VISA (Network processing Vertically Integrated Seismic Analysis) a probabilistic generative model of seismic events, their transmission, and detection on a global seismic network. In the earlier version of NET-VISA, seismic phase were assumed to be independent of each other. Although this didn't affect the quality of the inferred seismic bulletin, for the most part, it did result in a few instances of anomalous phase association. For example, an S phase with a smaller slowness than the corresponding P phase. We demonstrate that the phase attributes are indeed highly correlated, for example the uncertainty in the S phase travel time is significantly reduced given the P phase travel time. Our new model exploits these correlations to produce better calibrated probabilities for the events, as well as fewer anomalous associations.
NASA Technical Reports Server (NTRS)
Bast, Callie C.; Boyce, Lola
1995-01-01
The development of methodology for a probabilistic material strength degradation is described. The probabilistic model, in the form of a postulated randomized multifactor equation, provides for quantification of uncertainty in the lifetime material strength of aerospace propulsion system components subjected to a number of diverse random effects. This model is embodied in the computer program entitled PROMISS, which can include up to eighteen different effects. Presently, the model includes five effects that typically reduce lifetime strength: high temperature, high-cycle mechanical fatigue, low-cycle mechanical fatigue, creep and thermal fatigue. Results, in the form of cumulative distribution functions, illustrated the sensitivity of lifetime strength to any current value of an effect. In addition, verification studies comparing predictions of high-cycle mechanical fatigue and high temperature effects with experiments are presented. Results from this limited verification study strongly supported that material degradation can be represented by randomized multifactor interaction models.
A Practical Probabilistic Graphical Modeling Tool for Weighing Ecological Risk-Based Evidence
Past weight-of-evidence frameworks for adverse ecological effects have provided soft-scoring procedures for judgments based on the quality and measured attributes of evidence. Here, we provide a flexible probabilistic structure for weighing and integrating lines of evidence for e...
Briggs, Andrew H; Ades, A E; Price, Martin J
2003-01-01
In structuring decision models of medical interventions, it is commonly recommended that only 2 branches be used for each chance node to avoid logical inconsistencies that can arise during sensitivity analyses if the branching probabilities do not sum to 1. However, information may be naturally available in an unconditional form, and structuring a tree in conditional form may complicate rather than simplify the sensitivity analysis of the unconditional probabilities. Current guidance emphasizes using probabilistic sensitivity analysis, and a method is required to provide probabilistic probabilities over multiple branches that appropriately represents uncertainty while satisfying the requirement that mutually exclusive event probabilities should sum to 1. The authors argue that the Dirichlet distribution, the multivariate equivalent of the beta distribution, is appropriate for this purpose and illustrate its use for generating a fully probabilistic transition matrix for a Markov model. Furthermore, they demonstrate that by adopting a Bayesian approach, the problem of observing zero counts for transitions of interest can be overcome.
Probabilistic wind/tornado/missile analyses for hazard and fragility evaluations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, Y.J.; Reich, M.
Detailed analysis procedures and examples are presented for the probabilistic evaluation of hazard and fragility against high wind, tornado, and tornado-generated missiles. In the tornado hazard analysis, existing risk models are modified to incorporate various uncertainties including modeling errors. A significant feature of this paper is the detailed description of the Monte-Carlo simulation analyses of tornado-generated missiles. A simulation procedure, which includes the wind field modeling, missile injection, solution of flight equations, and missile impact analysis, is described with application examples.
Dynamic competitive probabilistic principal components analysis.
López-Rubio, Ezequiel; Ortiz-DE-Lazcano-Lobato, Juan Miguel
2009-04-01
We present a new neural model which extends the classical competitive learning (CL) by performing a Probabilistic Principal Components Analysis (PPCA) at each neuron. The model also has the ability to learn the number of basis vectors required to represent the principal directions of each cluster, so it overcomes a drawback of most local PCA models, where the dimensionality of a cluster must be fixed a priori. Experimental results are presented to show the performance of the network with multispectral image data.
Confronting uncertainty in flood damage predictions
NASA Astrophysics Data System (ADS)
Schröter, Kai; Kreibich, Heidi; Vogel, Kristin; Merz, Bruno
2015-04-01
Reliable flood damage models are a prerequisite for the practical usefulness of the model results. Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005 and 2006, in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The reliability of the probabilistic predictions within validation runs decreases only slightly and achieves a very good coverage of observations within the predictive interval. Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.
Probabilistic Modeling of the Renal Stone Formation Module
NASA Technical Reports Server (NTRS)
Best, Lauren M.; Myers, Jerry G.; Goodenow, Debra A.; McRae, Michael P.; Jackson, Travis C.
2013-01-01
The Integrated Medical Model (IMM) is a probabilistic tool, used in mission planning decision making and medical systems risk assessments. The IMM project maintains a database of over 80 medical conditions that could occur during a spaceflight, documenting an incidence rate and end case scenarios for each. In some cases, where observational data are insufficient to adequately define the inflight medical risk, the IMM utilizes external probabilistic modules to model and estimate the event likelihoods. One such medical event of interest is an unpassed renal stone. Due to a high salt diet and high concentrations of calcium in the blood (due to bone depletion caused by unloading in the microgravity environment) astronauts are at a considerable elevated risk for developing renal calculi (nephrolithiasis) while in space. Lack of observed incidences of nephrolithiasis has led HRP to initiate the development of the Renal Stone Formation Module (RSFM) to create a probabilistic simulator capable of estimating the likelihood of symptomatic renal stone presentation in astronauts on exploration missions. The model consists of two major parts. The first is the probabilistic component, which utilizes probability distributions to assess the range of urine electrolyte parameters and a multivariate regression to transform estimated crystal density and size distributions to the likelihood of the presentation of nephrolithiasis symptoms. The second is a deterministic physical and chemical model of renal stone growth in the kidney developed by Kassemi et al. The probabilistic component of the renal stone model couples the input probability distributions describing the urine chemistry, astronaut physiology, and system parameters with the physical and chemical outputs and inputs to the deterministic stone growth model. These two parts of the model are necessary to capture the uncertainty in the likelihood estimate. The model will be driven by Monte Carlo simulations, continuously randomly sampling the probability distributions of the electrolyte concentrations and system parameters that are inputs into the deterministic model. The total urine chemistry concentrations are used to determine the urine chemistry activity using the Joint Expert Speciation System (JESS), a biochemistry model. Information used from JESS is then fed into the deterministic growth model. Outputs from JESS and the deterministic model are passed back to the probabilistic model where a multivariate regression is used to assess the likelihood of a stone forming and the likelihood of a stone requiring clinical intervention. The parameters used to determine to quantify these risks include: relative supersaturation (RS) of calcium oxalate, citrate/calcium ratio, crystal number density, total urine volume, pH, magnesium excretion, maximum stone width, and ureteral location. Methods and Validation: The RSFM is designed to perform a Monte Carlo simulation to generate probability distributions of clinically significant renal stones, as well as provide an associated uncertainty in the estimate. Initially, early versions will be used to test integration of the components and assess component validation and verification (V&V), with later versions used to address questions regarding design reference mission scenarios. Once integrated with the deterministic component, the credibility assessment of the integrated model will follow NASA STD 7009 requirements.
Do Resit Exams Promote Lower Investments of Study Time? Theory and Data from a Laboratory Study
Nieuwenstein, Mark R.; de Jong, Ritske; Lorist, Monicque M.
2016-01-01
Although many educational institutions allow students to resit exams, a recently proposed mathematical model suggests that this could lead to a dramatic reduction in study-time investment, especially in rational students. In the current study, we present a modification of this model in which we included some well-justified assumptions about learning and performance on multiple-choice tests, and we tested its predictions in two experiments in which participants were asked to invest fictional study time for a fictional exam. Consistent with our model, the prospect of a resit exam was found to promote lower investments of study time for a first exam and this effect was stronger for participants scoring higher on the cognitive reflection test. We also found that the negative effect of resit exams on study-time investment was attenuated when access to the resit was made uncertain by making it probabilistic or dependent on obtaining a minimal, non-passing grade for the first attempt. Taken together, these results suggest that offering students resit exams may compromise the achievement of learning goals, and they raise the more general implication that second chances promote risky behavior. PMID:27711140
Do Resit Exams Promote Lower Investments of Study Time? Theory and Data from a Laboratory Study.
Nijenkamp, Rob; Nieuwenstein, Mark R; de Jong, Ritske; Lorist, Monicque M
2016-01-01
Although many educational institutions allow students to resit exams, a recently proposed mathematical model suggests that this could lead to a dramatic reduction in study-time investment, especially in rational students. In the current study, we present a modification of this model in which we included some well-justified assumptions about learning and performance on multiple-choice tests, and we tested its predictions in two experiments in which participants were asked to invest fictional study time for a fictional exam. Consistent with our model, the prospect of a resit exam was found to promote lower investments of study time for a first exam and this effect was stronger for participants scoring higher on the cognitive reflection test. We also found that the negative effect of resit exams on study-time investment was attenuated when access to the resit was made uncertain by making it probabilistic or dependent on obtaining a minimal, non-passing grade for the first attempt. Taken together, these results suggest that offering students resit exams may compromise the achievement of learning goals, and they raise the more general implication that second chances promote risky behavior.
ERIC Educational Resources Information Center
Wang, Yinying; Bowers, Alex J.; Fikis, David J.
2017-01-01
Purpose: The purpose of this study is to describe the underlying topics and the topic evolution in the 50-year history of educational leadership research literature. Method: We used automated text data mining with probabilistic latent topic models to examine the full text of the entire publication history of all 1,539 articles published in…
Method and system for dynamic probabilistic risk assessment
NASA Technical Reports Server (NTRS)
Dugan, Joanne Bechta (Inventor); Xu, Hong (Inventor)
2013-01-01
The DEFT methodology, system and computer readable medium extends the applicability of the PRA (Probabilistic Risk Assessment) methodology to computer-based systems, by allowing DFT (Dynamic Fault Tree) nodes as pivot nodes in the Event Tree (ET) model. DEFT includes a mathematical model and solution algorithm, supports all common PRA analysis functions and cutsets. Additional capabilities enabled by the DFT include modularization, phased mission analysis, sequence dependencies, and imperfect coverage.
Probabilistic Climate Scenario Information for Risk Assessment
NASA Astrophysics Data System (ADS)
Dairaku, K.; Ueno, G.; Takayabu, I.
2014-12-01
Climate information and services for Impacts, Adaptation and Vulnerability (IAV) Assessments are of great concern. In order to develop probabilistic regional climate information that represents the uncertainty in climate scenario experiments in Japan, we compared the physics ensemble experiments using the 60km global atmospheric model of the Meteorological Research Institute (MRI-AGCM) with multi-model ensemble experiments with global atmospheric-ocean coupled models (CMIP3) of SRES A1b scenario experiments. The MRI-AGCM shows relatively good skills particularly in tropics for temperature and geopotential height. Variability in surface air temperature of physical ensemble experiments with MRI-AGCM was within the range of one standard deviation of the CMIP3 model in the Asia region. On the other hand, the variability of precipitation was relatively well represented compared with the variation of the CMIP3 models. Models which show the similar reproducibility in the present climate shows different future climate change. We couldn't find clear relationships between present climate and future climate change in temperature and precipitation. We develop a new method to produce probabilistic information of climate change scenarios by weighting model ensemble experiments based on a regression model (Krishnamurti et al., Science, 1999). The method can be easily applicable to other regions and other physical quantities, and also to downscale to finer-scale dependent on availability of observation dataset. The prototype of probabilistic information in Japan represents the quantified structural uncertainties of multi-model ensemble experiments of climate change scenarios. Acknowledgments: This study was supported by the SOUSEI Program, funded by Ministry of Education, Culture, Sports, Science and Technology, Government of Japan.
NASA Technical Reports Server (NTRS)
Warner, James E.; Zubair, Mohammad; Ranjan, Desh
2017-01-01
This work investigates novel approaches to probabilistic damage diagnosis that utilize surrogate modeling and high performance computing (HPC) to achieve substantial computational speedup. Motivated by Digital Twin, a structural health management (SHM) paradigm that integrates vehicle-specific characteristics with continual in-situ damage diagnosis and prognosis, the methods studied herein yield near real-time damage assessments that could enable monitoring of a vehicle's health while it is operating (i.e. online SHM). High-fidelity modeling and uncertainty quantification (UQ), both critical to Digital Twin, are incorporated using finite element method simulations and Bayesian inference, respectively. The crux of the proposed Bayesian diagnosis methods, however, is the reformulation of the numerical sampling algorithms (e.g. Markov chain Monte Carlo) used to generate the resulting probabilistic damage estimates. To this end, three distinct methods are demonstrated for rapid sampling that utilize surrogate modeling and exploit various degrees of parallelism for leveraging HPC. The accuracy and computational efficiency of the methods are compared on the problem of strain-based crack identification in thin plates. While each approach has inherent problem-specific strengths and weaknesses, all approaches are shown to provide accurate probabilistic damage diagnoses and several orders of magnitude computational speedup relative to a baseline Bayesian diagnosis implementation.
The meta-Gaussian Bayesian Processor of forecasts and associated preliminary experiments
NASA Astrophysics Data System (ADS)
Chen, Fajing; Jiao, Meiyan; Chen, Jing
2013-04-01
Public weather services are trending toward providing users with probabilistic weather forecasts, in place of traditional deterministic forecasts. Probabilistic forecasting techniques are continually being improved to optimize available forecasting information. The Bayesian Processor of Forecast (BPF), a new statistical method for probabilistic forecast, can transform a deterministic forecast into a probabilistic forecast according to the historical statistical relationship between observations and forecasts generated by that forecasting system. This technique accounts for the typical forecasting performance of a deterministic forecasting system in quantifying the forecast uncertainty. The meta-Gaussian likelihood model is suitable for a variety of stochastic dependence structures with monotone likelihood ratios. The meta-Gaussian BPF adopting this kind of likelihood model can therefore be applied across many fields, including meteorology and hydrology. The Bayes theorem with two continuous random variables and the normal-linear BPF are briefly introduced. The meta-Gaussian BPF for a continuous predictand using a single predictor is then presented and discussed. The performance of the meta-Gaussian BPF is tested in a preliminary experiment. Control forecasts of daily surface temperature at 0000 UTC at Changsha and Wuhan stations are used as the deterministic forecast data. These control forecasts are taken from ensemble predictions with a 96-h lead time generated by the National Meteorological Center of the China Meteorological Administration, the European Centre for Medium-Range Weather Forecasts, and the US National Centers for Environmental Prediction during January 2008. The results of the experiment show that the meta-Gaussian BPF can transform a deterministic control forecast of surface temperature from any one of the three ensemble predictions into a useful probabilistic forecast of surface temperature. These probabilistic forecasts quantify the uncertainty of the control forecast; accordingly, the performance of the probabilistic forecasts differs based on the source of the underlying deterministic control forecasts.
Speech Enhancement Using Gaussian Scale Mixture Models
Hao, Jiucang; Lee, Te-Won; Sejnowski, Terrence J.
2011-01-01
This paper presents a novel probabilistic approach to speech enhancement. Instead of a deterministic logarithmic relationship, we assume a probabilistic relationship between the frequency coefficients and the log-spectra. The speech model in the log-spectral domain is a Gaussian mixture model (GMM). The frequency coefficients obey a zero-mean Gaussian whose covariance equals to the exponential of the log-spectra. This results in a Gaussian scale mixture model (GSMM) for the speech signal in the frequency domain, since the log-spectra can be regarded as scaling factors. The probabilistic relation between frequency coefficients and log-spectra allows these to be treated as two random variables, both to be estimated from the noisy signals. Expectation-maximization (EM) was used to train the GSMM and Bayesian inference was used to compute the posterior signal distribution. Because exact inference of this full probabilistic model is computationally intractable, we developed two approaches to enhance the efficiency: the Laplace method and a variational approximation. The proposed methods were applied to enhance speech corrupted by Gaussian noise and speech-shaped noise (SSN). For both approximations, signals reconstructed from the estimated frequency coefficients provided higher signal-to-noise ratio (SNR) and those reconstructed from the estimated log-spectra produced lower word recognition error rate because the log-spectra fit the inputs to the recognizer better. Our algorithms effectively reduced the SSN, which algorithms based on spectral analysis were not able to suppress. PMID:21359139
Encoding probabilistic brain atlases using Bayesian inference.
Van Leemput, Koen
2009-06-01
This paper addresses the problem of creating probabilistic brain atlases from manually labeled training data. Probabilistic atlases are typically constructed by counting the relative frequency of occurrence of labels in corresponding locations across the training images. However, such an "averaging" approach generalizes poorly to unseen cases when the number of training images is limited, and provides no principled way of aligning the training datasets using deformable registration. In this paper, we generalize the generative image model implicitly underlying standard "average" atlases, using mesh-based representations endowed with an explicit deformation model. Bayesian inference is used to infer the optimal model parameters from the training data, leading to a simultaneous group-wise registration and atlas estimation scheme that encompasses standard averaging as a special case. We also use Bayesian inference to compare alternative atlas models in light of the training data, and show how this leads to a data compression problem that is intuitive to interpret and computationally feasible. Using this technique, we automatically determine the optimal amount of spatial blurring, the best deformation field flexibility, and the most compact mesh representation. We demonstrate, using 2-D training datasets, that the resulting models are better at capturing the structure in the training data than conventional probabilistic atlases. We also present experiments of the proposed atlas construction technique in 3-D, and show the resulting atlases' potential in fully-automated, pulse sequence-adaptive segmentation of 36 neuroanatomical structures in brain MRI scans.
Social and monetary reward learning engage overlapping neural substrates.
Lin, Alice; Adolphs, Ralph; Rangel, Antonio
2012-03-01
Learning to make choices that yield rewarding outcomes requires the computation of three distinct signals: stimulus values that are used to guide choices at the time of decision making, experienced utility signals that are used to evaluate the outcomes of those decisions and prediction errors that are used to update the values assigned to stimuli during reward learning. Here we investigated whether monetary and social rewards involve overlapping neural substrates during these computations. Subjects engaged in two probabilistic reward learning tasks that were identical except that rewards were either social (pictures of smiling or angry people) or monetary (gaining or losing money). We found substantial overlap between the two types of rewards for all components of the learning process: a common area of ventromedial prefrontal cortex (vmPFC) correlated with stimulus value at the time of choice and another common area of vmPFC correlated with reward magnitude and common areas in the striatum correlated with prediction errors. Taken together, the findings support the hypothesis that shared anatomical substrates are involved in the computation of both monetary and social rewards. © The Author (2011). Published by Oxford University Press.
An Instructional Module on Mokken Scale Analysis
ERIC Educational Resources Information Center
Wind, Stefanie A.
2017-01-01
Mokken scale analysis (MSA) is a probabilistic-nonparametric approach to item response theory (IRT) that can be used to evaluate fundamental measurement properties with less strict assumptions than parametric IRT models. This instructional module provides an introduction to MSA as a probabilistic-nonparametric framework in which to explore…
Offerman, Theo; Palley, Asa B
2016-01-01
Strictly proper scoring rules are designed to truthfully elicit subjective probabilistic beliefs from risk neutral agents. Previous experimental studies have identified two problems with this method: (i) risk aversion causes agents to bias their reports toward the probability of [Formula: see text], and (ii) for moderate beliefs agents simply report [Formula: see text]. Applying a prospect theory model of risk preferences, we show that loss aversion can explain both of these behavioral phenomena. Using the insights of this model, we develop a simple off-the-shelf probability assessment mechanism that encourages loss-averse agents to report true beliefs. In an experiment, we demonstrate the effectiveness of this modification in both eliminating uninformative reports and eliciting true probabilistic beliefs.
Probabilistic Methods for Structural Reliability and Risk
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2010-01-01
A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multifactor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.
Probabilistic Methods for Structural Reliability and Risk
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2008-01-01
A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multi-factor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.
High-Resolution Underwater Mapping Using Side-Scan Sonar
2016-01-01
The goal of this study is to generate high-resolution sea floor maps using a Side-Scan Sonar(SSS). This is achieved by explicitly taking into account the SSS operation as follows. First, the raw sensor data is corrected by means of a physics-based SSS model. Second, the data is projected to the sea-floor. The errors involved in this projection are thoroughfully analysed. Third, a probabilistic SSS model is defined and used to estimate the probability of each sea-floor region to be observed. This probabilistic information is then used to weight the contribution of each SSS measurement to the map. Because of these models, arbitrary map resolutions can be achieved, even beyond the sensor resolution. Finally, a geometric map building method is presented and combined with the probabilistic approach. The resulting map is composed of two layers. The echo intensity layer holds the most likely echo intensities at each point in the sea-floor. The probabilistic layer contains information about how confident can the user or the higher control layers be about the echo intensity layer data. Experimental results have been conducted in a large subsea region. PMID:26821379
Event-Based Media Enrichment Using an Adaptive Probabilistic Hypergraph Model.
Liu, Xueliang; Wang, Meng; Yin, Bao-Cai; Huet, Benoit; Li, Xuelong
2015-11-01
Nowadays, with the continual development of digital capture technologies and social media services, a vast number of media documents are captured and shared online to help attendees record their experience during events. In this paper, we present a method combining semantic inference and multimodal analysis for automatically finding media content to illustrate events using an adaptive probabilistic hypergraph model. In this model, media items are taken as vertices in the weighted hypergraph and the task of enriching media to illustrate events is formulated as a ranking problem. In our method, each hyperedge is constructed using the K-nearest neighbors of a given media document. We also employ a probabilistic representation, which assigns each vertex to a hyperedge in a probabilistic way, to further exploit the correlation among media data. Furthermore, we optimize the hypergraph weights in a regularization framework, which is solved as a second-order cone problem. The approach is initiated by seed media and then used to rank the media documents using a transductive inference process. The results obtained from validating the approach on an event dataset collected from EventMedia demonstrate the effectiveness of the proposed approach.
MrLavaLoba: A new probabilistic model for the simulation of lava flows as a settling process
NASA Astrophysics Data System (ADS)
de'Michieli Vitturi, Mattia; Tarquini, Simone
2018-01-01
A new code to simulate lava flow spread, MrLavaLoba, is presented. In the code, erupted lava is itemized in parcels having an elliptical shape and prescribed volume. New parcels bud from existing ones according to a probabilistic law influenced by the local steepest slope direction and by tunable input settings. MrLavaLoba must be accounted among the probabilistic codes for the simulation of lava flows, because it is not intended to mimic the actual process of flowing or to provide directly the progression with time of the flow field, but rather to guess the most probable inundated area and final thickness of the lava deposit. The code's flexibility allows it to produce variable lava flow spread and emplacement according to different dynamics (e.g. pahoehoe or channelized-'a'ā). For a given scenario, it is shown that model outputs converge, in probabilistic terms, towards a single solution. The code is applied to real cases in Hawaii and Mt. Etna, and the obtained maps are shown. The model is written in Python and the source code is available at http://demichie.github.io/MrLavaLoba/.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yersak, Alexander S., E-mail: alexander.yersak@colorado.edu; Lee, Yung-Cheng
Pinhole defects in atomic layer deposition (ALD) coatings were measured in an area of 30 cm{sup 2} in an ALD reactor, and these defects were represented by a probabilistic cluster model instead of a single defect density value with number of defects over area. With the probabilistic cluster model, the pinhole defects were simulated over a manufacturing scale surface area of ∼1 m{sup 2}. Large-area pinhole defect simulations were used to develop an improved and enhanced design method for ALD-based devices. A flexible thermal ground plane (FTGP) device requiring ALD hermetic coatings was used as an example. Using a single defectmore » density value, it was determined that for an application with operation temperatures higher than 60 °C, the FTGP device would not be possible. The new probabilistic cluster model shows that up to 40.3% of the FTGP would be acceptable. With this new approach the manufacturing yield of ALD-enabled or other thin film based devices with different design configurations can be determined. It is important to guide process optimization and control and design for manufacturability.« less
Bayesian Probabilistic Projections of Life Expectancy for All Countries
Raftery, Adrian E.; Chunn, Jennifer L.; Gerland, Patrick; Ševčíková, Hana
2014-01-01
We propose a Bayesian hierarchical model for producing probabilistic forecasts of male period life expectancy at birth for all the countries of the world from the present to 2100. Such forecasts would be an input to the production of probabilistic population projections for all countries, which is currently being considered by the United Nations. To evaluate the method, we did an out-of-sample cross-validation experiment, fitting the model to the data from 1950–1995, and using the estimated model to forecast for the subsequent ten years. The ten-year predictions had a mean absolute error of about 1 year, about 40% less than the current UN methodology. The probabilistic forecasts were calibrated, in the sense that (for example) the 80% prediction intervals contained the truth about 80% of the time. We illustrate our method with results from Madagascar (a typical country with steadily improving life expectancy), Latvia (a country that has had a mortality crisis), and Japan (a leading country). We also show aggregated results for South Asia, a region with eight countries. Free publicly available R software packages called bayesLife and bayesDem are available to implement the method. PMID:23494599
Edwards, T.C.; Cutler, D.R.; Zimmermann, N.E.; Geiser, L.; Moisen, Gretchen G.
2006-01-01
We evaluated the effects of probabilistic (hereafter DESIGN) and non-probabilistic (PURPOSIVE) sample surveys on resultant classification tree models for predicting the presence of four lichen species in the Pacific Northwest, USA. Models derived from both survey forms were assessed using an independent data set (EVALUATION). Measures of accuracy as gauged by resubstitution rates were similar for each lichen species irrespective of the underlying sample survey form. Cross-validation estimates of prediction accuracies were lower than resubstitution accuracies for all species and both design types, and in all cases were closer to the true prediction accuracies based on the EVALUATION data set. We argue that greater emphasis should be placed on calculating and reporting cross-validation accuracy rates rather than simple resubstitution accuracy rates. Evaluation of the DESIGN and PURPOSIVE tree models on the EVALUATION data set shows significantly lower prediction accuracy for the PURPOSIVE tree models relative to the DESIGN models, indicating that non-probabilistic sample surveys may generate models with limited predictive capability. These differences were consistent across all four lichen species, with 11 of the 12 possible species and sample survey type comparisons having significantly lower accuracy rates. Some differences in accuracy were as large as 50%. The classification tree structures also differed considerably both among and within the modelled species, depending on the sample survey form. Overlap in the predictor variables selected by the DESIGN and PURPOSIVE tree models ranged from only 20% to 38%, indicating the classification trees fit the two evaluated survey forms on different sets of predictor variables. The magnitude of these differences in predictor variables throws doubt on ecological interpretation derived from prediction models based on non-probabilistic sample surveys. ?? 2006 Elsevier B.V. All rights reserved.
Scheres, Anouk; Dijkstra, Marianne; Ainslie, Eleanor; Balkan, Jaclyn; Reynolds, Brady; Sonuga-Barke, Edmund; Castellanos, F Xavier
2006-01-01
This study investigated whether age and ADHD symptoms affected choice preferences in children and adolescents when they chose between (1) small immediate rewards and larger delayed rewards and (2) small certain rewards and larger probabilistic uncertain rewards. A temporal discounting (TD) task and a probabilistic discounting (PD) task were used to measure the degree to which the subjective value of a large reward decreased as one had to wait longer for it (TD), and as the probability of obtaining it decreased (PD). Rewards used were small amounts of money. In the TD task, the large reward (10 cents) was delayed by between 0 and 30s, and the immediate reward varied in magnitude (0-10 cents). In the PD task, receipt of the large reward (10 cents) varied in likelihood, with probabilities of 0, 0.25, 0.5, 0.75, and 1.0 used, and the certain reward varied in magnitude (0-10 cents). Age and diagnostic group did not affect the degree of PD of rewards: All participants made choices so that total gains were maximized. As predicted, young children, aged 6-11 years (n = 25) demonstrated steeper TD of rewards than adolescents, aged 12-17 years (n = 21). This effect remained significant even when choosing the immediate reward did not shorten overall task duration. This, together with the lack of interaction between TD task version and age, suggests that steeper discounting in young children is driven by reward immediacy and not by delay aversion. Contrary to our predictions, participants with ADHD (n = 22) did not demonstrate steeper TD of rewards than controls (n = 24). These results raise the possibility that strong preferences for small immediate rewards in ADHD, as found in previous research, depend on factors such as total maximum gain and the use of fixed versus varied delay durations. The decrease in TD as observed in adolescents compared to children may be related to developmental changes in the (dorsolateral) prefrontal cortex. Future research needs to investigate these possibilities.
Multi-parametric variational data assimilation for hydrological forecasting
NASA Astrophysics Data System (ADS)
Alvarado-Montero, R.; Schwanenberg, D.; Krahe, P.; Helmke, P.; Klein, B.
2017-12-01
Ensemble forecasting is increasingly applied in flow forecasting systems to provide users with a better understanding of forecast uncertainty and consequently to take better-informed decisions. A common practice in probabilistic streamflow forecasting is to force deterministic hydrological model with an ensemble of numerical weather predictions. This approach aims at the representation of meteorological uncertainty but neglects uncertainty of the hydrological model as well as its initial conditions. Complementary approaches use probabilistic data assimilation techniques to receive a variety of initial states or represent model uncertainty by model pools instead of single deterministic models. This paper introduces a novel approach that extends a variational data assimilation based on Moving Horizon Estimation to enable the assimilation of observations into multi-parametric model pools. It results in a probabilistic estimate of initial model states that takes into account the parametric model uncertainty in the data assimilation. The assimilation technique is applied to the uppermost area of River Main in Germany. We use different parametric pools, each of them with five parameter sets, to assimilate streamflow data, as well as remotely sensed data from the H-SAF project. We assess the impact of the assimilation in the lead time performance of perfect forecasts (i.e. observed data as forcing variables) as well as deterministic and probabilistic forecasts from ECMWF. The multi-parametric assimilation shows an improvement of up to 23% for CRPS performance and approximately 20% in Brier Skill Scores with respect to the deterministic approach. It also improves the skill of the forecast in terms of rank histogram and produces a narrower ensemble spread.
Probabilistic fatigue life prediction of metallic and composite materials
NASA Astrophysics Data System (ADS)
Xiang, Yibing
Fatigue is one of the most common failure modes for engineering structures, such as aircrafts, rotorcrafts and aviation transports. Both metallic materials and composite materials are widely used and affected by fatigue damage. Huge uncertainties arise from material properties, measurement noise, imperfect models, future anticipated loads and environmental conditions. These uncertainties are critical issues for accurate remaining useful life (RUL) prediction for engineering structures in service. Probabilistic fatigue prognosis considering various uncertainties is of great importance for structural safety. The objective of this study is to develop probabilistic fatigue life prediction models for metallic materials and composite materials. A fatigue model based on crack growth analysis and equivalent initial flaw size concept is proposed for metallic materials. Following this, the developed model is extended to include structural geometry effects (notch effect), environmental effects (corroded specimens) and manufacturing effects (shot peening effects). Due to the inhomogeneity and anisotropy, the fatigue model suitable for metallic materials cannot be directly applied to composite materials. A composite fatigue model life prediction is proposed based on a mixed-mode delamination growth model and a stiffness degradation law. After the development of deterministic fatigue models of metallic and composite materials, a general probabilistic life prediction methodology is developed. The proposed methodology combines an efficient Inverse First-Order Reliability Method (IFORM) for the uncertainty propogation in fatigue life prediction. An equivalent stresstransformation has been developed to enhance the computational efficiency under realistic random amplitude loading. A systematical reliability-based maintenance optimization framework is proposed for fatigue risk management and mitigation of engineering structures.
Preference pulses and the win-stay, fix-and-sample model of choice.
Hachiga, Yosuke; Sakagami, Takayuki; Silberberg, Alan
2015-11-01
Two groups of six rats each were trained to respond to two levers for a food reinforcer. One group was trained on concurrent variable-ratio 20 extinction schedules of reinforcement. The second group was trained on a concurrent variable-interval 27-s extinction schedule. In both groups, lever-schedule assignments changed randomly following reinforcement; a light cued the lever providing the next reinforcer. In the next condition, the light cue was removed and reinforcer assignment strictly alternated between levers. The next two conditions redetermined, in order, the first two conditions. Preference pulses, defined as a tendency for relative response rate to decline to the just-reinforced alternative with time since reinforcement, only appeared during the extinction schedule. Although the pulse's functional form was well described by a reinforcer-induction equation, there was a large residual between actual data and a pulse-as-artifact simulation (McLean, Grace, Pitts, & Hughes, 2014) used to discern reinforcer-dependent contributions to pulsing. However, if that simulation was modified to include a win-stay tendency (a propensity to stay on the just-reinforced alternative), the residual was greatly reduced. Additional modifications of the parameter values of the pulse-as-artifact simulation enabled it to accommodate the present results as well as those it originally accommodated. In its revised form, this simulation was used to create a model that describes response runs to the preferred alternative as terminating probabilistically, and runs to the unpreferred alternative as punctate with occasional perseverative response runs. After reinforcement, choices are modeled as returning briefly to the lever location that had been just reinforced. This win-stay propensity is hypothesized as due to reinforcer induction. © Society for the Experimental Analysis of Behavior.
Zeng, Xiaohui; Li, Jianhe; Peng, Liubao; Wang, Yunhua; Tan, Chongqing; Chen, Gannong; Wan, Xiaomin; Lu, Qiong; Yi, Lidan
2014-01-01
Maintenance gefitinib significantly prolonged progression-free survival (PFS) compared with placebo in patients from eastern Asian with locally advanced/metastatic non-small-cell lung cancer (NSCLC) after four chemotherapeutic cycles (21 days per cycle) of first-line platinum-based combination chemotherapy without disease progression. The objective of the current study was to evaluate the cost-effectiveness of maintenance gefitinib therapy after four chemotherapeutic cycle's stand first-line platinum-based chemotherapy for patients with locally advanced or metastatic NSCLC with unknown EGFR mutations, from a Chinese health care system perspective. A semi-Markov model was designed to evaluate cost-effectiveness of the maintenance gefitinib treatment. Two-parametric Weibull and Log-logistic distribution were fitted to PFS and overall survival curves independently. One-way and probabilistic sensitivity analyses were conducted to assess the stability of the model designed. The model base-case analysis suggested that maintenance gefitinib would increase benefits in a 1, 3, 6 or 10-year time horizon, with incremental $184,829, $19,214, $19,328, and $21,308 per quality-adjusted life-year (QALY) gained, respectively. The most sensitive influential variable in the cost-effectiveness analysis was utility of PFS plus rash, followed by utility of PFS plus diarrhoea, utility of progressed disease, price of gefitinib, cost of follow-up treatment in progressed survival state, and utility of PFS on oral therapy. The price of gefitinib is the most significant parameter that could reduce the incremental cost per QALY. Probabilistic sensitivity analysis indicated that the cost-effective probability of maintenance gefitinib was zero under the willingness-to-pay (WTP) threshold of $16,349 (3 × per-capita gross domestic product of China). The sensitivity analyses all suggested that the model was robust. Maintenance gefitinib following first-line platinum-based chemotherapy for patients with locally advanced/metastatic NSCLC with unknown EGFR mutations is not cost-effective. Decreasing the price of gefitinib may be a preferential choice for meeting widely treatment demands in China.
Uncertainty Estimation in Tsunami Initial Condition From Rapid Bayesian Finite Fault Modeling
NASA Astrophysics Data System (ADS)
Benavente, R. F.; Dettmer, J.; Cummins, P. R.; Urrutia, A.; Cienfuegos, R.
2017-12-01
It is well known that kinematic rupture models for a given earthquake can present discrepancies even when similar datasets are employed in the inversion process. While quantifying this variability can be critical when making early estimates of the earthquake and triggered tsunami impact, "most likely models" are normally used for this purpose. In this work, we quantify the uncertainty of the tsunami initial condition for the great Illapel earthquake (Mw = 8.3, 2015, Chile). We focus on utilizing data and inversion methods that are suitable to rapid source characterization yet provide meaningful and robust results. Rupture models from teleseismic body and surface waves as well as W-phase are derived and accompanied by Bayesian uncertainty estimates from linearized inversion under positivity constraints. We show that robust and consistent features about the rupture kinematics appear when working within this probabilistic framework. Moreover, by using static dislocation theory, we translate the probabilistic slip distributions into seafloor deformation which we interpret as a tsunami initial condition. After considering uncertainty, our probabilistic seafloor deformation models obtained from different data types appear consistent with each other providing meaningful results. We also show that selecting just a single "representative" solution from the ensemble of initial conditions for tsunami propagation may lead to overestimating information content in the data. Our results suggest that rapid, probabilistic rupture models can play a significant role during emergency response by providing robust information about the extent of the disaster.
Praveen, Paurush; Fröhlich, Holger
2013-01-01
Inferring regulatory networks from experimental data via probabilistic graphical models is a popular framework to gain insights into biological systems. However, the inherent noise in experimental data coupled with a limited sample size reduces the performance of network reverse engineering. Prior knowledge from existing sources of biological information can address this low signal to noise problem by biasing the network inference towards biologically plausible network structures. Although integrating various sources of information is desirable, their heterogeneous nature makes this task challenging. We propose two computational methods to incorporate various information sources into a probabilistic consensus structure prior to be used in graphical model inference. Our first model, called Latent Factor Model (LFM), assumes a high degree of correlation among external information sources and reconstructs a hidden variable as a common source in a Bayesian manner. The second model, a Noisy-OR, picks up the strongest support for an interaction among information sources in a probabilistic fashion. Our extensive computational studies on KEGG signaling pathways as well as on gene expression data from breast cancer and yeast heat shock response reveal that both approaches can significantly enhance the reconstruction accuracy of Bayesian Networks compared to other competing methods as well as to the situation without any prior. Our framework allows for using diverse information sources, like pathway databases, GO terms and protein domain data, etc. and is flexible enough to integrate new sources, if available. PMID:23826291
[The Probabilistic Efficiency Frontier: A Value Assessment of Treatment Options in Hepatitis C].
Mühlbacher, Axel C; Sadler, Andrew
2017-06-19
Background The German Institute for Quality and Efficiency in Health Care (IQWiG) recommends the concept of the efficiency frontier to assess health care interventions. The efficiency frontier supports regulatory decisions on reimbursement prices for the appropriate allocation of health care resources. Until today this cost-benefit assessment framework has only been applied on the basis of individual patient-relevant endpoints. This contradicts the reality of a multi-dimensional patient benefit. Objective The objective of this study was to illustrate the operationalization of multi-dimensional benefit considering the uncertainty in clinical effects and preference data in order to calculate the efficiency of different treatment options for hepatitis C (HCV). This case study shows how methodological challenges could be overcome in order to use the efficiency frontier for economic analysis and health care decision-making. Method The operationalization of patient benefit was carried out on several patient-relevant endpoints. Preference data from a discrete choice experiment (DCE) study and clinical data based on clinical trials, which reflected the patient and the clinical perspective, respectively, were used for the aggregation of an overall benefit score. A probabilistic efficiency frontier was constructed in a Monte Carlo simulation with 10000 random draws. Patient-relevant endpoints were modeled with a beta distribution and preference data with a normal distribution. The assessment of overall benefit and costs provided information about the adequacy of the treatment prices. The parameter uncertainty was illustrated by the price-acceptability-curve and the net monetary benefit. Results Based on the clinical and preference data in Germany, the interferon-free treatment options proved to be efficient for the current price level. The interferon-free therapies of the latest generation achieved a positive net cost-benefit. Within the decision model, these therapies showed a maximum overall benefit. Due to their high additional benefit and approved prices, the therapies lie above of the extrapolated efficiency frontier, which suggests that these options have efficient reimbursement prices. Considering uncertainty, even a higher price would have resulted in a positive cost-benefit ratio. Conclusion IQWiG's efficiency frontier was used to assess the value of different treatment options in HCV. This study demonstrates that the probabilistic efficiency frontier, price-acceptability-curve and the net monetary benefit can contribute essential information to reimbursement decisions and price negotiations. © Georg Thieme Verlag KG Stuttgart · New York.
NASA Astrophysics Data System (ADS)
Sari, Dwi Ivayana; Hermanto, Didik
2017-08-01
This research is a developmental research of probabilistic thinking-oriented learning tools for probability materials at ninth grade students. This study is aimed to produce a good probabilistic thinking-oriented learning tools. The subjects were IX-A students of MTs Model Bangkalan. The stages of this development research used 4-D development model which has been modified into define, design and develop. Teaching learning tools consist of lesson plan, students' worksheet, learning teaching media and students' achievement test. The research instrument used was a sheet of learning tools validation, a sheet of teachers' activities, a sheet of students' activities, students' response questionnaire and students' achievement test. The result of those instruments were analyzed descriptively to answer research objectives. The result was teaching learning tools in which oriented to probabilistic thinking of probability at ninth grade students which has been valid. Since teaching and learning tools have been revised based on validation, and after experiment in class produced that teachers' ability in managing class was effective, students' activities were good, students' responses to the learning tools were positive and the validity, sensitivity and reliability category toward achievement test. In summary, this teaching learning tools can be used by teacher to teach probability for develop students' probabilistic thinking.
Non-Gaussian spatiotemporal simulation of multisite daily precipitation: downscaling framework
NASA Astrophysics Data System (ADS)
Ben Alaya, M. A.; Ouarda, T. B. M. J.; Chebana, F.
2018-01-01
Probabilistic regression approaches for downscaling daily precipitation are very useful. They provide the whole conditional distribution at each forecast step to better represent the temporal variability. The question addressed in this paper is: how to simulate spatiotemporal characteristics of multisite daily precipitation from probabilistic regression models? Recent publications point out the complexity of multisite properties of daily precipitation and highlight the need for using a non-Gaussian flexible tool. This work proposes a reasonable compromise between simplicity and flexibility avoiding model misspecification. A suitable nonparametric bootstrapping (NB) technique is adopted. A downscaling model which merges a vector generalized linear model (VGLM as a probabilistic regression tool) and the proposed bootstrapping technique is introduced to simulate realistic multisite precipitation series. The model is applied to data sets from the southern part of the province of Quebec, Canada. It is shown that the model is capable of reproducing both at-site properties and the spatial structure of daily precipitations. Results indicate the superiority of the proposed NB technique, over a multivariate autoregressive Gaussian framework (i.e. Gaussian copula).
Modeling the Effect of Reward Amount on Probability Discounting
ERIC Educational Resources Information Center
Myerson, Joel; Green, Leonard; Morris, Joshua
2011-01-01
The present study with college students examined the effect of amount on the discounting of probabilistic monetary rewards. A hyperboloid function accurately described the discounting of hypothetical rewards ranging in amount from $20 to $10,000,000. The degree of discounting increased continuously with amount of probabilistic reward. This effect…
Probabilistic Priority Message Checking Modeling Based on Controller Area Networks
NASA Astrophysics Data System (ADS)
Lin, Cheng-Min
Although the probabilistic model checking tool called PRISM has been applied in many communication systems, such as wireless local area network, Bluetooth, and ZigBee, the technique is not used in a controller area network (CAN). In this paper, we use PRISM to model the mechanism of priority messages for CAN because the mechanism has allowed CAN to become the leader in serial communication for automobile and industry control. Through modeling CAN, it is easy to analyze the characteristic of CAN for further improving the security and efficiency of automobiles. The Markov chain model helps us to model the behaviour of priority messages.
Probabilistic Modeling of Intracranial Pressure Effects on Optic Nerve Biomechanics
NASA Technical Reports Server (NTRS)
Ethier, C. R.; Feola, Andrew J.; Raykin, Julia; Myers, Jerry G.; Nelson, Emily S.; Samuels, Brian C.
2016-01-01
Altered intracranial pressure (ICP) is involved/implicated in several ocular conditions: papilledema, glaucoma and Visual Impairment and Intracranial Pressure (VIIP) syndrome. The biomechanical effects of altered ICP on optic nerve head (ONH) tissues in these conditions are uncertain but likely important. We have quantified ICP-induced deformations of ONH tissues, using finite element (FE) and probabilistic modeling (Latin Hypercube Simulations (LHS)) to consider a range of tissue properties and relevant pressures.
Geothermal probabilistic cost study
NASA Technical Reports Server (NTRS)
Orren, L. H.; Ziman, G. M.; Jones, S. C.; Lee, T. K.; Noll, R.; Wilde, L.; Sadanand, V.
1981-01-01
A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model was used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents was analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance were examined.
2017-03-13
support of airborne laser designator use during test and training exercises on military ranges. The initial MATILDA tool, MATILDA PRO Version-1.6.1...was based on the 2007 PRA model developed to perform range safety clearances for the UK Thermal Imaging Airborne Laser Designator (TIALD) system...AFRL Technical Reports. This Technical Report, designated Part I, con- tains documentation of the computational procedures for probabilistic fault
Target Coverage in Wireless Sensor Networks with Probabilistic Sensors
Shan, Anxing; Xu, Xianghua; Cheng, Zongmao
2016-01-01
Sensing coverage is a fundamental problem in wireless sensor networks (WSNs), which has attracted considerable attention. Conventional research on this topic focuses on the 0/1 coverage model, which is only a coarse approximation to the practical sensing model. In this paper, we study the target coverage problem, where the objective is to find the least number of sensor nodes in randomly-deployed WSNs based on the probabilistic sensing model. We analyze the joint detection probability of target with multiple sensors. Based on the theoretical analysis of the detection probability, we formulate the minimum ϵ-detection coverage problem. We prove that the minimum ϵ-detection coverage problem is NP-hard and present an approximation algorithm called the Probabilistic Sensor Coverage Algorithm (PSCA) with provable approximation ratios. To evaluate our design, we analyze the performance of PSCA theoretically and also perform extensive simulations to demonstrate the effectiveness of our proposed algorithm. PMID:27618902
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peace, Gerald; Goering, Timothy James; Miller, Mark Laverne
2007-01-01
A probabilistic performance assessment has been conducted to evaluate the fate and transport of radionuclides (americium-241, cesium-137, cobalt-60, plutonium-238, plutonium-239, radium-226, radon-222, strontium-90, thorium-232, tritium, uranium-238), heavy metals (lead and cadmium), and volatile organic compounds (VOCs) at the Mixed Waste Landfill (MWL). Probabilistic analyses were performed to quantify uncertainties inherent in the system and models for a 1,000-year period, and sensitivity analyses were performed to identify parameters and processes that were most important to the simulated performance metrics. Comparisons between simulated results and measured values at the MWL were made to gain confidence in the models and perform calibrations whenmore » data were available. In addition, long-term monitoring requirements and triggers were recommended based on the results of the quantified uncertainty and sensitivity analyses.« less
Bröder, A
2000-09-01
The boundedly rational 'Take-The-Best" heuristic (TTB) was proposed by G. Gigerenzer, U. Hoffrage, and H. Kleinbölting (1991) as a model of fast and frugal probabilistic inferences. Although the simple lexicographic rule proved to be successful in computer simulations, direct empirical demonstrations of its adequacy as a psychological model are lacking because of several methodical problems. In 4 experiments with a total of 210 participants, this question was addressed. Whereas Experiment 1 showed that TTB is not valid as a universal hypothesis about probabilistic inferences, up to 28% of participants in Experiment 2 and 53% of participants in Experiment 3 were classified as TTB users. Experiment 4 revealed that investment costs for information seem to be a relevant factor leading participants to switch to a noncompensatory TTB strategy. The observed individual differences in strategy use imply the recommendation of an idiographic approach to decision-making research.
NASA Technical Reports Server (NTRS)
1991-01-01
The technical effort and computer code enhancements performed during the sixth year of the Probabilistic Structural Analysis Methods program are summarized. Various capabilities are described to probabilistically combine structural response and structural resistance to compute component reliability. A library of structural resistance models is implemented in the Numerical Evaluations of Stochastic Structures Under Stress (NESSUS) code that included fatigue, fracture, creep, multi-factor interaction, and other important effects. In addition, a user interface was developed for user-defined resistance models. An accurate and efficient reliability method was developed and was successfully implemented in the NESSUS code to compute component reliability based on user-selected response and resistance models. A risk module was developed to compute component risk with respect to cost, performance, or user-defined criteria. The new component risk assessment capabilities were validated and demonstrated using several examples. Various supporting methodologies were also developed in support of component risk assessment.
A geomorphic approach to 100-year floodplain mapping for the Conterminous United States
NASA Astrophysics Data System (ADS)
Jafarzadegan, Keighobad; Merwade, Venkatesh; Saksena, Siddharth
2018-06-01
Floodplain mapping using hydrodynamic models is difficult in data scarce regions. Additionally, using hydrodynamic models to map floodplain over large stream network can be computationally challenging. Some of these limitations of floodplain mapping using hydrodynamic modeling can be overcome by developing computationally efficient statistical methods to identify floodplains in large and ungauged watersheds using publicly available data. This paper proposes a geomorphic model to generate probabilistic 100-year floodplain maps for the Conterminous United States (CONUS). The proposed model first categorizes the watersheds in the CONUS into three classes based on the height of the water surface corresponding to the 100-year flood from the streambed. Next, the probability that any watershed in the CONUS belongs to one of these three classes is computed through supervised classification using watershed characteristics related to topography, hydrography, land use and climate. The result of this classification is then fed into a probabilistic threshold binary classifier (PTBC) to generate the probabilistic 100-year floodplain maps. The supervised classification algorithm is trained by using the 100-year Flood Insurance Rated Maps (FIRM) from the U.S. Federal Emergency Management Agency (FEMA). FEMA FIRMs are also used to validate the performance of the proposed model in areas not included in the training. Additionally, HEC-RAS model generated flood inundation extents are used to validate the model performance at fifteen sites that lack FEMA maps. Validation results show that the probabilistic 100-year floodplain maps, generated by proposed model, match well with both FEMA and HEC-RAS generated maps. On average, the error of predicted flood extents is around 14% across the CONUS. The high accuracy of the validation results shows the reliability of the geomorphic model as an alternative approach for fast and cost effective delineation of 100-year floodplains for the CONUS.
Matzke, Nicholas J
2014-11-01
Founder-event speciation, where a rare jump dispersal event founds a new genetically isolated lineage, has long been considered crucial by many historical biogeographers, but its importance is disputed within the vicariance school. Probabilistic modeling of geographic range evolution creates the potential to test different biogeographical models against data using standard statistical model choice procedures, as long as multiple models are available. I re-implement the Dispersal-Extinction-Cladogenesis (DEC) model of LAGRANGE in the R package BioGeoBEARS, and modify it to create a new model, DEC + J, which adds founder-event speciation, the importance of which is governed by a new free parameter, [Formula: see text]. The identifiability of DEC and DEC + J is tested on data sets simulated under a wide range of macroevolutionary models where geography evolves jointly with lineage birth/death events. The results confirm that DEC and DEC + J are identifiable even though these models ignore the fact that molecular phylogenies are missing many cladogenesis and extinction events. The simulations also indicate that DEC will have substantially increased errors in ancestral range estimation and parameter inference when the true model includes + J. DEC and DEC + J are compared on 13 empirical data sets drawn from studies of island clades. Likelihood-ratio tests indicate that all clades reject DEC, and AICc model weights show large to overwhelming support for DEC + J, for the first time verifying the importance of founder-event speciation in island clades via statistical model choice. Under DEC + J, ancestral nodes are usually estimated to have ranges occupying only one island, rather than the widespread ancestors often favored by DEC. These results indicate that the assumptions of historical biogeography models can have large impacts on inference and require testing and comparison with statistical methods. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Disruption of the Right Temporoparietal Junction Impairs Probabilistic Belief Updating.
Mengotti, Paola; Dombert, Pascasie L; Fink, Gereon R; Vossel, Simone
2017-05-31
Generating and updating probabilistic models of the environment is a fundamental modus operandi of the human brain. Although crucial for various cognitive functions, the neural mechanisms of these inference processes remain to be elucidated. Here, we show the causal involvement of the right temporoparietal junction (rTPJ) in updating probabilistic beliefs and we provide new insights into the chronometry of the process by combining online transcranial magnetic stimulation (TMS) with computational modeling of behavioral responses. Female and male participants performed a modified location-cueing paradigm, where false information about the percentage of cue validity (%CV) was provided in half of the experimental blocks to prompt updating of prior expectations. Online double-pulse TMS over rTPJ 300 ms (but not 50 ms) after target appearance selectively decreased participants' updating of false prior beliefs concerning %CV, reflected in a decreased learning rate of a Rescorla-Wagner model. Online TMS over rTPJ also impacted on participants' explicit beliefs, causing them to overestimate %CV. These results confirm the involvement of rTPJ in updating of probabilistic beliefs, thereby advancing our understanding of this area's function during cognitive processing. SIGNIFICANCE STATEMENT Contemporary views propose that the brain maintains probabilistic models of the world to minimize surprise about sensory inputs. Here, we provide evidence that the right temporoparietal junction (rTPJ) is causally involved in this process. Because neuroimaging has suggested that rTPJ is implicated in divergent cognitive domains, the demonstration of an involvement in updating internal models provides a novel unifying explanation for these findings. We used computational modeling to characterize how participants change their beliefs after new observations. By interfering with rTPJ activity through online transcranial magnetic stimulation, we showed that participants were less able to update prior beliefs with TMS delivered at 300 ms after target onset. Copyright © 2017 the authors 0270-6474/17/375419-10$15.00/0.
Exploring the calibration of a wind forecast ensemble for energy applications
NASA Astrophysics Data System (ADS)
Heppelmann, Tobias; Ben Bouallegue, Zied; Theis, Susanne
2015-04-01
In the German research project EWeLiNE, Deutscher Wetterdienst (DWD) and Fraunhofer Institute for Wind Energy and Energy System Technology (IWES) are collaborating with three German Transmission System Operators (TSO) in order to provide the TSOs with improved probabilistic power forecasts. Probabilistic power forecasts are derived from probabilistic weather forecasts, themselves derived from ensemble prediction systems (EPS). Since the considered raw ensemble wind forecasts suffer from underdispersiveness and bias, calibration methods are developed for the correction of the model bias and the ensemble spread bias. The overall aim is to improve the ensemble forecasts such that the uncertainty of the possible weather deployment is depicted by the ensemble spread from the first forecast hours. Additionally, the ensemble members after calibration should remain physically consistent scenarios. We focus on probabilistic hourly wind forecasts with horizon of 21 h delivered by the convection permitting high-resolution ensemble system COSMO-DE-EPS which has become operational in 2012 at DWD. The ensemble consists of 20 ensemble members driven by four different global models. The model area includes whole Germany and parts of Central Europe with a horizontal resolution of 2.8 km and a vertical resolution of 50 model levels. For verification we use wind mast measurements around 100 m height that corresponds to the hub height of wind energy plants that belong to wind farms within the model area. Calibration of the ensemble forecasts can be performed by different statistical methods applied to the raw ensemble output. Here, we explore local bivariate Ensemble Model Output Statistics at individual sites and quantile regression with different predictors. Applying different methods, we already show an improvement of ensemble wind forecasts from COSMO-DE-EPS for energy applications. In addition, an ensemble copula coupling approach transfers the time-dependencies of the raw ensemble to the calibrated ensemble. The calibrated wind forecasts are evaluated first with univariate probabilistic scores and additionally with diagnostics of wind ramps in order to assess the time-consistency of the calibrated ensemble members.
Instruction in information structuring improves Bayesian judgment in intelligence analysts.
Mandel, David R
2015-01-01
An experiment was conducted to test the effectiveness of brief instruction in information structuring (i.e., representing and integrating information) for improving the coherence of probability judgments and binary choices among intelligence analysts. Forty-three analysts were presented with comparable sets of Bayesian judgment problems before and immediately after instruction. After instruction, analysts' probability judgments were more coherent (i.e., more additive and compliant with Bayes theorem). Instruction also improved the coherence of binary choices regarding category membership: after instruction, subjects were more likely to invariably choose the category to which they assigned the higher probability of a target's membership. The research provides a rare example of evidence-based validation of effectiveness in instruction to improve the statistical assessment skills of intelligence analysts. Such instruction could also be used to improve the assessment quality of other types of experts who are required to integrate statistical information or make probabilistic assessments.
Quick probabilistic binary image matching: changing the rules of the game
NASA Astrophysics Data System (ADS)
Mustafa, Adnan A. Y.
2016-09-01
A Probabilistic Matching Model for Binary Images (PMMBI) is presented that predicts the probability of matching binary images with any level of similarity. The model relates the number of mappings, the amount of similarity between the images and the detection confidence. We show the advantage of using a probabilistic approach to matching in similarity space as opposed to a linear search in size space. With PMMBI a complete model is available to predict the quick detection of dissimilar binary images. Furthermore, the similarity between the images can be measured to a good degree if the images are highly similar. PMMBI shows that only a few pixels need to be compared to detect dissimilarity between images, as low as two pixels in some cases. PMMBI is image size invariant; images of any size can be matched at the same quick speed. Near-duplicate images can also be detected without much difficulty. We present tests on real images that show the prediction accuracy of the model.
Probabilistic analysis for fatigue strength degradation of materials
NASA Technical Reports Server (NTRS)
Royce, Lola
1989-01-01
This report presents the results of the first year of a research program conducted for NASA-LeRC by the University of Texas at San Antonio. The research included development of methodology that provides a probabilistic treatment of lifetime prediction of structural components of aerospace propulsion systems subjected to fatigue. Material strength degradation models, based on primitive variables, include both a fatigue strength reduction model and a fatigue crack growth model. Linear elastic fracture mechanics is utilized in the latter model. Probabilistic analysis is based on simulation, and both maximum entropy and maximum penalized likelihood methods are used for the generation of probability density functions. The resulting constitutive relationships are included in several computer programs, RANDOM2, RANDOM3, and RANDOM4. These programs determine the random lifetime of an engine component, in mechanical load cycles, to reach a critical fatigue strength or crack size. The material considered was a cast nickel base superalloy, one typical of those used in the Space Shuttle Main Engine.
Grammaticality, Acceptability, and Probability: A Probabilistic View of Linguistic Knowledge.
Lau, Jey Han; Clark, Alexander; Lappin, Shalom
2017-07-01
The question of whether humans represent grammatical knowledge as a binary condition on membership in a set of well-formed sentences, or as a probabilistic property has been the subject of debate among linguists, psychologists, and cognitive scientists for many decades. Acceptability judgments present a serious problem for both classical binary and probabilistic theories of grammaticality. These judgements are gradient in nature, and so cannot be directly accommodated in a binary formal grammar. However, it is also not possible to simply reduce acceptability to probability. The acceptability of a sentence is not the same as the likelihood of its occurrence, which is, in part, determined by factors like sentence length and lexical frequency. In this paper, we present the results of a set of large-scale experiments using crowd-sourced acceptability judgments that demonstrate gradience to be a pervasive feature in acceptability judgments. We then show how one can predict acceptability judgments on the basis of probability by augmenting probabilistic language models with an acceptability measure. This is a function that normalizes probability values to eliminate the confounding factors of length and lexical frequency. We describe a sequence of modeling experiments with unsupervised language models drawn from state-of-the-art machine learning methods in natural language processing. Several of these models achieve very encouraging levels of accuracy in the acceptability prediction task, as measured by the correlation between the acceptability measure scores and mean human acceptability values. We consider the relevance of these results to the debate on the nature of grammatical competence, and we argue that they support the view that linguistic knowledge can be intrinsically probabilistic. Copyright © 2016 Cognitive Science Society, Inc.
Borghi, Alessandro; Ruggiero, Federica; Badiali, Giovanni; Bianchi, Alberto; Marchetti, Claudio; Rodriguez-Florez, Naiara; Breakey, Richard W. F.; Jeelani, Owase; Dunaway, David J.; Schievano, Silvia
2018-01-01
Repositioning of the maxilla in orthognathic surgery is carried out for functional and aesthetic purposes. Pre-surgical planning tools can predict 3D facial appearance by computing the response of the soft tissue to the changes to the underlying skeleton. The clinical use of commercial prediction software remains controversial, likely due to the deterministic nature of these computational predictions. A novel probabilistic finite element model (FEM) for the prediction of postoperative facial soft tissues is proposed in this paper. A probabilistic FEM was developed and validated on a cohort of eight patients who underwent maxillary repositioning and had pre- and postoperative cone beam computed tomography (CBCT) scans taken. Firstly, a variables correlation assessed various modelling parameters. Secondly, a design of experiments (DOE) provided a range of potential outcomes based on uniformly distributed input parameters, followed by an optimisation. Lastly, the second DOE iteration provided optimised predictions with a probability range. A range of 3D predictions was obtained using the probabilistic FEM and validated using reconstructed soft tissue surfaces from the postoperative CBCT data. The predictions in the nose and upper lip areas accurately include the true postoperative position, whereas the prediction under-estimates the position of the cheeks and lower lip. A probabilistic FEM has been developed and validated for the prediction of the facial appearance following orthognathic surgery. This method shows how inaccuracies in the modelling and uncertainties in executing surgical planning influence the soft tissue prediction and it provides a range of predictions including a minimum and maximum, which may be helpful for patients in understanding the impact of surgery on the face. PMID:29742139
Knoops, Paul G M; Borghi, Alessandro; Ruggiero, Federica; Badiali, Giovanni; Bianchi, Alberto; Marchetti, Claudio; Rodriguez-Florez, Naiara; Breakey, Richard W F; Jeelani, Owase; Dunaway, David J; Schievano, Silvia
2018-01-01
Repositioning of the maxilla in orthognathic surgery is carried out for functional and aesthetic purposes. Pre-surgical planning tools can predict 3D facial appearance by computing the response of the soft tissue to the changes to the underlying skeleton. The clinical use of commercial prediction software remains controversial, likely due to the deterministic nature of these computational predictions. A novel probabilistic finite element model (FEM) for the prediction of postoperative facial soft tissues is proposed in this paper. A probabilistic FEM was developed and validated on a cohort of eight patients who underwent maxillary repositioning and had pre- and postoperative cone beam computed tomography (CBCT) scans taken. Firstly, a variables correlation assessed various modelling parameters. Secondly, a design of experiments (DOE) provided a range of potential outcomes based on uniformly distributed input parameters, followed by an optimisation. Lastly, the second DOE iteration provided optimised predictions with a probability range. A range of 3D predictions was obtained using the probabilistic FEM and validated using reconstructed soft tissue surfaces from the postoperative CBCT data. The predictions in the nose and upper lip areas accurately include the true postoperative position, whereas the prediction under-estimates the position of the cheeks and lower lip. A probabilistic FEM has been developed and validated for the prediction of the facial appearance following orthognathic surgery. This method shows how inaccuracies in the modelling and uncertainties in executing surgical planning influence the soft tissue prediction and it provides a range of predictions including a minimum and maximum, which may be helpful for patients in understanding the impact of surgery on the face.
NASA Astrophysics Data System (ADS)
Moncoulon, D.; Labat, D.; Ardon, J.; Onfroy, T.; Leblois, E.; Poulard, C.; Aji, S.; Rémy, A.; Quantin, A.
2013-07-01
The analysis of flood exposure at a national scale for the French insurance market must combine the generation of a probabilistic event set of all possible but not yet occurred flood situations with hazard and damage modeling. In this study, hazard and damage models are calibrated on a 1995-2012 historical event set, both for hazard results (river flow, flooded areas) and loss estimations. Thus, uncertainties in the deterministic estimation of a single event loss are known before simulating a probabilistic event set. To take into account at least 90% of the insured flood losses, the probabilistic event set must combine the river overflow (small and large catchments) with the surface runoff due to heavy rainfall, on the slopes of the watershed. Indeed, internal studies of CCR claim database has shown that approximately 45% of the insured flood losses are located inside the floodplains and 45% outside. 10% other percent are due to seasurge floods and groundwater rise. In this approach, two independent probabilistic methods are combined to create a single flood loss distribution: generation of fictive river flows based on the historical records of the river gauge network and generation of fictive rain fields on small catchments, calibrated on the 1958-2010 Météo-France rain database SAFRAN. All the events in the probabilistic event sets are simulated with the deterministic model. This hazard and damage distribution is used to simulate the flood losses at the national scale for an insurance company (MACIF) and to generate flood areas associated with hazard return periods. The flood maps concern river overflow and surface water runoff. Validation of these maps is conducted by comparison with the address located claim data on a small catchment (downstream Argens).
Stochastic Simulation and Forecast of Hydrologic Time Series Based on Probabilistic Chaos Expansion
NASA Astrophysics Data System (ADS)
Li, Z.; Ghaith, M.
2017-12-01
Hydrological processes are characterized by many complex features, such as nonlinearity, dynamics and uncertainty. How to quantify and address such complexities and uncertainties has been a challenging task for water engineers and managers for decades. To support robust uncertainty analysis, an innovative approach for the stochastic simulation and forecast of hydrologic time series is developed is this study. Probabilistic Chaos Expansions (PCEs) are established through probabilistic collocation to tackle uncertainties associated with the parameters of traditional hydrological models. The uncertainties are quantified in model outputs as Hermite polynomials with regard to standard normal random variables. Sequentially, multivariate analysis techniques are used to analyze the complex nonlinear relationships between meteorological inputs (e.g., temperature, precipitation, evapotranspiration, etc.) and the coefficients of the Hermite polynomials. With the established relationships between model inputs and PCE coefficients, forecasts of hydrologic time series can be generated and the uncertainties in the future time series can be further tackled. The proposed approach is demonstrated using a case study in China and is compared to a traditional stochastic simulation technique, the Markov-Chain Monte-Carlo (MCMC) method. Results show that the proposed approach can serve as a reliable proxy to complicated hydrological models. It can provide probabilistic forecasting in a more computationally efficient manner, compared to the traditional MCMC method. This work provides technical support for addressing uncertainties associated with hydrological modeling and for enhancing the reliability of hydrological modeling results. Applications of the developed approach can be extended to many other complicated geophysical and environmental modeling systems to support the associated uncertainty quantification and risk analysis.
Effects of varying the step particle distribution on a probabilistic transport model
NASA Astrophysics Data System (ADS)
Bouzat, S.; Farengo, R.
2005-12-01
The consequences of varying the step particle distribution on a probabilistic transport model, which captures the basic features of transport in plasmas and was recently introduced in Ref. 1 [B. Ph. van Milligen et al., Phys. Plasmas 11, 2272 (2004)], are studied. Different superdiffusive transport mechanisms generated by a family of distributions with algebraic decays (Tsallis distributions) are considered. It is observed that the possibility of changing the superdiffusive transport mechanism improves the flexibility of the model for describing different situations. The use of the model to describe the low (L) and high (H) confinement modes is also analyzed.
Quan-Hoang, Vuong
2016-01-01
Background: Patients have to acquire information to support their decision on choosing a suitable healthcare provider. But in developing countries like Vietnam, accessibility issues remain an obstacle, thus adversely affect both quality and costliness of healthcare information. Vietnamese use both sources from health professionals and friends/relatives, especially when quality of the Internet-based cheaper sources appear to be still questionable. The search of information from both professionals and friends/relatives incurs some cost, which can be viewed as low or high depending low or high accessibility to the sources. These views potentially affect their choices. Aim and Objectives: To investigate the effects that medical/health services information on perceived expensiveness of patients’ labor costs. Two related objectives are a) establishing empirical relations between accessibility to sources and expensiveness; and, b) probabilistic trends of probabilities for perceived expensiveness. Results: There is evidence for established relations among the variables “Convexp” and “Convrel” (all p’s < 0.01), indicating that both information sources (experts and friends/relatives) have influence on patients perception of information expensiveness. The use of experts source tends to increase the probability of perceived expensiveness. Conclusion: a) Probabilistic trends show Vietnamese patients have propensity to value healthcare information highly and do not see it as “expensive”; b) The majority of Vietnamese households still take non-professional advices at their own risks; c) There is more for the public healthcare information system to do to reduce costliness and risk of information. The Internet-based health service users communities cannot replace this system. PMID:28077894
Chronic pain impairs cognitive flexibility and engages novel learning strategies in rats.
Cowen, Stephen L; Phelps, Caroline E; Navratilova, Edita; McKinzie, David L; Okun, Alec; Husain, Omar; Gleason, Scott D; Witkin, Jeffrey M; Porreca, Frank
2018-03-22
Cognitive flexibility, the ability to adapt behavior to changing outcomes, is critical for survival. The prefrontal cortex is a key site of cognitive control and chronic pain is known to lead to significant morphological changes to this brain region. Nevertheless, the effects of chronic pain on cognitive flexibility and learning remain uncertain. We used an instrumental paradigm to assess adaptive learning in an experimental model of chronic pain induced by tight ligation of the spinal nerves L5/6 (SNL model). Naïve, sham-operated, and SNL rats were trained to perform fixed-ratio, variable-ratio, and contingency-shift behaviors for food reward. Although all groups learned an initial lever-reward contingency, learning was slower in SNL animals in a subsequent choice task that reversed reinforcement contingencies. Temporal analysis of lever-press responses across sessions indicated no apparent deficits in memory consolidation or retrieval. However, analysis of learning within sessions revealed that the lever presses of SNL animals occurred in bursts followed by delays. Unexpectedly, the degree of bursting correlated positively with learning. Under a variable-ratio probabilistic task, SNL rats chose a less profitable behavioral strategy compared to naïve and sham-operated animals. Following extinction of behavior for learned preferences, SNL animals reverted to their initially preferred (i.e., less profitable) behavioral choice. Our data suggest, that in the face of uncertainty, chronic pain drives a preference for familiar associations, consistent with reduced cognitive flexibility. The observed burst-like responding may represent a novel learning strategy in animals with chronic pain.
Bayesian approach to the analysis of neutron Brillouin scattering data on liquid metals
NASA Astrophysics Data System (ADS)
De Francesco, A.; Guarini, E.; Bafile, U.; Formisano, F.; Scaccia, L.
2016-08-01
When the dynamics of liquids and disordered systems at mesoscopic level is investigated by means of inelastic scattering (e.g., neutron or x ray), spectra are often characterized by a poor definition of the excitation lines and spectroscopic features in general and one important issue is to establish how many of these lines need to be included in the modeling function and to estimate their parameters. Furthermore, when strongly damped excitations are present, commonly used and widespread fitting algorithms are particularly affected by the choice of initial values of the parameters. An inadequate choice may lead to an inefficient exploration of the parameter space, resulting in the algorithm getting stuck in a local minimum. In this paper, we present a Bayesian approach to the analysis of neutron Brillouin scattering data in which the number of excitation lines is treated as unknown and estimated along with the other model parameters. We propose a joint estimation procedure based on a reversible-jump Markov chain Monte Carlo algorithm, which efficiently explores the parameter space, producing a probabilistic measure to quantify the uncertainty on the number of excitation lines as well as reliable parameter estimates. The method proposed could turn out of great importance in extracting physical information from experimental data, especially when the detection of spectral features is complicated not only because of the properties of the sample, but also because of the limited instrumental resolution and count statistics. The approach is tested on generated data set and then applied to real experimental spectra of neutron Brillouin scattering from a liquid metal, previously analyzed in a more traditional way.
Stochastic simulation of ecohydrological interactions between vegetation and groundwater
NASA Astrophysics Data System (ADS)
Dwelle, M. C.; Ivanov, V. Y.; Sargsyan, K.
2017-12-01
The complex interactions between groundwater and vegetation in the Amazon rainforest may yield vital ecophysiological interactions in specific landscape niches such as buffering plant water stress during dry season or suppression of water uptake due to anoxic conditions. Representation of such processes is greatly impacted by both external and internal sources of uncertainty: inaccurate data and subjective choice of model representation. The models that can simulate these processes are complex and computationally expensive, and therefore make it difficult to address uncertainty using traditional methods. We use the ecohydrologic model tRIBS+VEGGIE and a novel uncertainty quantification framework applied to the ZF2 watershed near Manaus, Brazil. We showcase the capability of this framework for stochastic simulation of vegetation-hydrology dynamics. This framework is useful for simulation with internal and external stochasticity, but this work will focus on internal variability of groundwater depth distribution and model parameterizations. We demonstrate the capability of this framework to make inferences on uncertain states of groundwater depth from limited in situ data, and how the realizations of these inferences affect the ecohydrological interactions between groundwater dynamics and vegetation function. We place an emphasis on the probabilistic representation of quantities of interest and how this impacts the understanding and interpretation of the dynamics at the groundwater-vegetation interface.
Probabilistic computer model of optimal runway turnoffs
NASA Technical Reports Server (NTRS)
Schoen, M. L.; Preston, O. W.; Summers, L. G.; Nelson, B. A.; Vanderlinden, L.; Mcreynolds, M. C.
1985-01-01
Landing delays are currently a problem at major air carrier airports and many forecasters agree that airport congestion will get worse by the end of the century. It is anticipated that some types of delays can be reduced by an efficient optimal runway exist system allowing increased approach volumes necessary at congested airports. A computerized Probabilistic Runway Turnoff Model which locates exits and defines path geometry for a selected maximum occupancy time appropriate for each TERPS aircraft category is defined. The model includes an algorithm for lateral ride comfort limits.
Probabilistic flood damage modelling at the meso-scale
NASA Astrophysics Data System (ADS)
Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno
2014-05-01
Decisions on flood risk management and adaptation are usually based on risk analyses. Such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments. Most damage models have in common that complex damaging processes are described by simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood damage models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we show how the model BT-FLEMO (Bagging decision Tree based Flood Loss Estimation MOdel) can be applied on the meso-scale, namely on the basis of ATKIS land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany. The application of BT-FLEMO provides a probability distribution of estimated damage to residential buildings per municipality. Validation is undertaken on the one hand via a comparison with eight other damage models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official damage data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of damage estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation model BT-FLEMO is that it inherently provides quantitative information about the uncertainty of the prediction. Reference: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64.
Sánchez-Vizcaíno, Fernando; Perez, Andrés; Martínez-López, Beatriz; Sánchez-Vizcaíno, José Manuel
2012-08-01
Trade of animals and animal products imposes an uncertain and variable risk for exotic animal diseases introduction into importing countries. Risk analysis provides importing countries with an objective, transparent, and internationally accepted method for assessing that risk. Over the last decades, European Union countries have conducted probabilistic risk assessments quite frequently to quantify the risk for rare animal diseases introduction into their territories. Most probabilistic animal health risk assessments have been typically classified into one-level and multilevel binomial models. One-level models are more simple than multilevel models because they assume that animals or products originate from one single population. However, it is unknown whether such simplification may result in substantially different results compared to those obtained through the use of multilevel models. Here, data used on a probabilistic multilevel binomial model formulated to assess the risk for highly pathogenic avian influenza introduction into Spain were reanalyzed using a one-level binomial model and their outcomes were compared. An alternative ordinal model is also proposed here, which makes use of simpler assumptions and less information compared to those required by traditional one-level and multilevel approaches. Results suggest that, at least under certain circumstances, results of the one-level and ordinal approaches are similar to those obtained using multilevel models. Consequently, we argue that, when data are insufficient to run traditional probabilistic models, the ordinal approach presented here may be a suitable alternative to rank exporting countries in terms of the risk that they impose for the spread of rare animal diseases into disease-free countries. © 2012 Society for Risk Analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Dengwang; Liu, Li; Chen, Jinhu
2014-06-01
Purpose: The aiming of this study was to extract liver structures for daily Cone beam CT (CBCT) images automatically. Methods: Datasets were collected from 50 intravenous contrast planning CT images, which were regarded as training dataset for probabilistic atlas and shape prior model construction. Firstly, probabilistic atlas and shape prior model based on sparse shape composition (SSC) were constructed by iterative deformable registration. Secondly, the artifacts and noise were removed from the daily CBCT image by an edge-preserving filtering using total variation with L1 norm (TV-L1). Furthermore, the initial liver region was obtained by registering the incoming CBCT image withmore » the atlas utilizing edge-preserving deformable registration with multi-scale strategy, and then the initial liver region was converted to surface meshing which was registered with the shape model where the major variation of specific patient was modeled by sparse vectors. At the last stage, the shape and intensity information were incorporated into joint probabilistic model, and finally the liver structure was extracted by maximum a posteriori segmentation.Regarding the construction process, firstly the manually segmented contours were converted into meshes, and then arbitrary patient data was chosen as reference image to register with the rest of training datasets by deformable registration algorithm for constructing probabilistic atlas and prior shape model. To improve the efficiency of proposed method, the initial probabilistic atlas was used as reference image to register with other patient data for iterative construction for removing bias caused by arbitrary selection. Results: The experiment validated the accuracy of the segmentation results quantitatively by comparing with the manually ones. The volumetric overlap percentage between the automatically generated liver contours and the ground truth were on an average 88%–95% for CBCT images. Conclusion: The experiment demonstrated that liver structures of CBCT with artifacts can be extracted accurately for following adaptive radiation therapy. This work is supported by National Natural Science Foundation of China (No. 61201441), Research Fund for Excellent Young and Middle-aged Scientists of Shandong Province (No. BS2012DX038), Project of Shandong Province Higher Educational Science and Technology Program (No. J12LN23), Jinan youth science and technology star (No.20120109)« less
Using discrete choice experiments within a cost-benefit analysis framework: some considerations.
McIntosh, Emma
2006-01-01
A great advantage of the stated preference discrete choice experiment (SPDCE) approach to economic evaluation methodology is its immense flexibility within applied cost-benefit analyses (CBAs). However, while the use of SPDCEs in healthcare has increased markedly in recent years there has been a distinct lack of equivalent CBAs in healthcare using such SPDCE-derived valuations. This article outlines specific issues and some practical suggestions for consideration relevant to the development of CBAs using SPDCE-derived benefits. The article shows that SPDCE-derived CBA can adopt recent developments in cost-effectiveness methodology including the cost-effectiveness plane, appropriate consideration of uncertainty, the net-benefit framework and probabilistic sensitivity analysis methods, while maintaining the theoretical advantage of the SPDCE approach. The concept of a cost-benefit plane is no different in principle to the cost-effectiveness plane and can be a useful tool for reporting and presenting the results of CBAs.However, there are many challenging issues to address for the advancement of CBA methodology using SPCDEs within healthcare. Particular areas for development include the importance of accounting for uncertainty in SPDCE-derived willingness-to-pay values, the methodology of SPDCEs in clinical trial settings and economic models, measurement issues pertinent to using SPDCEs specifically in healthcare, and the importance of issues such as consideration of the dynamic nature of healthcare and the resulting impact this has on the validity of attribute definitions and context.
Decision-making impairments in the context of intact reward sensitivity in schizophrenia.
Heerey, Erin A; Bell-Warren, Kimberly R; Gold, James M
2008-07-01
Deficits in motivated behavior and decision-making figure prominently in the behavioral syndrome that characterizes schizophrenia and are difficult both to treat and to understand. One explanation for these deficits is that schizophrenia decreases sensitivity to rewards in the environment. An alternate explanation is that sensitivity to rewards is intact but that poor integration of affective with cognitive information impairs the ability to use this information to guide behavior. We tested reward sensitivity with a modified version of an existing signal detection task with asymmetric reinforcement and decision-making with a probabilistic decision-making task in 40 participants with schizophrenia and 26 healthy participants. Results showed normal sensitivity to reward in participants with schizophrenia but differences in choice patterns on the decision-making task. A logistic regression model of the decision-making data showed that participants with schizophrenia differed from healthy participants in the ability to weigh potential outcomes, specifically potential losses, when choosing between competing response options. Deficits in working memory ability accounted for group differences in ability to use potential outcomes during decision-making. These results suggest that the implicit mechanisms that drive reward-based learning are surprisingly intact in schizophrenia but that poor ability to integrate cognitive and affective information when calculating the value of possible choices might hamper the ability to use such information during explicit decision-making.
NASA Astrophysics Data System (ADS)
Kozine, Igor
2018-04-01
The paper suggests looking on probabilistic risk quantities and concepts through the prism of accepting one of the views: whether a true value of risk exists or not. It is argued that discussions until now have been primarily focused on closely related topics that are different from the topic of the current paper. The paper examines operational consequences of adhering to each of the views and contrasts them. It is demonstrated that operational differences on how and what probabilistic measures can be assessed and how they can be interpreted appear tangible. In particular, this concerns prediction intervals, the use of Byes rule, models of complete ignorance, hierarchical models of uncertainty, assignment of probabilities over possibility space and interpretation of derived probabilistic measures. Behavioural implications of favouring the either view are also briefly described.
NASA Technical Reports Server (NTRS)
Ricks, Brian W.; Mengshoel, Ole J.
2009-01-01
Reliable systems health management is an important research area of NASA. A health management system that can accurately and quickly diagnose faults in various on-board systems of a vehicle will play a key role in the success of current and future NASA missions. We introduce in this paper the ProDiagnose algorithm, a diagnostic algorithm that uses a probabilistic approach, accomplished with Bayesian Network models compiled to Arithmetic Circuits, to diagnose these systems. We describe the ProDiagnose algorithm, how it works, and the probabilistic models involved. We show by experimentation on two Electrical Power Systems based on the ADAPT testbed, used in the Diagnostic Challenge Competition (DX 09), that ProDiagnose can produce results with over 96% accuracy and less than 1 second mean diagnostic time.
UQTools: The Uncertainty Quantification Toolbox - Introduction and Tutorial
NASA Technical Reports Server (NTRS)
Kenny, Sean P.; Crespo, Luis G.; Giesy, Daniel P.
2012-01-01
UQTools is the short name for the Uncertainty Quantification Toolbox, a software package designed to efficiently quantify the impact of parametric uncertainty on engineering systems. UQTools is a MATLAB-based software package and was designed to be discipline independent, employing very generic representations of the system models and uncertainty. Specifically, UQTools accepts linear and nonlinear system models and permits arbitrary functional dependencies between the system s measures of interest and the probabilistic or non-probabilistic parametric uncertainty. One of the most significant features incorporated into UQTools is the theoretical development centered on homothetic deformations and their application to set bounding and approximating failure probabilities. Beyond the set bounding technique, UQTools provides a wide range of probabilistic and uncertainty-based tools to solve key problems in science and engineering.
Towards a multilevel cognitive probabilistic representation of space
NASA Astrophysics Data System (ADS)
Tapus, Adriana; Vasudevan, Shrihari; Siegwart, Roland
2005-03-01
This paper addresses the problem of perception and representation of space for a mobile agent. A probabilistic hierarchical framework is suggested as a solution to this problem. The method proposed is a combination of probabilistic belief with "Object Graph Models" (OGM). The world is viewed from a topological optic, in terms of objects and relationships between them. The hierarchical representation that we propose permits an efficient and reliable modeling of the information that the mobile agent would perceive from its environment. The integration of both navigational and interactional capabilities through efficient representation is also addressed. Experiments on a set of images taken from the real world that validate the approach are reported. This framework draws on the general understanding of human cognition and perception and contributes towards the overall efforts to build cognitive robot companions.
Random mechanics: Nonlinear vibrations, turbulences, seisms, swells, fatigue
NASA Astrophysics Data System (ADS)
Kree, P.; Soize, C.
The random modeling of physical phenomena, together with probabilistic methods for the numerical calculation of random mechanical forces, are analytically explored. Attention is given to theoretical examinations such as probabilistic concepts, linear filtering techniques, and trajectory statistics. Applications of the methods to structures experiencing atmospheric turbulence, the quantification of turbulence, and the dynamic responses of the structures are considered. A probabilistic approach is taken to study the effects of earthquakes on structures and to the forces exerted by ocean waves on marine structures. Theoretical analyses by means of vector spaces and stochastic modeling are reviewed, as are Markovian formulations of Gaussian processes and the definition of stochastic differential equations. Finally, random vibrations with a variable number of links and linear oscillators undergoing the square of Gaussian processes are investigated.
Dependence in probabilistic modeling Dempster-Shafer theory and probability bounds analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferson, Scott; Nelsen, Roger B.; Hajagos, Janos
2015-05-01
This report summarizes methods to incorporate information (or lack of information) about inter-variable dependence into risk assessments that use Dempster-Shafer theory or probability bounds analysis to address epistemic and aleatory uncertainty. The report reviews techniques for simulating correlated variates for a given correlation measure and dependence model, computation of bounds on distribution functions under a specified dependence model, formulation of parametric and empirical dependence models, and bounding approaches that can be used when information about the intervariable dependence is incomplete. The report also reviews several of the most pervasive and dangerous myths among risk analysts about dependence in probabilistic models.
GENERAL: A modified weighted probabilistic cellular automaton traffic flow model
NASA Astrophysics Data System (ADS)
Zhuang, Qian; Jia, Bin; Li, Xin-Gang
2009-08-01
This paper modifies the weighted probabilistic cellular automaton model (Li X L, Kuang H, Song T, et al 2008 Chin. Phys. B 17 2366) which considered a diversity of traffic behaviors under real traffic situations induced by various driving characters and habits. In the new model, the effects of the velocity at the last time step and drivers' desire for acceleration are taken into account. The fundamental diagram, spatial-temporal diagram, and the time series of one-minute data are analyzed. The results show that this model reproduces synchronized flow. Finally, it simulates the on-ramp system with the proposed model. Some characteristics including the phase diagram are studied.
Identification of failure type in corroded pipelines: a bayesian probabilistic approach.
Breton, T; Sanchez-Gheno, J C; Alamilla, J L; Alvarez-Ramirez, J
2010-07-15
Spillover of hazardous materials from transport pipelines can lead to catastrophic events with serious and dangerous environmental impact, potential fire events and human fatalities. The problem is more serious for large pipelines when the construction material is under environmental corrosion conditions, as in the petroleum and gas industries. In this way, predictive models can provide a suitable framework for risk evaluation, maintenance policies and substitution procedure design that should be oriented to reduce increased hazards. This work proposes a bayesian probabilistic approach to identify and predict the type of failure (leakage or rupture) for steel pipelines under realistic corroding conditions. In the first step of the modeling process, the mechanical performance of the pipe is considered for establishing conditions under which either leakage or rupture failure can occur. In the second step, experimental burst tests are used to introduce a mean probabilistic boundary defining a region where the type of failure is uncertain. In the boundary vicinity, the failure discrimination is carried out with a probabilistic model where the events are considered as random variables. In turn, the model parameters are estimated with available experimental data and contrasted with a real catastrophic event, showing good discrimination capacity. The results are discussed in terms of policies oriented to inspection and maintenance of large-size pipelines in the oil and gas industry. 2010 Elsevier B.V. All rights reserved.
Asymptotic properties of a bold random walk
NASA Astrophysics Data System (ADS)
Serva, Maurizio
2014-08-01
In a recent paper we proposed a non-Markovian random walk model with memory of the maximum distance ever reached from the starting point (home). The behavior of the walker is different from the simple symmetric random walk only when she is at this maximum distance, where, having the choice to move either farther or closer, she decides with different probabilities. If the probability of a forward step is higher than the probability of a backward step, the walker is bold and her behavior turns out to be superdiffusive; otherwise she is timorous and her behavior turns out to be subdiffusive. The scaling behavior varies continuously from subdiffusive (timorous) to superdiffusive (bold) according to a single parameter γ ∈R. We investigate here the asymptotic properties of the bold case in the nonballistic region γ ∈[0,1/2], a problem which was left partially unsolved previously. The exact results proved in this paper require new probabilistic tools which rely on the construction of appropriate martingales of the random walk and its hitting times.
Deterministic Creation of Macroscopic Cat States
Lombardo, Daniel; Twamley, Jason
2015-01-01
Despite current technological advances, observing quantum mechanical effects outside of the nanoscopic realm is extremely challenging. For this reason, the observation of such effects on larger scale systems is currently one of the most attractive goals in quantum science. Many experimental protocols have been proposed for both the creation and observation of quantum states on macroscopic scales, in particular, in the field of optomechanics. The majority of these proposals, however, rely on performing measurements, making them probabilistic. In this work we develop a completely deterministic method of macroscopic quantum state creation. We study the prototypical optomechanical Membrane In The Middle model and show that by controlling the membrane’s opacity, and through careful choice of the optical cavity initial state, we can deterministically create and grow the spatial extent of the membrane’s position into a large cat state. It is found that by using a Bose-Einstein condensate as a membrane high fidelity cat states with spatial separations of up to ∼300 nm can be achieved. PMID:26345157
Opportunities of probabilistic flood loss models
NASA Astrophysics Data System (ADS)
Schröter, Kai; Kreibich, Heidi; Lüdtke, Stefan; Vogel, Kristin; Merz, Bruno
2016-04-01
Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. However, reliable flood damage models are a prerequisite for the practical usefulness of the model results. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks and traditional stage damage functions. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005, 2006 and 2013 in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of sharpness of the predictions the reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The comparison of the uni-variable Stage damage function and the multivariable model approach emphasises the importance to quantify predictive uncertainty. With each explanatory variable, the multi-variable model reveals an additional source of uncertainty. However, the predictive performance in terms of precision (mbe), accuracy (mae) and reliability (HR) is clearly improved in comparison to uni-variable Stage damage function. Overall, Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.
Fleurence, Rachael L
2005-01-01
The cost-effectiveness of alternating pressure-relieving devices, mattress replacements, and mattress overlays compared with a standard hospital (high-specification foam mattress) for the prevention and treatment of pressure ulcers in hospital patients in the United Kingdom was investigated. A decision-analytic model was constructed to evaluate different strategies to prevent or treat pressure ulcers. Three scenarios were evaluated: the prevention of pressure ulcers, the treatment of superficial ulcers, and the treatment of severe ulcers. Epidemiological and effectiveness data were obtained from the clinical literature. Expert opinion using a rating scale technique was used to obtain quality of life data. Costs of the devices were obtained from manufacturers, whereas costs of treatment were obtained from the literature. Uncertainty was explored through probabilistic sensitivity analysis. Using 30,000 pounds sterling/QALY (quality-adjusted life year) as the decision-maker's cut off point (the current UK standard), in scenario 1 (prevention), the cost-effective strategy was the mattress overlay at 1, 4, and 12 weeks. In scenarios 2 and 3, the cost-effective strategy was the mattress replacement at 1, 4, and 12 weeks. Standard care was a dominated intervention in all scenarios for values of the decision-maker's ceiling ratio ranging from 5,000 pounds sterling to 100,000 pounds sterling/QALY. However, the probabilistic sensitivity analysis results reflected the high uncertainty surrounding the choice of devices. Current information suggests that alternating pressure mattress overlays may be cost-effective for the prevention of pressure ulcers, whereas alternating pressure mattress replacements appears to be cost-effective for the treatment of superficial and severe pressure ulcers.
NASA Astrophysics Data System (ADS)
Lane, E. M.; Gillibrand, P. A.; Wang, X.; Power, W.
2013-09-01
Regional source tsunamis pose a potentially devastating hazard to communities and infrastructure on the New Zealand coast. But major events are very uncommon. This dichotomy of infrequent but potentially devastating hazards makes realistic assessment of the risk challenging. Here, we describe a method to determine a probabilistic assessment of the tsunami hazard by regional source tsunamis with an "Average Recurrence Interval" of 2,500-years. The method is applied to the east Auckland region of New Zealand. From an assessment of potential regional tsunamigenic events over 100,000 years, the inundation of the Auckland region from the worst 100 events is modelled using a hydrodynamic model and probabilistic inundation depths on a 2,500-year time scale were determined. Tidal effects on the potential inundation were included by coupling the predicted wave heights with the probability density function of tidal heights at the inundation site. Results show that the more exposed northern section of the east coast and outer islands in the Hauraki Gulf face the greatest hazard from regional tsunamis in the Auckland region. Incorporating tidal effects into predictions of inundation reduced the predicted hazard compared to modelling all the tsunamis arriving at high tide giving a more accurate hazard assessment on the specified time scale. This study presents the first probabilistic analysis of dynamic modelling of tsunami inundation for the New Zealand coast and as such provides the most comprehensive assessment of tsunami inundation of the Auckland region from regional source tsunamis available to date.
Probabilistic grammatical model for helix‐helix contact site classification
2013-01-01
Background Hidden Markov Models power many state‐of‐the‐art tools in the field of protein bioinformatics. While excelling in their tasks, these methods of protein analysis do not convey directly information on medium‐ and long‐range residue‐residue interactions. This requires an expressive power of at least context‐free grammars. However, application of more powerful grammar formalisms to protein analysis has been surprisingly limited. Results In this work, we present a probabilistic grammatical framework for problem‐specific protein languages and apply it to classification of transmembrane helix‐helix pairs configurations. The core of the model consists of a probabilistic context‐free grammar, automatically inferred by a genetic algorithm from only a generic set of expert‐based rules and positive training samples. The model was applied to produce sequence based descriptors of four classes of transmembrane helix‐helix contact site configurations. The highest performance of the classifiers reached AUCROC of 0.70. The analysis of grammar parse trees revealed the ability of representing structural features of helix‐helix contact sites. Conclusions We demonstrated that our probabilistic context‐free framework for analysis of protein sequences outperforms the state of the art in the task of helix‐helix contact site classification. However, this is achieved without necessarily requiring modeling long range dependencies between interacting residues. A significant feature of our approach is that grammar rules and parse trees are human‐readable. Thus they could provide biologically meaningful information for molecular biologists. PMID:24350601
Drake, Tom; Chalabi, Zaid; Coker, Richard
2015-02-01
Investment in pandemic preparedness is a long-term gamble, with the return on investment coming at an unknown point in the future. Many countries have chosen to stockpile key resources, and the number of pandemic economic evaluations has risen sharply since 2009. We assess the importance of uncertainty in time-to-pandemic (and associated discounting) in pandemic economic evaluation, a factor frequently neglected in the literature to-date. We use a probability tree model and Monte Carlo parameter sampling to consider the cost effectiveness of antiviral stockpiling in Cambodia under parameter uncertainty. Mean elasticity and mutual information (MI) are used to assess the importance of time-to-pandemic compared with other parameters. We also consider the sensitivity to choice of sampling distribution used to model time-to-pandemic uncertainty. Time-to-pandemic and discount rate are the primary drivers of sensitivity and uncertainty in pandemic cost effectiveness models. Base case cost effectiveness of antiviral stockpiling ranged between is US$112 and US$3599 per DALY averted using historical pandemic intervals for time-to-pandemic. The mean elasticities for time-to-pandemic and discount rate were greater than all other parameters. Similarly, the MI scores for time to pandemic and discount rate were greater than other parameters. Time-to-pandemic and discount rate were key drivers of uncertainty in cost-effectiveness results regardless of time-to-pandemic sampling distribution choice. Time-to-pandemic assumptions can "substantially" affect cost-effectiveness results and, in our model, is a greater contributor to uncertainty in cost-effectiveness results than any other parameter. We strongly recommend that cost-effectiveness models include probabilistic analysis of time-to-pandemic uncertainty. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine © The Author 2013; all rights reserved.
Gold, James M.; Waltz, James A.; Matveeva, Tatyana M.; Kasanova, Zuzana; Strauss, Gregory P.; Herbener, Ellen S.; Collins, Anne G.E.; Frank, Michael J.
2015-01-01
Context Negative symptoms are a core feature of schizophrenia, but their pathophysiology remains unclear. Objective Negative symptoms are defined by the absence of normal function. However, there must be a productive mechanism that leads to this absence. Here, we test a reinforcement learning account suggesting that negative symptoms result from a failure to represent the expected value of rewards coupled with preserved loss avoidance learning. Design Subjects performed a probabilistic reinforcement learning paradigm involving stimulus pairs in which choices resulted in either reward or avoidance of loss. Following training, subjects indicated their valuation of the stimuli in a transfer task. Computational modeling was used to distinguish between alternative accounts of the data. Setting A tertiary care research outpatient clinic. Patients A total of 47 clinically stable patients with a diagnosis of schizophrenia or schizoaffective disorder and 28 healthy volunteers participated. Patients were divided into high and low negative symptom groups. Main Outcome measures 1) The number of choices leading to reward or loss avoidance and 2) performance in the transfer phase. Quantitative fits from three different models were examined. Results High negative symptom patients demonstrated impaired learning from rewards but intact loss avoidance learning, and failed to distinguish rewarding stimuli from loss-avoiding stimuli in the transfer phase. Model fits revealed that high negative symptom patients were better characterized by an “actor-critic” model, learning stimulus-response associations, whereas controls and low negative symptom patients incorporated expected value of their actions (“Q-learning”) into the selection process. Conclusions Negative symptoms are associated with a specific reinforcement learning abnormality: High negative symptoms patients do not represent the expected value of rewards when making decisions but learn to avoid punishments through the use of prediction errors. This computational framework offers the potential to understand negative symptoms at a mechanistic level. PMID:22310503
Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment
NASA Technical Reports Server (NTRS)
Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.
2016-01-01
Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.
Finding models to detect Alzheimer's disease by fusing structural and neuropsychological information
NASA Astrophysics Data System (ADS)
Giraldo, Diana L.; García-Arteaga, Juan D.; Velasco, Nelson; Romero, Eduardo
2015-12-01
Alzheimer's disease (AD) is a neurodegenerative disease that affects higher brain functions. Initial diagnosis of AD is based on the patient's clinical history and a battery of neuropsychological tests. The accuracy of the diagnosis is highly dependent on the examiner's skills and on the evolution of a variable clinical frame. This work presents an automatic strategy that learns probabilistic brain models for different stages of the disease, reducing the complexity, parameter adjustment and computational costs. The proposed method starts by setting a probabilistic class description using the information stored in the neuropsychological test, followed by constructing the different structural class models using membership values from the learned probabilistic functions. These models are then used as a reference frame for the classification problem: a new case is assigned to a particular class simply by projecting to the different models. The validation was performed using a leave-one-out cross-validation, two classes were used: Normal Control (NC) subjects and patients diagnosed with mild AD. In this experiment it is possible to achieve a sensibility and specificity of 80% and 79% respectively.
Probabilistic accounting of uncertainty in forecasts of species distributions under climate change
Seth J. Wenger; Nicholas A. Som; Daniel C. Dauwalter; Daniel J. Isaak; Helen M. Neville; Charles H. Luce; Jason B. Dunham; Michael K. Young; Kurt D. Fausch; Bruce E. Rieman
2013-01-01
Forecasts of species distributions under future climates are inherently uncertain, but there have been few attempts to describe this uncertainty comprehensively in a probabilistic manner. We developed a Monte Carlo approach that accounts for uncertainty within generalized linear regression models (parameter uncertainty and residual error), uncertainty among competing...
Design-based Sample and Probability Law-Assumed Sample: Their Role in Scientific Investigation.
ERIC Educational Resources Information Center
Ojeda, Mario Miguel; Sahai, Hardeo
2002-01-01
Discusses some key statistical concepts in probabilistic and non-probabilistic sampling to provide an overview for understanding the inference process. Suggests a statistical model constituting the basis of statistical inference and provides a brief review of the finite population descriptive inference and a quota sampling inferential theory.…
Recovering a Probabilistic Knowledge Structure by Constraining Its Parameter Space
ERIC Educational Resources Information Center
Stefanutti, Luca; Robusto, Egidio
2009-01-01
In the Basic Local Independence Model (BLIM) of Doignon and Falmagne ("Knowledge Spaces," Springer, Berlin, 1999), the probabilistic relationship between the latent knowledge states and the observable response patterns is established by the introduction of a pair of parameters for each of the problems: a lucky guess probability and a careless…
Structural reliability methods: Code development status
NASA Astrophysics Data System (ADS)
Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.
1991-05-01
The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.
Structural reliability methods: Code development status
NASA Technical Reports Server (NTRS)
Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.
1991-01-01
The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.
Probabilistic dual heuristic programming-based adaptive critic
NASA Astrophysics Data System (ADS)
Herzallah, Randa
2010-02-01
Adaptive critic (AC) methods have common roots as generalisations of dynamic programming for neural reinforcement learning approaches. Since they approximate the dynamic programming solutions, they are potentially suitable for learning in noisy, non-linear and non-stationary environments. In this study, a novel probabilistic dual heuristic programming (DHP)-based AC controller is proposed. Distinct to current approaches, the proposed probabilistic (DHP) AC method takes uncertainties of forward model and inverse controller into consideration. Therefore, it is suitable for deterministic and stochastic control problems characterised by functional uncertainty. Theoretical development of the proposed method is validated by analytically evaluating the correct value of the cost function which satisfies the Bellman equation in a linear quadratic control problem. The target value of the probabilistic critic network is then calculated and shown to be equal to the analytically derived correct value. Full derivation of the Riccati solution for this non-standard stochastic linear quadratic control problem is also provided. Moreover, the performance of the proposed probabilistic controller is demonstrated on linear and non-linear control examples.
Methods for Probabilistic Fault Diagnosis: An Electrical Power System Case Study
NASA Technical Reports Server (NTRS)
Ricks, Brian W.; Mengshoel, Ole J.
2009-01-01
Health management systems that more accurately and quickly diagnose faults that may occur in different technical systems on-board a vehicle will play a key role in the success of future NASA missions. We discuss in this paper the diagnosis of abrupt continuous (or parametric) faults within the context of probabilistic graphical models, more specifically Bayesian networks that are compiled to arithmetic circuits. This paper extends our previous research, within the same probabilistic setting, on diagnosis of abrupt discrete faults. Our approach and diagnostic algorithm ProDiagnose are domain-independent; however we use an electrical power system testbed called ADAPT as a case study. In one set of ADAPT experiments, performed as part of the 2009 Diagnostic Challenge, our system turned out to have the best performance among all competitors. In a second set of experiments, we show how we have recently further significantly improved the performance of the probabilistic model of ADAPT. While these experiments are obtained for an electrical power system testbed, we believe they can easily be transitioned to real-world systems, thus promising to increase the success of future NASA missions.
Evaluation of Sex-Specific Movement Patterns in Judo Using Probabilistic Neural Networks.
Miarka, Bianca; Sterkowicz-Przybycien, Katarzyna; Fukuda, David H
2017-10-01
The purpose of the present study was to create a probabilistic neural network to clarify the understanding of movement patterns in international judo competitions by gender. Analysis of 773 male and 638 female bouts was utilized to identify movements during the approach, gripping, attack (including biomechanical designations), groundwork, defense, and pause phases. Probabilistic neural network and chi-square (χ 2 ) tests modeled and compared frequencies (p ≤ .05). Women (mean [interquartile range]: 9.9 [4; 14]) attacked more than men (7.0 [3; 10]) while attempting a greater number of arm/leg lever (women: 2.7 [1; 6]; men: 4.0 [0; 4]) and trunk/leg lever (women: 0.8 [0; 1]; men: 2.4 [0; 4]) techniques but fewer maximal length-moment arm techniques (women: 0.7 [0; 1]; men: 1.0 [0; 2]). Male athletes displayed one-handed gripping of the back and sleeve, whereas female athletes executed a greater number of groundwork techniques. An optimized probabilistic neural network model, using patterns from the gripping, attack, groundwork, and pause phases, produced an overall prediction accuracy of 76% for discrimination between men and women.
NASA Astrophysics Data System (ADS)
Smith, L. A.
2012-04-01
There is, at present, no attractive foundation for quantitative probabilistic decision support in the face of model inadequacy, or given ambiguity (deep uncertainty) regarding the relative likelihood of various outcomes, known or unknown. True model error arguably precludes the extraction of objective probabilities from an ensemble of model runs drawn from an available (inadequate) model class, while the acknowledgement of incomplete understanding precludes the justified use of (if not the very formation of) an individual's subjective probabilities. An alternative approach based on Sustainable Odds is proposed and investigated. Sustainable Odds differ from "fair odds" (and are easily distinguished any claim which implying well defined probabilities) as the probabilities implied by sustainable odds summed over all outcomes is expected to exceed one. Traditionally, a person's fair odds are found by identifying the probability level at which one would happily accept either side of a bet, thus the probabilities implied by fair odds always sum to one. Knowing that one has incomplete information and perhaps even erroneous beliefs, there is no compelling reason a rational agent should accept the constraint implied by "fair odds" in any bet. Rather, a rational agent might insist on longer odds both on the event and against the event in order to account for acknowledged ignorance. Let probabilistic odds imply any set of odds for which the implied probabilities sum to one; once model error is acknowledged can one rationally demand non-probabilistic odds? The danger of using fair odds (or probabilities) in decision making is illustrated by considering the risk of ruin a cooperative insurance scheme using probabilistic odds is exposed to. Cases where knowing merely that the insurer's model is imperfect, and nothing else, is sufficient to place bets which drive the insurer to an unexpectedly early ruin are presented. Methodologies which allow the insurer to avoid this early ruin are explored; those which prevent early ruin are said to provide "sustainable odds", and it is suggested that these must be non-probabilistic. The aim here is not for the insurance cooperative to make a profit in the long run (or to form a book in any one round) but rather to increase the chance that the cooperative will not go bust, merely breaking even in the long run and thereby continuing to provide a service. In the perfect model scenario, with complete knowledge of all uncertainties and unlimited computational resources, fair odds may prove to be sustainable. The implications these results hold in the case of games against nature, which is perhaps a more relevant context for decision makers concerned with geophysical systems, are discussed. The claim that acknowledged model error makes fair (probabilistic) odds an irrational aim is considered, as are the challenges of working within the framework of sustainable (but non-probabilistic) odds.
Probabilistic seismic hazard study based on active fault and finite element geodynamic models
NASA Astrophysics Data System (ADS)
Kastelic, Vanja; Carafa, Michele M. C.; Visini, Francesco
2016-04-01
We present a probabilistic seismic hazard analysis (PSHA) that is exclusively based on active faults and geodynamic finite element input models whereas seismic catalogues were used only in a posterior comparison. We applied the developed model in the External Dinarides, a slow deforming thrust-and-fold belt at the contact between Adria and Eurasia.. is the Our method consists of establishing s two earthquake rupture forecast models: (i) a geological active fault input (GEO) model and, (ii) a finite element (FEM) model. The GEO model is based on active fault database that provides information on fault location and its geometric and kinematic parameters together with estimations on its slip rate. By default in this model all deformation is set to be released along the active faults. The FEM model is based on a numerical geodynamic model developed for the region of study. In this model the deformation is, besides along the active faults, released also in the volumetric continuum elements. From both models we calculated their corresponding activity rates, its earthquake rates and their final expected peak ground accelerations. We investigated both the source model and the earthquake model uncertainties by varying the main active fault and earthquake rate calculation parameters through constructing corresponding branches of the seismic hazard logic tree. Hazard maps and UHS curves have been produced for horizontal ground motion on bedrock conditions VS 30 ≥ 800 m/s), thereby not considering local site amplification effects. The hazard was computed over a 0.2° spaced grid considering 648 branches of the logic tree and the mean value of 10% probability of exceedance in 50 years hazard level, while the 5th and 95th percentiles were also computed to investigate the model limits. We conducted a sensitivity analysis to control which of the input parameters influence the final hazard results in which measure. The results of such comparison evidence the deformation model and with their internal variability together with the choice of the ground motion prediction equations (GMPEs) are the most influencing parameter. Both of these parameters have significan affect on the hazard results. Thus having good knowledge of the existence of active faults and their geometric and activity characteristics is of key importance. We also show that PSHA models based exclusively on active faults and geodynamic inputs, which are thus not dependent on past earthquake occurrences, provide a valid method for seismic hazard calculation.
Enhancing Flood Prediction Reliability Using Bayesian Model Averaging
NASA Astrophysics Data System (ADS)
Liu, Z.; Merwade, V.
2017-12-01
Uncertainty analysis is an indispensable part of modeling the hydrology and hydrodynamics of non-idealized environmental systems. Compared to reliance on prediction from one model simulation, using on ensemble of predictions that consider uncertainty from different sources is more reliable. In this study, Bayesian model averaging (BMA) is applied to Black River watershed in Arkansas and Missouri by combining multi-model simulations to get reliable deterministic water stage and probabilistic inundation extent predictions. The simulation ensemble is generated from 81 LISFLOOD-FP subgrid model configurations that include uncertainty from channel shape, channel width, channel roughness and discharge. Model simulation outputs are trained with observed water stage data during one flood event, and BMA prediction ability is validated for another flood event. Results from this study indicate that BMA does not always outperform all members in the ensemble, but it provides relatively robust deterministic flood stage predictions across the basin. Station based BMA (BMA_S) water stage prediction has better performance than global based BMA (BMA_G) prediction which is superior to the ensemble mean prediction. Additionally, high-frequency flood inundation extent (probability greater than 60%) in BMA_G probabilistic map is more accurate than the probabilistic flood inundation extent based on equal weights.
SHM-Based Probabilistic Fatigue Life Prediction for Bridges Based on FE Model Updating
Lee, Young-Joo; Cho, Soojin
2016-01-01
Fatigue life prediction for a bridge should be based on the current condition of the bridge, and various sources of uncertainty, such as material properties, anticipated vehicle loads and environmental conditions, make the prediction very challenging. This paper presents a new approach for probabilistic fatigue life prediction for bridges using finite element (FE) model updating based on structural health monitoring (SHM) data. Recently, various types of SHM systems have been used to monitor and evaluate the long-term structural performance of bridges. For example, SHM data can be used to estimate the degradation of an in-service bridge, which makes it possible to update the initial FE model. The proposed method consists of three steps: (1) identifying the modal properties of a bridge, such as mode shapes and natural frequencies, based on the ambient vibration under passing vehicles; (2) updating the structural parameters of an initial FE model using the identified modal properties; and (3) predicting the probabilistic fatigue life using the updated FE model. The proposed method is demonstrated by application to a numerical model of a bridge, and the impact of FE model updating on the bridge fatigue life is discussed. PMID:26950125
A Probabilistic Model of Meter Perception: Simulating Enculturation.
van der Weij, Bastiaan; Pearce, Marcus T; Honing, Henkjan
2017-01-01
Enculturation is known to shape the perception of meter in music but this is not explicitly accounted for by current cognitive models of meter perception. We hypothesize that the induction of meter is a result of predictive coding: interpreting onsets in a rhythm relative to a periodic meter facilitates prediction of future onsets. Such prediction, we hypothesize, is based on previous exposure to rhythms. As such, predictive coding provides a possible explanation for the way meter perception is shaped by the cultural environment. Based on this hypothesis, we present a probabilistic model of meter perception that uses statistical properties of the relation between rhythm and meter to infer meter from quantized rhythms. We show that our model can successfully predict annotated time signatures from quantized rhythmic patterns derived from folk melodies. Furthermore, we show that by inferring meter, our model improves prediction of the onsets of future events compared to a similar probabilistic model that does not infer meter. Finally, as a proof of concept, we demonstrate how our model can be used in a simulation of enculturation. From the results of this simulation, we derive a class of rhythms that are likely to be interpreted differently by enculturated listeners with different histories of exposure to rhythms.
Larkin, Joshua D; Jenni, Nicole L; Floresco, Stan B
2016-01-01
Dopamine (DA) transmission within cortico-limbic-striatal circuitry is integral in modulating decisions involving reward uncertainty. The basolateral amygdala (BLA) also plays a role in these processes, yet how DA transmission within this nucleus regulates cost/benefit decision making is unknown. We investigated the contribution of DA transmission within the BLA to risk/reward decision making assessed with a probabilistic discounting task. Rats were well-trained to choose between a small/certain reward and a large/risky reward, with the probability of obtaining the larger reward decreasing (100-12.5 %) or increasing (12.5-100 %) over a session. We examined the effects of antagonizing BLA D1 (SCH 23390, 0.1-1 μg) or D2 (eticlopride, 0.1-1 μg) receptors, as well as intra-BLA infusions of agonists for D1 (SKF 81297, 0.1-1 μg) and D2 (quinpirole, 1-10 μg) receptors. We also assessed how DA receptor stimulation may induce differential effects related to baseline levels of risky choice. BLA D1 receptor antagonism reduced risky choice by decreasing reward sensitivity, whereas D2 antagonism did not affect overall choice patterns. Stimulation of BLA D1 receptors optimized decision making in a baseline-dependent manner: in risk-averse rats, infusions of a lower dose of SKF81297 increased risky choice when reward probabilities were high (50 %), whereas in risk-prone rats, this drug reduced risky choice when probabilities were low (12.5 %). Quinpirole reduced risky choice in risk-prone rats, enhancing lose-shift behavior. These data highlight previously uncharacterized roles for BLA DA D1 and D2 receptors in biasing choice during risk/reward decision making through mediation of reward/negative feedback sensitivity.