Science.gov

Sample records for probabilistic choice models

  1. A probabilistic choice model based on Tsallis’ statistics

    NASA Astrophysics Data System (ADS)

    Takahashi, Taiki

    2007-12-01

    Decision under risk and uncertainty (probabilistic choice) has been attracting attention in econophysics and neuroeconomics. This paper proposes a probabilistic choice model based on a mathematical equivalence of delay and uncertainty in decision-making, and the deformed algebra developed in the Tsallis’ non-extensive thermodynamics. Furthermore, it is shown that this model can be utilized to quantify the degree of consistency in probabilistic choice in humans and animals. Future directions in the application of the model to studies in econophysics, neurofinance, neuroeconomics, and social physics are discussed.

  2. Probabilistic choice models in health-state valuation research: background, theories, assumptions and applications.

    PubMed

    Arons, Alexander M M; Krabbe, Paul F M

    2013-02-01

    Interest is rising in measuring subjective health outcomes, such as treatment outcomes that are not directly quantifiable (functional disability, symptoms, complaints, side effects and health-related quality of life). Health economists in particular have applied probabilistic choice models in the area of health evaluation. They increasingly use discrete choice models based on random utility theory to derive values for healthcare goods or services. Recent attempts have been made to use discrete choice models as an alternative method to derive values for health states. In this article, various probabilistic choice models are described according to their underlying theory. A historical overview traces their development and applications in diverse fields. The discussion highlights some theoretical and technical aspects of the choice models and their similarity and dissimilarity. The objective of the article is to elucidate the position of each model and their applications for health-state valuation.

  3. Generalized outcome-based strategy classification: comparing deterministic and probabilistic choice models.

    PubMed

    Hilbig, Benjamin E; Moshagen, Morten

    2014-12-01

    Model comparisons are a vital tool for disentangling which of several strategies a decision maker may have used--that is, which cognitive processes may have governed observable choice behavior. However, previous methodological approaches have been limited to models (i.e., decision strategies) with deterministic choice rules. As such, psychologically plausible choice models--such as evidence-accumulation and connectionist models--that entail probabilistic choice predictions could not be considered appropriately. To overcome this limitation, we propose a generalization of Bröder and Schiffer's (Journal of Behavioral Decision Making, 19, 361-380, 2003) choice-based classification method, relying on (1) parametric order constraints in the multinomial processing tree framework to implement probabilistic models and (2) minimum description length for model comparison. The advantages of the generalized approach are demonstrated through recovery simulations and an experiment. In explaining previous methods and our generalization, we maintain a nontechnical focus--so as to provide a practical guide for comparing both deterministic and probabilistic choice models.

  4. A probabilistic, dynamic, and attribute-wise model of intertemporal choice.

    PubMed

    Dai, Junyi; Busemeyer, Jerome R

    2014-08-01

    Most theoretical and empirical research on intertemporal choice assumes a deterministic and static perspective, leading to the widely adopted delay discounting models. As a form of preferential choice, however, intertemporal choice may be generated by a stochastic process that requires some deliberation time to reach a decision. We conducted 3 experiments to investigate how choice and decision time varied as a function of manipulations designed to examine the delay duration effect, the common difference effect, and the magnitude effect in intertemporal choice. The results, especially those associated with the delay duration effect, challenged the traditional deterministic and static view and called for alternative approaches. Consequently, various static or dynamic stochastic choice models were explored and fit to the choice data, including alternative-wise models derived from the traditional exponential or hyperbolic discount function and attribute-wise models built upon comparisons of direct or relative differences in money and delay. Furthermore, for the first time, dynamic diffusion models, such as those based on decision field theory, were also fit to the choice and response time data simultaneously. The results revealed that the attribute-wise diffusion model with direct differences, power transformations of objective value and time, and varied diffusion parameter performed the best and could account for all 3 intertemporal effects. In addition, the empirical relationship between choice proportions and response times was consistent with the prediction of diffusion models and thus favored a stochastic choice process for intertemporal choice that requires some deliberation time to make a decision.

  5. On Choosing Between Two Probabilistic Choice Sub-models in a Dynamic Multitask Environment

    NASA Technical Reports Server (NTRS)

    Soulsby, E. P.

    1984-01-01

    An independent random utility model based on Thurstone's Theory of Comparative Judgment and a constant utility model based on Luce's Choice Axiom are reviewed in detail. Predictions from the two models are shown to be equivalent under certain restrictions on the distribution of the underlying random process. Each model is applied as a stochastic choice submodel in a dynamic, multitask, environment. Resulting choice probabilities are nearly identical, indicating that, despite their conceptual differences, neither model may be preferred over the other based solely on its predictive capability.

  6. Probabilistic Choice, Reversibility, Loops, and Miracles

    NASA Astrophysics Data System (ADS)

    Stoddart, Bill; Bell, Pete

    We consider an addition of probabilistic choice to Abrial's Generalised Substitution Language (GSL) in a form that accommodates the backtracking interpretation of non-deterministic choice. Our formulation is introduced as an extension of the Prospective Values formalism we have developed to describe the results from a backtracking search. Significant features are that probabilistic choice is governed by feasibility, and non-termination is strict. The former property allows us to use probabilistic choice to generate search heuristics. In this paper we are particularly interested in iteration. By demonstrating sub-conjunctivity and monotonicity properties of expectations we give the basis for a fixed point semantics of iterative constructs, and we consider the practical proof treatment of probabilistic loops. We discuss loop invariants, loops with probabilistic behaviour, and probabilistic termination in the context of a formalism in which a small probability of non-termination can dominate our calculations, proposing a method of limits to avoid this problem. The formal programming constructs described have been implemented in a reversible virtual machine (RVM).

  7. The Probabilistic Nature of Preferential Choice

    ERIC Educational Resources Information Center

    Rieskamp, Jorg

    2008-01-01

    Previous research has developed a variety of theories explaining when and why people's decisions under risk deviate from the standard economic view of expected utility maximization. These theories are limited in their predictive accuracy in that they do not explain the probabilistic nature of preferential choice, that is, why an individual makes…

  8. Learning Probabilistic Logic Models from Probabilistic Examples.

    PubMed

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2008-10-01

    We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples.

  9. Probabilistic microcell prediction model

    NASA Astrophysics Data System (ADS)

    Kim, Song-Kyoo

    2002-06-01

    A microcell is a cell with 1-km or less radius which is suitable for heavily urbanized area such as a metropolitan city. This paper deals with the microcell prediction model of propagation loss which uses probabilistic techniques. The RSL (Receive Signal Level) is the factor which can evaluate the performance of a microcell and the LOS (Line-Of-Sight) component and the blockage loss directly effect on the RSL. We are combining the probabilistic method to get these performance factors. The mathematical methods include the CLT (Central Limit Theorem) and the SPC (Statistical Process Control) to get the parameters of the distribution. This probabilistic solution gives us better measuring of performance factors. In addition, it gives the probabilistic optimization of strategies such as the number of cells, cell location, capacity of cells, range of cells and so on. Specially, the probabilistic optimization techniques by itself can be applied to real-world problems such as computer-networking, human resources and manufacturing process.

  10. Probabilistic Mesomechanical Fatigue Model

    NASA Technical Reports Server (NTRS)

    Tryon, Robert G.

    1997-01-01

    A probabilistic mesomechanical fatigue life model is proposed to link the microstructural material heterogeneities to the statistical scatter in the macrostructural response. The macrostructure is modeled as an ensemble of microelements. Cracks nucleation within the microelements and grow from the microelements to final fracture. Variations of the microelement properties are defined using statistical parameters. A micromechanical slip band decohesion model is used to determine the crack nucleation life and size. A crack tip opening displacement model is used to determine the small crack growth life and size. Paris law is used to determine the long crack growth life. The models are combined in a Monte Carlo simulation to determine the statistical distribution of total fatigue life for the macrostructure. The modeled response is compared to trends in experimental observations from the literature.

  11. A Discounting Framework for Choice With Delayed and Probabilistic Rewards

    PubMed Central

    Green, Leonard; Myerson, Joel

    2005-01-01

    When choosing between delayed or uncertain outcomes, individuals discount the value of such outcomes on the basis of the expected time to or the likelihood of their occurrence. In an integrative review of the expanding experimental literature on discounting, the authors show that although the same form of hyperbola-like function describes discounting of both delayed and probabilistic outcomes, a variety of recent findings are inconsistent with a single-process account. The authors also review studies that compare discounting in different populations and discuss the theoretical and practical implications of the findings. The present effort illustrates the value of studying choice involving both delayed and probabilistic outcomes within a general discounting framework that uses similar experimental procedures and a common analytical approach. PMID:15367080

  12. Enhanced probabilistic microcell prediction model

    NASA Astrophysics Data System (ADS)

    Kim, Song-Kyoo

    2005-06-01

    A microcell is a cell with 1-km or less radius which is suitable not only for heavily urbanized area such as a metropolitan city but also for in-building area such as offices and shopping malls. This paper deals with the microcell prediction model of propagation loss focused on in-buildng solution that is analyzed by probabilistic techniques. The RSL (Receive Signal Level) is the factor which can evaluate the performance of a microcell and the LOS (Line-Of-Sight) component and the blockage loss directly effect on the RSL. Combination of the probabilistic method is applied to get these performance factors. The mathematical methods include the CLT (Central Limit Theorem) and the SSQC (Six-Sigma Quality Control) to get the parameters of the distribution. This probabilistic solution gives us compact measuring of performance factors. In addition, it gives the probabilistic optimization of strategies such as the number of cells, cell location, capacity of cells, range of cells and so on. In addition, the optimal strategies for antenna allocation for a building can be obtained by using this model.

  13. Effects of Time between Trials on Rats' and Pigeons' Choices with Probabilistic Delayed Reinforcers

    ERIC Educational Resources Information Center

    Mazur, James E.; Biondi, Dawn R.

    2011-01-01

    Parallel experiments with rats and pigeons examined reasons for previous findings that in choices with probabilistic delayed reinforcers, rats' choices were affected by the time between trials whereas pigeons' choices were not. In both experiments, the animals chose between a standard alternative and an adjusting alternative. A choice of the…

  14. Probabilistic models of language processing and acquisition.

    PubMed

    Chater, Nick; Manning, Christopher D

    2006-07-01

    Probabilistic methods are providing new explanatory approaches to fundamental cognitive science questions of how humans structure, process and acquire language. This review examines probabilistic models defined over traditional symbolic structures. Language comprehension and production involve probabilistic inference in such models; and acquisition involves choosing the best model, given innate constraints and linguistic and other input. Probabilistic models can account for the learning and processing of language, while maintaining the sophistication of symbolic models. A recent burgeoning of theoretical developments and online corpus creation has enabled large models to be tested, revealing probabilistic constraints in processing, undermining acquisition arguments based on a perceived poverty of the stimulus, and suggesting fruitful links with probabilistic theories of categorization and ambiguity resolution in perception.

  15. Relative Gains, Losses, and Reference Points in Probabilistic Choice in Rats

    PubMed Central

    Marshall, Andrew T.; Kirkpatrick, Kimberly

    2015-01-01

    Theoretical reference points have been proposed to differentiate probabilistic gains from probabilistic losses in humans, but such a phenomenon in non-human animals has yet to be thoroughly elucidated. Three experiments evaluated the effect of reward magnitude on probabilistic choice in rats, seeking to determine reference point use by examining the effect of previous outcome magnitude(s) on subsequent choice behavior. Rats were trained to choose between an outcome that always delivered reward (low-uncertainty choice) and one that probabilistically delivered reward (high-uncertainty). The probability of high-uncertainty outcome receipt and the magnitudes of low-uncertainty and high-uncertainty outcomes were manipulated within and between experiments. Both the low- and high-uncertainty outcomes involved variable reward magnitudes, so that either a smaller or larger magnitude was probabilistically delivered, as well as reward omission following high-uncertainty choices. In Experiments 1 and 2, the between groups factor was the magnitude of the high-uncertainty-smaller (H-S) and high-uncertainty-larger (H-L) outcome, respectively. The H-S magnitude manipulation differentiated the groups, while the H-L magnitude manipulation did not. Experiment 3 showed that manipulating the probability of differential losses as well as the expected value of the low-uncertainty choice produced systematic effects on choice behavior. The results suggest that the reference point for probabilistic gains and losses was the expected value of the low-uncertainty choice. Current theories of probabilistic choice behavior have difficulty accounting for the present results, so an integrated theoretical framework is proposed. Overall, the present results have implications for understanding individual differences and corresponding underlying mechanisms of probabilistic choice behavior. PMID:25658448

  16. Relative gains, losses, and reference points in probabilistic choice in rats.

    PubMed

    Marshall, Andrew T; Kirkpatrick, Kimberly

    2015-01-01

    Theoretical reference points have been proposed to differentiate probabilistic gains from probabilistic losses in humans, but such a phenomenon in non-human animals has yet to be thoroughly elucidated. Three experiments evaluated the effect of reward magnitude on probabilistic choice in rats, seeking to determine reference point use by examining the effect of previous outcome magnitude(s) on subsequent choice behavior. Rats were trained to choose between an outcome that always delivered reward (low-uncertainty choice) and one that probabilistically delivered reward (high-uncertainty). The probability of high-uncertainty outcome receipt and the magnitudes of low-uncertainty and high-uncertainty outcomes were manipulated within and between experiments. Both the low- and high-uncertainty outcomes involved variable reward magnitudes, so that either a smaller or larger magnitude was probabilistically delivered, as well as reward omission following high-uncertainty choices. In Experiments 1 and 2, the between groups factor was the magnitude of the high-uncertainty-smaller (H-S) and high-uncertainty-larger (H-L) outcome, respectively. The H-S magnitude manipulation differentiated the groups, while the H-L magnitude manipulation did not. Experiment 3 showed that manipulating the probability of differential losses as well as the expected value of the low-uncertainty choice produced systematic effects on choice behavior. The results suggest that the reference point for probabilistic gains and losses was the expected value of the low-uncertainty choice. Current theories of probabilistic choice behavior have difficulty accounting for the present results, so an integrated theoretical framework is proposed. Overall, the present results have implications for understanding individual differences and corresponding underlying mechanisms of probabilistic choice behavior.

  17. Probabilistic Survivability Versus Time Modeling

    NASA Technical Reports Server (NTRS)

    Joyner, James J., Sr.

    2015-01-01

    This technical paper documents Kennedy Space Centers Independent Assessment team work completed on three assessments for the Ground Systems Development and Operations (GSDO) Program to assist the Chief Safety and Mission Assurance Officer (CSO) and GSDO management during key programmatic reviews. The assessments provided the GSDO Program with an analysis of how egress time affects the likelihood of astronaut and worker survival during an emergency. For each assessment, the team developed probability distributions for hazard scenarios to address statistical uncertainty, resulting in survivability plots over time. The first assessment developed a mathematical model of probabilistic survivability versus time to reach a safe location using an ideal Emergency Egress System at Launch Complex 39B (LC-39B); the second used the first model to evaluate and compare various egress systems under consideration at LC-39B. The third used a modified LC-39B model to determine if a specific hazard decreased survivability more rapidly than other events during flight hardware processing in Kennedys Vehicle Assembly Building (VAB).Based on the composite survivability versus time graphs from the first two assessments, there was a soft knee in the Figure of Merit graphs at eight minutes (ten minutes after egress ordered). Thus, the graphs illustrated to the decision makers that the final emergency egress design selected should have the capability of transporting the flight crew from the top of LC 39B to a safe location in eight minutes or less. Results for the third assessment were dominated by hazards that were classified as instantaneous in nature (e.g. stacking mishaps) and therefore had no effect on survivability vs time to egress the VAB. VAB emergency scenarios that degraded over time (e.g. fire) produced survivability vs time graphs that were line with aerospace industry norms.

  18. Modelling structured data with Probabilistic Graphical Models

    NASA Astrophysics Data System (ADS)

    Forbes, F.

    2016-05-01

    Most clustering and classification methods are based on the assumption that the objects to be clustered are independent. However, in more and more modern applications, data are structured in a way that makes this assumption not realistic and potentially misleading. A typical example that can be viewed as a clustering task is image segmentation where the objects are the pixels on a regular grid and depend on neighbouring pixels on this grid. Also, when data are geographically located, it is of interest to cluster data with an underlying dependence structure accounting for some spatial localisation. These spatial interactions can be naturally encoded via a graph not necessarily regular as a grid. Data sets can then be modelled via Markov random fields and mixture models (e.g. the so-called MRF and Hidden MRF). More generally, probabilistic graphical models are tools that can be used to represent and manipulate data in a structured way while modeling uncertainty. This chapter introduces the basic concepts. The two main classes of probabilistic graphical models are considered: Bayesian networks and Markov networks. The key concept of conditional independence and its link to Markov properties is presented. The main problems that can be solved with such tools are described. Some illustrations are given associated with some practical work.

  19. Choice Behavior in Pigeons Maintained with Probabilistic Schedules of Reinforcement

    ERIC Educational Resources Information Center

    Moore, Jay; Friedlen, Karen E.

    2007-01-01

    Pigeons were trained in three experiments with a two-key, concurrent-chains choice procedure. The initial links were equal variable-interval schedules, and the terminal links were random-time schedules with equal average interreinforcement intervals. Across the three experiments, the pigeons either stayed in a terminal link until a reinforcer was…

  20. A Probabilistic Model of Melody Perception

    ERIC Educational Resources Information Center

    Temperley, David

    2008-01-01

    This study presents a probabilistic model of melody perception, which infers the key of a melody and also judges the probability of the melody itself. The model uses Bayesian reasoning: For any "surface" pattern and underlying "structure," we can infer the structure maximizing P(structure [vertical bar] surface) based on knowledge of P(surface,…

  1. Probabilistic modeling of subgrade soil strengths

    NASA Astrophysics Data System (ADS)

    Chou, Y. T.

    1981-09-01

    A concept of spatial average in probabilistic modeling of subgrade soil strength is presented. The advantage of the application of spatial average to pavement engineering is explained. The link between the concept and the overall probability-based pavement design procedure is formulated and explained. In the earlier part of the report, a literature review of the concept and procedure of probabilistic design of pavements, which includes the concepts of variations and reliability, is presented. Finally, an outline of a probability based pavement design procedure for the Corps of Engineers is presented.

  2. Probabilistic and Non-probabilistic Synthetic Reliability Model for Space Structures

    NASA Astrophysics Data System (ADS)

    Hong, Dongpao; Hu, Xiao; Zhang, Jing

    2016-07-01

    As an alternative to reliability analysis, the non-probabilistic model is an effective supplement when the interval information exists. We describe the uncertain parameters of the structures with interval variables, and establish a non-probabilistic reliability model of structures. Then, we analyze the relation between the typical interference mode and the reliability according to the structure stress-strength interference model, and propose a new measure of structure non-probabilistic reliability. Furthermore we describe other uncertain parameters with random variables when probabilistic information also exists. For the complex structures including both random variables and interval variables, we propose a probabilistic and non-probabilistic synthetic reliability model. The illustrative example shows that the presented model is feasible for structure reliability analysis and design.

  3. Probabilistic modeling of children's handwriting

    NASA Astrophysics Data System (ADS)

    Puri, Mukta; Srihari, Sargur N.; Hanson, Lisa

    2013-12-01

    There is little work done in the analysis of children's handwriting, which can be useful in developing automatic evaluation systems and in quantifying handwriting individuality. We consider the statistical analysis of children's handwriting in early grades. Samples of handwriting of children in Grades 2-4 who were taught the Zaner-Bloser style were considered. The commonly occurring word "and" written in cursive style as well as hand-print were extracted from extended writing. The samples were assigned feature values by human examiners using a truthing tool. The human examiners looked at how the children constructed letter formations in their writing, looking for similarities and differences from the instructions taught in the handwriting copy book. These similarities and differences were measured using a feature space distance measure. Results indicate that the handwriting develops towards more conformity with the class characteristics of the Zaner-Bloser copybook which, with practice, is the expected result. Bayesian networks were learnt from the data to enable answering various probabilistic queries, such as determining students who may continue to produce letter formations as taught during lessons in school and determining the students who will develop a different and/or variation of the those letter formations and the number of different types of letter formations.

  4. Probabilistic Models for Solar Particle Events

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.; Dietrich, W. F.; Xapsos, M. A.; Welton, A. M.

    2009-01-01

    Probabilistic Models of Solar Particle Events (SPEs) are used in space mission design studies to provide a description of the worst-case radiation environment that the mission must be designed to tolerate.The models determine the worst-case environment using a description of the mission and a user-specified confidence level that the provided environment will not be exceeded. This poster will focus on completing the existing suite of models by developing models for peak flux and event-integrated fluence elemental spectra for the Z>2 elements. It will also discuss methods to take into account uncertainties in the data base and the uncertainties resulting from the limited number of solar particle events in the database. These new probabilistic models are based on an extensive survey of SPE measurements of peak and event-integrated elemental differential energy spectra. Attempts are made to fit the measured spectra with eight different published models. The model giving the best fit to each spectrum is chosen and used to represent that spectrum for any energy in the energy range covered by the measurements. The set of all such spectral representations for each element is then used to determine the worst case spectrum as a function of confidence level. The spectral representation that best fits these worst case spectra is found and its dependence on confidence level is parameterized. This procedure creates probabilistic models for the peak and event-integrated spectra.

  5. The Repeated Insertion Model for Rankings: Missing Link between Two Subset Choice Models

    ERIC Educational Resources Information Center

    Doignon, Jean-Paul; Pekec, Aleksandar; Regenwetter, Michel

    2004-01-01

    Several probabilistic models for subset choice have been proposed in the literature, for example, to explain approval voting data. We show that Marley et al.'s latent scale model is subsumed by Falmagne and Regenwetter's size-independent model, in the sense that every choice probability distribution generated by the former can also be explained by…

  6. Probabilistic drought classification using gamma mixture models

    NASA Astrophysics Data System (ADS)

    Mallya, Ganeshchandra; Tripathi, Shivam; Govindaraju, Rao S.

    2015-07-01

    Drought severity is commonly reported using drought classes obtained by assigning pre-defined thresholds on drought indices. Current drought classification methods ignore modeling uncertainties and provide discrete drought classification. However, the users of drought classification are often interested in knowing inherent uncertainties in classification so that they can make informed decisions. Recent studies have used hidden Markov models (HMM) for quantifying uncertainties in drought classification. The HMM method conceptualizes drought classes as distinct hydrological states that are not observed (hidden) but affect observed hydrological variables. The number of drought classes or hidden states in the model is pre-specified, which can sometimes result in model over-specification problem. This study proposes an alternate method for probabilistic drought classification where the number of states in the model is determined by the data. The proposed method adapts Standard Precipitation Index (SPI) methodology of drought classification by employing gamma mixture model (Gamma-MM) in a Bayesian framework. The method alleviates the problem of choosing a suitable distribution for fitting data in SPI analysis, quantifies modeling uncertainties, and propagates them for probabilistic drought classification. The method is tested on rainfall data over India. Comparison of the results with standard SPI show important differences particularly when SPI assumptions on data distribution are violated. Further, the new method is simpler and more parsimonious than HMM based drought classification method and can be a viable alternative for probabilistic drought classification.

  7. Probabilistic, meso-scale flood loss modelling

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2016-04-01

    Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.

  8. Analytic gain in probabilistic decompression sickness models.

    PubMed

    Howle, Laurens E

    2013-11-01

    Decompression sickness (DCS) is a disease known to be related to inert gas bubble formation originating from gases dissolved in body tissues. Probabilistic DCS models, which employ survival and hazard functions, are optimized by fitting model parameters to experimental dive data. In the work reported here, I develop methods to find the survival function gain parameter analytically, thus removing it from the fitting process. I show that the number of iterations required for model optimization is significantly reduced. The analytic gain method substantially improves the condition number of the Hessian matrix which reduces the model confidence intervals by more than an order of magnitude. PMID:24209920

  9. Analytic gain in probabilistic decompression sickness models.

    PubMed

    Howle, Laurens E

    2013-11-01

    Decompression sickness (DCS) is a disease known to be related to inert gas bubble formation originating from gases dissolved in body tissues. Probabilistic DCS models, which employ survival and hazard functions, are optimized by fitting model parameters to experimental dive data. In the work reported here, I develop methods to find the survival function gain parameter analytically, thus removing it from the fitting process. I show that the number of iterations required for model optimization is significantly reduced. The analytic gain method substantially improves the condition number of the Hessian matrix which reduces the model confidence intervals by more than an order of magnitude.

  10. Modelling default and likelihood reasoning as probabilistic

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1990-01-01

    A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. 'Likely' and 'by default' are in fact treated as duals in the same sense as 'possibility' and 'necessity'. To model these four forms probabilistically, a logic QDP and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequence results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.

  11. When good pigeons make bad decisions: Choice with probabilistic delays and outcomes.

    PubMed

    Pisklak, Jeffrey M; McDevitt, Margaret A; Dunn, Roger M; Spetch, Marcia L

    2015-11-01

    Pigeons chose between an (optimal) alternative that sometimes provided food after a 10-s delay and other times after a 40-s delay and another (suboptimal) alternative that sometimes provided food after 10 s but other times no food after 40 s. When outcomes were not signaled during the delays, pigeons strongly preferred the optimal alternative. When outcomes were signaled, choices of the suboptimal alternative increased and most pigeons preferred the alternative that provided no food after the long delay despite the cost in terms of obtained food. The pattern of results was similar whether the short delays occurred on 25% or 50% of the trials. Shortening the 40-s delay to food sharply reduced suboptimal choices, but shortening the delay to no food had little effect. The results suggest that a signaled delay to no food does not punish responding in probabilistic choice procedures. The findings are discussed in terms of conditioned reinforcement by signals for good news.

  12. Probabilistic Solar Energetic Particle Models

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.; Dietrich, William F.; Xapsos, Michael A.

    2011-01-01

    To plan and design safe and reliable space missions, it is necessary to take into account the effects of the space radiation environment. This is done by setting the goal of achieving safety and reliability with some desired level of confidence. To achieve this goal, a worst-case space radiation environment at the required confidence level must be obtained. Planning and designing then proceeds, taking into account the effects of this worst-case environment. The result will be a mission that is reliable against the effects of the space radiation environment at the desired confidence level. In this paper we will describe progress toward developing a model that provides worst-case space radiation environments at user-specified confidence levels. We will present a model for worst-case event-integrated solar proton environments that provide the worst-case differential proton spectrum. This model is based on data from IMP-8 and GOES spacecraft that provide a data base extending from 1974 to the present. We will discuss extending this work to create worst-case models for peak flux and mission-integrated fluence for protons. We will also describe plans for similar models for helium and heavier ions.

  13. Probabilistic model of Kordylewski clouds

    NASA Astrophysics Data System (ADS)

    Salnikova, T. V.; Stepanov, S. Ya.; Shuvalova, A. I.

    2016-05-01

    The problem of determining the phase-space distribution function for the system of the noninteracting dust particles for the mathematical model of cosmic dust Kordylewski clouds—clusters of the non-interacting dust particles in the vicinity of the triangular libration points of the Earth-Moon-Particle system taking into account perturbations from the Sun was considered.

  14. Probabilistic models for feedback systems.

    SciTech Connect

    Grace, Matthew D.; Boggs, Paul T.

    2011-02-01

    In previous work, we developed a Bayesian-based methodology to analyze the reliability of hierarchical systems. The output of the procedure is a statistical distribution of the reliability, thus allowing many questions to be answered. The principal advantage of the approach is that along with an estimate of the reliability, we also can provide statements of confidence in the results. The model is quite general in that it allows general representations of all of the distributions involved, it incorporates prior knowledge into the models, it allows errors in the 'engineered' nodes of a system to be determined by the data, and leads to the ability to determine optimal testing strategies. In this report, we provide the preliminary steps necessary to extend this approach to systems with feedback. Feedback is an essential component of 'complexity' and provides interesting challenges in modeling the time-dependent action of a feedback loop. We provide a mechanism for doing this and analyze a simple case. We then consider some extensions to more interesting examples with local control affecting the entire system. Finally, a discussion of the status of the research is also included.

  15. Probabilistic computer model of optimal runway turnoffs

    NASA Technical Reports Server (NTRS)

    Schoen, M. L.; Preston, O. W.; Summers, L. G.; Nelson, B. A.; Vanderlinden, L.; Mcreynolds, M. C.

    1985-01-01

    Landing delays are currently a problem at major air carrier airports and many forecasters agree that airport congestion will get worse by the end of the century. It is anticipated that some types of delays can be reduced by an efficient optimal runway exist system allowing increased approach volumes necessary at congested airports. A computerized Probabilistic Runway Turnoff Model which locates exits and defines path geometry for a selected maximum occupancy time appropriate for each TERPS aircraft category is defined. The model includes an algorithm for lateral ride comfort limits.

  16. Opportunities of probabilistic flood loss models

    NASA Astrophysics Data System (ADS)

    Schröter, Kai; Kreibich, Heidi; Lüdtke, Stefan; Vogel, Kristin; Merz, Bruno

    2016-04-01

    Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. However, reliable flood damage models are a prerequisite for the practical usefulness of the model results. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks and traditional stage damage functions. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005, 2006 and 2013 in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of sharpness of the predictions the reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The comparison of the uni-variable Stage damage function and the multivariable model approach emphasises the importance to quantify predictive uncertainty. With each explanatory variable, the multi-variable model reveals an additional source of uncertainty. However, the predictive performance in terms of precision (mbe), accuracy (mae) and reliability (HR) is clearly improved

  17. Probabilistic Gompertz model of irreversible growth.

    PubMed

    Bardos, D C

    2005-05-01

    Characterizing organism growth within populations requires the application of well-studied individual size-at-age models, such as the deterministic Gompertz model, to populations of individuals whose characteristics, corresponding to model parameters, may be highly variable. A natural approach is to assign probability distributions to one or more model parameters. In some contexts, size-at-age data may be absent due to difficulties in ageing individuals, but size-increment data may instead be available (e.g., from tag-recapture experiments). A preliminary transformation to a size-increment model is then required. Gompertz models developed along the above lines have recently been applied to strongly heterogeneous abalone tag-recapture data. Although useful in modelling the early growth stages, these models yield size-increment distributions that allow negative growth, which is inappropriate in the case of mollusc shells and other accumulated biological structures (e.g., vertebrae) where growth is irreversible. Here we develop probabilistic Gompertz models where this difficulty is resolved by conditioning parameter distributions on size, allowing application to irreversible growth data. In the case of abalone growth, introduction of a growth-limiting biological length scale is then shown to yield realistic length-increment distributions.

  18. A probabilistic graphical model based stochastic input model construction

    SciTech Connect

    Wan, Jiang; Zabaras, Nicholas

    2014-09-01

    Model reduction techniques have been widely used in modeling of high-dimensional stochastic input in uncertainty quantification tasks. However, the probabilistic modeling of random variables projected into reduced-order spaces presents a number of computational challenges. Due to the curse of dimensionality, the underlying dependence relationships between these random variables are difficult to capture. In this work, a probabilistic graphical model based approach is employed to learn the dependence by running a number of conditional independence tests using observation data. Thus a probabilistic model of the joint PDF is obtained and the PDF is factorized into a set of conditional distributions based on the dependence structure of the variables. The estimation of the joint PDF from data is then transformed to estimating conditional distributions under reduced dimensions. To improve the computational efficiency, a polynomial chaos expansion is further applied to represent the random field in terms of a set of standard random variables. This technique is combined with both linear and nonlinear model reduction methods. Numerical examples are presented to demonstrate the accuracy and efficiency of the probabilistic graphical model based stochastic input models. - Highlights: • Data-driven stochastic input models without the assumption of independence of the reduced random variables. • The problem is transformed to a Bayesian network structure learning problem. • Examples are given in flows in random media.

  19. Retinal blood vessels extraction using probabilistic modelling.

    PubMed

    Kaba, Djibril; Wang, Chuang; Li, Yongmin; Salazar-Gonzalez, Ana; Liu, Xiaohui; Serag, Ahmed

    2014-01-01

    The analysis of retinal blood vessels plays an important role in detecting and treating retinal diseases. In this review, we present an automated method to segment blood vessels of fundus retinal image. The proposed method could be used to support a non-intrusive diagnosis in modern ophthalmology for early detection of retinal diseases, treatment evaluation or clinical study. This study combines the bias correction and an adaptive histogram equalisation to enhance the appearance of the blood vessels. Then the blood vessels are extracted using probabilistic modelling that is optimised by the expectation maximisation algorithm. The method is evaluated on fundus retinal images of STARE and DRIVE datasets. The experimental results are compared with some recently published methods of retinal blood vessels segmentation. The experimental results show that our method achieved the best overall performance and it is comparable to the performance of human experts.

  20. Modeling Spanish Mood Choice in Belief Statements

    ERIC Educational Resources Information Center

    Robinson, Jason R.

    2013-01-01

    This work develops a computational methodology new to linguistics that empirically evaluates competing linguistic theories on Spanish verbal mood choice through the use of computational techniques to learn mood and other hidden linguistic features from Spanish belief statements found in corpora. The machine learned probabilistic linguistic models…

  1. scoringRules - A software package for probabilistic model evaluation

    NASA Astrophysics Data System (ADS)

    Lerch, Sebastian; Jordan, Alexander; Krüger, Fabian

    2016-04-01

    Models in the geosciences are generally surrounded by uncertainty, and being able to quantify this uncertainty is key to good decision making. Accordingly, probabilistic forecasts in the form of predictive distributions have become popular over the last decades. With the proliferation of probabilistic models arises the need for decision theoretically principled tools to evaluate the appropriateness of models and forecasts in a generalized way. Various scoring rules have been developed over the past decades to address this demand. Proper scoring rules are functions S(F,y) which evaluate the accuracy of a forecast distribution F , given that an outcome y was observed. As such, they allow to compare alternative models, a crucial ability given the variety of theories, data sources and statistical specifications that is available in many situations. This poster presents the software package scoringRules for the statistical programming language R, which contains functions to compute popular scoring rules such as the continuous ranked probability score for a variety of distributions F that come up in applied work. Two main classes are parametric distributions like normal, t, or gamma distributions, and distributions that are not known analytically, but are indirectly described through a sample of simulation draws. For example, Bayesian forecasts produced via Markov Chain Monte Carlo take this form. Thereby, the scoringRules package provides a framework for generalized model evaluation that both includes Bayesian as well as classical parametric models. The scoringRules package aims to be a convenient dictionary-like reference for computing scoring rules. We offer state of the art implementations of several known (but not routinely applied) formulas, and implement closed-form expressions that were previously unavailable. Whenever more than one implementation variant exists, we offer statistically principled default choices.

  2. EFFECTS OF CORRELATED PROBABILISTIC EXPOSURE MODEL INPUTS ON SIMULATED RESULTS

    EPA Science Inventory

    In recent years, more probabilistic models have been developed to quantify aggregate human exposures to environmental pollutants. The impact of correlation among inputs in these models is an important issue, which has not been resolved. Obtaining correlated data and implementi...

  3. Approaches to implementing deterministic models in a probabilistic framework

    SciTech Connect

    Talbott, D.V.

    1995-04-01

    The increasing use of results from probabilistic risk assessments in the decision-making process makes it ever more important to eliminate simplifications in probabilistic models that might lead to conservative results. One area in which conservative simplifications are often made is modeling the physical interactions that occur during the progression of an accident sequence. This paper demonstrates and compares different approaches for incorporating deterministic models of physical parameters into probabilistic models; parameter range binning, response curves, and integral deterministic models. An example that combines all three approaches in a probabilistic model for the handling of an energetic material (i.e. high explosive, rocket propellant,...) is then presented using a directed graph model.

  4. PROBABILISTIC MODELING FOR ADVANCED HUMAN EXPOSURE ASSESSMENT

    EPA Science Inventory

    Human exposures to environmental pollutants widely vary depending on the emission patterns that result in microenvironmental pollutant concentrations, as well as behavioral factors that determine the extent of an individual's contact with these pollutants. Probabilistic human exp...

  5. A Probabilistic Model for Reducing Medication Errors

    PubMed Central

    Nguyen, Phung Anh; Syed-Abdul, Shabbir; Iqbal, Usman; Hsu, Min-Huei; Huang, Chen-Ling; Li, Hsien-Chang; Clinciu, Daniel Livius; Jian, Wen-Shan; Li, Yu-Chuan Jack

    2013-01-01

    Background Medication errors are common, life threatening, costly but preventable. Information technology and automated systems are highly efficient for preventing medication errors and therefore widely employed in hospital settings. The aim of this study was to construct a probabilistic model that can reduce medication errors by identifying uncommon or rare associations between medications and diseases. Methods and Finding(s) Association rules of mining techniques are utilized for 103.5 million prescriptions from Taiwan’s National Health Insurance database. The dataset included 204.5 million diagnoses with ICD9-CM codes and 347.7 million medications by using ATC codes. Disease-Medication (DM) and Medication-Medication (MM) associations were computed by their co-occurrence and associations’ strength were measured by the interestingness or lift values which were being referred as Q values. The DMQs and MMQs were used to develop the AOP model to predict the appropriateness of a given prescription. Validation of this model was done by comparing the results of evaluation performed by the AOP model and verified by human experts. The results showed 96% accuracy for appropriate and 45% accuracy for inappropriate prescriptions, with a sensitivity and specificity of 75.9% and 89.5%, respectively. Conclusions We successfully developed the AOP model as an efficient tool for automatic identification of uncommon or rare associations between disease-medication and medication-medication in prescriptions. The AOP model helps to reduce medication errors by alerting physicians, improving the patients’ safety and the overall quality of care. PMID:24312659

  6. Probabilistic constitutive relationships for cyclic material strength models

    NASA Technical Reports Server (NTRS)

    Boyce, L.; Chamis, C. C.

    1988-01-01

    A methodology is developed that provides a probabilistic treatment for the lifetime of structural components of aerospace propulsion systems subjected to fatigue. Material strength degradation models, based on primitive variables, include both a fatigue strength reduction model and a fatigue crack growth model. Probabilistic analysis is based on simulation, and both maximum entropy and maximum penalized likelihood methods are used for the generation of probability density functions. The resulting constitutive relationships are included in several computer programs.

  7. A PROBABILISTIC MODELING FRAMEWORK FOR PREDICTING POPULATION EXPOSURES TO BENZENE

    EPA Science Inventory

    The US Environmental Protection Agency (EPA) is modifying their probabilistic Stochastic Human Exposure Dose Simulation (SHEDS) model to assess aggregate exposures to air toxics. Air toxics include urban Hazardous Air Pollutants (HAPS) such as benzene from mobile sources, part...

  8. Choice-Based Conjoint Analysis: Classification vs. Discrete Choice Models

    NASA Astrophysics Data System (ADS)

    Giesen, Joachim; Mueller, Klaus; Taneva, Bilyana; Zolliker, Peter

    Conjoint analysis is a family of techniques that originated in psychology and later became popular in market research. The main objective of conjoint analysis is to measure an individual's or a population's preferences on a class of options that can be described by parameters and their levels. We consider preference data obtained in choice-based conjoint analysis studies, where one observes test persons' choices on small subsets of the options. There are many ways to analyze choice-based conjoint analysis data. Here we discuss the intuition behind a classification based approach, and compare this approach to one based on statistical assumptions (discrete choice models) and to a regression approach. Our comparison on real and synthetic data indicates that the classification approach outperforms the discrete choice models.

  9. Bayesian non-parametrics and the probabilistic approach to modelling

    PubMed Central

    Ghahramani, Zoubin

    2013-01-01

    Modelling is fundamental to many fields of science and engineering. A model can be thought of as a representation of possible data one could predict from a system. The probabilistic approach to modelling uses probability theory to express all aspects of uncertainty in the model. The probabilistic approach is synonymous with Bayesian modelling, which simply uses the rules of probability theory in order to make predictions, compare alternative models, and learn model parameters and structure from data. This simple and elegant framework is most powerful when coupled with flexible probabilistic models. Flexibility is achieved through the use of Bayesian non-parametrics. This article provides an overview of probabilistic modelling and an accessible survey of some of the main tools in Bayesian non-parametrics. The survey covers the use of Bayesian non-parametrics for modelling unknown functions, density estimation, clustering, time-series modelling, and representing sparsity, hierarchies, and covariance structure. More specifically, it gives brief non-technical overviews of Gaussian processes, Dirichlet processes, infinite hidden Markov models, Indian buffet processes, Kingman’s coalescent, Dirichlet diffusion trees and Wishart processes. PMID:23277609

  10. Modeling one-choice and two-choice driving tasks

    PubMed Central

    Ratcliff, Roger

    2015-01-01

    An experiment is presented in which subjects were tested on both one-choice and two-choice driving tasks and on non-driving versions of them. Diffusion models for one- and two-choice tasks were successful in extracting model-based measures from the response time and accuracy data. These include measures of the quality of the information from the stimuli that drove the decision process (drift rate in the model), the time taken up by processes outside the decision process and, for the two-choice model, the speed/accuracy decision criteria that subjects set. Drift rates were only marginally different between the driving and non-driving tasks, indicating that nearly the same information was used in the two kinds of tasks. The tasks differed in the time taken up by other processes, reflecting the difference between them in response processing demands. Drift rates were significantly correlated across the two two-choice tasks showing that subjects that performed well on one task also performed well on the other task. Nondecision times were correlated across the two driving tasks, showing common abilities on motor processes across the two tasks. These results show the feasibility of using diffusion modeling to examine decision making in driving and so provide for a theoretical examination of factors that might impair driving, such as extreme aging, distraction, sleep deprivation, and so on. PMID:25944448

  11. Exploring Term Dependences in Probabilistic Information Retrieval Model.

    ERIC Educational Resources Information Center

    Cho, Bong-Hyun; Lee, Changki; Lee, Gary Geunbae

    2003-01-01

    Describes a theoretic process to apply Bahadur-Lazarsfeld expansion (BLE) to general probabilistic models and the state-of-the-art 2-Poisson model. Through experiments on two standard document collections, one in Korean and one in English, it is demonstrated that incorporation of term dependences using BLE significantly contributes to performance…

  12. What do we gain with Probabilistic Flood Loss Models?

    NASA Astrophysics Data System (ADS)

    Schroeter, K.; Kreibich, H.; Vogel, K.; Merz, B.; Lüdtke, S.

    2015-12-01

    The reliability of flood loss models is a prerequisite for their practical usefulness. Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks and traditional stage damage functions which are cast in a probabilistic framework. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005, 2006 and 2013 in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The reliability of the probabilistic predictions within validation runs decreases only slightly and achieves a very good coverage of observations within the predictive interval. Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.

  13. Modelling default and likelihood reasoning as probabilistic reasoning

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1990-01-01

    A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. Likely and by default are in fact treated as duals in the same sense as possibility and necessity. To model these four forms probabilistically, a qualitative default probabilistic (QDP) logic and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequent results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.

  14. Probabilistic Usage of the Multi-Factor Interaction Model

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2008-01-01

    A Multi-Factor Interaction Model (MFIM) is used to predict the insulating foam mass expulsion during the ascending of a space vehicle. The exponents in the MFIM are evaluated by an available approach which consists of least squares and an optimization algorithm. These results were subsequently used to probabilistically evaluate the effects of the uncertainties in each participating factor in the mass expulsion. The probabilistic results show that the surface temperature dominates at high probabilities and the pressure which causes the mass expulsion at low probabil

  15. Efficient Methods for Unsupervised Learning of Probabilistic Models

    NASA Astrophysics Data System (ADS)

    Sohl-Dickstein, Jascha

    High dimensional probabilistic models are used for many modern scientific and engineering data analysis tasks. Interpreting neural spike trains, compressing video, identifying features in DNA microarrays, and recognizing particles in high energy physics all rely upon the ability to find and model complex structure in a high dimensional space. Despite their great promise, high dimensional probabilistic models are frequently computationally intractable to work with in practice. In this thesis I develop solutions to overcome this intractability, primarily in the context of energy based models. A common cause of intractability is that model distributions cannot be analytically normalized. Probabilities can only be computed up to a constant, making training exceedingly difficult. To solve this problem I propose 'minimum probability flow learning', a variational technique for parameter estimation in such models. The utility of this training technique is demonstrated in the case of an Ising model, a Hopfield auto-associative memory, an independent component analysis model of natural images, and a deep belief network. A second common difficulty in training probabilistic models arises when the parameter space is ill-conditioned. This makes gradient descent optimization slow and impractical, but can be alleviated using the natural gradient. I show here that the natural gradient can be related to signal whitening, and provide specific prescriptions for applying it to learning problems. It is also difficult to evaluate the performance of models that cannot be analytically normalized, providing a particular challenge to hypothesis testing and model comparison. To overcome this, I introduce a method termed 'Hamiltonian annealed importance sampling,' which more efficiently estimates the normalization constant of non-analytically-normalizable models. This method is then used to calculate and compare the log likelihoods of several state of the art probabilistic models of natural

  16. ISSUES ASSOCIATED WITH PROBABILISTIC FAILURE MODELING OF DIGITAL SYSTEMS

    SciTech Connect

    CHU,T.L.; MARTINEZ-GURIDI,G.; LEHNER,J.; OVERLAND,D.

    2004-09-19

    The current U.S. Nuclear Regulatory Commission (NRC) licensing process of instrumentation and control (I&C) systems is based on deterministic requirements, e.g., single failure criteria, and defense in depth and diversity. Probabilistic considerations can be used as supplements to the deterministic process. The National Research Council has recommended development of methods for estimating failure probabilities of digital systems, including commercial off-the-shelf (COTS) equipment, for use in probabilistic risk assessment (PRA). NRC staff has developed informal qualitative and quantitative requirements for PRA modeling of digital systems. Brookhaven National Laboratory (BNL) has performed a review of the-state-of-the-art of the methods and tools that can potentially be used to model digital systems. The objectives of this paper are to summarize the review, discuss the issues associated with probabilistic modeling of digital systems, and identify potential areas of research that would enhance the state of the art toward a satisfactory modeling method that could be integrated with a typical probabilistic risk assessment.

  17. Multivariate probabilistic projections using imperfect climate models part I: outline of methodology

    NASA Astrophysics Data System (ADS)

    Sexton, David M. H.; Murphy, James M.; Collins, Mat; Webb, Mark J.

    2012-06-01

    We demonstrate a method for making probabilistic projections of climate change at global and regional scales, using examples consisting of the equilibrium response to doubled CO2 concentrations of global annual mean temperature and regional climate changes in summer and winter temperature and precipitation over Northern Europe and England-Wales. This method combines information from a perturbed physics ensemble, a set of international climate models, and observations. Our approach is based on a multivariate Bayesian framework which enables the prediction of a joint probability distribution for several variables constrained by more than one observational metric. This is important if different sets of impacts scientists are to use these probabilistic projections to make coherent forecasts for the impacts of climate change, by inputting several uncertain climate variables into their impacts models. Unlike a single metric, multiple metrics reduce the risk of rewarding a model variant which scores well due to a fortuitous compensation of errors rather than because it is providing a realistic simulation of the observed quantity. We provide some physical interpretation of how the key metrics constrain our probabilistic projections. The method also has a quantity, called discrepancy, which represents the degree of imperfection in the climate model i.e. it measures the extent to which missing processes, choices of parameterisation schemes and approximations in the climate model affect our ability to use outputs from climate models to make inferences about the real system. Other studies have, sometimes without realising it, treated the climate model as if it had no model error. We show that omission of discrepancy increases the risk of making over-confident predictions. Discrepancy also provides a transparent way of incorporating improvements in subsequent generations of climate models into probabilistic assessments. The set of international climate models is used to derive

  18. A probabilistic growth model for partition polygons and related structures

    NASA Astrophysics Data System (ADS)

    Kearney, Michael J.

    2004-03-01

    A two-parameter, probabilistic growth model for partition polygon clusters is introduced and exact results obtained relating to the area moments and the area probability distribution. In particular, the scaling behaviour in the presence of asymmetry between growth along the two principal axes is discussed. Variants of the model are also examined, including the extension to rooted stack polygons. An interesting application relates to characterizing the asymptotic behaviour of the cumulative customer waiting time distribution in a particular discrete-time queue.

  19. Probabilistic Graphical Model Representation in Phylogenetics

    PubMed Central

    Höhna, Sebastian; Heath, Tracy A.; Boussau, Bastien; Landis, Michael J.; Ronquist, Fredrik; Huelsenbeck, John P.

    2014-01-01

    Recent years have seen a rapid expansion of the model space explored in statistical phylogenetics, emphasizing the need for new approaches to statistical model representation and software development. Clear communication and representation of the chosen model is crucial for: (i) reproducibility of an analysis, (ii) model development, and (iii) software design. Moreover, a unified, clear and understandable framework for model representation lowers the barrier for beginners and nonspecialists to grasp complex phylogenetic models, including their assumptions and parameter/variable dependencies. Graphical modeling is a unifying framework that has gained in popularity in the statistical literature in recent years. The core idea is to break complex models into conditionally independent distributions. The strength lies in the comprehensibility, flexibility, and adaptability of this formalism, and the large body of computational work based on it. Graphical models are well-suited to teach statistical models, to facilitate communication among phylogeneticists and in the development of generic software for simulation and statistical inference. Here, we provide an introduction to graphical models for phylogeneticists and extend the standard graphical model representation to the realm of phylogenetics. We introduce a new graphical model component, tree plates, to capture the changing structure of the subgraph corresponding to a phylogenetic tree. We describe a range of phylogenetic models using the graphical model framework and introduce modules to simplify the representation of standard components in large and complex models. Phylogenetic model graphs can be readily used in simulation, maximum likelihood inference, and Bayesian inference using, for example, Metropolis–Hastings or Gibbs sampling of the posterior distribution. [Computation; graphical models; inference; modularization; statistical phylogenetics; tree plate.] PMID:24951559

  20. A simple probabilistic model of ideal gases

    NASA Astrophysics Data System (ADS)

    Sossinsky, A. B.

    2016-01-01

    We describe a discrete 3D model of ideal gas based on the idea that, on the microscopic level, the particles move randomly (as in ASEP models), instead of obeying Newton's laws as prescribed by Boltzmann.

  1. Probabilistic grammatical model for helix‐helix contact site classification

    PubMed Central

    2013-01-01

    Background Hidden Markov Models power many state‐of‐the‐art tools in the field of protein bioinformatics. While excelling in their tasks, these methods of protein analysis do not convey directly information on medium‐ and long‐range residue‐residue interactions. This requires an expressive power of at least context‐free grammars. However, application of more powerful grammar formalisms to protein analysis has been surprisingly limited. Results In this work, we present a probabilistic grammatical framework for problem‐specific protein languages and apply it to classification of transmembrane helix‐helix pairs configurations. The core of the model consists of a probabilistic context‐free grammar, automatically inferred by a genetic algorithm from only a generic set of expert‐based rules and positive training samples. The model was applied to produce sequence based descriptors of four classes of transmembrane helix‐helix contact site configurations. The highest performance of the classifiers reached AUCROC of 0.70. The analysis of grammar parse trees revealed the ability of representing structural features of helix‐helix contact sites. Conclusions We demonstrated that our probabilistic context‐free framework for analysis of protein sequences outperforms the state of the art in the task of helix‐helix contact site classification. However, this is achieved without necessarily requiring modeling long range dependencies between interacting residues. A significant feature of our approach is that grammar rules and parse trees are human‐readable. Thus they could provide biologically meaningful information for molecular biologists. PMID:24350601

  2. Linear Deterministic Accumulator Models of Simple Choice

    PubMed Central

    Heathcote, Andrew; Love, Jonathon

    2012-01-01

    We examine theories of simple choice as a race among evidence accumulation processes. We focus on the class of deterministic race models, which assume that the effects of fluctuations in the parameters of the accumulation processes between-choice trials (between-choice noise) dominate the effects of fluctuations occurring while making a choice (within-choice noise) in behavioral data (i.e., response times and choices). The latter deterministic approximation, when combined with the assumption that accumulation is linear, leads to a class of models that can be readily applied to simple-choice behavior because they are computationally tractable. We develop a new and mathematically simple exemplar within the class of linear deterministic models, the Lognormal race (LNR). We then examine how the LNR, and another widely applied linear deterministic model, Brown and Heathcote’s (2008) LBA, account for a range of benchmark simple-choice effects in lexical-decision task data reported by Wagenmakers et al. (2008). Our results indicate that the LNR provides an accurate description of this data. Although the LBA model provides a slightly better account, both models support similar psychological conclusions. PMID:22936920

  3. Probabilistic graphic models applied to identification of diseases.

    PubMed

    Sato, Renato Cesar; Sato, Graziela Tiemy Kajita

    2015-01-01

    Decision-making is fundamental when making diagnosis or choosing treatment. The broad dissemination of computed systems and databases allows systematization of part of decisions through artificial intelligence. In this text, we present basic use of probabilistic graphic models as tools to analyze causality in health conditions. This method has been used to make diagnosis of Alzheimer´s disease, sleep apnea and heart diseases.

  4. Probabilistic graphic models applied to identification of diseases

    PubMed Central

    Sato, Renato Cesar; Sato, Graziela Tiemy Kajita

    2015-01-01

    ABSTRACT Decision-making is fundamental when making diagnosis or choosing treatment. The broad dissemination of computed systems and databases allows systematization of part of decisions through artificial intelligence. In this text, we present basic use of probabilistic graphic models as tools to analyze causality in health conditions. This method has been used to make diagnosis of Alzheimer´s disease, sleep apnea and heart diseases. PMID:26154555

  5. Nonlinear probabilistic finite element models of laminated composite shells

    NASA Technical Reports Server (NTRS)

    Engelstad, S. P.; Reddy, J. N.

    1993-01-01

    A probabilistic finite element analysis procedure for laminated composite shells has been developed. A total Lagrangian finite element formulation, employing a degenerated 3-D laminated composite shell with the full Green-Lagrange strains and first-order shear deformable kinematics, forms the modeling foundation. The first-order second-moment technique for probabilistic finite element analysis of random fields is employed and results are presented in the form of mean and variance of the structural response. The effects of material nonlinearity are included through the use of a rate-independent anisotropic plasticity formulation with the macroscopic point of view. Both ply-level and micromechanics-level random variables can be selected, the latter by means of the Aboudi micromechanics model. A number of sample problems are solved to verify the accuracy of the procedures developed and to quantify the variability of certain material type/structure combinations. Experimental data is compared in many cases, and the Monte Carlo simulation method is used to check the probabilistic results. In general, the procedure is quite effective in modeling the mean and variance response of the linear and nonlinear behavior of laminated composite shells.

  6. Economic Dispatch for Microgrid Containing Electric Vehicles via Probabilistic Modelling

    SciTech Connect

    Yao, Yin; Gao, Wenzhong; Momoh, James; Muljadi, Eduard

    2015-10-06

    In this paper, an economic dispatch model with probabilistic modeling is developed for microgrid. Electric power supply in microgrid consists of conventional power plants and renewable energy power plants, such as wind and solar power plants. Due to the fluctuation of solar and wind plants' output, an empirical probabilistic model is developed to predict their hourly output. According to different characteristics of wind and solar plants, the parameters for probabilistic distribution are further adjusted individually for both power plants. On the other hand, with the growing trend of Plug-in Electric Vehicle (PHEV), an integrated microgrid system must also consider the impact of PHEVs. Not only the charging loads from PHEVs, but also the discharging output via Vehicle to Grid (V2G) method can greatly affect the economic dispatch for all the micro energy sources in microgrid. This paper presents an optimization method for economic dispatch in microgrid considering conventional, renewable power plants, and PHEVs. The simulation results reveal that PHEVs with V2G capability can be an indispensable supplement in modern microgrid.

  7. Probabilistic prediction models for aggregate quarry siting

    USGS Publications Warehouse

    Robinson, G.R.; Larkins, P.M.

    2007-01-01

    Weights-of-evidence (WofE) and logistic regression techniques were used in a GIS framework to predict the spatial likelihood (prospectivity) of crushed-stone aggregate quarry development. The joint conditional probability models, based on geology, transportation network, and population density variables, were defined using quarry location and time of development data for the New England States, North Carolina, and South Carolina, USA. The Quarry Operation models describe the distribution of active aggregate quarries, independent of the date of opening. The New Quarry models describe the distribution of aggregate quarries when they open. Because of the small number of new quarries developed in the study areas during the last decade, independent New Quarry models have low parameter estimate reliability. The performance of parameter estimates derived for Quarry Operation models, defined by a larger number of active quarries in the study areas, were tested and evaluated to predict the spatial likelihood of new quarry development. Population density conditions at the time of new quarry development were used to modify the population density variable in the Quarry Operation models to apply to new quarry development sites. The Quarry Operation parameters derived for the New England study area, Carolina study area, and the combined New England and Carolina study areas were all similar in magnitude and relative strength. The Quarry Operation model parameters, using the modified population density variables, were found to be a good predictor of new quarry locations. Both the aggregate industry and the land management community can use the model approach to target areas for more detailed site evaluation for quarry location. The models can be revised easily to reflect actual or anticipated changes in transportation and population features. ?? International Association for Mathematical Geology 2007.

  8. Recent advances and applications of probabilistic topic models

    NASA Astrophysics Data System (ADS)

    Wood, Ian

    2014-12-01

    I present here an overview of recent advances in probabilistic topic modelling and related Bayesian graphical models as well as some of their more atypical applications outside of their home: text analysis. These techniques allow the modelling of high dimensional count vectors with strong correlations. With such data, simply calculating a correlation matrix is infeasible. Probabilistic topic models address this using mixtures of multinomials estimated via Bayesian inference with Dirichlet priors. The use of conjugate priors allows for efficient inference, and these techniques scale well to data sets with many millions of vectors. The first of these techniques to attract significant attention was Latent Dirichlet Allocation (LDA) [1, 2]. Numerous extensions and adaptations of LDA have been proposed: non-parametric models; assorted models incorporating authors, sentiment and other features; models regularised through the use of extra metadata or extra priors on topic structure, and many more [3]. They have become widely used in the text analysis and population genetics communities, with a number of compelling applications. These techniques are not restricted to text analysis, however, and can be applied to other types of data which can be sensibly discretised and represented as counts of labels/properties/etc. LDA and it's variants have been used to find patterns in data from diverse areas of inquiry, including genetics, plant physiology, image analysis, social network analysis, remote sensing and astrophysics. Nonetheless, it is relatively recently that probabilistic topic models have found applications outside of text analysis, and to date few such applications have been considered. I suggest that there is substantial untapped potential for topic models and models inspired by or incorporating topic models to be fruitfully applied, and outline the characteristics of systems and data for which this may be the case.

  9. Probabilistic model for immiscible separations and extractions (ProMISE).

    PubMed

    de Folter, Joost; Sutherland, Ian A

    2011-09-01

    Chromatography models, liquid-liquid models and specifically Counter-Current Chromatography (CCC) models are usually either iterative, or provide a final solution for peak elution. This paper describes providing a better model by finding a more elemental solution. A completely new model has been developed based on simulating probabilistic units. This model has been labelled ProMISE (probabilistic model for immiscible phase separations and extractions), and has been realised in the form of a computer application, interactively visualising the behaviour of the units in the CCC process. It does not use compartments or cells like in the Craig based models, nor is it based on diffusion theory. With this new model, all the CCC flow modes can be accurately predicted. The main advantage over the previously developed model, is that it does not require a somewhat arbitrary number of steps or theoretical plates, and instead uses an efficiency factor. Furthermore, since this model is not based on compartments or cells like the Craig model, and is therefore not limited to a compartment or cell nature, it allows for an even greater flexibility. PMID:21211802

  10. Integrating Activity Patterns into Destination Choice Models.

    ERIC Educational Resources Information Center

    Fesenmaier, Daniel

    1988-01-01

    Factors affecting decision-making on where to go for recreation, specifically state park choice, were analyzed in a study based on data, collected via a telephone survey, from 452 Oklahoma households. The relative accuracy of various models for predicting individual destination choices were also examined. (IAH)

  11. Analytical probabilistic modeling for radiation therapy treatment planning

    NASA Astrophysics Data System (ADS)

    Bangert, Mark; Hennig, Philipp; Oelfke, Uwe

    2013-08-01

    This paper introduces the concept of analytical probabilistic modeling (APM) to quantify uncertainties in quality indicators of radiation therapy treatment plans. Assuming Gaussian probability densities over the input parameters of the treatment plan quality indicators, APM enables the calculation of the moments of the induced probability density over the treatment plan quality indicators by analytical integration. This paper focuses on analytical probabilistic dose calculation algorithms and the implications of APM regarding treatment planning. We derive closed-form expressions for the expectation value and the (co)variance of (1) intensity-modulated photon and proton dose distributions based on a pencil beam algorithm and (2) the standard quadratic objective function used in inverse planning. Complex correlation models of high dimensional uncertain input parameters and the different nature of random and systematic uncertainties in fractionated radiation therapy are explicitly incorporated into APM. APM variance calculations on phantom data sets show that the correlation assumptions and the difference of random and systematic uncertainties of the input parameters have a crucial impact on the uncertainty of the resulting dose. The derivations regarding the quadratic objective function show that APM has the potential to enable robust planning at almost the same computational cost like conventional inverse planning after a single probabilistic dose calculation. Beneficial applications of APM in the context of radiation therapy treatment planning are feasible.

  12. A Probabilistic Model of Cross-Categorization

    ERIC Educational Resources Information Center

    Shafto, Patrick; Kemp, Charles; Mansinghka, Vikash; Tenenbaum, Joshua B.

    2011-01-01

    Most natural domains can be represented in multiple ways: we can categorize foods in terms of their nutritional content or social role, animals in terms of their taxonomic groupings or their ecological niches, and musical instruments in terms of their taxonomic categories or social uses. Previous approaches to modeling human categorization have…

  13. A probabilistic model for the establishment of neuron polarity.

    PubMed

    Khanin, K; Khanin, R

    2001-01-01

    The main aim of this paper is to present a simple probabilistic model for the early stage of neuron growth: the specification on an axon out of several initially similar neurites. The model is a Markov process with competition between the growing neurites, wherein longer objects have more chances to grow, and parameter alpha determines the intensity of the competition. For alpha > 1 the model provides results which are qualitatively similar to the experimental ones, i.e. selection of one rapidly elongating axon out of several neurites while other less successful neurites stop growing at some random time. Rigorous mathematical proofs are given.

  14. Probabilistic Modeling of Settlement Risk at Land Disposal Facilities - 12304

    SciTech Connect

    Foye, Kevin C.; Soong, Te-Yang

    2012-07-01

    The long-term reliability of land disposal facility final cover systems - and therefore the overall waste containment - depends on the distortions imposed on these systems by differential settlement/subsidence. The evaluation of differential settlement is challenging because of the heterogeneity of the waste mass (caused by inconsistent compaction, void space distribution, debris-soil mix ratio, waste material stiffness, time-dependent primary compression of the fine-grained soil matrix, long-term creep settlement of the soil matrix and the debris, etc.) at most land disposal facilities. Deterministic approaches to long-term final cover settlement prediction are not able to capture the spatial variability in the waste mass and sub-grade properties which control differential settlement. An alternative, probabilistic solution is to use random fields to model the waste and sub-grade properties. The modeling effort informs the design, construction, operation, and maintenance of land disposal facilities. A probabilistic method to establish design criteria for waste placement and compaction is introduced using the model. Random fields are ideally suited to problems of differential settlement modeling of highly heterogeneous foundations, such as waste. Random fields model the seemingly random spatial distribution of a design parameter, such as compressibility. When used for design, the use of these models prompts the need for probabilistic design criteria. It also allows for a statistical approach to waste placement acceptance criteria. An example design evaluation was performed, illustrating the use of the probabilistic differential settlement simulation methodology to assemble a design guidance chart. The purpose of this design evaluation is to enable the designer to select optimal initial combinations of design slopes and quality control acceptance criteria that yield an acceptable proportion of post-settlement slopes meeting some design minimum. For this specific

  15. Probabilistic Modeling of the Renal Stone Formation Module

    NASA Technical Reports Server (NTRS)

    Best, Lauren M.; Myers, Jerry G.; Goodenow, Debra A.; McRae, Michael P.; Jackson, Travis C.

    2013-01-01

    The Integrated Medical Model (IMM) is a probabilistic tool, used in mission planning decision making and medical systems risk assessments. The IMM project maintains a database of over 80 medical conditions that could occur during a spaceflight, documenting an incidence rate and end case scenarios for each. In some cases, where observational data are insufficient to adequately define the inflight medical risk, the IMM utilizes external probabilistic modules to model and estimate the event likelihoods. One such medical event of interest is an unpassed renal stone. Due to a high salt diet and high concentrations of calcium in the blood (due to bone depletion caused by unloading in the microgravity environment) astronauts are at a considerable elevated risk for developing renal calculi (nephrolithiasis) while in space. Lack of observed incidences of nephrolithiasis has led HRP to initiate the development of the Renal Stone Formation Module (RSFM) to create a probabilistic simulator capable of estimating the likelihood of symptomatic renal stone presentation in astronauts on exploration missions. The model consists of two major parts. The first is the probabilistic component, which utilizes probability distributions to assess the range of urine electrolyte parameters and a multivariate regression to transform estimated crystal density and size distributions to the likelihood of the presentation of nephrolithiasis symptoms. The second is a deterministic physical and chemical model of renal stone growth in the kidney developed by Kassemi et al. The probabilistic component of the renal stone model couples the input probability distributions describing the urine chemistry, astronaut physiology, and system parameters with the physical and chemical outputs and inputs to the deterministic stone growth model. These two parts of the model are necessary to capture the uncertainty in the likelihood estimate. The model will be driven by Monte Carlo simulations, continuously

  16. A probabilistic gastrointestinal tract dosimetry model

    NASA Astrophysics Data System (ADS)

    Huh, Chulhaeng

    In internal dosimetry, the tissues of the gastrointestinal (GI) tract represent one of the most radiosensitive organs of the body with the hematopoietic bone marrow. Endoscopic ultrasound is a unique tool to acquire in-vivo data on GI tract wall thicknesses of sufficient resolution needed in radiation dosimetry studies. Through their different echo texture and intensity, five layers of differing echo patterns for superficial mucosa, deep mucosa, submucosa, muscularis propria and serosa exist within the walls of organs composing the alimentary tract. Thicknesses for stomach mucosa ranged from 620 +/- 150 mum to 1320 +/- 80 mum (total stomach wall thicknesses from 2.56 +/- 0.12 to 4.12 +/- 0.11 mm). Measurements made for the rectal images revealed rectal mucosal thicknesses from 150 +/- 90 mum to 670 +/- 110 mum (total rectal wall thicknesses from 2.01 +/- 0.06 to 3.35 +/- 0.46 mm). The mucosa thus accounted for 28 +/- 3% and 16 +/- 6% of the total thickness of the stomach and rectal wall, respectively. Radiation transport simulations were then performed using the Monte Carlo N-particle transport code (MCNP) 4C transport code to calculate S values (Gy/Bq-s) for penetrating and nonpenetrating radiations such as photons, beta particles, conversion electrons and auger electrons of selected nuclides, I123, I131, Tc 99m and Y90 under two source conditions: content and mucosa sources, respectively. The results of this study demonstrate generally good agreement with published data for the stomach mucosa wall. The rectal mucosa data are consistently higher than published data compared with the large intestine due to different radiosensitive cell thicknesses (350 mum vs. a range spanning from 149 mum to 729 mum) and different geometry when a rectal content source is considered. Generally, the ICRP models have been designed to predict the amount of radiation dose in the human body from a "typical" or "reference" individual in a given population. The study has been performed to

  17. Probabilistic Life Cycle Cost Model for Repairable System

    NASA Astrophysics Data System (ADS)

    Nasir, Meseret; Chong, H. Y.; Osman, Sabtuni

    2015-04-01

    Traditionally, Life cycle cost (LCC) has been predicted in a deterministic approach, however; this method is not capable to consider the uncertainties in the input variables. In this paper, a probabilistic approach using Adaptive network-based fuzzy inference system (ANFIS) is proposed to estimate the LCC of repairable systems. The developed model could handle the uncertainties of input variables in the estimation of LCC. The numerical analysis shows that the acquisition and downtime cost could have a high effect towards the LCC compared to repair cost. The developed model could also provide more precise quantitative information for decision making process.

  18. Probabilistic updating of building models using incomplete modal data

    NASA Astrophysics Data System (ADS)

    Sun, Hao; Büyüköztürk, Oral

    2016-06-01

    This paper investigates a new probabilistic strategy for Bayesian model updating using incomplete modal data. Direct mode matching between the measured and the predicted modal quantities is not required in the updating process, which is realized through model reduction. A Markov chain Monte Carlo technique with adaptive random-walk steps is proposed to draw the samples for model parameter uncertainty quantification. The iterated improved reduced system technique is employed to update the prediction error as well as to calculate the likelihood function in the sampling process. Since modal quantities are used in the model updating, modal identification is first carried out to extract the natural frequencies and mode shapes through the acceleration measurements of the structural system. The proposed algorithm is finally validated by both numerical and experimental examples: a 10-storey building with synthetic data and a 8-storey building with shaking table test data. Results illustrate that the proposed algorithm is effective and robust for parameter uncertainty quantification in probabilistic model updating of buildings.

  19. Model-driven, probabilistic level set based segmentation of magnetic resonance images of the brain.

    PubMed

    Verma, Nishant; Muralidhar, Gautam S; Bovik, Alan C; Cowperthwaite, Matthew C; Markey, Mia K

    2011-01-01

    Accurate segmentation of magnetic resonance (MR) images of the brain to differentiate features such as soft tissue, tumor, edema and necrosis is critical for both diagnosis and treatment purposes. Region-based formulations of geometric active contour models are popular choices for segmentation of MR and other medical images. Most of the traditional region-based formulations model local region intensity by assuming a piecewise constant approximation. However, the piecewise constant approximation rarely holds true for medical images such as MR images due to the presence of noise and bias field, which invariably results in a poor segmentation of the image. To overcome this problem, we have developed a probabilistic region-based active contour model for automatic segmentation of MR images of the brain. In our approach, a mixture of Gaussian distributions is used to accurately model the arbitrarily shaped local region intensity distribution. Prior spatial information derived from probabilistic atlases is also integrated into the level set evolution framework for guiding the segmentation process. Our experiments with a series of publicly available brain MR images show that the proposed active contour model gives stable and accurate segmentation results when compared to the traditional region based formulations. PMID:22254928

  20. Probabilistic/Fracture-Mechanics Model For Service Life

    NASA Technical Reports Server (NTRS)

    Watkins, T., Jr.; Annis, C. G., Jr.

    1991-01-01

    Computer program makes probabilistic estimates of lifetime of engine and components thereof. Developed to fill need for more accurate life-assessment technique that avoids errors in estimated lives and provides for statistical assessment of levels of risk created by engineering decisions in designing system. Implements mathematical model combining techniques of statistics, fatigue, fracture mechanics, nondestructive analysis, life-cycle cost analysis, and management of engine parts. Used to investigate effects of such engine-component life-controlling parameters as return-to-service intervals, stresses, capabilities for nondestructive evaluation, and qualities of materials.

  1. Probabilistic assessment of agricultural droughts using graphical models

    NASA Astrophysics Data System (ADS)

    Ramadas, Meenu; Govindaraju, Rao S.

    2015-07-01

    Agricultural droughts are often characterized by soil moisture in the root zone of the soil, but crop needs are rarely factored into the analysis. Since water needs vary with crops, agricultural drought incidences in a region can be characterized better if crop responses to soil water deficits are also accounted for in the drought index. This study investigates agricultural droughts driven by plant stress due to soil moisture deficits using crop stress functions available in the literature. Crop water stress is assumed to begin at the soil moisture level corresponding to incipient stomatal closure, and reaches its maximum at the crop's wilting point. Using available location-specific crop acreage data, a weighted crop water stress function is computed. A new probabilistic agricultural drought index is then developed within a hidden Markov model (HMM) framework that provides model uncertainty in drought classification and accounts for time dependence between drought states. The proposed index allows probabilistic classification of the drought states and takes due cognizance of the stress experienced by the crop due to soil moisture deficit. The capabilities of HMM model formulations for assessing agricultural droughts are compared to those of current drought indices such as standardized precipitation evapotranspiration index (SPEI) and self-calibrating Palmer drought severity index (SC-PDSI). The HMM model identified critical drought events and several drought occurrences that are not detected by either SPEI or SC-PDSI, and shows promise as a tool for agricultural drought studies.

  2. Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.

    2016-01-01

    Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.

  3. Alternative fuels and vehicles choice model

    SciTech Connect

    Greene, D.L.

    1994-10-01

    This report describes the theory and implementation of a model of alternative fuel and vehicle choice (AFVC), designed for use with the US Department of Energy`s Alternative Fuels Trade Model (AFTM). The AFTM is a static equilibrium model of the world supply and demand for liquid fuels, encompassing resource production, conversion processes, transportation, and consumption. The AFTM also includes fuel-switching behavior by incorporating multinomial logit-type equations for choice of alternative fuel vehicles and alternative fuels. This allows the model to solve for market shares of vehicles and fuels, as well as for fuel prices and quantities. The AFVC model includes fuel-flexible, bi-fuel, and dedicated fuel vehicles. For multi-fuel vehicles, the choice of fuel is subsumed within the vehicle choice framework, resulting in a nested multinomial logit design. The nesting is shown to be required by the different price elasticities of fuel and vehicle choice. A unique feature of the AFVC is that its parameters are derived directly from the characteristics of alternative fuels and vehicle technologies, together with a few key assumptions about consumer behavior. This not only establishes a direct link between assumptions and model predictions, but facilitates sensitivity testing, as well. The implementation of the AFVC model as a spreadsheet is also described.

  4. Developing a Model of Occupational Choice

    ERIC Educational Resources Information Center

    Egner, Joan Roos

    1974-01-01

    Rational and non-rational decision-making models of occupational choice are described. The Blau model provides an alternative to these. This model contains an occupational set of factors and a set related to the individual. Research supporting its conceptual utility and activities illustrating its pragmatic utility are discussed. (EAK)

  5. Probabilistic logic modeling of network reliability for hybrid network architectures

    SciTech Connect

    Wyss, G.D.; Schriner, H.K.; Gaylor, T.R.

    1996-10-01

    Sandia National Laboratories has found that the reliability and failure modes of current-generation network technologies can be effectively modeled using fault tree-based probabilistic logic modeling (PLM) techniques. We have developed fault tree models that include various hierarchical networking technologies and classes of components interconnected in a wide variety of typical and atypical configurations. In this paper we discuss the types of results that can be obtained from PLMs and why these results are of great practical value to network designers and analysts. After providing some mathematical background, we describe the `plug-and-play` fault tree analysis methodology that we have developed for modeling connectivity and the provision of network services in several current- generation network architectures. Finally, we demonstrate the flexibility of the method by modeling the reliability of a hybrid example network that contains several interconnected ethernet, FDDI, and token ring segments. 11 refs., 3 figs., 1 tab.

  6. Consumer Vehicle Choice Model Documentation

    SciTech Connect

    Liu, Changzheng; Greene, David L

    2012-08-01

    In response to the Fuel Economy and Greenhouse Gas (GHG) emissions standards, automobile manufacturers will need to adopt new technologies to improve the fuel economy of their vehicles and to reduce the overall GHG emissions of their fleets. The U.S. Environmental Protection Agency (EPA) has developed the Optimization Model for reducing GHGs from Automobiles (OMEGA) to estimate the costs and benefits of meeting GHG emission standards through different technology packages. However, the model does not simulate the impact that increased technology costs will have on vehicle sales or on consumer surplus. As the model documentation states, “While OMEGA incorporates functions which generally minimize the cost of meeting a specified carbon dioxide (CO2) target, it is not an economic simulation model which adjusts vehicle sales in response to the cost of the technology added to each vehicle.” Changes in the mix of vehicles sold, caused by the costs and benefits of added fuel economy technologies, could make it easier or more difficult for manufacturers to meet fuel economy and emissions standards, and impacts on consumer surplus could raise the costs or augment the benefits of the standards. Because the OMEGA model does not presently estimate such impacts, the EPA is investigating the feasibility of developing an adjunct to the OMEGA model to make such estimates. This project is an effort to develop and test a candidate model. The project statement of work spells out the key functional requirements for the new model.

  7. Futuristic Models for Educational Choice

    ERIC Educational Resources Information Center

    Tanner, C. Kenneth

    1973-01-01

    Discusses eight computer assisted planning models that are amenable to the improvement of decisionmaking; explains the feasibility of involving systems analysis and operations research in educational decisions; and suggests a minimal program designed to prepare educational planners with knowledge of computer assisted planning models. (Author/JF)

  8. Efficient diagnosis of multiprocessor systems under probabilistic models

    NASA Technical Reports Server (NTRS)

    Blough, Douglas M.; Sullivan, Gregory F.; Masson, Gerald M.

    1989-01-01

    The problem of fault diagnosis in multiprocessor systems is considered under a probabilistic fault model. The focus is on minimizing the number of tests that must be conducted in order to correctly diagnose the state of every processor in the system with high probability. A diagnosis algorithm that can correctly diagnose the state of every processor with probability approaching one in a class of systems performing slightly greater than a linear number of tests is presented. A nearly matching lower bound on the number of tests required to achieve correct diagnosis in arbitrary systems is also proven. Lower and upper bounds on the number of tests required for regular systems are also presented. A class of regular systems which includes hypercubes is shown to be correctly diagnosable with high probability. In all cases, the number of tests required under this probabilistic model is shown to be significantly less than under a bounded-size fault set model. Because the number of tests that must be conducted is a measure of the diagnosis overhead, these results represent a dramatic improvement in the performance of system-level diagnosis techniques.

  9. A probabilistic computational model of cross-situational word learning.

    PubMed

    Fazly, Afsaneh; Alishahi, Afra; Stevenson, Suzanne

    2010-08-01

    Words are the essence of communication: They are the building blocks of any language. Learning the meaning of words is thus one of the most important aspects of language acquisition: Children must first learn words before they can combine them into complex utterances. Many theories have been developed to explain the impressive efficiency of young children in acquiring the vocabulary of their language, as well as the developmental patterns observed in the course of lexical acquisition. A major source of disagreement among the different theories is whether children are equipped with special mechanisms and biases for word learning, or their general cognitive abilities are adequate for the task. We present a novel computational model of early word learning to shed light on the mechanisms that might be at work in this process. The model learns word meanings as probabilistic associations between words and semantic elements, using an incremental and probabilistic learning mechanism, and drawing only on general cognitive abilities. The results presented here demonstrate that much about word meanings can be learned from naturally occurring child-directed utterances (paired with meaning representations), without using any special biases or constraints, and without any explicit developmental changes in the underlying learning mechanism. Furthermore, our model provides explanations for the occasionally contradictory child experimental data, and offers predictions for the behavior of young word learners in novel situations. PMID:21564243

  10. Learning a Probabilistic Topology Discovering Model for Scene Categorization.

    PubMed

    Zhang, Luming; Ji, Rongrong; Xia, Yingjie; Zhang, Ying; Li, Xuelong

    2015-08-01

    A recent advance in scene categorization prefers a topological based modeling to capture the existence and relationships among different scene components. To that effect, local features are typically used to handle photographing variances such as occlusions and clutters. However, in many cases, the local features alone cannot well capture the scene semantics since they are extracted from tiny regions (e.g., 4×4 patches) within an image. In this paper, we mine a discriminative topology and a low-redundant topology from the local descriptors under a probabilistic perspective, which are further integrated into a boosting framework for scene categorization. In particular, by decomposing a scene image into basic components, a graphlet model is used to describe their spatial interactions. Accordingly, scene categorization is formulated as an intergraphlet matching problem. The above procedure is further accelerated by introducing a probabilistic based representative topology selection scheme that makes the pairwise graphlet comparison trackable despite their exponentially increasing volumes. The selected graphlets are highly discriminative and independent, characterizing the topological characteristics of scene images. A weak learner is subsequently trained for each topology, which are boosted together to jointly describe the scene image. In our experiment, the visualized graphlets demonstrate that the mined topological patterns are representative to scene categories, and our proposed method beats state-of-the-art models on five popular scene data sets.

  11. Model Checking Linear-Time Properties of Probabilistic Systems

    NASA Astrophysics Data System (ADS)

    Baier, Christel; Größer, Marcus; Ciesinski, Frank

    This chapter is about the verification of Markov decision processes (MDPs) which incorporate one of the fundamental models for reasoning about probabilistic and nondeterministic phenomena in reactive systems. MDPs have their roots in the field of operations research and are nowadays used in a wide variety of areas including verification, robotics, planning, controlling, reinforcement learning, economics and semantics of randomized systems. Furthermore, MDPs served as the basis for the introduction of probabilistic automata which are related to weighted automata. We describe the use of MDPs as an operational model for randomized systems, e.g., systems that employ randomized algorithms, multi-agent systems or systems with unreliable components or surroundings. In this context we outline the theory of verifying ω-regular properties of such operational models. As an integral part of this theory we use ω-automata, i.e., finite-state automata over finite alphabets that accept languages of infinite words. Additionally, basic concepts of important reduction techniques are sketched, namely partial order reduction of MDPs and quotient system reduction of the numerical problem that arises in the verification of MDPs. Furthermore we present several undecidability and decidability results for the controller synthesis problem for partially observable MDPs.

  12. Probabilistic delay differential equation modeling of event-related potentials.

    PubMed

    Ostwald, Dirk; Starke, Ludger

    2016-08-01

    "Dynamic causal models" (DCMs) are a promising approach in the analysis of functional neuroimaging data due to their biophysical interpretability and their consolidation of functional-segregative and functional-integrative propositions. In this theoretical note we are concerned with the DCM framework for electroencephalographically recorded event-related potentials (ERP-DCM). Intuitively, ERP-DCM combines deterministic dynamical neural mass models with dipole-based EEG forward models to describe the event-related scalp potential time-series over the entire electrode space. Since its inception, ERP-DCM has been successfully employed to capture the neural underpinnings of a wide range of neurocognitive phenomena. However, in spite of its empirical popularity, the technical literature on ERP-DCM remains somewhat patchy. A number of previous communications have detailed certain aspects of the approach, but no unified and coherent documentation exists. With this technical note, we aim to close this gap and to increase the technical accessibility of ERP-DCM. Specifically, this note makes the following novel contributions: firstly, we provide a unified and coherent review of the mathematical machinery of the latent and forward models constituting ERP-DCM by formulating the approach as a probabilistic latent delay differential equation model. Secondly, we emphasize the probabilistic nature of the model and its variational Bayesian inversion scheme by explicitly deriving the variational free energy function in terms of both the likelihood expectation and variance parameters. Thirdly, we detail and validate the estimation of the model with a special focus on the explicit form of the variational free energy function and introduce a conventional nonlinear optimization scheme for its maximization. Finally, we identify and discuss a number of computational issues which may be addressed in the future development of the approach.

  13. A probabilistic model to evaluate population dietary recommendations.

    PubMed

    Chalabi, Zaid; Ferguson, Elaine; Stanley, Robert; Briend, André

    2014-07-28

    Food-based dietary recommendations (FBR) play an essential role in promoting a healthy diet. To support the process of formulating a set of population-specific FBR, a probabilistic model was developed specifically to predict the changes in the percentage of a population at risk of inadequate nutrient intakes after the adoption of alternative sets of FBR. The model simulates the distribution of the number of servings per week from food groups or food items at baseline and after the hypothetical successful adoption of alternative sets of FBR, while ensuring that the population's energy intake distribution remains similar. The simulated changes from baseline in median nutrient intakes and the percentage of the population at risk of inadequate nutrient intakes are calculated and compared across the alternative sets of FBR. The model was illustrated using a hypothetical population of 12- to 18-month-old breast-feeding children consuming a cereal-based diet low in animal source foods.

  14. Probabilistic Modeling of Aircraft Trajectories for Dynamic Separation Volumes

    NASA Technical Reports Server (NTRS)

    Lewis, Timothy A.

    2016-01-01

    With a proliferation of new and unconventional vehicles and operations expected in the future, the ab initio airspace design will require new approaches to trajectory prediction for separation assurance and other air traffic management functions. This paper presents an approach to probabilistic modeling of the trajectory of an aircraft when its intent is unknown. The approach uses a set of feature functions to constrain a maximum entropy probability distribution based on a set of observed aircraft trajectories. This model can be used to sample new aircraft trajectories to form an ensemble reflecting the variability in an aircraft's intent. The model learning process ensures that the variability in this ensemble reflects the behavior observed in the original data set. Computational examples are presented.

  15. Binary Encoded-Prototype Tree for Probabilistic Model Building GP

    NASA Astrophysics Data System (ADS)

    Yanase, Toshihiko; Hasegawa, Yoshihiko; Iba, Hitoshi

    In recent years, program evolution algorithms based on the estimation of distribution algorithm (EDA) have been proposed to improve search ability of genetic programming (GP) and to overcome GP-hard problems. One such method is the probabilistic prototype tree (PPT) based algorithm. The PPT based method explores the optimal tree structure by using the full tree whose number of child nodes is maximum among possible trees. This algorithm, however, suffers from problems arising from function nodes having different number of child nodes. These function nodes cause intron nodes, which do not affect the fitness function. Moreover, the function nodes having many child nodes increase the search space and the number of samples necessary for properly constructing the probabilistic model. In order to solve this problem, we propose binary encoding for PPT. In this article, we convert each function node to a subtree of binary nodes where the converted tree is correct in grammar. Our method reduces ineffectual search space, and the binary encoded tree is able to express the same tree structures as the original method. The effectiveness of the proposed method is demonstrated through the use of two computational experiments.

  16. Applications of the International Space Station Probabilistic Risk Assessment Model

    NASA Technical Reports Server (NTRS)

    Grant, Warren; Lutomski, Michael G.

    2011-01-01

    Recently the International Space Station (ISS) has incorporated more Probabilistic Risk Assessments (PRAs) in the decision making process for significant issues. Future PRAs will have major impact to ISS and future spacecraft development and operations. These PRAs will have their foundation in the current complete ISS PRA model and the current PRA trade studies that are being analyzed as requested by ISS Program stakeholders. ISS PRAs have recently helped in the decision making process for determining reliability requirements for future NASA spacecraft and commercial spacecraft, making crew rescue decisions, as well as making operational requirements for ISS orbital orientation, planning Extravehicular activities (EVAs) and robotic operations. This paper will describe some applications of the ISS PRA model and how they impacted the final decision. This paper will discuss future analysis topics such as life extension, requirements of new commercial vehicles visiting ISS.

  17. Petri net modeling of fault analysis for probabilistic risk assessment

    NASA Astrophysics Data System (ADS)

    Lee, Andrew

    Fault trees and event trees have been widely accepted as the modeling strategy to perform Probabilistic Risk Assessment (PRA). However, there are several limitations associated with fault tree/event tree modeling. These include 1. It only considers binary events; 2. It assumes independence among basic events; and 3. It does not consider timing sequence of basic events. This thesis investigates Petri net modeling as a potential alternative for PRA modeling. Petri nets have mainly been used as a simulation tool for queuing and network systems. However, it has been suggested that they could also model failure scenarios, and thus could be a potential modeling strategy for PRA. In this thesis, the transformations required to model logic gates in a fault tree by Petri nets are explored. The gap between fault tree analysis and Petri net analysis is bridged through gate equivalency analysis. Methods for qualitative and quantitative analysis for Petri nets are presented. Techniques are developed and implemented to revise and tailor traditional Petri net modeling for system failure analysis. The airlock system and the maintenance cooling system of a CANada Deuterium Uranium (CANDU) reactor are used as case studies to demonstrate Petri nets ability to model system failure and provide a structured approach for qualitative and quantitative analysis. The minimal cutsets and the probability of the airlock system failing to maintain the pressure boundary are obtained. Furthermore, the case study is extended to non-coherent system analysis due to system maintenance.

  18. Use of probabilistic inversion to model qualitative expert input when selecting a new nuclear reactor technology

    NASA Astrophysics Data System (ADS)

    Merritt, Charles R., Jr.

    Complex investment decisions by corporate executives often require the comparison of dissimilar attributes and competing technologies. A technique to evaluate qualitative input from experts using a Multi-Criteria Decision Method (MCDM) is described to select a new reactor technology for a merchant nuclear generator. The high capital cost, risks from design, licensing and construction, reactor safety and security considerations are some of the diverse considerations when choosing a reactor design. Three next generation reactor technologies are examined: the Advanced Pressurized-1000 (AP-1000) from Westinghouse, Economic Simplified Boiling Water Reactor (ESBWR) from General Electric, and the U.S. Evolutionary Power Reactor (U.S. EPR) from AREVA. Recent developments in MCDM and decision support systems are described. The uncertainty inherent in experts' opinions for the attribute weighting in the MCDM is modeled through the use of probabilistic inversion. In probabilistic inversion, a function is inverted into a random variable within a defined range. Once the distribution is created, random samples based on the distribution are used to perform a sensitivity analysis on the decision results to verify the "strength" of the results. The decision results for the pool of experts identified the U.S. EPR as the optimal choice.

  19. De novo protein conformational sampling using a probabilistic graphical model

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Debswapna; Cheng, Jianlin

    2015-11-01

    Efficient exploration of protein conformational space remains challenging especially for large proteins when assembling discretized structural fragments extracted from a protein structure data database. We propose a fragment-free probabilistic graphical model, FUSION, for conformational sampling in continuous space and assess its accuracy using ‘blind’ protein targets with a length up to 250 residues from the CASP11 structure prediction exercise. The method reduces sampling bottlenecks, exhibits strong convergence, and demonstrates better performance than the popular fragment assembly method, ROSETTA, on relatively larger proteins with a length of more than 150 residues in our benchmark set. FUSION is freely available through a web server at http://protein.rnet.missouri.edu/FUSION/.

  20. Behavioral Modeling Based on Probabilistic Finite Automata: An Empirical Study.

    PubMed

    Tîrnăucă, Cristina; Montaña, José L; Ontañón, Santiago; González, Avelino J; Pardo, Luis M

    2016-01-01

    Imagine an agent that performs tasks according to different strategies. The goal of Behavioral Recognition (BR) is to identify which of the available strategies is the one being used by the agent, by simply observing the agent's actions and the environmental conditions during a certain period of time. The goal of Behavioral Cloning (BC) is more ambitious. In this last case, the learner must be able to build a model of the behavior of the agent. In both settings, the only assumption is that the learner has access to a training set that contains instances of observed behavioral traces for each available strategy. This paper studies a machine learning approach based on Probabilistic Finite Automata (PFAs), capable of achieving both the recognition and cloning tasks. We evaluate the performance of PFAs in the context of a simulated learning environment (in this case, a virtual Roomba vacuum cleaner robot), and compare it with a collection of other machine learning approaches. PMID:27347956

  1. Probabilistic modeling of soil development variability with time

    NASA Astrophysics Data System (ADS)

    Shepard, C.; Schaap, M. G.; Rasmussen, C.

    2015-12-01

    Soils develop as the result of a complex suite of biogeochemical and physical processes; however, effective modeling of soil development over pedogenic time scales and the resultant soil property variability is limited to individual chronosequence studies or overly broad generalizations. Soil chronosequence studies are used to understand soil development across a landscape with time, but traditional soil chronosequence studies do not account for uncertainty in soil development, and the results of these studies are site dependent. Here we develop a probabilistic approach to quantify the distribution of probable soil property values based on a review of soil chronosequence studies. Specifically, we examined the changes in the distributions of soil texture and solum thickness with increasing time and influx of pedogenic energy from climatic and biological forcings. We found the greatest variability in maximum measured clay content occurred between 103 to 105 years, with convergence of clay contents in soils older than 106 years. Conversely, we found that the variability in maximum sand content increased with increasing time, with the greatest variability in soils between 105 to 106 years old; we did not find distributional changes in maximum silt content with time. Bivariate normal probability distributions were parameterized using the chronosequence data, from which conditional univariate distributions based on the total pedogenic energy (age x rate of energy flux) were calculated, allowing determination of a probable range of soil properties for a given age and bioclimatic environment. The bivariate distribution was capable of effectively representing the measured maximum clay content values with an r2 of 0.53 (p < 0.0001, RMSE = 14.36%). By taking a distributional approach to quantifying soil development and variability, we can quantitatively probabilistically represent the full state factor model, while explicitly quantifying the uncertainty in soil development.

  2. Probabilistic models of eukaryotic evolution: time for integration

    PubMed Central

    Lartillot, Nicolas

    2015-01-01

    In spite of substantial work and recent progress, a global and fully resolved picture of the macroevolutionary history of eukaryotes is still under construction. This concerns not only the phylogenetic relations among major groups, but also the general characteristics of the underlying macroevolutionary processes, including the patterns of gene family evolution associated with endosymbioses, as well as their impact on the sequence evolutionary process. All these questions raise formidable methodological challenges, calling for a more powerful statistical paradigm. In this direction, model-based probabilistic approaches have played an increasingly important role. In particular, improved models of sequence evolution accounting for heterogeneities across sites and across lineages have led to significant, although insufficient, improvement in phylogenetic accuracy. More recently, one main trend has been to move away from simple parametric models and stepwise approaches, towards integrative models explicitly considering the intricate interplay between multiple levels of macroevolutionary processes. Such integrative models are in their infancy, and their application to the phylogeny of eukaryotes still requires substantial improvement of the underlying models, as well as additional computational developments. PMID:26323768

  3. Probabilistic model for AGV mobile robot ultrasonic sensor

    NASA Astrophysics Data System (ADS)

    Liao, Xiaoqun; Cao, Ming; Cao, Jin; Hall, Ernest L.

    1999-08-01

    An autonomous guided vehicle is a multi-sensor mobile robot. The sensors of a multi-sensor robot system are characteristically complex and diverse. They supply observations, which are often difficult to compare or aggregate directly. To make efficient use of the sensor information, the capabilities of each sensor must be modeled to extract information form the environment. For this goal, a probability model of ultrasonic sensor (PMUS) is presented in this paper. The model provides a means of distributing decision making and integrating diverse opinions. Also, the paper illustrates that a series of performance factors affect the probability model as parameters. PMUS could be extended to other sensor as members of the multi-sensor team. Moreover, the sensor probability model explored is suitable for all multi-sensor mobile robots. It should provide a quantitative ability for analysis of sensor performance, and allow the development of robust decision procedures for integrating sensor information. The theoretical sensor model presented is a first step in understanding and expanding the performance of ultrasound systems. The significance of this paper lies in the theoretical integration of sensory information from the probabilistic point of view.

  4. Probabilistic models for reactive behaviour in heterogeneous condensed phase media

    NASA Astrophysics Data System (ADS)

    Baer, M. R.; Gartling, D. K.; DesJardin, P. E.

    2012-02-01

    This work presents statistically-based models to describe reactive behaviour in heterogeneous energetic materials. Mesoscale effects are incorporated in continuum-level reactive flow descriptions using probability density functions (pdfs) that are associated with thermodynamic and mechanical states. A generalised approach is presented that includes multimaterial behaviour by treating the volume fraction as a random kinematic variable. Model simplifications are then sought to reduce the complexity of the description without compromising the statistical approach. Reactive behaviour is first considered for non-deformable media having a random temperature field as an initial state. A pdf transport relationship is derived and an approximate moment approach is incorporated in finite element analysis to model an example application whereby a heated fragment impacts a reactive heterogeneous material which leads to a delayed cook-off event. Modelling is then extended to include deformation effects associated with shock loading of a heterogeneous medium whereby random variables of strain, strain-rate and temperature are considered. A demonstrative mesoscale simulation of a non-ideal explosive is discussed that illustrates the joint statistical nature of the strain and temperature fields during shock loading to motivate the probabilistic approach. This modelling is derived in a Lagrangian framework that can be incorporated in continuum-level shock physics analysis. Future work will consider particle-based methods for a numerical implementation of this modelling approach.

  5. Gendered Educational and Occupational Choices: Applying the Eccles et al. Model of Achievement-Related Choices

    ERIC Educational Resources Information Center

    Eccles, Jacquelynne

    2011-01-01

    I summarize a theoretical model of the social, cultural, and psychological influences on achievement-related choices and outline how this model can help us understand gendered educational and occupational choices. I argue that both gender differences and individual differences within each gender in educational and occupational choices are linked…

  6. Detection and characterization of regulatory elements using probabilistic conditional random field and hidden Markov models.

    PubMed

    Wang, Hongyan; Zhou, Xiaobo

    2013-04-01

    By altering the electrostatic charge of histones or providing binding sites to protein recognition molecules, Chromatin marks have been proposed to regulate gene expression, a property that has motivated researchers to link these marks to cis-regulatory elements. With the help of next generation sequencing technologies, we can now correlate one specific chromatin mark with regulatory elements (e.g. enhancers or promoters) and also build tools, such as hidden Markov models, to gain insight into mark combinations. However, hidden Markov models have limitation for their character of generative models and assume that a current observation depends only on a current hidden state in the chain. Here, we employed two graphical probabilistic models, namely the linear conditional random field model and multivariate hidden Markov model, to mark gene regions with different states based on recurrent and spatially coherent character of these eight marks. Both models revealed chromatin states that may correspond to enhancers and promoters, transcribed regions, transcriptional elongation, and low-signal regions. We also found that the linear conditional random field model was more effective than the hidden Markov model in recognizing regulatory elements, such as promoter-, enhancer-, and transcriptional elongation-associated regions, which gives us a better choice.

  7. Detection and characterization of regulatory elements using probabilistic conditional random field and hidden Markov models.

    PubMed

    Wang, Hongyan; Zhou, Xiaobo

    2013-04-01

    By altering the electrostatic charge of histones or providing binding sites to protein recognition molecules, Chromatin marks have been proposed to regulate gene expression, a property that has motivated researchers to link these marks to cis-regulatory elements. With the help of next generation sequencing technologies, we can now correlate one specific chromatin mark with regulatory elements (e.g. enhancers or promoters) and also build tools, such as hidden Markov models, to gain insight into mark combinations. However, hidden Markov models have limitation for their character of generative models and assume that a current observation depends only on a current hidden state in the chain. Here, we employed two graphical probabilistic models, namely the linear conditional random field model and multivariate hidden Markov model, to mark gene regions with different states based on recurrent and spatially coherent character of these eight marks. Both models revealed chromatin states that may correspond to enhancers and promoters, transcribed regions, transcriptional elongation, and low-signal regions. We also found that the linear conditional random field model was more effective than the hidden Markov model in recognizing regulatory elements, such as promoter-, enhancer-, and transcriptional elongation-associated regions, which gives us a better choice. PMID:23237214

  8. Probabilistic Model for Low Altitude Trapped Proton Fluxes

    NASA Technical Reports Server (NTRS)

    Xapsos, M. A.; Huston, S. L.; Barth, J. L.; Stassiinopoulos, E. G.

    2003-01-01

    A new approach is developed for the assessment of low altitude trapped proton fluxes for future space missions. Low altitude fluxes are dependent on solar activity levels due to the resulting heating and cooling of the upper atmosphere. However, solar activity levels cannot be accurately predicted far enough into the future to accommodate typical spacecraft mission planning. Thus, the approach suggested here is to evaluate the trapped proton flux as a function of confidence level for a given mission time period. This is possible because of a recent advance in trapped proton modeling that uses the solar 10.7 cm radio flux, a measure of solar cycle activity, to calculate trapped proton fluxes as a continuous function of time throughout the solar cycle. This trapped proton model is combined with a new statistical description of the 10.7 cm flux to obtain the probabilistic model for low altitude trapped proton fluxes. Results for proton energies ranging from 1.5 to 81.3 MeV are examined as a function of time throughout solar cycle 22 for various orbits. For altitudes below 1000 km, fluxes are significantly higher and energy spectra are significantly harder than those predicted by the AP8 model.

  9. Probabilistic model for quick detection of dissimilar binary images

    NASA Astrophysics Data System (ADS)

    Mustafa, Adnan A. Y.

    2015-09-01

    We present a quick method to detect dissimilar binary images. The method is based on a "probabilistic matching model" for image matching. The matching model is used to predict the probability of occurrence of distinct-dissimilar image pairs (completely different images) when matching one image to another. Based on this model, distinct-dissimilar images can be detected by matching only a few points between two images with high confidence, namely 11 points for a 99.9% successful detection rate. For image pairs that are dissimilar but not distinct-dissimilar, more points need to be mapped. The number of points required to attain a certain successful detection rate or confidence depends on the amount of similarity between the compared images. As this similarity increases, more points are required. For example, images that differ by 1% can be detected by mapping fewer than 70 points on average. More importantly, the model is image size invariant; so, images of any sizes will produce high confidence levels with a limited number of matched points. As a result, this method does not suffer from the image size handicap that impedes current methods. We report on extensive tests conducted on real images of different sizes.

  10. Probabilistic models to describe the dynamics of migrating microbial communities.

    PubMed

    Schroeder, Joanna L; Lunn, Mary; Pinto, Ameet J; Raskin, Lutgarde; Sloan, William T

    2015-01-01

    In all but the most sterile environments bacteria will reside in fluid being transported through conduits and some of these will attach and grow as biofilms on the conduit walls. The concentration and diversity of bacteria in the fluid at the point of delivery will be a mix of those when it entered the conduit and those that have become entrained into the flow due to seeding from biofilms. Examples include fluids through conduits such as drinking water pipe networks, endotracheal tubes, catheters and ventilation systems. Here we present two probabilistic models to describe changes in the composition of bulk fluid microbial communities as they are transported through a conduit whilst exposed to biofilm communities. The first (discrete) model simulates absolute numbers of individual cells, whereas the other (continuous) model simulates the relative abundance of taxa in the bulk fluid. The discrete model is founded on a birth-death process whereby the community changes one individual at a time and the numbers of cells in the system can vary. The continuous model is a stochastic differential equation derived from the discrete model and can also accommodate changes in the carrying capacity of the bulk fluid. These models provide a novel Lagrangian framework to investigate and predict the dynamics of migrating microbial communities. In this paper we compare the two models, discuss their merits, possible applications and present simulation results in the context of drinking water distribution systems. Our results provide novel insight into the effects of stochastic dynamics on the composition of non-stationary microbial communities that are exposed to biofilms and provides a new avenue for modelling microbial dynamics in systems where fluids are being transported.

  11. Probabilistic models to describe the dynamics of migrating microbial communities.

    PubMed

    Schroeder, Joanna L; Lunn, Mary; Pinto, Ameet J; Raskin, Lutgarde; Sloan, William T

    2015-01-01

    In all but the most sterile environments bacteria will reside in fluid being transported through conduits and some of these will attach and grow as biofilms on the conduit walls. The concentration and diversity of bacteria in the fluid at the point of delivery will be a mix of those when it entered the conduit and those that have become entrained into the flow due to seeding from biofilms. Examples include fluids through conduits such as drinking water pipe networks, endotracheal tubes, catheters and ventilation systems. Here we present two probabilistic models to describe changes in the composition of bulk fluid microbial communities as they are transported through a conduit whilst exposed to biofilm communities. The first (discrete) model simulates absolute numbers of individual cells, whereas the other (continuous) model simulates the relative abundance of taxa in the bulk fluid. The discrete model is founded on a birth-death process whereby the community changes one individual at a time and the numbers of cells in the system can vary. The continuous model is a stochastic differential equation derived from the discrete model and can also accommodate changes in the carrying capacity of the bulk fluid. These models provide a novel Lagrangian framework to investigate and predict the dynamics of migrating microbial communities. In this paper we compare the two models, discuss their merits, possible applications and present simulation results in the context of drinking water distribution systems. Our results provide novel insight into the effects of stochastic dynamics on the composition of non-stationary microbial communities that are exposed to biofilms and provides a new avenue for modelling microbial dynamics in systems where fluids are being transported. PMID:25803866

  12. Probabilistic constitutive relationships for material strength degradation models

    NASA Technical Reports Server (NTRS)

    Boyce, L.; Chamis, C. C.

    1989-01-01

    In the present probabilistic methodology for the strength of aerospace propulsion system structural components subjected to such environmentally-induced primitive variables as loading stresses, high temperature, chemical corrosion, and radiation, time is encompassed as an interacting element, allowing the projection of creep and fatigue effects. A probabilistic constitutive equation is postulated to account for the degradation of strength due to these primitive variables which may be calibrated by an appropriately curve-fitted least-squares multiple regression of experimental data. The resulting probabilistic constitutive equation is embodied in the PROMISS code for aerospace propulsion component random strength determination.

  13. Modeling choice and valuation in decision experiments.

    PubMed

    Loomes, Graham

    2010-07-01

    This article develops a parsimonious descriptive model of individual choice and valuation in the kinds of experiments that constitute a substantial part of the literature relating to decision making under risk and uncertainty. It suggests that many of the best known "regularities" observed in those experiments may arise from a tendency for participants to perceive probabilities and payoffs in a particular way. This model organizes more of the data than any other extant model and generates a number of novel testable implications which are examined with new data.

  14. Probabilistic consequence model of accidenal or intentional chemical releases.

    SciTech Connect

    Chang, Y.-S.; Samsa, M. E.; Folga, S. M.; Hartmann, H. M.

    2008-06-02

    In this work, general methodologies for evaluating the impacts of large-scale toxic chemical releases are proposed. The potential numbers of injuries and fatalities, the numbers of hospital beds, and the geographical areas rendered unusable during and some time after the occurrence and passage of a toxic plume are estimated on a probabilistic basis. To arrive at these estimates, historical accidental release data, maximum stored volumes, and meteorological data were used as inputs into the SLAB accidental chemical release model. Toxic gas footprints from the model were overlaid onto detailed population and hospital distribution data for a given region to estimate potential impacts. Output results are in the form of a generic statistical distribution of injuries and fatalities associated with specific toxic chemicals and regions of the United States. In addition, indoor hazards were estimated, so the model can provide contingency plans for either shelter-in-place or evacuation when an accident occurs. The stochastic distributions of injuries and fatalities are being used in a U.S. Department of Homeland Security-sponsored decision support system as source terms for a Monte Carlo simulation that evaluates potential measures for mitigating terrorist threats. This information can also be used to support the formulation of evacuation plans and to estimate damage and cleanup costs.

  15. A probabilistic palimpsest model of visual short-term memory.

    PubMed

    Matthey, Loic; Bays, Paul M; Dayan, Peter

    2015-01-01

    Working memory plays a key role in cognition, and yet its mechanisms remain much debated. Human performance on memory tasks is severely limited; however, the two major classes of theory explaining the limits leave open questions about key issues such as how multiple simultaneously-represented items can be distinguished. We propose a palimpsest model, with the occurrent activity of a single population of neurons coding for several multi-featured items. Using a probabilistic approach to storage and recall, we show how this model can account for many qualitative aspects of existing experimental data. In our account, the underlying nature of a memory item depends entirely on the characteristics of the population representation, and we provide analytical and numerical insights into critical issues such as multiplicity and binding. We consider representations in which information about individual feature values is partially separate from the information about binding that creates single items out of multiple features. An appropriate balance between these two types of information is required to capture fully the different types of error seen in human experimental data. Our model provides the first principled account of misbinding errors. We also suggest a specific set of stimuli designed to elucidate the representations that subjects actually employ. PMID:25611204

  16. Spatial polychaeta habitat potential mapping using probabilistic models

    NASA Astrophysics Data System (ADS)

    Choi, Jong-Kuk; Oh, Hyun-Joo; Koo, Bon Joo; Ryu, Joo-Hyung; Lee, Saro

    2011-06-01

    The purpose of this study was to apply probabilistic models to the mapping of the potential polychaeta habitat area in the Hwangdo tidal flat, Korea. Remote sensing techniques were used to construct spatial datasets of ecological environments and field observations were carried out to determine the distribution of macrobenthos. Habitat potential mapping was achieved for two polychaeta species, Prionospio japonica and Prionospio pulchra, and eight control factors relating to the tidal macrobenthos distribution were selected. These included the intertidal digital elevation model (DEM), slope, aspect, tidal exposure duration, distance from tidal channels, tidal channel density, spectral reflectance of the near infrared (NIR) bands and surface sedimentary facies from satellite imagery. The spatial relationships between the polychaeta species and each control factor were calculated using a frequency ratio and weights-of-evidence combined with geographic information system (GIS) data. The species were randomly divided into a training set (70%) to analyze habitat potential using frequency ratio and weights-of-evidence, and a test set (30%) to verify the predicted habitat potential map. The relationships were overlaid to produce a habitat potential map with a polychaeta habitat potential (PHP) index value. These maps were verified by comparing them to surveyed habitat locations such as the verification data set. For the verification results, the frequency ratio model showed prediction accuracies of 77.71% and 74.87% for P. japonica and P. pulchra, respectively, while those for the weights-of-evidence model were 64.05% and 62.95%. Thus, the frequency ratio model provided a more accurate prediction than the weights-of-evidence model. Our data demonstrate that the frequency ratio and weights-of-evidence models based upon GIS analysis are effective for generating habitat potential maps of polychaeta species in a tidal flat. The results of this study can be applied towards

  17. Predicting coastal cliff erosion using a Bayesian probabilistic model

    USGS Publications Warehouse

    Hapke, C.; Plant, N.

    2010-01-01

    Regional coastal cliff retreat is difficult to model due to the episodic nature of failures and the along-shore variability of retreat events. There is a growing demand, however, for predictive models that can be used to forecast areas vulnerable to coastal erosion hazards. Increasingly, probabilistic models are being employed that require data sets of high temporal density to define the joint probability density function that relates forcing variables (e.g. wave conditions) and initial conditions (e.g. cliff geometry) to erosion events. In this study we use a multi-parameter Bayesian network to investigate correlations between key variables that control and influence variations in cliff retreat processes. The network uses Bayesian statistical methods to estimate event probabilities using existing observations. Within this framework, we forecast the spatial distribution of cliff retreat along two stretches of cliffed coast in Southern California. The input parameters are the height and slope of the cliff, a descriptor of material strength based on the dominant cliff-forming lithology, and the long-term cliff erosion rate that represents prior behavior. The model is forced using predicted wave impact hours. Results demonstrate that the Bayesian approach is well-suited to the forward modeling of coastal cliff retreat, with the correct outcomes forecast in 70-90% of the modeled transects. The model also performs well in identifying specific locations of high cliff erosion, thus providing a foundation for hazard mapping. This approach can be employed to predict cliff erosion at time-scales ranging from storm events to the impacts of sea-level rise at the century-scale. ?? 2010.

  18. Probabilistic modelling of rainfall induced landslide hazard assessment

    NASA Astrophysics Data System (ADS)

    Kawagoe, S.; Kazama, S.; Sarukkalige, P. R.

    2010-01-01

    To evaluate the frequency and distribution of landslides hazards over Japan, this study uses a probabilistic model based on multiple logistic regression analysis. Study particular concerns several important physical parameters such as hydraulic parameters, geographical parameters and the geological parameters which are considered to be influential in the occurrence of landslides. Sensitivity analysis confirmed that hydrological parameter (hydraulic gradient) is the most influential factor in the occurrence of landslides. Therefore, the hydraulic gradient is used as the main hydraulic parameter; dynamic factor which includes the effect of heavy rainfall and their return period. Using the constructed spatial data-sets, a multiple logistic regression model is applied and landslide susceptibility maps are produced showing the spatial-temporal distribution of landslide hazard susceptibility over Japan. To represent the susceptibility in different temporal scales, extreme precipitation in 5 years, 30 years, and 100 years return periods are used for the evaluation. The results show that the highest landslide hazard susceptibility exists in the mountain ranges on the western side of Japan (Japan Sea side), including the Hida and Kiso, Iide and the Asahi mountainous range, the south side of Chugoku mountainous range, the south side of Kyusu mountainous and the Dewa mountainous range and the Hokuriku region. The developed landslide hazard susceptibility maps in this study will assist authorities, policy makers and decision makers, who are responsible for infrastructural planning and development, as they can identify landslide-susceptible areas and thus decrease landslide damage through proper preparation.

  19. Probabilistic modelling of rainfall induced landslide hazard assessment

    NASA Astrophysics Data System (ADS)

    Kawagoe, S.; Kazama, S.; Sarukkalige, P. R.

    2010-06-01

    To evaluate the frequency and distribution of landslides hazards over Japan, this study uses a probabilistic model based on multiple logistic regression analysis. Study particular concerns several important physical parameters such as hydraulic parameters, geographical parameters and the geological parameters which are considered to be influential in the occurrence of landslides. Sensitivity analysis confirmed that hydrological parameter (hydraulic gradient) is the most influential factor in the occurrence of landslides. Therefore, the hydraulic gradient is used as the main hydraulic parameter; dynamic factor which includes the effect of heavy rainfall and their return period. Using the constructed spatial data-sets, a multiple logistic regression model is applied and landslide hazard probability maps are produced showing the spatial-temporal distribution of landslide hazard probability over Japan. To represent the landslide hazard in different temporal scales, extreme precipitation in 5 years, 30 years, and 100 years return periods are used for the evaluation. The results show that the highest landslide hazard probability exists in the mountain ranges on the western side of Japan (Japan Sea side), including the Hida and Kiso, Iide and the Asahi mountainous range, the south side of Chugoku mountainous range, the south side of Kyusu mountainous and the Dewa mountainous range and the Hokuriku region. The developed landslide hazard probability maps in this study will assist authorities, policy makers and decision makers, who are responsible for infrastructural planning and development, as they can identify landslide-susceptible areas and thus decrease landslide damage through proper preparation.

  20. Probabilistic modeling of climate change impacts in permafrost regions

    NASA Astrophysics Data System (ADS)

    Anisimov, O.

    2009-04-01

    The new type of climate impact models has recently come into existence. Unlike conventional models, they take into account the probabilistic nature of climatic projections and small-scale spatial variability of permafrost parameters. In this study we describe the new stochastic permafrost modeling methodology and present the predictive results obtained for the Northern Eurasia under the ensemble climatic projection for the mid-21st century. Changes in permafrost are very illustrative of the impacts of global warming. It underlies about 22.8 million square km or 24% of the land area in the Northern Hemisphere and largely controls the state of the environment and socio-economical development in the northern lands. Observed and projected for the future warming is more pronounced in high latitudes, and there are indications that climatic change has already affected permafrost leading to deeper seasonal thawing and disappearance of the frozen ground in many locations. Particular concerns are associated with environmental and economical risks due to the damage of constructions, and with potential enhancement of the global warming through emission of greenhouse gases from thawing permafrost. Comprehensive permafrost projections are needed to predict such processes. We developed new type of stochastic model, which operates with the probability distribution functions of the parameters characterizing the state of permafrost. Air temperature, precipitation, snow depth, as well as vegetation and soil properties contribute to the variability of these parameters in space and over time, which is taken into account in the calculations of the statistical ensemble representing potential states of permafrost under the prescribed conditions. The model requires appropriate climatic and environmental data characterizing baseline or projected for the future conditions. Four gridded sets of climatic parameters constructed through spatial interpolation of meteorological observations and

  1. Probabilistic approaches to the modelling of fluvial processes

    NASA Astrophysics Data System (ADS)

    Molnar, Peter

    2013-04-01

    Fluvial systems generally exhibit sediment dynamics that are strongly stochastic. This stochasticity comes basically from three sources: (a) the variability and randomness in sediment supply due to surface properties and topography; (b) from the multitude of pathways that sediment may take on hillslopes and in channels, and the uncertainty in travel times and sediment storage along those pathways; and (c) from the stochasticity which is inherent in mobilizing sediment, either by heavy rain, landslides, debris flows, slope erosion, channel avulsions, etc. Fully deterministic models of fluvial systems, even if they are physically realistic and very complex, are likely going to be unable to capture this stochasticity and as a result will fail to reproduce long-term sediment dynamics. In this paper I will review another approach to modelling fluvial processes, which grossly simplifies the systems itself, but allows for stochasticity in sediment supply, mobilization and transport. I will demonstrate the benefits and limitations of this probabilistic approach to fluvial processes on three examples. The first example is a probabilistic sediment cascade which we developed for the Illgraben, a debris flow basin in the Rhone catchment. In this example it will be shown how the probability distribution of landslides generating sediment input into the channel system is transposed into that of sediment yield out of the basin by debris flows. The key role of transient sediment storage in the channel system, which limits the size of potential debris flows, is highlighted together with the influence of the landslide triggering mechanisms and climate stochasticity. The second example focuses on the river reach scale in the Maggia River, a braided gravel-bed stream where the exposed sediment on gravel bars is colonised by riparian vegetation in periods without floods. A simple autoregressive model with a disturbance and colonization term is used to simulate the growth and decline in

  2. Model for understanding consumer textural food choice.

    PubMed

    Jeltema, Melissa; Beckley, Jacqueline; Vahalik, Jennifer

    2015-05-01

    The current paradigm for developing products that will match the marketing messaging is flawed because the drivers of product choice and satisfaction based on texture are misunderstood. Qualitative research across 10 years has led to the thesis explored in this research that individuals have a preferred way to manipulate food in their mouths (i.e., mouth behavior) and that this behavior is a major driver of food choice, satisfaction, and the desire to repurchase. Texture, which is currently thought to be a major driver of product choice, is a secondary factor, and is important only in that it supports the primary driver-mouth behavior. A model for mouth behavior is proposed and the qualitative research supporting the identification of different mouth behaviors is presented. The development of a trademarked typing tool for characterizing mouth behavior is described along with quantitative substantiation of the tool's ability to group individuals by mouth behavior. The use of these four groups to understand textural preferences and the implications for a variety of areas including product design and weight management are explored. PMID:25987995

  3. Model for understanding consumer textural food choice

    PubMed Central

    Jeltema, Melissa; Beckley, Jacqueline; Vahalik, Jennifer

    2015-01-01

    The current paradigm for developing products that will match the marketing messaging is flawed because the drivers of product choice and satisfaction based on texture are misunderstood. Qualitative research across 10 years has led to the thesis explored in this research that individuals have a preferred way to manipulate food in their mouths (i.e., mouth behavior) and that this behavior is a major driver of food choice, satisfaction, and the desire to repurchase. Texture, which is currently thought to be a major driver of product choice, is a secondary factor, and is important only in that it supports the primary driver—mouth behavior. A model for mouth behavior is proposed and the qualitative research supporting the identification of different mouth behaviors is presented. The development of a trademarked typing tool for characterizing mouth behavior is described along with quantitative substantiation of the tool's ability to group individuals by mouth behavior. The use of these four groups to understand textural preferences and the implications for a variety of areas including product design and weight management are explored. PMID:25987995

  4. Probabilistic model for fracture mechanics service life analysis

    NASA Technical Reports Server (NTRS)

    Annis, Charles; Watkins, Tommie

    1988-01-01

    The service longevity of complex propulsion systems, such as the Space Shuttle Main Engine (SSME), can be at risk from several competing failure modes. Conventional life assessment practice focuses upon the most severely life-limited feature of a given component, even though there may be other, less severe, potential failure locations. Primary, secondary, tertiary failure modes, as well as their associated probabilities, must also be considered. Futhermore, these probabilities are functions of accumulated service time. Thus a component may not always succumb to the most severe, or even the most probable failure mode. Propulsion system longevity must be assessed by considering simultaneously the actions of, and interactions among, life-limiting influences. These include, but are not limited to, high frequency fatigue (HFF), low cycle fatigue (LCF), and subsequent crack propagation, thermal and acoustic loadings, and the influence of less-than-ideal nondestructive evaluation (NDE). An outline is provided for a probabilistic model for service life analysis, and the progress towards its implementation is reported.

  5. Data-directed RNA secondary structure prediction using probabilistic modeling.

    PubMed

    Deng, Fei; Ledda, Mirko; Vaziri, Sana; Aviran, Sharon

    2016-08-01

    Structure dictates the function of many RNAs, but secondary RNA structure analysis is either labor intensive and costly or relies on computational predictions that are often inaccurate. These limitations are alleviated by integration of structure probing data into prediction algorithms. However, existing algorithms are optimized for a specific type of probing data. Recently, new chemistries combined with advances in sequencing have facilitated structure probing at unprecedented scale and sensitivity. These novel technologies and anticipated wealth of data highlight a need for algorithms that readily accommodate more complex and diverse input sources. We implemented and investigated a recently outlined probabilistic framework for RNA secondary structure prediction and extended it to accommodate further refinement of structural information. This framework utilizes direct likelihood-based calculations of pseudo-energy terms per considered structural context and can readily accommodate diverse data types and complex data dependencies. We use real data in conjunction with simulations to evaluate performances of several implementations and to show that proper integration of structural contexts can lead to improvements. Our tests also reveal discrepancies between real data and simulations, which we show can be alleviated by refined modeling. We then propose statistical preprocessing approaches to standardize data interpretation and integration into such a generic framework. We further systematically quantify the information content of data subsets, demonstrating that high reactivities are major drivers of SHAPE-directed predictions and that better understanding of less informative reactivities is key to further improvements. Finally, we provide evidence for the adaptive capability of our framework using mock probe simulations.

  6. Sonar signal processing using probabilistic signal and ocean environmental models.

    PubMed

    Culver, R Lee; Camin, H John

    2008-12-01

    Acoustic signals propagating through the ocean are refracted, scattered, and attenuated by the ocean volume and boundaries. Many aspects of how the ocean affects acoustic propagation are understood, such that the characteristics of a received signal can often be predicted with some degree of certainty. However, acoustic ocean parameters vary with time and location in a manner that is not, and cannot be, precisely known; some uncertainty will always remain. For this reason, the characteristics of the received signal can never be precisely predicted and must be described in probabilistic terms. A signal processing structure recently developed relies on knowledge of the ocean environment to predict the statistical characteristics of the received signal, and incorporates this description into the processor in order to detect and classify targets. Acoustic measurements at 250 Hz from the 1996 Strait of Gibraltar Acoustic Monitoring Experiment are used to illustrate how the processor utilizes environmental data to classify source depth and to underscore the importance of environmental model fidelity and completeness.

  7. Formulation of probabilistic models of protein structure in atomic detail using the reference ratio method.

    PubMed

    Valentin, Jan B; Andreetta, Christian; Boomsma, Wouter; Bottaro, Sandro; Ferkinghoff-Borg, Jesper; Frellsen, Jes; Mardia, Kanti V; Tian, Pengfei; Hamelryck, Thomas

    2014-02-01

    We propose a method to formulate probabilistic models of protein structure in atomic detail, for a given amino acid sequence, based on Bayesian principles, while retaining a close link to physics. We start from two previously developed probabilistic models of protein structure on a local length scale, which concern the dihedral angles in main chain and side chains, respectively. Conceptually, this constitutes a probabilistic and continuous alternative to the use of discrete fragment and rotamer libraries. The local model is combined with a nonlocal model that involves a small number of energy terms according to a physical force field, and some information on the overall secondary structure content. In this initial study we focus on the formulation of the joint model and the evaluation of the use of an energy vector as a descriptor of a protein's nonlocal structure; hence, we derive the parameters of the nonlocal model from the native structure without loss of generality. The local and nonlocal models are combined using the reference ratio method, which is a well-justified probabilistic construction. For evaluation, we use the resulting joint models to predict the structure of four proteins. The results indicate that the proposed method and the probabilistic models show considerable promise for probabilistic protein structure prediction and related applications.

  8. The Terrestrial Investigation Model: A probabilistic risk assessment model for birds exposed to pesticides

    EPA Science Inventory

    One of the major recommendations of the National Academy of Science to the USEPA, NMFS and USFWS was to utilize probabilistic methods when assessing the risks of pesticides to federally listed endangered and threatened species. The Terrestrial Investigation Model (TIM, version 3....

  9. Probabilistic model-based approach for heart beat detection.

    PubMed

    Chen, Hugh; Erol, Yusuf; Shen, Eric; Russell, Stuart

    2016-09-01

    Nowadays, hospitals are ubiquitous and integral to modern society. Patients flow in and out of a veritable whirlwind of paperwork, consultations, and potential inpatient admissions, through an abstracted system that is not without flaws. One of the biggest flaws in the medical system is perhaps an unexpected one: the patient alarm system. One longitudinal study reported an 88.8% rate of false alarms, with other studies reporting numbers of similar magnitudes. These false alarm rates lead to deleterious effects that manifest in a lower standard of care across clinics. This paper discusses a model-based probabilistic inference approach to estimate physiological variables at a detection level. We design a generative model that complies with a layman's understanding of human physiology and perform approximate Bayesian inference. One primary goal of this paper is to justify a Bayesian modeling approach to increasing robustness in a physiological domain. In order to evaluate our algorithm we look at the application of heart beat detection using four datasets provided by PhysioNet, a research resource for complex physiological signals, in the form of the PhysioNet 2014 Challenge set-p1 and set-p2, the MIT-BIH Polysomnographic Database, and the MGH/MF Waveform Database. On these data sets our algorithm performs on par with the other top six submissions to the PhysioNet 2014 challenge. The overall evaluation scores in terms of sensitivity and positive predictivity values obtained were as follows: set-p1 (99.72%), set-p2 (93.51%), MIT-BIH (99.66%), and MGH/MF (95.53%). These scores are based on the averaging of gross sensitivity, gross positive predictivity, average sensitivity, and average positive predictivity.

  10. Choice.

    PubMed

    Greenberg, Jay

    2008-09-01

    Understanding how and why analysands make the choices they do is central to both the clinical and the theoretical projects of psychoanalysis. And yet we know very little about the process of choice or about the relationship between choices and motives. A striking parallel is to be found between the ways choice is narrated in ancient Greek texts and the experience of analysts as they observe patients making choices in everyday clinical work. Pursuing this convergence of classical and contemporary sensibilities will illuminate crucial elements of the various meanings of choice, and of the way that these meanings change over the course of psychoanalytic treatment.

  11. Choice.

    PubMed

    Greenberg, Jay

    2008-09-01

    Understanding how and why analysands make the choices they do is central to both the clinical and the theoretical projects of psychoanalysis. And yet we know very little about the process of choice or about the relationship between choices and motives. A striking parallel is to be found between the ways choice is narrated in ancient Greek texts and the experience of analysts as they observe patients making choices in everyday clinical work. Pursuing this convergence of classical and contemporary sensibilities will illuminate crucial elements of the various meanings of choice, and of the way that these meanings change over the course of psychoanalytic treatment. PMID:18802123

  12. Forecasting the duration of volcanic eruptions: an empirical probabilistic model

    NASA Astrophysics Data System (ADS)

    Gunn, L. S.; Blake, S.; Jones, M. C.; Rymer, H.

    2014-01-01

    The ability to forecast future volcanic eruption durations would greatly benefit emergency response planning prior to and during a volcanic crises. This paper introduces a probabilistic model to forecast the duration of future and on-going eruptions. The model fits theoretical distributions to observed duration data and relies on past eruptions being a good indicator of future activity. A dataset of historical Mt. Etna flank eruptions is presented and used to demonstrate the model. The data have been compiled through critical examination of existing literature along with careful consideration of uncertainties on reported eruption start and end dates between the years 1300 AD and 2010. Data following 1600 is considered to be reliable and free of reporting biases. The distribution of eruption duration between the years 1600 and 1669 is found to be statistically different from that following it and the forecasting model is run on two datasets of Mt. Etna flank eruption durations: 1600-2010 and 1670-2010. Each dataset is modelled using a log-logistic distribution with parameter values found by maximum likelihood estimation. Survivor function statistics are applied to the model distributions to forecast (a) the probability of an eruption exceeding a given duration, (b) the probability of an eruption that has already lasted a particular number of days exceeding a given total duration and (c) the duration with a given probability of being exceeded. Results show that excluding the 1600-1670 data has little effect on the forecasting model result, especially where short durations are involved. By assigning the terms `likely' and `unlikely' to probabilities of 66 % or more and 33 % or less, respectively, the forecasting model based on the 1600-2010 dataset indicates that a future flank eruption on Mt. Etna would be likely to exceed 20 days (± 7 days) but unlikely to exceed 86 days (± 29 days). This approach can easily be adapted for use on other highly active, well

  13. An empirical model for probabilistic decadal prediction: A global analysis

    NASA Astrophysics Data System (ADS)

    Suckling, Emma; Hawkins, Ed; Eden, Jonathan; van Oldenborgh, Geert Jan

    2016-04-01

    Empirical models, designed to predict land-based surface variables over seasons to decades ahead, provide useful benchmarks for comparison against the performance of dynamical forecast systems; they may also be employable as predictive tools for use by climate services in their own right. A new global empirical decadal prediction system is presented, based on a multiple linear regression approach designed to produce probabilistic output for comparison against dynamical models. Its performance is evaluated for surface air temperature over a set of historical hindcast experiments under a series of different prediction `modes'. The modes include a real-time setting, a scenario in which future volcanic forcings are prescribed during the hindcasts, and an approach which exploits knowledge of the forced trend. A two-tier prediction system, which uses knowledge of future sea surface temperatures in the Pacific and Atlantic Oceans, is also tested, but within a perfect knowledge framework. Each mode is designed to identify sources of predictability and uncertainty, as well as investigate different approaches to the design of decadal prediction systems for operational use. It is found that the empirical model shows skill above that of persistence hindcasts for annual means at lead times of up to ten years ahead in all of the prediction modes investigated. Small improvements in skill are found at all lead times when including future volcanic forcings in the hindcasts. It is also suggested that hindcasts which exploit full knowledge of the forced trend due to increasing greenhouse gases throughout the hindcast period can provide more robust estimates of model bias for the calibration of the empirical model in an operational setting. The two-tier system shows potential for improved real-time prediction, given the assumption that skilful predictions of large-scale modes of variability are available. The empirical model framework has been designed with enough flexibility to

  14. Learning about causes from people and about people as causes: probabilistic models and social causal reasoning.

    PubMed

    Buchsbaum, Daphna; Seiver, Elizabeth; Bridgers, Sophie; Gopnik, Alison

    2012-01-01

    A major challenge children face is uncovering the causal structure of the world around them. Previous research on children's causal inference has demonstrated their ability to learn about causal relationships in the physical environment using probabilistic evidence. However, children must also learn about causal relationships in the social environment, including discovering the causes of other people's behavior, and understanding the causal relationships between others' goal-directed actions and the outcomes of those actions. In this chapter, we argue that social reasoning and causal reasoning are deeply linked, both in the real world and in children's minds. Children use both types of information together and in fact reason about both physical and social causation in fundamentally similar ways. We suggest that children jointly construct and update causal theories about their social and physical environment and that this process is best captured by probabilistic models of cognition. We first present studies showing that adults are able to jointly infer causal structure and human action structure from videos of unsegmented human motion. Next, we describe how children use social information to make inferences about physical causes. We show that the pedagogical nature of a demonstrator influences children's choices of which actions to imitate from within a causal sequence and that this social information interacts with statistical causal evidence. We then discuss how children combine evidence from an informant's testimony and expressed confidence with evidence from their own causal observations to infer the efficacy of different potential causes. We also discuss how children use these same causal observations to make inferences about the knowledge state of the social informant. Finally, we suggest that psychological causation and attribution are part of the same causal system as physical causation. We present evidence that just as children use covariation between

  15. A Survey of Probabilistic Models for Relational Data

    SciTech Connect

    Koutsourelakis, P S

    2006-10-13

    Traditional data mining methodologies have focused on ''flat'' data i.e. a collection of identically structured entities, assumed to be independent and identically distributed. However, many real-world datasets are innately relational in that they consist of multi-modal entities and multi-relational links (where each entity- or link-type is characterized by a different set of attributes). Link structure is an important characteristic of a dataset and should not be ignored in modeling efforts, especially when statistical dependencies exist between related entities. These dependencies can in fact significantly improve the accuracy of inference and prediction results, if the relational structure is appropriately leveraged (Figure 1). The need for models that can incorporate relational structure has been accentuated by new technological developments which allow us to easily track, store, and make accessible large amounts of data. Recently, there has been a surge of interest in statistical models for dealing with richly interconnected, heterogeneous data, fueled largely by information mining of web/hypertext data, social networks, bibliographic citation data, epidemiological data and communication networks. Graphical models have a natural formalism for representing complex relational data and for predicting the underlying evolving system in a dynamic framework. The present survey provides an overview of probabilistic methods and techniques that have been developed over the last few years for dealing with relational data. Particular emphasis is paid to approaches pertinent to the research areas of pattern recognition, group discovery, entity/node classification, and anomaly detection. We start with supervised learning tasks, where two basic modeling approaches are discussed--i.e. discriminative and generative. Several discriminative techniques are reviewed and performance results are presented. Generative methods are discussed in a separate survey. A special section is

  16. Probabilistic Digital Elevation Model Generation For Spatial Accuracy Assessment

    NASA Astrophysics Data System (ADS)

    Jalobeanu, A.

    2008-12-01

    We propose a new method for the measurement of high resolution topography from a stereo pair. The main application area is the study of planetary surfaces. Digital elevation models (DEM) computed from image pairs using state of the art algorithms usually lack quantitative error estimates. This can be a major issue when the result is used to measure actual physical parameters, such as slope or terrain roughness. Thus, we propose a new method to infer a dense bidimensional disparity map from two images, that also estimates the spatial distribution of errors. We adopt a probabilistic approach, which provides a rigorous framework for parameter estimation and uncertainty evaluation. All the parameters are described in terms of random variables within a Bayesian framework. We start by defining a forward model, which mainly consists of warping the observed scene using B-Splines and using a spatially adaptive radiometric change map for robustness purposes. An a priori smoothness model is introduced in order to stabilize the solution. Solving the inverse problem to recover the disparity map requires to optimize a global non-convex energy function, which is difficult in practice due to multiple local optima. A deterministic optimization technique based on a multi-grid strategy, followed by a local energy analysis at the optimum, allows to recover the a posteriori probability density function (pdf) of the disparity, which encodes both the optimal solution and the related error map. Finally, the disparity field is converted into a DEM through a geometric camera model. This camera model is either known initially, or calibrated automatically using the estimated disparity map and available measurements of the topography (existing low-resolution DEM or ground control points). Automatic calibration from uncertain disparity and topography measurements allows for efficient error propagation from the initial data to the generated elevation model. Results from Mars Express HRSC data

  17. Probabilistic modelling of sea surges in coastal urban areas

    NASA Astrophysics Data System (ADS)

    Georgiadis, Stylianos; Jomo Danielsen Sørup, Hjalte; Arnbjerg-Nielsen, Karsten; Nielsen, Bo Friis

    2016-04-01

    Urban floods are a major issue for coastal cities with severe impacts on economy, society and environment. A main cause for floods are sea surges stemming from extreme weather conditions. In the context of urban flooding, certain standards have to be met by critical infrastructures in order to protect them from floods. These standards can be so strict that no empirical data is available. For instance, protection plans for sub-surface railways against floods are established with 10,000 years return levels. Furthermore, the long technical lifetime of such infrastructures is a critical issue that should be considered, along with the associated climate change effects in this lifetime. We present a case study of Copenhagen where the metro system is being expanded at present with several stations close to the sea. The current critical sea levels for the metro have never been exceeded and Copenhagen has only been severely flooded from pluvial events in the time where measurements have been conducted. However, due to the very high return period that the metro has to be able to withstand and due to the expectations to sea-level rise due to climate change, reliable estimates of the occurrence rate and magnitude of sea surges have to be established as the current protection is expected to be insufficient at some point within the technical lifetime of the metro. The objective of this study is to probabilistically model sea level in Copenhagen as opposed to extrapolating the extreme statistics as is the practice often used. A better understanding and more realistic description of the phenomena leading to sea surges can then be given. The application of hidden Markov models to high-resolution data of sea level for different meteorological stations in and around Copenhagen is an effective tool to address uncertainty. For sea surge studies, the hidden states of the model may reflect the hydrological processes that contribute to coastal floods. Also, the states of the hidden Markov

  18. Building a Probabilistic Denitrification Model for an Oregon Salt Marsh

    NASA Astrophysics Data System (ADS)

    Moon, J. B.; Stecher, H. A.; DeWitt, T.; Nahlik, A.; Regutti, R.; Michael, L.; Fennessy, M. S.; Brown, L.; Mckane, R.; Naithani, K. J.

    2015-12-01

    Despite abundant work starting in the 1950s on the drivers of denitrification (DeN), mechanistic complexity and methodological challenges of direct DeN measurements have resulted in a lack of reliable rate estimates across landscapes, and a lack of operationally valid, robust models. Measuring and modeling DeN are particularly challenging in tidal systems, which play a vital role in buffering adjacent coastal waters from nitrogen inputs. These systems are hydrologically and biogeochemically complex, varying on fine temporal and spatial scales. We assessed the spatial and temporal variability of soil nitrate (NO3-) levels and O2 availability, two primary drivers of DeN, in surface soils of Winant salt marsh located in Yaquina estuary, OR during the summers of 2013 and 2014. We found low temporal variability in soil NO3- concentrations across years, tide series, and tide cycles, but high spatial variability linked to elevation gradients (i.e., habitat types); spatial variability within the high marsh habitat (0 - 68 μg N g-1 dry soil) was correlated with distance to major tide creek channels and connectivity to upslope N-fixing red alder. Soil O2 measurements collected at 5 cm below ground across three locations on two spring tide series showed that O2 drawdown rates were also spatially variable. Depending on the marsh location, O2 draw down ranged from sub-optimal for DeN (> 80 % O2 saturation) across an entire tide series (i.e., across days) to optimum (i.e., ~ 0 % O2 saturation) within one overtopping tide event (i.e., within hours). We are using these results, along with empirical relationships created between DeN and soil NO3- concentrations for Winant to improve on a pre-existing tidal DeN model. We will develop the first version of a fully probabilistic hierarchical Bayesian tidal DeN model to quantify parameter and prediction uncertainties, which are as important as determining mean predictions in order to distinguish measurable differences across the marsh.

  19. Integration of fuzzy analytic hierarchy process and probabilistic dynamic programming in formulating an optimal fleet management model

    NASA Astrophysics Data System (ADS)

    Teoh, Lay Eng; Khoo, Hooi Ling

    2013-09-01

    This study deals with two major aspects of airlines, i.e. supply and demand management. The aspect of supply focuses on the mathematical formulation of an optimal fleet management model to maximize operational profit of the airlines while the aspect of demand focuses on the incorporation of mode choice modeling as parts of the developed model. The proposed methodology is outlined in two-stage, i.e. Fuzzy Analytic Hierarchy Process is first adopted to capture mode choice modeling in order to quantify the probability of probable phenomena (for aircraft acquisition/leasing decision). Then, an optimization model is developed as a probabilistic dynamic programming model to determine the optimal number and types of aircraft to be acquired and/or leased in order to meet stochastic demand during the planning horizon. The findings of an illustrative case study show that the proposed methodology is viable. The results demonstrate that the incorporation of mode choice modeling could affect the operational profit and fleet management decision of the airlines at varying degrees.

  20. A fully probabilistic approach to extreme rainfall modeling

    NASA Astrophysics Data System (ADS)

    Coles, Stuart; Pericchi, Luis Raúl; Sisson, Scott

    2003-03-01

    It is an embarrassingly frequent experience that statistical practice fails to foresee historical disasters. It is all too easy to blame global trends or some sort of external intervention, but in this article we argue that statistical methods that do not take comprehensive account of the uncertainties involved in both model and predictions, are bound to produce an over-optimistic appraisal of future extremes that is often contradicted by observed hydrological events. Based on the annual and daily rainfall data on the central coast of Venezuela, different modeling strategies and inference approaches show that the 1999 rainfall which caused the worst environmentally related tragedy in Venezuelan history was extreme, but not implausible given the historical evidence. We follow in turn a classical likelihood and Bayesian approach, arguing that the latter is the most natural approach for taking into account all uncertainties. In each case we emphasize the importance of making inference on predicted levels of the process rather than model parameters. Our most detailed model comprises of seasons with unknown starting points and durations for the extremes of daily rainfall whose behavior is described using a standard threshold model. Based on a Bayesian analysis of this model, so that both prediction uncertainty and process heterogeneity are properly modeled, we find that the 1999 event has a sizeable probability which implies that such an occurrence within a reasonably short time horizon could have been anticipated. Finally, since accumulation of extreme rainfall over several days is an additional difficulty—and indeed, the catastrophe of 1999 was exaggerated by heavy rainfall on successive days—we examine the effect of timescale on our broad conclusions, finding results to be broadly similar across different choices.

  1. Probabilistic modeling of flood characterizations with parametric and minimum information pair-copula model

    NASA Astrophysics Data System (ADS)

    Daneshkhah, Alireza; Remesan, Renji; Chatrabgoun, Omid; Holman, Ian P.

    2016-09-01

    This paper highlights the usefulness of the minimum information and parametric pair-copula construction (PCC) to model the joint distribution of flood event properties. Both of these models outperform other standard multivariate copula in modeling multivariate flood data that exhibiting complex patterns of dependence, particularly in the tails. In particular, the minimum information pair-copula model shows greater flexibility and produces better approximation of the joint probability density and corresponding measures have capability for effective hazard assessments. The study demonstrates that any multivariate density can be approximated to any degree of desired precision using minimum information pair-copula model and can be practically used for probabilistic flood hazard assessment.

  2. Multidimensional analysis and probabilistic model of volcanic and seismic activities

    NASA Astrophysics Data System (ADS)

    Fedorov, V.

    2009-04-01

    .I. Gushchenko, 1979) and seismological (database of USGS/NEIC Significant Worldwide Earthquakes, 2150 B.C.- 1994 A.D.) information which displays dynamics of endogenic relief-forming processes over a period of 1900 to 1994. In the course of the analysis, a substitution of calendar variable by a corresponding astronomical one has been performed and the epoch superposition method was applied. In essence, the method consists in that the massifs of information on volcanic eruptions (over a period of 1900 to 1977) and seismic events (1900-1994) are differentiated with respect to value of astronomical parameters which correspond to the calendar dates of the known eruptions and earthquakes, regardless of the calendar year. The obtained spectra of volcanic eruptions and violent earthquake distribution in the fields of the Earth orbital movement parameters were used as a basis for calculation of frequency spectra and diurnal probability of volcanic and seismic activity. The objective of the proposed investigations is a probabilistic model development of the volcanic and seismic events, as well as GIS designing for monitoring and forecast of volcanic and seismic activities. In accordance with the stated objective, three probability parameters have been found in the course of preliminary studies; they form the basis for GIS-monitoring and forecast development. 1. A multidimensional analysis of volcanic eruption and earthquakes (of magnitude 7) have been performed in terms of the Earth orbital movement. Probability characteristics of volcanism and seismicity have been defined for the Earth as a whole. Time intervals have been identified with a diurnal probability twice as great as the mean value. Diurnal probability of volcanic and seismic events has been calculated up to 2020. 2. A regularity is found in duration of dormant (repose) periods has been established. A relationship has been found between the distribution of the repose period probability density and duration of the period. 3

  3. Integration of Advanced Probabilistic Analysis Techniques with Multi-Physics Models

    SciTech Connect

    Cetiner, Mustafa Sacit; none,; Flanagan, George F.; Poore III, Willis P.; Muhlheim, Michael David

    2014-07-30

    An integrated simulation platform that couples probabilistic analysis-based tools with model-based simulation tools can provide valuable insights for reactive and proactive responses to plant operating conditions. The objective of this work is to demonstrate the benefits of a partial implementation of the Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Framework Specification through the coupling of advanced PRA capabilities and accurate multi-physics plant models. Coupling a probabilistic model with a multi-physics model will aid in design, operations, and safety by providing a more accurate understanding of plant behavior. This represents the first attempt at actually integrating these two types of analyses for a control system used for operations, on a faster than real-time basis. This report documents the development of the basic communication capability to exchange data with the probabilistic model using Reliability Workbench (RWB) and the multi-physics model using Dymola. The communication pathways from injecting a fault (i.e., failing a component) to the probabilistic and multi-physics models were successfully completed. This first version was tested with prototypic models represented in both RWB and Modelica. First, a simple event tree/fault tree (ET/FT) model was created to develop the software code to implement the communication capabilities between the dynamic-link library (dll) and RWB. A program, written in C#, successfully communicates faults to the probabilistic model through the dll. A systems model of the Advanced Liquid-Metal Reactor–Power Reactor Inherently Safe Module (ALMR-PRISM) design developed under another DOE project was upgraded using Dymola to include proper interfaces to allow data exchange with the control application (ConApp). A program, written in C+, successfully communicates faults to the multi-physics model. The results of the example simulation were successfully plotted.

  4. Incentive theory: II. Models for choice.

    PubMed

    Killeen, P R

    1982-09-01

    Incentive theory is extended to account for concurrent chained schedules of reinforcement. The basic model consists of additive contributions from the primary and secondary effects of reinforcers, which serve to direct the behavior activated by reinforcement. The activation is proportional to the rate of reinforcement and interacts multiplicatively with the directive effects. The two free parameters are q, the slope of the delay of reinforcement gradient, whose value is constant across many experiments, and b, a bias parameter. The model is shown to provide an excellent description of all results from studies that have varied the terminal-link schedules, and of many of the results from studies that have varied initial-link schedules. The model is extended to diverse modifications of the terminal links, such as varied amount of reinforcement, varied signaling of the terminal-link schedules, and segmentation of the terminal-link schedules. It is demonstrated that incentive theory provides an accurate and integrated account of many of the phenomena of choice.

  5. The Stay/Switch Model of Concurrent Choice

    ERIC Educational Resources Information Center

    MacDonall, James S.

    2009-01-01

    This experiment compared descriptions of concurrent choice by the stay/switch model, which says choice is a function of the reinforcers obtained for staying at and for switching from each alternative, and the generalized matching law, which says choice is a function of the total reinforcers obtained at each alternative. For the stay/switch model…

  6. Probabilistic finite element analysis of a craniofacial finite element model.

    PubMed

    Berthaume, Michael A; Dechow, Paul C; Iriarte-Diaz, Jose; Ross, Callum F; Strait, David S; Wang, Qian; Grosse, Ian R

    2012-05-01

    We employed a probabilistic finite element analysis (FEA) method to determine how variability in material property values affects stress and strain values in a finite model of a Macaca fascicularis cranium. The material behavior of cortical bone varied in three ways: isotropic homogeneous, isotropic non-homogeneous, and orthotropic non-homogeneous. The material behavior of the trabecular bone and teeth was always treated as isotropic and homogeneous. All material property values for the cranium were randomized with a Gaussian distribution with either coefficients of variation (CVs) of 0.2 or with CVs calculated from empirical data. Latin hypercube sampling was used to determine the values of the material properties used in the finite element models. In total, four hundred and twenty six separate deterministic FE simulations were executed. We tested four hypotheses in this study: (1) uncertainty in material property values will have an insignificant effect on high stresses and a significant effect on high strains for homogeneous isotropic models; (2) the effect of variability in material property values on the stress state will increase as non-homogeneity and anisotropy increase; (3) variation in the in vivo shear strain values reported by Strait et al. (2005) and Ross et al. (2011) is not only due to variations in muscle forces and cranial morphology, but also due to variation in material property values; (4) the assumption of a uniform coefficient of variation for the material property values will result in the same trend in how moderate-to-high stresses and moderate-to-high strains vary with respect to the degree of non-homogeneity and anisotropy as the trend found when the coefficients of variation for material property values are calculated from empirical data. Our results supported the first three hypotheses and falsified the fourth. When material properties were varied with a constant CV, as non-homogeneity and anisotropy increased the level of variability in

  7. Reasoning in Reference Games: Individual- vs. Population-Level Probabilistic Modeling.

    PubMed

    Franke, Michael; Degen, Judith

    2016-01-01

    Recent advances in probabilistic pragmatics have achieved considerable success in modeling speakers' and listeners' pragmatic reasoning as probabilistic inference. However, these models are usually applied to population-level data, and so implicitly suggest a homogeneous population without individual differences. Here we investigate potential individual differences in Theory-of-Mind related depth of pragmatic reasoning in so-called reference games that require drawing ad hoc Quantity implicatures of varying complexity. We show by Bayesian model comparison that a model that assumes a heterogenous population is a better predictor of our data, especially for comprehension. We discuss the implications for the treatment of individual differences in probabilistic models of language use. PMID:27149675

  8. Reasoning in Reference Games: Individual- vs. Population-Level Probabilistic Modeling

    PubMed Central

    Franke, Michael; Degen, Judith

    2016-01-01

    Recent advances in probabilistic pragmatics have achieved considerable success in modeling speakers’ and listeners’ pragmatic reasoning as probabilistic inference. However, these models are usually applied to population-level data, and so implicitly suggest a homogeneous population without individual differences. Here we investigate potential individual differences in Theory-of-Mind related depth of pragmatic reasoning in so-called reference games that require drawing ad hoc Quantity implicatures of varying complexity. We show by Bayesian model comparison that a model that assumes a heterogenous population is a better predictor of our data, especially for comprehension. We discuss the implications for the treatment of individual differences in probabilistic models of language use. PMID:27149675

  9. Characterizing the International Migration Barriers with a Probabilistic Multilateral Migration Model

    NASA Astrophysics Data System (ADS)

    Li, Xiaomeng; Xu, Hongzhong; Chen, Jiawei; Chen, Qinghua; Zhang, Jiang; di, Zengru

    2016-09-01

    Human migration is responsible for forming modern civilization and has had an important influence on the development of various countries. There are many issues worth researching, and “the reason to move” is the most basic one. The concept of migration cost in the classical self-selection theory, which was introduced by Roy and Borjas, is useful. However, migration cost cannot address global migration because of the limitations of deterministic and bilateral choice. Following the idea of migration cost, this paper developed a new probabilistic multilateral migration model by introducing the Boltzmann factor from statistical physics. After characterizing the underlying mechanism or driving force of human mobility, we reveal some interesting facts that have provided a deeper understanding of international migration, such as the negative correlation between migration costs for emigrants and immigrants and a global classification with clear regional and economic characteristics, based on clustering of migration cost vectors. In addition, we deconstruct the migration barriers using regression analysis and find that the influencing factors are complicated but can be partly (12.5%) described by several macro indexes, such as the GDP growth of the destination country, the GNI per capita and the HDI of both the source and destination countries.

  10. Characterizing the International Migration Barriers with a Probabilistic Multilateral Migration Model.

    PubMed

    Li, Xiaomeng; Xu, Hongzhong; Chen, Jiawei; Chen, Qinghua; Zhang, Jiang; Di, Zengru

    2016-01-01

    Human migration is responsible for forming modern civilization and has had an important influence on the development of various countries. There are many issues worth researching, and "the reason to move" is the most basic one. The concept of migration cost in the classical self-selection theory, which was introduced by Roy and Borjas, is useful. However, migration cost cannot address global migration because of the limitations of deterministic and bilateral choice. Following the idea of migration cost, this paper developed a new probabilistic multilateral migration model by introducing the Boltzmann factor from statistical physics. After characterizing the underlying mechanism or driving force of human mobility, we reveal some interesting facts that have provided a deeper understanding of international migration, such as the negative correlation between migration costs for emigrants and immigrants and a global classification with clear regional and economic characteristics, based on clustering of migration cost vectors. In addition, we deconstruct the migration barriers using regression analysis and find that the influencing factors are complicated but can be partly (12.5%) described by several macro indexes, such as the GDP growth of the destination country, the GNI per capita and the HDI of both the source and destination countries. PMID:27597319

  11. Characterizing the International Migration Barriers with a Probabilistic Multilateral Migration Model.

    PubMed

    Li, Xiaomeng; Xu, Hongzhong; Chen, Jiawei; Chen, Qinghua; Zhang, Jiang; Di, Zengru

    2016-01-01

    Human migration is responsible for forming modern civilization and has had an important influence on the development of various countries. There are many issues worth researching, and "the reason to move" is the most basic one. The concept of migration cost in the classical self-selection theory, which was introduced by Roy and Borjas, is useful. However, migration cost cannot address global migration because of the limitations of deterministic and bilateral choice. Following the idea of migration cost, this paper developed a new probabilistic multilateral migration model by introducing the Boltzmann factor from statistical physics. After characterizing the underlying mechanism or driving force of human mobility, we reveal some interesting facts that have provided a deeper understanding of international migration, such as the negative correlation between migration costs for emigrants and immigrants and a global classification with clear regional and economic characteristics, based on clustering of migration cost vectors. In addition, we deconstruct the migration barriers using regression analysis and find that the influencing factors are complicated but can be partly (12.5%) described by several macro indexes, such as the GDP growth of the destination country, the GNI per capita and the HDI of both the source and destination countries.

  12. Characterizing the International Migration Barriers with a Probabilistic Multilateral Migration Model

    PubMed Central

    Li, Xiaomeng; Xu, Hongzhong; Chen, Jiawei; Chen, Qinghua; Zhang, Jiang; Di, Zengru

    2016-01-01

    Human migration is responsible for forming modern civilization and has had an important influence on the development of various countries. There are many issues worth researching, and “the reason to move” is the most basic one. The concept of migration cost in the classical self-selection theory, which was introduced by Roy and Borjas, is useful. However, migration cost cannot address global migration because of the limitations of deterministic and bilateral choice. Following the idea of migration cost, this paper developed a new probabilistic multilateral migration model by introducing the Boltzmann factor from statistical physics. After characterizing the underlying mechanism or driving force of human mobility, we reveal some interesting facts that have provided a deeper understanding of international migration, such as the negative correlation between migration costs for emigrants and immigrants and a global classification with clear regional and economic characteristics, based on clustering of migration cost vectors. In addition, we deconstruct the migration barriers using regression analysis and find that the influencing factors are complicated but can be partly (12.5%) described by several macro indexes, such as the GDP growth of the destination country, the GNI per capita and the HDI of both the source and destination countries. PMID:27597319

  13. Probabilistic Fatigue Damage Prognosis Using a Surrogate Model Trained Via 3D Finite Element Analysis

    NASA Technical Reports Server (NTRS)

    Leser, Patrick E.; Hochhalter, Jacob D.; Newman, John A.; Leser, William P.; Warner, James E.; Wawrzynek, Paul A.; Yuan, Fuh-Gwo

    2015-01-01

    Utilizing inverse uncertainty quantification techniques, structural health monitoring can be integrated with damage progression models to form probabilistic predictions of a structure's remaining useful life. However, damage evolution in realistic structures is physically complex. Accurately representing this behavior requires high-fidelity models which are typically computationally prohibitive. In the present work, a high-fidelity finite element model is represented by a surrogate model, reducing computation times. The new approach is used with damage diagnosis data to form a probabilistic prediction of remaining useful life for a test specimen under mixed-mode conditions.

  14. Parameter estimation of social forces in pedestrian dynamics models via a probabilistic method.

    PubMed

    Corbetta, Alessandro; Muntean, Adrian; Vafayi, Kiamars

    2015-04-01

    Focusing on a specific crowd dynamics situation, including real life experiments and measurements, our paper targets a twofold aim: (1) we present a Bayesian probabilistic method to estimate the value and the uncertainty (in the form of a probability density function) of parameters in crowd dynamic models from the experimental data; and (2) we introduce a fitness measure for the models to classify a couple of model structures (forces) according to their fitness to the experimental data, preparing the stage for a more general model-selection and validation strategy inspired by probabilistic data analysis. Finally, we review the essential aspects of our experimental setup and measurement technique.

  15. Nested Logit Models for Multiple-Choice Item Response Data

    ERIC Educational Resources Information Center

    Suh, Youngsuk; Bolt, Daniel M.

    2010-01-01

    Nested logit item response models for multiple-choice data are presented. Relative to previous models, the new models are suggested to provide a better approximation to multiple-choice items where the application of a solution strategy precedes consideration of response options. In practice, the models also accommodate collapsibility across all…

  16. Utilization of Probabilistic Cues in the Presence of Irrelevant Information: A Comparison of Risky Choice in Children and Adults

    ERIC Educational Resources Information Center

    Betsch, Tilmann; Lang, Anna

    2013-01-01

    We studied risky choices in preschoolers, elementary schoolers, and adults using an information board paradigm crossing two options with two cues that differ in their probability of making valid predictions (p = 0.50 vs. p = 0.83). We also varied the presence of normatively irrelevant information. Choice patterns indicate that preschoolers were…

  17. Integrating probabilistic models of perception and interactive neural networks: a historical and tutorial review

    PubMed Central

    McClelland, James L.

    2013-01-01

    This article seeks to establish a rapprochement between explicitly Bayesian models of contextual effects in perception and neural network models of such effects, particularly the connectionist interactive activation (IA) model of perception. The article is in part an historical review and in part a tutorial, reviewing the probabilistic Bayesian approach to understanding perception and how it may be shaped by context, and also reviewing ideas about how such probabilistic computations may be carried out in neural networks, focusing on the role of context in interactive neural networks, in which both bottom-up and top-down signals affect the interpretation of sensory inputs. It is pointed out that connectionist units that use the logistic or softmax activation functions can exactly compute Bayesian posterior probabilities when the bias terms and connection weights affecting such units are set to the logarithms of appropriate probabilistic quantities. Bayesian concepts such the prior, likelihood, (joint and marginal) posterior, probability matching and maximizing, and calculating vs. sampling from the posterior are all reviewed and linked to neural network computations. Probabilistic and neural network models are explicitly linked to the concept of a probabilistic generative model that describes the relationship between the underlying target of perception (e.g., the word intended by a speaker or other source of sensory stimuli) and the sensory input that reaches the perceiver for use in inferring the underlying target. It is shown how a new version of the IA model called the multinomial interactive activation (MIA) model can sample correctly from the joint posterior of a proposed generative model for perception of letters in words, indicating that interactive processing is fully consistent with principled probabilistic computation. Ways in which these computations might be realized in real neural systems are also considered. PMID:23970868

  18. Probabilistic seismic hazard study based on active fault and finite element geodynamic models

    NASA Astrophysics Data System (ADS)

    Kastelic, Vanja; Carafa, Michele M. C.; Visini, Francesco

    2016-04-01

    We present a probabilistic seismic hazard analysis (PSHA) that is exclusively based on active faults and geodynamic finite element input models whereas seismic catalogues were used only in a posterior comparison. We applied the developed model in the External Dinarides, a slow deforming thrust-and-fold belt at the contact between Adria and Eurasia.. is the Our method consists of establishing s two earthquake rupture forecast models: (i) a geological active fault input (GEO) model and, (ii) a finite element (FEM) model. The GEO model is based on active fault database that provides information on fault location and its geometric and kinematic parameters together with estimations on its slip rate. By default in this model all deformation is set to be released along the active faults. The FEM model is based on a numerical geodynamic model developed for the region of study. In this model the deformation is, besides along the active faults, released also in the volumetric continuum elements. From both models we calculated their corresponding activity rates, its earthquake rates and their final expected peak ground accelerations. We investigated both the source model and the earthquake model uncertainties by varying the main active fault and earthquake rate calculation parameters through constructing corresponding branches of the seismic hazard logic tree. Hazard maps and UHS curves have been produced for horizontal ground motion on bedrock conditions VS 30 ≥ 800 m/s), thereby not considering local site amplification effects. The hazard was computed over a 0.2° spaced grid considering 648 branches of the logic tree and the mean value of 10% probability of exceedance in 50 years hazard level, while the 5th and 95th percentiles were also computed to investigate the model limits. We conducted a sensitivity analysis to control which of the input parameters influence the final hazard results in which measure. The results of such comparison evidence the deformation model and

  19. Hierarchical Diffusion Models for Two-Choice Response Times

    ERIC Educational Resources Information Center

    Vandekerckhove, Joachim; Tuerlinckx, Francis; Lee, Michael D.

    2011-01-01

    Two-choice response times are a common type of data, and much research has been devoted to the development of process models for such data. However, the practical application of these models is notoriously complicated, and flexible methods are largely nonexistent. We combine a popular model for choice response times--the Wiener diffusion…

  20. Probabilistic Fatigue Life Prediction of Turbine Disc Considering Model Parameter Uncertainty

    NASA Astrophysics Data System (ADS)

    He, Liping; Yu, Le; Zhu, Shun-Peng; Ding, Liangliang; Huang, Hong-Zhong

    2016-06-01

    Aiming to improve the predictive ability of Walker model for fatigue life prediction and taking the turbine disc alloy GH4133 as the application example, this paper investigates a new approach for probabilistic fatigue life prediction when considering parameter uncertainty inherent in the life prediction model. Firstly, experimental data are used to update the model parameters using Bayes' theorem, so as to obtain the posterior probability distribution functions of two parameters of the Walker model, as well to achieve the probabilistic life prediction model for turbine disc. During the updating process, Markov Chain Monte Carlo (MCMC) technique is used to generate samples of the given distribution and estimating the parameters distinctly. After that, the turbine disc life is predicted using the probabilistic Walker model based on Monte Carlo simulation technique. The experimental results indicate that: (1) after using the small sample test data obtained from turbine disc, parameter uncertainty of the Walker model can be quantified and the corresponding probabilistic model for fatigue life prediction can be established using Bayes' theorem; (2) there exists obvious dispersion of life data for turbine disc when predicting fatigue life in practical engineering application.

  1. A PROBABILISTIC POPULATION EXPOSURE MODEL FOR PM10 AND PM 2.5

    EPA Science Inventory

    A first generation probabilistic population exposure model for Particulate Matter (PM), specifically for predicting PM10, and PM2.5, exposures of an urban, population has been developed. This model is intended to be used to predict exposure (magnitude, frequency, and duration) ...

  2. Conditional Reasoning in Context: A Dual-Source Model of Probabilistic Inference

    ERIC Educational Resources Information Center

    Klauer, Karl Christoph; Beller, Sieghard; Hutter, Mandy

    2010-01-01

    A dual-source model of probabilistic conditional inference is proposed. According to the model, inferences are based on 2 sources of evidence: logical form and prior knowledge. Logical form is a decontextualized source of evidence, whereas prior knowledge is activated by the contents of the conditional rule. In Experiments 1 to 3, manipulations of…

  3. Probabilistic Model Building Genetic Programming based on Estimation of Bayesian Network

    NASA Astrophysics Data System (ADS)

    Hasegawa, Yoshihiko; Iba, Hitoshi

    Genetic Programming (GP) is a powerful optimization algorithm, which employs the crossover for genetic operation. Because the crossover operator in GP randomly selects sub-trees, the building blocks may be destroyed by the crossover. Recently, algorithms called PMBGPs (Probabilistic Model Building GP) based on probabilistic techniques have been proposed in order to improve the problem mentioned above. We propose a new PMBGP employing Bayesian network for generating new individuals with a special chromosome called expanded parse tree, which much reduces a number of possible symbols at each node. Although the large number of symbols gives rise to the large conditional probability table and requires a lot of samples to estimate the interactions among nodes, a use of the expanded parse tree overcomes these problems. Computational experiments on two subjects demonstrate that our new PMBGP is much superior to prior probabilistic models.

  4. Model initialisation, data assimilation and probabilistic flood forecasting for distributed hydrological models

    NASA Astrophysics Data System (ADS)

    Cole, S. J.; Robson, A. J.; Bell, V. A.; Moore, R. J.

    2009-04-01

    The hydrological forecasting component of the Natural Environment Research Council's FREE (Flood Risk from Extreme Events) project "Exploitation of new data sources, data assimilation and ensemble techniques for storm and flood forecasting" addresses the initialisation, data assimilation and uncertainty of hydrological flood models utilising advances in rainfall estimation and forecasting. Progress will be reported on the development and assessment of simple model-initialisation and state-correction methods for a distributed grid-based hydrological model, the G2G Model. The potential of the G2G Model for area-wide flood forecasting is demonstrated through a nationwide application across England and Wales. Probabilistic flood forecasting in spatial form is illustrated through the use of high-resolution NWP rainfalls, and pseudo-ensemble forms of these, as input to the G2G Model. The G2G Model is configured over a large area of South West England and the Boscastle storm of 16 August 2004 is used as a convective case study. Visualisation of probabilistic flood forecasts is achieved through risk maps of flood threshold exceedence that indicate the space-time evolution of flood risk during the event.

  5. Opponent actor learning (OpAL): modeling interactive effects of striatal dopamine on reinforcement learning and choice incentive.

    PubMed

    Collins, Anne G E; Frank, Michael J

    2014-07-01

    The striatal dopaminergic system has been implicated in reinforcement learning (RL), motor performance, and incentive motivation. Various computational models have been proposed to account for each of these effects individually, but a formal analysis of their interactions is lacking. Here we present a novel algorithmic model expanding the classical actor-critic architecture to include fundamental interactive properties of neural circuit models, incorporating both incentive and learning effects into a single theoretical framework. The standard actor is replaced by a dual opponent actor system representing distinct striatal populations, which come to differentially specialize in discriminating positive and negative action values. Dopamine modulates the degree to which each actor component contributes to both learning and choice discriminations. In contrast to standard frameworks, this model simultaneously captures documented effects of dopamine on both learning and choice incentive-and their interactions-across a variety of studies, including probabilistic RL, effort-based choice, and motor skill learning. PMID:25090423

  6. Opponent actor learning (OpAL): modeling interactive effects of striatal dopamine on reinforcement learning and choice incentive.

    PubMed

    Collins, Anne G E; Frank, Michael J

    2014-07-01

    The striatal dopaminergic system has been implicated in reinforcement learning (RL), motor performance, and incentive motivation. Various computational models have been proposed to account for each of these effects individually, but a formal analysis of their interactions is lacking. Here we present a novel algorithmic model expanding the classical actor-critic architecture to include fundamental interactive properties of neural circuit models, incorporating both incentive and learning effects into a single theoretical framework. The standard actor is replaced by a dual opponent actor system representing distinct striatal populations, which come to differentially specialize in discriminating positive and negative action values. Dopamine modulates the degree to which each actor component contributes to both learning and choice discriminations. In contrast to standard frameworks, this model simultaneously captures documented effects of dopamine on both learning and choice incentive-and their interactions-across a variety of studies, including probabilistic RL, effort-based choice, and motor skill learning.

  7. A Neurocomputational Model of Altruistic Choice and Its Implications.

    PubMed

    Hutcherson, Cendri A; Bushong, Benjamin; Rangel, Antonio

    2015-07-15

    We propose a neurocomputational model of altruistic choice and test it using behavioral and fMRI data from a task in which subjects make choices between real monetary prizes for themselves and another. We show that a multi-attribute drift-diffusion model, in which choice results from accumulation of a relative value signal that linearly weights payoffs for self and other, captures key patterns of choice, reaction time, and neural response in ventral striatum, temporoparietal junction, and ventromedial prefrontal cortex. The model generates several novel insights into the nature of altruism. It explains when and why generous choices are slower or faster than selfish choices, and why they produce greater response in TPJ and vmPFC, without invoking competition between automatic and deliberative processes or reward value for generosity. It also predicts that when one's own payoffs are valued more than others', some generous acts may reflect mistakes rather than genuinely pro-social preferences. PMID:26182424

  8. Monthly water balance modeling: Probabilistic, possibilistic and hybrid methods for model combination and ensemble simulation

    NASA Astrophysics Data System (ADS)

    Nasseri, M.; Zahraie, B.; Ajami, N. K.; Solomatine, D. P.

    2014-04-01

    Multi-model (ensemble, or committee) techniques have shown to be an effective way to improve hydrological prediction performance and provide uncertainty information. This paper presents two novel multi-model ensemble techniques, one probabilistic, Modified Bootstrap Ensemble Model (MBEM), and one possibilistic, FUzzy C-means Ensemble based on data Pattern (FUCEP). The paper also explores utilization of the Ordinary Kriging (OK) method as a multi-model combination scheme for hydrological simulation/prediction. These techniques are compared against Bayesian Model Averaging (BMA) and Weighted Average (WA) methods to demonstrate their effectiveness. The mentioned techniques are applied to the three monthly water balance models used to generate stream flow simulations for two mountainous basins in the South-West of Iran. For both basins, the results demonstrate that MBEM and FUCEP generate more skillful and reliable probabilistic predictions, outperforming all the other techniques. We have also found that OK did not demonstrate any improved skill as a simple combination method over WA scheme for neither of the basins.

  9. Estimation of an Occupational Choice Model when Occupations Are Misclassified

    ERIC Educational Resources Information Center

    Sullivan, Paul

    2009-01-01

    This paper develops an empirical occupational choice model that corrects for misclassification in occupational choices and measurement error in occupation-specific work experience. The model is used to estimate the extent of measurement error in occupation data and quantify the bias that results from ignoring measurement error in occupation codes…

  10. Assessment of uncertainty in a probabilistic model of consumer exposure to pesticide residues in food.

    PubMed

    Ferrier, Helen; Shaw, George; Nieuwenhuijsen, Mark; Boobis, Alan; Elliott, Paul

    2006-06-01

    The assessment of consumer exposure to pesticides is an important part of pesticide regulation. Probabilistic modelling allows analysis of uncertainty and variability in risk assessments. The output of any assessment will be influenced by the characteristics and uncertainty of the inputs, model structure and assumptions. While the use of probabilistic models is well established in the United States, in Europe problems of low acceptance, sparse data and lack of guidelines are slowing the development. The analyses in the current paper focused on the dietary pathway and the exposure of UK toddlers. Three single food, single pesticide case studies were used to parameterize a simple probabilistic model built in Crystal Ball. Data on dietary consumption patterns were extracted from National Diet and Nutrition Surveys, and levels of pesticide active ingredients in foods were collected from Pesticide Residues Committee monitoring. The effect of uncertainty on the exposure estimate was analysed using scenarios, reflecting different assumptions related to sources of uncertainty. The most influential uncertainty issue was the distribution type used to represent input variables. Other sources that most affected model output were non-detects, unit-to-unit variability and processing. Specifying correlation between variables was found to have little effect on exposure estimates. The findings have important implications for how probabilistic modelling should be conducted, communicated and used by policy and decision makers as part of consumer risk assessment of pesticides.

  11. Hyperbolic value addition and general models of animal choice.

    PubMed

    Mazur, J E

    2001-01-01

    Three mathematical models of choice--the contextual-choice model (R. Grace, 1994), delay-reduction theory (N. Squires & E. Fantino, 1971), and a new model called the hyperbolic value-added model--were compared in their ability to predict the results from a wide variety of experiments with animal subjects. When supplied with 2 or 3 free parameters, all 3 models made fairly accurate predictions for a large set of experiments that used concurrent-chain procedures. One advantage of the hyperbolic value-added model is that it is derived from a simpler model that makes accurate predictions for many experiments using discrete-trial adjusting-delay procedures. Some results favor the hyperbolic value-added model and delay-reduction theory over the contextual-choice model, but more data are needed from choice situations for which the models make distinctly different predictions.

  12. Hybrid discrete choice models: Gained insights versus increasing effort.

    PubMed

    Mariel, Petr; Meyerhoff, Jürgen

    2016-10-15

    Hybrid choice models expand the standard models in discrete choice modelling by incorporating psychological factors as latent variables. They could therefore provide further insights into choice processes and underlying taste heterogeneity but the costs of estimating these models often significantly increase. This paper aims at comparing the results from a hybrid choice model and a classical random parameter logit. Point of departure for this analysis is whether researchers and practitioners should add hybrid choice models to their suite of models routinely estimated. Our comparison reveals, in line with the few prior studies, that hybrid models gain in efficiency by the inclusion of additional information. The use of one of the two proposed approaches, however, depends on the objective of the analysis. If disentangling preference heterogeneity is most important, hybrid model seems to be preferable. If the focus is on predictive power, a standard random parameter logit model might be the better choice. Finally, we give recommendations for an adequate use of hybrid choice models based on known principles of elementary scientific inference. PMID:27310534

  13. Hybrid discrete choice models: Gained insights versus increasing effort.

    PubMed

    Mariel, Petr; Meyerhoff, Jürgen

    2016-10-15

    Hybrid choice models expand the standard models in discrete choice modelling by incorporating psychological factors as latent variables. They could therefore provide further insights into choice processes and underlying taste heterogeneity but the costs of estimating these models often significantly increase. This paper aims at comparing the results from a hybrid choice model and a classical random parameter logit. Point of departure for this analysis is whether researchers and practitioners should add hybrid choice models to their suite of models routinely estimated. Our comparison reveals, in line with the few prior studies, that hybrid models gain in efficiency by the inclusion of additional information. The use of one of the two proposed approaches, however, depends on the objective of the analysis. If disentangling preference heterogeneity is most important, hybrid model seems to be preferable. If the focus is on predictive power, a standard random parameter logit model might be the better choice. Finally, we give recommendations for an adequate use of hybrid choice models based on known principles of elementary scientific inference.

  14. A range of complex probabilistic models for RNA secondary structure prediction that includes the nearest-neighbor model and more.

    PubMed

    Rivas, Elena; Lang, Raymond; Eddy, Sean R

    2012-02-01

    The standard approach for single-sequence RNA secondary structure prediction uses a nearest-neighbor thermodynamic model with several thousand experimentally determined energy parameters. An attractive alternative is to use statistical approaches with parameters estimated from growing databases of structural RNAs. Good results have been reported for discriminative statistical methods using complex nearest-neighbor models, including CONTRAfold, Simfold, and ContextFold. Little work has been reported on generative probabilistic models (stochastic context-free grammars [SCFGs]) of comparable complexity, although probabilistic models are generally easier to train and to use. To explore a range of probabilistic models of increasing complexity, and to directly compare probabilistic, thermodynamic, and discriminative approaches, we created TORNADO, a computational tool that can parse a wide spectrum of RNA grammar architectures (including the standard nearest-neighbor model and more) using a generalized super-grammar that can be parameterized with probabilities, energies, or arbitrary scores. By using TORNADO, we find that probabilistic nearest-neighbor models perform comparably to (but not significantly better than) discriminative methods. We find that complex statistical models are prone to overfitting RNA structure and that evaluations should use structurally nonhomologous training and test data sets. Overfitting has affected at least one published method (ContextFold). The most important barrier to improving statistical approaches for RNA secondary structure prediction is the lack of diversity of well-curated single-sequence RNA secondary structures in current RNA databases. PMID:22194308

  15. A range of complex probabilistic models for RNA secondary structure prediction that includes the nearest-neighbor model and more.

    PubMed

    Rivas, Elena; Lang, Raymond; Eddy, Sean R

    2012-02-01

    The standard approach for single-sequence RNA secondary structure prediction uses a nearest-neighbor thermodynamic model with several thousand experimentally determined energy parameters. An attractive alternative is to use statistical approaches with parameters estimated from growing databases of structural RNAs. Good results have been reported for discriminative statistical methods using complex nearest-neighbor models, including CONTRAfold, Simfold, and ContextFold. Little work has been reported on generative probabilistic models (stochastic context-free grammars [SCFGs]) of comparable complexity, although probabilistic models are generally easier to train and to use. To explore a range of probabilistic models of increasing complexity, and to directly compare probabilistic, thermodynamic, and discriminative approaches, we created TORNADO, a computational tool that can parse a wide spectrum of RNA grammar architectures (including the standard nearest-neighbor model and more) using a generalized super-grammar that can be parameterized with probabilities, energies, or arbitrary scores. By using TORNADO, we find that probabilistic nearest-neighbor models perform comparably to (but not significantly better than) discriminative methods. We find that complex statistical models are prone to overfitting RNA structure and that evaluations should use structurally nonhomologous training and test data sets. Overfitting has affected at least one published method (ContextFold). The most important barrier to improving statistical approaches for RNA secondary structure prediction is the lack of diversity of well-curated single-sequence RNA secondary structures in current RNA databases.

  16. The role of probability of reinforcement in models of choice.

    PubMed

    Williams, B A

    1994-10-01

    A general account of choice behavior in animals, the cumulative effects model, has been proposed by Davis, Staddon, Machado, and Palmer (1993). Its basic assumptions are that choice occurs in an all-or-none fashion for the response alternative with the highest probability of reinforcement and that the probability of reinforcement for each response alternative is calculated from the entire history of training (total number of reinforced responses/total number of reinforced and nonreinforced responses). The model's reliance on probability of reinforcement as the fundamental variable controlling choice behavior subjects the cumulative effects model to the same criticisms as have been directed toward other related models of choice, notably melioration theory. Several different data sets show that the relative value of a response alternative is not predicted by the obtained probability of reinforcement associated with that alternative. Alternative approaches to choice theory are considered.

  17. Psychological Plausibility of the Theory of Probabilistic Mental Models and the Fast and Frugal Heuristics

    ERIC Educational Resources Information Center

    Dougherty, Michael R.; Franco-Watkins, Ana M.; Thomas, Rick

    2008-01-01

    The theory of probabilistic mental models (PMM; G. Gigerenzer, U. Hoffrage, & H. Kleinbolting, 1991) has had a major influence on the field of judgment and decision making, with the most recent important modifications to PMM theory being the identification of several fast and frugal heuristics (G. Gigerenzer & D. G. Goldstein, 1996). These…

  18. From cyclone tracks to the costs of European winter storms: A probabilistic loss assessment model

    NASA Astrophysics Data System (ADS)

    Renggli, Dominik; Corti, Thierry; Reese, Stefan; Wueest, Marc; Viktor, Elisabeth; Zimmerli, Peter

    2014-05-01

    The quantitative assessment of the potential losses of European winter storms is essential for the economic viability of a global reinsurance company. For this purpose, reinsurance companies generally use probabilistic loss assessment models. This work presents an innovative approach to develop physically meaningful probabilistic events for Swiss Re's new European winter storm loss model. The meteorological hazard component of the new model is based on cyclone and windstorm tracks identified in the 20th Century Reanalysis data. The knowledge of the evolution of winter storms both in time and space allows the physically meaningful perturbation of properties of historical events (e.g. track, intensity). The perturbation includes a random element but also takes the local climatology and the evolution of the historical event into account. The low-resolution wind footprints taken from 20th Century Reanalysis are processed by a statistical-dynamical downscaling to generate high-resolution footprints of the historical and probabilistic winter storm events. Downscaling transfer functions are generated using ENSEMBLES regional climate model data. The result is a set of reliable probabilistic events representing thousands of years. The event set is then combined with country- and risk-specific vulnerability functions and detailed market- or client-specific exposure information to compute (re-)insurance risk premiums.

  19. PEER REVIEW FOR THE CONSUMER VEHICLE CHOICE MODEL

    EPA Science Inventory

    The U.S. Environmental Protection Agency’s (EPA) Office of Transportation and Air Quality (OTAQ) has recently sponsored the development of a Consumer Vehicle Choice Model (CVCM) by the Oak Ridge National Laboratory (ORNL). The specification by OTAQ to ORNL for consumer choice mod...

  20. NEUROBIOLOGY OF ECONOMIC CHOICE: A GOOD-BASED MODEL

    PubMed Central

    Padoa-Schioppa, Camillo

    2012-01-01

    Traditionally the object of economic theory and experimental psychology, economic choice recently became a lively research focus in systems neuroscience. Here I summarize the emerging results and I propose a unifying model of how economic choice might function at the neural level. Economic choice entails comparing options that vary on multiple dimensions. Hence, while choosing, individuals integrate different determinants into a subjective value; decisions are then made by comparing values. According to the good-based model, the values of different goods are computed independently of one another, which implies transitivity. Values are not learned as such, but rather computed at the time of choice. Most importantly, values are compared within the space of goods, independent of the sensori-motor contingencies of choice. Evidence from neurophysiology, imaging and lesion studies indicates that abstract representations of value exist in the orbitofrontal and ventromedial prefrontal cortices. The computation and comparison of values may thus take place within these regions. PMID:21456961

  1. Hierarchical minimax entropy modeling and probabilistic principal component visualization for data exploration

    NASA Astrophysics Data System (ADS)

    Wang, Yue J.; Luo, Lan; Li, Haifeng; Freedman, Matthew T.

    1999-05-01

    As a step toward understanding the complex information from data and relationships, structural and discriminative knowledge reveals insight that may prove useful in data interpretation and exploration. This paper reports the development of an automated and intelligent procedure for generating the hierarchy of minimize entropy models and principal component visualization spaces for improved data explanation. The proposed hierarchical mimimax entropy modeling and probabilistic principal component projection are both statistically principles and visually effective at revealing all of the interesting aspects of the data set. The methods involve multiple use of standard finite normal mixture models and probabilistic principal component projections. The strategy is that the top-level model and projection should explain the entire data set, best revealing the presence of clusters and relationships, while lower-level models and projections should display internal structure within individual clusters, such as the presence of subclusters and attribute trends, which might not be apparent in the higher-level models and projections. With may complementary mixture models and visualization projections, each level will be relatively simple while the complete hierarchy maintains overall flexibility yet still conveys considerable structural information. In particular, a model identification procedure is developed to select the optimal number and kernel shapes of local clusters from a class of data, resulting in a standard finite normal mixtures with minimum conditional bias and variance, and a probabilistic principal component neural network is advanced to generate optimal projections, leading to a hierarchical visualization algorithm allowing the complete data set to be analyzed at the top level, with best separated subclusters of data points analyzed at deeper levels. Hierarchial probabilistic principal component visualization involves (1) evaluation of posterior probabilities for

  2. Probabilistic load model development and validation for composite load spectra for select space propulsion engines

    NASA Technical Reports Server (NTRS)

    Kurth, R.; Newell, J. F.

    1987-01-01

    A major task of the program to develop an expert system to predict the loads on selected components of a generic space propulsion engine is the design development and application of a probabilitic loads model. This model is being developed in order to account for the random nature of the loads and assess the variable load ranges' effect on the engine performance. A probabilistic model has been developed. The model is based primarily on simulation methods, but also has a Gaussian algebra method (if all variables are near normal), a fast probability integrator routine (for the calculation of low probability events), and a separate, stand alone program for performing barrier crossing calculations. Each of these probabilistic methods has been verified with theoretical calculations using assumed distributional forms.

  3. The predictive accuracy of intertemporal-choice models.

    PubMed

    Arfer, Kodi B; Luhmann, Christian C

    2015-05-01

    How do people choose between a smaller reward available sooner and a larger reward available later? Past research has evaluated models of intertemporal choice by measuring goodness of fit or identifying which decision-making anomalies they can accommodate. An alternative criterion for model quality, which is partly antithetical to these standard criteria, is predictive accuracy. We used cross-validation to examine how well 10 models of intertemporal choice could predict behaviour in a 100-trial binary-decision task. Many models achieved the apparent ceiling of 85% accuracy, even with smaller training sets. When noise was added to the training set, however, a simple logistic-regression model we call the difference model performed particularly well. In many situations, between-model differences in predictive accuracy may be small, contrary to long-standing controversy over the modelling question in research on intertemporal choice, but the simplicity and robustness of the difference model recommend it to future use.

  4. A note on probabilistic models over strings: the linear algebra approach.

    PubMed

    Bouchard-Côté, Alexandre

    2013-12-01

    Probabilistic models over strings have played a key role in developing methods that take into consideration indels as phylogenetically informative events. There is an extensive literature on using automata and transducers on phylogenies to do inference on these probabilistic models, in which an important theoretical question is the complexity of computing the normalization of a class of string-valued graphical models. This question has been investigated using tools from combinatorics, dynamic programming, and graph theory, and has practical applications in Bayesian phylogenetics. In this work, we revisit this theoretical question from a different point of view, based on linear algebra. The main contribution is a set of results based on this linear algebra view that facilitate the analysis and design of inference algorithms on string-valued graphical models. As an illustration, we use this method to give a new elementary proof of a known result on the complexity of inference on the "TKF91" model, a well-known probabilistic model over strings. Compared to previous work, our proving method is easier to extend to other models, since it relies on a novel weak condition, triangular transducers, which is easy to establish in practice. The linear algebra view provides a concise way of describing transducer algorithms and their compositions, opens the possibility of transferring fast linear algebra libraries (for example, based on GPUs), as well as low rank matrix approximation methods, to string-valued inference problems. PMID:24135792

  5. A note on probabilistic models over strings: the linear algebra approach.

    PubMed

    Bouchard-Côté, Alexandre

    2013-12-01

    Probabilistic models over strings have played a key role in developing methods that take into consideration indels as phylogenetically informative events. There is an extensive literature on using automata and transducers on phylogenies to do inference on these probabilistic models, in which an important theoretical question is the complexity of computing the normalization of a class of string-valued graphical models. This question has been investigated using tools from combinatorics, dynamic programming, and graph theory, and has practical applications in Bayesian phylogenetics. In this work, we revisit this theoretical question from a different point of view, based on linear algebra. The main contribution is a set of results based on this linear algebra view that facilitate the analysis and design of inference algorithms on string-valued graphical models. As an illustration, we use this method to give a new elementary proof of a known result on the complexity of inference on the "TKF91" model, a well-known probabilistic model over strings. Compared to previous work, our proving method is easier to extend to other models, since it relies on a novel weak condition, triangular transducers, which is easy to establish in practice. The linear algebra view provides a concise way of describing transducer algorithms and their compositions, opens the possibility of transferring fast linear algebra libraries (for example, based on GPUs), as well as low rank matrix approximation methods, to string-valued inference problems.

  6. Probabilistic models for assessment of extreme temperatures and relative humidity in Lithuania

    NASA Astrophysics Data System (ADS)

    Alzbutas, Robertas; Šeputytė, Ilona

    2015-04-01

    Extreme temperatures are fairly common natural phenomenon in Lithuania. They have mainly negative effects both on the environment and humans. Thus there are important to perform probabilistic and statistical analyzes of possibly extreme temperature values and their time-dependant changes. This is especially important in areas where technical objects (sensitive to the extreme temperatures) are foreseen to be constructed. In order to estimate the frequencies and consequences of possible extreme temperatures, the probabilistic analysis of the event occurrence and its uncertainty has been performed: statistical data have been collected and analyzed. The probabilistic analysis of extreme temperatures in Lithuanian territory is based on historical data taken from Lithuanian Hydrometeorology Service, Dūkštas Meteorological Station, Lithuanian Energy Institute and Ignalina NNP Environmental Protection Department of Environmental Monitoring Service. The main objective of performed work was the probabilistic assessment of occurrence and impact of extreme temperature and relative humidity occurring in whole Lithuania and specifically in Dūkštas region where Ignalina Nuclear Power Plant is closed for decommissioning. In addition, the other purpose of this work was to analyze the changes of extreme temperatures. The probabilistic analysis of extreme temperatures increase in Lithuanian territory was based on more than 50 years historical data. The probabilistic assessment was focused on the application and comparison of Gumbel, Weibull and Generalized Value (GEV) distributions, enabling to select a distribution, which has the best fit for data of extreme temperatures. In order to assess the likelihood of extreme temperatures different probabilistic models were applied to evaluate the probability of exeedance of different extreme temperatures. According to the statistics and the relationship between return period and probabilities of temperatures the return period for 30

  7. Probabilistic modeling of the evolution of gene synteny within reconciled phylogenies

    PubMed Central

    2015-01-01

    Background Most models of genome evolution concern either genetic sequences, gene content or gene order. They sometimes integrate two of the three levels, but rarely the three of them. Probabilistic models of gene order evolution usually have to assume constant gene content or adopt a presence/absence coding of gene neighborhoods which is blind to complex events modifying gene content. Results We propose a probabilistic evolutionary model for gene neighborhoods, allowing genes to be inserted, duplicated or lost. It uses reconciled phylogenies, which integrate sequence and gene content evolution. We are then able to optimize parameters such as phylogeny branch lengths, or probabilistic laws depicting the diversity of susceptibility of syntenic regions to rearrangements. We reconstruct a structure for ancestral genomes by optimizing a likelihood, keeping track of all evolutionary events at the level of gene content and gene synteny. Ancestral syntenies are associated with a probability of presence. We implemented the model with the restriction that at most one gene duplication separates two gene speciations in reconciled gene trees. We reconstruct ancestral syntenies on a set of 12 drosophila genomes, and compare the evolutionary rates along the branches and along the sites. We compare with a parsimony method and find a significant number of results not supported by the posterior probability. The model is implemented in the Bio++ library. It thus benefits from and enriches the classical models and methods for molecular evolution. PMID:26452018

  8. Probabilistic Inference: Task Dependency and Individual Differences of Probability Weighting Revealed by Hierarchical Bayesian Modeling.

    PubMed

    Boos, Moritz; Seer, Caroline; Lange, Florian; Kopp, Bruno

    2016-01-01

    Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modeling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities) by two (likelihoods) design. Five computational models of cognitive processes were compared with the observed behavior. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted) S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model's success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modeling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modeling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision.

  9. Probabilistic Inference: Task Dependency and Individual Differences of Probability Weighting Revealed by Hierarchical Bayesian Modeling

    PubMed Central

    Boos, Moritz; Seer, Caroline; Lange, Florian; Kopp, Bruno

    2016-01-01

    Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modeling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities) by two (likelihoods) design. Five computational models of cognitive processes were compared with the observed behavior. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted) S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model's success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modeling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modeling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision. PMID:27303323

  10. Allocation Variable-Based Probabilistic Algorithm to Deal with Label Switching Problem in Bayesian Mixture Models

    PubMed Central

    Pan, Jia-Chiun; Liu, Chih-Min; Hwu, Hai-Gwo; Huang, Guan-Hua

    2015-01-01

    The label switching problem occurs as a result of the nonidentifiability of posterior distribution over various permutations of component labels when using Bayesian approach to estimate parameters in mixture models. In the cases where the number of components is fixed and known, we propose a relabelling algorithm, an allocation variable-based (denoted by AVP) probabilistic relabelling approach, to deal with label switching problem. We establish a model for the posterior distribution of allocation variables with label switching phenomenon. The AVP algorithm stochastically relabel the posterior samples according to the posterior probabilities of the established model. Some existing deterministic and other probabilistic algorithms are compared with AVP algorithm in simulation studies, and the success of the proposed approach is demonstrated in simulation studies and a real dataset. PMID:26458185

  11. Don't Fear Optimality: Sampling for Probabilistic-Logic Sequence Models

    NASA Astrophysics Data System (ADS)

    Thon, Ingo

    One of the current challenges in artificial intelligence is modeling dynamic environments that change due to the actions or activities undertaken by people or agents. The task of inferring hidden states, e.g. the activities or intentions of people, based on observations is called filtering. Standard probabilistic models such as Dynamic Bayesian Networks are able to solve this task efficiently using approximative methods such as particle filters. However, these models do not support logical or relational representations. The key contribution of this paper is the upgrade of a particle filter algorithm for use with a probabilistic logical representation through the definition of a proposal distribution. The performance of the algorithm depends largely on how well this distribution fits the target distribution. We adopt the idea of logical compilation into Binary Decision Diagrams for sampling. This allows us to use the optimal proposal distribution which is normally prohibitively slow.

  12. A probabilistic model for estimating the waiting time until the simultaneous collapse of two contingencies

    SciTech Connect

    Barnett, C.S.

    1991-06-01

    The Double Contingency Principle (DCP) is widely applied to criticality safety practice in the United States. Most practitioners base their application of the principle on qualitative, intuitive assessments. The recent trend toward probabilistic safety assessments provides a motive to search for a quantitative, probabilistic foundation for the DCP. A Markov model is tractable and leads to relatively simple results. The model yields estimates of mean time to simultaneous collapse of two contingencies as a function of estimates of mean failure times and mean recovery times of two independent contingencies. The model is a tool that can be used to supplement the qualitative methods now used to assess effectiveness of the DCP. 3 refs., 1 fig.

  13. A Probabilistic Model for the Distribution of Authorships.

    ERIC Educational Resources Information Center

    Ajiferuke, Isola

    1991-01-01

    Discusses bibliometric studies of research collaboration and describes the development of a theoretical model for the distribution of authorship. The shifted Waring distribution model and 15 other probability models are tested for goodness-of-fit, and results are reported that indicate the shifted inverse Gaussian-Poisson model provides the best…

  14. Boosting probabilistic graphical model inference by incorporating prior knowledge from multiple sources.

    PubMed

    Praveen, Paurush; Fröhlich, Holger

    2013-01-01

    Inferring regulatory networks from experimental data via probabilistic graphical models is a popular framework to gain insights into biological systems. However, the inherent noise in experimental data coupled with a limited sample size reduces the performance of network reverse engineering. Prior knowledge from existing sources of biological information can address this low signal to noise problem by biasing the network inference towards biologically plausible network structures. Although integrating various sources of information is desirable, their heterogeneous nature makes this task challenging. We propose two computational methods to incorporate various information sources into a probabilistic consensus structure prior to be used in graphical model inference. Our first model, called Latent Factor Model (LFM), assumes a high degree of correlation among external information sources and reconstructs a hidden variable as a common source in a Bayesian manner. The second model, a Noisy-OR, picks up the strongest support for an interaction among information sources in a probabilistic fashion. Our extensive computational studies on KEGG signaling pathways as well as on gene expression data from breast cancer and yeast heat shock response reveal that both approaches can significantly enhance the reconstruction accuracy of Bayesian Networks compared to other competing methods as well as to the situation without any prior. Our framework allows for using diverse information sources, like pathway databases, GO terms and protein domain data, etc. and is flexible enough to integrate new sources, if available.

  15. A Probabilistic Model of Student Nurses' Knowledge of Normal Nutrition.

    ERIC Educational Resources Information Center

    Passmore, David Lynn

    1983-01-01

    Vocational and technical education researchers need to be aware of the uses and limits of various statistical models. The author reviews the Rasch Model and applies it to results from a nutrition test given to student nurses. (Author)

  16. HIV-specific probabilistic models of protein evolution.

    PubMed

    Nickle, David C; Heath, Laura; Jensen, Mark A; Gilbert, Peter B; Mullins, James I; Kosakovsky Pond, Sergei L

    2007-06-06

    Comparative sequence analyses, including such fundamental bioinformatics techniques as similarity searching, sequence alignment and phylogenetic inference, have become a mainstay for researchers studying type 1 Human Immunodeficiency Virus (HIV-1) genome structure and evolution. Implicit in comparative analyses is an underlying model of evolution, and the chosen model can significantly affect the results. In general, evolutionary models describe the probabilities of replacing one amino acid character with another over a period of time. Most widely used evolutionary models for protein sequences have been derived from curated alignments of hundreds of proteins, usually based on mammalian genomes. It is unclear to what extent these empirical models are generalizable to a very different organism, such as HIV-1-the most extensively sequenced organism in existence. We developed a maximum likelihood model fitting procedure to a collection of HIV-1 alignments sampled from different viral genes, and inferred two empirical substitution models, suitable for describing between-and within-host evolution. Our procedure pools the information from multiple sequence alignments, and provided software implementation can be run efficiently in parallel on a computer cluster. We describe how the inferred substitution models can be used to generate scoring matrices suitable for alignment and similarity searches. Our models had a consistently superior fit relative to the best existing models and to parameter-rich data-driven models when benchmarked on independent HIV-1 alignments, demonstrating evolutionary biases in amino-acid substitution that are unique to HIV, and that are not captured by the existing models. The scoring matrices derived from the models showed a marked difference from common amino-acid scoring matrices. The use of an appropriate evolutionary model recovered a known viral transmission history, whereas a poorly chosen model introduced phylogenetic error. We argue that

  17. Statistical Research for Probabilistic Model of Distortions of Remote Sensing

    NASA Astrophysics Data System (ADS)

    Ayman, Iskakova

    2016-08-01

    In this work the new multivariate discrete probability model of distribution of processes distortion of radiation from remote sensing data is proposed and studied. Research was performed on a full cycle adopted in mathematical statistics, namely, the model was constructed and investigated, various methods for estimating the parameters was proposed and test the hypothesis that the model adequacy observations, was considered.

  18. Probabilistic performance-assessment modeling of the mixed waste landfill at Sandia National Laboratories.

    SciTech Connect

    Peace, Gerald L.; Goering, Timothy James; Miller, Mark Laverne; Ho, Clifford Kuofei

    2007-01-01

    A probabilistic performance assessment has been conducted to evaluate the fate and transport of radionuclides (americium-241, cesium-137, cobalt-60, plutonium-238, plutonium-239, radium-226, radon-222, strontium-90, thorium-232, tritium, uranium-238), heavy metals (lead and cadmium), and volatile organic compounds (VOCs) at the Mixed Waste Landfill (MWL). Probabilistic analyses were performed to quantify uncertainties inherent in the system and models for a 1,000-year period, and sensitivity analyses were performed to identify parameters and processes that were most important to the simulated performance metrics. Comparisons between simulated results and measured values at the MWL were made to gain confidence in the models and perform calibrations when data were available. In addition, long-term monitoring requirements and triggers were recommended based on the results of the quantified uncertainty and sensitivity analyses.

  19. Semiparametric Thurstonian Models for Recurrent Choices: A Bayesian Analysis

    ERIC Educational Resources Information Center

    Ansari, Asim; Iyengar, Raghuram

    2006-01-01

    We develop semiparametric Bayesian Thurstonian models for analyzing repeated choice decisions involving multinomial, multivariate binary or multivariate ordinal data. Our modeling framework has multiple components that together yield considerable flexibility in modeling preference utilities, cross-sectional heterogeneity and parameter-driven…

  20. Probabilistic fasteners with parabolic elements: biological system, artificial model and theoretical considerations.

    PubMed

    Gorb, S N; Popov, V L

    2002-02-15

    Probabilistic fasteners are attachment devices composed of two surfaces covered with cuticular micro-outgrowths. Friction-based fasteners demonstrate high frictional forces when the surfaces come into contact. Attachment in this case is based on the use of the surface profile and mechanical properties of materials, and is fast, precise and reversible. The best-studied examples composed of parabolic elements are the wing-locking mechanism in beetles and the head arrester in dragonflies. This study combines experimental data of force measurements, obtained in an artificial model system, and theoretical considerations based on the simple model of behaviour of probabilistic fasteners with parabolic elements. Elements of the geometry in both cases correspond to the biological prototypes. Force measurements on the artificial system show that the attachment force is strongly dependent on the load force. At small loads, the increase of attachment is very slow, whereas rapid increase of attachment was detected at higher loads. At very high loads, a saturation of the attachment force was revealed. A simple explanation of the attachment principle is that with an increasing load elements of both surfaces slide into gaps of the corresponding part. This results in an increase of lateral loading forces acting on elements. High lateral forces lead to an increase of friction between single sliding elements. An analytical model which describes behaviour of the probabilistic fasteners with parabolic elements is proposed.

  1. TEMPI: probabilistic modeling time-evolving differential PPI networks with multiPle information

    PubMed Central

    Kim, Yongsoo; Jang, Jin-Hyeok; Choi, Seungjin; Hwang, Daehee

    2014-01-01

    Motivation: Time-evolving differential protein–protein interaction (PPI) networks are essential to understand serial activation of differentially regulated (up- or downregulated) cellular processes (DRPs) and their interplays over time. Despite developments in the network inference, current methods are still limited in identifying temporal transition of structures of PPI networks, DRPs associated with the structural transition and the interplays among the DRPs over time. Results: Here, we present a probabilistic model for estimating Time-Evolving differential PPI networks with MultiPle Information (TEMPI). This model describes probabilistic relationships among network structures, time-course gene expression data and Gene Ontology biological processes (GOBPs). By maximizing the likelihood of the probabilistic model, TEMPI estimates jointly the time-evolving differential PPI networks (TDNs) describing temporal transition of PPI network structures together with serial activation of DRPs associated with transiting networks. This joint estimation enables us to interpret the TDNs in terms of temporal transition of the DRPs. To demonstrate the utility of TEMPI, we applied it to two time-course datasets. TEMPI identified the TDNs that correctly delineated temporal transition of DRPs and time-dependent associations between the DRPs. These TDNs provide hypotheses for mechanisms underlying serial activation of key DRPs and their temporal associations. Availability and implementation: Source code and sample data files are available at http://sbm.postech.ac.kr/tempi/sources.zip. Contact: seungjin@postech.ac.kr or dhwang@dgist.ac.kr Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25161233

  2. Probabilistically Constraining Age-Depth-Models of Glaciogenic Sediments

    NASA Astrophysics Data System (ADS)

    Werner, J.; van der Bilt, W.; Tingley, M.

    2015-12-01

    Reconstructions of the late-Holocene climate rely heavily upon proxies that are assumed to be accurately dated by layer counting. All of these proxies, such as measurements of tree rings, ice cores, and varved lake sediments do carry some inherent dating uncertainty that is not always fully accounted for. Considerable advances could be achieved if time uncertainties were recognized and correctly modeled, also for proxies commonly treated as free of age model errors. Current approaches for accounting for time uncertainty are generally limited to repeating the reconstruction using each one of an ensemble of age models, thereby inflating the final estimated uncertainty - in effect, each possible age model is given equal weighting. Uncertainties can be reduced by exploiting the inferred space-time covariance structure of the climate to re-weight the possible age models. Werner and Tingley (2015) demonstrated how Bayesian hierarchical climate reconstruction models can be augmented to account for time-uncertain proxies. In their method, probabilities associated with the age models are formally updated within the Bayesian framework, thereby reducing uncertainties. Numerical experiments (Werner and Tingley 2015) show that updating the age model probabilities decreases uncertainty in the resulting reconstructions, as compared with the current de facto standard of sampling over all age models, provided there is sufficient information from other data sources in the spatial region of the time-uncertain proxy. We show how this novel method can be applied to high resolution, sub-annually sampled lacustrine sediment records to constrain their respective age depth models. The results help to quantify the signal content and extract the regionally representative signal. The single time series can then be used as the basis for a reconstruction of glacial activity. van der Bilt et al. in prep. Werner, J.P. and Tingley, M.P. Clim. Past (2015)

  3. Modeling the impact of flexible textile composites through multiscale and probabilistic methods

    NASA Astrophysics Data System (ADS)

    Nilakantan, Gaurav

    Flexible textile composites or fabrics comprised of materials such as Kevlar are used in impact and penetration resistant structures such as protective clothing for law enforcement and military personnel. The penetration response of these fabrics is probabilistic in nature and experimentally characterized through parameters such as the V0 and the V50 velocity. In this research a probabilistic computational framework is developed through which the entire V0- V100 velocity curve or probabilistic velocity response (PVR) curve can be numerically determined through a series of finite element (FE) impact simulations. Sources of variability that affect the PVR curve are isolated for investigation, which in this study is chosen as the statistical nature of yarn tensile strengths. Experimental tensile testing is conducted on spooled and fabric-extracted Kevlar yarns. The statistically characterized strengths are then mapped onto the yarns of the fabric FE model as part of the probabilistic computational framework. The effects of projectile characteristics such as size and shape on the fabric PVR curve are studied. A multiscale modeling technique entitled the Hybrid Element Analysis (HEA) is developed to reduce the computational requirements of a fabric model based on a yarn level architecture discretized with only solid elements. This technique combines into a single FE model both a local region of solid and shell element based yarn level architecture, and a global region of shell element based membrane level architecture, with impedance matched interfaces. The multiscale model is then incorporated into the probabilistic computational framework. A yarn model comprised of a filament level architecture is developed to investigate the feasibility of solid element based homogenized yarn models as well as the effect of filament spreading and inter-filament friction on the impact response. Results from preliminary experimental fabric impact testing are also presented. This

  4. The Dependent Poisson Race Model and Modeling Dependence in Conjoint Choice Experiments

    ERIC Educational Resources Information Center

    Ruan, Shiling; MacEachern, Steven N.; Otter, Thomas; Dean, Angela M.

    2008-01-01

    Conjoint choice experiments are used widely in marketing to study consumer preferences amongst alternative products. We develop a class of choice models, belonging to the class of Poisson race models, that describe a "random utility" which lends itself to a process-based description of choice. The models incorporate a dependence structure which…

  5. Dependence in probabilistic modeling, Dempster-Shafer theory, and probability bounds analysis.

    SciTech Connect

    Oberkampf, William Louis; Tucker, W. Troy; Zhang, Jianzhong; Ginzburg, Lev; Berleant, Daniel J.; Ferson, Scott; Hajagos, Janos; Nelsen, Roger B.

    2004-10-01

    This report summarizes methods to incorporate information (or lack of information) about inter-variable dependence into risk assessments that use Dempster-Shafer theory or probability bounds analysis to address epistemic and aleatory uncertainty. The report reviews techniques for simulating correlated variates for a given correlation measure and dependence model, computation of bounds on distribution functions under a specified dependence model, formulation of parametric and empirical dependence models, and bounding approaches that can be used when information about the intervariable dependence is incomplete. The report also reviews several of the most pervasive and dangerous myths among risk analysts about dependence in probabilistic models.

  6. Probabilistic Modeling of Loran-C for nonprecision approaches

    NASA Technical Reports Server (NTRS)

    Einhorn, John K.

    1987-01-01

    The overall idea of the research was to predict the errors to be encountered during an approach using available data from the U.S. Coast Guard and standard normal distribution probability analysis for a number of airports in the North East CONUS. The research consists of two parts: an analytical model that predicts the probability of an approach falling within a given standard, and a series of flight tests designed to test the validity of the model.

  7. Learning a Generative Probabilistic Grammar of Experience: A Process-Level Model of Language Acquisition

    ERIC Educational Resources Information Center

    Kolodny, Oren; Lotem, Arnon; Edelman, Shimon

    2015-01-01

    We introduce a set of biologically and computationally motivated design choices for modeling the learning of language, or of other types of sequential, hierarchically structured experience and behavior, and describe an implemented system that conforms to these choices and is capable of unsupervised learning from raw natural-language corpora. Given…

  8. Probabilistic earthquake early warning in complex earth models using prior sampling

    NASA Astrophysics Data System (ADS)

    Valentine, Andrew; Käufl, Paul; Trampert, Jeannot

    2016-04-01

    In an earthquake early warning (EEW) context, we must draw inferences from small, noisy seismic datasets within an extremely limited time-frame. Ideally, a probabilistic framework would be used, to recognise that available observations may be compatible with a range of outcomes, and analysis would be conducted in a theoretically-complete physical framework. However, implementing these requirements has been challenging, as they tend to increase computational demands beyond what is feasible on EEW timescales. We present a new approach, based on 'prior sampling', which implements probabilistic inversion as a two stage process, and can be used for EEW monitoring within a given region. First, a large set of synthetic data is computed for randomly-distributed seismic sources within the region. A learning algorithm is used to infer details of the probability distribution linking observations and model parameters (including location, magnitude, and focal mechanism). This procedure is computationally expensive, but can be conducted entirely before monitoring commences. In the second stage, as observations are obtained, the algorithm can be evaluated within milliseconds to output a probabilistic representation of the corresponding source model. We demonstrate that this gives robust results, and can be implemented using state-of-the-art 3D wave propagation simulations, and complex crustal structures.

  9. A Coupled Probabilistic Wake Vortex and Aircraft Response Prediction Model

    NASA Technical Reports Server (NTRS)

    Gloudemans, Thijs; Van Lochem, Sander; Ras, Eelco; Malissa, Joel; Ahmad, Nashat N.; Lewis, Timothy A.

    2016-01-01

    Wake vortex spacing standards along with weather and runway occupancy time, restrict terminal area throughput and impose major constraints on the overall capacity and efficiency of the National Airspace System (NAS). For more than two decades, the National Aeronautics and Space Administration (NASA) has been conducting research on characterizing wake vortex behavior in order to develop fast-time wake transport and decay prediction models. It is expected that the models can be used in the systems level design of advanced air traffic management (ATM) concepts that safely increase the capacity of the NAS. It is also envisioned that at a later stage of maturity, these models could potentially be used operationally, in groundbased spacing and scheduling systems as well as on the flight deck.

  10. Urban stormwater management planning with analytical probabilistic models

    SciTech Connect

    Adams, B.J.

    2000-07-01

    Understanding how to properly manage urban stormwater is a critical concern to civil and environmental engineers the world over. Mismanagement of stormwater and urban runoff results in flooding, erosion, and water quality problems. In an effort to develop better management techniques, engineers have come to rely on computer simulation and advanced mathematical modeling techniques to help plan and predict water system performance. This important book outlines a new method that uses probability tools to model how stormwater behaves and interacts in a combined- or single-system municipal water system. Complete with sample problems and case studies illustrating how concepts really work, the book presents a cost-effective, easy-to-master approach to analytical modeling of stormwater management systems.

  11. A probabilistic model for the fault tolerance of multilayer perceptrons.

    PubMed

    Merchawi, N S; Kumara, S T; Das, C R

    1996-01-01

    This paper presents a theoretical approach to determine the probability of misclassification of the multilayer perceptron (MLP) neural model, subject to weight errors. The type of applications considered are classification/recognition tasks involving binary input-output mappings. The analytical models are validated via simulation of a small illustrative example. The theoretical results, in agreement with simulation results, show that, for the example considered, Gaussian weight errors of standard deviation up to 22% of the weight value can be tolerated. The theoretical method developed here adds predictability to the fault tolerance capability of neural nets and shows that this capability is heavily dependent on the problem data.

  12. Model-based choices involve prospective neural activity

    PubMed Central

    Doll, Bradley B.; Duncan, Katherine D.; Simon, Dylan A.; Shohamy, Daphna; Daw, Nathaniel D.

    2015-01-01

    Decisions may arise via “model-free” repetition of previously reinforced actions, or by “model-based” evaluation, which is widely thought to follow from prospective anticipation of action consequences using a learned map or model. While choices and neural correlates of decision variables sometimes reflect knowledge of their consequences, it remains unclear whether this actually arises from prospective evaluation. Using functional MRI and a sequential reward-learning task in which paths contained decodable object categories, we found that humans’ model-based choices were associated with neural signatures of future paths observed at decision time, suggesting a prospective mechanism for choice. Prospection also covaried with the degree of model-based influences on neural correlates of decision variables, and was inversely related to prediction error signals thought to underlie model-free learning. These results dissociate separate mechanisms underlying model-based and model-free evaluation and support the hypothesis that model-based influences on choices and neural decision variables result from prospection. PMID:25799041

  13. Probabilistic Model of Fault Detection in Quantum Circuits

    NASA Astrophysics Data System (ADS)

    Banerjee, A.; Pathak, A.

    Since the introduction of quantum computation, several protocols (such as quantum cryptography, quantum algorithm, quantum teleportation) have established quantum computing as a superior future technology. Each of these processes involves quantum circuits, which are prone to different kinds of faults. Consequently, it is important to verify whether the circuit hardware is defective or not. The systematic procedure to do so is known as fault testing. Normally testing is done by providing a set of valid input states and measuring the corresponding output states and comparing the output states with the expected output states of the perfect (fault less) circuit. This particular set of input vectors are known as test set [6]. If there exists a fault then the next step would be to find the exact location and nature of the defect. This is known as fault localization. A model that explains the logical or functional faults in the circuit is a fault model. Conventional fault models include (i) stuck at faults, (ii) bridge faults, and (iii) delay faults. These fault models have been rigorously studied for conventional irreversible circuit. But with the advent of reversible classical computing and quantum computing it has become important to enlarge the domain of the study on test vectors.

  14. Probabilistic Generalization of Penna Ageing Model and the Oldest Old

    NASA Astrophysics Data System (ADS)

    Stauffer, D.

    Using a 1995 method of Thoms et al., the traditional Penna model of biological ageing is modified such that there is no more absolute maximum life span; instead, our Monte Carlo data are similar to real demographic data collected by Thatcher et al., for rich countries.

  15. Boolean Queries and Term Dependencies in Probabilistic Retrieval Models.

    ERIC Educational Resources Information Center

    Croft, W. Bruce

    1986-01-01

    Proposes approach to integrating Boolean and statistical systems where Boolean queries are interpreted as a means of specifying term dependencies in relevant set of documents. Highlights include series of retrieval experiments designed to test retrieval strategy based on term dependence model and relation of results to other work. (18 references)…

  16. Testing for ontological errors in probabilistic forecasting models of natural systems.

    PubMed

    Marzocchi, Warner; Jordan, Thomas H

    2014-08-19

    Probabilistic forecasting models describe the aleatory variability of natural systems as well as our epistemic uncertainty about how the systems work. Testing a model against observations exposes ontological errors in the representation of a system and its uncertainties. We clarify several conceptual issues regarding the testing of probabilistic forecasting models for ontological errors: the ambiguity of the aleatory/epistemic dichotomy, the quantification of uncertainties as degrees of belief, the interplay between Bayesian and frequentist methods, and the scientific pathway for capturing predictability. We show that testability of the ontological null hypothesis derives from an experimental concept, external to the model, that identifies collections of data, observed and not yet observed, that are judged to be exchangeable when conditioned on a set of explanatory variables. These conditional exchangeability judgments specify observations with well-defined frequencies. Any model predicting these behaviors can thus be tested for ontological error by frequentist methods; e.g., using P values. In the forecasting problem, prior predictive model checking, rather than posterior predictive checking, is desirable because it provides more severe tests. We illustrate experimental concepts using examples from probabilistic seismic hazard analysis. Severe testing of a model under an appropriate set of experimental concepts is the key to model validation, in which we seek to know whether a model replicates the data-generating process well enough to be sufficiently reliable for some useful purpose, such as long-term seismic forecasting. Pessimistic views of system predictability fail to recognize the power of this methodology in separating predictable behaviors from those that are not. PMID:25097265

  17. Testing for ontological errors in probabilistic forecasting models of natural systems

    PubMed Central

    Marzocchi, Warner; Jordan, Thomas H.

    2014-01-01

    Probabilistic forecasting models describe the aleatory variability of natural systems as well as our epistemic uncertainty about how the systems work. Testing a model against observations exposes ontological errors in the representation of a system and its uncertainties. We clarify several conceptual issues regarding the testing of probabilistic forecasting models for ontological errors: the ambiguity of the aleatory/epistemic dichotomy, the quantification of uncertainties as degrees of belief, the interplay between Bayesian and frequentist methods, and the scientific pathway for capturing predictability. We show that testability of the ontological null hypothesis derives from an experimental concept, external to the model, that identifies collections of data, observed and not yet observed, that are judged to be exchangeable when conditioned on a set of explanatory variables. These conditional exchangeability judgments specify observations with well-defined frequencies. Any model predicting these behaviors can thus be tested for ontological error by frequentist methods; e.g., using P values. In the forecasting problem, prior predictive model checking, rather than posterior predictive checking, is desirable because it provides more severe tests. We illustrate experimental concepts using examples from probabilistic seismic hazard analysis. Severe testing of a model under an appropriate set of experimental concepts is the key to model validation, in which we seek to know whether a model replicates the data-generating process well enough to be sufficiently reliable for some useful purpose, such as long-term seismic forecasting. Pessimistic views of system predictability fail to recognize the power of this methodology in separating predictable behaviors from those that are not. PMID:25097265

  18. A Ballistic Model of Choice Response Time

    ERIC Educational Resources Information Center

    Brown, Scott; Heathcote, Andrew

    2005-01-01

    Almost all models of response time (RT) use a stochastic accumulation process. To account for the benchmark RT phenomena, researchers have found it necessary to include between-trial variability in the starting point and/or the rate of accumulation, both in linear (R. Ratcliff & J. N. Rouder, 1998) and nonlinear (M. Usher & J. L. McClelland, 2001)…

  19. Probabilistic uncertainty analysis of epidemiological modeling to guide public health intervention policy.

    PubMed

    Gilbert, Jennifer A; Meyers, Lauren Ancel; Galvani, Alison P; Townsend, Jeffrey P

    2014-03-01

    Mathematical modeling of disease transmission has provided quantitative predictions for health policy, facilitating the evaluation of epidemiological outcomes and the cost-effectiveness of interventions. However, typical sensitivity analyses of deterministic dynamic infectious disease models focus on model architecture and the relative importance of parameters but neglect parameter uncertainty when reporting model predictions. Consequently, model results that identify point estimates of intervention levels necessary to terminate transmission yield limited insight into the probability of success. We apply probabilistic uncertainty analysis to a dynamic model of influenza transmission and assess global uncertainty in outcome. We illustrate that when parameter uncertainty is not incorporated into outcome estimates, levels of vaccination and treatment predicted to prevent an influenza epidemic will only have an approximately 50% chance of terminating transmission and that sensitivity analysis alone is not sufficient to obtain this information. We demonstrate that accounting for parameter uncertainty yields probabilities of epidemiological outcomes based on the degree to which data support the range of model predictions. Unlike typical sensitivity analyses of dynamic models that only address variation in parameters, the probabilistic uncertainty analysis described here enables modelers to convey the robustness of their predictions to policy makers, extending the power of epidemiological modeling to improve public health. PMID:24593920

  20. A family of analytical probabilistic models for urban stormwater management planning

    SciTech Connect

    Papa, F.; Adams, B.J.; Guo, Y.

    1998-07-01

    This paper presents the synthesis of over fifteen years of research on the topic of analytical probabilistic models, as an alternative approach to continuous simulation, that have been derived for the performance analysis of urban runoff quantity and quality control systems. These models overcome the limitations imposed by single event modeling through the use of long term rainfall records and are significantly more computationally efficient and less cumbersome than other methods of continuous analysis. These attributes promote the comprehensive analysis of drainage system design alternatives at the screening and planning levels.

  1. Rock penetration : finite element sensitivity and probabilistic modeling analyses.

    SciTech Connect

    Fossum, Arlo Frederick

    2004-08-01

    This report summarizes numerical analyses conducted to assess the relative importance on penetration depth calculations of rock constitutive model physics features representing the presence of microscale flaws such as porosity and networks of microcracks and rock mass structural features. Three-dimensional, nonlinear, transient dynamic finite element penetration simulations are made with a realistic geomaterial constitutive model to determine which features have the most influence on penetration depth calculations. A baseline penetration calculation is made with a representative set of material parameters evaluated from measurements made from laboratory experiments conducted on a familiar sedimentary rock. Then, a sequence of perturbations of various material parameters allows an assessment to be made of the main penetration effects. A cumulative probability distribution function is calculated with the use of an advanced reliability method that makes use of this sensitivity database, probability density functions, and coefficients of variation of the key controlling parameters for penetration depth predictions. Thus the variability of the calculated penetration depth is known as a function of the variability of the input parameters. This simulation modeling capability should impact significantly the tools that are needed to design enhanced penetrator systems, support weapons effects studies, and directly address proposed HDBT defeat scenarios.

  2. Loss Aversion and Inhibition in Dynamical Models of Multialternative Choice

    ERIC Educational Resources Information Center

    Usher, Marius; McClelland, James L.

    2004-01-01

    The roles of loss aversion and inhibition among alternatives are examined in models of the similarity, compromise, and attraction effects that arise in choices among 3 alternatives differing on 2 attributes. R. M. Roe, J. R. Busemeyer, and J. T. Townsend (2001) have proposed a linear model in which effects previously attributed to loss aversion…

  3. The Influence of Role Models on Women's Career Choices

    ERIC Educational Resources Information Center

    Quimby, Julie L.; DeSantis, Angela M.

    2006-01-01

    This study of 368 female undergraduates examined self-efficacy and role model influence as predictors of career choice across J. L. Holland's (1997) 6 RIASEC (Realistic, Investigative, Artistic, Social, Enterprising, Conventional) types. Findings showed that levels of self-efficacy and role model influence differed across Holland types. Multiple…

  4. Kinetic modeling based probabilistic segmentation for molecular images.

    PubMed

    Saad, Ahmed; Hamarneh, Ghassan; Möller, Torsten; Smith, Ben

    2008-01-01

    We propose a semi-supervised, kinetic modeling based segmentation technique for molecular imaging applications. It is an iterative, self-learning algorithm based on uncertainty principles, designed to alleviate low signal-to-noise ratio (SNR) and partial volume effect (PVE) problems. Synthetic fluorodeoxyglucose (FDG) and simulated Raclopride dynamic positron emission tomography (dPET) brain images with excessive noise levels are used to validate our algorithm. We show, qualitatively and quantitatively, that our algorithm outperforms state-of-the-art techniques in identifying different functional regions and recovering the kinetic parameters.

  5. Probabilistic modelling of European consumer exposure to cosmetic products.

    PubMed

    McNamara, C; Rohan, D; Golden, D; Gibney, M; Hall, B; Tozer, S; Safford, B; Coroama, M; Leneveu-Duchemin, M C; Steiling, W

    2007-11-01

    In this study, we describe the statistical analysis of the usage profile of the European population to seven cosmetic products. The aim of the study was to construct a reliable model of exposure of the European population from use of the selected products: body lotion, shampoo, deodorant spray, deodorant non-spray, facial moisturiser, lipstick and toothpaste. The first step in this process was to gather reliable data on consumer usage patterns of the products. These data were sourced from a combination of market information databases and a controlled product use study by the trade association Colipa. The market information study contained a large number of subjects, in total 44,100 households and 18,057 habitual users (males and females) of the studied products, in five European countries. The data sets were then combined to generate a realistic distribution of frequency of use of each product, combined with distribution of the amount of product used at each occasion using the CREMe software. A Monte Carlo method was used to combine the data sets. This resulted in a new model of European exposure to cosmetic products being constructed.

  6. Probabilistic Multi-Factor Interaction Model for Complex Material Behavior

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Abumeri, Galib H.

    2008-01-01

    The Multi-Factor Interaction Model (MFIM) is used to evaluate the divot weight (foam weight ejected) from the launch external tanks. The multi-factor has sufficient degrees of freedom to evaluate a large number of factors that may contribute to the divot ejection. It also accommodates all interactions by its product form. Each factor has an exponent that satisfies only two points the initial and final points. The exponent describes a monotonic path from the initial condition to the final. The exponent values are selected so that the described path makes sense in the absence of experimental data. In the present investigation, the data used was obtained by testing simulated specimens in launching conditions. Results show that the MFIM is an effective method of describing the divot weight ejected under the conditions investigated.

  7. Probabilistic Modeling Of Ocular Biomechanics In VIIP: Risk Stratification

    NASA Technical Reports Server (NTRS)

    Feola, A.; Myers, J. G.; Raykin, J.; Nelson, E. S.; Mulugeta, L.; Samuels, B.; Ethier, C. R.

    2016-01-01

    Visual Impairment and Intracranial Pressure (VIIP) syndrome is a major health concern for long-duration space missions. Currently, it is thought that a cephalad fluid shift in microgravity causes elevated intracranial pressure (ICP) that is transmitted along the optic nerve sheath (ONS). We hypothesize that this in turn leads to alteration and remodeling of connective tissue in the posterior eye which impacts vision. Finite element (FE) analysis is a powerful tool for examining the effects of mechanical loads in complex geometries. Our goal is to build a FE analysis framework to understand the response of the lamina cribrosa and optic nerve head to elevations in ICP in VIIP. To simulate the effects of different pressures on tissues in the posterior eye, we developed a geometric model of the posterior eye and optic nerve sheath and used a Latin hypercubepartial rank correlation coef-ficient (LHSPRCC) approach to assess the influence of uncertainty in our input parameters (i.e. pressures and material properties) on the peak strains within the retina, lamina cribrosa and optic nerve. The LHSPRCC approach was repeated for three relevant ICP ranges, corresponding to upright and supine posture on earth, and microgravity [1]. At each ICP condition we used intraocular pressure (IOP) and mean arterial pressure (MAP) measurements of in-flight astronauts provided by Lifetime Surveillance of Astronaut Health Program, NASA Johnson Space Center. The lamina cribrosa, optic nerve, retinal vessel and retina were modeled as linear-elastic materials, while other tissues were modeled as a Mooney-Rivlin solid (representing ground substance, stiffness parameter c1) with embedded collagen fibers (stiffness parameters c3, c4 and c5). Geometry creationmesh generation was done in Gmsh [2], while FEBio was used for all FE simulations [3]. The LHSPRCC approach resulted in correlation coefficients in the range of 1. To assess the relative influence of the uncertainty in an input parameter on

  8. TAFV Alternative Fuels and Vehicles Choice Model Documentation

    SciTech Connect

    Greene, D.L.

    2001-07-27

    A model for predicting choice of alternative fuel and among alternative vehicle technologies for light-duty motor vehicles is derived. The nested multinomial logit (NML) mathematical framework is used. Calibration of the model is based on information in the existing literature and deduction based on assuming a small number of key parameters, such as the value of time and discount rates. A spreadsheet model has been developed for calibration and preliminary testing of the model.

  9. Probabilistic Multi-Factor Interaction Model for Complex Material Behavior

    NASA Technical Reports Server (NTRS)

    Abumeri, Galib H.; Chamis, Christos C.

    2010-01-01

    Complex material behavior is represented by a single equation of product form to account for interaction among the various factors. The factors are selected by the physics of the problem and the environment that the model is to represent. For example, different factors will be required for each to represent temperature, moisture, erosion, corrosion, etc. It is important that the equation represent the physics of the behavior in its entirety accurately. The Multi-Factor Interaction Model (MFIM) is used to evaluate the divot weight (foam weight ejected) from the external launch tanks. The multi-factor has sufficient degrees of freedom to evaluate a large number of factors that may contribute to the divot ejection. It also accommodates all interactions by its product form. Each factor has an exponent that satisfies only two points - the initial and final points. The exponent describes a monotonic path from the initial condition to the final. The exponent values are selected so that the described path makes sense in the absence of experimental data. In the present investigation, the data used were obtained by testing simulated specimens in launching conditions. Results show that the MFIM is an effective method of describing the divot weight ejected under the conditions investigated. The problem lies in how to represent the divot weight with a single equation. A unique solution to this problem is a multi-factor equation of product form. Each factor is of the following form (1 xi/xf)ei, where xi is the initial value, usually at ambient conditions, xf the final value, and ei the exponent that makes the curve represented unimodal that meets the initial and final values. The exponents are either evaluated by test data or by technical judgment. A minor disadvantage may be the selection of exponents in the absence of any empirical data. This form has been used successfully in describing the foam ejected in simulated space environmental conditions. Seven factors were required

  10. A probabilistic model of emphysema based on granulometry analysis

    NASA Astrophysics Data System (ADS)

    Marcos, J. V.; Nava, R.; Cristobal, G.; Munoz-Barrutia, A.; Escalante-Ramírez, B.; Ortiz-de-Solórzano, C.

    2013-11-01

    Emphysema is associated with the destruction of lung parenchyma, resulting in abnormal enlargement of airspaces. Accurate quantification of emphysema is required for a better understanding of the disease as well as for the assessment of drugs and treatments. In the present study, a novel method for emphysema characterization from histological lung images is proposed. Elastase-induced mice were used to simulate the effect of emphysema on the lungs. A database composed of 50 normal and 50 emphysematous lung patches of size 512 x 512 pixels was used in our experiments. The purpose is to automatically identify those patches containing emphysematous tissue. The proposed approach is based on the use of granulometry analysis, which provides the pattern spectrum describing the distribution of airspaces in the lung region under evaluation. The profile of the spectrum was summarized by a set of statistical features. A logistic regression model was then used to estimate the probability for a patch to be emphysematous from this feature set. An accuracy of 87% was achieved by our method in the classification between normal and emphysematous samples. This result shows the utility of our granulometry-based method to quantify the lesions due to emphysema.

  11. Probabilistic Structural Analysis and Reliability Using NESSUS With Implemented Material Strength Degradation Model

    NASA Technical Reports Server (NTRS)

    Bast, Callie C.; Jurena, Mark T.; Godines, Cody R.; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    This project included both research and education objectives. The goal of this project was to advance innovative research and education objectives in theoretical and computational probabilistic structural analysis, reliability, and life prediction for improved reliability and safety of structural components of aerospace and aircraft propulsion systems. Research and education partners included Glenn Research Center (GRC) and Southwest Research Institute (SwRI) along with the University of Texas at San Antonio (UTSA). SwRI enhanced the NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) code and provided consulting support for NESSUS-related activities at UTSA. NASA funding supported three undergraduate students, two graduate students, a summer course instructor and the Principal Investigator. Matching funds from UTSA provided for the purchase of additional equipment for the enhancement of the Advanced Interactive Computational SGI Lab established during the first year of this Partnership Award to conduct the probabilistic finite element summer courses. The research portion of this report presents the cumulation of work performed through the use of the probabilistic finite element program, NESSUS, Numerical Evaluation and Structures Under Stress, and an embedded Material Strength Degradation (MSD) model. Probabilistic structural analysis provided for quantification of uncertainties associated with the design, thus enabling increased system performance and reliability. The structure examined was a Space Shuttle Main Engine (SSME) fuel turbopump blade. The blade material analyzed was Inconel 718, since the MSD model was previously calibrated for this material. Reliability analysis encompassing the effects of high temperature and high cycle fatigue, yielded a reliability value of 0.99978 using a fully correlated random field for the blade thickness. The reliability did not change significantly for a change in distribution type except for a change in

  12. A probabilistic method for constructing wave time-series at inshore locations using model scenarios

    USGS Publications Warehouse

    Long, Joseph W.; Plant, Nathaniel G.; Dalyander, P. Soupy; Thompson, David M.

    2014-01-01

    Continuous time-series of wave characteristics (height, period, and direction) are constructed using a base set of model scenarios and simple probabilistic methods. This approach utilizes an archive of computationally intensive, highly spatially resolved numerical wave model output to develop time-series of historical or future wave conditions without performing additional, continuous numerical simulations. The archive of model output contains wave simulations from a set of model scenarios derived from an offshore wave climatology. Time-series of wave height, period, direction, and associated uncertainties are constructed at locations included in the numerical model domain. The confidence limits are derived using statistical variability of oceanographic parameters contained in the wave model scenarios. The method was applied to a region in the northern Gulf of Mexico and assessed using wave observations at 12 m and 30 m water depths. Prediction skill for significant wave height is 0.58 and 0.67 at the 12 m and 30 m locations, respectively, with similar performance for wave period and direction. The skill of this simplified, probabilistic time-series construction method is comparable to existing large-scale, high-fidelity operational wave models but provides higher spatial resolution output at low computational expense. The constructed time-series can be developed to support a variety of applications including climate studies and other situations where a comprehensive survey of wave impacts on the coastal area is of interest.

  13. An application of probabilistic safety assessment methods to model aircraft systems and accidents

    SciTech Connect

    Martinez-Guridi, G.; Hall, R.E.; Fullwood, R.R.

    1998-08-01

    A case study modeling the thrust reverser system (TRS) in the context of the fatal accident of a Boeing 767 is presented to illustrate the application of Probabilistic Safety Assessment methods. A simplified risk model consisting of an event tree with supporting fault trees was developed to represent the progression of the accident, taking into account the interaction between the TRS and the operating crew during the accident, and the findings of the accident investigation. A feasible sequence of events leading to the fatal accident was identified. Several insights about the TRS and the accident were obtained by applying PSA methods. Changes proposed for the TRS also are discussed.

  14. Probabilistic topic modeling for the analysis and classification of genomic sequences

    PubMed Central

    2015-01-01

    Background Studies on genomic sequences for classification and taxonomic identification have a leading role in the biomedical field and in the analysis of biodiversity. These studies are focusing on the so-called barcode genes, representing a well defined region of the whole genome. Recently, alignment-free techniques are gaining more importance because they are able to overcome the drawbacks of sequence alignment techniques. In this paper a new alignment-free method for DNA sequences clustering and classification is proposed. The method is based on k-mers representation and text mining techniques. Methods The presented method is based on Probabilistic Topic Modeling, a statistical technique originally proposed for text documents. Probabilistic topic models are able to find in a document corpus the topics (recurrent themes) characterizing classes of documents. This technique, applied on DNA sequences representing the documents, exploits the frequency of fixed-length k-mers and builds a generative model for a training group of sequences. This generative model, obtained through the Latent Dirichlet Allocation (LDA) algorithm, is then used to classify a large set of genomic sequences. Results and conclusions We performed classification of over 7000 16S DNA barcode sequences taken from Ribosomal Database Project (RDP) repository, training probabilistic topic models. The proposed method is compared to the RDP tool and Support Vector Machine (SVM) classification algorithm in a extensive set of trials using both complete sequences and short sequence snippets (from 400 bp to 25 bp). Our method reaches very similar results to RDP classifier and SVM for complete sequences. The most interesting results are obtained when short sequence snippets are considered. In these conditions the proposed method outperforms RDP and SVM with ultra short sequences and it exhibits a smooth decrease of performance, at every taxonomic level, when the sequence length is decreased. PMID:25916734

  15. Choice as a Global Language in Local Practice: A Mixed Model of School Choice in Taiwan

    ERIC Educational Resources Information Center

    Mao, Chin-Ju

    2015-01-01

    This paper uses school choice policy as an example to demonstrate how local actors adopt, mediate, translate, and reformulate "choice" as neo-liberal rhetoric informing education reform. Complex processes exist between global policy about school choice and the local practice of school choice. Based on the theoretical sensibility of…

  16. SHM-Based Probabilistic Fatigue Life Prediction for Bridges Based on FE Model Updating.

    PubMed

    Lee, Young-Joo; Cho, Soojin

    2016-01-01

    Fatigue life prediction for a bridge should be based on the current condition of the bridge, and various sources of uncertainty, such as material properties, anticipated vehicle loads and environmental conditions, make the prediction very challenging. This paper presents a new approach for probabilistic fatigue life prediction for bridges using finite element (FE) model updating based on structural health monitoring (SHM) data. Recently, various types of SHM systems have been used to monitor and evaluate the long-term structural performance of bridges. For example, SHM data can be used to estimate the degradation of an in-service bridge, which makes it possible to update the initial FE model. The proposed method consists of three steps: (1) identifying the modal properties of a bridge, such as mode shapes and natural frequencies, based on the ambient vibration under passing vehicles; (2) updating the structural parameters of an initial FE model using the identified modal properties; and (3) predicting the probabilistic fatigue life using the updated FE model. The proposed method is demonstrated by application to a numerical model of a bridge, and the impact of FE model updating on the bridge fatigue life is discussed. PMID:26950125

  17. SHM-Based Probabilistic Fatigue Life Prediction for Bridges Based on FE Model Updating

    PubMed Central

    Lee, Young-Joo; Cho, Soojin

    2016-01-01

    Fatigue life prediction for a bridge should be based on the current condition of the bridge, and various sources of uncertainty, such as material properties, anticipated vehicle loads and environmental conditions, make the prediction very challenging. This paper presents a new approach for probabilistic fatigue life prediction for bridges using finite element (FE) model updating based on structural health monitoring (SHM) data. Recently, various types of SHM systems have been used to monitor and evaluate the long-term structural performance of bridges. For example, SHM data can be used to estimate the degradation of an in-service bridge, which makes it possible to update the initial FE model. The proposed method consists of three steps: (1) identifying the modal properties of a bridge, such as mode shapes and natural frequencies, based on the ambient vibration under passing vehicles; (2) updating the structural parameters of an initial FE model using the identified modal properties; and (3) predicting the probabilistic fatigue life using the updated FE model. The proposed method is demonstrated by application to a numerical model of a bridge, and the impact of FE model updating on the bridge fatigue life is discussed. PMID:26950125

  18. Probabilistic modeling of percutaneous absorption for risk-based exposure assessments and transdermal drug delivery.

    SciTech Connect

    Ho, Clifford Kuofei

    2004-06-01

    Chemical transport through human skin can play a significant role in human exposure to toxic chemicals in the workplace, as well as to chemical/biological warfare agents in the battlefield. The viability of transdermal drug delivery also relies on chemical transport processes through the skin. Models of percutaneous absorption are needed for risk-based exposure assessments and drug-delivery analyses, but previous mechanistic models have been largely deterministic. A probabilistic, transient, three-phase model of percutaneous absorption of chemicals has been developed to assess the relative importance of uncertain parameters and processes that may be important to risk-based assessments. Penetration routes through the skin that were modeled include the following: (1) intercellular diffusion through the multiphase stratum corneum; (2) aqueous-phase diffusion through sweat ducts; and (3) oil-phase diffusion through hair follicles. Uncertainty distributions were developed for the model parameters, and a Monte Carlo analysis was performed to simulate probability distributions of mass fluxes through each of the routes. Sensitivity analyses using stepwise linear regression were also performed to identify model parameters that were most important to the simulated mass fluxes at different times. This probabilistic analysis of percutaneous absorption (PAPA) method has been developed to improve risk-based exposure assessments and transdermal drug-delivery analyses, where parameters and processes can be highly uncertain.

  19. Probabilistic Modeling of Conformational Space for 3D Machine Learning Approaches.

    PubMed

    Jahn, Andreas; Hinselmann, Georg; Fechner, Nikolas; Henneges, Carsten; Zell, Andreas

    2010-05-17

    We present a new probabilistic encoding of the conformational space of a molecule that allows for the integration into common similarity calculations. The method uses distance profiles of flexible atom-pairs and computes generative models that describe the distance distribution in the conformational space. The generative models permit the use of probabilistic kernel functions and, therefore, our approach can be used to extend existing 3D molecular kernel functions, as applied in support vector machines, to build QSAR models. The resulting kernels are valid 4D kernel functions and reduce the dependency of the model quality on suitable conformations of the molecules. We showed in several experiments the robust performance of the 4D kernel function, which was extended by our approach, in comparison to the original 3D-based kernel function. The new method compares the conformational space of two molecules within one kernel evaluation. Hence, the number of kernel evaluations is significantly reduced in comparison to common kernel-based conformational space averaging techniques. Additionally, the performance gain of the extended model correlates with the flexibility of the data set and enables an a priori estimation of the model improvement.

  20. SHM-Based Probabilistic Fatigue Life Prediction for Bridges Based on FE Model Updating.

    PubMed

    Lee, Young-Joo; Cho, Soojin

    2016-03-02

    Fatigue life prediction for a bridge should be based on the current condition of the bridge, and various sources of uncertainty, such as material properties, anticipated vehicle loads and environmental conditions, make the prediction very challenging. This paper presents a new approach for probabilistic fatigue life prediction for bridges using finite element (FE) model updating based on structural health monitoring (SHM) data. Recently, various types of SHM systems have been used to monitor and evaluate the long-term structural performance of bridges. For example, SHM data can be used to estimate the degradation of an in-service bridge, which makes it possible to update the initial FE model. The proposed method consists of three steps: (1) identifying the modal properties of a bridge, such as mode shapes and natural frequencies, based on the ambient vibration under passing vehicles; (2) updating the structural parameters of an initial FE model using the identified modal properties; and (3) predicting the probabilistic fatigue life using the updated FE model. The proposed method is demonstrated by application to a numerical model of a bridge, and the impact of FE model updating on the bridge fatigue life is discussed.

  1. Detailed probabilistic modelling of cell inactivation by ionizing radiations of different qualities: the model and its applications.

    PubMed

    Kundrát, Pavel

    2009-03-01

    The probabilistic two-stage model of cell killing by ionizing radiation enables to represent both damage induction by radiation and its repair by the cell. The model properties and applications as well as possible interpretation of the underlying damage classification are discussed. Analyses of published survival data for V79 hamster cells irradiated by protons and He, C, O, and Ne ions are reported, quantifying the variations in radiation quality with increasing charge and linear energy transfer of the ions.

  2. Modeling Multiple Response Processes in Judgment and Choice

    ERIC Educational Resources Information Center

    Bockenholt, Ulf

    2012-01-01

    In this article, I show how item response models can be used to capture multiple response processes in psychological applications. Intuitive and analytical responses, agree-disagree answers, response refusals, socially desirable responding, differential item functioning, and choices among multiple options are considered. In each of these cases, I…

  3. Success on Multiple Choice Examinations: A Model and Workshop Intervention.

    ERIC Educational Resources Information Center

    Bowering, Elizabeth R.; Wetmore, Ann A.

    1997-01-01

    Presents a theoretical model that identifies the context in which students experience difficulties with complex multiple choice exams (MCE). Provides a structured approach that facilitates the development of critical thinking and metacognitive skills. Discusses a workshop in which participants reported increased knowledge concerning the process…

  4. Psychophysics of time perception and intertemporal choice models

    NASA Astrophysics Data System (ADS)

    Takahashi, Taiki; Oono, Hidemi; Radford, Mark H. B.

    2008-03-01

    Intertemporal choice and psychophysics of time perception have been attracting attention in econophysics and neuroeconomics. Several models have been proposed for intertemporal choice: exponential discounting, general hyperbolic discounting (exponential discounting with logarithmic time perception of the Weber-Fechner law, a q-exponential discount model based on Tsallis's statistics), simple hyperbolic discounting, and Stevens' power law-exponential discounting (exponential discounting with Stevens' power time perception). In order to examine the fitness of the models for behavioral data, we estimated the parameters and AICc (Akaike Information Criterion with small sample correction) of the intertemporal choice models by assessing the points of subjective equality (indifference points) at seven delays. Our results have shown that the orders of the goodness-of-fit for both group and individual data were [Weber-Fechner discounting (general hyperbola) > Stevens' power law discounting > Simple hyperbolic discounting > Exponential discounting], indicating that human time perception in intertemporal choice may follow the Weber-Fechner law. Indications of the results for neuropsychopharmacological treatments of addiction and biophysical processing underlying temporal discounting and time perception are discussed.

  5. Loss aversion and inhibition in dynamical models of multialternative choice.

    PubMed

    Usher, Marius; McClelland, James L

    2004-07-01

    The roles of loss aversion and inhibition among alternatives are examined in models of the similarity, compromise, and attraction effects that arise in choices among 3 alternatives differing on 2 attributes. R. M. Roe, J. R. Busemeyer, and J. T. Townsend (2001) have proposed a linear model in which effects previously attributed to loss aversion (A. Tversky & D. Kahneman, 1991) arise from attention switching between attributes and similarity-dependent inhibitory interactions among alternatives. However, there are several reasons to maintain loss aversion in a theory of choice. In view of this, an alternative theory is proposed, integrating loss aversion and attention switching into a nonlinear model (M. Usher & J. L. McClelland, 2001) that relies on inhibition independent of similarity among alternatives. The model accounts for the 3 effects and makes testable predictions contrasting with those of the Roe et al. (2001) model.

  6. Medicare Care Choices Model Enables Concurrent Palliative and Curative Care.

    PubMed

    2015-01-01

    On July 20, 2015, the federal Centers for Medicare & Medicaid Services (CMS) announced hospices that have been selected to participate in the Medicare Care Choices Model. Fewer than half of the Medicare beneficiaries use hospice care for which they are eligible. Current Medicare regulations preclude concurrent palliative and curative care. Under the Medicare Choices Model, dually eligible Medicare beneficiaries may elect to receive supportive care services typically provided by hospice while continuing to receive curative services. This report describes how CMS has expanded the model from an originally anticipated 30 Medicare-certified hospices to over 140 Medicare-certified hospices and extended the duration of the model from 3 to 5 years. Medicare-certified hospice programs that will participate in the model are listed.

  7. Propagating Water Quality Analysis Uncertainty Into Resource Management Decisions Through Probabilistic Modeling

    NASA Astrophysics Data System (ADS)

    Gronewold, A. D.; Wolpert, R. L.; Reckhow, K. H.

    2007-12-01

    Most probable number (MPN) and colony-forming-unit (CFU) are two estimates of fecal coliform bacteria concentration commonly used as measures of water quality in United States shellfish harvesting waters. The MPN is the maximum likelihood estimate (or MLE) of the true fecal coliform concentration based on counts of non-sterile tubes in serial dilution of a sample aliquot, indicating bacterial metabolic activity. The CFU is the MLE of the true fecal coliform concentration based on the number of bacteria colonies emerging on a growth plate after inoculation from a sample aliquot. Each estimating procedure has intrinsic variability and is subject to additional uncertainty arising from minor variations in experimental protocol. Several versions of each procedure (using different sized aliquots or different numbers of tubes, for example) are in common use, each with its own levels of probabilistic and experimental error and uncertainty. It has been observed empirically that the MPN procedure is more variable than the CFU procedure, and that MPN estimates are somewhat higher on average than CFU estimates, on split samples from the same water bodies. We construct a probabilistic model that provides a clear theoretical explanation for the observed variability in, and discrepancy between, MPN and CFU measurements. We then explore how this variability and uncertainty might propagate into shellfish harvesting area management decisions through a two-phased modeling strategy. First, we apply our probabilistic model in a simulation-based analysis of future water quality standard violation frequencies under alternative land use scenarios, such as those evaluated under guidelines of the total maximum daily load (TMDL) program. Second, we apply our model to water quality data from shellfish harvesting areas which at present are closed (either conditionally or permanently) to shellfishing, to determine if alternative laboratory analysis procedures might have led to different

  8. Implications of Visual Attention Phenomena for Models of Preferential Choice

    PubMed Central

    2016-01-01

    We use computational modeling to examine the ability of evidence accumulation models to produce the reaction time (RT) distributions and attentional biases found in behavioral and eye-tracking research. We focus on simulating RTs and attention in binary choice with particular emphasis on whether different models can predict the late onset bias (LOB), commonly found in eye movements during choice (sometimes called the gaze cascade). The first finding is that this bias is predicted by models even when attention is entirely random and independent of the choice process. This shows that the LOB is not evidence of a feedback loop between evidence accumulation and attention. Second, we examine models with a relative evidence decision rule and an absolute evidence rule. In the relative models a decision is made once the difference in evidence accumulated for 2 items reaches a threshold. In the absolute models, a decision is made once 1 item accumulates a certain amount of evidence, independently of how much is accumulated for a competitor. Our core result is simple—the existence of the late onset gaze bias to the option ultimately chosen, together with a positively skewed RT distribution means that the stopping rule must be relative not absolute. A large scale grid search of parameter space shows that absolute threshold models struggle to predict these phenomena even when incorporating evidence decay and assumptions of either mutual inhibition or feedforward inhibition. PMID:27774490

  9. A Simplified Model of Choice Behavior under Uncertainty.

    PubMed

    Lin, Ching-Hung; Lin, Yu-Kai; Song, Tzu-Jiun; Huang, Jong-Tsun; Chiu, Yao-Chu

    2016-01-01

    The Iowa Gambling Task (IGT) has been standardized as a clinical assessment tool (Bechara, 2007). Nonetheless, numerous research groups have attempted to modify IGT models to optimize parameters for predicting the choice behavior of normal controls and patients. A decade ago, most researchers considered the expected utility (EU) model (Busemeyer and Stout, 2002) to be the optimal model for predicting choice behavior under uncertainty. However, in recent years, studies have demonstrated that models with the prospect utility (PU) function are more effective than the EU models in the IGT (Ahn et al., 2008). Nevertheless, after some preliminary tests based on our behavioral dataset and modeling, it was determined that the Ahn et al. (2008) PU model is not optimal due to some incompatible results. This study aims to modify the Ahn et al. (2008) PU model to a simplified model and used the IGT performance of 145 subjects as the benchmark data for comparison. In our simplified PU model, the best goodness-of-fit was found mostly as the value of α approached zero. More specifically, we retested the key parameters α, λ, and A in the PU model. Notably, the influence of the parameters α, λ, and A has a hierarchical power structure in terms of manipulating the goodness-of-fit in the PU model. Additionally, we found that the parameters λ and A may be ineffective when the parameter α is close to zero in the PU model. The present simplified model demonstrated that decision makers mostly adopted the strategy of gain-stay loss-shift rather than foreseeing the long-term outcome. However, there are other behavioral variables that are not well revealed under these dynamic-uncertainty situations. Therefore, the optimal behavioral models may not have been found yet. In short, the best model for predicting choice behavior under dynamic-uncertainty situations should be further evaluated. PMID:27582715

  10. A Simplified Model of Choice Behavior under Uncertainty

    PubMed Central

    Lin, Ching-Hung; Lin, Yu-Kai; Song, Tzu-Jiun; Huang, Jong-Tsun; Chiu, Yao-Chu

    2016-01-01

    The Iowa Gambling Task (IGT) has been standardized as a clinical assessment tool (Bechara, 2007). Nonetheless, numerous research groups have attempted to modify IGT models to optimize parameters for predicting the choice behavior of normal controls and patients. A decade ago, most researchers considered the expected utility (EU) model (Busemeyer and Stout, 2002) to be the optimal model for predicting choice behavior under uncertainty. However, in recent years, studies have demonstrated that models with the prospect utility (PU) function are more effective than the EU models in the IGT (Ahn et al., 2008). Nevertheless, after some preliminary tests based on our behavioral dataset and modeling, it was determined that the Ahn et al. (2008) PU model is not optimal due to some incompatible results. This study aims to modify the Ahn et al. (2008) PU model to a simplified model and used the IGT performance of 145 subjects as the benchmark data for comparison. In our simplified PU model, the best goodness-of-fit was found mostly as the value of α approached zero. More specifically, we retested the key parameters α, λ, and A in the PU model. Notably, the influence of the parameters α, λ, and A has a hierarchical power structure in terms of manipulating the goodness-of-fit in the PU model. Additionally, we found that the parameters λ and A may be ineffective when the parameter α is close to zero in the PU model. The present simplified model demonstrated that decision makers mostly adopted the strategy of gain-stay loss-shift rather than foreseeing the long-term outcome. However, there are other behavioral variables that are not well revealed under these dynamic-uncertainty situations. Therefore, the optimal behavioral models may not have been found yet. In short, the best model for predicting choice behavior under dynamic-uncertainty situations should be further evaluated. PMID:27582715

  11. A Simplified Model of Choice Behavior under Uncertainty.

    PubMed

    Lin, Ching-Hung; Lin, Yu-Kai; Song, Tzu-Jiun; Huang, Jong-Tsun; Chiu, Yao-Chu

    2016-01-01

    The Iowa Gambling Task (IGT) has been standardized as a clinical assessment tool (Bechara, 2007). Nonetheless, numerous research groups have attempted to modify IGT models to optimize parameters for predicting the choice behavior of normal controls and patients. A decade ago, most researchers considered the expected utility (EU) model (Busemeyer and Stout, 2002) to be the optimal model for predicting choice behavior under uncertainty. However, in recent years, studies have demonstrated that models with the prospect utility (PU) function are more effective than the EU models in the IGT (Ahn et al., 2008). Nevertheless, after some preliminary tests based on our behavioral dataset and modeling, it was determined that the Ahn et al. (2008) PU model is not optimal due to some incompatible results. This study aims to modify the Ahn et al. (2008) PU model to a simplified model and used the IGT performance of 145 subjects as the benchmark data for comparison. In our simplified PU model, the best goodness-of-fit was found mostly as the value of α approached zero. More specifically, we retested the key parameters α, λ, and A in the PU model. Notably, the influence of the parameters α, λ, and A has a hierarchical power structure in terms of manipulating the goodness-of-fit in the PU model. Additionally, we found that the parameters λ and A may be ineffective when the parameter α is close to zero in the PU model. The present simplified model demonstrated that decision makers mostly adopted the strategy of gain-stay loss-shift rather than foreseeing the long-term outcome. However, there are other behavioral variables that are not well revealed under these dynamic-uncertainty situations. Therefore, the optimal behavioral models may not have been found yet. In short, the best model for predicting choice behavior under dynamic-uncertainty situations should be further evaluated.

  12. Determination of the statistical distributions of model parameters for probabilistic risk assessment

    SciTech Connect

    Fields, D.E.; Glandon, S.R.

    1981-01-01

    Successful probabilistic risk assessment depends heavily on knowledge of the distribution of model parameters we have developed. The TERPED computer code is a versatile methodology for determining with what confidence a parameter set may be considered to have a normal or lognormal frequency distribution. Several measures of central tendency are computed. Other options include computation of the chi-square statistic, the Kolmogorov-Smirnov non-parametric statistic, and Pearson's correlation coefficient. Cumulative probability plots are produced either in high resolution (pen-and-ink or film) or in printerplot form.

  13. A probabilistic model of the spatial patterning of pecking in birds : Pilot study with young chicks.

    PubMed

    Bovet, P; Vauclair, J

    1985-11-01

    The pecking behaviour of young chicks (Gallus gallus ) is studied in a situation involving several equivalent targets (mealworms). The question is raised whether the successive pecks are randomly distributed or whether they follow a systematic order based on the spatial arrangement of the targets. Data collected with one-week-old chicks indicate that pecking is compatible with a probabilistic model where the probability to peck at a given place is inversely proportional to the energy used for this particular peck. Pecking by chicks is interpreted as a functional compromise between random sampling and optimal exploitation of the environment.

  14. Incorporating seismic phase correlations into a probabilistic model of global-scale seismology

    NASA Astrophysics Data System (ADS)

    Arora, Nimar

    2013-04-01

    We present a probabilistic model of seismic phases whereby the attributes of the body-wave phases are correlated to those of the first arriving P phase. This model has been incorporated into NET-VISA (Network processing Vertically Integrated Seismic Analysis) a probabilistic generative model of seismic events, their transmission, and detection on a global seismic network. In the earlier version of NET-VISA, seismic phase were assumed to be independent of each other. Although this didn't affect the quality of the inferred seismic bulletin, for the most part, it did result in a few instances of anomalous phase association. For example, an S phase with a smaller slowness than the corresponding P phase. We demonstrate that the phase attributes are indeed highly correlated, for example the uncertainty in the S phase travel time is significantly reduced given the P phase travel time. Our new model exploits these correlations to produce better calibrated probabilities for the events, as well as fewer anomalous associations.

  15. A Probabilistic Model for Estimating the Depth and Threshold Temperature of C-fiber Nociceptors

    PubMed Central

    Dezhdar, Tara; Moshourab, Rabih A.; Fründ, Ingo; Lewin, Gary R.; Schmuker, Michael

    2015-01-01

    The subjective experience of thermal pain follows the detection and encoding of noxious stimuli by primary afferent neurons called nociceptors. However, nociceptor morphology has been hard to access and the mechanisms of signal transduction remain unresolved. In order to understand how heat transducers in nociceptors are activated in vivo, it is important to estimate the temperatures that directly activate the skin-embedded nociceptor membrane. Hence, the nociceptor’s temperature threshold must be estimated, which in turn will depend on the depth at which transduction happens in the skin. Since the temperature at the receptor cannot be accessed experimentally, such an estimation can currently only be achieved through modeling. However, the current state-of-the-art model to estimate temperature at the receptor suffers from the fact that it cannot account for the natural stochastic variability of neuronal responses. We improve this model using a probabilistic approach which accounts for uncertainties and potential noise in system. Using a data set of 24 C-fibers recorded in vitro, we show that, even without detailed knowledge of the bio-thermal properties of the system, the probabilistic model that we propose here is capable of providing estimates of threshold and depth in cases where the classical method fails. PMID:26638830

  16. A Probabilistic Model for Estimating the Depth and Threshold Temperature of C-fiber Nociceptors

    NASA Astrophysics Data System (ADS)

    Dezhdar, Tara; Moshourab, Rabih A.; Fründ, Ingo; Lewin, Gary R.; Schmuker, Michael

    2015-12-01

    The subjective experience of thermal pain follows the detection and encoding of noxious stimuli by primary afferent neurons called nociceptors. However, nociceptor morphology has been hard to access and the mechanisms of signal transduction remain unresolved. In order to understand how heat transducers in nociceptors are activated in vivo, it is important to estimate the temperatures that directly activate the skin-embedded nociceptor membrane. Hence, the nociceptor’s temperature threshold must be estimated, which in turn will depend on the depth at which transduction happens in the skin. Since the temperature at the receptor cannot be accessed experimentally, such an estimation can currently only be achieved through modeling. However, the current state-of-the-art model to estimate temperature at the receptor suffers from the fact that it cannot account for the natural stochastic variability of neuronal responses. We improve this model using a probabilistic approach which accounts for uncertainties and potential noise in system. Using a data set of 24 C-fibers recorded in vitro, we show that, even without detailed knowledge of the bio-thermal properties of the system, the probabilistic model that we propose here is capable of providing estimates of threshold and depth in cases where the classical method fails.

  17. An efficient deterministic-probabilistic approach to modeling regional groundwater flow: 1. Theory

    USGS Publications Warehouse

    Yen, Chung-Cheng; Guymon, Gary L.

    1990-01-01

    An efficient probabilistic model is developed and cascaded with a deterministic model for predicting water table elevations in regional aquifers. The objective is to quantify model uncertainty where precise estimates of water table elevations may be required. The probabilistic model is based on the two-point probability method which only requires prior knowledge of uncertain variables mean and coefficient of variation. The two-point estimate method is theoretically developed and compared with the Monte Carlo simulation method. The results of comparisons using hypothetical determinisitic problems indicate that the two-point estimate method is only generally valid for linear problems where the coefficients of variation of uncertain parameters (for example, storage coefficient and hydraulic conductivity) is small. The two-point estimate method may be applied to slightly nonlinear problems with good results, provided coefficients of variation are small. In such cases, the two-point estimate method is much more efficient than the Monte Carlo method provided the number of uncertain variables is less than eight.

  18. From Cyclone Tracks to the Costs of European Winter Storms: A Probabilistic Loss Assessment Model

    NASA Astrophysics Data System (ADS)

    Orwig, K.; Renggli, D.; Corti, T.; Reese, S.; Wueest, M.; Viktor, E.; Zimmerli, P.

    2014-12-01

    European winter storms cause billions of dollars of insured losses every year. Therefore, it is essential to understand potential impacts of future events, and the role reinsurance can play to mitigate the losses. The authors will present an overview on natural catastrophe risk assessment modeling in the reinsurance industry, and the development of a new innovative approach for modeling the risk associated with European winter storms.The new innovative approach includes the development of physically meaningful probabilistic (i.e. simulated) events for European winter storm loss assessment. The meteorological hazard component of the new model is based on cyclone and windstorm tracks identified in the 20thCentury Reanalysis data. The knowledge of the evolution of winter storms both in time and space allows the physically meaningful perturbation of historical event properties (e.g. track, intensity, etc.). The perturbation includes a random element but also takes the local climatology and the evolution of the historical event into account.The low-resolution wind footprints taken from the 20thCentury Reanalysis are processed by a statistical-dynamical downscaling to generate high-resolution footprints for both the simulated and historical events. Downscaling transfer functions are generated using ENSEMBLES regional climate model data. The result is a set of reliable probabilistic events representing thousands of years. The event set is then combined with country and site-specific vulnerability functions and detailed market- or client-specific information to compute annual expected losses.

  19. Binary choices in small and large groups: A unified model

    NASA Astrophysics Data System (ADS)

    Bischi, Gian-Italo; Merlone, Ugo

    2010-02-01

    Two different ways to model the diffusion of alternative choices within a population of individuals in the presence of social externalities are known in the literature. While Galam’s model of rumors spreading considers a majority rule for interactions in several groups, Schelling considers individuals interacting in one large group, with payoff functions that describe how collective choices influence individual preferences. We incorporate these two approaches into a unified general discrete-time dynamic model for studying individual interactions in variously sized groups. We first illustrate how the two original models can be obtained as particular cases of the more general model we propose, then we show how several other situations can be analyzed. The model we propose goes beyond a theoretical exercise as it allows modeling situations which are relevant in economic and social systems. We consider also other aspects such as the propensity to switch choices and the behavioral momentum, and show how they may affect the dynamics of the whole population.

  20. The Integrated Medical Model: A Probabilistic Simulation Model for Predicting In-Flight Medical Risks

    NASA Technical Reports Server (NTRS)

    Keenan, Alexandra; Young, Millennia; Saile, Lynn; Boley, Lynn; Walton, Marlei; Kerstman, Eric; Shah, Ronak; Goodenow, Debra A.; Myers, Jerry G.

    2015-01-01

    The Integrated Medical Model (IMM) is a probabilistic model that uses simulation to predict mission medical risk. Given a specific mission and crew scenario, medical events are simulated using Monte Carlo methodology to provide estimates of resource utilization, probability of evacuation, probability of loss of crew, and the amount of mission time lost due to illness. Mission and crew scenarios are defined by mission length, extravehicular activity (EVA) schedule, and crew characteristics including: sex, coronary artery calcium score, contacts, dental crowns, history of abdominal surgery, and EVA eligibility. The Integrated Medical Evidence Database (iMED) houses the model inputs for one hundred medical conditions using in-flight, analog, and terrestrial medical data. Inputs include incidence, event durations, resource utilization, and crew functional impairment. Severity of conditions is addressed by defining statistical distributions on the dichotomized best and worst-case scenarios for each condition. The outcome distributions for conditions are bounded by the treatment extremes of the fully treated scenario in which all required resources are available and the untreated scenario in which no required resources are available. Upon occurrence of a simulated medical event, treatment availability is assessed, and outcomes are generated depending on the status of the affected crewmember at the time of onset, including any pre-existing functional impairments or ongoing treatment of concurrent conditions. The main IMM outcomes, including probability of evacuation and loss of crew life, time lost due to medical events, and resource utilization, are useful in informing mission planning decisions. To date, the IMM has been used to assess mission-specific risks with and without certain crewmember characteristics, to determine the impact of eliminating certain resources from the mission medical kit, and to design medical kits that maximally benefit crew health while meeting

  1. The Integrated Medical Model: A Probabilistic Simulation Model Predicting In-Flight Medical Risks

    NASA Technical Reports Server (NTRS)

    Keenan, Alexandra; Young, Millennia; Saile, Lynn; Boley, Lynn; Walton, Marlei; Kerstman, Eric; Shah, Ronak; Goodenow, Debra A.; Myers, Jerry G., Jr.

    2015-01-01

    The Integrated Medical Model (IMM) is a probabilistic model that uses simulation to predict mission medical risk. Given a specific mission and crew scenario, medical events are simulated using Monte Carlo methodology to provide estimates of resource utilization, probability of evacuation, probability of loss of crew, and the amount of mission time lost due to illness. Mission and crew scenarios are defined by mission length, extravehicular activity (EVA) schedule, and crew characteristics including: sex, coronary artery calcium score, contacts, dental crowns, history of abdominal surgery, and EVA eligibility. The Integrated Medical Evidence Database (iMED) houses the model inputs for one hundred medical conditions using in-flight, analog, and terrestrial medical data. Inputs include incidence, event durations, resource utilization, and crew functional impairment. Severity of conditions is addressed by defining statistical distributions on the dichotomized best and worst-case scenarios for each condition. The outcome distributions for conditions are bounded by the treatment extremes of the fully treated scenario in which all required resources are available and the untreated scenario in which no required resources are available. Upon occurrence of a simulated medical event, treatment availability is assessed, and outcomes are generated depending on the status of the affected crewmember at the time of onset, including any pre-existing functional impairments or ongoing treatment of concurrent conditions. The main IMM outcomes, including probability of evacuation and loss of crew life, time lost due to medical events, and resource utilization, are useful in informing mission planning decisions. To date, the IMM has been used to assess mission-specific risks with and without certain crewmember characteristics, to determine the impact of eliminating certain resources from the mission medical kit, and to design medical kits that maximally benefit crew health while meeting

  2. VBA: A Probabilistic Treatment of Nonlinear Models for Neurobiological and Behavioural Data

    PubMed Central

    Daunizeau, Jean; Adam, Vincent; Rigoux, Lionel

    2014-01-01

    This work is in line with an on-going effort tending toward a computational (quantitative and refutable) understanding of human neuro-cognitive processes. Many sophisticated models for behavioural and neurobiological data have flourished during the past decade. Most of these models are partly unspecified (i.e. they have unknown parameters) and nonlinear. This makes them difficult to peer with a formal statistical data analysis framework. In turn, this compromises the reproducibility of model-based empirical studies. This work exposes a software toolbox that provides generic, efficient and robust probabilistic solutions to the three problems of model-based analysis of empirical data: (i) data simulation, (ii) parameter estimation/model selection, and (iii) experimental design optimization. PMID:24465198

  3. Evaluation of Behavioral Demand Models of Consumer Choice in Health Care.

    ERIC Educational Resources Information Center

    Siddharthan, Kris

    1991-01-01

    Consumer choice of health provider plan and preference for a personal physician were studied for 1,438 elderly adults using a joint logit model (JL) and a nested logit model. Choice criteria used by senior citizens, and reasons the nested choice model explains choice behavior better than the JL are examined. (SLD)

  4. Probabilistic model of waiting times between large failures in sheared media

    NASA Astrophysics Data System (ADS)

    Brinkman, Braden A. W.; LeBlanc, Michael P.; Uhl, Jonathan T.; Ben-Zion, Yehuda; Dahmen, Karin A.

    2016-01-01

    Using a probabilistic approximation of a mean-field mechanistic model of sheared systems, we analytically calculate the statistical properties of large failures under slow shear loading. For general shear F (t ) , the distribution of waiting times between large system-spanning failures is a generalized exponential distribution, ρT(t ) =λ ( F (t ) ) P ( F (t ) ) exp[-∫0td τ λ ( F (τ ) ) P ( F (τ ) ) ] , where λ ( F (t )) is the rate of small event occurrences at stress F (t ) and P ( F (t )) is the probability that a small event triggers a large failure. We study the behavior of this distribution as a function of fault properties, such as heterogeneity or shear rate. Because the probabilistic model accommodates any stress loading F (t ) , it is particularly useful for modeling experiments designed to understand how different forms of shear loading or stress perturbations impact the waiting-time statistics of large failures. As examples, we study how periodic perturbations or fluctuations on top of a linear shear stress increase impact the waiting-time distribution.

  5. Probabilistic, multi-variate flood damage modelling using random forests and Bayesian networks

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Schröter, Kai

    2015-04-01

    Decisions on flood risk management and adaptation are increasingly based on risk analyses. Such analyses are associated with considerable uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention recently, they are hardly applied in flood damage assessments. Most of the damage models usually applied in standard practice have in common that complex damaging processes are described by simple, deterministic approaches like stage-damage functions. This presentation will show approaches for probabilistic, multi-variate flood damage modelling on the micro- and meso-scale and discuss their potential and limitations. Reference: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Schröter, K., Kreibich, H., Vogel, K., Riggelsen, C., Scherbaum, F., Merz, B. (2014): How useful are complex flood damage models? - Water Resources Research, 50, 4, p. 3378-3395.

  6. Development of a probabilistic PCB-bioaccumulation model for six fish species in the Hudson River

    SciTech Connect

    Stackelberg, K. von; Menzie, C.

    1995-12-31

    In 1984 the US Environmental Protection Agency (USEPA) completed a Feasibility Study on the Hudson River that investigated remedial alternatives and issued a Record of Decision (ROD) later that year. In December 1989 USEPA decided to reassess the No Action decision for Hudson River sediments. This reassessment consists of three phases: Interim Characterization and Evaluation (Phase 1); Further Site Characterization and Analysis (Phase 2); and, Feasibility study (Phase 3). A Phase 1 report was completed in August, 1991. The team then completed a Final Work Plan for Phase 2 in September 1992. This work plan identified various PCB fate and transport modeling activities to support the Hudson River PCB Reassessment Remedial Investigation and Feasibility Study (RI/FS). This talk provides a description of the development of a Probabilistic bioaccumulation models to describe the uptake of PCBs on a congener-specific basis in six fish species. The authors have developed a framework for relating body burdens of PCBs in fish to exposure concentrations in Hudson River water and sediments. This framework is used to understand historical and current relationships as well as to predict fish body burdens for future conditions under specific remediation and no action scenarios. The framework incorporates a probabilistic approach to predict distributions in PCB body burdens for selected fish species. These models can predict single population statistics such as the average expected values of PCBs under specific scenarios as well as the distribution of expected concentrations.

  7. Data assimilation for unsaturated flow models with restart adaptive probabilistic collocation based Kalman filter

    NASA Astrophysics Data System (ADS)

    Man, Jun; Li, Weixuan; Zeng, Lingzao; Wu, Laosheng

    2016-06-01

    The ensemble Kalman filter (EnKF) has gained popularity in hydrological data assimilation problems. As a Monte Carlo based method, a sufficiently large ensemble size is usually required to guarantee the accuracy. As an alternative approach, the probabilistic collocation based Kalman filter (PCKF) employs the polynomial chaos expansion (PCE) to represent and propagate the uncertainties in parameters and states. However, PCKF suffers from the so-called "curse of dimensionality". Its computational cost increases drastically with the increasing number of parameters and system nonlinearity. Furthermore, PCKF may fail to provide accurate estimations due to the joint updating scheme for strongly nonlinear models. Motivated by recent developments in uncertainty quantification and EnKF, we propose a restart adaptive probabilistic collocation based Kalman filter (RAPCKF) for data assimilation in unsaturated flow problems. During the implementation of RAPCKF, the important parameters are identified and active PCE basis functions are adaptively selected at each assimilation step; the "restart" scheme is utilized to eliminate the inconsistency between updated model parameters and states variables. The performance of RAPCKF is systematically tested with numerical cases of unsaturated flow models. It is shown that the adaptive approach and restart scheme can significantly improve the performance of PCKF. Moreover, RAPCKF has been demonstrated to be more efficient than EnKF with the same computational cost.

  8. Learned graphical models for probabilistic planning provide a new class of movement primitives.

    PubMed

    Rückert, Elmar A; Neumann, Gerhard; Toussaint, Marc; Maass, Wolfgang

    2012-01-01

    BIOLOGICAL MOVEMENT GENERATION COMBINES THREE INTERESTING ASPECTS: its modular organization in movement primitives (MPs), its characteristics of stochastic optimality under perturbations, and its efficiency in terms of learning. A common approach to motor skill learning is to endow the primitives with dynamical systems. Here, the parameters of the primitive indirectly define the shape of a reference trajectory. We propose an alternative MP representation based on probabilistic inference in learned graphical models with new and interesting properties that complies with salient features of biological movement control. Instead of endowing the primitives with dynamical systems, we propose to endow MPs with an intrinsic probabilistic planning system, integrating the power of stochastic optimal control (SOC) methods within a MP. The parameterization of the primitive is a graphical model that represents the dynamics and intrinsic cost function such that inference in this graphical model yields the control policy. We parameterize the intrinsic cost function using task-relevant features, such as the importance of passing through certain via-points. The system dynamics as well as intrinsic cost function parameters are learned in a reinforcement learning (RL) setting. We evaluate our approach on a complex 4-link balancing task. Our experiments show that our movement representation facilitates learning significantly and leads to better generalization to new task settings without re-learning.

  9. Probabilistic model of waiting times between large failures in sheared media.

    PubMed

    Brinkman, Braden A W; LeBlanc, Michael P; Uhl, Jonathan T; Ben-Zion, Yehuda; Dahmen, Karin A

    2016-01-01

    Using a probabilistic approximation of a mean-field mechanistic model of sheared systems, we analytically calculate the statistical properties of large failures under slow shear loading. For general shear F(t), the distribution of waiting times between large system-spanning failures is a generalized exponential distribution, ρ_{T}(t)=λ(F(t))P(F(t))exp[-∫_{0}^{t}dτλ(F(τ))P(F(τ))], where λ(F(t)) is the rate of small event occurrences at stress F(t) and P(F(t)) is the probability that a small event triggers a large failure. We study the behavior of this distribution as a function of fault properties, such as heterogeneity or shear rate. Because the probabilistic model accommodates any stress loading F(t), it is particularly useful for modeling experiments designed to understand how different forms of shear loading or stress perturbations impact the waiting-time statistics of large failures. As examples, we study how periodic perturbations or fluctuations on top of a linear shear stress increase impact the waiting-time distribution.

  10. Application of Continuous-Time Batch Markovian Arrival Processes and Particle Tracking Model to Probabilistic Sediment Transport Modeling

    NASA Astrophysics Data System (ADS)

    Tsai, Christina; Hung, Serena

    2016-04-01

    To more precisely describe particle movement in surface water, both the random particle arrival process at the receiving water and the stochastic particle movement in the receiving water should be carefully considered in sediment transport modeling. In this study, a stochastic framework is developed for a probabilistic description of discrete particle transport through a probability density function of sediment concentrations and transport rates. In order to more realistically describe the particle arrivals into receiving waters at random times and with a probabilistic particle number in each arrival, the continuous-time batch Markovian arrival process is introduced. The particle tracking model (PTM) composed of physically based stochastic differential equations (SDEs) for particle trajectory is then used to depict the random movement of particles in the receiving water. Particle deposition and entrainment processes are considered in the model. It is expected that the particle concentrations in the receiving water and particle transport rates can be mathematically expressed as a stochastic process. Compared with deterministic modeling, the proposed approach has the advantage of capturing any randomly selected scenarios (or realizations) of flow and sediment properties. Availability of a more sophisticated stochastic process for random particle arrival processes can assist in quantifying the probabilistic characteristics of sediment transport rates and concentrations. In addition, for a given turbidity threshold, the risk of exceeding a pre-established water quality standard can be quantified as needed.

  11. Linear-Nonlinear-Poisson models of primate choice dynamics.

    PubMed

    Corrado, Greg S; Sugrue, Leo P; Seung, H Sebastian; Newsome, William T

    2005-11-01

    The equilibrium phenomenon of matching behavior traditionally has been studied in stationary environments. Here we attempt to uncover the local mechanism of choice that gives rise to matching by studying behavior in a highly dynamic foraging environment. In our experiments, 2 rhesus monkeys (Macacca mulatta) foraged for juice rewards by making eye movements to one of two colored icons presented on a computer monitor, each rewarded on dynamic variable-interval schedules. Using a generalization of Wiener kernel analysis, we recover a compact mechanistic description of the impact of past reward on future choice in the form of a Linear-Nonlinear-Poisson model. We validate this model through rigorous predictive and generative testing. Compared to our earlier work with this same data set, this model proves to be a better description of choice behavior and is more tightly correlated with putative neural value signals. Refinements over previous models include hyperbolic (as opposed to exponential) temporal discounting of past rewards, and differential (as opposed to fractional) comparisons of option value. Through numerical simulation we find that within this class of strategies, the model parameters employed by animals are very close to those that maximize reward harvesting efficiency.

  12. Understanding Predisposition in College Choice: Toward an Integrated Model of College Choice and Theory of Reasoned Action

    ERIC Educational Resources Information Center

    Pitre, Paul E.; Johnson, Todd E.; Pitre, Charisse Cowan

    2006-01-01

    This article seeks to improve traditional models of college choice that draw from recruitment and enrollment management paradigms. In adopting a consumer approach to college choice, this article seeks to build upon consumer-related research, which centers on behavior and reasoning. More specifically, this article seeks to move inquiry beyond the…

  13. A simulation model for probabilistic analysis of Space Shuttle abort modes

    NASA Technical Reports Server (NTRS)

    Hage, R. T.

    1993-01-01

    A simulation model which was developed to provide a probabilistic analysis tool to study the various space transportation system abort mode situations is presented. The simulation model is based on Monte Carlo simulation of an event-tree diagram which accounts for events during the space transportation system's ascent and its abort modes. The simulation model considers just the propulsion elements of the shuttle system (i.e., external tank, main engines, and solid boosters). The model was developed to provide a better understanding of the probability of occurrence and successful completion of abort modes during the vehicle's ascent. The results of the simulation runs discussed are for demonstration purposes only, they are not official NASA probability estimates.

  14. Aircraft Conflict Analysis and Real-Time Conflict Probing Using Probabilistic Trajectory Modeling

    NASA Technical Reports Server (NTRS)

    Yang, Lee C.; Kuchar, James K.

    2000-01-01

    Methods for maintaining separation between aircraft in the current airspace system have been built from a foundation of structured routes and evolved procedures. However, as the airspace becomes more congested and the chance of failures or operational error become more problematic, automated conflict alerting systems have been proposed to help provide decision support and to serve as traffic monitoring aids. The problem of conflict detection and resolution has been tackled from a number of different ways, but in this thesis, it is recast as a problem of prediction in the presence of uncertainties. Much of the focus is concentrated on the errors and uncertainties from the working trajectory model used to estimate future aircraft positions. The more accurate the prediction, the more likely an ideal (no false alarms, no missed detections) alerting system can be designed. Additional insights into the problem were brought forth by a review of current operational and developmental approaches found in the literature. An iterative, trial and error approach to threshold design was identified. When examined from a probabilistic perspective, the threshold parameters were found to be a surrogate to probabilistic performance measures. To overcome the limitations in the current iterative design method, a new direct approach is presented where the performance measures are directly computed and used to perform the alerting decisions. The methodology is shown to handle complex encounter situations (3-D, multi-aircraft, multi-intent, with uncertainties) with relative ease. Utilizing a Monte Carlo approach, a method was devised to perform the probabilistic computations in near realtime. Not only does this greatly increase the method's potential as an analytical tool, but it also opens up the possibility for use as a real-time conflict alerting probe. A prototype alerting logic was developed and has been utilized in several NASA Ames Research Center experimental studies.

  15. Modeling PSA Problems - I: The Stimulus-Driven Theory of Probabilistic Dynamics

    SciTech Connect

    Labeau, P.E.; Izquierdo, J.M.

    2005-06-15

    The theory of probabilistic dynamics (TPD) offers a framework capable of modeling the interaction between the physical evolution of a system in transient conditions and the succession of branchings defining a sequence of events. Nonetheless, the Chapman-Kolmogorov equation, besides being inherently Markovian, assumes instantaneous changes in the system dynamics when a setpoint is crossed. In actuality, a transition between two dynamic evolution regimes of the system is a two-phase process. First, conditions corresponding to the triggering of a transition have to be met; this phase will be referred to as the activation of a 'stimulus'. Then, a time delay must elapse before the actual occurrence of the event causing the transition to take place. When this delay cannot be neglected and is a random quantity, the general TPD can no longer be used as such. Moreover, these delays are likely to influence the ordering of events in an accident sequence with competing situations, and the process of delineating sequences in the probabilistic safety analysis of a plant might therefore be affected in turn. This paper aims at presenting several extensions of the classical TPD, in which additional modeling capabilities are progressively introduced. A companion paper sketches a discretized approach of these problems.

  16. a Generic Probabilistic Model and a Hierarchical Solution for Sensor Localization in Noisy and Restricted Conditions

    NASA Astrophysics Data System (ADS)

    Ji, S.; Yuan, X.

    2016-06-01

    A generic probabilistic model, under fundamental Bayes' rule and Markov assumption, is introduced to integrate the process of mobile platform localization with optical sensors. And based on it, three relative independent solutions, bundle adjustment, Kalman filtering and particle filtering are deduced under different and additional restrictions. We want to prove that first, Kalman filtering, may be a better initial-value supplier for bundle adjustment than traditional relative orientation in irregular strips and networks or failed tie-point extraction. Second, in high noisy conditions, particle filtering can act as a bridge for gap binding when a large number of gross errors fail a Kalman filtering or a bundle adjustment. Third, both filtering methods, which help reduce the error propagation and eliminate gross errors, guarantee a global and static bundle adjustment, who requires the strictest initial values and control conditions. The main innovation is about the integrated processing of stochastic errors and gross errors in sensor observations, and the integration of the three most used solutions, bundle adjustment, Kalman filtering and particle filtering into a generic probabilistic localization model. The tests in noisy and restricted situations are designed and examined to prove them.

  17. PBDE exposure from food in Ireland: optimising data exploitation in probabilistic exposure modelling.

    PubMed

    Trudel, David; Tlustos, Christina; Von Goetz, Natalie; Scheringer, Martin; Hungerbühler, Konrad

    2011-01-01

    Polybrominated diphenyl ethers (PBDEs) are a class of brominated flame retardants added to plastics, polyurethane foam, electronics, textiles, and other products. These products release PBDEs into the indoor and outdoor environment, thus causing human exposure through food and dust. This study models PBDE dose distributions from ingestion of food for Irish adults on congener basis by using two probabilistic and one semi-deterministic method. One of the probabilistic methods was newly developed and is based on summary statistics of food consumption combined with a model generating realistic daily energy supply from food. Median (intermediate) doses of total PBDEs are in the range of 0.4-0.6 ng/kg(bw)/day for Irish adults. The 97.5th percentiles of total PBDE doses lie in a range of 1.7-2.2 ng/kg(bw)/day, which is comparable to doses derived for Belgian and Dutch adults. BDE-47 and BDE-99 were identified as the congeners contributing most to estimated intakes, accounting for more than half of the total doses. The most influential food groups contributing to this intake are lean fish and salmon which together account for about 22-25% of the total doses.

  18. Spatial dispersion of interstellar civilizations: a probabilistic site percolation model in three dimensions

    NASA Astrophysics Data System (ADS)

    Hair, Thomas W.; Hedman, Andrew D.

    2013-01-01

    A model of the spatial emergence of an interstellar civilization into a uniform distribution of habitable systems is presented. The process of emigration is modelled as a three-dimensional probabilistic cellular automaton. An algorithm is presented which defines both the daughter colonies of the original seed vertex and all subsequent connected vertices, and the probability of a connection between any two vertices. The automaton is analysed over a wide set of parameters for iterations that represent up to 250 000 years within the model's assumptions. Emigration patterns are characterized and used to evaluate two hypotheses that aim to explain the Fermi Paradox. The first hypothesis states that interstellar emigration takes too long for any civilization to have yet come within a detectable distance, and the second states that large volumes of habitable space may be left uninhabited by an interstellar civilization and Earth is located in one of these voids.

  19. A probabilistic model for predicting the probability of no-show in hospital appointments.

    PubMed

    Alaeddini, Adel; Yang, Kai; Reddy, Chandan; Yu, Susan

    2011-06-01

    The number of no-shows has a significant impact on the revenue, cost and resource utilization for almost all healthcare systems. In this study we develop a hybrid probabilistic model based on logistic regression and empirical Bayesian inference to predict the probability of no-shows in real time using both general patient social and demographic information and individual clinical appointments attendance records. The model also considers the effect of appointment date and clinic type. The effectiveness of the proposed approach is validated based on a patient dataset from a VA medical center. Such an accurate prediction model can be used to enable a precise selective overbooking strategy to reduce the negative effect of no-shows and to fill appointment slots while maintaining short wait times.

  20. Decomposing biodiversity data using the Latent Dirichlet Allocation model, a probabilistic multivariate statistical method.

    PubMed

    Valle, Denis; Baiser, Benjamin; Woodall, Christopher W; Chazdon, Robin

    2014-12-01

    We propose a novel multivariate method to analyse biodiversity data based on the Latent Dirichlet Allocation (LDA) model. LDA, a probabilistic model, reduces assemblages to sets of distinct component communities. It produces easily interpretable results, can represent abrupt and gradual changes in composition, accommodates missing data and allows for coherent estimates of uncertainty. We illustrate our method using tree data for the eastern United States and from a tropical successional chronosequence. The model is able to detect pervasive declines in the oak community in Minnesota and Indiana, potentially due to fire suppression, increased growing season precipitation and herbivory. The chronosequence analysis is able to delineate clear successional trends in species composition, while also revealing that site-specific factors significantly impact these successional trajectories. The proposed method provides a means to decompose and track the dynamics of species assemblages along temporal and spatial gradients, including effects of global change and forest disturbances.

  1. A spatio-temporal model for probabilistic seismic hazard zonation of Tehran

    NASA Astrophysics Data System (ADS)

    Hashemi, Mahdi; Alesheikh, Ali Asghar; Zolfaghari, Mohammad Reza

    2013-08-01

    A precondition for all disaster management steps, building damage prediction, and construction code developments is a hazard assessment that shows the exceedance probabilities of different ground motion levels at a site considering different near- and far-field earthquake sources. The seismic sources are usually categorized as time-independent area sources and time-dependent fault sources. While the earlier incorporates the small and medium events, the later takes into account only the large characteristic earthquakes. In this article, a probabilistic approach is proposed to aggregate the effects of time-dependent and time-independent sources on seismic hazard. The methodology is then applied to generate three probabilistic seismic hazard maps of Tehran for 10%, 5%, and 2% exceedance probabilities in 50 years. The results indicate an increase in peak ground acceleration (PGA) values toward the southeastern part of the study area and the PGA variations are mostly controlled by the shear wave velocities across the city. In addition, the implementation of the methodology takes advantage of GIS capabilities especially raster-based analyses and representations. During the estimation of the PGA exceedance rates, the emphasis has been placed on incorporating the effects of different attenuation relationships and seismic source models by using a logic tree.

  2. Probabilistic modeling of the flows and environmental risks of nano-silica.

    PubMed

    Wang, Yan; Kalinina, Anna; Sun, Tianyin; Nowack, Bernd

    2016-03-01

    Nano-silica, the engineered nanomaterial with one of the largest production volumes, has a wide range of applications in consumer products and industry. This study aimed to quantify the exposure of nano-silica to the environment and to assess its risk to surface waters. Concentrations were calculated for four environmental (air, soil, surface water, sediments) and two technical compartments (wastewater, solid waste) for the EU and Switzerland using probabilistic material flow modeling. The corresponding median concentration in surface water is predicted to be 0.12 μg/l in the EU (0.053-3.3 μg/l, 15/85% quantiles). The concentrations in sediments in the complete sedimentation scenario were found to be the largest among all environmental compartments, with a median annual increase of 0.43 mg/kg · y in the EU (0.19-12 mg/kg · y, 15/85% quantiles). Moreover, probabilistic species sensitivity distributions (PSSD) were computed and the risk of nano-silica in surface waters was quantified by comparing the predicted environmental concentration (PEC) with the predicted no-effect concentration (PNEC) distribution, which was derived from the cumulative PSSD. This assessment suggests that nano-silica currently poses no risk to aquatic organisms in surface waters. Further investigations are needed to assess the risk of nano-silica in other environmental compartments, which is currently not possible due to a lack of ecotoxicological data. PMID:26745294

  3. QUANTIFYING AGGREGATE CHLORPYRIFOS EXPOSURE AND DOSE TO CHILDREN USING A PHYSICALLY-BASED TWO-STAGE MONTE CARLO PROBABILISTIC MODEL

    EPA Science Inventory

    To help address the Food Quality Protection Act of 1996, a physically-based, two-stage Monte Carlo probabilistic model has been developed to quantify and analyze aggregate exposure and dose to pesticides via multiple routes and pathways. To illustrate model capabilities and ide...

  4. Alignment and prediction of cis-regulatory modules based on a probabilistic model of evolution.

    PubMed

    He, Xin; Ling, Xu; Sinha, Saurabh

    2009-03-01

    Cross-species comparison has emerged as a powerful paradigm for predicting cis-regulatory modules (CRMs) and understanding their evolution. The comparison requires reliable sequence alignment, which remains a challenging task for less conserved noncoding sequences. Furthermore, the existing models of DNA sequence evolution generally do not explicitly treat the special properties of CRM sequences. To address these limitations, we propose a model of CRM evolution that captures different modes of evolution of functional transcription factor binding sites (TFBSs) and the background sequences. A particularly novel aspect of our work is a probabilistic model of gains and losses of TFBSs, a process being recognized as an important part of regulatory sequence evolution. We present a computational framework that uses this model to solve the problems of CRM alignment and prediction. Our alignment method is similar to existing methods of statistical alignment but uses the conserved binding sites to improve alignment. Our CRM prediction method deals with the inherent uncertainties of binding site annotations and sequence alignment in a probabilistic framework. In simulated as well as real data, we demonstrate that our program is able to improve both alignment and prediction of CRM sequences over several state-of-the-art methods. Finally, we used alignments produced by our program to study binding site conservation in genome-wide binding data of key transcription factors in the Drosophila blastoderm, with two intriguing results: (i) the factor-bound sequences are under strong evolutionary constraints even if their neighboring genes are not expressed in the blastoderm and (ii) binding sites in distal bound sequences (relative to transcription start sites) tend to be more conserved than those in proximal regions. Our approach is implemented as software, EMMA (Evolutionary Model-based cis-regulatory Module Analysis), ready to be applied in a broad biological context.

  5. A probabilistic water erosion model for Mediterranean olive orchards with changing cover factor

    NASA Astrophysics Data System (ADS)

    Espejo Perez, A. J.; Giraldez Cervera, J. V.; Vanderlinden, K.

    2012-04-01

    A simple probabilistic framework is presented to describe soil and water loss in olive orchards in Mediterranean environments. The model is based on the exploration of field observations, obtained during three hydrological years (2003-2007) from a network of 1m2 microplots located in olive orchards throughout southern Spain. The objective of this experiment was to compare soil erosion under conventional tillage (CT) and a cover crop system (CC). The basis of the model is a linear relationship between soil and water loss (output) and key variables (input). The exploration of field observations suggested that the key variables were i) rainfall, which was easily described by a gamma probability density function (pdf), ii) slope, for which we adopted a uniform pdf, with values ranging from 4 to24%; and iii) cover factor. This factor could be well described using a truncated beta pdf, but due to the growing trend in the data we proposed an expression similar to a sigmoid curve. Runoff and sediment yield in both soil managements were best represented by exponential pdfs. To generalize the model we combined it with a Monte Carlo scheme to generate the inputs randomly. The model was run using the simulated input data and the relative frequencies of simulated output data were compared with the proposed pdfs for the observed data. The results showed the ability of the model to provide a probabilistic description of soil erosion. Observed and simulated data indicated that the probability to obtain higher soil losses was larger in CT as compared to CC. Therefore, conservationist soil management is essential for maintaining the productivity of olive orchards in this area. Keywords: soil management, erosion processes in olive orchard, probability density function, Monte Carlo scheme.

  6. Unification of models for choice between delayed reinforcers.

    PubMed

    Killeen, P R; Fantino, E

    1990-01-01

    Two models for choice between delayed reinforcers, Fantino's delay-reduction theory and Killeen's incentive theory, are reviewed. Incentive theory is amended to incorporate the effects of arousal on alternate types of behavior that might block the reinforcement of the target behavior. This amended version is shown to differ from the delay-reduction theory in a term that is an exponential in incentive theory and a difference in delay-reduction theory. A power series approximation to the exponential generates a model that is formally identical with delay-reduction theory. Correlations between delay-reduction theory and the amended incentive theory show excellent congruence over a range of experimental conditions. Although the assumptions that gave rise to delay-reduction theory and incentive theory remain different and testable, the models deriving from the theories are unlikely to be discriminable by parametric experimental tests. This congruence of the models is recognized by naming the common model the delayed reinforcement model, which is then compared with other models of choice such as Killeen and Fetterman's (1988) behavioral theory of timing, Mazur's (1984) equivalence rule, and Vaughan's (1985) melioration theory.

  7. Unification of models for choice between delayed reinforcers.

    PubMed Central

    Killeen, P R; Fantino, E

    1990-01-01

    Two models for choice between delayed reinforcers, Fantino's delay-reduction theory and Killeen's incentive theory, are reviewed. Incentive theory is amended to incorporate the effects of arousal on alternate types of behavior that might block the reinforcement of the target behavior. This amended version is shown to differ from the delay-reduction theory in a term that is an exponential in incentive theory and a difference in delay-reduction theory. A power series approximation to the exponential generates a model that is formally identical with delay-reduction theory. Correlations between delay-reduction theory and the amended incentive theory show excellent congruence over a range of experimental conditions. Although the assumptions that gave rise to delay-reduction theory and incentive theory remain different and testable, the models deriving from the theories are unlikely to be discriminable by parametric experimental tests. This congruence of the models is recognized by naming the common model the delayed reinforcement model, which is then compared with other models of choice such as Killeen and Fetterman's (1988) behavioral theory of timing, Mazur's (1984) equivalence rule, and Vaughan's (1985) melioration theory. PMID:2299288

  8. Probabilistic multi-item inventory model with varying mixture shortage cost under restrictions.

    PubMed

    Fergany, Hala A

    2016-01-01

    This paper proposed a new general probabilistic multi-item, single-source inventory model with varying mixture shortage cost under two restrictions. One of them is on the expected varying backorder cost and the other is on the expected varying lost sales cost. This model is formulated to analyze how the firm can deduce the optimal order quantity and the optimal reorder point for each item to reach the main goal of minimizing the expected total cost. The demand is a random variable and the lead time is a constant. The demand during the lead time is a random variable that follows any continuous distribution, for example; the normal distribution, the exponential distribution and the Chi square distribution. An application with real data is analyzed and the goal of minimization the expected total cost is achieved. Two special cases are deduced.

  9. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    SciTech Connect

    Suzette Payne

    2007-08-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  10. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    SciTech Connect

    Suzette Payne

    2006-04-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  11. Probabilistic Model of Onset Detection Explains Paradoxes in Human Time Perception

    PubMed Central

    Nikolov, Stanislav; Rahnev, Dobromir A.; Lau, Hakwan C.

    2010-01-01

    A very basic computational model is proposed to explain two puzzling findings in the time perception literature. First, spontaneous motor actions are preceded by up to 1–2 s of preparatory activity (Kornhuber and Deecke, 1965). Yet, subjects are only consciously aware of about a quarter of a second of motor preparation (Libet et al., 1983). Why are they not aware of the early part of preparation? Second, psychophysical findings (Spence et al., 2001) support the principle of attention prior entry (Titchener, 1908), which states that attended stimuli are perceived faster than unattended stimuli. However, electrophysiological studies reported no or little corresponding temporal difference between the neural signals for attended and unattended stimuli (McDonald et al., 2005; Vibell et al., 2007). We suggest that the key to understanding these puzzling findings is to think of onset detection in probabilistic terms. The two apparently paradoxical phenomena are naturally predicted by our signal detection theoretic model. PMID:21833206

  12. Probabilistic multi-item inventory model with varying mixture shortage cost under restrictions.

    PubMed

    Fergany, Hala A

    2016-01-01

    This paper proposed a new general probabilistic multi-item, single-source inventory model with varying mixture shortage cost under two restrictions. One of them is on the expected varying backorder cost and the other is on the expected varying lost sales cost. This model is formulated to analyze how the firm can deduce the optimal order quantity and the optimal reorder point for each item to reach the main goal of minimizing the expected total cost. The demand is a random variable and the lead time is a constant. The demand during the lead time is a random variable that follows any continuous distribution, for example; the normal distribution, the exponential distribution and the Chi square distribution. An application with real data is analyzed and the goal of minimization the expected total cost is achieved. Two special cases are deduced. PMID:27588244

  13. A probabilistic quantitative risk assessment model for the long-term work zone crashes.

    PubMed

    Meng, Qiang; Weng, Jinxian; Qu, Xiaobo

    2010-11-01

    Work zones especially long-term work zones increase traffic conflicts and cause safety problems. Proper casualty risk assessment for a work zone is of importance for both traffic safety engineers and travelers. This paper develops a novel probabilistic quantitative risk assessment (QRA) model to evaluate the casualty risk combining frequency and consequence of all accident scenarios triggered by long-term work zone crashes. The casualty risk is measured by the individual risk and societal risk. The individual risk can be interpreted as the frequency of a driver/passenger being killed or injured, and the societal risk describes the relation between frequency and the number of casualties. The proposed probabilistic QRA model consists of the estimation of work zone crash frequency, an event tree and consequence estimation models. There are seven intermediate events--age (A), crash unit (CU), vehicle type (VT), alcohol (AL), light condition (LC), crash type (CT) and severity (S)--in the event tree. Since the estimated value of probability for some intermediate event may have large uncertainty, the uncertainty can thus be characterized by a random variable. The consequence estimation model takes into account the combination effects of speed and emergency medical service response time (ERT) on the consequence of work zone crash. Finally, a numerical example based on the Southeast Michigan work zone crash data is carried out. The numerical results show that there will be a 62% decrease of individual fatality risk and 44% reduction of individual injury risk if the mean travel speed is slowed down by 20%. In addition, there will be a 5% reduction of individual fatality risk and 0.05% reduction of individual injury risk if ERT is reduced by 20%. In other words, slowing down speed is more effective than reducing ERT in the casualty risk mitigation.

  14. SCEC/CME CyberShake: Probabilistic Seismic Hazard Analysis Using 3D Seismic Waveform Modeling

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Cui, Y.; Faerman, M.; Field, E.; Graves, R.; Gupta, N.; Gupta, V.; Jordan, T. H.; Kesselman, C.; Mehta, G.; Okaya, D.; Vahi, K.; Zhao, L.

    2005-12-01

    Researchers on the SCEC Community Modeling Environment (SCEC/CME) Project are calculating Probabilistic Seismic Hazard Curves for several sites in the Los Angeles area. The hazard curves calculated in this study use Intensity Measure Relationships (IMRs) based on 3D ground motion simulations rather than on attenuation relationships. State-of-the-art Probabilistic Seismic Hazard Analysis (PSHA) is currently conducted using IMRs that use empirically-based attenuation relationships. These attenuation relationships represent relatively simple analytical models based on the regression of observed data. However, it is widely believed that significant improvements in SHA will rely on the use of more physics-based, waveform modeling. In fact, a more physics-based approach to PSHA was endorsed in a recent assessment of earthquake science by National Research Council (2003). In order to introduce the use of 3D seismic waveform modeling into PSHA hazard curve calculations, the SCEC/CME CyberShake group is integrating state-of-the-art PSHA software tools (OpenSHA), SCEC-developed geophysical models (SCEC CVM3.0), validated anelastic wave modeling (AWM) software, and state-of-the-art computational technologies including high performance computing and grid-based scientific workflows in an effort to develop an OpenSHA-compatible 3D waveform-based IMR component. This will allow researchers to combine a new class of waveform-based IMRs with the large number of existing PSHA components, such as Earthquake Rupture Forecasts (ERF's), that are currently implemented in the OpenSHA system. To calculate a probabilistic hazard curve for a site of interest, we use the OpenSHA implementation of the NSHMP-2002 ERF and identify all ruptures within 200km of the site of interest. For each of these ruptures, we convert the NSHMP-2002 rupture definition into one, or more, Ruptures with Slip Time History (Rupture Variations) using newly developed Rupture Generator software. Strain Green Tensors are

  15. Probabilistic neural networks modeling of the 48-h LC50 acute toxicity endpoint to Daphnia magna.

    PubMed

    Niculescu, S P; Lewis, M A; Tigner, J

    2008-01-01

    Two modeling experiments based on the maximum likelihood estimation paradigm and targeting prediction of the Daphnia magna 48-h LC50 acute toxicity endpoint for both organic and inorganic compounds are reported. The resulting models computational algorithms are implemented as basic probabilistic neural networks with Gaussian kernel (statistical corrections included). The first experiment uses strictly D. magna information for 971 structures as training/learning data and the resulting model targets practical applications. The second experiment uses the same training/learning information plus additional data on another 29 compounds whose endpoint information is originating from D. pulex and Ceriodaphnia dubia. It only targets investigation of the effect of mixing strictly D. magna 48-h LC50 modeling information with small amounts of similar information estimated from related species, and this is done as part of the validation process. A complementary 81 compounds dataset (involving only strictly D. magna information) is used to perform external testing. On this external test set, the Gaussian character of the distribution of the residuals is confirmed for both models. This allows the use of traditional statistical methodology to implement computation of confidence intervals for the unknown measured values based on the models predictions. Examples are provided for the model targeting practical applications. For the same model, a comparison with other existing models targeting the same endpoint is performed.

  16. ToPS: a framework to manipulate probabilistic models of sequence data.

    PubMed

    Kashiwabara, André Yoshiaki; Bonadio, Igor; Onuchic, Vitor; Amado, Felipe; Mathias, Rafael; Durham, Alan Mitchell

    2013-01-01

    Discrete Markovian models can be used to characterize patterns in sequences of values and have many applications in biological sequence analysis, including gene prediction, CpG island detection, alignment, and protein profiling. We present ToPS, a computational framework that can be used to implement different applications in bioinformatics analysis by combining eight kinds of models: (i) independent and identically distributed process; (ii) variable-length Markov chain; (iii) inhomogeneous Markov chain; (iv) hidden Markov model; (v) profile hidden Markov model; (vi) pair hidden Markov model; (vii) generalized hidden Markov model; and (viii) similarity based sequence weighting. The framework includes functionality for training, simulation and decoding of the models. Additionally, it provides two methods to help parameter setting: Akaike and Bayesian information criteria (AIC and BIC). The models can be used stand-alone, combined in Bayesian classifiers, or included in more complex, multi-model, probabilistic architectures using GHMMs. In particular the framework provides a novel, flexible, implementation of decoding in GHMMs that detects when the architecture can be traversed efficiently. PMID:24098098

  17. A 3-D probabilistic stability model incorporating the variability of root reinforcement

    NASA Astrophysics Data System (ADS)

    Cislaghi, Alessio; Chiaradia, Enrico; Battista Bischetti, Gian

    2016-04-01

    Process-oriented models of hillslope stability have a great potentiality to improve spatially-distributed landslides hazard analyses. At the same time, they may have severe limitations and among them the variability and uncertainty of the parameters play a key role. In this context, the application of a probabilistic approach through Monte Carlo techniques can be the right practice to deal with the variability of each input parameter by considering a proper probability distribution. In forested areas an additional point must be taken into account: the reinforcement due to roots permeating the soil and its variability and uncertainty. While the probability distributions of geotechnical and hydrological parameters have been widely investigated, little is known concerning the variability and the spatial heterogeneity of root reinforcement. Moreover, there are still many difficulties in measuring and in evaluating such a variable. In our study we aim to: i) implement a robust procedure to evaluate the variability of root reinforcement as a probabilistic distribution, according to the stand characteristics of forests, such as the trees density, the average diameter at breast height, the minimum distance among trees, and (ii) combine a multidimensional process-oriented model with a Monte Carlo Simulation technique, to obtain a probability distribution of the Factor of Safety. The proposed approach has been applied to a small Alpine area, mainly covered by a coniferous forest and characterized by steep slopes and a high landslide hazard. The obtained results show a good reliability of the model according to the landslide inventory map. At the end, our findings contribute to improve the reliability of landslide hazard mapping in forested areas and help forests managers to evaluate different management scenarios.

  18. Simple model for multiple-choice collective decision making.

    PubMed

    Lee, Ching Hua; Lucas, Andrew

    2014-11-01

    We describe a simple model of heterogeneous, interacting agents making decisions between n≥2 discrete choices. For a special class of interactions, our model is the mean field description of random field Potts-like models and is effectively solved by finding the extrema of the average energy E per agent. In these cases, by studying the propagation of decision changes via avalanches, we argue that macroscopic dynamics is well captured by a gradient flow along E. We focus on the permutation symmetric case, where all n choices are (on average) the same, and spontaneous symmetry breaking (SSB) arises purely from cooperative social interactions. As examples, we show that bimodal heterogeneity naturally provides a mechanism for the spontaneous formation of hierarchies between decisions and that SSB is a preferred instability to discontinuous phase transitions between two symmetric points. Beyond the mean field limit, exponentially many stable equilibria emerge when we place this model on a graph of finite mean degree. We conclude with speculation on decision making with persistent collective oscillations. Throughout the paper, we emphasize analogies between methods of solution to our model and common intuition from diverse areas of physics, including statistical physics and electromagnetism. PMID:25493831

  19. Simple model for multiple-choice collective decision making.

    PubMed

    Lee, Ching Hua; Lucas, Andrew

    2014-11-01

    We describe a simple model of heterogeneous, interacting agents making decisions between n≥2 discrete choices. For a special class of interactions, our model is the mean field description of random field Potts-like models and is effectively solved by finding the extrema of the average energy E per agent. In these cases, by studying the propagation of decision changes via avalanches, we argue that macroscopic dynamics is well captured by a gradient flow along E. We focus on the permutation symmetric case, where all n choices are (on average) the same, and spontaneous symmetry breaking (SSB) arises purely from cooperative social interactions. As examples, we show that bimodal heterogeneity naturally provides a mechanism for the spontaneous formation of hierarchies between decisions and that SSB is a preferred instability to discontinuous phase transitions between two symmetric points. Beyond the mean field limit, exponentially many stable equilibria emerge when we place this model on a graph of finite mean degree. We conclude with speculation on decision making with persistent collective oscillations. Throughout the paper, we emphasize analogies between methods of solution to our model and common intuition from diverse areas of physics, including statistical physics and electromagnetism.

  20. Alterations in choice behavior by manipulations of world model.

    PubMed

    Green, C S; Benson, C; Kersten, D; Schrater, P

    2010-09-14

    How to compute initially unknown reward values makes up one of the key problems in reinforcement learning theory, with two basic approaches being used. Model-free algorithms rely on the accumulation of substantial amounts of experience to compute the value of actions, whereas in model-based learning, the agent seeks to learn the generative process for outcomes from which the value of actions can be predicted. Here we show that (i) "probability matching"-a consistent example of suboptimal choice behavior seen in humans-occurs in an optimal Bayesian model-based learner using a max decision rule that is initialized with ecologically plausible, but incorrect beliefs about the generative process for outcomes and (ii) human behavior can be strongly and predictably altered by the presence of cues suggestive of various generative processes, despite statistically identical outcome generation. These results suggest human decision making is rational and model based and not consistent with model-free learning.

  1. Probabilistic Graphical Models for the Analysis and Synthesis of Musical Audio

    NASA Astrophysics Data System (ADS)

    Hoffmann, Matthew Douglas

    Content-based Music Information Retrieval (MIR) systems seek to automatically extract meaningful information from musical audio signals. This thesis applies new and existing generative probabilistic models to several content-based MIR tasks: timbral similarity estimation, semantic annotation and retrieval, and latent source discovery and separation. In order to estimate how similar two songs sound to one another, we employ a Hierarchical Dirichlet Process (HDP) mixture model to discover a shared representation of the distribution of timbres in each song. Comparing songs under this shared representation yields better query-by-example retrieval quality and scalability than previous approaches. To predict what tags are likely to apply to a song (e.g., "rap," "happy," or "driving music"), we develop the Codeword Bernoulli Average (CBA) model, a simple and fast mixture-of-experts model. Despite its simplicity, CBA performs at least as well as state-of-the-art approaches at automatically annotating songs and finding to what songs in a database a given tag most applies. Finally, we address the problem of latent source discovery and separation by developing two Bayesian nonparametric models, the Shift-Invariant HDP and Gamma Process NMF. These models allow us to discover what sounds (e.g. bass drums, guitar chords, etc.) are present in a song or set of songs and to isolate or suppress individual source. These models' ability to decide how many latent sources are necessary to model the data is particularly valuable in this application, since it is impossible to guess a priori how many sounds will appear in a given song or set of songs. Once they have been fit to data, probabilistic models can also be used to drive the synthesis of new musical audio, both for creative purposes and to qualitatively diagnose what information a model does and does not capture. We also adapt the SIHDP model to create new versions of input audio with arbitrary sample sets, for example, to create

  2. Erratum: Probabilistic application of a fugacity model to predict triclosan fate during wastewater treatment.

    PubMed

    Bock, Michael; Lyndall, Jennifer; Barber, Timothy; Fuchsman, Phyllis; Perruchon, Elyse; Capdevielle, Marie

    2010-10-01

    The fate and partitioning of the antimicrobial compound, triclosan, in wastewater treatment plants (WWTPs) is evaluated using a probabilistic fugacity model to predict the range of triclosan concentrations in effluent and secondary biosolids. The WWTP model predicts 84% to 92% triclosan removal, which is within the range of measured removal efficiencies (typically 70% to 98%). Triclosan is predominantly removed by sorption and subsequent settling of organic particulates during primary treatment and by aerobic biodegradation during secondary treatment. Median modeled removal efficiency due to sorption is 40% for all treatment phases and 31% in the primary treatment phase. Median modeled removal efficiency due to biodegradation is 48% for all treatment phases and 44% in the secondary treatment phase. Important factors contributing to variation in predicted triclosan concentrations in effluent and biosolids include influent concentrations, solids concentrations in settling tanks, and factors related to solids retention time. Measured triclosan concentrations in biosolids and non-United States (US) effluent are consistent with model predictions. However, median concentrations in US effluent are over-predicted with this model, suggesting that differences in some aspect of treatment practices not incorporated in the model (e.g., disinfection methods) may affect triclosan removal from effluent. Model applications include predicting changes in environmental loadings associated with new triclosan applications and supporting risk analyses for biosolids-amended land and effluent receiving waters.

  3. Probabilistic application of a fugacity model to predict triclosan fate during wastewater treatment.

    PubMed

    Bock, Michael; Lyndall, Jennifer; Barber, Timothy; Fuchsman, Phyllis; Perruchon, Elyse; Capdevielle, Marie

    2010-07-01

    The fate and partitioning of the antimicrobial compound, triclosan, in wastewater treatment plants (WWTPs) is evaluated using a probabilistic fugacity model to predict the range of triclosan concentrations in effluent and secondary biosolids. The WWTP model predicts 84% to 92% triclosan removal, which is within the range of measured removal efficiencies (typically 70% to 98%). Triclosan is predominantly removed by sorption and subsequent settling of organic particulates during primary treatment and by aerobic biodegradation during secondary treatment. Median modeled removal efficiency due to sorption is 40% for all treatment phases and 31% in the primary treatment phase. Median modeled removal efficiency due to biodegradation is 48% for all treatment phases and 44% in the secondary treatment phase. Important factors contributing to variation in predicted triclosan concentrations in effluent and biosolids include influent concentrations, solids concentrations in settling tanks, and factors related to solids retention time. Measured triclosan concentrations in biosolids and non-United States (US) effluent are consistent with model predictions. However, median concentrations in US effluent are over-predicted with this model, suggesting that differences in some aspect of treatment practices not incorporated in the model (e.g., disinfection methods) may affect triclosan removal from effluent. Model applications include predicting changes in environmental loadings associated with new triclosan applications and supporting risk analyses for biosolids-amended land and effluent receiving waters.

  4. Probabilistic conditional reasoning: Disentangling form and content with the dual-source model.

    PubMed

    Singmann, Henrik; Klauer, Karl Christoph; Beller, Sieghard

    2016-08-01

    The present research examines descriptive models of probabilistic conditional reasoning, that is of reasoning from uncertain conditionals with contents about which reasoners have rich background knowledge. According to our dual-source model, two types of information shape such reasoning: knowledge-based information elicited by the contents of the material and content-independent information derived from the form of inferences. Two experiments implemented manipulations that selectively influenced the model parameters for the knowledge-based information, the relative weight given to form-based versus knowledge-based information, and the parameters for the form-based information, validating the psychological interpretation of these parameters. We apply the model to classical suppression effects dissecting them into effects on background knowledge and effects on form-based processes (Exp. 3) and we use it to reanalyse previous studies manipulating reasoning instructions. In a model-comparison exercise, based on data of seven studies, the dual-source model outperformed three Bayesian competitor models. Overall, our results support the view that people make use of background knowledge in line with current Bayesian models, but they also suggest that the form of the conditional argument, irrespective of its content, plays a substantive, yet smaller, role. PMID:27416493

  5. Multi-level approach for statistical appearance models with probabilistic correspondences

    NASA Astrophysics Data System (ADS)

    Krüger, Julia; Ehrhardt, Jan; Handels, Heinz

    2016-03-01

    Statistical shape and appearance models are often based on the accurate identification of one-to-one correspondences in a training data set. At the same time, the determination of these corresponding landmarks is the most challenging part of such methods. Hufnagel et al.1 developed an alternative method using correspondence probabilities for a statistical shape model. In Krüuger et al.2, 3 we propose the use of probabilistic correspondences for statistical appearance models by incorporating appearance information into the framework. We employ a point-based representation of image data combining position and appearance information. The model is optimized and adapted by a maximum a-posteriori (MAP) approach deriving a single global optimization criterion with respect to model parameters and observation dependent parameters that directly affects shape and appearance information of the considered structures. Because initially unknown correspondence probabilities are used and a higher number of degrees of freedom is introduced to the model a regularization of the model generation process is advantageous. For this purpose we extend the derived global criterion by a regularization term which penalizes implausible topological changes. Furthermore, we propose a multi-level approach for the optimization, to increase the robustness of the model generation process.

  6. A time-dependent probabilistic seismic-hazard model for California

    USGS Publications Warehouse

    Cramer, C.H.; Petersen, M.D.; Cao, T.; Toppozada, Tousson R.; Reichle, M.

    2000-01-01

    For the purpose of sensitivity testing and illuminating nonconsensus components of time-dependent models, the California Department of Conservation, Division of Mines and Geology (CDMG) has assembled a time-dependent version of its statewide probabilistic seismic hazard (PSH) model for California. The model incorporates available consensus information from within the earth-science community, except for a few faults or fault segments where consensus information is not available. For these latter faults, published information has been incorporated into the model. As in the 1996 CDMG/U.S. Geological Survey (USGS) model, the time-dependent models incorporate three multisegment ruptures: a 1906, an 1857, and a southern San Andreas earthquake. Sensitivity tests are presented to show the effect on hazard and expected damage estimates of (1) intrinsic (aleatory) sigma, (2) multisegment (cascade) vs. independent segment (no cascade) ruptures, and (3) time-dependence vs. time-independence. Results indicate that (1) differences in hazard and expected damage estimates between time-dependent and independent models increase with decreasing intrinsic sigma, (2) differences in hazard and expected damage estimates between full cascading and not cascading are insensitive to intrinsic sigma, (3) differences in hazard increase with increasing return period (decreasing probability of occurrence), and (4) differences in moment-rate budgets increase with decreasing intrinsic sigma and with the degree of cascading, but are within the expected uncertainty in PSH time-dependent modeling and do not always significantly affect hazard and expected damage estimates.

  7. Probabilistic conditional reasoning: Disentangling form and content with the dual-source model.

    PubMed

    Singmann, Henrik; Klauer, Karl Christoph; Beller, Sieghard

    2016-08-01

    The present research examines descriptive models of probabilistic conditional reasoning, that is of reasoning from uncertain conditionals with contents about which reasoners have rich background knowledge. According to our dual-source model, two types of information shape such reasoning: knowledge-based information elicited by the contents of the material and content-independent information derived from the form of inferences. Two experiments implemented manipulations that selectively influenced the model parameters for the knowledge-based information, the relative weight given to form-based versus knowledge-based information, and the parameters for the form-based information, validating the psychological interpretation of these parameters. We apply the model to classical suppression effects dissecting them into effects on background knowledge and effects on form-based processes (Exp. 3) and we use it to reanalyse previous studies manipulating reasoning instructions. In a model-comparison exercise, based on data of seven studies, the dual-source model outperformed three Bayesian competitor models. Overall, our results support the view that people make use of background knowledge in line with current Bayesian models, but they also suggest that the form of the conditional argument, irrespective of its content, plays a substantive, yet smaller, role.

  8. Probabilistic Boolean Network Modelling and Analysis Framework for mRNA Translation.

    PubMed

    Zhao, Yun-Bo; Krishnan, J

    2016-01-01

    mRNA translation is a complex process involving the progression of ribosomes on the mRNA, resulting in the synthesis of proteins, and is subject to multiple layers of regulation. This process has been modelled using different formalisms, both stochastic and deterministic. Recently, we introduced a Probabilistic Boolean modelling framework for mRNA translation, which possesses the advantage of tools for numerically exact computation of steady state probability distribution, without requiring simulation. Here, we extend this model to incorporate both random sequential and parallel update rules, and demonstrate its effectiveness in various settings, including its flexibility in accommodating additional static and dynamic biological complexities and its role in parameter sensitivity analysis. In these applications, the results from the model analysis match those of TASEP model simulations. Importantly, the proposed modelling framework maintains the stochastic aspects of mRNA translation and provides a way to exactly calculate probability distributions, providing additional tools of analysis in this context. Finally, the proposed modelling methodology provides an alternative approach to the understanding of the mRNA translation process, by bridging the gap between existing approaches, providing new analysis tools, and contributing to a more robust platform for modelling and understanding translation.

  9. A Model-Based Probabilistic Inversion Framework for Wire Fault Detection Using TDR

    NASA Technical Reports Server (NTRS)

    Schuet, Stefan R.; Timucin, Dogan A.; Wheeler, Kevin R.

    2010-01-01

    Time-domain reflectometry (TDR) is one of the standard methods for diagnosing faults in electrical wiring and interconnect systems, with a long-standing history focused mainly on hardware development of both high-fidelity systems for laboratory use and portable hand-held devices for field deployment. While these devices can easily assess distance to hard faults such as sustained opens or shorts, their ability to assess subtle but important degradation such as chafing remains an open question. This paper presents a unified framework for TDR-based chafing fault detection in lossy coaxial cables by combining an S-parameter based forward modeling approach with a probabilistic (Bayesian) inference algorithm. Results are presented for the estimation of nominal and faulty cable parameters from laboratory data.

  10. Behavioral Modeling Based on Probabilistic Finite Automata: An Empirical Study †

    PubMed Central

    Tîrnăucă, Cristina; Montaña, José L.; Ontañón, Santiago; González, Avelino J.; Pardo, Luis M.

    2016-01-01

    Imagine an agent that performs tasks according to different strategies. The goal of Behavioral Recognition (BR) is to identify which of the available strategies is the one being used by the agent, by simply observing the agent’s actions and the environmental conditions during a certain period of time. The goal of Behavioral Cloning (BC) is more ambitious. In this last case, the learner must be able to build a model of the behavior of the agent. In both settings, the only assumption is that the learner has access to a training set that contains instances of observed behavioral traces for each available strategy. This paper studies a machine learning approach based on Probabilistic Finite Automata (PFAs), capable of achieving both the recognition and cloning tasks. We evaluate the performance of PFAs in the context of a simulated learning environment (in this case, a virtual Roomba vacuum cleaner robot), and compare it with a collection of other machine learning approaches. PMID:27347956

  11. Life Prediction and Classification of Failure Modes in Solid State Luminaires Using Bayesian Probabilistic Models

    SciTech Connect

    Lall, Pradeep; Wei, Junchao; Sakalaukus, Peter

    2014-05-27

    A new method has been developed for assessment of the onset of degradation in solid state luminaires to classify failure mechanisms by using metrics beyond lumen degradation that are currently used for identification of failure. Luminous Flux output, Correlated Color Temperature Data on Philips LED Lamps has been gathered under 85°C/85%RH till lamp failure. The acquired data has been used in conjunction with Bayesian Probabilistic Models to identify luminaires with onset of degradation much prior to failure through identification of decision boundaries between lamps with accrued damage and lamps beyond the failure threshold in the feature space. In addition luminaires with different failure modes have been classified separately from healthy pristine luminaires. It is expected that, the new test technique will allow the development of failure distributions without testing till L70 life for the manifestation of failure.

  12. A probabilistic model for the persistence of early planar fabrics in polydeformed pelitic schists

    USGS Publications Warehouse

    Ferguson, C.C.

    1984-01-01

    Although early planar fabrics are commonly preserved within microlithons in low-grade pelites, in higher-grade (amphibolite facies) pelitic schists fabric regeneration often appears complete. Evidence for early fabrics may be preserved within porphyroblasts but, within the matrix, later deformation often appears to totally obliterate or reorient earlier fabrics. However, examination of several hundred Dalradian pelites from Connemara, western Ireland, reveals that preservation of early fabrics is by no means uncommon; relict matrix domains, although volumetrically insignificant, are remarkably persistent even when inferred later strains are very large and fabric regeneration appears, at first sight, complete. Deterministic plasticity theories are ill-suited to the analysis of such an inhomogeneous material response, and a probabilistic model is proposed instead. It assumes that ductile polycrystal deformation is controlled by elementary flow units which can be activated once their associated stress barrier is overcome. Bulk flow propensity is related to the proportion of simultaneous activations, and a measure of this is derived from the probabilistic interaction between a stress-barrier spectrum and an internal stress spectrum (the latter determined by the external loading and the details of internal stress transfer). The spectra are modelled as Gaussian distributions although the treatment is very general and could be adapted for other distributions. Using the time rate of change of activation probability it is predicted that, initially, fabric development will be rapid but will then slow down dramatically even though stress increases at a constant rate. This highly non-linear response suggests that early fabrics persist because they comprise unfavourable distributions of stress-barriers which remain unregenerated at the time bulk stress is stabilized by steady-state flow. Relict domains will, however, bear the highest stress and are potential upper

  13. Advanced Nuclear Fuel Cycle Transitions: Optimization, Modeling Choices, and Disruptions

    NASA Astrophysics Data System (ADS)

    Carlsen, Robert W.

    Many nuclear fuel cycle simulators have evolved over time to help understan the nuclear industry/ecosystem at a macroscopic level. Cyclus is one of th first fuel cycle simulators to accommodate larger-scale analysis with it liberal open-source licensing and first-class Linux support. Cyclus also ha features that uniquely enable investigating the effects of modeling choices o fuel cycle simulators and scenarios. This work is divided into thre experiments focusing on optimization, effects of modeling choices, and fue cycle uncertainty. Effective optimization techniques are developed for automatically determinin desirable facility deployment schedules with Cyclus. A novel method fo mapping optimization variables to deployment schedules is developed. Thi allows relationships between reactor types and scenario constraints to b represented implicitly in the variable definitions enabling the usage o optimizers lacking constraint support. It also prevents wasting computationa resources evaluating infeasible deployment schedules. Deployed power capacit over time and deployment of non-reactor facilities are also included a optimization variables There are many fuel cycle simulators built with different combinations o modeling choices. Comparing results between them is often difficult. Cyclus flexibility allows comparing effects of many such modeling choices. Reacto refueling cycle synchronization and inter-facility competition among othe effects are compared in four cases each using combinations of fleet of individually modeled reactors with 1-month or 3-month time steps. There are noticeable differences in results for the different cases. The larges differences occur during periods of constrained reactor fuel availability This and similar work can help improve the quality of fuel cycle analysi generally There is significant uncertainty associated deploying new nuclear technologie such as time-frames for technology availability and the cost of buildin advanced reactors

  14. A Probabilistic Model for Students' Errors and Misconceptions on the Structure of Matter in Relation to Three Cognitive Variables

    ERIC Educational Resources Information Center

    Tsitsipis, Georgios; Stamovlasis, Dimitrios; Papageorgiou, George

    2012-01-01

    In this study, the effect of 3 cognitive variables such as logical thinking, field dependence/field independence, and convergent/divergent thinking on some specific students' answers related to the particulate nature of matter was investigated by means of probabilistic models. Besides recording and tabulating the students' responses, a combination…

  15. Comparison of Four Probabilistic Models (CARES, Calendex, ConsEspo, SHEDS) to Estimate Aggregate Residential Exposures to Pesticides

    EPA Science Inventory

    Two deterministic models (US EPA’s Office of Pesticide Programs Residential Standard Operating Procedures (OPP Residential SOPs) and Draft Protocol for Measuring Children’s Non-Occupational Exposure to Pesticides by all Relevant Pathways (Draft Protocol)) and four probabilistic mo...

  16. Probabilistic assessment of contamination using the two-phase flow model.

    PubMed

    Chen, Guan-Zhi; Hsu, Kuo-Chin; Lee, Cheng-Haw

    2003-08-01

    A physically motivated model is indispensable for a successful analysis of the impact of leaching from nuclear waste storage sites on the environment and public health. While most analyses use the single-phase flow model for modelling unsaturated flow and solute transport, the two-phase flow model considering the resistance of gas to water flow is a more realistic one. The effect of the two-phase flow model on the water content is theoretically investigated first in this study. Then, by combining a geostatistical generator using the turning bands method and a multi-phase transport code TOUGH2, an automatic process is used for Monte Carlo simulation of the solute transport. This stochastic approach is applied to a potentially polluted site by low-level nuclear waste in Taiwan. In the simulation, the saturated hydraulic conductivity is treated as the random variable. The stochastic approach provides a probabilistic assessment of contamination. The results show that even though water content from the two-phase flow model is only 1.5% less than the one from the single-phase flow model, the two-phase flow causes a slower movement but a wider lateral spreading of the plume in the unsaturated zone. The stochastic approach provides useful probability information which is not available from the deterministic approach. The probability assessment of groundwater contamination provides the basis for more informed waste management, better environmental assessment and improved evaluation of impact on public health.

  17. Probabilistic Stack of 180 Plio-Pleistocene Benthic δ18O Records Constructed Using Profile Hidden Markov Models

    NASA Astrophysics Data System (ADS)

    Lisiecki, L. E.; Ahn, S.; Khider, D.; Lawrence, C.

    2015-12-01

    Stratigraphic alignment is the primary way in which long marine climate records are placed on a common age model. We previously presented a probabilistic pairwise alignment algorithm, HMM-Match, which uses hidden Markov models to estimate alignment uncertainty and apply it to the alignment of benthic δ18O records to the "LR04" global benthic stack of Lisiecki and Raymo (2005) (Lin et al., 2014). However, since the LR04 stack is deterministic, the algorithm does not account for uncertainty in the stack. Here we address this limitation by developing a probabilistic stack, HMM-Stack. In this model the stack is a probabilistic inhomogeneous hidden Markov model, a.k.a. profile HMM. The HMM-stack is represented by a probabilistic model that "emits" each of the input records (Durbin et al., 1998). The unknown parameters of this model are learned from a set of input records using the expectation maximization (EM) algorithm. Because the multiple alignment of these records is unknown and uncertain, the expected contribution of each input point to each point in the stack is determined probabilistically. For each time step in the HMM-stack, δ18O values are described by a Gaussian probability distribution. Available δ18O records (N=180) are employed to estimate the mean and variance of δ18O at each time point. The mean of HMM-Stack follows the predicted pattern of glacial cycles with increased amplitude after the Pliocene-Pleistocene boundary and also larger and longer cycles after the mid-Pleistocene transition. Furthermore, the δ18O variance increases with age, producing a substantial loss in the signal-to-noise ratio. Not surprisingly, uncertainty in alignment and thus estimated age also increase substantially in the older portion of the stack.

  18. Learning a generative probabilistic grammar of experience: a process-level model of language acquisition.

    PubMed

    Kolodny, Oren; Lotem, Arnon; Edelman, Shimon

    2015-03-01

    We introduce a set of biologically and computationally motivated design choices for modeling the learning of language, or of other types of sequential, hierarchically structured experience and behavior, and describe an implemented system that conforms to these choices and is capable of unsupervised learning from raw natural-language corpora. Given a stream of linguistic input, our model incrementally learns a grammar that captures its statistical patterns, which can then be used to parse or generate new data. The grammar constructed in this manner takes the form of a directed weighted graph, whose nodes are recursively (hierarchically) defined patterns over the elements of the input stream. We evaluated the model in seventeen experiments, grouped into five studies, which examined, respectively, (a) the generative ability of grammar learned from a corpus of natural language, (b) the characteristics of the learned representation, (c) sequence segmentation and chunking, (d) artificial grammar learning, and (e) certain types of structure dependence. The model's performance largely vindicates our design choices, suggesting that progress in modeling language acquisition can be made on a broad front-ranging from issues of generativity to the replication of human experimental findings-by bringing biological and computational considerations, as well as lessons from prior efforts, to bear on the modeling approach.

  19. [Ventilators for anesthesia. Models available in France. Criteria for choice].

    PubMed

    Otteni, J C; Ancellin, J; Cazalaà, J B; Clergue, F; Feiss, P; Fougère, S; Nivoche, Y; Safran, D

    1995-01-01

    This update article discusses the criteria for the choice of an anaesthetic machine and provides a short analysis of the main components of the models commercialized in France in 1994. The following items are considered: the design of the machine, the fresh gas delivery system, the anaesthesia breathing system(s), the ventilator and the waste gas scavenging system, the monitors associated with the machine and other criteria such as facility of learning to run the machine and of its daily use, ease of "in-house" maintenance and quality of after-sales service, cost of the machine and of its use (driving gas, disposable equipment). PMID:7677278

  20. FOGCAST: Probabilistic fog forecasting based on operational (high-resolution) NWP models

    NASA Astrophysics Data System (ADS)

    Masbou, M.; Hacker, M.; Bentzien, S.

    2013-12-01

    The presence of fog and low clouds in the lower atmosphere can have a critical impact on both airborne and ground transports and is often connected with serious accidents. The improvement of localization, duration and variations in visibility therefore holds an immense operational value. Fog is generally a small scale phenomenon and mostly affected by local advective transport, radiation, turbulent mixing at the surface as well as its microphysical structure. Sophisticated three-dimensional fog models, based on advanced microphysical parameterization schemes and high vertical resolution, have been already developed and give promising results. Nevertheless, the computational time is beyond the range of an operational setup. Therefore, mesoscale numerical weather prediction models are generally used for forecasting all kinds of weather situations. In spite of numerous improvements, a large uncertainty of small scale weather events inherent in deterministic prediction cannot be evaluated adequately. Probabilistic guidance is necessary to assess these uncertainties and give reliable forecasts. In this study, fog forecasts are obtained by a diagnosis scheme similar to Fog Stability Index (FSI) based on COSMO-DE model outputs. COSMO-DE I the German-focused high-resolution operational weather prediction model of the German Meteorological Service. The FSI and the respective fog occurrence probability is optimized and calibrated with statistical postprocessing in terms of logistic regression. In a second step, the predictor number of the FOGCAST model has been optimized by use of the LASSO-method (Least Absolute Shrinkage and Selection Operator). The results will present objective out-of-sample verification based on the Brier score and is performed for station data over Germany. Furthermore, the probabilistic fog forecast approach, FOGCAST, serves as a benchmark for the evaluation of more sophisticated 3D fog models. Several versions have been set up based on different

  1. Multi-State Physics Models of Aging Passive Components in Probabilistic Risk Assessment

    SciTech Connect

    Unwin, Stephen D.; Lowry, Peter P.; Layton, Robert F.; Heasler, Patrick G.; Toloczko, Mychailo B.

    2011-03-13

    Multi-state Markov modeling has proved to be a promising approach to estimating the reliability of passive components - particularly metallic pipe components - in the context of probabilistic risk assessment (PRA). These models consider the progressive degradation of a component through a series of observable discrete states, such as detectable flaw, leak and rupture. Service data then generally provides the basis for estimating the state transition rates. Research in materials science is producing a growing understanding of the physical phenomena that govern the aging degradation of passive pipe components. As a result, there is an emerging opportunity to incorporate these insights into PRA. This paper describes research conducted under the Risk-Informed Safety Margin Characterization Pathway of the Department of Energy’s Light Water Reactor Sustainability Program. A state transition model is described that addresses aging behavior associated with stress corrosion cracking in ASME Class 1 dissimilar metal welds – a component type relevant to LOCA analysis. The state transition rate estimates are based on physics models of weld degradation rather than service data. The resultant model is found to be non-Markov in that the transition rates are time-inhomogeneous and stochastic. Numerical solutions to the model provide insight into the effect of aging on component reliability.

  2. Multi-model approach to petroleum resource appraisal using analytic methodologies for probabilistic systems

    USGS Publications Warehouse

    Crovelli, R.A.

    1988-01-01

    The geologic appraisal model that is selected for a petroleum resource assessment depends upon purpose of the assessment, basic geologic assumptions of the area, type of available data, time available before deadlines, available human and financial resources, available computer facilities, and, most importantly, the available quantitative methodology with corresponding computer software and any new quantitative methodology that would have to be developed. Therefore, different resource assessment projects usually require different geologic models. Also, more than one geologic model might be needed in a single project for assessing different regions of the study or for cross-checking resource estimates of the area. Some geologic analyses used in the past for petroleum resource appraisal involved play analysis. The corresponding quantitative methodologies of these analyses usually consisted of Monte Carlo simulation techniques. A probabilistic system of petroleum resource appraisal for play analysis has been designed to meet the following requirements: (1) includes a variety of geologic models, (2) uses an analytic methodology instead of Monte Carlo simulation, (3) possesses the capacity to aggregate estimates from many areas that have been assessed by different geologic models, and (4) runs quickly on a microcomputer. Geologic models consist of four basic types: reservoir engineering, volumetric yield, field size, and direct assessment. Several case histories and present studies by the U.S. Geological Survey are discussed. ?? 1988 International Association for Mathematical Geology.

  3. A probabilistic tornado wind hazard model for the continental United States

    SciTech Connect

    Hossain, Q; Kimball, J; Mensing, R; Savy, J

    1999-04-19

    A probabilistic tornado wind hazard model for the continental United States (CONUS) is described. The model incorporates both aleatory (random) and epistemic uncertainties associated with quantifying the tornado wind hazard parameters. The temporal occurrences of tornadoes within the continental United States (CONUS) is assumed to be a Poisson process. A spatial distribution of tornado touchdown locations is developed empirically based on the observed historical events within the CONUS. The hazard model is an aerial probability model that takes into consideration the size and orientation of the facility, the length and width of the tornado damage area (idealized as a rectangle and dependent on the tornado intensity scale), wind speed variation within the damage area, tornado intensity classification errors (i.e.,errors in assigning a Fujita intensity scale based on surveyed damage), and the tornado path direction. Epistemic uncertainties in describing the distributions of the aleatory variables are accounted for by using more than one distribution model to describe aleatory variations. The epistemic uncertainties are based on inputs from a panel of experts. A computer program, TORNADO, has been developed incorporating this model; features of this program are also presented.

  4. Multi-model approach to petroleum resource appraisal using analytic methodologies for probabilistic systems

    SciTech Connect

    Crovelli, R.A.

    1988-11-01

    The geologic appraisal model that is selected for a petroleum resource assessment depends upon purpose of the assessment, basic geologic assumptions of the area, type of available data, time available before deadlines, available human and financial resources, available computer facilities, and, most importantly, the available quantitative methodology with corresponding computer software and any new quantitative methodology that would have to be developed. Therefore, different resource assessment projects usually require different geologic models. Also, more than one geologic model might be needed in a single project for assessing different regions of the study or for cross-checking resource estimates of the area. Some geologic analyses used in the past for petroleum resource appraisal involved play analysis. The corresponding quantitative methodologies of these analyses usually consisted of Monte Carlo simulation techniques. A probabilistic system of petroleum resource appraisal for play analysis has been designed to meet the following requirements: (1) includes a variety of geologic models, (2) uses an analytic methodology instead of Monte Carlo simulation, (3) possesses the capacity to aggregate estimates from many areas that have been assessed by different geologic models, and (4) runs quickly on a microcomputer. Geologic models consist of four basic types: reservoir engineering, volumetric yield, field size, and direct assessment. Several case histories and present studies by the US Geological Survey are discussed.

  5. A probabilistic union model with automatic order selection for noisy speech recognition.

    PubMed

    Jancovic, P; Ming, J

    2001-09-01

    A critical issue in exploiting the potential of the sub-band-based approach to robust speech recognition is the method of combining the sub-band observations, for selecting the bands unaffected by noise. A new method for this purpose, i.e., the probabilistic union model, was recently introduced. This model has been shown to be capable of dealing with band-limited corruption, requiring no knowledge about the band position and statistical distribution of the noise. A parameter within the model, which we call its order, gives the best results when it equals the number of noisy bands. Since this information may not be available in practice, in this paper we introduce an automatic algorithm for selecting the order, based on the state duration pattern generated by the hidden Markov model (HMM). The algorithm has been tested on the TIDIGITS database corrupted by various types of additive band-limited noise with unknown noisy bands. The results have shown that the union model equipped with the new algorithm can achieve a recognition performance similar to that achieved when the number of noisy bands is known. The results show a very significant improvement over the traditional full-band model, without requiring prior information on either the position or the number of noisy bands. The principle of the algorithm for selecting the order based on state duration may also be applied to other sub-band combination methods.

  6. Additional evidence for a dual-strategy model of reasoning: Probabilistic reasoning is more invariant than reasoning about logical validity.

    PubMed

    Markovits, Henry; Brisson, Janie; de Chantal, Pier-Luc

    2015-11-01

    One of the major debates concerning the nature of inferential reasoning is between counterexample-based strategies such as mental model theory and the statistical strategies underlying probabilistic models. The dual-strategy model proposed by Verschueren, Schaeken, and d'Ydewalle (2005a, 2005b) suggests that people might have access to both kinds of strategies. One of the postulates of this approach is that statistical strategies correspond to low-cost, intuitive modes of evaluation, whereas counterexample strategies are higher-cost and more variable in use. We examined this hypothesis by using a deductive-updating paradigm. The results of Study 1 showed that individual differences in strategy use predict different levels of deductive updating on inferences about logical validity. Study 2 demonstrated no such variation when explicitly probabilistic inferences were examined. Study 3 showed that presenting updating problems with probabilistic inferences modified performance on subsequent problems using logical validity, whereas the opposite was not true. These results provide clear evidence that the processes used to make probabilistic inferences are less subject to variation than those used to make inferences of logical validity.

  7. The use of food consumption data in assessments of exposure to food chemicals including the application of probabilistic modelling.

    PubMed

    Lambe, Joyce

    2002-02-01

    Emphasis on public health and consumer protection, in combination with globalisation of the food market, has created a strong demand for exposure assessments of food chemicals. The food chemicals for which exposure assessments are required include food additives, pesticide residues, environmental contaminants, mycotoxins, novel food ingredients, packaging-material migrants, flavouring substances and nutrients. A wide range of methodologies exists for estimating exposure to food chemicals, and the method chosen for a particular exposure assessment is influenced by the nature of the chemical, the purpose of the assessment and the resources available. Sources of food consumption data currently used in exposure assessments range from food balance sheets to detailed food consumption surveys of individuals and duplicate-diet studies. The fitness-for-purpose of the data must be evaluated in the context of data quality and relevance to the assessment objective. Methods to combine the food consumption data with chemical concentration data may be deterministic or probabilistic. Deterministic methods estimate intakes of food chemicals that may occur in a population, but probabilistic methods provide the advantage of estimating the probability with which different levels of intake will occur. Probabilistic analysis permits the exposure assessor to model the variability (true heterogeneity) and uncertainty (lack of knowledge) that may exist in the exposure variables, including food consumption data, and thus to examine the full distribution of possible resulting exposures. Challenges for probabilistic modelling include the selection of appropriate modes of inputting food consumption data into the models. PMID:12002785

  8. The use of food consumption data in assessments of exposure to food chemicals including the application of probabilistic modelling.

    PubMed

    Lambe, Joyce

    2002-02-01

    Emphasis on public health and consumer protection, in combination with globalisation of the food market, has created a strong demand for exposure assessments of food chemicals. The food chemicals for which exposure assessments are required include food additives, pesticide residues, environmental contaminants, mycotoxins, novel food ingredients, packaging-material migrants, flavouring substances and nutrients. A wide range of methodologies exists for estimating exposure to food chemicals, and the method chosen for a particular exposure assessment is influenced by the nature of the chemical, the purpose of the assessment and the resources available. Sources of food consumption data currently used in exposure assessments range from food balance sheets to detailed food consumption surveys of individuals and duplicate-diet studies. The fitness-for-purpose of the data must be evaluated in the context of data quality and relevance to the assessment objective. Methods to combine the food consumption data with chemical concentration data may be deterministic or probabilistic. Deterministic methods estimate intakes of food chemicals that may occur in a population, but probabilistic methods provide the advantage of estimating the probability with which different levels of intake will occur. Probabilistic analysis permits the exposure assessor to model the variability (true heterogeneity) and uncertainty (lack of knowledge) that may exist in the exposure variables, including food consumption data, and thus to examine the full distribution of possible resulting exposures. Challenges for probabilistic modelling include the selection of appropriate modes of inputting food consumption data into the models.

  9. A performance weighting procedure for GCMs based on explicit probabilistic models and accounting for observation uncertainty

    NASA Astrophysics Data System (ADS)

    Renard, Benjamin; Vidal, Jean-Philippe

    2016-04-01

    In recent years, the climate modeling community has put a lot of effort into releasing the outputs of multimodel experiments for use by the wider scientific community. In such experiments, several structurally distinct GCMs are run using the same observed forcings (for the historical period) or the same projected forcings (for the future period). In addition, several members are produced for a single given model structure, by running each GCM with slightly different initial conditions. This multiplicity of GCM outputs offers many opportunities in terms of uncertainty quantification or GCM comparisons. In this presentation, we propose a new procedure to weight GCMs according to their ability to reproduce the observed climate. Such weights can be used to combine the outputs of several models in a way that rewards good-performing models and discards poorly-performing ones. The proposed procedure has the following main properties: 1. It is based on explicit probabilistic models describing the time series produced by the GCMs and the corresponding historical observations, 2. It can use several members whenever available, 3. It accounts for the uncertainty in observations, 4. It assigns a weight to each GCM (all weights summing up to one), 5. It can also assign a weight to the "H0 hypothesis" that all GCMs in the multimodel ensemble are not compatible with observations. The application of the weighting procedure is illustrated with several case studies including synthetic experiments, simple cases where the target GCM output is a simple univariate variable and more realistic cases where the target GCM output is a multivariate and/or a spatial variable. These case studies illustrate the generality of the procedure which can be applied in a wide range of situations, as long as the analyst is prepared to make an explicit probabilistic assumption on the target variable. Moreover, these case studies highlight several interesting properties of the weighting procedure. In

  10. A state-based probabilistic model for tumor respiratory motion prediction

    NASA Astrophysics Data System (ADS)

    Kalet, Alan; Sandison, George; Wu, Huanmei; Schmitz, Ruth

    2010-12-01

    This work proposes a new probabilistic mathematical model for predicting tumor motion and position based on a finite state representation using the natural breathing states of exhale, inhale and end of exhale. Tumor motion was broken down into linear breathing states and sequences of states. Breathing state sequences and the observables representing those sequences were analyzed using a hidden Markov model (HMM) to predict the future sequences and new observables. Velocities and other parameters were clustered using a k-means clustering algorithm to associate each state with a set of observables such that a prediction of state also enables a prediction of tumor velocity. A time average model with predictions based on average past state lengths was also computed. State sequences which are known a priori to fit the data were fed into the HMM algorithm to set a theoretical limit of the predictive power of the model. The effectiveness of the presented probabilistic model has been evaluated for gated radiation therapy based on previously tracked tumor motion in four lung cancer patients. Positional prediction accuracy is compared with actual position in terms of the overall RMS errors. Various system delays, ranging from 33 to 1000 ms, were tested. Previous studies have shown duty cycles for latencies of 33 and 200 ms at around 90% and 80%, respectively, for linear, no prediction, Kalman filter and ANN methods as averaged over multiple patients. At 1000 ms, the previously reported duty cycles range from approximately 62% (ANN) down to 34% (no prediction). Average duty cycle for the HMM method was found to be 100% and 91 ± 3% for 33 and 200 ms latency and around 40% for 1000 ms latency in three out of four breathing motion traces. RMS errors were found to be lower than linear and no prediction methods at latencies of 1000 ms. The results show that for system latencies longer than 400 ms, the time average HMM prediction outperforms linear, no prediction, and the more

  11. Using the Rasch model as an objective and probabilistic technique to integrate different soil properties

    NASA Astrophysics Data System (ADS)

    Rebollo, Francisco J.; Jesús Moral García, Francisco

    2016-04-01

    Soil apparent electrical conductivity (ECa) is one of the simplest, least expensive soil measurements that integrates many soil properties affecting crop productivity, including, for instance, soil texture, water content, and cation exchange capacity. The ECa measurements obtained with a 3100 Veris sensor, operating in both shallow (0-30 cm), ECs, and deep (0-90 cm), ECd, mode, can be used as an additional and essential information to be included in a probabilistic model, the Rasch model, with the aim of quantifying the overall soil fertililty potential in an agricultural field. This quantification should integrate the main soil physical and chemical properties, with different units. In this work, the formulation of the Rasch model integrates 11 soil properties (clay, silt and sand content, organic matter -OM-, pH, total nitrogen -TN-, available phosphorus -AP- and potassium -AK-, cation exchange capacity -CEC-, ECd, and ECs) measured at 70 locations in a field. The main outputs of the model include a ranking of all soil samples according to their relative fertility potential and the unexpected behaviours of some soil samples and properties. In the case study, the considered soil variables fit the model reasonably, having an important influence on soil fertility, except pH, probably due to its homogeneity in the field. Moreover, ECd, ECs are the most influential properties on soil fertility and, on the other hand, AP and AK the less influential properties. The use of the Rasch model to estimate soil fertility potential (always in a relative way, taking into account the characteristics of the studied soil) constitutes a new application of great practical importance, enabling to rationally determine locations in a field where high soil fertility potential exists and establishing those soil samples or properties which have any anomaly; this information can be necessary to conduct site-specific treatments, leading to a more cost-effective and sustainable field

  12. Face Recognition for Access Control Systems Combining Image-Difference Features Based on a Probabilistic Model

    NASA Astrophysics Data System (ADS)

    Miwa, Shotaro; Kage, Hiroshi; Hirai, Takashi; Sumi, Kazuhiko

    We propose a probabilistic face recognition algorithm for Access Control System(ACS)s. Comparing with existing ACSs using low cost IC-cards, face recognition has advantages in usability and security that it doesn't require people to hold cards over scanners and doesn't accept imposters with authorized cards. Therefore face recognition attracts more interests in security markets than IC-cards. But in security markets where low cost ACSs exist, price competition is important, and there is a limitation on the quality of available cameras and image control. Therefore ACSs using face recognition are required to handle much lower quality images, such as defocused and poor gain-controlled images than high security systems, such as immigration control. To tackle with such image quality problems we developed a face recognition algorithm based on a probabilistic model which combines a variety of image-difference features trained by Real AdaBoost with their prior probability distributions. It enables to evaluate and utilize only reliable features among trained ones during each authentication, and achieve high recognition performance rates. The field evaluation using a pseudo Access Control System installed in our office shows that the proposed system achieves a constant high recognition performance rate independent on face image qualities, that is about four times lower EER (Equal Error Rate) under a variety of image conditions than one without any prior probability distributions. On the other hand using image difference features without any prior probabilities are sensitive to image qualities. We also evaluated PCA, and it has worse, but constant performance rates because of its general optimization on overall data. Comparing with PCA, Real AdaBoost without any prior distribution performs twice better under good image conditions, but degrades to a performance as good as PCA under poor image conditions.

  13. Performance and Prediction: Bayesian Modelling of Fallible Choice in Chess

    NASA Astrophysics Data System (ADS)

    Haworth, Guy; Regan, Ken; di Fatta, Giuseppe

    Evaluating agents in decision-making applications requires assessing their skill and predicting their behaviour. Both are well developed in Poker-like situations, but less so in more complex game and model domains. This paper addresses both tasks by using Bayesian inference in a benchmark space of reference agents. The concepts are explained and demonstrated using the game of chess but the model applies generically to any domain with quantifiable options and fallible choice. Demonstration applications address questions frequently asked by the chess community regarding the stability of the rating scale, the comparison of players of different eras and/or leagues, and controversial incidents possibly involving fraud. The last include alleged under-performance, fabrication of tournament results, and clandestine use of computer advice during competition. Beyond the model world of games, the aim is to improve fallible human performance in complex, high-value tasks.

  14. A family of models for Schelling binary choices

    NASA Astrophysics Data System (ADS)

    Cavalli, Fausto; Naimzada, Ahmad; Pireddu, Marina

    2016-02-01

    We introduce and study a family of discrete-time dynamical systems to model binary choices based on the framework proposed by Schelling in 1973. The model we propose uses a gradient-like adjustment mechanism by means of a family of smooth maps and allows understanding and analytically studying the phenomena qualitatively described by Schelling. In particular, we investigate existence of steady states and their relation to the equilibria of the static model studied by Schelling, and we analyze local stability, linking several examples and considerations provided by Schelling with bifurcation theory. We provide examples to confirm the theoretical results and to numerically investigate the possible destabilizations, as well as the emergence of coexisting attractors. We show the existence of chaos for a particular example.

  15. Probabilistic modeling of school meals for potential bisphenol A (BPA) exposure.

    PubMed

    Hartle, Jennifer C; Fox, Mary A; Lawrence, Robert S

    2016-01-01

    Many endocrine-disrupting chemicals (EDCs), including bisphenol A (BPA), are approved for use in food packaging, with unbound BPA migrating into the foods it contacts. Children, with their developing organ systems, are especially susceptible to hormone disruption, prompting this research to model the potential dose of BPA from school-provided meals. Probabilistic exposure models for school meals were informed by mixed methods. Exposure scenarios were based on United States school nutrition guidelines and included meals with varying levels of exposure potential from canned and packaged food. BPA exposure potentials were modeled with a range of 0.00049 μg/kg-BW/day for a middle school student with a low exposure breakfast and plate waste to 1.19 μg/kg-BW/day for an elementary school student eating lunch with high exposure potential. The modeled BPA doses from school meals are below the current US EPA Oral Reference Dose (RfD) of 50 μg/kg-BW/day. Recent research shows BPA animal toxicity thresholds at 2 μg/kg-BW/day. The single meal doses modeled in this research are at the same order of magnitude as the low-dose toxicity thresholds, illustrating the potential for school meals to expose children to chronic toxic levels of BPA.

  16. A probabilistic model for hydrokinetic turbine collision risks: exploring impacts on fish.

    PubMed

    Hammar, Linus; Eggertsen, Linda; Andersson, Sandra; Ehnberg, Jimmy; Arvidsson, Rickard; Gullström, Martin; Molander, Sverker

    2015-01-01

    A variety of hydrokinetic turbines are currently under development for power generation in rivers, tidal straits and ocean currents. Because some of these turbines are large, with rapidly moving rotor blades, the risk of collision with aquatic animals has been brought to attention. The behavior and fate of animals that approach such large hydrokinetic turbines have not yet been monitored at any detail. In this paper, we conduct a synthesis of the current knowledge and understanding of hydrokinetic turbine collision risks. The outcome is a generic fault tree based probabilistic model suitable for estimating population-level ecological risks. New video-based data on fish behavior in strong currents are provided and models describing fish avoidance behaviors are presented. The findings indicate low risk for small-sized fish. However, at large turbines (≥5 m), bigger fish seem to have high probability of collision, mostly because rotor detection and avoidance is difficult in low visibility. Risks can therefore be substantial for vulnerable populations of large-sized fish, which thrive in strong currents. The suggested collision risk model can be applied to different turbine designs and at a variety of locations as basis for case-specific risk assessments. The structure of the model facilitates successive model validation, refinement and application to other organism groups such as marine mammals.

  17. Development of Simplified Probabilistic Risk Assessment Model for Seismic Initiating Event

    SciTech Connect

    S. Khericha; R. Buell; S. Sancaktar; M. Gonzalez; F. Ferrante

    2012-06-01

    ABSTRACT This paper discusses a simplified method to evaluate seismic risk using a methodology built on dividing the seismic intensity spectrum into multiple discrete bins. The seismic probabilistic risk assessment model uses Nuclear Regulatory Commission’s (NRC’s) full power Standardized Plant Analysis Risk (SPAR) model as the starting point for development. The seismic PRA models are integrated with their respective internal events at-power SPAR model. This is accomplished by combining the modified system fault trees from the full power SPAR model with seismic event tree logic. The peak ground acceleration is divided into five bins. The g-value for each bin is estimated using the geometric mean of lower and upper values of that particular bin and the associated frequency for each bin is estimated by taking the difference between upper and lower values of that bin. The component’s fragilities are calculated for each bin using the plant data, if available, or generic values of median peak ground acceleration and uncertainty values for the components. For human reliability analysis (HRA), the SPAR HRA (SPAR-H) method is used which requires the analysts to complete relatively straight forward worksheets that include the performance shaping factors (PSFs). The results are then used to estimate human error probabilities (HEPs) of interest. This work is expected to improve the NRC’s ability to include seismic hazards in risk assessments for operational events in support of the reactor oversight program (e.g., significance determination process).

  18. A probabilistic model for hydrokinetic turbine collision risks: exploring impacts on fish.

    PubMed

    Hammar, Linus; Eggertsen, Linda; Andersson, Sandra; Ehnberg, Jimmy; Arvidsson, Rickard; Gullström, Martin; Molander, Sverker

    2015-01-01

    A variety of hydrokinetic turbines are currently under development for power generation in rivers, tidal straits and ocean currents. Because some of these turbines are large, with rapidly moving rotor blades, the risk of collision with aquatic animals has been brought to attention. The behavior and fate of animals that approach such large hydrokinetic turbines have not yet been monitored at any detail. In this paper, we conduct a synthesis of the current knowledge and understanding of hydrokinetic turbine collision risks. The outcome is a generic fault tree based probabilistic model suitable for estimating population-level ecological risks. New video-based data on fish behavior in strong currents are provided and models describing fish avoidance behaviors are presented. The findings indicate low risk for small-sized fish. However, at large turbines (≥5 m), bigger fish seem to have high probability of collision, mostly because rotor detection and avoidance is difficult in low visibility. Risks can therefore be substantial for vulnerable populations of large-sized fish, which thrive in strong currents. The suggested collision risk model can be applied to different turbine designs and at a variety of locations as basis for case-specific risk assessments. The structure of the model facilitates successive model validation, refinement and application to other organism groups such as marine mammals. PMID:25730314

  19. Probabilistic modeling of school meals for potential bisphenol A (BPA) exposure.

    PubMed

    Hartle, Jennifer C; Fox, Mary A; Lawrence, Robert S

    2016-01-01

    Many endocrine-disrupting chemicals (EDCs), including bisphenol A (BPA), are approved for use in food packaging, with unbound BPA migrating into the foods it contacts. Children, with their developing organ systems, are especially susceptible to hormone disruption, prompting this research to model the potential dose of BPA from school-provided meals. Probabilistic exposure models for school meals were informed by mixed methods. Exposure scenarios were based on United States school nutrition guidelines and included meals with varying levels of exposure potential from canned and packaged food. BPA exposure potentials were modeled with a range of 0.00049 μg/kg-BW/day for a middle school student with a low exposure breakfast and plate waste to 1.19 μg/kg-BW/day for an elementary school student eating lunch with high exposure potential. The modeled BPA doses from school meals are below the current US EPA Oral Reference Dose (RfD) of 50 μg/kg-BW/day. Recent research shows BPA animal toxicity thresholds at 2 μg/kg-BW/day. The single meal doses modeled in this research are at the same order of magnitude as the low-dose toxicity thresholds, illustrating the potential for school meals to expose children to chronic toxic levels of BPA. PMID:26395857

  20. A Probabilistic Model for Hydrokinetic Turbine Collision Risks: Exploring Impacts on Fish

    PubMed Central

    Hammar, Linus; Eggertsen, Linda; Andersson, Sandra; Ehnberg, Jimmy; Arvidsson, Rickard; Gullström, Martin; Molander, Sverker

    2015-01-01

    A variety of hydrokinetic turbines are currently under development for power generation in rivers, tidal straits and ocean currents. Because some of these turbines are large, with rapidly moving rotor blades, the risk of collision with aquatic animals has been brought to attention. The behavior and fate of animals that approach such large hydrokinetic turbines have not yet been monitored at any detail. In this paper, we conduct a synthesis of the current knowledge and understanding of hydrokinetic turbine collision risks. The outcome is a generic fault tree based probabilistic model suitable for estimating population-level ecological risks. New video-based data on fish behavior in strong currents are provided and models describing fish avoidance behaviors are presented. The findings indicate low risk for small-sized fish. However, at large turbines (≥5 m), bigger fish seem to have high probability of collision, mostly because rotor detection and avoidance is difficult in low visibility. Risks can therefore be substantial for vulnerable populations of large-sized fish, which thrive in strong currents. The suggested collision risk model can be applied to different turbine designs and at a variety of locations as basis for case-specific risk assessments. The structure of the model facilitates successive model validation, refinement and application to other organism groups such as marine mammals. PMID:25730314

  1. Probabilistic and technology-specific modeling of emissions from municipal solid-waste incineration.

    PubMed

    Koehler, Annette; Peyer, Fabio; Salzmann, Christoph; Saner, Dominik

    2011-04-15

    The European legislation increasingly directs waste streams which cannot be recycled toward thermal treatment. Models are therefore needed that help to quantify emissions of waste incineration and thus reveal potential risks and mitigation needs. This study presents a probabilistic model which computes emissions as a function of waste composition and technological layout of grate incineration plants and their pollution-control equipment. In contrast to previous waste-incineration models, this tool is based on a broader empirical database and allows uncertainties in emission loads to be quantified. Comparison to monitoring data of 83 actual European plants showed no significant difference between modeled emissions and measured data. An inventory of all European grate incineration plants including technical characteristics and plant capacities was established, and waste material mixtures were determined for different European countries, including generic elemental waste-material compositions. The model thus allows for calculation of country-specific and material-dependent emission factors and enables identification and tracking of emission sources. It thereby helps to develop strategies to decrease plant emissions by reducing or redirecting problematic waste fractions to other treatment options or adapting the technological equipment of waste incinerators.

  2. Probabilistic Material Strength Degradation Model for Inconel 718 Components Subjected to High Temperature, Mechanical Fatigue, Creep and Thermal Fatigue Effects

    NASA Technical Reports Server (NTRS)

    Bast, Callie Corinne Scheidt

    1994-01-01

    This thesis presents the on-going development of methodology for a probabilistic material strength degradation model. The probabilistic model, in the form of a postulated randomized multifactor equation, provides for quantification of uncertainty in the lifetime material strength of aerospace propulsion system components subjected to a number of diverse random effects. This model is embodied in the computer program entitled PROMISS, which can include up to eighteen different effects. Presently, the model includes four effects that typically reduce lifetime strength: high temperature, mechanical fatigue, creep, and thermal fatigue. Statistical analysis was conducted on experimental Inconel 718 data obtained from the open literature. This analysis provided regression parameters for use as the model's empirical material constants, thus calibrating the model specifically for Inconel 718. Model calibration was carried out for four variables, namely, high temperature, mechanical fatigue, creep, and thermal fatigue. Methodology to estimate standard deviations of these material constants for input into the probabilistic material strength model was developed. Using the current version of PROMISS, entitled PROMISS93, a sensitivity study for the combined effects of mechanical fatigue, creep, and thermal fatigue was performed. Results, in the form of cumulative distribution functions, illustrated the sensitivity of lifetime strength to any current value of an effect. In addition, verification studies comparing a combination of mechanical fatigue and high temperature effects by model to the combination by experiment were conducted. Thus, for Inconel 718, the basic model assumption of independence between effects was evaluated. Results from this limited verification study strongly supported this assumption.

  3. Probabilistic Structural Analysis Program

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  4. How might Model-based Probabilities Extracted from Imperfect Models Guide Rational Decisions: The Case for non-probabilistic odds

    NASA Astrophysics Data System (ADS)

    Smith, Leonard A.

    2010-05-01

    This contribution concerns "deep" or "second-order" uncertainty, such as the uncertainty in our probability forecasts themselves. It asks the question: "Is it rational to take (or offer) bets using model-based probabilities as if they were objective probabilities?" If not, what alternative approaches for determining odds, perhaps non-probabilistic odds, might prove useful in practice, given the fact we know our models are imperfect? We consider the case where the aim is to provide sustainable odds: not to produce a profit but merely to rationally expect to break even in the long run. In other words, to run a quantified risk of ruin that is relatively small. Thus the cooperative insurance schemes of coastal villages provide a more appropriate parallel than a casino. A "better" probability forecast would lead to lower premiums charged and less volatile fluctuations in the cash reserves of the village. Note that the Bayesian paradigm does not constrain one to interpret model distributions as subjective probabilities, unless one believes the model to be empirically adequate for the task at hand. In geophysics, this is rarely the case. When a probability forecast is interpreted as the objective probability of an event, the odds on that event can be easily computed as one divided by the probability of the event, and one need not favour taking either side of the wager. (Here we are using "odds-for" not "odds-to", the difference being whether of not the stake is returned; odds of one to one are equivalent to odds of two for one.) The critical question is how to compute sustainable odds based on information from imperfect models. We suggest that this breaks the symmetry between the odds-on an event and the odds-against it. While a probability distribution can always be translated into odds, interpreting the odds on a set of events might result in "implied-probabilities" that sum to more than one. And/or the set of odds may be incomplete, not covering all events. We ask

  5. Probabilistic versus Deterministic Skill in Predicting the Western North Pacific- East Asian Summer Monsoon Variability with Multi-Model Ensembles

    NASA Astrophysics Data System (ADS)

    Yang, D.; Yang, X. Q.; Xie, Q.; Zhang, Y.; Ren, X.; Tang, Y.

    2015-12-01

    Based on the historical forecasts of three quasi-operational multi-model ensemble (MME) systems, this study assesses the superiorities of the coupled MME over its contributing single-model ensembles (SMEs) and over the uncoupled atmospheric MME in predicting the seasonal variability of the Western North Pacific-East Asian summer monsoon. The seasonal prediction skill of the monsoon is measured by Brier skill score (BSS) in the sense of probabilistic forecast as well as by anomaly correlation (AC) in the sense of deterministic forecast. The probabilistic forecast skill of the MME is found to be always significantly better than that of each participating SME, while the deterministic forecast skill of the MME is even worse than that of some SME. The BSS is composed of reliability and resolution, two attributes characterizing probabilistic forecast skill. The probabilistic skill increase of the MME is dominated by the drastic improvement in reliability, while resolution is not always improved, similar to AC. A monotonous resolution-AC relationship is further found and qualitatively understood, whereas little relationship can be identified between reliability and AC. It is argued that the MME's success in improving the reliability possibly arises from an effective reduction of biases and overconfidence in forecast distributions. The coupled MME is much more skillful than the uncoupled atmospheric MME forced by persisted sea surface temperature (SST) anomalies. This advantage is mainly attributed to its better capability in capturing the evolution of the underlying seasonal SST anomaly.

  6. Modelling the evolution of female choice strategies under inbreeding conditions.

    PubMed

    Reinhold, Klaus

    2002-11-01

    Recently, many mate choice studies have discussed the role of genetic compatibility and inbreeding for the evolution of mate choice. With population genetic simulations I compared the potential advantage of three different female choice strategies under inbreeding conditions. Females were assumed to benefit indirectly via a preference for (i) complementary males, (ii) males with few detrimental mutations, and (iii) non-inbred males. Probably related to the reduced inbreeding depression in offspring of choosy females, the choice-allele increased for all three strategies. However, the advantage of the strategies differed widely. Choice of males with fewer mutations provided a comparatively large advantage, choice of complementary males led to a reasonable advantage, and choice of non-inbred males only resulted in a minor advantage of female choice. My results show that complementary mate choice can be almost as beneficial as conventional good-genes choice of mates with lower genetic load. Compared to the two other mate choice strategies, choice of non-inbred males is less likely to contribute to the evolution of costly mate choice. The results of a recent study showing that female sticklebacks prefer males with a larger number of MHC-loci is thus unlikely to be related to an indirect benefit of choosing non-inbred males. PMID:12555777

  7. Probabilistic model for Listeria monocytogenes growth during distribution, retail storage, and domestic storage of pasteurized milk.

    PubMed

    Koutsoumanis, Konstantinos; Pavlis, Athanasios; Nychas, George-John E; Xanthiakos, Konstantinos

    2010-04-01

    A survey on the time-temperature conditions of pasteurized milk in Greece during transportation to retail, retail storage, and domestic storage and handling was performed. The data derived from the survey were described with appropriate probability distributions and introduced into a growth model of Listeria monocytogenes in pasteurized milk which was appropriately modified for taking into account strain variability. Based on the above components, a probabilistic model was applied to evaluate the growth of L. monocytogenes during the chill chain of pasteurized milk using a Monte Carlo simulation. The model predicted that, in 44.8% of the milk cartons released in the market, the pathogen will grow until the time of consumption. For these products the estimated mean total growth of L. monocytogenes during transportation, retail storage, and domestic storage was 0.93 log CFU, with 95th and 99th percentiles of 2.68 and 4.01 log CFU, respectively. Although based on EU regulation 2073/2005 pasteurized milk produced in Greece belongs to the category of products that do not allow the growth of L. monocytogenes due to a shelf life (defined by law) of 5 days, the above results show that this shelf life limit cannot prevent L. monocytogenes from growing under the current chill chain conditions. The predicted percentage of milk cartons-initially contaminated with 1 cell/1-liter carton-in which the pathogen exceeds the safety criterion of 100 cells/ml at the time of consumption was 0.14%. The probabilistic model was used for an importance analysis of the chill chain factors, using rank order correlation, while selected intervention and shelf life increase scenarios were evaluated. The results showed that simple interventions, such as excluding the door shelf from the domestic storage of pasteurized milk, can effectively reduce the growth of the pathogen. The door shelf was found to be the warmest position in domestic refrigerators, and it was most frequently used by the

  8. Size Evolution and Stochastic Models: Explaining Ostracod Size through Probabilistic Distributions

    NASA Astrophysics Data System (ADS)

    Krawczyk, M.; Decker, S.; Heim, N. A.; Payne, J.

    2014-12-01

    The biovolume of animals has functioned as an important benchmark for measuring evolution throughout geologic time. In our project, we examined the observed average body size of ostracods over time in order to understand the mechanism of size evolution in these marine organisms. The body size of ostracods has varied since the beginning of the Ordovician, where the first true ostracods appeared. We created a stochastic branching model to create possible evolutionary trees of ostracod size. Using stratigraphic ranges for ostracods compiled from over 750 genera in the Treatise on Invertebrate Paleontology, we calculated overall speciation and extinction rates for our model. At each timestep in our model, new lineages can evolve or existing lineages can become extinct. Newly evolved lineages are assigned sizes based on their parent genera. We parameterized our model to generate neutral and directional changes in ostracod size to compare with the observed data. New sizes were chosen via a normal distribution, and the neutral model selected new sizes differentials centered on zero, allowing for an equal chance of larger or smaller ostracods at each speciation. Conversely, the directional model centered the distribution on a negative value, giving a larger chance of smaller ostracods. Our data strongly suggests that the overall direction of ostracod evolution has been following a model that directionally pushes mean ostracod size down, shying away from a neutral model. Our model was able to match the magnitude of size decrease. Our models had a constant linear decrease while the actual data had a much more rapid initial rate followed by a constant size. The nuance of the observed trends ultimately suggests a more complex method of size evolution. In conclusion, probabilistic methods can provide valuable insight into possible evolutionary mechanisms determining size evolution in ostracods.

  9. Sensitivity analysis of a two-dimensional probabilistic risk assessment model using analysis of variance.

    PubMed

    Mokhtari, Amirhossein; Frey, H Christopher

    2005-12-01

    This article demonstrates application of sensitivity analysis to risk assessment models with two-dimensional probabilistic frameworks that distinguish between variability and uncertainty. A microbial food safety process risk (MFSPR) model is used as a test bed. The process of identifying key controllable inputs and key sources of uncertainty using sensitivity analysis is challenged by typical characteristics of MFSPR models such as nonlinearity, thresholds, interactions, and categorical inputs. Among many available sensitivity analysis methods, analysis of variance (ANOVA) is evaluated in comparison to commonly used methods based on correlation coefficients. In a two-dimensional risk model, the identification of key controllable inputs that can be prioritized with respect to risk management is confounded by uncertainty. However, as shown here, ANOVA provided robust insights regarding controllable inputs most likely to lead to effective risk reduction despite uncertainty. ANOVA appropriately selected the top six important inputs, while correlation-based methods provided misleading insights. Bootstrap simulation is used to quantify uncertainty in ranks of inputs due to sampling error. For the selected sample size, differences in F values of 60% or more were associated with clear differences in rank order between inputs. Sensitivity analysis results identified inputs related to the storage of ground beef servings at home as the most important. Risk management recommendations are suggested in the form of a consumer advisory for better handling and storage practices.

  10. A probabilistic respiratory tract dosimetry model with application to beta-particle and photon emitters

    NASA Astrophysics Data System (ADS)

    Farfan, Eduardo Balderrama

    2002-01-01

    Predicting equivalent dose in the human respiratory tract is significant in the assessment of health risks associated with the inhalation of radioactive aerosols. A complete respiratory tract methodology based on the International Commission on Radiological Protection Publication 66 model was used in this research project for beta-particle and photon emitters. The conventional methodology has been to use standard values (from Reference Man) for parameters to obtain a single dose value. However, the methods used in the current study allow lung dose values to be determined as probability distributions to reflect the spread or variability in doses. To implement the methodology, a computer code, LUDUC, has been modified to include inhalation scenarios of beta-particle and photon emitters. For beta particles, a new methodology was implemented into Monte Carlo simulations to determine absorbed fractions in target tissues within the thoracic region of the respiratory tract. For photons, a new mathematical phantom of extrathoracic and thoracic regions was created based on previous studies to determine specific absorbed fractions in several tissues and organs of the human body due to inhalation of radioactive materials. The application of the methodology and developed data will be helpful in dose reconstruction and prediction efforts concerning the inhalation of short-lived radionuclides or radionuclides of Inhalation Class S. The resulting dose distributions follow a lognormal distribution shape for all scenarios examined. Applying the probabilistic computer code LUDUC to inhalation of strontium and yttrium aerosols has shown several trends, which could also be valid for many S radionuclide compounds that are beta-particle emitters. The equivalent doses are, in general, found to follow lognormal distributions. Therefore, these distributions can be described by geometric means and geometric standard deviations. Furthermore, a mathematical phantom of the extrathoracic and

  11. Building Time-Dependent Earthquake Recurrence Models for Probabilistic Loss Computations

    NASA Astrophysics Data System (ADS)

    Fitzenz, D. D.; Nyst, M.

    2013-12-01

    We present a Risk Management perspective on earthquake recurrence on mature faults, and the ways that it can be modeled. The specificities of Risk Management relative to Probabilistic Seismic Hazard Assessment (PSHA), include the non-linearity of the exceedance probability curve for losses relative to the frequency of event occurrence, the fact that losses at all return periods are needed (and not at discrete values of the return period), and the set-up of financial models which sometimes require the modeling of realizations of the order in which events may occur (I.e., simulated event dates are important, whereas only average rates of occurrence are routinely used in PSHA). We use New Zealand as a case study and review the physical characteristics of several faulting environments, contrasting them against properties of three probability density functions (PDFs) widely used to characterize the inter-event time distributions in time-dependent recurrence models. We review the data available to help constrain both the priors and the recurrence process. And we propose that with the current level of knowledge, the best way to quantify the recurrence of large events on mature faults is to use a Bayesian combination of models, i.e., the decomposition of the inter-event time distribution into a linear combination of individual PDFs with their weight given by the posterior distribution. Finally we propose to the community : 1. A general debate on how best to incorporate our knowledge (e.g., from geology, geomorphology) on plausible models and model parameters, but also preserve the information on what we do not know; and 2. The creation and maintenance of a global database of priors, data, and model evidence, classified by tectonic region, special fluid characteristic (pH, compressibility, pressure), fault geometry, and other relevant properties so that we can monitor whether some trends emerge in terms of which model dominates in which conditions.

  12. Spike Sorting by Joint Probabilistic Modeling of Neural Spike Trains and Waveforms

    PubMed Central

    Matthews, Brett A.; Clements, Mark A.

    2014-01-01

    This paper details a novel probabilistic method for automatic neural spike sorting which uses stochastic point process models of neural spike trains and parameterized action potential waveforms. A novel likelihood model for observed firing times as the aggregation of hidden neural spike trains is derived, as well as an iterative procedure for clustering the data and finding the parameters that maximize the likelihood. The method is executed and evaluated on both a fully labeled semiartificial dataset and a partially labeled real dataset of extracellular electric traces from rat hippocampus. In conditions of relatively high difficulty (i.e., with additive noise and with similar action potential waveform shapes for distinct neurons) the method achieves significant improvements in clustering performance over a baseline waveform-only Gaussian mixture model (GMM) clustering on the semiartificial set (1.98% reduction in error rate) and outperforms both the GMM and a state-of-the-art method on the real dataset (5.04% reduction in false positive + false negative errors). Finally, an empirical study of two free parameters for our method is performed on the semiartificial dataset. PMID:24829568

  13. LADTAP-PROB: A PROBABILISTIC MODEL TO ASSESS RADIOLOGICAL CONSEQUENCES FROM LIQUID RADIOACTIVE RELEASES

    SciTech Connect

    Farfan, E; Trevor Foley, T; Tim Jannik, T

    2009-01-26

    The potential radiological consequences to humans resulting from aqueous releases at the Savannah River Site (SRS) have usually been assessed using the computer code LADTAP or deterministic variations of this code. The advancement of LADTAP over the years included LADTAP II (a computer program that still resides on the mainframe at SRS) [1], LADTAP XL{copyright} (Microsoft Excel{reg_sign} Spreadsheet) [2], and other versions specific to SRS areas such as [3]. The spreadsheet variations of LADTAP contain two worksheets: LADTAP and IRRIDOSE. The LADTAP worksheet estimates dose for environmental pathways including ingestion of water and fish and external exposure resulting from recreational activities. IRRIDOSE estimates potential dose to individuals from irrigation of food crops with contaminated water. A new version of this deterministic methodology, LADTAP-PROB, was developed at Savannah River National Laboratory (SRNL) to (1) consider the complete range of the model parameter values (not just maximum or mean values), (2) determine the influences of parameter uncertainties within the LADTAP methodology, to perform a sensitivity analysis of all model parameters (to identify the input parameters to which model results are most sensitive), and (3) probabilistically assess radiological consequences from contaminated water. This study presents the methodology applied in LADTAP-PROB.

  14. Probabilistic ecological risk assessment of effluent toxicity of a wastewater reclamation plant based on process modeling.

    PubMed

    Zeng, Siyu; Huang, Yunqing; Sun, Fu; Li, Dan; He, Miao

    2016-09-01

    The growing use of reclaimed wastewater for environmental purposes such as stream flow augmentation requires comprehensive ecological risk assessment and management. This study applied a system analysis approach, regarding a wastewater reclamation plant (WRP) and its recipient water body as a whole system, and assessed the ecological risk of the recipient water body caused by the WRP effluent. Instead of specific contaminants, two toxicity indicators, i.e. genotoxicity and estrogenicity, were selected to directly measure the biological effects of all bio-available contaminants in the reclaimed wastewater, as well as characterize the ecological risk of the recipient water. A series of physically based models were developed to simulate the toxicity indicators in a WRP through a typical reclamation process, including ultrafiltration, ozonation, and chlorination. After being validated against the field monitoring data from a full-scale WRP in Beijing, the models were applied to simulate the probability distribution of effluent toxicity of the WRP through Latin Hypercube Sampling to account for the variability of influent toxicity and operation conditions. The simulated effluent toxicity was then used to derive the predicted environmental concentration (PEC) in the recipient stream, considering the variations of the toxicity and flow of the upstream inflow as well. The ratio of the PEC of each toxicity indicator to its corresponding predicted no-effect concentration was finally used for the probabilistic ecological risk assessment. Regional sensitivity analysis was also performed with the developed models to identify the critical control variables and strategies for ecological risk management. PMID:27219046

  15. Optimizing Negotiation Conflict in the Cloud Service Negotiation Framework Using Probabilistic Decision Making Model.

    PubMed

    Rajavel, Rajkumar; Thangarathinam, Mala

    2015-01-01

    Optimization of negotiation conflict in the cloud service negotiation framework is identified as one of the major challenging issues. This negotiation conflict occurs during the bilateral negotiation process between the participants due to the misperception, aggressive behavior, and uncertain preferences and goals about their opponents. Existing research work focuses on the prerequest context of negotiation conflict optimization by grouping similar negotiation pairs using distance, binary, context-dependent, and fuzzy similarity approaches. For some extent, these approaches can maximize the success rate and minimize the communication overhead among the participants. To further optimize the success rate and communication overhead, the proposed research work introduces a novel probabilistic decision making model for optimizing the negotiation conflict in the long-term negotiation context. This decision model formulates the problem of managing different types of negotiation conflict that occurs during negotiation process as a multistage Markov decision problem. At each stage of negotiation process, the proposed decision model generates the heuristic decision based on the past negotiation state information without causing any break-off among the participants. In addition, this heuristic decision using the stochastic decision tree scenario can maximize the revenue among the participants available in the cloud service negotiation framework.

  16. Optimizing Negotiation Conflict in the Cloud Service Negotiation Framework Using Probabilistic Decision Making Model

    PubMed Central

    Rajavel, Rajkumar; Thangarathinam, Mala

    2015-01-01

    Optimization of negotiation conflict in the cloud service negotiation framework is identified as one of the major challenging issues. This negotiation conflict occurs during the bilateral negotiation process between the participants due to the misperception, aggressive behavior, and uncertain preferences and goals about their opponents. Existing research work focuses on the prerequest context of negotiation conflict optimization by grouping similar negotiation pairs using distance, binary, context-dependent, and fuzzy similarity approaches. For some extent, these approaches can maximize the success rate and minimize the communication overhead among the participants. To further optimize the success rate and communication overhead, the proposed research work introduces a novel probabilistic decision making model for optimizing the negotiation conflict in the long-term negotiation context. This decision model formulates the problem of managing different types of negotiation conflict that occurs during negotiation process as a multistage Markov decision problem. At each stage of negotiation process, the proposed decision model generates the heuristic decision based on the past negotiation state information without causing any break-off among the participants. In addition, this heuristic decision using the stochastic decision tree scenario can maximize the revenue among the participants available in the cloud service negotiation framework. PMID:26543899

  17. Probabilistic ecological risk assessment of effluent toxicity of a wastewater reclamation plant based on process modeling.

    PubMed

    Zeng, Siyu; Huang, Yunqing; Sun, Fu; Li, Dan; He, Miao

    2016-09-01

    The growing use of reclaimed wastewater for environmental purposes such as stream flow augmentation requires comprehensive ecological risk assessment and management. This study applied a system analysis approach, regarding a wastewater reclamation plant (WRP) and its recipient water body as a whole system, and assessed the ecological risk of the recipient water body caused by the WRP effluent. Instead of specific contaminants, two toxicity indicators, i.e. genotoxicity and estrogenicity, were selected to directly measure the biological effects of all bio-available contaminants in the reclaimed wastewater, as well as characterize the ecological risk of the recipient water. A series of physically based models were developed to simulate the toxicity indicators in a WRP through a typical reclamation process, including ultrafiltration, ozonation, and chlorination. After being validated against the field monitoring data from a full-scale WRP in Beijing, the models were applied to simulate the probability distribution of effluent toxicity of the WRP through Latin Hypercube Sampling to account for the variability of influent toxicity and operation conditions. The simulated effluent toxicity was then used to derive the predicted environmental concentration (PEC) in the recipient stream, considering the variations of the toxicity and flow of the upstream inflow as well. The ratio of the PEC of each toxicity indicator to its corresponding predicted no-effect concentration was finally used for the probabilistic ecological risk assessment. Regional sensitivity analysis was also performed with the developed models to identify the critical control variables and strategies for ecological risk management.

  18. Modeling and Quantification of Team Performance in Human Reliability Analysis for Probabilistic Risk Assessment

    SciTech Connect

    Jeffrey C. JOe; Ronald L. Boring

    2014-06-01

    Probabilistic Risk Assessment (PRA) and Human Reliability Assessment (HRA) are important technical contributors to the United States (U.S.) Nuclear Regulatory Commission’s (NRC) risk-informed and performance based approach to regulating U.S. commercial nuclear activities. Furthermore, all currently operating commercial NPPs in the U.S. are required by federal regulation to be staffed with crews of operators. Yet, aspects of team performance are underspecified in most HRA methods that are widely used in the nuclear industry. There are a variety of "emergent" team cognition and teamwork errors (e.g., communication errors) that are 1) distinct from individual human errors, and 2) important to understand from a PRA perspective. The lack of robust models or quantification of team performance is an issue that affects the accuracy and validity of HRA methods and models, leading to significant uncertainty in estimating HEPs. This paper describes research that has the objective to model and quantify team dynamics and teamwork within NPP control room crews for risk informed applications, thereby improving the technical basis of HRA, which improves the risk-informed approach the NRC uses to regulate the U.S. commercial nuclear industry.

  19. Optimizing Negotiation Conflict in the Cloud Service Negotiation Framework Using Probabilistic Decision Making Model.

    PubMed

    Rajavel, Rajkumar; Thangarathinam, Mala

    2015-01-01

    Optimization of negotiation conflict in the cloud service negotiation framework is identified as one of the major challenging issues. This negotiation conflict occurs during the bilateral negotiation process between the participants due to the misperception, aggressive behavior, and uncertain preferences and goals about their opponents. Existing research work focuses on the prerequest context of negotiation conflict optimization by grouping similar negotiation pairs using distance, binary, context-dependent, and fuzzy similarity approaches. For some extent, these approaches can maximize the success rate and minimize the communication overhead among the participants. To further optimize the success rate and communication overhead, the proposed research work introduces a novel probabilistic decision making model for optimizing the negotiation conflict in the long-term negotiation context. This decision model formulates the problem of managing different types of negotiation conflict that occurs during negotiation process as a multistage Markov decision problem. At each stage of negotiation process, the proposed decision model generates the heuristic decision based on the past negotiation state information without causing any break-off among the participants. In addition, this heuristic decision using the stochastic decision tree scenario can maximize the revenue among the participants available in the cloud service negotiation framework. PMID:26543899

  20. Development of a probabilistic timing model for the ingestion of tap water.

    SciTech Connect

    Davis, M. J.; Janke, R.; Environmental Science Division; EPA

    2009-01-01

    A contamination event in a water distribution system can result in adverse health impacts to individuals consuming contaminated water from the system. Assessing impacts to such consumers requires accounting for the timing of exposures of individuals to tap-water contaminants that have time-varying concentrations. Here we present a probabilistic model for the timing of ingestion of tap water that we developed for use in the U.S. Environmental Protection Agency's Threat Ensemble Vulnerability Assessment and Sensor Placement Tool, which is designed to perform consequence assessments for contamination events in water distribution systems. We also present a statistical analysis of the timing of ingestion activity using data collected by the American Time Use Survey. The results of the analysis provide the basis for our model, which accounts for individual variability in ingestion timing and provides a series of potential ingestion times for tap water. It can be combined with a model for ingestion volume to perform exposure assessments and applied in cases for which the use of characteristics typical of the United States is appropriate.

  1. The probabilistic niche model reveals substantial variation in the niche structure of empirical food webs.

    PubMed

    Williams, Richard J; Purves, Drew W

    2011-09-01

    The structure of food webs, complex networks of interspecies feeding interactions, plays a crucial role in ecosystem resilience and function, and understanding food web structure remains a central problem in ecology. Previous studies have shown that key features of empirical food webs can be reproduced by low-dimensional "niche" models. Here we examine the form and variability of food web niche structure by fitting a probabilistic niche model to 37 empirical food webs, a much larger number of food webs than used in previous studies. The model relaxes previous assumptions about parameter distributions and hierarchy and returns parameter estimates for each species in each web. The model significantly outperforms previous niche model variants and also performs well for several webs where a body-size-based niche model performs poorly, implying that traits other than body size are important in structuring these webs' niche space. Parameter estimates frequently violate previous models' assumptions: in 19 of 37 webs, parameter values are not significantly hierarchical, 32 of 37 webs have nonuniform niche value distributions, and 15 of 37 webs lack a correlation between niche width and niche position. Extending the model to a two-dimensional niche space yields networks with a mixture of one- and two-dimensional niches and provides a significantly better fit for webs with a large number of species and links. These results confirm that food webs are strongly niche-structured but reveal substantial variation in the form of the niche structuring, a result with fundamental implications for ecosystem resilience and function.

  2. Human nonindependent mate choice: is model female attractiveness everything?

    PubMed

    Vakirtzis, Antonios; Roberts, S Craig

    2012-05-06

    Following two decades of research on non-human animals, there has recently been increased interest in human nonindependent mate choice, namely the ways in which choosing women incorporate information about a man's past or present romantic partners ('model females') into their own assessment of the male. Experimental studies using static facial images have generally found that men receive higher desirability ratings from female raters when presented with attractive (compared to unattractive) model females. This phenomenon has a straightforward evolutionary explanation: the fact that female mate value is more dependent on physical attractiveness compared to male mate value. Furthermore, due to assortative mating for attractiveness, men who are paired with attractive women are more likely to be of high mate value themselves. Here, we also examine the possible relevance of model female cues other than attractiveness (personality and behavioral traits) by presenting video recordings of model females to a set of female raters. The results confirm that the model female's attractiveness is the primary cue. Contrary to some earlier findings in the human and nonhuman literature, we found no evidence that female raters prefer partners of slightly older model females. We conclude by suggesting some promising variations on the present experimental design.

  3. Probabilistic exposure assessment model to estimate aseptic-UHT product failure rate.

    PubMed

    Pujol, Laure; Albert, Isabelle; Magras, Catherine; Johnson, Nicholas Brian; Membré, Jeanne-Marie

    2015-01-01

    Aseptic-Ultra-High-Temperature (UHT) products are manufactured to be free of microorganisms capable of growing in the food at normal non-refrigerated conditions at which the food is likely to be held during manufacture, distribution and storage. Two important phases within the process are widely recognised as critical in controlling microbial contamination: the sterilisation steps and the following aseptic steps. Of the microbial hazards, the pathogen spore formers Clostridium botulinum and Bacillus cereus are deemed the most pertinent to be controlled. In addition, due to a relatively high thermal resistance, Geobacillus stearothermophilus spores are considered a concern for spoilage of low acid aseptic-UHT products. A probabilistic exposure assessment model has been developed in order to assess the aseptic-UHT product failure rate associated with these three bacteria. It was a Modular Process Risk Model, based on nine modules. They described: i) the microbial contamination introduced by the raw materials, either from the product (i.e. milk, cocoa and dextrose powders and water) or the packaging (i.e. bottle and sealing component), ii) the sterilisation processes, of either the product or the packaging material, iii) the possible recontamination during subsequent processing of both product and packaging. The Sterility Failure Rate (SFR) was defined as the sum of bottles contaminated for each batch, divided by the total number of bottles produced per process line run (10(6) batches simulated per process line). The SFR associated with the three bacteria was estimated at the last step of the process (i.e. after Module 9) but also after each module, allowing for the identification of modules, and responsible contamination pathways, with higher or lower intermediate SFR. The model contained 42 controlled settings associated with factory environment, process line or product formulation, and more than 55 probabilistic inputs corresponding to inputs with variability

  4. Probabilistic terrain models from waveform airborne LiDAR: AutoProbaDTM project results

    NASA Astrophysics Data System (ADS)

    Jalobeanu, A.; Goncalves, G. R.

    2012-12-01

    The main objective of the AutoProbaDTM project was to develop new methods for automated probabilistic topographic map production using the latest LiDAR scanners. It included algorithmic development, implementation and validation over a 200 km2 test area in continental Portugal, representing roughly 100 GB of raw data and half a billion waveforms. We aimed to generate digital terrain models automatically, including ground topography as well as uncertainty maps, using Bayesian inference for model estimation and error propagation, and approaches based on image processing. Here we are presenting the results of the completed project (methodological developments and processing results from the test dataset). In June 2011, the test data were acquired in central Portugal, over an area of geomorphological and ecological interest, using a Riegl LMS-Q680i sensor. We managed to survey 70% of the test area at a satisfactory sampling rate, the angular spacing matching the laser beam divergence and the ground spacing nearly equal to the footprint (almost 4 pts/m2 for a 50cm footprint at 1500 m AGL). This is crucial for a correct processing as aliasing artifacts are significantly reduced. A reverse engineering had to be done as the data were delivered in a proprietary binary format, so we were able to read the waveforms and the essential parameters. A robust waveform processing method has been implemented and tested, georeferencing and geometric computations have been coded. Fast gridding and interpolation techniques have been developed. Validation is nearly completed, as well as geometric calibration, IMU error correction, full error propagation and large-scale DEM reconstruction. A probabilistic processing software package has been implemented and code optimization is in progress. This package includes new boresight calibration procedures, robust peak extraction modules, DEM gridding and interpolation methods, and means to visualize the produced uncertain surfaces (topography

  5. Probabilistic exposure assessment model to estimate aseptic-UHT product failure rate.

    PubMed

    Pujol, Laure; Albert, Isabelle; Magras, Catherine; Johnson, Nicholas Brian; Membré, Jeanne-Marie

    2015-01-01

    Aseptic-Ultra-High-Temperature (UHT) products are manufactured to be free of microorganisms capable of growing in the food at normal non-refrigerated conditions at which the food is likely to be held during manufacture, distribution and storage. Two important phases within the process are widely recognised as critical in controlling microbial contamination: the sterilisation steps and the following aseptic steps. Of the microbial hazards, the pathogen spore formers Clostridium botulinum and Bacillus cereus are deemed the most pertinent to be controlled. In addition, due to a relatively high thermal resistance, Geobacillus stearothermophilus spores are considered a concern for spoilage of low acid aseptic-UHT products. A probabilistic exposure assessment model has been developed in order to assess the aseptic-UHT product failure rate associated with these three bacteria. It was a Modular Process Risk Model, based on nine modules. They described: i) the microbial contamination introduced by the raw materials, either from the product (i.e. milk, cocoa and dextrose powders and water) or the packaging (i.e. bottle and sealing component), ii) the sterilisation processes, of either the product or the packaging material, iii) the possible recontamination during subsequent processing of both product and packaging. The Sterility Failure Rate (SFR) was defined as the sum of bottles contaminated for each batch, divided by the total number of bottles produced per process line run (10(6) batches simulated per process line). The SFR associated with the three bacteria was estimated at the last step of the process (i.e. after Module 9) but also after each module, allowing for the identification of modules, and responsible contamination pathways, with higher or lower intermediate SFR. The model contained 42 controlled settings associated with factory environment, process line or product formulation, and more than 55 probabilistic inputs corresponding to inputs with variability

  6. Rapid probabilistic source characterisation in 3D earth models using learning algorithms

    NASA Astrophysics Data System (ADS)

    Valentine, A. P.; Kaeufl, P.; Trampert, J.

    2015-12-01

    Characterising earthquake sources rapidly and robustly is an essential component of any earthquake early warning (EEW) procedure. Ideally, this characterisation should:(i) be probabilistic -- enabling appreciation of the full range of mechanisms compatible with available data, and taking observational and theoretical uncertainties into account; and(ii) operate in a physically-complete theoretical framework.However, implementing either of these ideals increases computational costs significantly, making it unfeasible to satisfy both in the short timescales necessary for EEW applications.The barrier here arises from the fact that conventional probabilistic inversion techniques involve running many thousands of forward simulations after data has been obtained---a procedure known as `posterior sampling'. Thus, for EEW, all computational costs must be incurred after the event time. Here, we demonstrate a new approach---based instead on `prior sampling'---which circumvents this problem and is feasible for EEW applications. All forward simulations are conducted in advance, and a learning algorithm is used to assimilate information about the relationship between model and data. Once observations from an earthquake become available, this information can be used to infer probability density functions (pdfs) for seismic source parameters, within milliseconds.We demonstrate this procedure using data from the 2008 Mw5.4 Chino Hills earthquake. We compute Green's functions for 150 randomly-chosen locations on the Whittier and Chino faults, using SPECFEM3D and a 3D model of the regional velocity structure. We then use these to train neural networks that map from seismic waveforms to pdfs on a point-source, moment-tensor representation of the event mechanism. We show that using local network data from the Chino Hills event, this system provides accurate information on magnitude, epicentral location and source half-duration using data available 6 seconds after the first station

  7. Alterations in choice behavior by manipulations of world model

    PubMed Central

    Green, C. S.; Benson, C.; Kersten, D.; Schrater, P.

    2010-01-01

    How to compute initially unknown reward values makes up one of the key problems in reinforcement learning theory, with two basic approaches being used. Model-free algorithms rely on the accumulation of substantial amounts of experience to compute the value of actions, whereas in model-based learning, the agent seeks to learn the generative process for outcomes from which the value of actions can be predicted. Here we show that (i) “probability matching”—a consistent example of suboptimal choice behavior seen in humans—occurs in an optimal Bayesian model-based learner using a max decision rule that is initialized with ecologically plausible, but incorrect beliefs about the generative process for outcomes and (ii) human behavior can be strongly and predictably altered by the presence of cues suggestive of various generative processes, despite statistically identical outcome generation. These results suggest human decision making is rational and model based and not consistent with model-free learning. PMID:20805507

  8. Plant Phenotyping using Probabilistic Topic Models: Uncovering the Hyperspectral Language of Plants

    PubMed Central

    Wahabzada, Mirwaes; Mahlein, Anne-Katrin; Bauckhage, Christian; Steiner, Ulrike; Oerke, Erich-Christian; Kersting, Kristian

    2016-01-01

    Modern phenotyping and plant disease detection methods, based on optical sensors and information technology, provide promising approaches to plant research and precision farming. In particular, hyperspectral imaging have been found to reveal physiological and structural characteristics in plants and to allow for tracking physiological dynamics due to environmental effects. In this work, we present an approach to plant phenotyping that integrates non-invasive sensors, computer vision, as well as data mining techniques and allows for monitoring how plants respond to stress. To uncover latent hyperspectral characteristics of diseased plants reliably and in an easy-to-understand way, we “wordify” the hyperspectral images, i.e., we turn the images into a corpus of text documents. Then, we apply probabilistic topic models, a well-established natural language processing technique that identifies content and topics of documents. Based on recent regularized topic models, we demonstrate that one can track automatically the development of three foliar diseases of barley. We also present a visualization of the topics that provides plant scientists an intuitive tool for hyperspectral imaging. In short, our analysis and visualization of characteristic topics found during symptom development and disease progress reveal the hyperspectral language of plant diseases. PMID:26957018

  9. Probabilistic modelling of chromatin code landscape reveals functional diversity of enhancer-like chromatin states

    PubMed Central

    Zhou, Jian; Troyanskaya, Olga G.

    2016-01-01

    Interpreting the functional state of chromatin from the combinatorial binding patterns of chromatin factors, that is, the chromatin codes, is crucial for decoding the epigenetic state of the cell. Here we present a systematic map of Drosophila chromatin states derived from data-driven probabilistic modelling of dependencies between chromatin factors. Our model not only recapitulates enhancer-like chromatin states as indicated by widely used enhancer marks but also divides these states into three functionally distinct groups, of which only one specific group possesses active enhancer activity. Moreover, we discover a strong association between one specific enhancer state and RNA Polymerase II pausing, linking transcription regulatory potential and chromatin organization. We also observe that with the exception of long-intron genes, chromatin state transition positions in transcriptionally active genes align with an absolute distance to their corresponding transcription start site, regardless of gene length. Using our method, we provide a resource that helps elucidate the functional and spatial organization of the chromatin code landscape. PMID:26841971

  10. Automated reconstruction of ancient languages using probabilistic models of sound change

    PubMed Central

    Bouchard-Côté, Alexandre; Hall, David; Griffiths, Thomas L.; Klein, Dan

    2013-01-01

    One of the oldest problems in linguistics is reconstructing the words that appeared in the protolanguages from which modern languages evolved. Identifying the forms of these ancient languages makes it possible to evaluate proposals about the nature of language change and to draw inferences about human history. Protolanguages are typically reconstructed using a painstaking manual process known as the comparative method. We present a family of probabilistic models of sound change as well as algorithms for performing inference in these models. The resulting system automatically and accurately reconstructs protolanguages from modern languages. We apply this system to 637 Austronesian languages, providing an accurate, large-scale automatic reconstruction of a set of protolanguages. Over 85% of the system’s reconstructions are within one character of the manual reconstruction provided by a linguist specializing in Austronesian languages. Being able to automatically reconstruct large numbers of languages provides a useful way to quantitatively explore hypotheses about the factors determining which sounds in a language are likely to change over time. We demonstrate this by showing that the reconstructed Austronesian protolanguages provide compelling support for a hypothesis about the relationship between the function of a sound and its probability of changing that was first proposed in 1955. PMID:23401532

  11. Marginal Probabilistic Modeling of the Delays in the Sensory Data Transmission of Networked Telerobots

    PubMed Central

    Gago-Benítez, Ana; Fernández-Madrigal, Juan-Antonio; Cruz-Martín, Ana

    2014-01-01

    Networked telerobots are remotely controlled through general purpose networks and components, which are highly heterogeneous and exhibit stochastic response times; however their correct teleoperation requires a timely flow of information from sensors to remote stations. In order to guarantee these time requirements, a good on-line probabilistic estimation of the sensory transmission delays is needed. In many modern applications this estimation must be computationally highly efficient, e.g., when the system includes a web-based client interface. This paper studies marginal probability distributions that, under mild assumptions, can be a good approximation of the real distribution of the delays without using knowledge of their dynamics, are efficient to compute, and need minor modifications on the networked robot. Since sequences of delays exhibit strong non-linearities in these networked applications, to satisfy the iid hypothesis required by the marginal approach we apply a change detection method. The results reported here indicate that some parametrical models explain well many more real scenarios when using this change detection method, while some non-parametrical distributions have a very good rate of successful modeling in the case that non-linearity detection is not possible and that we split the total delay into its three basic terms: server, network and client times. PMID:24481232

  12. Near-Field Probabilistic Seismic Hazard Analysis of Metropolitan Tehran Using Region-Specific Directivity Models

    NASA Astrophysics Data System (ADS)

    Yazdani, Azad; Nicknam, Ahmad; Dadras, Ehsan Yousefi; Eftekhari, Seyed Nasrollah

    2016-09-01

    Ground motions are affected by directivity effects at near-fault regions which result in low-frequency cycle pulses at the beginning of the velocity time history. The directivity features of near-fault ground motions can lead to significant increase in the risk of earthquake-induced damage on engineering structures. The ordinary probabilistic seismic hazard analysis (PSHA) does not take into account such effects; recent studies have thus proposed new frameworks to incorporate directivity effects in PSHA. The objective of this study is to develop the seismic hazard mapping of Tehran City according to near-fault PSHA procedure for different return periods. To this end, the directivity models required in the modified PSHA were developed based on a database of the simulated ground motions. The simulated database was used in this study because there are no recorded near-fault data in the region to derive purely empirically based pulse prediction models. The results show that the directivity effects can significantly affect the estimate of regional seismic hazard.

  13. Unsupervised learning of probabilistic object models (POMs) for object classification, segmentation, and recognition using knowledge propagation.

    PubMed

    Chen, Yuanhao; Zhu, Long Leo; Yuille, Alan; Zhang, Hongjiang

    2009-10-01

    We present a method to learn probabilistic object models (POMs) with minimal supervision, which exploit different visual cues and perform tasks such as classification, segmentation, and recognition. We formulate this as a structure induction and learning task and our strategy is to learn and combine elementary POMs that make use of complementary image cues. We describe a novel structure induction procedure, which uses knowledge propagation to enable POMs to provide information to other POMs and "teach them" (which greatly reduces the amount of supervision required for training and speeds up the inference). In particular, we learn a POM-IP defined on Interest Points using weak supervision [1], [2] and use this to train a POM-mask, defined on regional features, which yields a combined POM that performs segmentation/localization. This combined model can be used to train POM-edgelets, defined on edgelets, which gives a full POM with improved performance on classification. We give detailed experimental analysis on large data sets for classification and segmentation with comparison to other methods. Inference takes five seconds while learning takes approximately four hours. In addition, we show that the full POM is invariant to scale and rotation of the object (for learning and inference) and can learn hybrid objects classes (i.e., when there are several objects and the identity of the object in each image is unknown). Finally, we show that POMs can be used to match between different objects of the same category, and hence, enable objects recognition. PMID:19696447

  14. Probabilistic Quantitative Precipitation Forecasting over East China using Bayesian Model Averaging

    NASA Astrophysics Data System (ADS)

    Yang, Ai; Yuan, Huiling

    2014-05-01

    The Bayesian model averaging (BMA) is a post-processing method that weights the predictive probability density functions (PDFs) of individual ensemble members. This study investigates the BMA method for calibrating quantitative precipitation forecasts (QPFs) from The Observing System Research and Predictability Experiment (THORPEX) Interactive Grand Global Ensemble (TIGGE) database. The QPFs over East Asia during summer (June-August) 2008-2011 are generated from six operational ensemble prediction systems (EPSs), including ECMWF, UKMO, NCEP, CMC, JMA, CMA, and multi-center ensembles of their combinations. The satellite-based precipitation estimate product TRMM 3B42 V7 is used as the verification dataset. In the BMA post-processing for precipitation forecasts, the PDF matching method is first applied to bias-correct systematic errors in each forecast member, by adjusting PDFs of forecasts to match PDFs of observations. Next, a logistic regression and two-parameter gamma distribution are used to fit the probability of rainfall occurrence and precipitation distribution. Through these two steps, the BMA post-processing bias-corrects ensemble forecasts systematically. The 60-70% cumulative density function (CDF) predictions well estimate moderate precipitation compared to raw ensemble mean, while the 90% upper boundary of BMA CDF predictions can be set as a threshold of extreme precipitation alarm. In general, the BMA method is more capable of multi-center ensemble post-processing, which improves probabilistic QPFs (PQPFs) with better ensemble spread and reliability. KEYWORDS: Bayesian model averaging (BMA); post-processing; ensemble forecast; TIGGE

  15. Probabilistic modelling of exposure doses and implications for health risk characterization: glycoalkaloids from potatoes.

    PubMed

    Ruprich, J; Rehurkova, I; Boon, P E; Svensson, K; Moussavian, S; Van der Voet, H; Bosgra, S; Van Klaveren, J D; Busk, L

    2009-12-01

    Potatoes are a source of glycoalkaloids (GAs) represented primarily by alpha-solanine and alpha-chaconine (about 95%). Content of GAs in tubers is usually 10-100 mg/kg and maximum levels do not exceed 200 mg/kg. GAs can be hazardous for human health. Poisoning involve gastrointestinal ailments and neurological symptoms. A single intake of >1-3 mg/kg b.w. is considered a critical effect dose (CED). Probabilistic modelling of acute and chronic (usual) exposure to GAs was performed in the Czech Republic, Sweden and The Netherlands. National databases on individual consumption of foods, data on concentration of GAs in tubers (439 Czech and Swedish results) and processing factors were used for modelling. Results concluded that potatoes currently available at the European market may lead to acute intakes >1 mg GAs/kg b.w./day for upper tail of the intake distribution (0.01% of population) in all three countries. 50 mg GAs/kg raw unpeeled tubers ensures that at least 99.99% of the population does not exceed the CED. Estimated chronic (usual) intake in participating countries was 0.25, 0.29 and 0.56 mg/kg b.w./day (97.5% upper confidence limit). It remains unclear if the incidence of GAs poisoning is underreported or if assumptions are the worst case for extremely sensitive persons.

  16. Plant Phenotyping using Probabilistic Topic Models: Uncovering the Hyperspectral Language of Plants.

    PubMed

    Wahabzada, Mirwaes; Mahlein, Anne-Katrin; Bauckhage, Christian; Steiner, Ulrike; Oerke, Erich-Christian; Kersting, Kristian

    2016-01-01

    Modern phenotyping and plant disease detection methods, based on optical sensors and information technology, provide promising approaches to plant research and precision farming. In particular, hyperspectral imaging have been found to reveal physiological and structural characteristics in plants and to allow for tracking physiological dynamics due to environmental effects. In this work, we present an approach to plant phenotyping that integrates non-invasive sensors, computer vision, as well as data mining techniques and allows for monitoring how plants respond to stress. To uncover latent hyperspectral characteristics of diseased plants reliably and in an easy-to-understand way, we "wordify" the hyperspectral images, i.e., we turn the images into a corpus of text documents. Then, we apply probabilistic topic models, a well-established natural language processing technique that identifies content and topics of documents. Based on recent regularized topic models, we demonstrate that one can track automatically the development of three foliar diseases of barley. We also present a visualization of the topics that provides plant scientists an intuitive tool for hyperspectral imaging. In short, our analysis and visualization of characteristic topics found during symptom development and disease progress reveal the hyperspectral language of plant diseases. PMID:26957018

  17. Developing a probabilistic fire risk model and its application to fire danger systems

    NASA Astrophysics Data System (ADS)

    Penman, T.; Bradstock, R.; Caccamo, G.; Price, O.

    2012-04-01

    Wildfires can result in significant economic losses where they encounter human assets. Management agencies have large budgets devoted to both prevention and suppression of fires, but little is known about the extent to which they alter the probability of asset loss. Prediction of the risk of asset loss as a result of wildfire requires an understanding of a number of complex processes from ignition, fire growth and impact on assets. These processes need to account for the additive or multiplicative effects of management, weather and the natural environment. Traditional analytical methods can only examine only a small subset of these. Bayesian Belief Networks (BBNs) provide a methodology to examine complex environmental problems. Outcomes of a BBN are represented as likelihoods, which can then form the basis for risk analysis and management. Here we combine a range of data sources, including simulation models, empirical statistical analyses and expert opinion to form a fire management BBN. Various management actions have been incorporated into the model including landscape and interface prescribed burning, initial attack and fire suppression. Performance of the model has been tested against fire history datasets with strong correlations being found. Adapting the BBN presented here we are capable of developing a spatial and temporal fire danger rating system. Currently Australian fire danger rating systems are based on the weather. Our model accounts for existing fires, as well as the risk of new ignitions combined with probabilistic weather forecasts to identify those areas which are most at risk of asset loss. Fire growth is modelled with consideration given to management prevention efforts, as well as suppression resources that are available in each geographic locality. At a 10km resolution the model will provide a probability of asset loss which represents a significant step forward in the level of information that can be provided to the general public.

  18. Probabilistic forecasts of debris-flow hazard at the regional scale with a combination of models.

    NASA Astrophysics Data System (ADS)

    Malet, Jean-Philippe; Remaître, Alexandre

    2015-04-01

    Debris flows are one of the many active slope-forming processes in the French Alps, where rugged and steep slopes mantled by various slope deposits offer a great potential for triggering hazardous events. A quantitative assessment of debris-flow hazard requires the estimation, in a probabilistic framework, of the spatial probability of occurrence of source areas, the spatial probability of runout areas, the temporal frequency of events, and their intensity. The main objective of this research is to propose a pipeline for the estimation of these quantities at the region scale using a chain of debris-flow models. The work uses the experimental site of the Barcelonnette Basin (South French Alps), where 26 active torrents have produced more than 150 debris-flow events since 1850 to develop and validate the methodology. First, a susceptibility assessment is performed to identify the debris-flow prone source areas. The most frequently used approach is the combination of environmental factors with GIS procedures and statistical techniques, integrating or not, detailed event inventories. Based on a 5m-DEM and derivatives, and information on slope lithology, engineering soils and landcover, the possible source areas are identified with a statistical logistic regression model. The performance of the statistical model is evaluated with the observed distribution of debris-flow events recorded after 1850 in the study area. The source areas in the three most active torrents (Riou-Bourdoux, Faucon, Sanières) are well identified by the model. Results are less convincing for three other active torrents (Bourget, La Valette and Riou-Chanal); this could be related to the type of debris-flow triggering mechanism as the model seems to better spot the open slope debris-flow source areas (e.g. scree slopes), but appears to be less efficient for the identification of landslide-induced debris flows. Second, a susceptibility assessment is performed to estimate the possible runout distance

  19. ADOPT: A Historically Validated Light Duty Vehicle Consumer Choice Model

    SciTech Connect

    Brooker, A.; Gonder, J.; Lopp, S.; Ward, J.

    2015-05-04

    The Automotive Deployment Option Projection Tool (ADOPT) is a light-duty vehicle consumer choice and stock model supported by the U.S. Department of Energy’s Vehicle Technologies Office. It estimates technology improvement impacts on U.S. light-duty vehicles sales, petroleum use, and greenhouse gas emissions. ADOPT uses techniques from the multinomial logit method and the mixed logit method estimate sales. Specifically, it estimates sales based on the weighted value of key attributes including vehicle price, fuel cost, acceleration, range and usable volume. The average importance of several attributes changes nonlinearly across its range and changes with income. For several attributes, a distribution of importance around the average value is used to represent consumer heterogeneity. The majority of existing vehicle makes, models, and trims are included to fully represent the market. The Corporate Average Fuel Economy regulations are enforced. The sales feed into the ADOPT stock model. It captures key aspects for summing petroleum use and greenhouse gas emissions This includes capturing the change in vehicle miles traveled by vehicle age, the creation of new model options based on the success of existing vehicles, new vehicle option introduction rate limits, and survival rates by vehicle age. ADOPT has been extensively validated with historical sales data. It matches in key dimensions including sales by fuel economy, acceleration, price, vehicle size class, and powertrain across multiple years. A graphical user interface provides easy and efficient use. It manages the inputs, simulation, and results.

  20. Statistical and Probabilistic Extensions to Ground Operations' Discrete Event Simulation Modeling

    NASA Technical Reports Server (NTRS)

    Trocine, Linda; Cummings, Nicholas H.; Bazzana, Ashley M.; Rychlik, Nathan; LeCroy, Kenneth L.; Cates, Grant R.

    2010-01-01

    NASA's human exploration initiatives will invest in technologies, public/private partnerships, and infrastructure, paving the way for the expansion of human civilization into the solar system and beyond. As it is has been for the past half century, the Kennedy Space Center will be the embarkation point for humankind's journey into the cosmos. Functioning as a next generation space launch complex, Kennedy's launch pads, integration facilities, processing areas, launch and recovery ranges will bustle with the activities of the world's space transportation providers. In developing this complex, KSC teams work through the potential operational scenarios: conducting trade studies, planning and budgeting for expensive and limited resources, and simulating alternative operational schemes. Numerous tools, among them discrete event simulation (DES), were matured during the Constellation Program to conduct such analyses with the purpose of optimizing the launch complex for maximum efficiency, safety, and flexibility while minimizing life cycle costs. Discrete event simulation is a computer-based modeling technique for complex and dynamic systems where the state of the system changes at discrete points in time and whose inputs may include random variables. DES is used to assess timelines and throughput, and to support operability studies and contingency analyses. It is applicable to any space launch campaign and informs decision-makers of the effects of varying numbers of expensive resources and the impact of off nominal scenarios on measures of performance. In order to develop representative DES models, methods were adopted, exploited, or created to extend traditional uses of DES. The Delphi method was adopted and utilized for task duration estimation. DES software was exploited for probabilistic event variation. A roll-up process was used, which was developed to reuse models and model elements in other less - detailed models. The DES team continues to innovate and expand

  1. Methodology for Developing a Probabilistic Risk Assessment Model of Spacecraft Rendezvous and Dockings

    NASA Technical Reports Server (NTRS)

    Farnham, Steven J., II; Garza, Joel, Jr.; Castillo, Theresa M.; Lutomski, Michael

    2011-01-01

    In 2007 NASA was preparing to send two new visiting vehicles carrying logistics and propellant to the International Space Station (ISS). These new vehicles were the European Space Agency s (ESA) Automated Transfer Vehicle (ATV), the Jules Verne, and the Japanese Aerospace and Explorations Agency s (JAXA) H-II Transfer Vehicle (HTV). The ISS Program wanted to quantify the increased risk to the ISS from these visiting vehicles. At the time, only the Shuttle, the Soyuz, and the Progress vehicles rendezvoused and docked to the ISS. The increased risk to the ISS was from an increase in vehicle traffic, thereby, increasing the potential catastrophic collision during the rendezvous and the docking or berthing of the spacecraft to the ISS. A universal method of evaluating the risk of rendezvous and docking or berthing was created by the ISS s Risk Team to accommodate the increasing number of rendezvous and docking or berthing operations due to the increasing number of different spacecraft, as well as the future arrival of commercial spacecraft. Before the first docking attempt of ESA's ATV and JAXA's HTV to the ISS, a probabilistic risk model was developed to quantitatively calculate the risk of collision of each spacecraft with the ISS. The 5 rendezvous and docking risk models (Soyuz, Progress, Shuttle, ATV, and HTV) have been used to build and refine the modeling methodology for rendezvous and docking of spacecrafts. This risk modeling methodology will be NASA s basis for evaluating the addition of future ISS visiting spacecrafts hazards, including SpaceX s Dragon, Orbital Science s Cygnus, and NASA s own Orion spacecraft. This paper will describe the methodology used for developing a visiting vehicle risk model.

  2. A hierarchical model for probabilistic independent component analysis of multi-subject fMRI studies.

    PubMed

    Guo, Ying; Tang, Li

    2013-12-01

    An important goal in fMRI studies is to decompose the observed series of brain images to identify and characterize underlying brain functional networks. Independent component analysis (ICA) has been shown to be a powerful computational tool for this purpose. Classic ICA has been successfully applied to single-subject fMRI data. The extension of ICA to group inferences in neuroimaging studies, however, is challenging due to the unavailability of a pre-specified group design matrix. Existing group ICA methods generally concatenate observed fMRI data across subjects on the temporal domain and then decompose multi-subject data in a similar manner to single-subject ICA. The major limitation of existing methods is that they ignore between-subject variability in spatial distributions of brain functional networks in group ICA. In this article, we propose a new hierarchical probabilistic group ICA method to formally model subject-specific effects in both temporal and spatial domains when decomposing multi-subject fMRI data. The proposed method provides model-based estimation of brain functional networks at both the population and subject level. An important advantage of the hierarchical model is that it provides a formal statistical framework to investigate similarities and differences in brain functional networks across subjects, for example, subjects with mental disorders or neurodegenerative diseases such as Parkinson's as compared to normal subjects. We develop an EM algorithm for model estimation where both the E-step and M-step have explicit forms. We compare the performance of the proposed hierarchical model with that of two popular group ICA methods via simulation studies. We illustrate our method with application to an fMRI study of Zen meditation.

  3. Improving predictive power of physically based rainfall-induced shallow landslide models: a probabilistic approach

    NASA Astrophysics Data System (ADS)

    Raia, S.; Alvioli, M.; Rossi, M.; Baum, R. L.; Godt, J. W.; Guzzetti, F.

    2013-02-01

    Distributed models to forecast the spatial and temporal occurrence of rainfall-induced shallow landslides are deterministic. These models extend spatially the static stability models adopted in geotechnical engineering and adopt an infinite-slope geometry to balance the resisting and the driving forces acting on the sliding mass. An infiltration model is used to determine how rainfall changes pore-water conditions, modulating the local stability/instability conditions. A problem with the existing models is the difficulty in obtaining accurate values for the several variables that describe the material properties of the slopes. The problem is particularly severe when the models are applied over large areas, for which sufficient information on the geotechnical and hydrological conditions of the slopes is not generally available. To help solve the problem, we propose a probabilistic Monte Carlo approach to the distributed modeling of shallow rainfall-induced landslides. For the purpose, we have modified the Transient Rainfall Infiltration and Grid-Based Regional Slope-Stability Analysis (TRIGRS) code. The new code (TRIGRS-P) adopts a stochastic approach to compute, on a cell-by-cell basis, transient pore-pressure changes and related changes in the factor of safety due to rainfall infiltration. Infiltration is modeled using analytical solutions of partial differential equations describing one-dimensional vertical flow in isotropic, homogeneous materials. Both saturated and unsaturated soil conditions can be considered. TRIGRS-P copes with the natural variability inherent to the mechanical and hydrological properties of the slope materials by allowing values of the TRIGRS model input parameters to be sampled randomly from a given probability distribution. The range of variation and the mean value of the parameters can be determined by the usual methods used for preparing the TRIGRS input parameters. The outputs of several model runs obtained varying the input parameters

  4. Preference pulses and the win-stay, fix-and-sample model of choice.

    PubMed

    Hachiga, Yosuke; Sakagami, Takayuki; Silberberg, Alan

    2015-11-01

    Two groups of six rats each were trained to respond to two levers for a food reinforcer. One group was trained on concurrent variable-ratio 20 extinction schedules of reinforcement. The second group was trained on a concurrent variable-interval 27-s extinction schedule. In both groups, lever-schedule assignments changed randomly following reinforcement; a light cued the lever providing the next reinforcer. In the next condition, the light cue was removed and reinforcer assignment strictly alternated between levers. The next two conditions redetermined, in order, the first two conditions. Preference pulses, defined as a tendency for relative response rate to decline to the just-reinforced alternative with time since reinforcement, only appeared during the extinction schedule. Although the pulse's functional form was well described by a reinforcer-induction equation, there was a large residual between actual data and a pulse-as-artifact simulation (McLean, Grace, Pitts, & Hughes, 2014) used to discern reinforcer-dependent contributions to pulsing. However, if that simulation was modified to include a win-stay tendency (a propensity to stay on the just-reinforced alternative), the residual was greatly reduced. Additional modifications of the parameter values of the pulse-as-artifact simulation enabled it to accommodate the present results as well as those it originally accommodated. In its revised form, this simulation was used to create a model that describes response runs to the preferred alternative as terminating probabilistically, and runs to the unpreferred alternative as punctate with occasional perseverative response runs. After reinforcement, choices are modeled as returning briefly to the lever location that had been just reinforced. This win-stay propensity is hypothesized as due to reinforcer induction.

  5. Preference pulses and the win-stay, fix-and-sample model of choice.

    PubMed

    Hachiga, Yosuke; Sakagami, Takayuki; Silberberg, Alan

    2015-11-01

    Two groups of six rats each were trained to respond to two levers for a food reinforcer. One group was trained on concurrent variable-ratio 20 extinction schedules of reinforcement. The second group was trained on a concurrent variable-interval 27-s extinction schedule. In both groups, lever-schedule assignments changed randomly following reinforcement; a light cued the lever providing the next reinforcer. In the next condition, the light cue was removed and reinforcer assignment strictly alternated between levers. The next two conditions redetermined, in order, the first two conditions. Preference pulses, defined as a tendency for relative response rate to decline to the just-reinforced alternative with time since reinforcement, only appeared during the extinction schedule. Although the pulse's functional form was well described by a reinforcer-induction equation, there was a large residual between actual data and a pulse-as-artifact simulation (McLean, Grace, Pitts, & Hughes, 2014) used to discern reinforcer-dependent contributions to pulsing. However, if that simulation was modified to include a win-stay tendency (a propensity to stay on the just-reinforced alternative), the residual was greatly reduced. Additional modifications of the parameter values of the pulse-as-artifact simulation enabled it to accommodate the present results as well as those it originally accommodated. In its revised form, this simulation was used to create a model that describes response runs to the preferred alternative as terminating probabilistically, and runs to the unpreferred alternative as punctate with occasional perseverative response runs. After reinforcement, choices are modeled as returning briefly to the lever location that had been just reinforced. This win-stay propensity is hypothesized as due to reinforcer induction. PMID:26420769

  6. Comparison of Control Approaches in Genetic Regulatory Networks by Using Stochastic Master Equation Models, Probabilistic Boolean Network Models and Differential Equation Models and Estimated Error Analyzes

    NASA Astrophysics Data System (ADS)

    Caglar, Mehmet Umut; Pal, Ranadip

    2011-03-01

    Central dogma of molecular biology states that ``information cannot be transferred back from protein to either protein or nucleic acid''. However, this assumption is not exactly correct in most of the cases. There are a lot of feedback loops and interactions between different levels of systems. These types of interactions are hard to analyze due to the lack of cell level data and probabilistic - nonlinear nature of interactions. Several models widely used to analyze and simulate these types of nonlinear interactions. Stochastic Master Equation (SME) models give probabilistic nature of the interactions in a detailed manner, with a high calculation cost. On the other hand Probabilistic Boolean Network (PBN) models give a coarse scale picture of the stochastic processes, with a less calculation cost. Differential Equation (DE) models give the time evolution of mean values of processes in a highly cost effective way. The understanding of the relations between the predictions of these models is important to understand the reliability of the simulations of genetic regulatory networks. In this work the success of the mapping between SME, PBN and DE models is analyzed and the accuracy and affectivity of the control policies generated by using PBN and DE models is compared.

  7. Occupational Choice: A Conditional Logit Model with Special Reference to Wage Subsidies and Occupational Choice. Final Report.

    ERIC Educational Resources Information Center

    Boskin, Michael J.

    A model of occupational choice based on the theory of human capital is developed and estimated by conditional logit analysis. The empirical results estimated the probability of individuals with certain characteristics (such as race, sex, age, and education) entering each of 11 occupational groups. The results indicate that individuals tend to…

  8. Predicting carcinogenicity of diverse chemicals using probabilistic neural network modeling approaches

    SciTech Connect

    Singh, Kunwar P.; Gupta, Shikha; Rai, Premanjali

    2013-10-15

    Robust global models capable of discriminating positive and non-positive carcinogens; and predicting carcinogenic potency of chemicals in rodents were developed. The dataset of 834 structurally diverse chemicals extracted from Carcinogenic Potency Database (CPDB) was used which contained 466 positive and 368 non-positive carcinogens. Twelve non-quantum mechanical molecular descriptors were derived. Structural diversity of the chemicals and nonlinearity in the data were evaluated using Tanimoto similarity index and Brock–Dechert–Scheinkman statistics. Probabilistic neural network (PNN) and generalized regression neural network (GRNN) models were constructed for classification and function optimization problems using the carcinogenicity end point in rat. Validation of the models was performed using the internal and external procedures employing a wide series of statistical checks. PNN constructed using five descriptors rendered classification accuracy of 92.09% in complete rat data. The PNN model rendered classification accuracies of 91.77%, 80.70% and 92.08% in mouse, hamster and pesticide data, respectively. The GRNN constructed with nine descriptors yielded correlation coefficient of 0.896 between the measured and predicted carcinogenic potency with mean squared error (MSE) of 0.44 in complete rat data. The rat carcinogenicity model (GRNN) applied to the mouse and hamster data yielded correlation coefficient and MSE of 0.758, 0.71 and 0.760, 0.46, respectively. The results suggest for wide applicability of the inter-species models in predicting carcinogenic potency of chemicals. Both the PNN and GRNN (inter-species) models constructed here can be useful tools in predicting the carcinogenicity of new chemicals for regulatory purposes. - Graphical abstract: Figure (a) shows classification accuracies (positive and non-positive carcinogens) in rat, mouse, hamster, and pesticide data yielded by optimal PNN model. Figure (b) shows generalization and predictive

  9. A probabilistic model-based approach to consistent white matter tract segmentation.

    PubMed

    Clayden, Jonathan D; Storkey, Amos J; Bastin, Mark E

    2007-11-01

    Since the invention of diffusion magnetic resonance imaging (dMRI), currently the only established method for studying white matter connectivity in a clinical environment, there has been a great deal of interest in the effects of various pathologies on the connectivity of the brain. As methods for in vivo tractography have been developed, it has become possible to track and segment specific white matter structures of interest for particular study. However, the consistency and reproducibility of tractography-based segmentation remain limited, and attempts to improve them have thus far typically involved the imposition of strong constraints on the tract reconstruction process itself. In this work we take a different approach, developing a formal probabilistic model for the relationships between comparable tracts in different scans, and then using it to choose a tract, a posteriori, which best matches a predefined reference tract for the structure of interest. We demonstrate that this method is able to significantly improve segmentation consistency without directly constraining the tractography algorithm. PMID:18041270

  10. Probabilistic Modeling of Landfill Subsidence Introduced by Buried Structure Collapse - 13229

    SciTech Connect

    Foye, Kevin; Soong, Te-Yang

    2013-07-01

    The long-term reliability of land disposal facility final cover systems - and therefore the overall waste containment - depends on the distortions imposed on these systems by differential settlement/subsidence. The evaluation of differential settlement is challenging because of the heterogeneity of the waste mass and buried structure placement. Deterministic approaches to long-term final cover settlement prediction are not able to capture the spatial variability in the waste mass and sub-grade properties, especially discontinuous inclusions, which control differential settlement. An alternative is to use a probabilistic model to capture the non-uniform collapse of cover soils and buried structures and the subsequent effect of that collapse on the final cover system. Both techniques are applied to the problem of two side-by-side waste trenches with collapsible voids. The results show how this analytical technique can be used to connect a metric of final cover performance (inundation area) to the susceptibility of the sub-grade to collapse and the effective thickness of the cover soils. This approach allows designers to specify cover thickness, reinforcement, and slope to meet the demands imposed by the settlement of the underlying waste trenches. (authors)

  11. Detection of prostate cancer on histopathology using color fractals and Probabilistic Pairwise Markov models.

    PubMed

    Yu, Elaine; Monaco, James P; Tomaszewski, John; Shih, Natalie; Feldman, Michael; Madabhushi, Anant

    2011-01-01

    In this paper we present a system for detecting regions of carcinoma of the prostate (CaP) in H&E stained radical prostatectomy specimens using the color fractal dimension. Color textural information is known to be a valuable characteristic to distinguish CaP from benign tissue. In addition to color information, we know that cancer tends to form contiguous regions. Our system leverages the color staining information of histology as well as spatial dependencies. The color and textural information is first captured using color fractal dimension. To incorporate spatial dependencies, we combine the probability map constructed via color fractal dimension with a novel Markov prior called the Probabilistic Pairwise Markov Model (PPMM). To demonstrate the capability of this CaP detection system, we applied the algorithm to 27 radical prostatectomy specimens from 10 patients. A per pixel evaluation was conducted with ground truth provided by an expert pathologist using only the color fractal feature first, yielding an area under the receiver operator characteristic curve (AUC) curve of 0.790. In conjunction with a Markov prior, the resultant color fractal dimension + Markov random field (MRF) classifier yielded an AUC of 0.831.

  12. Spacecraft technology portfolio: Probabilistic modeling and implications for responsiveness and schedule slippage

    NASA Astrophysics Data System (ADS)

    Dubos, Gregory F.; Saleh, Joseph H.

    2011-04-01

    Addressing the challenges of Responsive Space and mitigating the risk of schedule slippage in space programs require a thorough understanding of the various factors driving the development schedule of a space system. The present work contributes theoretical and practical results in this direction. A spacecraft is here conceived of as a technology portfolio. The characteristics of this portfolio are defined as its size (e.g., number of instruments), the technology maturity of each instrument and the resulting Technology Readiness Level ( TRL) heterogeneity, and their effects on the delivery schedule of a spacecraft are investigated. Following a brief overview of the concept of R&D portfolio and its relevance to spacecraft design, a probabilistic model of the Time-to-Delivery of a spacecraft is formulated, which includes the development, Integration and Testing, and Shipping phases. The Mean-Time-To-Delivery ( MTTD) of the spacecraft is quantified based on the portfolio characteristics, and it is shown that the Mean-Time-To-Delivery ( MTTD) of the spacecraft and its schedule risk are significantly impacted by decreasing TRL and increasing portfolio size. Finally, the utility implications of varying the portfolio characteristics are investigated, and "portfolio maps" are provided as guides to help system designers identify appropriate portfolio characteristics when operating in a calendar-based design environment (which is the paradigm shift that space responsiveness introduces).

  13. Probabilistic Movement Models Show that Postural Control Precedes and Predicts Volitional Motor Control.

    PubMed

    Rueckert, Elmar; Čamernik, Jernej; Peters, Jan; Babič, Jan

    2016-01-01

    Human motor skill learning is driven by the necessity to adapt to new situations. While supportive contacts are essential for many tasks, little is known about their impact on motor learning. To study the effect of contacts an innovative full-body experimental paradigm was established. The task of the subjects was to reach for a distant target while postural stability could only be maintained by establishing an additional supportive hand contact. To examine adaptation, non-trivial postural perturbations of the subjects' support base were systematically introduced. A novel probabilistic trajectory model approach was employed to analyze the correlation between the motions of both arms and the trunk. We found that subjects adapted to the perturbations by establishing target dependent hand contacts. Moreover, we found that the trunk motion adapted significantly faster than the motion of the arms. However, the most striking finding was that observations of the initial phase of the left arm or trunk motion (100-400 ms) were sufficient to faithfully predict the complete movement of the right arm. Overall, our results suggest that the goal-directed arm movements determine the supportive arm motions and that the motion of heavy body parts adapts faster than the light arms. PMID:27328750

  14. Probabilistic Movement Models Show that Postural Control Precedes and Predicts Volitional Motor Control

    PubMed Central

    Rueckert, Elmar; Čamernik, Jernej; Peters, Jan; Babič, Jan

    2016-01-01

    Human motor skill learning is driven by the necessity to adapt to new situations. While supportive contacts are essential for many tasks, little is known about their impact on motor learning. To study the effect of contacts an innovative full-body experimental paradigm was established. The task of the subjects was to reach for a distant target while postural stability could only be maintained by establishing an additional supportive hand contact. To examine adaptation, non-trivial postural perturbations of the subjects’ support base were systematically introduced. A novel probabilistic trajectory model approach was employed to analyze the correlation between the motions of both arms and the trunk. We found that subjects adapted to the perturbations by establishing target dependent hand contacts. Moreover, we found that the trunk motion adapted significantly faster than the motion of the arms. However, the most striking finding was that observations of the initial phase of the left arm or trunk motion (100–400 ms) were sufficient to faithfully predict the complete movement of the right arm. Overall, our results suggest that the goal-directed arm movements determine the supportive arm motions and that the motion of heavy body parts adapts faster than the light arms. PMID:27328750

  15. Quantification and probabilistic modeling of CRT obsolescence for the State of Delaware.

    PubMed

    Schumacher, Kelsea A; Schumacher, Thomas; Agbemabiese, Lawrence

    2014-11-01

    The cessation of production and replacement of cathode ray tube (CRT) displays with flat screen displays have resulted in the proliferation of CRTs in the electronic waste (e-waste) recycle stream. However, due to the nature of the technology and presence of hazardous components such as lead, CRTs are the most challenging of electronic components to recycle. In the State of Delaware it is due to this challenge and the resulting expense combined with the large quantities of CRTs in the recycle stream that electronic recyclers now charge to accept Delaware's e-waste. Therefore it is imperative that the Delaware Solid Waste Authority (DSWA) understand future quantities of CRTs entering the waste stream. This study presents the results of an assessment of CRT obsolescence in the State of Delaware. A prediction model was created utilizing publicized sales data, a variety of lifespan data as well as historic Delaware CRT collection rates. Both a deterministic and a probabilistic approach using Monte Carlo Simulation (MCS) were performed to forecast rates of CRT obsolescence to be anticipated in the State of Delaware. Results indicate that the peak of CRT obsolescence in Delaware has already passed, although CRTs are anticipated to enter the waste stream likely until 2033. PMID:25130982

  16. Modeling of constructional elements fragmentation:3-D statement and probabilistic approach

    NASA Astrophysics Data System (ADS)

    Gerasimov, Alexander; Pashkov, Sergey

    2011-06-01

    The heterogeneity of real materials structure influencing on distribution of material characteristics is one of the factors determining a destruction character. The introduction of the given factor in the equations of mechanics of deformed solid is possible at the use of probabilistic laws of characteristics distribution in the volume of a considered design. The explosive fragmentation of the open and closed shells, thick plate punching by HE charged shell on a normal and at an angle, plate and a shell fragmentation after plate piercing and under HE charge explosion, thin barrier punching on a normal and at an angle, crushing of metal rings dressed on a copper tube, process of high-speed impact of laminated-spaced metallic plates with steel spheres modeling debris of space bodies and artificial objects are considered. The processes are calculated in view of material heterogeneity. To calculate elastoplastic flows and detonation products we used the technique realized on tetrahedral cells and based on Wilkins method for calculation of internal points of a body and on Johnson method for calculation of contact interactions.

  17. A probabilistic model to predict clinical phenotypic traits from genome sequencing.

    PubMed

    Chen, Yun-Ching; Douville, Christopher; Wang, Cheng; Niknafs, Noushin; Yeo, Grace; Beleva-Guthrie, Violeta; Carter, Hannah; Stenson, Peter D; Cooper, David N; Li, Biao; Mooney, Sean; Karchin, Rachel

    2014-09-01

    Genetic screening is becoming possible on an unprecedented scale. However, its utility remains controversial. Although most variant genotypes cannot be easily interpreted, many individuals nevertheless attempt to interpret their genetic information. Initiatives such as the Personal Genome Project (PGP) and Illumina's Understand Your Genome are sequencing thousands of adults, collecting phenotypic information and developing computational pipelines to identify the most important variant genotypes harbored by each individual. These pipelines consider database and allele frequency annotations and bioinformatics classifications. We propose that the next step will be to integrate these different sources of information to estimate the probability that a given individual has specific phenotypes of clinical interest. To this end, we have designed a Bayesian probabilistic model to predict the probability of dichotomous phenotypes. When applied to a cohort from PGP, predictions of Gilbert syndrome, Graves' disease, non-Hodgkin lymphoma, and various blood groups were accurate, as individuals manifesting the phenotype in question exhibited the highest, or among the highest, predicted probabilities. Thirty-eight PGP phenotypes (26%) were predicted with area-under-the-ROC curve (AUC)>0.7, and 23 (15.8%) of these were statistically significant, based on permutation tests. Moreover, in a Critical Assessment of Genome Interpretation (CAGI) blinded prediction experiment, the models were used to match 77 PGP genomes to phenotypic profiles, generating the most accurate prediction of 16 submissions, according to an independent assessor. Although the models are currently insufficiently accurate for diagnostic utility, we expect their performance to improve with growth of publicly available genomics data and model refinement by domain experts.

  18. Predicting Postfire Hillslope Erosion with a Web-based Probabilistic Model

    NASA Astrophysics Data System (ADS)

    Robichaud, P. R.; Elliot, W. J.; Pierson, F. B.; Hall, D. E.; Moffet, C. A.

    2005-12-01

    Modeling erosion after major disturbances, such as wildfire, has major challenges that need to be overcome. Fire-induced changes include increased erosion due to loss of the protective litter and duff, loss of soil water storage, and in some cases, creation of water repellent soil conditions. These conditions increase the potential for flooding, and sedimentation, which are of special concern to people who live and mange resources in the areas adjacent to burned areas. A web-based Erosion Risk Management Tool (ERMiT), has been developed to predict surface erosion from postfire hillslopes and to evaluate the potential effectiveness of various erosion mitigation practices. The model uses a probabilistic approach that incorporates variability in weather, soil properties, and burn severity for forests, rangeland, and chaparral hillslopes. The Water Erosion Prediction Project (WEPP) is the erosion prediction engine used in a Monte Carlo simulation mode to provide event-based erosion rate probabilities. The one-page custom interface is targeted for hydrologists and soil scientists. The interface allows users to select climate, soil texture, burn severity, and hillslope topography. For a given hillslope, the model uses a single 100-year run to obtain weather variability and then twenty 5- to 10-year runs to incorporate soil property, cover, and spatial burn severity variability. The output, in both tabular and graphical form, relates the probability of soil erosion exceeding a given amount in each of the first five years following the fire. Event statistics are provided to show the magnitude and rainfall intensity of the storms used to predict erosion rates. ERMiT also allows users to compare the effects of various mitigation treatments (mulches, seeding, and barrier treatments such as contour-felled logs or straw wattles) on the erosion rate probability. Data from rainfall simulation and concentrated flow (rill) techniques were used to parameterize ERMiT for these varied

  19. Feedback May Harm: Role of Feedback in Probabilistic Decision Making of Adolescents with ADHD.

    PubMed

    Pollak, Yehuda; Shoham, Rachel

    2015-10-01

    Inept probabilistic decision making is commonly associated with ADHD. In experimental designs aimed to model probabilistic decision making in ADHD, feedback following each choice was, in the majority of studies, part of the paradigm. This study examined whether feedback processing plays a role in the maladaptive choice behavior of subjects with ADHD by comparing feedback and no-feedback conditions. Sixty adolescents (49 males), ages 13-18, with and without ADHD, performed a descriptive probabilistic choice task in which outcomes and probabilities were explicitly provided. Subjects performed the task either with or without feedback. Under the no-feedback condition, adolescents with ADHD and controls performed similarly, whereas under the feedback condition, subjects with ADHD chose the unfavorable outcomes more frequently and risked smaller sums than controls. These finding demonstrate the crucial role of feedback in the decision making of adolescents with ADHD.

  20. Probabilistic Failure Assessment For Fatigue

    NASA Technical Reports Server (NTRS)

    Moore, Nicholas; Ebbeler, Donald; Newlin, Laura; Sutharshana, Sravan; Creager, Matthew

    1995-01-01

    Probabilistic Failure Assessment for Fatigue (PFAFAT) package of software utilizing probabilistic failure-assessment (PFA) methodology to model high- and low-cycle-fatigue modes of failure of structural components. Consists of nine programs. Three programs perform probabilistic fatigue analysis by means of Monte Carlo simulation. Other six used for generating random processes, characterizing fatigue-life data pertaining to materials, and processing outputs of computational simulations. Written in FORTRAN 77.

  1. A probabilistic model for the identification of confinement regimes and edge localized mode behavior, with implications to scaling lawsa)

    NASA Astrophysics Data System (ADS)

    Verdoolaege, Geert; Van Oost, Guido

    2012-10-01

    Pattern recognition is becoming an important tool in fusion data analysis. However, fusion diagnostic measurements are often affected by considerable statistical uncertainties, rendering the extraction of useful patterns a significant challenge. Therefore, we assume a probabilistic model for the data and perform pattern recognition in the space of probability distributions. We show the considerable advantage of our method for identifying confinement regimes and edge localized mode behavior, and we discuss the potential for scaling laws.

  2. A probabilistic model for the identification of confinement regimes and edge localized mode behavior, with implications to scaling laws

    SciTech Connect

    Verdoolaege, Geert; Van Oost, Guido

    2012-10-15

    Pattern recognition is becoming an important tool in fusion data analysis. However, fusion diagnostic measurements are often affected by considerable statistical uncertainties, rendering the extraction of useful patterns a significant challenge. Therefore, we assume a probabilistic model for the data and perform pattern recognition in the space of probability distributions. We show the considerable advantage of our method for identifying confinement regimes and edge localized mode behavior, and we discuss the potential for scaling laws.

  3. A probabilistic storm surge risk model for the German North Sea and Baltic Sea coast

    NASA Astrophysics Data System (ADS)

    Grabbert, Jan-Henrik; Reiner, Andreas; Deepen, Jan; Rodda, Harvey; Mai, Stephan; Pfeifer, Dietmar

    2010-05-01

    The German North Sea coast is highly exposed to storm surges. Due to its concave bay-like shape mainly orientated to the North-West, cyclones from Western, North-Western and Northern directions together with astronomical tide cause storm surges accumulating the water in the German bight. Due to the existence of widespread low-lying areas (below 5m above mean sea level) behind the defenses, large areas including large economic values are exposed to coastal flooding including cities like Hamburg or Bremen. The occurrence of extreme storm surges in the past like e.g. in 1962 taking about 300 lives and causing widespread flooding and 1976 raised the awareness and led to a redesign of the coastal defenses which provide a good level of protection for today's conditions. Never the less the risk of flooding exists. Moreover an amplification of storm surge risk can be expected under the influence of climate change. The Baltic Sea coast is also exposed to storm surges, which are caused by other meteorological patterns. The influence of the astronomical tide is quite low instead high water levels are induced by strong winds only. Since the exceptional extreme event in 1872 storm surge hazard has been more or less forgotten. Although such an event is very unlikely to happen, it is not impossible. Storm surge risk is currently (almost) non-insurable in Germany. The potential risk is difficult to quantify as there are almost no historical losses available. Also premiums are difficult to assess. Therefore a new storm surge risk model is being developed to provide a basis for a probabilistic quantification of potential losses from coastal inundation. The model is funded by the GDV (German Insurance Association) and is planned to be used within the German insurance sector. Results might be used for a discussion of insurance cover for storm surge. The model consists of a probabilistic event driven hazard and a vulnerability module, furthermore an exposure interface and a financial

  4. Dopamine enhances model-based over model-free choice behavior.

    PubMed

    Wunderlich, Klaus; Smittenaar, Peter; Dolan, Raymond J

    2012-08-01

    Decision making is often considered to arise out of contributions from a model-free habitual system and a model-based goal-directed system. Here, we investigated the effect of a dopamine manipulation on the degree to which either system contributes to instrumental behavior in a two-stage Markov decision task, which has been shown to discriminate model-free from model-based control. We found increased dopamine levels promote model-based over model-free choice.

  5. Bayesian methods for model choice and propagation of model uncertainty in groundwater transport modeling

    NASA Astrophysics Data System (ADS)

    Mendes, B. S.; Draper, D.

    2008-12-01

    The issue of model uncertainty and model choice is central in any groundwater modeling effort [Neuman and Wierenga, 2003]; among the several approaches to the problem we favour using Bayesian statistics because it is a method that integrates in a natural way uncertainties (arising from any source) and experimental data. In this work, we experiment with several Bayesian approaches to model choice, focusing primarily on demonstrating the usefulness of the Reversible Jump Markov Chain Monte Carlo (RJMCMC) simulation method [Green, 1995]; this is an extension of the now- common MCMC methods. Standard MCMC techniques approximate posterior distributions for quantities of interest, often by creating a random walk in parameter space; RJMCMC allows the random walk to take place between parameter spaces with different dimensionalities. This fact allows us to explore state spaces that are associated with different deterministic models for experimental data. Our work is exploratory in nature; we restrict our study to comparing two simple transport models applied to a data set gathered to estimate the breakthrough curve for a tracer compound in groundwater. One model has a mean surface based on a simple advection dispersion differential equation; the second model's mean surface is also governed by a differential equation but in two dimensions. We focus on artificial data sets (in which truth is known) to see if model identification is done correctly, but we also address the issues of over and under-paramerization, and we compare RJMCMC's performance with other traditional methods for model selection and propagation of model uncertainty, including Bayesian model averaging, BIC and DIC.References Neuman and Wierenga (2003). A Comprehensive Strategy of Hydrogeologic Modeling and Uncertainty Analysis for Nuclear Facilities and Sites. NUREG/CR-6805, Division of Systems Analysis and Regulatory Effectiveness Office of Nuclear Regulatory Research, U. S. Nuclear Regulatory Commission

  6. Probabilistic Models and Generative Neural Networks: Towards an Unified Framework for Modeling Normal and Impaired Neurocognitive Functions.

    PubMed

    Testolin, Alberto; Zorzi, Marco

    2016-01-01

    Connectionist models can be characterized within the more general framework of probabilistic graphical models, which allow to efficiently describe complex statistical distributions involving a large number of interacting variables. This integration allows building more realistic computational models of cognitive functions, which more faithfully reflect the underlying neural mechanisms at the same time providing a useful bridge to higher-level descriptions in terms of Bayesian computations. Here we discuss a powerful class of graphical models that can be implemented as stochastic, generative neural networks. These models overcome many limitations associated with classic connectionist models, for example by exploiting unsupervised learning in hierarchical architectures (deep networks) and by taking into account top-down, predictive processing supported by feedback loops. We review some recent cognitive models based on generative networks, and we point out promising research directions to investigate neuropsychological disorders within this approach. Though further efforts are required in order to fill the gap between structured Bayesian models and more realistic, biophysical models of neuronal dynamics, we argue that generative neural networks have the potential to bridge these levels of analysis, thereby improving our understanding of the neural bases of cognition and of pathologies caused by brain damage. PMID:27468262

  7. Probabilistic Models and Generative Neural Networks: Towards an Unified Framework for Modeling Normal and Impaired Neurocognitive Functions.

    PubMed

    Testolin, Alberto; Zorzi, Marco

    2016-01-01

    Connectionist models can be characterized within the more general framework of probabilistic graphical models, which allow to efficiently describe complex statistical distributions involving a large number of interacting variables. This integration allows building more realistic computational models of cognitive functions, which more faithfully reflect the underlying neural mechanisms at the same time providing a useful bridge to higher-level descriptions in terms of Bayesian computations. Here we discuss a powerful class of graphical models that can be implemented as stochastic, generative neural networks. These models overcome many limitations associated with classic connectionist models, for example by exploiting unsupervised learning in hierarchical architectures (deep networks) and by taking into account top-down, predictive processing supported by feedback loops. We review some recent cognitive models based on generative networks, and we point out promising research directions to investigate neuropsychological disorders within this approach. Though further efforts are required in order to fill the gap between structured Bayesian models and more realistic, biophysical models of neuronal dynamics, we argue that generative neural networks have the potential to bridge these levels of analysis, thereby improving our understanding of the neural bases of cognition and of pathologies caused by brain damage.

  8. Probabilistic Models and Generative Neural Networks: Towards an Unified Framework for Modeling Normal and Impaired Neurocognitive Functions

    PubMed Central

    Testolin, Alberto; Zorzi, Marco

    2016-01-01

    Connectionist models can be characterized within the more general framework of probabilistic graphical models, which allow to efficiently describe complex statistical distributions involving a large number of interacting variables. This integration allows building more realistic computational models of cognitive functions, which more faithfully reflect the underlying neural mechanisms at the same time providing a useful bridge to higher-level descriptions in terms of Bayesian computations. Here we discuss a powerful class of graphical models that can be implemented as stochastic, generative neural networks. These models overcome many limitations associated with classic connectionist models, for example by exploiting unsupervised learning in hierarchical architectures (deep networks) and by taking into account top-down, predictive processing supported by feedback loops. We review some recent cognitive models based on generative networks, and we point out promising research directions to investigate neuropsychological disorders within this approach. Though further efforts are required in order to fill the gap between structured Bayesian models and more realistic, biophysical models of neuronal dynamics, we argue that generative neural networks have the potential to bridge these levels of analysis, thereby improving our understanding of the neural bases of cognition and of pathologies caused by brain damage. PMID:27468262

  9. Multidisciplinary design optimization of a fighter aircraft with damage tolerance constraints and a probabilistic model of the fatigue environment

    NASA Astrophysics Data System (ADS)

    Arrieta, Albert Joseph

    2001-07-01

    Damage tolerance analysis (DTA) was considered in the global design optimization of an aircraft wing structure. Residual strength and fatigue life requirements, based on the damage tolerance philosophy, were investigated as new design constraints. In general, accurate fatigue prediction is difficult if the load environment is not known with a high degree of certainty. To address this issue, a probabilistic approach was used to describe the uncertain load environment. Probabilistic load spectra models were developed from flight recorder data. The global/local finite element approach allowed local fatigue requirements to be considered in the global design optimization. AFGROW fatigue crack growth analysis provided a new strength criterion for satisfying damage tolerance requirements within a global optimization environment. Initial research with the ASTROS program used the probabilistic load model and this damage tolerance constraint to optimize cracked skin panels on the lower wing of a fighter/attack aircraft. For an aerodynamic and structural model similar to an F-16, ASTROS simulated symmetric and asymmetric maneuvers during the optimization. Symmetric maneuvers, without underwing stores, produced the highest stresses and drove the optimization of the inboard lower wing skin. Asymmetric maneuvers, with underwing stores, affected the optimum thickness of the outboard hard points. Subsequent design optimizations included von Mises stress, aileron effectiveness, and lift effectiveness constraints simultaneously. This optimization was driven by the DTA and von Mises stress constraints and, therefore, DTA requirements can have an active role to play in preliminary aircraft design.

  10. A probabilistic transmission model to assess infection risk from Mycobacterium tuberculosis in commercial passenger trains.

    PubMed

    Chen, Szu-Chieh; Liao, Chung-Min; Li, Sih-syuan; You, Shu-Han

    2011-06-01

    The objective of this article is to characterize the risk of infection from airborne Mycobacterium tuberculosis bacilli exposure in commercial passenger trains based on a risk-based probabilistic transmission modeling. We investigated the tuberculosis (TB) infection risks among commercial passengers by inhaled aerosol M. tuberculosis bacilli and quantify the patterns of TB transmission in Taiwan High Speed Rail (THSR). A deterministic Wells-Riley mathematical model was used to account for the probability of infection risk from M. tuberculosis bacilli by linking the cough-generated aerosol M. tuberculosis bacilli concentration and particle size distribution. We found that (i) the quantum generation rate of TB was estimated with a lognormal distribution of geometric mean (GM) of 54.29 and geometric standard deviation (GSD) of 3.05 quantum/h at particle size ≤ 5 μm and (ii) the basic reproduction numbers (R(0) ) were estimated to be 0.69 (0.06-6.79), 2.82 (0.32-20.97), and 2.31 (0.25-17.69) for business, standard, and nonreserved cabins, respectively. The results indicate that commercial passengers taking standard and nonreserved cabins had higher transmission risk than those in business cabins based on conservatism. Our results also reveal that even a brief exposure, as in the bronchoscopy cases, can also result in a transmission when the quantum generation rate is high. This study could contribute to a better understanding of the dynamics of TB transmission in commercial passenger trains by assessing the relationship between TB infectiousness, passenger mobility, and key model parameters such as seat occupancy, ventilation rate, and exposure duration.

  11. SEX-DETector: A Probabilistic Approach to Study Sex Chromosomes in Non-Model Organisms.

    PubMed

    Muyle, Aline; Käfer, Jos; Zemp, Niklaus; Mousset, Sylvain; Picard, Franck; Marais, Gabriel Ab

    2016-01-01

    We propose a probabilistic framework to infer autosomal and sex-linked genes from RNA-seq data of a cross for any sex chromosome type (XY, ZW, and UV). Sex chromosomes (especially the non-recombining and repeat-dense Y, W, U, and V) are notoriously difficult to sequence. Strategies have been developed to obtain partially assembled sex chromosome sequences. Most of them remain difficult to apply to numerous non-model organisms, either because they require a reference genome, or because they are designed for evolutionarily old systems. Sequencing a cross (parents and progeny) by RNA-seq to study the segregation of alleles and infer sex-linked genes is a cost-efficient strategy, which also provides expression level estimates. However, the lack of a proper statistical framework has limited a broader application of this approach. Tests on empirical Silene data show that our method identifies 20-35% more sex-linked genes than existing pipelines, while making reliable inferences for downstream analyses. Approximately 12 individuals are needed for optimal results based on simulations. For species with an unknown sex-determination system, the method can assess the presence and type (XY vs. ZW) of sex chromosomes through a model comparison strategy. The method is particularly well optimized for sex chromosomes of young or intermediate age, which are expected in thousands of yet unstudied lineages. Any organisms, including non-model ones for which nothing is known a priori, that can be bred in the lab, are suitable for our method. SEX-DETector and its implementation in a Galaxy workflow are made freely available. PMID:27492231

  12. A probabilistic transmission model to assess infection risk from Mycobacterium tuberculosis in commercial passenger trains.

    PubMed

    Chen, Szu-Chieh; Liao, Chung-Min; Li, Sih-syuan; You, Shu-Han

    2011-06-01

    The objective of this article is to characterize the risk of infection from airborne Mycobacterium tuberculosis bacilli exposure in commercial passenger trains based on a risk-based probabilistic transmission modeling. We investigated the tuberculosis (TB) infection risks among commercial passengers by inhaled aerosol M. tuberculosis bacilli and quantify the patterns of TB transmission in Taiwan High Speed Rail (THSR). A deterministic Wells-Riley mathematical model was used to account for the probability of infection risk from M. tuberculosis bacilli by linking the cough-generated aerosol M. tuberculosis bacilli concentration and particle size distribution. We found that (i) the quantum generation rate of TB was estimated with a lognormal distribution of geometric mean (GM) of 54.29 and geometric standard deviation (GSD) of 3.05 quantum/h at particle size ≤ 5 μm and (ii) the basic reproduction numbers (R(0) ) were estimated to be 0.69 (0.06-6.79), 2.82 (0.32-20.97), and 2.31 (0.25-17.69) for business, standard, and nonreserved cabins, respectively. The results indicate that commercial passengers taking standard and nonreserved cabins had higher transmission risk than those in business cabins based on conservatism. Our results also reveal that even a brief exposure, as in the bronchoscopy cases, can also result in a transmission when the quantum generation rate is high. This study could contribute to a better understanding of the dynamics of TB transmission in commercial passenger trains by assessing the relationship between TB infectiousness, passenger mobility, and key model parameters such as seat occupancy, ventilation rate, and exposure duration. PMID:21175727

  13. SEX-DETector: A Probabilistic Approach to Study Sex Chromosomes in Non-Model Organisms

    PubMed Central

    Muyle, Aline; Käfer, Jos; Zemp, Niklaus; Mousset, Sylvain; Picard, Franck; Marais, Gabriel AB

    2016-01-01

    We propose a probabilistic framework to infer autosomal and sex-linked genes from RNA-seq data of a cross for any sex chromosome type (XY, ZW, and UV). Sex chromosomes (especially the non-recombining and repeat-dense Y, W, U, and V) are notoriously difficult to sequence. Strategies have been developed to obtain partially assembled sex chromosome sequences. Most of them remain difficult to apply to numerous non-model organisms, either because they require a reference genome, or because they are designed for evolutionarily old systems. Sequencing a cross (parents and progeny) by RNA-seq to study the segregation of alleles and infer sex-linked genes is a cost-efficient strategy, which also provides expression level estimates. However, the lack of a proper statistical framework has limited a broader application of this approach. Tests on empirical Silene data show that our method identifies 20–35% more sex-linked genes than existing pipelines, while making reliable inferences for downstream analyses. Approximately 12 individuals are needed for optimal results based on simulations. For species with an unknown sex-determination system, the method can assess the presence and type (XY vs. ZW) of sex chromosomes through a model comparison strategy. The method is particularly well optimized for sex chromosomes of young or intermediate age, which are expected in thousands of yet unstudied lineages. Any organisms, including non-model ones for which nothing is known a priori, that can be bred in the lab, are suitable for our method. SEX-DETector and its implementation in a Galaxy workflow are made freely available. PMID:27492231

  14. SEX-DETector: A Probabilistic Approach to Study Sex Chromosomes in Non-Model Organisms.

    PubMed

    Muyle, Aline; Käfer, Jos; Zemp, Niklaus; Mousset, Sylvain; Picard, Franck; Marais, Gabriel Ab

    2016-01-01

    We propose a probabilistic framework to infer autosomal and sex-linked genes from RNA-seq data of a cross for any sex chromosome type (XY, ZW, and UV). Sex chromosomes (especially the non-recombining and repeat-dense Y, W, U, and V) are notoriously difficult to sequence. Strategies have been developed to obtain partially assembled sex chromosome sequences. Most of them remain difficult to apply to numerous non-model organisms, either because they require a reference genome, or because they are designed for evolutionarily old systems. Sequencing a cross (parents and progeny) by RNA-seq to study the segregation of alleles and infer sex-linked genes is a cost-efficient strategy, which also provides expression level estimates. However, the lack of a proper statistical framework has limited a broader application of this approach. Tests on empirical Silene data show that our method identifies 20-35% more sex-linked genes than existing pipelines, while making reliable inferences for downstream analyses. Approximately 12 individuals are needed for optimal results based on simulations. For species with an unknown sex-determination system, the method can assess the presence and type (XY vs. ZW) of sex chromosomes through a model comparison strategy. The method is particularly well optimized for sex chromosomes of young or intermediate age, which are expected in thousands of yet unstudied lineages. Any organisms, including non-model ones for which nothing is known a priori, that can be bred in the lab, are suitable for our method. SEX-DETector and its implementation in a Galaxy workflow are made freely available.

  15. Towards inclusion of dynamic slip features in stochastic models for probabilistic (tsunami) hazard analysis.

    NASA Astrophysics Data System (ADS)

    Murphy, S.; Scala, A.; Herrero, A.; Lorito, S.; Nielsen, S. B.; Festa, G.; Trasatti, E.; Tonini, R.; Molinari, I.; Romano, F.

    2015-12-01

    Stochastic slip modelling based on general scaling features with uniform slip probability over the fault plane is commonly employed in tsunami and seismic hazard. However, dynamic rupture effects driven by specific fault geometry and frictional conditions can potentially control the slip probability. Unfortunately dynamic simulations can be computationally intensive, preventing their extensive use for hazard analysis. The aim of this study is to produce a stochastic model that incorporates slip features observed in dynamic simulations. Taking a Tohoku-like fault as a case study, numerous 2d spectral element dynamic simulations are performed using a variety of pre-stress distributions. Comparing the slip distributions generated from these simulations to traditional stochastic slip models we find that the stochastic models generally under represent slip near the free surface. This is an important feature in tsunami hazard with very large slip at shallow depth observed for the 2011 Tohoku earthquake. To incorporate dynamic features in the stochastic modeling we generate a depth dependent "transfer function" based on comparisons between the dynamic and stochastic models. Assuming that the differences between stochastic and dynamic slip distributions are predominantly depth dependent and not along strike, the transfer function is then applied to stochastic source models over a 3d geometry of the Tohoku fault. Comparing maximum tsunami wave height along the Japanese coast using a traditional stochastic model and one modified by the transfer function we find that the inclusion of the transfer function leads to the occurrence of more extreme events. Applying this function to the traditional stochastic slip distribution as a depth-dependent PDF for the slip may allow for an approximated but efficient incorporation of regionally specific dynamic features in a modified source model, to be used specifically when a significant number of slip scenarios need to be produced, e

  16. The Answering Process for Multiple-Choice Questions in Collaborative Learning: A Mathematical Learning Model Analysis

    ERIC Educational Resources Information Center

    Nakamura, Yasuyuki; Nishi, Shinnosuke; Muramatsu, Yuta; Yasutake, Koichi; Yamakawa, Osamu; Tagawa, Takahiro

    2014-01-01

    In this paper, we introduce a mathematical model for collaborative learning and the answering process for multiple-choice questions. The collaborative learning model is inspired by the Ising spin model and the model for answering multiple-choice questions is based on their difficulty level. An intensive simulation study predicts the possibility of…

  17. A multinomial choice model approach for dynamic driver vision transitions.

    PubMed

    Huang, Shih-Hsuan; Wong, Jinn-Tsai

    2015-01-01

    Exploring the continual process of drivers allocating their attention under varying conditions could be vital for preventing motor vehicle crashes. This study aims to model visual behaviors and to estimate the effects of various contributing factors on driver's vision transitions. A visual attention allocation framework, based on certain contributing attributes related to driving tasks and environmental conditions, has been developed. The associated logit type models for determining driver choices for focal points were successfully formulated and estimated by using naturalistic glance data from the 100-car event database. The results offer insights into driver visual behavior and patterns of visual attention allocation. The three focal points that drivers most frequently rely on and glance at are the forward, left and rear view mirror. The sample drivers were less likely to demonstrate troublesome transition patterns, particularly in mentally demanding situations. Additionally, instead of shifting vision directly between two non-forward focal points, the sample drivers frequently had an intermediate forward glance. Thus, seemingly unrelated paths could be grouped into explanatory patterns of driver attention allocation. Finally, in addition to the vision-transition patterns, the potential pitfalls of such patterns and possible countermeasures to improving safety are illustrated, focusing on situations when drivers are distracted, traveling at high speeds and approaching intersections.

  18. Agent-based modelling of consumer energy choices

    NASA Astrophysics Data System (ADS)

    Rai, Varun; Henry, Adam Douglas

    2016-06-01

    Strategies to mitigate global climate change should be grounded in a rigorous understanding of energy systems, particularly the factors that drive energy demand. Agent-based modelling (ABM) is a powerful tool for representing the complexities of energy demand, such as social interactions and spatial constraints. Unlike other approaches for modelling energy demand, ABM is not limited to studying perfectly rational agents or to abstracting micro details into system-level equations. Instead, ABM provides the ability to represent behaviours of energy consumers -- such as individual households -- using a range of theories, and to examine how the interaction of heterogeneous agents at the micro-level produces macro outcomes of importance to the global climate, such as the adoption of low-carbon behaviours and technologies over space and time. We provide an overview of ABM work in the area of consumer energy choices, with a focus on identifying specific ways in which ABM can improve understanding of both fundamental scientific and applied aspects of the demand side of energy to aid the design of better policies and programmes. Future research needs for improving the practice of ABM to better understand energy demand are also discussed.

  19. An empirical model for probabilistic decadal prediction: global attribution and regional hindcasts

    NASA Astrophysics Data System (ADS)

    Suckling, Emma B.; van Oldenborgh, Geert Jan; Eden, Jonathan M.; Hawkins, Ed

    2016-07-01

    Empirical models, designed to predict surface variables over seasons to decades ahead, provide useful benchmarks for comparison against the performance of dynamical forecast systems; they may also be employable as predictive tools for use by climate services in their own right. A new global empirical decadal prediction system is presented, based on a multiple linear regression approach designed to produce probabilistic output for comparison against dynamical models. A global attribution is performed initially to identify the important forcing and predictor components of the model . Ensemble hindcasts of surface air temperature anomaly fields are then generated, based on the forcings and predictors identified as important, under a series of different prediction `modes' and their performance is evaluated. The modes include a real-time setting, a scenario in which future volcanic forcings are prescribed during the hindcasts, and an approach which exploits knowledge of the forced trend. A two-tier prediction system, which uses knowledge of future sea surface temperatures in the Pacific and Atlantic Oceans, is also tested, but within a perfect knowledge framework. Each mode is designed to identify sources of predictability and uncertainty, as well as investigate different approaches to the design of decadal prediction systems for operational use. It is found that the empirical model shows skill above that of persistence hindcasts for annual means at lead times of up to 10 years ahead in all of the prediction modes investigated. It is suggested that hindcasts which exploit full knowledge of the forced trend due to increasing greenhouse gases throughout the hindcast period can provide more robust estimates of model bias for the calibration of the empirical model in an operational setting. The two-tier system shows potential for improved real-time prediction, given the assumption that skilful predictions of large-scale modes of variability are available. The empirical

  20. Does probabilistic modelling of linkage disequilibrium evolution improve the accuracy of QTL location in animal pedigree?

    PubMed Central

    2010-01-01

    Background Since 2001, the use of more and more dense maps has made researchers aware that combining linkage and linkage disequilibrium enhances the feasibility of fine-mapping genes of interest. So, various method types have been derived to include concepts of population genetics in the analyses. One major drawback of many of these methods is their computational cost, which is very significant when many markers are considered. Recent advances in technology, such as SNP genotyping, have made it possible to deal with huge amount of data. Thus the challenge that remains is to find accurate and efficient methods that are not too time consuming. The study reported here specifically focuses on the half-sib family animal design. Our objective was to determine whether modelling of linkage disequilibrium evolution improved the mapping accuracy of a quantitative trait locus of agricultural interest in these populations. We compared two methods of fine-mapping. The first one was an association analysis. In this method, we did not model linkage disequilibrium evolution. Therefore, the modelling of the evolution of linkage disequilibrium was a deterministic process; it was complete at time 0 and remained complete during the following generations. In the second method, the modelling of the evolution of population allele frequencies was derived from a Wright-Fisher model. We simulated a wide range of scenarios adapted to animal populations and compared these two methods for each scenario. Results Our results indicated that the improvement produced by probabilistic modelling of linkage disequilibrium evolution was not significant. Both methods led to similar results concerning the location accuracy of quantitative trait loci which appeared to be mainly improved by using four flanking markers instead of two. Conclusions Therefore, in animal half-sib designs, modelling linkage disequilibrium evolution using a Wright-Fisher model does not significantly improve the accuracy of the

  1. Using simple chaotic models to interpret climate under climate change: Implications for probabilistic climate prediction

    NASA Astrophysics Data System (ADS)

    Daron, Joseph

    2010-05-01

    Exploring the reliability of model based projections is an important pre-cursor to evaluating their societal relevance. In order to better inform decisions concerning adaptation (and mitigation) to climate change, we must investigate whether or not our models are capable of replicating the dynamic nature of the climate system. Whilst uncertainty is inherent within climate prediction, establishing and communicating what is plausible as opposed to what is likely is the first step to ensuring that climate sensitive systems are robust to climate change. Climate prediction centers are moving towards probabilistic projections of climate change at regional and local scales (Murphy et al., 2009). It is therefore important to understand what a probabilistic forecast means for a chaotic nonlinear dynamic system that is subject to changing forcings. It is in this context that we present the results of experiments using simple models that can be considered analogous to the more complex climate system, namely the Lorenz 1963 and Lorenz 1984 models (Lorenz, 1963; Lorenz, 1984). Whilst the search for a low-dimensional climate attractor remains illusive (Fraedrich, 1986; Sahay and Sreenivasan, 1996) the characterization of the climate system in such terms can be useful for conceptual and computational simplicity. Recognising that a change in climate is manifest in a change in the distribution of a particular climate variable (Stainforth et al., 2007), we first establish the equilibrium distributions of the Lorenz systems for certain parameter settings. Allowing the parameters to vary in time, we investigate the dependency of such distributions to initial conditions and discuss the implications for climate prediction. We argue that the role of chaos and nonlinear dynamic behaviour ought to have more prominence in the discussion of the forecasting capabilities in climate prediction. References: Fraedrich, K. Estimating the dimensions of weather and climate attractors. J. Atmos. Sci

  2. Probabilistic settling in the Local Exchange Model of turbulent particle transport.

    PubMed

    McNair, James N

    2006-07-21

    The Local Exchange Model (LEM) is a stochastic diffusion model of particle transport in turbulent flowing water. It was developed mainly for application to particles of near-neutral buoyancy that are strongly influenced by turbulent eddies. Turbulence can rapidly transfer such particles to the bed, where settlement can then occur by, for example, sticking to biofilms (e.g., fine particulate organic matter, or FPOM) or attaching to the substrate behaviorally (e.g., benthic invertebrates). Previous papers on the LEM have addressed the problems of how long (time) and far (distance) a suspended particle will be transported before hitting the bed for the first time. These are the hitting-time and hitting-distance problems, respectively. Hitting distances predicted by the LEM for FPOM in natural streams tend to be much shorter than the distances at which most particles actually settle, suggesting that particles usually do not settle the first time they hit the bed. The present paper extends the LEM so it can address probabilistic settling, where a particle encountering the bed can either remain there for a positive length of time (i.e., settle) or immediately reflect back into the water column, each with positive probability. Previous results for the LEM are generalized by deducing a single set of equations governing the probability distribution and moments of a broad class of quantities that accumulate during particle trajectories terminated by hitting or settling on the bed (e.g., transport time, transport distance, cumulative energy expenditure during transport). Key properties of the settling-time and settling-distance distributions are studied numerically and compared with the observed FPOM settling-distance distribution for a natural stream. Some remaining limitations of the LEM and possible means of overcoming them are discussed.

  3. Quantification and probabilistic modeling of CRT obsolescence for the State of Delaware

    SciTech Connect

    Schumacher, Kelsea A.; Schumacher, Thomas; Agbemabiese, Lawrence

    2014-11-15

    Highlights: • We modeled the obsolescence of cathode ray tube devices in the State of Delaware. • 411,654 CRT units or ∼16,500 metric tons have been recycled in Delaware since 2002. • The peak of the CRT obsolescence in Delaware passed by 2012. • The Delaware average CRT recycling rate between 2002 and 13 was approximately 27.5%. • CRTs will continue to infiltrate the system likely until 2033. - Abstract: The cessation of production and replacement of cathode ray tube (CRT) displays with flat screen displays have resulted in the proliferation of CRTs in the electronic waste (e-waste) recycle stream. However, due to the nature of the technology and presence of hazardous components such as lead, CRTs are the most challenging of electronic components to recycle. In the State of Delaware it is due to this challenge and the resulting expense combined with the large quantities of CRTs in the recycle stream that electronic recyclers now charge to accept Delaware’s e-waste. Therefore it is imperative that the Delaware Solid Waste Authority (DSWA) understand future quantities of CRTs entering the waste stream. This study presents the results of an assessment of CRT obsolescence in the State of Delaware. A prediction model was created utilizing publicized sales data, a variety of lifespan data as well as historic Delaware CRT collection rates. Both a deterministic and a probabilistic approach using Monte Carlo Simulation (MCS) were performed to forecast rates of CRT obsolescence to be anticipated in the State of Delaware. Results indicate that the peak of CRT obsolescence in Delaware has already passed, although CRTs are anticipated to enter the waste stream likely until 2033.

  4. Bayesian probabilistic model for life prediction and fault mode classification of solid state luminaires

    SciTech Connect

    Lall, Pradeep; Wei, Junchao; Sakalaukus, Peter

    2014-06-22

    A new method has been developed for assessment of the onset of degradation in solid state luminaires to classify failure mechanisms by using metrics beyond lumen degradation that are currently used for identification of failure. Luminous Flux output, Correlated Color Temperature Data on Philips LED Lamps has been gathered under 85°C/85%RH till lamp failure. Failure modes of the test population of the lamps have been studied to understand the failure mechanisms in 85°C/85%RH accelerated test. Results indicate that the dominant failure mechanism is the discoloration of the LED encapsulant inside the lamps which is the likely cause for the luminous flux degradation and the color shift. The acquired data has been used in conjunction with Bayesian Probabilistic Models to identify luminaires with onset of degradation much prior to failure through identification of decision boundaries between lamps with accrued damage and lamps beyond the failure threshold in the feature space. In addition luminaires with different failure modes have been classified separately from healthy pristine luminaires. The α-λ plots have been used to evaluate the robustness of the proposed methodology. Results show that the predicted degradation for the lamps tracks the true degradation observed during 85°C/85%RH during accelerated life test fairly closely within the ±20% confidence bounds. Correlation of model prediction with experimental results indicates that the presented methodology allows the early identification of the onset of failure much prior to development of complete failure distributions and can be used for assessing the damage state of SSLs in fairly large deployments. It is expected that, the new prediction technique will allow the development of failure distributions without testing till L70 life for the manifestation of failure.

  5. Analyzing the impact of modeling choices and assumptions in compartmental epidemiological models

    DOE PAGES

    Nutaro, James J.; Pullum, Laura L.; Ramanathan, Arvind; Ozmen, Ozgur

    2016-05-01

    In this study, computational models have become increasingly used as part of modeling, predicting, and understanding how infectious diseases spread within large populations. These models can be broadly classified into differential equation-based models (EBM) and agent-based models (ABM). Both types of models are central in aiding public health officials design intervention strategies in case of large epidemic outbreaks. We examine these models in the context of illuminating their hidden assumptions and the impact these may have on the model outcomes. Very few ABM/EBMs are evaluated for their suitability to address a particular public health concern, and drawing relevant conclusions aboutmore » their suitability requires reliable and relevant information regarding the different modeling strategies and associated assumptions. Hence, there is a need to determine how the different modeling strategies, choices of various parameters, and the resolution of information for EBMs and ABMs affect outcomes, including predictions of disease spread. In this study, we present a quantitative analysis of how the selection of model types (i.e., EBM vs. ABM), the underlying assumptions that are enforced by model types to model the disease propagation process, and the choice of time advance (continuous vs. discrete) affect the overall outcomes of modeling disease spread. Our study reveals that the magnitude and velocity of the simulated epidemic depends critically on the selection of modeling principles, various assumptions of disease process, and the choice of time advance.« less

  6. Development of Probabilistic Risk Assessment Model for BWR Shutdown Modes 4 and 5 Integrated in SPAR Model

    SciTech Connect

    S. T. Khericha; S. Sancakter; J. Mitman; J. Wood

    2010-06-01

    Nuclear plant operating experience and several studies show that the risk from shutdown operation during modes 4, 5, and 6 can be significant This paper describes development of the standard template risk evaluation models for shutdown modes 4, and 5 for commercial boiling water nuclear power plants (BWR). The shutdown probabilistic risk assessment model uses full power Nuclear Regulatory Commission’s (NRC’s) Standardized Plant Analysis Risk (SPAR) model as the starting point for development. The shutdown PRA models are integrated with their respective internal events at-power SPAR model. This is accomplished by combining the modified system fault trees from SPAR full power model with shutdown event tree logic. For human reliability analysis (HRA), the SPAR HRA (SPAR-H) method is used which requires the analysts to complete relatively straight forward worksheet, including the performance shaping factors (PSFs). The results are then used to estimate HEP of interest. The preliminary results indicate the risk is dominated by the operator’s ability to diagnose the events and provide long term cooling.

  7. Probabilistic performance-assessment modeling of the mixed waste landfill at Sandia National Laboratories.

    SciTech Connect

    Peace, Gerald L.; Goering, Timothy James; Miller, Mark Laverne; Ho, Clifford Kuofei

    2005-11-01

    A probabilistic performance assessment has been conducted to evaluate the fate and transport of radionuclides (americium-241, cesium-137, cobalt-60, plutonium-238, plutonium-239, radium-226, radon-222, strontium-90, thorium-232, tritium, uranium-238), heavy metals (lead and cadmium), and volatile organic compounds (VOCs) at the Mixed Waste Landfill (MWL). Probabilistic analyses were performed to quantify uncertainties inherent in the system and models for a 1,000-year period, and sensitivity analyses were performed to identify parameters and processes that were most important to the simulated performance metrics. Comparisons between simulated results and measured values at the MWL were made to gain confidence in the models and perform calibrations when data were available. In addition, long-term monitoring requirements and triggers were recommended based on the results of the quantified uncertainty and sensitivity analyses. At least one-hundred realizations were simulated for each scenario defined in the performance assessment. Conservative values and assumptions were used to define values and distributions of uncertain input parameters when site data were not available. Results showed that exposure to tritium via the air pathway exceeded the regulatory metric of 10 mrem/year in about 2% of the simulated realizations when the receptor was located at the MWL (continuously exposed to the air directly above the MWL). Simulations showed that peak radon gas fluxes exceeded the design standard of 20 pCi/m{sup 2}/s in about 3% of the realizations if up to 1% of the containers of sealed radium-226 sources were assumed to completely degrade in the future. If up to 100% of the containers of radium-226 sources were assumed to completely degrade, 30% of the realizations yielded radon surface fluxes that exceeded the design standard. For the groundwater pathway, simulations showed that none of the radionuclides or heavy metals (lead and cadmium) reached the groundwater during

  8. Probabilistic Modeling for Risk Assessment of California Ground Water Contamination by Pesticides

    NASA Astrophysics Data System (ADS)

    Clayton, M.; Troiano, J.; Spurlock, F.

    2007-12-01

    The California Department of Pesticide Regulation (DPR) is responsible for the registration of pesticides in California. DPR's Environmental Monitoring Branch evaluates the potential for pesticide active ingredients to move to ground water under legal agricultural use conditions. Previous evaluations were primarily based on threshold values for specific persistence and mobility properties of pesticides as prescribed in the California Pesticide Contamination Prevention Act of 1985. Two limitations identified with that process were the univariate nature where interactions of the properties were not accounted for, and the inability to accommodate multiple values of a physical-chemical property. We addressed these limitations by developing a probabilistic modeling method based on prediction of potential well water concentrations. A mechanistic pesticide transport model, LEACHM, is used to simulate sorption, degradation and transport of a candidate pesticide through the root zone. A second empirical model component then simulates pesticide degradation and transport through the vadose zone to a receiving ground water aquifer. Finally, degradation during transport in the aquifer to the well screen is included in calculating final potential well concentrations. Using Monte Carlo techniques, numerous LEACHM simulations are conducted using random samples of the organic carbon normalized soil adsorption coefficients (Koc) and soil dissipation half-life values derived from terrestrial field dissipation (TFD) studies. Koc and TFD values are obtained from gamma distributions fitted to pooled data from agricultural-use pesticides detected in California ground water: atrazine, simazine, diuron, bromacil, hexazinone, and norflurazon. The distribution of predicted well water concentrations for these pesticides is in good agreement with concentrations measured in domestic wells in coarse, leaching vulnerable soils of Fresno and Tulure Counties. The leaching potential of a new

  9. Comparison of the MACCS2 atmospheric transport model with Lagrangian puff models as applied to deterministic and probabilistic safety analysis.

    PubMed

    Till, John E; Rood, Arthur S; Garzon, Caroline D; Lagdon, Richard H

    2014-09-01

    The suitability of a new facility in terms of potential impacts from routine and accidental releases is typically evaluated using conservative models and assumptions to assure dose standards are not exceeded. However, overly conservative dose estimates that exceed target doses can result in unnecessary and costly facility design changes. This paper examines one such case involving the U.S. Department of Energy's pretreatment facility of the Waste Treatment and Immobilization Plant (WTP). The MELCOR Accident Consequence Code System Version 2 (MACCS2) was run using conservative parameter values in prescribed guidance to demonstrate that the dose from a postulated airborne release would not exceed the guideline dose of 0.25 Sv. External review of default model parameters identified the deposition velocity of 1.0 cm s as being non-conservative. The deposition velocity calculated using resistance models was in the range of 0.1 to 0.3 cm s-1. A value of 0.1 cm s-1 would result in the dose guideline being exceeded. To test the overall conservatism of the MACCS2 transport model, the 95th percentile hourly average dispersion factor based on one year of meteorological data was compared to dispersion factors generated from two state-of-the-art Lagrangian puff models. The 95th percentile dispersion factor from MACCS2 was a factor of 3 to 6 higher compared to those of the Lagrangian puff models at a distance of 9.3 km and a deposition velocity of 0.1 cm s-1. Thus, the inherent conservatism in MACCS2 more than compensated for the high deposition velocity used in the assessment. Applications of models like MACCS2 with a conservative set of parameters are essentially screening calculations, and failure to meet dose criteria should not trigger facility design changes but prompt a more in-depth analysis using probabilistic methods with a defined margin of safety in the target dose. A sample application of the probabilistic approach is provided.

  10. Predictors of Latina/o Community College Student Vocational Choice of STEM Fields: Testing of the STEM-Vocational Choice Model

    ERIC Educational Resources Information Center

    Johnson, Joel D.

    2013-01-01

    This study confirmed appropriate measurement model fit for a theoretical model, the STEM vocational choice (STEM-VC) model. This model identifies exogenous factors that successfully predicted, at a statistically significant level, a student's vocational choice decision to pursue a STEM degree at transfer. The student population examined for this…

  11. Assessment of climate change impacts on climate variables using probabilistic ensemble modeling and trend analysis

    NASA Astrophysics Data System (ADS)

    Safavi, Hamid R.; Sajjadi, Sayed Mahdi; Raghibi, Vahid

    2016-08-01

    Water resources in snow-dependent regions have undergone significant changes due to climate change. Snow measurements in these regions have revealed alarming declines in snowfall over the past few years. The Zayandeh-Rud River in central Iran chiefly depends on winter falls as snow for supplying water from wet regions in high Zagrous Mountains to the downstream, (semi-)arid, low-lying lands. In this study, the historical records (baseline: 1971-2000) of climate variables (temperature and precipitation) in the wet region were chosen to construct a probabilistic ensemble model using 15 GCMs in order to forecast future trends and changes while the Long Ashton Research Station Weather Generator (LARS-WG) was utilized to project climate variables under two A2 and B1 scenarios to a future period (2015-2044). Since future snow water equivalent (SWE) forecasts by GCMs were not available for the study area, an artificial neural network (ANN) was implemented to build a relationship between climate variables and snow water equivalent for the baseline period to estimate future snowfall amounts. As a last step, homogeneity and trend tests were performed to evaluate the robustness of the data series and changes were examined to detect past and future variations. Results indicate different characteristics of the climate variables at upstream stations. A shift is observed in the type of precipitation from snow to rain as well as in its quantities across the subregions. The key role in these shifts and the subsequent side effects such as water losses is played by temperature.

  12. Landslide susceptibility mapping along PLUS expressways in Malaysia using probabilistic based model in GIS

    NASA Astrophysics Data System (ADS)

    Yusof, Norbazlan M.; Pradhan, Biswajeet

    2014-06-01

    PLUS Berhad holds the concession for a total of 987 km of toll expressways in Malaysia, the longest of which is the North-South Expressway or NSE. Acting as the backbone' of the west coast of the peninsula, the NSE stretches from the Malaysian-Thai border in the north to the border with neighbouring Singapore in the south, linking several major cities and towns along the way. North-South Expressway in Malaysia contributes to the country economic development through trade, social and tourism sector. Presently, the highway is good in terms of its condition and connection to every state but some locations need urgent attention. Stability of slopes at these locations is of most concern as any instability can cause danger to the motorist. In this paper, two study locations have been analysed; they are Gua Tempurung (soil slope) and Jelapang (rock slope) which are obviously having two different characteristics. These locations passed through undulating terrain with steep slopes where landslides are common and the probability of slope instability due to human activities in surrounding areas is high. A combination of twelve (12) landslide conditioning factors database on slope stability such as slope degree and slope aspect were extracted from IFSAR (interoferometric synthetic aperture radar) while landuse, lithology and structural geology were constructed from interpretation of high resolution satellite data from World View II, Quickbird and Ikonos. All this information was analysed in geographic information system (GIS) environment for landslide susceptibility mapping using probabilistic based frequency ratio model. Consequently, information on the slopes such as inventories, condition assessments and maintenance records were assessed through total expressway maintenance management system or better known as TEMAN. The above mentioned system is used by PLUS as an asset management and decision support tools for maintenance activities along the highways as well as for data

  13. Radar tracking with an interacting multiple model and probabilistic data association filter for civil aviation applications.

    PubMed

    Jan, Shau-Shiun; Kao, Yu-Chun

    2013-05-17

    The current trend of the civil aviation technology is to modernize the legacy air traffic control (ATC) system that is mainly supported by many ground based navigation aids to be the new air traffic management (ATM) system that is enabled by global positioning system (GPS) technology. Due to the low receiving power of GPS signal, it is a major concern to aviation authorities that the operation of the ATM system might experience service interruption when the GPS signal is jammed by either intentional or unintentional radio-frequency interference. To maintain the normal operation of the ATM system during the period of GPS outage, the use of the current radar system is proposed in this paper. However, the tracking performance of the current radar system could not meet the required performance of the ATM system, and an enhanced tracking algorithm, the interacting multiple model and probabilistic data association filter (IMMPDAF), is therefore developed to support the navigation and surveillance services of the ATM system. The conventional radar tracking algorithm, the nearest neighbor Kalman filter (NNKF), is used as the baseline to evaluate the proposed radar tracking algorithm, and the real flight data is used to validate the IMMPDAF algorithm. As shown in the results, the proposed IMMPDAF algorithm could enhance the tracking performance of the current aviation radar system and meets the required performance of the new ATM system. Thus, the current radar system with the IMMPDAF algorithm could be used as an alternative system to continue aviation navigation and surveillance services of the ATM system during GPS outage periods.

  14. Feasibility study on the use of probabilistic migration modeling in support of exposure assessment from food contact materials.

    PubMed

    Poças, Maria F; Oliveira, Jorge C; Brandsch, Rainer; Hogg, Timothy

    2010-07-01

    The use of probabilistic approaches in exposure assessments of contaminants migrating from food packages is of increasing interest but the lack of concentration or migration data is often referred as a limitation. Data accounting for the variability and uncertainty that can be expected in migration, for example, due to heterogeneity in the packaging system, variation of the temperature along the distribution chain, and different time of consumption of each individual package, are required for probabilistic analysis. The objective of this work was to characterize quantitatively the uncertainty and variability in estimates of migration. A Monte Carlo simulation was applied to a typical solution of the Fick's law with given variability in the input parameters. The analysis was performed based on experimental data of a model system (migration of Irgafos 168 from polyethylene into isooctane) and illustrates how important sources of variability and uncertainty can be identified in order to refine analyses. For long migration times and controlled conditions of temperature the affinity of the migrant to the food can be the major factor determining the variability in the migration values (more than 70% of variance). In situations where both the time of consumption and temperature can vary, these factors can be responsible, respectively, for more than 60% and 20% of the variance in the migration estimates. The approach presented can be used with databases from consumption surveys to yield a true probabilistic estimate of exposure.

  15. A Practical Probabilistic Graphical Modeling Tool for Weighing Ecological Risk-Based Evidence

    EPA Science Inventory

    Past weight-of-evidence frameworks for adverse ecological effects have provided soft-scoring procedures for judgments based on the quality and measured attributes of evidence. Here, we provide a flexible probabilistic structure for weighing and integrating lines of evidence for e...

  16. Fatigue strength reduction model: RANDOM3 and RANDOM4 user manual. Appendix 2: Development of advanced methodologies for probabilistic constitutive relationships of material strength models

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Lovelace, Thomas B.

    1989-01-01

    FORTRAN programs RANDOM3 and RANDOM4 are documented in the form of a user's manual. Both programs are based on fatigue strength reduction, using a probabilistic constitutive model. The programs predict the random lifetime of an engine component to reach a given fatigue strength. The theoretical backgrounds, input data instructions, and sample problems illustrating the use of the programs are included.

  17. Fatigue crack growth model RANDOM2 user manual. Appendix 1: Development of advanced methodologies for probabilistic constitutive relationships of material strength models

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Lovelace, Thomas B.

    1989-01-01

    FORTRAN program RANDOM2 is presented in the form of a user's manual. RANDOM2 is based on fracture mechanics using a probabilistic fatigue crack growth model. It predicts the random lifetime of an engine component to reach a given crack size. Details of the theoretical background, input data instructions, and a sample problem illustrating the use of the program are included.

  18. Self assembly of rectangular shapes on concentration programming and probabilistic tile assembly models.

    PubMed

    Kundeti, Vamsi; Rajasekaran, Sanguthevar

    2012-06-01

    ), to self assemble rectangles (of fixed aspect ratio) with high probability. The tile complexity of our algorithm is Θ(log(n)) and is optimal on the probabilistic tile assembly model (PTAM)-n being an upper bound on the dimensions of a rectangle. PMID:24311993

  19. Self assembly of rectangular shapes on concentration programming and probabilistic tile assembly models.

    PubMed

    Kundeti, Vamsi; Rajasekaran, Sanguthevar

    2012-06-01

    ), to self assemble rectangles (of fixed aspect ratio) with high probability. The tile complexity of our algorithm is Θ(log(n)) and is optimal on the probabilistic tile assembly model (PTAM)-n being an upper bound on the dimensions of a rectangle.

  20. A probabilistic spatial-temporal model for vent opening clustering at Campi Flegrei caldera (Italy)

    NASA Astrophysics Data System (ADS)

    Bevilacqua, A.; Isaia, R.; Flandoli, F.; Neri, A.; Quaranta, D.

    2014-12-01

    Campi Flegrei (CF) is a densely urbanized caldera with a very high volcanic risk. Its more recent volcanic activity was characterized in the last 15 kyrs by more than 70 explosive events of variable scale and vent location. The sequence of eruptive events at CF is remarkably inhomogeneous, both in space and time. Eruptions concentred over periods from a few centuries to a few millennia, and were alternated by periods of quiescence lasting up to several millennia. As a consequence, activity has been subdivided into three distinct epochs, i.e. Epoch I, 15 - 9.5 kyrs, Epoch II, 8.6 - 8.2 kyrs, and Epoch III, 4.8 - 3.7 kyrs BP [e.g. Orsi et al., 2004; Smith et al., 2011]. The eruptive record also shows the presence of clusters of events in space-time, i.e. the opening of a new vent in a particular location and at a specific time seems to increase the probability of another vent opening in the nearby area and in the next decades-centuries (self-exciting effect). Probabilistic vent opening mapping conditional the occurrence of a new event and able to account for some of the intrinsic uncertainties affecting the system, has been investigated in some recent studies [e.g. Selva et al. 2011, Bevilacqua et al. 2014, in preparation], but a spatial-temporal model of the sequence of volcanic activity remains an open issue. Hence we have developed a time-space mathematical model that takes into account both the self-exciting behaviour of the system and the significant uncertainty affecting the eruptive record. Based on the past eruptive record of the volcano, the model allows to simulate sequences of future events as well as to better understand the spatial and temporal evolution of the system. In addition, based on the assumption that the last eruptive event occurred in 1538 AD (Monte Nuovo eruption) is the first event of a new epoch of activity, the model can estimate the probability of new vent opening at CF in the next decades.

  1. A Simple Model for Probabilistic Seismic Hazard Analysis of Induced Seismicity Associated With Deep Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Schlittenhardt, Joerg; Spies, Thomas; Kopera, Juergen; Morales Aviles, Wilhelm

    2014-05-01

    In the research project MAGS (Microseismic activity of geothermal systems) funded by the German Federal Ministry of Environment (BMU) a simple model was developed to determine seismic hazard as the probability of the exceedance of ground motion of a certain size. Such estimates of the annual frequency of exceedance of prescriptive limits of e.g. seismic intensities or ground motions are needed for the planning and licensing, but likewise for the development and operation of deep geothermal systems. For the development of the proposed model well established probabilistic seismic hazard analysis (PSHA) methods for the estimation of the hazard for the case of natural seismicity were adapted to the case of induced seismicity. Important differences between induced and natural seismicity had to be considered. These include significantly smaller magnitudes, depths and source to site distances of the seismic events and, hence, different ground motion prediction equations (GMPE) that had to be incorporated to account for the seismic amplitude attenuation with distance as well as differences in the stationarity of the underlying tectonic and induced processes. Appropriate GMPE's in terms of PGV (peak ground velocity) were tested and selected from the literature. The proposed model and its application to the case of induced seismicity observed during the circulation period (operation phase of the plant) at geothermal sites in Germany will be presented. Using GMPE's for PGV has the advantage to estimate hazard in terms of velocities of ground motion, which can be linked to engineering regulations (e.g. German DIN 4150) which give prescriptive standards for the effects of vibrations on buildings and people. It is thus possible to specify the probability of exceedance of such prescriptive standard values and to decide whether they can be accepted or not. On the other hand hazard curves for induced and natural seismicity can be compared to study the impact at a site. Preliminary

  2. Impact of fault models on probabilistic seismic hazard assessment: the example of the West Corinth rift.

    NASA Astrophysics Data System (ADS)

    Chartier, Thomas; Scotti, Oona; Boiselet, Aurelien; Lyon-Caen, Hélène

    2016-04-01

    Including faults in probabilistic seismic hazard assessment tends to increase the degree of uncertainty in the results due to the intrinsically uncertain nature of the fault data. This is especially the case in the low to moderate seismicity regions of Europe, where slow slipping faults are difficult to characterize. In order to better understand the key parameters that control the uncertainty in the fault-related hazard computations, we propose to build an analytic tool that provides a clear link between the different components of the fault-related hazard computations and their impact on the results. This will allow identifying the important parameters that need to be better constrained in order to reduce the resulting uncertainty in hazard and also provide a more hazard-oriented strategy for collecting relevant fault parameters in the field. The tool will be illustrated through the example of the West Corinth rifts fault-models. Recent work performed in the gulf has shown the complexity of the normal faulting system that is accommodating the extensional deformation of the rift. A logic-tree approach is proposed to account for this complexity and the multiplicity of scientifically defendable interpretations. At the nodes of the logic tree, different options that could be considered at each step of the fault-related seismic hazard will be considered. The first nodes represent the uncertainty in the geometries of the faults and their slip rates, which can derive from different data and methodologies. The subsequent node explores, for a given geometry/slip rate of faults, different earthquake rupture scenarios that may occur in the complex network of faults. The idea is to allow the possibility of several faults segments to break together in a single rupture scenario. To build these multiple-fault-segment scenarios, two approaches are considered: one based on simple rules (i.e. minimum distance between faults) and a second one that relies on physically

  3. Predicting rib fracture risk with whole-body finite element models: development and preliminary evaluation of a probabilistic analytical framework.

    PubMed

    Forman, Jason L; Kent, Richard W; Mroz, Krystoffer; Pipkorn, Bengt; Bostrom, Ola; Segui-Gomez, Maria

    2012-01-01

    This study sought to develop a strain-based probabilistic method to predict rib fracture risk with whole-body finite element (FE) models, and to describe a method to combine the results with collision exposure information to predict injury risk and potential intervention effectiveness in the field. An age-adjusted ultimate strain distribution was used to estimate local rib fracture probabilities within an FE model. These local probabilities were combined to predict injury risk and severity within the whole ribcage. The ultimate strain distribution was developed from a literature dataset of 133 tests. Frontal collision simulations were performed with the THUMS (Total HUman Model for Safety) model with four levels of delta-V and two restraints: a standard 3-point belt and a progressive 3.5-7 kN force-limited, pretensioned (FL+PT) belt. The results of three simulations (29 km/h standard, 48 km/h standard, and 48 km/h FL+PT) were compared to matched cadaver sled tests. The numbers of fractures predicted for the comparison cases were consistent with those observed experimentally. Combining these results with field exposure informantion (ΔV, NASS-CDS 1992-2002) suggests a 8.9% probability of incurring AIS3+ rib fractures for a 60 year-old restrained by a standard belt in a tow-away frontal collision with this restraint, vehicle, and occupant configuration, compared to 4.6% for the FL+PT belt. This is the first study to describe a probabilistic framework to predict rib fracture risk based on strains observed in human-body FE models. Using this analytical framework, future efforts may incorporate additional subject or collision factors for multi-variable probabilistic injury prediction. PMID:23169122

  4. Predicting Rib Fracture Risk With Whole-Body Finite Element Models: Development and Preliminary Evaluation of a Probabilistic Analytical Framework

    PubMed Central

    Forman, Jason L.; Kent, Richard W.; Mroz, Krystoffer; Pipkorn, Bengt; Bostrom, Ola; Segui-Gomez, Maria

    2012-01-01

    This study sought to develop a strain-based probabilistic method to predict rib fracture risk with whole-body finite element (FE) models, and to describe a method to combine the results with collision exposure information to predict injury risk and potential intervention effectiveness in the field. An age-adjusted ultimate strain distribution was used to estimate local rib fracture probabilities within an FE model. These local probabilities were combined to predict injury risk and severity within the whole ribcage. The ultimate strain distribution was developed from a literature dataset of 133 tests. Frontal collision simulations were performed with the THUMS (Total HUman Model for Safety) model with four levels of delta-V and two restraints: a standard 3-point belt and a progressive 3.5–7 kN force-limited, pretensioned (FL+PT) belt. The results of three simulations (29 km/h standard, 48 km/h standard, and 48 km/h FL+PT) were compared to matched cadaver sled tests. The numbers of fractures predicted for the comparison cases were consistent with those observed experimentally. Combining these results with field exposure informantion (ΔV, NASS-CDS 1992–2002) suggests a 8.9% probability of incurring AIS3+ rib fractures for a 60 year-old restrained by a standard belt in a tow-away frontal collision with this restraint, vehicle, and occupant configuration, compared to 4.6% for the FL+PT belt. This is the first study to describe a probabilistic framework to predict rib fracture risk based on strains observed in human-body FE models. Using this analytical framework, future efforts may incorporate additional subject or collision factors for multi-variable probabilistic injury prediction. PMID:23169122

  5. Probabilistic model of beam-plasma interaction in the randomly inhomogeneous solar wind

    NASA Astrophysics Data System (ADS)

    Krasnoselskikh, Vladimir; Voshchepynets, Andrii

    2015-04-01

    We apply the probabilistic model of beam plasma interaction proposed before to electron beams propagating in the interplanetary plasma taking into account known properties of the density fluctuations in the solar wind measured aboard ISEE. The new element here with respect to previous work consists in the calculation of the probability density for density fluctuations having power law spectra, while previously we used Gaussian probability distribution. We use the property that for the given frequency the probability distribution of density fluctuations uniquely determines the probability distribution of phase velocity of wave. We present the system as discrete consisting of small equal spatial intervals and the density profile on each small interval is linear. The model is based on general description of the wave particle interaction on any of these small spatial intervals with linear profile. We solve equations of motion of a particle under the action of the wave with the given amplitude and phase in the beginning of the interval. This approach allows one to estimate variations of the wave's energy density and particle's velocity, depending on the density gradient. The presence of the plasma inhomogeneity results in the variation of the phase velocity of the wave having known frequency and causes a spreading of the width of the resonance in the velocity space. Since the characteristic time of the evolution of the electron distribution function and wave energy is much longer than the time of the single wave-particle resonant interaction on a given small interval, we can proceed to the description of the relaxation process in terms of averaged quantities. We derive a system of equations, similar to the quasi-linear approximation, but conventional velocity diffusion coefficient Dand the wave's growth rate γ are replaced by averaged in phase space making use the probability distribution of phase velocities and assuming that the interaction on each interval is

  6. Testing process predictions of models of risky choice: a quantitative model comparison approach

    PubMed Central

    Pachur, Thorsten; Hertwig, Ralph; Gigerenzer, Gerd; Brandstätter, Eduard

    2013-01-01

    This article presents a quantitative model comparison contrasting the process predictions of two prominent views on risky choice. One view assumes a trade-off between probabilities and outcomes (or non-linear functions thereof) and the separate evaluation of risky options (expectation models). Another view assumes that risky choice is based on comparative evaluation, limited search, aspiration levels, and the forgoing of trade-offs (heuristic models). We derived quantitative process predictions for a generic expectation model and for a specific heuristic model, namely the priority heuristic (Brandstätter et al., 2006), and tested them in two experiments. The focus was on two key features of the cognitive process: acquisition frequencies (i.e., how frequently individual reasons are looked up) and direction of search (i.e., gamble-wise vs. reason-wise). In Experiment 1, the priority heuristic predicted direction of search better than the expectation model (although neither model predicted the acquisition process perfectly); acquisition frequencies, however, were inconsistent with both models. Additional analyses revealed that these frequencies were primarily a function of what Rubinstein (1988) called “similarity.” In Experiment 2, the quantitative model comparison approach showed that people seemed to rely more on the priority heuristic in difficult problems, but to make more trade-offs in easy problems. This finding suggests that risky choice may be based on a mental toolbox of strategies. PMID:24151472

  7. Initial Correction versus Negative Marking in Multiple Choice Examinations

    ERIC Educational Resources Information Center

    Van Hecke, Tanja

    2015-01-01

    Optimal assessment tools should measure in a limited time the knowledge of students in a correct and unbiased way. A method for automating the scoring is multiple choice scoring. This article compares scoring methods from a probabilistic point of view by modelling the probability to pass: the number right scoring, the initial correction (IC) and…

  8. Modeling Educational Choices. A Binomial Logit Model Applied to the Demand for Higher Education.

    ERIC Educational Resources Information Center

    Jimenez, Juan de Dios; Salas-Velasco, Manual

    2000-01-01

    Presents a microeconomic analysis of the choice of university degree course (3 year or 4 year course) Spanish students make on finishing their secondary studies and applies the developed binomial logit model to survey data from 388 high school graduates. Findings show the importance of various factors in determining the likelihood of choosing the…

  9. Developing EHR-driven heart failure risk prediction models using CPXR(Log) with the probabilistic loss function.

    PubMed

    Taslimitehrani, Vahid; Dong, Guozhu; Pereira, Naveen L; Panahiazar, Maryam; Pathak, Jyotishman

    2016-04-01

    Computerized survival prediction in healthcare identifying the risk of disease mortality, helps healthcare providers to effectively manage their patients by providing appropriate treatment options. In this study, we propose to apply a classification algorithm, Contrast Pattern Aided Logistic Regression (CPXR(Log)) with the probabilistic loss function, to develop and validate prognostic risk models to predict 1, 2, and 5year survival in heart failure (HF) using data from electronic health records (EHRs) at Mayo Clinic. The CPXR(Log) constructs a pattern aided logistic regression model defined by several patterns and corresponding local logistic regression models. One of the models generated by CPXR(Log) achieved an AUC and accuracy of 0.94 and 0.91, respectively, and significantly outperformed prognostic models reported in prior studies. Data extracted from EHRs allowed incorporation of patient co-morbidities into our models which helped improve the performance of the CPXR(Log) models (15.9% AUC improvement), although did not improve the accuracy of the models built by other classifiers. We also propose a probabilistic loss function to determine the large error and small error instances. The new loss function used in the algorithm outperforms other functions used in the previous studies by 1% improvement in the AUC. This study revealed that using EHR data to build prediction models can be very challenging using existing classification methods due to the high dimensionality and complexity of EHR data. The risk models developed by CPXR(Log) also reveal that HF is a highly heterogeneous disease, i.e., different subgroups of HF patients require different types of considerations with their diagnosis and treatment. Our risk models provided two valuable insights for application of predictive modeling techniques in biomedicine: Logistic risk models often make systematic prediction errors, and it is prudent to use subgroup based prediction models such as those given by CPXR

  10. A Probabilistic Model of Global-Scale Seismology with Veith-Clawson Amplitude Corrections

    NASA Astrophysics Data System (ADS)

    Arora, N. S.; Russell, S.

    2013-12-01

    We present a probabilistic generative model of global-scale seismology, NET-VISA, that is designed to address the event detection and location problem of seismic monitoring. The model is based on a standard Bayesian framework with prior probabilities for event generation and propagation as well as likelihoods of detection and arrival (or onset) parameters. The model is supplemented with a greedy search algorithm that iteratively improves the predicted bulletin with respect to the posterior probability. Our prior model incorporates both seismic theory and empirical observations as appropriate. For instance, we use empirical observations for the expected rates of earthquake at each point on the earth, while we use the Gutenberg-Richter law for the the expected magnitude distribution of these earthquakes. In this work, we describe an extension of our model where we include the Veith-Clawson (1972) amplitude decline curves in our empirically calibrated arrival amplitude model. While this change doesn't alter the overall event-detection results, we have chosen to keep the Veith-Clawson curves since they are more seismically accurate. We also describe a recent change to our search algorithm, whereby we now consider multiple hypotheses when we encounter a series of closely spaced arrivals which could be explained by either a single event or multiple co-located events. This change has led to a sharp improvement in our results on large after-shock sequences. We use the analyst-curated LEB bulletin or the REB bulletin, which is the published product of the IDC, as a reference and measure the overlap (percentage of reference events that are matched) and inconsistency (percentage of test bulletin events that don't match anything in the reference) of a one-to-one matching between the test and the reference bulletins. In the table below we show results for NET-VISA and SEL3, which is produced by the existing GA software, for the whole of 2009. These results show that NET

  11. Probabilistic Risk Model for Organ Doses and Acute Health Effects of Astronauts on Lunar Missions

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Hu, Shaowen; Nounu, Hatem N.; Cucinotta, Francis A.

    2009-01-01

    Exposure to large solar particle events (SPEs) is a major concern during EVAs on the lunar surface and in Earth-to-Lunar transit. 15% of crew times may be on EVA with minimal radiation shielding. Therefore, an accurate assessment of SPE occurrence probability is required for the mission planning by NASA. We apply probabilistic risk assessment (PRA) for radiation protection of crews and optimization of lunar mission planning.

  12. Twenty-first century probabilistic projections of precipitation over Ontario, Canada through a regional climate model ensemble

    NASA Astrophysics Data System (ADS)

    Wang, Xiuquan; Huang, Guohe; Liu, Jinliang

    2016-06-01

    In this study, probabilistic projections of precipitation for the Province of Ontario are developed through a regional climate model ensemble to help investigate how global warming would affect its local climate. The PRECIS regional climate modeling system is employed to perform ensemble simulations, driven by a set of boundary conditions from a HadCM3-based perturbed-physics ensemble. The PRECIS ensemble simulations are fed into a Bayesian hierarchical model to quantify uncertain factors affecting the resulting projections of precipitation and thus generate probabilistic precipitation changes at grid point scales. Following that, reliable precipitation projections throughout the twenty-first century are developed for the entire province by applying the probabilistic changes to the observed precipitation. The results show that the vast majority of cities in Ontario are likely to suffer positive changes in annual precipitation in 2030, 2050, and 2080 s in comparison to the baseline observations. This may suggest that the whole province is likely to gain more precipitation throughout the twenty-first century in response to global warming. The analyses on the projections of seasonal precipitation further demonstrate that the entire province is likely to receive more precipitation in winter, spring, and autumn throughout this century while summer precipitation is only likely to increase slightly in 2030 s and would decrease gradually afterwards. However, because the magnitude of projected decrease in summer precipitation is relatively small in comparison with the anticipated increases in other three seasons, the annual precipitation over Ontario is likely to suffer a progressive increase throughout the twenty-first century (by 7.0 % in 2030 s, 9.5 % in 2050 s, and 12.6 % in 2080 s). Besides, the degree of uncertainty for precipitation projections is analyzed. The results suggest that future changes in spring precipitation show higher degree of uncertainty than other

  13. Linear-Nonlinear-Poisson Models of Primate Choice Dynamics

    ERIC Educational Resources Information Center

    Corrado, Greg S.; Sugrue, Leo P.; Seung, H. Sebastian; Newsome, William T.

    2005-01-01

    The equilibrium phenomenon of matching behavior traditionally has been studied in stationary environments. Here we attempt to uncover the local mechanism of choice that gives rise to matching by studying behavior in a highly dynamic foraging environment. In our experiments, 2 rhesus monkeys ("Macacca mulatta") foraged for juice rewards by making…

  14. Item Response Modeling of Forced-Choice Questionnaires

    ERIC Educational Resources Information Center

    Brown, Anna; Maydeu-Olivares, Alberto

    2011-01-01

    Multidimensional forced-choice formats can significantly reduce the impact of numerous response biases typically associated with rating scales. However, if scored with classical methodology, these questionnaires produce ipsative data, which lead to distorted scale relationships and make comparisons between individuals problematic. This research…

  15. A ligand predication tool based on modeling and reasoning with imprecise probabilistic knowledge.

    PubMed

    Liu, Weiru; Yue, Anbu; Timson, David J

    2010-04-01

    Ligand prediction has been driven by a fundamental desire to understand more about how biomolecules recognize their ligands and by the commercial imperative to develop new drugs. Most of the current available software systems are very complex and time-consuming to use. Therefore, developing simple and efficient tools to perform initial screening of interesting compounds is an appealing idea. In this paper, we introduce our tool for very rapid screening for likely ligands (either substrates or inhibitors) based on reasoning with imprecise probabilistic knowledge elicited from past experiments. Probabilistic knowledge is input to the system via a user-friendly interface showing a base compound structure. A prediction of whether a particular compound is a substrate is queried against the acquired probabilistic knowledge base and a probability is returned as an indication of the prediction. This tool will be particularly useful in situations where a number of similar compounds have been screened experimentally, but information is not available for all possible members of that group of compounds. We use two case studies to demonstrate how to use the tool.

  16. The effects of climate model similarity on probabilistic climate projections and the implications for local, risk-based adaptation planning

    NASA Astrophysics Data System (ADS)

    Steinschneider, Scott; McCrary, Rachel; Mearns, Linda O.; Brown, Casey

    2015-06-01

    Approaches for probability density function (pdf) development of future climate often assume that different climate models provide independent information, despite model similarities that stem from a common genealogy (models with shared code or developed at the same institution). Here we use an ensemble of projections from the Coupled Model Intercomparison Project Phase 5 to develop probabilistic climate information, with and without an accounting of intermodel correlations, for seven regions across the United States. We then use the pdfs to estimate midcentury climate-related risks to a water utility in one of the regions. We show that the variance of climate changes is underestimated across all regions if model correlations are ignored, and in some cases, the mean change shifts as well. When coupled with impact models of the hydrology and infrastructure of a water utility, the underestimated likelihood of large climate changes significantly alters the quantification of risk for water shortages by midcentury.

  17. Agent autonomy approach to probabilistic physics-of-failure modeling of complex dynamic systems with interacting failure mechanisms

    NASA Astrophysics Data System (ADS)

    Gromek, Katherine Emily

    A novel computational and inference framework of the physics-of-failure (PoF) reliability modeling for complex dynamic systems has been established in this research. The PoF-based reliability models are used to perform a real time simulation of system failure processes, so that the system level reliability modeling would constitute inferences from checking the status of component level reliability at any given time. The "agent autonomy" concept is applied as a solution method for the system-level probabilistic PoF-based (i.e. PPoF-based) modeling. This concept originated from artificial intelligence (AI) as a leading intelligent computational inference in modeling of multi agents systems (MAS). The concept of agent autonomy in the context of reliability modeling was first proposed by M. Azarkhail [1], where a fundamentally new idea of system representation by autonomous intelligent agents for the purpose of reliability modeling was introduced. Contribution of the current work lies in the further development of the agent anatomy concept, particularly the refined agent classification within the scope of the PoF-based system reliability modeling, new approaches to the learning and the autonomy properties of the intelligent agents, and modeling interacting failure mechanisms within the dynamic engineering system. The autonomous property of intelligent agents is defined as agent's ability to self-activate, deactivate or completely redefine their role in the analysis. This property of agents and the ability to model interacting failure mechanisms of the system elements makes the agent autonomy fundamentally different from all existing methods of probabilistic PoF-based reliability modeling. 1. Azarkhail, M., "Agent Autonomy Approach to Physics-Based Reliability Modeling of Structures and Mechanical Systems", PhD thesis, University of Maryland, College Park, 2007.

  18. Development of Standardized Probabilistic Risk Assessment Models for Shutdown Operations Integrated in SPAR Level 1 Model

    SciTech Connect

    S. T. Khericha; J. Mitman

    2008-05-01

    Nuclear plant operating experience and several studies show that the risk from shutdown operation during Modes 4, 5, and 6 at pressurized water reactors and Modes 4 and 5 at boiling water reactors can be significant. This paper describes using the U.S. Nuclear Regulatory Commission’s full-power Standardized Plant Analysis Risk (SPAR) model as the starting point for development of risk evaluation models for commercial nuclear power plants. The shutdown models are integrated with their respective internal event at-power SPAR model. This is accomplished by combining the modified system fault trees from the SPAR full-power model with shutdown event tree logic. Preliminary human reliability analysis results indicate that risk is dominated by the operator’s ability to correctly diagnose events and initiate systems.

  19. Choices and Changes: Eccles' Expectancy-Value Model and Upper-Secondary School Students' Longitudinal Reflections about Their Choice of a STEM Education

    ERIC Educational Resources Information Center

    Lykkegaard, Eva; Ulriksen, Lars

    2016-01-01

    During the past 30 years, Eccles' comprehensive social-psychological Expectancy-Value Model of Motivated Behavioural Choices (EV-MBC model) has been proven suitable for studying educational choices related to Science, Technology, Engineering and/or Mathematics (STEM). The reflections of 15 students in their last year in upper-secondary school…

  20. Crevice corrosion {ampersand} pitting of high-level waste containers: the integration of deterministic {ampersand} probabilistic models (II)

    SciTech Connect

    Farmer, J.C.

    1997-10-01

    An integrated predictive model is being developed to account for the effects of localized environmental conditions in crevices on the initiation and propagation of pits. A deterministic calculation is used to estimate the accumulation of hydrogen ions (pH suppression) in the crevice solution due to the hydrolysis of dissolved metals. Pit initiation and growth within the crevice is then dealt with by either a probabilistic model, or an equivalent deterministic model. Ultimately, the role of intergranular corrosion will have to be considered. While the strategy presented here is very promising, the integrated model is not yet ready for precise quantitative predictions. Empirical expressions for the rate of penetration based upon experimental crevice corrosion data can be used in the interim period, until the integrated model can be refined. Bounding calculations based upon such empirical expressions can provide important insight into worst-case scenarios.

  1. Multi-model ensemble-based probabilistic prediction of tropical cyclogenesis using TIGGE model forecasts

    NASA Astrophysics Data System (ADS)

    Jaiswal, Neeru; Kishtawal, C. M.; Bhomia, Swati; Pal, P. K.

    2016-10-01

    An extended range tropical cyclogenesis forecast model has been developed using the forecasts of global models available from TIGGE portal. A scheme has been developed to detect the signatures of cyclogenesis in the global model forecast fields [i.e., the mean sea level pressure and surface winds (10 m horizontal winds)]. For this, a wind matching index was determined between the synthetic cyclonic wind fields and the forecast wind fields. The thresholds of 0.4 for wind matching index and 1005 hpa for pressure were determined to detect the cyclonic systems. These detected cyclonic systems in the study region are classified into different cyclone categories based on their intensity (maximum wind speed). The forecasts of up to 15 days from three global models viz., ECMWF, NCEP and UKMO have been used to predict cyclogenesis based on multi-model ensemble approach. The occurrence of cyclonic events of different categories in all the forecast steps in the grided region (10 × 10 km2) was used to estimate the probability of the formation of cyclogenesis. The probability of cyclogenesis was estimated by computing the grid score using the wind matching index by each model and at each forecast step and convolving it with Gaussian filter. The proposed method is used to predict the cyclogenesis of five named tropical cyclones formed during the year 2013 in the north Indian Ocean. The 6-8 days advance cyclogenesis of theses systems were predicted using the above approach. The mean lead prediction time for the cyclogenesis event of the proposed model has been found as 7 days.

  2. Modeling the Bullying Prevention Program Preferences of Educators: A Discrete Choice Conjoint Experiment

    ERIC Educational Resources Information Center

    Cunningham, Charles E.; Vaillancourt, Tracy; Rimas, Heather; Deal, Ken; Cunningham, Lesley; Short, Kathy; Chen, Yvonne

    2009-01-01

    We used discrete choice conjoint analysis to model the bullying prevention program preferences of educators. Using themes from computerized decision support lab focus groups (n = 45 educators), we composed 20 three-level bullying prevention program design attributes. Each of 1,176 educators completed 25 choice tasks presenting experimentally…

  3. Modeling confidence judgments, response times, and multiple choices in decision making: recognition memory and motion discrimination.

    PubMed

    Ratcliff, Roger; Starns, Jeffrey J

    2013-07-01

    Confidence in judgments is a fundamental aspect of decision making, and tasks that collect confidence judgments are an instantiation of multiple-choice decision making. We present a model for confidence judgments in recognition memory tasks that uses a multiple-choice diffusion decision process with separate accumulators of evidence for the different confidence choices. The accumulator that first reaches its decision boundary determines which choice is made. Five algorithms for accumulating evidence were compared, and one of them produced proportions of responses for each of the choices and full response time distributions for each choice that closely matched empirical data. With this algorithm, an increase in the evidence in one accumulator is accompanied by a decrease in the others so that the total amount of evidence in the system is constant. Application of the model to the data from an earlier experiment (Ratcliff, McKoon, & Tindall, 1994) uncovered a relationship between the shapes of z-transformed receiver operating characteristics and the behavior of response time distributions. Both are explained in the model by the behavior of the decision boundaries. For generality, we also applied the decision model to a 3-choice motion discrimination task and found it accounted for data better than a competing class of models. The confidence model presents a coherent account of confidence judgments and response time that cannot be explained with currently popular signal detection theory analyses or dual-process models of recognition.

  4. Challenges associated with estimating the cost of European flooding through the development of a multi-country probabilistic model

    NASA Astrophysics Data System (ADS)

    Haseldine, Lucy

    2013-04-01

    Assessing the potential costs of large-scale flooding within the insurance and reinsurance industry can be achieved using probabilistic catastrophe models that combine hazard map outputs from flood models with exposure information. Many detailed flood modelling methodologies are available, including both advanced hydrological approaches and detailed 2D hydraulic models. However, these approaches are typically developed and perfected for a relatively limited test area (e.g. a single catchment or region) enabling efficient calibration to be carried out. With single flood events crossing country borders, multiple concurrent floods occurring across catchments, and an increasing need for national and international scale risk and financial assessment, up-scaling these localised methodologies is essential. The implementation of such techniques at national and international level pose a series of challenges to the model developer. Here, we discuss the challenges associated with the development of a multi-country probabilistic model designed to enable assessment of insurance exposures to river flooding in 12 countries across Europe on a return period basis. The underlying components of the model incorporate several components primarily developed for use in more limited areas, for example the 2D hydraulic modelling software JFlow+. Some of the challenges and their solutions that we will discuss include: • Availability of different volumes, record lengths and qualities of gauge and digital terrain data between countries; • Differing resolution and quality of property exposure information; • The need for a significant amount of manual editing work across a very wide area; • Different information available for validation in different regions; • Lengthy data and model analysis times; • The requirement for extremely fast computer processors.

  5. Developing an event-tree probabilistic tsunami inundation model for NE Atlantic coasts: Application to case studies

    NASA Astrophysics Data System (ADS)

    Omira, Rachid; Baptista, Maria Ana; Matias, Luis

    2015-04-01

    This study constitutes the first assessment of probabilistic tsunami inundation in the NE Atlantic region, using an event-tree approach. It aims to develop a probabilistic tsunami inundation approach for the NE Atlantic coast with an application to two test sites of ASTARTE project, Tangier-Morocco and Sines-Portugal. Only tsunamis of tectonic origin are considered here, taking into account near-, regional- and far-filed sources. The multidisciplinary approach, proposed here, consists of an event-tree method that gathers seismic hazard assessment, tsunami numerical modelling, and statistical methods. It presents also a treatment of uncertainties related to source location and tidal stage in order to derive the likelihood of tsunami flood occurrence and exceedance of a specific near-shore wave height during a given return period. We derive high-resolution probabilistic maximum wave heights and flood distributions for both test-sites Tangier and Sines considering 100-, 500-, and 1000-year return periods. We find that the probability that a maximum wave height exceeds 1 m somewhere along the Sines coasts reaches about 55% for 100-year return period, and is up to 100% for 1000-year return period. Along Tangier coast, the probability of inundation occurrence (flow depth > 0m) is up to 45% for 100-year return period and reaches 96% in some near-shore costal location for 500-year return period. Acknowledgements: This work is funded by project ASTARTE - Assessment, STrategy And Risk Reduction for Tsunamis in Europe. Grant 603839, 7th FP (ENV.2013.6.4-3 ENV.2013.6.4-3).

  6. Developing an Event-Tree Probabilistic Tsunami Inundation Model for NE Atlantic Coasts: Application to a Case Study

    NASA Astrophysics Data System (ADS)

    Omira, R.; Matias, L.; Baptista, M. A.

    2016-08-01

    This study constitutes a preliminary assessment of probabilistic tsunami inundation in the NE Atlantic region. We developed an event-tree approach to calculate the likelihood of tsunami flood occurrence and exceedance of a specific near-shore wave height for a given exposure time. Only tsunamis of tectonic origin are considered here, taking into account local, regional, and far-field sources. The approach used here consists of an event-tree method that gathers probability models for seismic sources, tsunami numerical modeling, and statistical methods. It also includes a treatment of aleatoric uncertainties related to source location and tidal stage. Epistemic uncertainties are not addressed in this study. The methodology is applied to the coastal test-site of Sines located in the NE Atlantic coast of Portugal. We derive probabilistic high-resolution maximum wave amplitudes and flood distributions for the study test-site considering 100- and 500-year exposure times. We find that the probability that maximum wave amplitude exceeds 1 m somewhere along the Sines coasts reaches about 60 % for an exposure time of 100 years and is up to 97 % for an exposure time of 500 years. The probability of inundation occurrence (flow depth >0 m) varies between 10 % and 57 %, and from 20 % up to 95 % for 100- and 500-year exposure times, respectively. No validation has been performed here with historical tsunamis. This paper illustrates a methodology through a case study, which is not an operational assessment.

  7. Analysis of well test data---Application of probabilistic models to infer hydraulic properties of fractures. [Contains list of standardized terminology or nomenclatue used in statistical models

    SciTech Connect

    Osnes, J.D. ); Winberg, A.; Andersson, J.E.; Larsson, N.A. )

    1991-09-27

    Statistical and probabilistic methods for estimating the probability that a fracture is nonconductive (or equivalently, the conductive-fracture frequency) and the distribution of the transmissivities of conductive fractures from transmissivity measurements made in single-hole injection (well) tests were developed. These methods were applied to a database consisting of over 1,000 measurements made in nearly 25 km of borehole at five sites in Sweden. The depths of the measurements ranged from near the surface to over 600-m deep, and packer spacings of 20- and 25-m were used. A probabilistic model that describes the distribution of a series of transmissivity measurements was derived. When the parameters of this model were estimated using maximum likelihood estimators, the resulting estimated distributions generally fit the cumulative histograms of the transmissivity measurements very well. Further, estimates of the mean transmissivity of conductive fractures based on the maximum likelihood estimates of the model's parameters were reasonable, both in magnitude and in trend, with respect to depth. The estimates of the conductive fracture probability were generated in the range of 0.5--5.0 percent, with the higher values at shallow depths and with increasingly smaller values as depth increased. An estimation procedure based on the probabilistic model and the maximum likelihood estimators of its parameters was recommended. Some guidelines regarding the design of injection test programs were drawn from the recommended estimation procedure and the parameter estimates based on the Swedish data. 24 refs., 12 figs., 14 tabs.

  8. Probabilistic river forecast methodology

    NASA Astrophysics Data System (ADS)

    Kelly, Karen Suzanne

    1997-09-01

    The National Weather Service (NWS) operates deterministic conceptual models to predict the hydrologic response of a river basin to precipitation. The output from these models are forecasted hydrographs (time series of the future river stage) at certain locations along a river. In order for the forecasts to be useful for optimal decision making, the uncertainty associated with them must be quantified. A methodology is developed for this purpose that (i) can be implemented with any deterministic hydrologic model, (ii) receives a probabilistic forecast of precipitation as input, (iii) quantifies all sources of uncertainty, (iv) operates in real-time and within computing constraints, and (v) produces probability distributions of future river stages. The Bayesian theory which supports the methodology involves transformation of a distribution of future precipitation into one of future river stage, and statistical characterization of the uncertainty in the hydrologic model. This is accomplished by decomposing total uncertainty into that associated with future precipitation and that associated with the hydrologic transformations. These are processed independently and then integrated into a predictive distribution which constitutes a probabilistic river stage forecast. A variety of models are presented for implementation of the methodology. In the most general model, a probability of exceedance associated with a given future hydrograph specified. In the simplest model, a probability of exceedance associated with a given future river stage is specified. In conjunction with the Ohio River Forecast Center of the NWS, the simplest model is used to demonstrate the feasibility of producing probabilistic river stage forecasts for a river basin located in headwaters. Previous efforts to quantify uncertainty in river forecasting have only considered selected sources of uncertainty, been specific to a particular hydrologic model, or have not obtained an entire probability

  9. The multiattribute linear ballistic accumulator model of context effects in multialternative choice.

    PubMed

    Trueblood, Jennifer S; Brown, Scott D; Heathcote, Andrew

    2014-04-01

    Context effects occur when a choice between 2 options is altered by adding a 3rd alternative. Three major context effects--similarity, compromise, and attraction--have wide-ranging implications across applied and theoretical domains, and have driven the development of new dynamic models of multiattribute and multialternative choice. We propose the multiattribute linear ballistic accumulator (MLBA), a new dynamic model that provides a quantitative account of all 3 context effects. Our account applies not only to traditional paradigms involving choices among hedonic stimuli, but also to recent demonstrations of context effects with nonhedonic stimuli. Because of its computational tractability, the MLBA model is more easily applied than previous dynamic models. We show that the model also accounts for a range of other phenomena in multiattribute, multialternative choice, including time pressure effects, and that it makes a new prediction about the relationship between deliberation time and the magnitude of the similarity effect, which we confirm experimentally.

  10. A combinatorial Bayesian and Dirichlet model for prostate MR image segmentation using probabilistic image features

    NASA Astrophysics Data System (ADS)

    Li, Ang; Li, Changyang; Wang, Xiuying; Eberl, Stefan; Feng, Dagan; Fulham, Michael

    2016-08-01

    Blurred boundaries and heterogeneous intensities make accurate prostate MR image segmentation problematic. To improve prostate MR image segmentation we suggest an approach that includes: (a) an image patch division method to partition the prostate into homogeneous segments for feature extraction; (b) an image feature formulation and classification method, using the relevance vector machine, to provide probabilistic prior knowledge for graph energy construction; (c) a graph energy formulation scheme with Bayesian priors and Dirichlet graph energy and (d) a non-iterative graph energy minimization scheme, based on matrix differentiation, to perform the probabilistic pixel membership optimization. The segmentation output was obtained by assigning pixels with foreground and background labels based on derived membership probabilities. We evaluated our approach on the PROMISE-12 dataset with 50 prostate MR image volumes. Our approach achieved a mean dice similarity coefficient (DSC) of 0.90  ±  0.02, which surpassed the five best prior-based methods in the PROMISE-12 segmentation challenge.

  11. A combinatorial Bayesian and Dirichlet model for prostate MR image segmentation using probabilistic image features.

    PubMed

    Li, Ang; Li, Changyang; Wang, Xiuying; Eberl, Stefan; Feng, Dagan; Fulham, Michael

    2016-08-21

    Blurred boundaries and heterogeneous intensities make accurate prostate MR image segmentation problematic. To improve prostate MR image segmentation we suggest an approach that includes: (a) an image patch division method to partition the prostate into homogeneous segments for feature extraction; (b) an image feature formulation and classification method, using the relevance vector machine, to provide probabilistic prior knowledge for graph energy construction; (c) a graph energy formulation scheme with Bayesian priors and Dirichlet graph energy and (d) a non-iterative graph energy minimization scheme, based on matrix differentiation, to perform the probabilistic pixel membership optimization. The segmentation output was obtained by assigning pixels with foreground and background labels based on derived membership probabilities. We evaluated our approach on the PROMISE-12 dataset with 50 prostate MR image volumes. Our approach achieved a mean dice similarity coefficient (DSC) of 0.90  ±  0.02, which surpassed the five best prior-based methods in the PROMISE-12 segmentation challenge. PMID:27461085

  12. Geothermal probabilistic cost study

    NASA Astrophysics Data System (ADS)

    Orren, L. H.; Ziman, G. M.; Jones, S. C.; Lee, T. K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-08-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model was used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents was analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance were examined.

  13. Geothermal probabilistic cost study

    NASA Technical Reports Server (NTRS)

    Orren, L. H.; Ziman, G. M.; Jones, S. C.; Lee, T. K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-01-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model was used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents was analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance were examined.

  14. Geothermal probabilistic cost study

    SciTech Connect

    Orren, L.H.; Ziman, G.M.; Jones, S.C.; Lee, T.K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-08-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model is used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents are analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance are examined. (MHR)

  15. Probabilistic Generative Models for the Statistical Inference of Unobserved Paleoceanographic Events: Application to Stratigraphic Alignment for Inference of Ages

    NASA Astrophysics Data System (ADS)

    Lawrence, C.; Lin, L.; Lisiecki, L. E.; Khider, D.

    2014-12-01

    The broad goal of this presentation is to demonstrate the utility of probabilistic generative models to capture investigators' knowledge of geological processes and proxy data to draw statistical inferences about unobserved paleoclimatological events. We illustrate how this approach forces investigators to be explicit about their assumptions, and about how probability theory yields results that are a mathematical consequence of these assumptions and the data. We illustrate these ideas with the HMM-Match model that infers common times of sediment deposition in two records and the uncertainty in these inferences in the form of confidence bands. HMM-Match models the sedimentation processes that led to proxy data measured in marine sediment cores. This Bayesian model has three components: 1) a generative probabilistic model that proceeds from the underlying geophysical and geochemical events, specifically the sedimentation events to the generation the proxy data Sedimentation ---> Proxy Data ; 2) a recursive algorithm that reverses the logic of the model to yield inference about the unobserved sedimentation events and the associated alignment of the records based on proxy data Proxy Data ---> Sedimentation (Alignment) ; 3) an expectation maximization algorithm for estimating two unknown parameters. We applied HMM-Match to align 35 Late Pleistocene records to a global benthic d18Ostack and found that the mean width of 95% confidence intervals varies between 3-23 kyr depending on the resolution and noisiness of the core's d18O signal. Confidence bands within individual cores also vary greatly, ranging from ~0 to >40 kyr. Results from this algorithm will allow researchers to examine the robustness of their conclusions with respect to alignment uncertainty. Figure 1 shows the confidence bands for one low resolution record.

  16. Connection between Dirichlet distributions and a scale-invariant probabilistic model based on Leibniz-like pyramids

    NASA Astrophysics Data System (ADS)

    Rodríguez, A.; Tsallis, C.

    2014-12-01

    We show that the N → ∞ limiting probability distributions of a recently introduced family of d-dimensional scale-invariant probabilistic models based on Leibniz-like (d + 1)-dimensional hyperpyramids (Rodríguez and Tsallis 2012 J. Math. Phys. 53 023302) are given by Dirichlet distributions for d = 1, 2, …. It was formerly proved by Rodríguez et al that, for the one-dimensional case (d = 1), the corresponding limiting distributions are q-Gaussians (\\propto e_q- β x^2 , with e_1-β x^2=e-β x^2) . The Dirichlet distributions generalize the so-called Beta distributions to higher dimensions. Consistently, we make a connection between one-dimensional q-Gaussians and Beta distributions via a linear transformation. In addition, we discuss the probabilistically admissible region of parameters q and β defining a normalizable q-Gaussian, focusing particularly on the possibility of having both bell-shaped and U-shaped q-Gaussians, the latter corresponding, in an appropriate physical interpretation, to negative temperatures.

  17. Probabilistic brains: knowns and unknowns

    PubMed Central

    Pouget, Alexandre; Beck, Jeffrey M; Ma, Wei Ji; Latham, Peter E

    2015-01-01

    There is strong behavioral and physiological evidence that the brain both represents probability distributions and performs probabilistic inference. Computational neuroscientists have started to shed light on how these probabilistic representations and computations might be implemented in neural circuits. One particularly appealing aspect of these theories is their generality: they can be used to model a wide range of tasks, from sensory processing to high-level cognition. To date, however, these theories have only been applied to very simple tasks. Here we discuss the challenges that will emerge as researchers start focusing their efforts on real-life computations, with a focus on probabilistic learning, structural learning and approximate inference. PMID:23955561

  18. Probabilistic Design of Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2006-01-01

    A formal procedure for the probabilistic design evaluation of a composite structure is described. The uncertainties in all aspects of a composite structure (constituent material properties, fabrication variables, structural geometry, and service environments, etc.), which result in the uncertain behavior in the composite structural responses, are included in the evaluation. The probabilistic evaluation consists of: (1) design criteria, (2) modeling of composite structures and uncertainties, (3) simulation methods, and (4) the decision-making process. A sample case is presented to illustrate the formal procedure and to demonstrate that composite structural designs can be probabilistically evaluated with accuracy and efficiency.

  19. Choices and changes: Eccles' Expectancy-Value model and upper-secondary school students' longitudinal reflections about their choice of a STEM education

    NASA Astrophysics Data System (ADS)

    Lykkegaard, Eva; Ulriksen, Lars

    2016-03-01

    During the past 30 years, Eccles' comprehensive social-psychological Expectancy-Value Model of Motivated Behavioural Choices (EV-MBC model) has been proven suitable for studying educational choices related to Science, Technology, Engineering and/or Mathematics (STEM). The reflections of 15 students in their last year in upper-secondary school concerning their choice of tertiary education were examined using quantitative EV-MBC surveys and repeated qualitative interviews. This article presents the analyses of three cases in detail. The analytical focus was whether the factors indicated in the EV-MBC model could be used to detect significant changes in the students' educational choice processes. An important finding was that the quantitative EV-MBC surveys and the qualitative interviews gave quite different results concerning the students' considerations about the choice of tertiary education, and that significant changes in the students' reflections were not captured by the factors of the EV-MBC model. This questions the validity of the EV-MBC surveys. Moreover, the quantitative factors from the EV-MBC model did not sufficiently explain students' dynamical educational choice processes where students in parallel considered several different potential educational trajectories. We therefore call for further studies of the EV-MBC model's use in describing longitudinal choice processes and especially in investigating significant changes.

  20. A conceptual model for determining career choice of CHROME alumna based on farmer's conceptual models

    NASA Astrophysics Data System (ADS)

    Moore, Lisa Simmons

    This qualitative program evaluation examines the career decision-making processes and career choices of nine, African American women who participated in the Cooperating Hampton Roads Organization for Minorities in Engineering (CHROME) and who graduated from urban, rural or suburban high schools in the year 2000. The CHROME program is a nonprofit, pre-college intervention program that encourages underrepresented minority and female students to enter science, technically related, engineering, and math (STEM) career fields. The study describes career choices and decisions made by each participant over a five-year period since high school graduation. Data was collected through an Annual Report, Post High School Questionnaires, Environmental Support Questionnaires, Career Choice Questionnaires, Senior Reports, and standardized open-ended interviews. Data was analyzed using a model based on Helen C. Farmer's Conceptual Models, John Ogbu's Caste Theory and Feminist Theory. The CHROME program, based on its stated goals and tenets, was also analyzed against study findings. Findings indicated that participants received very low levels of support from counselors and teachers to pursue STEM careers and high levels of support from parents and family, the CHROME program and financial backing. Findings of this study also indicated that the majority of CHROME alumna persisted in STEM careers. The most successful participants, in terms of undergraduate degree completion and occupational prestige, were the African American women who remained single, experienced no critical incidents, came from a middle class to upper middle class socioeconomic background, and did not have children.

  1. NasoNet, modeling the spread of nasopharyngeal cancer with networks of probabilistic events in discrete time.

    PubMed

    Galán, S F; Aguado, F; Díez, F J; Mira, J

    2002-07-01

    The spread of cancer is a non-deterministic dynamic process. As a consequence, the design of an assistant system for the diagnosis and prognosis of the extent of a cancer should be based on a representation method that deals with both uncertainty and time. The ultimate goal is to know the stage of development of a cancer in a patient before selecting the appropriate treatment. A network of probabilistic events in discrete time (NPEDT) is a type of Bayesian network for temporal reasoning that models the causal mechanisms associated with the time evolution of a process. This paper describes NasoNet, a system that applies NPEDTs to the diagnosis and prognosis of nasopharyngeal cancer. We have made use of temporal noisy gates to model the dynamic causal interactions that take place in the domain. The methodology we describe is general enough to be applied to any other type of cancer.

  2. Human risky choice under temporal constraints: tests of an energy-budget model.

    PubMed Central

    Pietras, Cynthia J; Locey, Matthew L; Hackenberg, Timothy D

    2003-01-01

    Risk-sensitive foraging models predict that choice between fixed and variable food delays should be influenced by an organism's energy budget. To investigate whether the predictions of these models could be extended to choice in humans, risk sensitivity in 4 adults was investigated under laboratory conditions designed to model positive and negative energy budgets. Subjects chose between fixed and variable trial durations with the same mean value. An energy requirement was modeled by requiring that five trials be completed within a limited time period for points delivered at the end of the period (block of trials) to be exchanged later for money. Manipulating the duration of this time period generated positive and negative earnings budgets (or, alternatively, "time budgets"). Choices were consistent with the predictions of energy-budget models: The fixed-delay option was strongly preferred under positive earnings-budget conditions and the variable-delay option was strongly preferred under negative earnings-budget conditions. Within-block (or trial-by-trial) choices were also frequently consistent with the predictions of a dynamic optimization model, indicating that choice was simultaneously sensitive to the temporal requirements, delays associated with fixed and variable choices on the upcoming trial, cumulative delays within the block of trials, and trial position within a block. PMID:13677609

  3. A semi-probabilistic modelling approach for the estimation of dietary exposure to phthalates in the Belgian adult population.

    PubMed

    Fierens, T; Standaert, A; Cornelis, C; Sioen, I; De Henauw, S; Willems, H; Bellemans, M; De Maeyer, M; Van Holderbeke, M

    2014-12-01

    In this study, a semi-probabilistic modelling approach was applied for the estimation of the long-term human dietary exposure to phthalates--one of world's most used families of plasticisers. Four phthalate compounds were considered: diethyl phthalate (DEP), di-n-butyl phthalate (DnBP), benzylbutyl phthalate (BBP) and di(2-ethylhexyl) phthalate (DEHP). Intake estimates were calculated for the Belgian adult population and several subgroups of this population for two considered scenarios using an extended version of the EN-forc model. The highest intake rates were found for DEHP, followed by DnBP, BBP and DEP. In the Belgian adult population, men and young adults generally had the highest dietary phthalate intake estimates. Nevertheless, predicted dietary intake rates for all four investigated phthalates were far below the corresponding tolerable daily intake (TDI) values (i.e. P99 intake values were 6.4% of the TDI at most), which is reassuring because adults are also exposed to phthalates via other contamination pathways (e.g. dust ingestion and inhalation). The food groups contributing most to the dietary exposure were grains and grain-based products for DEP, milk and dairy products for DnBP, meat and meat products or grains and grain-based products (depending on the scenario) for BBP and meat and meat products for DEHP. Comparison of the predicted intake results based on modelled phthalate concentrations in food products with intake estimates from other surveys (mostly based on measured concentrations) showed that the extended version of the EN-forc model is a suitable semi-probabilistic tool for the estimation and evaluation of the long-term dietary intake of phthalates in humans.

  4. The Role of Intent in Ethical Decision Making: The Ethical Choice Model

    ERIC Educational Resources Information Center

    King, Christine; Powell, Toni

    2007-01-01

    This paper reviews the major theories, studies and models concerning ethical decision making in organizations. The authors drew upon Jones' Model (1991) as the foundation for their Ethical Choice Model, which is designed to further clarify the ethical decision making process as it relates to the construct of intentionality. The model, illustrated…

  5. Monte Carlo probabilistic sensitivity analysis for patient level simulation models: efficient estimation of mean and variance using ANOVA.

    PubMed

    O'Hagan, Anthony; Stevenson, Matt; Madan, Jason

    2007-10-01

    Probabilistic sensitivity analysis (PSA) is required to account for uncertainty in cost-effectiveness calculations arising from health economic models. The simplest way to perform PSA in practice is by Monte Carlo methods, which involves running the model many times using randomly sampled values of the model inputs. However, this can be impractical when the economic model takes appreciable amounts of time to run. This situation arises, in particular, for patient-level simulation models (also known as micro-simulation or individual-level simulation models), where a single run of the model simulates the health care of many thousands of individual patients. The large number of patients required in each run to achieve accurate estimation of cost-effectiveness means that only a relatively small number of runs is possible. For this reason, it is often said that PSA is not practical for patient-level models. We develop a way to reduce the computational burden of Monte Carlo PSA for patient-level models, based on the algebra of analysis of variance. Methods are presented to estimate the mean and variance of the model output, with formulae for determining optimal sample sizes. The methods are simple to apply and will typically reduce the computational demand very substantially.

  6. Dynamic modeling of physical phenomena for probabilistic risk assessments using artificial neural networks

    SciTech Connect

    Benjamin, A.S.; Paez, T.L.; Brown, N.N.

    1998-01-01

    In most probabilistic risk assessments, there is a subset of accident scenarios that involves physical challenges to the system, such as high heat rates and/or accelerations. The system`s responses to these challenges may be complicated, and their prediction may require the use of long-running computer codes. To deal with the many scenarios demanded by a risk assessment, the authors have been investigating the use of artificial neural networks (ANNs) as a fast-running estimation tool. They have developed a multivariate linear spline algorithm by extending previous ANN methods that use radial basis functions. They have applied the algorithm to problems involving fires, shocks, and vibrations. They have found that within the parameter range for which it is trained, the algorithm can simulate the nonlinear responses of complex systems with high accuracy. Running times per case are less than one second.

  7. Effect of reflectance model choice on earthshine-based terrestrial albedo determinations.

    NASA Astrophysics Data System (ADS)

    Thejll, Peter; Gleisner, Hans; Flynn, Chris

    2016-04-01

    Earthshine observations can be used to determine near-hemispheric average terrestrial albedos by careful observation of the relative strength of the earthshine-lit half of the Moon coupled with correct modelling of the reflectances of Earth and Moon, as well as lunar single-scattering albedo maps. Using our own observations of the earthshine, from Mauna Loa Observatory in 2011-12, we investigate the influence of the choice of bidirectional reflectance models for the Moon on derived terrestrial albedos. We find a considerable dependence on albedo results in this choice, and discuss ways to determine what the origin of the dependence is - e.g is it in the joint choices of lunar and terrestrial BRDFs, or is the choice of terrestrial BRDF less important than the lunar one? We report on the results of modelling lunar reflectance and albedo in 6 ways and terrestrial reflectance in two ways, assuming a uniform single-scattering albedo on Earth.

  8. Model Verification and Validation Concepts for a Probabilistic Fracture Assessment Model to Predict Cracking of Knife Edge Seals in the Space Shuttle Main Engine High Pressure Oxidizer

    NASA Technical Reports