Science.gov

Sample records for probabilistic choice models

  1. Probabilistic choice models in health-state valuation research: background, theories, assumptions and applications.

    PubMed

    Arons, Alexander M M; Krabbe, Paul F M

    2013-02-01

    Interest is rising in measuring subjective health outcomes, such as treatment outcomes that are not directly quantifiable (functional disability, symptoms, complaints, side effects and health-related quality of life). Health economists in particular have applied probabilistic choice models in the area of health evaluation. They increasingly use discrete choice models based on random utility theory to derive values for healthcare goods or services. Recent attempts have been made to use discrete choice models as an alternative method to derive values for health states. In this article, various probabilistic choice models are described according to their underlying theory. A historical overview traces their development and applications in diverse fields. The discussion highlights some theoretical and technical aspects of the choice models and their similarity and dissimilarity. The objective of the article is to elucidate the position of each model and their applications for health-state valuation.

  2. The Probabilistic Nature of Preferential Choice

    ERIC Educational Resources Information Center

    Rieskamp, Jorg

    2008-01-01

    Previous research has developed a variety of theories explaining when and why people's decisions under risk deviate from the standard economic view of expected utility maximization. These theories are limited in their predictive accuracy in that they do not explain the probabilistic nature of preferential choice, that is, why an individual makes…

  3. Probabilistic Model Development

    NASA Technical Reports Server (NTRS)

    Adam, James H., Jr.

    2010-01-01

    Objective: Develop a Probabilistic Model for the Solar Energetic Particle Environment. Develop a tool to provide a reference solar particle radiation environment that: 1) Will not be exceeded at a user-specified confidence level; 2) Will provide reference environments for: a) Peak flux; b) Event-integrated fluence; and c) Mission-integrated fluence. The reference environments will consist of: a) Elemental energy spectra; b) For protons, helium and heavier ions.

  4. Probabilistic Mesomechanical Fatigue Model

    NASA Technical Reports Server (NTRS)

    Tryon, Robert G.

    1997-01-01

    A probabilistic mesomechanical fatigue life model is proposed to link the microstructural material heterogeneities to the statistical scatter in the macrostructural response. The macrostructure is modeled as an ensemble of microelements. Cracks nucleation within the microelements and grow from the microelements to final fracture. Variations of the microelement properties are defined using statistical parameters. A micromechanical slip band decohesion model is used to determine the crack nucleation life and size. A crack tip opening displacement model is used to determine the small crack growth life and size. Paris law is used to determine the long crack growth life. The models are combined in a Monte Carlo simulation to determine the statistical distribution of total fatigue life for the macrostructure. The modeled response is compared to trends in experimental observations from the literature.

  5. Firm competition in a probabilistic framework of consumer choice

    NASA Astrophysics Data System (ADS)

    Liao, Hao; Xiao, Rui; Chen, Duanbing; Medo, Matúš; Zhang, Yi-Cheng

    2014-04-01

    We develop a probabilistic consumer choice framework based on information asymmetry between consumers and firms. This framework makes it possible to study market competition of several firms by both quality and price of their products. We find Nash market equilibria and other optimal strategies in various situations ranging from competition of two identical firms to firms of different sizes and firms which improve their efficiency.

  6. A Discounting Framework for Choice With Delayed and Probabilistic Rewards

    PubMed Central

    Green, Leonard; Myerson, Joel

    2005-01-01

    When choosing between delayed or uncertain outcomes, individuals discount the value of such outcomes on the basis of the expected time to or the likelihood of their occurrence. In an integrative review of the expanding experimental literature on discounting, the authors show that although the same form of hyperbola-like function describes discounting of both delayed and probabilistic outcomes, a variety of recent findings are inconsistent with a single-process account. The authors also review studies that compare discounting in different populations and discuss the theoretical and practical implications of the findings. The present effort illustrates the value of studying choice involving both delayed and probabilistic outcomes within a general discounting framework that uses similar experimental procedures and a common analytical approach. PMID:15367080

  7. Probabilistic choice between symmetric disparities in motion stereo matching for a lateral navigation system

    NASA Astrophysics Data System (ADS)

    Ershov, Egor; Karnaukhov, Victor; Mozerov, Mikhail

    2016-02-01

    Two consecutive frames of a lateral navigation camera video sequence can be considered as an appropriate approximation to epipolar stereo. To overcome edge-aware inaccuracy caused by occlusion, we propose a model that matches the current frame to the next and to the previous ones. The positive disparity of matching to the previous frame has its symmetric negative disparity to the next frame. The proposed algorithm performs probabilistic choice for each matched pixel between the positive disparity and its symmetric disparity cost. A disparity map obtained by optimization over the cost volume composed of the proposed probabilistic choice is more accurate than the traditional left-to-right and right-to-left disparity maps cross-check. Also, our algorithm needs two times less computational operations per pixel than the cross-check technique. The effectiveness of our approach is demonstrated on synthetic data and real video sequences, with ground-truth value.

  8. The effects of the previous outcome on probabilistic choice in rats.

    PubMed

    Marshall, Andrew T; Kirkpatrick, Kimberly

    2013-01-01

    This study examined the effects of previous outcomes on subsequent choices in a probabilistic-choice task. Twenty-four rats were trained to choose between a certain outcome (1 or 3 pellets) versus an uncertain outcome (3 or 9 pellets), delivered with a probability of .1, .33, .67, and .9 in different phases. Uncertain outcome choices increased with the probability of uncertain food. Additionally, uncertain choices increased with the probability of uncertain food following both certain-choice outcomes and unrewarded uncertain choices. However, following uncertain-choice food outcomes, there was a tendency to choose the uncertain outcome in all cases, indicating that the rats continued to "gamble" after successful uncertain choices, regardless of the overall probability or magnitude of food. A subsequent manipulation, in which the probability of uncertain food varied within each session as a function of the previous uncertain outcome, examined how the previous outcome and probability of uncertain food affected choice in a dynamic environment. Uncertain-choice behavior increased with the probability of uncertain food. The rats exhibited increased sensitivity to probability changes and a greater degree of win-stay/lose-shift behavior than in the static phase. Simulations of two sequential choice models were performed to explore the possible mechanisms of reward value computations. The simulation results supported an exponentially decaying value function that updated as a function of trial (rather than time). These results emphasize the importance of analyzing global and local factors in choice behavior and suggest avenues for the future development of sequential-choice models.

  9. Effects of Time between Trials on Rats' and Pigeons' Choices with Probabilistic Delayed Reinforcers

    ERIC Educational Resources Information Center

    Mazur, James E.; Biondi, Dawn R.

    2011-01-01

    Parallel experiments with rats and pigeons examined reasons for previous findings that in choices with probabilistic delayed reinforcers, rats' choices were affected by the time between trials whereas pigeons' choices were not. In both experiments, the animals chose between a standard alternative and an adjusting alternative. A choice of the…

  10. Relative Gains, Losses, and Reference Points in Probabilistic Choice in Rats

    PubMed Central

    Marshall, Andrew T.; Kirkpatrick, Kimberly

    2015-01-01

    Theoretical reference points have been proposed to differentiate probabilistic gains from probabilistic losses in humans, but such a phenomenon in non-human animals has yet to be thoroughly elucidated. Three experiments evaluated the effect of reward magnitude on probabilistic choice in rats, seeking to determine reference point use by examining the effect of previous outcome magnitude(s) on subsequent choice behavior. Rats were trained to choose between an outcome that always delivered reward (low-uncertainty choice) and one that probabilistically delivered reward (high-uncertainty). The probability of high-uncertainty outcome receipt and the magnitudes of low-uncertainty and high-uncertainty outcomes were manipulated within and between experiments. Both the low- and high-uncertainty outcomes involved variable reward magnitudes, so that either a smaller or larger magnitude was probabilistically delivered, as well as reward omission following high-uncertainty choices. In Experiments 1 and 2, the between groups factor was the magnitude of the high-uncertainty-smaller (H-S) and high-uncertainty-larger (H-L) outcome, respectively. The H-S magnitude manipulation differentiated the groups, while the H-L magnitude manipulation did not. Experiment 3 showed that manipulating the probability of differential losses as well as the expected value of the low-uncertainty choice produced systematic effects on choice behavior. The results suggest that the reference point for probabilistic gains and losses was the expected value of the low-uncertainty choice. Current theories of probabilistic choice behavior have difficulty accounting for the present results, so an integrated theoretical framework is proposed. Overall, the present results have implications for understanding individual differences and corresponding underlying mechanisms of probabilistic choice behavior. PMID:25658448

  11. Probabilistic Survivability Versus Time Modeling

    NASA Technical Reports Server (NTRS)

    Joyner, James J., Sr.

    2015-01-01

    This technical paper documents Kennedy Space Centers Independent Assessment team work completed on three assessments for the Ground Systems Development and Operations (GSDO) Program to assist the Chief Safety and Mission Assurance Officer (CSO) and GSDO management during key programmatic reviews. The assessments provided the GSDO Program with an analysis of how egress time affects the likelihood of astronaut and worker survival during an emergency. For each assessment, the team developed probability distributions for hazard scenarios to address statistical uncertainty, resulting in survivability plots over time. The first assessment developed a mathematical model of probabilistic survivability versus time to reach a safe location using an ideal Emergency Egress System at Launch Complex 39B (LC-39B); the second used the first model to evaluate and compare various egress systems under consideration at LC-39B. The third used a modified LC-39B model to determine if a specific hazard decreased survivability more rapidly than other events during flight hardware processing in Kennedys Vehicle Assembly Building (VAB).Based on the composite survivability versus time graphs from the first two assessments, there was a soft knee in the Figure of Merit graphs at eight minutes (ten minutes after egress ordered). Thus, the graphs illustrated to the decision makers that the final emergency egress design selected should have the capability of transporting the flight crew from the top of LC 39B to a safe location in eight minutes or less. Results for the third assessment were dominated by hazards that were classified as instantaneous in nature (e.g. stacking mishaps) and therefore had no effect on survivability vs time to egress the VAB. VAB emergency scenarios that degraded over time (e.g. fire) produced survivability vs time graphs that were line with aerospace industry norms.

  12. Probabilistic Modeling of Rosette Formation

    PubMed Central

    Long, Mian; Chen, Juan; Jiang, Ning; Selvaraj, Periasamy; McEver, Rodger P.; Zhu, Cheng

    2006-01-01

    Rosetting, or forming a cell aggregate between a single target nucleated cell and a number of red blood cells (RBCs), is a simple assay for cell adhesion mediated by specific receptor-ligand interaction. For example, rosette formation between sheep RBC and human lymphocytes has been used to differentiate T cells from B cells. Rosetting assay is commonly used to determine the interaction of Fc γ-receptors (FcγR) expressed on inflammatory cells and IgG coated on RBCs. Despite its wide use in measuring cell adhesion, the biophysical parameters of rosette formation have not been well characterized. Here we developed a probabilistic model to describe the distribution of rosette sizes, which is Poissonian. The average rosette size is predicted to be proportional to the apparent two-dimensional binding affinity of the interacting receptor-ligand pair and their site densities. The model has been supported by experiments of rosettes mediated by four molecular interactions: FcγRIII interacting with IgG, T cell receptor and coreceptor CD8 interacting with antigen peptide presented by major histocompatibility molecule, P-selectin interacting with P-selectin glycoprotein ligand 1 (PSGL-1), and L-selectin interacting with PSGL-1. The latter two are structurally similar and are different from the former two. Fitting the model to data enabled us to evaluate the apparent effective two-dimensional binding affinity of the interacting molecular pairs: 7.19 × 10−5 μm4 for FcγRIII-IgG interaction, 4.66 × 10−3 μm4 for P-selectin-PSGL-1 interaction, and 0.94 × 10−3 μm4 for L-selectin-PSGL-1 interaction. These results elucidate the biophysical mechanism of rosette formation and enable it to become a semiquantitative assay that relates the rosette size to the effective affinity for receptor-ligand binding. PMID:16603493

  13. Modelling structured data with Probabilistic Graphical Models

    NASA Astrophysics Data System (ADS)

    Forbes, F.

    2016-05-01

    Most clustering and classification methods are based on the assumption that the objects to be clustered are independent. However, in more and more modern applications, data are structured in a way that makes this assumption not realistic and potentially misleading. A typical example that can be viewed as a clustering task is image segmentation where the objects are the pixels on a regular grid and depend on neighbouring pixels on this grid. Also, when data are geographically located, it is of interest to cluster data with an underlying dependence structure accounting for some spatial localisation. These spatial interactions can be naturally encoded via a graph not necessarily regular as a grid. Data sets can then be modelled via Markov random fields and mixture models (e.g. the so-called MRF and Hidden MRF). More generally, probabilistic graphical models are tools that can be used to represent and manipulate data in a structured way while modeling uncertainty. This chapter introduces the basic concepts. The two main classes of probabilistic graphical models are considered: Bayesian networks and Markov networks. The key concept of conditional independence and its link to Markov properties is presented. The main problems that can be solved with such tools are described. Some illustrations are given associated with some practical work.

  14. Choice Behavior in Pigeons Maintained with Probabilistic Schedules of Reinforcement

    ERIC Educational Resources Information Center

    Moore, Jay; Friedlen, Karen E.

    2007-01-01

    Pigeons were trained in three experiments with a two-key, concurrent-chains choice procedure. The initial links were equal variable-interval schedules, and the terminal links were random-time schedules with equal average interreinforcement intervals. Across the three experiments, the pigeons either stayed in a terminal link until a reinforcer was…

  15. Towards Probabilistic Modelling in Event-B

    NASA Astrophysics Data System (ADS)

    Tarasyuk, Anton; Troubitsyna, Elena; Laibinis, Linas

    Event-B provides us with a powerful framework for correct-by-construction system development. However, while developing dependable systems we should not only guarantee their functional correctness but also quantitatively assess their dependability attributes. In this paper we investigate how to conduct probabilistic assessment of reliability of control systems modeled in Event-B. We show how to transform an Event-B model into a Markov model amendable for probabilistic reliability analysis. Our approach enables integration of reasoning about correctness with quantitative analysis of reliability.

  16. The Repeated Insertion Model for Rankings: Missing Link between Two Subset Choice Models

    ERIC Educational Resources Information Center

    Doignon, Jean-Paul; Pekec, Aleksandar; Regenwetter, Michel

    2004-01-01

    Several probabilistic models for subset choice have been proposed in the literature, for example, to explain approval voting data. We show that Marley et al.'s latent scale model is subsumed by Falmagne and Regenwetter's size-independent model, in the sense that every choice probability distribution generated by the former can also be explained by…

  17. Probabilistic modeling of children's handwriting

    NASA Astrophysics Data System (ADS)

    Puri, Mukta; Srihari, Sargur N.; Hanson, Lisa

    2013-12-01

    There is little work done in the analysis of children's handwriting, which can be useful in developing automatic evaluation systems and in quantifying handwriting individuality. We consider the statistical analysis of children's handwriting in early grades. Samples of handwriting of children in Grades 2-4 who were taught the Zaner-Bloser style were considered. The commonly occurring word "and" written in cursive style as well as hand-print were extracted from extended writing. The samples were assigned feature values by human examiners using a truthing tool. The human examiners looked at how the children constructed letter formations in their writing, looking for similarities and differences from the instructions taught in the handwriting copy book. These similarities and differences were measured using a feature space distance measure. Results indicate that the handwriting develops towards more conformity with the class characteristics of the Zaner-Bloser copybook which, with practice, is the expected result. Bayesian networks were learnt from the data to enable answering various probabilistic queries, such as determining students who may continue to produce letter formations as taught during lessons in school and determining the students who will develop a different and/or variation of the those letter formations and the number of different types of letter formations.

  18. Probabilistic Seismic Risk Model for Western Balkans

    NASA Astrophysics Data System (ADS)

    Stejskal, Vladimir; Lorenzo, Francisco; Pousse, Guillaume; Radovanovic, Slavica; Pekevski, Lazo; Dojcinovski, Dragi; Lokin, Petar; Petronijevic, Mira; Sipka, Vesna

    2010-05-01

    A probabilistic seismic risk model for insurance and reinsurance purposes is presented for an area of Western Balkans, covering former Yugoslavia and Albania. This territory experienced many severe earthquakes during past centuries producing significant damage to many population centres in the region. The highest hazard is related to external Dinarides, namely to the collision zone of the Adriatic plate. The model is based on a unified catalogue for the region and a seismic source model consisting of more than 30 zones covering all the three main structural units - Southern Alps, Dinarides and the south-western margin of the Pannonian Basin. A probabilistic methodology using Monte Carlo simulation was applied to generate the hazard component of the model. Unique set of damage functions based on both loss experience and engineering assessments is used to convert the modelled ground motion severity into the monetary loss.

  19. Probabilistic, meso-scale flood loss modelling

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2016-04-01

    Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.

  20. When good pigeons make bad decisions: Choice with probabilistic delays and outcomes.

    PubMed

    Pisklak, Jeffrey M; McDevitt, Margaret A; Dunn, Roger M; Spetch, Marcia L

    2015-11-01

    Pigeons chose between an (optimal) alternative that sometimes provided food after a 10-s delay and other times after a 40-s delay and another (suboptimal) alternative that sometimes provided food after 10 s but other times no food after 40 s. When outcomes were not signaled during the delays, pigeons strongly preferred the optimal alternative. When outcomes were signaled, choices of the suboptimal alternative increased and most pigeons preferred the alternative that provided no food after the long delay despite the cost in terms of obtained food. The pattern of results was similar whether the short delays occurred on 25% or 50% of the trials. Shortening the 40-s delay to food sharply reduced suboptimal choices, but shortening the delay to no food had little effect. The results suggest that a signaled delay to no food does not punish responding in probabilistic choice procedures. The findings are discussed in terms of conditioned reinforcement by signals for good news.

  1. The probabilistic structure of planetary contamination models

    NASA Technical Reports Server (NTRS)

    Harrison, J. M.; North, W. D.

    1973-01-01

    The analytical basis for planetary quarantine standards and procedures is presented. The heirarchy of planetary quarantine decisions is explained and emphasis is placed on the determination of mission specifications to include sterilization. The influence of the Sagan-Coleman probabilistic model of planetary contamination on current standards and procedures is analyzed. A classical problem in probability theory which provides a close conceptual parallel to the type of dependence present in the contamination problem is presented.

  2. Inferring cellular networks using probabilistic graphical models.

    PubMed

    Friedman, Nir

    2004-02-06

    High-throughput genome-wide molecular assays, which probe cellular networks from different perspectives, have become central to molecular biology. Probabilistic graphical models are useful for extracting meaningful biological insights from the resulting data sets. These models provide a concise representation of complex cellular networks by composing simpler submodels. Procedures based on well-understood principles for inferring such models from data facilitate a model-based methodology for analysis and discovery. This methodology and its capabilities are illustrated by several recent applications to gene expression data.

  3. Modelling default and likelihood reasoning as probabilistic

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1990-01-01

    A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. 'Likely' and 'by default' are in fact treated as duals in the same sense as 'possibility' and 'necessity'. To model these four forms probabilistically, a logic QDP and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequence results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.

  4. Probabilistic Solar Energetic Particle Models

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.; Dietrich, William F.; Xapsos, Michael A.

    2011-01-01

    To plan and design safe and reliable space missions, it is necessary to take into account the effects of the space radiation environment. This is done by setting the goal of achieving safety and reliability with some desired level of confidence. To achieve this goal, a worst-case space radiation environment at the required confidence level must be obtained. Planning and designing then proceeds, taking into account the effects of this worst-case environment. The result will be a mission that is reliable against the effects of the space radiation environment at the desired confidence level. In this paper we will describe progress toward developing a model that provides worst-case space radiation environments at user-specified confidence levels. We will present a model for worst-case event-integrated solar proton environments that provide the worst-case differential proton spectrum. This model is based on data from IMP-8 and GOES spacecraft that provide a data base extending from 1974 to the present. We will discuss extending this work to create worst-case models for peak flux and mission-integrated fluence for protons. We will also describe plans for similar models for helium and heavier ions.

  5. Probabilistic graphical models for genetic association studies.

    PubMed

    Mourad, Raphaël; Sinoquet, Christine; Leray, Philippe

    2012-01-01

    Probabilistic graphical models have been widely recognized as a powerful formalism in the bioinformatics field, especially in gene expression studies and linkage analysis. Although less well known in association genetics, many successful methods have recently emerged to dissect the genetic architecture of complex diseases. In this review article, we cover the applications of these models to the population association studies' context, such as linkage disequilibrium modeling, fine mapping and candidate gene studies, and genome-scale association studies. Significant breakthroughs of the corresponding methods are highlighted, but emphasis is also given to their current limitations, in particular, to the issue of scalability. Finally, we give promising directions for future research in this field.

  6. Probabilistic model of Kordylewski clouds

    NASA Astrophysics Data System (ADS)

    Salnikova, T. V.; Stepanov, S. Ya.; Shuvalova, A. I.

    2016-05-01

    The problem of determining the phase-space distribution function for the system of the noninteracting dust particles for the mathematical model of cosmic dust Kordylewski clouds—clusters of the non-interacting dust particles in the vicinity of the triangular libration points of the Earth-Moon-Particle system taking into account perturbations from the Sun was considered.

  7. Probabilistic models for feedback systems.

    SciTech Connect

    Grace, Matthew D.; Boggs, Paul T.

    2011-02-01

    In previous work, we developed a Bayesian-based methodology to analyze the reliability of hierarchical systems. The output of the procedure is a statistical distribution of the reliability, thus allowing many questions to be answered. The principal advantage of the approach is that along with an estimate of the reliability, we also can provide statements of confidence in the results. The model is quite general in that it allows general representations of all of the distributions involved, it incorporates prior knowledge into the models, it allows errors in the 'engineered' nodes of a system to be determined by the data, and leads to the ability to determine optimal testing strategies. In this report, we provide the preliminary steps necessary to extend this approach to systems with feedback. Feedback is an essential component of 'complexity' and provides interesting challenges in modeling the time-dependent action of a feedback loop. We provide a mechanism for doing this and analyze a simple case. We then consider some extensions to more interesting examples with local control affecting the entire system. Finally, a discussion of the status of the research is also included.

  8. Multiclient Identification System Using Adaptive Probabilistic Model

    NASA Astrophysics Data System (ADS)

    Lin, Chin-Teng; Siana, Linda; Shou, Yu-Wen; Yang, Chien-Ting

    2010-12-01

    This paper aims at integrating detection and identification of human faces in a more practical and real-time face recognition system. The proposed face detection system is based on the cascade Adaboost method to improve the precision and robustness toward unstable surrounding lightings. Our Adaboost method innovates to adjust the environmental lighting conditions by histogram lighting normalization and to accurately locate the face regions by a region-based-clustering process as well. We also address on the problem of multi-scale faces in this paper by using 12 different scales of searching windows and 5 different orientations for each client in pursuit of the multi-view independent face identification. There are majorly two methodological parts in our face identification system, including PCA (principal component analysis) facial feature extraction and adaptive probabilistic model (APM). The structure of our implemented APM with a weighted combination of simple probabilistic functions constructs the likelihood functions by the probabilistic constraint in the similarity measures. In addition, our proposed method can online add a new client and update the information of registered clients due to the constructed APM. The experimental results eventually show the superior performance of our proposed system for both offline and real-time online testing.

  9. Modeling Spanish Mood Choice in Belief Statements

    ERIC Educational Resources Information Center

    Robinson, Jason R.

    2013-01-01

    This work develops a computational methodology new to linguistics that empirically evaluates competing linguistic theories on Spanish verbal mood choice through the use of computational techniques to learn mood and other hidden linguistic features from Spanish belief statements found in corpora. The machine learned probabilistic linguistic models…

  10. Probabilistic Gompertz model of irreversible growth.

    PubMed

    Bardos, D C

    2005-05-01

    Characterizing organism growth within populations requires the application of well-studied individual size-at-age models, such as the deterministic Gompertz model, to populations of individuals whose characteristics, corresponding to model parameters, may be highly variable. A natural approach is to assign probability distributions to one or more model parameters. In some contexts, size-at-age data may be absent due to difficulties in ageing individuals, but size-increment data may instead be available (e.g., from tag-recapture experiments). A preliminary transformation to a size-increment model is then required. Gompertz models developed along the above lines have recently been applied to strongly heterogeneous abalone tag-recapture data. Although useful in modelling the early growth stages, these models yield size-increment distributions that allow negative growth, which is inappropriate in the case of mollusc shells and other accumulated biological structures (e.g., vertebrae) where growth is irreversible. Here we develop probabilistic Gompertz models where this difficulty is resolved by conditioning parameter distributions on size, allowing application to irreversible growth data. In the case of abalone growth, introduction of a growth-limiting biological length scale is then shown to yield realistic length-increment distributions.

  11. A probabilistic graphical model based stochastic input model construction

    SciTech Connect

    Wan, Jiang; Zabaras, Nicholas

    2014-09-01

    Model reduction techniques have been widely used in modeling of high-dimensional stochastic input in uncertainty quantification tasks. However, the probabilistic modeling of random variables projected into reduced-order spaces presents a number of computational challenges. Due to the curse of dimensionality, the underlying dependence relationships between these random variables are difficult to capture. In this work, a probabilistic graphical model based approach is employed to learn the dependence by running a number of conditional independence tests using observation data. Thus a probabilistic model of the joint PDF is obtained and the PDF is factorized into a set of conditional distributions based on the dependence structure of the variables. The estimation of the joint PDF from data is then transformed to estimating conditional distributions under reduced dimensions. To improve the computational efficiency, a polynomial chaos expansion is further applied to represent the random field in terms of a set of standard random variables. This technique is combined with both linear and nonlinear model reduction methods. Numerical examples are presented to demonstrate the accuracy and efficiency of the probabilistic graphical model based stochastic input models. - Highlights: • Data-driven stochastic input models without the assumption of independence of the reduced random variables. • The problem is transformed to a Bayesian network structure learning problem. • Examples are given in flows in random media.

  12. Probabilistic Resilience in Hidden Markov Models

    NASA Astrophysics Data System (ADS)

    Panerati, Jacopo; Beltrame, Giovanni; Schwind, Nicolas; Zeltner, Stefan; Inoue, Katsumi

    2016-05-01

    Originally defined in the context of ecological systems and environmental sciences, resilience has grown to be a property of major interest for the design and analysis of many other complex systems: resilient networks and robotics systems other the desirable capability of absorbing disruption and transforming in response to external shocks, while still providing the services they were designed for. Starting from an existing formalization of resilience for constraint-based systems, we develop a probabilistic framework based on hidden Markov models. In doing so, we introduce two new important features: stochastic evolution and partial observability. Using our framework, we formalize a methodology for the evaluation of probabilities associated with generic properties, we describe an efficient algorithm for the computation of its essential inference step, and show that its complexity is comparable to other state-of-the-art inference algorithms.

  13. Retinal blood vessels extraction using probabilistic modelling.

    PubMed

    Kaba, Djibril; Wang, Chuang; Li, Yongmin; Salazar-Gonzalez, Ana; Liu, Xiaohui; Serag, Ahmed

    2014-01-01

    The analysis of retinal blood vessels plays an important role in detecting and treating retinal diseases. In this review, we present an automated method to segment blood vessels of fundus retinal image. The proposed method could be used to support a non-intrusive diagnosis in modern ophthalmology for early detection of retinal diseases, treatment evaluation or clinical study. This study combines the bias correction and an adaptive histogram equalisation to enhance the appearance of the blood vessels. Then the blood vessels are extracted using probabilistic modelling that is optimised by the expectation maximisation algorithm. The method is evaluated on fundus retinal images of STARE and DRIVE datasets. The experimental results are compared with some recently published methods of retinal blood vessels segmentation. The experimental results show that our method achieved the best overall performance and it is comparable to the performance of human experts.

  14. Statistical appearance models based on probabilistic correspondences.

    PubMed

    Krüger, Julia; Ehrhardt, Jan; Handels, Heinz

    2017-04-01

    Model-based image analysis is indispensable in medical image processing. One key aspect of building statistical shape and appearance models is the determination of one-to-one correspondences in the training data set. At the same time, the identification of these correspondences is the most challenging part of such methods. In our earlier work, we developed an alternative method using correspondence probabilities instead of exact one-to-one correspondences for a statistical shape model (Hufnagel et al., 2008). In this work, a new approach for statistical appearance models without one-to-one correspondences is proposed. A sparse image representation is used to build a model that combines point position and appearance information at the same time. Probabilistic correspondences between the derived multi-dimensional feature vectors are used to omit the need for extensive preprocessing of finding landmarks and correspondences as well as to reduce the dependence of the generated model on the landmark positions. Model generation and model fitting can now be expressed by optimizing a single global criterion derived from a maximum a-posteriori (MAP) approach with respect to model parameters that directly affect both shape and appearance of the considered objects inside the images. The proposed approach describes statistical appearance modeling in a concise and flexible mathematical framework. Besides eliminating the demand for costly correspondence determination, the method allows for additional constraints as topological regularity in the modeling process. In the evaluation the model was applied for segmentation and landmark identification in hand X-ray images. The results demonstrate the feasibility of the model to detect hand contours as well as the positions of the joints between finger bones for unseen test images. Further, we evaluated the model on brain data of stroke patients to show the ability of the proposed model to handle partially corrupted data and to

  15. A Probabilistic Approach to Model Update

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Reaves, Mercedes C.; Voracek, David F.

    2001-01-01

    Finite element models are often developed for load validation, structural certification, response predictions, and to study alternate design concepts. In rare occasions, models developed with a nominal set of parameters agree with experimental data without the need to update parameter values. Today, model updating is generally heuristic and often performed by a skilled analyst with in-depth understanding of the model assumptions. Parameter uncertainties play a key role in understanding the model update problem and therefore probabilistic analysis tools, developed for reliability and risk analysis, may be used to incorporate uncertainty in the analysis. In this work, probability analysis (PA) tools are used to aid the parameter update task using experimental data and some basic knowledge of potential error sources. Discussed here is the first application of PA tools to update parameters of a finite element model for a composite wing structure. Static deflection data at six locations are used to update five parameters. It is shown that while prediction of individual response values may not be matched identically, the system response is significantly improved with moderate changes in parameter values.

  16. A Probabilistic Asteroid Impact Risk Model

    NASA Technical Reports Server (NTRS)

    Mathias, Donovan L.; Wheeler, Lorien F.; Dotson, Jessie L.

    2016-01-01

    Asteroid threat assessment requires the quantification of both the impact likelihood and resulting consequence across the range of possible events. This paper presents a probabilistic asteroid impact risk (PAIR) assessment model developed for this purpose. The model incorporates published impact frequency rates with state-of-the-art consequence assessment tools, applied within a Monte Carlo framework that generates sets of impact scenarios from uncertain parameter distributions. Explicit treatment of atmospheric entry is included to produce energy deposition rates that account for the effects of thermal ablation and object fragmentation. These energy deposition rates are used to model the resulting ground damage, and affected populations are computed for the sampled impact locations. The results for each scenario are aggregated into a distribution of potential outcomes that reflect the range of uncertain impact parameters, population densities, and strike probabilities. As an illustration of the utility of the PAIR model, the results are used to address the question of what minimum size asteroid constitutes a threat to the population. To answer this question, complete distributions of results are combined with a hypothetical risk tolerance posture to provide the minimum size, given sets of initial assumptions. Model outputs demonstrate how such questions can be answered and provide a means for interpreting the effect that input assumptions and uncertainty can have on final risk-based decisions. Model results can be used to prioritize investments to gain knowledge in critical areas or, conversely, to identify areas where additional data has little effect on the metrics of interest.

  17. scoringRules - A software package for probabilistic model evaluation

    NASA Astrophysics Data System (ADS)

    Lerch, Sebastian; Jordan, Alexander; Krüger, Fabian

    2016-04-01

    Models in the geosciences are generally surrounded by uncertainty, and being able to quantify this uncertainty is key to good decision making. Accordingly, probabilistic forecasts in the form of predictive distributions have become popular over the last decades. With the proliferation of probabilistic models arises the need for decision theoretically principled tools to evaluate the appropriateness of models and forecasts in a generalized way. Various scoring rules have been developed over the past decades to address this demand. Proper scoring rules are functions S(F,y) which evaluate the accuracy of a forecast distribution F , given that an outcome y was observed. As such, they allow to compare alternative models, a crucial ability given the variety of theories, data sources and statistical specifications that is available in many situations. This poster presents the software package scoringRules for the statistical programming language R, which contains functions to compute popular scoring rules such as the continuous ranked probability score for a variety of distributions F that come up in applied work. Two main classes are parametric distributions like normal, t, or gamma distributions, and distributions that are not known analytically, but are indirectly described through a sample of simulation draws. For example, Bayesian forecasts produced via Markov Chain Monte Carlo take this form. Thereby, the scoringRules package provides a framework for generalized model evaluation that both includes Bayesian as well as classical parametric models. The scoringRules package aims to be a convenient dictionary-like reference for computing scoring rules. We offer state of the art implementations of several known (but not routinely applied) formulas, and implement closed-form expressions that were previously unavailable. Whenever more than one implementation variant exists, we offer statistically principled default choices.

  18. EFFECTS OF CORRELATED PROBABILISTIC EXPOSURE MODEL INPUTS ON SIMULATED RESULTS

    EPA Science Inventory

    In recent years, more probabilistic models have been developed to quantify aggregate human exposures to environmental pollutants. The impact of correlation among inputs in these models is an important issue, which has not been resolved. Obtaining correlated data and implementi...

  19. Approaches to implementing deterministic models in a probabilistic framework

    SciTech Connect

    Talbott, D.V.

    1995-04-01

    The increasing use of results from probabilistic risk assessments in the decision-making process makes it ever more important to eliminate simplifications in probabilistic models that might lead to conservative results. One area in which conservative simplifications are often made is modeling the physical interactions that occur during the progression of an accident sequence. This paper demonstrates and compares different approaches for incorporating deterministic models of physical parameters into probabilistic models; parameter range binning, response curves, and integral deterministic models. An example that combines all three approaches in a probabilistic model for the handling of an energetic material (i.e. high explosive, rocket propellant,...) is then presented using a directed graph model.

  20. A probabilistic model for binaural sound localization.

    PubMed

    Willert, Volker; Eggert, Julian; Adamy, Jürgen; Stahl, Raphael; Körner, Edgar

    2006-10-01

    This paper proposes a biologically inspired and technically implemented sound localization system to robustly estimate the position of a sound source in the frontal azimuthal half-plane. For localization, binaural cues are extracted using cochleagrams generated by a cochlear model that serve as input to the system. The basic idea of the model is to separately measure interaural time differences and interaural level differences for a number of frequencies and process these measurements as a whole. This leads to two-dimensional frequency versus time-delay representations of binaural cues, so-called activity maps. A probabilistic evaluation is presented to estimate the position of a sound source over time based on these activity maps. Learned reference maps for different azimuthal positions are integrated into the computation to gain time-dependent discrete conditional probabilities. At every timestep these probabilities are combined over frequencies and binaural cues to estimate the sound source position. In addition, they are propagated over time to improve position estimation. This leads to a system that is able to localize audible signals, for example human speech signals, even in reverberating environments.

  1. A Probabilistic Model for Reducing Medication Errors

    PubMed Central

    Nguyen, Phung Anh; Syed-Abdul, Shabbir; Iqbal, Usman; Hsu, Min-Huei; Huang, Chen-Ling; Li, Hsien-Chang; Clinciu, Daniel Livius; Jian, Wen-Shan; Li, Yu-Chuan Jack

    2013-01-01

    Background Medication errors are common, life threatening, costly but preventable. Information technology and automated systems are highly efficient for preventing medication errors and therefore widely employed in hospital settings. The aim of this study was to construct a probabilistic model that can reduce medication errors by identifying uncommon or rare associations between medications and diseases. Methods and Finding(s) Association rules of mining techniques are utilized for 103.5 million prescriptions from Taiwan’s National Health Insurance database. The dataset included 204.5 million diagnoses with ICD9-CM codes and 347.7 million medications by using ATC codes. Disease-Medication (DM) and Medication-Medication (MM) associations were computed by their co-occurrence and associations’ strength were measured by the interestingness or lift values which were being referred as Q values. The DMQs and MMQs were used to develop the AOP model to predict the appropriateness of a given prescription. Validation of this model was done by comparing the results of evaluation performed by the AOP model and verified by human experts. The results showed 96% accuracy for appropriate and 45% accuracy for inappropriate prescriptions, with a sensitivity and specificity of 75.9% and 89.5%, respectively. Conclusions We successfully developed the AOP model as an efficient tool for automatic identification of uncommon or rare associations between disease-medication and medication-medication in prescriptions. The AOP model helps to reduce medication errors by alerting physicians, improving the patients’ safety and the overall quality of care. PMID:24312659

  2. Probabilistic constitutive relationships for cyclic material strength models

    NASA Technical Reports Server (NTRS)

    Boyce, L.; Chamis, C. C.

    1988-01-01

    A methodology is developed that provides a probabilistic treatment for the lifetime of structural components of aerospace propulsion systems subjected to fatigue. Material strength degradation models, based on primitive variables, include both a fatigue strength reduction model and a fatigue crack growth model. Probabilistic analysis is based on simulation, and both maximum entropy and maximum penalized likelihood methods are used for the generation of probability density functions. The resulting constitutive relationships are included in several computer programs.

  3. A PROBABILISTIC MODELING FRAMEWORK FOR PREDICTING POPULATION EXPOSURES TO BENZENE

    EPA Science Inventory

    The US Environmental Protection Agency (EPA) is modifying their probabilistic Stochastic Human Exposure Dose Simulation (SHEDS) model to assess aggregate exposures to air toxics. Air toxics include urban Hazardous Air Pollutants (HAPS) such as benzene from mobile sources, part...

  4. A mathematical framework for probabilistic choice based on information theory and psychophysics.

    PubMed

    Takahashi, Taiki

    2006-01-01

    Risky decision-making (e.g. reward dependency) has been associated with substance abuse, psychopathy and pathological gambling; conversely, marked sensitivity to risk and uncertainty have been observed in anxiety disorder patients. In economic decision theory, probability and uncertainty have been dissociated. Frank Knight defined uncertainty as loss of information on the probability distribution of outcomes for choices (i.e., unpredictability), which is referred to as Knightian uncertainty (also as ambiguity). However, even when the probability distribution of outcomes is known, there are different degrees of predictability. In information theory, this type of degrees of uncertainty/unpredictability has been parametrized by introducing Shannon entropy. In the present paper, we show: (i) a mathematical framework combining Shannon entropy in information theory and Weber's law in psychophysics is capable of parametrizing subject's level of both aversion to probabilistic uncertainty (exaggerated in anxiety disorder patients) and reward dependency (enhanced in drug addicts and pathological gamblers), and (ii) this framework has an analogue in thermodynamics, therefore this can readily be utilized in studies in the nascent fields of neuroeconomics and econophysics as well. Future study directions for elucidating maladaptive personality characteristics in neuropsychiatric patients by using the present framework are discussed.

  5. Choice-Based Conjoint Analysis: Classification vs. Discrete Choice Models

    NASA Astrophysics Data System (ADS)

    Giesen, Joachim; Mueller, Klaus; Taneva, Bilyana; Zolliker, Peter

    Conjoint analysis is a family of techniques that originated in psychology and later became popular in market research. The main objective of conjoint analysis is to measure an individual's or a population's preferences on a class of options that can be described by parameters and their levels. We consider preference data obtained in choice-based conjoint analysis studies, where one observes test persons' choices on small subsets of the options. There are many ways to analyze choice-based conjoint analysis data. Here we discuss the intuition behind a classification based approach, and compare this approach to one based on statistical assumptions (discrete choice models) and to a regression approach. Our comparison on real and synthetic data indicates that the classification approach outperforms the discrete choice models.

  6. A probabilistic model for snow avalanche occurrence

    NASA Astrophysics Data System (ADS)

    Perona, P.; Miescher, A.; Porporato, A.

    2009-04-01

    Avalanche hazard forecasting is an important issue in relation to the protection of urbanized environments, ski resorts and of ski-touring alpinists. A critical point is to predict the conditions that trigger the snow mass instability determining the onset and the size of avalanches. On steep terrains the risk of avalanches is known to be related to preceding consistent snowfall events and to subsequent changes in the local climatic conditions. Regression analysis has shown that avalanche occurrence indeed correlates to the amount of snow fallen in consecutive three snowing days and to the state of the settled snow at the ground. Moreover, since different type of avalanches may occur as a result of the interactions of different factors, the process of snow avalanche formation is inherently complex and with some degree of unpredictability. For this reason, although several models assess the risk of avalanche by accounting for all the involved processes with a great detail, a high margin of uncertainty invariably remains. In this work, we explicitly describe such an unpredictable behaviour with an intrinsic noise affecting the processes leading snow instability. Eventually, this sets the basis for a minimalist stochastic model, which allows us to investigate the avalanche dynamics and its statistical properties. We employ a continuous time process with stochastic jumps (snowfalls), deterministic decay (snowmelt and compaction) and state dependent avalanche occurrence (renewals) as a minimalist model for the determination of avalanche size and related intertime occurrence. The physics leading to avalanches is simplified to the extent where only meteorological data and terrain data are necessary to estimate avalanche danger. We explore the analytical formulation of the process and the properties of the probability density function of the avalanche process variables. We also discuss what is the probabilistic link between avalanche size and preceding snowfall event and

  7. Identification of thermal degradation using probabilistic models in reflectance spectroscopy

    NASA Astrophysics Data System (ADS)

    Criner, A. K.; Cherry, A. J.; Cooney, A. T.; Katter, T. D.; Banks, H. T.; Hu, Shuhua; Catenacci, Jared

    2015-03-01

    Different probabilistic models of molecular vibration modes are considered to model the reflectance spectra of chemical species through the dielectric constant. We discuss probability measure estimators in parametric and nonparametric models. Analyses of ceramic matrix composite samples that have been heat treated for different amounts of times are compared. We finally compare these results with the analysis of vitreous silica using nonparametric models.

  8. Maritime Threat Detection Using Probabilistic Graphical Models

    DTIC Science & Technology

    2012-01-01

    CRF, unlike an HMM, can represent local features, and does not require feature concatenation. MLNs For MLNs, we used Alchemy ( Alchemy 2011), an...open source statistical relational learning and probabilistic inferencing package. Alchemy supports generative and discriminative weight learning, and...that Alchemy creates a new formula for every possible combination of the values for a1 and a2 that fit the type specified in their predicate

  9. Probabilistic Modeling of Space Shuttle Debris Impact

    NASA Technical Reports Server (NTRS)

    Huyse, Luc J.; Asce, M.; Waldhart, Chris J.; Riha, David S.; Larsen, Curtis E.; Gomez, Reynaldo J.; Stuart, Phillip C.

    2007-01-01

    On Feb 1, 2003, the Shuttle Columbia was lost during its return to Earth. As a result of the conclusion that debris impact caused the damage to the left wing of the Columbia Space Shuttle Vehicle (SSV) during ascent, the Columbia Accident Investigation Board recommended that an assessment be performed of the debris environment experienced by the SSV during ascent. A flight rationale based on probabilistic assessment is used for the SSV return-to-flight. The assessment entails identifying all potential debris sources, their probable geometric and aerodynamic characteristics, and their potential for impacting and damaging critical Shuttle components. A probabilistic analysis tool, based on the SwRI-developed NESSUS probabilistic analysis software, predicts the probability of impact and damage to the space shuttle wing leading edge and thermal protection system components. Among other parameters, the likelihood of unacceptable damage depends on the time of release (Mach number of the orbiter) and the divot mass as well as the impact velocity and impact angle. A typical result is visualized in the figures below. Probability of impact and damage, as well as the sensitivities thereof with respect to the distribution assumptions, can be computed and visualized at each point on the orbiter or summarized per wing panel or tile zone.

  10. Probabilistic modelling of flood events using the entropy copula

    NASA Astrophysics Data System (ADS)

    Li, Fan; Zheng, Qian

    2016-11-01

    The estimation of flood frequency is vital for the flood control strategies and hydraulic structure design. Generating synthetic flood events according to statistical properties of observations is one of plausible methods to analyze the flood frequency. Due to the statistical dependence among the flood event variables (i.e. the flood peak, volume and duration), a multidimensional joint probability estimation is required. Recently, the copula method is widely used for multivariable dependent structure construction, however, the copula family should be chosen before application and the choice process is sometimes rather subjective. The entropy copula, a new copula family, employed in this research proposed a way to avoid the relatively subjective process by combining the theories of copula and entropy. The analysis shows the effectiveness of the entropy copula for probabilistic modelling the flood events of two hydrological gauges, and a comparison of accuracy with the popular copulas was made. The Gibbs sampling technique was applied for trivariate flood events simulation in order to mitigate the calculation difficulties of extending to three dimension directly. The simulation results indicate that the entropy copula is a simple and effective copula family for trivariate flood simulation.

  11. Modeling one-choice and two-choice driving tasks.

    PubMed

    Ratcliff, Roger

    2015-08-01

    An experiment is presented in which subjects were tested on both one-choice and two-choice driving tasks and on non-driving versions of them. Diffusion models for one- and two-choice tasks were successful in extracting model-based measures from the response time and accuracy data. These include measures of the quality of the information from the stimuli that drove the decision process (drift rate in the model), the time taken up by processes outside the decision process and, for the two-choice model, the speed/accuracy decision criteria that subjects set. Drift rates were only marginally different between the driving and non-driving tasks, indicating that nearly the same information was used in the two kinds of tasks. The tasks differed in the time taken up by other processes, reflecting the difference between them in response processing demands. Drift rates were significantly correlated across the two two-choice tasks showing that subjects that performed well on one task also performed well on the other task. Nondecision times were correlated across the two driving tasks, showing common abilities on motor processes across the two tasks. These results show the feasibility of using diffusion modeling to examine decision making in driving and so provide for a theoretical examination of factors that might impair driving, such as extreme aging, distraction, sleep deprivation, and so on.

  12. Variational upper and lower bounds for probabilistic graphical models.

    PubMed

    Wexler, Ydo; Geiger, Dan

    2008-09-01

    Probabilistic phylogenetic models which relax the site independence evolution assumption often face the problem of infeasible likelihood computations, for example, for the task of selecting suitable parameters for the model. We present a new approximation method, applicable for a wide range of probabilistic models, which guarantees to upper and lower bound the true likelihood of data, and apply it to the problem of probabilistic phylogenetic models. The new method is complementary to known variational methods that lower bound the likelihood, and it uses similar methods to optimize the bounds from above and below. We applied our method to aligned DNA sequences of various lengths from human in the region of the CFTR gene and homologous from eight mammals, and found the bounds to be appreciably close to the true likelihood whenever it could be computed. When computing the exact likelihood was not feasible, we demonstrated the proximity of the upper and lower variational bounds, implying a tight approximation of the likelihood.

  13. A Thurstonian Pairwise Choice Model with Univariate and Multivariate Spline Transformations.

    ERIC Educational Resources Information Center

    De Soete, Geert; Winsberg, Suzanne

    1993-01-01

    A probabilistic choice model, based on L. L. Thurstone's Law of Comparative Judgment Case V, is developed for paired comparisons data about psychological stimuli. The model assumes that each stimulus is measured on a small number of physical variables. An algorithm for estimating parameters is illustrated with real data. (SLD)

  14. Bayesian non-parametrics and the probabilistic approach to modelling

    PubMed Central

    Ghahramani, Zoubin

    2013-01-01

    Modelling is fundamental to many fields of science and engineering. A model can be thought of as a representation of possible data one could predict from a system. The probabilistic approach to modelling uses probability theory to express all aspects of uncertainty in the model. The probabilistic approach is synonymous with Bayesian modelling, which simply uses the rules of probability theory in order to make predictions, compare alternative models, and learn model parameters and structure from data. This simple and elegant framework is most powerful when coupled with flexible probabilistic models. Flexibility is achieved through the use of Bayesian non-parametrics. This article provides an overview of probabilistic modelling and an accessible survey of some of the main tools in Bayesian non-parametrics. The survey covers the use of Bayesian non-parametrics for modelling unknown functions, density estimation, clustering, time-series modelling, and representing sparsity, hierarchies, and covariance structure. More specifically, it gives brief non-technical overviews of Gaussian processes, Dirichlet processes, infinite hidden Markov models, Indian buffet processes, Kingman’s coalescent, Dirichlet diffusion trees and Wishart processes. PMID:23277609

  15. Exploring Term Dependences in Probabilistic Information Retrieval Model.

    ERIC Educational Resources Information Center

    Cho, Bong-Hyun; Lee, Changki; Lee, Gary Geunbae

    2003-01-01

    Describes a theoretic process to apply Bahadur-Lazarsfeld expansion (BLE) to general probabilistic models and the state-of-the-art 2-Poisson model. Through experiments on two standard document collections, one in Korean and one in English, it is demonstrated that incorporation of term dependences using BLE significantly contributes to performance…

  16. A Probabilistic Model of Phonological Relationships from Contrast to Allophony

    ERIC Educational Resources Information Center

    Hall, Kathleen Currie

    2009-01-01

    This dissertation proposes a model of phonological relationships, the Probabilistic Phonological Relationship Model (PPRM), that quantifies how predictably distributed two sounds in a relationship are. It builds on a core premise of traditional phonological analysis, that the ability to define phonological relationships such as contrast and…

  17. Training probabilistic VLSI models on-chip to recognise biomedical signals under hardware nonidealities.

    PubMed

    Jiang, P C; Chen, H

    2006-01-01

    VLSI implementation of probabilistic models is attractive for many biomedical applications. However, hardware non-idealities can prevent probabilistic VLSI models from modelling data optimally through on-chip learning. This paper investigates the maximum computational errors that a probabilistic VLSI model can tolerate when modelling real biomedical data. VLSI circuits capable of achieving the required precision are also proposed.

  18. Modelling default and likelihood reasoning as probabilistic reasoning

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1990-01-01

    A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. Likely and by default are in fact treated as duals in the same sense as possibility and necessity. To model these four forms probabilistically, a qualitative default probabilistic (QDP) logic and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequent results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.

  19. Probabilistic Usage of the Multi-Factor Interaction Model

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2008-01-01

    A Multi-Factor Interaction Model (MFIM) is used to predict the insulating foam mass expulsion during the ascending of a space vehicle. The exponents in the MFIM are evaluated by an available approach which consists of least squares and an optimization algorithm. These results were subsequently used to probabilistically evaluate the effects of the uncertainties in each participating factor in the mass expulsion. The probabilistic results show that the surface temperature dominates at high probabilities and the pressure which causes the mass expulsion at low probabil

  20. ISSUES ASSOCIATED WITH PROBABILISTIC FAILURE MODELING OF DIGITAL SYSTEMS

    SciTech Connect

    CHU,T.L.; MARTINEZ-GURIDI,G.; LEHNER,J.; OVERLAND,D.

    2004-09-19

    The current U.S. Nuclear Regulatory Commission (NRC) licensing process of instrumentation and control (I&C) systems is based on deterministic requirements, e.g., single failure criteria, and defense in depth and diversity. Probabilistic considerations can be used as supplements to the deterministic process. The National Research Council has recommended development of methods for estimating failure probabilities of digital systems, including commercial off-the-shelf (COTS) equipment, for use in probabilistic risk assessment (PRA). NRC staff has developed informal qualitative and quantitative requirements for PRA modeling of digital systems. Brookhaven National Laboratory (BNL) has performed a review of the-state-of-the-art of the methods and tools that can potentially be used to model digital systems. The objectives of this paper are to summarize the review, discuss the issues associated with probabilistic modeling of digital systems, and identify potential areas of research that would enhance the state of the art toward a satisfactory modeling method that could be integrated with a typical probabilistic risk assessment.

  1. Integrating Boolean Queries in Conjunctive Normal Form with Probabilistic Retrieval Models.

    ERIC Educational Resources Information Center

    Losee, Robert M.; Bookstein, Abraham

    1988-01-01

    Presents a model that places Boolean database queries into conjunctive normal form, thereby allowing probabilistic ranking of documents and the incorporation of relevance feedback. Experimental results compare the performance of a sequential learning probabilistic retrieval model with the proposed integrated Boolean probabilistic model and a fuzzy…

  2. PGMC: a framework for probabilistic graphic model combination.

    PubMed

    Jiang, Chang An; Leong, Tze-Yun; Poh, Kim-Leng

    2005-01-01

    Decision making in biomedicine often involves incorporating new evidences into existing or working models reflecting the decision problems at hand. We propose a new framework that facilitates effective and incremental integration of multiple probabilistic graphical models. The proposed framework aims to minimize time and effort required to customize and extend the original models through preserving the conditional independence relationships inherent in two types of probabilistic graphical models: Bayesian networks and influence diagrams. We present a four-step algorithm to systematically combine the qualitative and the quantitative parts of the different models; we also describe three heuristic methods for target variable generation to reduce the complexity of the integrated models. Preliminary results from a case study in heart disease diagnosis demonstrate the feasibility and potential for applying the proposed framework in real applications.

  3. Probabilistic Based Modeling and Simulation Assessment

    DTIC Science & Technology

    2010-06-01

    with an unbelted Hybrid III model The Hybrid III dummy model was then restrained using a finite element seatbelt model. The dummy model has the...artifacts of the modeling process and not representative of the true physics of the impact, and can thus be qualified as unwanted noise in the model

  4. Supermultiplicative Speedups of Probabilistic Model-Building Genetic Algorithms

    DTIC Science & Technology

    2009-02-01

    simulations. We (Todd Martinez (2005 MacArthur fellow), Duanc Johnson, Kumara Sastry and David E. Goldberg) have applied inultiobjcctive GAs and model...AUTHOR(S) David E. Goldberg. Kumara Sastry. Martin Pelikan 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S...Speedups of Probabilistic Model-Building Genetic Algorithms AFOSR Grant No. FA9550-06-1-0096 February 1, 2006 to November 30, 2008 David E. Goldberg

  5. Nonlinear sensor fault diagnosis using mixture of probabilistic PCA models

    NASA Astrophysics Data System (ADS)

    Sharifi, Reza; Langari, Reza

    2017-02-01

    This paper presents a methodology for sensor fault diagnosis in nonlinear systems using a Mixture of Probabilistic Principal Component Analysis (MPPCA) models. This methodology separates the measurement space into several locally linear regions, each of which is associated with a Probabilistic PCA (PPCA) model. Using the transformation associated with each PPCA model, a parity relation scheme is used to construct a residual vector. Bayesian analysis of the residuals forms the basis for detection and isolation of sensor faults across the entire range of operation of the system. The resulting method is demonstrated in its application to sensor fault diagnosis of a fully instrumented HVAC system. The results show accurate detection of sensor faults under the assumption that a single sensor is faulty.

  6. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  7. M-estimation with probabilistic models of geodetic observations

    NASA Astrophysics Data System (ADS)

    Wiśniewski, Z.

    2014-10-01

    The paper concerns -estimation with probabilistic models of geodetic observations that is called estimation. The special attention is paid to estimation that includes the asymmetry and the excess kurtosis, which are basic anomalies of empiric distributions of errors of geodetic or astrometric observations (in comparison to the Gaussian errors). It is assumed that the influence function of estimation is equal to the differential equation that defines the system of the Pearson distributions. The central moments , are the parameters of that system and thus, they are also the parameters of the chosen influence function. The estimation that includes the Pearson type IV and VII distributions ( method) is analyzed in great detail from a theoretical point of view as well as by applying numerical tests. The chosen distributions are leptokurtic with asymmetry which refers to the general characteristic of empirical distributions. Considering -estimation with probabilistic models, the Gram-Charlier series are also applied to approximate the models in question ( method). The paper shows that estimation with the application of probabilistic models belongs to the class of robust estimations; method is especially effective in that case. It is suggested that even in the absence of significant anomalies the method in question should be regarded as robust against gross errors while its robustness is controlled by the pseudo-kurtosis.

  8. Probabilistic grammatical model for helix‐helix contact site classification

    PubMed Central

    2013-01-01

    Background Hidden Markov Models power many state‐of‐the‐art tools in the field of protein bioinformatics. While excelling in their tasks, these methods of protein analysis do not convey directly information on medium‐ and long‐range residue‐residue interactions. This requires an expressive power of at least context‐free grammars. However, application of more powerful grammar formalisms to protein analysis has been surprisingly limited. Results In this work, we present a probabilistic grammatical framework for problem‐specific protein languages and apply it to classification of transmembrane helix‐helix pairs configurations. The core of the model consists of a probabilistic context‐free grammar, automatically inferred by a genetic algorithm from only a generic set of expert‐based rules and positive training samples. The model was applied to produce sequence based descriptors of four classes of transmembrane helix‐helix contact site configurations. The highest performance of the classifiers reached AUCROC of 0.70. The analysis of grammar parse trees revealed the ability of representing structural features of helix‐helix contact sites. Conclusions We demonstrated that our probabilistic context‐free framework for analysis of protein sequences outperforms the state of the art in the task of helix‐helix contact site classification. However, this is achieved without necessarily requiring modeling long range dependencies between interacting residues. A significant feature of our approach is that grammar rules and parse trees are human‐readable. Thus they could provide biologically meaningful information for molecular biologists. PMID:24350601

  9. Probabilistic graphical model representation in phylogenetics.

    PubMed

    Höhna, Sebastian; Heath, Tracy A; Boussau, Bastien; Landis, Michael J; Ronquist, Fredrik; Huelsenbeck, John P

    2014-09-01

    Recent years have seen a rapid expansion of the model space explored in statistical phylogenetics, emphasizing the need for new approaches to statistical model representation and software development. Clear communication and representation of the chosen model is crucial for: (i) reproducibility of an analysis, (ii) model development, and (iii) software design. Moreover, a unified, clear and understandable framework for model representation lowers the barrier for beginners and nonspecialists to grasp complex phylogenetic models, including their assumptions and parameter/variable dependencies. Graphical modeling is a unifying framework that has gained in popularity in the statistical literature in recent years. The core idea is to break complex models into conditionally independent distributions. The strength lies in the comprehensibility, flexibility, and adaptability of this formalism, and the large body of computational work based on it. Graphical models are well-suited to teach statistical models, to facilitate communication among phylogeneticists and in the development of generic software for simulation and statistical inference. Here, we provide an introduction to graphical models for phylogeneticists and extend the standard graphical model representation to the realm of phylogenetics. We introduce a new graphical model component, tree plates, to capture the changing structure of the subgraph corresponding to a phylogenetic tree. We describe a range of phylogenetic models using the graphical model framework and introduce modules to simplify the representation of standard components in large and complex models. Phylogenetic model graphs can be readily used in simulation, maximum likelihood inference, and Bayesian inference using, for example, Metropolis-Hastings or Gibbs sampling of the posterior distribution.

  10. Probabilistic Priority Message Checking Modeling Based on Controller Area Networks

    NASA Astrophysics Data System (ADS)

    Lin, Cheng-Min

    Although the probabilistic model checking tool called PRISM has been applied in many communication systems, such as wireless local area network, Bluetooth, and ZigBee, the technique is not used in a controller area network (CAN). In this paper, we use PRISM to model the mechanism of priority messages for CAN because the mechanism has allowed CAN to become the leader in serial communication for automobile and industry control. Through modeling CAN, it is easy to analyze the characteristic of CAN for further improving the security and efficiency of automobiles. The Markov chain model helps us to model the behaviour of priority messages.

  11. Probabilistic-Based Modeling and Simulation Assessment

    DTIC Science & Technology

    2010-06-01

    mph crash simulation at 100 ms with an unbelted Hybrid III model The Hybrid III dummy model was then restrained using a finite element seatbelt ...true physics of the impact, and can thus be qualified as unwanted noise in the model response. Unfortunately, it is difficult to quantify the

  12. Influential input classification in probabilistic multimedia models

    SciTech Connect

    Maddalena, Randy L.; McKone, Thomas E.; Hsieh, Dennis P.H.; Geng, Shu

    1999-05-01

    Monte Carlo analysis is a statistical simulation method that is often used to assess and quantify the outcome variance in complex environmental fate and effects models. Total outcome variance of these models is a function of (1) the uncertainty and/or variability associated with each model input and (2) the sensitivity of the model outcome to changes in the inputs. To propagate variance through a model using Monte Carlo techniques, each variable must be assigned a probability distribution. The validity of these distributions directly influences the accuracy and reliability of the model outcome. To efficiently allocate resources for constructing distributions one should first identify the most influential set of variables in the model. Although existing sensitivity and uncertainty analysis methods can provide a relative ranking of the importance of model inputs, they fail to identify the minimum set of stochastic inputs necessary to sufficiently characterize the outcome variance. In this paper, we describe and demonstrate a novel sensitivity/uncertainty analysis method for assessing the importance of each variable in a multimedia environmental fate model. Our analyses show that for a given scenario, a relatively small number of input variables influence the central tendency of the model and an even smaller set determines the shape of the outcome distribution. For each input, the level of influence depends on the scenario under consideration. This information is useful for developing site specific models and improving our understanding of the processes that have the greatest influence on the variance in outcomes from multimedia models.

  13. Probabilistic graphic models applied to identification of diseases

    PubMed Central

    Sato, Renato Cesar; Sato, Graziela Tiemy Kajita

    2015-01-01

    ABSTRACT Decision-making is fundamental when making diagnosis or choosing treatment. The broad dissemination of computed systems and databases allows systematization of part of decisions through artificial intelligence. In this text, we present basic use of probabilistic graphic models as tools to analyze causality in health conditions. This method has been used to make diagnosis of Alzheimer´s disease, sleep apnea and heart diseases. PMID:26154555

  14. Probabilistic graphic models applied to identification of diseases.

    PubMed

    Sato, Renato Cesar; Sato, Graziela Tiemy Kajita

    2015-01-01

    Decision-making is fundamental when making diagnosis or choosing treatment. The broad dissemination of computed systems and databases allows systematization of part of decisions through artificial intelligence. In this text, we present basic use of probabilistic graphic models as tools to analyze causality in health conditions. This method has been used to make diagnosis of Alzheimer´s disease, sleep apnea and heart diseases.

  15. Probabilistic Modeling of Intracranial Pressure Effects on Optic Nerve Biomechanics

    NASA Technical Reports Server (NTRS)

    Ethier, C. R.; Feola, Andrew J.; Raykin, Julia; Myers, Jerry G.; Nelson, Emily S.; Samuels, Brian C.

    2016-01-01

    Altered intracranial pressure (ICP) is involved/implicated in several ocular conditions: papilledema, glaucoma and Visual Impairment and Intracranial Pressure (VIIP) syndrome. The biomechanical effects of altered ICP on optic nerve head (ONH) tissues in these conditions are uncertain but likely important. We have quantified ICP-induced deformations of ONH tissues, using finite element (FE) and probabilistic modeling (Latin Hypercube Simulations (LHS)) to consider a range of tissue properties and relevant pressures.

  16. Probabilistic Independence Networks for Hidden Markov Probability Models

    NASA Technical Reports Server (NTRS)

    Smyth, Padhraic; Heckerman, Cavid; Jordan, Michael I

    1996-01-01

    In this paper we explore hidden Markov models(HMMs) and related structures within the general framework of probabilistic independence networks (PINs). The paper contains a self-contained review of the basic principles of PINs. It is shown that the well-known forward-backward (F-B) and Viterbi algorithms for HMMs are special cases of more general enference algorithms for arbitrary PINs.

  17. Nonlinear probabilistic finite element models of laminated composite shells

    NASA Technical Reports Server (NTRS)

    Engelstad, S. P.; Reddy, J. N.

    1993-01-01

    A probabilistic finite element analysis procedure for laminated composite shells has been developed. A total Lagrangian finite element formulation, employing a degenerated 3-D laminated composite shell with the full Green-Lagrange strains and first-order shear deformable kinematics, forms the modeling foundation. The first-order second-moment technique for probabilistic finite element analysis of random fields is employed and results are presented in the form of mean and variance of the structural response. The effects of material nonlinearity are included through the use of a rate-independent anisotropic plasticity formulation with the macroscopic point of view. Both ply-level and micromechanics-level random variables can be selected, the latter by means of the Aboudi micromechanics model. A number of sample problems are solved to verify the accuracy of the procedures developed and to quantify the variability of certain material type/structure combinations. Experimental data is compared in many cases, and the Monte Carlo simulation method is used to check the probabilistic results. In general, the procedure is quite effective in modeling the mean and variance response of the linear and nonlinear behavior of laminated composite shells.

  18. Economic Dispatch for Microgrid Containing Electric Vehicles via Probabilistic Modelling

    SciTech Connect

    Yao, Yin; Gao, Wenzhong; Momoh, James; Muljadi, Eduard

    2015-10-06

    In this paper, an economic dispatch model with probabilistic modeling is developed for microgrid. Electric power supply in microgrid consists of conventional power plants and renewable energy power plants, such as wind and solar power plants. Due to the fluctuation of solar and wind plants' output, an empirical probabilistic model is developed to predict their hourly output. According to different characteristics of wind and solar plants, the parameters for probabilistic distribution are further adjusted individually for both power plants. On the other hand, with the growing trend of Plug-in Electric Vehicle (PHEV), an integrated microgrid system must also consider the impact of PHEVs. Not only the charging loads from PHEVs, but also the discharging output via Vehicle to Grid (V2G) method can greatly affect the economic dispatch for all the micro energy sources in microgrid. This paper presents an optimization method for economic dispatch in microgrid considering conventional, renewable power plants, and PHEVs. The simulation results reveal that PHEVs with V2G capability can be an indispensable supplement in modern microgrid.

  19. GENERAL: A modified weighted probabilistic cellular automaton traffic flow model

    NASA Astrophysics Data System (ADS)

    Zhuang, Qian; Jia, Bin; Li, Xin-Gang

    2009-08-01

    This paper modifies the weighted probabilistic cellular automaton model (Li X L, Kuang H, Song T, et al 2008 Chin. Phys. B 17 2366) which considered a diversity of traffic behaviors under real traffic situations induced by various driving characters and habits. In the new model, the effects of the velocity at the last time step and drivers' desire for acceleration are taken into account. The fundamental diagram, spatial-temporal diagram, and the time series of one-minute data are analyzed. The results show that this model reproduces synchronized flow. Finally, it simulates the on-ramp system with the proposed model. Some characteristics including the phase diagram are studied.

  20. Recent advances and applications of probabilistic topic models

    NASA Astrophysics Data System (ADS)

    Wood, Ian

    2014-12-01

    I present here an overview of recent advances in probabilistic topic modelling and related Bayesian graphical models as well as some of their more atypical applications outside of their home: text analysis. These techniques allow the modelling of high dimensional count vectors with strong correlations. With such data, simply calculating a correlation matrix is infeasible. Probabilistic topic models address this using mixtures of multinomials estimated via Bayesian inference with Dirichlet priors. The use of conjugate priors allows for efficient inference, and these techniques scale well to data sets with many millions of vectors. The first of these techniques to attract significant attention was Latent Dirichlet Allocation (LDA) [1, 2]. Numerous extensions and adaptations of LDA have been proposed: non-parametric models; assorted models incorporating authors, sentiment and other features; models regularised through the use of extra metadata or extra priors on topic structure, and many more [3]. They have become widely used in the text analysis and population genetics communities, with a number of compelling applications. These techniques are not restricted to text analysis, however, and can be applied to other types of data which can be sensibly discretised and represented as counts of labels/properties/etc. LDA and it's variants have been used to find patterns in data from diverse areas of inquiry, including genetics, plant physiology, image analysis, social network analysis, remote sensing and astrophysics. Nonetheless, it is relatively recently that probabilistic topic models have found applications outside of text analysis, and to date few such applications have been considered. I suggest that there is substantial untapped potential for topic models and models inspired by or incorporating topic models to be fruitfully applied, and outline the characteristics of systems and data for which this may be the case.

  1. Quantum-like Probabilistic Models Outside Physics

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei

    We present a quantum-like (QL) model in that contexts (complexes of e.g. mental, social, biological, economic or even political conditions) are represented by complex probability amplitudes. This approach gives the possibility to apply the mathematical quantum formalism to probabilities induced in any domain of science. In our model quantum randomness appears not as irreducible randomness (as it is commonly accepted in conventional quantum mechanics, e.g. by von Neumann and Dirac), but as a consequence of obtaining incomplete information about a system. We pay main attention to the QL description of processing of incomplete information. Our QL model can be useful in cognitive, social and political sciences as well as economics and artificial intelligence. In this paper we consider in a more detail one special application — QL modeling of brain's functioning. The brain is modeled as a QL-computer.

  2. Probabilistic prediction models for aggregate quarry siting

    USGS Publications Warehouse

    Robinson, G.R.; Larkins, P.M.

    2007-01-01

    Weights-of-evidence (WofE) and logistic regression techniques were used in a GIS framework to predict the spatial likelihood (prospectivity) of crushed-stone aggregate quarry development. The joint conditional probability models, based on geology, transportation network, and population density variables, were defined using quarry location and time of development data for the New England States, North Carolina, and South Carolina, USA. The Quarry Operation models describe the distribution of active aggregate quarries, independent of the date of opening. The New Quarry models describe the distribution of aggregate quarries when they open. Because of the small number of new quarries developed in the study areas during the last decade, independent New Quarry models have low parameter estimate reliability. The performance of parameter estimates derived for Quarry Operation models, defined by a larger number of active quarries in the study areas, were tested and evaluated to predict the spatial likelihood of new quarry development. Population density conditions at the time of new quarry development were used to modify the population density variable in the Quarry Operation models to apply to new quarry development sites. The Quarry Operation parameters derived for the New England study area, Carolina study area, and the combined New England and Carolina study areas were all similar in magnitude and relative strength. The Quarry Operation model parameters, using the modified population density variables, were found to be a good predictor of new quarry locations. Both the aggregate industry and the land management community can use the model approach to target areas for more detailed site evaluation for quarry location. The models can be revised easily to reflect actual or anticipated changes in transportation and population features. ?? International Association for Mathematical Geology 2007.

  3. Learning structurally consistent undirected probabilistic graphical models.

    PubMed

    Roy, Sushmita; Lane, Terran; Werner-Washburne, Margaret

    2009-01-01

    In many real-world domains, undirected graphical models such as Markov random fields provide a more natural representation of the statistical dependency structure than directed graphical models. Unfortunately, structure learning of undirected graphs using likelihood-based scores remains difficult because of the intractability of computing the partition function. We describe a new Markov random field structure learning algorithm, motivated by canonical parameterization of Abbeel et al. We provide computational improvements on their parameterization by learning per-variable canonical factors, which makes our algorithm suitable for domains with hundreds of nodes. We compare our algorithm against several algorithms for learning undirected and directed models on simulated and real datasets from biology. Our algorithm frequently outperforms existing algorithms, producing higher-quality structures, suggesting that enforcing consistency during structure learning is beneficial for learning undirected graphs.

  4. Probabilistic Modeling of Settlement Risk at Land Disposal Facilities - 12304

    SciTech Connect

    Foye, Kevin C.; Soong, Te-Yang

    2012-07-01

    The long-term reliability of land disposal facility final cover systems - and therefore the overall waste containment - depends on the distortions imposed on these systems by differential settlement/subsidence. The evaluation of differential settlement is challenging because of the heterogeneity of the waste mass (caused by inconsistent compaction, void space distribution, debris-soil mix ratio, waste material stiffness, time-dependent primary compression of the fine-grained soil matrix, long-term creep settlement of the soil matrix and the debris, etc.) at most land disposal facilities. Deterministic approaches to long-term final cover settlement prediction are not able to capture the spatial variability in the waste mass and sub-grade properties which control differential settlement. An alternative, probabilistic solution is to use random fields to model the waste and sub-grade properties. The modeling effort informs the design, construction, operation, and maintenance of land disposal facilities. A probabilistic method to establish design criteria for waste placement and compaction is introduced using the model. Random fields are ideally suited to problems of differential settlement modeling of highly heterogeneous foundations, such as waste. Random fields model the seemingly random spatial distribution of a design parameter, such as compressibility. When used for design, the use of these models prompts the need for probabilistic design criteria. It also allows for a statistical approach to waste placement acceptance criteria. An example design evaluation was performed, illustrating the use of the probabilistic differential settlement simulation methodology to assemble a design guidance chart. The purpose of this design evaluation is to enable the designer to select optimal initial combinations of design slopes and quality control acceptance criteria that yield an acceptable proportion of post-settlement slopes meeting some design minimum. For this specific

  5. A Probabilistic Model of Cross-Categorization

    ERIC Educational Resources Information Center

    Shafto, Patrick; Kemp, Charles; Mansinghka, Vikash; Tenenbaum, Joshua B.

    2011-01-01

    Most natural domains can be represented in multiple ways: we can categorize foods in terms of their nutritional content or social role, animals in terms of their taxonomic groupings or their ecological niches, and musical instruments in terms of their taxonomic categories or social uses. Previous approaches to modeling human categorization have…

  6. A Probabilistic Model of Theory Formation

    ERIC Educational Resources Information Center

    Kemp, Charles; Tenenbaum, Joshua B.; Niyogi, Sourabh; Griffiths, Thomas L.

    2010-01-01

    Concept learning is challenging in part because the meanings of many concepts depend on their relationships to other concepts. Learning these concepts in isolation can be difficult, but we present a model that discovers entire systems of related concepts. These systems can be viewed as simple theories that specify the concepts that exist in a…

  7. Probabilistic flood damage modelling at the meso-scale

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2014-05-01

    Decisions on flood risk management and adaptation are usually based on risk analyses. Such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments. Most damage models have in common that complex damaging processes are described by simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood damage models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we show how the model BT-FLEMO (Bagging decision Tree based Flood Loss Estimation MOdel) can be applied on the meso-scale, namely on the basis of ATKIS land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany. The application of BT-FLEMO provides a probability distribution of estimated damage to residential buildings per municipality. Validation is undertaken on the one hand via a comparison with eight other damage models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official damage data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of damage estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation model BT-FLEMO is that it inherently provides quantitative information about the uncertainty of the prediction. Reference: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64.

  8. Probabilistic Modeling of the Renal Stone Formation Module

    NASA Technical Reports Server (NTRS)

    Best, Lauren M.; Myers, Jerry G.; Goodenow, Debra A.; McRae, Michael P.; Jackson, Travis C.

    2013-01-01

    The Integrated Medical Model (IMM) is a probabilistic tool, used in mission planning decision making and medical systems risk assessments. The IMM project maintains a database of over 80 medical conditions that could occur during a spaceflight, documenting an incidence rate and end case scenarios for each. In some cases, where observational data are insufficient to adequately define the inflight medical risk, the IMM utilizes external probabilistic modules to model and estimate the event likelihoods. One such medical event of interest is an unpassed renal stone. Due to a high salt diet and high concentrations of calcium in the blood (due to bone depletion caused by unloading in the microgravity environment) astronauts are at a considerable elevated risk for developing renal calculi (nephrolithiasis) while in space. Lack of observed incidences of nephrolithiasis has led HRP to initiate the development of the Renal Stone Formation Module (RSFM) to create a probabilistic simulator capable of estimating the likelihood of symptomatic renal stone presentation in astronauts on exploration missions. The model consists of two major parts. The first is the probabilistic component, which utilizes probability distributions to assess the range of urine electrolyte parameters and a multivariate regression to transform estimated crystal density and size distributions to the likelihood of the presentation of nephrolithiasis symptoms. The second is a deterministic physical and chemical model of renal stone growth in the kidney developed by Kassemi et al. The probabilistic component of the renal stone model couples the input probability distributions describing the urine chemistry, astronaut physiology, and system parameters with the physical and chemical outputs and inputs to the deterministic stone growth model. These two parts of the model are necessary to capture the uncertainty in the likelihood estimate. The model will be driven by Monte Carlo simulations, continuously

  9. New probabilistic graphical models for genetic regulatory networks studies.

    PubMed

    Wang, Junbai; Cheung, Leo Wang-Kit; Delabie, Jan

    2005-12-01

    This paper introduces two new probabilistic graphical models for reconstruction of genetic regulatory networks using DNA microarray data. One is an independence graph (IG) model with either a forward or a backward search algorithm and the other one is a Gaussian network (GN) model with a novel greedy search method. The performances of both models were evaluated on four MAPK pathways in yeast and three simulated data sets. Generally, an IG model provides a sparse graph but a GN model produces a dense graph where more information about gene-gene interactions may be preserved. The results of our proposed models were compared with several other commonly used models, and our models have shown to give superior performance. Additionally, we found the same common limitations in the prediction of genetic regulatory networks when using only DNA microarray data.

  10. A probabilistic model of brittle crack formation

    NASA Technical Reports Server (NTRS)

    Chudnovsky, A.; Kunin, B.

    1987-01-01

    Probability of a brittle crack formation in an elastic solid with fluctuating strength is considered. A set Omega of all possible crack trajectories reflecting the fluctuation of the strength field is introduced. The probability P(X) that crack penetration depth exceeds X is expressed as a functional integral over Omega of a conditional probability of the same event taking place along a particular path. Various techniques are considered to evaluate the integral. Under rather nonrestrictive assumptions, the integral is reduced to solving a diffusion-type equation. A new characteristic of fracture process, 'crack diffusion coefficient', is introduced. An illustrative example is then considered where the integration is reduced to solving an ordinary differential equation. The effect of the crack diffusion coefficient and of the magnitude of strength fluctuations on probability density of crack penetration depth is presented. Practical implications of the proposed model are discussed.

  11. A probabilistic gastrointestinal tract dosimetry model

    NASA Astrophysics Data System (ADS)

    Huh, Chulhaeng

    In internal dosimetry, the tissues of the gastrointestinal (GI) tract represent one of the most radiosensitive organs of the body with the hematopoietic bone marrow. Endoscopic ultrasound is a unique tool to acquire in-vivo data on GI tract wall thicknesses of sufficient resolution needed in radiation dosimetry studies. Through their different echo texture and intensity, five layers of differing echo patterns for superficial mucosa, deep mucosa, submucosa, muscularis propria and serosa exist within the walls of organs composing the alimentary tract. Thicknesses for stomach mucosa ranged from 620 +/- 150 mum to 1320 +/- 80 mum (total stomach wall thicknesses from 2.56 +/- 0.12 to 4.12 +/- 0.11 mm). Measurements made for the rectal images revealed rectal mucosal thicknesses from 150 +/- 90 mum to 670 +/- 110 mum (total rectal wall thicknesses from 2.01 +/- 0.06 to 3.35 +/- 0.46 mm). The mucosa thus accounted for 28 +/- 3% and 16 +/- 6% of the total thickness of the stomach and rectal wall, respectively. Radiation transport simulations were then performed using the Monte Carlo N-particle transport code (MCNP) 4C transport code to calculate S values (Gy/Bq-s) for penetrating and nonpenetrating radiations such as photons, beta particles, conversion electrons and auger electrons of selected nuclides, I123, I131, Tc 99m and Y90 under two source conditions: content and mucosa sources, respectively. The results of this study demonstrate generally good agreement with published data for the stomach mucosa wall. The rectal mucosa data are consistently higher than published data compared with the large intestine due to different radiosensitive cell thicknesses (350 mum vs. a range spanning from 149 mum to 729 mum) and different geometry when a rectal content source is considered. Generally, the ICRP models have been designed to predict the amount of radiation dose in the human body from a "typical" or "reference" individual in a given population. The study has been performed to

  12. Alternative fuels and vehicles choice model

    SciTech Connect

    Greene, D.L.

    1994-10-01

    This report describes the theory and implementation of a model of alternative fuel and vehicle choice (AFVC), designed for use with the US Department of Energy`s Alternative Fuels Trade Model (AFTM). The AFTM is a static equilibrium model of the world supply and demand for liquid fuels, encompassing resource production, conversion processes, transportation, and consumption. The AFTM also includes fuel-switching behavior by incorporating multinomial logit-type equations for choice of alternative fuel vehicles and alternative fuels. This allows the model to solve for market shares of vehicles and fuels, as well as for fuel prices and quantities. The AFVC model includes fuel-flexible, bi-fuel, and dedicated fuel vehicles. For multi-fuel vehicles, the choice of fuel is subsumed within the vehicle choice framework, resulting in a nested multinomial logit design. The nesting is shown to be required by the different price elasticities of fuel and vehicle choice. A unique feature of the AFVC is that its parameters are derived directly from the characteristics of alternative fuels and vehicle technologies, together with a few key assumptions about consumer behavior. This not only establishes a direct link between assumptions and model predictions, but facilitates sensitivity testing, as well. The implementation of the AFVC model as a spreadsheet is also described.

  13. Consumer Vehicle Choice Model Documentation

    SciTech Connect

    Liu, Changzheng; Greene, David L

    2012-08-01

    In response to the Fuel Economy and Greenhouse Gas (GHG) emissions standards, automobile manufacturers will need to adopt new technologies to improve the fuel economy of their vehicles and to reduce the overall GHG emissions of their fleets. The U.S. Environmental Protection Agency (EPA) has developed the Optimization Model for reducing GHGs from Automobiles (OMEGA) to estimate the costs and benefits of meeting GHG emission standards through different technology packages. However, the model does not simulate the impact that increased technology costs will have on vehicle sales or on consumer surplus. As the model documentation states, “While OMEGA incorporates functions which generally minimize the cost of meeting a specified carbon dioxide (CO2) target, it is not an economic simulation model which adjusts vehicle sales in response to the cost of the technology added to each vehicle.” Changes in the mix of vehicles sold, caused by the costs and benefits of added fuel economy technologies, could make it easier or more difficult for manufacturers to meet fuel economy and emissions standards, and impacts on consumer surplus could raise the costs or augment the benefits of the standards. Because the OMEGA model does not presently estimate such impacts, the EPA is investigating the feasibility of developing an adjunct to the OMEGA model to make such estimates. This project is an effort to develop and test a candidate model. The project statement of work spells out the key functional requirements for the new model.

  14. Spatial probabilistic pulsatility model for enhancing photoplethysmographic imaging systems

    NASA Astrophysics Data System (ADS)

    Amelard, Robert; Clausi, David A.; Wong, Alexander

    2016-11-01

    Photoplethysmographic imaging (PPGI) is a widefield noncontact biophotonic technology able to remotely monitor cardiovascular function over anatomical areas. Although spatial context can provide insight into physiologically relevant sampling locations, existing PPGI systems rely on coarse spatial averaging with no anatomical priors for assessing arterial pulsatility. Here, we developed a continuous probabilistic pulsatility model for importance-weighted blood pulse waveform extraction. Using a data-driven approach, the model was constructed using a 23 participant sample with a large demographic variability (11/12 female/male, age 11 to 60 years, BMI 16.4 to 35.1 kg·m-2). Using time-synchronized ground-truth blood pulse waveforms, spatial correlation priors were computed and projected into a coaligned importance-weighted Cartesian space. A modified Parzen-Rosenblatt kernel density estimation method was used to compute the continuous resolution-agnostic probabilistic pulsatility model. The model identified locations that consistently exhibited pulsatility across the sample. Blood pulse waveform signals extracted with the model exhibited significantly stronger temporal correlation (W=35,p<0.01) and spectral SNR (W=31,p<0.01) compared to uniform spatial averaging. Heart rate estimation was in strong agreement with true heart rate [r2=0.9619, error (μ,σ)=(0.52,1.69) bpm].

  15. Probabilistic modeling of financial exposure to flood in France

    NASA Astrophysics Data System (ADS)

    Moncoulon, David; Quantin, Antoine; Leblois, Etienne

    2014-05-01

    CCR is a French reinsurance company which offers natural catastrophe covers with the State guarantee. Within this framework, CCR develops its own models to assess its financial exposure to floods, droughts, earthquakes and other perils, and thus the exposure of insurers and the French State. A probabilistic flood model has been developed in order to estimate the financial exposure of the Nat Cat insurance market to flood events, depending on their annual occurrence probability. This presentation is organized in two parts. The first part is dedicated to the development of a flood hazard and damage model (ARTEMIS). The model calibration and validation on historical events are then described. In the second part, the coupling of ARTEMIS with two generators of probabilistic events is achieved: a stochastic flow generator and a stochastic spatialized precipitation generator, adapted from the SAMPO model developed by IRSTEA. The analysis of the complementary nature of these two generators is proposed: the first one allows generating floods on the French hydrological station network; the second allows simulating surface water runoff and Small River floods, even on ungauged rivers. Thus, the simulation of thousands of non-occured, but possible events allows us to provide for the first time an estimate of the financial exposure to flooding in France at different scales (commune, department, country) and from different points of view (hazard, vulnerability and damages).

  16. De novo protein conformational sampling using a probabilistic graphical model.

    PubMed

    Bhattacharya, Debswapna; Cheng, Jianlin

    2015-11-06

    Efficient exploration of protein conformational space remains challenging especially for large proteins when assembling discretized structural fragments extracted from a protein structure data database. We propose a fragment-free probabilistic graphical model, FUSION, for conformational sampling in continuous space and assess its accuracy using 'blind' protein targets with a length up to 250 residues from the CASP11 structure prediction exercise. The method reduces sampling bottlenecks, exhibits strong convergence, and demonstrates better performance than the popular fragment assembly method, ROSETTA, on relatively larger proteins with a length of more than 150 residues in our benchmark set. FUSION is freely available through a web server at http://protein.rnet.missouri.edu/FUSION/.

  17. Probabilistic/Fracture-Mechanics Model For Service Life

    NASA Technical Reports Server (NTRS)

    Watkins, T., Jr.; Annis, C. G., Jr.

    1991-01-01

    Computer program makes probabilistic estimates of lifetime of engine and components thereof. Developed to fill need for more accurate life-assessment technique that avoids errors in estimated lives and provides for statistical assessment of levels of risk created by engineering decisions in designing system. Implements mathematical model combining techniques of statistics, fatigue, fracture mechanics, nondestructive analysis, life-cycle cost analysis, and management of engine parts. Used to investigate effects of such engine-component life-controlling parameters as return-to-service intervals, stresses, capabilities for nondestructive evaluation, and qualities of materials.

  18. Probabilistic model for bridge structural evaluation using nondestructive inspection data

    NASA Astrophysics Data System (ADS)

    Carrion, Francisco; Lopez, Jose Alfredo; Balankin, Alexander

    2005-05-01

    A bridge management system developed for the Mexican toll highway network applies a probabilistic-reliability model to estimate load capacity and structural residual life. Basic inputs for the system are the global inspection data (visual inspections and vibration testing), and the information from the environment conditions (weather, traffic, loads, earthquakes); although, the model takes account for additional non-destructive testing or permanent monitoring data. Main outputs are the periodic maintenance, rehabilitation and replacement program, and the updated inspection program. Both programs are custom-made to available funds and scheduled according to a priority assignation criterion. The probabilistic model, tailored to typical bridges, accounts for the size, age, material and structure type. Special bridges in size or type may be included, while in these cases finite element deterministic models are also possible. Key feature is that structural qualification is given in terms of the probability of failure, calculated considering fundamental degradation mechanisms and from actual direct observations and measurements, such as crack distribution and size, materials properties, bridge dimensions, load deflections, and parameters for corrosion evaluation. Vibration measurements are basically used to infer structural resistance and to monitor long term degradation.

  19. Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.

    2016-01-01

    Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.

  20. A Probabilistic Model for Simulating Magnetoacoustic Emission Responses in Ferromagnets

    NASA Technical Reports Server (NTRS)

    Namkung, M.; Fulton, J. P.; Wincheski, B.

    1993-01-01

    Magnetoacoustic emission (MAE) is a phenomenon where acoustic noise is generated due to the motion of non-180 magnetic domain walls in a ferromagnet with non-zero magnetostrictive constants. MAE has been studied extensively for many years and has even been applied as an NDE tool for characterizing the heat treatment of high-yield low carbon steels. A complete theory which fully accounts for the magnetoacoustic response, however, has not yet emerged. The motion of the domain walls appears to be a totally random process, however, it does exhibit features of regularity which have been identified by studying phenomena such as 1/f flicker noise and self-organized criticality (SOC). In this paper, a probabilistic model incorporating the effects of SOC has been developed to help explain the MAE response. The model uses many simplifying assumptions yet yields good qualitative agreement with observed experimental results and also provides some insight into the possible underlying mechanisms responsible for MAE. We begin by providing a brief overview of magnetoacoustic emission and the experimental set-up used to obtain the MAE signal. We then describe a pseudo-probabilistic model used to predict the MAE response and give an example of the predicted result. Finally, the model is modified to account for SOC and the new predictions are shown and compared with experiment.

  1. Probabilistic assessment of agricultural droughts using graphical models

    NASA Astrophysics Data System (ADS)

    Ramadas, Meenu; Govindaraju, Rao S.

    2015-07-01

    Agricultural droughts are often characterized by soil moisture in the root zone of the soil, but crop needs are rarely factored into the analysis. Since water needs vary with crops, agricultural drought incidences in a region can be characterized better if crop responses to soil water deficits are also accounted for in the drought index. This study investigates agricultural droughts driven by plant stress due to soil moisture deficits using crop stress functions available in the literature. Crop water stress is assumed to begin at the soil moisture level corresponding to incipient stomatal closure, and reaches its maximum at the crop's wilting point. Using available location-specific crop acreage data, a weighted crop water stress function is computed. A new probabilistic agricultural drought index is then developed within a hidden Markov model (HMM) framework that provides model uncertainty in drought classification and accounts for time dependence between drought states. The proposed index allows probabilistic classification of the drought states and takes due cognizance of the stress experienced by the crop due to soil moisture deficit. The capabilities of HMM model formulations for assessing agricultural droughts are compared to those of current drought indices such as standardized precipitation evapotranspiration index (SPEI) and self-calibrating Palmer drought severity index (SC-PDSI). The HMM model identified critical drought events and several drought occurrences that are not detected by either SPEI or SC-PDSI, and shows promise as a tool for agricultural drought studies.

  2. Road environment perception algorithm based on object semantic probabilistic model

    NASA Astrophysics Data System (ADS)

    Liu, Wei; Wang, XinMei; Tian, Jinwen; Wang, Yong

    2015-12-01

    This article seeks to discover the object categories' semantic probabilistic model (OSPM) based on statistical test analysis method. We applied this model on road forward environment perception algorithm, including on-road object recognition and detection. First, the image was represented by a set composed of words (local feature regions). Then, found the probability distribution among image, local regions and object semantic category based on the new model. In training, the parameters of the object model are estimated. This is done by using expectation-maximization in a maximum likelihood setting. In recognition, this model is used to classify images by using a Bayesian manner. In detection, the posterios is calculated to detect the typical on-road objects. Experiments release the good performance on object recognition and detection in urban street background.

  3. Probabilistic logic modeling of network reliability for hybrid network architectures

    SciTech Connect

    Wyss, G.D.; Schriner, H.K.; Gaylor, T.R.

    1996-10-01

    Sandia National Laboratories has found that the reliability and failure modes of current-generation network technologies can be effectively modeled using fault tree-based probabilistic logic modeling (PLM) techniques. We have developed fault tree models that include various hierarchical networking technologies and classes of components interconnected in a wide variety of typical and atypical configurations. In this paper we discuss the types of results that can be obtained from PLMs and why these results are of great practical value to network designers and analysts. After providing some mathematical background, we describe the `plug-and-play` fault tree analysis methodology that we have developed for modeling connectivity and the provision of network services in several current- generation network architectures. Finally, we demonstrate the flexibility of the method by modeling the reliability of a hybrid example network that contains several interconnected ethernet, FDDI, and token ring segments. 11 refs., 3 figs., 1 tab.

  4. A Practical Probabilistic Graphical Modeling Tool for Weighing ...

    EPA Pesticide Factsheets

    Past weight-of-evidence frameworks for adverse ecological effects have provided soft-scoring procedures for judgments based on the quality and measured attributes of evidence. Here, we provide a flexible probabilistic structure for weighing and integrating lines of evidence for ecological risk determinations. Probabilistic approaches can provide both a quantitative weighing of lines of evidence and methods for evaluating risk and uncertainty. The current modeling structure wasdeveloped for propagating uncertainties in measured endpoints and their influence on the plausibility of adverse effects. To illustrate the approach, we apply the model framework to the sediment quality triad using example lines of evidence for sediment chemistry measurements, bioassay results, and in situ infauna diversity of benthic communities using a simplified hypothetical case study. We then combine the three lines evidence and evaluate sensitivity to the input parameters, and show how uncertainties are propagated and how additional information can be incorporated to rapidly update the probability of impacts. The developed network model can be expanded to accommodate additional lines of evidence, variables and states of importance, and different types of uncertainties in the lines of evidence including spatial and temporal as well as measurement errors. We provide a flexible Bayesian network structure for weighing and integrating lines of evidence for ecological risk determinations

  5. Efficient diagnosis of multiprocessor systems under probabilistic models

    NASA Technical Reports Server (NTRS)

    Blough, Douglas M.; Sullivan, Gregory F.; Masson, Gerald M.

    1989-01-01

    The problem of fault diagnosis in multiprocessor systems is considered under a probabilistic fault model. The focus is on minimizing the number of tests that must be conducted in order to correctly diagnose the state of every processor in the system with high probability. A diagnosis algorithm that can correctly diagnose the state of every processor with probability approaching one in a class of systems performing slightly greater than a linear number of tests is presented. A nearly matching lower bound on the number of tests required to achieve correct diagnosis in arbitrary systems is also proven. Lower and upper bounds on the number of tests required for regular systems are also presented. A class of regular systems which includes hypercubes is shown to be correctly diagnosable with high probability. In all cases, the number of tests required under this probabilistic model is shown to be significantly less than under a bounded-size fault set model. Because the number of tests that must be conducted is a measure of the diagnosis overhead, these results represent a dramatic improvement in the performance of system-level diagnosis techniques.

  6. Model Checking Linear-Time Properties of Probabilistic Systems

    NASA Astrophysics Data System (ADS)

    Baier, Christel; Größer, Marcus; Ciesinski, Frank

    This chapter is about the verification of Markov decision processes (MDPs) which incorporate one of the fundamental models for reasoning about probabilistic and nondeterministic phenomena in reactive systems. MDPs have their roots in the field of operations research and are nowadays used in a wide variety of areas including verification, robotics, planning, controlling, reinforcement learning, economics and semantics of randomized systems. Furthermore, MDPs served as the basis for the introduction of probabilistic automata which are related to weighted automata. We describe the use of MDPs as an operational model for randomized systems, e.g., systems that employ randomized algorithms, multi-agent systems or systems with unreliable components or surroundings. In this context we outline the theory of verifying ω-regular properties of such operational models. As an integral part of this theory we use ω-automata, i.e., finite-state automata over finite alphabets that accept languages of infinite words. Additionally, basic concepts of important reduction techniques are sketched, namely partial order reduction of MDPs and quotient system reduction of the numerical problem that arises in the verification of MDPs. Furthermore we present several undecidability and decidability results for the controller synthesis problem for partially observable MDPs.

  7. Modeling of human artery tissue with probabilistic approach.

    PubMed

    Xiong, Linfei; Chui, Chee-Kong; Fu, Yabo; Teo, Chee-Leong; Li, Yao

    2015-04-01

    Accurate modeling of biological soft tissue properties is vital for realistic medical simulation. Mechanical response of biological soft tissue always exhibits a strong variability due to the complex microstructure and different loading conditions. The inhomogeneity in human artery tissue is modeled with a computational probabilistic approach by assuming that the instantaneous stress at a specific strain varies according to normal distribution. Material parameters of the artery tissue which are modeled with a combined logarithmic and polynomial energy equation are represented by a statistical function with normal distribution. Mean and standard deviation of the material parameters are determined using genetic algorithm (GA) and inverse mean-value first-order second-moment (IMVFOSM) method, respectively. This nondeterministic approach was verified using computer simulation based on the Monte-Carlo (MC) method. Cumulative distribution function (CDF) of the MC simulation corresponds well with that of the experimental stress-strain data and the probabilistic approach is further validated using data from other studies. By taking into account the inhomogeneous mechanical properties of human biological tissue, the proposed method is suitable for realistic virtual simulation as well as an accurate computational approach for medical device validation.

  8. Probabilistic models of species discovery and biodiversity comparisons.

    PubMed

    Edie, Stewart M; Smits, Peter D; Jablonski, David

    2017-04-04

    Inferring large-scale processes that drive biodiversity hinges on understanding the phylogenetic and spatial pattern of species richness. However, clades and geographic regions are accumulating newly described species at an uneven rate, potentially affecting the stability of currently observed diversity patterns. Here, we present a probabilistic model of species discovery to assess the uncertainty in diversity levels among clades and regions. We use a Bayesian time series regression to estimate the long-term trend in the rate of species description for marine bivalves and find a distinct spatial bias in the accumulation of new species. Despite these biases, probabilistic estimates of future species richness show considerable stability in the currently observed rank order of regional diversity. However, absolute differences in richness are still likely to change, potentially modifying the correlation between species numbers and geographic, environmental, and biological factors thought to promote biodiversity. Applied to scallops and related clades, we find that accumulating knowledge of deep-sea species will likely shift the relative richness of these three families, emphasizing the need to consider the incomplete nature of bivalve taxonomy in quantitative studies of its diversity. Along with estimating expected changes to observed patterns of diversity, the model described in this paper pinpoints geographic areas and clades most urgently requiring additional systematic study-an important practice for building more complete and accurate models of biodiversity dynamics that can inform ecological and evolutionary theory and improve conservation practice.

  9. Probabilistic delay differential equation modeling of event-related potentials.

    PubMed

    Ostwald, Dirk; Starke, Ludger

    2016-08-01

    "Dynamic causal models" (DCMs) are a promising approach in the analysis of functional neuroimaging data due to their biophysical interpretability and their consolidation of functional-segregative and functional-integrative propositions. In this theoretical note we are concerned with the DCM framework for electroencephalographically recorded event-related potentials (ERP-DCM). Intuitively, ERP-DCM combines deterministic dynamical neural mass models with dipole-based EEG forward models to describe the event-related scalp potential time-series over the entire electrode space. Since its inception, ERP-DCM has been successfully employed to capture the neural underpinnings of a wide range of neurocognitive phenomena. However, in spite of its empirical popularity, the technical literature on ERP-DCM remains somewhat patchy. A number of previous communications have detailed certain aspects of the approach, but no unified and coherent documentation exists. With this technical note, we aim to close this gap and to increase the technical accessibility of ERP-DCM. Specifically, this note makes the following novel contributions: firstly, we provide a unified and coherent review of the mathematical machinery of the latent and forward models constituting ERP-DCM by formulating the approach as a probabilistic latent delay differential equation model. Secondly, we emphasize the probabilistic nature of the model and its variational Bayesian inversion scheme by explicitly deriving the variational free energy function in terms of both the likelihood expectation and variance parameters. Thirdly, we detail and validate the estimation of the model with a special focus on the explicit form of the variational free energy function and introduce a conventional nonlinear optimization scheme for its maximization. Finally, we identify and discuss a number of computational issues which may be addressed in the future development of the approach.

  10. Probabilistic multi-scale modeling of pathogen dynamics in rivers

    NASA Astrophysics Data System (ADS)

    Packman, A. I.; Drummond, J. D.; Aubeneau, A. F.

    2014-12-01

    Most parameterizations of microbial dynamics and pathogen transport in surface waters rely on classic assumptions of advection-diffusion behavior in the water column and limited interactions between the water column and sediments. However, recent studies have shown that strong surface-subsurface interactions produce a wide range of transport timescales in rivers, and greatly the opportunity for long-term retention of pathogens in sediment beds and benthic biofilms. We present a stochastic model for pathogen dynamics, based on continuous-time random walk theory, that properly accounts for such diverse transport timescales, along with the remobilization and inactivation of pathogens in storage reservoirs. By representing pathogen dynamics probabilistically, the model framework enables diverse local-scale processes to be incorporated in system-scale models. We illustrate the application of the model to microbial dynamics in rivers based on the results of a tracer injection experiment. In-stream transport and surface-subsurface interactions are parameterized based on observations of conservative tracer transport, while E. coli retention and inactivation in sediments is parameterized based on direct local-scale experiments. The results indicate that sediments are an important reservoir of enteric organisms in rivers, and slow remobilization from sediments represents a long-term source of bacteria to streams. Current capability, potential advances, and limitations of this model framework for assessing pathogen transmission risks will be discussed. Because the transport model is probabilistic, it is amenable to incorporation into risk models, but a lack of characterization of key microbial processes in sediments and benthic biofilms hinders current application.

  11. Probabilistic multicompartmental model for interpreting DGT kinetics in sediments.

    PubMed

    Ciffroy, P; Nia, Y; Garnier, J M

    2011-11-15

    Extensive research has been performed on the use of the DIFS (DGT-Induced Fluxes in Soils and Sediments) model to interpret diffusive gradients in thin-film, or DGT, measurements in soils and sediments. The current report identifies some areas where the DIFS model has been shown to yield poor results and proposes a model to address weaknesses. In particular, two major flaws in the current approaches are considered: (i) many studies of accumulation kinetics in DGT exhibit multiple kinetic stages and (ii) several combinations of the two fitted DIFS parameters can yield identical results, leaving the question of how to select the 'best' combination. Previously, problem (i) has been addressed by separating the experimental data sets into distinct time segments. To overcome these problems, a model considering two types of particulate binding sites is proposed, instead of the DIFS model which assumed one single particulate pool. A probabilistic approach is proposed to fit experimental data and to determine the range of possible physical parameters using Probability Distribution Functions (PDFs), as opposed to single values without any indication of their uncertainty. The new probabilistic model, called DGT-PROFS, was tested on three different formulated sediments which mainly differ in the presence or absence of iron oxides. It was shown that a good fit can be obtained for the complete set of data (instead of DIFS-2D) and that a range of uncertainty values for each modeling parameter can be obtained. The interpretation of parameter PDFs allows one to distinguish between a variety of geochemical behaviors, providing useful information on metal dynamics in sediments.

  12. Parental role models, gender and educational choice.

    PubMed

    Dryler, H

    1998-09-01

    Parental role models are often put forward as an explanation for the choice of gender-atypical educational routes. This paper aims to test such explanations by examining the impact of family background variables like parental education and occupation, on choice of educational programme at upper secondary school. Using a sample of around 73,000 Swedish teenagers born between 1972 and 1976, girls' and boys' gender-atypical as well as gender-typical educational choices are analysed by means of logistic regression. Parents working or educated within a specific field increase the probability that a child will make a similar choice of educational programme at upper secondary school. This same-sector effect appeared to be somewhat stronger for fathers and sons, while no such same-sex influence was confirmed for girls. No evidence was found that, in addition to a same-sector effect, it matters whether parents' occupations represent gender-traditional or non-traditional models. Parents of the service classes or highly educated parents--expected to be the most gender egalitarian in attitudes and behaviours--have a positive influence upon children's choice of gender-atypical education.

  13. Probabilistic Modeling of Aircraft Trajectories for Dynamic Separation Volumes

    NASA Technical Reports Server (NTRS)

    Lewis, Timothy A.

    2016-01-01

    With a proliferation of new and unconventional vehicles and operations expected in the future, the ab initio airspace design will require new approaches to trajectory prediction for separation assurance and other air traffic management functions. This paper presents an approach to probabilistic modeling of the trajectory of an aircraft when its intent is unknown. The approach uses a set of feature functions to constrain a maximum entropy probability distribution based on a set of observed aircraft trajectories. This model can be used to sample new aircraft trajectories to form an ensemble reflecting the variability in an aircraft's intent. The model learning process ensures that the variability in this ensemble reflects the behavior observed in the original data set. Computational examples are presented.

  14. Binary Encoded-Prototype Tree for Probabilistic Model Building GP

    NASA Astrophysics Data System (ADS)

    Yanase, Toshihiko; Hasegawa, Yoshihiko; Iba, Hitoshi

    In recent years, program evolution algorithms based on the estimation of distribution algorithm (EDA) have been proposed to improve search ability of genetic programming (GP) and to overcome GP-hard problems. One such method is the probabilistic prototype tree (PPT) based algorithm. The PPT based method explores the optimal tree structure by using the full tree whose number of child nodes is maximum among possible trees. This algorithm, however, suffers from problems arising from function nodes having different number of child nodes. These function nodes cause intron nodes, which do not affect the fitness function. Moreover, the function nodes having many child nodes increase the search space and the number of samples necessary for properly constructing the probabilistic model. In order to solve this problem, we propose binary encoding for PPT. In this article, we convert each function node to a subtree of binary nodes where the converted tree is correct in grammar. Our method reduces ineffectual search space, and the binary encoded tree is able to express the same tree structures as the original method. The effectiveness of the proposed method is demonstrated through the use of two computational experiments.

  15. Applications of the International Space Station Probabilistic Risk Assessment Model

    NASA Technical Reports Server (NTRS)

    Grant, Warren; Lutomski, Michael G.

    2011-01-01

    Recently the International Space Station (ISS) has incorporated more Probabilistic Risk Assessments (PRAs) in the decision making process for significant issues. Future PRAs will have major impact to ISS and future spacecraft development and operations. These PRAs will have their foundation in the current complete ISS PRA model and the current PRA trade studies that are being analyzed as requested by ISS Program stakeholders. ISS PRAs have recently helped in the decision making process for determining reliability requirements for future NASA spacecraft and commercial spacecraft, making crew rescue decisions, as well as making operational requirements for ISS orbital orientation, planning Extravehicular activities (EVAs) and robotic operations. This paper will describe some applications of the ISS PRA model and how they impacted the final decision. This paper will discuss future analysis topics such as life extension, requirements of new commercial vehicles visiting ISS.

  16. Toward a Dynamic Probabilistic Model for Vestibular Cognition

    PubMed Central

    Ellis, Andrew W.; Mast, Fred W.

    2017-01-01

    We suggest that research in vestibular cognition will benefit from the theoretical framework of probabilistic models. This will aid in developing an understanding of how interactions between high-level cognition and low-level sensory processing might occur. Many such interactions have been shown experimentally; however, to date, no attempt has been made to systematically explore vestibular cognition by using computational modeling. It is widely assumed that mental imagery and perception share at least in part neural circuitry, and it has been proposed that mental simulation is closely connected to the brain’s ability to make predictions. We claim that this connection has been disregarded in the vestibular domain, and we suggest ways in which future research may take this into consideration. PMID:28203219

  17. A probabilistic model of a porous heat exchanger

    NASA Technical Reports Server (NTRS)

    Agrawal, O. P.; Lin, X. A.

    1995-01-01

    This paper presents a probabilistic one-dimensional finite element model for heat transfer processes in porous heat exchangers. The Galerkin approach is used to develop the finite element matrices. Some of the submatrices are asymmetric due to the presence of the flow term. The Neumann expansion is used to write the temperature distribution as a series of random variables, and the expectation operator is applied to obtain the mean and deviation statistics. To demonstrate the feasibility of the formulation, a one-dimensional model of heat transfer phenomenon in superfluid flow through a porous media is considered. Results of this formulation agree well with the Monte-Carlo simulations and the analytical solutions. Although the numerical experiments are confined to parametric random variables, a formulation is presented to account for the random spatial variations.

  18. An organizational model of choice: a theoretical analysis differentiating choice, personal control, and self-determination.

    PubMed

    Williams, S

    1998-11-01

    Individuals experience choice when they select one option from among meaningful alternatives that possess relatively equal attractiveness and some degree of indeterminacy. Choice has been found to influence important psychological and behavioral outcomes. After differentiating among choice, personal control, and self-determination, the author offers a model of choice, with self-determination as the key mechanism regulating how choice influences intrinsic motivation. The model suggests specific types of choice-relevant information that should affect whether choice results in an internal (self-determined) or external (controlled) locus of causality. The individual characteristics of locus of control, self-presentation, self-esteem, and Type A personality are suggested as possible moderators of the effects of choice. Finally, the implications of the choice model for organizations and further areas of research are discussed.

  19. Use of probabilistic inversion to model qualitative expert input when selecting a new nuclear reactor technology

    NASA Astrophysics Data System (ADS)

    Merritt, Charles R., Jr.

    Complex investment decisions by corporate executives often require the comparison of dissimilar attributes and competing technologies. A technique to evaluate qualitative input from experts using a Multi-Criteria Decision Method (MCDM) is described to select a new reactor technology for a merchant nuclear generator. The high capital cost, risks from design, licensing and construction, reactor safety and security considerations are some of the diverse considerations when choosing a reactor design. Three next generation reactor technologies are examined: the Advanced Pressurized-1000 (AP-1000) from Westinghouse, Economic Simplified Boiling Water Reactor (ESBWR) from General Electric, and the U.S. Evolutionary Power Reactor (U.S. EPR) from AREVA. Recent developments in MCDM and decision support systems are described. The uncertainty inherent in experts' opinions for the attribute weighting in the MCDM is modeled through the use of probabilistic inversion. In probabilistic inversion, a function is inverted into a random variable within a defined range. Once the distribution is created, random samples based on the distribution are used to perform a sensitivity analysis on the decision results to verify the "strength" of the results. The decision results for the pool of experts identified the U.S. EPR as the optimal choice.

  20. De novo protein conformational sampling using a probabilistic graphical model

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Debswapna; Cheng, Jianlin

    2015-11-01

    Efficient exploration of protein conformational space remains challenging especially for large proteins when assembling discretized structural fragments extracted from a protein structure data database. We propose a fragment-free probabilistic graphical model, FUSION, for conformational sampling in continuous space and assess its accuracy using ‘blind’ protein targets with a length up to 250 residues from the CASP11 structure prediction exercise. The method reduces sampling bottlenecks, exhibits strong convergence, and demonstrates better performance than the popular fragment assembly method, ROSETTA, on relatively larger proteins with a length of more than 150 residues in our benchmark set. FUSION is freely available through a web server at http://protein.rnet.missouri.edu/FUSION/.

  1. Current Challenges in Bayesian Model Choice

    NASA Astrophysics Data System (ADS)

    Clyde, M. A.; Berger, J. O.; Bullard, F.; Ford, E. B.; Jefferys, W. H.; Luo, R.; Paulo, R.; Loredo, T.

    2007-11-01

    Model selection (and the related issue of model uncertainty) arises in many astronomical problems, and, in particular, has been one of the focal areas of the Exoplanet working group under the SAMSI (Statistics and Applied Mathematical Sciences Institute) Astrostatistcs Exoplanet program. We provide an overview of the Bayesian approach to model selection and highlight the challenges involved in implementing Bayesian model choice in four stylized problems. We review some of the current methods used by statisticians and astronomers and present recent developments in the area. We discuss the applicability, computational challenges, and performance of suggested methods and conclude with recommendations and open questions.

  2. New probabilistic network models and algorithms for oncogenesis.

    PubMed

    Hjelm, Marcus; Höglund, Mattias; Lagergren, Jens

    2006-05-01

    Chromosomal aberrations in solid tumors appear in complex patterns. It is important to understand how these patterns develop, the dynamics of the process, the temporal or even causal order between aberrations, and the involved pathways. Here we present network models for chromosomal aberrations and algorithms for training models based on observed data. Our models are generative probabilistic models that can be used to study dynamical aspects of chromosomal evolution in cancer cells. They are well suited for a graphical representation that conveys the pathways found in a dataset. By allowing only pairwise dependencies and partition aberrations into modules, in which all aberrations are restricted to have the same dependencies, we reduce the number of parameters so that datasets sizes relevant to cancer applications can be handled. We apply our framework to a dataset of colorectal cancer tumor karyotypes. The obtained model explains the data significantly better than a model where independence between the aberrations is assumed. In fact, the obtained model performs very well with respect to several measures of goodness of fit and is, with respect to repetition of the training, more or less unique.

  3. An Approach for Incorporating Context in Building Probabilistic Predictive Models.

    PubMed

    Wu, Juan Anna; Hsu, William; Bui, Alex At

    2012-09-01

    With the increasing amount of information collected through clinical practice and scientific experimentation, a growing challenge is how to utilize available resources to construct predictive models to facilitate clinical decision making. Clinicians often have questions related to the treatment and outcome of a medical problem for individual patients; however, few tools exist that leverage the large collection of patient data and scientific knowledge to answer these questions. Without appropriate context, existing data that have been collected for a specific task may not be suitable for creating new models that answer different questions. This paper presents an approach that leverages available structured or unstructured data to build a probabilistic predictive model that assists physicians with answering clinical questions on individual patients. Various challenges related to transforming available data to an end-user application are addressed: problem decomposition, variable selection, context representation, automated extraction of information from unstructured data sources, model generation, and development of an intuitive application to query the model and present the results. We describe our efforts towards building a model that predicts the risk of vasospasm in aneurysm patients.

  4. Modeling choice and valuation in decision experiments.

    PubMed

    Loomes, Graham

    2010-07-01

    This article develops a parsimonious descriptive model of individual choice and valuation in the kinds of experiments that constitute a substantial part of the literature relating to decision making under risk and uncertainty. It suggests that many of the best known "regularities" observed in those experiments may arise from a tendency for participants to perceive probabilities and payoffs in a particular way. This model organizes more of the data than any other extant model and generates a number of novel testable implications which are examined with new data.

  5. Detection and characterization of regulatory elements using probabilistic conditional random field and hidden Markov models.

    PubMed

    Wang, Hongyan; Zhou, Xiaobo

    2013-04-01

    By altering the electrostatic charge of histones or providing binding sites to protein recognition molecules, Chromatin marks have been proposed to regulate gene expression, a property that has motivated researchers to link these marks to cis-regulatory elements. With the help of next generation sequencing technologies, we can now correlate one specific chromatin mark with regulatory elements (e.g. enhancers or promoters) and also build tools, such as hidden Markov models, to gain insight into mark combinations. However, hidden Markov models have limitation for their character of generative models and assume that a current observation depends only on a current hidden state in the chain. Here, we employed two graphical probabilistic models, namely the linear conditional random field model and multivariate hidden Markov model, to mark gene regions with different states based on recurrent and spatially coherent character of these eight marks. Both models revealed chromatin states that may correspond to enhancers and promoters, transcribed regions, transcriptional elongation, and low-signal regions. We also found that the linear conditional random field model was more effective than the hidden Markov model in recognizing regulatory elements, such as promoter-, enhancer-, and transcriptional elongation-associated regions, which gives us a better choice.

  6. Probabilistic constitutive relationships for material strength degradation models

    NASA Technical Reports Server (NTRS)

    Boyce, L.; Chamis, C. C.

    1989-01-01

    In the present probabilistic methodology for the strength of aerospace propulsion system structural components subjected to such environmentally-induced primitive variables as loading stresses, high temperature, chemical corrosion, and radiation, time is encompassed as an interacting element, allowing the projection of creep and fatigue effects. A probabilistic constitutive equation is postulated to account for the degradation of strength due to these primitive variables which may be calibrated by an appropriately curve-fitted least-squares multiple regression of experimental data. The resulting probabilistic constitutive equation is embodied in the PROMISS code for aerospace propulsion component random strength determination.

  7. The Gain-Loss Model: A Probabilistic Skill Multimap Model for Assessing Learning Processes

    ERIC Educational Resources Information Center

    Robusto, Egidio; Stefanutti, Luca; Anselmi, Pasquale

    2010-01-01

    Within the theoretical framework of knowledge space theory, a probabilistic skill multimap model for assessing learning processes is proposed. The learning process of a student is modeled as a function of the student's knowledge and of an educational intervention on the attainment of specific skills required to solve problems in a knowledge…

  8. Probabilistic consequence model of accidenal or intentional chemical releases.

    SciTech Connect

    Chang, Y.-S.; Samsa, M. E.; Folga, S. M.; Hartmann, H. M.

    2008-06-02

    In this work, general methodologies for evaluating the impacts of large-scale toxic chemical releases are proposed. The potential numbers of injuries and fatalities, the numbers of hospital beds, and the geographical areas rendered unusable during and some time after the occurrence and passage of a toxic plume are estimated on a probabilistic basis. To arrive at these estimates, historical accidental release data, maximum stored volumes, and meteorological data were used as inputs into the SLAB accidental chemical release model. Toxic gas footprints from the model were overlaid onto detailed population and hospital distribution data for a given region to estimate potential impacts. Output results are in the form of a generic statistical distribution of injuries and fatalities associated with specific toxic chemicals and regions of the United States. In addition, indoor hazards were estimated, so the model can provide contingency plans for either shelter-in-place or evacuation when an accident occurs. The stochastic distributions of injuries and fatalities are being used in a U.S. Department of Homeland Security-sponsored decision support system as source terms for a Monte Carlo simulation that evaluates potential measures for mitigating terrorist threats. This information can also be used to support the formulation of evacuation plans and to estimate damage and cleanup costs.

  9. Probabilistic pairwise Markov models: application to prostate cancer detection

    NASA Astrophysics Data System (ADS)

    Monaco, James; Tomaszewski, John E.; Feldman, Michael D.; Moradi, Mehdi; Mousavi, Parvin; Boag, Alexander; Davidson, Chris; Abolmaesumi, Purang; Madabhushi, Anant

    2009-02-01

    Markov Random Fields (MRFs) provide a tractable means for incorporating contextual information into a Bayesian framework. This contextual information is modeled using multiple local conditional probability density functions (LCPDFs) which the MRF framework implicitly combines into a single joint probability density function (JPDF) that describes the entire system. However, only LCPDFs of certain functional forms are consistent, meaning they reconstitute a valid JPDF. These forms are specified by the Gibbs-Markov equivalence theorem which indicates that the JPDF, and hence the LCPDFs, should be representable as a product of potential functions (i.e. Gibbs distributions). Unfortunately, potential functions are mathematical abstractions that lack intuition; and consequently, constructing LCPDFs through their selection becomes an ad hoc procedure, usually resulting in generic and/or heuristic models. In this paper we demonstrate that under certain conditions the LCDPFs can be formulated in terms of quantities that are both meaningful and descriptive: probability distributions. Using probability distributions instead of potential functions enables us to construct consistent LCPDFs whose modeling capabilities are both more intuitive and expansive than typical MRF models. As an example, we compare the efficacy of our so-called probabilistic pairwise Markov models (PPMMs) to the prevalent Potts model by incorporating both into a novel computer aided diagnosis (CAD) system for detecting prostate cancer in whole-mount histological sections. Using the Potts model the CAD system is able to detection cancerous glands with a specificity of 0.82 and sensitivity of 0.71; its area under the receiver operator characteristic (AUC) curve is 0.83. If instead the PPMM model is employed the sensitivity (specificity is held fixed) and AUC increase to 0.77 and 0.87.

  10. Choice.

    PubMed

    Greenberg, Jay

    2008-09-01

    Understanding how and why analysands make the choices they do is central to both the clinical and the theoretical projects of psychoanalysis. And yet we know very little about the process of choice or about the relationship between choices and motives. A striking parallel is to be found between the ways choice is narrated in ancient Greek texts and the experience of analysts as they observe patients making choices in everyday clinical work. Pursuing this convergence of classical and contemporary sensibilities will illuminate crucial elements of the various meanings of choice, and of the way that these meanings change over the course of psychoanalytic treatment.

  11. Predicting coastal cliff erosion using a Bayesian probabilistic model

    USGS Publications Warehouse

    Hapke, C.; Plant, N.

    2010-01-01

    Regional coastal cliff retreat is difficult to model due to the episodic nature of failures and the along-shore variability of retreat events. There is a growing demand, however, for predictive models that can be used to forecast areas vulnerable to coastal erosion hazards. Increasingly, probabilistic models are being employed that require data sets of high temporal density to define the joint probability density function that relates forcing variables (e.g. wave conditions) and initial conditions (e.g. cliff geometry) to erosion events. In this study we use a multi-parameter Bayesian network to investigate correlations between key variables that control and influence variations in cliff retreat processes. The network uses Bayesian statistical methods to estimate event probabilities using existing observations. Within this framework, we forecast the spatial distribution of cliff retreat along two stretches of cliffed coast in Southern California. The input parameters are the height and slope of the cliff, a descriptor of material strength based on the dominant cliff-forming lithology, and the long-term cliff erosion rate that represents prior behavior. The model is forced using predicted wave impact hours. Results demonstrate that the Bayesian approach is well-suited to the forward modeling of coastal cliff retreat, with the correct outcomes forecast in 70-90% of the modeled transects. The model also performs well in identifying specific locations of high cliff erosion, thus providing a foundation for hazard mapping. This approach can be employed to predict cliff erosion at time-scales ranging from storm events to the impacts of sea-level rise at the century-scale. ?? 2010.

  12. Temporal Resolution in Time Series and Probabilistic Models of Renewable Power Systems

    NASA Astrophysics Data System (ADS)

    Hoevenaars, Eric

    There are two main types of logistical models used for long-term performance prediction of autonomous power systems: time series and probabilistic. Time series models are more common and are more accurate for sizing storage systems because they are able to track the state of charge. However, the computational time is usually greater than for probabilistic models. It is common for time series models to perform 1-year simulations with a 1-hour time step. This is likely because of the limited availability of high resolution data and the increase in computation time with a shorter time step. Computation time is particularly important because these types of models are often used for component size optimization which requires many model runs. This thesis includes a sensitivity analysis examining the effect of the time step on these simulations. The results show that it can be significant, though it depends on the system configuration and site characteristics. Two probabilistic models are developed to estimate the temporal resolution error of a 1-hour simulation: a time series/probabilistic model and a fully probabilistic model. To demonstrate the application of and evaluate the performance of these models, two case studies are analyzed. One is for a typical residential system and one is for a system designed to provide on-site power at an aquaculture site. The results show that the time series/probabilistic model would be a useful tool if accurate distributions of the sub-hour data can be determined. Additionally, the method of cumulant arithmetic is demonstrated to be a useful technique for incorporating multiple non-Gaussian random variables into a probabilistic model, a feature other models such as Hybrid2 currently do not have. The results from the fully probabilistic model showed that some form of autocorrelation is required to account for seasonal and diurnal trends.

  13. Model for understanding consumer textural food choice.

    PubMed

    Jeltema, Melissa; Beckley, Jacqueline; Vahalik, Jennifer

    2015-05-01

    The current paradigm for developing products that will match the marketing messaging is flawed because the drivers of product choice and satisfaction based on texture are misunderstood. Qualitative research across 10 years has led to the thesis explored in this research that individuals have a preferred way to manipulate food in their mouths (i.e., mouth behavior) and that this behavior is a major driver of food choice, satisfaction, and the desire to repurchase. Texture, which is currently thought to be a major driver of product choice, is a secondary factor, and is important only in that it supports the primary driver-mouth behavior. A model for mouth behavior is proposed and the qualitative research supporting the identification of different mouth behaviors is presented. The development of a trademarked typing tool for characterizing mouth behavior is described along with quantitative substantiation of the tool's ability to group individuals by mouth behavior. The use of these four groups to understand textural preferences and the implications for a variety of areas including product design and weight management are explored.

  14. Model for understanding consumer textural food choice

    PubMed Central

    Jeltema, Melissa; Beckley, Jacqueline; Vahalik, Jennifer

    2015-01-01

    The current paradigm for developing products that will match the marketing messaging is flawed because the drivers of product choice and satisfaction based on texture are misunderstood. Qualitative research across 10 years has led to the thesis explored in this research that individuals have a preferred way to manipulate food in their mouths (i.e., mouth behavior) and that this behavior is a major driver of food choice, satisfaction, and the desire to repurchase. Texture, which is currently thought to be a major driver of product choice, is a secondary factor, and is important only in that it supports the primary driver—mouth behavior. A model for mouth behavior is proposed and the qualitative research supporting the identification of different mouth behaviors is presented. The development of a trademarked typing tool for characterizing mouth behavior is described along with quantitative substantiation of the tool's ability to group individuals by mouth behavior. The use of these four groups to understand textural preferences and the implications for a variety of areas including product design and weight management are explored. PMID:25987995

  15. Probabilistic modelling of rainfall induced landslide hazard assessment

    NASA Astrophysics Data System (ADS)

    Kawagoe, S.; Kazama, S.; Sarukkalige, P. R.

    2010-06-01

    To evaluate the frequency and distribution of landslides hazards over Japan, this study uses a probabilistic model based on multiple logistic regression analysis. Study particular concerns several important physical parameters such as hydraulic parameters, geographical parameters and the geological parameters which are considered to be influential in the occurrence of landslides. Sensitivity analysis confirmed that hydrological parameter (hydraulic gradient) is the most influential factor in the occurrence of landslides. Therefore, the hydraulic gradient is used as the main hydraulic parameter; dynamic factor which includes the effect of heavy rainfall and their return period. Using the constructed spatial data-sets, a multiple logistic regression model is applied and landslide hazard probability maps are produced showing the spatial-temporal distribution of landslide hazard probability over Japan. To represent the landslide hazard in different temporal scales, extreme precipitation in 5 years, 30 years, and 100 years return periods are used for the evaluation. The results show that the highest landslide hazard probability exists in the mountain ranges on the western side of Japan (Japan Sea side), including the Hida and Kiso, Iide and the Asahi mountainous range, the south side of Chugoku mountainous range, the south side of Kyusu mountainous and the Dewa mountainous range and the Hokuriku region. The developed landslide hazard probability maps in this study will assist authorities, policy makers and decision makers, who are responsible for infrastructural planning and development, as they can identify landslide-susceptible areas and thus decrease landslide damage through proper preparation.

  16. Probabilistic modelling of rainfall induced landslide hazard assessment

    NASA Astrophysics Data System (ADS)

    Kawagoe, S.; Kazama, S.; Sarukkalige, P. R.

    2010-01-01

    To evaluate the frequency and distribution of landslides hazards over Japan, this study uses a probabilistic model based on multiple logistic regression analysis. Study particular concerns several important physical parameters such as hydraulic parameters, geographical parameters and the geological parameters which are considered to be influential in the occurrence of landslides. Sensitivity analysis confirmed that hydrological parameter (hydraulic gradient) is the most influential factor in the occurrence of landslides. Therefore, the hydraulic gradient is used as the main hydraulic parameter; dynamic factor which includes the effect of heavy rainfall and their return period. Using the constructed spatial data-sets, a multiple logistic regression model is applied and landslide susceptibility maps are produced showing the spatial-temporal distribution of landslide hazard susceptibility over Japan. To represent the susceptibility in different temporal scales, extreme precipitation in 5 years, 30 years, and 100 years return periods are used for the evaluation. The results show that the highest landslide hazard susceptibility exists in the mountain ranges on the western side of Japan (Japan Sea side), including the Hida and Kiso, Iide and the Asahi mountainous range, the south side of Chugoku mountainous range, the south side of Kyusu mountainous and the Dewa mountainous range and the Hokuriku region. The developed landslide hazard susceptibility maps in this study will assist authorities, policy makers and decision makers, who are responsible for infrastructural planning and development, as they can identify landslide-susceptible areas and thus decrease landslide damage through proper preparation.

  17. Formulation of probabilistic models of protein structure in atomic detail using the reference ratio method.

    PubMed

    Valentin, Jan B; Andreetta, Christian; Boomsma, Wouter; Bottaro, Sandro; Ferkinghoff-Borg, Jesper; Frellsen, Jes; Mardia, Kanti V; Tian, Pengfei; Hamelryck, Thomas

    2014-02-01

    We propose a method to formulate probabilistic models of protein structure in atomic detail, for a given amino acid sequence, based on Bayesian principles, while retaining a close link to physics. We start from two previously developed probabilistic models of protein structure on a local length scale, which concern the dihedral angles in main chain and side chains, respectively. Conceptually, this constitutes a probabilistic and continuous alternative to the use of discrete fragment and rotamer libraries. The local model is combined with a nonlocal model that involves a small number of energy terms according to a physical force field, and some information on the overall secondary structure content. In this initial study we focus on the formulation of the joint model and the evaluation of the use of an energy vector as a descriptor of a protein's nonlocal structure; hence, we derive the parameters of the nonlocal model from the native structure without loss of generality. The local and nonlocal models are combined using the reference ratio method, which is a well-justified probabilistic construction. For evaluation, we use the resulting joint models to predict the structure of four proteins. The results indicate that the proposed method and the probabilistic models show considerable promise for probabilistic protein structure prediction and related applications.

  18. Probabilistic approaches to the modelling of fluvial processes

    NASA Astrophysics Data System (ADS)

    Molnar, Peter

    2013-04-01

    Fluvial systems generally exhibit sediment dynamics that are strongly stochastic. This stochasticity comes basically from three sources: (a) the variability and randomness in sediment supply due to surface properties and topography; (b) from the multitude of pathways that sediment may take on hillslopes and in channels, and the uncertainty in travel times and sediment storage along those pathways; and (c) from the stochasticity which is inherent in mobilizing sediment, either by heavy rain, landslides, debris flows, slope erosion, channel avulsions, etc. Fully deterministic models of fluvial systems, even if they are physically realistic and very complex, are likely going to be unable to capture this stochasticity and as a result will fail to reproduce long-term sediment dynamics. In this paper I will review another approach to modelling fluvial processes, which grossly simplifies the systems itself, but allows for stochasticity in sediment supply, mobilization and transport. I will demonstrate the benefits and limitations of this probabilistic approach to fluvial processes on three examples. The first example is a probabilistic sediment cascade which we developed for the Illgraben, a debris flow basin in the Rhone catchment. In this example it will be shown how the probability distribution of landslides generating sediment input into the channel system is transposed into that of sediment yield out of the basin by debris flows. The key role of transient sediment storage in the channel system, which limits the size of potential debris flows, is highlighted together with the influence of the landslide triggering mechanisms and climate stochasticity. The second example focuses on the river reach scale in the Maggia River, a braided gravel-bed stream where the exposed sediment on gravel bars is colonised by riparian vegetation in periods without floods. A simple autoregressive model with a disturbance and colonization term is used to simulate the growth and decline in

  19. Design of multispecific protein sequences using probabilistic graphical modeling.

    PubMed

    Fromer, Menachem; Yanover, Chen; Linial, Michal

    2010-02-15

    In nature, proteins partake in numerous protein- protein interactions that mediate their functions. Moreover, proteins have been shown to be physically stable in multiple structures, induced by cellular conditions, small ligands, or covalent modifications. Understanding how protein sequences achieve this structural promiscuity at the atomic level is a fundamental step in the drug design pipeline and a critical question in protein physics. One way to investigate this subject is to computationally predict protein sequences that are compatible with multiple states, i.e., multiple target structures or binding to distinct partners. The goal of engineering such proteins has been termed multispecific protein design. We develop a novel computational framework to efficiently and accurately perform multispecific protein design. This framework utilizes recent advances in probabilistic graphical modeling to predict sequences with low energies in multiple target states. Furthermore, it is also geared to specifically yield positional amino acid probability profiles compatible with these target states. Such profiles can be used as input to randomly bias high-throughput experimental sequence screening techniques, such as phage display, thus providing an alternative avenue for elucidating the multispecificity of natural proteins and the synthesis of novel proteins with specific functionalities. We prove the utility of such multispecific design techniques in better recovering amino acid sequence diversities similar to those resulting from millions of years of evolution. We then compare the approaches of prediction of low energy ensembles and of amino acid profiles and demonstrate their complementarity in providing more robust predictions for protein design.

  20. Poisson Group Testing: A Probabilistic Model for Boolean Compressed Sensing

    NASA Astrophysics Data System (ADS)

    Emad, Amin; Milenkovic, Olgica

    2015-08-01

    We introduce a novel probabilistic group testing framework, termed Poisson group testing, in which the number of defectives follows a right-truncated Poisson distribution. The Poisson model has a number of new applications, including dynamic testing with diminishing relative rates of defectives. We consider both nonadaptive and semi-adaptive identification methods. For nonadaptive methods, we derive a lower bound on the number of tests required to identify the defectives with a probability of error that asymptotically converges to zero; in addition, we propose test matrix constructions for which the number of tests closely matches the lower bound. For semi-adaptive methods, we describe a lower bound on the expected number of tests required to identify the defectives with zero error probability. In addition, we propose a stage-wise reconstruction algorithm for which the expected number of tests is only a constant factor away from the lower bound. The methods rely only on an estimate of the average number of defectives, rather than on the individual probabilities of subjects being defective.

  1. Sonar signal processing using probabilistic signal and ocean environmental models.

    PubMed

    Culver, R Lee; Camin, H John

    2008-12-01

    Acoustic signals propagating through the ocean are refracted, scattered, and attenuated by the ocean volume and boundaries. Many aspects of how the ocean affects acoustic propagation are understood, such that the characteristics of a received signal can often be predicted with some degree of certainty. However, acoustic ocean parameters vary with time and location in a manner that is not, and cannot be, precisely known; some uncertainty will always remain. For this reason, the characteristics of the received signal can never be precisely predicted and must be described in probabilistic terms. A signal processing structure recently developed relies on knowledge of the ocean environment to predict the statistical characteristics of the received signal, and incorporates this description into the processor in order to detect and classify targets. Acoustic measurements at 250 Hz from the 1996 Strait of Gibraltar Acoustic Monitoring Experiment are used to illustrate how the processor utilizes environmental data to classify source depth and to underscore the importance of environmental model fidelity and completeness.

  2. Data-directed RNA secondary structure prediction using probabilistic modeling.

    PubMed

    Deng, Fei; Ledda, Mirko; Vaziri, Sana; Aviran, Sharon

    2016-08-01

    Structure dictates the function of many RNAs, but secondary RNA structure analysis is either labor intensive and costly or relies on computational predictions that are often inaccurate. These limitations are alleviated by integration of structure probing data into prediction algorithms. However, existing algorithms are optimized for a specific type of probing data. Recently, new chemistries combined with advances in sequencing have facilitated structure probing at unprecedented scale and sensitivity. These novel technologies and anticipated wealth of data highlight a need for algorithms that readily accommodate more complex and diverse input sources. We implemented and investigated a recently outlined probabilistic framework for RNA secondary structure prediction and extended it to accommodate further refinement of structural information. This framework utilizes direct likelihood-based calculations of pseudo-energy terms per considered structural context and can readily accommodate diverse data types and complex data dependencies. We use real data in conjunction with simulations to evaluate performances of several implementations and to show that proper integration of structural contexts can lead to improvements. Our tests also reveal discrepancies between real data and simulations, which we show can be alleviated by refined modeling. We then propose statistical preprocessing approaches to standardize data interpretation and integration into such a generic framework. We further systematically quantify the information content of data subsets, demonstrating that high reactivities are major drivers of SHAPE-directed predictions and that better understanding of less informative reactivities is key to further improvements. Finally, we provide evidence for the adaptive capability of our framework using mock probe simulations.

  3. Probabilistic model for fracture mechanics service life analysis

    NASA Technical Reports Server (NTRS)

    Annis, Charles; Watkins, Tommie

    1988-01-01

    The service longevity of complex propulsion systems, such as the Space Shuttle Main Engine (SSME), can be at risk from several competing failure modes. Conventional life assessment practice focuses upon the most severely life-limited feature of a given component, even though there may be other, less severe, potential failure locations. Primary, secondary, tertiary failure modes, as well as their associated probabilities, must also be considered. Futhermore, these probabilities are functions of accumulated service time. Thus a component may not always succumb to the most severe, or even the most probable failure mode. Propulsion system longevity must be assessed by considering simultaneously the actions of, and interactions among, life-limiting influences. These include, but are not limited to, high frequency fatigue (HFF), low cycle fatigue (LCF), and subsequent crack propagation, thermal and acoustic loadings, and the influence of less-than-ideal nondestructive evaluation (NDE). An outline is provided for a probabilistic model for service life analysis, and the progress towards its implementation is reported.

  4. A probabilistic model of ecosystem response to climate change

    SciTech Connect

    Shevliakova, E.; Dowlatabadi, H.

    1994-12-31

    Anthropogenic activities are leading to rapid changes in land cover and emissions of greenhouse gases into the atmosphere. These changes can bring about climate change typified by average global temperatures rising by 1--5 C over the next century. Climate change of this magnitude is likely to alter the distribution of terrestrial ecosystems on a large scale. Options available for dealing with such change are abatement of emissions, adaptation, and geoengineering. The integrated assessment of climate change demands that frameworks be developed where all the elements of the climate problem are present (from economic activity to climate change and its impacts on market and non-market goods and services). Integrated climate assessment requires multiple impact metrics and multi-attribute utility functions to simulate the response of different key actors/decision-makers to the actual physical impacts (rather than a dollar value) of the climate-damage vs. policy-cost debate. This necessitates direct modeling of ecosystem impacts of climate change. The authors have developed a probabilistic model of ecosystem response to global change. This model differs from previous efforts in that it is statistically estimated using actual ecosystem and climate data yielding a joint multivariate probability of prevalence for each ecosystem, given climatic conditions. The authors expect this approach to permit simulation of inertia and competition which have, so far, been absent in transfer models of continental-scale ecosystem response to global change. Thus, although the probability of one ecotype will dominate others at a given point, others would have the possibility of establishing an early foothold.

  5. The Terrestrial Investigation Model: A probabilistic risk assessment model for birds exposed to pesticides

    EPA Science Inventory

    One of the major recommendations of the National Academy of Science to the USEPA, NMFS and USFWS was to utilize probabilistic methods when assessing the risks of pesticides to federally listed endangered and threatened species. The Terrestrial Investigation Model (TIM, version 3....

  6. Forecasting the duration of volcanic eruptions: an empirical probabilistic model

    NASA Astrophysics Data System (ADS)

    Gunn, L. S.; Blake, S.; Jones, M. C.; Rymer, H.

    2014-01-01

    The ability to forecast future volcanic eruption durations would greatly benefit emergency response planning prior to and during a volcanic crises. This paper introduces a probabilistic model to forecast the duration of future and on-going eruptions. The model fits theoretical distributions to observed duration data and relies on past eruptions being a good indicator of future activity. A dataset of historical Mt. Etna flank eruptions is presented and used to demonstrate the model. The data have been compiled through critical examination of existing literature along with careful consideration of uncertainties on reported eruption start and end dates between the years 1300 AD and 2010. Data following 1600 is considered to be reliable and free of reporting biases. The distribution of eruption duration between the years 1600 and 1669 is found to be statistically different from that following it and the forecasting model is run on two datasets of Mt. Etna flank eruption durations: 1600-2010 and 1670-2010. Each dataset is modelled using a log-logistic distribution with parameter values found by maximum likelihood estimation. Survivor function statistics are applied to the model distributions to forecast (a) the probability of an eruption exceeding a given duration, (b) the probability of an eruption that has already lasted a particular number of days exceeding a given total duration and (c) the duration with a given probability of being exceeded. Results show that excluding the 1600-1670 data has little effect on the forecasting model result, especially where short durations are involved. By assigning the terms `likely' and `unlikely' to probabilities of 66 % or more and 33 % or less, respectively, the forecasting model based on the 1600-2010 dataset indicates that a future flank eruption on Mt. Etna would be likely to exceed 20 days (± 7 days) but unlikely to exceed 86 days (± 29 days). This approach can easily be adapted for use on other highly active, well

  7. Probabilistic Modeling of Tephra Dispersion using Parallel Processing

    NASA Astrophysics Data System (ADS)

    Hincks, T.; Bonadonna, C.; Connor, L.; Connor, C.; Sparks, S.

    2002-12-01

    Numerical models of tephra accumulation are important tools in assessing hazards of volcanic eruptions. Such tools can be used far in advance of future eruptions to calculate possible hazards as conditional probabilities. For example, given that a volcanic eruption occurs, what is the expected range of tephra deposition in a specific location or across a region? An empirical model is presented that uses physical characteristics (e.g., volume, column height, particle size distribution) of a volcanic eruption to calculate expected tephra accumulation at geographic locations distant from the vent. This model results from the combination of the Connor et al. (2001) and Bonadonna et al. (1998, 2002) numerical approaches and is based on application of the diffusion advection equation using a stratified atmosphere and particle fall velocities that account for particle shape, density, and variation in Reynold's number along the path of decent. Distribution of particles in the eruption column is a major source of uncertainty in estimation of tephra hazards. We adopt an approach in which several models of the volcanic column may be used and the impact of these various source term models on hazard estimated. Cast probabilistically, this model can use characteristics of historical eruptions, or data from analogous eruptions, to predict the expected tephra deposition from future eruptions. Application of such a model for computing a large number of events over a grid of many points is computationally expensive. In fact, the utility of the model for stochastic simulations of volcanic eruptions was limited by long execution time. To address this concern, we created a parallel version in C and MPI, a message passing interface, to run on a Beowulf cluster, a private network of reasonably high performance computers. We have discovered that grid or input decomposition and self-scheduling techniques lead to essentially linear speed-up in the code. This means that the code is readily

  8. An empirical model for probabilistic decadal prediction: A global analysis

    NASA Astrophysics Data System (ADS)

    Suckling, Emma; Hawkins, Ed; Eden, Jonathan; van Oldenborgh, Geert Jan

    2016-04-01

    Empirical models, designed to predict land-based surface variables over seasons to decades ahead, provide useful benchmarks for comparison against the performance of dynamical forecast systems; they may also be employable as predictive tools for use by climate services in their own right. A new global empirical decadal prediction system is presented, based on a multiple linear regression approach designed to produce probabilistic output for comparison against dynamical models. Its performance is evaluated for surface air temperature over a set of historical hindcast experiments under a series of different prediction `modes'. The modes include a real-time setting, a scenario in which future volcanic forcings are prescribed during the hindcasts, and an approach which exploits knowledge of the forced trend. A two-tier prediction system, which uses knowledge of future sea surface temperatures in the Pacific and Atlantic Oceans, is also tested, but within a perfect knowledge framework. Each mode is designed to identify sources of predictability and uncertainty, as well as investigate different approaches to the design of decadal prediction systems for operational use. It is found that the empirical model shows skill above that of persistence hindcasts for annual means at lead times of up to ten years ahead in all of the prediction modes investigated. Small improvements in skill are found at all lead times when including future volcanic forcings in the hindcasts. It is also suggested that hindcasts which exploit full knowledge of the forced trend due to increasing greenhouse gases throughout the hindcast period can provide more robust estimates of model bias for the calibration of the empirical model in an operational setting. The two-tier system shows potential for improved real-time prediction, given the assumption that skilful predictions of large-scale modes of variability are available. The empirical model framework has been designed with enough flexibility to

  9. A Survey of Probabilistic Models for Relational Data

    SciTech Connect

    Koutsourelakis, P S

    2006-10-13

    Traditional data mining methodologies have focused on ''flat'' data i.e. a collection of identically structured entities, assumed to be independent and identically distributed. However, many real-world datasets are innately relational in that they consist of multi-modal entities and multi-relational links (where each entity- or link-type is characterized by a different set of attributes). Link structure is an important characteristic of a dataset and should not be ignored in modeling efforts, especially when statistical dependencies exist between related entities. These dependencies can in fact significantly improve the accuracy of inference and prediction results, if the relational structure is appropriately leveraged (Figure 1). The need for models that can incorporate relational structure has been accentuated by new technological developments which allow us to easily track, store, and make accessible large amounts of data. Recently, there has been a surge of interest in statistical models for dealing with richly interconnected, heterogeneous data, fueled largely by information mining of web/hypertext data, social networks, bibliographic citation data, epidemiological data and communication networks. Graphical models have a natural formalism for representing complex relational data and for predicting the underlying evolving system in a dynamic framework. The present survey provides an overview of probabilistic methods and techniques that have been developed over the last few years for dealing with relational data. Particular emphasis is paid to approaches pertinent to the research areas of pattern recognition, group discovery, entity/node classification, and anomaly detection. We start with supervised learning tasks, where two basic modeling approaches are discussed--i.e. discriminative and generative. Several discriminative techniques are reviewed and performance results are presented. Generative methods are discussed in a separate survey. A special section is

  10. Integration of fuzzy analytic hierarchy process and probabilistic dynamic programming in formulating an optimal fleet management model

    NASA Astrophysics Data System (ADS)

    Teoh, Lay Eng; Khoo, Hooi Ling

    2013-09-01

    This study deals with two major aspects of airlines, i.e. supply and demand management. The aspect of supply focuses on the mathematical formulation of an optimal fleet management model to maximize operational profit of the airlines while the aspect of demand focuses on the incorporation of mode choice modeling as parts of the developed model. The proposed methodology is outlined in two-stage, i.e. Fuzzy Analytic Hierarchy Process is first adopted to capture mode choice modeling in order to quantify the probability of probable phenomena (for aircraft acquisition/leasing decision). Then, an optimization model is developed as a probabilistic dynamic programming model to determine the optimal number and types of aircraft to be acquired and/or leased in order to meet stochastic demand during the planning horizon. The findings of an illustrative case study show that the proposed methodology is viable. The results demonstrate that the incorporation of mode choice modeling could affect the operational profit and fleet management decision of the airlines at varying degrees.

  11. Probabilistic modelling of sea surges in coastal urban areas

    NASA Astrophysics Data System (ADS)

    Georgiadis, Stylianos; Jomo Danielsen Sørup, Hjalte; Arnbjerg-Nielsen, Karsten; Nielsen, Bo Friis

    2016-04-01

    Urban floods are a major issue for coastal cities with severe impacts on economy, society and environment. A main cause for floods are sea surges stemming from extreme weather conditions. In the context of urban flooding, certain standards have to be met by critical infrastructures in order to protect them from floods. These standards can be so strict that no empirical data is available. For instance, protection plans for sub-surface railways against floods are established with 10,000 years return levels. Furthermore, the long technical lifetime of such infrastructures is a critical issue that should be considered, along with the associated climate change effects in this lifetime. We present a case study of Copenhagen where the metro system is being expanded at present with several stations close to the sea. The current critical sea levels for the metro have never been exceeded and Copenhagen has only been severely flooded from pluvial events in the time where measurements have been conducted. However, due to the very high return period that the metro has to be able to withstand and due to the expectations to sea-level rise due to climate change, reliable estimates of the occurrence rate and magnitude of sea surges have to be established as the current protection is expected to be insufficient at some point within the technical lifetime of the metro. The objective of this study is to probabilistically model sea level in Copenhagen as opposed to extrapolating the extreme statistics as is the practice often used. A better understanding and more realistic description of the phenomena leading to sea surges can then be given. The application of hidden Markov models to high-resolution data of sea level for different meteorological stations in and around Copenhagen is an effective tool to address uncertainty. For sea surge studies, the hidden states of the model may reflect the hydrological processes that contribute to coastal floods. Also, the states of the hidden Markov

  12. A probabilistic graphical model approach to stochastic multiscale partial differential equations

    SciTech Connect

    Wan, Jiang; Zabaras, Nicholas

    2013-10-01

    We develop a probabilistic graphical model based methodology to efficiently perform uncertainty quantification in the presence of both stochastic input and multiple scales. Both the stochastic input and model responses are treated as random variables in this framework. Their relationships are modeled by graphical models which give explicit factorization of a high-dimensional joint probability distribution. The hyperparameters in the probabilistic model are learned using sequential Monte Carlo (SMC) method, which is superior to standard Markov chain Monte Carlo (MCMC) methods for multi-modal distributions. Finally, we make predictions from the probabilistic graphical model using the belief propagation algorithm. Numerical examples are presented to show the accuracy and efficiency of the predictive capability of the developed graphical model.

  13. The Stay/Switch Model of Concurrent Choice

    ERIC Educational Resources Information Center

    MacDonall, James S.

    2009-01-01

    This experiment compared descriptions of concurrent choice by the stay/switch model, which says choice is a function of the reinforcers obtained for staying at and for switching from each alternative, and the generalized matching law, which says choice is a function of the total reinforcers obtained at each alternative. For the stay/switch model…

  14. Nested Logit Models for Multiple-Choice Item Response Data

    ERIC Educational Resources Information Center

    Suh, Youngsuk; Bolt, Daniel M.

    2010-01-01

    Nested logit item response models for multiple-choice data are presented. Relative to previous models, the new models are suggested to provide a better approximation to multiple-choice items where the application of a solution strategy precedes consideration of response options. In practice, the models also accommodate collapsibility across all…

  15. Integration of Advanced Probabilistic Analysis Techniques with Multi-Physics Models

    SciTech Connect

    Cetiner, Mustafa Sacit; none,; Flanagan, George F.; Poore III, Willis P.; Muhlheim, Michael David

    2014-07-30

    An integrated simulation platform that couples probabilistic analysis-based tools with model-based simulation tools can provide valuable insights for reactive and proactive responses to plant operating conditions. The objective of this work is to demonstrate the benefits of a partial implementation of the Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Framework Specification through the coupling of advanced PRA capabilities and accurate multi-physics plant models. Coupling a probabilistic model with a multi-physics model will aid in design, operations, and safety by providing a more accurate understanding of plant behavior. This represents the first attempt at actually integrating these two types of analyses for a control system used for operations, on a faster than real-time basis. This report documents the development of the basic communication capability to exchange data with the probabilistic model using Reliability Workbench (RWB) and the multi-physics model using Dymola. The communication pathways from injecting a fault (i.e., failing a component) to the probabilistic and multi-physics models were successfully completed. This first version was tested with prototypic models represented in both RWB and Modelica. First, a simple event tree/fault tree (ET/FT) model was created to develop the software code to implement the communication capabilities between the dynamic-link library (dll) and RWB. A program, written in C#, successfully communicates faults to the probabilistic model through the dll. A systems model of the Advanced Liquid-Metal Reactor–Power Reactor Inherently Safe Module (ALMR-PRISM) design developed under another DOE project was upgraded using Dymola to include proper interfaces to allow data exchange with the control application (ConApp). A program, written in C+, successfully communicates faults to the multi-physics model. The results of the example simulation were successfully plotted.

  16. Probabilistic modeling of flood characterizations with parametric and minimum information pair-copula model

    NASA Astrophysics Data System (ADS)

    Daneshkhah, Alireza; Remesan, Renji; Chatrabgoun, Omid; Holman, Ian P.

    2016-09-01

    This paper highlights the usefulness of the minimum information and parametric pair-copula construction (PCC) to model the joint distribution of flood event properties. Both of these models outperform other standard multivariate copula in modeling multivariate flood data that exhibiting complex patterns of dependence, particularly in the tails. In particular, the minimum information pair-copula model shows greater flexibility and produces better approximation of the joint probability density and corresponding measures have capability for effective hazard assessments. The study demonstrates that any multivariate density can be approximated to any degree of desired precision using minimum information pair-copula model and can be practically used for probabilistic flood hazard assessment.

  17. A Probabilistic Computational Framework for Neural Network Models

    DTIC Science & Technology

    1987-09-29

    OF R1EPORT 13b TIME COVERED 14DATE OF REPORT VYtar, ftth , DayS S PAGE COUNT ~cc~nta ~FROM 86Septl3TO~let STS em 9 16 S.,PPLEVIENTARY NOTATION 7 COSATI...P. Inspection of the form of P indicates the class of probabilistic environments that can be learned. Learning algorithms can be analyzed and designed

  18. Utilization of Probabilistic Cues in the Presence of Irrelevant Information: A Comparison of Risky Choice in Children and Adults

    ERIC Educational Resources Information Center

    Betsch, Tilmann; Lang, Anna

    2013-01-01

    We studied risky choices in preschoolers, elementary schoolers, and adults using an information board paradigm crossing two options with two cues that differ in their probability of making valid predictions (p = 0.50 vs. p = 0.83). We also varied the presence of normatively irrelevant information. Choice patterns indicate that preschoolers were…

  19. Characterizing the International Migration Barriers with a Probabilistic Multilateral Migration Model

    NASA Astrophysics Data System (ADS)

    Li, Xiaomeng; Xu, Hongzhong; Chen, Jiawei; Chen, Qinghua; Zhang, Jiang; di, Zengru

    2016-09-01

    Human migration is responsible for forming modern civilization and has had an important influence on the development of various countries. There are many issues worth researching, and “the reason to move” is the most basic one. The concept of migration cost in the classical self-selection theory, which was introduced by Roy and Borjas, is useful. However, migration cost cannot address global migration because of the limitations of deterministic and bilateral choice. Following the idea of migration cost, this paper developed a new probabilistic multilateral migration model by introducing the Boltzmann factor from statistical physics. After characterizing the underlying mechanism or driving force of human mobility, we reveal some interesting facts that have provided a deeper understanding of international migration, such as the negative correlation between migration costs for emigrants and immigrants and a global classification with clear regional and economic characteristics, based on clustering of migration cost vectors. In addition, we deconstruct the migration barriers using regression analysis and find that the influencing factors are complicated but can be partly (12.5%) described by several macro indexes, such as the GDP growth of the destination country, the GNI per capita and the HDI of both the source and destination countries.

  20. Characterizing the International Migration Barriers with a Probabilistic Multilateral Migration Model

    PubMed Central

    Li, Xiaomeng; Xu, Hongzhong; Chen, Jiawei; Chen, Qinghua; Zhang, Jiang; Di, Zengru

    2016-01-01

    Human migration is responsible for forming modern civilization and has had an important influence on the development of various countries. There are many issues worth researching, and “the reason to move” is the most basic one. The concept of migration cost in the classical self-selection theory, which was introduced by Roy and Borjas, is useful. However, migration cost cannot address global migration because of the limitations of deterministic and bilateral choice. Following the idea of migration cost, this paper developed a new probabilistic multilateral migration model by introducing the Boltzmann factor from statistical physics. After characterizing the underlying mechanism or driving force of human mobility, we reveal some interesting facts that have provided a deeper understanding of international migration, such as the negative correlation between migration costs for emigrants and immigrants and a global classification with clear regional and economic characteristics, based on clustering of migration cost vectors. In addition, we deconstruct the migration barriers using regression analysis and find that the influencing factors are complicated but can be partly (12.5%) described by several macro indexes, such as the GDP growth of the destination country, the GNI per capita and the HDI of both the source and destination countries. PMID:27597319

  1. Reasoning in Reference Games: Individual- vs. Population-Level Probabilistic Modeling

    PubMed Central

    Franke, Michael; Degen, Judith

    2016-01-01

    Recent advances in probabilistic pragmatics have achieved considerable success in modeling speakers’ and listeners’ pragmatic reasoning as probabilistic inference. However, these models are usually applied to population-level data, and so implicitly suggest a homogeneous population without individual differences. Here we investigate potential individual differences in Theory-of-Mind related depth of pragmatic reasoning in so-called reference games that require drawing ad hoc Quantity implicatures of varying complexity. We show by Bayesian model comparison that a model that assumes a heterogenous population is a better predictor of our data, especially for comprehension. We discuss the implications for the treatment of individual differences in probabilistic models of language use. PMID:27149675

  2. Multidimensional analysis and probabilistic model of volcanic and seismic activities

    NASA Astrophysics Data System (ADS)

    Fedorov, V.

    2009-04-01

    .I. Gushchenko, 1979) and seismological (database of USGS/NEIC Significant Worldwide Earthquakes, 2150 B.C.- 1994 A.D.) information which displays dynamics of endogenic relief-forming processes over a period of 1900 to 1994. In the course of the analysis, a substitution of calendar variable by a corresponding astronomical one has been performed and the epoch superposition method was applied. In essence, the method consists in that the massifs of information on volcanic eruptions (over a period of 1900 to 1977) and seismic events (1900-1994) are differentiated with respect to value of astronomical parameters which correspond to the calendar dates of the known eruptions and earthquakes, regardless of the calendar year. The obtained spectra of volcanic eruptions and violent earthquake distribution in the fields of the Earth orbital movement parameters were used as a basis for calculation of frequency spectra and diurnal probability of volcanic and seismic activity. The objective of the proposed investigations is a probabilistic model development of the volcanic and seismic events, as well as GIS designing for monitoring and forecast of volcanic and seismic activities. In accordance with the stated objective, three probability parameters have been found in the course of preliminary studies; they form the basis for GIS-monitoring and forecast development. 1. A multidimensional analysis of volcanic eruption and earthquakes (of magnitude 7) have been performed in terms of the Earth orbital movement. Probability characteristics of volcanism and seismicity have been defined for the Earth as a whole. Time intervals have been identified with a diurnal probability twice as great as the mean value. Diurnal probability of volcanic and seismic events has been calculated up to 2020. 2. A regularity is found in duration of dormant (repose) periods has been established. A relationship has been found between the distribution of the repose period probability density and duration of the period. 3

  3. Probabilistic dose-response modeling: case study using dichloromethane PBPK model results.

    PubMed

    Marino, Dale J; Starr, Thomas B

    2007-12-01

    A revised assessment of dichloromethane (DCM) has recently been reported that examines the influence of human genetic polymorphisms on cancer risks using deterministic PBPK and dose-response modeling in the mouse combined with probabilistic PBPK modeling in humans. This assessment utilized Bayesian techniques to optimize kinetic variables in mice and humans with mean values from posterior distributions used in the deterministic modeling in the mouse. To supplement this research, a case study was undertaken to examine the potential impact of probabilistic rather than deterministic PBPK and dose-response modeling in mice on subsequent unit risk factor (URF) determinations. Four separate PBPK cases were examined based on the exposure regimen of the NTP DCM bioassay. These were (a) Same Mouse (single draw of all PBPK inputs for both treatment groups); (b) Correlated BW-Same Inputs (single draw of all PBPK inputs for both treatment groups except for bodyweights (BWs), which were entered as correlated variables); (c) Correlated BW-Different Inputs (separate draws of all PBPK inputs for both treatment groups except that BWs were entered as correlated variables); and (d) Different Mouse (separate draws of all PBPK inputs for both treatment groups). Monte Carlo PBPK inputs reflect posterior distributions from Bayesian calibration in the mouse that had been previously reported. A minimum of 12,500 PBPK iterations were undertaken, in which dose metrics, i.e., mg DCM metabolized by the GST pathway/L tissue/day for lung and liver were determined. For dose-response modeling, these metrics were combined with NTP tumor incidence data that were randomly selected from binomial distributions. Resultant potency factors (0.1/ED(10)) were coupled with probabilistic PBPK modeling in humans that incorporated genetic polymorphisms to derive URFs. Results show that there was relatively little difference, i.e., <10% in central tendency and upper percentile URFs, regardless of the case

  4. Probabilistic Fatigue Damage Prognosis Using a Surrogate Model Trained Via 3D Finite Element Analysis

    NASA Technical Reports Server (NTRS)

    Leser, Patrick E.; Hochhalter, Jacob D.; Newman, John A.; Leser, William P.; Warner, James E.; Wawrzynek, Paul A.; Yuan, Fuh-Gwo

    2015-01-01

    Utilizing inverse uncertainty quantification techniques, structural health monitoring can be integrated with damage progression models to form probabilistic predictions of a structure's remaining useful life. However, damage evolution in realistic structures is physically complex. Accurately representing this behavior requires high-fidelity models which are typically computationally prohibitive. In the present work, a high-fidelity finite element model is represented by a surrogate model, reducing computation times. The new approach is used with damage diagnosis data to form a probabilistic prediction of remaining useful life for a test specimen under mixed-mode conditions.

  5. Parameter estimation of social forces in pedestrian dynamics models via a probabilistic method.

    PubMed

    Corbetta, Alessandro; Muntean, Adrian; Vafayi, Kiamars

    2015-04-01

    Focusing on a specific crowd dynamics situation, including real life experiments and measurements, our paper targets a twofold aim: (1) we present a Bayesian probabilistic method to estimate the value and the uncertainty (in the form of a probability density function) of parameters in crowd dynamic models from the experimental data; and (2) we introduce a fitness measure for the models to classify a couple of model structures (forces) according to their fitness to the experimental data, preparing the stage for a more general model-selection and validation strategy inspired by probabilistic data analysis. Finally, we review the essential aspects of our experimental setup and measurement technique.

  6. Synaptic Computation Underlying Probabilistic Inference

    PubMed Central

    Soltani, Alireza; Wang, Xiao-Jing

    2010-01-01

    In this paper we propose that synapses may be the workhorse of neuronal computations that underlie probabilistic reasoning. We built a neural circuit model for probabilistic inference when information provided by different sensory cues needs to be integrated, and the predictive powers of individual cues about an outcome are deduced through experience. We found that bounded synapses naturally compute, through reward-dependent plasticity, the posterior probability that a choice alternative is correct given that a cue is presented. Furthermore, a decision circuit endowed with such synapses makes choices based on the summated log posterior odds and performs near-optimal cue combination. The model is validated by reproducing salient observations of, and provide insights into, a monkey experiment using a categorization task. Our model thus suggests a biophysical instantiation of the Bayesian decision rule, while predicting important deviations from it similar to ‘base-rate neglect’ observed in human studies when alternatives have unequal priors. PMID:20010823

  7. Hierarchical Diffusion Models for Two-Choice Response Times

    ERIC Educational Resources Information Center

    Vandekerckhove, Joachim; Tuerlinckx, Francis; Lee, Michael D.

    2011-01-01

    Two-choice response times are a common type of data, and much research has been devoted to the development of process models for such data. However, the practical application of these models is notoriously complicated, and flexible methods are largely nonexistent. We combine a popular model for choice response times--the Wiener diffusion…

  8. Integrating probabilistic models of perception and interactive neural networks: a historical and tutorial review

    PubMed Central

    McClelland, James L.

    2013-01-01

    This article seeks to establish a rapprochement between explicitly Bayesian models of contextual effects in perception and neural network models of such effects, particularly the connectionist interactive activation (IA) model of perception. The article is in part an historical review and in part a tutorial, reviewing the probabilistic Bayesian approach to understanding perception and how it may be shaped by context, and also reviewing ideas about how such probabilistic computations may be carried out in neural networks, focusing on the role of context in interactive neural networks, in which both bottom-up and top-down signals affect the interpretation of sensory inputs. It is pointed out that connectionist units that use the logistic or softmax activation functions can exactly compute Bayesian posterior probabilities when the bias terms and connection weights affecting such units are set to the logarithms of appropriate probabilistic quantities. Bayesian concepts such the prior, likelihood, (joint and marginal) posterior, probability matching and maximizing, and calculating vs. sampling from the posterior are all reviewed and linked to neural network computations. Probabilistic and neural network models are explicitly linked to the concept of a probabilistic generative model that describes the relationship between the underlying target of perception (e.g., the word intended by a speaker or other source of sensory stimuli) and the sensory input that reaches the perceiver for use in inferring the underlying target. It is shown how a new version of the IA model called the multinomial interactive activation (MIA) model can sample correctly from the joint posterior of a proposed generative model for perception of letters in words, indicating that interactive processing is fully consistent with principled probabilistic computation. Ways in which these computations might be realized in real neural systems are also considered. PMID:23970868

  9. Opponent actor learning (OpAL): modeling interactive effects of striatal dopamine on reinforcement learning and choice incentive.

    PubMed

    Collins, Anne G E; Frank, Michael J

    2014-07-01

    The striatal dopaminergic system has been implicated in reinforcement learning (RL), motor performance, and incentive motivation. Various computational models have been proposed to account for each of these effects individually, but a formal analysis of their interactions is lacking. Here we present a novel algorithmic model expanding the classical actor-critic architecture to include fundamental interactive properties of neural circuit models, incorporating both incentive and learning effects into a single theoretical framework. The standard actor is replaced by a dual opponent actor system representing distinct striatal populations, which come to differentially specialize in discriminating positive and negative action values. Dopamine modulates the degree to which each actor component contributes to both learning and choice discriminations. In contrast to standard frameworks, this model simultaneously captures documented effects of dopamine on both learning and choice incentive-and their interactions-across a variety of studies, including probabilistic RL, effort-based choice, and motor skill learning.

  10. Probabilistic seismic hazard study based on active fault and finite element geodynamic models

    NASA Astrophysics Data System (ADS)

    Kastelic, Vanja; Carafa, Michele M. C.; Visini, Francesco

    2016-04-01

    We present a probabilistic seismic hazard analysis (PSHA) that is exclusively based on active faults and geodynamic finite element input models whereas seismic catalogues were used only in a posterior comparison. We applied the developed model in the External Dinarides, a slow deforming thrust-and-fold belt at the contact between Adria and Eurasia.. is the Our method consists of establishing s two earthquake rupture forecast models: (i) a geological active fault input (GEO) model and, (ii) a finite element (FEM) model. The GEO model is based on active fault database that provides information on fault location and its geometric and kinematic parameters together with estimations on its slip rate. By default in this model all deformation is set to be released along the active faults. The FEM model is based on a numerical geodynamic model developed for the region of study. In this model the deformation is, besides along the active faults, released also in the volumetric continuum elements. From both models we calculated their corresponding activity rates, its earthquake rates and their final expected peak ground accelerations. We investigated both the source model and the earthquake model uncertainties by varying the main active fault and earthquake rate calculation parameters through constructing corresponding branches of the seismic hazard logic tree. Hazard maps and UHS curves have been produced for horizontal ground motion on bedrock conditions VS 30 ≥ 800 m/s), thereby not considering local site amplification effects. The hazard was computed over a 0.2° spaced grid considering 648 branches of the logic tree and the mean value of 10% probability of exceedance in 50 years hazard level, while the 5th and 95th percentiles were also computed to investigate the model limits. We conducted a sensitivity analysis to control which of the input parameters influence the final hazard results in which measure. The results of such comparison evidence the deformation model and

  11. Children's Conceptions of Career Choice and Attainment: Model Development

    ERIC Educational Resources Information Center

    Howard, Kimberly A. S.; Walsh, Mary E.

    2011-01-01

    This article describes a model of children's conceptions of two key career development processes: career choice and career attainment. The model of children's understanding of career choice and attainment was constructed with developmental research and theory into children's understanding of allied phenomena such as their understanding of illness,…

  12. Estimation of an Occupational Choice Model when Occupations Are Misclassified

    ERIC Educational Resources Information Center

    Sullivan, Paul

    2009-01-01

    This paper develops an empirical occupational choice model that corrects for misclassification in occupational choices and measurement error in occupation-specific work experience. The model is used to estimate the extent of measurement error in occupation data and quantify the bias that results from ignoring measurement error in occupation codes…

  13. Real-time probabilistic covariance tracking with efficient model update.

    PubMed

    Wu, Yi; Cheng, Jian; Wang, Jinqiao; Lu, Hanqing; Wang, Jun; Ling, Haibin; Blasch, Erik; Bai, Li

    2012-05-01

    The recently proposed covariance region descriptor has been proven robust and versatile for a modest computational cost. The covariance matrix enables efficient fusion of different types of features, where the spatial and statistical properties, as well as their correlation, are characterized. The similarity between two covariance descriptors is measured on Riemannian manifolds. Based on the same metric but with a probabilistic framework, we propose a novel tracking approach on Riemannian manifolds with a novel incremental covariance tensor learning (ICTL). To address the appearance variations, ICTL incrementally learns a low-dimensional covariance tensor representation and efficiently adapts online to appearance changes of the target with only O(1) computational complexity, resulting in a real-time performance. The covariance-based representation and the ICTL are then combined with the particle filter framework to allow better handling of background clutter, as well as the temporary occlusions. We test the proposed probabilistic ICTL tracker on numerous benchmark sequences involving different types of challenges including occlusions and variations in illumination, scale, and pose. The proposed approach demonstrates excellent real-time performance, both qualitatively and quantitatively, in comparison with several previously proposed trackers.

  14. Probabilistic Fatigue Life Prediction of Turbine Disc Considering Model Parameter Uncertainty

    NASA Astrophysics Data System (ADS)

    He, Liping; Yu, Le; Zhu, Shun-Peng; Ding, Liangliang; Huang, Hong-Zhong

    2016-06-01

    Aiming to improve the predictive ability of Walker model for fatigue life prediction and taking the turbine disc alloy GH4133 as the application example, this paper investigates a new approach for probabilistic fatigue life prediction when considering parameter uncertainty inherent in the life prediction model. Firstly, experimental data are used to update the model parameters using Bayes' theorem, so as to obtain the posterior probability distribution functions of two parameters of the Walker model, as well to achieve the probabilistic life prediction model for turbine disc. During the updating process, Markov Chain Monte Carlo (MCMC) technique is used to generate samples of the given distribution and estimating the parameters distinctly. After that, the turbine disc life is predicted using the probabilistic Walker model based on Monte Carlo simulation technique. The experimental results indicate that: (1) after using the small sample test data obtained from turbine disc, parameter uncertainty of the Walker model can be quantified and the corresponding probabilistic model for fatigue life prediction can be established using Bayes' theorem; (2) there exists obvious dispersion of life data for turbine disc when predicting fatigue life in practical engineering application.

  15. Conditional Reasoning in Context: A Dual-Source Model of Probabilistic Inference

    ERIC Educational Resources Information Center

    Klauer, Karl Christoph; Beller, Sieghard; Hutter, Mandy

    2010-01-01

    A dual-source model of probabilistic conditional inference is proposed. According to the model, inferences are based on 2 sources of evidence: logical form and prior knowledge. Logical form is a decontextualized source of evidence, whereas prior knowledge is activated by the contents of the conditional rule. In Experiments 1 to 3, manipulations of…

  16. Hyperbolic value addition and general models of animal choice.

    PubMed

    Mazur, J E

    2001-01-01

    Three mathematical models of choice--the contextual-choice model (R. Grace, 1994), delay-reduction theory (N. Squires & E. Fantino, 1971), and a new model called the hyperbolic value-added model--were compared in their ability to predict the results from a wide variety of experiments with animal subjects. When supplied with 2 or 3 free parameters, all 3 models made fairly accurate predictions for a large set of experiments that used concurrent-chain procedures. One advantage of the hyperbolic value-added model is that it is derived from a simpler model that makes accurate predictions for many experiments using discrete-trial adjusting-delay procedures. Some results favor the hyperbolic value-added model and delay-reduction theory over the contextual-choice model, but more data are needed from choice situations for which the models make distinctly different predictions.

  17. Probabilistic inference in general graphical models through sampling in stochastic networks of spiking neurons.

    PubMed

    Pecevski, Dejan; Buesing, Lars; Maass, Wolfgang

    2011-12-01

    An important open problem of computational neuroscience is the generic organization of computations in networks of neurons in the brain. We show here through rigorous theoretical analysis that inherent stochastic features of spiking neurons, in combination with simple nonlinear computational operations in specific network motifs and dendritic arbors, enable networks of spiking neurons to carry out probabilistic inference through sampling in general graphical models. In particular, it enables them to carry out probabilistic inference in Bayesian networks with converging arrows ("explaining away") and with undirected loops, that occur in many real-world tasks. Ubiquitous stochastic features of networks of spiking neurons, such as trial-to-trial variability and spontaneous activity, are necessary ingredients of the underlying computational organization. We demonstrate through computer simulations that this approach can be scaled up to neural emulations of probabilistic inference in fairly large graphical models, yielding some of the most complex computations that have been carried out so far in networks of spiking neurons.

  18. Probabilistic multiobject deformable model for MR/SPECT brain image registration and segmentation

    NASA Astrophysics Data System (ADS)

    Nikou, Christophoros; Heitz, Fabrice; Armspach, Jean-Paul

    1999-05-01

    A probabilistic deformable model for the representation of brain structures is described. The statistically learned deformable model represents the relative location of head (skull and scalp) and brain surfaces in MR/SPECT images pairs and accommodates the significant variability of these anatomical structures across different individuals. To provide a training set, a representative collection of 3D MRI volumes of different patients have first been registered to a reference image. The head and brain surfaces of each volume are parameterized by the amplitudes of the vibration modes of a deformable spherical mesh. For a given MR image in the training set, a vector containing the largest vibration modes describing the head and the brain is created. This random vector is statistically constrained by retaining the most significant variations modes of its Karhunen-Loeve expansion on the training population. By these means, both head and brain surfaces are deformed according to the anatomical variability observed in the training set. Two applications of the probabilistic deformable model are presented: the deformable model-based registration of 3D multimodal (MR/SPECT) brain images and the segmentation of the brain from MRI using the probabilistic constraints embedded in the deformable model. The multi-object deformable model may be considered as a first step towards the development of a general purpose probabilistic anatomical atlas of the brain.

  19. A range of complex probabilistic models for RNA secondary structure prediction that includes the nearest-neighbor model and more.

    PubMed

    Rivas, Elena; Lang, Raymond; Eddy, Sean R

    2012-02-01

    The standard approach for single-sequence RNA secondary structure prediction uses a nearest-neighbor thermodynamic model with several thousand experimentally determined energy parameters. An attractive alternative is to use statistical approaches with parameters estimated from growing databases of structural RNAs. Good results have been reported for discriminative statistical methods using complex nearest-neighbor models, including CONTRAfold, Simfold, and ContextFold. Little work has been reported on generative probabilistic models (stochastic context-free grammars [SCFGs]) of comparable complexity, although probabilistic models are generally easier to train and to use. To explore a range of probabilistic models of increasing complexity, and to directly compare probabilistic, thermodynamic, and discriminative approaches, we created TORNADO, a computational tool that can parse a wide spectrum of RNA grammar architectures (including the standard nearest-neighbor model and more) using a generalized super-grammar that can be parameterized with probabilities, energies, or arbitrary scores. By using TORNADO, we find that probabilistic nearest-neighbor models perform comparably to (but not significantly better than) discriminative methods. We find that complex statistical models are prone to overfitting RNA structure and that evaluations should use structurally nonhomologous training and test data sets. Overfitting has affected at least one published method (ContextFold). The most important barrier to improving statistical approaches for RNA secondary structure prediction is the lack of diversity of well-curated single-sequence RNA secondary structures in current RNA databases.

  20. PEER REVIEW FOR THE CONSUMER VEHICLE CHOICE MODEL

    EPA Science Inventory

    The U.S. Environmental Protection Agency’s (EPA) Office of Transportation and Air Quality (OTAQ) has recently sponsored the development of a Consumer Vehicle Choice Model (CVCM) by the Oak Ridge National Laboratory (ORNL). The specification by OTAQ to ORNL for consumer choice mod...

  1. Probabilistic estimation of residential air exchange rates for population-based human exposure modeling

    EPA Science Inventory

    Residential air exchange rates (AERs) are a key determinant in the infiltration of ambient air pollution indoors. Population-based human exposure models using probabilistic approaches to estimate personal exposure to air pollutants have relied on input distributions from AER meas...

  2. From cyclone tracks to the costs of European winter storms: A probabilistic loss assessment model

    NASA Astrophysics Data System (ADS)

    Renggli, Dominik; Corti, Thierry; Reese, Stefan; Wueest, Marc; Viktor, Elisabeth; Zimmerli, Peter

    2014-05-01

    The quantitative assessment of the potential losses of European winter storms is essential for the economic viability of a global reinsurance company. For this purpose, reinsurance companies generally use probabilistic loss assessment models. This work presents an innovative approach to develop physically meaningful probabilistic events for Swiss Re's new European winter storm loss model. The meteorological hazard component of the new model is based on cyclone and windstorm tracks identified in the 20th Century Reanalysis data. The knowledge of the evolution of winter storms both in time and space allows the physically meaningful perturbation of properties of historical events (e.g. track, intensity). The perturbation includes a random element but also takes the local climatology and the evolution of the historical event into account. The low-resolution wind footprints taken from 20th Century Reanalysis are processed by a statistical-dynamical downscaling to generate high-resolution footprints of the historical and probabilistic winter storm events. Downscaling transfer functions are generated using ENSEMBLES regional climate model data. The result is a set of reliable probabilistic events representing thousands of years. The event set is then combined with country- and risk-specific vulnerability functions and detailed market- or client-specific exposure information to compute (re-)insurance risk premiums.

  3. Psychological Plausibility of the Theory of Probabilistic Mental Models and the Fast and Frugal Heuristics

    ERIC Educational Resources Information Center

    Dougherty, Michael R.; Franco-Watkins, Ana M.; Thomas, Rick

    2008-01-01

    The theory of probabilistic mental models (PMM; G. Gigerenzer, U. Hoffrage, & H. Kleinbolting, 1991) has had a major influence on the field of judgment and decision making, with the most recent important modifications to PMM theory being the identification of several fast and frugal heuristics (G. Gigerenzer & D. G. Goldstein, 1996). These…

  4. A model for probabilistic health impact assessment of exposure to food chemicals.

    PubMed

    van der Voet, Hilko; van der Heijden, Gerie W A M; Bos, Peter M J; Bosgra, Sieto; Boon, Polly E; Muri, Stefan D; Brüschweiler, Beat J

    2009-12-01

    A statistical model is presented extending the integrated probabilistic risk assessment (IPRA) model of van der Voet and Slob [van der Voet, H., Slob, W., 2007. Integration of probabilistic exposure assessment and probabilistic hazard characterisation. Risk Analysis, 27, 351-371]. The aim is to characterise the health impact due to one or more chemicals present in food causing one or more health effects. For chemicals with hardly any measurable safety problems we propose health impact characterisation by margins of exposure. In this probabilistic model not one margin of exposure is calculated, but rather a distribution of individual margins of exposure (IMoE) which allows quantifying the health impact for small parts of the population. A simple bar chart is proposed to represent the IMoE distribution and a lower bound (IMoEL) quantifies uncertainties in this distribution. It is described how IMoE distributions can be combined for dose-additive compounds and for different health effects. Health impact assessment critically depends on a subjective valuation of the health impact of a given health effect, and possibilities to implement this health impact valuation step are discussed. Examples show the possibilities of health impact characterisation and of integrating IMoE distributions. The paper also includes new proposals for modelling variable and uncertain factors describing food processing effects and intraspecies variation in sensitivity.

  5. Speeded Classification in a Probabilistic Category Structure: Contrasting Exemplar-Retrieval, Decision-Boundary, and Prototype Models

    ERIC Educational Resources Information Center

    Nosofsky, Robert M.; Stanton, Roger D.

    2005-01-01

    Speeded perceptual classification experiments were conducted to distinguish among the predictions of exemplar-retrieval, decision-boundary, and prototype models. The key manipulation was that across conditions, individual stimuli received either probabilistic or deterministic category feedback. Regardless of the probabilistic feedback, however, an…

  6. NEUROBIOLOGY OF ECONOMIC CHOICE: A GOOD-BASED MODEL

    PubMed Central

    Padoa-Schioppa, Camillo

    2012-01-01

    Traditionally the object of economic theory and experimental psychology, economic choice recently became a lively research focus in systems neuroscience. Here I summarize the emerging results and I propose a unifying model of how economic choice might function at the neural level. Economic choice entails comparing options that vary on multiple dimensions. Hence, while choosing, individuals integrate different determinants into a subjective value; decisions are then made by comparing values. According to the good-based model, the values of different goods are computed independently of one another, which implies transitivity. Values are not learned as such, but rather computed at the time of choice. Most importantly, values are compared within the space of goods, independent of the sensori-motor contingencies of choice. Evidence from neurophysiology, imaging and lesion studies indicates that abstract representations of value exist in the orbitofrontal and ventromedial prefrontal cortices. The computation and comparison of values may thus take place within these regions. PMID:21456961

  7. Sexual selection under parental choice: a revision to the model.

    PubMed

    Apostolou, Menelaos

    2014-06-01

    Across human cultures, parents exercise considerable influence over their children's mate choices. The model of parental choice provides a good account of these patterns, but its prediction that male parents exercise more control than female ones is not well founded in evolutionary theory. To address this shortcoming, the present article proposes a revision to the model. In particular, parental uncertainty, residual reproductive value, reproductive variance, asymmetry in the control of resources, physical strength, and access to weaponry make control over mating more profitable for male parents than female ones; in turn, this produces an asymmetrical incentive for controlling mate choice. Several implications of this formulation are also explored.

  8. Hierarchical minimax entropy modeling and probabilistic principal component visualization for data exploration

    NASA Astrophysics Data System (ADS)

    Wang, Yue J.; Luo, Lan; Li, Haifeng; Freedman, Matthew T.

    1999-05-01

    As a step toward understanding the complex information from data and relationships, structural and discriminative knowledge reveals insight that may prove useful in data interpretation and exploration. This paper reports the development of an automated and intelligent procedure for generating the hierarchy of minimize entropy models and principal component visualization spaces for improved data explanation. The proposed hierarchical mimimax entropy modeling and probabilistic principal component projection are both statistically principles and visually effective at revealing all of the interesting aspects of the data set. The methods involve multiple use of standard finite normal mixture models and probabilistic principal component projections. The strategy is that the top-level model and projection should explain the entire data set, best revealing the presence of clusters and relationships, while lower-level models and projections should display internal structure within individual clusters, such as the presence of subclusters and attribute trends, which might not be apparent in the higher-level models and projections. With may complementary mixture models and visualization projections, each level will be relatively simple while the complete hierarchy maintains overall flexibility yet still conveys considerable structural information. In particular, a model identification procedure is developed to select the optimal number and kernel shapes of local clusters from a class of data, resulting in a standard finite normal mixtures with minimum conditional bias and variance, and a probabilistic principal component neural network is advanced to generate optimal projections, leading to a hierarchical visualization algorithm allowing the complete data set to be analyzed at the top level, with best separated subclusters of data points analyzed at deeper levels. Hierarchial probabilistic principal component visualization involves (1) evaluation of posterior probabilities for

  9. A Hierarchical Model for Accuracy and Choice on Standardized Tests.

    PubMed

    Culpepper, Steven Andrew; Balamuta, James Joseph

    2015-11-25

    This paper assesses the psychometric value of allowing test-takers choice in standardized testing. New theoretical results examine the conditions where allowing choice improves score precision. A hierarchical framework is presented for jointly modeling the accuracy of cognitive responses and item choices. The statistical methodology is disseminated in the 'cIRT' R package. An 'answer two, choose one' (A2C1) test administration design is introduced to avoid challenges associated with nonignorable missing data. Experimental results suggest that the A2C1 design and payout structure encouraged subjects to choose items consistent with their cognitive trait levels. Substantively, the experimental data suggest that item choices yielded comparable information and discrimination ability as cognitive items. Given there are no clear guidelines for writing more or less discriminating items, one practical implication is that choice can serve as a mechanism to improve score precision.

  10. An analytical probabilistic model of the quality efficiency of a sewer tank

    NASA Astrophysics Data System (ADS)

    Balistrocchi, Matteo; Grossi, Giovanna; Bacchi, Baldassare

    2009-12-01

    The assessment of the efficiency of a storm water storage facility devoted to the sewer overflow control in urban areas strictly depends on the ability to model the main features of the rainfall-runoff routing process and the related wet weather pollution delivery. In this paper the possibility of applying the analytical probabilistic approach for developing a tank design method, whose potentials are similar to the continuous simulations, is proved. In the model derivation the quality issues of such devices were implemented. The formulation is based on a Weibull probabilistic model of the main characteristics of the rainfall process and on a power law describing the relationship between the dimensionless storm water cumulative runoff volume and the dimensionless cumulative pollutograph. Following this approach, efficiency indexes were established. The proposed model was verified by comparing its results to those obtained by continuous simulations; satisfactory agreement is shown for the proposed efficiency indexes.

  11. A note on probabilistic models over strings: the linear algebra approach.

    PubMed

    Bouchard-Côté, Alexandre

    2013-12-01

    Probabilistic models over strings have played a key role in developing methods that take into consideration indels as phylogenetically informative events. There is an extensive literature on using automata and transducers on phylogenies to do inference on these probabilistic models, in which an important theoretical question is the complexity of computing the normalization of a class of string-valued graphical models. This question has been investigated using tools from combinatorics, dynamic programming, and graph theory, and has practical applications in Bayesian phylogenetics. In this work, we revisit this theoretical question from a different point of view, based on linear algebra. The main contribution is a set of results based on this linear algebra view that facilitate the analysis and design of inference algorithms on string-valued graphical models. As an illustration, we use this method to give a new elementary proof of a known result on the complexity of inference on the "TKF91" model, a well-known probabilistic model over strings. Compared to previous work, our proving method is easier to extend to other models, since it relies on a novel weak condition, triangular transducers, which is easy to establish in practice. The linear algebra view provides a concise way of describing transducer algorithms and their compositions, opens the possibility of transferring fast linear algebra libraries (for example, based on GPUs), as well as low rank matrix approximation methods, to string-valued inference problems.

  12. Probabilistic models for assessment of extreme temperatures and relative humidity in Lithuania

    NASA Astrophysics Data System (ADS)

    Alzbutas, Robertas; Šeputytė, Ilona

    2015-04-01

    Extreme temperatures are fairly common natural phenomenon in Lithuania. They have mainly negative effects both on the environment and humans. Thus there are important to perform probabilistic and statistical analyzes of possibly extreme temperature values and their time-dependant changes. This is especially important in areas where technical objects (sensitive to the extreme temperatures) are foreseen to be constructed. In order to estimate the frequencies and consequences of possible extreme temperatures, the probabilistic analysis of the event occurrence and its uncertainty has been performed: statistical data have been collected and analyzed. The probabilistic analysis of extreme temperatures in Lithuanian territory is based on historical data taken from Lithuanian Hydrometeorology Service, Dūkštas Meteorological Station, Lithuanian Energy Institute and Ignalina NNP Environmental Protection Department of Environmental Monitoring Service. The main objective of performed work was the probabilistic assessment of occurrence and impact of extreme temperature and relative humidity occurring in whole Lithuania and specifically in Dūkštas region where Ignalina Nuclear Power Plant is closed for decommissioning. In addition, the other purpose of this work was to analyze the changes of extreme temperatures. The probabilistic analysis of extreme temperatures increase in Lithuanian territory was based on more than 50 years historical data. The probabilistic assessment was focused on the application and comparison of Gumbel, Weibull and Generalized Value (GEV) distributions, enabling to select a distribution, which has the best fit for data of extreme temperatures. In order to assess the likelihood of extreme temperatures different probabilistic models were applied to evaluate the probability of exeedance of different extreme temperatures. According to the statistics and the relationship between return period and probabilities of temperatures the return period for 30

  13. Probabilistic material degradation model for aerospace materials subjected to high temperature, mechanical and thermal fatigue, and creep

    NASA Technical Reports Server (NTRS)

    Boyce, L.

    1992-01-01

    A probabilistic general material strength degradation model has been developed for structural components of aerospace propulsion systems subjected to diverse random effects. The model has been implemented in two FORTRAN programs, PROMISS (Probabilistic Material Strength Simulator) and PROMISC (Probabilistic Material Strength Calibrator). PROMISS calculates the random lifetime strength of an aerospace propulsion component due to as many as eighteen diverse random effects. Results are presented in the form of probability density functions and cumulative distribution functions of lifetime strength. PROMISC calibrates the model by calculating the values of empirical material constants.

  14. Probabilistic Inference: Task Dependency and Individual Differences of Probability Weighting Revealed by Hierarchical Bayesian Modeling

    PubMed Central

    Boos, Moritz; Seer, Caroline; Lange, Florian; Kopp, Bruno

    2016-01-01

    Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modeling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities) by two (likelihoods) design. Five computational models of cognitive processes were compared with the observed behavior. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted) S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model's success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modeling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modeling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision. PMID:27303323

  15. Boosting probabilistic graphical model inference by incorporating prior knowledge from multiple sources.

    PubMed

    Praveen, Paurush; Fröhlich, Holger

    2013-01-01

    Inferring regulatory networks from experimental data via probabilistic graphical models is a popular framework to gain insights into biological systems. However, the inherent noise in experimental data coupled with a limited sample size reduces the performance of network reverse engineering. Prior knowledge from existing sources of biological information can address this low signal to noise problem by biasing the network inference towards biologically plausible network structures. Although integrating various sources of information is desirable, their heterogeneous nature makes this task challenging. We propose two computational methods to incorporate various information sources into a probabilistic consensus structure prior to be used in graphical model inference. Our first model, called Latent Factor Model (LFM), assumes a high degree of correlation among external information sources and reconstructs a hidden variable as a common source in a Bayesian manner. The second model, a Noisy-OR, picks up the strongest support for an interaction among information sources in a probabilistic fashion. Our extensive computational studies on KEGG signaling pathways as well as on gene expression data from breast cancer and yeast heat shock response reveal that both approaches can significantly enhance the reconstruction accuracy of Bayesian Networks compared to other competing methods as well as to the situation without any prior. Our framework allows for using diverse information sources, like pathway databases, GO terms and protein domain data, etc. and is flexible enough to integrate new sources, if available.

  16. Semiparametric Thurstonian Models for Recurrent Choices: A Bayesian Analysis

    ERIC Educational Resources Information Center

    Ansari, Asim; Iyengar, Raghuram

    2006-01-01

    We develop semiparametric Bayesian Thurstonian models for analyzing repeated choice decisions involving multinomial, multivariate binary or multivariate ordinal data. Our modeling framework has multiple components that together yield considerable flexibility in modeling preference utilities, cross-sectional heterogeneity and parameter-driven…

  17. A Probabilistic Model for the Distribution of Authorships.

    ERIC Educational Resources Information Center

    Ajiferuke, Isola

    1991-01-01

    Discusses bibliometric studies of research collaboration and describes the development of a theoretical model for the distribution of authorship. The shifted Waring distribution model and 15 other probability models are tested for goodness-of-fit, and results are reported that indicate the shifted inverse Gaussian-Poisson model provides the best…

  18. Receptor-mediated cell attachment and detachment kinetics. I. Probabilistic model and analysis.

    PubMed Central

    Cozens-Roberts, C.; Lauffenburger, D. A.; Quinn, J. A.

    1990-01-01

    The kinetics of receptor-mediated cell adhesion to a ligand-coated surface play a key role in many physiological and biotechnology-related processes. We present a probabilistic model of receptor-ligand bond formation between a cell and surface to describe the probability of adhesion in a fluid shear field. Our model extends the deterministic model of Hammer and Lauffenburger (Hammer, D.A., and D.A. Lauffenburger. 1987. Biophys. J. 52:475-487) to a probabilistic framework, in which we calculate the probability that a certain number of bonds between a cell and surface exists at any given time. The probabilistic framework is used to account for deviations from ideal, deterministic behavior, inherent in chemical reactions involving relatively small numbers of reacting molecules. Two situations are investigated: first, cell attachment in the absence of fluid stress; and, second, cell detachment in the presence of fluid stress. In the attachment case, we examine the expected variance in bond formation as a function of attachment time; this also provides an initial condition for the detachment case. Focusing then on detachment, we predict transient behavior as a function of key system parameters, such as the distractive fluid force, the receptor-ligand bond affinity and rate constants, and the receptor and ligand densities. We compare the predictions of the probabilistic model with those of a deterministic model, and show how a deterministic approach can yield some inaccurate results; e.g., it cannot account for temporally continuous cell attach mentor detachment, it can underestimate the time needed for cell attachment, it can overestimate the time required for cell detachment for a given level of force, and it can overestimate the force necessary for cell detachment. PMID:2174271

  19. A neurocomputational model of altruistic choice and its implications

    PubMed Central

    Hutcherson, Cendri A.; Bushong, Benjamin; Rangel, Antonio

    2016-01-01

    Summary We propose a neurocomputational model of altruistic choice and test it using behavioral and fMRI data from a task in which subjects make choices between real monetary prizes for themselves and another. We show that a multi-attribute drift-diffusion model, in which choice results from accumulation of a relative value signal that linearly weights payoffs for self and other, captures key patterns of choice, reaction time, and neural response in ventral striatum, temporoparietal junction, and ventromedial prefrontal cortex. The model generates several novel insights into the nature of altruism. It explains when and why generous choices are slower or faster than selfish choices, and why they produce greater response in TPJ and vmPFC, without invoking competition between automatic and deliberative processes or reward value for generosity. It also predicts that when one’s own payoffs are valued more than others’, some generous acts may reflect mistakes rather than genuinely pro-social preferences. PMID:26182424

  20. Probabilistic performance-assessment modeling of the mixed waste landfill at Sandia National Laboratories.

    SciTech Connect

    Peace, Gerald L.; Goering, Timothy James; Miller, Mark Laverne; Ho, Clifford Kuofei

    2007-01-01

    A probabilistic performance assessment has been conducted to evaluate the fate and transport of radionuclides (americium-241, cesium-137, cobalt-60, plutonium-238, plutonium-239, radium-226, radon-222, strontium-90, thorium-232, tritium, uranium-238), heavy metals (lead and cadmium), and volatile organic compounds (VOCs) at the Mixed Waste Landfill (MWL). Probabilistic analyses were performed to quantify uncertainties inherent in the system and models for a 1,000-year period, and sensitivity analyses were performed to identify parameters and processes that were most important to the simulated performance metrics. Comparisons between simulated results and measured values at the MWL were made to gain confidence in the models and perform calibrations when data were available. In addition, long-term monitoring requirements and triggers were recommended based on the results of the quantified uncertainty and sensitivity analyses.

  1. The Dependent Poisson Race Model and Modeling Dependence in Conjoint Choice Experiments

    ERIC Educational Resources Information Center

    Ruan, Shiling; MacEachern, Steven N.; Otter, Thomas; Dean, Angela M.

    2008-01-01

    Conjoint choice experiments are used widely in marketing to study consumer preferences amongst alternative products. We develop a class of choice models, belonging to the class of Poisson race models, that describe a "random utility" which lends itself to a process-based description of choice. The models incorporate a dependence structure which…

  2. Probabilistic Analysis of Onion Routing in a Black-box Model

    DTIC Science & Technology

    2007-01-01

    model of anonymous communication that abstracts the essential properties of onion routing in the presence of an active adversary that controls a portion...users by exploit- ing knowledge of their probabilistic behavior. In particular, we show that a user u’s anonymity is worst either when the other...users always choose the destination u is least likely to visit or when the other users always choose the destination u chooses. This worst-case anonymity

  3. Modeling choice behavior for new pharmaceutical products.

    PubMed

    Bingham, M F; Johnson, F R; Miller, D

    2001-01-01

    This paper presents a dynamic generalization of a model often used to aid marketing decisions relating to conventional products. The model uses stated-preference data in a random-utility framework to predict adoption rates for new pharmaceutical products. In addition, this paper employs a Markov model of patient learning in drug selection. While the simple learning rule presented here is only a rough approximation to reality, this model nevertheless systematically incorporates important features including learning and the influence of shifting preferences on market share. Despite its simplifications, the integrated framework of random-utility and product attribute updating presented here is capable of accommodating a variety of pharmaceutical marketing and development problems. This research demonstrates both the strengths of stated-preference market research and some of its shortcomings for pharmaceutical applications.

  4. Street choice logit model for visitors in shopping districts.

    PubMed

    Kawada, Ko; Yamada, Takashi; Kishimoto, Tatsuya

    2014-09-01

    In this study, we propose two models for predicting people's activity. The first model is the pedestrian distribution prediction (or postdiction) model by multiple regression analysis using space syntax indices of urban fabric and people distribution data obtained from a field survey. The second model is a street choice model for visitors using multinomial logit model. We performed a questionnaire survey on the field to investigate the strolling routes of 46 visitors and obtained a total of 1211 street choices in their routes. We proposed a utility function, sum of weighted space syntax indices, and other indices, and estimated the parameters for weights on the basis of maximum likelihood. These models consider both street networks, distance from destination, direction of the street choice and other spatial compositions (numbers of pedestrians, cars, shops, and elevation). The first model explains the characteristics of the street where many people tend to walk or stay. The second model explains the mechanism underlying the street choice of visitors and clarifies the differences in the weights of street choice parameters among the various attributes, such as gender, existence of destinations, number of people, etc. For all the attributes considered, the influences of DISTANCE and DIRECTION are strong. On the other hand, the influences of Int.V, SHOPS, CARS, ELEVATION, and WIDTH are different for each attribute. People with defined destinations tend to choose streets that "have more shops, and are wider and lower". In contrast, people with undefined destinations tend to choose streets of high Int.V. The choice of males is affected by Int.V, SHOPS, WIDTH (positive) and CARS (negative). Females prefer streets that have many shops, and couples tend to choose downhill streets. The behavior of individual persons is affected by all variables. The behavior of people visiting in groups is affected by SHOP and WIDTH (positive).

  5. Near Real-Time Probabilistic Damage Diagnosis Using Surrogate Modeling and High Performance Computing

    NASA Technical Reports Server (NTRS)

    Warner, James E.; Zubair, Mohammad; Ranjan, Desh

    2017-01-01

    This work investigates novel approaches to probabilistic damage diagnosis that utilize surrogate modeling and high performance computing (HPC) to achieve substantial computational speedup. Motivated by Digital Twin, a structural health management (SHM) paradigm that integrates vehicle-specific characteristics with continual in-situ damage diagnosis and prognosis, the methods studied herein yield near real-time damage assessments that could enable monitoring of a vehicle's health while it is operating (i.e. online SHM). High-fidelity modeling and uncertainty quantification (UQ), both critical to Digital Twin, are incorporated using finite element method simulations and Bayesian inference, respectively. The crux of the proposed Bayesian diagnosis methods, however, is the reformulation of the numerical sampling algorithms (e.g. Markov chain Monte Carlo) used to generate the resulting probabilistic damage estimates. To this end, three distinct methods are demonstrated for rapid sampling that utilize surrogate modeling and exploit various degrees of parallelism for leveraging HPC. The accuracy and computational efficiency of the methods are compared on the problem of strain-based crack identification in thin plates. While each approach has inherent problem-specific strengths and weaknesses, all approaches are shown to provide accurate probabilistic damage diagnoses and several orders of magnitude computational speedup relative to a baseline Bayesian diagnosis implementation.

  6. Economic Dispatch for Microgrid Containing Electric Vehicles via Probabilistic Modeling: Preprint

    SciTech Connect

    Yao, Yin; Gao, Wenzhong; Momoh, James; Muljadi, Eduard

    2016-02-11

    In this paper, an economic dispatch model with probabilistic modeling is developed for a microgrid. The electric power supply in a microgrid consists of conventional power plants and renewable energy power plants, such as wind and solar power plants. Because of the fluctuation in the output of solar and wind power plants, an empirical probabilistic model is developed to predict their hourly output. According to different characteristics of wind and solar power plants, the parameters for probabilistic distribution are further adjusted individually for both. On the other hand, with the growing trend in plug-in electric vehicles (PHEVs), an integrated microgrid system must also consider the impact of PHEVs. The charging loads from PHEVs as well as the discharging output via the vehicle-to-grid (V2G) method can greatly affect the economic dispatch for all of the micro energy sources in a microgrid. This paper presents an optimization method for economic dispatch in a microgrid considering conventional power plants, renewable power plants, and PHEVs. The simulation results reveal that PHEVs with V2G capability can be an indispensable supplement in a modern microgrid.

  7. Sensitivity analysis and probabilistic re-entry modeling for debris using high dimensional model representation based uncertainty treatment

    NASA Astrophysics Data System (ADS)

    Mehta, Piyush M.; Kubicek, Martin; Minisci, Edmondo; Vasile, Massimiliano

    2017-01-01

    Well-known tools developed for satellite and debris re-entry perform break-up and trajectory simulations in a deterministic sense and do not perform any uncertainty treatment. The treatment of uncertainties associated with the re-entry of a space object requires a probabilistic approach. A Monte Carlo campaign is the intuitive approach to performing a probabilistic analysis, however, it is computationally very expensive. In this work, we use a recently developed approach based on a new derivation of the high dimensional model representation method for implementing a computationally efficient probabilistic analysis approach for re-entry. Both aleatoric and epistemic uncertainties that affect aerodynamic trajectory and ground impact location are considered. The method is applicable to both controlled and un-controlled re-entry scenarios. The resulting ground impact distributions are far from the typically used Gaussian or ellipsoid distributions.

  8. Microscopic probabilistic model for the simulation of secondary electron emission

    SciTech Connect

    Furman, M.A.; Pivi, M.T.F.

    2002-07-29

    We provide a detailed description of a model and its computational algorithm for the secondary electron emission process. The model is based on a broad phenomenological fit to data for the secondary emission yield (SEY) and the emitted-energy spectrum. We provide two sets of values for the parameters by fitting our model to two particular data sets, one for copper and the other one for stainless steel.

  9. Probabilistic Model for the Simulation of Secondary Electron Emission

    SciTech Connect

    Furman, M

    2004-05-17

    We provide a detailed description of a model and its computational algorithm for the secondary electron emission process. The model is based on a broad phenomenological fit to data for the secondary emission yield (SEY) and the emitted-energy spectrum. We provide two sets of values for the parameters by fitting our model to two particular data sets, one for copper and the other one for stainless steel.

  10. A Ballistic Model of Choice Response Time

    ERIC Educational Resources Information Center

    Brown, Scott; Heathcote, Andrew

    2005-01-01

    Almost all models of response time (RT) use a stochastic accumulation process. To account for the benchmark RT phenomena, researchers have found it necessary to include between-trial variability in the starting point and/or the rate of accumulation, both in linear (R. Ratcliff & J. N. Rouder, 1998) and nonlinear (M. Usher & J. L. McClelland, 2001)…

  11. Dependence in probabilistic modeling, Dempster-Shafer theory, and probability bounds analysis.

    SciTech Connect

    Oberkampf, William Louis; Tucker, W. Troy; Zhang, Jianzhong; Ginzburg, Lev; Berleant, Daniel J.; Ferson, Scott; Hajagos, Janos; Nelsen, Roger B.

    2004-10-01

    This report summarizes methods to incorporate information (or lack of information) about inter-variable dependence into risk assessments that use Dempster-Shafer theory or probability bounds analysis to address epistemic and aleatory uncertainty. The report reviews techniques for simulating correlated variates for a given correlation measure and dependence model, computation of bounds on distribution functions under a specified dependence model, formulation of parametric and empirical dependence models, and bounding approaches that can be used when information about the intervariable dependence is incomplete. The report also reviews several of the most pervasive and dangerous myths among risk analysts about dependence in probabilistic models.

  12. Limitations of Exemplar Models of Multi-Attribute Probabilistic Inference

    ERIC Educational Resources Information Center

    Nosofsky, Robert M.; Bergert, F. Bryabn

    2007-01-01

    Observers were presented with pairs of objects varying along binary-valued attributes and learned to predict which member of each pair had a greater value on a continuously varying criterion variable. The predictions from exemplar models of categorization were contrasted with classic alternative models, including generalized versions of a…

  13. Politics, Organizations, and Choice: Applications of an Equilibrium Model

    ERIC Educational Resources Information Center

    Roos, Leslie L., Jr.

    1972-01-01

    An economic model of consumer choice is used to link the separate theories that have dealt with comparative politics, job satisfaction, and organizational mobility. The model is used to structure data taken from studies of Turkish and French elites on environmental change, organizational mobility, and satisfaction. (Author/DN)

  14. The Influence of Role Models on Women's Career Choices

    ERIC Educational Resources Information Center

    Quimby, Julie L.; DeSantis, Angela M.

    2006-01-01

    This study of 368 female undergraduates examined self-efficacy and role model influence as predictors of career choice across J. L. Holland's (1997) 6 RIASEC (Realistic, Investigative, Artistic, Social, Enterprising, Conventional) types. Findings showed that levels of self-efficacy and role model influence differed across Holland types. Multiple…

  15. Probabilistically Constraining Age-Depth-Models of Glaciogenic Sediments

    NASA Astrophysics Data System (ADS)

    Werner, J.; van der Bilt, W.; Tingley, M.

    2015-12-01

    Reconstructions of the late-Holocene climate rely heavily upon proxies that are assumed to be accurately dated by layer counting. All of these proxies, such as measurements of tree rings, ice cores, and varved lake sediments do carry some inherent dating uncertainty that is not always fully accounted for. Considerable advances could be achieved if time uncertainties were recognized and correctly modeled, also for proxies commonly treated as free of age model errors. Current approaches for accounting for time uncertainty are generally limited to repeating the reconstruction using each one of an ensemble of age models, thereby inflating the final estimated uncertainty - in effect, each possible age model is given equal weighting. Uncertainties can be reduced by exploiting the inferred space-time covariance structure of the climate to re-weight the possible age models. Werner and Tingley (2015) demonstrated how Bayesian hierarchical climate reconstruction models can be augmented to account for time-uncertain proxies. In their method, probabilities associated with the age models are formally updated within the Bayesian framework, thereby reducing uncertainties. Numerical experiments (Werner and Tingley 2015) show that updating the age model probabilities decreases uncertainty in the resulting reconstructions, as compared with the current de facto standard of sampling over all age models, provided there is sufficient information from other data sources in the spatial region of the time-uncertain proxy. We show how this novel method can be applied to high resolution, sub-annually sampled lacustrine sediment records to constrain their respective age depth models. The results help to quantify the signal content and extract the regionally representative signal. The single time series can then be used as the basis for a reconstruction of glacial activity. van der Bilt et al. in prep. Werner, J.P. and Tingley, M.P. Clim. Past (2015)

  16. Learning a Generative Probabilistic Grammar of Experience: A Process-Level Model of Language Acquisition

    ERIC Educational Resources Information Center

    Kolodny, Oren; Lotem, Arnon; Edelman, Shimon

    2015-01-01

    We introduce a set of biologically and computationally motivated design choices for modeling the learning of language, or of other types of sequential, hierarchically structured experience and behavior, and describe an implemented system that conforms to these choices and is capable of unsupervised learning from raw natural-language corpora. Given…

  17. Order Under Uncertainty: Robust Differential Expression Analysis Using Probabilistic Models for Pseudotime Inference

    PubMed Central

    Campbell, Kieran R.

    2016-01-01

    Single cell gene expression profiling can be used to quantify transcriptional dynamics in temporal processes, such as cell differentiation, using computational methods to label each cell with a ‘pseudotime’ where true time series experimentation is too difficult to perform. However, owing to the high variability in gene expression between individual cells, there is an inherent uncertainty in the precise temporal ordering of the cells. Pre-existing methods for pseudotime estimation have predominantly given point estimates precluding a rigorous analysis of the implications of uncertainty. We use probabilistic modelling techniques to quantify pseudotime uncertainty and propagate this into downstream differential expression analysis. We demonstrate that reliance on a point estimate of pseudotime can lead to inflated false discovery rates and that probabilistic approaches provide greater robustness and measures of the temporal resolution that can be obtained from pseudotime inference. PMID:27870852

  18. Event-Based Media Enrichment Using an Adaptive Probabilistic Hypergraph Model.

    PubMed

    Liu, Xueliang; Wang, Meng; Yin, Bao-Cai; Huet, Benoit; Li, Xuelong

    2015-11-01

    Nowadays, with the continual development of digital capture technologies and social media services, a vast number of media documents are captured and shared online to help attendees record their experience during events. In this paper, we present a method combining semantic inference and multimodal analysis for automatically finding media content to illustrate events using an adaptive probabilistic hypergraph model. In this model, media items are taken as vertices in the weighted hypergraph and the task of enriching media to illustrate events is formulated as a ranking problem. In our method, each hyperedge is constructed using the K-nearest neighbors of a given media document. We also employ a probabilistic representation, which assigns each vertex to a hyperedge in a probabilistic way, to further exploit the correlation among media data. Furthermore, we optimize the hypergraph weights in a regularization framework, which is solved as a second-order cone problem. The approach is initiated by seed media and then used to rank the media documents using a transductive inference process. The results obtained from validating the approach on an event dataset collected from EventMedia demonstrate the effectiveness of the proposed approach.

  19. Probabilistic Inference in General Graphical Models through Sampling in Stochastic Networks of Spiking Neurons

    PubMed Central

    Pecevski, Dejan; Buesing, Lars; Maass, Wolfgang

    2011-01-01

    An important open problem of computational neuroscience is the generic organization of computations in networks of neurons in the brain. We show here through rigorous theoretical analysis that inherent stochastic features of spiking neurons, in combination with simple nonlinear computational operations in specific network motifs and dendritic arbors, enable networks of spiking neurons to carry out probabilistic inference through sampling in general graphical models. In particular, it enables them to carry out probabilistic inference in Bayesian networks with converging arrows (“explaining away”) and with undirected loops, that occur in many real-world tasks. Ubiquitous stochastic features of networks of spiking neurons, such as trial-to-trial variability and spontaneous activity, are necessary ingredients of the underlying computational organization. We demonstrate through computer simulations that this approach can be scaled up to neural emulations of probabilistic inference in fairly large graphical models, yielding some of the most complex computations that have been carried out so far in networks of spiking neurons. PMID:22219717

  20. Fundamental Mistuning Model for Probabilistic Analysis Studied Experimentally

    NASA Technical Reports Server (NTRS)

    Griffin, Jerry H.

    2005-01-01

    The Fundamental Mistuning Model (FMM) is a reduced-order model for efficiently calculating the forced response of a mistuned bladed disk. FMM ID is a companion program that determines the mistuning in a particular rotor. Together, these methods provide a way to acquire mistuning data in a population of bladed disks and then simulate the forced response of the fleet. This process was tested experimentally at the NASA Glenn Research Center, and the simulated results were compared with laboratory measurements of a "fleet" of test rotors. The method was shown to work quite well. It was found that the accuracy of the results depends on two factors: (1) the quality of the statistical model used to characterize mistuning and (2) how sensitive the system is to errors in the statistical modeling.

  1. Probabilistic Modeling of Loran-C for nonprecision approaches

    NASA Technical Reports Server (NTRS)

    Einhorn, John K.

    1987-01-01

    The overall idea of the research was to predict the errors to be encountered during an approach using available data from the U.S. Coast Guard and standard normal distribution probability analysis for a number of airports in the North East CONUS. The research consists of two parts: an analytical model that predicts the probability of an approach falling within a given standard, and a series of flight tests designed to test the validity of the model.

  2. Probabilistic Model of Microbial Cell Growth, Division, and Mortality ▿

    PubMed Central

    Horowitz, Joseph; Normand, Mark D.; Corradini, Maria G.; Peleg, Micha

    2010-01-01

    After a short time interval of length δt during microbial growth, an individual cell can be found to be divided with probability Pd(t)δt, dead with probability Pm(t)δt, or alive but undivided with the probability 1 − [Pd(t) + Pm(t)]δt, where t is time, Pd(t) expresses the probability of division for an individual cell per unit of time, and Pm(t) expresses the probability of mortality per unit of time. These probabilities may change with the state of the population and the habitat's properties and are therefore functions of time. This scenario translates into a model that is presented in stochastic and deterministic versions. The first, a stochastic process model, monitors the fates of individual cells and determines cell numbers. It is particularly suitable for small populations such as those that may exist in the case of casual contamination of a food by a pathogen. The second, which can be regarded as a large-population limit of the stochastic model, is a continuous mathematical expression that describes the population's size as a function of time. It is suitable for large microbial populations such as those present in unprocessed foods. Exponential or logistic growth with or without lag, inactivation with or without a “shoulder,” and transitions between growth and inactivation are all manifestations of the underlying probability structure of the model. With temperature-dependent parameters, the model can be used to simulate nonisothermal growth and inactivation patterns. The same concept applies to other factors that promote or inhibit microorganisms, such as pH and the presence of antimicrobials, etc. With Pd(t) and Pm(t) in the form of logistic functions, the model can simulate all commonly observed growth/mortality patterns. Estimates of the changing probability parameters can be obtained with both the stochastic and deterministic versions of the model, as demonstrated with simulated data. PMID:19915038

  3. TAFV Alternative Fuels and Vehicles Choice Model Documentation

    SciTech Connect

    Greene, D.L.

    2001-07-27

    A model for predicting choice of alternative fuel and among alternative vehicle technologies for light-duty motor vehicles is derived. The nested multinomial logit (NML) mathematical framework is used. Calibration of the model is based on information in the existing literature and deduction based on assuming a small number of key parameters, such as the value of time and discount rates. A spreadsheet model has been developed for calibration and preliminary testing of the model.

  4. Testing for ontological errors in probabilistic forecasting models of natural systems.

    PubMed

    Marzocchi, Warner; Jordan, Thomas H

    2014-08-19

    Probabilistic forecasting models describe the aleatory variability of natural systems as well as our epistemic uncertainty about how the systems work. Testing a model against observations exposes ontological errors in the representation of a system and its uncertainties. We clarify several conceptual issues regarding the testing of probabilistic forecasting models for ontological errors: the ambiguity of the aleatory/epistemic dichotomy, the quantification of uncertainties as degrees of belief, the interplay between Bayesian and frequentist methods, and the scientific pathway for capturing predictability. We show that testability of the ontological null hypothesis derives from an experimental concept, external to the model, that identifies collections of data, observed and not yet observed, that are judged to be exchangeable when conditioned on a set of explanatory variables. These conditional exchangeability judgments specify observations with well-defined frequencies. Any model predicting these behaviors can thus be tested for ontological error by frequentist methods; e.g., using P values. In the forecasting problem, prior predictive model checking, rather than posterior predictive checking, is desirable because it provides more severe tests. We illustrate experimental concepts using examples from probabilistic seismic hazard analysis. Severe testing of a model under an appropriate set of experimental concepts is the key to model validation, in which we seek to know whether a model replicates the data-generating process well enough to be sufficiently reliable for some useful purpose, such as long-term seismic forecasting. Pessimistic views of system predictability fail to recognize the power of this methodology in separating predictable behaviors from those that are not.

  5. Testing for ontological errors in probabilistic forecasting models of natural systems

    PubMed Central

    Marzocchi, Warner; Jordan, Thomas H.

    2014-01-01

    Probabilistic forecasting models describe the aleatory variability of natural systems as well as our epistemic uncertainty about how the systems work. Testing a model against observations exposes ontological errors in the representation of a system and its uncertainties. We clarify several conceptual issues regarding the testing of probabilistic forecasting models for ontological errors: the ambiguity of the aleatory/epistemic dichotomy, the quantification of uncertainties as degrees of belief, the interplay between Bayesian and frequentist methods, and the scientific pathway for capturing predictability. We show that testability of the ontological null hypothesis derives from an experimental concept, external to the model, that identifies collections of data, observed and not yet observed, that are judged to be exchangeable when conditioned on a set of explanatory variables. These conditional exchangeability judgments specify observations with well-defined frequencies. Any model predicting these behaviors can thus be tested for ontological error by frequentist methods; e.g., using P values. In the forecasting problem, prior predictive model checking, rather than posterior predictive checking, is desirable because it provides more severe tests. We illustrate experimental concepts using examples from probabilistic seismic hazard analysis. Severe testing of a model under an appropriate set of experimental concepts is the key to model validation, in which we seek to know whether a model replicates the data-generating process well enough to be sufficiently reliable for some useful purpose, such as long-term seismic forecasting. Pessimistic views of system predictability fail to recognize the power of this methodology in separating predictable behaviors from those that are not. PMID:25097265

  6. Probabilistic models for semisupervised discriminative motif discovery in DNA sequences.

    PubMed

    Kim, Jong Kyoung; Choi, Seungjin

    2011-01-01

    Methods for discriminative motif discovery in DNA sequences identify transcription factor binding sites (TFBSs), searching only for patterns that differentiate two sets (positive and negative sets) of sequences. On one hand, discriminative methods increase the sensitivity and specificity of motif discovery, compared to generative models. On the other hand, generative models can easily exploit unlabeled sequences to better detect functional motifs when labeled training samples are limited. In this paper, we develop a hybrid generative/discriminative model which enables us to make use of unlabeled sequences in the framework of discriminative motif discovery, leading to semisupervised discriminative motif discovery. Numerical experiments on yeast ChIP-chip data for discovering DNA motifs demonstrate that the best performance is obtained between the purely-generative and the purely-discriminative and the semisupervised learning improves the performance when labeled sequences are limited.

  7. A Coupled Probabilistic Wake Vortex and Aircraft Response Prediction Model

    NASA Technical Reports Server (NTRS)

    Gloudemans, Thijs; Van Lochem, Sander; Ras, Eelco; Malissa, Joel; Ahmad, Nashat N.; Lewis, Timothy A.

    2016-01-01

    Wake vortex spacing standards along with weather and runway occupancy time, restrict terminal area throughput and impose major constraints on the overall capacity and efficiency of the National Airspace System (NAS). For more than two decades, the National Aeronautics and Space Administration (NASA) has been conducting research on characterizing wake vortex behavior in order to develop fast-time wake transport and decay prediction models. It is expected that the models can be used in the systems level design of advanced air traffic management (ATM) concepts that safely increase the capacity of the NAS. It is also envisioned that at a later stage of maturity, these models could potentially be used operationally, in groundbased spacing and scheduling systems as well as on the flight deck.

  8. Web-tool to Support Medical Experts in Probabilistic Modelling Using Large Bayesian Networks With an Example of Hinosinusitis.

    PubMed

    Cypko, Mario A; Hirsch, David; Koch, Lucas; Stoehr, Matthaeus; Strauss, Gero; Denecke, Kerstin

    2015-01-01

    For many complex diseases, finding the best patient-specific treatment decision is difficult for physicians due to limited mental capacity. Clinical decision support systems based on Bayesian networks (BN) can provide a probabilistic graphical model integrating all necessary aspects relevant for decision making. Such models are often manually created by clinical experts. The modeling process consists of graphical modeling conducted by collecting of information entities, and probabilistic modeling achieved through defining the relations of information entities to their direct causes. Such expert-based probabilistic modelling with BNs is very time intensive and requires knowledge about the underlying modeling method. We introduce in this paper an intuitive web-based system for helping medical experts generate decision models based on BNs. Using the tool, no special knowledge about the underlying model or BN is necessary. We tested the tool with an example of modeling treatment decisions of Rhinosinusitis and studied its usability.

  9. Probabilistic uncertainty analysis of epidemiological modeling to guide public health intervention policy.

    PubMed

    Gilbert, Jennifer A; Meyers, Lauren Ancel; Galvani, Alison P; Townsend, Jeffrey P

    2014-03-01

    Mathematical modeling of disease transmission has provided quantitative predictions for health policy, facilitating the evaluation of epidemiological outcomes and the cost-effectiveness of interventions. However, typical sensitivity analyses of deterministic dynamic infectious disease models focus on model architecture and the relative importance of parameters but neglect parameter uncertainty when reporting model predictions. Consequently, model results that identify point estimates of intervention levels necessary to terminate transmission yield limited insight into the probability of success. We apply probabilistic uncertainty analysis to a dynamic model of influenza transmission and assess global uncertainty in outcome. We illustrate that when parameter uncertainty is not incorporated into outcome estimates, levels of vaccination and treatment predicted to prevent an influenza epidemic will only have an approximately 50% chance of terminating transmission and that sensitivity analysis alone is not sufficient to obtain this information. We demonstrate that accounting for parameter uncertainty yields probabilities of epidemiological outcomes based on the degree to which data support the range of model predictions. Unlike typical sensitivity analyses of dynamic models that only address variation in parameters, the probabilistic uncertainty analysis described here enables modelers to convey the robustness of their predictions to policy makers, extending the power of epidemiological modeling to improve public health.

  10. Boolean Queries and Term Dependencies in Probabilistic Retrieval Models.

    ERIC Educational Resources Information Center

    Croft, W. Bruce

    1986-01-01

    Proposes approach to integrating Boolean and statistical systems where Boolean queries are interpreted as a means of specifying term dependencies in relevant set of documents. Highlights include series of retrieval experiments designed to test retrieval strategy based on term dependence model and relation of results to other work. (18 references)…

  11. A simple probabilistic model of submicroscopic diatom morphogenesis

    PubMed Central

    Willis, L.; Cox, E. J.; Duke, T.

    2013-01-01

    Unicellular algae called diatoms morph biomineral compounds into tough exoskeletons via complex intracellular processes about which there is much to be learned. These exoskeletons feature a rich variety of structures from submicroscale to milliscale, many that have not been reproduced in vitro. In order to help understand this complex miniature morphogenesis, here we introduce and analyse a simple model of biomineral kinetics, focusing on the exoskeleton's submicroscopic patterned planar structures called pore occlusions. The model reproduces most features of these pore occlusions by retuning just one parameter, thereby indicating what physio-biochemical mechanisms could sufficiently explain morphogenesis at the submicroscopic scale: it is sufficient to identify a mechanism of lateral negative feedback on the biomineral reaction kinetics. The model is nonlinear and stochastic; it is an extended version of the threshold voter model. Its mean-field equation provides a simple and, as far as the authors are aware, new way of mapping out the spatial patterns produced by lateral inhibition and variants thereof. PMID:23554345

  12. Probabilistic Model of Fault Detection in Quantum Circuits

    NASA Astrophysics Data System (ADS)

    Banerjee, A.; Pathak, A.

    Since the introduction of quantum computation, several protocols (such as quantum cryptography, quantum algorithm, quantum teleportation) have established quantum computing as a superior future technology. Each of these processes involves quantum circuits, which are prone to different kinds of faults. Consequently, it is important to verify whether the circuit hardware is defective or not. The systematic procedure to do so is known as fault testing. Normally testing is done by providing a set of valid input states and measuring the corresponding output states and comparing the output states with the expected output states of the perfect (fault less) circuit. This particular set of input vectors are known as test set [6]. If there exists a fault then the next step would be to find the exact location and nature of the defect. This is known as fault localization. A model that explains the logical or functional faults in the circuit is a fault model. Conventional fault models include (i) stuck at faults, (ii) bridge faults, and (iii) delay faults. These fault models have been rigorously studied for conventional irreversible circuit. But with the advent of reversible classical computing and quantum computing it has become important to enlarge the domain of the study on test vectors.

  13. Choice as a Global Language in Local Practice: A Mixed Model of School Choice in Taiwan

    ERIC Educational Resources Information Center

    Mao, Chin-Ju

    2015-01-01

    This paper uses school choice policy as an example to demonstrate how local actors adopt, mediate, translate, and reformulate "choice" as neo-liberal rhetoric informing education reform. Complex processes exist between global policy about school choice and the local practice of school choice. Based on the theoretical sensibility of…

  14. Probabilistic image modeling with an extended chain graph for human activity recognition and image segmentation.

    PubMed

    Zhang, Lei; Zeng, Zhi; Ji, Qiang

    2011-09-01

    Chain graph (CG) is a hybrid probabilistic graphical model (PGM) capable of modeling heterogeneous relationships among random variables. So far, however, its application in image and video analysis is very limited due to lack of principled learning and inference methods for a CG of general topology. To overcome this limitation, we introduce methods to extend the conventional chain-like CG model to CG model with more general topology and the associated methods for learning and inference in such a general CG model. Specifically, we propose techniques to systematically construct a generally structured CG, to parameterize this model, to derive its joint probability distribution, to perform joint parameter learning, and to perform probabilistic inference in this model. To demonstrate the utility of such an extended CG, we apply it to two challenging image and video analysis problems: human activity recognition and image segmentation. The experimental results show improved performance of the extended CG model over the conventional directed or undirected PGMs. This study demonstrates the promise of the extended CG for effective modeling and inference of complex real-world problems.

  15. Ultrasonic wave-based defect localization using probabilistic modeling

    NASA Astrophysics Data System (ADS)

    Todd, M. D.; Flynn, E. B.; Wilcox, P. D.; Drinkwater, B. W.; Croxford, A. J.; Kessler, S.

    2012-05-01

    This work presents a new approach rooted in maximum likelihood estimation for defect localization in sparse array guided wave ultrasonic interrogation applications. The approach constructs a minimally-informed statistical model of the guided wave process, where unknown or uncertain model parameters are assigned non-informative Bayesian prior distributions and integrated out of the a posteriori probability calculation. The premise of this localization approach is straightforward: the most likely defect location is the point on the structure with the maximum a posteriori probability of actually being the location of damage (i.e., the most probable location given a set of sensor measurements). The proposed approach is tested on a complex stiffened panel against other common localization approaches and found to have superior performance in all cases.

  16. PREMChlor: Probabilistic Remediation Evaluation Model for Chlorinated Solvents

    DTIC Science & Technology

    2010-03-01

    Evaluation Model for Chlorinated Solvents ESTCP Project ER-0704 MARCH 2010 Hailian Liang, Ph.D. Ronald Falta, Ph.D. Clemson University Charles...PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Clemson University, Clemson ,SC,29634 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING...Certification Program (ESTCP) research project (ER-0704), which was a joint effort between Clemson University, GSI Environmental Inc., and Purdue University. The

  17. Probabilistic Model of a Floating Target Behaviour in Rough Seas

    DTIC Science & Technology

    2013-07-01

    Honolulu, Hawaii, 1976, pp. 301-329. 10. Pierson, W. J., and Moskowitz, L . A proposed spectral form for fully developed wind seas based on the...175-181. 14. Torsethaugen, K. Simplified double peak spectral model for ocean waves. SINTEF Report STF80 A048052, SINTEF Fisheries and Aquaculture ...RELEASE USE ( L ) NEXT TO DOCUMENT CLASSIFICATION) Document (U) Title (U) Abstract (U) 4. AUTHOR(S) Rada Pushkarova 5

  18. The implicit possibility of dualism in quantum probabilistic cognitive modeling.

    PubMed

    Mender, Donald

    2013-06-01

    Pothos & Busemeyer (P&B) argue convincingly that quantum probability offers an improvement over classical Bayesian probability in modeling the empirical data of cognitive science. However, a weakness related to restrictions on the dimensionality of incompatible physical observables flows from the authors' "agnosticism" regarding quantum processes in neural substrates underlying cognition. Addressing this problem will require either future research findings validating quantum neurophysics or theoretical expansion of the uncertainty principle as a new, neurocognitively contextualized, "local" symmetry.

  19. Probabilistic graphical models to deal with age estimation of living persons.

    PubMed

    Sironi, Emanuele; Gallidabino, Matteo; Weyermann, Céline; Taroni, Franco

    2016-03-01

    Due to the rise of criminal, civil and administrative judicial situations involving people lacking valid identity documents, age estimation of living persons has become an important operational procedure for numerous forensic and medicolegal services worldwide. The chronological age of a given person is generally estimated from the observed degree of maturity of some selected physical attributes by means of statistical methods. However, their application in the forensic framework suffers from some conceptual and practical drawbacks, as recently claimed in the specialised literature. The aim of this paper is therefore to offer an alternative solution for overcoming these limits, by reiterating the utility of a probabilistic Bayesian approach for age estimation. This approach allows one to deal in a transparent way with the uncertainty surrounding the age estimation process and to produce all the relevant information in the form of posterior probability distribution about the chronological age of the person under investigation. Furthermore, this probability distribution can also be used for evaluating in a coherent way the possibility that the examined individual is younger or older than a given legal age threshold having a particular legal interest. The main novelty introduced by this work is the development of a probabilistic graphical model, i.e. a Bayesian network, for dealing with the problem at hand. The use of this kind of probabilistic tool can significantly facilitate the application of the proposed methodology: examples are presented based on data related to the ossification status of the medial clavicular epiphysis. The reliability and the advantages of this probabilistic tool are presented and discussed.

  20. Kinetic modeling based probabilistic segmentation for molecular images.

    PubMed

    Saad, Ahmed; Hamarneh, Ghassan; Möller, Torsten; Smith, Ben

    2008-01-01

    We propose a semi-supervised, kinetic modeling based segmentation technique for molecular imaging applications. It is an iterative, self-learning algorithm based on uncertainty principles, designed to alleviate low signal-to-noise ratio (SNR) and partial volume effect (PVE) problems. Synthetic fluorodeoxyglucose (FDG) and simulated Raclopride dynamic positron emission tomography (dPET) brain images with excessive noise levels are used to validate our algorithm. We show, qualitatively and quantitatively, that our algorithm outperforms state-of-the-art techniques in identifying different functional regions and recovering the kinetic parameters.

  1. Probabilistic Multi-Factor Interaction Model for Complex Material Behavior

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Abumeri, Galib H.

    2008-01-01

    The Multi-Factor Interaction Model (MFIM) is used to evaluate the divot weight (foam weight ejected) from the launch external tanks. The multi-factor has sufficient degrees of freedom to evaluate a large number of factors that may contribute to the divot ejection. It also accommodates all interactions by its product form. Each factor has an exponent that satisfies only two points the initial and final points. The exponent describes a monotonic path from the initial condition to the final. The exponent values are selected so that the described path makes sense in the absence of experimental data. In the present investigation, the data used was obtained by testing simulated specimens in launching conditions. Results show that the MFIM is an effective method of describing the divot weight ejected under the conditions investigated.

  2. Probabilistic Multi-Factor Interaction Model for Complex Material Behavior

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Abumeri, Galib H.

    2008-01-01

    The Multi-Factor Interaction Model (MFIM) is used to evaluate the divot weight (foam weight ejected) from the launch external tanks. The multi-factor has sufficient degrees of freedom to evaluate a large number of factors that may contribute to the divot ejection. It also accommodates all interactions by its product form. Each factor has an exponent that satisfies only two points, the initial and final points. The exponent describes a monotonic path from the initial condition to the final. The exponent values are selected so that the described path makes sense in the absence of experimental data. In the present investigation the data used was obtained by testing simulated specimens in launching conditions. Results show that the MFIM is an effective method of describing the divot weight ejected under the conditions investigated.

  3. Probabilistic Modeling Of Ocular Biomechanics In VIIP: Risk Stratification

    NASA Technical Reports Server (NTRS)

    Feola, A.; Myers, J. G.; Raykin, J.; Nelson, E. S.; Mulugeta, L.; Samuels, B.; Ethier, C. R.

    2016-01-01

    Visual Impairment and Intracranial Pressure (VIIP) syndrome is a major health concern for long-duration space missions. Currently, it is thought that a cephalad fluid shift in microgravity causes elevated intracranial pressure (ICP) that is transmitted along the optic nerve sheath (ONS). We hypothesize that this in turn leads to alteration and remodeling of connective tissue in the posterior eye which impacts vision. Finite element (FE) analysis is a powerful tool for examining the effects of mechanical loads in complex geometries. Our goal is to build a FE analysis framework to understand the response of the lamina cribrosa and optic nerve head to elevations in ICP in VIIP. To simulate the effects of different pressures on tissues in the posterior eye, we developed a geometric model of the posterior eye and optic nerve sheath and used a Latin hypercubepartial rank correlation coef-ficient (LHSPRCC) approach to assess the influence of uncertainty in our input parameters (i.e. pressures and material properties) on the peak strains within the retina, lamina cribrosa and optic nerve. The LHSPRCC approach was repeated for three relevant ICP ranges, corresponding to upright and supine posture on earth, and microgravity [1]. At each ICP condition we used intraocular pressure (IOP) and mean arterial pressure (MAP) measurements of in-flight astronauts provided by Lifetime Surveillance of Astronaut Health Program, NASA Johnson Space Center. The lamina cribrosa, optic nerve, retinal vessel and retina were modeled as linear-elastic materials, while other tissues were modeled as a Mooney-Rivlin solid (representing ground substance, stiffness parameter c1) with embedded collagen fibers (stiffness parameters c3, c4 and c5). Geometry creationmesh generation was done in Gmsh [2], while FEBio was used for all FE simulations [3]. The LHSPRCC approach resulted in correlation coefficients in the range of 1. To assess the relative influence of the uncertainty in an input parameter on

  4. Probabilistic Structural Analysis and Reliability Using NESSUS With Implemented Material Strength Degradation Model

    NASA Technical Reports Server (NTRS)

    Bast, Callie C.; Jurena, Mark T.; Godines, Cody R.; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    This project included both research and education objectives. The goal of this project was to advance innovative research and education objectives in theoretical and computational probabilistic structural analysis, reliability, and life prediction for improved reliability and safety of structural components of aerospace and aircraft propulsion systems. Research and education partners included Glenn Research Center (GRC) and Southwest Research Institute (SwRI) along with the University of Texas at San Antonio (UTSA). SwRI enhanced the NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) code and provided consulting support for NESSUS-related activities at UTSA. NASA funding supported three undergraduate students, two graduate students, a summer course instructor and the Principal Investigator. Matching funds from UTSA provided for the purchase of additional equipment for the enhancement of the Advanced Interactive Computational SGI Lab established during the first year of this Partnership Award to conduct the probabilistic finite element summer courses. The research portion of this report presents the cumulation of work performed through the use of the probabilistic finite element program, NESSUS, Numerical Evaluation and Structures Under Stress, and an embedded Material Strength Degradation (MSD) model. Probabilistic structural analysis provided for quantification of uncertainties associated with the design, thus enabling increased system performance and reliability. The structure examined was a Space Shuttle Main Engine (SSME) fuel turbopump blade. The blade material analyzed was Inconel 718, since the MSD model was previously calibrated for this material. Reliability analysis encompassing the effects of high temperature and high cycle fatigue, yielded a reliability value of 0.99978 using a fully correlated random field for the blade thickness. The reliability did not change significantly for a change in distribution type except for a change in

  5. Psychophysics of time perception and intertemporal choice models

    NASA Astrophysics Data System (ADS)

    Takahashi, Taiki; Oono, Hidemi; Radford, Mark H. B.

    2008-03-01

    Intertemporal choice and psychophysics of time perception have been attracting attention in econophysics and neuroeconomics. Several models have been proposed for intertemporal choice: exponential discounting, general hyperbolic discounting (exponential discounting with logarithmic time perception of the Weber-Fechner law, a q-exponential discount model based on Tsallis's statistics), simple hyperbolic discounting, and Stevens' power law-exponential discounting (exponential discounting with Stevens' power time perception). In order to examine the fitness of the models for behavioral data, we estimated the parameters and AICc (Akaike Information Criterion with small sample correction) of the intertemporal choice models by assessing the points of subjective equality (indifference points) at seven delays. Our results have shown that the orders of the goodness-of-fit for both group and individual data were [Weber-Fechner discounting (general hyperbola) > Stevens' power law discounting > Simple hyperbolic discounting > Exponential discounting], indicating that human time perception in intertemporal choice may follow the Weber-Fechner law. Indications of the results for neuropsychopharmacological treatments of addiction and biophysical processing underlying temporal discounting and time perception are discussed.

  6. Modeling Multiple Response Processes in Judgment and Choice

    ERIC Educational Resources Information Center

    Bockenholt, Ulf

    2012-01-01

    In this article, I show how item response models can be used to capture multiple response processes in psychological applications. Intuitive and analytical responses, agree-disagree answers, response refusals, socially desirable responding, differential item functioning, and choices among multiple options are considered. In each of these cases, I…

  7. Modelling the Reasons for Training Choices: Technical Paper. Support Document

    ERIC Educational Resources Information Center

    Smith, Andrew; Oczkowski, Eddie; Hill, Mark

    2009-01-01

    This report provides the technical details on the modelling aspects of identifying significant drivers for the reasons for using certain types of training and for the choice of training types. The employed data is from the 2005 Survey of Employer Use and Views of the VET system (SEUV). The data has previously been analysed in NCVER (2006). This…

  8. An application of probabilistic safety assessment methods to model aircraft systems and accidents

    SciTech Connect

    Martinez-Guridi, G.; Hall, R.E.; Fullwood, R.R.

    1998-08-01

    A case study modeling the thrust reverser system (TRS) in the context of the fatal accident of a Boeing 767 is presented to illustrate the application of Probabilistic Safety Assessment methods. A simplified risk model consisting of an event tree with supporting fault trees was developed to represent the progression of the accident, taking into account the interaction between the TRS and the operating crew during the accident, and the findings of the accident investigation. A feasible sequence of events leading to the fatal accident was identified. Several insights about the TRS and the accident were obtained by applying PSA methods. Changes proposed for the TRS also are discussed.

  9. A probabilistic model for risk assessment of residual host cell DNA in biological products.

    PubMed

    Yang, Harry; Zhang, Lanju; Galinski, Mark

    2010-04-26

    Biological products such as viral vaccines manufactured in cells contain residual DNA derived from host cell substrates used in production. It is theoretically possible that the residual DNA could transmit activated oncogenes and/or latent infectious viral genomes to subjects receiving the product, and induce oncogenic or infective events. A probabilistic model to estimate the risks due to residual DNA is proposed. The model takes account of enzyme inactivation process. It allows for more accurate risk assessment when compared to methods currently in use. An application of the method to determine safety factor of a vaccine product is provided.

  10. A probabilistic method for constructing wave time-series at inshore locations using model scenarios

    USGS Publications Warehouse

    Long, Joseph W.; Plant, Nathaniel G.; Dalyander, P. Soupy; Thompson, David M.

    2014-01-01

    Continuous time-series of wave characteristics (height, period, and direction) are constructed using a base set of model scenarios and simple probabilistic methods. This approach utilizes an archive of computationally intensive, highly spatially resolved numerical wave model output to develop time-series of historical or future wave conditions without performing additional, continuous numerical simulations. The archive of model output contains wave simulations from a set of model scenarios derived from an offshore wave climatology. Time-series of wave height, period, direction, and associated uncertainties are constructed at locations included in the numerical model domain. The confidence limits are derived using statistical variability of oceanographic parameters contained in the wave model scenarios. The method was applied to a region in the northern Gulf of Mexico and assessed using wave observations at 12 m and 30 m water depths. Prediction skill for significant wave height is 0.58 and 0.67 at the 12 m and 30 m locations, respectively, with similar performance for wave period and direction. The skill of this simplified, probabilistic time-series construction method is comparable to existing large-scale, high-fidelity operational wave models but provides higher spatial resolution output at low computational expense. The constructed time-series can be developed to support a variety of applications including climate studies and other situations where a comprehensive survey of wave impacts on the coastal area is of interest.

  11. Probabilistic Multi-Factor Interaction Model for Complex Material Behavior

    NASA Technical Reports Server (NTRS)

    Abumeri, Galib H.; Chamis, Christos C.

    2010-01-01

    Complex material behavior is represented by a single equation of product form to account for interaction among the various factors. The factors are selected by the physics of the problem and the environment that the model is to represent. For example, different factors will be required for each to represent temperature, moisture, erosion, corrosion, etc. It is important that the equation represent the physics of the behavior in its entirety accurately. The Multi-Factor Interaction Model (MFIM) is used to evaluate the divot weight (foam weight ejected) from the external launch tanks. The multi-factor has sufficient degrees of freedom to evaluate a large number of factors that may contribute to the divot ejection. It also accommodates all interactions by its product form. Each factor has an exponent that satisfies only two points - the initial and final points. The exponent describes a monotonic path from the initial condition to the final. The exponent values are selected so that the described path makes sense in the absence of experimental data. In the present investigation, the data used were obtained by testing simulated specimens in launching conditions. Results show that the MFIM is an effective method of describing the divot weight ejected under the conditions investigated. The problem lies in how to represent the divot weight with a single equation. A unique solution to this problem is a multi-factor equation of product form. Each factor is of the following form (1 xi/xf)ei, where xi is the initial value, usually at ambient conditions, xf the final value, and ei the exponent that makes the curve represented unimodal that meets the initial and final values. The exponents are either evaluated by test data or by technical judgment. A minor disadvantage may be the selection of exponents in the absence of any empirical data. This form has been used successfully in describing the foam ejected in simulated space environmental conditions. Seven factors were required

  12. Medicare Care Choices Model Enables Concurrent Palliative and Curative Care.

    PubMed

    2015-01-01

    On July 20, 2015, the federal Centers for Medicare & Medicaid Services (CMS) announced hospices that have been selected to participate in the Medicare Care Choices Model. Fewer than half of the Medicare beneficiaries use hospice care for which they are eligible. Current Medicare regulations preclude concurrent palliative and curative care. Under the Medicare Choices Model, dually eligible Medicare beneficiaries may elect to receive supportive care services typically provided by hospice while continuing to receive curative services. This report describes how CMS has expanded the model from an originally anticipated 30 Medicare-certified hospices to over 140 Medicare-certified hospices and extended the duration of the model from 3 to 5 years. Medicare-certified hospice programs that will participate in the model are listed.

  13. A Generative Model for Probabilistic Label Fusion of Multimodal Data

    PubMed Central

    Iglesias, Juan Eugenio; Sabuncu, Mert Rory; Van Leemput, Koen

    2014-01-01

    The maturity of registration methods, in combination with the increasing processing power of computers, has made multi-atlas segmentation methods practical. The problem of merging the deformed label maps from the atlases is known as label fusion. Even though label fusion has been well studied for intramodality scenarios, it remains relatively unexplored when the nature of the target data is multimodal or when its modality is different from that of the atlases. In this paper, we review the literature on label fusion methods and also present an extension of our previously published algorithm to the general case in which the target data are multimodal. The method is based on a generative model that exploits the consistency of voxel intensities within the target scan based on the current estimate of the segmentation. Using brain MRI scans acquired with a multiecho FLASH sequence, we compare the method with majority voting, statistical-atlas-based segmentation, the popular package FreeSurfer and an adaptive local multi-atlas segmentation method. The results show that our approach produces highly accurate segmentations (Dice 86.3% across 22 brain structures of interest), outperforming the competing methods. PMID:25685856

  14. A probabilistic model of emphysema based on granulometry analysis

    NASA Astrophysics Data System (ADS)

    Marcos, J. V.; Nava, R.; Cristobal, G.; Munoz-Barrutia, A.; Escalante-Ramírez, B.; Ortiz-de-Solórzano, C.

    2013-11-01

    Emphysema is associated with the destruction of lung parenchyma, resulting in abnormal enlargement of airspaces. Accurate quantification of emphysema is required for a better understanding of the disease as well as for the assessment of drugs and treatments. In the present study, a novel method for emphysema characterization from histological lung images is proposed. Elastase-induced mice were used to simulate the effect of emphysema on the lungs. A database composed of 50 normal and 50 emphysematous lung patches of size 512 x 512 pixels was used in our experiments. The purpose is to automatically identify those patches containing emphysematous tissue. The proposed approach is based on the use of granulometry analysis, which provides the pattern spectrum describing the distribution of airspaces in the lung region under evaluation. The profile of the spectrum was summarized by a set of statistical features. A logistic regression model was then used to estimate the probability for a patch to be emphysematous from this feature set. An accuracy of 87% was achieved by our method in the classification between normal and emphysematous samples. This result shows the utility of our granulometry-based method to quantify the lesions due to emphysema.

  15. Probabilistic topic modeling for the analysis and classification of genomic sequences

    PubMed Central

    2015-01-01

    Background Studies on genomic sequences for classification and taxonomic identification have a leading role in the biomedical field and in the analysis of biodiversity. These studies are focusing on the so-called barcode genes, representing a well defined region of the whole genome. Recently, alignment-free techniques are gaining more importance because they are able to overcome the drawbacks of sequence alignment techniques. In this paper a new alignment-free method for DNA sequences clustering and classification is proposed. The method is based on k-mers representation and text mining techniques. Methods The presented method is based on Probabilistic Topic Modeling, a statistical technique originally proposed for text documents. Probabilistic topic models are able to find in a document corpus the topics (recurrent themes) characterizing classes of documents. This technique, applied on DNA sequences representing the documents, exploits the frequency of fixed-length k-mers and builds a generative model for a training group of sequences. This generative model, obtained through the Latent Dirichlet Allocation (LDA) algorithm, is then used to classify a large set of genomic sequences. Results and conclusions We performed classification of over 7000 16S DNA barcode sequences taken from Ribosomal Database Project (RDP) repository, training probabilistic topic models. The proposed method is compared to the RDP tool and Support Vector Machine (SVM) classification algorithm in a extensive set of trials using both complete sequences and short sequence snippets (from 400 bp to 25 bp). Our method reaches very similar results to RDP classifier and SVM for complete sequences. The most interesting results are obtained when short sequence snippets are considered. In these conditions the proposed method outperforms RDP and SVM with ultra short sequences and it exhibits a smooth decrease of performance, at every taxonomic level, when the sequence length is decreased. PMID:25916734

  16. Probabilistic modeling of percutaneous absorption for risk-based exposure assessments and transdermal drug delivery.

    SciTech Connect

    Ho, Clifford Kuofei

    2004-06-01

    Chemical transport through human skin can play a significant role in human exposure to toxic chemicals in the workplace, as well as to chemical/biological warfare agents in the battlefield. The viability of transdermal drug delivery also relies on chemical transport processes through the skin. Models of percutaneous absorption are needed for risk-based exposure assessments and drug-delivery analyses, but previous mechanistic models have been largely deterministic. A probabilistic, transient, three-phase model of percutaneous absorption of chemicals has been developed to assess the relative importance of uncertain parameters and processes that may be important to risk-based assessments. Penetration routes through the skin that were modeled include the following: (1) intercellular diffusion through the multiphase stratum corneum; (2) aqueous-phase diffusion through sweat ducts; and (3) oil-phase diffusion through hair follicles. Uncertainty distributions were developed for the model parameters, and a Monte Carlo analysis was performed to simulate probability distributions of mass fluxes through each of the routes. Sensitivity analyses using stepwise linear regression were also performed to identify model parameters that were most important to the simulated mass fluxes at different times. This probabilistic analysis of percutaneous absorption (PAPA) method has been developed to improve risk-based exposure assessments and transdermal drug-delivery analyses, where parameters and processes can be highly uncertain.

  17. SHM-Based Probabilistic Fatigue Life Prediction for Bridges Based on FE Model Updating.

    PubMed

    Lee, Young-Joo; Cho, Soojin

    2016-03-02

    Fatigue life prediction for a bridge should be based on the current condition of the bridge, and various sources of uncertainty, such as material properties, anticipated vehicle loads and environmental conditions, make the prediction very challenging. This paper presents a new approach for probabilistic fatigue life prediction for bridges using finite element (FE) model updating based on structural health monitoring (SHM) data. Recently, various types of SHM systems have been used to monitor and evaluate the long-term structural performance of bridges. For example, SHM data can be used to estimate the degradation of an in-service bridge, which makes it possible to update the initial FE model. The proposed method consists of three steps: (1) identifying the modal properties of a bridge, such as mode shapes and natural frequencies, based on the ambient vibration under passing vehicles; (2) updating the structural parameters of an initial FE model using the identified modal properties; and (3) predicting the probabilistic fatigue life using the updated FE model. The proposed method is demonstrated by application to a numerical model of a bridge, and the impact of FE model updating on the bridge fatigue life is discussed.

  18. SHM-Based Probabilistic Fatigue Life Prediction for Bridges Based on FE Model Updating

    PubMed Central

    Lee, Young-Joo; Cho, Soojin

    2016-01-01

    Fatigue life prediction for a bridge should be based on the current condition of the bridge, and various sources of uncertainty, such as material properties, anticipated vehicle loads and environmental conditions, make the prediction very challenging. This paper presents a new approach for probabilistic fatigue life prediction for bridges using finite element (FE) model updating based on structural health monitoring (SHM) data. Recently, various types of SHM systems have been used to monitor and evaluate the long-term structural performance of bridges. For example, SHM data can be used to estimate the degradation of an in-service bridge, which makes it possible to update the initial FE model. The proposed method consists of three steps: (1) identifying the modal properties of a bridge, such as mode shapes and natural frequencies, based on the ambient vibration under passing vehicles; (2) updating the structural parameters of an initial FE model using the identified modal properties; and (3) predicting the probabilistic fatigue life using the updated FE model. The proposed method is demonstrated by application to a numerical model of a bridge, and the impact of FE model updating on the bridge fatigue life is discussed. PMID:26950125

  19. A Simplified Model of Choice Behavior under Uncertainty

    PubMed Central

    Lin, Ching-Hung; Lin, Yu-Kai; Song, Tzu-Jiun; Huang, Jong-Tsun; Chiu, Yao-Chu

    2016-01-01

    The Iowa Gambling Task (IGT) has been standardized as a clinical assessment tool (Bechara, 2007). Nonetheless, numerous research groups have attempted to modify IGT models to optimize parameters for predicting the choice behavior of normal controls and patients. A decade ago, most researchers considered the expected utility (EU) model (Busemeyer and Stout, 2002) to be the optimal model for predicting choice behavior under uncertainty. However, in recent years, studies have demonstrated that models with the prospect utility (PU) function are more effective than the EU models in the IGT (Ahn et al., 2008). Nevertheless, after some preliminary tests based on our behavioral dataset and modeling, it was determined that the Ahn et al. (2008) PU model is not optimal due to some incompatible results. This study aims to modify the Ahn et al. (2008) PU model to a simplified model and used the IGT performance of 145 subjects as the benchmark data for comparison. In our simplified PU model, the best goodness-of-fit was found mostly as the value of α approached zero. More specifically, we retested the key parameters α, λ, and A in the PU model. Notably, the influence of the parameters α, λ, and A has a hierarchical power structure in terms of manipulating the goodness-of-fit in the PU model. Additionally, we found that the parameters λ and A may be ineffective when the parameter α is close to zero in the PU model. The present simplified model demonstrated that decision makers mostly adopted the strategy of gain-stay loss-shift rather than foreseeing the long-term outcome. However, there are other behavioral variables that are not well revealed under these dynamic-uncertainty situations. Therefore, the optimal behavioral models may not have been found yet. In short, the best model for predicting choice behavior under dynamic-uncertainty situations should be further evaluated. PMID:27582715

  20. A Simplified Model of Choice Behavior under Uncertainty.

    PubMed

    Lin, Ching-Hung; Lin, Yu-Kai; Song, Tzu-Jiun; Huang, Jong-Tsun; Chiu, Yao-Chu

    2016-01-01

    The Iowa Gambling Task (IGT) has been standardized as a clinical assessment tool (Bechara, 2007). Nonetheless, numerous research groups have attempted to modify IGT models to optimize parameters for predicting the choice behavior of normal controls and patients. A decade ago, most researchers considered the expected utility (EU) model (Busemeyer and Stout, 2002) to be the optimal model for predicting choice behavior under uncertainty. However, in recent years, studies have demonstrated that models with the prospect utility (PU) function are more effective than the EU models in the IGT (Ahn et al., 2008). Nevertheless, after some preliminary tests based on our behavioral dataset and modeling, it was determined that the Ahn et al. (2008) PU model is not optimal due to some incompatible results. This study aims to modify the Ahn et al. (2008) PU model to a simplified model and used the IGT performance of 145 subjects as the benchmark data for comparison. In our simplified PU model, the best goodness-of-fit was found mostly as the value of α approached zero. More specifically, we retested the key parameters α, λ, and A in the PU model. Notably, the influence of the parameters α, λ, and A has a hierarchical power structure in terms of manipulating the goodness-of-fit in the PU model. Additionally, we found that the parameters λ and A may be ineffective when the parameter α is close to zero in the PU model. The present simplified model demonstrated that decision makers mostly adopted the strategy of gain-stay loss-shift rather than foreseeing the long-term outcome. However, there are other behavioral variables that are not well revealed under these dynamic-uncertainty situations. Therefore, the optimal behavioral models may not have been found yet. In short, the best model for predicting choice behavior under dynamic-uncertainty situations should be further evaluated.

  1. A probabilistic modeling approach in thermal inactivation: estimation of postprocess Bacillus cereus spore prevalence and concentration.

    PubMed

    Membré, J M; Amézquita, A; Bassett, J; Giavedoni, P; Blackburn, C de W; Gorris, L G M

    2006-01-01

    The survival of spore-forming bacteria is linked to the safety and stability of refrigerated processed foods of extended durability (REPFEDs). A probabilistic modeling approach was used to assess the prevalence and concentration of Bacillus cereus spores surviving heat treatment for a semiliquid chilled food product. This product received heat treatment to inactivate nonproteolytic Clostridium botulinum during manufacture and was designed to be kept at refrigerator temperature postmanufacture. As key inputs for the modeling, the assessment took into consideration the following factors: (i) contamination frequency (prevalence) and level (concentration) of both psychrotrophic and mesophilic strains of B. cereus, (ii) heat resistance of both types (expressed as decimal reduction times at 90 degrees C), and (iii) intrapouch variability of thermal kinetics during heat processing (expressed as the time spent at 90 degrees C). These three inputs were established as statistical distributions using expert opinion, literature data, and specific modeling, respectively. They were analyzed in a probabilistic model in which the outputs, expressed as distributions as well, were the proportion of the contaminated pouches (the likely prevalence) and the number of spores in the contaminated pouches (the likely concentration). The prevalence after thermal processing was estimated to be 11 and 49% for psychrotrophic and mesophilic strains, respectively. In the positive pouches, the bacterial concentration (considering psychrotrophic and mesophilic strains combined) was estimated to be 30 CFU/g (95th percentile). Such a probabilistic approach seems promising to help in (i) optimizing heat processes, (ii) identifying which key factor(s) to control, and (iii) providing information for subsequent assessment of B. cereus resuscitation and growth.

  2. Probabilistic investigation of sensitivities of advanced test-analysis model correlation methods

    NASA Astrophysics Data System (ADS)

    Bergman, Elizabeth J.; Allen, Matthew S.; Kammer, Daniel C.; Mayes, Randall L.

    2010-06-01

    The industry standard method used to validate finite element models involves correlation of test and analysis mode shapes using reduced Test-Analysis Models (TAMs). Some organizations even require this model validation approach. Considerable effort is required to choose sensor locations and to create a suitable TAM so that the test and analysis mode shapes will be orthogonal to within the required tolerance. This work uses a probabilistic framework to understand and quantify the effect of small errors in the test mode shapes on test-analysis orthogonality. Using the proposed framework, test-orthogonality is a probabilistic metric and the problem becomes one of choosing sensor placement and TAM generation techniques that assure that the orthogonality has a high probability of being within an acceptable range if the model is correct, even though the test measurements are contaminated with random errors. A simple analytical metric is derived that is shown to give a good estimate of the sensitivity of a TAM to errors in the test mode shapes for a certain noise model. These ideas are then applied to a generic satellite system, using TAMs generated by the Static, Modal and Improved Reduced System (IRS) reduction methods. Experimental errors are simulated for a set of mode shapes and Monte Carlo simulation is used to estimate the probability that the orthogonality metric exceeds a threshold due to experimental error alone. For the satellite system considered here, the orthogonality calculation is highly sensitive to experimental errors, so a set of noisy mode shapes has a small probability of passing the orthogonality criteria for some of the TAMs. A number of sensor placement techniques are used in this study, and the comparison reveals that, for this system, the Modal TAM is twice as sensitive to errors on the test mode shapes when it is created on a sensor set optimized for the Static TAM rather than one that was optimized specifically for the Modal TAM. These findings

  3. Binary choices in small and large groups: A unified model

    NASA Astrophysics Data System (ADS)

    Bischi, Gian-Italo; Merlone, Ugo

    2010-02-01

    Two different ways to model the diffusion of alternative choices within a population of individuals in the presence of social externalities are known in the literature. While Galam’s model of rumors spreading considers a majority rule for interactions in several groups, Schelling considers individuals interacting in one large group, with payoff functions that describe how collective choices influence individual preferences. We incorporate these two approaches into a unified general discrete-time dynamic model for studying individual interactions in variously sized groups. We first illustrate how the two original models can be obtained as particular cases of the more general model we propose, then we show how several other situations can be analyzed. The model we propose goes beyond a theoretical exercise as it allows modeling situations which are relevant in economic and social systems. We consider also other aspects such as the propensity to switch choices and the behavioral momentum, and show how they may affect the dynamics of the whole population.

  4. Statistical shape analysis of the human spleen geometry for probabilistic occupant models.

    PubMed

    Yates, Keegan M; Lu, Yuan-Chiao; Untaroiu, Costin D

    2016-06-14

    Statistical shape models are an effective way to create computational models of human organs that can incorporate inter-subject geometrical variation. The main objective of this study was to create statistical mean and boundary models of the human spleen in an occupant posture. Principal component analysis was applied to fifteen human spleens in order to find the statistical modes of variation, mean shape, and boundary models. A landmark sliding approach was utilized to refine the landmarks to obtain a better shape correspondence and create a better representation of the underlying shape contour. The first mode of variation was found to be the overall volume, and it accounted for 69% of the total variation. The mean model and boundary models could be used to develop probabilistic finite element (FE) models which may identify the risk of spleen injury during vehicle collisions and consequently help to improve automobile safety systems.

  5. Spike-based probabilistic inference in analog graphical models using interspike-interval coding.

    PubMed

    Steimer, Andreas; Douglas, Rodney

    2013-09-01

    Temporal spike codes play a crucial role in neural information processing. In particular, there is strong experimental evidence that interspike intervals (ISIs) are used for stimulus representation in neural systems. However, very few algorithmic principles exploit the benefits of such temporal codes for probabilistic inference of stimuli or decisions. Here, we describe and rigorously prove the functional properties of a spike-based processor that uses ISI distributions to perform probabilistic inference. The abstract processor architecture serves as a building block for more concrete, neural implementations of the belief-propagation (BP) algorithm in arbitrary graphical models (e.g., Bayesian networks and factor graphs). The distributed nature of graphical models matches well with the architectural and functional constraints imposed by biology. In our model, ISI distributions represent the BP messages exchanged between factor nodes, leading to the interpretation of a single spike as a random sample that follows such a distribution. We verify the abstract processor model by numerical simulation in full graphs, and demonstrate that it can be applied even in the presence of analog variables. As a particular example, we also show results of a concrete, neural implementation of the processor, although in principle our approach is more flexible and allows different neurobiological interpretations. Furthermore, electrophysiological data from area LIP during behavioral experiments are assessed in light of ISI coding, leading to concrete testable, quantitative predictions and a more accurate description of these data compared to hitherto existing models.

  6. Propagating Water Quality Analysis Uncertainty Into Resource Management Decisions Through Probabilistic Modeling

    NASA Astrophysics Data System (ADS)

    Gronewold, A. D.; Wolpert, R. L.; Reckhow, K. H.

    2007-12-01

    Most probable number (MPN) and colony-forming-unit (CFU) are two estimates of fecal coliform bacteria concentration commonly used as measures of water quality in United States shellfish harvesting waters. The MPN is the maximum likelihood estimate (or MLE) of the true fecal coliform concentration based on counts of non-sterile tubes in serial dilution of a sample aliquot, indicating bacterial metabolic activity. The CFU is the MLE of the true fecal coliform concentration based on the number of bacteria colonies emerging on a growth plate after inoculation from a sample aliquot. Each estimating procedure has intrinsic variability and is subject to additional uncertainty arising from minor variations in experimental protocol. Several versions of each procedure (using different sized aliquots or different numbers of tubes, for example) are in common use, each with its own levels of probabilistic and experimental error and uncertainty. It has been observed empirically that the MPN procedure is more variable than the CFU procedure, and that MPN estimates are somewhat higher on average than CFU estimates, on split samples from the same water bodies. We construct a probabilistic model that provides a clear theoretical explanation for the observed variability in, and discrepancy between, MPN and CFU measurements. We then explore how this variability and uncertainty might propagate into shellfish harvesting area management decisions through a two-phased modeling strategy. First, we apply our probabilistic model in a simulation-based analysis of future water quality standard violation frequencies under alternative land use scenarios, such as those evaluated under guidelines of the total maximum daily load (TMDL) program. Second, we apply our model to water quality data from shellfish harvesting areas which at present are closed (either conditionally or permanently) to shellfishing, to determine if alternative laboratory analysis procedures might have led to different

  7. Building macroscale models from microscale probabilistic models: a general probabilistic approach for nonlinear diffusion and multispecies phenomena.

    PubMed

    Penington, Catherine J; Hughes, Barry D; Landman, Kerry A

    2011-10-01

    A discrete agent-based model on a periodic lattice of arbitrary dimension is considered. Agents move to nearest-neighbor sites by a motility mechanism accounting for general interactions, which may include volume exclusion. The partial differential equation describing the average occupancy of the agent population is derived systematically. A diffusion equation arises for all types of interactions and is nonlinear except for the simplest interactions. In addition, multiple species of interacting subpopulations give rise to an advection-diffusion equation for each subpopulation. This work extends and generalizes previous specific results, providing a construction method for determining the transport coefficients in terms of a single conditional transition probability, which depends on the occupancy of sites in an influence region. These coefficients characterize the diffusion of agents in a crowded environment in biological and physical processes.

  8. Value learning and arousal in the extinction of probabilistic rewards: the role of dopamine in a modified temporal difference model.

    PubMed

    Song, Minryung R; Fellous, Jean-Marc

    2014-01-01

    Because most rewarding events are probabilistic and changing, the extinction of probabilistic rewards is important for survival. It has been proposed that the extinction of probabilistic rewards depends on arousal and the amount of learning of reward values. Midbrain dopamine neurons were suggested to play a role in both arousal and learning reward values. Despite extensive research on modeling dopaminergic activity in reward learning (e.g. temporal difference models), few studies have been done on modeling its role in arousal. Although temporal difference models capture key characteristics of dopaminergic activity during the extinction of deterministic rewards, they have been less successful at simulating the extinction of probabilistic rewards. By adding an arousal signal to a temporal difference model, we were able to simulate the extinction of probabilistic rewards and its dependence on the amount of learning. Our simulations propose that arousal allows the probability of reward to have lasting effects on the updating of reward value, which slows the extinction of low probability rewards. Using this model, we predicted that, by signaling the prediction error, dopamine determines the learned reward value that has to be extinguished during extinction and participates in regulating the size of the arousal signal that controls the learning rate. These predictions were supported by pharmacological experiments in rats.

  9. A Probabilistic Model for Estimating the Depth and Threshold Temperature of C-fiber Nociceptors

    PubMed Central

    Dezhdar, Tara; Moshourab, Rabih A.; Fründ, Ingo; Lewin, Gary R.; Schmuker, Michael

    2015-01-01

    The subjective experience of thermal pain follows the detection and encoding of noxious stimuli by primary afferent neurons called nociceptors. However, nociceptor morphology has been hard to access and the mechanisms of signal transduction remain unresolved. In order to understand how heat transducers in nociceptors are activated in vivo, it is important to estimate the temperatures that directly activate the skin-embedded nociceptor membrane. Hence, the nociceptor’s temperature threshold must be estimated, which in turn will depend on the depth at which transduction happens in the skin. Since the temperature at the receptor cannot be accessed experimentally, such an estimation can currently only be achieved through modeling. However, the current state-of-the-art model to estimate temperature at the receptor suffers from the fact that it cannot account for the natural stochastic variability of neuronal responses. We improve this model using a probabilistic approach which accounts for uncertainties and potential noise in system. Using a data set of 24 C-fibers recorded in vitro, we show that, even without detailed knowledge of the bio-thermal properties of the system, the probabilistic model that we propose here is capable of providing estimates of threshold and depth in cases where the classical method fails. PMID:26638830

  10. Assessing uncertainties in global cropland futures using a conditional probabilistic modelling framework

    NASA Astrophysics Data System (ADS)

    Engström, Kerstin; Olin, Stefan; Rounsevell, Mark D. A.; Brogaard, Sara; van Vuuren, Detlef P.; Alexander, Peter; Murray-Rust, Dave; Arneth, Almut

    2016-11-01

    We present a modelling framework to simulate probabilistic futures of global cropland areas that are conditional on the SSP (shared socio-economic pathway) scenarios. Simulations are based on the Parsimonious Land Use Model (PLUM) linked with the global dynamic vegetation model LPJ-GUESS (Lund-Potsdam-Jena General Ecosystem Simulator) using socio-economic data from the SSPs and climate data from the RCPs (representative concentration pathways). The simulated range of global cropland is 893-2380 Mha in 2100 (± 1 standard deviation), with the main uncertainties arising from differences in the socio-economic conditions prescribed by the SSP scenarios and the assumptions that underpin the translation of qualitative SSP storylines into quantitative model input parameters. Uncertainties in the assumptions for population growth, technological change and cropland degradation were found to be the most important for global cropland, while uncertainty in food consumption had less influence on the results. The uncertainties arising from climate variability and the differences between climate change scenarios do not strongly affect the range of global cropland futures. Some overlap occurred across all of the conditional probabilistic futures, except for those based on SSP3. We conclude that completely different socio-economic and climate change futures, although sharing low to medium population development, can result in very similar cropland areas on the aggregated global scale.

  11. A Probabilistic Model for Estimating the Depth and Threshold Temperature of C-fiber Nociceptors

    NASA Astrophysics Data System (ADS)

    Dezhdar, Tara; Moshourab, Rabih A.; Fründ, Ingo; Lewin, Gary R.; Schmuker, Michael

    2015-12-01

    The subjective experience of thermal pain follows the detection and encoding of noxious stimuli by primary afferent neurons called nociceptors. However, nociceptor morphology has been hard to access and the mechanisms of signal transduction remain unresolved. In order to understand how heat transducers in nociceptors are activated in vivo, it is important to estimate the temperatures that directly activate the skin-embedded nociceptor membrane. Hence, the nociceptor’s temperature threshold must be estimated, which in turn will depend on the depth at which transduction happens in the skin. Since the temperature at the receptor cannot be accessed experimentally, such an estimation can currently only be achieved through modeling. However, the current state-of-the-art model to estimate temperature at the receptor suffers from the fact that it cannot account for the natural stochastic variability of neuronal responses. We improve this model using a probabilistic approach which accounts for uncertainties and potential noise in system. Using a data set of 24 C-fibers recorded in vitro, we show that, even without detailed knowledge of the bio-thermal properties of the system, the probabilistic model that we propose here is capable of providing estimates of threshold and depth in cases where the classical method fails.

  12. From Cyclone Tracks to the Costs of European Winter Storms: A Probabilistic Loss Assessment Model

    NASA Astrophysics Data System (ADS)

    Orwig, K.; Renggli, D.; Corti, T.; Reese, S.; Wueest, M.; Viktor, E.; Zimmerli, P.

    2014-12-01

    European winter storms cause billions of dollars of insured losses every year. Therefore, it is essential to understand potential impacts of future events, and the role reinsurance can play to mitigate the losses. The authors will present an overview on natural catastrophe risk assessment modeling in the reinsurance industry, and the development of a new innovative approach for modeling the risk associated with European winter storms.The new innovative approach includes the development of physically meaningful probabilistic (i.e. simulated) events for European winter storm loss assessment. The meteorological hazard component of the new model is based on cyclone and windstorm tracks identified in the 20thCentury Reanalysis data. The knowledge of the evolution of winter storms both in time and space allows the physically meaningful perturbation of historical event properties (e.g. track, intensity, etc.). The perturbation includes a random element but also takes the local climatology and the evolution of the historical event into account.The low-resolution wind footprints taken from the 20thCentury Reanalysis are processed by a statistical-dynamical downscaling to generate high-resolution footprints for both the simulated and historical events. Downscaling transfer functions are generated using ENSEMBLES regional climate model data. The result is a set of reliable probabilistic events representing thousands of years. The event set is then combined with country and site-specific vulnerability functions and detailed market- or client-specific information to compute annual expected losses.

  13. An efficient deterministic-probabilistic approach to modeling regional groundwater flow: 1. Theory

    USGS Publications Warehouse

    Yen, Chung-Cheng; Guymon, Gary L.

    1990-01-01

    An efficient probabilistic model is developed and cascaded with a deterministic model for predicting water table elevations in regional aquifers. The objective is to quantify model uncertainty where precise estimates of water table elevations may be required. The probabilistic model is based on the two-point probability method which only requires prior knowledge of uncertain variables mean and coefficient of variation. The two-point estimate method is theoretically developed and compared with the Monte Carlo simulation method. The results of comparisons using hypothetical determinisitic problems indicate that the two-point estimate method is only generally valid for linear problems where the coefficients of variation of uncertain parameters (for example, storage coefficient and hydraulic conductivity) is small. The two-point estimate method may be applied to slightly nonlinear problems with good results, provided coefficients of variation are small. In such cases, the two-point estimate method is much more efficient than the Monte Carlo method provided the number of uncertain variables is less than eight.

  14. The international normalized ratio and uncertainty. Validation of a probabilistic model.

    PubMed

    Critchfield, G C; Bennett, S T

    1994-07-01

    The motivation behind the creation of the International Normalized Ratio (INR) was to improve interlaboratory comparison for patients on anticoagulation therapy. In principle, a laboratory that reports the prothrombin time (PT) as an INR can standardize its PT measurements to an international reference thromboplastin. Using probability theory, the authors derived the equation for the probability distribution of the INR based on the PT, the International Sensitivity Index (ISI), and the geometric mean PT of the reference population. With Monte Carlo and numeric integration techniques, the model is validated on data from three different laboratories. The model allows computation of confidence intervals for the INR as a function of PT, ISI, and reference mean. The probabilistic model illustrates that confidence in INR measurements degrades for higher INR values. This occurs primarily as a result of amplification of between-run measurement errors in the PT, which is inherent in the mathematical transformation from the PT to the INR. The probabilistic model can be used by any laboratory to study the reliability of its own INR for any measured PT. This framework provides better insight into the problems of monitoring oral anticoagulation.

  15. Linear-Nonlinear-Poisson Models of Primate Choice Dynamics

    PubMed Central

    Corrado, Greg S; Sugrue, Leo P; Sebastian Seung, H; Newsome, William T

    2005-01-01

    The equilibrium phenomenon of matching behavior traditionally has been studied in stationary environments. Here we attempt to uncover the local mechanism of choice that gives rise to matching by studying behavior in a highly dynamic foraging environment. In our experiments, 2 rhesus monkeys (Macacca mulatta) foraged for juice rewards by making eye movements to one of two colored icons presented on a computer monitor, each rewarded on dynamic variable-interval schedules. Using a generalization of Wiener kernel analysis, we recover a compact mechanistic description of the impact of past reward on future choice in the form of a Linear-Nonlinear-Poisson model. We validate this model through rigorous predictive and generative testing. Compared to our earlier work with this same data set, this model proves to be a better description of choice behavior and is more tightly correlated with putative neural value signals. Refinements over previous models include hyperbolic (as opposed to exponential) temporal discounting of past rewards, and differential (as opposed to fractional) comparisons of option value. Through numerical simulation we find that within this class of strategies, the model parameters employed by animals are very close to those that maximize reward harvesting efficiency. PMID:16596981

  16. The Integrated Medical Model: A Probabilistic Simulation Model for Predicting In-Flight Medical Risks

    NASA Technical Reports Server (NTRS)

    Keenan, Alexandra; Young, Millennia; Saile, Lynn; Boley, Lynn; Walton, Marlei; Kerstman, Eric; Shah, Ronak; Goodenow, Debra A.; Myers, Jerry G.

    2015-01-01

    The Integrated Medical Model (IMM) is a probabilistic model that uses simulation to predict mission medical risk. Given a specific mission and crew scenario, medical events are simulated using Monte Carlo methodology to provide estimates of resource utilization, probability of evacuation, probability of loss of crew, and the amount of mission time lost due to illness. Mission and crew scenarios are defined by mission length, extravehicular activity (EVA) schedule, and crew characteristics including: sex, coronary artery calcium score, contacts, dental crowns, history of abdominal surgery, and EVA eligibility. The Integrated Medical Evidence Database (iMED) houses the model inputs for one hundred medical conditions using in-flight, analog, and terrestrial medical data. Inputs include incidence, event durations, resource utilization, and crew functional impairment. Severity of conditions is addressed by defining statistical distributions on the dichotomized best and worst-case scenarios for each condition. The outcome distributions for conditions are bounded by the treatment extremes of the fully treated scenario in which all required resources are available and the untreated scenario in which no required resources are available. Upon occurrence of a simulated medical event, treatment availability is assessed, and outcomes are generated depending on the status of the affected crewmember at the time of onset, including any pre-existing functional impairments or ongoing treatment of concurrent conditions. The main IMM outcomes, including probability of evacuation and loss of crew life, time lost due to medical events, and resource utilization, are useful in informing mission planning decisions. To date, the IMM has been used to assess mission-specific risks with and without certain crewmember characteristics, to determine the impact of eliminating certain resources from the mission medical kit, and to design medical kits that maximally benefit crew health while meeting

  17. The Integrated Medical Model: A Probabilistic Simulation Model Predicting In-Flight Medical Risks

    NASA Technical Reports Server (NTRS)

    Keenan, Alexandra; Young, Millennia; Saile, Lynn; Boley, Lynn; Walton, Marlei; Kerstman, Eric; Shah, Ronak; Goodenow, Debra A.; Myers, Jerry G., Jr.

    2015-01-01

    The Integrated Medical Model (IMM) is a probabilistic model that uses simulation to predict mission medical risk. Given a specific mission and crew scenario, medical events are simulated using Monte Carlo methodology to provide estimates of resource utilization, probability of evacuation, probability of loss of crew, and the amount of mission time lost due to illness. Mission and crew scenarios are defined by mission length, extravehicular activity (EVA) schedule, and crew characteristics including: sex, coronary artery calcium score, contacts, dental crowns, history of abdominal surgery, and EVA eligibility. The Integrated Medical Evidence Database (iMED) houses the model inputs for one hundred medical conditions using in-flight, analog, and terrestrial medical data. Inputs include incidence, event durations, resource utilization, and crew functional impairment. Severity of conditions is addressed by defining statistical distributions on the dichotomized best and worst-case scenarios for each condition. The outcome distributions for conditions are bounded by the treatment extremes of the fully treated scenario in which all required resources are available and the untreated scenario in which no required resources are available. Upon occurrence of a simulated medical event, treatment availability is assessed, and outcomes are generated depending on the status of the affected crewmember at the time of onset, including any pre-existing functional impairments or ongoing treatment of concurrent conditions. The main IMM outcomes, including probability of evacuation and loss of crew life, time lost due to medical events, and resource utilization, are useful in informing mission planning decisions. To date, the IMM has been used to assess mission-specific risks with and without certain crewmember characteristics, to determine the impact of eliminating certain resources from the mission medical kit, and to design medical kits that maximally benefit crew health while meeting

  18. VBA: A Probabilistic Treatment of Nonlinear Models for Neurobiological and Behavioural Data

    PubMed Central

    Daunizeau, Jean; Adam, Vincent; Rigoux, Lionel

    2014-01-01

    This work is in line with an on-going effort tending toward a computational (quantitative and refutable) understanding of human neuro-cognitive processes. Many sophisticated models for behavioural and neurobiological data have flourished during the past decade. Most of these models are partly unspecified (i.e. they have unknown parameters) and nonlinear. This makes them difficult to peer with a formal statistical data analysis framework. In turn, this compromises the reproducibility of model-based empirical studies. This work exposes a software toolbox that provides generic, efficient and robust probabilistic solutions to the three problems of model-based analysis of empirical data: (i) data simulation, (ii) parameter estimation/model selection, and (iii) experimental design optimization. PMID:24465198

  19. Learned graphical models for probabilistic planning provide a new class of movement primitives.

    PubMed

    Rückert, Elmar A; Neumann, Gerhard; Toussaint, Marc; Maass, Wolfgang

    2012-01-01

    BIOLOGICAL MOVEMENT GENERATION COMBINES THREE INTERESTING ASPECTS: its modular organization in movement primitives (MPs), its characteristics of stochastic optimality under perturbations, and its efficiency in terms of learning. A common approach to motor skill learning is to endow the primitives with dynamical systems. Here, the parameters of the primitive indirectly define the shape of a reference trajectory. We propose an alternative MP representation based on probabilistic inference in learned graphical models with new and interesting properties that complies with salient features of biological movement control. Instead of endowing the primitives with dynamical systems, we propose to endow MPs with an intrinsic probabilistic planning system, integrating the power of stochastic optimal control (SOC) methods within a MP. The parameterization of the primitive is a graphical model that represents the dynamics and intrinsic cost function such that inference in this graphical model yields the control policy. We parameterize the intrinsic cost function using task-relevant features, such as the importance of passing through certain via-points. The system dynamics as well as intrinsic cost function parameters are learned in a reinforcement learning (RL) setting. We evaluate our approach on a complex 4-link balancing task. Our experiments show that our movement representation facilitates learning significantly and leads to better generalization to new task settings without re-learning.

  20. Probabilistic, multi-variate flood damage modelling using random forests and Bayesian networks

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Schröter, Kai

    2015-04-01

    Decisions on flood risk management and adaptation are increasingly based on risk analyses. Such analyses are associated with considerable uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention recently, they are hardly applied in flood damage assessments. Most of the damage models usually applied in standard practice have in common that complex damaging processes are described by simple, deterministic approaches like stage-damage functions. This presentation will show approaches for probabilistic, multi-variate flood damage modelling on the micro- and meso-scale and discuss their potential and limitations. Reference: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Schröter, K., Kreibich, H., Vogel, K., Riggelsen, C., Scherbaum, F., Merz, B. (2014): How useful are complex flood damage models? - Water Resources Research, 50, 4, p. 3378-3395.

  1. Biomedical time series clustering based on non-negative sparse coding and probabilistic topic model.

    PubMed

    Wang, Jin; Liu, Ping; F H She, Mary; Nahavandi, Saeid; Kouzani, Abbas

    2013-09-01

    Biomedical time series clustering that groups a set of unlabelled temporal signals according to their underlying similarity is very useful for biomedical records management and analysis such as biosignals archiving and diagnosis. In this paper, a new framework for clustering of long-term biomedical time series such as electrocardiography (ECG) and electroencephalography (EEG) signals is proposed. Specifically, local segments extracted from the time series are projected as a combination of a small number of basis elements in a trained dictionary by non-negative sparse coding. A Bag-of-Words (BoW) representation is then constructed by summing up all the sparse coefficients of local segments in a time series. Based on the BoW representation, a probabilistic topic model that was originally developed for text document analysis is extended to discover the underlying similarity of a collection of time series. The underlying similarity of biomedical time series is well captured attributing to the statistic nature of the probabilistic topic model. Experiments on three datasets constructed from publicly available EEG and ECG signals demonstrates that the proposed approach achieves better accuracy than existing state-of-the-art methods, and is insensitive to model parameters such as length of local segments and dictionary size.

  2. Data assimilation for unsaturated flow models with restart adaptive probabilistic collocation based Kalman filter

    NASA Astrophysics Data System (ADS)

    Man, Jun; Li, Weixuan; Zeng, Lingzao; Wu, Laosheng

    2016-06-01

    The ensemble Kalman filter (EnKF) has gained popularity in hydrological data assimilation problems. As a Monte Carlo based method, a sufficiently large ensemble size is usually required to guarantee the accuracy. As an alternative approach, the probabilistic collocation based Kalman filter (PCKF) employs the polynomial chaos expansion (PCE) to represent and propagate the uncertainties in parameters and states. However, PCKF suffers from the so-called "curse of dimensionality". Its computational cost increases drastically with the increasing number of parameters and system nonlinearity. Furthermore, PCKF may fail to provide accurate estimations due to the joint updating scheme for strongly nonlinear models. Motivated by recent developments in uncertainty quantification and EnKF, we propose a restart adaptive probabilistic collocation based Kalman filter (RAPCKF) for data assimilation in unsaturated flow problems. During the implementation of RAPCKF, the important parameters are identified and active PCE basis functions are adaptively selected at each assimilation step; the "restart" scheme is utilized to eliminate the inconsistency between updated model parameters and states variables. The performance of RAPCKF is systematically tested with numerical cases of unsaturated flow models. It is shown that the adaptive approach and restart scheme can significantly improve the performance of PCKF. Moreover, RAPCKF has been demonstrated to be more efficient than EnKF with the same computational cost.

  3. 3D Morphology Prediction of Progressive Spinal Deformities from Probabilistic Modeling of Discriminant Manifolds.

    PubMed

    Kadoury, Samuel; Mandel, William; Roy-Beaudry, Marjolaine; Nault, Marie-Lyne; Parent, Stefan

    2017-01-23

    We introduce a novel approach for predicting the progression of adolescent idiopathic scoliosis from 3D spine models reconstructed from biplanar X-ray images. Recent progress in machine learning have allowed to improve classification and prognosis rates, but lack a probabilistic framework to measure uncertainty in the data. We propose a discriminative probabilistic manifold embedding where locally linear mappings transform data points from high-dimensional space to corresponding lowdimensional coordinates. A discriminant adjacency matrix is constructed to maximize the separation between progressive and non-progressive groups of patients diagnosed with scoliosis, while minimizing the distance in latent variables belonging to the same class. To predict the evolution of deformation, a baseline reconstruction is projected onto the manifold, from which a spatiotemporal regression model is built from parallel transport curves inferred from neighboring exemplars. Rate of progression is modulated from the spine flexibility and curve magnitude of the 3D spine deformation. The method was tested on 745 reconstructions from 133 subjects using longitudinal 3D reconstructions of the spine, with results demonstrating the discriminatory framework can identify between progressive and non-progressive of scoliotic patients with a classification rate of 81% and prediction differences of 2.1o in main curve angulation, outperforming other manifold learning methods. Our method achieved a higher prediction accuracy and improved the modeling of spatiotemporal morphological changes in highly deformed spines compared to other learning methods.

  4. Probabilistic failure modelling of reinforced concrete structures subjected to chloride penetration

    NASA Astrophysics Data System (ADS)

    Nogueira, Caio Gorla; Leonel, Edson Denner; Coda, Humberto Breves

    2012-12-01

    Structural durability is an important criterion that must be evaluated for every type of structure. Concerning reinforced concrete members, chloride diffusion process is widely used to evaluate durability, especially when these structures are constructed in aggressive atmospheres. The chloride ingress triggers the corrosion of reinforcements; therefore, by modelling this phenomenon, the corrosion process can be better evaluated as well as the structural durability. The corrosion begins when a threshold level of chloride concentration is reached at the steel bars of reinforcements. Despite the robustness of several models proposed in literature, deterministic approaches fail to predict accurately the corrosion time initiation due the inherent randomness observed in this process. In this regard, structural durability can be more realistically represented using probabilistic approaches. This paper addresses the analyses of probabilistic corrosion time initiation in reinforced concrete structures exposed to chloride penetration. The chloride penetration is modelled using the Fick's diffusion law. This law simulates the chloride diffusion process considering time-dependent effects. The probability of failure is calculated using Monte Carlo simulation and the first order reliability method, with a direct coupling approach. Some examples are considered in order to study these phenomena. Moreover, a simplified method is proposed to determine optimal values for concrete cover.

  5. The Stay/Switch Model of Concurrent Choice

    PubMed Central

    MacDonall, James S

    2009-01-01

    This experiment compared descriptions of concurrent choice by the stay/switch model, which says choice is a function of the reinforcers obtained for staying at and for switching from each alternative, and the generalized matching law, which says choice is a function of the total reinforcers obtained at each alternative. For the stay/switch model two schedules operate when at each alternative. One arranges reinforcers for staying there and the other arranges reinforcers for switching from there. Rats were exposed to eight or nine conditions that differed in the arrangement of the values of the stay and switch schedules. The generalized matching law described preferences when arrangements were similar to those found when using two concurrently running interval schedules. It did not, however, describe all preferences when using different arrangements. The stay/switch model described all preferences in one analysis. In addition, comparisons of selected conditions indicated that changing the ratio of obtained reinforcers was neither necessary nor sufficient for changing preference as measured by response ratios. Taken together these results provide support for the stay/switch model as a viable alternative to the generalized matching law and that the critical independent variable is allocation of stay and switch reinforcers. PMID:19230510

  6. Use of raster-based data layers to model spatial variation of seismotectonic data in probabilistic seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Zolfaghari, Mohammad R.

    2009-07-01

    Recent achievements in computer and information technology have provided the necessary tools to extend the application of probabilistic seismic hazard mapping from its traditional engineering use to many other applications. Examples for such applications are risk mitigation, disaster management, post disaster recovery planning and catastrophe loss estimation and risk management. Due to the lack of proper knowledge with regard to factors controlling seismic hazards, there are always uncertainties associated with all steps involved in developing and using seismic hazard models. While some of these uncertainties can be controlled by more accurate and reliable input data, the majority of the data and assumptions used in seismic hazard studies remain with high uncertainties that contribute to the uncertainty of the final results. In this paper a new methodology for the assessment of seismic hazard is described. The proposed approach provides practical facility for better capture of spatial variations of seismological and tectonic characteristics, which allows better treatment of their uncertainties. In the proposed approach, GIS raster-based data models are used in order to model geographical features in a cell-based system. The cell-based source model proposed in this paper provides a framework for implementing many geographically referenced seismotectonic factors into seismic hazard modelling. Examples for such components are seismic source boundaries, rupture geometry, seismic activity rate, focal depth and the choice of attenuation functions. The proposed methodology provides improvements in several aspects of the standard analytical tools currently being used for assessment and mapping of regional seismic hazard. The proposed methodology makes the best use of the recent advancements in computer technology in both software and hardware. The proposed approach is well structured to be implemented using conventional GIS tools.

  7. Aircraft Conflict Analysis and Real-Time Conflict Probing Using Probabilistic Trajectory Modeling

    NASA Technical Reports Server (NTRS)

    Yang, Lee C.; Kuchar, James K.

    2000-01-01

    Methods for maintaining separation between aircraft in the current airspace system have been built from a foundation of structured routes and evolved procedures. However, as the airspace becomes more congested and the chance of failures or operational error become more problematic, automated conflict alerting systems have been proposed to help provide decision support and to serve as traffic monitoring aids. The problem of conflict detection and resolution has been tackled from a number of different ways, but in this thesis, it is recast as a problem of prediction in the presence of uncertainties. Much of the focus is concentrated on the errors and uncertainties from the working trajectory model used to estimate future aircraft positions. The more accurate the prediction, the more likely an ideal (no false alarms, no missed detections) alerting system can be designed. Additional insights into the problem were brought forth by a review of current operational and developmental approaches found in the literature. An iterative, trial and error approach to threshold design was identified. When examined from a probabilistic perspective, the threshold parameters were found to be a surrogate to probabilistic performance measures. To overcome the limitations in the current iterative design method, a new direct approach is presented where the performance measures are directly computed and used to perform the alerting decisions. The methodology is shown to handle complex encounter situations (3-D, multi-aircraft, multi-intent, with uncertainties) with relative ease. Utilizing a Monte Carlo approach, a method was devised to perform the probabilistic computations in near realtime. Not only does this greatly increase the method's potential as an analytical tool, but it also opens up the possibility for use as a real-time conflict alerting probe. A prototype alerting logic was developed and has been utilized in several NASA Ames Research Center experimental studies.

  8. Global sensitivity analysis, probabilistic calibration, and predictive assessment for the data assimilation linked ecosystem carbon model

    DOE PAGES

    Safta, C.; Ricciuto, Daniel M.; Sargsyan, Khachik; ...

    2015-07-01

    In this paper we propose a probabilistic framework for an uncertainty quantification (UQ) study of a carbon cycle model and focus on the comparison between steady-state and transient simulation setups. A global sensitivity analysis (GSA) study indicates the parameters and parameter couplings that are important at different times of the year for quantities of interest (QoIs) obtained with the data assimilation linked ecosystem carbon (DALEC) model. We then employ a Bayesian approach and a statistical model error term to calibrate the parameters of DALEC using net ecosystem exchange (NEE) observations at the Harvard Forest site. The calibration results are employedmore » in the second part of the paper to assess the predictive skill of the model via posterior predictive checks.« less

  9. A simulation model for probabilistic analysis of Space Shuttle abort modes

    NASA Technical Reports Server (NTRS)

    Hage, R. T.

    1993-01-01

    A simulation model which was developed to provide a probabilistic analysis tool to study the various space transportation system abort mode situations is presented. The simulation model is based on Monte Carlo simulation of an event-tree diagram which accounts for events during the space transportation system's ascent and its abort modes. The simulation model considers just the propulsion elements of the shuttle system (i.e., external tank, main engines, and solid boosters). The model was developed to provide a better understanding of the probability of occurrence and successful completion of abort modes during the vehicle's ascent. The results of the simulation runs discussed are for demonstration purposes only, they are not official NASA probability estimates.

  10. Global sensitivity analysis, probabilistic calibration, and predictive assessment for the data assimilation linked ecosystem carbon model

    SciTech Connect

    Safta, C.; Ricciuto, Daniel M.; Sargsyan, Khachik; Debusschere, B.; Najm, H. N.; Williams, M.; Thornton, Peter E.

    2015-07-01

    In this paper we propose a probabilistic framework for an uncertainty quantification (UQ) study of a carbon cycle model and focus on the comparison between steady-state and transient simulation setups. A global sensitivity analysis (GSA) study indicates the parameters and parameter couplings that are important at different times of the year for quantities of interest (QoIs) obtained with the data assimilation linked ecosystem carbon (DALEC) model. We then employ a Bayesian approach and a statistical model error term to calibrate the parameters of DALEC using net ecosystem exchange (NEE) observations at the Harvard Forest site. The calibration results are employed in the second part of the paper to assess the predictive skill of the model via posterior predictive checks.

  11. A probabilistic model of insolation for the Mojave desert-area

    NASA Technical Reports Server (NTRS)

    Hester, O. V.; Reid, M. S.

    1978-01-01

    A preliminary solar model has been developed for the area around the JPL's Goldstone Space Communications Complex. The model has the capability of producing any or all of the following outputs: (1) a clear sky theoretical amount of radiation, (2) solar radiation for clear sky, cloudy sky or partially clear sky depending on certain probabilistic parameters, and (3) an array of average solar energy reception rates (solar intensities) in kW/sq m for a specified length of time. This model is based on the ASHRAE clear day model, which is modulated by the effects of clouds. The distribution of clouds for any given time is determined by the combination of statistical procedures, measured insolation values over a six-months period, and a data bank of 19 years of cloud cover information.

  12. Data assimilation for unsaturated flow models with restart adaptive probabilistic collocation based Kalman filter

    SciTech Connect

    Man, Jun; Li, Weixuan; Zeng, Lingzao; Wu, Laosheng

    2016-06-01

    The ensemble Kalman filter (EnKF) has gained popularity in hydrological data assimilation problems. As a Monte Carlo based method, a relatively large ensemble size is usually required to guarantee the accuracy. As an alternative approach, the probabilistic collocation based Kalman filter (PCKF) employs the polynomial chaos to approximate the original system. In this way, the sampling error can be reduced. However, PCKF suffers from the so-called "curse of dimensionality". When the system nonlinearity is strong and number of parameters is large, PCKF could be even more computationally expensive than EnKF. Motivated by most recent developments in uncertainty quantification, we propose a restart adaptive probabilistic collocation based Kalman filter (RAPCKF) for data assimilation in unsaturated flow problems. During the implementation of RAPCKF, the important parameters are identified and active PCE basis functions are adaptively selected. The "restart" technology is used to eliminate the inconsistency between model parameters and states. The performance of RAPCKF is tested with numerical cases of unsaturated flow models. It is shown that RAPCKF is more efficient than EnKF with the same computational cost. Compared with the traditional PCKF, the RAPCKF is more applicable in strongly nonlinear and high dimensional problems.

  13. a Generic Probabilistic Model and a Hierarchical Solution for Sensor Localization in Noisy and Restricted Conditions

    NASA Astrophysics Data System (ADS)

    Ji, S.; Yuan, X.

    2016-06-01

    A generic probabilistic model, under fundamental Bayes' rule and Markov assumption, is introduced to integrate the process of mobile platform localization with optical sensors. And based on it, three relative independent solutions, bundle adjustment, Kalman filtering and particle filtering are deduced under different and additional restrictions. We want to prove that first, Kalman filtering, may be a better initial-value supplier for bundle adjustment than traditional relative orientation in irregular strips and networks or failed tie-point extraction. Second, in high noisy conditions, particle filtering can act as a bridge for gap binding when a large number of gross errors fail a Kalman filtering or a bundle adjustment. Third, both filtering methods, which help reduce the error propagation and eliminate gross errors, guarantee a global and static bundle adjustment, who requires the strictest initial values and control conditions. The main innovation is about the integrated processing of stochastic errors and gross errors in sensor observations, and the integration of the three most used solutions, bundle adjustment, Kalman filtering and particle filtering into a generic probabilistic localization model. The tests in noisy and restricted situations are designed and examined to prove them.

  14. Unification of models for choice between delayed reinforcers.

    PubMed Central

    Killeen, P R; Fantino, E

    1990-01-01

    Two models for choice between delayed reinforcers, Fantino's delay-reduction theory and Killeen's incentive theory, are reviewed. Incentive theory is amended to incorporate the effects of arousal on alternate types of behavior that might block the reinforcement of the target behavior. This amended version is shown to differ from the delay-reduction theory in a term that is an exponential in incentive theory and a difference in delay-reduction theory. A power series approximation to the exponential generates a model that is formally identical with delay-reduction theory. Correlations between delay-reduction theory and the amended incentive theory show excellent congruence over a range of experimental conditions. Although the assumptions that gave rise to delay-reduction theory and incentive theory remain different and testable, the models deriving from the theories are unlikely to be discriminable by parametric experimental tests. This congruence of the models is recognized by naming the common model the delayed reinforcement model, which is then compared with other models of choice such as Killeen and Fetterman's (1988) behavioral theory of timing, Mazur's (1984) equivalence rule, and Vaughan's (1985) melioration theory. PMID:2299288

  15. A Probabilistic Graphical Model for Individualizing Prognosis in Chronic, Complex Diseases.

    PubMed

    Schulam, Peter; Saria, Suchi

    Making accurate prognoses in chronic, complex diseases is challenging due to the wide variation in expression across individuals. In many such diseases, the notion of subtypes-subpopulations that share similar symptoms and patterns of progression-have been proposed. We develop a probabilistic model that exploits the concept of subtypes to individualize prognoses of disease trajectories. These subtypes are learned automatically from data. On a new individual, our model incorporates static and time-varying markers to dynamically update predictions of subtype membership and provide individualized predictions of disease trajectory. We use our model to tackle the problem of predicting lung function trajectories in scleroderma, an autoimmune disease, and demonstrate improved predictive performance over existing approaches.

  16. Spatial dispersion of interstellar civilizations: a probabilistic site percolation model in three dimensions

    NASA Astrophysics Data System (ADS)

    Hair, Thomas W.; Hedman, Andrew D.

    2013-01-01

    A model of the spatial emergence of an interstellar civilization into a uniform distribution of habitable systems is presented. The process of emigration is modelled as a three-dimensional probabilistic cellular automaton. An algorithm is presented which defines both the daughter colonies of the original seed vertex and all subsequent connected vertices, and the probability of a connection between any two vertices. The automaton is analysed over a wide set of parameters for iterations that represent up to 250 000 years within the model's assumptions. Emigration patterns are characterized and used to evaluate two hypotheses that aim to explain the Fermi Paradox. The first hypothesis states that interstellar emigration takes too long for any civilization to have yet come within a detectable distance, and the second states that large volumes of habitable space may be left uninhabited by an interstellar civilization and Earth is located in one of these voids.

  17. Early Design Choices: Capture, Model, Integrate, Analyze, Simulate

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.

    2004-01-01

    I. Designs are constructed incrementally to meet requirements and solve problems: a) Requirements types: objectives, scenarios, constraints, ilities. etc. b) Problem/issue types: risk/safety, cost/difficulty, interaction, conflict, etc. II. Capture requirements, problems and solutions: a) Collect design and analysis products and make them accessible for integration and analysis; b) Link changes in design requirements, problems and solutions; and c) Harvest design data for design models and choice structures. III. System designs are constructed by multiple groups designing interacting subsystems a) Diverse problems, choice criteria, analysis methods and point solutions. IV. Support integration and global analysis of repercussions: a) System implications of point solutions; b) Broad analysis of interactions beyond totals of mass, cost, etc.

  18. Probabilistic Residual Strength Model Developed for Life Prediction of Ceramic Matrix Composites

    NASA Technical Reports Server (NTRS)

    Thomas, David J.; Verrilli, Michael J.; Calomino, Anthony M.

    2004-01-01

    For the next generation of reusable launch vehicles, NASA is investigating introducing ceramic matrix composites (CMCs) in place of current superalloys for structural propulsion applications (e.g., nozzles, vanes, combustors, and heat exchangers). The higher use temperatures of CMCs will reduce vehicle weight by eliminating and/or reducing cooling system requirements. The increased strength-to-weight ratio of CMCs relative to superalloys further enhances their weight savings potential. However, in order to provide safe designs for components made of these new materials, a comprehensive life prediction methodology for CMC structures needs to be developed. A robust methodology for lifing composite structures has yet to be adopted by the engineering community. Current industry design practice continues to utilize deterministic empirically based models borrowed from metals design for predicting material life capabilities. The deterministic nature of these models inadequately addresses the stochastic character of brittle composites, and their empirical reliance makes predictions beyond the experimental test conditions a risky extrapolation. A team of engineers at the NASA Glenn Research Center has been developing a new life prediction engineering model. The Probabilistic Residual Strength (PRS) model uses the residual strength of the composite as its damage metric. Expected life and material strength are both considered probabilistically to account for the observed stochastic material response. Extensive experimental testing has been carried out on C/SiC (a candidate aerospace CMC material system) in a controlled 1000 ppm O2/argon environment at elevated temperatures of 800 and 1200 C. The test matrix was established to allow observation of the material behavior, characterization of the model, and validation of the model's predictive capabilities. Sample results of the validation study are illustrated in the graphs.

  19. QUANTIFYING AGGREGATE CHLORPYRIFOS EXPOSURE AND DOSE TO CHILDREN USING A PHYSICALLY-BASED TWO-STAGE MONTE CARLO PROBABILISTIC MODEL

    EPA Science Inventory

    To help address the Food Quality Protection Act of 1996, a physically-based, two-stage Monte Carlo probabilistic model has been developed to quantify and analyze aggregate exposure and dose to pesticides via multiple routes and pathways. To illustrate model capabilities and ide...

  20. Steady-State Analysis of Genetic Regulatory Networks Modelled by Probabilistic Boolean Networks

    PubMed Central

    Gluhovsky, Ilya; Hashimoto, Ronaldo F.; Dougherty, Edward R.; Zhang, Wei

    2003-01-01

    Probabilistic Boolean networks (PBNs) have recently been introduced as a promising class of models of genetic regulatory networks. The dynamic behaviour of PBNs can be analysed in the context of Markov chains. A key goal is the determination of the steady-state (long-run) behaviour of a PBN by analysing the corresponding Markov chain. This allows one to compute the long-term influence of a gene on another gene or determine the long-term joint probabilistic behaviour of a few selected genes. Because matrix-based methods quickly become prohibitive for large sizes of networks, we propose the use of Monte Carlo methods. However, the rate of convergence to the stationary distribution becomes a central issue. We discuss several approaches for determining the number of iterations necessary to achieve convergence of the Markov chain corresponding to a PBN. Using a recently introduced method based on the theory of two-state Markov chains, we illustrate the approach on a sub-network designed from human glioma gene expression data and determine the joint steadystate probabilities for several groups of genes. PMID:18629023

  1. Probabilistic Failure Analysis of Bone Using a Finite Element Model of Mineral-Collagen Composites

    PubMed Central

    Dong, X. Neil; Guda, Teja; Millwater, Harry R.; Wang, Xiaodu

    2009-01-01

    Microdamage accumulation is a major pathway for energy dissipation during the post-yield deformation of bone. In this study, a two-dimensional probabilistic finite element model of a mineral-collagen composite was developed to investigate the influence of the tissue and ultrastructural properties of bone on the evolution of microdamage from an initial defect in tension. The probabilistic failure analyses indicated that the microdamage progression would be along the plane of the initial defect when the debonding at mineral-collagen interfaces was either absent or limited in the vicinity of the defect. In this case, the formation of a linear microcrack would be facilitated. However, the microdamage progression would be scattered away from the initial defect plane if interfacial debonding takes place at a large scale. This would suggest the possible formation of diffuse damage. In addition to interfacial debonding, the sensitivity analyses indicated that the microdamage progression was also dependent on the other material and ultrastructural properties of bone. The intensity of stress concentration accompanied with microdamage progression was more sensitive to the elastic modulus of the mineral phase and the nonlinearity of the collagen phase, whereas the scattering of failure location was largely dependent on the mineral to collagen ratio and the nonlinearity of the collagen phase. The findings of this study may help understanding the post-yield behavior of bone at the ultrastructural level and shed light on the underlying mechanism of bone fractures. PMID:19058806

  2. Probabilistic modeling of the flows and environmental risks of nano-silica.

    PubMed

    Wang, Yan; Kalinina, Anna; Sun, Tianyin; Nowack, Bernd

    2016-03-01

    Nano-silica, the engineered nanomaterial with one of the largest production volumes, has a wide range of applications in consumer products and industry. This study aimed to quantify the exposure of nano-silica to the environment and to assess its risk to surface waters. Concentrations were calculated for four environmental (air, soil, surface water, sediments) and two technical compartments (wastewater, solid waste) for the EU and Switzerland using probabilistic material flow modeling. The corresponding median concentration in surface water is predicted to be 0.12 μg/l in the EU (0.053-3.3 μg/l, 15/85% quantiles). The concentrations in sediments in the complete sedimentation scenario were found to be the largest among all environmental compartments, with a median annual increase of 0.43 mg/kg · y in the EU (0.19-12 mg/kg · y, 15/85% quantiles). Moreover, probabilistic species sensitivity distributions (PSSD) were computed and the risk of nano-silica in surface waters was quantified by comparing the predicted environmental concentration (PEC) with the predicted no-effect concentration (PNEC) distribution, which was derived from the cumulative PSSD. This assessment suggests that nano-silica currently poses no risk to aquatic organisms in surface waters. Further investigations are needed to assess the risk of nano-silica in other environmental compartments, which is currently not possible due to a lack of ecotoxicological data.

  3. Possibilities and limitations of modeling environmental exposure to engineered nanomaterials by probabilistic material flow analysis.

    PubMed

    Gottschalk, Fadri; Sonderer, Tobias; Scholz, Roland W; Nowack, Bernd

    2010-05-01

    Information on environmental concentrations is needed to assess the risks that engineered nanomaterials (ENM) may pose to the environment. In this study, predicted environmental concentrations (PEC) were modeled for nano-TiO2, carbon nanotubes (CNT) and nano-Ag for Switzerland. Based on a life-cycle perspective, the model considered as input parameters the production volumes of the ENMs, the manufacturing and consumption quantities of products containing those materials, and the fate and pathways of ENMs in natural and technical environments. Faced with a distinct scarcity of data, we used a probabilistic material flow analysis model, treating all parameters as probability distributions. The modeling included Monte Carlo and Markov Chain Monte Carlo simulations as well as a sensitivity and uncertainty analysis. The PEC values of the ENMs in the different environmental compartments vary widely due to different ENM production volumes and different life cycles of the nanoproducts. The use of ENM in products with high water relevance leads to higher water and sediment concentrations for nano-TiO2 and nano-Ag, compared to CNTs, where smaller amounts of ENM reach the aquatic compartments. This study also presents a sensitivity analysis and a comprehensive discussion of the uncertainties of the simulation results and the limitations of the used approach. To estimate potential risks, the PEC values were compared to the predicted-no-effect concentrations (PNEC) derived from published data. The risk quotients (PEC/PNEC) for nano-TiO2 and nano-Ag were larger than one for treated wastewater and much smaller for all other environmental compartments (e.g., water, sediments, soils). We conclude that probabilistic modeling is very useful for predicting environmental concentrations of ENMs given the current lack of substantiated data.

  4. Systematic evaluation of autoregressive error models as post-processors for a probabilistic streamflow forecast system

    NASA Astrophysics Data System (ADS)

    Morawietz, Martin; Xu, Chong-Yu; Gottschalk, Lars; Tallaksen, Lena

    2010-05-01

    A post-processor that accounts for the hydrologic uncertainty in a probabilistic streamflow forecast system is necessary to account for the uncertainty introduced by the hydrological model. In this study different variants of an autoregressive error model that can be used as a post-processor for short to medium range streamflow forecasts, are evaluated. The deterministic HBV model is used to form the basis for the streamflow forecast. The general structure of the error models then used as post-processor is a first order autoregressive model of the form dt = αdt-1 + σɛt where dt is the model error (observed minus simulated streamflow) at time t, α and σ are the parameters of the error model, and ɛt is the residual error described through a probability distribution. The following aspects are investigated: (1) Use of constant parameters α and σ versus the use of state dependent parameters. The state dependent parameters vary depending on the states of temperature, precipitation, snow water equivalent and simulated streamflow. (2) Use of a Standard Normal distribution for ɛt versus use of an empirical distribution function constituted through the normalized residuals of the error model in the calibration period. (3) Comparison of two different transformations, i.e. logarithmic versus square root, that are applied to the streamflow data before the error model is applied. The reason for applying a transformation is to make the residuals of the error model homoscedastic over the range of streamflow values of different magnitudes. Through combination of these three characteristics, eight variants of the autoregressive post-processor are generated. These are calibrated and validated in 55 catchments throughout Norway. The discrete ranked probability score with 99 flow percentiles as standardized thresholds is used for evaluation. In addition, a non-parametric bootstrap is used to construct confidence intervals and evaluate the significance of the results. The main

  5. Simple model for multiple-choice collective decision making

    NASA Astrophysics Data System (ADS)

    Lee, Ching Hua; Lucas, Andrew

    2014-11-01

    We describe a simple model of heterogeneous, interacting agents making decisions between n ≥2 discrete choices. For a special class of interactions, our model is the mean field description of random field Potts-like models and is effectively solved by finding the extrema of the average energy E per agent. In these cases, by studying the propagation of decision changes via avalanches, we argue that macroscopic dynamics is well captured by a gradient flow along E . We focus on the permutation symmetric case, where all n choices are (on average) the same, and spontaneous symmetry breaking (SSB) arises purely from cooperative social interactions. As examples, we show that bimodal heterogeneity naturally provides a mechanism for the spontaneous formation of hierarchies between decisions and that SSB is a preferred instability to discontinuous phase transitions between two symmetric points. Beyond the mean field limit, exponentially many stable equilibria emerge when we place this model on a graph of finite mean degree. We conclude with speculation on decision making with persistent collective oscillations. Throughout the paper, we emphasize analogies between methods of solution to our model and common intuition from diverse areas of physics, including statistical physics and electromagnetism.

  6. Predicting the acute neurotoxicity of diverse organic solvents using probabilistic neural networks based QSTR modeling approaches.

    PubMed

    Basant, Nikita; Gupta, Shikha; Singh, Kunwar P

    2016-03-01

    Organic solvents are widely used chemicals and the neurotoxic properties of some are well established. In this study, we established nonlinear qualitative and quantitative structure-toxicity relationship (STR) models for predicting neurotoxic classes and neurotoxicity of structurally diverse solvents in rodent test species following OECD guideline principles for model development. Probabilistic neural network (PNN) based qualitative and generalized regression neural network (GRNN) based quantitative STR models were constructed using neurotoxicity data from rat and mouse studies. Further, interspecies correlation based quantitative activity-activity relationship (QAAR) and global QSTR models were also developed using the combined data set of both rodent species for predicting the neurotoxicity of solvents. The constructed models were validated through deriving several statistical coefficients for the test data and the prediction and generalization abilities of these models were evaluated. The qualitative STR models (rat and mouse) yielded classification accuracies of 92.86% in the test data sets, whereas, the quantitative STRs yielded correlation (R(2)) of >0.93 between the measured and model predicted toxicity values in both the test data (rat and mouse). The prediction accuracies of the QAAR (R(2) 0.859) and global STR (R(2) 0.945) models were comparable to those of the independent local STR models. The results suggest the ability of the developed QSTR models to reliably predict binary neurotoxicity classes and the endpoint neurotoxicities of the structurally diverse organic solvents.

  7. Modeling the Effects of Choice-Set Size on the Processing of Letters and Words

    ERIC Educational Resources Information Center

    Rouder, Jeffrey N.

    2004-01-01

    Letters and words are better identified when there are fewer available choices. How do readers use choice-set restrictions? By analyzing new experimental data and previously reported data, the author shows that Bayes theorem-based models overestimate readers' use of choice-set restrictions. This result is discordant with choice-similarity models…

  8. Multiple Choice Neurodynamical Model of the Uncertain Option Task.

    PubMed

    Insabato, Andrea; Pannunzi, Mario; Deco, Gustavo

    2017-01-01

    The uncertain option task has been recently adopted to investigate the neural systems underlying the decision confidence. Latterly single neurons activity has been recorded in lateral intraparietal cortex of monkeys performing an uncertain option task, where the subject is allowed to opt for a small but sure reward instead of making a risky perceptual decision. We propose a multiple choice model implemented in a discrete attractors network. This model is able to reproduce both behavioral and neurophysiological experimental data and therefore provides support to the numerous perspectives that interpret the uncertain option task as a sensory-motor association. The model explains the behavioral and neural data recorded in monkeys as the result of the multistable attractor landscape and produces several testable predictions. One of these predictions may help distinguish our model from a recently proposed continuous attractor model.

  9. Multiple Choice Neurodynamical Model of the Uncertain Option Task

    PubMed Central

    Insabato, Andrea; Pannunzi, Mario; Deco, Gustavo

    2017-01-01

    The uncertain option task has been recently adopted to investigate the neural systems underlying the decision confidence. Latterly single neurons activity has been recorded in lateral intraparietal cortex of monkeys performing an uncertain option task, where the subject is allowed to opt for a small but sure reward instead of making a risky perceptual decision. We propose a multiple choice model implemented in a discrete attractors network. This model is able to reproduce both behavioral and neurophysiological experimental data and therefore provides support to the numerous perspectives that interpret the uncertain option task as a sensory-motor association. The model explains the behavioral and neural data recorded in monkeys as the result of the multistable attractor landscape and produces several testable predictions. One of these predictions may help distinguish our model from a recently proposed continuous attractor model. PMID:28076355

  10. Probabilistic solution of random SI-type epidemiological models using the Random Variable Transformation technique

    NASA Astrophysics Data System (ADS)

    Casabán, M.-C.; Cortés, J.-C.; Romero, J.-V.; Roselló, M.-D.

    2015-07-01

    This paper presents a full probabilistic description of the solution of random SI-type epidemiological models which are based on nonlinear differential equations. This description consists of determining: the first probability density function of the solution in terms of the density functions of the diffusion coefficient and the initial condition, which are assumed to be independent random variables; the expectation and variance functions of the solution as well as confidence intervals and, finally, the distribution of time until a given proportion of susceptibles remains in the population. The obtained formulas are general since they are valid regardless the probability distributions assigned to the random inputs. We also present a pair of illustrative examples including in one of them the application of the theoretical results to model the diffusion of a technology using real data.

  11. PROBABILISTIC NON-RIGID REGISTRATION OF PROSTATE IMAGES: MODELING AND QUANTIFYING UNCERTAINTY

    PubMed Central

    Risholm, Petter; Fedorov, Andriy; Pursley, Jennifer; Tuncali, Kemal; Cormack, Robert; Wells, William M.

    2012-01-01

    Registration of pre- to intra-procedural prostate images needs to handle the large changes in position and shape of the prostate caused by varying rectal filling and patient positioning. We describe a probabilistic method for non-rigid registration of prostate images which can quantify the most probable deformation as well as the uncertainty of the estimated deformation. The method is based on a biomechanical Finite Element model which treats the prostate as an elastic material. We use a Markov Chain Monte Carlo sampler to draw deformation configurations from the posterior distribution. In practice, we simultaneously estimate the boundary conditions (surface displacements) and the internal deformations of our biomechanical model. The proposed method was validated on a clinical MRI dataset with registration results comparable to previously published methods, but with the added benefit of also providing uncertainty estimates which may be important to take into account during prostate biopsy and brachytherapy procedures. PMID:22288004

  12. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    SciTech Connect

    Suzette Payne

    2006-04-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  13. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    SciTech Connect

    Suzette Payne

    2007-08-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  14. A probabilistic quantitative risk assessment model for the long-term work zone crashes.

    PubMed

    Meng, Qiang; Weng, Jinxian; Qu, Xiaobo

    2010-11-01

    Work zones especially long-term work zones increase traffic conflicts and cause safety problems. Proper casualty risk assessment for a work zone is of importance for both traffic safety engineers and travelers. This paper develops a novel probabilistic quantitative risk assessment (QRA) model to evaluate the casualty risk combining frequency and consequence of all accident scenarios triggered by long-term work zone crashes. The casualty risk is measured by the individual risk and societal risk. The individual risk can be interpreted as the frequency of a driver/passenger being killed or injured, and the societal risk describes the relation between frequency and the number of casualties. The proposed probabilistic QRA model consists of the estimation of work zone crash frequency, an event tree and consequence estimation models. There are seven intermediate events--age (A), crash unit (CU), vehicle type (VT), alcohol (AL), light condition (LC), crash type (CT) and severity (S)--in the event tree. Since the estimated value of probability for some intermediate event may have large uncertainty, the uncertainty can thus be characterized by a random variable. The consequence estimation model takes into account the combination effects of speed and emergency medical service response time (ERT) on the consequence of work zone crash. Finally, a numerical example based on the Southeast Michigan work zone crash data is carried out. The numerical results show that there will be a 62% decrease of individual fatality risk and 44% reduction of individual injury risk if the mean travel speed is slowed down by 20%. In addition, there will be a 5% reduction of individual fatality risk and 0.05% reduction of individual injury risk if ERT is reduced by 20%. In other words, slowing down speed is more effective than reducing ERT in the casualty risk mitigation.

  15. Advanced Nuclear Fuel Cycle Transitions: Optimization, Modeling Choices, and Disruptions

    NASA Astrophysics Data System (ADS)

    Carlsen, Robert W.

    Many nuclear fuel cycle simulators have evolved over time to help understan the nuclear industry/ecosystem at a macroscopic level. Cyclus is one of th first fuel cycle simulators to accommodate larger-scale analysis with it liberal open-source licensing and first-class Linux support. Cyclus also ha features that uniquely enable investigating the effects of modeling choices o fuel cycle simulators and scenarios. This work is divided into thre experiments focusing on optimization, effects of modeling choices, and fue cycle uncertainty. Effective optimization techniques are developed for automatically determinin desirable facility deployment schedules with Cyclus. A novel method fo mapping optimization variables to deployment schedules is developed. Thi allows relationships between reactor types and scenario constraints to b represented implicitly in the variable definitions enabling the usage o optimizers lacking constraint support. It also prevents wasting computationa resources evaluating infeasible deployment schedules. Deployed power capacit over time and deployment of non-reactor facilities are also included a optimization variables There are many fuel cycle simulators built with different combinations o modeling choices. Comparing results between them is often difficult. Cyclus flexibility allows comparing effects of many such modeling choices. Reacto refueling cycle synchronization and inter-facility competition among othe effects are compared in four cases each using combinations of fleet of individually modeled reactors with 1-month or 3-month time steps. There are noticeable differences in results for the different cases. The larges differences occur during periods of constrained reactor fuel availability This and similar work can help improve the quality of fuel cycle analysi generally There is significant uncertainty associated deploying new nuclear technologie such as time-frames for technology availability and the cost of buildin advanced reactors

  16. Improving predictive power of physically based rainfall-induced shallow landslide models: a probabilistic approach

    NASA Astrophysics Data System (ADS)

    Raia, S.; Alvioli, M.; Rossi, M.; Baum, R. L.; Godt, J. W.; Guzzetti, F.

    2014-03-01

    Distributed models to forecast the spatial and temporal occurrence of rainfall-induced shallow landslides are based on deterministic laws. These models extend spatially the static stability models adopted in geotechnical engineering, and adopt an infinite-slope geometry to balance the resisting and the driving forces acting on the sliding mass. An infiltration model is used to determine how rainfall changes pore-water conditions, modulating the local stability/instability conditions. A problem with the operation of the existing models lays in the difficulty in obtaining accurate values for the several variables that describe the material properties of the slopes. The problem is particularly severe when the models are applied over large areas, for which sufficient information on the geotechnical and hydrological conditions of the slopes is not generally available. To help solve the problem, we propose a probabilistic Monte Carlo approach to the distributed modeling of rainfall-induced shallow landslides. For this purpose, we have modified the transient rainfall infiltration and grid-based regional slope-stability analysis (TRIGRS) code. The new code (TRIGRS-P) adopts a probabilistic approach to compute, on a cell-by-cell basis, transient pore-pressure changes and related changes in the factor of safety due to rainfall infiltration. Infiltration is modeled using analytical solutions of partial differential equations describing one-dimensional vertical flow in isotropic, homogeneous materials. Both saturated and unsaturated soil conditions can be considered. TRIGRS-P copes with the natural variability inherent to the mechanical and hydrological properties of the slope materials by allowing values of the TRIGRS model input parameters to be sampled randomly from a given probability distribution. The range of variation and the mean value of the parameters can be determined by the usual methods used for preparing the TRIGRS input parameters. The outputs of several model

  17. Relationships between probabilistic Boolean networks and dynamic Bayesian networks as models of gene regulatory networks

    PubMed Central

    Lähdesmäki, Harri; Hautaniemi, Sampsa; Shmulevich, Ilya; Yli-Harja, Olli

    2006-01-01

    A significant amount of attention has recently been focused on modeling of gene regulatory networks. Two frequently used large-scale modeling frameworks are Bayesian networks (BNs) and Boolean networks, the latter one being a special case of its recent stochastic extension, probabilistic Boolean networks (PBNs). PBN is a promising model class that generalizes the standard rule-based interactions of Boolean networks into the stochastic setting. Dynamic Bayesian networks (DBNs) is a general and versatile model class that is able to represent complex temporal stochastic processes and has also been proposed as a model for gene regulatory systems. In this paper, we concentrate on these two model classes and demonstrate that PBNs and a certain subclass of DBNs can represent the same joint probability distribution over their common variables. The major benefit of introducing the relationships between the models is that it opens up the possibility of applying the standard tools of DBNs to PBNs and vice versa. Hence, the standard learning tools of DBNs can be applied in the context of PBNs, and the inference methods give a natural way of handling the missing values in PBNs which are often present in gene expression measurements. Conversely, the tools for controlling the stationary behavior of the networks, tools for projecting networks onto sub-networks, and efficient learning schemes can be used for DBNs. In other words, the introduced relationships between the models extend the collection of analysis tools for both model classes. PMID:17415411

  18. A 3-D probabilistic stability model incorporating the variability of root reinforcement

    NASA Astrophysics Data System (ADS)

    Cislaghi, Alessio; Chiaradia, Enrico; Battista Bischetti, Gian

    2016-04-01

    Process-oriented models of hillslope stability have a great potentiality to improve spatially-distributed landslides hazard analyses. At the same time, they may have severe limitations and among them the variability and uncertainty of the parameters play a key role. In this context, the application of a probabilistic approach through Monte Carlo techniques can be the right practice to deal with the variability of each input parameter by considering a proper probability distribution. In forested areas an additional point must be taken into account: the reinforcement due to roots permeating the soil and its variability and uncertainty. While the probability distributions of geotechnical and hydrological parameters have been widely investigated, little is known concerning the variability and the spatial heterogeneity of root reinforcement. Moreover, there are still many difficulties in measuring and in evaluating such a variable. In our study we aim to: i) implement a robust procedure to evaluate the variability of root reinforcement as a probabilistic distribution, according to the stand characteristics of forests, such as the trees density, the average diameter at breast height, the minimum distance among trees, and (ii) combine a multidimensional process-oriented model with a Monte Carlo Simulation technique, to obtain a probability distribution of the Factor of Safety. The proposed approach has been applied to a small Alpine area, mainly covered by a coniferous forest and characterized by steep slopes and a high landslide hazard. The obtained results show a good reliability of the model according to the landslide inventory map. At the end, our findings contribute to improve the reliability of landslide hazard mapping in forested areas and help forests managers to evaluate different management scenarios.

  19. Probabilistic Graphical Models for the Analysis and Synthesis of Musical Audio

    NASA Astrophysics Data System (ADS)

    Hoffmann, Matthew Douglas

    Content-based Music Information Retrieval (MIR) systems seek to automatically extract meaningful information from musical audio signals. This thesis applies new and existing generative probabilistic models to several content-based MIR tasks: timbral similarity estimation, semantic annotation and retrieval, and latent source discovery and separation. In order to estimate how similar two songs sound to one another, we employ a Hierarchical Dirichlet Process (HDP) mixture model to discover a shared representation of the distribution of timbres in each song. Comparing songs under this shared representation yields better query-by-example retrieval quality and scalability than previous approaches. To predict what tags are likely to apply to a song (e.g., "rap," "happy," or "driving music"), we develop the Codeword Bernoulli Average (CBA) model, a simple and fast mixture-of-experts model. Despite its simplicity, CBA performs at least as well as state-of-the-art approaches at automatically annotating songs and finding to what songs in a database a given tag most applies. Finally, we address the problem of latent source discovery and separation by developing two Bayesian nonparametric models, the Shift-Invariant HDP and Gamma Process NMF. These models allow us to discover what sounds (e.g. bass drums, guitar chords, etc.) are present in a song or set of songs and to isolate or suppress individual source. These models' ability to decide how many latent sources are necessary to model the data is particularly valuable in this application, since it is impossible to guess a priori how many sounds will appear in a given song or set of songs. Once they have been fit to data, probabilistic models can also be used to drive the synthesis of new musical audio, both for creative purposes and to qualitatively diagnose what information a model does and does not capture. We also adapt the SIHDP model to create new versions of input audio with arbitrary sample sets, for example, to create

  20. Probabilistic Boolean Network Modelling and Analysis Framework for mRNA Translation.

    PubMed

    Zhao, Yun-Bo; Krishnan, J

    2016-01-01

    mRNA translation is a complex process involving the progression of ribosomes on the mRNA, resulting in the synthesis of proteins, and is subject to multiple layers of regulation. This process has been modelled using different formalisms, both stochastic and deterministic. Recently, we introduced a Probabilistic Boolean modelling framework for mRNA translation, which possesses the advantage of tools for numerically exact computation of steady state probability distribution, without requiring simulation. Here, we extend this model to incorporate both random sequential and parallel update rules, and demonstrate its effectiveness in various settings, including its flexibility in accommodating additional static and dynamic biological complexities and its role in parameter sensitivity analysis. In these applications, the results from the model analysis match those of TASEP model simulations. Importantly, the proposed modelling framework maintains the stochastic aspects of mRNA translation and provides a way to exactly calculate probability distributions, providing additional tools of analysis in this context. Finally, the proposed modelling methodology provides an alternative approach to the understanding of the mRNA translation process, by bridging the gap between existing approaches, providing new analysis tools, and contributing to a more robust platform for modelling and understanding translation.

  1. Probabilistic conditional reasoning: Disentangling form and content with the dual-source model.

    PubMed

    Singmann, Henrik; Klauer, Karl Christoph; Beller, Sieghard

    2016-08-01

    The present research examines descriptive models of probabilistic conditional reasoning, that is of reasoning from uncertain conditionals with contents about which reasoners have rich background knowledge. According to our dual-source model, two types of information shape such reasoning: knowledge-based information elicited by the contents of the material and content-independent information derived from the form of inferences. Two experiments implemented manipulations that selectively influenced the model parameters for the knowledge-based information, the relative weight given to form-based versus knowledge-based information, and the parameters for the form-based information, validating the psychological interpretation of these parameters. We apply the model to classical suppression effects dissecting them into effects on background knowledge and effects on form-based processes (Exp. 3) and we use it to reanalyse previous studies manipulating reasoning instructions. In a model-comparison exercise, based on data of seven studies, the dual-source model outperformed three Bayesian competitor models. Overall, our results support the view that people make use of background knowledge in line with current Bayesian models, but they also suggest that the form of the conditional argument, irrespective of its content, plays a substantive, yet smaller, role.

  2. A time-dependent probabilistic seismic-hazard model for California

    USGS Publications Warehouse

    Cramer, C.H.; Petersen, M.D.; Cao, T.; Toppozada, Tousson R.; Reichle, M.

    2000-01-01

    For the purpose of sensitivity testing and illuminating nonconsensus components of time-dependent models, the California Department of Conservation, Division of Mines and Geology (CDMG) has assembled a time-dependent version of its statewide probabilistic seismic hazard (PSH) model for California. The model incorporates available consensus information from within the earth-science community, except for a few faults or fault segments where consensus information is not available. For these latter faults, published information has been incorporated into the model. As in the 1996 CDMG/U.S. Geological Survey (USGS) model, the time-dependent models incorporate three multisegment ruptures: a 1906, an 1857, and a southern San Andreas earthquake. Sensitivity tests are presented to show the effect on hazard and expected damage estimates of (1) intrinsic (aleatory) sigma, (2) multisegment (cascade) vs. independent segment (no cascade) ruptures, and (3) time-dependence vs. time-independence. Results indicate that (1) differences in hazard and expected damage estimates between time-dependent and independent models increase with decreasing intrinsic sigma, (2) differences in hazard and expected damage estimates between full cascading and not cascading are insensitive to intrinsic sigma, (3) differences in hazard increase with increasing return period (decreasing probability of occurrence), and (4) differences in moment-rate budgets increase with decreasing intrinsic sigma and with the degree of cascading, but are within the expected uncertainty in PSH time-dependent modeling and do not always significantly affect hazard and expected damage estimates.

  3. Multi-level approach for statistical appearance models with probabilistic correspondences

    NASA Astrophysics Data System (ADS)

    Krüger, Julia; Ehrhardt, Jan; Handels, Heinz

    2016-03-01

    Statistical shape and appearance models are often based on the accurate identification of one-to-one correspondences in a training data set. At the same time, the determination of these corresponding landmarks is the most challenging part of such methods. Hufnagel et al.1 developed an alternative method using correspondence probabilities for a statistical shape model. In Krüuger et al.2, 3 we propose the use of probabilistic correspondences for statistical appearance models by incorporating appearance information into the framework. We employ a point-based representation of image data combining position and appearance information. The model is optimized and adapted by a maximum a-posteriori (MAP) approach deriving a single global optimization criterion with respect to model parameters and observation dependent parameters that directly affects shape and appearance information of the considered structures. Because initially unknown correspondence probabilities are used and a higher number of degrees of freedom is introduced to the model a regularization of the model generation process is advantageous. For this purpose we extend the derived global criterion by a regularization term which penalizes implausible topological changes. Furthermore, we propose a multi-level approach for the optimization, to increase the robustness of the model generation process.

  4. Robust Depth Image Acquisition Using Modulated Pattern Projection and Probabilistic Graphical Models

    PubMed Central

    Kravanja, Jaka; Žganec, Mario; Žganec-Gros, Jerneja; Dobrišek, Simon; Štruc, Vitomir

    2016-01-01

    Depth image acquisition with structured light approaches in outdoor environments is a challenging problem due to external factors, such as ambient sunlight, which commonly affect the acquisition procedure. This paper presents a novel structured light sensor designed specifically for operation in outdoor environments. The sensor exploits a modulated sequence of structured light projected onto the target scene to counteract environmental factors and estimate a spatial distortion map in a robust manner. The correspondence between the projected pattern and the estimated distortion map is then established using a probabilistic framework based on graphical models. Finally, the depth image of the target scene is reconstructed using a number of reference frames recorded during the calibration process. We evaluate the proposed sensor on experimental data in indoor and outdoor environments and present comparative experiments with other existing methods, as well as commercial sensors. PMID:27775570

  5. Robust Depth Image Acquisition Using Modulated Pattern Projection and Probabilistic Graphical Models.

    PubMed

    Kravanja, Jaka; Žganec, Mario; Žganec-Gros, Jerneja; Dobrišek, Simon; Štruc, Vitomir

    2016-10-19

    Depth image acquisition with structured light approaches in outdoor environments is a challenging problem due to external factors, such as ambient sunlight, which commonly affect the acquisition procedure. This paper presents a novel structured light sensor designed specifically for operation in outdoor environments. The sensor exploits a modulated sequence of structured light projected onto the target scene to counteract environmental factors and estimate a spatial distortion map in a robust manner. The correspondence between the projected pattern and the estimated distortion map is then established using a probabilistic framework based on graphical models. Finally, the depth image of the target scene is reconstructed using a number of reference frames recorded during the calibration process. We evaluate the proposed sensor on experimental data in indoor and outdoor environments and present comparative experiments with other existing methods, as well as commercial sensors.

  6. A Model-Based Probabilistic Inversion Framework for Wire Fault Detection Using TDR

    NASA Technical Reports Server (NTRS)

    Schuet, Stefan R.; Timucin, Dogan A.; Wheeler, Kevin R.

    2010-01-01

    Time-domain reflectometry (TDR) is one of the standard methods for diagnosing faults in electrical wiring and interconnect systems, with a long-standing history focused mainly on hardware development of both high-fidelity systems for laboratory use and portable hand-held devices for field deployment. While these devices can easily assess distance to hard faults such as sustained opens or shorts, their ability to assess subtle but important degradation such as chafing remains an open question. This paper presents a unified framework for TDR-based chafing fault detection in lossy coaxial cables by combining an S-parameter based forward modeling approach with a probabilistic (Bayesian) inference algorithm. Results are presented for the estimation of nominal and faulty cable parameters from laboratory data.

  7. On probabilistic certification of combined cancer therapies using strongly uncertain models.

    PubMed

    Alamir, Mazen

    2015-11-07

    This paper proposes a general framework for probabilistic certification of cancer therapies. The certification is defined in terms of two key issues which are the tumor contraction and the lower admissible bound on the circulating lymphocytes which is viewed as indicator of the patient health. The certification is viewed as the ability to guarantee with a predefined high probability the success of the therapy over a finite horizon despite of the unavoidable high uncertainties affecting the dynamic model that is used to compute the optimal scheduling of drugs injection. The certification paradigm can be viewed as a tool for tuning the treatment parameters and protocols as well as for getting a rational use of limited or expensive drugs. The proposed framework is illustrated using the specific problem of combined immunotherapy/chemotherapy of cancer.

  8. Life Prediction and Classification of Failure Modes in Solid State Luminaires Using Bayesian Probabilistic Models

    SciTech Connect

    Lall, Pradeep; Wei, Junchao; Sakalaukus, Peter

    2014-05-27

    A new method has been developed for assessment of the onset of degradation in solid state luminaires to classify failure mechanisms by using metrics beyond lumen degradation that are currently used for identification of failure. Luminous Flux output, Correlated Color Temperature Data on Philips LED Lamps has been gathered under 85°C/85%RH till lamp failure. The acquired data has been used in conjunction with Bayesian Probabilistic Models to identify luminaires with onset of degradation much prior to failure through identification of decision boundaries between lamps with accrued damage and lamps beyond the failure threshold in the feature space. In addition luminaires with different failure modes have been classified separately from healthy pristine luminaires. It is expected that, the new test technique will allow the development of failure distributions without testing till L70 life for the manifestation of failure.

  9. Behavioral Modeling Based on Probabilistic Finite Automata: An Empirical Study †

    PubMed Central

    Tîrnăucă, Cristina; Montaña, José L.; Ontañón, Santiago; González, Avelino J.; Pardo, Luis M.

    2016-01-01

    Imagine an agent that performs tasks according to different strategies. The goal of Behavioral Recognition (BR) is to identify which of the available strategies is the one being used by the agent, by simply observing the agent’s actions and the environmental conditions during a certain period of time. The goal of Behavioral Cloning (BC) is more ambitious. In this last case, the learner must be able to build a model of the behavior of the agent. In both settings, the only assumption is that the learner has access to a training set that contains instances of observed behavioral traces for each available strategy. This paper studies a machine learning approach based on Probabilistic Finite Automata (PFAs), capable of achieving both the recognition and cloning tasks. We evaluate the performance of PFAs in the context of a simulated learning environment (in this case, a virtual Roomba vacuum cleaner robot), and compare it with a collection of other machine learning approaches. PMID:27347956

  10. Comparison of Four Probabilistic Models (CARES, Calendex, ConsEspo, SHEDS) to Estimate Aggregate Residential Exposures to Pesticides

    EPA Science Inventory

    Two deterministic models (US EPA’s Office of Pesticide Programs Residential Standard Operating Procedures (OPP Residential SOPs) and Draft Protocol for Measuring Children’s Non-Occupational Exposure to Pesticides by all Relevant Pathways (Draft Protocol)) and four probabilistic mo...

  11. A probabilistic model for the persistence of early planar fabrics in polydeformed pelitic schists

    USGS Publications Warehouse

    Ferguson, C.C.

    1984-01-01

    Although early planar fabrics are commonly preserved within microlithons in low-grade pelites, in higher-grade (amphibolite facies) pelitic schists fabric regeneration often appears complete. Evidence for early fabrics may be preserved within porphyroblasts but, within the matrix, later deformation often appears to totally obliterate or reorient earlier fabrics. However, examination of several hundred Dalradian pelites from Connemara, western Ireland, reveals that preservation of early fabrics is by no means uncommon; relict matrix domains, although volumetrically insignificant, are remarkably persistent even when inferred later strains are very large and fabric regeneration appears, at first sight, complete. Deterministic plasticity theories are ill-suited to the analysis of such an inhomogeneous material response, and a probabilistic model is proposed instead. It assumes that ductile polycrystal deformation is controlled by elementary flow units which can be activated once their associated stress barrier is overcome. Bulk flow propensity is related to the proportion of simultaneous activations, and a measure of this is derived from the probabilistic interaction between a stress-barrier spectrum and an internal stress spectrum (the latter determined by the external loading and the details of internal stress transfer). The spectra are modelled as Gaussian distributions although the treatment is very general and could be adapted for other distributions. Using the time rate of change of activation probability it is predicted that, initially, fabric development will be rapid but will then slow down dramatically even though stress increases at a constant rate. This highly non-linear response suggests that early fabrics persist because they comprise unfavourable distributions of stress-barriers which remain unregenerated at the time bulk stress is stabilized by steady-state flow. Relict domains will, however, bear the highest stress and are potential upper

  12. Evolution of the sewage treatment plant model SimpleTreat: use of realistic biodegradability tests in probabilistic model simulations.

    PubMed

    Franco, Antonio; Struijs, Jaap; Gouin, Todd; Price, Oliver R

    2013-10-01

    Given the large number of chemicals under regulatory scrutiny, models play a crucial role in the screening phase of the environmental risk assessment. The sewage treatment plant (STP) model SimpleTreat 3.1 is routinely applied as part of the European Union System for the Evaluation of Substances to estimate the fate and elimination of organic chemicals discharged via sewage. SimpleTreat estimates tend to be conservative and therefore only useful for lower-tier assessments. A probabilistic version of SimpleTreat was built on the updated version of the model (SimpleTreat 3.2, presented in a parallel article in this issue), embracing likeliest as well as worst-case conditions in a statistically robust way. Probabilistic parameters representing the variability of sewage characteristics, STP design, and operational parameters were based on actual STP conditions for activated sludge plants in Europe. An evaluation study was carried out for 4 chemicals with distinct sorption and biodegradability profiles: tonalide, triclosan, trimethoprim, and linear alkylbenzene sulfonate. Simulations incorporated information on biodegradability simulation studies with activated sludge (OECD 314B and OECD 303A tests). Good agreement for both median values and variability ranges was observed between model estimates and monitoring data. The uncertainty analysis highlighted the importance of refined data on partitioning and biodegradability in activated sludge to achieve realistic estimates. The study indicates that the best strategy to refine the exposure assessment of down-the-drain chemicals is by integrating higher-tier laboratory data with probabilistic STP simulations and, if possible, by comparing them with monitoring data for validation.

  13. Dynamic route choice model of large-scale traffic network

    SciTech Connect

    Boyce, D.W.; Lee, D.H.; Janson, B.N.; Berka, S.

    1997-08-01

    Application and extensions of a dynamic network equilibrium model to the Advanced Driver and Vehicle Advisory Navigation Concept (ADVANCE) Network are described in this paper. ADVANCE is a dynamic route guidance field test designed for 800 km{sup 2} in the northwestern suburbs of Chicago. The dynamic route choice model employed in this paper is solved efficiently by a modified version of Janson`s DYMOD algorithm. Realistic traffic engineering-based link delay functions, instead of the simplistic Bureau of Public Roads (BPR) function, are used to estimate link travel times and intersection delays for most types of links and intersections. Further, an expanded intersection representation is utilized, resulting in a network of nearly 23,000 links and 10,000 nodes. Time-dependent link flows, travel times, speeds and queue spillbacks are generated for the ADVANCE Network. The model was solved on a CONVEX-C3880. Convergence and computational results are presented and analyzed.

  14. Probabilistic modeling of the fate of Listeria monocytogenes in diced bacon during the manufacturing process.

    PubMed

    Billoir, Elise; Denis, Jean-Baptiste; Cammeau, Natalie; Cornu, Marie; Zuliani, Veronique

    2011-02-01

    To assess the impact of the manufacturing process on the fate of Listeria monocytogenes, we built a generic probabilistic model intended to simulate the successive steps in the process. Contamination evolution was modeled in the appropriate units (breasts, dice, and then packaging units through the successive steps in the process). To calibrate the model, parameter values were estimated from industrial data, from the literature, and based on expert opinion. By means of simulations, the model was explored using a baseline calibration and alternative scenarios, in order to assess the impact of changes in the process and of accidental events. The results are reported as contamination distributions and as the probability that the product will be acceptable with regards to the European regulatory safety criterion. Our results are consistent with data provided by industrial partners and highlight that tumbling is a key step for the distribution of the contamination at the end of the process. Process chain models could provide an important added value for risk assessment models that basically consider only the outputs of the process in their risk mitigation strategies. Moreover, a model calibrated to correspond to a specific plant could be used to optimize surveillance.

  15. Probabilistic Stack of 180 Plio-Pleistocene Benthic δ18O Records Constructed Using Profile Hidden Markov Models

    NASA Astrophysics Data System (ADS)

    Lisiecki, L. E.; Ahn, S.; Khider, D.; Lawrence, C.

    2015-12-01

    Stratigraphic alignment is the primary way in which long marine climate records are placed on a common age model. We previously presented a probabilistic pairwise alignment algorithm, HMM-Match, which uses hidden Markov models to estimate alignment uncertainty and apply it to the alignment of benthic δ18O records to the "LR04" global benthic stack of Lisiecki and Raymo (2005) (Lin et al., 2014). However, since the LR04 stack is deterministic, the algorithm does not account for uncertainty in the stack. Here we address this limitation by developing a probabilistic stack, HMM-Stack. In this model the stack is a probabilistic inhomogeneous hidden Markov model, a.k.a. profile HMM. The HMM-stack is represented by a probabilistic model that "emits" each of the input records (Durbin et al., 1998). The unknown parameters of this model are learned from a set of input records using the expectation maximization (EM) algorithm. Because the multiple alignment of these records is unknown and uncertain, the expected contribution of each input point to each point in the stack is determined probabilistically. For each time step in the HMM-stack, δ18O values are described by a Gaussian probability distribution. Available δ18O records (N=180) are employed to estimate the mean and variance of δ18O at each time point. The mean of HMM-Stack follows the predicted pattern of glacial cycles with increased amplitude after the Pliocene-Pleistocene boundary and also larger and longer cycles after the mid-Pleistocene transition. Furthermore, the δ18O variance increases with age, producing a substantial loss in the signal-to-noise ratio. Not surprisingly, uncertainty in alignment and thus estimated age also increase substantially in the older portion of the stack.

  16. The probabilistic niche model reveals the niche structure and role of body size in a complex food web.

    PubMed

    Williams, Richard J; Anandanadesan, Ananthi; Purves, Drew

    2010-08-09

    The niche model has been widely used to model the structure of complex food webs, and yet the ecological meaning of the single niche dimension has not been explored. In the niche model, each species has three traits, niche position, diet position and feeding range. Here, a new probabilistic niche model, which allows the maximum likelihood set of trait values to be estimated for each species, is applied to the food web of the Benguela fishery. We also developed the allometric niche model, in which body size is used as the niche dimension. About 80% of the links in the empirical data are predicted by the probabilistic niche model, a significant improvement over recent models. As in the niche model, species are uniformly distributed on the niche axis. Feeding ranges are exponentially distributed, but diet positions are not uniformly distributed below the predator. Species traits are strongly correlated with body size, but the allometric niche model performs significantly worse than the probabilistic niche model. The best-fit parameter set provides a significantly better model of the structure of the Benguela food web than was previously available. The methodology allows the identification of a number of taxa that stand out as outliers either in the model's poor performance at predicting their predators or prey or in their parameter values. While important, body size alone does not explain the structure of the one-dimensional niche.

  17. Learning a generative probabilistic grammar of experience: a process-level model of language acquisition.

    PubMed

    Kolodny, Oren; Lotem, Arnon; Edelman, Shimon

    2015-03-01

    We introduce a set of biologically and computationally motivated design choices for modeling the learning of language, or of other types of sequential, hierarchically structured experience and behavior, and describe an implemented system that conforms to these choices and is capable of unsupervised learning from raw natural-language corpora. Given a stream of linguistic input, our model incrementally learns a grammar that captures its statistical patterns, which can then be used to parse or generate new data. The grammar constructed in this manner takes the form of a directed weighted graph, whose nodes are recursively (hierarchically) defined patterns over the elements of the input stream. We evaluated the model in seventeen experiments, grouped into five studies, which examined, respectively, (a) the generative ability of grammar learned from a corpus of natural language, (b) the characteristics of the learned representation, (c) sequence segmentation and chunking, (d) artificial grammar learning, and (e) certain types of structure dependence. The model's performance largely vindicates our design choices, suggesting that progress in modeling language acquisition can be made on a broad front-ranging from issues of generativity to the replication of human experimental findings-by bringing biological and computational considerations, as well as lessons from prior efforts, to bear on the modeling approach.

  18. FOGCAST: Probabilistic fog forecasting based on operational (high-resolution) NWP models

    NASA Astrophysics Data System (ADS)

    Masbou, M.; Hacker, M.; Bentzien, S.

    2013-12-01

    The presence of fog and low clouds in the lower atmosphere can have a critical impact on both airborne and ground transports and is often connected with serious accidents. The improvement of localization, duration and variations in visibility therefore holds an immense operational value. Fog is generally a small scale phenomenon and mostly affected by local advective transport, radiation, turbulent mixing at the surface as well as its microphysical structure. Sophisticated three-dimensional fog models, based on advanced microphysical parameterization schemes and high vertical resolution, have been already developed and give promising results. Nevertheless, the computational time is beyond the range of an operational setup. Therefore, mesoscale numerical weather prediction models are generally used for forecasting all kinds of weather situations. In spite of numerous improvements, a large uncertainty of small scale weather events inherent in deterministic prediction cannot be evaluated adequately. Probabilistic guidance is necessary to assess these uncertainties and give reliable forecasts. In this study, fog forecasts are obtained by a diagnosis scheme similar to Fog Stability Index (FSI) based on COSMO-DE model outputs. COSMO-DE I the German-focused high-resolution operational weather prediction model of the German Meteorological Service. The FSI and the respective fog occurrence probability is optimized and calibrated with statistical postprocessing in terms of logistic regression. In a second step, the predictor number of the FOGCAST model has been optimized by use of the LASSO-method (Least Absolute Shrinkage and Selection Operator). The results will present objective out-of-sample verification based on the Brier score and is performed for station data over Germany. Furthermore, the probabilistic fog forecast approach, FOGCAST, serves as a benchmark for the evaluation of more sophisticated 3D fog models. Several versions have been set up based on different

  19. Additional evidence for a dual-strategy model of reasoning: Probabilistic reasoning is more invariant than reasoning about logical validity.

    PubMed

    Markovits, Henry; Brisson, Janie; de Chantal, Pier-Luc

    2015-11-01

    One of the major debates concerning the nature of inferential reasoning is between counterexample-based strategies such as mental model theory and the statistical strategies underlying probabilistic models. The dual-strategy model proposed by Verschueren, Schaeken, and d'Ydewalle (2005a, 2005b) suggests that people might have access to both kinds of strategies. One of the postulates of this approach is that statistical strategies correspond to low-cost, intuitive modes of evaluation, whereas counterexample strategies are higher-cost and more variable in use. We examined this hypothesis by using a deductive-updating paradigm. The results of Study 1 showed that individual differences in strategy use predict different levels of deductive updating on inferences about logical validity. Study 2 demonstrated no such variation when explicitly probabilistic inferences were examined. Study 3 showed that presenting updating problems with probabilistic inferences modified performance on subsequent problems using logical validity, whereas the opposite was not true. These results provide clear evidence that the processes used to make probabilistic inferences are less subject to variation than those used to make inferences of logical validity.

  20. Multi-State Physics Models of Aging Passive Components in Probabilistic Risk Assessment

    SciTech Connect

    Unwin, Stephen D.; Lowry, Peter P.; Layton, Robert F.; Heasler, Patrick G.; Toloczko, Mychailo B.

    2011-03-13

    Multi-state Markov modeling has proved to be a promising approach to estimating the reliability of passive components - particularly metallic pipe components - in the context of probabilistic risk assessment (PRA). These models consider the progressive degradation of a component through a series of observable discrete states, such as detectable flaw, leak and rupture. Service data then generally provides the basis for estimating the state transition rates. Research in materials science is producing a growing understanding of the physical phenomena that govern the aging degradation of passive pipe components. As a result, there is an emerging opportunity to incorporate these insights into PRA. This paper describes research conducted under the Risk-Informed Safety Margin Characterization Pathway of the Department of Energy’s Light Water Reactor Sustainability Program. A state transition model is described that addresses aging behavior associated with stress corrosion cracking in ASME Class 1 dissimilar metal welds – a component type relevant to LOCA analysis. The state transition rate estimates are based on physics models of weld degradation rather than service data. The resultant model is found to be non-Markov in that the transition rates are time-inhomogeneous and stochastic. Numerical solutions to the model provide insight into the effect of aging on component reliability.

  1. Multi-model approach to petroleum resource appraisal using analytic methodologies for probabilistic systems

    USGS Publications Warehouse

    Crovelli, R.A.

    1988-01-01

    The geologic appraisal model that is selected for a petroleum resource assessment depends upon purpose of the assessment, basic geologic assumptions of the area, type of available data, time available before deadlines, available human and financial resources, available computer facilities, and, most importantly, the available quantitative methodology with corresponding computer software and any new quantitative methodology that would have to be developed. Therefore, different resource assessment projects usually require different geologic models. Also, more than one geologic model might be needed in a single project for assessing different regions of the study or for cross-checking resource estimates of the area. Some geologic analyses used in the past for petroleum resource appraisal involved play analysis. The corresponding quantitative methodologies of these analyses usually consisted of Monte Carlo simulation techniques. A probabilistic system of petroleum resource appraisal for play analysis has been designed to meet the following requirements: (1) includes a variety of geologic models, (2) uses an analytic methodology instead of Monte Carlo simulation, (3) possesses the capacity to aggregate estimates from many areas that have been assessed by different geologic models, and (4) runs quickly on a microcomputer. Geologic models consist of four basic types: reservoir engineering, volumetric yield, field size, and direct assessment. Several case histories and present studies by the U.S. Geological Survey are discussed. ?? 1988 International Association for Mathematical Geology.

  2. The MCRA model for probabilistic single-compound and cumulative risk assessment of pesticides.

    PubMed

    van der Voet, Hilko; de Boer, Waldo J; Kruisselbrink, Johannes W; Goedhart, Paul W; van der Heijden, Gerie W A M; Kennedy, Marc C; Boon, Polly E; van Klaveren, Jacob D

    2015-05-01

    Pesticide risk assessment is hampered by worst-case assumptions leading to overly pessimistic assessments. On the other hand, cumulative health effects of similar pesticides are often not taken into account. This paper describes models and a web-based software system developed in the European research project ACROPOLIS. The models are appropriate for both acute and chronic exposure assessments of single compounds and of multiple compounds in cumulative assessment groups. The software system MCRA (Monte Carlo Risk Assessment) is available for stakeholders in pesticide risk assessment at mcra.rivm.nl. We describe the MCRA implementation of the methods as advised in the 2012 EFSA Guidance on probabilistic modelling, as well as more refined methods developed in the ACROPOLIS project. The emphasis is on cumulative assessments. Two approaches, sample-based and compound-based, are contrasted. It is shown that additional data on agricultural use of pesticides may give more realistic risk assessments. Examples are given of model and software validation of acute and chronic assessments, using both simulated data and comparisons against the previous release of MCRA and against the standard software DEEM-FCID used by the Environmental Protection Agency in the USA. It is shown that the EFSA Guidance pessimistic model may not always give an appropriate modelling of exposure.

  3. Multi-model approach to petroleum resource appraisal using analytic methodologies for probabilistic systems

    SciTech Connect

    Crovelli, R.A.

    1988-11-01

    The geologic appraisal model that is selected for a petroleum resource assessment depends upon purpose of the assessment, basic geologic assumptions of the area, type of available data, time available before deadlines, available human and financial resources, available computer facilities, and, most importantly, the available quantitative methodology with corresponding computer software and any new quantitative methodology that would have to be developed. Therefore, different resource assessment projects usually require different geologic models. Also, more than one geologic model might be needed in a single project for assessing different regions of the study or for cross-checking resource estimates of the area. Some geologic analyses used in the past for petroleum resource appraisal involved play analysis. The corresponding quantitative methodologies of these analyses usually consisted of Monte Carlo simulation techniques. A probabilistic system of petroleum resource appraisal for play analysis has been designed to meet the following requirements: (1) includes a variety of geologic models, (2) uses an analytic methodology instead of Monte Carlo simulation, (3) possesses the capacity to aggregate estimates from many areas that have been assessed by different geologic models, and (4) runs quickly on a microcomputer. Geologic models consist of four basic types: reservoir engineering, volumetric yield, field size, and direct assessment. Several case histories and present studies by the US Geological Survey are discussed.

  4. A probabilistic tornado wind hazard model for the continental United States

    SciTech Connect

    Hossain, Q; Kimball, J; Mensing, R; Savy, J

    1999-04-19

    A probabilistic tornado wind hazard model for the continental United States (CONUS) is described. The model incorporates both aleatory (random) and epistemic uncertainties associated with quantifying the tornado wind hazard parameters. The temporal occurrences of tornadoes within the continental United States (CONUS) is assumed to be a Poisson process. A spatial distribution of tornado touchdown locations is developed empirically based on the observed historical events within the CONUS. The hazard model is an aerial probability model that takes into consideration the size and orientation of the facility, the length and width of the tornado damage area (idealized as a rectangle and dependent on the tornado intensity scale), wind speed variation within the damage area, tornado intensity classification errors (i.e.,errors in assigning a Fujita intensity scale based on surveyed damage), and the tornado path direction. Epistemic uncertainties in describing the distributions of the aleatory variables are accounted for by using more than one distribution model to describe aleatory variations. The epistemic uncertainties are based on inputs from a panel of experts. A computer program, TORNADO, has been developed incorporating this model; features of this program are also presented.

  5. Probabilistic prediction of cyanobacteria abundance in a Korean reservoir using a Bayesian Poisson model

    NASA Astrophysics Data System (ADS)

    Cha, YoonKyung; Park, Seok Soon; Kim, Kyunghyun; Byeon, Myeongseop; Stow, Craig A.

    2014-03-01

    There have been increasing reports of harmful algal blooms (HABs) worldwide. However, the factors that influence cyanobacteria dominance and HAB formation can be site-specific and idiosyncratic, making prediction challenging. The drivers of cyanobacteria blooms in Lake Paldang, South Korea, the summer climate of which is strongly affected by the East Asian monsoon, may differ from those in well-studied North American lakes. Using the observational data sampled during the growing season in 2007-2011, a Bayesian hurdle Poisson model was developed to predict cyanobacteria abundance in the lake. The model allowed cyanobacteria absence (zero count) and nonzero cyanobacteria counts to be modeled as functions of different environmental factors. The model predictions demonstrated that the principal factor that determines the success of cyanobacteria was temperature. Combined with high temperature, increased residence time indicated by low outflow rates appeared to increase the probability of cyanobacteria occurrence. A stable water column, represented by low suspended solids, and high temperature were the requirements for high abundance of cyanobacteria. Our model results had management implications; the model can be used to forecast cyanobacteria watch or alert levels probabilistically and develop mitigation strategies of cyanobacteria blooms.

  6. Probabilistic Structural Analysis Program

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  7. A performance weighting procedure for GCMs based on explicit probabilistic models and accounting for observation uncertainty

    NASA Astrophysics Data System (ADS)

    Renard, Benjamin; Vidal, Jean-Philippe

    2016-04-01

    In recent years, the climate modeling community has put a lot of effort into releasing the outputs of multimodel experiments for use by the wider scientific community. In such experiments, several structurally distinct GCMs are run using the same observed forcings (for the historical period) or the same projected forcings (for the future period). In addition, several members are produced for a single given model structure, by running each GCM with slightly different initial conditions. This multiplicity of GCM outputs offers many opportunities in terms of uncertainty quantification or GCM comparisons. In this presentation, we propose a new procedure to weight GCMs according to their ability to reproduce the observed climate. Such weights can be used to combine the outputs of several models in a way that rewards good-performing models and discards poorly-performing ones. The proposed procedure has the following main properties: 1. It is based on explicit probabilistic models describing the time series produced by the GCMs and the corresponding historical observations, 2. It can use several members whenever available, 3. It accounts for the uncertainty in observations, 4. It assigns a weight to each GCM (all weights summing up to one), 5. It can also assign a weight to the "H0 hypothesis" that all GCMs in the multimodel ensemble are not compatible with observations. The application of the weighting procedure is illustrated with several case studies including synthetic experiments, simple cases where the target GCM output is a simple univariate variable and more realistic cases where the target GCM output is a multivariate and/or a spatial variable. These case studies illustrate the generality of the procedure which can be applied in a wide range of situations, as long as the analyst is prepared to make an explicit probabilistic assumption on the target variable. Moreover, these case studies highlight several interesting properties of the weighting procedure. In

  8. Using the Rasch model as an objective and probabilistic technique to integrate different soil properties

    NASA Astrophysics Data System (ADS)

    Rebollo, Francisco J.; Jesús Moral García, Francisco

    2016-04-01

    Soil apparent electrical conductivity (ECa) is one of the simplest, least expensive soil measurements that integrates many soil properties affecting crop productivity, including, for instance, soil texture, water content, and cation exchange capacity. The ECa measurements obtained with a 3100 Veris sensor, operating in both shallow (0-30 cm), ECs, and deep (0-90 cm), ECd, mode, can be used as an additional and essential information to be included in a probabilistic model, the Rasch model, with the aim of quantifying the overall soil fertililty potential in an agricultural field. This quantification should integrate the main soil physical and chemical properties, with different units. In this work, the formulation of the Rasch model integrates 11 soil properties (clay, silt and sand content, organic matter -OM-, pH, total nitrogen -TN-, available phosphorus -AP- and potassium -AK-, cation exchange capacity -CEC-, ECd, and ECs) measured at 70 locations in a field. The main outputs of the model include a ranking of all soil samples according to their relative fertility potential and the unexpected behaviours of some soil samples and properties. In the case study, the considered soil variables fit the model reasonably, having an important influence on soil fertility, except pH, probably due to its homogeneity in the field. Moreover, ECd, ECs are the most influential properties on soil fertility and, on the other hand, AP and AK the less influential properties. The use of the Rasch model to estimate soil fertility potential (always in a relative way, taking into account the characteristics of the studied soil) constitutes a new application of great practical importance, enabling to rationally determine locations in a field where high soil fertility potential exists and establishing those soil samples or properties which have any anomaly; this information can be necessary to conduct site-specific treatments, leading to a more cost-effective and sustainable field

  9. Face Recognition for Access Control Systems Combining Image-Difference Features Based on a Probabilistic Model

    NASA Astrophysics Data System (ADS)

    Miwa, Shotaro; Kage, Hiroshi; Hirai, Takashi; Sumi, Kazuhiko

    We propose a probabilistic face recognition algorithm for Access Control System(ACS)s. Comparing with existing ACSs using low cost IC-cards, face recognition has advantages in usability and security that it doesn't require people to hold cards over scanners and doesn't accept imposters with authorized cards. Therefore face recognition attracts more interests in security markets than IC-cards. But in security markets where low cost ACSs exist, price competition is important, and there is a limitation on the quality of available cameras and image control. Therefore ACSs using face recognition are required to handle much lower quality images, such as defocused and poor gain-controlled images than high security systems, such as immigration control. To tackle with such image quality problems we developed a face recognition algorithm based on a probabilistic model which combines a variety of image-difference features trained by Real AdaBoost with their prior probability distributions. It enables to evaluate and utilize only reliable features among trained ones during each authentication, and achieve high recognition performance rates. The field evaluation using a pseudo Access Control System installed in our office shows that the proposed system achieves a constant high recognition performance rate independent on face image qualities, that is about four times lower EER (Equal Error Rate) under a variety of image conditions than one without any prior probability distributions. On the other hand using image difference features without any prior probabilities are sensitive to image qualities. We also evaluated PCA, and it has worse, but constant performance rates because of its general optimization on overall data. Comparing with PCA, Real AdaBoost without any prior distribution performs twice better under good image conditions, but degrades to a performance as good as PCA under poor image conditions.

  10. Probabilistic Material Strength Degradation Model for Inconel 718 Components Subjected to High Temperature, Mechanical Fatigue, Creep and Thermal Fatigue Effects

    NASA Technical Reports Server (NTRS)

    Bast, Callie Corinne Scheidt

    1994-01-01

    This thesis presents the on-going development of methodology for a probabilistic material strength degradation model. The probabilistic model, in the form of a postulated randomized multifactor equation, provides for quantification of uncertainty in the lifetime material strength of aerospace propulsion system components subjected to a number of diverse random effects. This model is embodied in the computer program entitled PROMISS, which can include up to eighteen different effects. Presently, the model includes four effects that typically reduce lifetime strength: high temperature, mechanical fatigue, creep, and thermal fatigue. Statistical analysis was conducted on experimental Inconel 718 data obtained from the open literature. This analysis provided regression parameters for use as the model's empirical material constants, thus calibrating the model specifically for Inconel 718. Model calibration was carried out for four variables, namely, high temperature, mechanical fatigue, creep, and thermal fatigue. Methodology to estimate standard deviations of these material constants for input into the probabilistic material strength model was developed. Using the current version of PROMISS, entitled PROMISS93, a sensitivity study for the combined effects of mechanical fatigue, creep, and thermal fatigue was performed. Results, in the form of cumulative distribution functions, illustrated the sensitivity of lifetime strength to any current value of an effect. In addition, verification studies comparing a combination of mechanical fatigue and high temperature effects by model to the combination by experiment were conducted. Thus, for Inconel 718, the basic model assumption of independence between effects was evaluated. Results from this limited verification study strongly supported this assumption.

  11. A probabilistic model framework for evaluating year-to-year variation in crop productivity

    NASA Astrophysics Data System (ADS)

    Yokozawa, M.; Iizumi, T.; Tao, F.

    2008-12-01

    Most models describing the relation between crop productivity and weather condition have so far been focused on mean changes of crop yield. For keeping stable food supply against abnormal weather as well as climate change, evaluating the year-to-year variations in crop productivity rather than the mean changes is more essential. We here propose a new framework of probabilistic model based on Bayesian inference and Monte Carlo simulation. As an example, we firstly introduce a model on paddy rice production in Japan. It is called PRYSBI (Process- based Regional rice Yield Simulator with Bayesian Inference; Iizumi et al., 2008). The model structure is the same as that of SIMRIW, which was developed and used widely in Japan. The model includes three sub- models describing phenological development, biomass accumulation and maturing of rice crop. These processes are formulated to include response nature of rice plant to weather condition. This model inherently was developed to predict rice growth and yield at plot paddy scale. We applied it to evaluate the large scale rice production with keeping the same model structure. Alternatively, we assumed the parameters as stochastic variables. In order to let the model catch up actual yield at larger scale, model parameters were determined based on agricultural statistical data of each prefecture of Japan together with weather data averaged over the region. The posterior probability distribution functions (PDFs) of parameters included in the model were obtained using Bayesian inference. The MCMC (Markov Chain Monte Carlo) algorithm was conducted to numerically solve the Bayesian theorem. For evaluating the year-to-year changes in rice growth/yield under this framework, we firstly iterate simulations with set of parameter values sampled from the estimated posterior PDF of each parameter and then take the ensemble mean weighted with the posterior PDFs. We will also present another example for maize productivity in China. The

  12. A Probabilistic Model for Hydrokinetic Turbine Collision Risks: Exploring Impacts on Fish

    PubMed Central

    Hammar, Linus; Eggertsen, Linda; Andersson, Sandra; Ehnberg, Jimmy; Arvidsson, Rickard; Gullström, Martin; Molander, Sverker

    2015-01-01

    A variety of hydrokinetic turbines are currently under development for power generation in rivers, tidal straits and ocean currents. Because some of these turbines are large, with rapidly moving rotor blades, the risk of collision with aquatic animals has been brought to attention. The behavior and fate of animals that approach such large hydrokinetic turbines have not yet been monitored at any detail. In this paper, we conduct a synthesis of the current knowledge and understanding of hydrokinetic turbine collision risks. The outcome is a generic fault tree based probabilistic model suitable for estimating population-level ecological risks. New video-based data on fish behavior in strong currents are provided and models describing fish avoidance behaviors are presented. The findings indicate low risk for small-sized fish. However, at large turbines (≥5 m), bigger fish seem to have high probability of collision, mostly because rotor detection and avoidance is difficult in low visibility. Risks can therefore be substantial for vulnerable populations of large-sized fish, which thrive in strong currents. The suggested collision risk model can be applied to different turbine designs and at a variety of locations as basis for case-specific risk assessments. The structure of the model facilitates successive model validation, refinement and application to other organism groups such as marine mammals. PMID:25730314

  13. Using generative models to make probabilistic statements about hippocampal engagement in MEG.

    PubMed

    Meyer, Sofie S; Rossiter, Holly; Brookes, Matthew J; Woolrich, Mark W; Bestmann, Sven; Barnes, Gareth R

    2017-04-01

    Magnetoencephalography (MEG) enables non-invasive real time characterization of brain activity. However, convincing demonstrations of signal contributions from deeper sources such as the hippocampus remain controversial and are made difficult by its depth, structural complexity and proximity to neocortex. Here, we demonstrate a method for quantifying hippocampal engagement probabilistically using simulated hippocampal activity and realistic anatomical and electromagnetic source modelling. We construct two generative models, one which supports neuronal current flow on the cortical surface, and one which supports neuronal current flow on both the cortical and hippocampal surface. Using Bayesian model comparison, we then infer which of the two models provides a more likely explanation of the dataset at hand. We also carry out a set of control experiments to rule out bias, including simulating medial temporal lobe sources to assess the risk of falsely positive results, and adding different types of displacements to the hippocampal portion of the mesh to test for anatomical specificity of the results. In addition, we test the robustness of this inference by adding co-registration error and sensor level noise. We find that the model comparison framework is sensitive to hippocampal activity when co-registration error is <3 mm and the sensor-level signal-to-noise ratio (SNR) is >-20 dB. These levels of co-registration error and SNR can now be achieved empirically using recently developed subject-specific head-casts.

  14. A probabilistic sediment cascade model of sediment transfer in the Illgraben

    NASA Astrophysics Data System (ADS)

    Bennett, G. L.; Molnar, P.; McArdell, B. W.; Burlando, P.

    2014-02-01

    We present a probabilistic sediment cascade model to simulate sediment transfer in a mountain basin (Illgraben, Switzerland) where sediment is produced by hillslope landslides and rockfalls and exported out of the basin by debris flows and floods. The model conceptualizes the fluvial system as a spatially lumped cascade of connected reservoirs representing hillslope and channel storages where sediment goes through cycles of storage and remobilization by surface runoff. The model includes all relevant hydrological processes that lead to runoff formation in an Alpine basin, such as precipitation, snow accumulation, snowmelt, evapotranspiration, and soil water storage. Although the processes of sediment transfer and debris flow generation are described in a simplified manner, the model produces complex sediment discharge behavior which is driven by the availability of sediment and antecedent wetness conditions (system memory) as well as the triggering potential (climatic forcing). The observed probability distribution of debris flow volumes and their seasonality in 2000-2009 are reproduced. The stochasticity of hillslope sediment input is important for reproducing realistic sediment storage variability, although many details of the hillslope landslide triggering procedures are filtered out by the sediment transfer system. The model allows us to explicitly quantify the division into transport and supply-limited sediment discharge events. We show that debris flows may be generated for a wide range of rainfall intensities because of variable antecedent basin wetness and snowmelt contribution to runoff, which helps to understand the limitations of methods based on a single rainfall threshold for debris flow initiation in Alpine basins.

  15. A probabilistic model for hydrokinetic turbine collision risks: exploring impacts on fish.

    PubMed

    Hammar, Linus; Eggertsen, Linda; Andersson, Sandra; Ehnberg, Jimmy; Arvidsson, Rickard; Gullström, Martin; Molander, Sverker

    2015-01-01

    A variety of hydrokinetic turbines are currently under development for power generation in rivers, tidal straits and ocean currents. Because some of these turbines are large, with rapidly moving rotor blades, the risk of collision with aquatic animals has been brought to attention. The behavior and fate of animals that approach such large hydrokinetic turbines have not yet been monitored at any detail. In this paper, we conduct a synthesis of the current knowledge and understanding of hydrokinetic turbine collision risks. The outcome is a generic fault tree based probabilistic model suitable for estimating population-level ecological risks. New video-based data on fish behavior in strong currents are provided and models describing fish avoidance behaviors are presented. The findings indicate low risk for small-sized fish. However, at large turbines (≥5 m), bigger fish seem to have high probability of collision, mostly because rotor detection and avoidance is difficult in low visibility. Risks can therefore be substantial for vulnerable populations of large-sized fish, which thrive in strong currents. The suggested collision risk model can be applied to different turbine designs and at a variety of locations as basis for case-specific risk assessments. The structure of the model facilitates successive model validation, refinement and application to other organism groups such as marine mammals.

  16. Probabilistic modeling of school meals for potential bisphenol A (BPA) exposure.

    PubMed

    Hartle, Jennifer C; Fox, Mary A; Lawrence, Robert S

    2016-01-01

    Many endocrine-disrupting chemicals (EDCs), including bisphenol A (BPA), are approved for use in food packaging, with unbound BPA migrating into the foods it contacts. Children, with their developing organ systems, are especially susceptible to hormone disruption, prompting this research to model the potential dose of BPA from school-provided meals. Probabilistic exposure models for school meals were informed by mixed methods. Exposure scenarios were based on United States school nutrition guidelines and included meals with varying levels of exposure potential from canned and packaged food. BPA exposure potentials were modeled with a range of 0.00049 μg/kg-BW/day for a middle school student with a low exposure breakfast and plate waste to 1.19 μg/kg-BW/day for an elementary school student eating lunch with high exposure potential. The modeled BPA doses from school meals are below the current US EPA Oral Reference Dose (RfD) of 50 μg/kg-BW/day. Recent research shows BPA animal toxicity thresholds at 2 μg/kg-BW/day. The single meal doses modeled in this research are at the same order of magnitude as the low-dose toxicity thresholds, illustrating the potential for school meals to expose children to chronic toxic levels of BPA.

  17. Development of Simplified Probabilistic Risk Assessment Model for Seismic Initiating Event

    SciTech Connect

    S. Khericha; R. Buell; S. Sancaktar; M. Gonzalez; F. Ferrante

    2012-06-01

    ABSTRACT This paper discusses a simplified method to evaluate seismic risk using a methodology built on dividing the seismic intensity spectrum into multiple discrete bins. The seismic probabilistic risk assessment model uses Nuclear Regulatory Commission’s (NRC’s) full power Standardized Plant Analysis Risk (SPAR) model as the starting point for development. The seismic PRA models are integrated with their respective internal events at-power SPAR model. This is accomplished by combining the modified system fault trees from the full power SPAR model with seismic event tree logic. The peak ground acceleration is divided into five bins. The g-value for each bin is estimated using the geometric mean of lower and upper values of that particular bin and the associated frequency for each bin is estimated by taking the difference between upper and lower values of that bin. The component’s fragilities are calculated for each bin using the plant data, if available, or generic values of median peak ground acceleration and uncertainty values for the components. For human reliability analysis (HRA), the SPAR HRA (SPAR-H) method is used which requires the analysts to complete relatively straight forward worksheets that include the performance shaping factors (PSFs). The results are then used to estimate human error probabilities (HEPs) of interest. This work is expected to improve the NRC’s ability to include seismic hazards in risk assessments for operational events in support of the reactor oversight program (e.g., significance determination process).

  18. Probabilistic versus Deterministic Skill in Predicting the Western North Pacific- East Asian Summer Monsoon Variability with Multi-Model Ensembles

    NASA Astrophysics Data System (ADS)

    Yang, D.; Yang, X. Q.; Xie, Q.; Zhang, Y.; Ren, X.; Tang, Y.

    2015-12-01

    Based on the historical forecasts of three quasi-operational multi-model ensemble (MME) systems, this study assesses the superiorities of the coupled MME over its contributing single-model ensembles (SMEs) and over the uncoupled atmospheric MME in predicting the seasonal variability of the Western North Pacific-East Asian summer monsoon. The seasonal prediction skill of the monsoon is measured by Brier skill score (BSS) in the sense of probabilistic forecast as well as by anomaly correlation (AC) in the sense of deterministic forecast. The probabilistic forecast skill of the MME is found to be always significantly better than that of each participating SME, while the deterministic forecast skill of the MME is even worse than that of some SME. The BSS is composed of reliability and resolution, two attributes characterizing probabilistic forecast skill. The probabilistic skill increase of the MME is dominated by the drastic improvement in reliability, while resolution is not always improved, similar to AC. A monotonous resolution-AC relationship is further found and qualitatively understood, whereas little relationship can be identified between reliability and AC. It is argued that the MME's success in improving the reliability possibly arises from an effective reduction of biases and overconfidence in forecast distributions. The coupled MME is much more skillful than the uncoupled atmospheric MME forced by persisted sea surface temperature (SST) anomalies. This advantage is mainly attributed to its better capability in capturing the evolution of the underlying seasonal SST anomaly.

  19. Probabilistic model for Listeria monocytogenes growth during distribution, retail storage, and domestic storage of pasteurized milk.

    PubMed

    Koutsoumanis, Konstantinos; Pavlis, Athanasios; Nychas, George-John E; Xanthiakos, Konstantinos

    2010-04-01

    A survey on the time-temperature conditions of pasteurized milk in Greece during transportation to retail, retail storage, and domestic storage and handling was performed. The data derived from the survey were described with appropriate probability distributions and introduced into a growth model of Listeria monocytogenes in pasteurized milk which was appropriately modified for taking into account strain variability. Based on the above components, a probabilistic model was applied to evaluate the growth of L. monocytogenes during the chill chain of pasteurized milk using a Monte Carlo simulation. The model predicted that, in 44.8% of the milk cartons released in the market, the pathogen will grow until the time of consumption. For these products the estimated mean total growth of L. monocytogenes during transportation, retail storage, and domestic storage was 0.93 log CFU, with 95th and 99th percentiles of 2.68 and 4.01 log CFU, respectively. Although based on EU regulation 2073/2005 pasteurized milk produced in Greece belongs to the category of products that do not allow the growth of L. monocytogenes due to a shelf life (defined by law) of 5 days, the above results show that this shelf life limit cannot prevent L. monocytogenes from growing under the current chill chain conditions. The predicted percentage of milk cartons-initially contaminated with 1 cell/1-liter carton-in which the pathogen exceeds the safety criterion of 100 cells/ml at the time of consumption was 0.14%. The probabilistic model was used for an importance analysis of the chill chain factors, using rank order correlation, while selected intervention and shelf life increase scenarios were evaluated. The results showed that simple interventions, such as excluding the door shelf from the domestic storage of pasteurized milk, can effectively reduce the growth of the pathogen. The door shelf was found to be the warmest position in domestic refrigerators, and it was most frequently used by the

  20. Maxwell and very-hard-particle models for probabilistic ballistic annihilation: Hydrodynamic description

    NASA Astrophysics Data System (ADS)

    Coppex, François; Droz, Michel; Trizac, Emmanuel

    2005-08-01

    The hydrodynamic description of probabilistic ballistic annihilation, for which no conservation laws hold, is an intricate problem with hard spherelike dynamics for which no exact solution exists. We consequently focus on simplified approaches, the Maxwell and very-hard-particle (VHP) models, which allows us to compute analytically upper and lower bounds for several quantities. The purpose is to test the possibility of describing such a far from equilibrium dynamics with simplified kinetic models. The motivation is also in turn to assess the relevance of some singular features appearing within the original model and the approximations invoked to study it. The scaling exponents are first obtained from the (simplified) Boltzmann equation, and are confronted against direct Monte Carlo simulations. Then, the Chapman-Enskog method is used to obtain constitutive relations and transport coefficients. The corresponding Navier-Stokes equations for the hydrodynamic fields are derived for both Maxwell and VHP models. We finally perform a linear stability analysis around the homogeneous solution, which illustrates the importance of dissipation in the possible development of spatial inhomogeneities.

  1. Sensitivity analysis of a two-dimensional probabilistic risk assessment model using analysis of variance.

    PubMed

    Mokhtari, Amirhossein; Frey, H Christopher

    2005-12-01

    This article demonstrates application of sensitivity analysis to risk assessment models with two-dimensional probabilistic frameworks that distinguish between variability and uncertainty. A microbial food safety process risk (MFSPR) model is used as a test bed. The process of identifying key controllable inputs and key sources of uncertainty using sensitivity analysis is challenged by typical characteristics of MFSPR models such as nonlinearity, thresholds, interactions, and categorical inputs. Among many available sensitivity analysis methods, analysis of variance (ANOVA) is evaluated in comparison to commonly used methods based on correlation coefficients. In a two-dimensional risk model, the identification of key controllable inputs that can be prioritized with respect to risk management is confounded by uncertainty. However, as shown here, ANOVA provided robust insights regarding controllable inputs most likely to lead to effective risk reduction despite uncertainty. ANOVA appropriately selected the top six important inputs, while correlation-based methods provided misleading insights. Bootstrap simulation is used to quantify uncertainty in ranks of inputs due to sampling error. For the selected sample size, differences in F values of 60% or more were associated with clear differences in rank order between inputs. Sensitivity analysis results identified inputs related to the storage of ground beef servings at home as the most important. Risk management recommendations are suggested in the form of a consumer advisory for better handling and storage practices.

  2. Uncertainty analyses of fuel hydrocarbon biodegradation signatures in ground water by probabilistic modeling

    SciTech Connect

    McNab, W.W. Jr.; Dooher, B.P.

    1998-07-01

    Natural attenuation processes, such as biodegradation, may serve as a means for remediating ground water contaminated by fuel hydrocarbons from leaking underground fuel tanks (LUFTs). Quantification of the uncertainties associated with natural attenuation, and hence the capacity to limit plume migration and restore an aquifer, is important. In this study, a probabilistic screening model is developed to quantify uncertainties involved in the impact of biodegradation on hydrocarbon plume behavior. The approach is based on Monte Carlo simulation using an analytical solution to the advective-dispersive solute transport equation, including a first-order degradation term, coupled with mass balance constraints on electron acceptor use. Empirical probability distributions for governing parameters are provided as input to the model. Application of the model to an existing LUFT site illustrates the degree of uncertainty associated with model-predicted hydrocarbon concentrations and geochemical indicators at individual site monitoring wells as well as the role of various parameter assumptions (e.g., hydraulic conductivity, first-order decay coefficient, source term) in influencing forecasts. This information is useful for risk management planning because the degree of confidence that biodegradation will limit the impact of a hydrocarbon plume on potential receptors can be quantified.

  3. Identifying biological concepts from a protein-related corpus with a probabilistic topic model

    PubMed Central

    Zheng, Bin; McLean, David C; Lu, Xinghua

    2006-01-01

    Background Biomedical literature, e.g., MEDLINE, contains a wealth of knowledge regarding functions of proteins. Major recurring biological concepts within such text corpora represent the domains of this body of knowledge. The goal of this research is to identify the major biological topics/concepts from a corpus of protein-related MEDLINE© titles and abstracts by applying a probabilistic topic model. Results The latent Dirichlet allocation (LDA) model was applied to the corpus. Based on the Bayesian model selection, 300 major topics were extracted from the corpus. The majority of identified topics/concepts was found to be semantically coherent and most represented biological objects or concepts. The identified topics/concepts were further mapped to the controlled vocabulary of the Gene Ontology (GO) terms based on mutual information. Conclusion The major and recurring biological concepts within a collection of MEDLINE documents can be extracted by the LDA model. The identified topics/concepts provide parsimonious and semantically-enriched representation of the texts in a semantic space with reduced dimensionality and can be used to index text. PMID:16466569

  4. A Probabilistic Model for Sequence Alignment with Context-Sensitive Indels

    NASA Astrophysics Data System (ADS)

    Hickey, Glenn; Blanchette, Mathieu

    Probabilistic approaches for sequence alignment are usually based on pair Hidden Markov Models (HMMs) or Stochastic Context Free Grammars (SCFGs). Recent studies have shown a significant correlation between the content of short indels and their flanking regions, which by definition cannot be modelled by the above two approaches. In this work, we present a context-sensitive indel model based on a pair Tree-Adjoining Grammar (TAG), along with accompanying algorithms for efficient alignment and parameter estimation. The increased precision and statistical power of this model is shown on simulated and real genomic data. As the cost of sequencing plummets, the usefulness of comparative analysis is becoming limited by alignment accuracy rather than data availability. Our results will therefore have an impact on any type of downstream comparative genomics analyses that rely on alignments. Fine-grained studies of small functional regions or disease markers, for example, could be significantly improved by our method. The implementation is available at http://www.mcb.mcgill.ca/~blanchem/software.html

  5. Regional gold potential mapping in Kelantan (Malaysia) using probabilistic based models and GIS

    NASA Astrophysics Data System (ADS)

    Yusoff, Suhaimizi; Pradhan, Biswajeet; Manap, Mohamad Abd; Shafri, Helmi Zulhaidi Mohd

    2015-06-01

    The aim of this study is to test and compare two probabilistic based models (frequency ratio and weightsof- evidence) with regard to regional gold potential mapping at Kelantan, Malaysia. Until now these models have not been used for the purpose of mapping gold potential areas in Malaysia. This study analyzed the spatial relationship between gold deposits and geological factors such as lithology, faults, geochemical and geophysical data in geographical information system (GIS) software. About eight (8) gold deposits and five (5) related factors are identified and quantified for their spatial relationships. Then, all factors were combined to generate a predictive gold potential map. The predictive maps were then validated by comparing them with known gold deposits using receiver operating characteristics (ROC) and "area under the curve" (AUC) graphs. The results of validation showed accuracies of 80% for the frequency ratio and 74% for the weightsof- evidence model, respectively. The results demonstrated the usefulness of frequency ratio and weights-of-evidence modeling techniques in mineral exploration work to discover unknown gold deposits in Kelantan, Malaysia.

  6. A probabilistic seismic hazard model based on cellular automata and information theory

    NASA Astrophysics Data System (ADS)

    Jiménez, A.; Posadas, A. M.; Marfil, J. M.

    2005-03-01

    We try to obtain a spatio-temporal model of earthquakes occurrence based on Information Theory and Cellular Automata (CA). The CA supply useful models for many investigations in natural sciences; here, it have been used to establish temporal relations between the seismic events occurring in neighbouring parts of the crust. The catalogue used is divided into time intervals and the region into cells, which are declared active or inactive by means of a certain energy release criterion (four criteria have been tested). A pattern of active and inactive cells which evolves over time is given. A stochastic CA is constructed with the patterns to simulate their spatio-temporal evolution. The interaction between the cells is represented by the neighbourhood (2-D and 3-D models have been tried). The best model is chosen by maximizing the mutual information between the past and the future states. Finally, a Probabilistic Seismic Hazard Map is drawn up for the different energy releases. The method has been applied to the Iberian Peninsula catalogue from 1970 to 2001. For 2-D, the best neighbourhood has been the Moore's one of radius 1; the von Neumann's 3-D also gives hazard maps and takes into account the depth of the events. Gutenberg-Richter's law and Hurst's analysis have been obtained for the data as a test of the catalogue. Our results are consistent with previous studies both of seismic hazard and stress conditions in the zone, and with the seismicity occurred after 2001.

  7. Size Evolution and Stochastic Models: Explaining Ostracod Size through Probabilistic Distributions

    NASA Astrophysics Data System (ADS)

    Krawczyk, M.; Decker, S.; Heim, N. A.; Payne, J.

    2014-12-01

    The biovolume of animals has functioned as an important benchmark for measuring evolution throughout geologic time. In our project, we examined the observed average body size of ostracods over time in order to understand the mechanism of size evolution in these marine organisms. The body size of ostracods has varied since the beginning of the Ordovician, where the first true ostracods appeared. We created a stochastic branching model to create possible evolutionary trees of ostracod size. Using stratigraphic ranges for ostracods compiled from over 750 genera in the Treatise on Invertebrate Paleontology, we calculated overall speciation and extinction rates for our model. At each timestep in our model, new lineages can evolve or existing lineages can become extinct. Newly evolved lineages are assigned sizes based on their parent genera. We parameterized our model to generate neutral and directional changes in ostracod size to compare with the observed data. New sizes were chosen via a normal distribution, and the neutral model selected new sizes differentials centered on zero, allowing for an equal chance of larger or smaller ostracods at each speciation. Conversely, the directional model centered the distribution on a negative value, giving a larger chance of smaller ostracods. Our data strongly suggests that the overall direction of ostracod evolution has been following a model that directionally pushes mean ostracod size down, shying away from a neutral model. Our model was able to match the magnitude of size decrease. Our models had a constant linear decrease while the actual data had a much more rapid initial rate followed by a constant size. The nuance of the observed trends ultimately suggests a more complex method of size evolution. In conclusion, probabilistic methods can provide valuable insight into possible evolutionary mechanisms determining size evolution in ostracods.

  8. A probabilistic respiratory tract dosimetry model with application to beta-particle and photon emitters

    NASA Astrophysics Data System (ADS)

    Farfan, Eduardo Balderrama

    2002-01-01

    Predicting equivalent dose in the human respiratory tract is significant in the assessment of health risks associated with the inhalation of radioactive aerosols. A complete respiratory tract methodology based on the International Commission on Radiological Protection Publication 66 model was used in this research project for beta-particle and photon emitters. The conventional methodology has been to use standard values (from Reference Man) for parameters to obtain a single dose value. However, the methods used in the current study allow lung dose values to be determined as probability distributions to reflect the spread or variability in doses. To implement the methodology, a computer code, LUDUC, has been modified to include inhalation scenarios of beta-particle and photon emitters. For beta particles, a new methodology was implemented into Monte Carlo simulations to determine absorbed fractions in target tissues within the thoracic region of the respiratory tract. For photons, a new mathematical phantom of extrathoracic and thoracic regions was created based on previous studies to determine specific absorbed fractions in several tissues and organs of the human body due to inhalation of radioactive materials. The application of the methodology and developed data will be helpful in dose reconstruction and prediction efforts concerning the inhalation of short-lived radionuclides or radionuclides of Inhalation Class S. The resulting dose distributions follow a lognormal distribution shape for all scenarios examined. Applying the probabilistic computer code LUDUC to inhalation of strontium and yttrium aerosols has shown several trends, which could also be valid for many S radionuclide compounds that are beta-particle emitters. The equivalent doses are, in general, found to follow lognormal distributions. Therefore, these distributions can be described by geometric means and geometric standard deviations. Furthermore, a mathematical phantom of the extrathoracic and

  9. Evaluating the impacts of agricultural land management practices on water resources: A probabilistic hydrologic modeling approach.

    PubMed

    Prada, A F; Chu, M L; Guzman, J A; Moriasi, D N

    2017-02-24

    Evaluating the effectiveness of agricultural land management practices in minimizing environmental impacts using models is challenged by the presence of inherent uncertainties during the model development stage. One issue faced during the model development stage is the uncertainty involved in model parameterization. Using a single optimized set of parameters (one snapshot) to represent baseline conditions of the system limits the applicability and robustness of the model to properly represent future or alternative scenarios. The objective of this study was to develop a framework that facilitates model parameter selection while evaluating uncertainty to assess the impacts of land management practices at the watershed scale. The model framework was applied to the Lake Creek watershed located in southwestern Oklahoma, USA. A two-step probabilistic approach was implemented to parameterize the Agricultural Policy/Environmental eXtender (APEX) model using global uncertainty and sensitivity analysis to estimate the full spectrum of total monthly water yield (WYLD) and total monthly Nitrogen loads (N) in the watershed under different land management practices. Twenty-seven models were found to represent the baseline scenario in which uncertainty of up to 29% and 400% in WYLD and N, respectively, is plausible. Changing the land cover to pasture manifested the highest decrease in N to up to 30% for a full pasture coverage while changing to full winter wheat cover can increase the N up to 11%. The methodology developed in this study was able to quantify the full spectrum of system responses, the uncertainty associated with them, and the most important parameters that drive their variability. Results from this study can be used to develop strategic decisions on the risks and tradeoffs associated with different management alternatives that aim to increase productivity while also minimizing their environmental impacts.

  10. A first passage based model for probabilistic fracture of polycrystalline silicon MEMS structures

    NASA Astrophysics Data System (ADS)

    Xu, Zhifeng; Le, Jia-Liang

    2017-02-01

    Experiments have shown that the failure loads of Microelectromechanical Systems (MEMS) devices usually exhibit a considerable level of variability, which is believed to be caused by the random material strength and the geometry-induced random stress field. Understanding the strength statistics of MEMS devices is of paramount importance for the device design guarding against a tolerable failure risk. In this study, we develop a continuum-based probabilistic model for polycrystalline silicon (poly-Si) MEMS structures within the framework of first passage analysis. The failure of poly-Si MEMS structures is considered to be triggered by fracture initiation from the sidewalls governed by a nonlocal failure criterion. The model takes into account an autocorrelated random field of material tensile strength. The nonlocal random stress field is obtained by stochastic finite element simulations based on the information of the uncertainties of the sidewall geometry. The model is formulated within the contexts of both stationary and non-stationary stochastic processes for MEMS structures of various geometries and under different loading configurations. It is shown that the model agrees well with the experimentally measured strength distributions of uniaxial tensile poly-Si MEMS specimens of different gauge lengths. The model is further used to predict the strength distribution of poly-Si MEMS beams under three-point bending, and the result is compared with the Monte Carlo simulation. The present model predicts strong size effects on both the strength distribution and the mean structural strength. It is shown that the mean size effect curve consists of three power-law asymptotes in the small, intermediate, and large-size regimes. By matching these three asymptotes, an approximate size effect equation is proposed. The present model is shown to be a generalization of the classical weakest-link statistical model, and it provides a physical interpretation of the material length

  11. Probabilistic model for estimating snow cover duration from ground temperature measurements in the Austrian Alpine region

    NASA Astrophysics Data System (ADS)

    Teubner, Irene; Haimberger, Leopold; Hantel, Michael; Dorigo, Wouter

    2016-04-01

    Snow cover duration represents a key climate parameter. Trends in the seasonal snow cover duration can be linked to changes of the mean annual air temperature and precipitation pattern and, therefore, can serve as a sentinel for climate change. Snow cover duration is commonly inferred from snow depth or snow water equivalent measurements provided by ground observations or satellites. Recently, methods have been developed to estimate the presence or absence of a snow cover from daily ground temperature variations. This method commonly includes the definition of station-specific thresholds. In our study, we propose to use a probabilistic model for determining a single threshold for the whole dataset. The model takes the daily range and/or the daily mean of ground temperature at 10 cm depth as input and is further calibrated with in situ snow depth observations. Applying the model to 87 measuring sites in the Austrian Alps, we showed that the snow cover estimation was improved when combining the daily range and the mean of ground temperature. Our results suggest that ground temperature records are a valuable source for the validation of satellite-derived snow cover, complementary to traditional ground-based snow measurements.

  12. Modeling and Quantification of Team Performance in Human Reliability Analysis for Probabilistic Risk Assessment

    SciTech Connect

    Jeffrey C. JOe; Ronald L. Boring

    2014-06-01

    Probabilistic Risk Assessment (PRA) and Human Reliability Assessment (HRA) are important technical contributors to the United States (U.S.) Nuclear Regulatory Commission’s (NRC) risk-informed and performance based approach to regulating U.S. commercial nuclear activities. Furthermore, all currently operating commercial NPPs in the U.S. are required by federal regulation to be staffed with crews of operators. Yet, aspects of team performance are underspecified in most HRA methods that are widely used in the nuclear industry. There are a variety of "emergent" team cognition and teamwork errors (e.g., communication errors) that are 1) distinct from individual human errors, and 2) important to understand from a PRA perspective. The lack of robust models or quantification of team performance is an issue that affects the accuracy and validity of HRA methods and models, leading to significant uncertainty in estimating HEPs. This paper describes research that has the objective to model and quantify team dynamics and teamwork within NPP control room crews for risk informed applications, thereby improving the technical basis of HRA, which improves the risk-informed approach the NRC uses to regulate the U.S. commercial nuclear industry.

  13. Probabilistic ecological risk assessment of effluent toxicity of a wastewater reclamation plant based on process modeling.

    PubMed

    Zeng, Siyu; Huang, Yunqing; Sun, Fu; Li, Dan; He, Miao

    2016-09-01

    The growing use of reclaimed wastewater for environmental purposes such as stream flow augmentation requires comprehensive ecological risk assessment and management. This study applied a system analysis approach, regarding a wastewater reclamation plant (WRP) and its recipient water body as a whole system, and assessed the ecological risk of the recipient water body caused by the WRP effluent. Instead of specific contaminants, two toxicity indicators, i.e. genotoxicity and estrogenicity, were selected to directly measure the biological effects of all bio-available contaminants in the reclaimed wastewater, as well as characterize the ecological risk of the recipient water. A series of physically based models were developed to simulate the toxicity indicators in a WRP through a typical reclamation process, including ultrafiltration, ozonation, and chlorination. After being validated against the field monitoring data from a full-scale WRP in Beijing, the models were applied to simulate the probability distribution of effluent toxicity of the WRP through Latin Hypercube Sampling to account for the variability of influent toxicity and operation conditions. The simulated effluent toxicity was then used to derive the predicted environmental concentration (PEC) in the recipient stream, considering the variations of the toxicity and flow of the upstream inflow as well. The ratio of the PEC of each toxicity indicator to its corresponding predicted no-effect concentration was finally used for the probabilistic ecological risk assessment. Regional sensitivity analysis was also performed with the developed models to identify the critical control variables and strategies for ecological risk management.

  14. Optimizing Negotiation Conflict in the Cloud Service Negotiation Framework Using Probabilistic Decision Making Model

    PubMed Central

    Rajavel, Rajkumar; Thangarathinam, Mala

    2015-01-01

    Optimization of negotiation conflict in the cloud service negotiation framework is identified as one of the major challenging issues. This negotiation conflict occurs during the bilateral negotiation process between the participants due to the misperception, aggressive behavior, and uncertain preferences and goals about their opponents. Existing research work focuses on the prerequest context of negotiation conflict optimization by grouping similar negotiation pairs using distance, binary, context-dependent, and fuzzy similarity approaches. For some extent, these approaches can maximize the success rate and minimize the communication overhead among the participants. To further optimize the success rate and communication overhead, the proposed research work introduces a novel probabilistic decision making model for optimizing the negotiation conflict in the long-term negotiation context. This decision model formulates the problem of managing different types of negotiation conflict that occurs during negotiation process as a multistage Markov decision problem. At each stage of negotiation process, the proposed decision model generates the heuristic decision based on the past negotiation state information without causing any break-off among the participants. In addition, this heuristic decision using the stochastic decision tree scenario can maximize the revenue among the participants available in the cloud service negotiation framework. PMID:26543899

  15. Development of a probabilistic timing model for the ingestion of tap water.

    SciTech Connect

    Davis, M. J.; Janke, R.; Environmental Science Division; EPA

    2009-01-01

    A contamination event in a water distribution system can result in adverse health impacts to individuals consuming contaminated water from the system. Assessing impacts to such consumers requires accounting for the timing of exposures of individuals to tap-water contaminants that have time-varying concentrations. Here we present a probabilistic model for the timing of ingestion of tap water that we developed for use in the U.S. Environmental Protection Agency's Threat Ensemble Vulnerability Assessment and Sensor Placement Tool, which is designed to perform consequence assessments for contamination events in water distribution systems. We also present a statistical analysis of the timing of ingestion activity using data collected by the American Time Use Survey. The results of the analysis provide the basis for our model, which accounts for individual variability in ingestion timing and provides a series of potential ingestion times for tap water. It can be combined with a model for ingestion volume to perform exposure assessments and applied in cases for which the use of characteristics typical of the United States is appropriate.

  16. LADTAP-PROB: A PROBABILISTIC MODEL TO ASSESS RADIOLOGICAL CONSEQUENCES FROM LIQUID RADIOACTIVE RELEASES

    SciTech Connect

    Farfan, E; Trevor Foley, T; Tim Jannik, T

    2009-01-26

    The potential radiological consequences to humans resulting from aqueous releases at the Savannah River Site (SRS) have usually been assessed using the computer code LADTAP or deterministic variations of this code. The advancement of LADTAP over the years included LADTAP II (a computer program that still resides on the mainframe at SRS) [1], LADTAP XL{copyright} (Microsoft Excel{reg_sign} Spreadsheet) [2], and other versions specific to SRS areas such as [3]. The spreadsheet variations of LADTAP contain two worksheets: LADTAP and IRRIDOSE. The LADTAP worksheet estimates dose for environmental pathways including ingestion of water and fish and external exposure resulting from recreational activities. IRRIDOSE estimates potential dose to individuals from irrigation of food crops with contaminated water. A new version of this deterministic methodology, LADTAP-PROB, was developed at Savannah River National Laboratory (SRNL) to (1) consider the complete range of the model parameter values (not just maximum or mean values), (2) determine the influences of parameter uncertainties within the LADTAP methodology, to perform a sensitivity analysis of all model parameters (to identify the input parameters to which model results are most sensitive), and (3) probabilistically assess radiological consequences from contaminated water. This study presents the methodology applied in LADTAP-PROB.

  17. Segmentation of risk structures for otologic surgery using the Probabilistic Active Shape Model (PASM)

    NASA Astrophysics Data System (ADS)

    Becker, Meike; Kirschner, Matthias; Sakas, Georgios

    2014-03-01

    Our research project investigates a multi-port approach for minimally-invasive otologic surgery. For planning such a surgery, an accurate segmentation of the risk structures is crucial. However, the segmentation of these risk structures is a challenging task: The anatomical structures are very small and some have a complex shape, low contrast and vary both in shape and appearance. Therefore, prior knowledge is needed which is why we apply model-based approaches. In the present work, we use the Probabilistic Active Shape Model (PASM), which is a more flexible and specific variant of the Active Shape Model (ASM), to segment the following risk structures: cochlea, semicircular canals, facial nerve, chorda tympani, ossicles, internal auditory canal, external auditory canal and internal carotid artery. For the evaluation we trained and tested the algorithm on 42 computed tomography data sets using leave-one-out tests. Visual assessment of the results shows in general a good agreement of manual and algorithmic segmentations. Further, we achieve a good Average Symmetric Surface Distance while the maximum error is comparatively large due to low contrast at start and end points. Last, we compare the PASM to the standard ASM and show that the PASM leads to a higher accuracy.

  18. Optimizing Negotiation Conflict in the Cloud Service Negotiation Framework Using Probabilistic Decision Making Model.

    PubMed

    Rajavel, Rajkumar; Thangarathinam, Mala

    2015-01-01

    Optimization of negotiation conflict in the cloud service negotiation framework is identified as one of the major challenging issues. This negotiation conflict occurs during the bilateral negotiation process between the participants due to the misperception, aggressive behavior, and uncertain preferences and goals about their opponents. Existing research work focuses on the prerequest context of negotiation conflict optimization by grouping similar negotiation pairs using distance, binary, context-dependent, and fuzzy similarity approaches. For some extent, these approaches can maximize the success rate and minimize the communication overhead among the participants. To further optimize the success rate and communication overhead, the proposed research work introduces a novel probabilistic decision making model for optimizing the negotiation conflict in the long-term negotiation context. This decision model formulates the problem of managing different types of negotiation conflict that occurs during negotiation process as a multistage Markov decision problem. At each stage of negotiation process, the proposed decision model generates the heuristic decision based on the past negotiation state information without causing any break-off among the participants. In addition, this heuristic decision using the stochastic decision tree scenario can maximize the revenue among the participants available in the cloud service negotiation framework.

  19. The probabilistic niche model reveals substantial variation in the niche structure of empirical food webs.

    PubMed

    Williams, Richard J; Purves, Drew W

    2011-09-01

    The structure of food webs, complex networks of interspecies feeding interactions, plays a crucial role in ecosystem resilience and function, and understanding food web structure remains a central problem in ecology. Previous studies have shown that key features of empirical food webs can be reproduced by low-dimensional "niche" models. Here we examine the form and variability of food web niche structure by fitting a probabilistic niche model to 37 empirical food webs, a much larger number of food webs than used in previous studies. The model relaxes previous assumptions about parameter distributions and hierarchy and returns parameter estimates for each species in each web. The model significantly outperforms previous niche model variants and also performs well for several webs where a body-size-based niche model performs poorly, implying that traits other than body size are important in structuring these webs' niche space. Parameter estimates frequently violate previous models' assumptions: in 19 of 37 webs, parameter values are not significantly hierarchical, 32 of 37 webs have nonuniform niche value distributions, and 15 of 37 webs lack a correlation between niche width and niche position. Extending the model to a two-dimensional niche space yields networks with a mixture of one- and two-dimensional niches and provides a significantly better fit for webs with a large number of species and links. These results confirm that food webs are strongly niche-structured but reveal substantial variation in the form of the niche structuring, a result with fundamental implications for ecosystem resilience and function.

  20. Monitoring and modeling as a continuing learning process: the use of hydrological models in a general probabilistic framework.

    NASA Astrophysics Data System (ADS)

    Baroni, G.; Gräff, T.; Reinstorf, F.; Oswald, S. E.

    2012-04-01

    Nowadays uncertainty and sensitivity analysis are considered basic tools for the assessment of hydrological models and the evaluation of the most important sources of uncertainty. In this context, in the last decades several methods have been developed and applied in different hydrological conditions. However, in most of the cases, the studies have been done by investigating mainly the influence of the parameter uncertainty on the simulated outputs and few approaches tried to consider also other sources of uncertainty i.e. input and model structure. Moreover, several constrains arise when spatially distributed parameters are involved. To overcome these limitations a general probabilistic framework based on Monte Carlo simulations and the Sobol method has been proposed. In this study, the general probabilistic framework was applied at field scale using a 1D physical-based hydrological model (SWAP). Furthermore, the framework was extended at catchment scale in combination with a spatially distributed hydrological model (SHETRAN). The models are applied in two different experimental sites in Germany: a relatively flat cropped field close to Potsdam (Brandenburg) and a small mountainous catchment with agricultural land use (Schaefertal, Harz Mountains). For both cases, input and parameters are considered as major sources of uncertainty. Evaluation of the models was based on soil moisture detected at plot scale in different depths and, for the catchment site, also with daily discharge values. The study shows how the framework can take into account all the various sources of uncertainty i.e. input data, parameters (either in scalar or spatially distributed form) and model structures. The framework can be used in a loop in order to optimize further monitoring activities used to improve the performance of the model. In the particular applications, the results show how the sources of uncertainty are specific for each process considered. The influence of the input data as

  1. Probabilistic exposure assessment model to estimate aseptic-UHT product failure rate.

    PubMed

    Pujol, Laure; Albert, Isabelle; Magras, Catherine; Johnson, Nicholas Brian; Membré, Jeanne-Marie

    2015-01-02

    Aseptic-Ultra-High-Temperature (UHT) products are manufactured to be free of microorganisms capable of growing in the food at normal non-refrigerated conditions at which the food is likely to be held during manufacture, distribution and storage. Two important phases within the process are widely recognised as critical in controlling microbial contamination: the sterilisation steps and the following aseptic steps. Of the microbial hazards, the pathogen spore formers Clostridium botulinum and Bacillus cereus are deemed the most pertinent to be controlled. In addition, due to a relatively high thermal resistance, Geobacillus stearothermophilus spores are considered a concern for spoilage of low acid aseptic-UHT products. A probabilistic exposure assessment model has been developed in order to assess the aseptic-UHT product failure rate associated with these three bacteria. It was a Modular Process Risk Model, based on nine modules. They described: i) the microbial contamination introduced by the raw materials, either from the product (i.e. milk, cocoa and dextrose powders and water) or the packaging (i.e. bottle and sealing component), ii) the sterilisation processes, of either the product or the packaging material, iii) the possible recontamination during subsequent processing of both product and packaging. The Sterility Failure Rate (SFR) was defined as the sum of bottles contaminated for each batch, divided by the total number of bottles produced per process line run (10(6) batches simulated per process line). The SFR associated with the three bacteria was estimated at the last step of the process (i.e. after Module 9) but also after each module, allowing for the identification of modules, and responsible contamination pathways, with higher or lower intermediate SFR. The model contained 42 controlled settings associated with factory environment, process line or product formulation, and more than 55 probabilistic inputs corresponding to inputs with variability

  2. Probabilistic terrain models from waveform airborne LiDAR: AutoProbaDTM project results

    NASA Astrophysics Data System (ADS)

    Jalobeanu, A.; Goncalves, G. R.

    2012-12-01

    The main objective of the AutoProbaDTM project was to develop new methods for automated probabilistic topographic map production using the latest LiDAR scanners. It included algorithmic development, implementation and validation over a 200 km2 test area in continental Portugal, representing roughly 100 GB of raw data and half a billion waveforms. We aimed to generate digital terrain models automatically, including ground topography as well as uncertainty maps, using Bayesian inference for model estimation and error propagation, and approaches based on image processing. Here we are presenting the results of the completed project (methodological developments and processing results from the test dataset). In June 2011, the test data were acquired in central Portugal, over an area of geomorphological and ecological interest, using a Riegl LMS-Q680i sensor. We managed to survey 70% of the test area at a satisfactory sampling rate, the angular spacing matching the laser beam divergence and the ground spacing nearly equal to the footprint (almost 4 pts/m2 for a 50cm footprint at 1500 m AGL). This is crucial for a correct processing as aliasing artifacts are significantly reduced. A reverse engineering had to be done as the data were delivered in a proprietary binary format, so we were able to read the waveforms and the essential parameters. A robust waveform processing method has been implemented and tested, georeferencing and geometric computations have been coded. Fast gridding and interpolation techniques have been developed. Validation is nearly completed, as well as geometric calibration, IMU error correction, full error propagation and large-scale DEM reconstruction. A probabilistic processing software package has been implemented and code optimization is in progress. This package includes new boresight calibration procedures, robust peak extraction modules, DEM gridding and interpolation methods, and means to visualize the produced uncertain surfaces (topography

  3. ADOPT: A Historically Validated Light Duty Vehicle Consumer Choice Model

    SciTech Connect

    Brooker, A.; Gonder, J.; Lopp, S.; Ward, J.

    2015-05-04

    The Automotive Deployment Option Projection Tool (ADOPT) is a light-duty vehicle consumer choice and stock model supported by the U.S. Department of Energy’s Vehicle Technologies Office. It estimates technology improvement impacts on U.S. light-duty vehicles sales, petroleum use, and greenhouse gas emissions. ADOPT uses techniques from the multinomial logit method and the mixed logit method estimate sales. Specifically, it estimates sales based on the weighted value of key attributes including vehicle price, fuel cost, acceleration, range and usable volume. The average importance of several attributes changes nonlinearly across its range and changes with income. For several attributes, a distribution of importance around the average value is used to represent consumer heterogeneity. The majority of existing vehicle makes, models, and trims are included to fully represent the market. The Corporate Average Fuel Economy regulations are enforced. The sales feed into the ADOPT stock model. It captures key aspects for summing petroleum use and greenhouse gas emissions This includes capturing the change in vehicle miles traveled by vehicle age, the creation of new model options based on the success of existing vehicles, new vehicle option introduction rate limits, and survival rates by vehicle age. ADOPT has been extensively validated with historical sales data. It matches in key dimensions including sales by fuel economy, acceleration, price, vehicle size class, and powertrain across multiple years. A graphical user interface provides easy and efficient use. It manages the inputs, simulation, and results.

  4. Preference pulses and the win-stay, fix-and-sample model of choice.

    PubMed

    Hachiga, Yosuke; Sakagami, Takayuki; Silberberg, Alan

    2015-11-01

    Two groups of six rats each were trained to respond to two levers for a food reinforcer. One group was trained on concurrent variable-ratio 20 extinction schedules of reinforcement. The second group was trained on a concurrent variable-interval 27-s extinction schedule. In both groups, lever-schedule assignments changed randomly following reinforcement; a light cued the lever providing the next reinforcer. In the next condition, the light cue was removed and reinforcer assignment strictly alternated between levers. The next two conditions redetermined, in order, the first two conditions. Preference pulses, defined as a tendency for relative response rate to decline to the just-reinforced alternative with time since reinforcement, only appeared during the extinction schedule. Although the pulse's functional form was well described by a reinforcer-induction equation, there was a large residual between actual data and a pulse-as-artifact simulation (McLean, Grace, Pitts, & Hughes, 2014) used to discern reinforcer-dependent contributions to pulsing. However, if that simulation was modified to include a win-stay tendency (a propensity to stay on the just-reinforced alternative), the residual was greatly reduced. Additional modifications of the parameter values of the pulse-as-artifact simulation enabled it to accommodate the present results as well as those it originally accommodated. In its revised form, this simulation was used to create a model that describes response runs to the preferred alternative as terminating probabilistically, and runs to the unpreferred alternative as punctate with occasional perseverative response runs. After reinforcement, choices are modeled as returning briefly to the lever location that had been just reinforced. This win-stay propensity is hypothesized as due to reinforcer induction.

  5. Probabilistic nowcast of PBL profiles with a single column model and ensemble filter assimilation of surface observations

    NASA Astrophysics Data System (ADS)

    Rostkier-Edelstein, D.; Hacker, J. P.

    2009-09-01

    A long-term goal of this work is to find an efficient system for probabilistic planetary boundary layer (PBL) nowcasting that can be deployed wherever surface observations are present. One approach showing promise is the use of a single column model (SCM) and ensemble filter (EF) data assimilation techniques. Earlier work showed that surface observations can be an important source of information with an SCM and an EF. Here we extend that work to quantify the deterministic and probabilistic skill of ensemble SCM predictions with added complexity. Although it is appealing to add additional physics and dynamics to the SCM model it is not immediately clear that additional complexity will improve the performance of a PBL nowcasting system based on a simple model. We address this question with regard to treatment of surface assimilation, radiation in the column, and also advection to account for realistic 3D dynamics (a timely WRF prediction). We adopt factor separation analysis to quantify the individual contribution of each model component to the deterministic and probabilistic skill of the system, as well as any beneficial or detrimental interactions between them. Deterministic skill of the system is evaluated through the mean absolute error, and probabilistic skill through the Brier Skill Score (BSS) and the area under the relative operating characteristic (ROC) curve (AUR). The BSS is further decomposed into both a reliability and resolution term to understand the trade-offs in different components of probabilistic skill. An alternative system based on climatological covariances and surface observations is used as a reference to assess the real utility of the flow-dependent covariances estimated with the ensemble system. In essence it is a dressing technique, whereby a deterministic 3D mesoscale forecast (e.g. WRF) is corrected with surface forecast errors and covariances computed from a distribution of available historical mesoscale forecasts. The adjusted profile

  6. Occupational Choice: A Conditional Logit Model with Special Reference to Wage Subsidies and Occupational Choice. Final Report.

    ERIC Educational Resources Information Center

    Boskin, Michael J.

    A model of occupational choice based on the theory of human capital is developed and estimated by conditional logit analysis. The empirical results estimated the probability of individuals with certain characteristics (such as race, sex, age, and education) entering each of 11 occupational groups. The results indicate that individuals tend to…

  7. Probabilistic reliability modeling for oil exploration & production (E&P) facilities in the tallgrass prairie preserve.

    PubMed

    Zambrano, Lyda; Sublette, Kerry; Duncan, Kathleen; Thoma, Greg

    2007-10-01

    The aging domestic oil production infrastructure represents a high risk to the environment because of the type of fluids being handled (oil and brine) and the potential for accidental release of these fluids into sensitive ecosystems. Currently, there is not a quantitative risk model directly applicable to onshore oil exploration and production (E&P) facilities. We report on a probabilistic reliability model created for onshore exploration and production (E&P) facilities. Reliability theory, failure modes and effects analysis (FMEA), and event trees were used to develop the model estimates of the failure probability of typical oil production equipment. Monte Carlo simulation was used to translate uncertainty in input parameter values to uncertainty in the model output. The predicted failure rates were calibrated to available failure rate information by adjusting probability density function parameters used as random variates in the Monte Carlo simulations. The mean and standard deviation of normal variate distributions from which the Weibull distribution characteristic life was chosen were used as adjustable parameters in the model calibration. The model was applied to oil production leases in the Tallgrass Prairie Preserve, Oklahoma. We present the estimated failure probability due to the combination of the most significant failure modes associated with each type of equipment (pumps, tanks, and pipes). The results show that the estimated probability of failure for tanks is about the same as that for pipes, but that pumps have much lower failure probability. The model can provide necessary equipment reliability information for proactive risk management at the lease level by providing quantitative information to base allocation of maintenance resources to high-risk equipment that will minimize both lost production and ecosystem damage.

  8. Marginal Probabilistic Modeling of the Delays in the Sensory Data Transmission of Networked Telerobots

    PubMed Central

    Gago-Benítez, Ana; Fernández-Madrigal, Juan-Antonio; Cruz-Martín, Ana

    2014-01-01

    Networked telerobots are remotely controlled through general purpose networks and components, which are highly heterogeneous and exhibit stochastic response times; however their correct teleoperation requires a timely flow of information from sensors to remote stations. In order to guarantee these time requirements, a good on-line probabilistic estimation of the sensory transmission delays is needed. In many modern applications this estimation must be computationally highly efficient, e.g., when the system includes a web-based client interface. This paper studies marginal probability distributions that, under mild assumptions, can be a good approximation of the real distribution of the delays without using knowledge of their dynamics, are efficient to compute, and need minor modifications on the networked robot. Since sequences of delays exhibit strong non-linearities in these networked applications, to satisfy the iid hypothesis required by the marginal approach we apply a change detection method. The results reported here indicate that some parametrical models explain well many more real scenarios when using this change detection method, while some non-parametrical distributions have a very good rate of successful modeling in the case that non-linearity detection is not possible and that we split the total delay into its three basic terms: server, network and client times. PMID:24481232

  9. Plant Phenotyping using Probabilistic Topic Models: Uncovering the Hyperspectral Language of Plants.

    PubMed

    Wahabzada, Mirwaes; Mahlein, Anne-Katrin; Bauckhage, Christian; Steiner, Ulrike; Oerke, Erich-Christian; Kersting, Kristian

    2016-03-09

    Modern phenotyping and plant disease detection methods, based on optical sensors and information technology, provide promising approaches to plant research and precision farming. In particular, hyperspectral imaging have been found to reveal physiological and structural characteristics in plants and to allow for tracking physiological dynamics due to environmental effects. In this work, we present an approach to plant phenotyping that integrates non-invasive sensors, computer vision, as well as data mining techniques and allows for monitoring how plants respond to stress. To uncover latent hyperspectral characteristics of diseased plants reliably and in an easy-to-understand way, we "wordify" the hyperspectral images, i.e., we turn the images into a corpus of text documents. Then, we apply probabilistic topic models, a well-established natural language processing technique that identifies content and topics of documents. Based on recent regularized topic models, we demonstrate that one can track automatically the development of three foliar diseases of barley. We also present a visualization of the topics that provides plant scientists an intuitive tool for hyperspectral imaging. In short, our analysis and visualization of characteristic topics found during symptom development and disease progress reveal the hyperspectral language of plant diseases.

  10. Plant Phenotyping using Probabilistic Topic Models: Uncovering the Hyperspectral Language of Plants

    PubMed Central

    Wahabzada, Mirwaes; Mahlein, Anne-Katrin; Bauckhage, Christian; Steiner, Ulrike; Oerke, Erich-Christian; Kersting, Kristian

    2016-01-01

    Modern phenotyping and plant disease detection methods, based on optical sensors and information technology, provide promising approaches to plant research and precision farming. In particular, hyperspectral imaging have been found to reveal physiological and structural characteristics in plants and to allow for tracking physiological dynamics due to environmental effects. In this work, we present an approach to plant phenotyping that integrates non-invasive sensors, computer vision, as well as data mining techniques and allows for monitoring how plants respond to stress. To uncover latent hyperspectral characteristics of diseased plants reliably and in an easy-to-understand way, we “wordify” the hyperspectral images, i.e., we turn the images into a corpus of text documents. Then, we apply probabilistic topic models, a well-established natural language processing technique that identifies content and topics of documents. Based on recent regularized topic models, we demonstrate that one can track automatically the development of three foliar diseases of barley. We also present a visualization of the topics that provides plant scientists an intuitive tool for hyperspectral imaging. In short, our analysis and visualization of characteristic topics found during symptom development and disease progress reveal the hyperspectral language of plant diseases. PMID:26957018

  11. Combining exposure and effect modeling into an integrated probabilistic environmental risk assessment for nanoparticles.

    PubMed

    Jacobs, Rianne; Meesters, Johannes A J; Ter Braak, Cajo J F; van de Meent, Dik; van der Voet, Hilko

    2016-12-01

    There is a growing need for good environmental risk assessment of engineered nanoparticles (ENPs). Environmental risk assessment of ENPs has been hampered by lack of data and knowledge about ENPs, their environmental fate, and their toxicity. This leads to uncertainty in the risk assessment. To deal with uncertainty in the risk assessment effectively, probabilistic methods are advantageous. In the present study, the authors developed a method to model both the variability and the uncertainty in environmental risk assessment of ENPs. This method is based on the concentration ratio and the ratio of the exposure concentration to the critical effect concentration, both considered to be random. In this method, variability and uncertainty are modeled separately so as to allow the user to see which part of the total variation in the concentration ratio is attributable to uncertainty and which part is attributable to variability. The authors illustrate the use of the method with a simplified aquatic risk assessment of nano-titanium dioxide. The authors' method allows a more transparent risk assessment and can also direct further environmental and toxicological research to the areas in which it is most needed. Environ Toxicol Chem 2016;35:2958-2967. © 2016 The Authors. Environmental Toxicology and Chemistry published by Wiley Periodicals, Inc. on behalf of SETAC.

  12. Probabilistic modelling of exposure doses and implications for health risk characterization: glycoalkaloids from potatoes.

    PubMed

    Ruprich, J; Rehurkova, I; Boon, P E; Svensson, K; Moussavian, S; Van der Voet, H; Bosgra, S; Van Klaveren, J D; Busk, L

    2009-12-01

    Potatoes are a source of glycoalkaloids (GAs) represented primarily by alpha-solanine and alpha-chaconine (about 95%). Content of GAs in tubers is usually 10-100 mg/kg and maximum levels do not exceed 200 mg/kg. GAs can be hazardous for human health. Poisoning involve gastrointestinal ailments and neurological symptoms. A single intake of >1-3 mg/kg b.w. is considered a critical effect dose (CED). Probabilistic modelling of acute and chronic (usual) exposure to GAs was performed in the Czech Republic, Sweden and The Netherlands. National databases on individual consumption of foods, data on concentration of GAs in tubers (439 Czech and Swedish results) and processing factors were used for modelling. Results concluded that potatoes currently available at the European market may lead to acute intakes >1 mg GAs/kg b.w./day for upper tail of the intake distribution (0.01% of population) in all three countries. 50 mg GAs/kg raw unpeeled tubers ensures that at least 99.99% of the population does not exceed the CED. Estimated chronic (usual) intake in participating countries was 0.25, 0.29 and 0.56 mg/kg b.w./day (97.5% upper confidence limit). It remains unclear if the incidence of GAs poisoning is underreported or if assumptions are the worst case for extremely sensitive persons.

  13. a Simple Probabilistic, Biologically Informed Model of the Population Dynamics of Desert Shrubs

    NASA Astrophysics Data System (ADS)

    Worman, S.; Furbish, D. J.; Clarke, J. H.; Roberts, A. S.

    2010-12-01

    In arid environments, spatiotemporal variations in the processes of erosion and deposition are strongly coupled with the structure and dynamics of plant communities as well as the specific life behavior of individual plants. Understanding how physical transport processes affect the evolution of the land surface on geomorphic time-scales therefore requires considering how long-term changes in plant dynamics may in turn impact such processes. The development of this desert shrub population dynamics model is therefore motivated by the need to link rain-splash induced mound building at the shrub-scale with the unfolding ‘biological play’ occurring on a hillslope. Using the Master Equation to conserve shrub age, probabilistic and biologically informed statements for recruitment and mortality are formulated to function as source and sink terms respectively. This simple accounting framework, by tracking the number of individuals entering and leaving a population, captures the changes in shrub count that can be expected in time as the key variables driving the dynamics of these plant communities (i.e. precipitation) also change in time. The result is a tool through which it is possible to statistically describe the aggregate spatiotemporal behavior of different shrub populations, with their own characteristic life-cycles and physical dimensions, under different external forcing scenarios. This model features inputs that have a solid biophysical basis and insofar as it has the capacity to mimic key features of real processes, leads to outputs which appear consistent with findings reported in the literature.

  14. Marginal probabilistic modeling of the delays in the sensory data transmission of networked telerobots.

    PubMed

    Gago-Benítez, Ana; Fernández-Madrigal, Juan-Antonio; Cruz-Martín, Ana

    2014-01-29

    Networked telerobots are remotely controlled through general purpose networks and components, which are highly heterogeneous and exhibit stochastic response times; however their correct teleoperation requires a timely flow of information from sensors to remote stations. In order to guarantee these time requirements, a good on-line probabilistic estimation of the sensory transmission delays is needed. In many modern applications this estimation must be computationally highly efficient, e.g., when the system includes a web-based client interface. This paper studies marginal probability distributions that, under mild assumptions, can be a good approximation of the real distribution of the delays without using knowledge of their dynamics, are efficient to compute, and need minor modifications on the networked robot. Since sequences of delays exhibit strong non-linearities in these networked applications, to satisfy the iid hypothesis required by the marginal approach we apply a change detection method. The results reported here indicate that some parametrical models explain well many more real scenarios when using this change detection method, while some non-parametrical distributions have a very good rate of successful modeling in the case that non-linearity detection is not possible and that we split the total delay into its three basic terms: server, network and client times.

  15. Near-Field Probabilistic Seismic Hazard Analysis of Metropolitan Tehran Using Region-Specific Directivity Models

    NASA Astrophysics Data System (ADS)

    Yazdani, Azad; Nicknam, Ahmad; Dadras, Ehsan Yousefi; Eftekhari, Seyed Nasrollah

    2017-01-01

    Ground motions are affected by directivity effects at near-fault regions which result in low-frequency cycle pulses at the beginning of the velocity time history. The directivity features of near-fault ground motions can lead to significant increase in the risk of earthquake-induced damage on engineering structures. The ordinary probabilistic seismic hazard analysis (PSHA) does not take into account such effects; recent studies have thus proposed new frameworks to incorporate directivity effects in PSHA. The objective of this study is to develop the seismic hazard mapping of Tehran City according to near-fault PSHA procedure for different return periods. To this end, the directivity models required in the modified PSHA were developed based on a database of the simulated ground motions. The simulated database was used in this study because there are no recorded near-fault data in the region to derive purely empirically based pulse prediction models. The results show that the directivity effects can significantly affect the estimate of regional seismic hazard.

  16. Epistemic uncertainty in the ranking and categorization of probabilistic safety assessment model elements: issues and findings.

    PubMed

    Borgonovo, Emanuele

    2008-08-01

    In this work, we study the effect of epistemic uncertainty in the ranking and categorization of elements of probabilistic safety assessment (PSA) models. We show that, while in a deterministic setting a PSA element belongs to a given category univocally, in the presence of epistemic uncertainty, a PSA element belongs to a given category only with a certain probability. We propose an approach to estimate these probabilities, showing that their knowledge allows to appreciate "the sensitivity of component categorizations to uncertainties in the parameter values" (U.S. NRC Regulatory Guide 1.174). We investigate the meaning and utilization of an assignment method based on the expected value of importance measures. We discuss the problem of evaluating changes in quality assurance, maintenance activities prioritization, etc. in the presence of epistemic uncertainty. We show that the inclusion of epistemic uncertainly in the evaluation makes it necessary to evaluate changes through their effect on PSA model parameters. We propose a categorization of parameters based on the Fussell-Vesely and differential importance (DIM) measures. In addition, issues in the calculation of the expected value of the joint importance measure are present when evaluating changes affecting groups of components. We illustrate that the problem can be solved using DIM. A numerical application to a case study concludes the work.

  17. Sociotechnical probabilistic risk modeling to predict injurious falls in community living centers.

    PubMed

    Powell-Cope, Gail; Campbell, Robert; Hahm, Bridget; Bulat, Tatjana; Westphal, John

    2016-01-01

    The goal of this study was to apply sociotechnical probabilistic risk assessment to prioritize risks and prevention strategies for serious injurious falls of residents in nursing homes. Risk modeling teams consisted of 26 clinical and nonclinical staff from three Department of Veterans Affairs community living centers and one state Veteran's nursing home. Participants met in groups several times to identify and assign probabilities to provider and resident at-risk behaviors and equipment failures. They identified prevention strategies for the failures that accounted for the highest levels of risk. Six scenarios were modeled: (1) transferring from bed to wheelchair, (2) propelling from bedside to bathroom, (3) transferring from wheelchair to toilet, (4) transferring from toilet to wheelchair, (5) propelling from bathroom to bedside, and (6) transferring from wheelchair to bed. The greatest paths of risk were for residents with impaired mobility and high fragility. A 26% reduction in injurious falls could be achieved by (1) reducing the number of unassisted transfers through a modest improvement in response time to alarms, (2) installing automatic brake locks on 90% of wheelchairs, (3) making the wheelchair maintenance process highly reliable, and (4) decreasing improper transfer techniques by 10%.

  18. Automated reconstruction of ancient languages using probabilistic models of sound change

    PubMed Central

    Bouchard-Côté, Alexandre; Hall, David; Griffiths, Thomas L.; Klein, Dan

    2013-01-01

    One of the oldest problems in linguistics is reconstructing the words that appeared in the protolanguages from which modern languages evolved. Identifying the forms of these ancient languages makes it possible to evaluate proposals about the nature of language change and to draw inferences about human history. Protolanguages are typically reconstructed using a painstaking manual process known as the comparative method. We present a family of probabilistic models of sound change as well as algorithms for performing inference in these models. The resulting system automatically and accurately reconstructs protolanguages from modern languages. We apply this system to 637 Austronesian languages, providing an accurate, large-scale automatic reconstruction of a set of protolanguages. Over 85% of the system’s reconstructions are within one character of the manual reconstruction provided by a linguist specializing in Austronesian languages. Being able to automatically reconstruct large numbers of languages provides a useful way to quantitatively explore hypotheses about the factors determining which sounds in a language are likely to change over time. We demonstrate this by showing that the reconstructed Austronesian protolanguages provide compelling support for a hypothesis about the relationship between the function of a sound and its probability of changing that was first proposed in 1955. PMID:23401532

  19. Probabilistic forecasts of debris-flow hazard at the regional scale with a combination of models.

    NASA Astrophysics Data System (ADS)

    Malet, Jean-Philippe; Remaître, Alexandre

    2015-04-01

    Debris flows are one of the many active slope-forming processes in the French Alps, where rugged and steep slopes mantled by various slope deposits offer a great potential for triggering hazardous events. A quantitative assessment of debris-flow hazard requires the estimation, in a probabilistic framework, of the spatial probability of occurrence of source areas, the spatial probability of runout areas, the temporal frequency of events, and their intensity. The main objective of this research is to propose a pipeline for the estimation of these quantities at the region scale using a chain of debris-flow models. The work uses the experimental site of the Barcelonnette Basin (South French Alps), where 26 active torrents have produced more than 150 debris-flow events since 1850 to develop and validate the methodology. First, a susceptibility assessment is performed to identify the debris-flow prone source areas. The most frequently used approach is the combination of environmental factors with GIS procedures and statistical techniques, integrating or not, detailed event inventories. Based on a 5m-DEM and derivatives, and information on slope lithology, engineering soils and landcover, the possible source areas are identified with a statistical logistic regression model. The performance of the statistical model is evaluated with the observed distribution of debris-flow events recorded after 1850 in the study area. The source areas in the three most active torrents (Riou-Bourdoux, Faucon, Sanières) are well identified by the model. Results are less convincing for three other active torrents (Bourget, La Valette and Riou-Chanal); this could be related to the type of debris-flow triggering mechanism as the model seems to better spot the open slope debris-flow source areas (e.g. scree slopes), but appears to be less efficient for the identification of landslide-induced debris flows. Second, a susceptibility assessment is performed to estimate the possible runout distance

  20. Methodology for Developing a Probabilistic Risk Assessment Model of Spacecraft Rendezvous and Dockings

    NASA Technical Reports Server (NTRS)

    Farnham, Steven J., II; Garza, Joel, Jr.; Castillo, Theresa M.; Lutomski, Michael

    2011-01-01

    In 2007 NASA was preparing to send two new visiting vehicles carrying logistics and propellant to the International Space Station (ISS). These new vehicles were the European Space Agency s (ESA) Automated Transfer Vehicle (ATV), the Jules Verne, and the Japanese Aerospace and Explorations Agency s (JAXA) H-II Transfer Vehicle (HTV). The ISS Program wanted to quantify the increased risk to the ISS from these visiting vehicles. At the time, only the Shuttle, the Soyuz, and the Progress vehicles rendezvoused and docked to the ISS. The increased risk to the ISS was from an increase in vehicle traffic, thereby, increasing the potential catastrophic collision during the rendezvous and the docking or berthing of the spacecraft to the ISS. A universal method of evaluating the risk of rendezvous and docking or berthing was created by the ISS s Risk Team to accommodate the increasing number of rendezvous and docking or berthing operations due to the increasing number of different spacecraft, as well as the future arrival of commercial spacecraft. Before the first docking attempt of ESA's ATV and JAXA's HTV to the ISS, a probabilistic risk model was developed to quantitatively calculate the risk of collision of each spacecraft with the ISS. The 5 rendezvous and docking risk models (Soyuz, Progress, Shuttle, ATV, and HTV) have been used to build and refine the modeling methodology for rendezvous and docking of spacecrafts. This risk modeling methodology will be NASA s basis for evaluating the addition of future ISS visiting spacecrafts hazards, including SpaceX s Dragon, Orbital Science s Cygnus, and NASA s own Orion spacecraft. This paper will describe the methodology used for developing a visiting vehicle risk model.

  1. Statistical and Probabilistic Extensions to Ground Operations' Discrete Event Simulation Modeling

    NASA Technical Reports Server (NTRS)

    Trocine, Linda; Cummings, Nicholas H.; Bazzana, Ashley M.; Rychlik, Nathan; LeCroy, Kenneth L.; Cates, Grant R.

    2010-01-01

    NASA's human exploration initiatives will invest in technologies, public/private partnerships, and infrastructure, paving the way for the expansion of human civilization into the solar system and beyond. As it is has been for the past half century, the Kennedy Space Center will be the embarkation point for humankind's journey into the cosmos. Functioning as a next generation space launch complex, Kennedy's launch pads, integration facilities, processing areas, launch and recovery ranges will bustle with the activities of the world's space transportation providers. In developing this complex, KSC teams work through the potential operational scenarios: conducting trade studies, planning and budgeting for expensive and limited resources, and simulating alternative operational schemes. Numerous tools, among them discrete event simulation (DES), were matured during the Constellation Program to conduct such analyses with the purpose of optimizing the launch complex for maximum efficiency, safety, and flexibility while minimizing life cycle costs. Discrete event simulation is a computer-based modeling technique for complex and dynamic systems where the state of the system changes at discrete points in time and whose inputs may include random variables. DES is used to assess timelines and throughput, and to support operability studies and contingency analyses. It is applicable to any space launch campaign and informs decision-makers of the effects of varying numbers of expensive resources and the impact of off nominal scenarios on measures of performance. In order to develop representative DES models, methods were adopted, exploited, or created to extend traditional uses of DES. The Delphi method was adopted and utilized for task duration estimation. DES software was exploited for probabilistic event variation. A roll-up process was used, which was developed to reuse models and model elements in other less - detailed models. The DES team continues to innovate and expand

  2. Improving predictive power of physically based rainfall-induced shallow landslide models: a probabilistic approach

    NASA Astrophysics Data System (ADS)

    Raia, S.; Alvioli, M.; Rossi, M.; Baum, R. L.; Godt, J. W.; Guzzetti, F.

    2013-02-01

    Distributed models to forecast the spatial and temporal occurrence of rainfall-induced shallow landslides are deterministic. These models extend spatially the static stability models adopted in geotechnical engineering and adopt an infinite-slope geometry to balance the resisting and the driving forces acting on the sliding mass. An infiltration model is used to determine how rainfall changes pore-water conditions, modulating the local stability/instability conditions. A problem with the existing models is the difficulty in obtaining accurate values for the several variables that describe the material properties of the slopes. The problem is particularly severe when the models are applied over large areas, for which sufficient information on the geotechnical and hydrological conditions of the slopes is not generally available. To help solve the problem, we propose a probabilistic Monte Carlo approach to the distributed modeling of shallow rainfall-induced landslides. For the purpose, we have modified the Transient Rainfall Infiltration and Grid-Based Regional Slope-Stability Analysis (TRIGRS) code. The new code (TRIGRS-P) adopts a stochastic approach to compute, on a cell-by-cell basis, transient pore-pressure changes and related changes in the factor of safety due to rainfall infiltration. Infiltration is modeled using analytical solutions of partial differential equations describing one-dimensional vertical flow in isotropic, homogeneous materials. Both saturated and unsaturated soil conditions can be considered. TRIGRS-P copes with the natural variability inherent to the mechanical and hydrological properties of the slope materials by allowing values of the TRIGRS model input parameters to be sampled randomly from a given probability distribution. The range of variation and the mean value of the parameters can be determined by the usual methods used for preparing the TRIGRS input parameters. The outputs of several model runs obtained varying the input parameters

  3. The Answering Process for Multiple-Choice Questions in Collaborative Learning: A Mathematical Learning Model Analysis

    ERIC Educational Resources Information Center

    Nakamura, Yasuyuki; Nishi, Shinnosuke; Muramatsu, Yuta; Yasutake, Koichi; Yamakawa, Osamu; Tagawa, Takahiro

    2014-01-01

    In this paper, we introduce a mathematical model for collaborative learning and the answering process for multiple-choice questions. The collaborative learning model is inspired by the Ising spin model and the model for answering multiple-choice questions is based on their difficulty level. An intensive simulation study predicts the possibility of…

  4. Bayesian methods for model choice and propagation of model uncertainty in groundwater transport modeling

    NASA Astrophysics Data System (ADS)

    Mendes, B. S.; Draper, D.

    2008-12-01

    The issue of model uncertainty and model choice is central in any groundwater modeling effort [Neuman and Wierenga, 2003]; among the several approaches to the problem we favour using Bayesian statistics because it is a method that integrates in a natural way uncertainties (arising from any source) and experimental data. In this work, we experiment with several Bayesian approaches to model choice, focusing primarily on demonstrating the usefulness of the Reversible Jump Markov Chain Monte Carlo (RJMCMC) simulation method [Green, 1995]; this is an extension of the now- common MCMC methods. Standard MCMC techniques approximate posterior distributions for quantities of interest, often by creating a random walk in parameter space; RJMCMC allows the random walk to take place between parameter spaces with different dimensionalities. This fact allows us to explore state spaces that are associated with different deterministic models for experimental data. Our work is exploratory in nature; we restrict our study to comparing two simple transport models applied to a data set gathered to estimate the breakthrough curve for a tracer compound in groundwater. One model has a mean surface based on a simple advection dispersion differential equation; the second model's mean surface is also governed by a differential equation but in two dimensions. We focus on artificial data sets (in which truth is known) to see if model identification is done correctly, but we also address the issues of over and under-paramerization, and we compare RJMCMC's performance with other traditional methods for model selection and propagation of model uncertainty, including Bayesian model averaging, BIC and DIC.References Neuman and Wierenga (2003). A Comprehensive Strategy of Hydrogeologic Modeling and Uncertainty Analysis for Nuclear Facilities and Sites. NUREG/CR-6805, Division of Systems Analysis and Regulatory Effectiveness Office of Nuclear Regulatory Research, U. S. Nuclear Regulatory Commission

  5. Predicting carcinogenicity of diverse chemicals using probabilistic neural network modeling approaches

    SciTech Connect

    Singh, Kunwar P.; Gupta, Shikha; Rai, Premanjali

    2013-10-15

    Robust global models capable of discriminating positive and non-positive carcinogens; and predicting carcinogenic potency of chemicals in rodents were developed. The dataset of 834 structurally diverse chemicals extracted from Carcinogenic Potency Database (CPDB) was used which contained 466 positive and 368 non-positive carcinogens. Twelve non-quantum mechanical molecular descriptors were derived. Structural diversity of the chemicals and nonlinearity in the data were evaluated using Tanimoto similarity index and Brock–Dechert–Scheinkman statistics. Probabilistic neural network (PNN) and generalized regression neural network (GRNN) models were constructed for classification and function optimization problems using the carcinogenicity end point in rat. Validation of the models was performed using the internal and external procedures employing a wide series of statistical checks. PNN constructed using five descriptors rendered classification accuracy of 92.09% in complete rat data. The PNN model rendered classification accuracies of 91.77%, 80.70% and 92.08% in mouse, hamster and pesticide data, respectively. The GRNN constructed with nine descriptors yielded correlation coefficient of 0.896 between the measured and predicted carcinogenic potency with mean squared error (MSE) of 0.44 in complete rat data. The rat carcinogenicity model (GRNN) applied to the mouse and hamster data yielded correlation coefficient and MSE of 0.758, 0.71 and 0.760, 0.46, respectively. The results suggest for wide applicability of the inter-species models in predicting carcinogenic potency of chemicals. Both the PNN and GRNN (inter-species) models constructed here can be useful tools in predicting the carcinogenicity of new chemicals for regulatory purposes. - Graphical abstract: Figure (a) shows classification accuracies (positive and non-positive carcinogens) in rat, mouse, hamster, and pesticide data yielded by optimal PNN model. Figure (b) shows generalization and predictive

  6. Agent-based modelling of consumer energy choices

    NASA Astrophysics Data System (ADS)

    Rai, Varun; Henry, Adam Douglas

    2016-06-01

    Strategies to mitigate global climate change should be grounded in a rigorous understanding of energy systems, particularly the factors that drive energy demand. Agent-based modelling (ABM) is a powerful tool for representing the complexities of energy demand, such as social interactions and spatial constraints. Unlike other approaches for modelling energy demand, ABM is not limited to studying perfectly rational agents or to abstracting micro details into system-level equations. Instead, ABM provides the ability to represent behaviours of energy consumers -- such as individual households -- using a range of theories, and to examine how the interaction of heterogeneous agents at the micro-level produces macro outcomes of importance to the global climate, such as the adoption of low-carbon behaviours and technologies over space and time. We provide an overview of ABM work in the area of consumer energy choices, with a focus on identifying specific ways in which ABM can improve understanding of both fundamental scientific and applied aspects of the demand side of energy to aid the design of better policies and programmes. Future research needs for improving the practice of ABM to better understand energy demand are also discussed.

  7. Time Analysis for Probabilistic Workflows

    SciTech Connect

    Czejdo, Bogdan; Ferragut, Erik M

    2012-01-01

    There are many theoretical and practical results in the area of workflow modeling, especially when the more formal workflows are used. In this paper we focus on probabilistic workflows. We show algorithms for time computations in probabilistic workflows. With time of activities more precisely modeled, we can achieve improvement in the work cooperation and analyses of cooperation including simulation and visualization.

  8. Probabilistic Movement Models Show that Postural Control Precedes and Predicts Volitional Motor Control

    PubMed Central

    Rueckert, Elmar; Čamernik, Jernej; Peters, Jan; Babič, Jan

    2016-01-01

    Human motor skill learning is driven by the necessity to adapt to new situations. While supportive contacts are essential for many tasks, little is known about their impact on motor learning. To study the effect of contacts an innovative full-body experimental paradigm was established. The task of the subjects was to reach for a distant target while postural stability could only be maintained by establishing an additional supportive hand contact. To examine adaptation, non-trivial postural perturbations of the subjects’ support base were systematically introduced. A novel probabilistic trajectory model approach was employed to analyze the correlation between the motions of both arms and the trunk. We found that subjects adapted to the perturbations by establishing target dependent hand contacts. Moreover, we found that the trunk motion adapted significantly faster than the motion of the arms. However, the most striking finding was that observations of the initial phase of the left arm or trunk motion (100–400 ms) were sufficient to faithfully predict the complete movement of the right arm. Overall, our results suggest that the goal-directed arm movements determine the supportive arm motions and that the motion of heavy body parts adapts faster than the light arms. PMID:27328750

  9. Detection of prostate cancer on histopathology using color fractals and Probabilistic Pairwise Markov models.

    PubMed

    Yu, Elaine; Monaco, James P; Tomaszewski, John; Shih, Natalie; Feldman, Michael; Madabhushi, Anant

    2011-01-01

    In this paper we present a system for detecting regions of carcinoma of the prostate (CaP) in H&E stained radical prostatectomy specimens using the color fractal dimension. Color textural information is known to be a valuable characteristic to distinguish CaP from benign tissue. In addition to color information, we know that cancer tends to form contiguous regions. Our system leverages the color staining information of histology as well as spatial dependencies. The color and textural information is first captured using color fractal dimension. To incorporate spatial dependencies, we combine the probability map constructed via color fractal dimension with a novel Markov prior called the Probabilistic Pairwise Markov Model (PPMM). To demonstrate the capability of this CaP detection system, we applied the algorithm to 27 radical prostatectomy specimens from 10 patients. A per pixel evaluation was conducted with ground truth provided by an expert pathologist using only the color fractal feature first, yielding an area under the receiver operator characteristic curve (AUC) curve of 0.790. In conjunction with a Markov prior, the resultant color fractal dimension + Markov random field (MRF) classifier yielded an AUC of 0.831.

  10. Probabilistic Modeling of Landfill Subsidence Introduced by Buried Structure Collapse - 13229

    SciTech Connect

    Foye, Kevin; Soong, Te-Yang

    2013-07-01

    The long-term reliability of land disposal facility final cover systems - and therefore the overall waste containment - depends on the distortions imposed on these systems by differential settlement/subsidence. The evaluation of differential settlement is challenging because of the heterogeneity of the waste mass and buried structure placement. Deterministic approaches to long-term final cover settlement prediction are not able to capture the spatial variability in the waste mass and sub-grade properties, especially discontinuous inclusions, which control differential settlement. An alternative is to use a probabilistic model to capture the non-uniform collapse of cover soils and buried structures and the subsequent effect of that collapse on the final cover system. Both techniques are applied to the problem of two side-by-side waste trenches with collapsible voids. The results show how this analytical technique can be used to connect a metric of final cover performance (inundation area) to the susceptibility of the sub-grade to collapse and the effective thickness of the cover soils. This approach allows designers to specify cover thickness, reinforcement, and slope to meet the demands imposed by the settlement of the underlying waste trenches. (authors)

  11. A probabilistic model for the identification of confinement regimes and edge localized mode behavior, with implications to scaling laws

    SciTech Connect

    Verdoolaege, Geert; Van Oost, Guido

    2012-10-15

    Pattern recognition is becoming an important tool in fusion data analysis. However, fusion diagnostic measurements are often affected by considerable statistical uncertainties, rendering the extraction of useful patterns a significant challenge. Therefore, we assume a probabilistic model for the data and perform pattern recognition in the space of probability distributions. We show the considerable advantage of our method for identifying confinement regimes and edge localized mode behavior, and we discuss the potential for scaling laws.

  12. A probabilistic model for the identification of confinement regimes and edge localized mode behavior, with implications to scaling lawsa)

    NASA Astrophysics Data System (ADS)

    Verdoolaege, Geert; Van Oost, Guido

    2012-10-01

    Pattern recognition is becoming an important tool in fusion data analysis. However, fusion diagnostic measurements are often affected by considerable statistical uncertainties, rendering the extraction of useful patterns a significant challenge. Therefore, we assume a probabilistic model for the data and perform pattern recognition in the space of probability distributions. We show the considerable advantage of our method for identifying confinement regimes and edge localized mode behavior, and we discuss the potential for scaling laws.

  13. Probabilistic Models and Generative Neural Networks: Towards an Unified Framework for Modeling Normal and Impaired Neurocognitive Functions.

    PubMed

    Testolin, Alberto; Zorzi, Marco

    2016-01-01

    Connectionist models can be characterized within the more general framework of probabilistic graphical models, which allow to efficiently describe complex statistical distributions involving a large number of interacting variables. This integration allows building more realistic computational models of cognitive functions, which more faithfully reflect the underlying neural mechanisms at the same time providing a useful bridge to higher-level descriptions in terms of Bayesian computations. Here we discuss a powerful class of graphical models that can be implemented as stochastic, generative neural networks. These models overcome many limitations associated with classic connectionist models, for example by exploiting unsupervised learning in hierarchical architectures (deep networks) and by taking into account top-down, predictive processing supported by feedback loops. We review some recent cognitive models based on generative networks, and we point out promising research directions to investigate neuropsychological disorders within this approach. Though further efforts are required in order to fill the gap between structured Bayesian models and more realistic, biophysical models of neuronal dynamics, we argue that generative neural networks have the potential to bridge these levels of analysis, thereby improving our understanding of the neural bases of cognition and of pathologies caused by brain damage.

  14. Probabilistic Models and Generative Neural Networks: Towards an Unified Framework for Modeling Normal and Impaired Neurocognitive Functions

    PubMed Central

    Testolin, Alberto; Zorzi, Marco

    2016-01-01

    Connectionist models can be characterized within the more general framework of probabilistic graphical models, which allow to efficiently describe complex statistical distributions involving a large number of interacting variables. This integration allows building more realistic computational models of cognitive functions, which more faithfully reflect the underlying neural mechanisms at the same time providing a useful bridge to higher-level descriptions in terms of Bayesian computations. Here we discuss a powerful class of graphical models that can be implemented as stochastic, generative neural networks. These models overcome many limitations associated with classic connectionist models, for example by exploiting unsupervised learning in hierarchical architectures (deep networks) and by taking into account top-down, predictive processing supported by feedback loops. We review some recent cognitive models based on generative networks, and we point out promising research directions to investigate neuropsychological disorders within this approach. Though further efforts are required in order to fill the gap between structured Bayesian models and more realistic, biophysical models of neuronal dynamics, we argue that generative neural networks have the potential to bridge these levels of analysis, thereby improving our understanding of the neural bases of cognition and of pathologies caused by brain damage. PMID:27468262

  15. Fully probabilistic seismic source inversion - Part 2: Modelling errors and station covariances

    NASA Astrophysics Data System (ADS)

    Stähler, Simon C.; Sigloch, Karin

    2016-11-01

    Seismic source inversion, a central task in seismology, is concerned with the estimation of earthquake source parameters and their uncertainties. Estimating uncertainties is particularly challenging because source inversion is a non-linear problem. In a companion paper, Stähler and Sigloch (2014) developed a method of fully Bayesian inference for source parameters, based on measurements of waveform cross-correlation between broadband, teleseismic body-wave observations and their modelled counterparts. This approach yields not only depth and moment tensor estimates but also source time functions. A prerequisite for Bayesian inference is the proper characterisation of the noise afflicting the measurements, a problem we address here. We show that, for realistic broadband body-wave seismograms, the systematic error due to an incomplete physical model affects waveform misfits more strongly than random, ambient background noise. In this situation, the waveform cross-correlation coefficient CC, or rather its decorrelation D = 1 - CC, performs more robustly as a misfit criterion than ℓp norms, more commonly used as sample-by-sample measures of misfit based on distances between individual time samples. From a set of over 900 user-supervised, deterministic earthquake source solutions treated as a quality-controlled reference, we derive the noise distribution on signal decorrelation D = 1 - CC of the broadband seismogram fits between observed and modelled waveforms. The noise on D is found to approximately follow a log-normal distribution, a fortunate fact that readily accommodates the formulation of an empirical likelihood function for D for our multivariate problem. The first and second moments of this multivariate distribution are shown to depend mostly on the signal-to-noise ratio (SNR) of the CC measurements and on the back-azimuthal distances of seismic stations. By identifying and quantifying this likelihood function, we make D and thus waveform cross

  16. SEX-DETector: A Probabilistic Approach to Study Sex Chromosomes in Non-Model Organisms

    PubMed Central

    Muyle, Aline; Käfer, Jos; Zemp, Niklaus; Mousset, Sylvain; Picard, Franck; Marais, Gabriel AB

    2016-01-01

    We propose a probabilistic framework to infer autosomal and sex-linked genes from RNA-seq data of a cross for any sex chromosome type (XY, ZW, and UV). Sex chromosomes (especially the non-recombining and repeat-dense Y, W, U, and V) are notoriously difficult to sequence. Strategies have been developed to obtain partially assembled sex chromosome sequences. Most of them remain difficult to apply to numerous non-model organisms, either because they require a reference genome, or because they are designed for evolutionarily old systems. Sequencing a cross (parents and progeny) by RNA-seq to study the segregation of alleles and infer sex-linked genes is a cost-efficient strategy, which also provides expression level estimates. However, the lack of a proper statistical framework has limited a broader application of this approach. Tests on empirical Silene data show that our method identifies 20–35% more sex-linked genes than existing pipelines, while making reliable inferences for downstream analyses. Approximately 12 individuals are needed for optimal results based on simulations. For species with an unknown sex-determination system, the method can assess the presence and type (XY vs. ZW) of sex chromosomes through a model comparison strategy. The method is particularly well optimized for sex chromosomes of young or intermediate age, which are expected in thousands of yet unstudied lineages. Any organisms, including non-model ones for which nothing is known a priori, that can be bred in the lab, are suitable for our method. SEX-DETector and its implementation in a Galaxy workflow are made freely available. PMID:27492231

  17. Analyzing the impact of modeling choices and assumptions in compartmental epidemiological models

    SciTech Connect

    Nutaro, James J.; Pullum, Laura L.; Ramanathan, Arvind; Ozmen, Ozgur

    2016-05-01

    In this study, computational models have become increasingly used as part of modeling, predicting, and understanding how infectious diseases spread within large populations. These models can be broadly classified into differential equation-based models (EBM) and agent-based models (ABM). Both types of models are central in aiding public health officials design intervention strategies in case of large epidemic outbreaks. We examine these models in the context of illuminating their hidden assumptions and the impact these may have on the model outcomes. Very few ABM/EBMs are evaluated for their suitability to address a particular public health concern, and drawing relevant conclusions about their suitability requires reliable and relevant information regarding the different modeling strategies and associated assumptions. Hence, there is a need to determine how the different modeling strategies, choices of various parameters, and the resolution of information for EBMs and ABMs affect outcomes, including predictions of disease spread. In this study, we present a quantitative analysis of how the selection of model types (i.e., EBM vs. ABM), the underlying assumptions that are enforced by model types to model the disease propagation process, and the choice of time advance (continuous vs. discrete) affect the overall outcomes of modeling disease spread. Our study reveals that the magnitude and velocity of the simulated epidemic depends critically on the selection of modeling principles, various assumptions of disease process, and the choice of time advance.

  18. Analyzing the impact of modeling choices and assumptions in compartmental epidemiological models

    DOE PAGES

    Nutaro, James J.; Pullum, Laura L.; Ramanathan, Arvind; ...

    2016-05-01

    In this study, computational models have become increasingly used as part of modeling, predicting, and understanding how infectious diseases spread within large populations. These models can be broadly classified into differential equation-based models (EBM) and agent-based models (ABM). Both types of models are central in aiding public health officials design intervention strategies in case of large epidemic outbreaks. We examine these models in the context of illuminating their hidden assumptions and the impact these may have on the model outcomes. Very few ABM/EBMs are evaluated for their suitability to address a particular public health concern, and drawing relevant conclusions aboutmore » their suitability requires reliable and relevant information regarding the different modeling strategies and associated assumptions. Hence, there is a need to determine how the different modeling strategies, choices of various parameters, and the resolution of information for EBMs and ABMs affect outcomes, including predictions of disease spread. In this study, we present a quantitative analysis of how the selection of model types (i.e., EBM vs. ABM), the underlying assumptions that are enforced by model types to model the disease propagation process, and the choice of time advance (continuous vs. discrete) affect the overall outcomes of modeling disease spread. Our study reveals that the magnitude and velocity of the simulated epidemic depends critically on the selection of modeling principles, various assumptions of disease process, and the choice of time advance.« less

  19. An empirical model for probabilistic decadal prediction: global attribution and regional hindcasts

    NASA Astrophysics Data System (ADS)

    Suckling, Emma B.; van Oldenborgh, Geert Jan; Eden, Jonathan M.; Hawkins, Ed

    2016-07-01

    Empirical models, designed to predict surface variables over seasons to decades ahead, provide useful benchmarks for comparison against the performance of dynamical forecast systems; they may also be employable as predictive tools for use by climate services in their own right. A new global empirical decadal prediction system is presented, based on a multiple linear regression approach designed to produce probabilistic output for comparison against dynamical models. A global attribution is performed initially to identify the important forcing and predictor components of the model . Ensemble hindcasts of surface air temperature anomaly fields are then generated, based on the forcings and predictors identified as important, under a series of different prediction `modes' and their performance is evaluated. The modes include a real-time setting, a scenario in which future volcanic forcings are prescribed during the hindcasts, and an approach which exploits knowledge of the forced trend. A two-tier prediction system, which uses knowledge of future sea surface temperatures in the Pacific and Atlantic Oceans, is also tested, but within a perfect knowledge framework. Each mode is designed to identify sources of predictability and uncertainty, as well as investigate different approaches to the design of decadal prediction systems for operational use. It is found that the empirical model shows skill above that of persistence hindcasts for annual means at lead times of up to 10 years ahead in all of the prediction modes investigated. It is suggested that hindcasts which exploit full knowledge of the forced trend due to increasing greenhouse gases throughout the hindcast period can provide more robust estimates of model bias for the calibration of the empirical model in an operational setting. The two-tier system shows potential for improved real-time prediction, given the assumption that skilful predictions of large-scale modes of variability are available. The empirical

  20. Does probabilistic modelling of linkage disequilibrium evolution improve the accuracy of QTL location in animal pedigree?

    PubMed Central

    2010-01-01

    Background Since 2001, the use of more and more dense maps has made researchers aware that combining linkage and linkage disequilibrium enhances the feasibility of fine-mapping genes of interest. So, various method types have been derived to include concepts of population genetics in the analyses. One major drawback of many of these methods is their computational cost, which is very significant when many markers are considered. Recent advances in technology, such as SNP genotyping, have made it possible to deal with huge amount of data. Thus the challenge that remains is to find accurate and efficient methods that are not too time consuming. The study reported here specifically focuses on the half-sib family animal design. Our objective was to determine whether modelling of linkage disequilibrium evolution improved the mapping accuracy of a quantitative trait locus of agricultural interest in these populations. We compared two methods of fine-mapping. The first one was an association analysis. In this method, we did not model linkage disequilibrium evolution. Therefore, the modelling of the evolution of linkage disequilibrium was a deterministic process; it was complete at time 0 and remained complete during the following generations. In the second method, the modelling of the evolution of population allele frequencies was derived from a Wright-Fisher model. We simulated a wide range of scenarios adapted to animal populations and compared these two methods for each scenario. Results Our results indicated that the improvement produced by probabilistic modelling of linkage disequilibrium evolution was not significant. Both methods led to similar results concerning the location accuracy of quantitative trait loci which appeared to be mainly improved by using four flanking markers instead of two. Conclusions Therefore, in animal half-sib designs, modelling linkage disequilibrium evolution using a Wright-Fisher model does not significantly improve the accuracy of the

  1. A Scalar Product Model for the Multidimensional Scaling of Choice

    ERIC Educational Resources Information Center

    Bechtel, Gordon G.; And Others

    1971-01-01

    Contains a solution for the multidimensional scaling of pairwise choice when individuals are represented as dimensional weights. The analysis supplies an exact least squares solution and estimates of group unscalability parameters. (DG)

  2. Discovery of Transcriptional Targets Regulated by Nuclear Receptors Using a Probabilistic Graphical Model.

    PubMed

    Lee, Mikyung; Huang, Ruili; Tong, Weida

    2016-03-01

    Nuclear receptors (NRs) are ligand-activated transcriptional regulators that play vital roles in key biological processes such as growth, differentiation, metabolism, reproduction, and morphogenesis. Disruption of NRs can result in adverse health effects such as NR-mediated endocrine disruption. A comprehensive understanding of core transcriptional targets regulated by NRs helps to elucidate their key biological processes in both toxicological and therapeutic aspects. In this study, we applied a probabilistic graphical model to identify the transcriptional targets of NRs and the biological processes they govern. The Tox21 program profiled a collection of approximate 10 000 environmental chemicals and drugs against a panel of human NRs in a quantitative high-throughput screening format for their NR disruption potential. The Japanese Toxicogenomics Project, one of the most comprehensive efforts in the field of toxicogenomics, generated large-scale gene expression profiles on the effect of 131 compounds (in its first phase of study) at various doses, and different durations, and their combinations. We applied author-topic model to these 2 toxicological datasets, which consists of 11 NRs run in either agonist and/or antagonist mode (18 assays total) and 203 in vitro human gene expression profiles connected by 52 shared drugs. As a result, a set of clusters (topics), which consists of a set of NRs and their associated target genes were determined. Various transcriptional targets of the NRs were identified by assays run in either agonist or antagonist mode. Our results were validated by functional analysis and compared with TRANSFAC data. In summary, our approach resulted in effective identification of associated/affected NRs and their target genes, providing biologically meaningful hypothesis embedded in their relationships.

  3. Bayesian probabilistic model for life prediction and fault mode classification of solid state luminaires

    SciTech Connect

    Lall, Pradeep; Wei, Junchao; Sakalaukus, Peter

    2014-06-22

    A new method has been developed for assessment of the onset of degradation in solid state luminaires to classify failure mechanisms by using metrics beyond lumen degradation that are currently used for identification of failure. Luminous Flux output, Correlated Color Temperature Data on Philips LED Lamps has been gathered under 85°C/85%RH till lamp failure. Failure modes of the test population of the lamps have been studied to understand the failure mechanisms in 85°C/85%RH accelerated test. Results indicate that the dominant failure mechanism is the discoloration of the LED encapsulant inside the lamps which is the likely cause for the luminous flux degradation and the color shift. The acquired data has been used in conjunction with Bayesian Probabilistic Models to identify luminaires with onset of degradation much prior to failure through identification of decision boundaries between lamps with accrued damage and lamps beyond the failure threshold in the feature space. In addition luminaires with different failure modes have been classified separately from healthy pristine luminaires. The α-λ plots have been used to evaluate the robustness of the proposed methodology. Results show that the predicted degradation for the lamps tracks the true degradation observed during 85°C/85%RH during accelerated life test fairly closely within the ±20% confidence bounds. Correlation of model prediction with experimental results indicates that the presented methodology allows the early identification of the onset of failure much prior to development of complete failure distributions and can be used for assessing the damage state of SSLs in fairly large deployments. It is expected that, the new prediction technique will allow the development of failure distributions without testing till L70 life for the manifestation of failure.

  4. Quantification and probabilistic modeling of CRT obsolescence for the State of Delaware

    SciTech Connect

    Schumacher, Kelsea A.; Schumacher, Thomas; Agbemabiese, Lawrence

    2014-11-15

    Highlights: • We modeled the obsolescence of cathode ray tube devices in the State of Delaware. • 411,654 CRT units or ∼16,500 metric tons have been recycled in Delaware since 2002. • The peak of the CRT obsolescence in Delaware passed by 2012. • The Delaware average CRT recycling rate between 2002 and 13 was approximately 27.5%. • CRTs will continue to infiltrate the system likely until 2033. - Abstract: The cessation of production and replacement of cathode ray tube (CRT) displays with flat screen displays have resulted in the proliferation of CRTs in the electronic waste (e-waste) recycle stream. However, due to the nature of the technology and presence of hazardous components such as lead, CRTs are the most challenging of electronic components to recycle. In the State of Delaware it is due to this challenge and the resulting expense combined with the large quantities of CRTs in the recycle stream that electronic recyclers now charge to accept Delaware’s e-waste. Therefore it is imperative that the Delaware Solid Waste Authority (DSWA) understand future quantities of CRTs entering the waste stream. This study presents the results of an assessment of CRT obsolescence in the State of Delaware. A prediction model was created utilizing publicized sales data, a variety of lifespan data as well as historic Delaware CRT collection rates. Both a deterministic and a probabilistic approach using Monte Carlo Simulation (MCS) were performed to forecast rates of CRT obsolescence to be anticipated in the State of Delaware. Results indicate that the peak of CRT obsolescence in Delaware has already passed, although CRTs are anticipated to enter the waste stream likely until 2033.

  5. Development of Probabilistic Risk Assessment Model for BWR Shutdown Modes 4 and 5 Integrated in SPAR Model

    SciTech Connect

    S. T. Khericha; S. Sancakter; J. Mitman; J. Wood

    2010-06-01

    Nuclear plant operating experience and several studies show that the risk from shutdown operation during modes 4, 5, and 6 can be significant This paper describes development of the standard template risk evaluation models for shutdown modes 4, and 5 for commercial boiling water nuclear power plants (BWR). The shutdown probabilistic risk assessment model uses full power Nuclear Regulatory Commission’s (NRC’s) Standardized Plant Analysis Risk (SPAR) model as the starting point for development. The shutdown PRA models are integrated with their respective internal events at-power SPAR model. This is accomplished by combining the modified system fault trees from SPAR full power model with shutdown event tree logic. For human reliability analysis (HRA), the SPAR HRA (SPAR-H) method is used which requires the analysts to complete relatively straight forward worksheet, including the performance shaping factors (PSFs). The results are then used to estimate HEP of interest. The preliminary results indicate the risk is dominated by the operator’s ability to diagnose the events and provide long term cooling.

  6. Probabilistic performance-assessment modeling of the mixed waste landfill at Sandia National Laboratories.

    SciTech Connect

    Peace, Gerald L.; Goering, Timothy James; Miller, Mark Laverne; Ho, Clifford Kuofei

    2005-11-01

    A probabilistic performance assessment has been conducted to evaluate the fate and transport of radionuclides (americium-241, cesium-137, cobalt-60, plutonium-238, plutonium-239, radium-226, radon-222, strontium-90, thorium-232, tritium, uranium-238), heavy metals (lead and cadmium), and volatile organic compounds (VOCs) at the Mixed Waste Landfill (MWL). Probabilistic analyses were performed to quantify uncertainties inherent in the system and models for a 1,000-year period, and sensitivity analyses were performed to identify parameters and processes that were most important to the simulated performance metrics. Comparisons between simulated results and measured values at the MWL were made to gain confidence in the models and perform calibrations when data were available. In addition, long-term monitoring requirements and triggers were recommended based on the results of the quantified uncertainty and sensitivity analyses. At least one-hundred realizations were simulated for each scenario defined in the performance assessment. Conservative values and assumptions were used to define values and distributions of uncertain input parameters when site data were not available. Results showed that exposure to tritium via the air pathway exceeded the regulatory metric of 10 mrem/year in about 2% of the simulated realizations when the receptor was located at the MWL (continuously exposed to the air directly above the MWL). Simulations showed that peak radon gas fluxes exceeded the design standard of 20 pCi/m{sup 2}/s in about 3% of the realizations if up to 1% of the containers of sealed radium-226 sources were assumed to completely degrade in the future. If up to 100% of the containers of radium-226 sources were assumed to completely degrade, 30% of the realizations yielded radon surface fluxes that exceeded the design standard. For the groundwater pathway, simulations showed that none of the radionuclides or heavy metals (lead and cadmium) reached the groundwater during

  7. Comparison of the MACCS2 atmospheric transport model with Lagrangian puff models as applied to deterministic and probabilistic safety analysis.

    PubMed

    Till, John E; Rood, Arthur S; Garzon, Caroline D; Lagdon, Richard H

    2014-09-01

    The suitability of a new facility in terms of potential impacts from routine and accidental releases is typically evaluated using conservative models and assumptions to assure dose standards are not exceeded. However, overly conservative dose estimates that exceed target doses can result in unnecessary and costly facility design changes. This paper examines one such case involving the U.S. Department of Energy's pretreatment facility of the Waste Treatment and Immobilization Plant (WTP). The MELCOR Accident Consequence Code System Version 2 (MACCS2) was run using conservative parameter values in prescribed guidance to demonstrate that the dose from a postulated airborne release would not exceed the guideline dose of 0.25 Sv. External review of default model parameters identified the deposition velocity of 1.0 cm s as being non-conservative. The deposition velocity calculated using resistance models was in the range of 0.1 to 0.3 cm s-1. A value of 0.1 cm s-1 would result in the dose guideline being exceeded. To test the overall conservatism of the MACCS2 transport model, the 95th percentile hourly average dispersion factor based on one year of meteorological data was compared to dispersion factors generated from two state-of-the-art Lagrangian puff models. The 95th percentile dispersion factor from MACCS2 was a factor of 3 to 6 higher compared to those of the Lagrangian puff models at a distance of 9.3 km and a deposition velocity of 0.1 cm s-1. Thus, the inherent conservatism in MACCS2 more than compensated for the high deposition velocity used in the assessment. Applications of models like MACCS2 with a conservative set of parameters are essentially screening calculations, and failure to meet dose criteria should not trigger facility design changes but prompt a more in-depth analysis using probabilistic methods with a defined margin of safety in the target dose. A sample application of the probabilistic approach is provided.

  8. Landslide susceptibility mapping along PLUS expressways in Malaysia using probabilistic based model in GIS

    NASA Astrophysics Data System (ADS)

    Yusof, Norbazlan M.; Pradhan, Biswajeet

    2014-06-01

    PLUS Berhad holds the concession for a total of 987 km of toll expressways in Malaysia, the longest of which is the North-South Expressway or NSE. Acting as the backbone' of the west coast of the peninsula, the NSE stretches from the Malaysian-Thai border in the north to the border with neighbouring Singapore in the south, linking several major cities and towns along the way. North-South Expressway in Malaysia contributes to the country economic development through trade, social and tourism sector. Presently, the highway is good in terms of its condition and connection to every state but some locations need urgent attention. Stability of slopes at these locations is of most concern as any instability can cause danger to the motorist. In this paper, two study locations have been analysed; they are Gua Tempurung (soil slope) and Jelapang (rock slope) which are obviously having two different characteristics. These locations passed through undulating terrain with steep slopes where landslides are common and the probability of slope instability due to human activities in surrounding areas is high. A combination of twelve (12) landslide conditioning factors database on slope stability such as slope degree and slope aspect were extracted from IFSAR (interoferometric synthetic aperture radar) while landuse, lithology and structural geology were constructed from interpretation of high resolution satellite data from World View II, Quickbird and Ikonos. All this information was analysed in geographic information system (GIS) environment for landslide susceptibility mapping using probabilistic based frequency ratio model. Consequently, information on the slopes such as inventories, condition assessments and maintenance records were assessed through total expressway maintenance management system or better known as TEMAN. The above mentioned system is used by PLUS as an asset management and decision support tools for maintenance activities along the highways as well as for data

  9. Radar Tracking with an Interacting Multiple Model and Probabilistic Data Association Filter for Civil Aviation Applications

    PubMed Central

    Jan, Shau-Shiun; Kao, Yu-Chun

    2013-01-01

    The current trend of the civil aviation technology is to modernize the legacy air traffic control (ATC) system that is mainly supported by many ground based navigation aids to be the new air traffic management (ATM) system that is enabled by global positioning system (GPS) technology. Due to the low receiving power of GPS signal, it is a major concern to aviation authorities that the operation of the ATM system might experience service interruption when the GPS signal is jammed by either intentional or unintentional radio-frequency interference. To maintain the normal operation of the ATM system during the period of GPS outage, the use of the current radar system is proposed in this paper. However, the tracking performance of the current radar system could not meet the required performance of the ATM system, and an enhanced tracking algorithm, the interacting multiple model and probabilistic data association filter (IMMPDAF), is therefore developed to support the navigation and surveillance services of the ATM system. The conventional radar tracking algorithm, the nearest neighbor Kalman filter (NNKF), is used as the baseline to evaluate the proposed radar tracking algorithm, and the real flight data is used to validate the IMMPDAF algorithm. As shown in the results, the proposed IMMPDAF algorithm could enhance the tracking performance of the current aviation radar system and meets the required performance of the new ATM system. Thus, the current radar system with the IMMPDAF algorithm could be used as an alternative system to continue aviation navigation and surveillance services of the ATM system during GPS outage periods. PMID:23686142

  10. Assessment of climate change impacts on climate variables using probabilistic ensemble modeling and trend analysis

    NASA Astrophysics Data System (ADS)

    Safavi, Hamid R.; Sajjadi, Sayed Mahdi; Raghibi, Vahid

    2016-08-01

    Water resources in snow-dependent regions have undergone significant changes due to climate change. Snow measurements in these regions have revealed alarming declines in snowfall over the past few years. The Zayandeh-Rud River in central Iran chiefly depends on winter falls as snow for supplying water from wet regions in high Zagrous Mountains to the downstream, (semi-)arid, low-lying lands. In this study, the historical records (baseline: 1971-2000) of climate variables (temperature and precipitation) in the wet region were chosen to construct a probabilistic ensemble model using 15 GCMs in order to forecast future trends and changes while the Long Ashton Research Station Weather Generator (LARS-WG) was utilized to project climate variables under two A2 and B1 scenarios to a future period (2015-2044). Since future snow water equivalent (SWE) forecasts by GCMs were not available for the study area, an artificial neural network (ANN) was implemented to build a relationship between climate variables and snow water equivalent for the baseline period to estimate future snowfall amounts. As a last step, homogeneity and trend tests were performed to evaluate the robustness of the data series and changes were examined to detect past and future variations. Results indicate different characteristics of the climate variables at upstream stations. A shift is observed in the type of precipitation from snow to rain as well as in its quantities across the subregions. The key role in these shifts and the subsequent side effects such as water losses is played by temperature.

  11. Initial Correction versus Negative Marking in Multiple Choice Examinations

    ERIC Educational Resources Information Center

    Van Hecke, Tanja

    2015-01-01

    Optimal assessment tools should measure in a limited time the knowledge of students in a correct and unbiased way. A method for automating the scoring is multiple choice scoring. This article compares scoring methods from a probabilistic point of view by modelling the probability to pass: the number right scoring, the initial correction (IC) and…

  12. Fatigue strength reduction model: RANDOM3 and RANDOM4 user manual. Appendix 2: Development of advanced methodologies for probabilistic constitutive relationships of material strength models

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Lovelace, Thomas B.

    1989-01-01

    FORTRAN programs RANDOM3 and RANDOM4 are documented in the form of a user's manual. Both programs are based on fatigue strength reduction, using a probabilistic constitutive model. The programs predict the random lifetime of an engine component to reach a given fatigue strength. The theoretical backgrounds, input data instructions, and sample problems illustrating the use of the programs are included.

  13. Fatigue crack growth model RANDOM2 user manual. Appendix 1: Development of advanced methodologies for probabilistic constitutive relationships of material strength models

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Lovelace, Thomas B.

    1989-01-01

    FORTRAN program RANDOM2 is presented in the form of a user's manual. RANDOM2 is based on fracture mechanics using a probabilistic fatigue crack growth model. It predicts the random lifetime of an engine component to reach a given crack size. Details of the theoretical background, input data instructions, and a sample problem illustrating the use of the program are included.

  14. A Practical Probabilistic Graphical Modeling Tool for Weighing Ecological Risk-Based Evidence

    EPA Science Inventory

    Past weight-of-evidence frameworks for adverse ecological effects have provided soft-scoring procedures for judgments based on the quality and measured attributes of evidence. Here, we provide a flexible probabilistic structure for weighing and integrating lines of evidence for e...

  15. High-Throughput Detection of Prostate Cancer in Histological Sections Using Probabilistic Pairwise Markov Models

    PubMed Central

    Monaco, James P.; Tomaszewski, John E.; Feldman, Michael D.; Hagemann, Ian; Moradi, Mehdi; Mousavi, Parvin; Boag, Alexander; Davidson, Chris; Abolmaesumi, Purang; Madabhushi, Anant

    2010-01-01

    In this paper we present a high-throughput system for detecting regions of carcinoma of the prostate (CaP) in HSs from radical prostatectomies (RPs) using probabilistic pairwise Markov models (PPMMs), a novel type of Markov random field (MRF). At diagnostic resolution a digitized HS can contain 80K×70K pixels — far too many for current automated Gleason grading algorithms to process. However, grading can be separated into two distinct steps: 1) detecting cancerous regions and 2) then grading these regions. The detection step does not require diagnostic resolution and can be performed much more quickly. Thus, we introduce a CaP detection system capable of analyzing an entire digitized whole-mount HS (2×1.75 cm2) in under three minutes (on a desktop computer) while achieving a CaP detection sensitivity and specificity of 0.87 and 0.90, respectively. We obtain this high-throughput by tailoring the system to analyze the HSs at low resolution (8 µm per pixel). This motivates the following algorithm: Step 1) glands are segmented, Step 2) the segmented glands are classified as malignant or benign, and Step 3) the malignant glands are consolidated into continuous regions. The classification of individual glands leverages two features: gland size and the tendency for proximate glands to share the same class. The latter feature describes a spatial dependency which we model using a Markov prior. Typically, Markov priors are expressed as the product of potential functions. Unfortunately, potential functions are mathematical abstractions, and constructing priors through their selection becomes an ad hoc procedure, resulting in simplistic models such as the Potts. Addressing this problem, we introduce PPMMs which formulate priors in terms of probability density functions, allowing the creation of more sophisticated models. To demonstrate the efficacy of our CaP detection system and assess the advantages of using a PPMM prior instead of the Potts, we alternately incorporate

  16. A probabilistic spatial-temporal model for vent opening clustering at Campi Flegrei caldera (Italy)

    NASA Astrophysics Data System (ADS)

    Bevilacqua, A.; Isaia, R.; Flandoli, F.; Neri, A.; Quaranta, D.

    2014-12-01

    Campi Flegrei (CF) is a densely urbanized caldera with a very high volcanic risk. Its more recent volcanic activity was characterized in the last 15 kyrs by more than 70 explosive events of variable scale and vent location. The sequence of eruptive events at CF is remarkably inhomogeneous, both in space and time. Eruptions concentred over periods from a few centuries to a few millennia, and were alternated by periods of quiescence lasting up to several millennia. As a consequence, activity has been subdivided into three distinct epochs, i.e. Epoch I, 15 - 9.5 kyrs, Epoch II, 8.6 - 8.2 kyrs, and Epoch III, 4.8 - 3.7 kyrs BP [e.g. Orsi et al., 2004; Smith et al., 2011]. The eruptive record also shows the presence of clusters of events in space-time, i.e. the opening of a new vent in a particular location and at a specific time seems to increase the probability of another vent opening in the nearby area and in the next decades-centuries (self-exciting effect). Probabilistic vent opening mapping conditional the occurrence of a new event and able to account for some of the intrinsic uncertainties affecting the system, has been investigated in some recent studies [e.g. Selva et al. 2011, Bevilacqua et al. 2014, in preparation], but a spatial-temporal model of the sequence of volcanic activity remains an open issue. Hence we have developed a time-space mathematical model that takes into account both the self-exciting behaviour of the system and the significant uncertainty affecting the eruptive record. Based on the past eruptive record of the volcano, the model allows to simulate sequences of future events as well as to better understand the spatial and temporal evolution of the system. In addition, based on the assumption that the last eruptive event occurred in 1538 AD (Monte Nuovo eruption) is the first event of a new epoch of activity, the model can estimate the probability of new vent opening at CF in the next decades.

  17. Self assembly of rectangular shapes on concentration programming and probabilistic tile assembly models.

    PubMed

    Kundeti, Vamsi; Rajasekaran, Sanguthevar

    2012-06-01

    ), to self assemble rectangles (of fixed aspect ratio) with high probability. The tile complexity of our algorithm is Θ(log(n)) and is optimal on the probabilistic tile assembly model (PTAM)-n being an upper bound on the dimensions of a rectangle.

  18. From Recurrent Choice to Skill Learning: A Reinforcement-Learning Model

    ERIC Educational Resources Information Center

    Fu, Wai-Tat; Anderson, John R.

    2006-01-01

    The authors propose a reinforcement-learning mechanism as a model for recurrent choice and extend it to account for skill learning. The model was inspired by recent research in neurophysiological studies of the basal ganglia and provides an integrated explanation of recurrent choice behavior and skill learning. The behavior includes effects of…

  19. On the use of hierarchical probabilistic models for characterizing and managing uncertainty in risk/safety assessment.

    PubMed

    Kodell, Ralph L; Chen, James J

    2007-04-01

    A general probabilistically-based approach is proposed for both cancer and noncancer risk/safety assessments. The familiar framework of the original ADI/RfD formulation is used, substituting in the numerator a benchmark dose derived from a hierarchical pharmacokinetic/pharmacodynamic model and in the denominator a unitary uncertainty factor derived from a hierarchical animal/average human/sensitive human model. The empirical probability distributions of the numerator and denominator can be combined to produce an empirical human-equivalent distribution for an animal-derived benchmark dose in external-exposure units.

  20. Comparison of Probabilistic Coastal Inundation Maps Based on Historical Storms and Statistically Modeled Storm Ensemble

    NASA Astrophysics Data System (ADS)

    Feng, X.; Sheng, Y.; Condon, A. J.; Paramygin, V. A.; Hall, T.

    2012-12-01

    A cost effective method, JPM-OS (Joint Probability Method with Optimal Sampling), for determining storm response and inundation return frequencies was developed and applied to quantify the hazard of hurricane storm surges and inundation along the Southwest FL,US coast (Condon and Sheng 2012). The JPM-OS uses piecewise multivariate regression splines coupled with dimension adaptive sparse grids to enable the generation of a base flood elevation (BFE) map. Storms are characterized by their landfall characteristics (pressure deficit, radius to maximum winds, forward speed, heading, and landfall location) and a sparse grid algorithm determines the optimal set of storm parameter combinations so that the inundation from any other storm parameter combination can be determined. The end result is a sample of a few hundred (197 for SW FL) optimal storms which are simulated using a dynamically coupled storm surge / wave modeling system CH3D-SSMS (Sheng et al. 2010). The limited historical climatology (1940 - 2009) is explored to develop probabilistic characterizations of the five storm parameters. The probability distributions are discretized and the inundation response of all parameter combinations is determined by the interpolation in five-dimensional space of the optimal storms. The surge response and the associated joint probability of the parameter combination is used to determine the flood elevation with a 1% annual probability of occurrence. The limited historical data constrains the accuracy of the PDFs of the hurricane characteristics, which in turn affect the accuracy of the BFE maps calculated. To offset the deficiency of limited historical dataset, this study presents a different method for producing coastal inundation maps. Instead of using the historical storm data, here we adopt 33,731 tracks that can represent the storm climatology in North Atlantic basin and SW Florida coasts. This large quantity of hurricane tracks is generated from a new statistical model

  1. Impact of fault models on probabilistic seismic hazard assessment: the example of the West Corinth rift.

    NASA Astrophysics Data System (ADS)

    Chartier, Thomas; Scotti, Oona; Boiselet, Aurelien; Lyon-Caen, Hélène

    2016-04-01

    Including faults in probabilistic seismic hazard assessment tends to increase the degree of uncertainty in the results due to the intrinsically uncertain nature of the fault data. This is especially the case in the low to moderate seismicity regions of Europe, where slow slipping faults are difficult to characterize. In order to better understand the key parameters that control the uncertainty in the fault-related hazard computations, we propose to build an analytic tool that provides a clear link between the different components of the fault-related hazard computations and their impact on the results. This will allow identifying the important parameters that need to be better constrained in order to reduce the resulting uncertainty in hazard and also provide a more hazard-oriented strategy for collecting relevant fault parameters in the field. The tool will be illustrated through the example of the West Corinth rifts fault-models. Recent work performed in the gulf has shown the complexity of the normal faulting system that is accommodating the extensional deformation of the rift. A logic-tree approach is proposed to account for this complexity and the multiplicity of scientifically defendable interpretations. At the nodes of the logic tree, different options that could be considered at each step of the fault-related seismic hazard will be considered. The first nodes represent the uncertainty in the geometries of the faults and their slip rates, which can derive from different data and methodologies. The subsequent node explores, for a given geometry/slip rate of faults, different earthquake rupture scenarios that may occur in the complex network of faults. The idea is to allow the possibility of several faults segments to break together in a single rupture scenario. To build these multiple-fault-segment scenarios, two approaches are considered: one based on simple rules (i.e. minimum distance between faults) and a second one that relies on physically

  2. Modeling Educational Choices. A Binomial Logit Model Applied to the Demand for Higher Education.

    ERIC Educational Resources Information Center

    Jimenez, Juan de Dios; Salas-Velasco, Manual

    2000-01-01

    Presents a microeconomic analysis of the choice of university degree course (3 year or 4 year course) Spanish students make on finishing their secondary studies and applies the developed binomial logit model to survey data from 388 high school graduates. Findings show the importance of various factors in determining the likelihood of choosing the…

  3. Predicting Rib Fracture Risk With Whole-Body Finite Element Models: Development and Preliminary Evaluation of a Probabilistic Analytical Framework

    PubMed Central

    Forman, Jason L.; Kent, Richard W.; Mroz, Krystoffer; Pipkorn, Bengt; Bostrom, Ola; Segui-Gomez, Maria

    2012-01-01

    This study sought to develop a strain-based probabilistic method to predict rib fracture risk with whole-body finite element (FE) models, and to describe a method to combine the results with collision exposure information to predict injury risk and potential intervention effectiveness in the field. An age-adjusted ultimate strain distribution was used to estimate local rib fracture probabilities within an FE model. These local probabilities were combined to predict injury risk and severity within the whole ribcage. The ultimate strain distribution was developed from a literature dataset of 133 tests. Frontal collision simulations were performed with the THUMS (Total HUman Model for Safety) model with four levels of delta-V and two restraints: a standard 3-point belt and a progressive 3.5–7 kN force-limited, pretensioned (FL+PT) belt. The results of three simulations (29 km/h standard, 48 km/h standard, and 48 km/h FL+PT) were compared to matched cadaver sled tests. The numbers of fractures predicted for the comparison cases were consistent with those observed experimentally. Combining these results with field exposure informantion (ΔV, NASS-CDS 1992–2002) suggests a 8.9% probability of incurring AIS3+ rib fractures for a 60 year-old restrained by a standard belt in a tow-away frontal collision with this restraint, vehicle, and occupant configuration, compared to 4.6% for the FL+PT belt. This is the first study to describe a probabilistic framework to predict rib fracture risk based on strains observed in human-body FE models. Using this analytical framework, future efforts may incorporate additional subject or collision factors for multi-variable probabilistic injury prediction. PMID:23169122

  4. Predicting rib fracture risk with whole-body finite element models: development and preliminary evaluation of a probabilistic analytical framework.

    PubMed

    Forman, Jason L; Kent, Richard W; Mroz, Krystoffer; Pipkorn, Bengt; Bostrom, Ola; Segui-Gomez, Maria

    2012-01-01

    This study sought to develop a strain-based probabilistic method to predict rib fracture risk with whole-body finite element (FE) models, and to describe a method to combine the results with collision exposure information to predict injury risk and potential intervention effectiveness in the field. An age-adjusted ultimate strain distribution was used to estimate local rib fracture probabilities within an FE model. These local probabilities were combined to predict injury risk and severity within the whole ribcage. The ultimate strain distribution was developed from a literature dataset of 133 tests. Frontal collision simulations were performed with the THUMS (Total HUman Model for Safety) model with four levels of delta-V and two restraints: a standard 3-point belt and a progressive 3.5-7 kN force-limited, pretensioned (FL+PT) belt. The results of three simulations (29 km/h standard, 48 km/h standard, and 48 km/h FL+PT) were compared to matched cadaver sled tests. The numbers of fractures predicted for the comparison cases were consistent with those observed experimentally. Combining these results with field exposure informantion (ΔV, NASS-CDS 1992-2002) suggests a 8.9% probability of incurring AIS3+ rib fractures for a 60 year-old restrained by a standard belt in a tow-away frontal collision with this restraint, vehicle, and occupant configuration, compared to 4.6% for the FL+PT belt. This is the first study to describe a probabilistic framework to predict rib fracture risk based on strains observed in human-body FE models. Using this analytical framework, future efforts may incorporate additional subject or collision factors for multi-variable probabilistic injury prediction.

  5. The Use of Predictive Models in Forecasting Student Choice.

    ERIC Educational Resources Information Center

    Tuckman, Howard P.

    This paper uses ordinary least squares regression to obtain probabilities for the post-graduation choices of high school seniors, and it presents an illustration of the use of these probabilities in calculating future income. Problems raised by the use of the least squares regression are discussed. The benefits of higher education and ways in…

  6. Item Response Modeling of Forced-Choice Questionnaires

    ERIC Educational Resources Information Center

    Brown, Anna; Maydeu-Olivares, Alberto

    2011-01-01

    Multidimensional forced-choice formats can significantly reduce the impact of numerous response biases typically associated with rating scales. However, if scored with classical methodology, these questionnaires produce ipsative data, which lead to distorted scale relationships and make comparisons between individuals problematic. This research…

  7. Linear-Nonlinear-Poisson Models of Primate Choice Dynamics

    ERIC Educational Resources Information Center

    Corrado, Greg S.; Sugrue, Leo P.; Seung, H. Sebastian; Newsome, William T.

    2005-01-01

    The equilibrium phenomenon of matching behavior traditionally has been studied in stationary environments. Here we attempt to uncover the local mechanism of choice that gives rise to matching by studying behavior in a highly dynamic foraging environment. In our experiments, 2 rhesus monkeys ("Macacca mulatta") foraged for juice rewards by making…

  8. Interrelationship of Career Choice Competencies and Career Choice Attitudes of Ninth-Grade Pupils: Testing Hypotheses Derived from Crites' Model of Career Maturity

    ERIC Educational Resources Information Center

    Westbrook, Bert W.

    1976-01-01

    Presents the results of a study examining whether the two dimensions of Career Choice Attitudes and Career Choice Competencies are interrelated as hypothesized in the Crites model of career maturity. To test six associational hypotheses derived from the Crites model three career maturity instruments were administered to ninth-grade pupils (N=90).…

  9. Probabilistic Causation without Probability.

    ERIC Educational Resources Information Center

    Holland, Paul W.

    The failure of Hume's "constant conjunction" to describe apparently causal relations in science and everyday life has led to various "probabilistic" theories of causation of which the study by P. C. Suppes (1970) is an important example. A formal model that was developed for the analysis of comparative agricultural experiments…

  10. A probabilistic modeling approach to assess human inhalation exposure risks to airborne aflatoxin B 1 (AFB 1)

    NASA Astrophysics Data System (ADS)

    Liao, Chung-Min; Chen, Szu-Chieh

    To assess how the human lung exposure to airborne aflatoxin B 1 (AFB 1) during on-farm activities including swine feeding, storage bin cleaning, corn harvest, and grain elevator loading/unloading, we present a probabilistic risk model, appraised with empirical data. The model integrates probabilistic exposure profiles from a compartmental lung model with the reconstructed dose-response relationships based on an empirical three-parameter Hill equation model, describing AFB 1 cytotoxicity for inhibition response in human bronchial epithelial cells, to quantitatively estimate the inhalation exposure risks. The risk assessment results implicate that exposure to airborne AFB 1 may pose no significance to corn harvest and grain elevator loading/unloading activities, yet a relatively high risk for swine feeding and storage bin cleaning. Applying a joint probability function method based on exceedence profiles, we estimate that a potential high risk for the bronchial region (inhibition=56.69% with 95% confidence interval (CI): 35.05-72.87%) and bronchiolar region (inhibition=44.93% with 95% CI: 21.61 - 66.78%) is alarming during swine feeding activity. We parameterized the proposed predictive model that should encourage a risk-management framework for discussion of carcinogenic risk in occupational settings where inhalation of AFB 1-contaminated dust occurs.

  11. Developing EHR-driven heart failure risk prediction models using CPXR(Log) with the probabilistic loss function.

    PubMed

    Taslimitehrani, Vahid; Dong, Guozhu; Pereira, Naveen L; Panahiazar, Maryam; Pathak, Jyotishman

    2016-04-01

    Computerized survival prediction in healthcare identifying the risk of disease mortality, helps healthcare providers to effectively manage their patients by providing appropriate treatment options. In this study, we propose to apply a classification algorithm, Contrast Pattern Aided Logistic Regression (CPXR(Log)) with the probabilistic loss function, to develop and validate prognostic risk models to predict 1, 2, and 5year survival in heart failure (HF) using data from electronic health records (EHRs) at Mayo Clinic. The CPXR(Log) constructs a pattern aided logistic regression model defined by several patterns and corresponding local logistic regression models. One of the models generated by CPXR(Log) achieved an AUC and accuracy of 0.94 and 0.91, respectively, and significantly outperformed prognostic models reported in prior studies. Data extracted from EHRs allowed incorporation of patient co-morbidities into our models which helped improve the performance of the CPXR(Log) models (15.9% AUC improvement), although did not improve the accuracy of the models built by other classifiers. We also propose a probabilistic loss function to determine the large error and small error instances. The new loss function used in the algorithm outperforms other functions used in the previous studies by 1% improvement in the AUC. This study revealed that using EHR data to build prediction models can be very challenging using existing classification methods due to the high dimensionality and complexity of EHR data. The risk models developed by CPXR(Log) also reveal that HF is a highly heterogeneous disease, i.e., different subgroups of HF patients require different types of considerations with their diagnosis and treatment. Our risk models provided two valuable insights for application of predictive modeling techniques in biomedicine: Logistic risk models often make systematic prediction errors, and it is prudent to use subgroup based prediction models such as those given by CPXR

  12. The hidden Markov Topic model: a probabilistic model of semantic representation.

    PubMed

    Andrews, Mark; Vigliocco, Gabriella

    2010-01-01

    In this paper, we describe a model that learns semantic representations from the distributional statistics of language. This model, however, goes beyond the common bag-of-words paradigm, and infers semantic representations by taking into account the inherent sequential nature of linguistic data. The model we describe, which we refer to as a Hidden Markov Topics model, is a natural extension of the current state of the art in Bayesian bag-of-words models, that is, the Topics model of Griffiths, Steyvers, and Tenenbaum (2007), preserving its strengths while extending its scope to incorporate more fine-grained linguistic information.

  13. Twenty-first century probabilistic projections of precipitation over Ontario, Canada through a regional climate model ensemble

    NASA Astrophysics Data System (ADS)

    Wang, Xiuquan; Huang, Guohe; Liu, Jinliang

    2016-06-01

    In this study, probabilistic projections of precipitation for the Province of Ontario are developed through a regional climate model ensemble to help investigate how global warming would affect its local climate. The PRECIS regional climate modeling system is employed to perform ensemble simulations, driven by a set of boundary conditions from a HadCM3-based perturbed-physics ensemble. The PRECIS ensemble simulations are fed into a Bayesian hierarchical model to quantify uncertain factors affecting the resulting projections of precipitation and thus generate probabilistic precipitation changes at grid point scales. Following that, reliable precipitation projections throughout the twenty-first century are developed for the entire province by applying the probabilistic changes to the observed precipitation. The results show that the vast majority of cities in Ontario are likely to suffer positive changes in annual precipitation in 2030, 2050, and 2080 s in comparison to the baseline observations. This may suggest that the whole province is likely to gain more precipitation throughout the twenty-first century in response to global warming. The analyses on the projections of seasonal precipitation further demonstrate that the entire province is likely to receive more precipitation in winter, spring, and autumn throughout this century while summer precipitation is only likely to increase slightly in 2030 s and would decrease gradually afterwards. However, because the magnitude of projected decrease in summer precipitation is relatively small in comparison with the anticipated increases in other three seasons, the annual precipitation over Ontario is likely to suffer a progressive increase throughout the twenty-first century (by 7.0 % in 2030 s, 9.5 % in 2050 s, and 12.6 % in 2080 s). Besides, the degree of uncertainty for precipitation projections is analyzed. The results suggest that future changes in spring precipitation show higher degree of uncertainty than other

  14. Is the basic conditional probabilistic?

    PubMed

    Goodwin, Geoffrey P

    2014-06-01

    Nine experiments examined whether individuals treat the meaning of basic conditional assertions as deterministic or probabilistic. In Experiments 1-4, participants were presented with either probabilistic or deterministic relations, which they had to describe with a conditional. These experiments consistently showed that people tend only to use the basic if p then q construction to describe deterministic relations between antecedent and consequent, whereas they use a probabilistically qualified construction, if p then probably q, to describe probabilistic relations-suggesting that the default interpretation of the conditional is deterministic. Experiments 5 and 6 showed that when directly asked, individuals typically report that conditional assertions admit no exceptions (i.e., they are seen as deterministic). Experiments 7-9 showed that individuals judge the truth of conditional assertions in accordance with this deterministic interpretation. Together, these results pose a challenge to probabilistic accounts of the meaning of conditionals and support mental models, formal rules, and suppositional accounts.

  15. Probabilistic Risk Model for Organ Doses and Acute Health Effects of Astronauts on Lunar Missions

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Hu, Shaowen; Nounu, Hatem N.; Cucinotta, Francis A.

    2009-01-01

    Exposure to large solar particle events (SPEs) is a major concern during EVAs on the lunar surface and in Earth-to-Lunar transit. 15% of crew times may be on EVA with minimal radiation shielding. Therefore, an accurate assessment of SPE occurrence probability is required for the mission planning by NASA. We apply probabilistic risk assessment (PRA) for radiation protection of crews and optimization of lunar mission planning.

  16. A Probabilistic Model of Global-Scale Seismology with Veith-Clawson Amplitude Corrections

    NASA Astrophysics Data System (ADS)

    Arora, N. S.; Russell, S.

    2013-12-01

    We present a probabilistic generative model of global-scale seismology, NET-VISA, that is designed to address the event detection and location problem of seismic monitoring. The model is based on a standard Bayesian framework with prior probabilities for event generation and propagation as well as likelihoods of detection and arrival (or onset) parameters. The model is supplemented with a greedy search algorithm that iteratively improves the predicted bulletin with respect to the posterior probability. Our prior model incorporates both seismic theory and empirical observations as appropriate. For instance, we use empirical observations for the expected rates of earthquake at each point on the earth, while we use the Gutenberg-Richter law for the the expected magnitude distribution of these earthquakes. In this work, we describe an extension of our model where we include the Veith-Clawson (1972) amplitude decline curves in our empirically calibrated arrival amplitude model. While this change doesn't alter the overall event-detection results, we have chosen to keep the Veith-Clawson curves since they are more seismically accurate. We also describe a recent change to our search algorithm, whereby we now consider multiple hypotheses when we encounter a series of closely spaced arrivals which could be explained by either a single event or multiple co-located events. This change has led to a sharp improvement in our results on large after-shock sequences. We use the analyst-curated LEB bulletin or the REB bulletin, which is the published product of the IDC, as a reference and measure the overlap (percentage of reference events that are matched) and inconsistency (percentage of test bulletin events that don't match anything in the reference) of a one-to-one matching between the test and the reference bulletins. In the table below we show results for NET-VISA and SEL3, which is produced by the existing GA software, for the whole of 2009. These results show that NET

  17. Circularly-symmetric complex normal ratio distribution for scalar transmissibility functions. Part II: Probabilistic model and validation

    NASA Astrophysics Data System (ADS)

    Yan, Wang-Ji; Ren, Wei-Xin

    2016-12-01

    In Part I of this study, some new theorems, corollaries and lemmas on circularly-symmetric complex normal ratio distribution have been mathematically proved. This part II paper is dedicated to providing a rigorous treatment of statistical properties of raw scalar transmissibility functions at an arbitrary frequency line. On the basis of statistics of raw FFT coefficients and circularly-symmetric complex normal ratio distribution, explicit closed-form probabilistic models are established for both multivariate and univariate scalar transmissibility functions. Also, remarks on the independence of transmissibility functions at different frequency lines and the shape of the probability density function (PDF) of univariate case are presented. The statistical structures of probabilistic models are concise, compact and easy-implemented with a low computational effort. They hold for general stationary vector processes, either Gaussian stochastic processes or non-Gaussian stochastic processes. The accuracy of proposed models is verified using numerical example as well as field test data of a high-rise building and a long-span cable-stayed bridge. This study yields new insights into the qualitative analysis of the uncertainty of scalar transmissibility functions, which paves the way for developing new statistical methodologies for modal analysis, model updating or damage detection using responses only without input information.

  18. A Statistical Model of the Grammatical Choices in Child Production of Dative Sentences

    ERIC Educational Resources Information Center

    de Marneffe, Marie-Catherine; Grimm, Scott; Arnon, Inbal; Kirby, Susannah; Bresnan, Joan

    2012-01-01

    Focusing on children's production of the dative alternation in English, we examine whether children's choices are influenced by the same factors that influence adults' choices, and whether, like adults, they are sensitive to multiple factors simultaneously. We do so by using mixed-effect regression models to analyse child and child-directed…

  19. Testing a Model of Nontraditional Career Choice Goals with Mexican American Adolescent Men

    ERIC Educational Resources Information Center

    Flores, Lisa Y.; Navarro, Rachel L.; Smith, Jamie L.; Ploszaj, Ann M.

    2006-01-01

    This study examined the nontraditional career choice goals of 302 Mexican American adolescent men using an extended version of Lent, Brown, and Hacketts (1994) career choice model. It was hypothesized that several background contextual variables (e.g., acculturation level, parental support, perceived occupational gender barriers) would predict…

  20. Modeling the Bullying Prevention Program Preferences of Educators: A Discrete Choice Conjoint Experiment

    ERIC Educational Resources Information Center

    Cunningham, Charles E.; Vaillancourt, Tracy; Rimas, Heather; Deal, Ken; Cunningham, Lesley; Short, Kathy; Chen, Yvonne

    2009-01-01

    We used discrete choice conjoint analysis to model the bullying prevention program preferences of educators. Using themes from computerized decision support lab focus groups (n = 45 educators), we composed 20 three-level bullying prevention program design attributes. Each of 1,176 educators completed 25 choice tasks presenting experimentally…

  1. How the twain can meet: Prospect theory and models of heuristics in risky choice.

    PubMed

    Pachur, Thorsten; Suter, Renata S; Hertwig, Ralph

    2017-03-01

    Two influential approaches to modeling choice between risky options are algebraic models (which focus on predicting the overt decisions) and models of heuristics (which are also concerned with capturing the underlying cognitive process). Because they rest on fundamentally different assumptions and algorithms, the two approaches are usually treated as antithetical, or even incommensurable. Drawing on cumulative prospect theory (CPT; Tversky & Kahneman, 1992) as the currently most influential instance of a descriptive algebraic model, we demonstrate how the two modeling traditions can be linked. CPT's algebraic functions characterize choices in terms of psychophysical (diminishing sensitivity to probabilities and outcomes) as well as psychological (risk aversion and loss aversion) constructs. Models of heuristics characterize choices as rooted in simple information-processing principles such as lexicographic and limited search. In computer simulations, we estimated CPT's parameters for choices produced by various heuristics. The resulting CPT parameter profiles portray each of the choice-generating heuristics in psychologically meaningful ways-capturing, for instance, differences in how the heuristics process probability information. Furthermore, CPT parameters can reflect a key property of many heuristics, lexicographic search, and track the environment-dependent behavior of heuristics. Finally, we show, both in an empirical and a model recovery study, how CPT parameter profiles can be used to detect the operation of heuristics. We also address the limits of CPT's ability to capture choices produced by heuristics. Our results highlight an untapped potential of CPT as a measurement tool to characterize the information processing underlying risky choice.

  2. Agent autonomy approach to probabilistic physics-of-failure modeling of complex dynamic systems with interacting failure mechanisms

    NASA Astrophysics Data System (ADS)

    Gromek, Katherine Emily

    A novel computational and inference framework of the physics-of-failure (PoF) reliability modeling for complex dynamic systems has been established in this research. The PoF-based reliability models are used to perform a real time simulation of system failure processes, so that the system level reliability modeling would constitute inferences from checking the status of component level reliability at any given time. The "agent autonomy" concept is applied as a solution method for the system-level probabilistic PoF-based (i.e. PPoF-based) modeling. This concept originated from artificial intelligence (AI) as a leading intelligent computational inference in modeling of multi agents systems (MAS). The concept of agent autonomy in the context of reliability modeling was first proposed by M. Azarkhail [1], where a fundamentally new idea of system representation by autonomous intelligent agents for the purpose of reliability modeling was introduced. Contribution of the current work lies in the further development of the agent anatomy concept, particularly the refined agent classification within the scope of the PoF-based system reliability modeling, new approaches to the learning and the autonomy properties of the intelligent agents, and modeling interacting failure mechanisms within the dynamic engineering system. The autonomous property of intelligent agents is defined as agent's ability to self-activate, deactivate or completely redefine their role in the analysis. This property of agents and the ability to model interacting failure mechanisms of the system elements makes the agent autonomy fundamentally different from all existing methods of probabilistic PoF-based reliability modeling. 1. Azarkhail, M., "Agent Autonomy Approach to Physics-Based Reliability Modeling of Structures and Mechanical Systems", PhD thesis, University of Maryland, College Park, 2007.

  3. Probabilistically constraining proxy age-depth models within a Bayesian hierarchical reconstruction model

    NASA Astrophysics Data System (ADS)

    Werner, Johannes; Tingley, Martin

    2015-04-01

    Reconstructions of late-Holocene climate rely heavily upon proxies that are assumed to be accurately dated by layer counting, such as measurement on tree rings, ice cores, and varved lake sediments. Considerable advances may be achievable if time uncertain proxies could be included within these multiproxy reconstructions, and if time uncertainties were recognized and correctly modeled for proxies commonly treated as free of age model errors. Current approaches to accounting for time uncertainty are generally limited to repeating the reconstruction using each of an ensemble of age models, thereby inflating the final estimated uncertainty - in effect, each possible age model is given equal weighting. Uncertainties can be reduced by exploiting the inferred space-time covariance structure of the climate to re-weight the possible age models. Here we demonstrate how Bayesian Hierarchical climate reconstruction models can be augmented to account for time uncertain proxies. Critically, while a priori all age models are given equal probability of being correct, the probabilities associated with the age models are formally updated within the Bayesian framework, thereby reducing uncertainties. Numerical experiments show that updating the age model probabilities decreases uncertainty in the climate reconstruction, as compared with the current de-facto standard of sampling over all age models, provided there is sufficient information from other data sources in the region of the time-uncertain proxy. This approach can readily be generalized to non-layer counted proxies, such as those derived from marine sediments. Werner and Tingley, Climate of the Past Discussions (2014)

  4. Technical Note: Probabilistically constraining proxy age-depth models within a Bayesian hierarchical reconstruction model

    NASA Astrophysics Data System (ADS)

    Werner, J. P.; Tingley, M. P.

    2015-03-01

    Reconstructions of the late-Holocene climate rely heavily upon proxies that are assumed to be accurately dated by layer counting, such as measurements of tree rings, ice cores, and varved lake sediments. Considerable advances could be achieved if time-uncertain proxies were able to be included within these multiproxy reconstructions, and if time uncertainties were recognized and correctly modeled for proxies commonly treated as free of age model errors. Current approaches for accounting for time uncertainty are generally limited to repeating the reconstruction using each one of an ensemble of age models, thereby inflating the final estimated uncertainty - in effect, each possible age model is given equal weighting. Uncertainties can be reduced by exploiting the inferred space-time covariance structure of the climate to re-weight the possible age models. Here, we demonstrate how Bayesian hierarchical climate reconstruction models can be augmented to account for time-uncertain proxies. Critically, although a priori all age models are given equal probability of being correct, the probabilities associated with the age models are formally updated within the Bayesian framework, thereby reducing uncertainties. Numerical experiments show that updating the age model probabilities decreases uncertainty in the resulting reconstructions, as compared with the current de facto standard of sampling over all age models, provided there is sufficient information from other data sources in the spatial region of the time-uncertain proxy. This approach can readily be generalized to non-layer-counted proxies, such as those derived from marine sediments.

  5. Technical Note: Probabilistically constraining proxy age-depth models within a Bayesian hierarchical reconstruction model

    NASA Astrophysics Data System (ADS)

    Werner, J. P.; Tingley, M. P.

    2014-12-01

    Reconstructions of late-Holocene climate rely heavily upon proxies that are assumed to be accurately dated by layer counting, such as measurement on tree rings, ice cores, and varved lake sediments. Considerable advances may be achievable if time uncertain proxies could be included within these multiproxy reconstructions, and if time uncertainties were recognized and correctly modeled for proxies commonly treated as free of age model errors. Current approaches to accounting for time uncertainty are generally limited to repeating the reconstruction using each of an ensemble of age models, thereby inflating the final estimated uncertainty - in effect, each possible age model is given equal weighting. Uncertainties can be reduced by exploiting the inferred space-time covariance structure of the climate to re-weight the possible age models. Here we demonstrate how Bayesian Hierarchical climate reconstruction models can be augmented to account for time uncertain proxies. Critically, while a priori all age models are given equal probability of being correct, the probabilities associated with the age models are formally updated within the Bayesian framework, thereby reducing uncertainties. Numerical experiments show that updating the age-model probabilities decreases uncertainty in the climate reconstruction, as compared with the current de-facto standard of sampling over all age models, provided there is sufficient information from other data sources in the region of the time-uncertain proxy. This approach can readily be generalized to non-layer counted proxies, such as those derived from marine sediments.

  6. Crevice corrosion {ampersand} pitting of high-level waste containers: the integration of deterministic {ampersand} probabilistic models (II)

    SciTech Connect

    Farmer, J.C.

    1997-10-01

    An integrated predictive model is being developed to account for the effects of localized environmental conditions in crevices on the initiation and propagation of pits. A deterministic calculation is used to estimate the accumulation of hydrogen ions (pH suppression) in the crevice solution due to the hydrolysis of dissolved metals. Pit initiation and growth within the crevice is then dealt with by either a probabilistic model, or an equivalent deterministic model. Ultimately, the role of intergranular corrosion will have to be considered. While the strategy presented here is very promising, the integrated model is not yet ready for precise quantitative predictions. Empirical expressions for the rate of penetration based upon experimental crevice corrosion data can be used in the interim period, until the integrated model can be refined. Bounding calculations based upon such empirical expressions can provide important insight into worst-case scenarios.

  7. Verification and optimal control of context-sensitive probabilistic Boolean networks using model checking and polynomial optimization.

    PubMed

    Kobayashi, Koichi; Hiraishi, Kunihiko

    2014-01-01

    One of the significant topics in systems biology is to develop control theory of gene regulatory networks (GRNs). In typical control of GRNs, expression of some genes is inhibited (activated) by manipulating external stimuli and expression of other genes. It is expected to apply control theory of GRNs to gene therapy technologies in the future. In this paper, a control method using a Boolean network (BN) is studied. A BN is widely used as a model of GRNs, and gene expression is expressed by a binary value (ON or OFF). In particular, a context-sensitive probabilistic Boolean network (CS-PBN), which is one of the extended models of BNs, is used. For CS-PBNs, the verification problem and the optimal control problem are considered. For the verification problem, a solution method using the probabilistic model checker PRISM is proposed. For the optimal control problem, a solution method using polynomial optimization is proposed. Finally, a numerical example on the WNT5A network, which is related to melanoma, is presented. The proposed methods provide us useful tools in control theory of GRNs.

  8. Verification and Optimal Control of Context-Sensitive Probabilistic Boolean Networks Using Model Checking and Polynomial Optimization

    PubMed Central

    Hiraishi, Kunihiko

    2014-01-01

    One of the significant topics in systems biology is to develop control theory of gene regulatory networks (GRNs). In typical control of GRNs, expression of some genes is inhibited (activated) by manipulating external stimuli and expression of other genes. It is expected to apply control theory of GRNs to gene therapy technologies in the future. In this paper, a control method using a Boolean network (BN) is studied. A BN is widely used as a model of GRNs, and gene expression is expressed by a binary value (ON or OFF). In particular, a context-sensitive probabilistic Boolean network (CS-PBN), which is one of the extended models of BNs, is used. For CS-PBNs, the verification problem and the optimal control problem are considered. For the verification problem, a solution method using the probabilistic model checker PRISM is proposed. For the optimal control problem, a solution method using polynomial optimization is proposed. Finally, a numerical example on the WNT5A network, which is related to melanoma, is presented. The proposed methods provide us useful tools in control theory of GRNs. PMID:24587766

  9. Probabilistic modelling to assess exposure to three artificial sweeteners of young Irish patients aged 1-3 years with PKU and CMPA.

    PubMed

    O'Sullivan, Aaron J; Pigat, Sandrine; O'Mahony, Cian; Gibney, Michael J; McKevitt, Aideen I

    2016-11-01

    The choice of suitable normal foods is limited for individuals with particular medical conditions, e.g., inborn errors of metabolism (phenylketonuria - PKU) or severe cow's milk protein allergy (CMPA). Patients may have dietary restrictions and exclusive or partial replacement of specific food groups with specially formulated products to meet particular nutrition requirements. Artificial sweeteners are used to improve the appearance and palatability of such food products to avoid food refusal and ensure dietary adherence. Young children have a higher risk of exceeding acceptable daily intakes for additives than adults due to higher food intakes kg(-1) body weight. The Budget Method and EFSA's Food Additives Intake Model (FAIM) are not equipped to assess partial dietary replacement with special formulations as they are built on data from dietary surveys of consumers without special medical requirements impacting the diet. The aim of this study was to explore dietary exposure modelling as a means of estimating the intake of artificial sweeteners by young PKU and CMPA patients aged 1-3 years. An adapted validated probabilistic model (FACET) was used to assess patients' exposure to artificial sweeteners. Food consumption data were derived from the food consumption survey data of healthy young children in Ireland from the National Preschool and Nutrition Survey (NPNS, 2010-11). Specially formulated foods for special medical purposes were included in the exposure model to replace restricted foods. Inclusion was based on recommendations for adequate protein intake and dietary adherence data. Exposure assessment results indicated that young children with PKU and CMPA have higher relative average intakes of artificial sweeteners than healthy young children. The reliability and robustness of the model in the estimation of patient additive exposures was further investigated and provides the first exposure estimates for these special populations.

  10. Model choice for phylogeographic inference using a large set of models.

    PubMed

    Pelletier, Tara A; Carstens, Bryan C

    2014-06-01

    Model-based analyses are common in phylogeographic inference because they parameterize processes such as population division, gene flow and expansion that are of interest to biologists. Approximate Bayesian computation is a model-based approach that can be customized to any empirical system and used to calculate the relative posterior probability of several models, provided that suitable models can be identified for comparison. The question of how to identify suitable models is explored using data from Plethodon idahoensis, a salamander that inhabits the North American inland northwest temperate rainforest. First, we conduct an ABC analysis using five models suggested by previous research, calculate the relative posterior probabilities and find that a simple model of population isolation has the best fit to the data (PP=0.70). In contrast to this subjective choice of models to include in the analysis, we also specify models in a more objective manner by simulating prior distributions for 143 models that included panmixia, population isolation, change in effective population size, migration and range expansion. We then identify a smaller subset of models for comparison by generating an expectation of the highest posterior probability that a false model is likely to achieve due to chance and calculate the relative posterior probabilities of only those models that exceed this expected level. A model that parameterized divergence with population expansion and gene flow in one direction offered the best fit to the P. idahoensis data (in contrast to an isolation-only model from the first analysis). Our investigation demonstrates that the determination of which models to include in ABC model choice experiments is a vital component of model-based phylogeographic analysis.

  11. Challenges associated with estimating the cost of European flooding through the development of a multi-country probabilistic model

    NASA Astrophysics Data System (ADS)

    Haseldine, Lucy

    2013-04-01

    Assessing the potential costs of large-scale flooding within the insurance and reinsurance industry can be achieved using probabilistic catastrophe models that combine hazard map outputs from flood models with exposure information. Many detailed flood modelling methodologies are available, including both advanced hydrological approaches and detailed 2D hydraulic models. However, these approaches are typically developed and perfected for a relatively limited test area (e.g. a single catchment or region) enabling efficient calibration to be carried out. With single flood events crossing country borders, multiple concurrent floods occurring across catchments, and an increasing need for national and international scale risk and financial assessment, up-scaling these localised methodologies is essential. The implementation of such techniques at national and international level pose a series of challenges to the model developer. Here, we discuss the challenges associated with the development of a multi-country probabilistic model designed to enable assessment of insurance exposures to river flooding in 12 countries across Europe on a return period basis. The underlying components of the model incorporate several components primarily developed for use in more limited areas, for example the 2D hydraulic modelling software JFlow+. Some of the challenges and their solutions that we will discuss include: • Availability of different volumes, record lengths and qualities of gauge and digital terrain data between countries; • Differing resolution and quality of property exposure information; • The need for a significant amount of manual editing work across a very wide area; • Different information available for validation in different regions; • Lengthy data and model analysis times; • The requirement for extremely fast computer processors.

  12. Developing an event-tree probabilistic tsunami inundation model for NE Atlantic coasts: Application to case studies

    NASA Astrophysics Data System (ADS)

    Omira, Rachid; Baptista, Maria Ana; Matias, Luis

    2015-04-01

    This study constitutes the first assessment of probabilistic tsunami inundation in the NE Atlantic region, using an event-tree approach. It aims to develop a probabilistic tsunami inundation approach for the NE Atlantic coast with an application to two test sites of ASTARTE project, Tangier-Morocco and Sines-Portugal. Only tsunamis of tectonic origin are considered here, taking into account near-, regional- and far-filed sources. The multidisciplinary approach, proposed here, consists of an event-tree method that gathers seismic hazard assessment, tsunami numerical modelling, and statistical methods. It presents also a treatment of uncertainties related to source location and tidal stage in order to derive the likelihood of tsunami flood occurrence and exceedance of a specific near-shore wave height during a given return period. We derive high-resolution probabilistic maximum wave heights and flood distributions for both test-sites Tangier and Sines considering 100-, 500-, and 1000-year return periods. We find that the probability that a maximum wave height exceeds 1 m somewhere along the Sines coasts reaches about 55% for 100-year return period, and is up to 100% for 1000-year return period. Along Tangier coast, the probability of inundation occurrence (flow depth > 0m) is up to 45% for 100-year return period and reaches 96% in some near-shore costal location for 500-year return period. Acknowledgements: This work is funded by project ASTARTE - Assessment, STrategy And Risk Reduction for Tsunamis in Europe. Grant 603839, 7th FP (ENV.2013.6.4-3 ENV.2013.6.4-3).

  13. Multi-model ensemble-based probabilistic prediction of tropical cyclogenesis using TIGGE model forecasts

    NASA Astrophysics Data System (ADS)

    Jaiswal, Neeru; Kishtawal, C. M.; Bhomia, Swati; Pal, P. K.

    2016-10-01

    An extended range tropical cyclogenesis forecast model has been developed using the forecasts of global models available from TIGGE portal. A scheme has been developed to detect the signatures of cyclogenesis in the global model forecast fields [i.e., the mean sea level pressure and surface winds (10 m horizontal winds)]. For this, a wind matching index was determined between the synthetic cyclonic wind fields and the forecast wind fields. The thresholds of 0.4 for wind matching index and 1005 hpa for pressure were determined to detect the cyclonic systems. These detected cyclonic systems in the study region are classified into different cyclone categories based on their intensity (maximum wind speed). The forecasts of up to 15 days from three global models viz., ECMWF, NCEP and UKMO have been used to predict cyclogenesis based on multi-model ensemble approach. The occurrence of cyclonic events of different categories in all the forecast steps in the grided region (10 × 10 km2) was used to estimate the probability of the formation of cyclogenesis. The probability of cyclogenesis was estimated by computing the grid score using the wind matching index by each model and at each forecast step and convolving it with Gaussian filter. The proposed method is used to predict the cyclogenesis of five named tropical cyclones formed during the year 2013 in the north Indian Ocean. The 6-8 days advance cyclogenesis of theses systems were predicted using the above approach. The mean lead prediction time for the cyclogenesis event of the proposed model has been found as 7 days.

  14. Improving Global Forecast System of extreme precipitation events with regional statistical model: Application of quantile-based probabilistic forecasts

    NASA Astrophysics Data System (ADS)

    Shastri, Hiteshri; Ghosh, Subimal; Karmakar, Subhankar

    2017-02-01

    Forecasting of extreme precipitation events at a regional scale is of high importance due to their severe impacts on society. The impacts are stronger in urban regions due to high flood potential as well high population density leading to high vulnerability. Although significant scientific improvements took place in the global models for weather forecasting, they are still not adequate at a regional scale (e.g., for an urban region) with high false alarms and low detection. There has been a need to improve the weather forecast skill at a local scale with probabilistic outcome. Here we develop a methodology with quantile regression, where the reliably simulated variables from Global Forecast System are used as predictors and different quantiles of rainfall are generated corresponding to that set of predictors. We apply this method to a flood-prone coastal city of India, Mumbai, which has experienced severe floods in recent years. We find significant improvements in the forecast with high detection and skill scores. We apply the methodology to 10 ensemble members of Global Ensemble Forecast System and find a reduction in ensemble uncertainty of precipitation across realizations with respect to that of original precipitation forecasts. We validate our model for the monsoon season of 2006 and 2007, which are independent of the training/calibration data set used in the study. We find promising results and emphasize to implement such data-driven methods for a better probabilistic forecast at an urban scale primarily for an early flood warning.

  15. Development of a probabilistic ocean modelling system based on NEMO 3.5: application at eddying resolution

    NASA Astrophysics Data System (ADS)

    Bessières, Laurent; Leroux, Stéphanie; Brankart, Jean-Michel; Molines, Jean-Marc; Moine, Marie-Pierre; Bouttier, Pierre-Antoine; Penduff, Thierry; Terray, Laurent; Barnier, Bernard; Sérazin, Guillaume

    2017-03-01

    This paper presents the technical implementation of a new, probabilistic version of the NEMO ocean-sea-ice modelling system. Ensemble simulations with N members running simultaneously within a single executable, and interacting mutually if needed, are made possible through an enhanced message-passing interface (MPI) strategy including a double parallelization in the spatial and ensemble dimensions. An example application is then given to illustrate the implementation, performances, and potential use of this novel probabilistic modelling tool. A large ensemble of 50 global ocean-sea-ice hindcasts has been performed over the period 1960-2015 at eddy-permitting resolution (1/4°) for the OCCIPUT (oceanic chaos - impacts, structure, predictability) project. This application aims to simultaneously simulate the intrinsic/chaotic and the atmospherically forced contributions to the ocean variability, from mesoscale turbulence to interannual-to-multidecadal timescales. Such an ensemble indeed provides a unique way to disentangle and study both contributions, as the forced variability may be estimated through the ensemble mean, and the intrinsic chaotic variability may be estimated through the ensemble spread.

  16. Developing an Event-Tree Probabilistic Tsunami Inundation Model for NE Atlantic Coasts: Application to a Case Study

    NASA Astrophysics Data System (ADS)

    Omira, R.; Matias, L.; Baptista, M. A.

    2016-12-01

    This study constitutes a preliminary assessment of probabilistic tsunami inundation in the NE Atlantic region. We developed an event-tree approach to calculate the likelihood of tsunami flood occurrence and exceedance of a specific near-shore wave height for a given exposure time. Only tsunamis of tectonic origin are considered here, taking into account local, regional, and far-field sources. The approach used here consists of an event-tree method that gathers probability models for seismic sources, tsunami numerical modeling, and statistical methods. It also includes a treatment of aleatoric uncertainties related to source location and tidal stage. Epistemic uncertainties are not addressed in this study. The methodology is applied to the coastal test-site of Sines located in the NE Atlantic coast of Portugal. We derive probabilistic high-resolution maximum wave amplitudes and flood distributions for the study test-site considering 100- and 500-year exposure times. We find that the probability that maximum wave amplitude exceeds 1 m somewhere along the Sines coasts reaches about 60 % for an exposure time of 100 years and is up to 97 % for an exposure time of 500 years. The probability of inundation occurrence (flow depth >0 m) varies between 10 % and 57 %, and from 20 % up to 95 % for 100- and 500-year exposure times, respectively. No validation has been performed here with historical tsunamis. This paper illustrates a methodology through a case study, which is not an operational assessment.

  17. Analysis of well test data---Application of probabilistic models to infer hydraulic properties of fractures. [Contains list of standardized terminology or nomenclatue used in statistical models

    SciTech Connect

    Osnes, J.D. ); Winberg, A.; Andersson, J.E.; Larsson, N.A. )

    1991-09-27

    Statistical and probabilistic methods for estimating the probability that a fracture is nonconductive (or equivalently, the conductive-fracture frequency) and the distribution of the transmissivities of conductive fractures from transmissivity measurements made in single-hole injection (well) tests were developed. These methods were applied to a database consisting of over 1,000 measurements made in nearly 25 km of borehole at five sites in Sweden. The depths of the measurements ranged from near the surface to over 600-m deep, and packer spacings of 20- and 25-m were used. A probabilistic model that describes the distribution of a series of transmissivity measurements was derived. When the parameters of this model were estimated using maximum likelihood estimators, the resulting estimated distributions generally fit the cumulative histograms of the transmissivity measurements very well. Further, estimates of the mean transmissivity of conductive fractures based on the maximum likelihood estimates of the model's parameters were reasonable, both in magnitude and in trend, with respect to depth. The estimates of the conductive fracture probability were generated in the range of 0.5--5.0 percent, with the higher values at shallow depths and with increasingly smaller values as depth increased. An estimation procedure based on the probabilistic model and the maximum likelihood estimators of its parameters was recommended. Some guidelines regarding the design of injection test programs were drawn from the recommended estimation procedure and the parameter estimates based on the Swedish data. 24 refs., 12 figs., 14 tabs.

  18. Choices and changes: Eccles' Expectancy-Value model and upper-secondary school students' longitudinal reflections about their choice of a STEM education

    NASA Astrophysics Data System (ADS)

    Lykkegaard, Eva; Ulriksen, Lars

    2016-03-01

    During the past 30 years, Eccles' comprehensive social-psychological Expectancy-Value Model of Motivated Behavioural Choices (EV-MBC model) has been proven suitable for studying educational choices related to Science, Technology, Engineering and/or Mathematics (STEM). The reflections of 15 students in their last year in upper-secondary school concerning their choice of tertiary education were examined using quantitative EV-MBC surveys and repeated qualitative interviews. This article presents the analyses of three cases in detail. The analytical focus was whether the factors indicated in the EV-MBC model could be used to detect significant changes in the students' educational choice processes. An important finding was that the quantitative EV-MBC surveys and the qualitative interviews gave quite different results concerning the students' considerations about the choice of tertiary education, and that significant changes in the students' reflections were not captured by the factors of the EV-MBC model. This questions the validity of the EV-MBC surveys. Moreover, the quantitative factors from the EV-MBC model did not sufficiently explain students' dynamical educational choice processes where students in parallel considered several different potential educational trajectories. We therefore call for further studies of the EV-MBC model's use in describing longitudinal choice processes and especially in investigating significant changes.

  19. Geothermal probabilistic cost study

    NASA Astrophysics Data System (ADS)

    Orren, L. H.; Ziman, G. M.; Jones, S. C.; Lee, T. K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-08-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model was used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents was analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance were examined.

  20. Geothermal probabilistic cost study

    NASA Technical Reports Server (NTRS)

    Orren, L. H.; Ziman, G. M.; Jones, S. C.; Lee, T. K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-01-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model was used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents was analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance were examined.

  1. On the relationship between deterministic and probabilistic directed Graphical models: from Bayesian networks to recursive neural networks.

    PubMed

    Baldi, Pierre; Rosen-Zvi, Michal

    2005-10-01

    Machine learning methods that can handle variable-size structured data such as sequences and graphs include Bayesian networks (BNs) and Recursive Neural Networks (RNNs). In both classes of models, the data is modeled using a set of observed and hidden variables associated with the nodes of a directed acyclic graph. In BNs, the conditional relationships between parent and child variables are probabilistic, whereas in RNNs they are deterministic and parameterized by neural networks. Here, we study the formal relationship between both classes of models and show that when the source nodes variables are observed, RNNs can be viewed as limits, both in distribution and probability, of BNs with local conditional distributions that have vanishing covariance matrices and converge to delta functions. Conditions for uniform convergence are also given together with an analysis of the behavior and exactness of Belief Propagation (BP) in 'deterministic' BNs. Implications for the design of mixed architectures and the corresponding inference algorithms are briefly discussed.

  2. Probabilistic brains: knowns and unknowns

    PubMed Central

    Pouget, Alexandre; Beck, Jeffrey M; Ma, Wei Ji; Latham, Peter E

    2015-01-01

    There is strong behavioral and physiological evidence that the brain both represents probability distributions and performs probabilistic inference. Computational neuroscientists have started to shed light on how these probabilistic representations and computations might be implemented in neural circuits. One particularly appealing aspect of these theories is their generality: they can be used to model a wide range of tasks, from sensory processing to high-level cognition. To date, however, these theories have only been applied to very simple tasks. Here we discuss the challenges that will emerge as researchers start focusing their efforts on real-life computations, with a focus on probabilistic learning, structural learning and approximate inference. PMID:23955561

  3. Probabilistic Design of Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2006-01-01

    A formal procedure for the probabilistic design evaluation of a composite structure is described. The uncertainties in all aspects of a composite structure (constituent material properties, fabrication variables, structural geometry, and service environments, etc.), which result in the uncertain behavior in the composite structural responses, are included in the evaluation. The probabilistic evaluation consists of: (1) design criteria, (2) modeling of composite structures and uncertainties, (3) simulation methods, and (4) the decision-making process. A sample case is presented to illustrate the formal procedure and to demonstrate that composite structural designs can be probabilistically evaluated with accuracy and efficiency.

  4. A conceptual model for determining career choice of CHROME alumna based on farmer's conceptual models

    NASA Astrophysics Data System (ADS)

    Moore, Lisa Simmons

    This qualitative program evaluation examines the career decision-making processes and career choices of nine, African American women who participated in the Cooperating Hampton Roads Organization for Minorities in Engineering (CHROME) and who graduated from urban, rural or suburban high schools in the year 2000. The CHROME program is a nonprofit, pre-college intervention program that encourages underrepresented minority and female students to enter science, technically related, engineering, and math (STEM) career fields. The study describes career choices and decisions made by each participant over a five-year period since high school graduation. Data was collected through an Annual Report, Post High School Questionnaires, Environmental Support Questionnaires, Career Choice Questionnaires, Senior Reports, and standardized open-ended interviews. Data was analyzed using a model based on Helen C. Farmer's Conceptual Models, John Ogbu's Caste Theory and Feminist Theory. The CHROME program, based on its stated goals and tenets, was also analyzed against study findings. Findings indicated that participants received very low levels of support from counselors and teachers to pursue STEM careers and high levels of support from parents and family, the CHROME program and financial backing. Findings of this study also indicated that the majority of CHROME alumna persisted in STEM careers. The most successful participants, in terms of undergraduate degree completion and occupational prestige, were the African American women who remained single, experienced no critical incidents, came from a middle class to upper middle class socioeconomic background, and did not have children.

  5. A combinatorial Bayesian and Dirichlet model for prostate MR image segmentation using probabilistic image features

    NASA Astrophysics Data System (ADS)

    Li, Ang; Li, Changyang; Wang, Xiuying; Eberl, Stefan; Feng, Dagan; Fulham, Michael

    2016-08-01

    Blurred boundaries and heterogeneous intensities make accurate prostate MR image segmentation problematic. To improve prostate MR image segmentation we suggest an approach that includes: (a) an image patch division method to partition the prostate into homogeneous segments for feature extraction; (b) an image feature formulation and classification method, using the relevance vector machine, to provide probabilistic prior knowledge for graph energy construction; (c) a graph energy formulation scheme with Bayesian priors and Dirichlet graph energy and (d) a non-iterative graph energy minimization scheme, based on matrix differentiation, to perform the probabilistic pixel membership optimization. The segmentation output was obtained by assigning pixels with foreground and background labels based on derived membership probabilities. We evaluated our approach on the PROMISE-12 dataset with 50 prostate MR image volumes. Our approach achieved a mean dice similarity coefficient (DSC) of 0.90  ±  0.02, which surpassed the five best prior-based methods in the PROMISE-12 segmentation challenge.

  6. Human risky choice under temporal constraints: tests of an energy-budget model.

    PubMed Central

    Pietras, Cynthia J; Locey, Matthew L; Hackenberg, Timothy D

    2003-01-01

    Risk-sensitive foraging models predict that choice between fixed and variable food delays should be influenced by an organism's energy budget. To investigate whether the predictions of these models could be extended to choice in humans, risk sensitivity in 4 adults was investigated under laboratory conditions designed to model positive and negative energy budgets. Subjects chose between fixed and variable trial durations with the same mean value. An energy requirement was modeled by requiring that five trials be completed within a limited time period for points delivered at the end of the period (block of trials) to be exchanged later for money. Manipulating the duration of this time period generated positive and negative earnings budgets (or, alternatively, "time budgets"). Choices were consistent with the predictions of energy-budget models: The fixed-delay option was strongly preferred under positive earnings-budget conditions and the variable-delay option was strongly preferred under negative earnings-budget conditions. Within-block (or trial-by-trial) choices were also frequently consistent with the predictions of a dynamic optimization model, indicating that choice was simultaneously sensitive to the temporal requirements, delays associated with fixed and variable choices on the upcoming trial, cumulative delays within the block of trials, and trial position within a block. PMID:13677609

  7. Probabilistic load simulation: Code development status

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Ho, H.

    1991-01-01

    The objective of the Composite Load Spectra (CLS) project is to develop generic load models to simulate the composite load spectra that are included in space propulsion system components. The probabilistic loads thus generated are part of the probabilistic design analysis (PDA) of a space propulsion system that also includes probabilistic structural analyses, reliability, and risk evaluations. Probabilistic load simulation for space propulsion systems demands sophisticated probabilistic methodology and requires large amounts of load information and engineering data. The CLS approach is to implement a knowledge based system coupled with a probabilistic load simulation module. The knowledge base manages and furnishes load information and expertise and sets up the simulation runs. The load simulation module performs the numerical computation to generate the probabilistic loads with load information supplied from the CLS knowledge base.

  8. Probabilistic load simulation: Code development status

    NASA Astrophysics Data System (ADS)

    Newell, J. F.; Ho, H.

    1991-05-01

    The objective of the Composite Load Spectra (CLS) project is to develop generic load models to simulate the composite load spectra that are included in space propulsion system components. The probabilistic loads thus generated are part of the probabilistic design analysis (PDA) of a space propulsion system that also includes probabilistic structural analyses, reliability, and risk evaluations. Probabilistic load simulation for space propulsion systems demands sophisticated probabilistic methodology and requires large amounts of load information and engineering data. The CLS approach is to implement a knowledge based system coupled with a probabilistic load simulation module. The knowledge base manages and furnishes load information and expertise and sets up the simulation runs. The load simulation module performs the numerical computation to generate the probabilistic loads with load information supplied from the CLS knowledge base.

  9. Effect of reflectance model choice on earthshine-based terrestrial albedo determinations.

    NASA Astrophysics Data System (ADS)

    Thejll, Peter; Gleisner, Hans; Flynn, Chris

    2016-04-01

    Earthshine observations can be used to determine near-hemispheric average terrestrial albedos by careful observation of the relative strength of the earthshine-lit half of the Moon coupled with correct modelling of the reflectances of Earth and Moon, as well as lunar single-scattering albedo maps. Using our own observations of the earthshine, from Mauna Loa Observatory in 2011-12, we investigate the influence of the choice of bidirectional reflectance models for the Moon on derived terrestrial albedos. We find a considerable dependence on albedo results in this choice, and discuss ways to determine what the origin of the dependence is - e.g is it in the joint choices of lunar and terrestrial BRDFs, or is the choice of terrestrial BRDF less important than the lunar one? We report on the results of modelling lunar reflectance and albedo in 6 ways and terrestrial reflectance in two ways, assuming a uniform single-scattering albedo on Earth.

  10. Probabilistic evaluation of the material properties of the in vivo subject-specific articular surface using a computational model.

    PubMed

    Kang, Kyoung-Tak; Kim, Sung-Hwan; Son, Juhyun; Lee, Young Han; Kim, Shinil; Chun, Heoung-Jae

    2016-04-15

    This article used probabilistic analysis to evaluate material properties of the in vivo subject-specific tibiofemoral (TF) joint model. Sensitivity analysis, based on a Monte Carlo (MC) method, was performed using a subject-specific finite element (FE) model generated from in vivo computed tomography (CT) and magnetic resonance imaging (MRI) data, subjected to two different loading conditions. Specifically, the effects of inherent uncertainty in ligament stiffness, horn attachment stiffness, and articular surface material properties were assessed using multifactorial global sensitivity analysis. The MRI images were taken before and after axial compression, and when the flexion condition had been maintained at up to 90 degree flexion in the subject-specific knee joint. The loading conditions of the probabilistic subject-specific FE model (axial compression and 90 degree flexion) were similar to the MRI acquisition setup. We were able to detect the influence of material parameters while maintaining the potential effect of parametric interactions. Throughout the in silico property optimization, a subject-specific FE model was used and less sensitive parameters were eliminated in the global sensitivity method. Soft tissue material properties were estimated using an optimization procedure that involved the minimization of the differences between the kinematics predicted by the subject-specific model and those obtained through in vivo subject-specific data. The results of this approach suggest that the articular surface mechanical properties could be found by using in vivo measurements, which clarifies the valuable tool for future subject-specific studies related to TF joint scaffolds, allografts and biologics. © 2016 Wiley Periodicals, Inc. J Biomed Mater Res Part B: Appl Biomater, 2016.

  11. Assessing patient safety risk before the injury occurs: an introduction to sociotechnical probabilistic risk modelling in health care

    PubMed Central

    Marx, D; Slonim, A

    2003-01-01

    Since 1 July 2001 the Joint Commission on Accreditation of Healthcare Organizations (JCAHO) has required each accredited hospital to conduct at least one proactive risk assessment annually. Failure modes and effects analysis (FMEA) was recommended as one tool for conducting this task. This paper examines the limitations of FMEA and introduces a second tool used by the aviation and nuclear industries to examine low frequency, high impact events in complex systems. The adapted tool, known as sociotechnical probabilistic risk assessment (ST-PRA), provides an alternative for proactively identifying, prioritizing, and mitigating patient safety risk. The uniqueness of ST-PRA is its ability to model combinations of equipment failures, human error, at risk behavioral norms, and recovery opportunities through the use of fault trees. While ST-PRA is a complex, high end risk modelling tool, it provides an opportunity to visualize system risk in a manner that is not possible through FMEA. PMID:14645893

  12. Assessing patient safety risk before the injury occurs: an introduction to sociotechnical probabilistic risk modelling in health care.

    PubMed

    Marx, D A; Slonim, A D

    2003-12-01

    Since 1 July 2001 the Joint Commission on Accreditation of Healthcare Organizations (JCAHO) has required each accredited hospital to conduct at least one proactive risk assessment annually. Failure modes and effects analysis (FMEA) was recommended as one tool for conducting this task. This paper examines the limitations of FMEA and introduces a second tool used by the aviation and nuclear industries to examine low frequency, high impact events in complex systems. The adapted tool, known as sociotechnical probabilistic risk assessment (ST-PRA), provides an alternative for proactively identifying, prioritizing, and mitigating patient safety risk. The uniqueness of ST-PRA is its ability to model combinations of equipment failures, human error, at risk behavioral norms, and recovery opportunities through the use of fault trees. While ST-PRA is a complex, high end risk modelling tool, it provides an opportunity to visualize system risk in a manner that is not possible through FMEA.

  13. NasoNet, modeling the spread of nasopharyngeal cancer with networks of probabilistic events in discrete time.

    PubMed

    Galán, S F; Aguado, F; Díez, F J; Mira, J

    2002-07-01

    The spread of cancer is a non-deterministic dynamic process. As a consequence, the design of an assistant system for the diagnosis and prognosis of the extent of a cancer should be based on a representation method that deals with both uncertainty and time. The ultimate goal is to know the stage of development of a cancer in a patient before selecting the appropriate treatment. A network of probabilistic events in discrete time (NPEDT) is a type of Bayesian network for temporal reasoning that models the causal mechanisms associated with the time evolution of a process. This paper describes NasoNet, a system that applies NPEDTs to the diagnosis and prognosis of nasopharyngeal cancer. We have made use of temporal noisy gates to model the dynamic causal interactions that take place in the domain. The methodology we describe is general enough to be applied to any other type of cancer.

  14. A likelihood-based biostatistical model for analyzing consumer movement in simultaneous choice experiments.

    PubMed

    Zeilinger, Adam R; Olson, Dawn M; Andow, David A

    2014-08-01

    Consumer feeding preference among resource choices has critical implications for basic ecological and evolutionary processes, and can be highly relevant to applied problems such as ecological risk assessment and invasion biology. Within consumer choice experiments, also known as feeding preference or cafeteria experiments, measures of relative consumption and measures of consumer movement can provide distinct and complementary insights into the strength, causes, and consequences of preference. Despite the distinct value of inferring preference from measures of consumer movement, rigorous and biologically relevant analytical methods are lacking. We describe a simple, likelihood-based, biostatistical model for analyzing the transient dynamics of consumer movement in a paired-choice experiment. With experimental data consisting of repeated discrete measures of consumer location, the model can be used to estimate constant consumer attraction and leaving rates for two food choices, and differences in choice-specific attraction and leaving rates can be tested using model selection. The model enables calculation of transient and equilibrial probabilities of consumer-resource association, which could be incorporated into larger scale movement models. We explore the effect of experimental design on parameter estimation through stochastic simulation and describe methods to check that data meet model assumptions. Using a dataset of modest sample size, we illustrate the use of the model to draw inferences on consumer preference as well as underlying behavioral mechanisms. Finally, we include a user's guide and computer code scripts in R to facilitate use of the model by other researchers.

  15. A decision network account of reasoning about other people's choices.

    PubMed

    Jern, Alan; Kemp, Charles

    2015-09-01

    The ability to predict and reason about other people's choices is fundamental to social interaction. We propose that people reason about other people's choices using mental models that are similar to decision networks. Decision networks are extensions of Bayesian networks that incorporate the idea that choices are made in order to achieve goals. In our first experiment, we explore how people predict the choices of others. Our remaining three experiments explore how people infer the goals and knowledge of others by observing the choices that they make. We show that decision networks account for our data better than alternative computational accounts that do not incorporate the notion of goal-directed choice or that do not rely on probabilistic inference.

  16. A decision network account of reasoning about other people's choices

    PubMed Central

    Jern, Alan; Kemp, Charles

    2015-01-01

    The ability to predict and reason about other people's choices is fundamental to social interaction. We propose that people reason about other people's choices using mental models that are similar to decision networks. Decision networks are extensions of Bayesian networks that incorporate the idea that choices are made in order to achieve goals. In our first experiment, we explore how people predict the choices of others. Our remaining three experiments explore how people infer the goals and knowledge of others by observing the choices that they make. We show that decision networks account for our data better than alternative computational accounts that do not incorporate the notion of goal-directed choice or that do not rely on probabilistic inference. PMID:26010559

  17. Monte Carlo probabilistic sensitivity analysis for patient level simulation models: efficient estimation of mean and variance using ANOVA.

    PubMed

    O'Hagan, Anthony; Stevenson, Matt; Madan, Jason

    2007-10-01

    Probabilistic sensitivity analysis (PSA) is required to account for uncertainty in cost-effectiveness calculations arising from health economic models. The simplest way to perform PSA in practice is by Monte Carlo methods, which involves running the model many times using randomly sampled values of the model inputs. However, this can be impractical when the economic model takes appreciable amounts of time to run. This situation arises, in particular, for patient-level simulation models (also known as micro-simulation or individual-level simulation models), where a single run of the model simulates the health care of many thousands of individual patients. The large number of patients required in each run to achieve accurate estimation of cost-effectiveness means that only a relatively small number of runs is possible. For this reason, it is often said that PSA is not practical for patient-level models. We develop a way to reduce the computational burden of Monte Carlo PSA for patient-level models, based on the algebra of analysis of variance. Methods are presented to estimate the mean and variance of the model output, with formulae for determining optimal sample sizes. The methods are simple to apply and will typically reduce the computational demand very substantially.

  18. Partner choice promotes cooperation: the two faces of testing with agent-based models.

    PubMed

    Campennì, Marco; Schino, Gabriele

    2014-03-07

    Reciprocity is one of the most debated among the mechanisms that have been proposed to explain the evolution of cooperation. While a distinction can be made between two general processes that can underlie reciprocation (within-pair temporal relations between cooperative events, and partner choice based on benefits received), theoretical modelling has concentrated on the former, while the latter has been often neglected. We developed a set of agent-based models in which agents adopted a strategy of obligate cooperation and partner choice based on benefits received. Our models tested the ability of partner choice both to reproduce significant emergent features of cooperation in group living animals and to promote the evolution of cooperation. Populations formed by agents adopting a strategy of obligate cooperation and partner choice based on benefits received showed differentiated "social relationships" and a positive correlation between cooperation given and received, two common phenomena in animal cooperation. When selection across multiple generations was added to the model, agents adopting a strategy of partner choice based on benefits received outperformed selfish agents that did not cooperate. Our results suggest partner choice is a significant aspect of cooperation and provides a possible mechanism for its evolution.

  19. Biostereometric Data Processing In ERGODATA: Choice Of Human Body Models

    NASA Astrophysics Data System (ADS)

    Pineau, J. C.; Mollard, R.; Sauvignon, M.; Amphoux, M.

    1983-07-01

    The definition of human body models was elaborated with anthropometric data from ERGODATA. The first model reduces the human body into a series of points and lines. The second model is well adapted to represent volumes of each segmentary element. The third is an original model built from the conventional anatomical points. Each segment is defined in space by a tri-angular plane located with its 3-D coordinates. This new model can answer all the processing possibilities in the field of computer-aided design (C.A.D.) in ergonomy but also biomechanics and orthopaedics.

  20. Model Verification and Validation Concepts for a Probabilistic Fracture Assessment Model to Predict Cracking of Knife Edge Seals in the Space Shuttle Main Engine High Pressure Oxidizer

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Riha, David S.

    2013-01-01

    Physics-based models are routinely used to predict the performance of engineered systems to make decisions such as when to retire system components, how to extend the life of an aging system, or if a new design will be safe or available. Model verification and validation (V&V) is a process to establish credibility in model predictions. Ideally, carefully controlled validation experiments will be designed and performed to validate models or submodels. In reality, time and cost constraints limit experiments and even model development. This paper describes elements of model V&V during the development and application of a probabilistic fracture assessment model to predict cracking in space shuttle main engine high-pressure oxidizer turbopump knife-edge seals. The objective of this effort was to assess the probability of initiating and growing a crack to a specified failure length in specific flight units for different usage and inspection scenarios. The probabilistic fracture assessment model developed in this investigation combined a series of submodels describing the usage, temperature history, flutter tendencies, tooth stresses and numbers of cycles, fatigue cracking, nondestructive inspection, and final