Science.gov

Sample records for probabilistic model integrating

  1. Integrating Boolean Queries in Conjunctive Normal Form with Probabilistic Retrieval Models.

    ERIC Educational Resources Information Center

    Losee, Robert M.; Bookstein, Abraham

    1988-01-01

    Presents a model that places Boolean database queries into conjunctive normal form, thereby allowing probabilistic ranking of documents and the incorporation of relevance feedback. Experimental results compare the performance of a sequential learning probabilistic retrieval model with the proposed integrated Boolean probabilistic model and a fuzzy…

  2. Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.

    2016-01-01

    Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.

  3. Probabilistic models of eukaryotic evolution: time for integration

    PubMed Central

    Lartillot, Nicolas

    2015-01-01

    In spite of substantial work and recent progress, a global and fully resolved picture of the macroevolutionary history of eukaryotes is still under construction. This concerns not only the phylogenetic relations among major groups, but also the general characteristics of the underlying macroevolutionary processes, including the patterns of gene family evolution associated with endosymbioses, as well as their impact on the sequence evolutionary process. All these questions raise formidable methodological challenges, calling for a more powerful statistical paradigm. In this direction, model-based probabilistic approaches have played an increasingly important role. In particular, improved models of sequence evolution accounting for heterogeneities across sites and across lineages have led to significant, although insufficient, improvement in phylogenetic accuracy. More recently, one main trend has been to move away from simple parametric models and stepwise approaches, towards integrative models explicitly considering the intricate interplay between multiple levels of macroevolutionary processes. Such integrative models are in their infancy, and their application to the phylogeny of eukaryotes still requires substantial improvement of the underlying models, as well as additional computational developments. PMID:26323768

  4. SHEDS-HT: An Integrated Probabilistic Exposure Model for ...

    EPA Pesticide Factsheets

    United States Environmental Protection Agency (USEPA) researchers are developing a strategy for highthroughput (HT) exposure-based prioritization of chemicals under the ExpoCast program. These novel modeling approaches for evaluating chemicals based on their potential for biologically relevant human exposures will inform toxicity testing and prioritization for chemical risk assessment. Based on probabilistic methods and algorithms developed for The Stochastic Human Exposure and Dose Simulation Model for Multimedia, Multipathway Chemicals (SHEDS-MM), a new mechanistic modeling approach has been developed to accommodate high-throughput (HT) assessment of exposure potential. In this SHEDS-HT model, the residential and dietary modules of SHEDS-MM have been operationally modified to reduce the user burden, input data demands, and run times of the higher-tier model, while maintaining critical features and inputs that influence exposure. The model has been implemented in R; the modeling framework links chemicals to consumer product categories or food groups (and thus exposure scenarios) to predict HT exposures and intake doses. Initially, SHEDS-HT has been applied to 2507 organic chemicals associated with consumer products and agricultural pesticides. These evaluations employ data from recent USEPA efforts to characterize usage (prevalence, frequency, and magnitude), chemical composition, and exposure scenarios for a wide range of consumer products. In modeling indirec

  5. Integration of Advanced Probabilistic Analysis Techniques with Multi-Physics Models

    SciTech Connect

    Cetiner, Mustafa Sacit; none,; Flanagan, George F.; Poore III, Willis P.; Muhlheim, Michael David

    2014-07-30

    An integrated simulation platform that couples probabilistic analysis-based tools with model-based simulation tools can provide valuable insights for reactive and proactive responses to plant operating conditions. The objective of this work is to demonstrate the benefits of a partial implementation of the Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Framework Specification through the coupling of advanced PRA capabilities and accurate multi-physics plant models. Coupling a probabilistic model with a multi-physics model will aid in design, operations, and safety by providing a more accurate understanding of plant behavior. This represents the first attempt at actually integrating these two types of analyses for a control system used for operations, on a faster than real-time basis. This report documents the development of the basic communication capability to exchange data with the probabilistic model using Reliability Workbench (RWB) and the multi-physics model using Dymola. The communication pathways from injecting a fault (i.e., failing a component) to the probabilistic and multi-physics models were successfully completed. This first version was tested with prototypic models represented in both RWB and Modelica. First, a simple event tree/fault tree (ET/FT) model was created to develop the software code to implement the communication capabilities between the dynamic-link library (dll) and RWB. A program, written in C#, successfully communicates faults to the probabilistic model through the dll. A systems model of the Advanced Liquid-Metal Reactor–Power Reactor Inherently Safe Module (ALMR-PRISM) design developed under another DOE project was upgraded using Dymola to include proper interfaces to allow data exchange with the control application (ConApp). A program, written in C+, successfully communicates faults to the multi-physics model. The results of the example simulation were successfully plotted.

  6. PROBABILISTIC INFORMATION INTEGRATION TECHNOLOGY

    SciTech Connect

    J. BOOKER; M. MEYER; ET AL

    2001-02-01

    The Statistical Sciences Group at Los Alamos has successfully developed a structured, probabilistic, quantitative approach for the evaluation of system performance based on multiple information sources, called Information Integration Technology (IIT). The technology integrates diverse types and sources of data and information (both quantitative and qualitative), and their associated uncertainties, to develop distributions for performance metrics, such as reliability. Applications include predicting complex system performance, where test data are lacking or expensive to obtain, through the integration of expert judgment, historical data, computer/simulation model predictions, and any relevant test/experimental data. The technology is particularly well suited for tracking estimated system performance for systems under change (e.g. development, aging), and can be used at any time during product development, including concept and early design phases, prior to prototyping, testing, or production, and before costly design decisions are made. Techniques from various disciplines (e.g., state-of-the-art expert elicitation, statistical and reliability analysis, design engineering, physics modeling, and knowledge management) are merged and modified to develop formal methods for the data/information integration. The power of this technology, known as PREDICT (Performance and Reliability Evaluation with Diverse Information Combination and Tracking), won a 1999 R and D 100 Award (Meyer, Booker, Bement, Kerscher, 1999). Specifically the PREDICT application is a formal, multidisciplinary process for estimating the performance of a product when test data are sparse or nonexistent. The acronym indicates the purpose of the methodology: to evaluate the performance or reliability of a product/system by combining all available (often diverse) sources of information and then tracking that performance as the product undergoes changes.

  7. Probabilistic Model Development

    NASA Technical Reports Server (NTRS)

    Adam, James H., Jr.

    2010-01-01

    Objective: Develop a Probabilistic Model for the Solar Energetic Particle Environment. Develop a tool to provide a reference solar particle radiation environment that: 1) Will not be exceeded at a user-specified confidence level; 2) Will provide reference environments for: a) Peak flux; b) Event-integrated fluence; and c) Mission-integrated fluence. The reference environments will consist of: a) Elemental energy spectra; b) For protons, helium and heavier ions.

  8. The Integrated Medical Model: A Probabilistic Simulation Model Predicting In-Flight Medical Risks

    NASA Technical Reports Server (NTRS)

    Keenan, Alexandra; Young, Millennia; Saile, Lynn; Boley, Lynn; Walton, Marlei; Kerstman, Eric; Shah, Ronak; Goodenow, Debra A.; Myers, Jerry G., Jr.

    2015-01-01

    The Integrated Medical Model (IMM) is a probabilistic model that uses simulation to predict mission medical risk. Given a specific mission and crew scenario, medical events are simulated using Monte Carlo methodology to provide estimates of resource utilization, probability of evacuation, probability of loss of crew, and the amount of mission time lost due to illness. Mission and crew scenarios are defined by mission length, extravehicular activity (EVA) schedule, and crew characteristics including: sex, coronary artery calcium score, contacts, dental crowns, history of abdominal surgery, and EVA eligibility. The Integrated Medical Evidence Database (iMED) houses the model inputs for one hundred medical conditions using in-flight, analog, and terrestrial medical data. Inputs include incidence, event durations, resource utilization, and crew functional impairment. Severity of conditions is addressed by defining statistical distributions on the dichotomized best and worst-case scenarios for each condition. The outcome distributions for conditions are bounded by the treatment extremes of the fully treated scenario in which all required resources are available and the untreated scenario in which no required resources are available. Upon occurrence of a simulated medical event, treatment availability is assessed, and outcomes are generated depending on the status of the affected crewmember at the time of onset, including any pre-existing functional impairments or ongoing treatment of concurrent conditions. The main IMM outcomes, including probability of evacuation and loss of crew life, time lost due to medical events, and resource utilization, are useful in informing mission planning decisions. To date, the IMM has been used to assess mission-specific risks with and without certain crewmember characteristics, to determine the impact of eliminating certain resources from the mission medical kit, and to design medical kits that maximally benefit crew health while meeting

  9. The Integrated Medical Model: A Probabilistic Simulation Model for Predicting In-Flight Medical Risks

    NASA Technical Reports Server (NTRS)

    Keenan, Alexandra; Young, Millennia; Saile, Lynn; Boley, Lynn; Walton, Marlei; Kerstman, Eric; Shah, Ronak; Goodenow, Debra A.; Myers, Jerry G.

    2015-01-01

    The Integrated Medical Model (IMM) is a probabilistic model that uses simulation to predict mission medical risk. Given a specific mission and crew scenario, medical events are simulated using Monte Carlo methodology to provide estimates of resource utilization, probability of evacuation, probability of loss of crew, and the amount of mission time lost due to illness. Mission and crew scenarios are defined by mission length, extravehicular activity (EVA) schedule, and crew characteristics including: sex, coronary artery calcium score, contacts, dental crowns, history of abdominal surgery, and EVA eligibility. The Integrated Medical Evidence Database (iMED) houses the model inputs for one hundred medical conditions using in-flight, analog, and terrestrial medical data. Inputs include incidence, event durations, resource utilization, and crew functional impairment. Severity of conditions is addressed by defining statistical distributions on the dichotomized best and worst-case scenarios for each condition. The outcome distributions for conditions are bounded by the treatment extremes of the fully treated scenario in which all required resources are available and the untreated scenario in which no required resources are available. Upon occurrence of a simulated medical event, treatment availability is assessed, and outcomes are generated depending on the status of the affected crewmember at the time of onset, including any pre-existing functional impairments or ongoing treatment of concurrent conditions. The main IMM outcomes, including probability of evacuation and loss of crew life, time lost due to medical events, and resource utilization, are useful in informing mission planning decisions. To date, the IMM has been used to assess mission-specific risks with and without certain crewmember characteristics, to determine the impact of eliminating certain resources from the mission medical kit, and to design medical kits that maximally benefit crew health while meeting

  10. Using the Rasch model as an objective and probabilistic technique to integrate different soil properties

    NASA Astrophysics Data System (ADS)

    Rebollo, Francisco J.; Jesús Moral García, Francisco

    2016-04-01

    Soil apparent electrical conductivity (ECa) is one of the simplest, least expensive soil measurements that integrates many soil properties affecting crop productivity, including, for instance, soil texture, water content, and cation exchange capacity. The ECa measurements obtained with a 3100 Veris sensor, operating in both shallow (0-30 cm), ECs, and deep (0-90 cm), ECd, mode, can be used as an additional and essential information to be included in a probabilistic model, the Rasch model, with the aim of quantifying the overall soil fertililty potential in an agricultural field. This quantification should integrate the main soil physical and chemical properties, with different units. In this work, the formulation of the Rasch model integrates 11 soil properties (clay, silt and sand content, organic matter -OM-, pH, total nitrogen -TN-, available phosphorus -AP- and potassium -AK-, cation exchange capacity -CEC-, ECd, and ECs) measured at 70 locations in a field. The main outputs of the model include a ranking of all soil samples according to their relative fertility potential and the unexpected behaviours of some soil samples and properties. In the case study, the considered soil variables fit the model reasonably, having an important influence on soil fertility, except pH, probably due to its homogeneity in the field. Moreover, ECd, ECs are the most influential properties on soil fertility and, on the other hand, AP and AK the less influential properties. The use of the Rasch model to estimate soil fertility potential (always in a relative way, taking into account the characteristics of the studied soil) constitutes a new application of great practical importance, enabling to rationally determine locations in a field where high soil fertility potential exists and establishing those soil samples or properties which have any anomaly; this information can be necessary to conduct site-specific treatments, leading to a more cost-effective and sustainable field

  11. Learning Probabilistic Logic Models from Probabilistic Examples

    PubMed Central

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2009-01-01

    Abstract We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples. PMID:19888348

  12. Integrating probabilistic models of perception and interactive neural networks: a historical and tutorial review

    PubMed Central

    McClelland, James L.

    2013-01-01

    This article seeks to establish a rapprochement between explicitly Bayesian models of contextual effects in perception and neural network models of such effects, particularly the connectionist interactive activation (IA) model of perception. The article is in part an historical review and in part a tutorial, reviewing the probabilistic Bayesian approach to understanding perception and how it may be shaped by context, and also reviewing ideas about how such probabilistic computations may be carried out in neural networks, focusing on the role of context in interactive neural networks, in which both bottom-up and top-down signals affect the interpretation of sensory inputs. It is pointed out that connectionist units that use the logistic or softmax activation functions can exactly compute Bayesian posterior probabilities when the bias terms and connection weights affecting such units are set to the logarithms of appropriate probabilistic quantities. Bayesian concepts such the prior, likelihood, (joint and marginal) posterior, probability matching and maximizing, and calculating vs. sampling from the posterior are all reviewed and linked to neural network computations. Probabilistic and neural network models are explicitly linked to the concept of a probabilistic generative model that describes the relationship between the underlying target of perception (e.g., the word intended by a speaker or other source of sensory stimuli) and the sensory input that reaches the perceiver for use in inferring the underlying target. It is shown how a new version of the IA model called the multinomial interactive activation (MIA) model can sample correctly from the joint posterior of a proposed generative model for perception of letters in words, indicating that interactive processing is fully consistent with principled probabilistic computation. Ways in which these computations might be realized in real neural systems are also considered. PMID:23970868

  13. A Simple Probabilistic Combat Model

    DTIC Science & Technology

    2016-06-13

    model described here, all attrition is modeled probabilistically and it is possible (although unlikely) for the weaker side to be successful . The model...integrals are plotted as a function of the number of waves in the lower right plot . Since we start at a point in the space, there is a clear winner...Red14 5. MODEL FOR RED AND BLUE WIN PROBABILITY The previous plots have shown how the probability distribution of red and blue survivors evolves

  14. Analysis of molecular expression patterns and integration with other knowledge bases using probabilistic Bayesian network models

    SciTech Connect

    Moler, Edward J.; Mian, I.S.

    2000-03-01

    How can molecular expression experiments be interpreted with greater than ten to the fourth measurements per chip? How can one get the most quantitative information possible from the experimental data with good confidence? These are important questions whose solutions require an interdisciplinary combination of molecular and cellular biology, computer science, statistics, and complex systems analysis. The explosion of data from microarray techniques present the problem of interpreting the experiments. The availability of large-scale knowledge bases provide the opportunity to maximize the information extracted from these experiments. We have developed new methods of discovering biological function, metabolic pathways, and regulatory networks from these data and knowledge bases. These techniques are applicable to analyses for biomedical engineering, clinical, and fundamental cell and molecular biology studies. Our approach uses probabilistic, computational methods that give quantitative interpretations of data in a biological context. We have selected Bayesian statistical models with graphical network representations as a framework for our methods. As a first step, we use a nave Bayesian classifier to identify statistically significant patterns in gene expression data. We have developed methods which allow us to (a) characterize which genes or experiments distinguish each class from the others, (b) cross-index the resulting classes with other databases to assess biological meaning of the classes, and (c) display a gross overview of cellular dynamics. We have developed a number of visualization tools to convey the results. We report here our methods of classification and our first attempts at integrating the data and other knowledge bases together with new visualization tools. We demonstrate the utility of these methods and tools by analysis of a series of yeast cDNA microarray data and to a set of cancerous/normal sample data from colon cancer patients. We discuss

  15. Crevice corrosion {ampersand} pitting of high-level waste containers: the integration of deterministic {ampersand} probabilistic models (II)

    SciTech Connect

    Farmer, J.C.

    1997-10-01

    An integrated predictive model is being developed to account for the effects of localized environmental conditions in crevices on the initiation and propagation of pits. A deterministic calculation is used to estimate the accumulation of hydrogen ions (pH suppression) in the crevice solution due to the hydrolysis of dissolved metals. Pit initiation and growth within the crevice is then dealt with by either a probabilistic model, or an equivalent deterministic model. Ultimately, the role of intergranular corrosion will have to be considered. While the strategy presented here is very promising, the integrated model is not yet ready for precise quantitative predictions. Empirical expressions for the rate of penetration based upon experimental crevice corrosion data can be used in the interim period, until the integrated model can be refined. Bounding calculations based upon such empirical expressions can provide important insight into worst-case scenarios.

  16. Probabilistic representation in syllogistic reasoning: A theory to integrate mental models and heuristics.

    PubMed

    Hattori, Masasi

    2016-12-01

    This paper presents a new theory of syllogistic reasoning. The proposed model assumes there are probabilistic representations of given signature situations. Instead of conducting an exhaustive search, the model constructs an individual-based "logical" mental representation that expresses the most probable state of affairs, and derives a necessary conclusion that is not inconsistent with the model using heuristics based on informativeness. The model is a unification of previous influential models. Its descriptive validity has been evaluated against existing empirical data and two new experiments, and by qualitative analyses based on previous empirical findings, all of which supported the theory. The model's behavior is also consistent with findings in other areas, including working memory capacity. The results indicate that people assume the probabilities of all target events mentioned in a syllogism to be almost equal, which suggests links between syllogistic reasoning and other areas of cognition. Copyright © 2016 The Author(s). Published by Elsevier B.V. All rights reserved.

  17. Development of Probabilistic Risk Assessment Model for BWR Shutdown Modes 4 and 5 Integrated in SPAR Model

    SciTech Connect

    S. T. Khericha; S. Sancakter; J. Mitman; J. Wood

    2010-06-01

    Nuclear plant operating experience and several studies show that the risk from shutdown operation during modes 4, 5, and 6 can be significant This paper describes development of the standard template risk evaluation models for shutdown modes 4, and 5 for commercial boiling water nuclear power plants (BWR). The shutdown probabilistic risk assessment model uses full power Nuclear Regulatory Commission’s (NRC’s) Standardized Plant Analysis Risk (SPAR) model as the starting point for development. The shutdown PRA models are integrated with their respective internal events at-power SPAR model. This is accomplished by combining the modified system fault trees from SPAR full power model with shutdown event tree logic. For human reliability analysis (HRA), the SPAR HRA (SPAR-H) method is used which requires the analysts to complete relatively straight forward worksheet, including the performance shaping factors (PSFs). The results are then used to estimate HEP of interest. The preliminary results indicate the risk is dominated by the operator’s ability to diagnose the events and provide long term cooling.

  18. Probabilistic Mesomechanical Fatigue Model

    NASA Technical Reports Server (NTRS)

    Tryon, Robert G.

    1997-01-01

    A probabilistic mesomechanical fatigue life model is proposed to link the microstructural material heterogeneities to the statistical scatter in the macrostructural response. The macrostructure is modeled as an ensemble of microelements. Cracks nucleation within the microelements and grow from the microelements to final fracture. Variations of the microelement properties are defined using statistical parameters. A micromechanical slip band decohesion model is used to determine the crack nucleation life and size. A crack tip opening displacement model is used to determine the small crack growth life and size. Paris law is used to determine the long crack growth life. The models are combined in a Monte Carlo simulation to determine the statistical distribution of total fatigue life for the macrostructure. The modeled response is compared to trends in experimental observations from the literature.

  19. Probabilistic data integration and computational complexity

    NASA Astrophysics Data System (ADS)

    Hansen, T. M.; Cordua, K. S.; Mosegaard, K.

    2016-12-01

    Inverse problems in Earth Sciences typically refer to the problem of inferring information about properties of the Earth from observations of geophysical data (the result of nature's solution to the `forward' problem). This problem can be formulated more generally as a problem of `integration of information'. A probabilistic formulation of data integration is in principle simple: If all information available (from e.g. geology, geophysics, remote sensing, chemistry…) can be quantified probabilistically, then different algorithms exist that allow solving the data integration problem either through an analytical description of the combined probability function, or sampling the probability function. In practice however, probabilistic based data integration may not be easy to apply successfully. This may be related to the use of sampling methods, which are known to be computationally costly. But, another source of computational complexity is related to how the individual types of information are quantified. In one case a data integration problem is demonstrated where the goal is to determine the existence of buried channels in Denmark, based on multiple sources of geo-information. Due to one type of information being too informative (and hence conflicting), this leads to a difficult sampling problems with unrealistic uncertainty. Resolving this conflict prior to data integration, leads to an easy data integration problem, with no biases. In another case it is demonstrated how imperfections in the description of the geophysical forward model (related to solving the wave-equation) can lead to a difficult data integration problem, with severe bias in the results. If the modeling error is accounted for, the data integration problems becomes relatively easy, with no apparent biases. Both examples demonstrate that biased information can have a dramatic effect on the computational efficiency solving a data integration problem and lead to biased results, and under

  20. Integrating fault and seismological data into a probabilistic seismic hazard model for Italy.

    NASA Astrophysics Data System (ADS)

    Valentini, Alessandro; Visini, Francesco; Pace, Bruno

    2017-04-01

    We present the results of new probabilistic seismic hazard analysis (PSHA) for Italy based on active fault and seismological data. Combining seismic hazard from active fault with distributed seismic sources (where there are no data on active faults) is the backbone of this work. Far away from identifying a best procedure, currently adopted approaches combine active faults and background sources applying a threshold magnitude, generally between 5.5 and 7, over which seismicity is modelled by faults, and under which is modelled by distributed sources or area sources. In our PSHA we (i) apply a new method for the treatment of geologic data of major active faults and (ii) propose a new approach to combine these data with historical seismicity to evaluate PSHA for Italy. Assuming that deformation is concentrated in correspondence of fault, we combine the earthquakes occurrences derived from the geometry and slip rates of the active faults with the earthquakes from the spatially smoothed earthquake sources. In the vicinity of an active fault, the smoothed seismic activity is gradually reduced by a fault-size driven factor. Even if the range and gross spatial distribution of expected accelerations obtained in our work are comparable to the ones obtained through methods applying seismic catalogues and classical zonation models, the main difference is in the detailed spatial pattern of our PSHA model: our model is characterized by spots of more hazardous area, in correspondence of mapped active faults, while the previous models give expected accelerations almost uniformly distributed in large regions. Finally, we investigate the impact due to the earthquake rates derived from two magnitude-frequency distribution (MFD) model for faults on the hazard result and in respect to the contribution of faults versus distributed seismic activity.

  1. Combining exposure and effect modeling into an integrated probabilistic environmental risk assessment for nanoparticles.

    PubMed

    Jacobs, Rianne; Meesters, Johannes A J; Ter Braak, Cajo J F; van de Meent, Dik; van der Voet, Hilko

    2016-12-01

    There is a growing need for good environmental risk assessment of engineered nanoparticles (ENPs). Environmental risk assessment of ENPs has been hampered by lack of data and knowledge about ENPs, their environmental fate, and their toxicity. This leads to uncertainty in the risk assessment. To deal with uncertainty in the risk assessment effectively, probabilistic methods are advantageous. In the present study, the authors developed a method to model both the variability and the uncertainty in environmental risk assessment of ENPs. This method is based on the concentration ratio and the ratio of the exposure concentration to the critical effect concentration, both considered to be random. In this method, variability and uncertainty are modeled separately so as to allow the user to see which part of the total variation in the concentration ratio is attributable to uncertainty and which part is attributable to variability. The authors illustrate the use of the method with a simplified aquatic risk assessment of nano-titanium dioxide. The authors' method allows a more transparent risk assessment and can also direct further environmental and toxicological research to the areas in which it is most needed. Environ Toxicol Chem 2016;35:2958-2967. © 2016 The Authors. Environmental Toxicology and Chemistry published by Wiley Periodicals, Inc. on behalf of SETAC.

  2. A probabilistic approach to jointly integrate 3D/4D seismic, production data and geological information for building reservoir models

    NASA Astrophysics Data System (ADS)

    Castro, Scarlet A.

    Reservoir modeling aims at understanding static and dynamic components of the reservoir in order to make decisions about future surface operations. The practice of reservoir modeling calls for the integration of expertise from different disciplines, as well as the in tegration of a wide variety of data: geological data, (core data, well-logs, etc.), production data (fluid rates or volumes, pressure data, etc.), and geophysical data (3D seismic data). Although a single 3D seismic survey is the most common geophysical data available for most reservoirs, a suite of several 3D seismic surveys (4D seismic data) acquired for monitoring production can be available for mature reservoirs. The main contribution of this dissertation is to incorporate 4D seismic data within the reservoir modeling workflow while honoring all other available data. This dissertation proposes two general approaches to include 4D seismic data into the reservoir modeling workflow. The Probabilistic Data Integration approach (PDI), which consists of modeling the information content of 4D seismic through a spatial probability of facies occurrence; and the Forward Modeling (FM) approach, which consists of matching 4D seismic along with production data. The FM approach requires forward modeling the 4D seismic response, which requires to downscale the flow simulation response. This dissertation introduces a novel dynamic downscaling method that takes into account both static information (high-resolution per meability field) and dynamic information in the form of coarsened fluxes and saturations (flow simulation on the coarsened grid). The two proposed approaches (PDI and FM approaches) are applied to a prominent field in the North Sea, to model the channel facies of a fluvial reservoir. The PDI approach constrained the reservoir model to the spatial probability of facies occurrence (obtained from a calibration between well-log and 4D seismic data) as well as other static data while satisfactorily history

  3. Modeling development of natural multi-sensory integration using neural self-organisation and probabilistic population codes

    NASA Astrophysics Data System (ADS)

    Bauer, Johannes; Dávila-Chacón, Jorge; Wermter, Stefan

    2015-10-01

    Humans and other animals have been shown to perform near-optimally in multi-sensory integration tasks. Probabilistic population codes (PPCs) have been proposed as a mechanism by which optimal integration can be accomplished. Previous approaches have focussed on how neural networks might produce PPCs from sensory input or perform calculations using them, like combining multiple PPCs. Less attention has been given to the question of how the necessary organisation of neurons can arise and how the required knowledge about the input statistics can be learned. In this paper, we propose a model of learning multi-sensory integration based on an unsupervised learning algorithm in which an artificial neural network learns the noise characteristics of each of its sources of input. Our algorithm borrows from the self-organising map the ability to learn latent-variable models of the input and extends it to learning to produce a PPC approximating a probability density function over the latent variable behind its (noisy) input. The neurons in our network are only required to perform simple calculations and we make few assumptions about input noise properties and tuning functions. We report on a neurorobotic experiment in which we apply our algorithm to multi-sensory integration in a humanoid robot to demonstrate its effectiveness and compare it to human multi-sensory integration on the behavioural level. We also show in simulations that our algorithm performs near-optimally under certain plausible conditions, and that it reproduces important aspects of natural multi-sensory integration on the neural level.

  4. Towards Probabilistic Modelling in Event-B

    NASA Astrophysics Data System (ADS)

    Tarasyuk, Anton; Troubitsyna, Elena; Laibinis, Linas

    Event-B provides us with a powerful framework for correct-by-construction system development. However, while developing dependable systems we should not only guarantee their functional correctness but also quantitatively assess their dependability attributes. In this paper we investigate how to conduct probabilistic assessment of reliability of control systems modeled in Event-B. We show how to transform an Event-B model into a Markov model amendable for probabilistic reliability analysis. Our approach enables integration of reasoning about correctness with quantitative analysis of reliability.

  5. Crevice corrosion {ampersand} pitting of high-level waste containers: integration of deterministic {ampersand} probabilistic models

    SciTech Connect

    Farmer, J.C.; McCright, R.D.

    1997-10-01

    A key component of the Engineered Barrier System (EBS) being designed for containment of spent-fuel and high-level waste at the proposed geological repository at Yucca Mountain, Nevada is a two-layer canister. In this particular design, the inner barrier is made of a corrosion resistant material (CRM) such as Alloy 625 or C-22, while the outer barrier is made of a corrosion-allowance material (CAM) such as carbon steel or Monel 400. An integrated predictive model is being developed to account for the effects of localized environmental conditions in the CRM-CAM crevice on the initiation and propagation of pits through the CRM.

  6. Crevice corrosion and pitting of high-level waste containers: Integration of deterministic and probabilistic models

    SciTech Connect

    Farmer, J.C.; McCright, R.D.

    1998-12-31

    A key component of the Engineered Barrier System (EBS) being designed for containment of spent-fuel and high-level waste at the proposed geological repository at Yucca Mountain, Nevada is a two-layer canister. In this particular design, the inner barrier is made of a corrosion resistant material (CRM) such as Alloy 625 or C-22, while the outer barrier is made of a corrosion-allowance material (CAM) such as carbon steel or Alloy 400. An integrated predictive model is being developed to account for the effects of localized environmental conditions in the CRM-CAM crevice on the initiation and propagation of pits through the CRM.

  7. CREVICE CORROSION & PITTING OF HIGH-LEVEL WASTE CONTAINERS: INTEGRATION OF DETERMINISTIC & PROBABILISTIC MODELS

    SciTech Connect

    JOSEPH C. FARMER AND R. DANIEL MCCRIGHT

    1997-10-01

    A key component of the Engineered Barrier System (EBS) being designed for containment of spent-fuel and high-level waste at the proposed geological repository at Yucca Mountain, Nevada is a two-layer canister. In this particular design, the inner barrier is made of a corrosion resistant material (CRM) such as Alloy 625 or C-22, while the outer barrier is made of a corrosion-allowance material (CAM) such as carbon steel or Monel 400. An integrated predictive model is being developed to account for the effects of localized environmental conditions in the CRM-CAM crevice on the initiation and propagation of pits through the CRM.

  8. The integration of bioclimatic indices in an objective probabilistic model for establishing and mapping viticulture suitability in a region

    NASA Astrophysics Data System (ADS)

    Moral García, Francisco J.; Rebollo, Francisco J.; Paniagua, Luis L.; García, Abelardo

    2014-05-01

    Different bioclimatic indices have been proposed to determine the wine suitability in a region. Some of them are related to the air temperature, but the hydric component of climate should also be considered which, in turn, is influenced by the precipitation during the different stages of the grapevine growing and ripening periods. In this work we propose using the information obtained from 10 bioclimatic indices and variables (heliothermal index, HI, cool night index, CI, dryness index, DI, growing season temperature, GST, the Winkler index, WI, September mean thermal amplitude, MTA, annual precipitation, AP, precipitation during flowering, PDF, precipitation before flowering, PBF, and summer precipitation, SP) as inputs in an objective and probabilistic model, the Rasch model, with the aim of integrating the individual effects of them, obtaining the climate data that summarize all main bioclimatic indices which could influence on wine suitability, and utilize the Rasch measures to generate homogeneous climatic zones. The use of the Rasch model to estimate viticultural suitability constitutes a new application of great practical importance, enabling to rationally determine locations in a region where high viticultural potential exists and establishing a ranking of the bioclimatic indices or variables which exerts an important influence on wine suitability in a region. Furthermore, from the measures of viticultural suitability at some locations, estimates can be computed using a geostatistical algorithm, and these estimates can be utilized to map viticultural suitability potential in a region. To illustrate the process, an application to Extremadura, southewestern Spain, is shown. Keywords: Rasch model, bioclimatic indices, GIS.

  9. Integration of climatic indices in an objective probabilistic model for establishing and mapping viticultural climatic zones in a region

    NASA Astrophysics Data System (ADS)

    Moral, Francisco J.; Rebollo, Francisco J.; Paniagua, Luis L.; García, Abelardo; Honorio, Fulgencio

    2016-05-01

    Different climatic indices have been proposed to determine the wine suitability in a region. Some of them are related to the air temperature, but the hydric component of climate should also be considered which, in turn, is influenced by the precipitation during the different stages of the grapevine growing and ripening periods. In this study, we propose using the information obtained from ten climatic indices [heliothermal index (HI), cool night index (CI), dryness index (DI), growing season temperature (GST), the Winkler index (WI), September mean thermal amplitude (MTA), annual precipitation (AP), precipitation during flowering (PDF), precipitation before flowering (PBF), and summer precipitation (SP)] as inputs in an objective and probabilistic model, the Rasch model, with the aim of integrating the individual effects of them, obtaining the climate data that summarize all main climatic indices, which could influence on wine suitability from a climate viewpoint, and utilizing the Rasch measures to generate homogeneous climatic zones. The use of the Rasch model to estimate viticultural climatic suitability constitutes a new application of great practical importance, enabling to rationally determine locations in a region where high viticultural potential exists and establishing a ranking of the climatic indices which exerts an important influence on wine suitability in a region. Furthermore, from the measures of viticultural climatic suitability at some locations, estimates can be computed using a geostatistical algorithm, and these estimates can be utilized to map viticultural climatic zones in a region. To illustrate the process, an application to Extremadura, southwestern Spain, is shown.

  10. Probabilistic modeling of earthquakes

    NASA Astrophysics Data System (ADS)

    Duputel, Z.; Jolivet, R.; Jiang, J.; Simons, M.; Rivera, L. A.; Ampuero, J. P.; Gombert, B.; Minson, S. E.

    2015-12-01

    By exploiting increasing amounts of geophysical data we are able to produce increasingly sophisticated fault slip models. Such detailed models, while they are essential ingredients towards better understanding fault mechanical behavior, can only inform us in a meaningful way if we can assign uncertainties to the inferred slip parameters. This talk will present our recent efforts to infer fault slip models with realistic error estimates. Bayesian analysis is a useful tool for this purpose as it handles uncertainty in a natural way. One of the biggest obstacles to significant progress in observational earthquake source modeling arises from imperfect predictions of geodetic and seismic data due to uncertainties in the material parameters and fault geometries used in our forward models - the impact of which are generally overlooked. We recently developed physically based statistics for the model prediction error and showed how to account for inaccuracies in the Earth model elastic parameters. We will present applications of this formalism to recent large earthquakes such as the 2014 Pisagua earthquake. We will also discuss novel approaches to integrate the large amount of information available from GPS, InSAR, tide-gauge, tsunami and seismic data.

  11. Integrated feature and parameter optimization for an evolving spiking neural network: exploring heterogeneous probabilistic models.

    PubMed

    Schliebs, Stefan; Defoin-Platel, Michaël; Worner, Sue; Kasabov, Nikola

    2009-01-01

    This study introduces a quantum-inspired spiking neural network (QiSNN) as an integrated connectionist system, in which the features and parameters of an evolving spiking neural network are optimized together with the use of a quantum-inspired evolutionary algorithm. We propose here a novel optimization method that uses different representations to explore the two search spaces: A binary representation for optimizing feature subsets and a continuous representation for evolving appropriate real-valued configurations of the spiking network. The properties and characteristics of the improved framework are studied on two different synthetic benchmark datasets. Results are compared to traditional methods, namely a multi-layer-perceptron and a naïve Bayesian classifier (NBC). A previously used real world ecological dataset on invasive species establishment prediction is revisited and new results are obtained and analyzed by an ecological expert. The proposed method results in a much faster convergence to an optimal solution (or a close to it), in a better accuracy, and in a more informative set of features selected.

  12. Development of Probabilistic Structural Analysis Integrated with Manufacturing Processes

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Nagpal, Vinod K.

    2007-01-01

    An effort has been initiated to integrate manufacturing process simulations with probabilistic structural analyses in order to capture the important impacts of manufacturing uncertainties on component stress levels and life. Two physics-based manufacturing process models (one for powdered metal forging and the other for annular deformation resistance welding) have been linked to the NESSUS structural analysis code. This paper describes the methodology developed to perform this integration including several examples. Although this effort is still underway, particularly for full integration of a probabilistic analysis, the progress to date has been encouraging and a software interface that implements the methodology has been developed. The purpose of this paper is to report this preliminary development.

  13. Integrating Sequence Evolution into Probabilistic Orthology Analysis.

    PubMed

    Ullah, Ikram; Sjöstrand, Joel; Andersson, Peter; Sennblad, Bengt; Lagergren, Jens

    2015-11-01

    Orthology analysis, that is, finding out whether a pair of homologous genes are orthologs - stemming from a speciation - or paralogs - stemming from a gene duplication - is of central importance in computational biology, genome annotation, and phylogenetic inference. In particular, an orthologous relationship makes functional equivalence of the two genes highly likely. A major approach to orthology analysis is to reconcile a gene tree to the corresponding species tree, (most commonly performed using the most parsimonious reconciliation, MPR). However, most such phylogenetic orthology methods infer the gene tree without considering the constraints implied by the species tree and, perhaps even more importantly, only allow the gene sequences to influence the orthology analysis through the a priori reconstructed gene tree. We propose a sound, comprehensive Bayesian Markov chain Monte Carlo-based method, DLRSOrthology, to compute orthology probabilities. It efficiently sums over the possible gene trees and jointly takes into account the current gene tree, all possible reconciliations to the species tree, and the, typically strong, signal conveyed by the sequences. We compare our method with PrIME-GEM, a probabilistic orthology approach built on a probabilistic duplication-loss model, and MrBayesMPR, a probabilistic orthology approach that is based on conventional Bayesian inference coupled with MPR. We find that DLRSOrthology outperforms these competing approaches on synthetic data as well as on biological data sets and is robust to incomplete taxon sampling artifacts.

  14. SHEDS-HT: an integrated probabilistic exposure model for prioritizing exposures to chemicals with near-field and dietary sources.

    PubMed

    Isaacs, Kristin K; Glen, W Graham; Egeghy, Peter; Goldsmith, Michael-Rock; Smith, Luther; Vallero, Daniel; Brooks, Raina; Grulke, Christopher M; Özkaynak, Halûk

    2014-11-04

    United States Environmental Protection Agency (USEPA) researchers are developing a strategy for high-throughput (HT) exposure-based prioritization of chemicals under the ExpoCast program. These novel modeling approaches for evaluating chemicals based on their potential for biologically relevant human exposures will inform toxicity testing and prioritization for chemical risk assessment. Based on probabilistic methods and algorithms developed for The Stochastic Human Exposure and Dose Simulation Model for Multimedia, Multipathway Chemicals (SHEDS-MM), a new mechanistic modeling approach has been developed to accommodate high-throughput (HT) assessment of exposure potential. In this SHEDS-HT model, the residential and dietary modules of SHEDS-MM have been operationally modified to reduce the user burden, input data demands, and run times of the higher-tier model, while maintaining critical features and inputs that influence exposure. The model has been implemented in R; the modeling framework links chemicals to consumer product categories or food groups (and thus exposure scenarios) to predict HT exposures and intake doses. Initially, SHEDS-HT has been applied to 2507 organic chemicals associated with consumer products and agricultural pesticides. These evaluations employ data from recent USEPA efforts to characterize usage (prevalence, frequency, and magnitude), chemical composition, and exposure scenarios for a wide range of consumer products. In modeling indirect exposures from near-field sources, SHEDS-HT employs a fugacity-based module to estimate concentrations in indoor environmental media. The concentration estimates, along with relevant exposure factors and human activity data, are then used by the model to rapidly generate probabilistic population distributions of near-field indirect exposures via dermal, nondietary ingestion, and inhalation pathways. Pathway-specific estimates of near-field direct exposures from consumer products are also modeled

  15. Approaches to implementing deterministic models in a probabilistic framework

    SciTech Connect

    Talbott, D.V.

    1995-04-01

    The increasing use of results from probabilistic risk assessments in the decision-making process makes it ever more important to eliminate simplifications in probabilistic models that might lead to conservative results. One area in which conservative simplifications are often made is modeling the physical interactions that occur during the progression of an accident sequence. This paper demonstrates and compares different approaches for incorporating deterministic models of physical parameters into probabilistic models; parameter range binning, response curves, and integral deterministic models. An example that combines all three approaches in a probabilistic model for the handling of an energetic material (i.e. high explosive, rocket propellant,...) is then presented using a directed graph model.

  16. Integration of fuzzy analytic hierarchy process and probabilistic dynamic programming in formulating an optimal fleet management model

    NASA Astrophysics Data System (ADS)

    Teoh, Lay Eng; Khoo, Hooi Ling

    2013-09-01

    This study deals with two major aspects of airlines, i.e. supply and demand management. The aspect of supply focuses on the mathematical formulation of an optimal fleet management model to maximize operational profit of the airlines while the aspect of demand focuses on the incorporation of mode choice modeling as parts of the developed model. The proposed methodology is outlined in two-stage, i.e. Fuzzy Analytic Hierarchy Process is first adopted to capture mode choice modeling in order to quantify the probability of probable phenomena (for aircraft acquisition/leasing decision). Then, an optimization model is developed as a probabilistic dynamic programming model to determine the optimal number and types of aircraft to be acquired and/or leased in order to meet stochastic demand during the planning horizon. The findings of an illustrative case study show that the proposed methodology is viable. The results demonstrate that the incorporation of mode choice modeling could affect the operational profit and fleet management decision of the airlines at varying degrees.

  17. Probabilistic Survivability Versus Time Modeling

    NASA Technical Reports Server (NTRS)

    Joyner, James J., Sr.

    2016-01-01

    This presentation documents Kennedy Space Center's Independent Assessment work completed on three assessments for the Ground Systems Development and Operations (GSDO) Program to assist the Chief Safety and Mission Assurance Officer during key programmatic reviews and provided the GSDO Program with analyses of how egress time affects the likelihood of astronaut and ground worker survival during an emergency. For each assessment, a team developed probability distributions for hazard scenarios to address statistical uncertainty, resulting in survivability plots over time. The first assessment developed a mathematical model of probabilistic survivability versus time to reach a safe location using an ideal Emergency Egress System at Launch Complex 39B (LC-39B); the second used the first model to evaluate and compare various egress systems under consideration at LC-39B. The third used a modified LC-39B model to determine if a specific hazard decreased survivability more rapidly than other events during flight hardware processing in Kennedy's Vehicle Assembly Building.

  18. Multiclient Identification System Using Adaptive Probabilistic Model

    NASA Astrophysics Data System (ADS)

    Lin, Chin-Teng; Siana, Linda; Shou, Yu-Wen; Yang, Chien-Ting

    2010-12-01

    This paper aims at integrating detection and identification of human faces in a more practical and real-time face recognition system. The proposed face detection system is based on the cascade Adaboost method to improve the precision and robustness toward unstable surrounding lightings. Our Adaboost method innovates to adjust the environmental lighting conditions by histogram lighting normalization and to accurately locate the face regions by a region-based-clustering process as well. We also address on the problem of multi-scale faces in this paper by using 12 different scales of searching windows and 5 different orientations for each client in pursuit of the multi-view independent face identification. There are majorly two methodological parts in our face identification system, including PCA (principal component analysis) facial feature extraction and adaptive probabilistic model (APM). The structure of our implemented APM with a weighted combination of simple probabilistic functions constructs the likelihood functions by the probabilistic constraint in the similarity measures. In addition, our proposed method can online add a new client and update the information of registered clients due to the constructed APM. The experimental results eventually show the superior performance of our proposed system for both offline and real-time online testing.

  19. Models for Retrieval with Probabilistic Indexing.

    ERIC Educational Resources Information Center

    Fuhr, Norbert

    1989-01-01

    Describes three models for probabilistic indexing, all based on the Darmstadt automatic indexing approach, and presents experimental evaluation results for each. The discussion covers the improved retrieval effectiveness of probabilistic indexing over binary indexing, and suggestions for using this automatic indexing method with free text terms.…

  20. Probabilistic modeling for simulation of aerodynamic uncertainties in propulsion systems

    NASA Technical Reports Server (NTRS)

    Hamed, Awatef

    1989-01-01

    The numerical simulation of the probabilistic aerothermodynamic response of propulsion system components to randomness in their environment was explored. The reusable rocket engine turbopumps were selected as an example because of the severe cryogenic environment in which they operate. The thermal and combustion instabilities, coupled with the engine thrust requirements from start up to shut down, lead to randomness in the flow variables and uncertainties in the aerodynamic loading. The probabilistic modeling of the turbopumps aerodynamic response was accomplished using the panel method coupled with fast probability integration methods. The aerodynamic response in the form of probabilistic rotor blades and splitter loading were predicted and the results presented for specified flow coefficient and rotor preswirl variance. Possible future applications of the aerothermodynamic probabilistic modeling in engine transient simulation, condition monitoring and engine life prediction are briefly discussed.

  1. Probabilistic Survivability Versus Time Modeling

    NASA Technical Reports Server (NTRS)

    Joyner, James J., Sr.

    2015-01-01

    This technical paper documents Kennedy Space Centers Independent Assessment team work completed on three assessments for the Ground Systems Development and Operations (GSDO) Program to assist the Chief Safety and Mission Assurance Officer (CSO) and GSDO management during key programmatic reviews. The assessments provided the GSDO Program with an analysis of how egress time affects the likelihood of astronaut and worker survival during an emergency. For each assessment, the team developed probability distributions for hazard scenarios to address statistical uncertainty, resulting in survivability plots over time. The first assessment developed a mathematical model of probabilistic survivability versus time to reach a safe location using an ideal Emergency Egress System at Launch Complex 39B (LC-39B); the second used the first model to evaluate and compare various egress systems under consideration at LC-39B. The third used a modified LC-39B model to determine if a specific hazard decreased survivability more rapidly than other events during flight hardware processing in Kennedys Vehicle Assembly Building (VAB).Based on the composite survivability versus time graphs from the first two assessments, there was a soft knee in the Figure of Merit graphs at eight minutes (ten minutes after egress ordered). Thus, the graphs illustrated to the decision makers that the final emergency egress design selected should have the capability of transporting the flight crew from the top of LC 39B to a safe location in eight minutes or less. Results for the third assessment were dominated by hazards that were classified as instantaneous in nature (e.g. stacking mishaps) and therefore had no effect on survivability vs time to egress the VAB. VAB emergency scenarios that degraded over time (e.g. fire) produced survivability vs time graphs that were line with aerospace industry norms.

  2. Probabilistic Modeling of Rosette Formation

    PubMed Central

    Long, Mian; Chen, Juan; Jiang, Ning; Selvaraj, Periasamy; McEver, Rodger P.; Zhu, Cheng

    2006-01-01

    Rosetting, or forming a cell aggregate between a single target nucleated cell and a number of red blood cells (RBCs), is a simple assay for cell adhesion mediated by specific receptor-ligand interaction. For example, rosette formation between sheep RBC and human lymphocytes has been used to differentiate T cells from B cells. Rosetting assay is commonly used to determine the interaction of Fc γ-receptors (FcγR) expressed on inflammatory cells and IgG coated on RBCs. Despite its wide use in measuring cell adhesion, the biophysical parameters of rosette formation have not been well characterized. Here we developed a probabilistic model to describe the distribution of rosette sizes, which is Poissonian. The average rosette size is predicted to be proportional to the apparent two-dimensional binding affinity of the interacting receptor-ligand pair and their site densities. The model has been supported by experiments of rosettes mediated by four molecular interactions: FcγRIII interacting with IgG, T cell receptor and coreceptor CD8 interacting with antigen peptide presented by major histocompatibility molecule, P-selectin interacting with P-selectin glycoprotein ligand 1 (PSGL-1), and L-selectin interacting with PSGL-1. The latter two are structurally similar and are different from the former two. Fitting the model to data enabled us to evaluate the apparent effective two-dimensional binding affinity of the interacting molecular pairs: 7.19 × 10−5 μm4 for FcγRIII-IgG interaction, 4.66 × 10−3 μm4 for P-selectin-PSGL-1 interaction, and 0.94 × 10−3 μm4 for L-selectin-PSGL-1 interaction. These results elucidate the biophysical mechanism of rosette formation and enable it to become a semiquantitative assay that relates the rosette size to the effective affinity for receptor-ligand binding. PMID:16603493

  3. INTEGRATED PROBABILISTIC AND DETERMINISTIC MODELING TECHNIQUES IN ESTIMATING EXPOSURE TO WATER-BORNE CONTAMINANTS: PART 2 PHARMACOKINETIC MODELING

    EPA Science Inventory

    The Total Exposure Model (TEM) uses deterministic and stochastic methods to estimate the exposure of a person performing daily activities of eating, drinking, showering, and bathing. There were 250 time histories generated, by subject with activities, for the three exposure ro...

  4. INTEGRATED PROBABILISTIC AND DETERMINISTIC MODELING TECHNIQUES IN ESTIMATING EXPOSURE TO WATER-BORNE CONTAMINANTS: PART 2 PHARMACOKINETIC MODELING

    EPA Science Inventory

    The Total Exposure Model (TEM) uses deterministic and stochastic methods to estimate the exposure of a person performing daily activities of eating, drinking, showering, and bathing. There were 250 time histories generated, by subject with activities, for the three exposure ro...

  5. Modelling structured data with Probabilistic Graphical Models

    NASA Astrophysics Data System (ADS)

    Forbes, F.

    2016-05-01

    Most clustering and classification methods are based on the assumption that the objects to be clustered are independent. However, in more and more modern applications, data are structured in a way that makes this assumption not realistic and potentially misleading. A typical example that can be viewed as a clustering task is image segmentation where the objects are the pixels on a regular grid and depend on neighbouring pixels on this grid. Also, when data are geographically located, it is of interest to cluster data with an underlying dependence structure accounting for some spatial localisation. These spatial interactions can be naturally encoded via a graph not necessarily regular as a grid. Data sets can then be modelled via Markov random fields and mixture models (e.g. the so-called MRF and Hidden MRF). More generally, probabilistic graphical models are tools that can be used to represent and manipulate data in a structured way while modeling uncertainty. This chapter introduces the basic concepts. The two main classes of probabilistic graphical models are considered: Bayesian networks and Markov networks. The key concept of conditional independence and its link to Markov properties is presented. The main problems that can be solved with such tools are described. Some illustrations are given associated with some practical work.

  6. Probabilistic Solar Energetic Particle Models

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.; Dietrich, William F.; Xapsos, Michael A.

    2011-01-01

    To plan and design safe and reliable space missions, it is necessary to take into account the effects of the space radiation environment. This is done by setting the goal of achieving safety and reliability with some desired level of confidence. To achieve this goal, a worst-case space radiation environment at the required confidence level must be obtained. Planning and designing then proceeds, taking into account the effects of this worst-case environment. The result will be a mission that is reliable against the effects of the space radiation environment at the desired confidence level. In this paper we will describe progress toward developing a model that provides worst-case space radiation environments at user-specified confidence levels. We will present a model for worst-case event-integrated solar proton environments that provide the worst-case differential proton spectrum. This model is based on data from IMP-8 and GOES spacecraft that provide a data base extending from 1974 to the present. We will discuss extending this work to create worst-case models for peak flux and mission-integrated fluence for protons. We will also describe plans for similar models for helium and heavier ions.

  7. iTOUGH2-IFC: An Integrated Flow Code in Support of Nagra's Probabilistic Safety Assessment:--User's Guide and Model Description

    SciTech Connect

    Finsterle, Stefan A.

    2009-01-02

    This document describes the development and use of the Integrated Flow Code (IFC), a numerical code and related model to be used for the simulation of time-dependent, two-phase flow in the near field and geosphere of a gas-generating nuclear waste repository system located in an initially fully water-saturated claystone (Opalinus Clay) in Switzerland. The development of the code and model was supported by the Swiss National Cooperative for the Disposal of Radioactive Waste (Nagra), Wettingen, Switzerland. Gas generation (mainly H{sub 2}, but also CH{sub 4} and CO{sub 2}) may affect repository performance by (1) compromising the engineered barriers through excessive pressure build-up, (2) displacing potentially contaminated pore water, (3) releasing radioactive gases (e.g., those containing {sup 14}C and {sup 3}H), (4) changing hydrogeologic properties of the engineered barrier system and the host rock, and (5) altering the groundwater flow field and thus radionuclide migration paths. The IFC aims at providing water and gas flow fields as the basis for the subsequent radionuclide transport simulations, which are performed by the radionuclide transport code (RTC). The IFC, RTC and a waste-dissolution and near-field transport model (STMAN) are part of the Integrated Radionuclide Release Code (IRRC), which integrates all safety-relevant features, events, and processes (FEPs). The IRRC is embedded into a Probabilistic Safety Assessment (PSA) computational tool that (1) evaluates alternative conceptual models, scenarios, and disruptive events, and (2) performs Monte-Carlo sampling to account for parametric uncertainties. The preliminary probabilistic safety assessment concept and the role of the IFC are visualized in Figure 1. The IFC was developed based on Nagra's PSA concept. Specifically, as many phenomena as possible are to be directly simulated using a (simplified) process model, which is at the core of the IRRC model. Uncertainty evaluation (scenario uncertainty

  8. Championship Tennis as a Probabilistic Modelling Context.

    ERIC Educational Resources Information Center

    Galbraith, Peter

    1996-01-01

    Suggests ways for using data from championship tennis as a means for exploring probabilistic models, especially binomial probability. Examples include the probability of winning a service point and the probability of winning a service game using data from tables and graphs. (AIM)

  9. A probabilistic model for binaural sound localization.

    PubMed

    Willert, Volker; Eggert, Julian; Adamy, Jürgen; Stahl, Raphael; Körner, Edgar

    2006-10-01

    This paper proposes a biologically inspired and technically implemented sound localization system to robustly estimate the position of a sound source in the frontal azimuthal half-plane. For localization, binaural cues are extracted using cochleagrams generated by a cochlear model that serve as input to the system. The basic idea of the model is to separately measure interaural time differences and interaural level differences for a number of frequencies and process these measurements as a whole. This leads to two-dimensional frequency versus time-delay representations of binaural cues, so-called activity maps. A probabilistic evaluation is presented to estimate the position of a sound source over time based on these activity maps. Learned reference maps for different azimuthal positions are integrated into the computation to gain time-dependent discrete conditional probabilities. At every timestep these probabilities are combined over frequencies and binaural cues to estimate the sound source position. In addition, they are propagated over time to improve position estimation. This leads to a system that is able to localize audible signals, for example human speech signals, even in reverberating environments.

  10. Probabilistic modeling of children's handwriting

    NASA Astrophysics Data System (ADS)

    Puri, Mukta; Srihari, Sargur N.; Hanson, Lisa

    2013-12-01

    There is little work done in the analysis of children's handwriting, which can be useful in developing automatic evaluation systems and in quantifying handwriting individuality. We consider the statistical analysis of children's handwriting in early grades. Samples of handwriting of children in Grades 2-4 who were taught the Zaner-Bloser style were considered. The commonly occurring word "and" written in cursive style as well as hand-print were extracted from extended writing. The samples were assigned feature values by human examiners using a truthing tool. The human examiners looked at how the children constructed letter formations in their writing, looking for similarities and differences from the instructions taught in the handwriting copy book. These similarities and differences were measured using a feature space distance measure. Results indicate that the handwriting develops towards more conformity with the class characteristics of the Zaner-Bloser copybook which, with practice, is the expected result. Bayesian networks were learnt from the data to enable answering various probabilistic queries, such as determining students who may continue to produce letter formations as taught during lessons in school and determining the students who will develop a different and/or variation of the those letter formations and the number of different types of letter formations.

  11. Probabilistic Seismic Risk Model for Western Balkans

    NASA Astrophysics Data System (ADS)

    Stejskal, Vladimir; Lorenzo, Francisco; Pousse, Guillaume; Radovanovic, Slavica; Pekevski, Lazo; Dojcinovski, Dragi; Lokin, Petar; Petronijevic, Mira; Sipka, Vesna

    2010-05-01

    A probabilistic seismic risk model for insurance and reinsurance purposes is presented for an area of Western Balkans, covering former Yugoslavia and Albania. This territory experienced many severe earthquakes during past centuries producing significant damage to many population centres in the region. The highest hazard is related to external Dinarides, namely to the collision zone of the Adriatic plate. The model is based on a unified catalogue for the region and a seismic source model consisting of more than 30 zones covering all the three main structural units - Southern Alps, Dinarides and the south-western margin of the Pannonian Basin. A probabilistic methodology using Monte Carlo simulation was applied to generate the hazard component of the model. Unique set of damage functions based on both loss experience and engineering assessments is used to convert the modelled ground motion severity into the monetary loss.

  12. ISSUES ASSOCIATED WITH PROBABILISTIC FAILURE MODELING OF DIGITAL SYSTEMS

    SciTech Connect

    CHU,T.L.; MARTINEZ-GURIDI,G.; LEHNER,J.; OVERLAND,D.

    2004-09-19

    The current U.S. Nuclear Regulatory Commission (NRC) licensing process of instrumentation and control (I&C) systems is based on deterministic requirements, e.g., single failure criteria, and defense in depth and diversity. Probabilistic considerations can be used as supplements to the deterministic process. The National Research Council has recommended development of methods for estimating failure probabilities of digital systems, including commercial off-the-shelf (COTS) equipment, for use in probabilistic risk assessment (PRA). NRC staff has developed informal qualitative and quantitative requirements for PRA modeling of digital systems. Brookhaven National Laboratory (BNL) has performed a review of the-state-of-the-art of the methods and tools that can potentially be used to model digital systems. The objectives of this paper are to summarize the review, discuss the issues associated with probabilistic modeling of digital systems, and identify potential areas of research that would enhance the state of the art toward a satisfactory modeling method that could be integrated with a typical probabilistic risk assessment.

  13. Probabilistic Latent Variable Models as Nonnegative Factorizations

    PubMed Central

    Shashanka, Madhusudana; Raj, Bhiksha; Smaragdis, Paris

    2008-01-01

    This paper presents a family of probabilistic latent variable models that can be used for analysis of nonnegative data. We show that there are strong ties between nonnegative matrix factorization and this family, and provide some straightforward extensions which can help in dealing with shift invariances, higher-order decompositions and sparsity constraints. We argue through these extensions that the use of this approach allows for rapid development of complex statistical models for analyzing nonnegative data. PMID:18509481

  14. Probabilistic, meso-scale flood loss modelling

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2016-04-01

    Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.

  15. SHEDS-HT: An Integrated Probabilistic Exposure Model for Prioritizing Exposures to Chemicals with Near-Field and Dietary Sources

    EPA Science Inventory

    United States Environmental Protection Agency (USEPA) researchers are developing a strategy for highthroughput (HT) exposure-based prioritization of chemicals under the ExpoCast program. These novel modeling approaches for evaluating chemicals based on their potential for biologi...

  16. SHEDS-HT: An Integrated Probabilistic Exposure Model for Prioritizing Exposures to Chemicals with Near-Field and Dietary Sources

    EPA Science Inventory

    United States Environmental Protection Agency (USEPA) researchers are developing a strategy for highthroughput (HT) exposure-based prioritization of chemicals under the ExpoCast program. These novel modeling approaches for evaluating chemicals based on their potential for biologi...

  17. The loss of ecosystem services due to land degradation. Integration of mechanistic and probabilistic models in an Ethiopian case study

    NASA Astrophysics Data System (ADS)

    Cerretelli, Stefania; Poggio, Laura; Gimona, Alessandro; Peressotti, Alessandro; Black, Helaina

    2017-04-01

    Land and soil degradation are widespread especially in dry and developing countries such as Ethiopia. Land degradation leads to ecosystems services (ESS) degradation, because it causes the depletion and loss of several soil functions. Ethiopia's farmland faces intense degradation due to deforestation, agricultural land expansion, land overexploitation and overgrazing. In this study we modelled the impact of physical factors on ESS degradation, in particular soil erodibility, carbon storage and nutrient retention, in the Ethiopian Great Rift Valley, northwestern of Hawassa. We used models of the Sediment retention/loss, the Nutrient Retention/loss (from the software suite InVEST) and Carbon Storage. To run the models we coupled soil local data (such as soil organic carbon, soil texture) with remote sensing data as input in the parametrization phase, e.g. to derive a land use map, to calculate the aboveground and belowground carbon, the evapotraspiration coefficient and the capacity of vegetation to retain nutrient. We then used spatialised Bayesian Belief Networks (sBBNs) predicting ecosystem services degradation on the basis of the results of the three mechanistic models. The results show i) the importance of mapping of ESS degradation taking into consideration the spatial heterogeneity and the cross-correlations between impacts ii) the fundamental role of remote sensing data in monitoring and modelling in remote, data-poor areas and iii) the important role of spatial BBNs in providing spatially explicit measures of risk and uncertainty. This approach could help decision makers to identify priority areas for intervention in order to reduce land and ecosystem services degradation.

  18. The probabilistic structure of planetary contamination models

    NASA Technical Reports Server (NTRS)

    Harrison, J. M.; North, W. D.

    1973-01-01

    The analytical basis for planetary quarantine standards and procedures is presented. The heirarchy of planetary quarantine decisions is explained and emphasis is placed on the determination of mission specifications to include sterilization. The influence of the Sagan-Coleman probabilistic model of planetary contamination on current standards and procedures is analyzed. A classical problem in probability theory which provides a close conceptual parallel to the type of dependence present in the contamination problem is presented.

  19. Analytic gain in probabilistic decompression sickness models.

    PubMed

    Howle, Laurens E

    2013-11-01

    Decompression sickness (DCS) is a disease known to be related to inert gas bubble formation originating from gases dissolved in body tissues. Probabilistic DCS models, which employ survival and hazard functions, are optimized by fitting model parameters to experimental dive data. In the work reported here, I develop methods to find the survival function gain parameter analytically, thus removing it from the fitting process. I show that the number of iterations required for model optimization is significantly reduced. The analytic gain method substantially improves the condition number of the Hessian matrix which reduces the model confidence intervals by more than an order of magnitude. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Modelling default and likelihood reasoning as probabilistic

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1990-01-01

    A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. 'Likely' and 'by default' are in fact treated as duals in the same sense as 'possibility' and 'necessity'. To model these four forms probabilistically, a logic QDP and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequence results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.

  1. Probabilistic model of Kordylewski clouds

    NASA Astrophysics Data System (ADS)

    Salnikova, T. V.; Stepanov, S. Ya.; Shuvalova, A. I.

    2016-05-01

    The problem of determining the phase-space distribution function for the system of the noninteracting dust particles for the mathematical model of cosmic dust Kordylewski clouds—clusters of the non-interacting dust particles in the vicinity of the triangular libration points of the Earth-Moon-Particle system taking into account perturbations from the Sun was considered.

  2. Probabilistic models for feedback systems.

    SciTech Connect

    Grace, Matthew D.; Boggs, Paul T.

    2011-02-01

    In previous work, we developed a Bayesian-based methodology to analyze the reliability of hierarchical systems. The output of the procedure is a statistical distribution of the reliability, thus allowing many questions to be answered. The principal advantage of the approach is that along with an estimate of the reliability, we also can provide statements of confidence in the results. The model is quite general in that it allows general representations of all of the distributions involved, it incorporates prior knowledge into the models, it allows errors in the 'engineered' nodes of a system to be determined by the data, and leads to the ability to determine optimal testing strategies. In this report, we provide the preliminary steps necessary to extend this approach to systems with feedback. Feedback is an essential component of 'complexity' and provides interesting challenges in modeling the time-dependent action of a feedback loop. We provide a mechanism for doing this and analyze a simple case. We then consider some extensions to more interesting examples with local control affecting the entire system. Finally, a discussion of the status of the research is also included.

  3. Bayesian theory of probabilistic forecasting via deterministic hydrologic model

    NASA Astrophysics Data System (ADS)

    Krzysztofowicz, Roman

    1999-09-01

    Rational decision making (for flood warning, navigation, or reservoir systems) requires that the total uncertainty about a hydrologic predictand (such as river stage, discharge, or runoff volume) be quantified in terms of a probability distribution, conditional on all available information and knowledge. Hydrologic knowledge is typically embodied in a deterministic catchment model. Fundamentals are presented of a Bayesian forecasting system (BFS) for producing a probabilistic forecast of a hydrologic predictand via any deterministic catchment model. The BFS decomposes the total uncertainty into input uncertainty and hydrologic uncertainty, which are quantified independently and then integrated into a predictive (Bayes) distribution. This distribution results from a revision of a prior (climatic) distribution, is well calibrated, and has a nonnegative ex ante economic value. The BFS is compared with Monte Carlo simulation and "ensemble forecasting" technique, none of which can alone produce a probabilistic forecast that meets requirements of rational decision making, but each can serve as a component of the BFS.

  4. Probabilistic risk assessment of the Space Shuttle. Phase 3: A study of the potential of losing the vehicle during nominal operation. Volume 2: Integrated loss of vehicle model

    NASA Technical Reports Server (NTRS)

    Fragola, Joseph R.; Maggio, Gaspare; Frank, Michael V.; Gerez, Luis; Mcfadden, Richard H.; Collins, Erin P.; Ballesio, Jorge; Appignani, Peter L.; Karns, James J.

    1995-01-01

    The application of the probabilistic risk assessment methodology to a Space Shuttle environment, particularly to the potential of losing the Shuttle during nominal operation is addressed. The different related concerns are identified and combined to determine overall program risks. A fault tree model is used to allocate system probabilities to the subsystem level. The loss of the vehicle due to failure to contain energetic gas and debris, to maintain proper propulsion and configuration is analyzed, along with the loss due to Orbiter, external tank failure, and landing failure or error.

  5. Towards an integrated probabilistic nowcasting system (En-INCA)

    NASA Astrophysics Data System (ADS)

    Suklitsch, M.; Kann, A.; Bica, B.

    2015-04-01

    Ensemble prediction systems are becoming of more and more interest for various applications. Especially ensemble nowcasting systems are increasingly requested by different end users. In this study we introduce such an integrated probabilistic nowcasting system, En-INCA. In a case study we show the added value and increased skill of the new system and demonstrate the improved performance in comparison with a state-of-the-art LAM-EPS.

  6. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  7. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  8. Probabilistic computer model of optimal runway turnoffs

    NASA Technical Reports Server (NTRS)

    Schoen, M. L.; Preston, O. W.; Summers, L. G.; Nelson, B. A.; Vanderlinden, L.; Mcreynolds, M. C.

    1985-01-01

    Landing delays are currently a problem at major air carrier airports and many forecasters agree that airport congestion will get worse by the end of the century. It is anticipated that some types of delays can be reduced by an efficient optimal runway exist system allowing increased approach volumes necessary at congested airports. A computerized Probabilistic Runway Turnoff Model which locates exits and defines path geometry for a selected maximum occupancy time appropriate for each TERPS aircraft category is defined. The model includes an algorithm for lateral ride comfort limits.

  9. En-INCA: Towards an integrated probabilistic nowcasting system

    NASA Astrophysics Data System (ADS)

    Suklitsch, Martin; Stuhl, Barbora; Kann, Alexander; Bica, Benedikt

    2014-05-01

    INCA (Integrated Nowcasting through Comprehensive Analysis), the analysis and nowcasting system operated by ZAMG, is based on blending observations and NWP data. Its performance is extremely high in the nowcasting range. However, uncertainties can be large even in the very short term and limit its practical use. Severe weather conditions are particularly demanding, which is why the quantification of uncertainties and determining probabilities of event occurrences are adding value for various applications. The Nowcasting Ensemble System En-INCA achieves this by coupling the INCA nowcast with ALADIN-LAEF, the EPS of the local area model ALADIN operated at ZAMG successfully for years already. In En-INCA, the Nowcasting approach of INCA is blended with different EPS members in order to derive an ensemble of forecasts in the nowcasting range. In addition to NWP based uncertainties also specific perturbations with respect to observations, the analysis and nowcasting techniques are discussed, and the influence of learning from errors in previous nowcasts is shown. En-INCA is a link between INCA and ALADIN-LAEF by merging the advantages of both systems: observation based nowcasting at very high resolution on the one hand and the uncertainty estimation of a state-of-the-art LAM-EPS on the other hand. Probabilistic nowcasting products can support various end users, e.g. civil protection agencies and power industry, to optimize their decision making process.

  10. Opportunities of probabilistic flood loss models

    NASA Astrophysics Data System (ADS)

    Schröter, Kai; Kreibich, Heidi; Lüdtke, Stefan; Vogel, Kristin; Merz, Bruno

    2016-04-01

    Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. However, reliable flood damage models are a prerequisite for the practical usefulness of the model results. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks and traditional stage damage functions. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005, 2006 and 2013 in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of sharpness of the predictions the reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The comparison of the uni-variable Stage damage function and the multivariable model approach emphasises the importance to quantify predictive uncertainty. With each explanatory variable, the multi-variable model reveals an additional source of uncertainty. However, the predictive performance in terms of precision (mbe), accuracy (mae) and reliability (HR) is clearly improved

  11. A guide to integrating transcriptional regulatory and metabolic networks using PROM (probabilistic regulation of metabolism).

    PubMed

    Simeonidis, Evangelos; Chandrasekaran, Sriram; Price, Nathan D

    2013-01-01

    The integration of transcriptional regulatory and metabolic networks is a crucial step in the process of predicting metabolic behaviors that emerge from either genetic or environmental changes. Here, we present a guide to PROM (probabilistic regulation of metabolism), an automated method for the construction and simulation of integrated metabolic and transcriptional regulatory networks that enables large-scale phenotypic predictions for a wide range of model organisms.

  12. Integration of Evidence Base into a Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Saile, Lyn; Lopez, Vilma; Bickham, Grandin; Kerstman, Eric; FreiredeCarvalho, Mary; Byrne, Vicky; Butler, Douglas; Myers, Jerry; Walton, Marlei

    2011-01-01

    INTRODUCTION: A probabilistic decision support model such as the Integrated Medical Model (IMM) utilizes an immense amount of input data that necessitates a systematic, integrated approach for data collection, and management. As a result of this approach, IMM is able to forecasts medical events, resource utilization and crew health during space flight. METHODS: Inflight data is the most desirable input for the Integrated Medical Model. Non-attributable inflight data is collected from the Lifetime Surveillance for Astronaut Health study as well as the engineers, flight surgeons, and astronauts themselves. When inflight data is unavailable cohort studies, other models and Bayesian analyses are used, in addition to subject matters experts input on occasion. To determine the quality of evidence of a medical condition, the data source is categorized and assigned a level of evidence from 1-5; the highest level is one. The collected data reside and are managed in a relational SQL database with a web-based interface for data entry and review. The database is also capable of interfacing with outside applications which expands capabilities within the database itself. Via the public interface, customers can access a formatted Clinical Findings Form (CLiFF) that outlines the model input and evidence base for each medical condition. Changes to the database are tracked using a documented Configuration Management process. DISSCUSSION: This strategic approach provides a comprehensive data management plan for IMM. The IMM Database s structure and architecture has proven to support additional usages. As seen by the resources utilization across medical conditions analysis. In addition, the IMM Database s web-based interface provides a user-friendly format for customers to browse and download the clinical information for medical conditions. It is this type of functionality that will provide Exploratory Medicine Capabilities the evidence base for their medical condition list

  13. Economic Dispatch for Microgrid Containing Electric Vehicles via Probabilistic Modelling

    SciTech Connect

    Yao, Yin; Gao, Wenzhong; Momoh, James; Muljadi, Eduard

    2015-10-06

    In this paper, an economic dispatch model with probabilistic modeling is developed for microgrid. Electric power supply in microgrid consists of conventional power plants and renewable energy power plants, such as wind and solar power plants. Due to the fluctuation of solar and wind plants' output, an empirical probabilistic model is developed to predict their hourly output. According to different characteristics of wind and solar plants, the parameters for probabilistic distribution are further adjusted individually for both power plants. On the other hand, with the growing trend of Plug-in Electric Vehicle (PHEV), an integrated microgrid system must also consider the impact of PHEVs. Not only the charging loads from PHEVs, but also the discharging output via Vehicle to Grid (V2G) method can greatly affect the economic dispatch for all the micro energy sources in microgrid. This paper presents an optimization method for economic dispatch in microgrid considering conventional, renewable power plants, and PHEVs. The simulation results reveal that PHEVs with V2G capability can be an indispensable supplement in modern microgrid.

  14. Probabilistic Modeling of the Renal Stone Formation Module

    NASA Technical Reports Server (NTRS)

    Best, Lauren M.; Myers, Jerry G.; Goodenow, Debra A.; McRae, Michael P.; Jackson, Travis C.

    2013-01-01

    The Integrated Medical Model (IMM) is a probabilistic tool, used in mission planning decision making and medical systems risk assessments. The IMM project maintains a database of over 80 medical conditions that could occur during a spaceflight, documenting an incidence rate and end case scenarios for each. In some cases, where observational data are insufficient to adequately define the inflight medical risk, the IMM utilizes external probabilistic modules to model and estimate the event likelihoods. One such medical event of interest is an unpassed renal stone. Due to a high salt diet and high concentrations of calcium in the blood (due to bone depletion caused by unloading in the microgravity environment) astronauts are at a considerable elevated risk for developing renal calculi (nephrolithiasis) while in space. Lack of observed incidences of nephrolithiasis has led HRP to initiate the development of the Renal Stone Formation Module (RSFM) to create a probabilistic simulator capable of estimating the likelihood of symptomatic renal stone presentation in astronauts on exploration missions. The model consists of two major parts. The first is the probabilistic component, which utilizes probability distributions to assess the range of urine electrolyte parameters and a multivariate regression to transform estimated crystal density and size distributions to the likelihood of the presentation of nephrolithiasis symptoms. The second is a deterministic physical and chemical model of renal stone growth in the kidney developed by Kassemi et al. The probabilistic component of the renal stone model couples the input probability distributions describing the urine chemistry, astronaut physiology, and system parameters with the physical and chemical outputs and inputs to the deterministic stone growth model. These two parts of the model are necessary to capture the uncertainty in the likelihood estimate. The model will be driven by Monte Carlo simulations, continuously

  15. A probabilistic angle on one-loop scalar integrals

    NASA Astrophysics Data System (ADS)

    Benhaddou, Kamel

    2017-06-01

    Recasting the N-point one-loop scalar integral as a probabilistic problem allows the derivation of integral recurrence relations, as well as exact analytical expressions in the most common cases. ɛ expansions are derived by writing a formula that relates an N-point function in decimal dimensions to an N-point function in integer dimensions. As an example, we give relations for the massive five-point function in n=4-2ε and n=6-2ε dimensions. The reduction of tensor integrals of rank two with N  =  5 is achieved showing the method’s potential. No hypergeometric functions are involved. Results are expressed as integrals of arcsine functions, whose analytical continuation is well known.

  16. A probabilistic NF2 relational algebra for integrated information retrieval and database systems

    SciTech Connect

    Fuhr, N.; Roelleke, T.

    1996-12-31

    The integration of information retrieval (IR) and database systems requires a data model which allows for modelling documents as entities, representing uncertainty and vagueness and performing uncertain inference. For this purpose, we present a probabilistic data model based on relations in non-first-normal-form (NF2). Here, tuples are assigned probabilistic weights giving the probability that a tuple belongs to a relation. Thus, the set of weighted index terms of a document are represented as a probabilistic subrelation. In a similar way, imprecise attribute values are modelled as a set-valued attribute. We redefine the relational operators for this type of relations such that the result of each operator is again a probabilistic NF2 relation, where the weight of a tuple gives the probability that this tuple belongs to the result. By ordering the tuples according to decreasing probabilities, the model yields a ranking of answers like in most IR models. This effect also can be used for typical database queries involving imprecise attribute values as well as for combinations of database and IR queries.

  17. Probabilistic Gompertz model of irreversible growth.

    PubMed

    Bardos, D C

    2005-05-01

    Characterizing organism growth within populations requires the application of well-studied individual size-at-age models, such as the deterministic Gompertz model, to populations of individuals whose characteristics, corresponding to model parameters, may be highly variable. A natural approach is to assign probability distributions to one or more model parameters. In some contexts, size-at-age data may be absent due to difficulties in ageing individuals, but size-increment data may instead be available (e.g., from tag-recapture experiments). A preliminary transformation to a size-increment model is then required. Gompertz models developed along the above lines have recently been applied to strongly heterogeneous abalone tag-recapture data. Although useful in modelling the early growth stages, these models yield size-increment distributions that allow negative growth, which is inappropriate in the case of mollusc shells and other accumulated biological structures (e.g., vertebrae) where growth is irreversible. Here we develop probabilistic Gompertz models where this difficulty is resolved by conditioning parameter distributions on size, allowing application to irreversible growth data. In the case of abalone growth, introduction of a growth-limiting biological length scale is then shown to yield realistic length-increment distributions.

  18. Probabilistic Resilience in Hidden Markov Models

    NASA Astrophysics Data System (ADS)

    Panerati, Jacopo; Beltrame, Giovanni; Schwind, Nicolas; Zeltner, Stefan; Inoue, Katsumi

    2016-05-01

    Originally defined in the context of ecological systems and environmental sciences, resilience has grown to be a property of major interest for the design and analysis of many other complex systems: resilient networks and robotics systems other the desirable capability of absorbing disruption and transforming in response to external shocks, while still providing the services they were designed for. Starting from an existing formalization of resilience for constraint-based systems, we develop a probabilistic framework based on hidden Markov models. In doing so, we introduce two new important features: stochastic evolution and partial observability. Using our framework, we formalize a methodology for the evaluation of probabilities associated with generic properties, we describe an efficient algorithm for the computation of its essential inference step, and show that its complexity is comparable to other state-of-the-art inference algorithms.

  19. Retinal blood vessels extraction using probabilistic modelling.

    PubMed

    Kaba, Djibril; Wang, Chuang; Li, Yongmin; Salazar-Gonzalez, Ana; Liu, Xiaohui; Serag, Ahmed

    2014-01-01

    The analysis of retinal blood vessels plays an important role in detecting and treating retinal diseases. In this review, we present an automated method to segment blood vessels of fundus retinal image. The proposed method could be used to support a non-intrusive diagnosis in modern ophthalmology for early detection of retinal diseases, treatment evaluation or clinical study. This study combines the bias correction and an adaptive histogram equalisation to enhance the appearance of the blood vessels. Then the blood vessels are extracted using probabilistic modelling that is optimised by the expectation maximisation algorithm. The method is evaluated on fundus retinal images of STARE and DRIVE datasets. The experimental results are compared with some recently published methods of retinal blood vessels segmentation. The experimental results show that our method achieved the best overall performance and it is comparable to the performance of human experts.

  20. A probabilistic graphical model based stochastic input model construction

    SciTech Connect

    Wan, Jiang; Zabaras, Nicholas

    2014-09-01

    Model reduction techniques have been widely used in modeling of high-dimensional stochastic input in uncertainty quantification tasks. However, the probabilistic modeling of random variables projected into reduced-order spaces presents a number of computational challenges. Due to the curse of dimensionality, the underlying dependence relationships between these random variables are difficult to capture. In this work, a probabilistic graphical model based approach is employed to learn the dependence by running a number of conditional independence tests using observation data. Thus a probabilistic model of the joint PDF is obtained and the PDF is factorized into a set of conditional distributions based on the dependence structure of the variables. The estimation of the joint PDF from data is then transformed to estimating conditional distributions under reduced dimensions. To improve the computational efficiency, a polynomial chaos expansion is further applied to represent the random field in terms of a set of standard random variables. This technique is combined with both linear and nonlinear model reduction methods. Numerical examples are presented to demonstrate the accuracy and efficiency of the probabilistic graphical model based stochastic input models. - Highlights: • Data-driven stochastic input models without the assumption of independence of the reduced random variables. • The problem is transformed to a Bayesian network structure learning problem. • Examples are given in flows in random media.

  1. A Practical Probabilistic Graphical Modeling Tool for Weighing ...

    EPA Pesticide Factsheets

    Past weight-of-evidence frameworks for adverse ecological effects have provided soft-scoring procedures for judgments based on the quality and measured attributes of evidence. Here, we provide a flexible probabilistic structure for weighing and integrating lines of evidence for ecological risk determinations. Probabilistic approaches can provide both a quantitative weighing of lines of evidence and methods for evaluating risk and uncertainty. The current modeling structure wasdeveloped for propagating uncertainties in measured endpoints and their influence on the plausibility of adverse effects. To illustrate the approach, we apply the model framework to the sediment quality triad using example lines of evidence for sediment chemistry measurements, bioassay results, and in situ infauna diversity of benthic communities using a simplified hypothetical case study. We then combine the three lines evidence and evaluate sensitivity to the input parameters, and show how uncertainties are propagated and how additional information can be incorporated to rapidly update the probability of impacts. The developed network model can be expanded to accommodate additional lines of evidence, variables and states of importance, and different types of uncertainties in the lines of evidence including spatial and temporal as well as measurement errors. We provide a flexible Bayesian network structure for weighing and integrating lines of evidence for ecological risk determinations

  2. A Practical Probabilistic Graphical Modeling Tool for Weighing ...

    EPA Pesticide Factsheets

    Past weight-of-evidence frameworks for adverse ecological effects have provided soft-scoring procedures for judgments based on the quality and measured attributes of evidence. Here, we provide a flexible probabilistic structure for weighing and integrating lines of evidence for ecological risk determinations. Probabilistic approaches can provide both a quantitative weighing of lines of evidence and methods for evaluating risk and uncertainty. The current modeling structure wasdeveloped for propagating uncertainties in measured endpoints and their influence on the plausibility of adverse effects. To illustrate the approach, we apply the model framework to the sediment quality triad using example lines of evidence for sediment chemistry measurements, bioassay results, and in situ infauna diversity of benthic communities using a simplified hypothetical case study. We then combine the three lines evidence and evaluate sensitivity to the input parameters, and show how uncertainties are propagated and how additional information can be incorporated to rapidly update the probability of impacts. The developed network model can be expanded to accommodate additional lines of evidence, variables and states of importance, and different types of uncertainties in the lines of evidence including spatial and temporal as well as measurement errors. We provide a flexible Bayesian network structure for weighing and integrating lines of evidence for ecological risk determinations

  3. Statistical appearance models based on probabilistic correspondences.

    PubMed

    Krüger, Julia; Ehrhardt, Jan; Handels, Heinz

    2017-04-01

    Model-based image analysis is indispensable in medical image processing. One key aspect of building statistical shape and appearance models is the determination of one-to-one correspondences in the training data set. At the same time, the identification of these correspondences is the most challenging part of such methods. In our earlier work, we developed an alternative method using correspondence probabilities instead of exact one-to-one correspondences for a statistical shape model (Hufnagel et al., 2008). In this work, a new approach for statistical appearance models without one-to-one correspondences is proposed. A sparse image representation is used to build a model that combines point position and appearance information at the same time. Probabilistic correspondences between the derived multi-dimensional feature vectors are used to omit the need for extensive preprocessing of finding landmarks and correspondences as well as to reduce the dependence of the generated model on the landmark positions. Model generation and model fitting can now be expressed by optimizing a single global criterion derived from a maximum a-posteriori (MAP) approach with respect to model parameters that directly affect both shape and appearance of the considered objects inside the images. The proposed approach describes statistical appearance modeling in a concise and flexible mathematical framework. Besides eliminating the demand for costly correspondence determination, the method allows for additional constraints as topological regularity in the modeling process. In the evaluation the model was applied for segmentation and landmark identification in hand X-ray images. The results demonstrate the feasibility of the model to detect hand contours as well as the positions of the joints between finger bones for unseen test images. Further, we evaluated the model on brain data of stroke patients to show the ability of the proposed model to handle partially corrupted data and to

  4. A Probabilistic Approach to Model Update

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Reaves, Mercedes C.; Voracek, David F.

    2001-01-01

    Finite element models are often developed for load validation, structural certification, response predictions, and to study alternate design concepts. In rare occasions, models developed with a nominal set of parameters agree with experimental data without the need to update parameter values. Today, model updating is generally heuristic and often performed by a skilled analyst with in-depth understanding of the model assumptions. Parameter uncertainties play a key role in understanding the model update problem and therefore probabilistic analysis tools, developed for reliability and risk analysis, may be used to incorporate uncertainty in the analysis. In this work, probability analysis (PA) tools are used to aid the parameter update task using experimental data and some basic knowledge of potential error sources. Discussed here is the first application of PA tools to update parameters of a finite element model for a composite wing structure. Static deflection data at six locations are used to update five parameters. It is shown that while prediction of individual response values may not be matched identically, the system response is significantly improved with moderate changes in parameter values.

  5. A Probabilistic Asteroid Impact Risk Model

    NASA Technical Reports Server (NTRS)

    Mathias, Donovan L.; Wheeler, Lorien F.; Dotson, Jessie L.

    2016-01-01

    Asteroid threat assessment requires the quantification of both the impact likelihood and resulting consequence across the range of possible events. This paper presents a probabilistic asteroid impact risk (PAIR) assessment model developed for this purpose. The model incorporates published impact frequency rates with state-of-the-art consequence assessment tools, applied within a Monte Carlo framework that generates sets of impact scenarios from uncertain parameter distributions. Explicit treatment of atmospheric entry is included to produce energy deposition rates that account for the effects of thermal ablation and object fragmentation. These energy deposition rates are used to model the resulting ground damage, and affected populations are computed for the sampled impact locations. The results for each scenario are aggregated into a distribution of potential outcomes that reflect the range of uncertain impact parameters, population densities, and strike probabilities. As an illustration of the utility of the PAIR model, the results are used to address the question of what minimum size asteroid constitutes a threat to the population. To answer this question, complete distributions of results are combined with a hypothetical risk tolerance posture to provide the minimum size, given sets of initial assumptions. Model outputs demonstrate how such questions can be answered and provide a means for interpreting the effect that input assumptions and uncertainty can have on final risk-based decisions. Model results can be used to prioritize investments to gain knowledge in critical areas or, conversely, to identify areas where additional data has little effect on the metrics of interest.

  6. EFFECTS OF CORRELATED PROBABILISTIC EXPOSURE MODEL INPUTS ON SIMULATED RESULTS

    EPA Science Inventory

    In recent years, more probabilistic models have been developed to quantify aggregate human exposures to environmental pollutants. The impact of correlation among inputs in these models is an important issue, which has not been resolved. Obtaining correlated data and implementi...

  7. EFFECTS OF CORRELATED PROBABILISTIC EXPOSURE MODEL INPUTS ON SIMULATED RESULTS

    EPA Science Inventory

    In recent years, more probabilistic models have been developed to quantify aggregate human exposures to environmental pollutants. The impact of correlation among inputs in these models is an important issue, which has not been resolved. Obtaining correlated data and implementi...

  8. Crevice corrosion and pitting of high-level waste containers: a first step towards the integration of deterministic and probabilistic models

    SciTech Connect

    Farmer, J. C., LLNL

    1997-07-01

    An integrated predictive model is being developed to account for the effects of localized environmental conditions in crevices on pit initiation and propagation. A deterministic calculation is used to estimate the accumulation of hydrogen ions in the crevice solution due to equilibrium hydrolysis reactions of dissolved metal. Pit initiation and growth within the crevice is dealt with by either a stochastic probability model, or an equivalent deterministic model. While the strategy presented here is very promising, the integrated model is not yet ready for accurate quantitative predictions. Empirical expressions for the rate of penetration based upon experimental crevice corrosion data should be used in the interim period, until the integrated model can be refined. Both approaches are discussed.

  9. Model Checking Linear-Time Properties of Probabilistic Systems

    NASA Astrophysics Data System (ADS)

    Baier, Christel; Größer, Marcus; Ciesinski, Frank

    This chapter is about the verification of Markov decision processes (MDPs) which incorporate one of the fundamental models for reasoning about probabilistic and nondeterministic phenomena in reactive systems. MDPs have their roots in the field of operations research and are nowadays used in a wide variety of areas including verification, robotics, planning, controlling, reinforcement learning, economics and semantics of randomized systems. Furthermore, MDPs served as the basis for the introduction of probabilistic automata which are related to weighted automata. We describe the use of MDPs as an operational model for randomized systems, e.g., systems that employ randomized algorithms, multi-agent systems or systems with unreliable components or surroundings. In this context we outline the theory of verifying ω-regular properties of such operational models. As an integral part of this theory we use ω-automata, i.e., finite-state automata over finite alphabets that accept languages of infinite words. Additionally, basic concepts of important reduction techniques are sketched, namely partial order reduction of MDPs and quotient system reduction of the numerical problem that arises in the verification of MDPs. Furthermore we present several undecidability and decidability results for the controller synthesis problem for partially observable MDPs.

  10. A Probabilistic Model for Reducing Medication Errors

    PubMed Central

    Nguyen, Phung Anh; Syed-Abdul, Shabbir; Iqbal, Usman; Hsu, Min-Huei; Huang, Chen-Ling; Li, Hsien-Chang; Clinciu, Daniel Livius; Jian, Wen-Shan; Li, Yu-Chuan Jack

    2013-01-01

    Background Medication errors are common, life threatening, costly but preventable. Information technology and automated systems are highly efficient for preventing medication errors and therefore widely employed in hospital settings. The aim of this study was to construct a probabilistic model that can reduce medication errors by identifying uncommon or rare associations between medications and diseases. Methods and Finding(s) Association rules of mining techniques are utilized for 103.5 million prescriptions from Taiwan’s National Health Insurance database. The dataset included 204.5 million diagnoses with ICD9-CM codes and 347.7 million medications by using ATC codes. Disease-Medication (DM) and Medication-Medication (MM) associations were computed by their co-occurrence and associations’ strength were measured by the interestingness or lift values which were being referred as Q values. The DMQs and MMQs were used to develop the AOP model to predict the appropriateness of a given prescription. Validation of this model was done by comparing the results of evaluation performed by the AOP model and verified by human experts. The results showed 96% accuracy for appropriate and 45% accuracy for inappropriate prescriptions, with a sensitivity and specificity of 75.9% and 89.5%, respectively. Conclusions We successfully developed the AOP model as an efficient tool for automatic identification of uncommon or rare associations between disease-medication and medication-medication in prescriptions. The AOP model helps to reduce medication errors by alerting physicians, improving the patients’ safety and the overall quality of care. PMID:24312659

  11. Probabilistic statistical modeling of air pollution from vehicles

    NASA Astrophysics Data System (ADS)

    Adikanova, Saltanat; Malgazhdarov, Yerzhan A.; Madiyarov, Muratkan N.; Temirbekov, Nurlan M.

    2017-09-01

    The aim of the work is to create a probabilistic-statistical mathematical model for the distribution of emissions from vehicles. In this article, it is proposed to use the probabilistic and statistical approach for modeling the distribution of harmful impurities in the atmosphere from vehicles using the example of the Ust-Kamenogorsk city. Using a simplified methodology of stochastic modeling, it is possible to construct effective numerical computational algorithms that significantly reduce the amount of computation without losing their accuracy.

  12. Probabilistic constitutive relationships for cyclic material strength models

    NASA Technical Reports Server (NTRS)

    Boyce, L.; Chamis, C. C.

    1988-01-01

    A methodology is developed that provides a probabilistic treatment for the lifetime of structural components of aerospace propulsion systems subjected to fatigue. Material strength degradation models, based on primitive variables, include both a fatigue strength reduction model and a fatigue crack growth model. Probabilistic analysis is based on simulation, and both maximum entropy and maximum penalized likelihood methods are used for the generation of probability density functions. The resulting constitutive relationships are included in several computer programs.

  13. An integrated GIS-based interval-probabilistic programming model for land-use planning management under uncertainty--a case study at Suzhou, China.

    PubMed

    Lu, Shasha; Zhou, Min; Guan, Xingliang; Tao, Lizao

    2015-03-01

    A large number of mathematical models have been developed for supporting optimization of land-use allocation; however, few of them simultaneously consider land suitability (e.g., physical features and spatial information) and various uncertainties existing in many factors (e.g., land availabilities, land demands, land-use patterns, and ecological requirements). This paper incorporates geographic information system (GIS) technology into interval-probabilistic programming (IPP) for land-use planning management (IPP-LUPM). GIS is utilized to assemble data for the aggregated land-use alternatives, and IPP is developed for tackling uncertainties presented as discrete intervals and probability distribution. Based on GIS, the suitability maps of different land users are provided by the outcomes of land suitability assessment and spatial analysis. The maximum area of every type of land use obtained from the suitability maps, as well as various objectives/constraints (i.e., land supply, land demand of socioeconomic development, future development strategies, and environmental capacity), is used as input data for the optimization of land-use areas with IPP-LUPM model. The proposed model not only considers the outcomes of land suitability evaluation (i.e., topography, ground conditions, hydrology, and spatial location) but also involves economic factors, food security, and eco-environmental constraints, which can effectively reflect various interrelations among different aspects in a land-use planning management system. The case study results at Suzhou, China, demonstrate that the model can help to examine the reliability of satisfying (or risk of violating) system constraints under uncertainty. Moreover, it may identify the quantitative relationship between land suitability and system benefits. Willingness to arrange the land areas based on the condition of highly suitable land will not only reduce the potential conflicts on the environmental system but also lead to a lower

  14. A PROBABILISTIC MODELING FRAMEWORK FOR PREDICTING POPULATION EXPOSURES TO BENZENE

    EPA Science Inventory

    The US Environmental Protection Agency (EPA) is modifying their probabilistic Stochastic Human Exposure Dose Simulation (SHEDS) model to assess aggregate exposures to air toxics. Air toxics include urban Hazardous Air Pollutants (HAPS) such as benzene from mobile sources, part...

  15. A PROBABILISTIC MODELING FRAMEWORK FOR PREDICTING POPULATION EXPOSURES TO BENZENE

    EPA Science Inventory

    The US Environmental Protection Agency (EPA) is modifying their probabilistic Stochastic Human Exposure Dose Simulation (SHEDS) model to assess aggregate exposures to air toxics. Air toxics include urban Hazardous Air Pollutants (HAPS) such as benzene from mobile sources, part...

  16. Probabilistic Fatigue Damage Prognosis Using a Surrogate Model Trained Via 3D Finite Element Analysis

    NASA Technical Reports Server (NTRS)

    Leser, Patrick E.; Hochhalter, Jacob D.; Newman, John A.; Leser, William P.; Warner, James E.; Wawrzynek, Paul A.; Yuan, Fuh-Gwo

    2015-01-01

    Utilizing inverse uncertainty quantification techniques, structural health monitoring can be integrated with damage progression models to form probabilistic predictions of a structure's remaining useful life. However, damage evolution in realistic structures is physically complex. Accurately representing this behavior requires high-fidelity models which are typically computationally prohibitive. In the present work, a high-fidelity finite element model is represented by a surrogate model, reducing computation times. The new approach is used with damage diagnosis data to form a probabilistic prediction of remaining useful life for a test specimen under mixed-mode conditions.

  17. Integrated Campaign Probabilistic Cost, Schedule, Performance, and Value for Program Office Support

    NASA Technical Reports Server (NTRS)

    Cornelius, David; Sasamoto, Washito; Daugherty, Kevin; Deacon, Shaun

    2012-01-01

    This paper describes an integrated assessment tool developed at NASA Langley Research Center that incorporates probabilistic analysis of life cycle cost, schedule, launch performance, on-orbit performance, and value across a series of planned space-based missions, or campaign. Originally designed as an aid in planning the execution of missions to accomplish the National Research Council 2007 Earth Science Decadal Survey, it utilizes Monte Carlo simulation of a series of space missions for assessment of resource requirements and expected return on investment. Interactions between simulated missions are incorporated, such as competition for launch site manifest, to capture unexpected and non-linear system behaviors. A novel value model is utilized to provide an assessment of the probabilistic return on investment. A demonstration case is discussed to illustrate the tool utility.

  18. Integrated Campaign Probabilistic Cost, Schedule, Performance, and Value for Program Office Support

    NASA Technical Reports Server (NTRS)

    Cornelius, David; Sasamoto, Washito; Daugherty, Kevin; Deacon, Shaun

    2012-01-01

    This paper describes an integrated assessment tool developed at NASA Langley Research Center that incorporates probabilistic analysis of life cycle cost, schedule, launch performance, on-orbit performance, and value across a series of planned space-based missions, or campaign. Originally designed as an aid in planning the execution of missions to accomplish the National Research Council 2007 Earth Science Decadal Survey, it utilizes Monte Carlo simulation of a series of space missions for assessment of resource requirements and expected return on investment. Interactions between simulated missions are incorporated, such as competition for launch site manifest, to capture unexpected and non-linear system behaviors. A novel value model is utilized to provide an assessment of the probabilistic return on investment. A demonstration case is discussed to illustrate the tool utility.

  19. Probabilistic delay differential equation modeling of event-related potentials.

    PubMed

    Ostwald, Dirk; Starke, Ludger

    2016-08-01

    "Dynamic causal models" (DCMs) are a promising approach in the analysis of functional neuroimaging data due to their biophysical interpretability and their consolidation of functional-segregative and functional-integrative propositions. In this theoretical note we are concerned with the DCM framework for electroencephalographically recorded event-related potentials (ERP-DCM). Intuitively, ERP-DCM combines deterministic dynamical neural mass models with dipole-based EEG forward models to describe the event-related scalp potential time-series over the entire electrode space. Since its inception, ERP-DCM has been successfully employed to capture the neural underpinnings of a wide range of neurocognitive phenomena. However, in spite of its empirical popularity, the technical literature on ERP-DCM remains somewhat patchy. A number of previous communications have detailed certain aspects of the approach, but no unified and coherent documentation exists. With this technical note, we aim to close this gap and to increase the technical accessibility of ERP-DCM. Specifically, this note makes the following novel contributions: firstly, we provide a unified and coherent review of the mathematical machinery of the latent and forward models constituting ERP-DCM by formulating the approach as a probabilistic latent delay differential equation model. Secondly, we emphasize the probabilistic nature of the model and its variational Bayesian inversion scheme by explicitly deriving the variational free energy function in terms of both the likelihood expectation and variance parameters. Thirdly, we detail and validate the estimation of the model with a special focus on the explicit form of the variational free energy function and introduce a conventional nonlinear optimization scheme for its maximization. Finally, we identify and discuss a number of computational issues which may be addressed in the future development of the approach.

  20. Verification of recursive probabilistic integration (RPI) method for fatigue life management using non-destructive inspections

    NASA Astrophysics Data System (ADS)

    Chen, Tzikang J.; Shiao, Michael

    2016-04-01

    This paper verified a generic and efficient assessment concept for probabilistic fatigue life management. The concept is developed based on an integration of damage tolerance methodology, simulations methods1, 2, and a probabilistic algorithm RPI (recursive probability integration)3-9 considering maintenance for damage tolerance and risk-based fatigue life management. RPI is an efficient semi-analytical probabilistic method for risk assessment subjected to various uncertainties such as the variability in material properties including crack growth rate, initial flaw size, repair quality, random process modeling of flight loads for failure analysis, and inspection reliability represented by probability of detection (POD). In addition, unlike traditional Monte Carlo simulations (MCS) which requires a rerun of MCS when maintenance plan is changed, RPI can repeatedly use a small set of baseline random crack growth histories excluding maintenance related parameters from a single MCS for various maintenance plans. In order to fully appreciate the RPI method, a verification procedure was performed. In this study, MC simulations in the orders of several hundred billions were conducted for various flight conditions, material properties, and inspection scheduling, POD and repair/replacement strategies. Since the MC simulations are time-consuming methods, the simulations were conducted parallelly on DoD High Performance Computers (HPC) using a specialized random number generator for parallel computing. The study has shown that RPI method is several orders of magnitude more efficient than traditional Monte Carlo simulations.

  1. A probabilistic model of brittle crack formation

    NASA Technical Reports Server (NTRS)

    Chudnovsky, A.; Kunin, B.

    1987-01-01

    Probability of a brittle crack formation in an elastic solid with fluctuating strength is considered. A set Omega of all possible crack trajectories reflecting the fluctuation of the strength field is introduced. The probability P(X) that crack penetration depth exceeds X is expressed as a functional integral over Omega of a conditional probability of the same event taking place along a particular path. Various techniques are considered to evaluate the integral. Under rather nonrestrictive assumptions, the integral is reduced to solving a diffusion-type equation. A new characteristic of fracture process, 'crack diffusion coefficient', is introduced. An illustrative example is then considered where the integration is reduced to solving an ordinary differential equation. The effect of the crack diffusion coefficient and of the magnitude of strength fluctuations on probability density of crack penetration depth is presented. Practical implications of the proposed model are discussed.

  2. Application of a stochastic snowmelt model for probabilistic decisionmaking

    NASA Technical Reports Server (NTRS)

    Mccuen, R. H.

    1983-01-01

    A stochastic form of the snowmelt runoff model that can be used for probabilistic decision-making was developed. The use of probabilistic streamflow predictions instead of single valued deterministic predictions leads to greater accuracy in decisions. While the accuracy of the output function is important in decisionmaking, it is also important to understand the relative importance of the coefficients. Therefore, a sensitivity analysis was made for each of the coefficients.

  3. A probabilistic model for snow avalanche occurrence

    NASA Astrophysics Data System (ADS)

    Perona, P.; Miescher, A.; Porporato, A.

    2009-04-01

    Avalanche hazard forecasting is an important issue in relation to the protection of urbanized environments, ski resorts and of ski-touring alpinists. A critical point is to predict the conditions that trigger the snow mass instability determining the onset and the size of avalanches. On steep terrains the risk of avalanches is known to be related to preceding consistent snowfall events and to subsequent changes in the local climatic conditions. Regression analysis has shown that avalanche occurrence indeed correlates to the amount of snow fallen in consecutive three snowing days and to the state of the settled snow at the ground. Moreover, since different type of avalanches may occur as a result of the interactions of different factors, the process of snow avalanche formation is inherently complex and with some degree of unpredictability. For this reason, although several models assess the risk of avalanche by accounting for all the involved processes with a great detail, a high margin of uncertainty invariably remains. In this work, we explicitly describe such an unpredictable behaviour with an intrinsic noise affecting the processes leading snow instability. Eventually, this sets the basis for a minimalist stochastic model, which allows us to investigate the avalanche dynamics and its statistical properties. We employ a continuous time process with stochastic jumps (snowfalls), deterministic decay (snowmelt and compaction) and state dependent avalanche occurrence (renewals) as a minimalist model for the determination of avalanche size and related intertime occurrence. The physics leading to avalanches is simplified to the extent where only meteorological data and terrain data are necessary to estimate avalanche danger. We explore the analytical formulation of the process and the properties of the probability density function of the avalanche process variables. We also discuss what is the probabilistic link between avalanche size and preceding snowfall event and

  4. Maritime Threat Detection Using Probabilistic Graphical Models

    DTIC Science & Technology

    2012-01-01

    CRF, unlike an HMM, can represent local features, and does not require feature concatenation. MLNs For MLNs, we used Alchemy ( Alchemy 2011), an...open source statistical relational learning and probabilistic inferencing package. Alchemy supports generative and discriminative weight learning, and...that Alchemy creates a new formula for every possible combination of the values for a1 and a2 that fit the type specified in their predicate

  5. Probabilistic Modeling of Space Shuttle Debris Impact

    NASA Technical Reports Server (NTRS)

    Huyse, Luc J.; Asce, M.; Waldhart, Chris J.; Riha, David S.; Larsen, Curtis E.; Gomez, Reynaldo J.; Stuart, Phillip C.

    2007-01-01

    On Feb 1, 2003, the Shuttle Columbia was lost during its return to Earth. As a result of the conclusion that debris impact caused the damage to the left wing of the Columbia Space Shuttle Vehicle (SSV) during ascent, the Columbia Accident Investigation Board recommended that an assessment be performed of the debris environment experienced by the SSV during ascent. A flight rationale based on probabilistic assessment is used for the SSV return-to-flight. The assessment entails identifying all potential debris sources, their probable geometric and aerodynamic characteristics, and their potential for impacting and damaging critical Shuttle components. A probabilistic analysis tool, based on the SwRI-developed NESSUS probabilistic analysis software, predicts the probability of impact and damage to the space shuttle wing leading edge and thermal protection system components. Among other parameters, the likelihood of unacceptable damage depends on the time of release (Mach number of the orbiter) and the divot mass as well as the impact velocity and impact angle. A typical result is visualized in the figures below. Probability of impact and damage, as well as the sensitivities thereof with respect to the distribution assumptions, can be computed and visualized at each point on the orbiter or summarized per wing panel or tile zone.

  6. Probabilistic language models in cognitive neuroscience: Promises and pitfalls.

    PubMed

    Armeni, Kristijan; Willems, Roel M; Frank, Stefan L

    2017-09-05

    Cognitive neuroscientists of language comprehension study how neural computations relate to cognitive computations during comprehension. On the cognitive part of the equation, it is important that the computations and processing complexity are explicitly defined. Probabilistic language models can be used to give a computationally explicit account of language complexity during comprehension. Whereas such models have so far predominantly been evaluated against behavioral data, only recently have the models been used to explain neurobiological signals. Measures obtained from these models emphasize the probabilistic, information-processing view of language understanding and provide a set of tools that can be used for testing neural hypotheses about language comprehension. Here, we provide a cursory review of the theoretical foundations and example neuroimaging studies employing probabilistic language models. We highlight the advantages and potential pitfalls of this approach and indicate avenues for future research. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. An integrated approach coupling physically based models and probabilistic method to assess quantitatively landslide susceptibility at different scale: application to different geomorphological environments

    NASA Astrophysics Data System (ADS)

    Vandromme, Rosalie; Thiéry, Yannick; Sedan, Olivier; Bernardie, Séverine

    2016-04-01

    Landslide hazard assessment is the estimation of a target area where landslides of a particular type, volume, runout and intensity may occur within a given period. The first step to analyze landslide hazard consists in assessing the spatial and temporal failure probability (when the information is available, i.e. susceptibility assessment). Two types of approach are generally recommended to achieve this goal: (i) qualitative approach (i.e. inventory based methods and knowledge data driven methods) and (ii) quantitative approach (i.e. data-driven methods or deterministic physically based methods). Among quantitative approaches, deterministic physically based methods (PBM) are generally used at local and/or site-specific scales (1:5,000-1:25,000 and >1:5,000, respectively). The main advantage of these methods is the calculation of probability of failure (safety factor) following some specific environmental conditions. For some models it is possible to integrate the land-uses and climatic change. At the opposite, major drawbacks are the large amounts of reliable and detailed data (especially materials type, their thickness and the geotechnical parameters heterogeneity over a large area) and the fact that only shallow landslides are taking into account. This is why they are often used at site-specific scales (> 1:5,000). Thus, to take into account (i) materials' heterogeneity , (ii) spatial variation of physical parameters, (iii) different landslide types, the French Geological Survey (i.e. BRGM) has developed a physically based model (PBM) implemented in a GIS environment. This PBM couples a global hydrological model (GARDENIA®) including a transient unsaturated/saturated hydrological component with a physically based model computing the stability of slopes (ALICE®, Assessment of Landslides Induced by Climatic Events) based on the Morgenstern-Price method for any slip surface. The variability of mechanical parameters is handled by Monte Carlo approach. The

  8. A toolkit for integrated deterministic and probabilistic assessment for hydrogen infrastructure.

    SciTech Connect

    Groth, Katrina M.; Tchouvelev, Andrei V.

    2014-03-01

    There has been increasing interest in using Quantitative Risk Assessment [QRA] to help improve the safety of hydrogen infrastructure and applications. Hydrogen infrastructure for transportation (e.g. fueling fuel cell vehicles) or stationary (e.g. back-up power) applications is a relatively new area for application of QRA vs. traditional industrial production and use, and as a result there are few tools designed to enable QRA for this emerging sector. There are few existing QRA tools containing models that have been developed and validated for use in small-scale hydrogen applications. However, in the past several years, there has been significant progress in developing and validating deterministic physical and engineering models for hydrogen dispersion, ignition, and flame behavior. In parallel, there has been progress in developing defensible probabilistic models for the occurrence of events such as hydrogen release and ignition. While models and data are available, using this information is difficult due to a lack of readily available tools for integrating deterministic and probabilistic components into a single analysis framework. This paper discusses the first steps in building an integrated toolkit for performing QRA on hydrogen transportation technologies and suggests directions for extending the toolkit.

  9. Bayesian non-parametrics and the probabilistic approach to modelling

    PubMed Central

    Ghahramani, Zoubin

    2013-01-01

    Modelling is fundamental to many fields of science and engineering. A model can be thought of as a representation of possible data one could predict from a system. The probabilistic approach to modelling uses probability theory to express all aspects of uncertainty in the model. The probabilistic approach is synonymous with Bayesian modelling, which simply uses the rules of probability theory in order to make predictions, compare alternative models, and learn model parameters and structure from data. This simple and elegant framework is most powerful when coupled with flexible probabilistic models. Flexibility is achieved through the use of Bayesian non-parametrics. This article provides an overview of probabilistic modelling and an accessible survey of some of the main tools in Bayesian non-parametrics. The survey covers the use of Bayesian non-parametrics for modelling unknown functions, density estimation, clustering, time-series modelling, and representing sparsity, hierarchies, and covariance structure. More specifically, it gives brief non-technical overviews of Gaussian processes, Dirichlet processes, infinite hidden Markov models, Indian buffet processes, Kingman’s coalescent, Dirichlet diffusion trees and Wishart processes. PMID:23277609

  10. A model for probabilistic health impact assessment of exposure to food chemicals.

    PubMed

    van der Voet, Hilko; van der Heijden, Gerie W A M; Bos, Peter M J; Bosgra, Sieto; Boon, Polly E; Muri, Stefan D; Brüschweiler, Beat J

    2009-12-01

    A statistical model is presented extending the integrated probabilistic risk assessment (IPRA) model of van der Voet and Slob [van der Voet, H., Slob, W., 2007. Integration of probabilistic exposure assessment and probabilistic hazard characterisation. Risk Analysis, 27, 351-371]. The aim is to characterise the health impact due to one or more chemicals present in food causing one or more health effects. For chemicals with hardly any measurable safety problems we propose health impact characterisation by margins of exposure. In this probabilistic model not one margin of exposure is calculated, but rather a distribution of individual margins of exposure (IMoE) which allows quantifying the health impact for small parts of the population. A simple bar chart is proposed to represent the IMoE distribution and a lower bound (IMoEL) quantifies uncertainties in this distribution. It is described how IMoE distributions can be combined for dose-additive compounds and for different health effects. Health impact assessment critically depends on a subjective valuation of the health impact of a given health effect, and possibilities to implement this health impact valuation step are discussed. Examples show the possibilities of health impact characterisation and of integrating IMoE distributions. The paper also includes new proposals for modelling variable and uncertain factors describing food processing effects and intraspecies variation in sensitivity.

  11. Integration of an Evidence Base into a Probabilistic Risk Assessment Model. The Integrated Medical Model Database: An Organized Evidence Base for Assessing In-Flight Crew Health Risk and System Design

    NASA Technical Reports Server (NTRS)

    Saile, Lynn; Lopez, Vilma; Bickham, Grandin; FreiredeCarvalho, Mary; Kerstman, Eric; Byrne, Vicky; Butler, Douglas; Myers, Jerry; Walton, Marlei

    2011-01-01

    This slide presentation reviews the Integrated Medical Model (IMM) database, which is an organized evidence base for assessing in-flight crew health risk. The database is a relational database accessible to many people. The database quantifies the model inputs by a ranking based on the highest value of the data as Level of Evidence (LOE) and the quality of evidence (QOE) score that provides an assessment of the evidence base for each medical condition. The IMM evidence base has already been able to provide invaluable information for designers, and for other uses.

  12. A Probabilistic Model of Phonological Relationships from Contrast to Allophony

    ERIC Educational Resources Information Center

    Hall, Kathleen Currie

    2009-01-01

    This dissertation proposes a model of phonological relationships, the Probabilistic Phonological Relationship Model (PPRM), that quantifies how predictably distributed two sounds in a relationship are. It builds on a core premise of traditional phonological analysis, that the ability to define phonological relationships such as contrast and…

  13. A Probabilistic Model of Phonological Relationships from Contrast to Allophony

    ERIC Educational Resources Information Center

    Hall, Kathleen Currie

    2009-01-01

    This dissertation proposes a model of phonological relationships, the Probabilistic Phonological Relationship Model (PPRM), that quantifies how predictably distributed two sounds in a relationship are. It builds on a core premise of traditional phonological analysis, that the ability to define phonological relationships such as contrast and…

  14. Exploring Term Dependences in Probabilistic Information Retrieval Model.

    ERIC Educational Resources Information Center

    Cho, Bong-Hyun; Lee, Changki; Lee, Gary Geunbae

    2003-01-01

    Describes a theoretic process to apply Bahadur-Lazarsfeld expansion (BLE) to general probabilistic models and the state-of-the-art 2-Poisson model. Through experiments on two standard document collections, one in Korean and one in English, it is demonstrated that incorporation of term dependences using BLE significantly contributes to performance…

  15. What do we gain with Probabilistic Flood Loss Models?

    NASA Astrophysics Data System (ADS)

    Schroeter, K.; Kreibich, H.; Vogel, K.; Merz, B.; Lüdtke, S.

    2015-12-01

    The reliability of flood loss models is a prerequisite for their practical usefulness. Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks and traditional stage damage functions which are cast in a probabilistic framework. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005, 2006 and 2013 in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The reliability of the probabilistic predictions within validation runs decreases only slightly and achieves a very good coverage of observations within the predictive interval. Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.

  16. An integrated probabilistic assessment to analyse stochasticity of soil erosion in different restoration vegetation types

    NASA Astrophysics Data System (ADS)

    Zhou, Ji; Fu, Bojie; Gao, Guangyao; Lü, Yihe; Wang, Shuai

    2017-03-01

    The stochasticity of soil erosion reflects the variability of soil hydrological response to precipitation in a complex environment. Assessing this stochasticity is important for the conservation of soil and water resources; however, the stochasticity of erosion event in restoration vegetation types in water-limited environment has been little investigated. In this study, we constructed an event-driven framework to quantify the stochasticity of runoff and sediment generation in three typical restoration vegetation types (Armeniaca sibirica (T1), Spiraea pubescens (T2) and Artemisia copria (T3)) in closed runoff plots over five rainy seasons in the Loess Plateau of China. The results indicate that, under the same rainfall condition, the average probabilities of runoff and sediment in T1 (3.8 and 1.6 %) and T3 (5.6 and 4.4 %) were lowest and highest, respectively. The binomial and Poisson probabilistic model are two effective ways to simulate the frequency distributions of times of erosion events occurring in all restoration vegetation types. The Bayes model indicated that relatively longer-duration and stronger-intensity rainfall events respectively become the main probabilistic contributors to the stochasticity of an erosion event occurring in T1 and T3. Logistic regression modelling highlighted that the higher-grade rainfall intensity and canopy structure were the two most important factors to respectively improve and restrain the probability of stochastic erosion generation in all restoration vegetation types. The Bayes, binomial, Poisson and logistic regression models constituted an integrated probabilistic assessment to systematically simulate and evaluate soil erosion stochasticity. This should prove to be an innovative and important complement in understanding soil erosion from the stochasticity viewpoint, and also provide an alternative to assess the efficacy of ecological restoration in conserving soil and water resources in a semi-arid environment.

  17. Training probabilistic VLSI models on-chip to recognise biomedical signals under hardware nonidealities.

    PubMed

    Jiang, P C; Chen, H

    2006-01-01

    VLSI implementation of probabilistic models is attractive for many biomedical applications. However, hardware non-idealities can prevent probabilistic VLSI models from modelling data optimally through on-chip learning. This paper investigates the maximum computational errors that a probabilistic VLSI model can tolerate when modelling real biomedical data. VLSI circuits capable of achieving the required precision are also proposed.

  18. Quantitative Integration of Ndt with Probabilistic Fracture Mechanics for the Assessment of Fracture Risk in Pipelines

    NASA Astrophysics Data System (ADS)

    Kurz, J. H.; Cioclov, D.; Dobmann, G.; Boller, C.

    2010-02-01

    In the context of probabilistic paradigm of fracture risk assessment in structural components a computer simulation rationale is presented which has at the base the integration of Quantitative Non-destructive Inspection and Probabilistic Fracture Mechanics. In this study the static failure under static loading is assessed in the format known as Failure Assessment Diagram (FAD). The fracture risk is evaluated in probabilistic terms. The superposed probabilistic pattern over the deterministic one is implemented via Monte-Carlo sampling. The probabilistic fracture simulation yields a more informative analysis in terms of probability of failure. The ability to simulate the influence of the quality and reliability of non-destructive inspection (NDI) is an important feature of this approach. It is achieved by integrating, algorithmically, probabilistic FAD analysis and the Probability of Detection (POD). The POD information can only be applied in a probabilistic analysis and leads to a refinement of the assessment. By this means, it can be ascertained the decrease of probability of failure when POD-characterized NDI is applied. Therefore, this procedure can be used as a tool for inspection based life time conceptions. In this paper results of sensitivity analyses are presented with the aim to outline, in terms of non-failure probabilities, the benefits of applying NDI, in various qualities, in comparison with the situation when NDI is lacking. A better substantiation is enabled of both the component reliability management and the costs-effectiveness of NDI timing.

  19. Modelling default and likelihood reasoning as probabilistic reasoning

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1990-01-01

    A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. Likely and by default are in fact treated as duals in the same sense as possibility and necessity. To model these four forms probabilistically, a qualitative default probabilistic (QDP) logic and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequent results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.

  20. Probabilistic Usage of the Multi-Factor Interaction Model

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2008-01-01

    A Multi-Factor Interaction Model (MFIM) is used to predict the insulating foam mass expulsion during the ascending of a space vehicle. The exponents in the MFIM are evaluated by an available approach which consists of least squares and an optimization algorithm. These results were subsequently used to probabilistically evaluate the effects of the uncertainties in each participating factor in the mass expulsion. The probabilistic results show that the surface temperature dominates at high probabilities and the pressure which causes the mass expulsion at low probabil

  1. A probabilistic model of ecosystem response to climate change

    SciTech Connect

    Shevliakova, E.; Dowlatabadi, H.

    1994-12-31

    Anthropogenic activities are leading to rapid changes in land cover and emissions of greenhouse gases into the atmosphere. These changes can bring about climate change typified by average global temperatures rising by 1--5 C over the next century. Climate change of this magnitude is likely to alter the distribution of terrestrial ecosystems on a large scale. Options available for dealing with such change are abatement of emissions, adaptation, and geoengineering. The integrated assessment of climate change demands that frameworks be developed where all the elements of the climate problem are present (from economic activity to climate change and its impacts on market and non-market goods and services). Integrated climate assessment requires multiple impact metrics and multi-attribute utility functions to simulate the response of different key actors/decision-makers to the actual physical impacts (rather than a dollar value) of the climate-damage vs. policy-cost debate. This necessitates direct modeling of ecosystem impacts of climate change. The authors have developed a probabilistic model of ecosystem response to global change. This model differs from previous efforts in that it is statistically estimated using actual ecosystem and climate data yielding a joint multivariate probability of prevalence for each ecosystem, given climatic conditions. The authors expect this approach to permit simulation of inertia and competition which have, so far, been absent in transfer models of continental-scale ecosystem response to global change. Thus, although the probability of one ecotype will dominate others at a given point, others would have the possibility of establishing an early foothold.

  2. A Probabilistic IRT Model for Unfolding Preference Data.

    ERIC Educational Resources Information Center

    Andrich, David

    1989-01-01

    A probabilistic item response theory (IRT) model is developed for pair-comparison design in which the unfolding principle governing the choice process uses a discriminant process analogous to Thurstone's Law of Comparative Judgment. A simulation study demonstrates the feasibility of estimation, and two examples illustrate the implications for…

  3. Efficient Methods for Unsupervised Learning of Probabilistic Models

    NASA Astrophysics Data System (ADS)

    Sohl-Dickstein, Jascha

    High dimensional probabilistic models are used for many modern scientific and engineering data analysis tasks. Interpreting neural spike trains, compressing video, identifying features in DNA microarrays, and recognizing particles in high energy physics all rely upon the ability to find and model complex structure in a high dimensional space. Despite their great promise, high dimensional probabilistic models are frequently computationally intractable to work with in practice. In this thesis I develop solutions to overcome this intractability, primarily in the context of energy based models. A common cause of intractability is that model distributions cannot be analytically normalized. Probabilities can only be computed up to a constant, making training exceedingly difficult. To solve this problem I propose 'minimum probability flow learning', a variational technique for parameter estimation in such models. The utility of this training technique is demonstrated in the case of an Ising model, a Hopfield auto-associative memory, an independent component analysis model of natural images, and a deep belief network. A second common difficulty in training probabilistic models arises when the parameter space is ill-conditioned. This makes gradient descent optimization slow and impractical, but can be alleviated using the natural gradient. I show here that the natural gradient can be related to signal whitening, and provide specific prescriptions for applying it to learning problems. It is also difficult to evaluate the performance of models that cannot be analytically normalized, providing a particular challenge to hypothesis testing and model comparison. To overcome this, I introduce a method termed 'Hamiltonian annealed importance sampling,' which more efficiently estimates the normalization constant of non-analytically-normalizable models. This method is then used to calculate and compare the log likelihoods of several state of the art probabilistic models of natural

  4. Fault tree analysis for integrated and probabilistic risk analysis of drinking water systems.

    PubMed

    Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof

    2009-04-01

    Drinking water systems are vulnerable and subject to a wide range of risks. To avoid sub-optimisation of risk-reduction options, risk analyses need to include the entire drinking water system, from source to tap. Such an integrated approach demands tools that are able to model interactions between different events. Fault tree analysis is a risk estimation tool with the ability to model interactions between events. Using fault tree analysis on an integrated level, a probabilistic risk analysis of a large drinking water system in Sweden was carried out. The primary aims of the study were: (1) to develop a method for integrated and probabilistic risk analysis of entire drinking water systems; and (2) to evaluate the applicability of Customer Minutes Lost (CML) as a measure of risk. The analysis included situations where no water is delivered to the consumer (quantity failure) and situations where water is delivered but does not comply with water quality standards (quality failure). Hard data as well as expert judgements were used to estimate probabilities of events and uncertainties in the estimates. The calculations were performed using Monte Carlo simulations. CML is shown to be a useful measure of risks associated with drinking water systems. The method presented provides information on risk levels, probabilities of failure, failure rates and downtimes of the system. This information is available for the entire system as well as its different sub-systems. Furthermore, the method enables comparison of the results with performance targets and acceptable levels of risk. The method thus facilitates integrated risk analysis and consequently helps decision-makers to minimise sub-optimisation of risk-reduction options.

  5. Data-directed RNA secondary structure prediction using probabilistic modeling.

    PubMed

    Deng, Fei; Ledda, Mirko; Vaziri, Sana; Aviran, Sharon

    2016-08-01

    Structure dictates the function of many RNAs, but secondary RNA structure analysis is either labor intensive and costly or relies on computational predictions that are often inaccurate. These limitations are alleviated by integration of structure probing data into prediction algorithms. However, existing algorithms are optimized for a specific type of probing data. Recently, new chemistries combined with advances in sequencing have facilitated structure probing at unprecedented scale and sensitivity. These novel technologies and anticipated wealth of data highlight a need for algorithms that readily accommodate more complex and diverse input sources. We implemented and investigated a recently outlined probabilistic framework for RNA secondary structure prediction and extended it to accommodate further refinement of structural information. This framework utilizes direct likelihood-based calculations of pseudo-energy terms per considered structural context and can readily accommodate diverse data types and complex data dependencies. We use real data in conjunction with simulations to evaluate performances of several implementations and to show that proper integration of structural contexts can lead to improvements. Our tests also reveal discrepancies between real data and simulations, which we show can be alleviated by refined modeling. We then propose statistical preprocessing approaches to standardize data interpretation and integration into such a generic framework. We further systematically quantify the information content of data subsets, demonstrating that high reactivities are major drivers of SHAPE-directed predictions and that better understanding of less informative reactivities is key to further improvements. Finally, we provide evidence for the adaptive capability of our framework using mock probe simulations. © 2016 Deng et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.

  6. ProbFold: a probabilistic method for integration of probing data in RNA secondary structure prediction.

    PubMed

    Sahoo, Sudhakar; Świtnicki, Michał P; Pedersen, Jakob Skou

    2016-09-01

    Recently, new RNA secondary structure probing techniques have been developed, including Next Generation Sequencing based methods capable of probing transcriptome-wide. These techniques hold great promise for improving structure prediction accuracy. However, each new data type comes with its own signal properties and biases, which may even be experiment specific. There is therefore a growing need for RNA structure prediction methods that can be automatically trained on new data types and readily extended to integrate and fully exploit multiple types of data. Here, we develop and explore a modular probabilistic approach for integrating probing data in RNA structure prediction. It can be automatically trained given a set of known structures with probing data. The approach is demonstrated on SHAPE datasets, where we evaluate and selectively model specific correlations. The approach often makes superior use of the probing data signal compared to other methods. We illustrate the use of ProbFold on multiple data types using both simulations and a small set of structures with both SHAPE, DMS and CMCT data. Technically, the approach combines stochastic context-free grammars (SCFGs) with probabilistic graphical models. This approach allows rapid adaptation and integration of new probing data types. ProbFold is implemented in C ++. Models are specified using simple textual formats. Data reformatting is done using separate C ++ programs. Source code, statically compiled binaries for x86 Linux machines, C ++ programs, example datasets and a tutorial is available from http://moma.ki.au.dk/prj/probfold/ : jakob.skou@clin.au.dk Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. A probabilistic model of insolation for the Mojave Desert area

    NASA Technical Reports Server (NTRS)

    Hester, O. V.; Reid, M. S.

    1978-01-01

    A discussion of mathematical models of insolation characteristics suitable for use in analysis of solar energy systems is presented and shows why such models are essential for solar energy system design. A model of solar radiation for the Mojave Desert area is presented with probabilistic and deterministic components which reflect the occurrence and density of clouds and haze, and mimic their effects on both direct and indirect radiation. Multiple comparisons were made between measured total energy received per day and the corresponding simulated totals. The simulated totals were all within 11 percent of the measured total. The conclusion is that a useful probabilistic model of solar radiation for the Goldstone, California, area of the Mojave Desert has been constructed.

  8. Probabilistic (Bayesian) Modeling of Gene Expression in Transplant Glomerulopathy

    PubMed Central

    Elster, Eric A.; Hawksworth, Jason S.; Cheng, Orlena; Leeser, David B.; Ring, Michael; Tadaki, Douglas K.; Kleiner, David E.; Eberhardt, John S.; Brown, Trevor S.; Mannon, Roslyn B.

    2010-01-01

    Transplant glomerulopathy (TG) is associated with rapid decline in glomerular filtration rate and poor outcome. We used low-density arrays with a novel probabilistic analysis to characterize relationships between gene transcripts and the development of TG in allograft recipients. Retrospective review identified TG in 10.8% of 963 core biopsies from 166 patients; patients with stable function were studied for comparison. The biopsies were analyzed for expression of 87 genes related to immune function and fibrosis by using real-time PCR, and a Bayesian model was generated and validated to predict histopathology based on gene expression. A total of 57 individual genes were increased in TG compared with stable function biopsies (P < 0.05). The Bayesian analysis identified critical relationships between ICAM-1, IL-10, CCL3, CD86, VCAM-1, MMP-9, MMP-7, and LAMC2 and allograft pathology. Moreover, Bayesian models predicted TG when derived from either immune function (area under the curve [95% confidence interval] of 0.875 [0.675 to 0.999], P = 0.004) or fibrosis (area under the curve [95% confidence interval] of 0.859 [0.754 to 0.963], P < 0.001) gene networks. Critical pathways in the Bayesian models were also analyzed by using the Fisher exact test and had P values <0.005. This study demonstrates that evaluating quantitative gene expression profiles with Bayesian modeling can identify significant transcriptional associations that have the potential to support the diagnostic capability of allograft histology. This integrated approach has broad implications in the field of transplant diagnostics. PMID:20688906

  9. Supermultiplicative Speedups of Probabilistic Model-Building Genetic Algorithms

    DTIC Science & Technology

    2009-02-01

    simulations. We (Todd Martinez (2005 MacArthur fellow), Duanc Johnson, Kumara Sastry and David E. Goldberg) have applied inultiobjcctive GAs and model...AUTHOR(S) David E. Goldberg. Kumara Sastry. Martin Pelikan 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S...Speedups of Probabilistic Model-Building Genetic Algorithms AFOSR Grant No. FA9550-06-1-0096 February 1, 2006 to November 30, 2008 David E. Goldberg

  10. Nonlinear sensor fault diagnosis using mixture of probabilistic PCA models

    NASA Astrophysics Data System (ADS)

    Sharifi, Reza; Langari, Reza

    2017-02-01

    This paper presents a methodology for sensor fault diagnosis in nonlinear systems using a Mixture of Probabilistic Principal Component Analysis (MPPCA) models. This methodology separates the measurement space into several locally linear regions, each of which is associated with a Probabilistic PCA (PPCA) model. Using the transformation associated with each PPCA model, a parity relation scheme is used to construct a residual vector. Bayesian analysis of the residuals forms the basis for detection and isolation of sensor faults across the entire range of operation of the system. The resulting method is demonstrated in its application to sensor fault diagnosis of a fully instrumented HVAC system. The results show accurate detection of sensor faults under the assumption that a single sensor is faulty.

  11. Probabilistic Based Modeling and Simulation Assessment

    DTIC Science & Technology

    2010-06-01

    but with the head and neck replaced with a high fidelity cervical spine and head model. The occupant models were used to determine the effects of...fidelity cervical spine and head model... vertebrae , including the disks, ligaments and musculature, Figure 6. In total there are 57837 elements with 63713 nodes. A full description of the model

  12. Probabilistic Harmonic Analysis on Distributed Photovoltaic Integration Considering Typical Weather Scenarios

    NASA Astrophysics Data System (ADS)

    Bin, Che; Ruoying, Yu; Dongsheng, Dang; Xiangyan, Wang

    2017-05-01

    Distributed Generation (DG) integrating to the network would cause the harmonic pollution which would cause damages on electrical devices and affect the normal operation of power system. On the other hand, due to the randomness of the wind and solar irradiation, the output of DG is random, too, which leads to an uncertainty of the harmonic generated by the DG. Thus, probabilistic methods are needed to analyse the impacts of the DG integration. In this work we studied the harmonic voltage probabilistic distribution and the harmonic distortion in distributed network after the distributed photovoltaic (DPV) system integrating in different weather conditions, mainly the sunny day, cloudy day, rainy day and the snowy day. The probabilistic distribution function of the DPV output power in different typical weather conditions could be acquired via the parameter identification method of maximum likelihood estimation. The Monte-Carlo simulation method was adopted to calculate the probabilistic distribution of harmonic voltage content at different frequency orders as well as the harmonic distortion (THD) in typical weather conditions. The case study was based on the IEEE33 system and the results of harmonic voltage content probabilistic distribution as well as THD in typical weather conditions were compared.

  13. Probabilistic Graphical Model Representation in Phylogenetics

    PubMed Central

    Höhna, Sebastian; Heath, Tracy A.; Boussau, Bastien; Landis, Michael J.; Ronquist, Fredrik; Huelsenbeck, John P.

    2014-01-01

    Recent years have seen a rapid expansion of the model space explored in statistical phylogenetics, emphasizing the need for new approaches to statistical model representation and software development. Clear communication and representation of the chosen model is crucial for: (i) reproducibility of an analysis, (ii) model development, and (iii) software design. Moreover, a unified, clear and understandable framework for model representation lowers the barrier for beginners and nonspecialists to grasp complex phylogenetic models, including their assumptions and parameter/variable dependencies. Graphical modeling is a unifying framework that has gained in popularity in the statistical literature in recent years. The core idea is to break complex models into conditionally independent distributions. The strength lies in the comprehensibility, flexibility, and adaptability of this formalism, and the large body of computational work based on it. Graphical models are well-suited to teach statistical models, to facilitate communication among phylogeneticists and in the development of generic software for simulation and statistical inference. Here, we provide an introduction to graphical models for phylogeneticists and extend the standard graphical model representation to the realm of phylogenetics. We introduce a new graphical model component, tree plates, to capture the changing structure of the subgraph corresponding to a phylogenetic tree. We describe a range of phylogenetic models using the graphical model framework and introduce modules to simplify the representation of standard components in large and complex models. Phylogenetic model graphs can be readily used in simulation, maximum likelihood inference, and Bayesian inference using, for example, Metropolis–Hastings or Gibbs sampling of the posterior distribution. [Computation; graphical models; inference; modularization; statistical phylogenetics; tree plate.] PMID:24951559

  14. Probabilistic graphical model representation in phylogenetics.

    PubMed

    Höhna, Sebastian; Heath, Tracy A; Boussau, Bastien; Landis, Michael J; Ronquist, Fredrik; Huelsenbeck, John P

    2014-09-01

    Recent years have seen a rapid expansion of the model space explored in statistical phylogenetics, emphasizing the need for new approaches to statistical model representation and software development. Clear communication and representation of the chosen model is crucial for: (i) reproducibility of an analysis, (ii) model development, and (iii) software design. Moreover, a unified, clear and understandable framework for model representation lowers the barrier for beginners and nonspecialists to grasp complex phylogenetic models, including their assumptions and parameter/variable dependencies. Graphical modeling is a unifying framework that has gained in popularity in the statistical literature in recent years. The core idea is to break complex models into conditionally independent distributions. The strength lies in the comprehensibility, flexibility, and adaptability of this formalism, and the large body of computational work based on it. Graphical models are well-suited to teach statistical models, to facilitate communication among phylogeneticists and in the development of generic software for simulation and statistical inference. Here, we provide an introduction to graphical models for phylogeneticists and extend the standard graphical model representation to the realm of phylogenetics. We introduce a new graphical model component, tree plates, to capture the changing structure of the subgraph corresponding to a phylogenetic tree. We describe a range of phylogenetic models using the graphical model framework and introduce modules to simplify the representation of standard components in large and complex models. Phylogenetic model graphs can be readily used in simulation, maximum likelihood inference, and Bayesian inference using, for example, Metropolis-Hastings or Gibbs sampling of the posterior distribution. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  15. A simple probabilistic model of ideal gases

    NASA Astrophysics Data System (ADS)

    Sossinsky, A. B.

    2016-01-01

    We describe a discrete 3D model of ideal gas based on the idea that, on the microscopic level, the particles move randomly (as in ASEP models), instead of obeying Newton's laws as prescribed by Boltzmann.

  16. Probabilistic grammatical model for helix‐helix contact site classification

    PubMed Central

    2013-01-01

    Background Hidden Markov Models power many state‐of‐the‐art tools in the field of protein bioinformatics. While excelling in their tasks, these methods of protein analysis do not convey directly information on medium‐ and long‐range residue‐residue interactions. This requires an expressive power of at least context‐free grammars. However, application of more powerful grammar formalisms to protein analysis has been surprisingly limited. Results In this work, we present a probabilistic grammatical framework for problem‐specific protein languages and apply it to classification of transmembrane helix‐helix pairs configurations. The core of the model consists of a probabilistic context‐free grammar, automatically inferred by a genetic algorithm from only a generic set of expert‐based rules and positive training samples. The model was applied to produce sequence based descriptors of four classes of transmembrane helix‐helix contact site configurations. The highest performance of the classifiers reached AUCROC of 0.70. The analysis of grammar parse trees revealed the ability of representing structural features of helix‐helix contact sites. Conclusions We demonstrated that our probabilistic context‐free framework for analysis of protein sequences outperforms the state of the art in the task of helix‐helix contact site classification. However, this is achieved without necessarily requiring modeling long range dependencies between interacting residues. A significant feature of our approach is that grammar rules and parse trees are human‐readable. Thus they could provide biologically meaningful information for molecular biologists. PMID:24350601

  17. M-estimation with probabilistic models of geodetic observations

    NASA Astrophysics Data System (ADS)

    Wiśniewski, Z.

    2014-10-01

    The paper concerns -estimation with probabilistic models of geodetic observations that is called estimation. The special attention is paid to estimation that includes the asymmetry and the excess kurtosis, which are basic anomalies of empiric distributions of errors of geodetic or astrometric observations (in comparison to the Gaussian errors). It is assumed that the influence function of estimation is equal to the differential equation that defines the system of the Pearson distributions. The central moments , are the parameters of that system and thus, they are also the parameters of the chosen influence function. The estimation that includes the Pearson type IV and VII distributions ( method) is analyzed in great detail from a theoretical point of view as well as by applying numerical tests. The chosen distributions are leptokurtic with asymmetry which refers to the general characteristic of empirical distributions. Considering -estimation with probabilistic models, the Gram-Charlier series are also applied to approximate the models in question ( method). The paper shows that estimation with the application of probabilistic models belongs to the class of robust estimations; method is especially effective in that case. It is suggested that even in the absence of significant anomalies the method in question should be regarded as robust against gross errors while its robustness is controlled by the pseudo-kurtosis.

  18. Influential input classification in probabilistic multimedia models

    SciTech Connect

    Maddalena, Randy L.; McKone, Thomas E.; Hsieh, Dennis P.H.; Geng, Shu

    1999-05-01

    Monte Carlo analysis is a statistical simulation method that is often used to assess and quantify the outcome variance in complex environmental fate and effects models. Total outcome variance of these models is a function of (1) the uncertainty and/or variability associated with each model input and (2) the sensitivity of the model outcome to changes in the inputs. To propagate variance through a model using Monte Carlo techniques, each variable must be assigned a probability distribution. The validity of these distributions directly influences the accuracy and reliability of the model outcome. To efficiently allocate resources for constructing distributions one should first identify the most influential set of variables in the model. Although existing sensitivity and uncertainty analysis methods can provide a relative ranking of the importance of model inputs, they fail to identify the minimum set of stochastic inputs necessary to sufficiently characterize the outcome variance. In this paper, we describe and demonstrate a novel sensitivity/uncertainty analysis method for assessing the importance of each variable in a multimedia environmental fate model. Our analyses show that for a given scenario, a relatively small number of input variables influence the central tendency of the model and an even smaller set determines the shape of the outcome distribution. For each input, the level of influence depends on the scenario under consideration. This information is useful for developing site specific models and improving our understanding of the processes that have the greatest influence on the variance in outcomes from multimedia models.

  19. Probabilistic-Based Modeling and Simulation Assessment

    DTIC Science & Technology

    2010-06-01

    mph crash simulation at 100 ms with an unbelted Hybrid III model The Hybrid III dummy model was then restrained using a finite element seatbelt ...true physics of the impact, and can thus be qualified as unwanted noise in the model response. Unfortunately, it is difficult to quantify the

  20. Frontostriatal white matter integrity mediates adult age differences in probabilistic reward learning.

    PubMed

    Samanez-Larkin, Gregory R; Levens, Sara M; Perry, Lee M; Dougherty, Robert F; Knutson, Brian

    2012-04-11

    Frontostriatal circuits have been implicated in reward learning, and emerging findings suggest that frontal white matter structural integrity and probabilistic reward learning are reduced in older age. This cross-sectional study examined whether age differences in frontostriatal white matter integrity could account for age differences in reward learning in a community life span sample of human adults. By combining diffusion tensor imaging with a probabilistic reward learning task, we found that older age was associated with decreased reward learning and decreased white matter integrity in specific pathways running from the thalamus to the medial prefrontal cortex and from the medial prefrontal cortex to the ventral striatum. Further, white matter integrity in these thalamocorticostriatal paths could statistically account for age differences in learning. These findings suggest that the integrity of frontostriatal white matter pathways critically supports reward learning. The findings also raise the possibility that interventions that bolster frontostriatal integrity might improve reward learning and decision making.

  1. Probabilistic Priority Message Checking Modeling Based on Controller Area Networks

    NASA Astrophysics Data System (ADS)

    Lin, Cheng-Min

    Although the probabilistic model checking tool called PRISM has been applied in many communication systems, such as wireless local area network, Bluetooth, and ZigBee, the technique is not used in a controller area network (CAN). In this paper, we use PRISM to model the mechanism of priority messages for CAN because the mechanism has allowed CAN to become the leader in serial communication for automobile and industry control. Through modeling CAN, it is easy to analyze the characteristic of CAN for further improving the security and efficiency of automobiles. The Markov chain model helps us to model the behaviour of priority messages.

  2. Probabilistic Modeling of Intracranial Pressure Effects on Optic Nerve Biomechanics

    NASA Technical Reports Server (NTRS)

    Ethier, C. R.; Feola, Andrew J.; Raykin, Julia; Myers, Jerry G.; Nelson, Emily S.; Samuels, Brian C.

    2016-01-01

    Altered intracranial pressure (ICP) is involved/implicated in several ocular conditions: papilledema, glaucoma and Visual Impairment and Intracranial Pressure (VIIP) syndrome. The biomechanical effects of altered ICP on optic nerve head (ONH) tissues in these conditions are uncertain but likely important. We have quantified ICP-induced deformations of ONH tissues, using finite element (FE) and probabilistic modeling (Latin Hypercube Simulations (LHS)) to consider a range of tissue properties and relevant pressures.

  3. Probabilistic graphic models applied to identification of diseases.

    PubMed

    Sato, Renato Cesar; Sato, Graziela Tiemy Kajita

    2015-01-01

    Decision-making is fundamental when making diagnosis or choosing treatment. The broad dissemination of computed systems and databases allows systematization of part of decisions through artificial intelligence. In this text, we present basic use of probabilistic graphic models as tools to analyze causality in health conditions. This method has been used to make diagnosis of Alzheimer´s disease, sleep apnea and heart diseases.

  4. Probabilistic graphic models applied to identification of diseases

    PubMed Central

    Sato, Renato Cesar; Sato, Graziela Tiemy Kajita

    2015-01-01

    ABSTRACT Decision-making is fundamental when making diagnosis or choosing treatment. The broad dissemination of computed systems and databases allows systematization of part of decisions through artificial intelligence. In this text, we present basic use of probabilistic graphic models as tools to analyze causality in health conditions. This method has been used to make diagnosis of Alzheimer´s disease, sleep apnea and heart diseases. PMID:26154555

  5. Probabilistic Independence Networks for Hidden Markov Probability Models

    NASA Technical Reports Server (NTRS)

    Smyth, Padhraic; Heckerman, Cavid; Jordan, Michael I

    1996-01-01

    In this paper we explore hidden Markov models(HMMs) and related structures within the general framework of probabilistic independence networks (PINs). The paper contains a self-contained review of the basic principles of PINs. It is shown that the well-known forward-backward (F-B) and Viterbi algorithms for HMMs are special cases of more general enference algorithms for arbitrary PINs.

  6. Toward a Simple Probabilistic GCM Emulator for Integrated Assessment of Climate Change Impacts

    NASA Astrophysics Data System (ADS)

    Sue Wing, I.; Tebaldi, C.; Nychka, D. W.; Winkler, J.

    2014-12-01

    Climate emulators can bridge spatial scales in integrated assessment in ways that allow us to take advantage of the evolving understanding of the impacts of climate change. The spatial scales at which climate impacts occur are much finer than those of the "damage functions" in integrated assessment models (IAMs), which incorporate reduced form climate models to project changes in global mean temperature, and estimate aggregate damages directly from that. Advancing the state of IA modeling requires methods to generate—in a flexible and computationally efficient manner—future changes in climate variables at the geographic scales at which individual impact endpoints can be resolved. The state of the art uses outputs of global climate models (GCMs) forced by warming scenarios to drive impact calculations. However, downstream integrated assessments are perforce "locked-in" to the particular GCM x warming scenario combinations that generated the meteorological fields of interest—it is not possible assess risk due to the absence of probabilities over warming scenarios or model uncertainty. The availability of reduced-form models which can efficiently simulate the envelope of the response of multiple GCMs to a given amount of warming provides us with capability to create probabilistic projections of fine-scale of meteorological changes conditional on global mean temperature change to drive impact calculations in ways that permit risk assessments. This presentation documents a prototype probabilistic climate emulator for use as a GCM diagnostic tool and a driver of climate change impact assessments. We use a regression-based approach to construct multi-model global patterns for changes in temperature and precipitation from the CMIP3 archive. Crucially, regression residuals are used to derive a spatial covariance function of the model- and scenario-dependent deviations from the average pattern. By sampling from this manifold we can rapidly generate many realizations of

  7. Nonlinear probabilistic finite element models of laminated composite shells

    NASA Technical Reports Server (NTRS)

    Engelstad, S. P.; Reddy, J. N.

    1993-01-01

    A probabilistic finite element analysis procedure for laminated composite shells has been developed. A total Lagrangian finite element formulation, employing a degenerated 3-D laminated composite shell with the full Green-Lagrange strains and first-order shear deformable kinematics, forms the modeling foundation. The first-order second-moment technique for probabilistic finite element analysis of random fields is employed and results are presented in the form of mean and variance of the structural response. The effects of material nonlinearity are included through the use of a rate-independent anisotropic plasticity formulation with the macroscopic point of view. Both ply-level and micromechanics-level random variables can be selected, the latter by means of the Aboudi micromechanics model. A number of sample problems are solved to verify the accuracy of the procedures developed and to quantify the variability of certain material type/structure combinations. Experimental data is compared in many cases, and the Monte Carlo simulation method is used to check the probabilistic results. In general, the procedure is quite effective in modeling the mean and variance response of the linear and nonlinear behavior of laminated composite shells.

  8. Quantum-like Probabilistic Models Outside Physics

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei

    We present a quantum-like (QL) model in that contexts (complexes of e.g. mental, social, biological, economic or even political conditions) are represented by complex probability amplitudes. This approach gives the possibility to apply the mathematical quantum formalism to probabilities induced in any domain of science. In our model quantum randomness appears not as irreducible randomness (as it is commonly accepted in conventional quantum mechanics, e.g. by von Neumann and Dirac), but as a consequence of obtaining incomplete information about a system. We pay main attention to the QL description of processing of incomplete information. Our QL model can be useful in cognitive, social and political sciences as well as economics and artificial intelligence. In this paper we consider in a more detail one special application — QL modeling of brain's functioning. The brain is modeled as a QL-computer.

  9. GENERAL: A modified weighted probabilistic cellular automaton traffic flow model

    NASA Astrophysics Data System (ADS)

    Zhuang, Qian; Jia, Bin; Li, Xin-Gang

    2009-08-01

    This paper modifies the weighted probabilistic cellular automaton model (Li X L, Kuang H, Song T, et al 2008 Chin. Phys. B 17 2366) which considered a diversity of traffic behaviors under real traffic situations induced by various driving characters and habits. In the new model, the effects of the velocity at the last time step and drivers' desire for acceleration are taken into account. The fundamental diagram, spatial-temporal diagram, and the time series of one-minute data are analyzed. The results show that this model reproduces synchronized flow. Finally, it simulates the on-ramp system with the proposed model. Some characteristics including the phase diagram are studied.

  10. Recent advances and applications of probabilistic topic models

    NASA Astrophysics Data System (ADS)

    Wood, Ian

    2014-12-01

    I present here an overview of recent advances in probabilistic topic modelling and related Bayesian graphical models as well as some of their more atypical applications outside of their home: text analysis. These techniques allow the modelling of high dimensional count vectors with strong correlations. With such data, simply calculating a correlation matrix is infeasible. Probabilistic topic models address this using mixtures of multinomials estimated via Bayesian inference with Dirichlet priors. The use of conjugate priors allows for efficient inference, and these techniques scale well to data sets with many millions of vectors. The first of these techniques to attract significant attention was Latent Dirichlet Allocation (LDA) [1, 2]. Numerous extensions and adaptations of LDA have been proposed: non-parametric models; assorted models incorporating authors, sentiment and other features; models regularised through the use of extra metadata or extra priors on topic structure, and many more [3]. They have become widely used in the text analysis and population genetics communities, with a number of compelling applications. These techniques are not restricted to text analysis, however, and can be applied to other types of data which can be sensibly discretised and represented as counts of labels/properties/etc. LDA and it's variants have been used to find patterns in data from diverse areas of inquiry, including genetics, plant physiology, image analysis, social network analysis, remote sensing and astrophysics. Nonetheless, it is relatively recently that probabilistic topic models have found applications outside of text analysis, and to date few such applications have been considered. I suggest that there is substantial untapped potential for topic models and models inspired by or incorporating topic models to be fruitfully applied, and outline the characteristics of systems and data for which this may be the case.

  11. Probabilistic prediction models for aggregate quarry siting

    USGS Publications Warehouse

    Robinson, G.R.; Larkins, P.M.

    2007-01-01

    Weights-of-evidence (WofE) and logistic regression techniques were used in a GIS framework to predict the spatial likelihood (prospectivity) of crushed-stone aggregate quarry development. The joint conditional probability models, based on geology, transportation network, and population density variables, were defined using quarry location and time of development data for the New England States, North Carolina, and South Carolina, USA. The Quarry Operation models describe the distribution of active aggregate quarries, independent of the date of opening. The New Quarry models describe the distribution of aggregate quarries when they open. Because of the small number of new quarries developed in the study areas during the last decade, independent New Quarry models have low parameter estimate reliability. The performance of parameter estimates derived for Quarry Operation models, defined by a larger number of active quarries in the study areas, were tested and evaluated to predict the spatial likelihood of new quarry development. Population density conditions at the time of new quarry development were used to modify the population density variable in the Quarry Operation models to apply to new quarry development sites. The Quarry Operation parameters derived for the New England study area, Carolina study area, and the combined New England and Carolina study areas were all similar in magnitude and relative strength. The Quarry Operation model parameters, using the modified population density variables, were found to be a good predictor of new quarry locations. Both the aggregate industry and the land management community can use the model approach to target areas for more detailed site evaluation for quarry location. The models can be revised easily to reflect actual or anticipated changes in transportation and population features. ?? International Association for Mathematical Geology 2007.

  12. A Probabilistic Model of Cross-Categorization

    ERIC Educational Resources Information Center

    Shafto, Patrick; Kemp, Charles; Mansinghka, Vikash; Tenenbaum, Joshua B.

    2011-01-01

    Most natural domains can be represented in multiple ways: we can categorize foods in terms of their nutritional content or social role, animals in terms of their taxonomic groupings or their ecological niches, and musical instruments in terms of their taxonomic categories or social uses. Previous approaches to modeling human categorization have…

  13. A Probabilistic Model of Theory Formation

    ERIC Educational Resources Information Center

    Kemp, Charles; Tenenbaum, Joshua B.; Niyogi, Sourabh; Griffiths, Thomas L.

    2010-01-01

    Concept learning is challenging in part because the meanings of many concepts depend on their relationships to other concepts. Learning these concepts in isolation can be difficult, but we present a model that discovers entire systems of related concepts. These systems can be viewed as simple theories that specify the concepts that exist in a…

  14. A Probabilistic Model of Cross-Categorization

    ERIC Educational Resources Information Center

    Shafto, Patrick; Kemp, Charles; Mansinghka, Vikash; Tenenbaum, Joshua B.

    2011-01-01

    Most natural domains can be represented in multiple ways: we can categorize foods in terms of their nutritional content or social role, animals in terms of their taxonomic groupings or their ecological niches, and musical instruments in terms of their taxonomic categories or social uses. Previous approaches to modeling human categorization have…

  15. A Probabilistic Model of Theory Formation

    ERIC Educational Resources Information Center

    Kemp, Charles; Tenenbaum, Joshua B.; Niyogi, Sourabh; Griffiths, Thomas L.

    2010-01-01

    Concept learning is challenging in part because the meanings of many concepts depend on their relationships to other concepts. Learning these concepts in isolation can be difficult, but we present a model that discovers entire systems of related concepts. These systems can be viewed as simple theories that specify the concepts that exist in a…

  16. Probabilistic Modeling of Settlement Risk at Land Disposal Facilities - 12304

    SciTech Connect

    Foye, Kevin C.; Soong, Te-Yang

    2012-07-01

    The long-term reliability of land disposal facility final cover systems - and therefore the overall waste containment - depends on the distortions imposed on these systems by differential settlement/subsidence. The evaluation of differential settlement is challenging because of the heterogeneity of the waste mass (caused by inconsistent compaction, void space distribution, debris-soil mix ratio, waste material stiffness, time-dependent primary compression of the fine-grained soil matrix, long-term creep settlement of the soil matrix and the debris, etc.) at most land disposal facilities. Deterministic approaches to long-term final cover settlement prediction are not able to capture the spatial variability in the waste mass and sub-grade properties which control differential settlement. An alternative, probabilistic solution is to use random fields to model the waste and sub-grade properties. The modeling effort informs the design, construction, operation, and maintenance of land disposal facilities. A probabilistic method to establish design criteria for waste placement and compaction is introduced using the model. Random fields are ideally suited to problems of differential settlement modeling of highly heterogeneous foundations, such as waste. Random fields model the seemingly random spatial distribution of a design parameter, such as compressibility. When used for design, the use of these models prompts the need for probabilistic design criteria. It also allows for a statistical approach to waste placement acceptance criteria. An example design evaluation was performed, illustrating the use of the probabilistic differential settlement simulation methodology to assemble a design guidance chart. The purpose of this design evaluation is to enable the designer to select optimal initial combinations of design slopes and quality control acceptance criteria that yield an acceptable proportion of post-settlement slopes meeting some design minimum. For this specific

  17. Probabilistic flood damage modelling at the meso-scale

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2014-05-01

    Decisions on flood risk management and adaptation are usually based on risk analyses. Such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments. Most damage models have in common that complex damaging processes are described by simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood damage models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we show how the model BT-FLEMO (Bagging decision Tree based Flood Loss Estimation MOdel) can be applied on the meso-scale, namely on the basis of ATKIS land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany. The application of BT-FLEMO provides a probability distribution of estimated damage to residential buildings per municipality. Validation is undertaken on the one hand via a comparison with eight other damage models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official damage data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of damage estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation model BT-FLEMO is that it inherently provides quantitative information about the uncertainty of the prediction. Reference: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64.

  18. Use of Probabilistic Topic Models for Search

    DTIC Science & Technology

    2009-09-01

    people, whose advice was crucial for my research, would be too long to fit in here. A few people, however, shall be named here. I thank my advisors Prof...probability of a word oc- curring in a document is not well explained by a single parametric distribution. A mixture model attempts to fit to the document a...cupied tables over all restaurants as the sum of M normal random variables. A better fit that also works for smaller number of words and/or concentration

  19. Probabilistic modelling of sea surges in coastal urban areas

    NASA Astrophysics Data System (ADS)

    Georgiadis, Stylianos; Jomo Danielsen Sørup, Hjalte; Arnbjerg-Nielsen, Karsten; Nielsen, Bo Friis

    2016-04-01

    Urban floods are a major issue for coastal cities with severe impacts on economy, society and environment. A main cause for floods are sea surges stemming from extreme weather conditions. In the context of urban flooding, certain standards have to be met by critical infrastructures in order to protect them from floods. These standards can be so strict that no empirical data is available. For instance, protection plans for sub-surface railways against floods are established with 10,000 years return levels. Furthermore, the long technical lifetime of such infrastructures is a critical issue that should be considered, along with the associated climate change effects in this lifetime. We present a case study of Copenhagen where the metro system is being expanded at present with several stations close to the sea. The current critical sea levels for the metro have never been exceeded and Copenhagen has only been severely flooded from pluvial events in the time where measurements have been conducted. However, due to the very high return period that the metro has to be able to withstand and due to the expectations to sea-level rise due to climate change, reliable estimates of the occurrence rate and magnitude of sea surges have to be established as the current protection is expected to be insufficient at some point within the technical lifetime of the metro. The objective of this study is to probabilistically model sea level in Copenhagen as opposed to extrapolating the extreme statistics as is the practice often used. A better understanding and more realistic description of the phenomena leading to sea surges can then be given. The application of hidden Markov models to high-resolution data of sea level for different meteorological stations in and around Copenhagen is an effective tool to address uncertainty. For sea surge studies, the hidden states of the model may reflect the hydrological processes that contribute to coastal floods. Also, the states of the hidden Markov

  20. IPACS (Integrated Probabilistic Assessment of Composite Structures): Code development and applications

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Shiao, Michael C.

    1993-01-01

    A methodology and attendant computer code have been developed and are described to computationally simulate the uncertain behavior of composite structures. The uncertain behavior includes buckling loads, stress concentration factors, displacements, stress/strain etc., which are the consequences of the inherent uncertainties (scatter) in the primitive (independent random) variables (constituent, ply, laminate and structural) that describe the composite structures. The computer code, IPACS (Integrated Probabilistic Assessment of Composite Structures), can handle both composite mechanics and composite structures. Application to probabilistic composite mechanics is illustrated by its uses to evaluate the uncertainties in the major Poisson's ratio and in laminate stiffness and strength. IPACS application to probabilistic structural analysis is illustrated by its use to evaluate the uncertainties in the buckling of a composite plate, in the stress concentration factor in a composite panel and in the vertical displacement and ply stress in a composite aircraft wing segment.

  1. A probabilistic gastrointestinal tract dosimetry model

    NASA Astrophysics Data System (ADS)

    Huh, Chulhaeng

    In internal dosimetry, the tissues of the gastrointestinal (GI) tract represent one of the most radiosensitive organs of the body with the hematopoietic bone marrow. Endoscopic ultrasound is a unique tool to acquire in-vivo data on GI tract wall thicknesses of sufficient resolution needed in radiation dosimetry studies. Through their different echo texture and intensity, five layers of differing echo patterns for superficial mucosa, deep mucosa, submucosa, muscularis propria and serosa exist within the walls of organs composing the alimentary tract. Thicknesses for stomach mucosa ranged from 620 +/- 150 mum to 1320 +/- 80 mum (total stomach wall thicknesses from 2.56 +/- 0.12 to 4.12 +/- 0.11 mm). Measurements made for the rectal images revealed rectal mucosal thicknesses from 150 +/- 90 mum to 670 +/- 110 mum (total rectal wall thicknesses from 2.01 +/- 0.06 to 3.35 +/- 0.46 mm). The mucosa thus accounted for 28 +/- 3% and 16 +/- 6% of the total thickness of the stomach and rectal wall, respectively. Radiation transport simulations were then performed using the Monte Carlo N-particle transport code (MCNP) 4C transport code to calculate S values (Gy/Bq-s) for penetrating and nonpenetrating radiations such as photons, beta particles, conversion electrons and auger electrons of selected nuclides, I123, I131, Tc 99m and Y90 under two source conditions: content and mucosa sources, respectively. The results of this study demonstrate generally good agreement with published data for the stomach mucosa wall. The rectal mucosa data are consistently higher than published data compared with the large intestine due to different radiosensitive cell thicknesses (350 mum vs. a range spanning from 149 mum to 729 mum) and different geometry when a rectal content source is considered. Generally, the ICRP models have been designed to predict the amount of radiation dose in the human body from a "typical" or "reference" individual in a given population. The study has been performed to

  2. Probabilistic Cross-matching of Radio Catalogs with Geometric Models

    NASA Astrophysics Data System (ADS)

    Fan, D.; Budavári, T.

    2014-05-01

    Cross-matching radio is different from that in the optical. Radio sources can have multiple corresponding detections, the core and its lobes, which makes identification and cross-identification to other catalogs much more difficult. Traditionally, these cases have been handled manually, with researchers looking at the possible candidates; this will not be possible for the upcoming radio ultimately leading to the Square Kilometer Array. We present a probabilistic method that can automatically associate radio sources by explicitly modeling their morphology. Our preliminary results based on a simple straight-line model seem to be on par with the manual associations.

  3. A simple probabilistic model of multibody interactions in proteins.

    PubMed

    Johansson, Kristoffer Enøe; Hamelryck, Thomas

    2013-08-01

    Protein structure prediction methods typically use statistical potentials, which rely on statistics derived from a database of know protein structures. In the vast majority of cases, these potentials involve pairwise distances or contacts between amino acids or atoms. Although some potentials beyond pairwise interactions have been described, the formulation of a general multibody potential is seen as intractable due to the perceived limited amount of data. In this article, we show that it is possible to formulate a probabilistic model of higher order interactions in proteins, without arbitrarily limiting the number of contacts. The success of this approach is based on replacing a naive table-based approach with a simple hierarchical model involving suitable probability distributions and conditional independence assumptions. The model captures the joint probability distribution of an amino acid and its neighbors, local structure and solvent exposure. We show that this model can be used to approximate the conditional probability distribution of an amino acid sequence given a structure using a pseudo-likelihood approach. We verify the model by decoy recognition and site-specific amino acid predictions. Our coarse-grained model is compared to state-of-art methods that use full atomic detail. This article illustrates how the use of simple probabilistic models can lead to new opportunities in the treatment of nonlocal interactions in knowledge-based protein structure prediction and design. Copyright © 2013 Wiley Periodicals, Inc., a Wiley company.

  4. A Probabilistic Model of Meter Perception: Simulating Enculturation

    PubMed Central

    van der Weij, Bastiaan; Pearce, Marcus T.; Honing, Henkjan

    2017-01-01

    Enculturation is known to shape the perception of meter in music but this is not explicitly accounted for by current cognitive models of meter perception. We hypothesize that the induction of meter is a result of predictive coding: interpreting onsets in a rhythm relative to a periodic meter facilitates prediction of future onsets. Such prediction, we hypothesize, is based on previous exposure to rhythms. As such, predictive coding provides a possible explanation for the way meter perception is shaped by the cultural environment. Based on this hypothesis, we present a probabilistic model of meter perception that uses statistical properties of the relation between rhythm and meter to infer meter from quantized rhythms. We show that our model can successfully predict annotated time signatures from quantized rhythmic patterns derived from folk melodies. Furthermore, we show that by inferring meter, our model improves prediction of the onsets of future events compared to a similar probabilistic model that does not infer meter. Finally, as a proof of concept, we demonstrate how our model can be used in a simulation of enculturation. From the results of this simulation, we derive a class of rhythms that are likely to be interpreted differently by enculturated listeners with different histories of exposure to rhythms. PMID:28588533

  5. Spatial probabilistic pulsatility model for enhancing photoplethysmographic imaging systems

    NASA Astrophysics Data System (ADS)

    Amelard, Robert; Clausi, David A.; Wong, Alexander

    2016-11-01

    Photoplethysmographic imaging (PPGI) is a widefield noncontact biophotonic technology able to remotely monitor cardiovascular function over anatomical areas. Although spatial context can provide insight into physiologically relevant sampling locations, existing PPGI systems rely on coarse spatial averaging with no anatomical priors for assessing arterial pulsatility. Here, we developed a continuous probabilistic pulsatility model for importance-weighted blood pulse waveform extraction. Using a data-driven approach, the model was constructed using a 23 participant sample with a large demographic variability (11/12 female/male, age 11 to 60 years, BMI 16.4 to 35.1 kg·m-2). Using time-synchronized ground-truth blood pulse waveforms, spatial correlation priors were computed and projected into a coaligned importance-weighted Cartesian space. A modified Parzen-Rosenblatt kernel density estimation method was used to compute the continuous resolution-agnostic probabilistic pulsatility model. The model identified locations that consistently exhibited pulsatility across the sample. Blood pulse waveform signals extracted with the model exhibited significantly stronger temporal correlation (W=35,p<0.01) and spectral SNR (W=31,p<0.01) compared to uniform spatial averaging. Heart rate estimation was in strong agreement with true heart rate [r2=0.9619, error (μ,σ)=(0.52,1.69) bpm].

  6. Probabilistic modeling of financial exposure to flood in France

    NASA Astrophysics Data System (ADS)

    Moncoulon, David; Quantin, Antoine; Leblois, Etienne

    2014-05-01

    CCR is a French reinsurance company which offers natural catastrophe covers with the State guarantee. Within this framework, CCR develops its own models to assess its financial exposure to floods, droughts, earthquakes and other perils, and thus the exposure of insurers and the French State. A probabilistic flood model has been developed in order to estimate the financial exposure of the Nat Cat insurance market to flood events, depending on their annual occurrence probability. This presentation is organized in two parts. The first part is dedicated to the development of a flood hazard and damage model (ARTEMIS). The model calibration and validation on historical events are then described. In the second part, the coupling of ARTEMIS with two generators of probabilistic events is achieved: a stochastic flow generator and a stochastic spatialized precipitation generator, adapted from the SAMPO model developed by IRSTEA. The analysis of the complementary nature of these two generators is proposed: the first one allows generating floods on the French hydrological station network; the second allows simulating surface water runoff and Small River floods, even on ungauged rivers. Thus, the simulation of thousands of non-occured, but possible events allows us to provide for the first time an estimate of the financial exposure to flooding in France at different scales (commune, department, country) and from different points of view (hazard, vulnerability and damages).

  7. Probabilistic/Fracture-Mechanics Model For Service Life

    NASA Technical Reports Server (NTRS)

    Watkins, T., Jr.; Annis, C. G., Jr.

    1991-01-01

    Computer program makes probabilistic estimates of lifetime of engine and components thereof. Developed to fill need for more accurate life-assessment technique that avoids errors in estimated lives and provides for statistical assessment of levels of risk created by engineering decisions in designing system. Implements mathematical model combining techniques of statistics, fatigue, fracture mechanics, nondestructive analysis, life-cycle cost analysis, and management of engine parts. Used to investigate effects of such engine-component life-controlling parameters as return-to-service intervals, stresses, capabilities for nondestructive evaluation, and qualities of materials.

  8. Probabilistic/Fracture-Mechanics Model For Service Life

    NASA Technical Reports Server (NTRS)

    Watkins, T., Jr.; Annis, C. G., Jr.

    1991-01-01

    Computer program makes probabilistic estimates of lifetime of engine and components thereof. Developed to fill need for more accurate life-assessment technique that avoids errors in estimated lives and provides for statistical assessment of levels of risk created by engineering decisions in designing system. Implements mathematical model combining techniques of statistics, fatigue, fracture mechanics, nondestructive analysis, life-cycle cost analysis, and management of engine parts. Used to investigate effects of such engine-component life-controlling parameters as return-to-service intervals, stresses, capabilities for nondestructive evaluation, and qualities of materials.

  9. Probabilistic assessment of agricultural droughts using graphical models

    NASA Astrophysics Data System (ADS)

    Ramadas, Meenu; Govindaraju, Rao S.

    2015-07-01

    Agricultural droughts are often characterized by soil moisture in the root zone of the soil, but crop needs are rarely factored into the analysis. Since water needs vary with crops, agricultural drought incidences in a region can be characterized better if crop responses to soil water deficits are also accounted for in the drought index. This study investigates agricultural droughts driven by plant stress due to soil moisture deficits using crop stress functions available in the literature. Crop water stress is assumed to begin at the soil moisture level corresponding to incipient stomatal closure, and reaches its maximum at the crop's wilting point. Using available location-specific crop acreage data, a weighted crop water stress function is computed. A new probabilistic agricultural drought index is then developed within a hidden Markov model (HMM) framework that provides model uncertainty in drought classification and accounts for time dependence between drought states. The proposed index allows probabilistic classification of the drought states and takes due cognizance of the stress experienced by the crop due to soil moisture deficit. The capabilities of HMM model formulations for assessing agricultural droughts are compared to those of current drought indices such as standardized precipitation evapotranspiration index (SPEI) and self-calibrating Palmer drought severity index (SC-PDSI). The HMM model identified critical drought events and several drought occurrences that are not detected by either SPEI or SC-PDSI, and shows promise as a tool for agricultural drought studies.

  10. A Probabilistic Model for Simulating Magnetoacoustic Emission Responses in Ferromagnets

    NASA Technical Reports Server (NTRS)

    Namkung, M.; Fulton, J. P.; Wincheski, B.

    1993-01-01

    Magnetoacoustic emission (MAE) is a phenomenon where acoustic noise is generated due to the motion of non-180 magnetic domain walls in a ferromagnet with non-zero magnetostrictive constants. MAE has been studied extensively for many years and has even been applied as an NDE tool for characterizing the heat treatment of high-yield low carbon steels. A complete theory which fully accounts for the magnetoacoustic response, however, has not yet emerged. The motion of the domain walls appears to be a totally random process, however, it does exhibit features of regularity which have been identified by studying phenomena such as 1/f flicker noise and self-organized criticality (SOC). In this paper, a probabilistic model incorporating the effects of SOC has been developed to help explain the MAE response. The model uses many simplifying assumptions yet yields good qualitative agreement with observed experimental results and also provides some insight into the possible underlying mechanisms responsible for MAE. We begin by providing a brief overview of magnetoacoustic emission and the experimental set-up used to obtain the MAE signal. We then describe a pseudo-probabilistic model used to predict the MAE response and give an example of the predicted result. Finally, the model is modified to account for SOC and the new predictions are shown and compared with experiment.

  11. Probabilistic model for bridge structural evaluation using nondestructive inspection data

    NASA Astrophysics Data System (ADS)

    Carrion, Francisco; Lopez, Jose Alfredo; Balankin, Alexander

    2005-05-01

    A bridge management system developed for the Mexican toll highway network applies a probabilistic-reliability model to estimate load capacity and structural residual life. Basic inputs for the system are the global inspection data (visual inspections and vibration testing), and the information from the environment conditions (weather, traffic, loads, earthquakes); although, the model takes account for additional non-destructive testing or permanent monitoring data. Main outputs are the periodic maintenance, rehabilitation and replacement program, and the updated inspection program. Both programs are custom-made to available funds and scheduled according to a priority assignation criterion. The probabilistic model, tailored to typical bridges, accounts for the size, age, material and structure type. Special bridges in size or type may be included, while in these cases finite element deterministic models are also possible. Key feature is that structural qualification is given in terms of the probability of failure, calculated considering fundamental degradation mechanisms and from actual direct observations and measurements, such as crack distribution and size, materials properties, bridge dimensions, load deflections, and parameters for corrosion evaluation. Vibration measurements are basically used to infer structural resistance and to monitor long term degradation.

  12. Economic Dispatch for Microgrid Containing Electric Vehicles via Probabilistic Modeling: Preprint

    SciTech Connect

    Yao, Yin; Gao, Wenzhong; Momoh, James; Muljadi, Eduard

    2016-02-11

    In this paper, an economic dispatch model with probabilistic modeling is developed for a microgrid. The electric power supply in a microgrid consists of conventional power plants and renewable energy power plants, such as wind and solar power plants. Because of the fluctuation in the output of solar and wind power plants, an empirical probabilistic model is developed to predict their hourly output. According to different characteristics of wind and solar power plants, the parameters for probabilistic distribution are further adjusted individually for both. On the other hand, with the growing trend in plug-in electric vehicles (PHEVs), an integrated microgrid system must also consider the impact of PHEVs. The charging loads from PHEVs as well as the discharging output via the vehicle-to-grid (V2G) method can greatly affect the economic dispatch for all of the micro energy sources in a microgrid. This paper presents an optimization method for economic dispatch in a microgrid considering conventional power plants, renewable power plants, and PHEVs. The simulation results reveal that PHEVs with V2G capability can be an indispensable supplement in a modern microgrid.

  13. Near Real-Time Probabilistic Damage Diagnosis Using Surrogate Modeling and High Performance Computing

    NASA Technical Reports Server (NTRS)

    Warner, James E.; Zubair, Mohammad; Ranjan, Desh

    2017-01-01

    This work investigates novel approaches to probabilistic damage diagnosis that utilize surrogate modeling and high performance computing (HPC) to achieve substantial computational speedup. Motivated by Digital Twin, a structural health management (SHM) paradigm that integrates vehicle-specific characteristics with continual in-situ damage diagnosis and prognosis, the methods studied herein yield near real-time damage assessments that could enable monitoring of a vehicle's health while it is operating (i.e. online SHM). High-fidelity modeling and uncertainty quantification (UQ), both critical to Digital Twin, are incorporated using finite element method simulations and Bayesian inference, respectively. The crux of the proposed Bayesian diagnosis methods, however, is the reformulation of the numerical sampling algorithms (e.g. Markov chain Monte Carlo) used to generate the resulting probabilistic damage estimates. To this end, three distinct methods are demonstrated for rapid sampling that utilize surrogate modeling and exploit various degrees of parallelism for leveraging HPC. The accuracy and computational efficiency of the methods are compared on the problem of strain-based crack identification in thin plates. While each approach has inherent problem-specific strengths and weaknesses, all approaches are shown to provide accurate probabilistic damage diagnoses and several orders of magnitude computational speedup relative to a baseline Bayesian diagnosis implementation.

  14. Probabilistic logic modeling of network reliability for hybrid network architectures

    SciTech Connect

    Wyss, G.D.; Schriner, H.K.; Gaylor, T.R.

    1996-10-01

    Sandia National Laboratories has found that the reliability and failure modes of current-generation network technologies can be effectively modeled using fault tree-based probabilistic logic modeling (PLM) techniques. We have developed fault tree models that include various hierarchical networking technologies and classes of components interconnected in a wide variety of typical and atypical configurations. In this paper we discuss the types of results that can be obtained from PLMs and why these results are of great practical value to network designers and analysts. After providing some mathematical background, we describe the `plug-and-play` fault tree analysis methodology that we have developed for modeling connectivity and the provision of network services in several current- generation network architectures. Finally, we demonstrate the flexibility of the method by modeling the reliability of a hybrid example network that contains several interconnected ethernet, FDDI, and token ring segments. 11 refs., 3 figs., 1 tab.

  15. Road environment perception algorithm based on object semantic probabilistic model

    NASA Astrophysics Data System (ADS)

    Liu, Wei; Wang, XinMei; Tian, Jinwen; Wang, Yong

    2015-12-01

    This article seeks to discover the object categories' semantic probabilistic model (OSPM) based on statistical test analysis method. We applied this model on road forward environment perception algorithm, including on-road object recognition and detection. First, the image was represented by a set composed of words (local feature regions). Then, found the probability distribution among image, local regions and object semantic category based on the new model. In training, the parameters of the object model are estimated. This is done by using expectation-maximization in a maximum likelihood setting. In recognition, this model is used to classify images by using a Bayesian manner. In detection, the posterios is calculated to detect the typical on-road objects. Experiments release the good performance on object recognition and detection in urban street background.

  16. Integrated software health management for aerospace guidance, navigation, and control systems: A probabilistic reasoning approach

    NASA Astrophysics Data System (ADS)

    Mbaya, Timmy

    Embedded Aerospace Systems have to perform safety and mission critical operations in a real-time environment where timing and functional correctness are extremely important. Guidance, Navigation, and Control (GN&C) systems substantially rely on complex software interfacing with hardware in real-time; any faults in software or hardware, or their interaction could result in fatal consequences. Integrated Software Health Management (ISWHM) provides an approach for detection and diagnosis of software failures while the software is in operation. The ISWHM approach is based on probabilistic modeling of software and hardware sensors using a Bayesian network. To meet memory and timing constraints of real-time embedded execution, the Bayesian network is compiled into an Arithmetic Circuit, which is used for on-line monitoring. This type of system monitoring, using an ISWHM, provides automated reasoning capabilities that compute diagnoses in a timely manner when failures occur. This reasoning capability enables time-critical mitigating decisions and relieves the human agent from the time-consuming and arduous task of foraging through a multitude of isolated---and often contradictory---diagnosis data. For the purpose of demonstrating the relevance of ISWHM, modeling and reasoning is performed on a simple simulated aerospace system running on a real-time operating system emulator, the OSEK/Trampoline platform. Models for a small satellite and an F-16 fighter jet GN&C (Guidance, Navigation, and Control) system have been implemented. Analysis of the ISWHM is then performed by injecting faults and analyzing the ISWHM's diagnoses.

  17. Modeling of human artery tissue with probabilistic approach.

    PubMed

    Xiong, Linfei; Chui, Chee-Kong; Fu, Yabo; Teo, Chee-Leong; Li, Yao

    2015-04-01

    Accurate modeling of biological soft tissue properties is vital for realistic medical simulation. Mechanical response of biological soft tissue always exhibits a strong variability due to the complex microstructure and different loading conditions. The inhomogeneity in human artery tissue is modeled with a computational probabilistic approach by assuming that the instantaneous stress at a specific strain varies according to normal distribution. Material parameters of the artery tissue which are modeled with a combined logarithmic and polynomial energy equation are represented by a statistical function with normal distribution. Mean and standard deviation of the material parameters are determined using genetic algorithm (GA) and inverse mean-value first-order second-moment (IMVFOSM) method, respectively. This nondeterministic approach was verified using computer simulation based on the Monte-Carlo (MC) method. Cumulative distribution function (CDF) of the MC simulation corresponds well with that of the experimental stress-strain data and the probabilistic approach is further validated using data from other studies. By taking into account the inhomogeneous mechanical properties of human biological tissue, the proposed method is suitable for realistic virtual simulation as well as an accurate computational approach for medical device validation.

  18. Probabilistic models of species discovery and biodiversity comparisons.

    PubMed

    Edie, Stewart M; Smits, Peter D; Jablonski, David

    2017-04-04

    Inferring large-scale processes that drive biodiversity hinges on understanding the phylogenetic and spatial pattern of species richness. However, clades and geographic regions are accumulating newly described species at an uneven rate, potentially affecting the stability of currently observed diversity patterns. Here, we present a probabilistic model of species discovery to assess the uncertainty in diversity levels among clades and regions. We use a Bayesian time series regression to estimate the long-term trend in the rate of species description for marine bivalves and find a distinct spatial bias in the accumulation of new species. Despite these biases, probabilistic estimates of future species richness show considerable stability in the currently observed rank order of regional diversity. However, absolute differences in richness are still likely to change, potentially modifying the correlation between species numbers and geographic, environmental, and biological factors thought to promote biodiversity. Applied to scallops and related clades, we find that accumulating knowledge of deep-sea species will likely shift the relative richness of these three families, emphasizing the need to consider the incomplete nature of bivalve taxonomy in quantitative studies of its diversity. Along with estimating expected changes to observed patterns of diversity, the model described in this paper pinpoints geographic areas and clades most urgently requiring additional systematic study-an important practice for building more complete and accurate models of biodiversity dynamics that can inform ecological and evolutionary theory and improve conservation practice.

  19. Efficient diagnosis of multiprocessor systems under probabilistic models

    NASA Technical Reports Server (NTRS)

    Blough, Douglas M.; Sullivan, Gregory F.; Masson, Gerald M.

    1989-01-01

    The problem of fault diagnosis in multiprocessor systems is considered under a probabilistic fault model. The focus is on minimizing the number of tests that must be conducted in order to correctly diagnose the state of every processor in the system with high probability. A diagnosis algorithm that can correctly diagnose the state of every processor with probability approaching one in a class of systems performing slightly greater than a linear number of tests is presented. A nearly matching lower bound on the number of tests required to achieve correct diagnosis in arbitrary systems is also proven. Lower and upper bounds on the number of tests required for regular systems are also presented. A class of regular systems which includes hypercubes is shown to be correctly diagnosable with high probability. In all cases, the number of tests required under this probabilistic model is shown to be significantly less than under a bounded-size fault set model. Because the number of tests that must be conducted is a measure of the diagnosis overhead, these results represent a dramatic improvement in the performance of system-level diagnosis techniques.

  20. Probabilistic evaluation of integrating resource recovery into wastewater treatment to improve environmental sustainability.

    PubMed

    Wang, Xu; McCarty, Perry L; Liu, Junxin; Ren, Nan-Qi; Lee, Duu-Jong; Yu, Han-Qing; Qian, Yi; Qu, Jiuhui

    2015-02-03

    Global expectations for wastewater service infrastructure have evolved over time, and the standard treatment methods used by wastewater treatment plants (WWTPs) are facing issues related to problem shifting due to the current emphasis on sustainability. A transition in WWTPs toward reuse of wastewater-derived resources is recognized as a promising solution for overcoming these obstacles. However, it remains uncertain whether this approach can reduce the environmental footprint of WWTPs. To test this hypothesis, we conducted a net environmental benefit calculation for several scenarios for more than 50 individual countries over a 20-y time frame. For developed countries, the resource recovery approach resulted in ∼154% net increase in the environmental performance of WWTPs compared with the traditional substance elimination approach, whereas this value decreased to ∼60% for developing countries. Subsequently, we conducted a probabilistic analysis integrating these estimates with national values and determined that, if this transition was attempted for WWTPs in developed countries, it would have a ∼65% probability of attaining net environmental benefits. However, this estimate decreased greatly to ∼10% for developing countries, implying a substantial risk of failure. These results suggest that implementation of this transition for WWTPs should be studied carefully in different temporal and spatial contexts. Developing countries should customize their approach to realizing more sustainable WWTPs, rather than attempting to simply replicate the successful models of developed countries. Results derived from the model forecasting highlight the role of bioenergy generation and reduced use of chemicals in improving the sustainability of WWTPs in developing countries.

  1. Probabilistic multicompartmental model for interpreting DGT kinetics in sediments.

    PubMed

    Ciffroy, P; Nia, Y; Garnier, J M

    2011-11-15

    Extensive research has been performed on the use of the DIFS (DGT-Induced Fluxes in Soils and Sediments) model to interpret diffusive gradients in thin-film, or DGT, measurements in soils and sediments. The current report identifies some areas where the DIFS model has been shown to yield poor results and proposes a model to address weaknesses. In particular, two major flaws in the current approaches are considered: (i) many studies of accumulation kinetics in DGT exhibit multiple kinetic stages and (ii) several combinations of the two fitted DIFS parameters can yield identical results, leaving the question of how to select the 'best' combination. Previously, problem (i) has been addressed by separating the experimental data sets into distinct time segments. To overcome these problems, a model considering two types of particulate binding sites is proposed, instead of the DIFS model which assumed one single particulate pool. A probabilistic approach is proposed to fit experimental data and to determine the range of possible physical parameters using Probability Distribution Functions (PDFs), as opposed to single values without any indication of their uncertainty. The new probabilistic model, called DGT-PROFS, was tested on three different formulated sediments which mainly differ in the presence or absence of iron oxides. It was shown that a good fit can be obtained for the complete set of data (instead of DIFS-2D) and that a range of uncertainty values for each modeling parameter can be obtained. The interpretation of parameter PDFs allows one to distinguish between a variety of geochemical behaviors, providing useful information on metal dynamics in sediments.

  2. Probabilistic multi-scale modeling of pathogen dynamics in rivers

    NASA Astrophysics Data System (ADS)

    Packman, A. I.; Drummond, J. D.; Aubeneau, A. F.

    2014-12-01

    Most parameterizations of microbial dynamics and pathogen transport in surface waters rely on classic assumptions of advection-diffusion behavior in the water column and limited interactions between the water column and sediments. However, recent studies have shown that strong surface-subsurface interactions produce a wide range of transport timescales in rivers, and greatly the opportunity for long-term retention of pathogens in sediment beds and benthic biofilms. We present a stochastic model for pathogen dynamics, based on continuous-time random walk theory, that properly accounts for such diverse transport timescales, along with the remobilization and inactivation of pathogens in storage reservoirs. By representing pathogen dynamics probabilistically, the model framework enables diverse local-scale processes to be incorporated in system-scale models. We illustrate the application of the model to microbial dynamics in rivers based on the results of a tracer injection experiment. In-stream transport and surface-subsurface interactions are parameterized based on observations of conservative tracer transport, while E. coli retention and inactivation in sediments is parameterized based on direct local-scale experiments. The results indicate that sediments are an important reservoir of enteric organisms in rivers, and slow remobilization from sediments represents a long-term source of bacteria to streams. Current capability, potential advances, and limitations of this model framework for assessing pathogen transmission risks will be discussed. Because the transport model is probabilistic, it is amenable to incorporation into risk models, but a lack of characterization of key microbial processes in sediments and benthic biofilms hinders current application.

  3. Probabilistic Modeling of Aircraft Trajectories for Dynamic Separation Volumes

    NASA Technical Reports Server (NTRS)

    Lewis, Timothy A.

    2016-01-01

    With a proliferation of new and unconventional vehicles and operations expected in the future, the ab initio airspace design will require new approaches to trajectory prediction for separation assurance and other air traffic management functions. This paper presents an approach to probabilistic modeling of the trajectory of an aircraft when its intent is unknown. The approach uses a set of feature functions to constrain a maximum entropy probability distribution based on a set of observed aircraft trajectories. This model can be used to sample new aircraft trajectories to form an ensemble reflecting the variability in an aircraft's intent. The model learning process ensures that the variability in this ensemble reflects the behavior observed in the original data set. Computational examples are presented.

  4. Extending the dimensionality of flatland with attribute view probabilistic models

    NASA Astrophysics Data System (ADS)

    Neufeld, Eric; Bickis, Mikelis; Grant, Kevin

    2008-01-01

    In much of Bertin's Semiology of Graphics, marks representing individuals are arranged on paper according to their various attributes (components). Paper and computer monitors can conveniently map two attributes to width and height, and can map other attributes into nonspatial dimensions such as texture, or colour. Good visualizations exploit the human perceptual apparatus so that key relationships are quickly detected as interesting patterns. Graphical models take a somewhat dual approach with respect to the original information. Components, rather than individuals, are represented as marks. Links between marks represent conceptually simple, easily computable, and typically probabilistic relationships of possibly varying strength, and the viewer studies the diagram to discover deeper relationships. Although visually annotated graphical models have been around for almost a century, they have not been widely used. We argue that they have the potential to represent multivariate data as generically as pie charts represent univariate data. The present work suggests a semiology for graphical models, and discusses the consequences for information visualization.

  5. scoringRules - A software package for probabilistic model evaluation

    NASA Astrophysics Data System (ADS)

    Lerch, Sebastian; Jordan, Alexander; Krüger, Fabian

    2016-04-01

    Models in the geosciences are generally surrounded by uncertainty, and being able to quantify this uncertainty is key to good decision making. Accordingly, probabilistic forecasts in the form of predictive distributions have become popular over the last decades. With the proliferation of probabilistic models arises the need for decision theoretically principled tools to evaluate the appropriateness of models and forecasts in a generalized way. Various scoring rules have been developed over the past decades to address this demand. Proper scoring rules are functions S(F,y) which evaluate the accuracy of a forecast distribution F , given that an outcome y was observed. As such, they allow to compare alternative models, a crucial ability given the variety of theories, data sources and statistical specifications that is available in many situations. This poster presents the software package scoringRules for the statistical programming language R, which contains functions to compute popular scoring rules such as the continuous ranked probability score for a variety of distributions F that come up in applied work. Two main classes are parametric distributions like normal, t, or gamma distributions, and distributions that are not known analytically, but are indirectly described through a sample of simulation draws. For example, Bayesian forecasts produced via Markov Chain Monte Carlo take this form. Thereby, the scoringRules package provides a framework for generalized model evaluation that both includes Bayesian as well as classical parametric models. The scoringRules package aims to be a convenient dictionary-like reference for computing scoring rules. We offer state of the art implementations of several known (but not routinely applied) formulas, and implement closed-form expressions that were previously unavailable. Whenever more than one implementation variant exists, we offer statistically principled default choices.

  6. Binary Encoded-Prototype Tree for Probabilistic Model Building GP

    NASA Astrophysics Data System (ADS)

    Yanase, Toshihiko; Hasegawa, Yoshihiko; Iba, Hitoshi

    In recent years, program evolution algorithms based on the estimation of distribution algorithm (EDA) have been proposed to improve search ability of genetic programming (GP) and to overcome GP-hard problems. One such method is the probabilistic prototype tree (PPT) based algorithm. The PPT based method explores the optimal tree structure by using the full tree whose number of child nodes is maximum among possible trees. This algorithm, however, suffers from problems arising from function nodes having different number of child nodes. These function nodes cause intron nodes, which do not affect the fitness function. Moreover, the function nodes having many child nodes increase the search space and the number of samples necessary for properly constructing the probabilistic model. In order to solve this problem, we propose binary encoding for PPT. In this article, we convert each function node to a subtree of binary nodes where the converted tree is correct in grammar. Our method reduces ineffectual search space, and the binary encoded tree is able to express the same tree structures as the original method. The effectiveness of the proposed method is demonstrated through the use of two computational experiments.

  7. Applications of the International Space Station Probabilistic Risk Assessment Model

    NASA Technical Reports Server (NTRS)

    Grant, Warren; Lutomski, Michael G.

    2011-01-01

    Recently the International Space Station (ISS) has incorporated more Probabilistic Risk Assessments (PRAs) in the decision making process for significant issues. Future PRAs will have major impact to ISS and future spacecraft development and operations. These PRAs will have their foundation in the current complete ISS PRA model and the current PRA trade studies that are being analyzed as requested by ISS Program stakeholders. ISS PRAs have recently helped in the decision making process for determining reliability requirements for future NASA spacecraft and commercial spacecraft, making crew rescue decisions, as well as making operational requirements for ISS orbital orientation, planning Extravehicular activities (EVAs) and robotic operations. This paper will describe some applications of the ISS PRA model and how they impacted the final decision. This paper will discuss future analysis topics such as life extension, requirements of new commercial vehicles visiting ISS.

  8. A probabilistic model of a porous heat exchanger

    NASA Technical Reports Server (NTRS)

    Agrawal, O. P.; Lin, X. A.

    1995-01-01

    This paper presents a probabilistic one-dimensional finite element model for heat transfer processes in porous heat exchangers. The Galerkin approach is used to develop the finite element matrices. Some of the submatrices are asymmetric due to the presence of the flow term. The Neumann expansion is used to write the temperature distribution as a series of random variables, and the expectation operator is applied to obtain the mean and deviation statistics. To demonstrate the feasibility of the formulation, a one-dimensional model of heat transfer phenomenon in superfluid flow through a porous media is considered. Results of this formulation agree well with the Monte-Carlo simulations and the analytical solutions. Although the numerical experiments are confined to parametric random variables, a formulation is presented to account for the random spatial variations.

  9. Toward a Dynamic Probabilistic Model for Vestibular Cognition

    PubMed Central

    Ellis, Andrew W.; Mast, Fred W.

    2017-01-01

    We suggest that research in vestibular cognition will benefit from the theoretical framework of probabilistic models. This will aid in developing an understanding of how interactions between high-level cognition and low-level sensory processing might occur. Many such interactions have been shown experimentally; however, to date, no attempt has been made to systematically explore vestibular cognition by using computational modeling. It is widely assumed that mental imagery and perception share at least in part neural circuitry, and it has been proposed that mental simulation is closely connected to the brain’s ability to make predictions. We claim that this connection has been disregarded in the vestibular domain, and we suggest ways in which future research may take this into consideration. PMID:28203219

  10. a Generic Probabilistic Model and a Hierarchical Solution for Sensor Localization in Noisy and Restricted Conditions

    NASA Astrophysics Data System (ADS)

    Ji, S.; Yuan, X.

    2016-06-01

    A generic probabilistic model, under fundamental Bayes' rule and Markov assumption, is introduced to integrate the process of mobile platform localization with optical sensors. And based on it, three relative independent solutions, bundle adjustment, Kalman filtering and particle filtering are deduced under different and additional restrictions. We want to prove that first, Kalman filtering, may be a better initial-value supplier for bundle adjustment than traditional relative orientation in irregular strips and networks or failed tie-point extraction. Second, in high noisy conditions, particle filtering can act as a bridge for gap binding when a large number of gross errors fail a Kalman filtering or a bundle adjustment. Third, both filtering methods, which help reduce the error propagation and eliminate gross errors, guarantee a global and static bundle adjustment, who requires the strictest initial values and control conditions. The main innovation is about the integrated processing of stochastic errors and gross errors in sensor observations, and the integration of the three most used solutions, bundle adjustment, Kalman filtering and particle filtering into a generic probabilistic localization model. The tests in noisy and restricted situations are designed and examined to prove them.

  11. Probabilistic Structural Analysis and Reliability Using NESSUS With Implemented Material Strength Degradation Model

    NASA Technical Reports Server (NTRS)

    Bast, Callie C.; Jurena, Mark T.; Godines, Cody R.; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    This project included both research and education objectives. The goal of this project was to advance innovative research and education objectives in theoretical and computational probabilistic structural analysis, reliability, and life prediction for improved reliability and safety of structural components of aerospace and aircraft propulsion systems. Research and education partners included Glenn Research Center (GRC) and Southwest Research Institute (SwRI) along with the University of Texas at San Antonio (UTSA). SwRI enhanced the NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) code and provided consulting support for NESSUS-related activities at UTSA. NASA funding supported three undergraduate students, two graduate students, a summer course instructor and the Principal Investigator. Matching funds from UTSA provided for the purchase of additional equipment for the enhancement of the Advanced Interactive Computational SGI Lab established during the first year of this Partnership Award to conduct the probabilistic finite element summer courses. The research portion of this report presents the cumulation of work performed through the use of the probabilistic finite element program, NESSUS, Numerical Evaluation and Structures Under Stress, and an embedded Material Strength Degradation (MSD) model. Probabilistic structural analysis provided for quantification of uncertainties associated with the design, thus enabling increased system performance and reliability. The structure examined was a Space Shuttle Main Engine (SSME) fuel turbopump blade. The blade material analyzed was Inconel 718, since the MSD model was previously calibrated for this material. Reliability analysis encompassing the effects of high temperature and high cycle fatigue, yielded a reliability value of 0.99978 using a fully correlated random field for the blade thickness. The reliability did not change significantly for a change in distribution type except for a change in

  12. Integrated probabilistic risk assessment for nanoparticles: the case of nanosilica in food

    NASA Astrophysics Data System (ADS)

    Jacobs, Rianne; van der Voet, Hilko; ter Braak, Cajo J. F.

    2015-06-01

    Insight into risks of nanotechnology and the use of nanoparticles is an essential condition for the social acceptance and safe use of nanotechnology. One of the problems with which the risk assessment of nanoparticles is faced is the lack of data, resulting in uncertainty in the risk assessment. We attempt to quantify some of this uncertainty by expanding a previous deterministic study on nanosilica (5-200 nm) in food into a fully integrated probabilistic risk assessment. We use the integrated probabilistic risk assessment method in which statistical distributions and bootstrap methods are used to quantify uncertainty and variability in the risk assessment. Due to the large amount of uncertainty present, this probabilistic method, which separates variability from uncertainty, contributed to a better understandable risk assessment. We found that quantifying the uncertainties did not increase the perceived risk relative to the outcome of the deterministic study. We pinpointed particular aspects of the hazard characterization that contributed most to the total uncertainty in the risk assessment, suggesting that further research would benefit most from obtaining more reliable data on those aspects.

  13. Integrated probabilistic risk assessment for nanoparticles: the case of nanosilica in food.

    PubMed

    Jacobs, Rianne; van der Voet, Hilko; Ter Braak, Cajo J F

    Insight into risks of nanotechnology and the use of nanoparticles is an essential condition for the social acceptance and safe use of nanotechnology. One of the problems with which the risk assessment of nanoparticles is faced is the lack of data, resulting in uncertainty in the risk assessment. We attempt to quantify some of this uncertainty by expanding a previous deterministic study on nanosilica (5-200 nm) in food into a fully integrated probabilistic risk assessment. We use the integrated probabilistic risk assessment method in which statistical distributions and bootstrap methods are used to quantify uncertainty and variability in the risk assessment. Due to the large amount of uncertainty present, this probabilistic method, which separates variability from uncertainty, contributed to a better understandable risk assessment. We found that quantifying the uncertainties did not increase the perceived risk relative to the outcome of the deterministic study. We pinpointed particular aspects of the hazard characterization that contributed most to the total uncertainty in the risk assessment, suggesting that further research would benefit most from obtaining more reliable data on those aspects.

  14. The integrated environmental control model

    SciTech Connect

    Rubin, E.S.; Berkenpas, M.B.; Kalagnanam, J.R.

    1995-11-01

    The capability to estimate the performance and cost of emission control systems is critical to a variety of planning and analysis requirements faced by utilities, regulators, researchers and analysts in the public and private sectors. The computer model described in this paper has been developed for DOe to provide an up-to-date capability for analyzing a variety of pre-combustion, combustion, and post-combustion options in an integrated framework. A unique capability allows performance and costs to be modeled probabilistically, which allows explicit characterization of uncertainties and risks.

  15. Petri net modeling of fault analysis for probabilistic risk assessment

    NASA Astrophysics Data System (ADS)

    Lee, Andrew

    Fault trees and event trees have been widely accepted as the modeling strategy to perform Probabilistic Risk Assessment (PRA). However, there are several limitations associated with fault tree/event tree modeling. These include 1. It only considers binary events; 2. It assumes independence among basic events; and 3. It does not consider timing sequence of basic events. This thesis investigates Petri net modeling as a potential alternative for PRA modeling. Petri nets have mainly been used as a simulation tool for queuing and network systems. However, it has been suggested that they could also model failure scenarios, and thus could be a potential modeling strategy for PRA. In this thesis, the transformations required to model logic gates in a fault tree by Petri nets are explored. The gap between fault tree analysis and Petri net analysis is bridged through gate equivalency analysis. Methods for qualitative and quantitative analysis for Petri nets are presented. Techniques are developed and implemented to revise and tailor traditional Petri net modeling for system failure analysis. The airlock system and the maintenance cooling system of a CANada Deuterium Uranium (CANDU) reactor are used as case studies to demonstrate Petri nets ability to model system failure and provide a structured approach for qualitative and quantitative analysis. The minimal cutsets and the probability of the airlock system failing to maintain the pressure boundary are obtained. Furthermore, the case study is extended to non-coherent system analysis due to system maintenance.

  16. Probabilistic Fracture Mechanics analysis based on three-dimensional J-integral database

    NASA Astrophysics Data System (ADS)

    Ye, G.-W.; Yagawa, G.; Yoshimura, S.

    1993-04-01

    The development is described of a novel Probabilistic Fracture Mechanics (PFM) code based on the three-dimensional J-integral database, giving so-called fully plastic solutions. An efficient technique for the evaluation of leak and break probabilities is also utilized, based on the stratified sampling Monte Carlo simulation. The outline of the present PFM code is described, and the J-integral database and the numerical technique are presented. Nonlinear effects of materials on failure probabilities are discussed through the analysis of a surface cracked structure subjected to cyclic tension.

  17. Integrated deterministic and probabilistic safety analysis for safety assessment of nuclear power plants

    DOE PAGES

    Di Maio, Francesco; Zio, Enrico; Smith, Curtis; ...

    2015-07-06

    The present special issue contains an overview of the research in the field of Integrated Deterministic and Probabilistic Safety Assessment (IDPSA) of Nuclear Power Plants (NPPs). Traditionally, safety regulation for NPPs design and operation has been based on Deterministic Safety Assessment (DSA) methods to verify criteria that assure plant safety in a number of postulated Design Basis Accident (DBA) scenarios. Referring to such criteria, it is also possible to identify those plant Structures, Systems, and Components (SSCs) and activities that are most important for safety within those postulated scenarios. Then, the design, operation, and maintenance of these “safety-related” SSCs andmore » activities are controlled through regulatory requirements and supported by Probabilistic Safety Assessment (PSA).« less

  18. Integrated deterministic and probabilistic safety analysis for safety assessment of nuclear power plants

    SciTech Connect

    Di Maio, Francesco; Zio, Enrico; Smith, Curtis; Rychkov, Valentin

    2015-07-06

    The present special issue contains an overview of the research in the field of Integrated Deterministic and Probabilistic Safety Assessment (IDPSA) of Nuclear Power Plants (NPPs). Traditionally, safety regulation for NPPs design and operation has been based on Deterministic Safety Assessment (DSA) methods to verify criteria that assure plant safety in a number of postulated Design Basis Accident (DBA) scenarios. Referring to such criteria, it is also possible to identify those plant Structures, Systems, and Components (SSCs) and activities that are most important for safety within those postulated scenarios. Then, the design, operation, and maintenance of these “safety-related” SSCs and activities are controlled through regulatory requirements and supported by Probabilistic Safety Assessment (PSA).

  19. Best Merge Region Growing with Integrated Probabilistic Classification for Hyperspectral Imagery

    NASA Technical Reports Server (NTRS)

    Tarabalka, Yuliya; Tilton, James C.

    2011-01-01

    A new method for spectral-spatial classification of hyperspectral images is proposed. The method is based on the integration of probabilistic classification within the hierarchical best merge region growing algorithm. For this purpose, preliminary probabilistic support vector machines classification is performed. Then, hierarchical step-wise optimization algorithm is applied, by iteratively merging regions with the smallest Dissimilarity Criterion (DC). The main novelty of this method consists in defining a DC between regions as a function of region statistical and geometrical features along with classification probabilities. Experimental results are presented on a 200-band AVIRIS image of the Northwestern Indiana s vegetation area and compared with those obtained by recently proposed spectral-spatial classification techniques. The proposed method improves classification accuracies when compared to other classification approaches.

  20. Probabilistic Environmental Model for Solid Rocket Motor Life Prediction.

    DTIC Science & Technology

    1982-03-01

    e o•o )e o oo~~oe UNCLASSIFIED SECURITY CLASSIFICATION OF THIS PAGE (b7htn Data En.red) READ INSTRUCTIONS REPORT DOCUMENTATION PAGE BEFORE COMPLETING...Data Bettered) UNCLASSI F I ED SIECUOITY CLASSIFICATION OF THIS I’AS .A. E Wto l n ’... s’ red) (U) Probabilistic Environmental Model for Solid Rocket...N11C’ 4 NCAP’ 8 IF’ 58 DMr,3’ 0.55160E-05 TIMtE’ esqo4. NOCe’ 3 tIeAP’ 2 IF’ 59 DM5’= 0.7343#E-04 TIME= 93679. NIOC’ 3 CeAP ’ 27 ZR’ 60 DM3’= 0

  1. Using symbolic computing in building probabilistic models for atoms

    NASA Astrophysics Data System (ADS)

    Guiasu, Silviu

    This article shows how symbolic computing and the mathematical formalism induced by maximizing entropy and minimizing the mean deviation from statistical equilibrium may be effectively applied to obtaining probabilistic models for the structure of atoms, using trial wave functions compatible with an average shell picture of the atom. The objective is not only to recover the experimental value of the ground state mean energy of the atom, but rather to better approximate the unknown parameters of these trial functions and to calculate both correlations between electrons and the amount of interdependence among different subsets of electrons of the atoms. The examples and numerical results refer to the hydrogen, helium, lithium, and beryllium atoms. The main computer programs, using the symbolic computing software MATHEMATICA, are also given.

  2. Design of interchannel MRF model for probabilistic multichannel image processing.

    PubMed

    Koo, Hyung Il; Cho, Nam Ik

    2011-03-01

    In this paper, we present a novel framework that exploits an informative reference channel in the processing of another channel. We formulate the problem as a maximum a posteriori estimation problem considering a reference channel and develop a probabilistic model encoding the interchannel correlations based on Markov random fields. Interestingly, the proposed formulation results in an image-specific and region-specific linear filter for each site. The strength of filter response can also be controlled in order to transfer the structural information of a channel to the others. Experimental results on satellite image fusion and chrominance image interpolation with denoising show that our method provides improved subjective and objective performance compared with conventional approaches.

  3. De novo protein conformational sampling using a probabilistic graphical model

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Debswapna; Cheng, Jianlin

    2015-11-01

    Efficient exploration of protein conformational space remains challenging especially for large proteins when assembling discretized structural fragments extracted from a protein structure data database. We propose a fragment-free probabilistic graphical model, FUSION, for conformational sampling in continuous space and assess its accuracy using ‘blind’ protein targets with a length up to 250 residues from the CASP11 structure prediction exercise. The method reduces sampling bottlenecks, exhibits strong convergence, and demonstrates better performance than the popular fragment assembly method, ROSETTA, on relatively larger proteins with a length of more than 150 residues in our benchmark set. FUSION is freely available through a web server at http://protein.rnet.missouri.edu/FUSION/.

  4. Development of a Probabilistic Decision-Support Model to Forecast Coastal Resilience

    NASA Astrophysics Data System (ADS)

    Wilson, K.; Safak, I.; Brenner, O.; Lentz, E. E.; Hapke, C. J.

    2016-02-01

    Site-specific forecasts of coastal change are a valuable management tool in preparing for and assessing storm-driven impacts in coastal areas. More specifically, understanding the likelihood of storm impacts, recovery following events, and the alongshore variability of both is central in evaluating vulnerability and resiliency of barrier islands. We introduce a probabilistic modeling framework that integrates hydrodynamic, anthropogenic, and morphologic components of the barrier system to evaluate coastal change at Fire Island, New York. The model is structured on a Bayesian network (BN), which utilizes observations to learn statistical relationships between system variables. In addition to predictive ability, probabilistic models convey the level of confidence associated with a prediction, an important consideration for coastal managers. Our model predicts the likelihood of morphologic change on the upper beach based on several decades of beach monitoring data. A coupled hydrodynamic BN combines probabilistic and deterministic modeling approaches; by querying nearly two decades of nested-grid wave simulations that account for both distant swells and local seas, we produce scenarios of event and seasonal wave climates. The wave scenarios of total water level - a sum of run up, surge and tide - and anthropogenic modification are the primary drivers of morphologic change in our model structure. Preliminary results show the hydrodynamic BN is able to reproduce time series of total water levels, a critical validation process before generating scenarios, and forecasts of geomorphic change over three month intervals are up to 70% accurate. Predictions of storm-induced change and recovery are linked to evaluate zones of persistent vulnerability or resilience and will help managers target restoration efforts, identify areas most vulnerable to habitat degradation, and highlight resilient zones that may best support relocation of critical infrastructure.

  5. Probabilistic modeling of soil development variability with time

    NASA Astrophysics Data System (ADS)

    Shepard, C.; Schaap, M. G.; Rasmussen, C.

    2015-12-01

    Soils develop as the result of a complex suite of biogeochemical and physical processes; however, effective modeling of soil development over pedogenic time scales and the resultant soil property variability is limited to individual chronosequence studies or overly broad generalizations. Soil chronosequence studies are used to understand soil development across a landscape with time, but traditional soil chronosequence studies do not account for uncertainty in soil development, and the results of these studies are site dependent. Here we develop a probabilistic approach to quantify the distribution of probable soil property values based on a review of soil chronosequence studies. Specifically, we examined the changes in the distributions of soil texture and solum thickness with increasing time and influx of pedogenic energy from climatic and biological forcings. We found the greatest variability in maximum measured clay content occurred between 103 to 105 years, with convergence of clay contents in soils older than 106 years. Conversely, we found that the variability in maximum sand content increased with increasing time, with the greatest variability in soils between 105 to 106 years old; we did not find distributional changes in maximum silt content with time. Bivariate normal probability distributions were parameterized using the chronosequence data, from which conditional univariate distributions based on the total pedogenic energy (age x rate of energy flux) were calculated, allowing determination of a probable range of soil properties for a given age and bioclimatic environment. The bivariate distribution was capable of effectively representing the measured maximum clay content values with an r2 of 0.53 (p < 0.0001, RMSE = 14.36%). By taking a distributional approach to quantifying soil development and variability, we can quantitatively probabilistically represent the full state factor model, while explicitly quantifying the uncertainty in soil development.

  6. Integrated Medical Model Overview

    NASA Technical Reports Server (NTRS)

    Myers, J.; Boley, L.; Foy, M.; Goodenow, D.; Griffin, D.; Keenan, A.; Kerstman, E.; Melton, S.; McGuire, K.; Saile, L.; hide

    2015-01-01

    The Integrated Medical Model (IMM) Project represents one aspect of NASA's Human Research Program (HRP) to quantitatively assess medical risks to astronauts for existing operational missions as well as missions associated with future exploration and commercial space flight ventures. The IMM takes a probabilistic approach to assessing the likelihood and specific outcomes of one hundred medical conditions within the envelope of accepted space flight standards of care over a selectable range of mission capabilities. A specially developed Integrated Medical Evidence Database (iMED) maintains evidence-based, organizational knowledge across a variety of data sources. Since becoming operational in 2011, version 3.0 of the IMM, the supporting iMED, and the expertise of the IMM project team have contributed to a wide range of decision and informational processes for the space medical and human research community. This presentation provides an overview of the IMM conceptual architecture and range of application through examples of actual space flight community questions posed to the IMM project.

  7. An Approach for Incorporating Context in Building Probabilistic Predictive Models

    PubMed Central

    Wu, Juan Anna; Hsu, William; Bui, Alex AT

    2016-01-01

    With the increasing amount of information collected through clinical practice and scientific experimentation, a growing challenge is how to utilize available resources to construct predictive models to facilitate clinical decision making. Clinicians often have questions related to the treatment and outcome of a medical problem for individual patients; however, few tools exist that leverage the large collection of patient data and scientific knowledge to answer these questions. Without appropriate context, existing data that have been collected for a specific task may not be suitable for creating new models that answer different questions. This paper presents an approach that leverages available structured or unstructured data to build a probabilistic predictive model that assists physicians with answering clinical questions on individual patients. Various challenges related to transforming available data to an end-user application are addressed: problem decomposition, variable selection, context representation, automated extraction of information from unstructured data sources, model generation, and development of an intuitive application to query the model and present the results. We describe our efforts towards building a model that predicts the risk of vasospasm in aneurysm patients. PMID:27617299

  8. An Approach for Incorporating Context in Building Probabilistic Predictive Models.

    PubMed

    Wu, Juan Anna; Hsu, William; Bui, Alex At

    2012-09-01

    With the increasing amount of information collected through clinical practice and scientific experimentation, a growing challenge is how to utilize available resources to construct predictive models to facilitate clinical decision making. Clinicians often have questions related to the treatment and outcome of a medical problem for individual patients; however, few tools exist that leverage the large collection of patient data and scientific knowledge to answer these questions. Without appropriate context, existing data that have been collected for a specific task may not be suitable for creating new models that answer different questions. This paper presents an approach that leverages available structured or unstructured data to build a probabilistic predictive model that assists physicians with answering clinical questions on individual patients. Various challenges related to transforming available data to an end-user application are addressed: problem decomposition, variable selection, context representation, automated extraction of information from unstructured data sources, model generation, and development of an intuitive application to query the model and present the results. We describe our efforts towards building a model that predicts the risk of vasospasm in aneurysm patients.

  9. Probabilistic evaluation of integrating resource recovery into wastewater treatment to improve environmental sustainability

    PubMed Central

    Wang, Xu; McCarty, Perry L.; Liu, Junxin; Ren, Nan-Qi; Lee, Duu-Jong; Yu, Han-Qing; Qian, Yi; Qu, Jiuhui

    2015-01-01

    Global expectations for wastewater service infrastructure have evolved over time, and the standard treatment methods used by wastewater treatment plants (WWTPs) are facing issues related to problem shifting due to the current emphasis on sustainability. A transition in WWTPs toward reuse of wastewater-derived resources is recognized as a promising solution for overcoming these obstacles. However, it remains uncertain whether this approach can reduce the environmental footprint of WWTPs. To test this hypothesis, we conducted a net environmental benefit calculation for several scenarios for more than 50 individual countries over a 20-y time frame. For developed countries, the resource recovery approach resulted in ∼154% net increase in the environmental performance of WWTPs compared with the traditional substance elimination approach, whereas this value decreased to ∼60% for developing countries. Subsequently, we conducted a probabilistic analysis integrating these estimates with national values and determined that, if this transition was attempted for WWTPs in developed countries, it would have a ∼65% probability of attaining net environmental benefits. However, this estimate decreased greatly to ∼10% for developing countries, implying a substantial risk of failure. These results suggest that implementation of this transition for WWTPs should be studied carefully in different temporal and spatial contexts. Developing countries should customize their approach to realizing more sustainable WWTPs, rather than attempting to simply replicate the successful models of developed countries. Results derived from the model forecasting highlight the role of bioenergy generation and reduced use of chemicals in improving the sustainability of WWTPs in developing countries. PMID:25605884

  10. Modelling circumplanetary ejecta clouds at low altitudes: A probabilistic approach

    NASA Astrophysics Data System (ADS)

    Christou, Apostolos A.

    2015-04-01

    A model is presented of a ballistic, collisionless, steady state population of ejecta launched at randomly distributed times and velocities and moving under constant gravity above the surface of an airless planetary body. Within a probabilistic framework, closed form solutions are derived for the probability density functions of the altitude distribution of particles, the distribution of their speeds in a rest frame both at the surface and at altitude and with respect to a moving platform such as an orbiting spacecraft. These expressions are validated against numerically-generated synthetic populations of ejecta under lunar surface gravity. The model is applied to the cases where the ejection speed distribution is (a) uniform (b) a power law. For the latter law, it is found that the effective scale height of the ejecta envelope directly depends on the exponent of the power law and increases with altitude. The same holds for the speed distribution of particles near the surface. Ejection model parameters can, therefore, be constrained through orbital and surface measurements. The scope of the model is then extended to include size-dependency of the ejection speed and an example worked through for a deterministic power law relation. The result suggests that the height distribution of ejecta is a sensitive proxy for this dependency.

  11. Probabilistic model for quick detection of dissimilar binary images

    NASA Astrophysics Data System (ADS)

    Mustafa, Adnan A. Y.

    2015-09-01

    We present a quick method to detect dissimilar binary images. The method is based on a "probabilistic matching model" for image matching. The matching model is used to predict the probability of occurrence of distinct-dissimilar image pairs (completely different images) when matching one image to another. Based on this model, distinct-dissimilar images can be detected by matching only a few points between two images with high confidence, namely 11 points for a 99.9% successful detection rate. For image pairs that are dissimilar but not distinct-dissimilar, more points need to be mapped. The number of points required to attain a certain successful detection rate or confidence depends on the amount of similarity between the compared images. As this similarity increases, more points are required. For example, images that differ by 1% can be detected by mapping fewer than 70 points on average. More importantly, the model is image size invariant; so, images of any sizes will produce high confidence levels with a limited number of matched points. As a result, this method does not suffer from the image size handicap that impedes current methods. We report on extensive tests conducted on real images of different sizes.

  12. The international normalized ratio and uncertainty. Validation of a probabilistic model.

    PubMed

    Critchfield, G C; Bennett, S T

    1994-07-01

    The motivation behind the creation of the International Normalized Ratio (INR) was to improve interlaboratory comparison for patients on anticoagulation therapy. In principle, a laboratory that reports the prothrombin time (PT) as an INR can standardize its PT measurements to an international reference thromboplastin. Using probability theory, the authors derived the equation for the probability distribution of the INR based on the PT, the International Sensitivity Index (ISI), and the geometric mean PT of the reference population. With Monte Carlo and numeric integration techniques, the model is validated on data from three different laboratories. The model allows computation of confidence intervals for the INR as a function of PT, ISI, and reference mean. The probabilistic model illustrates that confidence in INR measurements degrades for higher INR values. This occurs primarily as a result of amplification of between-run measurement errors in the PT, which is inherent in the mathematical transformation from the PT to the INR. The probabilistic model can be used by any laboratory to study the reliability of its own INR for any measured PT. This framework provides better insight into the problems of monitoring oral anticoagulation.

  13. Probabilistic constitutive relationships for material strength degradation models

    NASA Technical Reports Server (NTRS)

    Boyce, L.; Chamis, C. C.

    1989-01-01

    In the present probabilistic methodology for the strength of aerospace propulsion system structural components subjected to such environmentally-induced primitive variables as loading stresses, high temperature, chemical corrosion, and radiation, time is encompassed as an interacting element, allowing the projection of creep and fatigue effects. A probabilistic constitutive equation is postulated to account for the degradation of strength due to these primitive variables which may be calibrated by an appropriately curve-fitted least-squares multiple regression of experimental data. The resulting probabilistic constitutive equation is embodied in the PROMISS code for aerospace propulsion component random strength determination.

  14. Probabilistic consequence model of accidenal or intentional chemical releases.

    SciTech Connect

    Chang, Y.-S.; Samsa, M. E.; Folga, S. M.; Hartmann, H. M.

    2008-06-02

    In this work, general methodologies for evaluating the impacts of large-scale toxic chemical releases are proposed. The potential numbers of injuries and fatalities, the numbers of hospital beds, and the geographical areas rendered unusable during and some time after the occurrence and passage of a toxic plume are estimated on a probabilistic basis. To arrive at these estimates, historical accidental release data, maximum stored volumes, and meteorological data were used as inputs into the SLAB accidental chemical release model. Toxic gas footprints from the model were overlaid onto detailed population and hospital distribution data for a given region to estimate potential impacts. Output results are in the form of a generic statistical distribution of injuries and fatalities associated with specific toxic chemicals and regions of the United States. In addition, indoor hazards were estimated, so the model can provide contingency plans for either shelter-in-place or evacuation when an accident occurs. The stochastic distributions of injuries and fatalities are being used in a U.S. Department of Homeland Security-sponsored decision support system as source terms for a Monte Carlo simulation that evaluates potential measures for mitigating terrorist threats. This information can also be used to support the formulation of evacuation plans and to estimate damage and cleanup costs.

  15. Probabilistic space-time video modeling via piecewise GMM.

    PubMed

    Greenspan, Hayit; Goldberger, Jacob; Mayer, Arnaldo

    2004-03-01

    In this paper, we describe a statistical video representation and modeling scheme. Video representation schemes are needed to segment a video stream into meaningful video-objects, useful for later indexing and retrieval applications. In the proposed methodology, unsupervised clustering via Gaussian mixture modeling extracts coherent space-time regions in feature space, and corresponding coherent segments (video-regions) in the video content. A key feature of the system is the analysis of video input as a single entity as opposed to a sequence of separate frames. Space and time are treated uniformly. The probabilistic space-time video representation scheme is extended to a piecewise GMM framework in which a succession of GMMs are extracted for the video sequence, instead of a single global model for the entire sequence. The piecewise GMM framework allows for the analysis of extended video sequences and the description of nonlinear, nonconvex motion patterns. The extracted space-time regions allow for the detection and recognition of video events. Results of segmenting video content into static versus dynamic video regions and video content editing are presented.

  16. A probabilistic approach to modeling and controlling fluid flows

    NASA Astrophysics Data System (ADS)

    Kaiser, Eurika; Noack, Bernd R.; Spohn, Andreas; Cattafesta, Louis N.; Morzynski, Marek; Daviller, Guillaume; Brunton, Bingni W.; Brunton, Steven L.

    2016-11-01

    We extend cluster-based reduced-order modeling (CROM) (Kaiser et al., 2014) to include control inputs in order to determine optimal control laws with respect to a cost function for unsteady flows. The proposed methodology frames high-dimensional, nonlinear dynamics into low- dimensional, probabilistic, linear dynamics which considerably simplifies the optimal control problem while preserving nonlinear actuation mechanisms. The data-driven approach builds upon the unsupervised partitioning of the data into few kinematically similar flow states using a clustering algorithm. The coarse-grained dynamics are then described by a Markov model which is closely related to the approximation of Perron-Frobenius operators. The Markov model can be used as predictor for the ergodic probability distribution for a particular control law approximating the long-term behavior of the system on which basis the optimal control law is determined. Moreover, we combine CROM with a recently developed approach for optimal sparse sensor placement for classification (Brunton et al., 2013) as a critical enabler for in-time control and for the systematic identification of dynamical regimes from few measurements. The approach is applied to a separating flow and a mixing layer exhibiting vortex pairing.

  17. The Gain-Loss Model: A Probabilistic Skill Multimap Model for Assessing Learning Processes

    ERIC Educational Resources Information Center

    Robusto, Egidio; Stefanutti, Luca; Anselmi, Pasquale

    2010-01-01

    Within the theoretical framework of knowledge space theory, a probabilistic skill multimap model for assessing learning processes is proposed. The learning process of a student is modeled as a function of the student's knowledge and of an educational intervention on the attainment of specific skills required to solve problems in a knowledge…

  18. The Gain-Loss Model: A Probabilistic Skill Multimap Model for Assessing Learning Processes

    ERIC Educational Resources Information Center

    Robusto, Egidio; Stefanutti, Luca; Anselmi, Pasquale

    2010-01-01

    Within the theoretical framework of knowledge space theory, a probabilistic skill multimap model for assessing learning processes is proposed. The learning process of a student is modeled as a function of the student's knowledge and of an educational intervention on the attainment of specific skills required to solve problems in a knowledge…

  19. An Integrated Probabilistic-Fuzzy Assessment of Uncertainty Associated with Human Health Risk to MSW Landfill Leachate Contamination

    NASA Astrophysics Data System (ADS)

    Mishra, H.; Karmakar, S.; Kumar, R.

    2016-12-01

    Risk assessment will not remain simple when it involves multiple uncertain variables. Uncertainties in risk assessment majorly results from (1) the lack of knowledge of input variable (mostly random), and (2) data obtained from expert judgment or subjective interpretation of available information (non-random). An integrated probabilistic-fuzzy health risk approach has been proposed for simultaneous treatment of random and non-random uncertainties associated with input parameters of health risk model. The LandSim 2.5, a landfill simulator, has been used to simulate the Turbhe landfill (Navi Mumbai, India) activities for various time horizons. Further the LandSim simulated six heavy metals concentration in ground water have been used in the health risk model. The water intake, exposure duration, exposure frequency, bioavailability and average time are treated as fuzzy variables, while the heavy metals concentration and body weight are considered as probabilistic variables. Identical alpha-cut and reliability level are considered for fuzzy and probabilistic variables respectively and further, uncertainty in non-carcinogenic human health risk is estimated using ten thousand Monte-Carlo simulations (MCS). This is the first effort in which all the health risk variables have been considered as non-deterministic for the estimation of uncertainty in risk output. The non-exceedance probability of Hazard Index (HI), summation of hazard quotients, of heavy metals of Co, Cu, Mn, Ni, Zn and Fe for male and female population have been quantified and found to be high (HI>1) for all the considered time horizon, which evidently shows possibility of adverse health effects on the population residing near Turbhe landfill.

  20. Probabilistic pairwise Markov models: application to prostate cancer detection

    NASA Astrophysics Data System (ADS)

    Monaco, James; Tomaszewski, John E.; Feldman, Michael D.; Moradi, Mehdi; Mousavi, Parvin; Boag, Alexander; Davidson, Chris; Abolmaesumi, Purang; Madabhushi, Anant

    2009-02-01

    Markov Random Fields (MRFs) provide a tractable means for incorporating contextual information into a Bayesian framework. This contextual information is modeled using multiple local conditional probability density functions (LCPDFs) which the MRF framework implicitly combines into a single joint probability density function (JPDF) that describes the entire system. However, only LCPDFs of certain functional forms are consistent, meaning they reconstitute a valid JPDF. These forms are specified by the Gibbs-Markov equivalence theorem which indicates that the JPDF, and hence the LCPDFs, should be representable as a product of potential functions (i.e. Gibbs distributions). Unfortunately, potential functions are mathematical abstractions that lack intuition; and consequently, constructing LCPDFs through their selection becomes an ad hoc procedure, usually resulting in generic and/or heuristic models. In this paper we demonstrate that under certain conditions the LCDPFs can be formulated in terms of quantities that are both meaningful and descriptive: probability distributions. Using probability distributions instead of potential functions enables us to construct consistent LCPDFs whose modeling capabilities are both more intuitive and expansive than typical MRF models. As an example, we compare the efficacy of our so-called probabilistic pairwise Markov models (PPMMs) to the prevalent Potts model by incorporating both into a novel computer aided diagnosis (CAD) system for detecting prostate cancer in whole-mount histological sections. Using the Potts model the CAD system is able to detection cancerous glands with a specificity of 0.82 and sensitivity of 0.71; its area under the receiver operator characteristic (AUC) curve is 0.83. If instead the PPMM model is employed the sensitivity (specificity is held fixed) and AUC increase to 0.77 and 0.87.

  1. Learned graphical models for probabilistic planning provide a new class of movement primitives

    PubMed Central

    Rückert, Elmar A.; Neumann, Gerhard; Toussaint, Marc; Maass, Wolfgang

    2013-01-01

    Biological movement generation combines three interesting aspects: its modular organization in movement primitives (MPs), its characteristics of stochastic optimality under perturbations, and its efficiency in terms of learning. A common approach to motor skill learning is to endow the primitives with dynamical systems. Here, the parameters of the primitive indirectly define the shape of a reference trajectory. We propose an alternative MP representation based on probabilistic inference in learned graphical models with new and interesting properties that complies with salient features of biological movement control. Instead of endowing the primitives with dynamical systems, we propose to endow MPs with an intrinsic probabilistic planning system, integrating the power of stochastic optimal control (SOC) methods within a MP. The parameterization of the primitive is a graphical model that represents the dynamics and intrinsic cost function such that inference in this graphical model yields the control policy. We parameterize the intrinsic cost function using task-relevant features, such as the importance of passing through certain via-points. The system dynamics as well as intrinsic cost function parameters are learned in a reinforcement learning (RL) setting. We evaluate our approach on a complex 4-link balancing task. Our experiments show that our movement representation facilitates learning significantly and leads to better generalization to new task settings without re-learning. PMID:23293598

  2. Spatial polychaeta habitat potential mapping using probabilistic models

    NASA Astrophysics Data System (ADS)

    Choi, Jong-Kuk; Oh, Hyun-Joo; Koo, Bon Joo; Ryu, Joo-Hyung; Lee, Saro

    2011-06-01

    The purpose of this study was to apply probabilistic models to the mapping of the potential polychaeta habitat area in the Hwangdo tidal flat, Korea. Remote sensing techniques were used to construct spatial datasets of ecological environments and field observations were carried out to determine the distribution of macrobenthos. Habitat potential mapping was achieved for two polychaeta species, Prionospio japonica and Prionospio pulchra, and eight control factors relating to the tidal macrobenthos distribution were selected. These included the intertidal digital elevation model (DEM), slope, aspect, tidal exposure duration, distance from tidal channels, tidal channel density, spectral reflectance of the near infrared (NIR) bands and surface sedimentary facies from satellite imagery. The spatial relationships between the polychaeta species and each control factor were calculated using a frequency ratio and weights-of-evidence combined with geographic information system (GIS) data. The species were randomly divided into a training set (70%) to analyze habitat potential using frequency ratio and weights-of-evidence, and a test set (30%) to verify the predicted habitat potential map. The relationships were overlaid to produce a habitat potential map with a polychaeta habitat potential (PHP) index value. These maps were verified by comparing them to surveyed habitat locations such as the verification data set. For the verification results, the frequency ratio model showed prediction accuracies of 77.71% and 74.87% for P. japonica and P. pulchra, respectively, while those for the weights-of-evidence model were 64.05% and 62.95%. Thus, the frequency ratio model provided a more accurate prediction than the weights-of-evidence model. Our data demonstrate that the frequency ratio and weights-of-evidence models based upon GIS analysis are effective for generating habitat potential maps of polychaeta species in a tidal flat. The results of this study can be applied towards

  3. Predicting coastal cliff erosion using a Bayesian probabilistic model

    USGS Publications Warehouse

    Hapke, Cheryl J.; Plant, Nathaniel G.

    2010-01-01

    Regional coastal cliff retreat is difficult to model due to the episodic nature of failures and the along-shore variability of retreat events. There is a growing demand, however, for predictive models that can be used to forecast areas vulnerable to coastal erosion hazards. Increasingly, probabilistic models are being employed that require data sets of high temporal density to define the joint probability density function that relates forcing variables (e.g. wave conditions) and initial conditions (e.g. cliff geometry) to erosion events. In this study we use a multi-parameter Bayesian network to investigate correlations between key variables that control and influence variations in cliff retreat processes. The network uses Bayesian statistical methods to estimate event probabilities using existing observations. Within this framework, we forecast the spatial distribution of cliff retreat along two stretches of cliffed coast in Southern California. The input parameters are the height and slope of the cliff, a descriptor of material strength based on the dominant cliff-forming lithology, and the long-term cliff erosion rate that represents prior behavior. The model is forced using predicted wave impact hours. Results demonstrate that the Bayesian approach is well-suited to the forward modeling of coastal cliff retreat, with the correct outcomes forecast in 70–90% of the modeled transects. The model also performs well in identifying specific locations of high cliff erosion, thus providing a foundation for hazard mapping. This approach can be employed to predict cliff erosion at time-scales ranging from storm events to the impacts of sea-level rise at the century-scale.

  4. Temporal Resolution in Time Series and Probabilistic Models of Renewable Power Systems

    NASA Astrophysics Data System (ADS)

    Hoevenaars, Eric

    There are two main types of logistical models used for long-term performance prediction of autonomous power systems: time series and probabilistic. Time series models are more common and are more accurate for sizing storage systems because they are able to track the state of charge. However, the computational time is usually greater than for probabilistic models. It is common for time series models to perform 1-year simulations with a 1-hour time step. This is likely because of the limited availability of high resolution data and the increase in computation time with a shorter time step. Computation time is particularly important because these types of models are often used for component size optimization which requires many model runs. This thesis includes a sensitivity analysis examining the effect of the time step on these simulations. The results show that it can be significant, though it depends on the system configuration and site characteristics. Two probabilistic models are developed to estimate the temporal resolution error of a 1-hour simulation: a time series/probabilistic model and a fully probabilistic model. To demonstrate the application of and evaluate the performance of these models, two case studies are analyzed. One is for a typical residential system and one is for a system designed to provide on-site power at an aquaculture site. The results show that the time series/probabilistic model would be a useful tool if accurate distributions of the sub-hour data can be determined. Additionally, the method of cumulant arithmetic is demonstrated to be a useful technique for incorporating multiple non-Gaussian random variables into a probabilistic model, a feature other models such as Hybrid2 currently do not have. The results from the fully probabilistic model showed that some form of autocorrelation is required to account for seasonal and diurnal trends.

  5. Probabilistic modelling of rainfall induced landslide hazard assessment

    NASA Astrophysics Data System (ADS)

    Kawagoe, S.; Kazama, S.; Sarukkalige, P. R.

    2010-06-01

    To evaluate the frequency and distribution of landslides hazards over Japan, this study uses a probabilistic model based on multiple logistic regression analysis. Study particular concerns several important physical parameters such as hydraulic parameters, geographical parameters and the geological parameters which are considered to be influential in the occurrence of landslides. Sensitivity analysis confirmed that hydrological parameter (hydraulic gradient) is the most influential factor in the occurrence of landslides. Therefore, the hydraulic gradient is used as the main hydraulic parameter; dynamic factor which includes the effect of heavy rainfall and their return period. Using the constructed spatial data-sets, a multiple logistic regression model is applied and landslide hazard probability maps are produced showing the spatial-temporal distribution of landslide hazard probability over Japan. To represent the landslide hazard in different temporal scales, extreme precipitation in 5 years, 30 years, and 100 years return periods are used for the evaluation. The results show that the highest landslide hazard probability exists in the mountain ranges on the western side of Japan (Japan Sea side), including the Hida and Kiso, Iide and the Asahi mountainous range, the south side of Chugoku mountainous range, the south side of Kyusu mountainous and the Dewa mountainous range and the Hokuriku region. The developed landslide hazard probability maps in this study will assist authorities, policy makers and decision makers, who are responsible for infrastructural planning and development, as they can identify landslide-susceptible areas and thus decrease landslide damage through proper preparation.

  6. Probabilistic modelling of rainfall induced landslide hazard assessment

    NASA Astrophysics Data System (ADS)

    Kawagoe, S.; Kazama, S.; Sarukkalige, P. R.

    2010-01-01

    To evaluate the frequency and distribution of landslides hazards over Japan, this study uses a probabilistic model based on multiple logistic regression analysis. Study particular concerns several important physical parameters such as hydraulic parameters, geographical parameters and the geological parameters which are considered to be influential in the occurrence of landslides. Sensitivity analysis confirmed that hydrological parameter (hydraulic gradient) is the most influential factor in the occurrence of landslides. Therefore, the hydraulic gradient is used as the main hydraulic parameter; dynamic factor which includes the effect of heavy rainfall and their return period. Using the constructed spatial data-sets, a multiple logistic regression model is applied and landslide susceptibility maps are produced showing the spatial-temporal distribution of landslide hazard susceptibility over Japan. To represent the susceptibility in different temporal scales, extreme precipitation in 5 years, 30 years, and 100 years return periods are used for the evaluation. The results show that the highest landslide hazard susceptibility exists in the mountain ranges on the western side of Japan (Japan Sea side), including the Hida and Kiso, Iide and the Asahi mountainous range, the south side of Chugoku mountainous range, the south side of Kyusu mountainous and the Dewa mountainous range and the Hokuriku region. The developed landslide hazard susceptibility maps in this study will assist authorities, policy makers and decision makers, who are responsible for infrastructural planning and development, as they can identify landslide-susceptible areas and thus decrease landslide damage through proper preparation.

  7. Effects of shipping on marine acoustic habitats in Canadian Arctic estimated via probabilistic modeling and mapping.

    PubMed

    Aulanier, Florian; Simard, Yvan; Roy, Nathalie; Gervaise, Cédric; Bandet, Marion

    2017-08-29

    Canadian Arctic and Subarctic regions experience a rapid decrease of sea ice accompanied with increasing shipping traffic. The resulting time-space changes in shipping noise are studied for four key regions of this pristine environment, for 2013 traffic conditions and a hypothetical tenfold traffic increase. A probabilistic modeling and mapping framework, called Ramdam, which integrates the intrinsic variability and uncertainties of shipping noise and its effects on marine habitats, is developed and applied. A substantial transformation of soundscapes is observed in areas where shipping noise changes from present occasional-transient contributor to a dominant noise source. Examination of impacts on low-frequency mammals within ecologically and biologically significant areas reveals that shipping noise has the potential to trigger behavioral responses and masking in the future, although no risk of temporary or permanent hearing threshold shifts is noted. Such probabilistic modeling and mapping is strategic in marine spatial planning of this emerging noise issues. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  8. Probabilistic approaches to the modelling of fluvial processes

    NASA Astrophysics Data System (ADS)

    Molnar, Peter

    2013-04-01

    Fluvial systems generally exhibit sediment dynamics that are strongly stochastic. This stochasticity comes basically from three sources: (a) the variability and randomness in sediment supply due to surface properties and topography; (b) from the multitude of pathways that sediment may take on hillslopes and in channels, and the uncertainty in travel times and sediment storage along those pathways; and (c) from the stochasticity which is inherent in mobilizing sediment, either by heavy rain, landslides, debris flows, slope erosion, channel avulsions, etc. Fully deterministic models of fluvial systems, even if they are physically realistic and very complex, are likely going to be unable to capture this stochasticity and as a result will fail to reproduce long-term sediment dynamics. In this paper I will review another approach to modelling fluvial processes, which grossly simplifies the systems itself, but allows for stochasticity in sediment supply, mobilization and transport. I will demonstrate the benefits and limitations of this probabilistic approach to fluvial processes on three examples. The first example is a probabilistic sediment cascade which we developed for the Illgraben, a debris flow basin in the Rhone catchment. In this example it will be shown how the probability distribution of landslides generating sediment input into the channel system is transposed into that of sediment yield out of the basin by debris flows. The key role of transient sediment storage in the channel system, which limits the size of potential debris flows, is highlighted together with the influence of the landslide triggering mechanisms and climate stochasticity. The second example focuses on the river reach scale in the Maggia River, a braided gravel-bed stream where the exposed sediment on gravel bars is colonised by riparian vegetation in periods without floods. A simple autoregressive model with a disturbance and colonization term is used to simulate the growth and decline in

  9. Formulation of probabilistic models of protein structure in atomic detail using the reference ratio method.

    PubMed

    Valentin, Jan B; Andreetta, Christian; Boomsma, Wouter; Bottaro, Sandro; Ferkinghoff-Borg, Jesper; Frellsen, Jes; Mardia, Kanti V; Tian, Pengfei; Hamelryck, Thomas

    2014-02-01

    We propose a method to formulate probabilistic models of protein structure in atomic detail, for a given amino acid sequence, based on Bayesian principles, while retaining a close link to physics. We start from two previously developed probabilistic models of protein structure on a local length scale, which concern the dihedral angles in main chain and side chains, respectively. Conceptually, this constitutes a probabilistic and continuous alternative to the use of discrete fragment and rotamer libraries. The local model is combined with a nonlocal model that involves a small number of energy terms according to a physical force field, and some information on the overall secondary structure content. In this initial study we focus on the formulation of the joint model and the evaluation of the use of an energy vector as a descriptor of a protein's nonlocal structure; hence, we derive the parameters of the nonlocal model from the native structure without loss of generality. The local and nonlocal models are combined using the reference ratio method, which is a well-justified probabilistic construction. For evaluation, we use the resulting joint models to predict the structure of four proteins. The results indicate that the proposed method and the probabilistic models show considerable promise for probabilistic protein structure prediction and related applications. Copyright © 2013 Wiley Periodicals, Inc.

  10. Probabilistic modelling of flood events using the entropy copula

    NASA Astrophysics Data System (ADS)

    Li, Fan; Zheng, Qian

    2016-11-01

    The estimation of flood frequency is vital for the flood control strategies and hydraulic structure design. Generating synthetic flood events according to statistical properties of observations is one of plausible methods to analyze the flood frequency. Due to the statistical dependence among the flood event variables (i.e. the flood peak, volume and duration), a multidimensional joint probability estimation is required. Recently, the copula method is widely used for multivariable dependent structure construction, however, the copula family should be chosen before application and the choice process is sometimes rather subjective. The entropy copula, a new copula family, employed in this research proposed a way to avoid the relatively subjective process by combining the theories of copula and entropy. The analysis shows the effectiveness of the entropy copula for probabilistic modelling the flood events of two hydrological gauges, and a comparison of accuracy with the popular copulas was made. The Gibbs sampling technique was applied for trivariate flood events simulation in order to mitigate the calculation difficulties of extending to three dimension directly. The simulation results indicate that the entropy copula is a simple and effective copula family for trivariate flood simulation.

  11. Sonar signal processing using probabilistic signal and ocean environmental models.

    PubMed

    Culver, R Lee; Camin, H John

    2008-12-01

    Acoustic signals propagating through the ocean are refracted, scattered, and attenuated by the ocean volume and boundaries. Many aspects of how the ocean affects acoustic propagation are understood, such that the characteristics of a received signal can often be predicted with some degree of certainty. However, acoustic ocean parameters vary with time and location in a manner that is not, and cannot be, precisely known; some uncertainty will always remain. For this reason, the characteristics of the received signal can never be precisely predicted and must be described in probabilistic terms. A signal processing structure recently developed relies on knowledge of the ocean environment to predict the statistical characteristics of the received signal, and incorporates this description into the processor in order to detect and classify targets. Acoustic measurements at 250 Hz from the 1996 Strait of Gibraltar Acoustic Monitoring Experiment are used to illustrate how the processor utilizes environmental data to classify source depth and to underscore the importance of environmental model fidelity and completeness.

  12. Poisson Group Testing: A Probabilistic Model for Boolean Compressed Sensing

    NASA Astrophysics Data System (ADS)

    Emad, Amin; Milenkovic, Olgica

    2015-08-01

    We introduce a novel probabilistic group testing framework, termed Poisson group testing, in which the number of defectives follows a right-truncated Poisson distribution. The Poisson model has a number of new applications, including dynamic testing with diminishing relative rates of defectives. We consider both nonadaptive and semi-adaptive identification methods. For nonadaptive methods, we derive a lower bound on the number of tests required to identify the defectives with a probability of error that asymptotically converges to zero; in addition, we propose test matrix constructions for which the number of tests closely matches the lower bound. For semi-adaptive methods, we describe a lower bound on the expected number of tests required to identify the defectives with zero error probability. In addition, we propose a stage-wise reconstruction algorithm for which the expected number of tests is only a constant factor away from the lower bound. The methods rely only on an estimate of the average number of defectives, rather than on the individual probabilities of subjects being defective.

  13. Probabilistic model for fracture mechanics service life analysis

    NASA Technical Reports Server (NTRS)

    Annis, Charles; Watkins, Tommie

    1988-01-01

    The service longevity of complex propulsion systems, such as the Space Shuttle Main Engine (SSME), can be at risk from several competing failure modes. Conventional life assessment practice focuses upon the most severely life-limited feature of a given component, even though there may be other, less severe, potential failure locations. Primary, secondary, tertiary failure modes, as well as their associated probabilities, must also be considered. Futhermore, these probabilities are functions of accumulated service time. Thus a component may not always succumb to the most severe, or even the most probable failure mode. Propulsion system longevity must be assessed by considering simultaneously the actions of, and interactions among, life-limiting influences. These include, but are not limited to, high frequency fatigue (HFF), low cycle fatigue (LCF), and subsequent crack propagation, thermal and acoustic loadings, and the influence of less-than-ideal nondestructive evaluation (NDE). An outline is provided for a probabilistic model for service life analysis, and the progress towards its implementation is reported.

  14. Comprehensive probabilistic modelling of environmental emissions of engineered nanomaterials.

    PubMed

    Sun, Tian Yin; Gottschalk, Fadri; Hungerbühler, Konrad; Nowack, Bernd

    2014-02-01

    Concerns about the environmental risks of engineered nanomaterials (ENM) are growing, however, currently very little is known about their concentrations in the environment. Here, we calculate the concentrations of five ENM (nano-TiO2, nano-ZnO, nano-Ag, CNT and fullerenes) in environmental and technical compartments using probabilistic material-flow modelling. We apply the newest data on ENM production volumes, their allocation to and subsequent release from different product categories, and their flows into and within those compartments. Further, we compare newly predicted ENM concentrations to estimates from 2009 and to corresponding measured concentrations of their conventional materials, e.g. TiO2, Zn and Ag. We show that the production volume and the compounds' inertness are crucial factors determining final concentrations. ENM production estimates are generally higher than a few years ago. In most cases, the environmental concentrations of corresponding conventional materials are between one and seven orders of magnitude higher than those for ENM. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. The Terrestrial Investigation Model: A probabilistic risk assessment model for birds exposed to pesticides

    EPA Science Inventory

    One of the major recommendations of the National Academy of Science to the USEPA, NMFS and USFWS was to utilize probabilistic methods when assessing the risks of pesticides to federally listed endangered and threatened species. The Terrestrial Investigation Model (TIM, version 3....

  16. The Terrestrial Investigation Model: A probabilistic risk assessment model for birds exposed to pesticides

    EPA Science Inventory

    One of the major recommendations of the National Academy of Science to the USEPA, NMFS and USFWS was to utilize probabilistic methods when assessing the risks of pesticides to federally listed endangered and threatened species. The Terrestrial Investigation Model (TIM, version 3....

  17. Forecasting the duration of volcanic eruptions: an empirical probabilistic model

    NASA Astrophysics Data System (ADS)

    Gunn, L. S.; Blake, S.; Jones, M. C.; Rymer, H.

    2014-01-01

    The ability to forecast future volcanic eruption durations would greatly benefit emergency response planning prior to and during a volcanic crises. This paper introduces a probabilistic model to forecast the duration of future and on-going eruptions. The model fits theoretical distributions to observed duration data and relies on past eruptions being a good indicator of future activity. A dataset of historical Mt. Etna flank eruptions is presented and used to demonstrate the model. The data have been compiled through critical examination of existing literature along with careful consideration of uncertainties on reported eruption start and end dates between the years 1300 AD and 2010. Data following 1600 is considered to be reliable and free of reporting biases. The distribution of eruption duration between the years 1600 and 1669 is found to be statistically different from that following it and the forecasting model is run on two datasets of Mt. Etna flank eruption durations: 1600-2010 and 1670-2010. Each dataset is modelled using a log-logistic distribution with parameter values found by maximum likelihood estimation. Survivor function statistics are applied to the model distributions to forecast (a) the probability of an eruption exceeding a given duration, (b) the probability of an eruption that has already lasted a particular number of days exceeding a given total duration and (c) the duration with a given probability of being exceeded. Results show that excluding the 1600-1670 data has little effect on the forecasting model result, especially where short durations are involved. By assigning the terms `likely' and `unlikely' to probabilities of 66 % or more and 33 % or less, respectively, the forecasting model based on the 1600-2010 dataset indicates that a future flank eruption on Mt. Etna would be likely to exceed 20 days (± 7 days) but unlikely to exceed 86 days (± 29 days). This approach can easily be adapted for use on other highly active, well

  18. Probabilistic Modeling of Tephra Dispersion using Parallel Processing

    NASA Astrophysics Data System (ADS)

    Hincks, T.; Bonadonna, C.; Connor, L.; Connor, C.; Sparks, S.

    2002-12-01

    Numerical models of tephra accumulation are important tools in assessing hazards of volcanic eruptions. Such tools can be used far in advance of future eruptions to calculate possible hazards as conditional probabilities. For example, given that a volcanic eruption occurs, what is the expected range of tephra deposition in a specific location or across a region? An empirical model is presented that uses physical characteristics (e.g., volume, column height, particle size distribution) of a volcanic eruption to calculate expected tephra accumulation at geographic locations distant from the vent. This model results from the combination of the Connor et al. (2001) and Bonadonna et al. (1998, 2002) numerical approaches and is based on application of the diffusion advection equation using a stratified atmosphere and particle fall velocities that account for particle shape, density, and variation in Reynold's number along the path of decent. Distribution of particles in the eruption column is a major source of uncertainty in estimation of tephra hazards. We adopt an approach in which several models of the volcanic column may be used and the impact of these various source term models on hazard estimated. Cast probabilistically, this model can use characteristics of historical eruptions, or data from analogous eruptions, to predict the expected tephra deposition from future eruptions. Application of such a model for computing a large number of events over a grid of many points is computationally expensive. In fact, the utility of the model for stochastic simulations of volcanic eruptions was limited by long execution time. To address this concern, we created a parallel version in C and MPI, a message passing interface, to run on a Beowulf cluster, a private network of reasonably high performance computers. We have discovered that grid or input decomposition and self-scheduling techniques lead to essentially linear speed-up in the code. This means that the code is readily

  19. Rockfall hazard assessment integrating probabilistic physically based rockfall source detection (Norddal municipality, Norway).

    NASA Astrophysics Data System (ADS)

    Yugsi Molina, F. X.; Oppikofer, T.; Fischer, L.; Hermanns, R. L.; Taurisano, A.

    2012-04-01

    Traditional techniques to assess rockfall hazard are partially based on probabilistic analysis. Stochastic methods has been used for run-out analysis of rock blocks to estimate the trajectories that a detached block will follow during its fall until it stops due to kinetic energy loss. However, the selection of rockfall source areas is usually defined either by multivariate analysis or by field observations. For either case, a physically based approach is not used for the source area detection. We present an example of rockfall hazard assessment that integrates a probabilistic rockfall run-out analysis with a stochastic assessment of the rockfall source areas using kinematic stability analysis in a GIS environment. The method has been tested for a steep more than 200 m high rock wall, located in the municipality of Norddal (Møre og Romsdal county, Norway), where a large number of people are either exposed to snow avalanches, rockfalls, or debris flows. The area was selected following the recently published hazard mapping plan of Norway. The cliff is formed by medium to coarse-grained quartz-dioritic to granitic gneisses of Proterozoic age. Scree deposits product of recent rockfall activity are found at the bottom of the rock wall. Large blocks can be found several tens of meters away from the cliff in Sylte, the main locality in the Norddal municipality. Structural characterization of the rock wall was done using terrestrial laser scanning (TLS) point clouds in the software Coltop3D (www.terranum.ch), and results were validated with field data. Orientation data sets from the structural characterization were analyzed separately to assess best-fit probability density functions (PDF) for both dip angle and dip direction angle of each discontinuity set. A GIS-based stochastic kinematic analysis was then carried out using the discontinuity set orientations and the friction angle as random variables. An airborne laser scanning digital elevation model (ALS-DEM) with 1 m

  20. An empirical model for probabilistic decadal prediction: A global analysis

    NASA Astrophysics Data System (ADS)

    Suckling, Emma; Hawkins, Ed; Eden, Jonathan; van Oldenborgh, Geert Jan

    2016-04-01

    Empirical models, designed to predict land-based surface variables over seasons to decades ahead, provide useful benchmarks for comparison against the performance of dynamical forecast systems; they may also be employable as predictive tools for use by climate services in their own right. A new global empirical decadal prediction system is presented, based on a multiple linear regression approach designed to produce probabilistic output for comparison against dynamical models. Its performance is evaluated for surface air temperature over a set of historical hindcast experiments under a series of different prediction `modes'. The modes include a real-time setting, a scenario in which future volcanic forcings are prescribed during the hindcasts, and an approach which exploits knowledge of the forced trend. A two-tier prediction system, which uses knowledge of future sea surface temperatures in the Pacific and Atlantic Oceans, is also tested, but within a perfect knowledge framework. Each mode is designed to identify sources of predictability and uncertainty, as well as investigate different approaches to the design of decadal prediction systems for operational use. It is found that the empirical model shows skill above that of persistence hindcasts for annual means at lead times of up to ten years ahead in all of the prediction modes investigated. Small improvements in skill are found at all lead times when including future volcanic forcings in the hindcasts. It is also suggested that hindcasts which exploit full knowledge of the forced trend due to increasing greenhouse gases throughout the hindcast period can provide more robust estimates of model bias for the calibration of the empirical model in an operational setting. The two-tier system shows potential for improved real-time prediction, given the assumption that skilful predictions of large-scale modes of variability are available. The empirical model framework has been designed with enough flexibility to

  1. A Survey of Probabilistic Models for Relational Data

    SciTech Connect

    Koutsourelakis, P S

    2006-10-13

    Traditional data mining methodologies have focused on ''flat'' data i.e. a collection of identically structured entities, assumed to be independent and identically distributed. However, many real-world datasets are innately relational in that they consist of multi-modal entities and multi-relational links (where each entity- or link-type is characterized by a different set of attributes). Link structure is an important characteristic of a dataset and should not be ignored in modeling efforts, especially when statistical dependencies exist between related entities. These dependencies can in fact significantly improve the accuracy of inference and prediction results, if the relational structure is appropriately leveraged (Figure 1). The need for models that can incorporate relational structure has been accentuated by new technological developments which allow us to easily track, store, and make accessible large amounts of data. Recently, there has been a surge of interest in statistical models for dealing with richly interconnected, heterogeneous data, fueled largely by information mining of web/hypertext data, social networks, bibliographic citation data, epidemiological data and communication networks. Graphical models have a natural formalism for representing complex relational data and for predicting the underlying evolving system in a dynamic framework. The present survey provides an overview of probabilistic methods and techniques that have been developed over the last few years for dealing with relational data. Particular emphasis is paid to approaches pertinent to the research areas of pattern recognition, group discovery, entity/node classification, and anomaly detection. We start with supervised learning tasks, where two basic modeling approaches are discussed--i.e. discriminative and generative. Several discriminative techniques are reviewed and performance results are presented. Generative methods are discussed in a separate survey. A special section is

  2. Proposal for a probabilistic local level landslide hazard assessment model: The case of Suluktu, Kyrgyzstan

    NASA Astrophysics Data System (ADS)

    Vidar Vangelsten, Bjørn; Fornes, Petter; Cepeda, Jose Mauricio; Ekseth, Kristine Helene; Eidsvig, Unni; Ormukov, Cholponbek

    2015-04-01

    Landslides are a significant threat to human life and the built environment in many parts of Central Asia. To improve understanding of the magnitude of the threat and propose appropriate risk mitigation measures, landslide hazard mapping is needed both at regional and local level. Many different approaches for landslide hazard mapping exist depending on the scale and purpose of the analysis and what input data are available. This paper presents a probabilistic local scale landslide hazard mapping methodology for rainfall triggered landslides, adapted to the relatively dry climate found in South-Western Kyrgyzstan. The GIS based approach makes use of data on topography, geology, land use and soil characteristics to assess landslide susceptibility. Together with a selected rainfall scenario, these data are inserted into a triggering model based on an infinite slope formulation considering pore pressure and suction effects for unsaturated soils. A statistical model based on local landslide data has been developed to estimate landslide run-out. The model links the spatial extension of the landslide to land use and geological features. The model is tested and validated for the town of Suluktu in the Ferghana Valley in South-West Kyrgyzstan. Landslide hazard is estimated for the urban area and the surrounding hillsides. The case makes use of a range of data from different sources, both remote sensing data and in-situ data. Public global data sources are mixed with case specific data obtained from field work. The different data and models have various degrees of uncertainty. To account for this, the hazard model has been inserted into a Monte Carlo simulation framework to produce a probabilistic landslide hazard map identifying areas with high landslide exposure. The research leading to these results has received funding from the European Commission's Seventh Framework Programme [FP7/2007-2013], under grant agreement n° 312972 "Framework to integrate Space-based and in

  3. Object features fail independently in visual working memory: evidence for a probabilistic feature-store model.

    PubMed

    Fougnie, Daryl; Alvarez, George A

    2011-10-06

    The world is composed of features and objects and this structure may influence what is stored in working memory. It is widely believed that the content of memory is object-based: Memory stores integrated objects, not independent features. We asked participants to report the color and orientation of an object and found that memory errors were largely independent: Even when one of the object's features was entirely forgotten, the other feature was often reported. This finding contradicts object-based models and challenges fundamental assumptions about the organization of information in working memory. We propose an alternative framework involving independent self-sustaining representations that may fail probabilistically and independently for each feature. This account predicts that the degree of independence in feature storage is determined by the degree of overlap in neural coding during perception. Consistent with this prediction, we found that errors for jointly encoded dimensions were less independent than errors for independently encoded dimensions.

  4. Evolution of the sewage treatment plant model SimpleTreat: use of realistic biodegradability tests in probabilistic model simulations.

    PubMed

    Franco, Antonio; Struijs, Jaap; Gouin, Todd; Price, Oliver R

    2013-10-01

    Given the large number of chemicals under regulatory scrutiny, models play a crucial role in the screening phase of the environmental risk assessment. The sewage treatment plant (STP) model SimpleTreat 3.1 is routinely applied as part of the European Union System for the Evaluation of Substances to estimate the fate and elimination of organic chemicals discharged via sewage. SimpleTreat estimates tend to be conservative and therefore only useful for lower-tier assessments. A probabilistic version of SimpleTreat was built on the updated version of the model (SimpleTreat 3.2, presented in a parallel article in this issue), embracing likeliest as well as worst-case conditions in a statistically robust way. Probabilistic parameters representing the variability of sewage characteristics, STP design, and operational parameters were based on actual STP conditions for activated sludge plants in Europe. An evaluation study was carried out for 4 chemicals with distinct sorption and biodegradability profiles: tonalide, triclosan, trimethoprim, and linear alkylbenzene sulfonate. Simulations incorporated information on biodegradability simulation studies with activated sludge (OECD 314B and OECD 303A tests). Good agreement for both median values and variability ranges was observed between model estimates and monitoring data. The uncertainty analysis highlighted the importance of refined data on partitioning and biodegradability in activated sludge to achieve realistic estimates. The study indicates that the best strategy to refine the exposure assessment of down-the-drain chemicals is by integrating higher-tier laboratory data with probabilistic STP simulations and, if possible, by comparing them with monitoring data for validation.

  5. Boolean Queries and Term Dependencies in Probabilistic Retrieval Models.

    ERIC Educational Resources Information Center

    Croft, W. Bruce

    1986-01-01

    Proposes approach to integrating Boolean and statistical systems where Boolean queries are interpreted as a means of specifying term dependencies in relevant set of documents. Highlights include series of retrieval experiments designed to test retrieval strategy based on term dependence model and relation of results to other work. (18 references)…

  6. White matter integrity of cerebellar-cortical tracts in reading impaired children: A probabilistic tractography study

    PubMed Central

    Fernandez, Vindia G.; Juranek, Jenifer; Romanowska-Pawliczek, Anna; Stuebing, Karla; Williams, Victoria J.; Fletcher, Jack M.

    2016-01-01

    Little is known about the white matter integrity of cerebellar-cortical pathways in individuals with dyslexia. Building on previous findings of decreased volume in the anterior lobe of the cerebellum, we utilized novel cerebellar segmentation procedures and probabilistic tractography to examine tracts that connect the anterior lobe of the cerebellum and cortical regions typically associated with reading: the temporoparietal (TP), occipitotemporal (OT), and inferior frontal (IF) regions. The sample included 29 reading impaired children and 27 typical readers. We found greater fractional anisotropy (FA) for the poor readers in tracts connecting the cerebellum with TP and IF regions relative to typical readers. In the OT region, FA was greater for the older poor readers, but smaller for the younger ones. This study provides evidence for discrete, regionally-bound functions of the cerebellum and suggests that projections from the anterior cerebellum appear to have a regulatory effect on cortical pathways important for reading. PMID:26307492

  7. Uncertainty assessment for watershed water quality modeling: A Probabilistic Collocation Method based approach

    NASA Astrophysics Data System (ADS)

    Zheng, Yi; Wang, Weiming; Han, Feng; Ping, Jing

    2011-07-01

    Watershed water quality models are increasingly used in management. However, simulations by such complex models often involve significant uncertainty, especially those for non-conventional pollutants which are often poorly monitored. This study first proposed an integrated framework for watershed water quality modeling. Within this framework, Probabilistic Collocation Method (PCM) was then applied to a WARMF model of diazinon pollution to assess the modeling uncertainty. Based on PCM, a global sensitivity analysis method named PCM-VD (VD stands for variance decomposition) was also developed, which quantifies variance contribution of all uncertain parameters. The study results validated the applicability of PCM and PCM-VD to the WARMF model. The PCM-based approach is much more efficient, regarding computational time, than conventional Monte Carlo methods. It has also been demonstrated that analysis using the PCM-based approach could provide insights into data collection, model structure improvement and management practices. It was concluded that the PCM-based approach could play an important role in watershed water quality modeling, as an alternative to conventional Monte Carlo methods to account for parametric uncertainty and uncertainty propagation.

  8. A Probabilistic Computational Framework for Neural Network Models

    DTIC Science & Technology

    1987-09-29

    OF R1EPORT 13b TIME COVERED 14DATE OF REPORT VYtar, ftth , DayS S PAGE COUNT ~cc~nta ~FROM 86Septl3TO~let STS em 9 16 S.,PPLEVIENTARY NOTATION 7 COSATI...P. Inspection of the form of P indicates the class of probabilistic environments that can be learned. Learning algorithms can be analyzed and designed

  9. A Practical Probabilistic Graphical Modeling Tool for Weighing Ecological Risk-Based Evidence

    EPA Science Inventory

    Past weight-of-evidence frameworks for adverse ecological effects have provided soft-scoring procedures for judgments based on the quality and measured attributes of evidence. Here, we provide a flexible probabilistic structure for weighing and integrating lines of evidence for e...

  10. A Practical Probabilistic Graphical Modeling Tool for Weighing Ecological Risk-Based Evidence

    EPA Science Inventory

    Past weight-of-evidence frameworks for adverse ecological effects have provided soft-scoring procedures for judgments based on the quality and measured attributes of evidence. Here, we provide a flexible probabilistic structure for weighing and integrating lines of evidence for e...

  11. Probabilistic modeling of flood characterizations with parametric and minimum information pair-copula model

    NASA Astrophysics Data System (ADS)

    Daneshkhah, Alireza; Remesan, Renji; Chatrabgoun, Omid; Holman, Ian P.

    2016-09-01

    This paper highlights the usefulness of the minimum information and parametric pair-copula construction (PCC) to model the joint distribution of flood event properties. Both of these models outperform other standard multivariate copula in modeling multivariate flood data that exhibiting complex patterns of dependence, particularly in the tails. In particular, the minimum information pair-copula model shows greater flexibility and produces better approximation of the joint probability density and corresponding measures have capability for effective hazard assessments. The study demonstrates that any multivariate density can be approximated to any degree of desired precision using minimum information pair-copula model and can be practically used for probabilistic flood hazard assessment.

  12. Multidimensional analysis and probabilistic model of volcanic and seismic activities

    NASA Astrophysics Data System (ADS)

    Fedorov, V.

    2009-04-01

    .I. Gushchenko, 1979) and seismological (database of USGS/NEIC Significant Worldwide Earthquakes, 2150 B.C.- 1994 A.D.) information which displays dynamics of endogenic relief-forming processes over a period of 1900 to 1994. In the course of the analysis, a substitution of calendar variable by a corresponding astronomical one has been performed and the epoch superposition method was applied. In essence, the method consists in that the massifs of information on volcanic eruptions (over a period of 1900 to 1977) and seismic events (1900-1994) are differentiated with respect to value of astronomical parameters which correspond to the calendar dates of the known eruptions and earthquakes, regardless of the calendar year. The obtained spectra of volcanic eruptions and violent earthquake distribution in the fields of the Earth orbital movement parameters were used as a basis for calculation of frequency spectra and diurnal probability of volcanic and seismic activity. The objective of the proposed investigations is a probabilistic model development of the volcanic and seismic events, as well as GIS designing for monitoring and forecast of volcanic and seismic activities. In accordance with the stated objective, three probability parameters have been found in the course of preliminary studies; they form the basis for GIS-monitoring and forecast development. 1. A multidimensional analysis of volcanic eruption and earthquakes (of magnitude 7) have been performed in terms of the Earth orbital movement. Probability characteristics of volcanism and seismicity have been defined for the Earth as a whole. Time intervals have been identified with a diurnal probability twice as great as the mean value. Diurnal probability of volcanic and seismic events has been calculated up to 2020. 2. A regularity is found in duration of dormant (repose) periods has been established. A relationship has been found between the distribution of the repose period probability density and duration of the period. 3

  13. Age-Associated Alterations in Corpus Callosum White Matter Integrity in Bipolar Disorder Assessed Using Probabilistic Tractography

    PubMed Central

    Toteja, Nitin; Cokol, Perihan Guvenek; Ikuta, Toshikazu; Kafantaris, Vivian; Peters, Bart D.; Burdick, Katherine E.; John, Majnu; Malhotra, Anil K.; Szeszko, Philip R.

    2014-01-01

    Objectives Atypical age-associated changes in white matter integrity may play a role in the neurobiology of bipolar disorder, but no studies have examined the major white matter tracts using nonlinear statistical modeling across a wide age range in this disorder. The goal of this study was to identify possible deviations in the typical pattern of age-associated changes in white matter integrity in patients with bipolar disorder across the age range of 9 to 62 years. Methods Diffusion tensor imaging was performed in 57 (20M/37F) patients with a diagnosis of bipolar disorder and 57 (20M/37F) age- and sex-matched healthy volunteers. Mean diffusivity and fractional anisotropy were computed for the genu and splenium of the corpus callosum, two projection tracts, and five association tracts using probabilistic tractography. Results Overall, patients had lower fractional anisotropy and higher mean diffusivity compared to healthy volunteers across all tracts (while controlling for the effects of age and age2). In addition, there were greater age-associated increases in mean diffusivity in patients compared to healthy volunteers within the genu and splenium of the corpus callosum beginning in the second and third decades of life. Conclusions Our findings provide evidence for alterations in the typical pattern of white matter development in patients with bipolar disorder compared to healthy volunteers. Changes in white matter development within the corpus callosum may lead to altered inter-hemispheric communication that is considered integral to the neurobiology of the disorder. PMID:25532972

  14. Reasoning in Reference Games: Individual- vs. Population-Level Probabilistic Modeling

    PubMed Central

    Franke, Michael; Degen, Judith

    2016-01-01

    Recent advances in probabilistic pragmatics have achieved considerable success in modeling speakers’ and listeners’ pragmatic reasoning as probabilistic inference. However, these models are usually applied to population-level data, and so implicitly suggest a homogeneous population without individual differences. Here we investigate potential individual differences in Theory-of-Mind related depth of pragmatic reasoning in so-called reference games that require drawing ad hoc Quantity implicatures of varying complexity. We show by Bayesian model comparison that a model that assumes a heterogenous population is a better predictor of our data, especially for comprehension. We discuss the implications for the treatment of individual differences in probabilistic models of language use. PMID:27149675

  15. Probabilistic dose-response modeling: case study using dichloromethane PBPK model results.

    PubMed

    Marino, Dale J; Starr, Thomas B

    2007-12-01

    A revised assessment of dichloromethane (DCM) has recently been reported that examines the influence of human genetic polymorphisms on cancer risks using deterministic PBPK and dose-response modeling in the mouse combined with probabilistic PBPK modeling in humans. This assessment utilized Bayesian techniques to optimize kinetic variables in mice and humans with mean values from posterior distributions used in the deterministic modeling in the mouse. To supplement this research, a case study was undertaken to examine the potential impact of probabilistic rather than deterministic PBPK and dose-response modeling in mice on subsequent unit risk factor (URF) determinations. Four separate PBPK cases were examined based on the exposure regimen of the NTP DCM bioassay. These were (a) Same Mouse (single draw of all PBPK inputs for both treatment groups); (b) Correlated BW-Same Inputs (single draw of all PBPK inputs for both treatment groups except for bodyweights (BWs), which were entered as correlated variables); (c) Correlated BW-Different Inputs (separate draws of all PBPK inputs for both treatment groups except that BWs were entered as correlated variables); and (d) Different Mouse (separate draws of all PBPK inputs for both treatment groups). Monte Carlo PBPK inputs reflect posterior distributions from Bayesian calibration in the mouse that had been previously reported. A minimum of 12,500 PBPK iterations were undertaken, in which dose metrics, i.e., mg DCM metabolized by the GST pathway/L tissue/day for lung and liver were determined. For dose-response modeling, these metrics were combined with NTP tumor incidence data that were randomly selected from binomial distributions. Resultant potency factors (0.1/ED(10)) were coupled with probabilistic PBPK modeling in humans that incorporated genetic polymorphisms to derive URFs. Results show that there was relatively little difference, i.e., <10% in central tendency and upper percentile URFs, regardless of the case

  16. Parameter estimation of social forces in pedestrian dynamics models via a probabilistic method.

    PubMed

    Corbetta, Alessandro; Muntean, Adrian; Vafayi, Kiamars

    2015-04-01

    Focusing on a specific crowd dynamics situation, including real life experiments and measurements, our paper targets a twofold aim: (1) we present a Bayesian probabilistic method to estimate the value and the uncertainty (in the form of a probability density function) of parameters in crowd dynamic models from the experimental data; and (2) we introduce a fitness measure for the models to classify a couple of model structures (forces) according to their fitness to the experimental data, preparing the stage for a more general model-selection and validation strategy inspired by probabilistic data analysis. Finally, we review the essential aspects of our experimental setup and measurement technique.

  17. Development of Simplified Probabilistic Risk Assessment Model for Seismic Initiating Event

    SciTech Connect

    S. Khericha; R. Buell; S. Sancaktar; M. Gonzalez; F. Ferrante

    2012-06-01

    ABSTRACT This paper discusses a simplified method to evaluate seismic risk using a methodology built on dividing the seismic intensity spectrum into multiple discrete bins. The seismic probabilistic risk assessment model uses Nuclear Regulatory Commission’s (NRC’s) full power Standardized Plant Analysis Risk (SPAR) model as the starting point for development. The seismic PRA models are integrated with their respective internal events at-power SPAR model. This is accomplished by combining the modified system fault trees from the full power SPAR model with seismic event tree logic. The peak ground acceleration is divided into five bins. The g-value for each bin is estimated using the geometric mean of lower and upper values of that particular bin and the associated frequency for each bin is estimated by taking the difference between upper and lower values of that bin. The component’s fragilities are calculated for each bin using the plant data, if available, or generic values of median peak ground acceleration and uncertainty values for the components. For human reliability analysis (HRA), the SPAR HRA (SPAR-H) method is used which requires the analysts to complete relatively straight forward worksheets that include the performance shaping factors (PSFs). The results are then used to estimate human error probabilities (HEPs) of interest. This work is expected to improve the NRC’s ability to include seismic hazards in risk assessments for operational events in support of the reactor oversight program (e.g., significance determination process).

  18. Probabilistic Fatigue Life Prediction of Turbine Disc Considering Model Parameter Uncertainty

    NASA Astrophysics Data System (ADS)

    He, Liping; Yu, Le; Zhu, Shun-Peng; Ding, Liangliang; Huang, Hong-Zhong

    2016-06-01

    Aiming to improve the predictive ability of Walker model for fatigue life prediction and taking the turbine disc alloy GH4133 as the application example, this paper investigates a new approach for probabilistic fatigue life prediction when considering parameter uncertainty inherent in the life prediction model. Firstly, experimental data are used to update the model parameters using Bayes' theorem, so as to obtain the posterior probability distribution functions of two parameters of the Walker model, as well to achieve the probabilistic life prediction model for turbine disc. During the updating process, Markov Chain Monte Carlo (MCMC) technique is used to generate samples of the given distribution and estimating the parameters distinctly. After that, the turbine disc life is predicted using the probabilistic Walker model based on Monte Carlo simulation technique. The experimental results indicate that: (1) after using the small sample test data obtained from turbine disc, parameter uncertainty of the Walker model can be quantified and the corresponding probabilistic model for fatigue life prediction can be established using Bayes' theorem; (2) there exists obvious dispersion of life data for turbine disc when predicting fatigue life in practical engineering application.

  19. Conditional Reasoning in Context: A Dual-Source Model of Probabilistic Inference

    ERIC Educational Resources Information Center

    Klauer, Karl Christoph; Beller, Sieghard; Hutter, Mandy

    2010-01-01

    A dual-source model of probabilistic conditional inference is proposed. According to the model, inferences are based on 2 sources of evidence: logical form and prior knowledge. Logical form is a decontextualized source of evidence, whereas prior knowledge is activated by the contents of the conditional rule. In Experiments 1 to 3, manipulations of…

  20. Probabilistic risk models for multiple disturbances: an example of forest insects and wildfires

    Treesearch

    Haiganoush K. Preisler; Alan A. Ager; Jane L. Hayes

    2010-01-01

    Building probabilistic risk models for highly random forest disturbances like wildfire and forest insect outbreaks is a challenging. Modeling the interactions among natural disturbances is even more difficult. In the case of wildfire and forest insects, we looked at the probability of a large fire given an insect outbreak and also the incidence of insect outbreaks...

  1. A PROBABILISTIC POPULATION EXPOSURE MODEL FOR PM10 AND PM 2.5

    EPA Science Inventory

    A first generation probabilistic population exposure model for Particulate Matter (PM), specifically for predicting PM10, and PM2.5, exposures of an urban, population has been developed. This model is intended to be used to predict exposure (magnitude, frequency, and duration) ...

  2. Decomposing biodiversity data using the Latent Dirichlet Allocation model, a probabilistic multivariate statistical method

    Treesearch

    Denis Valle; Benjamin Baiser; Christopher W. Woodall; Robin Chazdon; Jerome. Chave

    2014-01-01

    We propose a novel multivariate method to analyse biodiversity data based on the Latent Dirichlet Allocation (LDA) model. LDA, a probabilistic model, reduces assemblages to sets of distinct component communities. It produces easily interpretable results, can represent abrupt and gradual changes in composition, accommodates missing data and allows for coherent estimates...

  3. Ultrasonic wave-based defect localization using probabilistic modeling

    NASA Astrophysics Data System (ADS)

    Todd, M. D.; Flynn, E. B.; Wilcox, P. D.; Drinkwater, B. W.; Croxford, A. J.; Kessler, S.

    2012-05-01

    This work presents a new approach rooted in maximum likelihood estimation for defect localization in sparse array guided wave ultrasonic interrogation applications. The approach constructs a minimally-informed statistical model of the guided wave process, where unknown or uncertain model parameters are assigned non-informative Bayesian prior distributions and integrated out of the a posteriori probability calculation. The premise of this localization approach is straightforward: the most likely defect location is the point on the structure with the maximum a posteriori probability of actually being the location of damage (i.e., the most probable location given a set of sensor measurements). The proposed approach is tested on a complex stiffened panel against other common localization approaches and found to have superior performance in all cases.

  4. Probabilistic inference in general graphical models through sampling in stochastic networks of spiking neurons.

    PubMed

    Pecevski, Dejan; Buesing, Lars; Maass, Wolfgang

    2011-12-01

    An important open problem of computational neuroscience is the generic organization of computations in networks of neurons in the brain. We show here through rigorous theoretical analysis that inherent stochastic features of spiking neurons, in combination with simple nonlinear computational operations in specific network motifs and dendritic arbors, enable networks of spiking neurons to carry out probabilistic inference through sampling in general graphical models. In particular, it enables them to carry out probabilistic inference in Bayesian networks with converging arrows ("explaining away") and with undirected loops, that occur in many real-world tasks. Ubiquitous stochastic features of networks of spiking neurons, such as trial-to-trial variability and spontaneous activity, are necessary ingredients of the underlying computational organization. We demonstrate through computer simulations that this approach can be scaled up to neural emulations of probabilistic inference in fairly large graphical models, yielding some of the most complex computations that have been carried out so far in networks of spiking neurons.

  5. A Probabilistic Risk Analysis (PRA) of Human Space Missions for the Advanced Integration Matrix (AIM)

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.; Dillon-Merrill, Robin L.; Thomas, Gretchen A.

    2003-01-01

    The Advanced Integration Matrix (AIM) Project u7ill study and solve systems-level integration issues for exploration missions beyond Low Earth Orbit (LEO), through the design and development of a ground-based facility for developing revolutionary integrated systems for joint human-robotic missions. This paper describes a Probabilistic Risk Analysis (PRA) of human space missions that was developed to help define the direction and priorities for AIM. Risk analysis is required for all major NASA programs and has been used for shuttle, station, and Mars lander programs. It is a prescribed part of early planning and is necessary during concept definition, even before mission scenarios and system designs exist. PRA cm begin when little failure data are available, and be continually updated and refined as detail becomes available. PRA provides a basis for examining tradeoffs among safety, reliability, performance, and cost. The objective of AIM's PRA is to indicate how risk can be managed and future human space missions enabled by the AIM Project. Many critical events can cause injuries and fatalities to the crew without causing loss of vehicle or mission. Some critical systems are beyond AIM's scope, such as propulsion and guidance. Many failure-causing events can be mitigated by conducting operational tests in AIM, such as testing equipment and evaluating operational procedures, especially in the areas of communications and computers, autonomous operations, life support, thermal design, EVA and rover activities, physiological factors including habitation, medical equipment, and food, and multifunctional tools and repairable systems. AIM is well suited to test and demonstrate the habitat, life support, crew operations, and human interface. Because these account for significant crew, systems performance, and science risks, AIM will help reduce mission risk, and missions beyond LEO are far enough in the future that AIM can have significant impact.

  6. A Probabilistic Risk Analysis (PRA) of Human Space Missions for the Advanced Integration Matrix (AIM)

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.; Dillon-Merrill, Robin L.; Thomas, Gretchen A.

    2003-01-01

    The Advanced Integration Matrix (AIM) Project u7ill study and solve systems-level integration issues for exploration missions beyond Low Earth Orbit (LEO), through the design and development of a ground-based facility for developing revolutionary integrated systems for joint human-robotic missions. This paper describes a Probabilistic Risk Analysis (PRA) of human space missions that was developed to help define the direction and priorities for AIM. Risk analysis is required for all major NASA programs and has been used for shuttle, station, and Mars lander programs. It is a prescribed part of early planning and is necessary during concept definition, even before mission scenarios and system designs exist. PRA cm begin when little failure data are available, and be continually updated and refined as detail becomes available. PRA provides a basis for examining tradeoffs among safety, reliability, performance, and cost. The objective of AIM's PRA is to indicate how risk can be managed and future human space missions enabled by the AIM Project. Many critical events can cause injuries and fatalities to the crew without causing loss of vehicle or mission. Some critical systems are beyond AIM's scope, such as propulsion and guidance. Many failure-causing events can be mitigated by conducting operational tests in AIM, such as testing equipment and evaluating operational procedures, especially in the areas of communications and computers, autonomous operations, life support, thermal design, EVA and rover activities, physiological factors including habitation, medical equipment, and food, and multifunctional tools and repairable systems. AIM is well suited to test and demonstrate the habitat, life support, crew operations, and human interface. Because these account for significant crew, systems performance, and science risks, AIM will help reduce mission risk, and missions beyond LEO are far enough in the future that AIM can have significant impact.

  7. Monthly water balance modeling: Probabilistic, possibilistic and hybrid methods for model combination and ensemble simulation

    NASA Astrophysics Data System (ADS)

    Nasseri, M.; Zahraie, B.; Ajami, N. K.; Solomatine, D. P.

    2014-04-01

    Multi-model (ensemble, or committee) techniques have shown to be an effective way to improve hydrological prediction performance and provide uncertainty information. This paper presents two novel multi-model ensemble techniques, one probabilistic, Modified Bootstrap Ensemble Model (MBEM), and one possibilistic, FUzzy C-means Ensemble based on data Pattern (FUCEP). The paper also explores utilization of the Ordinary Kriging (OK) method as a multi-model combination scheme for hydrological simulation/prediction. These techniques are compared against Bayesian Model Averaging (BMA) and Weighted Average (WA) methods to demonstrate their effectiveness. The mentioned techniques are applied to the three monthly water balance models used to generate stream flow simulations for two mountainous basins in the South-West of Iran. For both basins, the results demonstrate that MBEM and FUCEP generate more skillful and reliable probabilistic predictions, outperforming all the other techniques. We have also found that OK did not demonstrate any improved skill as a simple combination method over WA scheme for neither of the basins.

  8. Assessment of uncertainty in a probabilistic model of consumer exposure to pesticide residues in food.

    PubMed

    Ferrier, Helen; Shaw, George; Nieuwenhuijsen, Mark; Boobis, Alan; Elliott, Paul

    2006-06-01

    The assessment of consumer exposure to pesticides is an important part of pesticide regulation. Probabilistic modelling allows analysis of uncertainty and variability in risk assessments. The output of any assessment will be influenced by the characteristics and uncertainty of the inputs, model structure and assumptions. While the use of probabilistic models is well established in the United States, in Europe problems of low acceptance, sparse data and lack of guidelines are slowing the development. The analyses in the current paper focused on the dietary pathway and the exposure of UK toddlers. Three single food, single pesticide case studies were used to parameterize a simple probabilistic model built in Crystal Ball. Data on dietary consumption patterns were extracted from National Diet and Nutrition Surveys, and levels of pesticide active ingredients in foods were collected from Pesticide Residues Committee monitoring. The effect of uncertainty on the exposure estimate was analysed using scenarios, reflecting different assumptions related to sources of uncertainty. The most influential uncertainty issue was the distribution type used to represent input variables. Other sources that most affected model output were non-detects, unit-to-unit variability and processing. Specifying correlation between variables was found to have little effect on exposure estimates. The findings have important implications for how probabilistic modelling should be conducted, communicated and used by policy and decision makers as part of consumer risk assessment of pesticides.

  9. Probabilistic multiobject deformable model for MR/SPECT brain image registration and segmentation

    NASA Astrophysics Data System (ADS)

    Nikou, Christophoros; Heitz, Fabrice; Armspach, Jean-Paul

    1999-05-01

    A probabilistic deformable model for the representation of brain structures is described. The statistically learned deformable model represents the relative location of head (skull and scalp) and brain surfaces in MR/SPECT images pairs and accommodates the significant variability of these anatomical structures across different individuals. To provide a training set, a representative collection of 3D MRI volumes of different patients have first been registered to a reference image. The head and brain surfaces of each volume are parameterized by the amplitudes of the vibration modes of a deformable spherical mesh. For a given MR image in the training set, a vector containing the largest vibration modes describing the head and the brain is created. This random vector is statistically constrained by retaining the most significant variations modes of its Karhunen-Loeve expansion on the training population. By these means, both head and brain surfaces are deformed according to the anatomical variability observed in the training set. Two applications of the probabilistic deformable model are presented: the deformable model-based registration of 3D multimodal (MR/SPECT) brain images and the segmentation of the brain from MRI using the probabilistic constraints embedded in the deformable model. The multi-object deformable model may be considered as a first step towards the development of a general purpose probabilistic anatomical atlas of the brain.

  10. Probabilistic Models and Generative Neural Networks: Towards an Unified Framework for Modeling Normal and Impaired Neurocognitive Functions

    PubMed Central

    Testolin, Alberto; Zorzi, Marco

    2016-01-01

    Connectionist models can be characterized within the more general framework of probabilistic graphical models, which allow to efficiently describe complex statistical distributions involving a large number of interacting variables. This integration allows building more realistic computational models of cognitive functions, which more faithfully reflect the underlying neural mechanisms at the same time providing a useful bridge to higher-level descriptions in terms of Bayesian computations. Here we discuss a powerful class of graphical models that can be implemented as stochastic, generative neural networks. These models overcome many limitations associated with classic connectionist models, for example by exploiting unsupervised learning in hierarchical architectures (deep networks) and by taking into account top-down, predictive processing supported by feedback loops. We review some recent cognitive models based on generative networks, and we point out promising research directions to investigate neuropsychological disorders within this approach. Though further efforts are required in order to fill the gap between structured Bayesian models and more realistic, biophysical models of neuronal dynamics, we argue that generative neural networks have the potential to bridge these levels of analysis, thereby improving our understanding of the neural bases of cognition and of pathologies caused by brain damage. PMID:27468262

  11. Probabilistic Models and Generative Neural Networks: Towards an Unified Framework for Modeling Normal and Impaired Neurocognitive Functions.

    PubMed

    Testolin, Alberto; Zorzi, Marco

    2016-01-01

    Connectionist models can be characterized within the more general framework of probabilistic graphical models, which allow to efficiently describe complex statistical distributions involving a large number of interacting variables. This integration allows building more realistic computational models of cognitive functions, which more faithfully reflect the underlying neural mechanisms at the same time providing a useful bridge to higher-level descriptions in terms of Bayesian computations. Here we discuss a powerful class of graphical models that can be implemented as stochastic, generative neural networks. These models overcome many limitations associated with classic connectionist models, for example by exploiting unsupervised learning in hierarchical architectures (deep networks) and by taking into account top-down, predictive processing supported by feedback loops. We review some recent cognitive models based on generative networks, and we point out promising research directions to investigate neuropsychological disorders within this approach. Though further efforts are required in order to fill the gap between structured Bayesian models and more realistic, biophysical models of neuronal dynamics, we argue that generative neural networks have the potential to bridge these levels of analysis, thereby improving our understanding of the neural bases of cognition and of pathologies caused by brain damage.

  12. Developing probabilistic models to predict amphibian site occupancy in a patchy landscape

    Treesearch

    R. A. Knapp; K.R. Matthews; H. K. Preisler; R. Jellison

    2003-01-01

    Abstract. Human-caused fragmentation of habitats is threatening an increasing number of animal and plant species, making an understanding of the factors influencing patch occupancy ever more important. The overall goal of the current study was to develop probabilistic models of patch occupancy for the mountain yellow-legged frog (Rana muscosa). This once-common species...

  13. Psychological Plausibility of the Theory of Probabilistic Mental Models and the Fast and Frugal Heuristics

    ERIC Educational Resources Information Center

    Dougherty, Michael R.; Franco-Watkins, Ana M.; Thomas, Rick

    2008-01-01

    The theory of probabilistic mental models (PMM; G. Gigerenzer, U. Hoffrage, & H. Kleinbolting, 1991) has had a major influence on the field of judgment and decision making, with the most recent important modifications to PMM theory being the identification of several fast and frugal heuristics (G. Gigerenzer & D. G. Goldstein, 1996). These…

  14. Psychological Plausibility of the Theory of Probabilistic Mental Models and the Fast and Frugal Heuristics

    ERIC Educational Resources Information Center

    Dougherty, Michael R.; Franco-Watkins, Ana M.; Thomas, Rick

    2008-01-01

    The theory of probabilistic mental models (PMM; G. Gigerenzer, U. Hoffrage, & H. Kleinbolting, 1991) has had a major influence on the field of judgment and decision making, with the most recent important modifications to PMM theory being the identification of several fast and frugal heuristics (G. Gigerenzer & D. G. Goldstein, 1996). These…

  15. Probabilistic estimation of residential air exchange rates for population-based human exposure modeling

    EPA Science Inventory

    Residential air exchange rates (AERs) are a key determinant in the infiltration of ambient air pollution indoors. Population-based human exposure models using probabilistic approaches to estimate personal exposure to air pollutants have relied on input distributions from AER meas...

  16. Probabilistic estimation of residential air exchange rates for population-based human exposure modeling

    EPA Science Inventory

    Residential air exchange rates (AERs) are a key determinant in the infiltration of ambient air pollution indoors. Population-based human exposure models using probabilistic approaches to estimate personal exposure to air pollutants have relied on input distributions from AER meas...

  17. From cyclone tracks to the costs of European winter storms: A probabilistic loss assessment model

    NASA Astrophysics Data System (ADS)

    Renggli, Dominik; Corti, Thierry; Reese, Stefan; Wueest, Marc; Viktor, Elisabeth; Zimmerli, Peter

    2014-05-01

    The quantitative assessment of the potential losses of European winter storms is essential for the economic viability of a global reinsurance company. For this purpose, reinsurance companies generally use probabilistic loss assessment models. This work presents an innovative approach to develop physically meaningful probabilistic events for Swiss Re's new European winter storm loss model. The meteorological hazard component of the new model is based on cyclone and windstorm tracks identified in the 20th Century Reanalysis data. The knowledge of the evolution of winter storms both in time and space allows the physically meaningful perturbation of properties of historical events (e.g. track, intensity). The perturbation includes a random element but also takes the local climatology and the evolution of the historical event into account. The low-resolution wind footprints taken from 20th Century Reanalysis are processed by a statistical-dynamical downscaling to generate high-resolution footprints of the historical and probabilistic winter storm events. Downscaling transfer functions are generated using ENSEMBLES regional climate model data. The result is a set of reliable probabilistic events representing thousands of years. The event set is then combined with country- and risk-specific vulnerability functions and detailed market- or client-specific exposure information to compute (re-)insurance risk premiums.

  18. A range of complex probabilistic models for RNA secondary structure prediction that includes the nearest-neighbor model and more

    PubMed Central

    Rivas, Elena; Lang, Raymond; Eddy, Sean R.

    2012-01-01

    The standard approach for single-sequence RNA secondary structure prediction uses a nearest-neighbor thermodynamic model with several thousand experimentally determined energy parameters. An attractive alternative is to use statistical approaches with parameters estimated from growing databases of structural RNAs. Good results have been reported for discriminative statistical methods using complex nearest-neighbor models, including CONTRAfold, Simfold, and ContextFold. Little work has been reported on generative probabilistic models (stochastic context-free grammars [SCFGs]) of comparable complexity, although probabilistic models are generally easier to train and to use. To explore a range of probabilistic models of increasing complexity, and to directly compare probabilistic, thermodynamic, and discriminative approaches, we created TORNADO, a computational tool that can parse a wide spectrum of RNA grammar architectures (including the standard nearest-neighbor model and more) using a generalized super-grammar that can be parameterized with probabilities, energies, or arbitrary scores. By using TORNADO, we find that probabilistic nearest-neighbor models perform comparably to (but not significantly better than) discriminative methods. We find that complex statistical models are prone to overfitting RNA structure and that evaluations should use structurally nonhomologous training and test data sets. Overfitting has affected at least one published method (ContextFold). The most important barrier to improving statistical approaches for RNA secondary structure prediction is the lack of diversity of well-curated single-sequence RNA secondary structures in current RNA databases. PMID:22194308

  19. A range of complex probabilistic models for RNA secondary structure prediction that includes the nearest-neighbor model and more.

    PubMed

    Rivas, Elena; Lang, Raymond; Eddy, Sean R

    2012-02-01

    The standard approach for single-sequence RNA secondary structure prediction uses a nearest-neighbor thermodynamic model with several thousand experimentally determined energy parameters. An attractive alternative is to use statistical approaches with parameters estimated from growing databases of structural RNAs. Good results have been reported for discriminative statistical methods using complex nearest-neighbor models, including CONTRAfold, Simfold, and ContextFold. Little work has been reported on generative probabilistic models (stochastic context-free grammars [SCFGs]) of comparable complexity, although probabilistic models are generally easier to train and to use. To explore a range of probabilistic models of increasing complexity, and to directly compare probabilistic, thermodynamic, and discriminative approaches, we created TORNADO, a computational tool that can parse a wide spectrum of RNA grammar architectures (including the standard nearest-neighbor model and more) using a generalized super-grammar that can be parameterized with probabilities, energies, or arbitrary scores. By using TORNADO, we find that probabilistic nearest-neighbor models perform comparably to (but not significantly better than) discriminative methods. We find that complex statistical models are prone to overfitting RNA structure and that evaluations should use structurally nonhomologous training and test data sets. Overfitting has affected at least one published method (ContextFold). The most important barrier to improving statistical approaches for RNA secondary structure prediction is the lack of diversity of well-curated single-sequence RNA secondary structures in current RNA databases.

  20. Plant Phenotyping using Probabilistic Topic Models: Uncovering the Hyperspectral Language of Plants.

    PubMed

    Wahabzada, Mirwaes; Mahlein, Anne-Katrin; Bauckhage, Christian; Steiner, Ulrike; Oerke, Erich-Christian; Kersting, Kristian

    2016-03-09

    Modern phenotyping and plant disease detection methods, based on optical sensors and information technology, provide promising approaches to plant research and precision farming. In particular, hyperspectral imaging have been found to reveal physiological and structural characteristics in plants and to allow for tracking physiological dynamics due to environmental effects. In this work, we present an approach to plant phenotyping that integrates non-invasive sensors, computer vision, as well as data mining techniques and allows for monitoring how plants respond to stress. To uncover latent hyperspectral characteristics of diseased plants reliably and in an easy-to-understand way, we "wordify" the hyperspectral images, i.e., we turn the images into a corpus of text documents. Then, we apply probabilistic topic models, a well-established natural language processing technique that identifies content and topics of documents. Based on recent regularized topic models, we demonstrate that one can track automatically the development of three foliar diseases of barley. We also present a visualization of the topics that provides plant scientists an intuitive tool for hyperspectral imaging. In short, our analysis and visualization of characteristic topics found during symptom development and disease progress reveal the hyperspectral language of plant diseases.

  1. Probabilistic tsunami hazard analysis (PTHA) of Taiwan region by stochastic model

    NASA Astrophysics Data System (ADS)

    Sun, Y. S.; Chen, P. F.; Chen, C. C.

    2014-12-01

    We conduct probabilistic tsunami hazard analysis (PTHA) of Taiwan region for earthquake sources in the Ryukyu trench. The PTHA estimates the probabilities of a site hit by tsunamis with certain amplitudes threshold. The probabilities were integrated over earthquakes of various magnitudes from potential fault zones in the Ryukyu trench. The annual frequencies of earthquakes in a fault zone are determined or extrapolated by magnitude-frequency distributions of earthquakes (Gutenberg-Richter law) of the zone. Given moment (or magnitude) of an earthquake, we first synthesize patterns of differently complex and heterogeneous slip distributions on the fault using stochastic model. Assuming the slip and stress drop distribution are processes of fractional Brownian motion and described by Hurt exponent. According to ω-2 model of earthquakes and following Fourier transform, slip distributions of earthquake are determined by randomly distributing phase spectrum of those with greater than corner wave number kc. Finally, the vertical seafloor displacements induced by each slip distribution are used by COMCOT for simulation of tsunami to assess the impacts on various coasts in Taiwan.

  2. Plant Phenotyping using Probabilistic Topic Models: Uncovering the Hyperspectral Language of Plants

    PubMed Central

    Wahabzada, Mirwaes; Mahlein, Anne-Katrin; Bauckhage, Christian; Steiner, Ulrike; Oerke, Erich-Christian; Kersting, Kristian

    2016-01-01

    Modern phenotyping and plant disease detection methods, based on optical sensors and information technology, provide promising approaches to plant research and precision farming. In particular, hyperspectral imaging have been found to reveal physiological and structural characteristics in plants and to allow for tracking physiological dynamics due to environmental effects. In this work, we present an approach to plant phenotyping that integrates non-invasive sensors, computer vision, as well as data mining techniques and allows for monitoring how plants respond to stress. To uncover latent hyperspectral characteristics of diseased plants reliably and in an easy-to-understand way, we “wordify” the hyperspectral images, i.e., we turn the images into a corpus of text documents. Then, we apply probabilistic topic models, a well-established natural language processing technique that identifies content and topics of documents. Based on recent regularized topic models, we demonstrate that one can track automatically the development of three foliar diseases of barley. We also present a visualization of the topics that provides plant scientists an intuitive tool for hyperspectral imaging. In short, our analysis and visualization of characteristic topics found during symptom development and disease progress reveal the hyperspectral language of plant diseases. PMID:26957018

  3. Performance of the multi-model SREPS precipitation probabilistic forecast over Mediterranean area

    NASA Astrophysics Data System (ADS)

    Callado, A.; Escribà, P.; Santos, C.; Santos-Muñoz, D.; Simarro, J.; García-Moya, J. A.

    2009-09-01

    The performance of the Short-Range Ensemble Prediction system (SREPS) probabilistic precipitation forecast over the Mediterranean area has been evaluated comparing with both, an Atlantic-European area excluding the first one, and a more general area including the two previous ones. The main aim is to assess whether the performance of the system due to its meso-alpha horizontal resolution of 25 kilometres is affected over the Mediterranean area, where the meteorological mesoscale events play a more important role than in an Atlantic-European area, more related to synoptic scale with an Atlantic influence. Furthermore, two different verification methods have been applied and compared for the three areas in order to assess its performance. The SREPS is a daily experimental LAM EPS focused on the short range (up to 72 hours) which has been developed at the Spanish Meteorological Agency (AEMET). To take into account implicitly the model errors, five purely independent different limited area models are used (COSMO (COSMO), HIRLAM (HIRLAM Consortium), HRM (DWD), MM5 (NOAA) and UM-NAE (UKMO)), and in order to sample the initial and boundary condition uncertainties each model is integrated using data from four different global deterministic models (GFS (NCEP), GME (DWD), IFS (ECMWF) and UM (UKMO)). As a result, crossing models and initial conditions the EPS is composed by 20 members. The underlying idea is that the ensemble performance has to improve as far as each member has itself the better possible performance, i.e. the better operational configuration limited area models are combined with the better global deterministic model configurations initialized with the best analysis. Because of this neither global EPS as initial conditions nor different model settings as multi-parameterizations or multi-parameters are used to generate SREPS. The performance over the three areas has been assessed focusing on 24 hour accumulation precipitation with four different usual

  4. Speeded Classification in a Probabilistic Category Structure: Contrasting Exemplar-Retrieval, Decision-Boundary, and Prototype Models

    ERIC Educational Resources Information Center

    Nosofsky, Robert M.; Stanton, Roger D.

    2005-01-01

    Speeded perceptual classification experiments were conducted to distinguish among the predictions of exemplar-retrieval, decision-boundary, and prototype models. The key manipulation was that across conditions, individual stimuli received either probabilistic or deterministic category feedback. Regardless of the probabilistic feedback, however, an…

  5. Probabilistic choice models in health-state valuation research: background, theories, assumptions and applications.

    PubMed

    Arons, Alexander M M; Krabbe, Paul F M

    2013-02-01

    Interest is rising in measuring subjective health outcomes, such as treatment outcomes that are not directly quantifiable (functional disability, symptoms, complaints, side effects and health-related quality of life). Health economists in particular have applied probabilistic choice models in the area of health evaluation. They increasingly use discrete choice models based on random utility theory to derive values for healthcare goods or services. Recent attempts have been made to use discrete choice models as an alternative method to derive values for health states. In this article, various probabilistic choice models are described according to their underlying theory. A historical overview traces their development and applications in diverse fields. The discussion highlights some theoretical and technical aspects of the choice models and their similarity and dissimilarity. The objective of the article is to elucidate the position of each model and their applications for health-state valuation.

  6. A probabilistic modeling approach to assess human inhalation exposure risks to airborne aflatoxin B 1 (AFB 1)

    NASA Astrophysics Data System (ADS)

    Liao, Chung-Min; Chen, Szu-Chieh

    To assess how the human lung exposure to airborne aflatoxin B 1 (AFB 1) during on-farm activities including swine feeding, storage bin cleaning, corn harvest, and grain elevator loading/unloading, we present a probabilistic risk model, appraised with empirical data. The model integrates probabilistic exposure profiles from a compartmental lung model with the reconstructed dose-response relationships based on an empirical three-parameter Hill equation model, describing AFB 1 cytotoxicity for inhibition response in human bronchial epithelial cells, to quantitatively estimate the inhalation exposure risks. The risk assessment results implicate that exposure to airborne AFB 1 may pose no significance to corn harvest and grain elevator loading/unloading activities, yet a relatively high risk for swine feeding and storage bin cleaning. Applying a joint probability function method based on exceedence profiles, we estimate that a potential high risk for the bronchial region (inhibition=56.69% with 95% confidence interval (CI): 35.05-72.87%) and bronchiolar region (inhibition=44.93% with 95% CI: 21.61 - 66.78%) is alarming during swine feeding activity. We parameterized the proposed predictive model that should encourage a risk-management framework for discussion of carcinogenic risk in occupational settings where inhalation of AFB 1-contaminated dust occurs.

  7. Probabilistic forecasts of debris-flow hazard at the regional scale with a combination of models.

    NASA Astrophysics Data System (ADS)

    Malet, Jean-Philippe; Remaître, Alexandre

    2015-04-01

    Debris flows are one of the many active slope-forming processes in the French Alps, where rugged and steep slopes mantled by various slope deposits offer a great potential for triggering hazardous events. A quantitative assessment of debris-flow hazard requires the estimation, in a probabilistic framework, of the spatial probability of occurrence of source areas, the spatial probability of runout areas, the temporal frequency of events, and their intensity. The main objective of this research is to propose a pipeline for the estimation of these quantities at the region scale using a chain of debris-flow models. The work uses the experimental site of the Barcelonnette Basin (South French Alps), where 26 active torrents have produced more than 150 debris-flow events since 1850 to develop and validate the methodology. First, a susceptibility assessment is performed to identify the debris-flow prone source areas. The most frequently used approach is the combination of environmental factors with GIS procedures and statistical techniques, integrating or not, detailed event inventories. Based on a 5m-DEM and derivatives, and information on slope lithology, engineering soils and landcover, the possible source areas are identified with a statistical logistic regression model. The performance of the statistical model is evaluated with the observed distribution of debris-flow events recorded after 1850 in the study area. The source areas in the three most active torrents (Riou-Bourdoux, Faucon, Sanières) are well identified by the model. Results are less convincing for three other active torrents (Bourget, La Valette and Riou-Chanal); this could be related to the type of debris-flow triggering mechanism as the model seems to better spot the open slope debris-flow source areas (e.g. scree slopes), but appears to be less efficient for the identification of landslide-induced debris flows. Second, a susceptibility assessment is performed to estimate the possible runout distance

  8. Spacecraft technology portfolio: Probabilistic modeling and implications for responsiveness and schedule slippage

    NASA Astrophysics Data System (ADS)

    Dubos, Gregory F.; Saleh, Joseph H.

    2011-04-01

    Addressing the challenges of Responsive Space and mitigating the risk of schedule slippage in space programs require a thorough understanding of the various factors driving the development schedule of a space system. The present work contributes theoretical and practical results in this direction. A spacecraft is here conceived of as a technology portfolio. The characteristics of this portfolio are defined as its size (e.g., number of instruments), the technology maturity of each instrument and the resulting Technology Readiness Level ( TRL) heterogeneity, and their effects on the delivery schedule of a spacecraft are investigated. Following a brief overview of the concept of R&D portfolio and its relevance to spacecraft design, a probabilistic model of the Time-to-Delivery of a spacecraft is formulated, which includes the development, Integration and Testing, and Shipping phases. The Mean-Time-To-Delivery ( MTTD) of the spacecraft is quantified based on the portfolio characteristics, and it is shown that the Mean-Time-To-Delivery ( MTTD) of the spacecraft and its schedule risk are significantly impacted by decreasing TRL and increasing portfolio size. Finally, the utility implications of varying the portfolio characteristics are investigated, and "portfolio maps" are provided as guides to help system designers identify appropriate portfolio characteristics when operating in a calendar-based design environment (which is the paradigm shift that space responsiveness introduces).

  9. Statistical and Probabilistic Extensions to Ground Operations' Discrete Event Simulation Modeling

    NASA Technical Reports Server (NTRS)

    Trocine, Linda; Cummings, Nicholas H.; Bazzana, Ashley M.; Rychlik, Nathan; LeCroy, Kenneth L.; Cates, Grant R.

    2010-01-01

    NASA's human exploration initiatives will invest in technologies, public/private partnerships, and infrastructure, paving the way for the expansion of human civilization into the solar system and beyond. As it is has been for the past half century, the Kennedy Space Center will be the embarkation point for humankind's journey into the cosmos. Functioning as a next generation space launch complex, Kennedy's launch pads, integration facilities, processing areas, launch and recovery ranges will bustle with the activities of the world's space transportation providers. In developing this complex, KSC teams work through the potential operational scenarios: conducting trade studies, planning and budgeting for expensive and limited resources, and simulating alternative operational schemes. Numerous tools, among them discrete event simulation (DES), were matured during the Constellation Program to conduct such analyses with the purpose of optimizing the launch complex for maximum efficiency, safety, and flexibility while minimizing life cycle costs. Discrete event simulation is a computer-based modeling technique for complex and dynamic systems where the state of the system changes at discrete points in time and whose inputs may include random variables. DES is used to assess timelines and throughput, and to support operability studies and contingency analyses. It is applicable to any space launch campaign and informs decision-makers of the effects of varying numbers of expensive resources and the impact of off nominal scenarios on measures of performance. In order to develop representative DES models, methods were adopted, exploited, or created to extend traditional uses of DES. The Delphi method was adopted and utilized for task duration estimation. DES software was exploited for probabilistic event variation. A roll-up process was used, which was developed to reuse models and model elements in other less - detailed models. The DES team continues to innovate and expand

  10. Hierarchical minimax entropy modeling and probabilistic principal component visualization for data exploration

    NASA Astrophysics Data System (ADS)

    Wang, Yue J.; Luo, Lan; Li, Haifeng; Freedman, Matthew T.

    1999-05-01

    As a step toward understanding the complex information from data and relationships, structural and discriminative knowledge reveals insight that may prove useful in data interpretation and exploration. This paper reports the development of an automated and intelligent procedure for generating the hierarchy of minimize entropy models and principal component visualization spaces for improved data explanation. The proposed hierarchical mimimax entropy modeling and probabilistic principal component projection are both statistically principles and visually effective at revealing all of the interesting aspects of the data set. The methods involve multiple use of standard finite normal mixture models and probabilistic principal component projections. The strategy is that the top-level model and projection should explain the entire data set, best revealing the presence of clusters and relationships, while lower-level models and projections should display internal structure within individual clusters, such as the presence of subclusters and attribute trends, which might not be apparent in the higher-level models and projections. With may complementary mixture models and visualization projections, each level will be relatively simple while the complete hierarchy maintains overall flexibility yet still conveys considerable structural information. In particular, a model identification procedure is developed to select the optimal number and kernel shapes of local clusters from a class of data, resulting in a standard finite normal mixtures with minimum conditional bias and variance, and a probabilistic principal component neural network is advanced to generate optimal projections, leading to a hierarchical visualization algorithm allowing the complete data set to be analyzed at the top level, with best separated subclusters of data points analyzed at deeper levels. Hierarchial probabilistic principal component visualization involves (1) evaluation of posterior probabilities for

  11. An analytical probabilistic model of the quality efficiency of a sewer tank

    NASA Astrophysics Data System (ADS)

    Balistrocchi, Matteo; Grossi, Giovanna; Bacchi, Baldassare

    2009-12-01

    The assessment of the efficiency of a storm water storage facility devoted to the sewer overflow control in urban areas strictly depends on the ability to model the main features of the rainfall-runoff routing process and the related wet weather pollution delivery. In this paper the possibility of applying the analytical probabilistic approach for developing a tank design method, whose potentials are similar to the continuous simulations, is proved. In the model derivation the quality issues of such devices were implemented. The formulation is based on a Weibull probabilistic model of the main characteristics of the rainfall process and on a power law describing the relationship between the dimensionless storm water cumulative runoff volume and the dimensionless cumulative pollutograph. Following this approach, efficiency indexes were established. The proposed model was verified by comparing its results to those obtained by continuous simulations; satisfactory agreement is shown for the proposed efficiency indexes.

  12. A note on probabilistic models over strings: the linear algebra approach.

    PubMed

    Bouchard-Côté, Alexandre

    2013-12-01

    Probabilistic models over strings have played a key role in developing methods that take into consideration indels as phylogenetically informative events. There is an extensive literature on using automata and transducers on phylogenies to do inference on these probabilistic models, in which an important theoretical question is the complexity of computing the normalization of a class of string-valued graphical models. This question has been investigated using tools from combinatorics, dynamic programming, and graph theory, and has practical applications in Bayesian phylogenetics. In this work, we revisit this theoretical question from a different point of view, based on linear algebra. The main contribution is a set of results based on this linear algebra view that facilitate the analysis and design of inference algorithms on string-valued graphical models. As an illustration, we use this method to give a new elementary proof of a known result on the complexity of inference on the "TKF91" model, a well-known probabilistic model over strings. Compared to previous work, our proving method is easier to extend to other models, since it relies on a novel weak condition, triangular transducers, which is easy to establish in practice. The linear algebra view provides a concise way of describing transducer algorithms and their compositions, opens the possibility of transferring fast linear algebra libraries (for example, based on GPUs), as well as low rank matrix approximation methods, to string-valued inference problems.

  13. Probabilistic models for assessment of extreme temperatures and relative humidity in Lithuania

    NASA Astrophysics Data System (ADS)

    Alzbutas, Robertas; Šeputytė, Ilona

    2015-04-01

    Extreme temperatures are fairly common natural phenomenon in Lithuania. They have mainly negative effects both on the environment and humans. Thus there are important to perform probabilistic and statistical analyzes of possibly extreme temperature values and their time-dependant changes. This is especially important in areas where technical objects (sensitive to the extreme temperatures) are foreseen to be constructed. In order to estimate the frequencies and consequences of possible extreme temperatures, the probabilistic analysis of the event occurrence and its uncertainty has been performed: statistical data have been collected and analyzed. The probabilistic analysis of extreme temperatures in Lithuanian territory is based on historical data taken from Lithuanian Hydrometeorology Service, Dūkštas Meteorological Station, Lithuanian Energy Institute and Ignalina NNP Environmental Protection Department of Environmental Monitoring Service. The main objective of performed work was the probabilistic assessment of occurrence and impact of extreme temperature and relative humidity occurring in whole Lithuania and specifically in Dūkštas region where Ignalina Nuclear Power Plant is closed for decommissioning. In addition, the other purpose of this work was to analyze the changes of extreme temperatures. The probabilistic analysis of extreme temperatures increase in Lithuanian territory was based on more than 50 years historical data. The probabilistic assessment was focused on the application and comparison of Gumbel, Weibull and Generalized Value (GEV) distributions, enabling to select a distribution, which has the best fit for data of extreme temperatures. In order to assess the likelihood of extreme temperatures different probabilistic models were applied to evaluate the probability of exeedance of different extreme temperatures. According to the statistics and the relationship between return period and probabilities of temperatures the return period for 30

  14. Modeling, Analysis, and Control of Swarming Agents in a Probabilistic Framework

    DTIC Science & Technology

    2012-11-01

    a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. a...REPORT Modeling, Analysis, and Control of Swarming Agents in a Probabilistic Framework 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: This research...effort focuses on the development of a unified, systematic, and formal approach to modeling and control of multi-agent systems, drawing inspiration from

  15. Modeling and analysis of cell membrane systems with probabilistic model checking

    PubMed Central

    2011-01-01

    Background Recently there has been a growing interest in the application of Probabilistic Model Checking (PMC) for the formal specification of biological systems. PMC is able to exhaustively explore all states of a stochastic model and can provide valuable insight into its behavior which are more difficult to see using only traditional methods for system analysis such as deterministic and stochastic simulation. In this work we propose a stochastic modeling for the description and analysis of sodium-potassium exchange pump. The sodium-potassium pump is a membrane transport system presents in all animal cell and capable of moving sodium and potassium ions against their concentration gradient. Results We present a quantitative formal specification of the pump mechanism in the PRISM language, taking into consideration a discrete chemistry approach and the Law of Mass Action aspects. We also present an analysis of the system using quantitative properties in order to verify the pump reversibility and understand the pump behavior using trend labels for the transition rates of the pump reactions. Conclusions Probabilistic model checking can be used along with other well established approaches such as simulation and differential equations to better understand pump behavior. Using PMC we can determine if specific events happen such as the potassium outside the cell ends in all model traces. We can also have a more detailed perspective on its behavior such as determining its reversibility and why its normal operation becomes slow over time. This knowledge can be used to direct experimental research and make it more efficient, leading to faster and more accurate scientific discoveries. PMID:22369714

  16. Probabilistic material degradation model for aerospace materials subjected to high temperature, mechanical and thermal fatigue, and creep

    NASA Astrophysics Data System (ADS)

    Boyce, L.

    1992-07-01

    A probabilistic general material strength degradation model has been developed for structural components of aerospace propulsion systems subjected to diverse random effects. The model has been implemented in two FORTRAN programs, PROMISS (Probabilistic Material Strength Simulator) and PROMISC (Probabilistic Material Strength Calibrator). PROMISS calculates the random lifetime strength of an aerospace propulsion component due to as many as eighteen diverse random effects. Results are presented in the form of probability density functions and cumulative distribution functions of lifetime strength. PROMISC calibrates the model by calculating the values of empirical material constants.

  17. Probabilistic material degradation model for aerospace materials subjected to high temperature, mechanical and thermal fatigue, and creep

    NASA Technical Reports Server (NTRS)

    Boyce, L.

    1992-01-01

    A probabilistic general material strength degradation model has been developed for structural components of aerospace propulsion systems subjected to diverse random effects. The model has been implemented in two FORTRAN programs, PROMISS (Probabilistic Material Strength Simulator) and PROMISC (Probabilistic Material Strength Calibrator). PROMISS calculates the random lifetime strength of an aerospace propulsion component due to as many as eighteen diverse random effects. Results are presented in the form of probability density functions and cumulative distribution functions of lifetime strength. PROMISC calibrates the model by calculating the values of empirical material constants.

  18. Probabilistic Inference: Task Dependency and Individual Differences of Probability Weighting Revealed by Hierarchical Bayesian Modeling

    PubMed Central

    Boos, Moritz; Seer, Caroline; Lange, Florian; Kopp, Bruno

    2016-01-01

    Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modeling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities) by two (likelihoods) design. Five computational models of cognitive processes were compared with the observed behavior. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted) S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model's success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modeling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modeling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision. PMID:27303323

  19. Probabilistic Inference: Task Dependency and Individual Differences of Probability Weighting Revealed by Hierarchical Bayesian Modeling.

    PubMed

    Boos, Moritz; Seer, Caroline; Lange, Florian; Kopp, Bruno

    2016-01-01

    Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modeling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities) by two (likelihoods) design. Five computational models of cognitive processes were compared with the observed behavior. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted) S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model's success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modeling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modeling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision.

  20. Probabilistic modelling of human exposure to intense sweeteners in Italian teenagers: validation and sensitivity analysis of a probabilistic model including indicators of market share and brand loyalty.

    PubMed

    Arcella, D; Soggiu, M E; Leclercq, C

    2003-10-01

    For the assessment of exposure to food-borne chemicals, the most commonly used methods in the European Union follow a deterministic approach based on conservative assumptions. Over the past few years, to get a more realistic view of exposure to food chemicals, risk managers are getting more interested in the probabilistic approach. Within the EU-funded 'Monte Carlo' project, a stochastic model of exposure to chemical substances from the diet and a computer software program were developed. The aim of this paper was to validate the model with respect to the intake of saccharin from table-top sweeteners and cyclamate from soft drinks by Italian teenagers with the use of the software and to evaluate the impact of the inclusion/exclusion of indicators on market share and brand loyalty through a sensitivity analysis. Data on food consumption and the concentration of sweeteners were collected. A food frequency questionnaire aimed at identifying females who were high consumers of sugar-free soft drinks and/or of table top sweeteners was filled in by 3982 teenagers living in the District of Rome. Moreover, 362 subjects participated in a detailed food survey by recording, at brand level, all foods and beverages ingested over 12 days. Producers were asked to provide the intense sweeteners' concentration of sugar-free products. Results showed that consumer behaviour with respect to brands has an impact on exposure assessments. Only probabilistic models that took into account indicators of market share and brand loyalty met the validation criteria.

  1. Predicting inpatient clinical order patterns with probabilistic topic models vs conventional order sets.

    PubMed

    Chen, Jonathan H; Goldstein, Mary K; Asch, Steven M; Mackey, Lester; Altman, Russ B

    2017-05-01

    Build probabilistic topic model representations of hospital admissions processes and compare the ability of such models to predict clinical order patterns as compared to preconstructed order sets. The authors evaluated the first 24 hours of structured electronic health record data for > 10 K inpatients. Drawing an analogy between structured items (e.g., clinical orders) to words in a text document, the authors performed latent Dirichlet allocation probabilistic topic modeling. These topic models use initial clinical information to predict clinical orders for a separate validation set of > 4 K patients. The authors evaluated these topic model-based predictions vs existing human-authored order sets by area under the receiver operating characteristic curve, precision, and recall for subsequent clinical orders. Existing order sets predict clinical orders used within 24 hours with area under the receiver operating characteristic curve 0.81, precision 16%, and recall 35%. This can be improved to 0.90, 24%, and 47% ( P  < 10 -20 ) by using probabilistic topic models to summarize clinical data into up to 32 topics. Many of these latent topics yield natural clinical interpretations (e.g., "critical care," "pneumonia," "neurologic evaluation"). Existing order sets tend to provide nonspecific, process-oriented aid, with usability limitations impairing more precise, patient-focused support. Algorithmic summarization has the potential to breach this usability barrier by automatically inferring patient context, but with potential tradeoffs in interpretability. Probabilistic topic modeling provides an automated approach to detect thematic trends in patient care and generate decision support content. A potential use case finds related clinical orders for decision support.

  2. Predicting inpatient clinical order patterns with probabilistic topic models vs conventional order sets

    PubMed Central

    Goldstein, Mary K; Asch, Steven M; Mackey, Lester; Altman, Russ B

    2017-01-01

    Objective: Build probabilistic topic model representations of hospital admissions processes and compare the ability of such models to predict clinical order patterns as compared to preconstructed order sets. Materials and Methods: The authors evaluated the first 24 hours of structured electronic health record data for > 10 K inpatients. Drawing an analogy between structured items (e.g., clinical orders) to words in a text document, the authors performed latent Dirichlet allocation probabilistic topic modeling. These topic models use initial clinical information to predict clinical orders for a separate validation set of > 4 K patients. The authors evaluated these topic model-based predictions vs existing human-authored order sets by area under the receiver operating characteristic curve, precision, and recall for subsequent clinical orders. Results: Existing order sets predict clinical orders used within 24 hours with area under the receiver operating characteristic curve 0.81, precision 16%, and recall 35%. This can be improved to 0.90, 24%, and 47% (P < 10−20) by using probabilistic topic models to summarize clinical data into up to 32 topics. Many of these latent topics yield natural clinical interpretations (e.g., “critical care,” “pneumonia,” “neurologic evaluation”). Discussion: Existing order sets tend to provide nonspecific, process-oriented aid, with usability limitations impairing more precise, patient-focused support. Algorithmic summarization has the potential to breach this usability barrier by automatically inferring patient context, but with potential tradeoffs in interpretability. Conclusion: Probabilistic topic modeling provides an automated approach to detect thematic trends in patient care and generate decision support content. A potential use case finds related clinical orders for decision support. PMID:27655861

  3. Landslide susceptibility mapping along PLUS expressways in Malaysia using probabilistic based model in GIS

    NASA Astrophysics Data System (ADS)

    Yusof, Norbazlan M.; Pradhan, Biswajeet

    2014-06-01

    PLUS Berhad holds the concession for a total of 987 km of toll expressways in Malaysia, the longest of which is the North-South Expressway or NSE. Acting as the backbone' of the west coast of the peninsula, the NSE stretches from the Malaysian-Thai border in the north to the border with neighbouring Singapore in the south, linking several major cities and towns along the way. North-South Expressway in Malaysia contributes to the country economic development through trade, social and tourism sector. Presently, the highway is good in terms of its condition and connection to every state but some locations need urgent attention. Stability of slopes at these locations is of most concern as any instability can cause danger to the motorist. In this paper, two study locations have been analysed; they are Gua Tempurung (soil slope) and Jelapang (rock slope) which are obviously having two different characteristics. These locations passed through undulating terrain with steep slopes where landslides are common and the probability of slope instability due to human activities in surrounding areas is high. A combination of twelve (12) landslide conditioning factors database on slope stability such as slope degree and slope aspect were extracted from IFSAR (interoferometric synthetic aperture radar) while landuse, lithology and structural geology were constructed from interpretation of high resolution satellite data from World View II, Quickbird and Ikonos. All this information was analysed in geographic information system (GIS) environment for landslide susceptibility mapping using probabilistic based frequency ratio model. Consequently, information on the slopes such as inventories, condition assessments and maintenance records were assessed through total expressway maintenance management system or better known as TEMAN. The above mentioned system is used by PLUS as an asset management and decision support tools for maintenance activities along the highways as well as for data

  4. A probabilistic model for estimating the waiting time until the simultaneous collapse of two contingencies

    SciTech Connect

    Barnett, C.S.

    1991-06-01

    The Double Contingency Principle (DCP) is widely applied to criticality safety practice in the United States. Most practitioners base their application of the principle on qualitative, intuitive assessments. The recent trend toward probabilistic safety assessments provides a motive to search for a quantitative, probabilistic foundation for the DCP. A Markov model is tractable and leads to relatively simple results. The model yields estimates of mean time to simultaneous collapse of two contingencies as a function of estimates of mean failure times and mean recovery times of two independent contingencies. The model is a tool that can be used to supplement the qualitative methods now used to assess effectiveness of the DCP. 3 refs., 1 fig.

  5. A Probabilistic Model for the Distribution of Authorships.

    ERIC Educational Resources Information Center

    Ajiferuke, Isola

    1991-01-01

    Discusses bibliometric studies of research collaboration and describes the development of a theoretical model for the distribution of authorship. The shifted Waring distribution model and 15 other probability models are tested for goodness-of-fit, and results are reported that indicate the shifted inverse Gaussian-Poisson model provides the best…

  6. Statistical Learning of Probabilistic Nonadjacent Dependencies by Multiple-Cue Integration

    ERIC Educational Resources Information Center

    van den Bos, Esther; Christiansen, Morten H.; Misyak, Jennifer B.

    2012-01-01

    Previous studies have indicated that dependencies between nonadjacent elements can be acquired by statistical learning when each element predicts only one other element (deterministic dependencies). The present study investigates statistical learning of probabilistic nonadjacent dependencies, in which each element predicts several other elements…

  7. Statistical Learning of Probabilistic Nonadjacent Dependencies by Multiple-Cue Integration

    ERIC Educational Resources Information Center

    van den Bos, Esther; Christiansen, Morten H.; Misyak, Jennifer B.

    2012-01-01

    Previous studies have indicated that dependencies between nonadjacent elements can be acquired by statistical learning when each element predicts only one other element (deterministic dependencies). The present study investigates statistical learning of probabilistic nonadjacent dependencies, in which each element predicts several other elements…

  8. Receptor-mediated cell attachment and detachment kinetics. I. Probabilistic model and analysis.

    PubMed Central

    Cozens-Roberts, C.; Lauffenburger, D. A.; Quinn, J. A.

    1990-01-01

    The kinetics of receptor-mediated cell adhesion to a ligand-coated surface play a key role in many physiological and biotechnology-related processes. We present a probabilistic model of receptor-ligand bond formation between a cell and surface to describe the probability of adhesion in a fluid shear field. Our model extends the deterministic model of Hammer and Lauffenburger (Hammer, D.A., and D.A. Lauffenburger. 1987. Biophys. J. 52:475-487) to a probabilistic framework, in which we calculate the probability that a certain number of bonds between a cell and surface exists at any given time. The probabilistic framework is used to account for deviations from ideal, deterministic behavior, inherent in chemical reactions involving relatively small numbers of reacting molecules. Two situations are investigated: first, cell attachment in the absence of fluid stress; and, second, cell detachment in the presence of fluid stress. In the attachment case, we examine the expected variance in bond formation as a function of attachment time; this also provides an initial condition for the detachment case. Focusing then on detachment, we predict transient behavior as a function of key system parameters, such as the distractive fluid force, the receptor-ligand bond affinity and rate constants, and the receptor and ligand densities. We compare the predictions of the probabilistic model with those of a deterministic model, and show how a deterministic approach can yield some inaccurate results; e.g., it cannot account for temporally continuous cell attach mentor detachment, it can underestimate the time needed for cell attachment, it can overestimate the time required for cell detachment for a given level of force, and it can overestimate the force necessary for cell detachment. PMID:2174271

  9. A Probabilistic Model of Student Nurses' Knowledge of Normal Nutrition.

    ERIC Educational Resources Information Center

    Passmore, David Lynn

    1983-01-01

    Vocational and technical education researchers need to be aware of the uses and limits of various statistical models. The author reviews the Rasch Model and applies it to results from a nutrition test given to student nurses. (Author)

  10. Probabilistic performance-assessment modeling of the mixed waste landfill at Sandia National Laboratories.

    SciTech Connect

    Peace, Gerald L.; Goering, Timothy James; Miller, Mark Laverne; Ho, Clifford Kuofei

    2007-01-01

    A probabilistic performance assessment has been conducted to evaluate the fate and transport of radionuclides (americium-241, cesium-137, cobalt-60, plutonium-238, plutonium-239, radium-226, radon-222, strontium-90, thorium-232, tritium, uranium-238), heavy metals (lead and cadmium), and volatile organic compounds (VOCs) at the Mixed Waste Landfill (MWL). Probabilistic analyses were performed to quantify uncertainties inherent in the system and models for a 1,000-year period, and sensitivity analyses were performed to identify parameters and processes that were most important to the simulated performance metrics. Comparisons between simulated results and measured values at the MWL were made to gain confidence in the models and perform calibrations when data were available. In addition, long-term monitoring requirements and triggers were recommended based on the results of the quantified uncertainty and sensitivity analyses.

  11. Probabilistic Model for Laser Damage to the Human Retina

    DTIC Science & Technology

    2012-03-01

    the beam. Power density may be measured in radiant exposure, J cm2 , or by irradiance , W cm2 . In the experimental database used in this study and...to quan- tify a binary response, either lethal or non-lethal, within a population such as insects or rats. In directed energy research, probit...value of the normalized Arrhenius damage integral. In a one-dimensional simulation, the source term is determined as a spatially averaged irradiance (W

  12. Probabilistic Design and Analysis Framework

    NASA Technical Reports Server (NTRS)

    Strack, William C.; Nagpal, Vinod K.

    2010-01-01

    PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.

  13. A Strategy to Integrate Probabilistic Risk Assessment into Design and Development Processes for Aerospace Based pon Mars Exploration Rover Experiences

    NASA Technical Reports Server (NTRS)

    Nunes, Jeffery; Paulos, Todd; Everline, Chester J.; Dezfuli, Homayoon

    2006-01-01

    This paper will discuss the Probabilistic Risk Assessment (PRA) effort and its involvement with related activities during the development of the Mars Exploration Rover (MER). The Rovers were launched 2003.June.10 (Spirit) and 2003.July.7 (Opportunity), and both have proven very successful. Although designed for a 90-day mission, the Rovers have been operating for over two earth years. This paper will review aspects of how the MER project integrated PRA into the design and development process. A companion paper (Development of the Mars Exploration Rover PRA) will describe the MER PRA and design changes from those results.

  14. A Strategy to Integrate Probabilistic Risk Assessment into Design and Development Processes for Aerospace Based pon Mars Exploration Rover Experiences

    NASA Technical Reports Server (NTRS)

    Nunes, Jeffery; Paulos, Todd; Everline, Chester J.; Dezfuli, Homayoon

    2006-01-01

    This paper will discuss the Probabilistic Risk Assessment (PRA) effort and its involvement with related activities during the development of the Mars Exploration Rover (MER). The Rovers were launched 2003.June.10 (Spirit) and 2003.July.7 (Opportunity), and both have proven very successful. Although designed for a 90-day mission, the Rovers have been operating for over two earth years. This paper will review aspects of how the MER project integrated PRA into the design and development process. A companion paper (Development of the Mars Exploration Rover PRA) will describe the MER PRA and design changes from those results.

  15. Probabilistic Analysis of Onion Routing in a Black-box Model

    DTIC Science & Technology

    2007-01-01

    model of anonymous communication that abstracts the essential properties of onion routing in the presence of an active adversary that controls a portion...users by exploit- ing knowledge of their probabilistic behavior. In particular, we show that a user u’s anonymity is worst either when the other...users always choose the destination u is least likely to visit or when the other users always choose the destination u chooses. This worst-case anonymity

  16. The Integrated Medical Model

    NASA Technical Reports Server (NTRS)

    Kerstman, Eric; Minard, Charles; Saile, Lynn; Freiere deCarvalho, Mary; Myers, Jerry; Walton, Marlei; Butler, Douglas; Iyengar, Sriram; Johnson-Throop, Kathy; Baumann, David

    2010-01-01

    The goals of the Integrated Medical Model (IMM) are to develop an integrated, quantified, evidence-based decision support tool useful to crew health and mission planners and to help align science, technology, and operational activities intended to optimize crew health, safety, and mission success. Presentation slides address scope and approach, beneficiaries of IMM capabilities, history, risk components, conceptual models, development steps, and the evidence base. Space adaptation syndrome is used to demonstrate the model's capabilities.

  17. Probabilistic Model for the Simulation of Secondary Electron Emission

    SciTech Connect

    Furman, M

    2004-05-17

    We provide a detailed description of a model and its computational algorithm for the secondary electron emission process. The model is based on a broad phenomenological fit to data for the secondary emission yield (SEY) and the emitted-energy spectrum. We provide two sets of values for the parameters by fitting our model to two particular data sets, one for copper and the other one for stainless steel.

  18. Microscopic probabilistic model for the simulation of secondary electron emission

    SciTech Connect

    Furman, M.A.; Pivi, M.T.F.

    2002-07-29

    We provide a detailed description of a model and its computational algorithm for the secondary electron emission process. The model is based on a broad phenomenological fit to data for the secondary emission yield (SEY) and the emitted-energy spectrum. We provide two sets of values for the parameters by fitting our model to two particular data sets, one for copper and the other one for stainless steel.

  19. Development of a coupled hydrological - hydrodynamic model for probabilistic catchment flood inundation modelling

    NASA Astrophysics Data System (ADS)

    Quinn, Niall; Freer, Jim; Coxon, Gemma; Dunne, Toby; Neal, Jeff; Bates, Paul; Sampson, Chris; Smith, Andy; Parkin, Geoff

    2017-04-01

    Computationally efficient flood inundation modelling systems capable of representing important hydrological and hydrodynamic flood generating processes over relatively large regions are vital for those interested in flood preparation, response, and real time forecasting. However, such systems are currently not readily available. This can be particularly important where flood predictions from intense rainfall are considered as the processes leading to flooding often involve localised, non-linear spatially connected hillslope-catchment responses. Therefore, this research introduces a novel hydrological-hydraulic modelling framework for the provision of probabilistic flood inundation predictions across catchment to regional scales that explicitly account for spatial variability in rainfall-runoff and routing processes. Approaches have been developed to automate the provision of required input datasets and estimate essential catchment characteristics from freely available, national datasets. This is an essential component of the framework as when making predictions over multiple catchments or at relatively large scales, and where data is often scarce, obtaining local information and manually incorporating it into the model quickly becomes infeasible. An extreme flooding event in the town of Morpeth, NE England, in 2008 was used as a first case study evaluation of the modelling framework introduced. The results demonstrated a high degree of prediction accuracy when comparing modelled and reconstructed event characteristics for the event, while the efficiency of the modelling approach used enabled the generation of relatively large ensembles of realisations from which uncertainty within the prediction may be represented. This research supports previous literature highlighting the importance of probabilistic forecasting, particularly during extreme events, which can be often be poorly characterised or even missed by deterministic predictions due to the inherent

  20. Using ELM-based weighted probabilistic model in the classification of synchronous EEG BCI.

    PubMed

    Tan, Ping; Tan, Guan-Zheng; Cai, Zi-Xing; Sa, Wei-Ping; Zou, Yi-Qun

    2017-01-01

    Extreme learning machine (ELM) is an effective machine learning technique with simple theory and fast implementation, which has gained increasing interest from various research fields recently. A new method that combines ELM with probabilistic model method is proposed in this paper to classify the electroencephalography (EEG) signals in synchronous brain-computer interface (BCI) system. In the proposed method, the softmax function is used to convert the ELM output to classification probability. The Chernoff error bound, deduced from the Bayesian probabilistic model in the training process, is adopted as the weight to take the discriminant process. Since the proposed method makes use of the knowledge from all preceding training datasets, its discriminating performance improves accumulatively. In the test experiments based on the datasets from BCI competitions, the proposed method is compared with other classification methods, including the linear discriminant analysis, support vector machine, ELM and weighted probabilistic model methods. For comparison, the mutual information, classification accuracy and information transfer rate are considered as the evaluation indicators for these classifiers. The results demonstrate that our method shows competitive performance against other methods.

  1. Interference in the classical probabilistic model and its representation in complex Hilbert space

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei Yu.

    2005-10-01

    The notion of a context (complex of physical conditions, that is to say: specification of the measurement setup) is basic in this paper.We show that the main structures of quantum theory (interference of probabilities, Born's rule, complex probabilistic amplitudes, Hilbert state space, representation of observables by operators) are present already in a latent form in the classical Kolmogorov probability model. However, this model should be considered as a calculus of contextual probabilities. In our approach it is forbidden to consider abstract context independent probabilities: “first context and only then probability”. We construct the representation of the general contextual probabilistic dynamics in the complex Hilbert space. Thus dynamics of the wave function (in particular, Schrödinger's dynamics) can be considered as Hilbert space projections of a realistic dynamics in a “prespace”. The basic condition for representing of the prespace-dynamics is the law of statistical conservation of energy-conservation of probabilities. In general the Hilbert space projection of the “prespace” dynamics can be nonlinear and even irreversible (but it is always unitary). Methods developed in this paper can be applied not only to quantum mechanics, but also to classical statistical mechanics. The main quantum-like structures (e.g., interference of probabilities) might be found in some models of classical statistical mechanics. Quantum-like probabilistic behavior can be demonstrated by biological systems. In particular, it was recently found in some psychological experiments.

  2. Limitations of Exemplar Models of Multi-Attribute Probabilistic Inference

    ERIC Educational Resources Information Center

    Nosofsky, Robert M.; Bergert, F. Bryabn

    2007-01-01

    Observers were presented with pairs of objects varying along binary-valued attributes and learned to predict which member of each pair had a greater value on a continuously varying criterion variable. The predictions from exemplar models of categorization were contrasted with classic alternative models, including generalized versions of a…

  3. Sensitivity analysis and probabilistic re-entry modeling for debris using high dimensional model representation based uncertainty treatment

    NASA Astrophysics Data System (ADS)

    Mehta, Piyush M.; Kubicek, Martin; Minisci, Edmondo; Vasile, Massimiliano

    2017-01-01

    Well-known tools developed for satellite and debris re-entry perform break-up and trajectory simulations in a deterministic sense and do not perform any uncertainty treatment. The treatment of uncertainties associated with the re-entry of a space object requires a probabilistic approach. A Monte Carlo campaign is the intuitive approach to performing a probabilistic analysis, however, it is computationally very expensive. In this work, we use a recently developed approach based on a new derivation of the high dimensional model representation method for implementing a computationally efficient probabilistic analysis approach for re-entry. Both aleatoric and epistemic uncertainties that affect aerodynamic trajectory and ground impact location are considered. The method is applicable to both controlled and un-controlled re-entry scenarios. The resulting ground impact distributions are far from the typically used Gaussian or ellipsoid distributions.

  4. Modeling the impact of flexible textile composites through multiscale and probabilistic methods

    NASA Astrophysics Data System (ADS)

    Nilakantan, Gaurav

    Flexible textile composites or fabrics comprised of materials such as Kevlar are used in impact and penetration resistant structures such as protective clothing for law enforcement and military personnel. The penetration response of these fabrics is probabilistic in nature and experimentally characterized through parameters such as the V0 and the V50 velocity. In this research a probabilistic computational framework is developed through which the entire V0- V100 velocity curve or probabilistic velocity response (PVR) curve can be numerically determined through a series of finite element (FE) impact simulations. Sources of variability that affect the PVR curve are isolated for investigation, which in this study is chosen as the statistical nature of yarn tensile strengths. Experimental tensile testing is conducted on spooled and fabric-extracted Kevlar yarns. The statistically characterized strengths are then mapped onto the yarns of the fabric FE model as part of the probabilistic computational framework. The effects of projectile characteristics such as size and shape on the fabric PVR curve are studied. A multiscale modeling technique entitled the Hybrid Element Analysis (HEA) is developed to reduce the computational requirements of a fabric model based on a yarn level architecture discretized with only solid elements. This technique combines into a single FE model both a local region of solid and shell element based yarn level architecture, and a global region of shell element based membrane level architecture, with impedance matched interfaces. The multiscale model is then incorporated into the probabilistic computational framework. A yarn model comprised of a filament level architecture is developed to investigate the feasibility of solid element based homogenized yarn models as well as the effect of filament spreading and inter-filament friction on the impact response. Results from preliminary experimental fabric impact testing are also presented. This

  5. Probabilistically Constraining Age-Depth-Models of Glaciogenic Sediments

    NASA Astrophysics Data System (ADS)

    Werner, J.; van der Bilt, W.; Tingley, M.

    2015-12-01

    Reconstructions of the late-Holocene climate rely heavily upon proxies that are assumed to be accurately dated by layer counting. All of these proxies, such as measurements of tree rings, ice cores, and varved lake sediments do carry some inherent dating uncertainty that is not always fully accounted for. Considerable advances could be achieved if time uncertainties were recognized and correctly modeled, also for proxies commonly treated as free of age model errors. Current approaches for accounting for time uncertainty are generally limited to repeating the reconstruction using each one of an ensemble of age models, thereby inflating the final estimated uncertainty - in effect, each possible age model is given equal weighting. Uncertainties can be reduced by exploiting the inferred space-time covariance structure of the climate to re-weight the possible age models. Werner and Tingley (2015) demonstrated how Bayesian hierarchical climate reconstruction models can be augmented to account for time-uncertain proxies. In their method, probabilities associated with the age models are formally updated within the Bayesian framework, thereby reducing uncertainties. Numerical experiments (Werner and Tingley 2015) show that updating the age model probabilities decreases uncertainty in the resulting reconstructions, as compared with the current de facto standard of sampling over all age models, provided there is sufficient information from other data sources in the spatial region of the time-uncertain proxy. We show how this novel method can be applied to high resolution, sub-annually sampled lacustrine sediment records to constrain their respective age depth models. The results help to quantify the signal content and extract the regionally representative signal. The single time series can then be used as the basis for a reconstruction of glacial activity. van der Bilt et al. in prep. Werner, J.P. and Tingley, M.P. Clim. Past (2015)

  6. Dependence in probabilistic modeling, Dempster-Shafer theory, and probability bounds analysis.

    SciTech Connect

    Oberkampf, William Louis; Tucker, W. Troy; Zhang, Jianzhong; Ginzburg, Lev; Berleant, Daniel J.; Ferson, Scott; Hajagos, Janos; Nelsen, Roger B.

    2004-10-01

    This report summarizes methods to incorporate information (or lack of information) about inter-variable dependence into risk assessments that use Dempster-Shafer theory or probability bounds analysis to address epistemic and aleatory uncertainty. The report reviews techniques for simulating correlated variates for a given correlation measure and dependence model, computation of bounds on distribution functions under a specified dependence model, formulation of parametric and empirical dependence models, and bounding approaches that can be used when information about the intervariable dependence is incomplete. The report also reviews several of the most pervasive and dangerous myths among risk analysts about dependence in probabilistic models.

  7. Integrated Modeling Environment

    NASA Technical Reports Server (NTRS)

    Mosier, Gary; Stone, Paul; Holtery, Christopher

    2006-01-01

    The Integrated Modeling Environment (IME) is a software system that establishes a centralized Web-based interface for integrating people (who may be geographically dispersed), processes, and data involved in a common engineering project. The IME includes software tools for life-cycle management, configuration management, visualization, and collaboration.

  8. Fundamental Mistuning Model for Probabilistic Analysis Studied Experimentally

    NASA Technical Reports Server (NTRS)

    Griffin, Jerry H.

    2005-01-01

    The Fundamental Mistuning Model (FMM) is a reduced-order model for efficiently calculating the forced response of a mistuned bladed disk. FMM ID is a companion program that determines the mistuning in a particular rotor. Together, these methods provide a way to acquire mistuning data in a population of bladed disks and then simulate the forced response of the fleet. This process was tested experimentally at the NASA Glenn Research Center, and the simulated results were compared with laboratory measurements of a "fleet" of test rotors. The method was shown to work quite well. It was found that the accuracy of the results depends on two factors: (1) the quality of the statistical model used to characterize mistuning and (2) how sensitive the system is to errors in the statistical modeling.

  9. Probabilistic Modeling of Loran-C for nonprecision approaches

    NASA Technical Reports Server (NTRS)

    Einhorn, John K.

    1987-01-01

    The overall idea of the research was to predict the errors to be encountered during an approach using available data from the U.S. Coast Guard and standard normal distribution probability analysis for a number of airports in the North East CONUS. The research consists of two parts: an analytical model that predicts the probability of an approach falling within a given standard, and a series of flight tests designed to test the validity of the model.

  10. Order Under Uncertainty: Robust Differential Expression Analysis Using Probabilistic Models for Pseudotime Inference

    PubMed Central

    Campbell, Kieran R.

    2016-01-01

    Single cell gene expression profiling can be used to quantify transcriptional dynamics in temporal processes, such as cell differentiation, using computational methods to label each cell with a ‘pseudotime’ where true time series experimentation is too difficult to perform. However, owing to the high variability in gene expression between individual cells, there is an inherent uncertainty in the precise temporal ordering of the cells. Pre-existing methods for pseudotime estimation have predominantly given point estimates precluding a rigorous analysis of the implications of uncertainty. We use probabilistic modelling techniques to quantify pseudotime uncertainty and propagate this into downstream differential expression analysis. We demonstrate that reliance on a point estimate of pseudotime can lead to inflated false discovery rates and that probabilistic approaches provide greater robustness and measures of the temporal resolution that can be obtained from pseudotime inference. PMID:27870852

  11. Probabilistic Inference in General Graphical Models through Sampling in Stochastic Networks of Spiking Neurons

    PubMed Central

    Pecevski, Dejan; Buesing, Lars; Maass, Wolfgang

    2011-01-01

    An important open problem of computational neuroscience is the generic organization of computations in networks of neurons in the brain. We show here through rigorous theoretical analysis that inherent stochastic features of spiking neurons, in combination with simple nonlinear computational operations in specific network motifs and dendritic arbors, enable networks of spiking neurons to carry out probabilistic inference through sampling in general graphical models. In particular, it enables them to carry out probabilistic inference in Bayesian networks with converging arrows (“explaining away”) and with undirected loops, that occur in many real-world tasks. Ubiquitous stochastic features of networks of spiking neurons, such as trial-to-trial variability and spontaneous activity, are necessary ingredients of the underlying computational organization. We demonstrate through computer simulations that this approach can be scaled up to neural emulations of probabilistic inference in fairly large graphical models, yielding some of the most complex computations that have been carried out so far in networks of spiking neurons. PMID:22219717

  12. Event-Based Media Enrichment Using an Adaptive Probabilistic Hypergraph Model.

    PubMed

    Liu, Xueliang; Wang, Meng; Yin, Bao-Cai; Huet, Benoit; Li, Xuelong

    2015-11-01

    Nowadays, with the continual development of digital capture technologies and social media services, a vast number of media documents are captured and shared online to help attendees record their experience during events. In this paper, we present a method combining semantic inference and multimodal analysis for automatically finding media content to illustrate events using an adaptive probabilistic hypergraph model. In this model, media items are taken as vertices in the weighted hypergraph and the task of enriching media to illustrate events is formulated as a ranking problem. In our method, each hyperedge is constructed using the K-nearest neighbors of a given media document. We also employ a probabilistic representation, which assigns each vertex to a hyperedge in a probabilistic way, to further exploit the correlation among media data. Furthermore, we optimize the hypergraph weights in a regularization framework, which is solved as a second-order cone problem. The approach is initiated by seed media and then used to rank the media documents using a transductive inference process. The results obtained from validating the approach on an event dataset collected from EventMedia demonstrate the effectiveness of the proposed approach.

  13. Probabilistic Model of Microbial Cell Growth, Division, and Mortality ▿

    PubMed Central

    Horowitz, Joseph; Normand, Mark D.; Corradini, Maria G.; Peleg, Micha

    2010-01-01

    After a short time interval of length δt during microbial growth, an individual cell can be found to be divided with probability Pd(t)δt, dead with probability Pm(t)δt, or alive but undivided with the probability 1 − [Pd(t) + Pm(t)]δt, where t is time, Pd(t) expresses the probability of division for an individual cell per unit of time, and Pm(t) expresses the probability of mortality per unit of time. These probabilities may change with the state of the population and the habitat's properties and are therefore functions of time. This scenario translates into a model that is presented in stochastic and deterministic versions. The first, a stochastic process model, monitors the fates of individual cells and determines cell numbers. It is particularly suitable for small populations such as those that may exist in the case of casual contamination of a food by a pathogen. The second, which can be regarded as a large-population limit of the stochastic model, is a continuous mathematical expression that describes the population's size as a function of time. It is suitable for large microbial populations such as those present in unprocessed foods. Exponential or logistic growth with or without lag, inactivation with or without a “shoulder,” and transitions between growth and inactivation are all manifestations of the underlying probability structure of the model. With temperature-dependent parameters, the model can be used to simulate nonisothermal growth and inactivation patterns. The same concept applies to other factors that promote or inhibit microorganisms, such as pH and the presence of antimicrobials, etc. With Pd(t) and Pm(t) in the form of logistic functions, the model can simulate all commonly observed growth/mortality patterns. Estimates of the changing probability parameters can be obtained with both the stochastic and deterministic versions of the model, as demonstrated with simulated data. PMID:19915038

  14. Validation analysis of probabilistic models of dietary exposure to food additives.

    PubMed

    Gilsenan, M B; Thompson, R L; Lambe, J; Gibney, M J

    2003-10-01

    The validity of a range of simple conceptual models designed specifically for the estimation of food additive intakes using probabilistic analysis was assessed. Modelled intake estimates that fell below traditional conservative point estimates of intake and above 'true' additive intakes (calculated from a reference database at brand level) were considered to be in a valid region. Models were developed for 10 food additives by combining food intake data, the probability of an additive being present in a food group and additive concentration data. Food intake and additive concentration data were entered as raw data or as a lognormal distribution, and the probability of an additive being present was entered based on the per cent brands or the per cent eating occasions within a food group that contained an additive. Since the three model components assumed two possible modes of input, the validity of eight (2(3)) model combinations was assessed. All model inputs were derived from the reference database. An iterative approach was employed in which the validity of individual model components was assessed first, followed by validation of full conceptual models. While the distribution of intake estimates from models fell below conservative intakes, which assume that the additive is present at maximum permitted levels (MPLs) in all foods in which it is permitted, intake estimates were not consistently above 'true' intakes. These analyses indicate the need for more complex models for the estimation of food additive intakes using probabilistic analysis. Such models should incorporate information on market share and/or brand loyalty.

  15. Probabilistic models for semisupervised discriminative motif discovery in DNA sequences.

    PubMed

    Kim, Jong Kyoung; Choi, Seungjin

    2011-01-01

    Methods for discriminative motif discovery in DNA sequences identify transcription factor binding sites (TFBSs), searching only for patterns that differentiate two sets (positive and negative sets) of sequences. On one hand, discriminative methods increase the sensitivity and specificity of motif discovery, compared to generative models. On the other hand, generative models can easily exploit unlabeled sequences to better detect functional motifs when labeled training samples are limited. In this paper, we develop a hybrid generative/discriminative model which enables us to make use of unlabeled sequences in the framework of discriminative motif discovery, leading to semisupervised discriminative motif discovery. Numerical experiments on yeast ChIP-chip data for discovering DNA motifs demonstrate that the best performance is obtained between the purely-generative and the purely-discriminative and the semisupervised learning improves the performance when labeled sequences are limited.

  16. A Coupled Probabilistic Wake Vortex and Aircraft Response Prediction Model

    NASA Technical Reports Server (NTRS)

    Gloudemans, Thijs; Van Lochem, Sander; Ras, Eelco; Malissa, Joel; Ahmad, Nashat N.; Lewis, Timothy A.

    2016-01-01

    Wake vortex spacing standards along with weather and runway occupancy time, restrict terminal area throughput and impose major constraints on the overall capacity and efficiency of the National Airspace System (NAS). For more than two decades, the National Aeronautics and Space Administration (NASA) has been conducting research on characterizing wake vortex behavior in order to develop fast-time wake transport and decay prediction models. It is expected that the models can be used in the systems level design of advanced air traffic management (ATM) concepts that safely increase the capacity of the NAS. It is also envisioned that at a later stage of maturity, these models could potentially be used operationally, in groundbased spacing and scheduling systems as well as on the flight deck.

  17. A new Bayesian formulation to integrate body-wave polarisation in non-linear probabilistic earthquake location

    NASA Astrophysics Data System (ADS)

    Gaucher, Emmanuel; Gesret, Alexandrine; Noble, Mark; Kohl, Thomas

    2016-04-01

    Earthquake location is most of the time computed using the arrival time of the seismic waves observed on monitoring networks. However, three-component seismometers enable measurement of the seismic wave polarisation which is also hypocentre dependent. This information is necessary when considering single-station locations but may also be applied to local and sparse seismic networks with poor coverage to better constrain the local earthquake hypocentres, as typically seen in hydraulic fracturing or geothermal field monitoring. In this work, we propose a new Bayesian formulation that integrates the information associated with the P-wave polarisation into a probabilistic earthquake location scheme. The approach takes a single 3C-sensor perspective and uses the covariance matrix to quantify the polarisation. This matrix contains all necessary axial information including uncertainties. According to directional statistics, the tri-variate Gaussian distribution represented by the covariance matrix corresponds to an angular central Gaussian distribution when axial data are considered. This property allows us defining a simple probability density function associated with a modelled polarisation vector given the observed covariance matrix. With this approach, the non-linearity of the location problem is kept. Unlike existing least-square misfit functions, this formulation does not reduce the polarisation to a single axis and avoids inexact estimate of a priori angular uncertainties. Furthermore, it replaces the polarisation information in the spherical data space, which yields correct probability density normalisation and prevents from any weighting when combined with e.g. travel-time probability density function. We first present the Bayesian formalism. Then, several synthetic tests on a 1D velocity model are performed to illustrate the technique and to show the effect of integrating the polarisation information. In this synthetic test, we also compare the results with an

  18. Testing for ontological errors in probabilistic forecasting models of natural systems.

    PubMed

    Marzocchi, Warner; Jordan, Thomas H

    2014-08-19

    Probabilistic forecasting models describe the aleatory variability of natural systems as well as our epistemic uncertainty about how the systems work. Testing a model against observations exposes ontological errors in the representation of a system and its uncertainties. We clarify several conceptual issues regarding the testing of probabilistic forecasting models for ontological errors: the ambiguity of the aleatory/epistemic dichotomy, the quantification of uncertainties as degrees of belief, the interplay between Bayesian and frequentist methods, and the scientific pathway for capturing predictability. We show that testability of the ontological null hypothesis derives from an experimental concept, external to the model, that identifies collections of data, observed and not yet observed, that are judged to be exchangeable when conditioned on a set of explanatory variables. These conditional exchangeability judgments specify observations with well-defined frequencies. Any model predicting these behaviors can thus be tested for ontological error by frequentist methods; e.g., using P values. In the forecasting problem, prior predictive model checking, rather than posterior predictive checking, is desirable because it provides more severe tests. We illustrate experimental concepts using examples from probabilistic seismic hazard analysis. Severe testing of a model under an appropriate set of experimental concepts is the key to model validation, in which we seek to know whether a model replicates the data-generating process well enough to be sufficiently reliable for some useful purpose, such as long-term seismic forecasting. Pessimistic views of system predictability fail to recognize the power of this methodology in separating predictable behaviors from those that are not.

  19. Testing for ontological errors in probabilistic forecasting models of natural systems

    PubMed Central

    Marzocchi, Warner; Jordan, Thomas H.

    2014-01-01

    Probabilistic forecasting models describe the aleatory variability of natural systems as well as our epistemic uncertainty about how the systems work. Testing a model against observations exposes ontological errors in the representation of a system and its uncertainties. We clarify several conceptual issues regarding the testing of probabilistic forecasting models for ontological errors: the ambiguity of the aleatory/epistemic dichotomy, the quantification of uncertainties as degrees of belief, the interplay between Bayesian and frequentist methods, and the scientific pathway for capturing predictability. We show that testability of the ontological null hypothesis derives from an experimental concept, external to the model, that identifies collections of data, observed and not yet observed, that are judged to be exchangeable when conditioned on a set of explanatory variables. These conditional exchangeability judgments specify observations with well-defined frequencies. Any model predicting these behaviors can thus be tested for ontological error by frequentist methods; e.g., using P values. In the forecasting problem, prior predictive model checking, rather than posterior predictive checking, is desirable because it provides more severe tests. We illustrate experimental concepts using examples from probabilistic seismic hazard analysis. Severe testing of a model under an appropriate set of experimental concepts is the key to model validation, in which we seek to know whether a model replicates the data-generating process well enough to be sufficiently reliable for some useful purpose, such as long-term seismic forecasting. Pessimistic views of system predictability fail to recognize the power of this methodology in separating predictable behaviors from those that are not. PMID:25097265

  20. Exploring the uncertainties in cancer risk assessment using the integrated probabilistic risk assessment (IPRA) approach.

    PubMed

    Slob, Wout; Bakker, Martine I; Biesebeek, Jan Dirk Te; Bokkers, Bas G H

    2014-08-01

    Current methods for cancer risk assessment result in single values, without any quantitative information on the uncertainties in these values. Therefore, single risk values could easily be overinterpreted. In this study, we discuss a full probabilistic cancer risk assessment approach in which all the generally recognized uncertainties in both exposure and hazard assessment are quantitatively characterized and probabilistically evaluated, resulting in a confidence interval for the final risk estimate. The methodology is applied to three example chemicals (aflatoxin, N-nitrosodimethylamine, and methyleugenol). These examples illustrate that the uncertainty in a cancer risk estimate may be huge, making single value estimates of cancer risk meaningless. Further, a risk based on linear extrapolation tends to be lower than the upper 95% confidence limit of a probabilistic risk estimate, and in that sense it is not conservative. Our conceptual analysis showed that there are two possible basic approaches for cancer risk assessment, depending on the interpretation of the dose-incidence data measured in animals. However, it remains unclear which of the two interpretations is the more adequate one, adding an additional uncertainty to the already huge confidence intervals for cancer risk estimates. © 2014 Society for Risk Analysis.

  1. Integration of Fuzzy and Probabilistic Information in the Description of Hydraulic Conductivity

    NASA Astrophysics Data System (ADS)

    Druschel, B.; Ozbek, M.; Pinder, G.

    2004-12-01

    Evaluation of the heterogeneity of hydraulic conductivity, K, is a well-known problem in groundwater hydrology. The open question is how to fully represent a given highly heterogeneous K field and its inherent uncertainty at least cost. Today, most K fields are analyzed using field test data and probability theory. Uncertainty is usually reported in the spatial covariance. In an attempt to develop a more cost effective method which still provides an accurate approximation of a K field, we propose using an evidence theory framework to merge probabilistic and fuzzy (or possibilistic) information in an effort to improve our ability to fully define a K field. The tool chosen to fuse probabilistic information obtained via experiment and subjective information provided by the groundwater professional is Dempster's Rule of Combination. In using this theory we must create mass assignments for our subject of interest, describing the degree of evidence that supports the presence of our subject in a particular set. These mass assignments can be created directly from the probabilistic information and, in the case of the subjective information, from feedback we obtain from an expert. The fusion of these two types of information provides a better description of uncertainty than would typically be available with just probability theory alone.

  2. Validation of a probabilistic post-fire erosion model

    Treesearch

    Pete Robichaud; William J. Elliot; Sarah A. Lewis; Mary Ellen Miller

    2016-01-01

    Post-fire increases of runoff and erosion often occur and land managers need tools to be able to project the increased risk. The Erosion Risk Management Tool (ERMiT) uses the Water Erosion Prediction Project (WEPP) model as the underlying processor. ERMiT predicts the probability of a given amount of hillslope sediment delivery from a single rainfall or...

  3. Probabilistic Model of Fault Detection in Quantum Circuits

    NASA Astrophysics Data System (ADS)

    Banerjee, A.; Pathak, A.

    Since the introduction of quantum computation, several protocols (such as quantum cryptography, quantum algorithm, quantum teleportation) have established quantum computing as a superior future technology. Each of these processes involves quantum circuits, which are prone to different kinds of faults. Consequently, it is important to verify whether the circuit hardware is defective or not. The systematic procedure to do so is known as fault testing. Normally testing is done by providing a set of valid input states and measuring the corresponding output states and comparing the output states with the expected output states of the perfect (fault less) circuit. This particular set of input vectors are known as test set [6]. If there exists a fault then the next step would be to find the exact location and nature of the defect. This is known as fault localization. A model that explains the logical or functional faults in the circuit is a fault model. Conventional fault models include (i) stuck at faults, (ii) bridge faults, and (iii) delay faults. These fault models have been rigorously studied for conventional irreversible circuit. But with the advent of reversible classical computing and quantum computing it has become important to enlarge the domain of the study on test vectors.

  4. A simple probabilistic model of submicroscopic diatom morphogenesis

    PubMed Central

    Willis, L.; Cox, E. J.; Duke, T.

    2013-01-01

    Unicellular algae called diatoms morph biomineral compounds into tough exoskeletons via complex intracellular processes about which there is much to be learned. These exoskeletons feature a rich variety of structures from submicroscale to milliscale, many that have not been reproduced in vitro. In order to help understand this complex miniature morphogenesis, here we introduce and analyse a simple model of biomineral kinetics, focusing on the exoskeleton's submicroscopic patterned planar structures called pore occlusions. The model reproduces most features of these pore occlusions by retuning just one parameter, thereby indicating what physio-biochemical mechanisms could sufficiently explain morphogenesis at the submicroscopic scale: it is sufficient to identify a mechanism of lateral negative feedback on the biomineral reaction kinetics. The model is nonlinear and stochastic; it is an extended version of the threshold voter model. Its mean-field equation provides a simple and, as far as the authors are aware, new way of mapping out the spatial patterns produced by lateral inhibition and variants thereof. PMID:23554345

  5. Probabilistic uncertainty analysis of epidemiological modeling to guide public health intervention policy.

    PubMed

    Gilbert, Jennifer A; Meyers, Lauren Ancel; Galvani, Alison P; Townsend, Jeffrey P

    2014-03-01

    Mathematical modeling of disease transmission has provided quantitative predictions for health policy, facilitating the evaluation of epidemiological outcomes and the cost-effectiveness of interventions. However, typical sensitivity analyses of deterministic dynamic infectious disease models focus on model architecture and the relative importance of parameters but neglect parameter uncertainty when reporting model predictions. Consequently, model results that identify point estimates of intervention levels necessary to terminate transmission yield limited insight into the probability of success. We apply probabilistic uncertainty analysis to a dynamic model of influenza transmission and assess global uncertainty in outcome. We illustrate that when parameter uncertainty is not incorporated into outcome estimates, levels of vaccination and treatment predicted to prevent an influenza epidemic will only have an approximately 50% chance of terminating transmission and that sensitivity analysis alone is not sufficient to obtain this information. We demonstrate that accounting for parameter uncertainty yields probabilities of epidemiological outcomes based on the degree to which data support the range of model predictions. Unlike typical sensitivity analyses of dynamic models that only address variation in parameters, the probabilistic uncertainty analysis described here enables modelers to convey the robustness of their predictions to policy makers, extending the power of epidemiological modeling to improve public health. Copyright © 2013 The Authors. Published by Elsevier B.V. All rights reserved.

  6. A family of analytical probabilistic models for urban stormwater management planning

    SciTech Connect

    Papa, F.; Adams, B.J.; Guo, Y.

    1998-07-01

    This paper presents the synthesis of over fifteen years of research on the topic of analytical probabilistic models, as an alternative approach to continuous simulation, that have been derived for the performance analysis of urban runoff quantity and quality control systems. These models overcome the limitations imposed by single event modeling through the use of long term rainfall records and are significantly more computationally efficient and less cumbersome than other methods of continuous analysis. These attributes promote the comprehensive analysis of drainage system design alternatives at the screening and planning levels.

  7. A tractable probabilistic model for Affymetrix probe-level analysis across multiple chips.

    PubMed

    Liu, Xuejun; Milo, Marta; Lawrence, Neil D; Rattray, Magnus

    2005-09-15

    Affymetrix GeneChip arrays are currently the most widely used microarray technology. Many summarization methods have been developed to provide gene expression levels from Affymetrix probe-level data. Most of the currently popular methods do not provide a measure of uncertainty for the expression level of each gene. The use of probabilistic models can overcome this limitation. A full hierarchical Bayesian approach requires the use of computationally intensive MCMC methods that are impractical for large datasets. An alternative computationally efficient probabilistic model, mgMOS, uses Gamma distributions to model specific and non-specific binding with a latent variable to capture variations in probe affinity. Although promising, the main limitations of this model are that it does not use information from multiple chips and does not account for specific binding to the mismatch (MM) probes. We extend mgMOS to model the binding affinity of probe-pairs across multiple chips and to capture the effect of specific binding to MM probes. The new model, multi-mgMOS, provides improved accuracy, as demonstrated on some bench-mark datasets and a real time-course dataset, and is much more computationally efficient than a competing hierarchical Bayesian approach that requires MCMC sampling. We demonstrate how the probabilistic model can be used to estimate credibility intervals for expression levels and their log-ratios between conditions. Both mgMOS and the new model multi-mgMOS have been implemented in an R package, which is available at http://www.bioinf.man.ac.uk/resources/puma.

  8. Probabilistic model of ligaments and tendons: Quasistatic linear stretching

    NASA Astrophysics Data System (ADS)

    Bontempi, M.

    2009-03-01

    Ligaments and tendons have a significant role in the musculoskeletal system and are frequently subjected to injury. This study presents a model of collagen fibers, based on the study of a statistical distribution of fibers when they are subjected to quasistatic linear stretching. With respect to other methodologies, this model is able to describe the behavior of the bundle using less ad hoc hypotheses and is able to describe all the quasistatic stretch-load responses of the bundle, including the yield and failure regions described in the literature. It has two other important results: the first is that it is able to correlate the mechanical behavior of the bundle with its internal structure, and it suggests a methodology to deduce the fibers population distribution directly from the tensile-test data. The second is that it can follow fibers’ structure evolution during the stretching and it is possible to study the internal adaptation of fibers in physiological and pathological conditions.

  9. Probabilistic model of ligaments and tendons: quasistatic linear stretching.

    PubMed

    Bontempi, M

    2009-03-01

    Ligaments and tendons have a significant role in the musculoskeletal system and are frequently subjected to injury. This study presents a model of collagen fibers, based on the study of a statistical distribution of fibers when they are subjected to quasistatic linear stretching. With respect to other methodologies, this model is able to describe the behavior of the bundle using less ad hoc hypotheses and is able to describe all the quasistatic stretch-load responses of the bundle, including the yield and failure regions described in the literature. It has two other important results: the first is that it is able to correlate the mechanical behavior of the bundle with its internal structure, and it suggests a methodology to deduce the fibers population distribution directly from the tensile-test data. The second is that it can follow fibers' structure evolution during the stretching and it is possible to study the internal adaptation of fibers in physiological and pathological conditions.

  10. Structural model integrity

    NASA Technical Reports Server (NTRS)

    Wallerstein, D. V.; Lahey, R. S.; Haggenmacher, G. W.

    1977-01-01

    Many of the practical aspects and problems of ensuring the integrity of a structural model are discussed, as well as the steps which have been taken in the NASTRAN system to assure that these checks can be routinely performed. Model integrity as used applies not only to the structural model but also to the loads applied to the model. Emphasis is also placed on the fact that when dealing with substructure analysis, all of the checking procedures discussed should be applied at the lowest level of substructure prior to any coupling.

  11. Structural model integrity

    NASA Technical Reports Server (NTRS)

    Wallerstein, D. V.; Lahey, R. S.; Haggenmacher, G. W.

    1977-01-01

    Many of the practical aspects and problems of ensuring the integrity of a structural model are discussed, as well as the steps which have been taken in the NASTRAN system to assure that these checks can be routinely performed. Model integrity as used applies not only to the structural model but also to the loads applied to the model. Emphasis is also placed on the fact that when dealing with substructure analysis, all of the checking procedures discussed should be applied at the lowest level of substructure prior to any coupling.

  12. Using Structured Knowledge Representation for Context-Sensitive Probabilistic Modeling

    DTIC Science & Technology

    2008-01-01

    Gopnik, C. Glymour, D. M. Sobel, L. E. Schulz, T. Kushnir, D. Danks, A theory of causal learning in children: Causal maps and Bayes nets, Psychological...Morgan Kaufmann, 1988. [24] J. Pearl, Causality: Models, Reasoning, and Inference, Cambridge University Press, 2000. [25] J. Piaget , Piaget’s theory ...in the training data and captured by the learning algorithm in order to avoid overlooking hidden relationships. One implication of this is that when a

  13. Probabilistic Model of a Floating Target Behaviour in Rough Seas

    DTIC Science & Technology

    2013-07-01

    Honolulu, Hawaii, 1976, pp. 301-329. 10. Pierson, W. J., and Moskowitz, L . A proposed spectral form for fully developed wind seas based on the...175-181. 14. Torsethaugen, K. Simplified double peak spectral model for ocean waves. SINTEF Report STF80 A048052, SINTEF Fisheries and Aquaculture ...RELEASE USE ( L ) NEXT TO DOCUMENT CLASSIFICATION) Document (U) Title (U) Abstract (U) 4. AUTHOR(S) Rada Pushkarova 5

  14. PREMChlor: Probabilistic Remediation Evaluation Model for Chlorinated Solvents

    DTIC Science & Technology

    2010-03-01

    or reduce the source contaminant loading to the plume, source containment methods, such as slurry walls, clay caps and sealable joint sheet pile walls...at chlorinated solvent release sites, respectively. Several three-dimensional Manual for PREMChlor Contents • 10 multiphase numerical models...additional decay [Falta et al., 2005a and Falta, 2008]: )()()()( tMtCtQ dt tdM ss λ−−= (1) where Q(t) is the water flow rate through the source

  15. Probabilistic Uncertainty of Parameters and Conceptual Models in Geophysical Inversion

    NASA Astrophysics Data System (ADS)

    Sambridge, M.; Hawkins, R.; Dettmer, J.

    2016-12-01

    Stochastic uncertainty in parameters estimated from geophysical observations has a long history. In the situation where the data model relationship is linear or may be linearized, and data noise can be characterized, then in principle the uncertainty can be estimated in a straightforward manner. In the optimistic case where data noise can be assumed to follow Gaussian errors with known variances and co-variances then much favoured matrix expressions are available that quantify stochastic model uncertainty for linear problems. As the number of data or unknowns increase, nonlinearity and/or non-uniqueness can become severe, or knowledge of data errors itself becomes uncertain, then there are significant practical challenges in the computation and interpretation of uncertainty. These challenges are well known and much effort has recently been devoted to finding efficient ways to quantify uncertainty for such cases. A major aspect of uncertainty that is often acknowledged but seldom addressed is conceptual uncertainty in the inversion process itself. By this we mean assumptions about the physics, chemistry or geology captured in the forward problem, assumptions about the level or type of data noise, and assumptions about the appropriate complexity and form of the model parameterization. Conceptual assumptions are made in building the inference framework in the first place and conceptual uncertainty can have a significant influence on and feedback with uncertainty quantification. This area is receiving increasing attention in the geosciences utilizing techniques from the field of computational Bayesian statistics, where they are referred to as model selection. This presentation will summarize recent, and not so recent, developments in this field, and point to some promising directions.

  16. Real-time Probabilistic Covariance Tracking with Efficient Model Update

    DTIC Science & Technology

    2012-05-01

    computational complexity, resulting in a real-time performance. The covariance-based representation and ICTL are then combined with the particle filter ...Terms—Visual tracking, particle filter , covariance descriptor, Riemannian manifolds, incremental learning, model update. F 1 INTRODUCTION Visual tracking...which is the main contribution of our work. Further, our ICTL method uses a particle filter [13] for motion parameter estimation rather than the

  17. The implicit possibility of dualism in quantum probabilistic cognitive modeling.

    PubMed

    Mender, Donald

    2013-06-01

    Pothos & Busemeyer (P&B) argue convincingly that quantum probability offers an improvement over classical Bayesian probability in modeling the empirical data of cognitive science. However, a weakness related to restrictions on the dimensionality of incompatible physical observables flows from the authors' "agnosticism" regarding quantum processes in neural substrates underlying cognition. Addressing this problem will require either future research findings validating quantum neurophysics or theoretical expansion of the uncertainty principle as a new, neurocognitively contextualized, "local" symmetry.

  18. Probabilistic Modeling and Simulation of Metal Fatigue Life Prediction

    DTIC Science & Technology

    2002-09-01

    distribution demonstrate the central limit theorem? Obviously not! This is much the same as materials testing. If only NBA basketball stars are...60 near the exit of a NBA locker room. There would obviously be some pseudo-normal distribution with a very small standard deviation. The mean...completed, the investigators must understand how the midgets and the NBA stars will affect the total solution. D. IT IS MUCH SIMPLER TO MODEL THE

  19. Probabilistic Modeling Approach to Thermoelectric Systems Design Optimization

    SciTech Connect

    Karri, Naveen K.; Hendricks, Terry J.

    2007-06-25

    Recent studies on thermoelectric (TE) systems indicate that the existence of high figure of merit (ZT) materials alone is not sufficient for superior system performance and an integrated system level analysis is necessary to attain such performance. This is because there are numerous design parameters at various levels of the system that are randomly variable in nature that could affect the overall system performance. In this work the effect of stochasticity in design variables at various levels of a TE system has been studied and analyzed to attain optimal design solutions. Starting with stochasticity in one of the environmental variables, a progression was made towards studying the coupled effects of stochasticity in multiple variables at environmental and heat exchanger levels of a thermoelectric generator (TEG) system. Research and analysis tools were developed to incorporate stochasticities in single or multiple variables individually or simultaneously to study both the individual and coupled affects of input design variable stochasticities (probabilities) on output performance variables. Results indicate that normal or Gaussian distribution in input design parameters may not produce Gaussian output parameters. Also when the stochasticities in multiple variables are coupled, the standard deviations in performance parameters are magnified, and their means/averages deviate more from the deterministic values. Although more studies are required to quantify the parameters for design modifications, the studies presented in this paper affirm that incorporating stochastic variability not only aids in understanding the effects of system design variable randomness on expected output performance, but also serves to guide design decisions for optimal TE system design solutions that provide more robust system designs with improved reliability and performance across a range of off-nominal conditions.

  20. [Probabilistic models of mortality for patients hospitalized in conventional units].

    PubMed

    Rué, M; Roqué, M; Solà, J; Macià, M

    2001-09-29

    We have developed a tool to measure disease severity of patients hospitalized in conventional units in order to evaluate and compare the effectiveness and quality of health care in our setting. A total of 2,274 adult patients admitted consecutively to inpatient units from the Medicine, Surgery and Orthopaedic Surgery, and Trauma Departments of the Corporació Sanitària Parc Taulí of Sabadell, Spain, between November 1, 1997 and September 30, 1998 were included. The following variables were collected: demographic data, previous health state, substance abuse, comorbidity prior to admission, characteristics of the admission, clinical parameters within the first 24 hours of admission, laboratory results and data from the Basic Minimum Data Set of hospital discharges. Multiple logistic regression analysis was used to develop mortality probability models during the hospital stay. The mortality probability model at admission (MPMHOS-0) contained 7 variables associated with mortality during hospital stay: age, urgent admission, chronic cardiac insufficiency, chronic respiratory insufficiency, chronic liver disease, neoplasm, and dementia syndrome. The mortality probability model at 24-48 hours from admission (MPMHOS-24) contained 9 variables: those included in the MPMHOS-0 plus two statistically significant laboratory variables: hemoglobin and creatinine. Severity measures, in particular those presented in this study, can be helpful for the interpretation of hospital mortality rates and can guide mortality or quality committees at the time of investigating health care-related problems.

  1. Rock penetration : finite element sensitivity and probabilistic modeling analyses.

    SciTech Connect

    Fossum, Arlo Frederick

    2004-08-01

    This report summarizes numerical analyses conducted to assess the relative importance on penetration depth calculations of rock constitutive model physics features representing the presence of microscale flaws such as porosity and networks of microcracks and rock mass structural features. Three-dimensional, nonlinear, transient dynamic finite element penetration simulations are made with a realistic geomaterial constitutive model to determine which features have the most influence on penetration depth calculations. A baseline penetration calculation is made with a representative set of material parameters evaluated from measurements made from laboratory experiments conducted on a familiar sedimentary rock. Then, a sequence of perturbations of various material parameters allows an assessment to be made of the main penetration effects. A cumulative probability distribution function is calculated with the use of an advanced reliability method that makes use of this sensitivity database, probability density functions, and coefficients of variation of the key controlling parameters for penetration depth predictions. Thus the variability of the calculated penetration depth is known as a function of the variability of the input parameters. This simulation modeling capability should impact significantly the tools that are needed to design enhanced penetrator systems, support weapons effects studies, and directly address proposed HDBT defeat scenarios.

  2. Probabilistic image modeling with an extended chain graph for human activity recognition and image segmentation.

    PubMed

    Zhang, Lei; Zeng, Zhi; Ji, Qiang

    2011-09-01

    Chain graph (CG) is a hybrid probabilistic graphical model (PGM) capable of modeling heterogeneous relationships among random variables. So far, however, its application in image and video analysis is very limited due to lack of principled learning and inference methods for a CG of general topology. To overcome this limitation, we introduce methods to extend the conventional chain-like CG model to CG model with more general topology and the associated methods for learning and inference in such a general CG model. Specifically, we propose techniques to systematically construct a generally structured CG, to parameterize this model, to derive its joint probability distribution, to perform joint parameter learning, and to perform probabilistic inference in this model. To demonstrate the utility of such an extended CG, we apply it to two challenging image and video analysis problems: human activity recognition and image segmentation. The experimental results show improved performance of the extended CG model over the conventional directed or undirected PGMs. This study demonstrates the promise of the extended CG for effective modeling and inference of complex real-world problems.

  3. Kinetic modeling based probabilistic segmentation for molecular images.

    PubMed

    Saad, Ahmed; Hamarneh, Ghassan; Möller, Torsten; Smith, Ben

    2008-01-01

    We propose a semi-supervised, kinetic modeling based segmentation technique for molecular imaging applications. It is an iterative, self-learning algorithm based on uncertainty principles, designed to alleviate low signal-to-noise ratio (SNR) and partial volume effect (PVE) problems. Synthetic fluorodeoxyglucose (FDG) and simulated Raclopride dynamic positron emission tomography (dPET) brain images with excessive noise levels are used to validate our algorithm. We show, qualitatively and quantitatively, that our algorithm outperforms state-of-the-art techniques in identifying different functional regions and recovering the kinetic parameters.

  4. Mixture models and the probabilistic structure of depth cues.

    PubMed

    Knill, David C

    2003-03-01

    Monocular cues to depth derive their informativeness from a combination of perspective projection and prior constraints on the way scenes in the world are structured. For many cues, the appropriate priors are best described as mixture models, each of which characterizes a different category of objects, surfaces, or scenes. This paper provides a Bayesian analysis of the resulting model selection problem, showing how the mixed structure of priors creates the potential for non-linear, cooperative interactions between cues and how the information provided by a single cue can effectively determine the appropriate constraint to apply to a given image. The analysis also leads to a number of psychophysically testable predictions. We test these predictions by applying the framework to the problem of perceiving planar surface orientation from texture. A number of psychophysical experiments are described that show that the visual system is biased to interpret textures as isotropic, but that when sufficient image data is available, the system effectively turns off the isotropy constraint and interprets texture information using only a homogeneity assumption. Human performance is qualitatively similar to an optimal estimator that assumes a mixed prior on surface textures--some proportion being isotropic and homogeneous and some proportion being anisotropic and homogeneous.

  5. Probabilistic Modeling Of Ocular Biomechanics In VIIP: Risk Stratification

    NASA Technical Reports Server (NTRS)

    Feola, A.; Myers, J. G.; Raykin, J.; Nelson, E. S.; Mulugeta, L.; Samuels, B.; Ethier, C. R.

    2016-01-01

    Visual Impairment and Intracranial Pressure (VIIP) syndrome is a major health concern for long-duration space missions. Currently, it is thought that a cephalad fluid shift in microgravity causes elevated intracranial pressure (ICP) that is transmitted along the optic nerve sheath (ONS). We hypothesize that this in turn leads to alteration and remodeling of connective tissue in the posterior eye which impacts vision. Finite element (FE) analysis is a powerful tool for examining the effects of mechanical loads in complex geometries. Our goal is to build a FE analysis framework to understand the response of the lamina cribrosa and optic nerve head to elevations in ICP in VIIP. To simulate the effects of different pressures on tissues in the posterior eye, we developed a geometric model of the posterior eye and optic nerve sheath and used a Latin hypercubepartial rank correlation coef-ficient (LHSPRCC) approach to assess the influence of uncertainty in our input parameters (i.e. pressures and material properties) on the peak strains within the retina, lamina cribrosa and optic nerve. The LHSPRCC approach was repeated for three relevant ICP ranges, corresponding to upright and supine posture on earth, and microgravity [1]. At each ICP condition we used intraocular pressure (IOP) and mean arterial pressure (MAP) measurements of in-flight astronauts provided by Lifetime Surveillance of Astronaut Health Program, NASA Johnson Space Center. The lamina cribrosa, optic nerve, retinal vessel and retina were modeled as linear-elastic materials, while other tissues were modeled as a Mooney-Rivlin solid (representing ground substance, stiffness parameter c1) with embedded collagen fibers (stiffness parameters c3, c4 and c5). Geometry creationmesh generation was done in Gmsh [2], while FEBio was used for all FE simulations [3]. The LHSPRCC approach resulted in correlation coefficients in the range of 1. To assess the relative influence of the uncertainty in an input parameter on

  6. Probabilistic Multi-Factor Interaction Model for Complex Material Behavior

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Abumeri, Galib H.

    2008-01-01

    The Multi-Factor Interaction Model (MFIM) is used to evaluate the divot weight (foam weight ejected) from the launch external tanks. The multi-factor has sufficient degrees of freedom to evaluate a large number of factors that may contribute to the divot ejection. It also accommodates all interactions by its product form. Each factor has an exponent that satisfies only two points the initial and final points. The exponent describes a monotonic path from the initial condition to the final. The exponent values are selected so that the described path makes sense in the absence of experimental data. In the present investigation, the data used was obtained by testing simulated specimens in launching conditions. Results show that the MFIM is an effective method of describing the divot weight ejected under the conditions investigated.

  7. Probabilistic Multi-Factor Interaction Model for Complex Material Behavior

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Abumeri, Galib H.

    2008-01-01

    The Multi-Factor Interaction Model (MFIM) is used to evaluate the divot weight (foam weight ejected) from the launch external tanks. The multi-factor has sufficient degrees of freedom to evaluate a large number of factors that may contribute to the divot ejection. It also accommodates all interactions by its product form. Each factor has an exponent that satisfies only two points, the initial and final points. The exponent describes a monotonic path from the initial condition to the final. The exponent values are selected so that the described path makes sense in the absence of experimental data. In the present investigation the data used was obtained by testing simulated specimens in launching conditions. Results show that the MFIM is an effective method of describing the divot weight ejected under the conditions investigated.

  8. Probabilistic models of genetic variation in structured populations applied to global human studies.

    PubMed

    Hao, Wei; Song, Minsun; Storey, John D

    2016-03-01

    Modern population genetics studies typically involve genome-wide genotyping of individuals from a diverse network of ancestries. An important problem is how to formulate and estimate probabilistic models of observed genotypes that account for complex population structure. The most prominent work on this problem has focused on estimating a model of admixture proportions of ancestral populations for each individual. Here, we instead focus on modeling variation of the genotypes without requiring a higher-level admixture interpretation. We formulate two general probabilistic models, and we propose computationally efficient algorithms to estimate them. First, we show how principal component analysis can be utilized to estimate a general model that includes the well-known Pritchard-Stephens-Donnelly admixture model as a special case. Noting some drawbacks of this approach, we introduce a new 'logistic factor analysis' framework that seeks to directly model the logit transformation of probabilities underlying observed genotypes in terms of latent variables that capture population structure. We demonstrate these advances on data from the Human Genome Diversity Panel and 1000 Genomes Project, where we are able to identify SNPs that are highly differentiated with respect to structure while making minimal modeling assumptions. A Bioconductor R package called lfa is available at http://www.bioconductor.org/packages/release/bioc/html/lfa.html jstorey@princeton.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  9. A Bayesian Approach to Integrate Real-Time Data into Probabilistic Risk Analysis of Remediation Efforts in NAPL Sites

    NASA Astrophysics Data System (ADS)

    Fernandez-Garcia, D.; Sanchez-Vila, X.; Bolster, D.; Tartakovsky, D. M.

    2010-12-01

    The release of non-aqueous phase liquids (NAPLs) such as petroleum hydrocarbons and chlorinated solvents in the subsurface is a severe source of groundwater and vapor contamination. Because these liquids are essentially immiscible due to low solubility, these contaminants get slowly dissolved in groundwater and/or volatilized in the vadoze zone threatening the environment and public health over a long period. Many remediation technologies and strategies have been developed in the last decades for restoring the water quality properties of these contaminated sites. The failure of an on-site treatment technology application is often due to the unnoticed presence of dissolved NAPL entrapped in low permeability areas (heterogeneity) and/or the remaining of substantial amounts of pure phase after remediation efforts. Full understanding of the impact of remediation efforts is complicated due to the role of many interlink physical and biochemical processes taking place through several potential pathways of exposure to multiple receptors in a highly unknown heterogeneous environment. Due to these difficulties, the design of remediation strategies and definition of remediation endpoints have been traditionally determined without quantifying the risk associated with the failure of such efforts. We conduct a probabilistic risk analysis (PRA) of the likelihood of success of an on-site NAPL treatment technology that easily integrates all aspects of the problem (causes, pathways, and receptors) without doing extensive modeling. Importantly, the method is further capable to incorporate the inherent uncertainty that often exist in the exact location where the dissolved NAPL plume leaves the source zone. This is achieved by describing the failure of the system as a function of this source zone exit location, parameterized in terms of a vector of parameters. Using a Bayesian interpretation of the system and by means of the posterior multivariate distribution, the failure of the

  10. Probabilistic Multi-Factor Interaction Model for Complex Material Behavior

    NASA Technical Reports Server (NTRS)

    Abumeri, Galib H.; Chamis, Christos C.

    2010-01-01

    Complex material behavior is represented by a single equation of product form to account for interaction among the various factors. The factors are selected by the physics of the problem and the environment that the model is to represent. For example, different factors will be required for each to represent temperature, moisture, erosion, corrosion, etc. It is important that the equation represent the physics of the behavior in its entirety accurately. The Multi-Factor Interaction Model (MFIM) is used to evaluate the divot weight (foam weight ejected) from the external launch tanks. The multi-factor has sufficient degrees of freedom to evaluate a large number of factors that may contribute to the divot ejection. It also accommodates all interactions by its product form. Each factor has an exponent that satisfies only two points - the initial and final points. The exponent describes a monotonic path from the initial condition to the final. The exponent values are selected so that the described path makes sense in the absence of experimental data. In the present investigation, the data used were obtained by testing simulated specimens in launching conditions. Results show that the MFIM is an effective method of describing the divot weight ejected under the conditions investigated. The problem lies in how to represent the divot weight with a single equation. A unique solution to this problem is a multi-factor equation of product form. Each factor is of the following form (1 xi/xf)ei, where xi is the initial value, usually at ambient conditions, xf the final value, and ei the exponent that makes the curve represented unimodal that meets the initial and final values. The exponents are either evaluated by test data or by technical judgment. A minor disadvantage may be the selection of exponents in the absence of any empirical data. This form has been used successfully in describing the foam ejected in simulated space environmental conditions. Seven factors were required

  11. Multimodal probabilistic generative models for time-course gene expression data and Gene Ontology (GO) tags.

    PubMed

    Gabbur, Prasad; Hoying, James; Barnard, Kobus

    2015-10-01

    We propose four probabilistic generative models for simultaneously modeling gene expression levels and Gene Ontology (GO) tags. Unlike previous approaches for using GO tags, the joint modeling framework allows the two sources of information to complement and reinforce each other. We fit our models to three time-course datasets collected to study biological processes, specifically blood vessel growth (angiogenesis) and mitotic cell cycles. The proposed models result in a joint clustering of genes and GO annotations. Different models group genes based on GO tags and their behavior over the entire time-course, within biological stages, or even individual time points. We show how such models can be used for biological stage boundary estimation de novo. We also evaluate our models on biological stage prediction accuracy of held out samples. Our results suggest that the models usually perform better when GO tag information is included. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. A probabilistic model of emphysema based on granulometry analysis

    NASA Astrophysics Data System (ADS)

    Marcos, J. V.; Nava, R.; Cristobal, G.; Munoz-Barrutia, A.; Escalante-Ramírez, B.; Ortiz-de-Solórzano, C.

    2013-11-01

    Emphysema is associated with the destruction of lung parenchyma, resulting in abnormal enlargement of airspaces. Accurate quantification of emphysema is required for a better understanding of the disease as well as for the assessment of drugs and treatments. In the present study, a novel method for emphysema characterization from histological lung images is proposed. Elastase-induced mice were used to simulate the effect of emphysema on the lungs. A database composed of 50 normal and 50 emphysematous lung patches of size 512 x 512 pixels was used in our experiments. The purpose is to automatically identify those patches containing emphysematous tissue. The proposed approach is based on the use of granulometry analysis, which provides the pattern spectrum describing the distribution of airspaces in the lung region under evaluation. The profile of the spectrum was summarized by a set of statistical features. A logistic regression model was then used to estimate the probability for a patch to be emphysematous from this feature set. An accuracy of 87% was achieved by our method in the classification between normal and emphysematous samples. This result shows the utility of our granulometry-based method to quantify the lesions due to emphysema.

  13. An application of probabilistic safety assessment methods to model aircraft systems and accidents

    SciTech Connect

    Martinez-Guridi, G.; Hall, R.E.; Fullwood, R.R.

    1998-08-01

    A case study modeling the thrust reverser system (TRS) in the context of the fatal accident of a Boeing 767 is presented to illustrate the application of Probabilistic Safety Assessment methods. A simplified risk model consisting of an event tree with supporting fault trees was developed to represent the progression of the accident, taking into account the interaction between the TRS and the operating crew during the accident, and the findings of the accident investigation. A feasible sequence of events leading to the fatal accident was identified. Several insights about the TRS and the accident were obtained by applying PSA methods. Changes proposed for the TRS also are discussed.

  14. A probabilistic model for risk assessment of residual host cell DNA in biological products.

    PubMed

    Yang, Harry; Zhang, Lanju; Galinski, Mark

    2010-04-26

    Biological products such as viral vaccines manufactured in cells contain residual DNA derived from host cell substrates used in production. It is theoretically possible that the residual DNA could transmit activated oncogenes and/or latent infectious viral genomes to subjects receiving the product, and induce oncogenic or infective events. A probabilistic model to estimate the risks due to residual DNA is proposed. The model takes account of enzyme inactivation process. It allows for more accurate risk assessment when compared to methods currently in use. An application of the method to determine safety factor of a vaccine product is provided.

  15. A probabilistic method for constructing wave time-series at inshore locations using model scenarios

    USGS Publications Warehouse

    Long, Joseph W.; Plant, Nathaniel G.; Dalyander, P. Soupy; Thompson, David M.

    2014-01-01

    Continuous time-series of wave characteristics (height, period, and direction) are constructed using a base set of model scenarios and simple probabilistic methods. This approach utilizes an archive of computationally intensive, highly spatially resolved numerical wave model output to develop time-series of historical or future wave conditions without performing additional, continuous numerical simulations. The archive of model output contains wave simulations from a set of model scenarios derived from an offshore wave climatology. Time-series of wave height, period, direction, and associated uncertainties are constructed at locations included in the numerical model domain. The confidence limits are derived using statistical variability of oceanographic parameters contained in the wave model scenarios. The method was applied to a region in the northern Gulf of Mexico and assessed using wave observations at 12 m and 30 m water depths. Prediction skill for significant wave height is 0.58 and 0.67 at the 12 m and 30 m locations, respectively, with similar performance for wave period and direction. The skill of this simplified, probabilistic time-series construction method is comparable to existing large-scale, high-fidelity operational wave models but provides higher spatial resolution output at low computational expense. The constructed time-series can be developed to support a variety of applications including climate studies and other situations where a comprehensive survey of wave impacts on the coastal area is of interest.

  16. Probabilistic topic modeling for the analysis and classification of genomic sequences

    PubMed Central

    2015-01-01

    Background Studies on genomic sequences for classification and taxonomic identification have a leading role in the biomedical field and in the analysis of biodiversity. These studies are focusing on the so-called barcode genes, representing a well defined region of the whole genome. Recently, alignment-free techniques are gaining more importance because they are able to overcome the drawbacks of sequence alignment techniques. In this paper a new alignment-free method for DNA sequences clustering and classification is proposed. The method is based on k-mers representation and text mining techniques. Methods The presented method is based on Probabilistic Topic Modeling, a statistical technique originally proposed for text documents. Probabilistic topic models are able to find in a document corpus the topics (recurrent themes) characterizing classes of documents. This technique, applied on DNA sequences representing the documents, exploits the frequency of fixed-length k-mers and builds a generative model for a training group of sequences. This generative model, obtained through the Latent Dirichlet Allocation (LDA) algorithm, is then used to classify a large set of genomic sequences. Results and conclusions We performed classification of over 7000 16S DNA barcode sequences taken from Ribosomal Database Project (RDP) repository, training probabilistic topic models. The proposed method is compared to the RDP tool and Support Vector Machine (SVM) classification algorithm in a extensive set of trials using both complete sequences and short sequence snippets (from 400 bp to 25 bp). Our method reaches very similar results to RDP classifier and SVM for complete sequences. The most interesting results are obtained when short sequence snippets are considered. In these conditions the proposed method outperforms RDP and SVM with ultra short sequences and it exhibits a smooth decrease of performance, at every taxonomic level, when the sequence length is decreased. PMID:25916734

  17. A probabilistic method to report predictions from a human liver microsomes stability QSAR model: a practical tool for drug discovery

    NASA Astrophysics Data System (ADS)

    Aliagas, Ignacio; Gobbi, Alberto; Heffron, Timothy; Lee, Man-Ling; Ortwine, Daniel F.; Zak, Mark; Khojasteh, S. Cyrus

    2015-04-01

    Using data from the in vitro liver microsomes metabolic stability assay, we have developed QSAR models to predict in vitro human clearance. Models were trained using in house high-throughput assay data reported as the predicted human hepatic clearance by liver microsomes or pCLh. Machine learning regression methods were used to generate the models. Model output for a given molecule was reported as its probability of being metabolically stable, thus allowing for synthesis prioritization based on this prediction. Use of probability, instead of the regression value or categories, has been found to be an efficient way for both reporting and assessing predictions. Model performance is evaluated using prospective validation. These models have been integrated into a number of desktop tools, and the models are routinely used to prioritize the synthesis of compounds. We discuss two therapeutic projects at Genentech that exemplify the benefits of a probabilistic approach in applying the models. A three-year retrospective analysis of measured liver microsomes stability data on all registered compounds at Genentech reveals that the use of these models has resulted in an improved metabolic stability profile of synthesized compounds.

  18. A probabilistic method to report predictions from a human liver microsomes stability QSAR model: a practical tool for drug discovery.

    PubMed

    Aliagas, Ignacio; Gobbi, Alberto; Heffron, Timothy; Lee, Man-Ling; Ortwine, Daniel F; Zak, Mark; Khojasteh, S Cyrus

    2015-04-01

    Using data from the in vitro liver microsomes metabolic stability assay, we have developed QSAR models to predict in vitro human clearance. Models were trained using in house high-throughput assay data reported as the predicted human hepatic clearance by liver microsomes or pCLh. Machine learning regression methods were used to generate the models. Model output for a given molecule was reported as its probability of being metabolically stable, thus allowing for synthesis prioritization based on this prediction. Use of probability, instead of the regression value or categories, has been found to be an efficient way for both reporting and assessing predictions. Model performance is evaluated using prospective validation. These models have been integrated into a number of desktop tools, and the models are routinely used to prioritize the synthesis of compounds. We discuss two therapeutic projects at Genentech that exemplify the benefits of a probabilistic approach in applying the models. A three-year retrospective analysis of measured liver microsomes stability data on all registered compounds at Genentech reveals that the use of these models has resulted in an improved metabolic stability profile of synthesized compounds.

  19. Probabilistic modeling of percutaneous absorption for risk-based exposure assessments and transdermal drug delivery.

    SciTech Connect

    Ho, Clifford Kuofei

    2004-06-01

    Chemical transport through human skin can play a significant role in human exposure to toxic chemicals in the workplace, as well as to chemical/biological warfare agents in the battlefield. The viability of transdermal drug delivery also relies on chemical transport processes through the skin. Models of percutaneous absorption are needed for risk-based exposure assessments and drug-delivery analyses, but previous mechanistic models have been largely deterministic. A probabilistic, transient, three-phase model of percutaneous absorption of chemicals has been developed to assess the relative importance of uncertain parameters and processes that may be important to risk-based assessments. Penetration routes through the skin that were modeled include the following: (1) intercellular diffusion through the multiphase stratum corneum; (2) aqueous-phase diffusion through sweat ducts; and (3) oil-phase diffusion through hair follicles. Uncertainty distributions were developed for the model parameters, and a Monte Carlo analysis was performed to simulate probability distributions of mass fluxes through each of the routes. Sensitivity analyses using stepwise linear regression were also performed to identify model parameters that were most important to the simulated mass fluxes at different times. This probabilistic analysis of percutaneous absorption (PAPA) method has been developed to improve risk-based exposure assessments and transdermal drug-delivery analyses, where parameters and processes can be highly uncertain.

  20. SHM-Based Probabilistic Fatigue Life Prediction for Bridges Based on FE Model Updating.

    PubMed

    Lee, Young-Joo; Cho, Soojin

    2016-03-02

    Fatigue life prediction for a bridge should be based on the current condition of the bridge, and various sources of uncertainty, such as material properties, anticipated vehicle loads and environmental conditions, make the prediction very challenging. This paper presents a new approach for probabilistic fatigue life prediction for bridges using finite element (FE) model updating based on structural health monitoring (SHM) data. Recently, various types of SHM systems have been used to monitor and evaluate the long-term structural performance of bridges. For example, SHM data can be used to estimate the degradation of an in-service bridge, which makes it possible to update the initial FE model. The proposed method consists of three steps: (1) identifying the modal properties of a bridge, such as mode shapes and natural frequencies, based on the ambient vibration under passing vehicles; (2) updating the structural parameters of an initial FE model using the identified modal properties; and (3) predicting the probabilistic fatigue life using the updated FE model. The proposed method is demonstrated by application to a numerical model of a bridge, and the impact of FE model updating on the bridge fatigue life is discussed.

  1. The role of expectation and probabilistic learning in auditory boundary perception: a model comparison.

    PubMed

    Pearce, Marcus T; Müllensiefen, Daniel; Wiggins, Geraint A

    2010-01-01

    Grouping and boundary perception are central to many aspects of sensory processing in cognition. We present a comparative study of recently published computational models of boundary perception in music. In doing so, we make three contributions. First, we hypothesise a relationship between expectation and grouping in auditory perception, and introduce a novel information-theoretic model of perceptual segmentation to test the hypothesis. Although we apply the model to musical melody, it is applicable in principle to sequential grouping in other areas of cognition. Second, we address a methodological consideration in the analysis of ambiguous stimuli that produce different percepts between individuals. We propose and demonstrate a solution to this problem, based on clustering of participants prior to analysis. Third, we conduct the first comparative analysis of probabilistic-learning and rule-based models of perceptual grouping in music. In spite of having only unsupervised exposure to music, the model performs comparably to rule-based models based on expert musical knowledge, supporting a role for probabilistic learning in perceptual segmentation of music.

  2. SHM-Based Probabilistic Fatigue Life Prediction for Bridges Based on FE Model Updating

    PubMed Central

    Lee, Young-Joo; Cho, Soojin

    2016-01-01

    Fatigue life prediction for a bridge should be based on the current condition of the bridge, and various sources of uncertainty, such as material properties, anticipated vehicle loads and environmental conditions, make the prediction very challenging. This paper presents a new approach for probabilistic fatigue life prediction for bridges using finite element (FE) model updating based on structural health monitoring (SHM) data. Recently, various types of SHM systems have been used to monitor and evaluate the long-term structural performance of bridges. For example, SHM data can be used to estimate the degradation of an in-service bridge, which makes it possible to update the initial FE model. The proposed method consists of three steps: (1) identifying the modal properties of a bridge, such as mode shapes and natural frequencies, based on the ambient vibration under passing vehicles; (2) updating the structural parameters of an initial FE model using the identified modal properties; and (3) predicting the probabilistic fatigue life using the updated FE model. The proposed method is demonstrated by application to a numerical model of a bridge, and the impact of FE model updating on the bridge fatigue life is discussed. PMID:26950125

  3. A probabilistic modeling approach in thermal inactivation: estimation of postprocess Bacillus cereus spore prevalence and concentration.

    PubMed

    Membré, J M; Amézquita, A; Bassett, J; Giavedoni, P; Blackburn, C de W; Gorris, L G M

    2006-01-01

    The survival of spore-forming bacteria is linked to the safety and stability of refrigerated processed foods of extended durability (REPFEDs). A probabilistic modeling approach was used to assess the prevalence and concentration of Bacillus cereus spores surviving heat treatment for a semiliquid chilled food product. This product received heat treatment to inactivate nonproteolytic Clostridium botulinum during manufacture and was designed to be kept at refrigerator temperature postmanufacture. As key inputs for the modeling, the assessment took into consideration the following factors: (i) contamination frequency (prevalence) and level (concentration) of both psychrotrophic and mesophilic strains of B. cereus, (ii) heat resistance of both types (expressed as decimal reduction times at 90 degrees C), and (iii) intrapouch variability of thermal kinetics during heat processing (expressed as the time spent at 90 degrees C). These three inputs were established as statistical distributions using expert opinion, literature data, and specific modeling, respectively. They were analyzed in a probabilistic model in which the outputs, expressed as distributions as well, were the proportion of the contaminated pouches (the likely prevalence) and the number of spores in the contaminated pouches (the likely concentration). The prevalence after thermal processing was estimated to be 11 and 49% for psychrotrophic and mesophilic strains, respectively. In the positive pouches, the bacterial concentration (considering psychrotrophic and mesophilic strains combined) was estimated to be 30 CFU/g (95th percentile). Such a probabilistic approach seems promising to help in (i) optimizing heat processes, (ii) identifying which key factor(s) to control, and (iii) providing information for subsequent assessment of B. cereus resuscitation and growth.

  4. Probabilistic investigation of sensitivities of advanced test-analysis model correlation methods

    NASA Astrophysics Data System (ADS)

    Bergman, Elizabeth J.; Allen, Matthew S.; Kammer, Daniel C.; Mayes, Randall L.

    2010-06-01

    The industry standard method used to validate finite element models involves correlation of test and analysis mode shapes using reduced Test-Analysis Models (TAMs). Some organizations even require this model validation approach. Considerable effort is required to choose sensor locations and to create a suitable TAM so that the test and analysis mode shapes will be orthogonal to within the required tolerance. This work uses a probabilistic framework to understand and quantify the effect of small errors in the test mode shapes on test-analysis orthogonality. Using the proposed framework, test-orthogonality is a probabilistic metric and the problem becomes one of choosing sensor placement and TAM generation techniques that assure that the orthogonality has a high probability of being within an acceptable range if the model is correct, even though the test measurements are contaminated with random errors. A simple analytical metric is derived that is shown to give a good estimate of the sensitivity of a TAM to errors in the test mode shapes for a certain noise model. These ideas are then applied to a generic satellite system, using TAMs generated by the Static, Modal and Improved Reduced System (IRS) reduction methods. Experimental errors are simulated for a set of mode shapes and Monte Carlo simulation is used to estimate the probability that the orthogonality metric exceeds a threshold due to experimental error alone. For the satellite system considered here, the orthogonality calculation is highly sensitive to experimental errors, so a set of noisy mode shapes has a small probability of passing the orthogonality criteria for some of the TAMs. A number of sensor placement techniques are used in this study, and the comparison reveals that, for this system, the Modal TAM is twice as sensitive to errors on the test mode shapes when it is created on a sensor set optimized for the Static TAM rather than one that was optimized specifically for the Modal TAM. These findings

  5. A probabilistic Poisson-based model accounts for an extensive set of absolute auditory threshold measurements.

    PubMed

    Heil, Peter; Matysiak, Artur; Neubauer, Heinrich

    2017-09-01

    Thresholds for detecting sounds in quiet decrease with increasing sound duration in every species studied. The neural mechanisms underlying this trade-off, often referred to as temporal integration, are not fully understood. Here, we probe the human auditory system with a large set of tone stimuli differing in duration, shape of the temporal amplitude envelope, duration of silent gaps between bursts, and frequency. Duration was varied by varying the plateau duration of plateau-burst (PB) stimuli, the duration of the onsets and offsets of onset-offset (OO) stimuli, and the number of identical bursts of multiple-burst (MB) stimuli. Absolute thresholds for a large number of ears (>230) were measured using a 3-interval-3-alternative forced choice (3I-3AFC) procedure. Thresholds decreased with increasing sound duration in a manner that depended on the temporal envelope. Most commonly, thresholds for MB stimuli were highest followed by thresholds for OO and PB stimuli of corresponding durations. Differences in the thresholds for MB and OO stimuli and in the thresholds for MB and PB stimuli, however, varied widely across ears, were negative in some ears, and were tightly correlated. We show that the variation and correlation of MB-OO and MB-PB threshold differences are linked to threshold microstructure, which affects the relative detectability of the sidebands of the MB stimuli and affects estimates of the bandwidth of auditory filters. We also found that thresholds for MB stimuli increased with increasing duration of the silent gaps between bursts. We propose a new model and show that it accurately accounts for our results and does so considerably better than a leaky-integrator-of-intensity model and a probabilistic model proposed by others. Our model is based on the assumption that sensory events are generated by a Poisson point process with a low rate in the absence of stimulation and higher, time-varying rates in the presence of stimulation. A subject in a 3I-3AFC

  6. Building macroscale models from microscale probabilistic models: a general probabilistic approach for nonlinear diffusion and multispecies phenomena.

    PubMed

    Penington, Catherine J; Hughes, Barry D; Landman, Kerry A

    2011-10-01

    A discrete agent-based model on a periodic lattice of arbitrary dimension is considered. Agents move to nearest-neighbor sites by a motility mechanism accounting for general interactions, which may include volume exclusion. The partial differential equation describing the average occupancy of the agent population is derived systematically. A diffusion equation arises for all types of interactions and is nonlinear except for the simplest interactions. In addition, multiple species of interacting subpopulations give rise to an advection-diffusion equation for each subpopulation. This work extends and generalizes previous specific results, providing a construction method for determining the transport coefficients in terms of a single conditional transition probability, which depends on the occupancy of sites in an influence region. These coefficients characterize the diffusion of agents in a crowded environment in biological and physical processes.

  7. Statistical shape analysis of the human spleen geometry for probabilistic occupant models.

    PubMed

    Yates, Keegan M; Lu, Yuan-Chiao; Untaroiu, Costin D

    2016-06-14

    Statistical shape models are an effective way to create computational models of human organs that can incorporate inter-subject geometrical variation. The main objective of this study was to create statistical mean and boundary models of the human spleen in an occupant posture. Principal component analysis was applied to fifteen human spleens in order to find the statistical modes of variation, mean shape, and boundary models. A landmark sliding approach was utilized to refine the landmarks to obtain a better shape correspondence and create a better representation of the underlying shape contour. The first mode of variation was found to be the overall volume, and it accounted for 69% of the total variation. The mean model and boundary models could be used to develop probabilistic finite element (FE) models which may identify the risk of spleen injury during vehicle collisions and consequently help to improve automobile safety systems.

  8. Probabilistic Projections of Regional Climatic Changes Through an Ensemble Modeling Approach: A Canadian Case Study

    NASA Astrophysics Data System (ADS)

    WANG, X.; Huang, G.

    2016-12-01

    Planning of adaptation strategies against the changing climate requires a thorough assessment of the potential impacts of climate change at local scales. However, climate change impact assessment is usually subject to a number of challenges, such as the lack of high-resolution climate scenarios and the uncertainty in climate model projections, which may pose barriers to impact researchers and decision makers. To tackle these challenges, we will develop high-resolution regional climate scenarios using multiple regional climate models (e.g., PRECIS, WRF, and RegCM) driven by different global climate models (e.g., HadGEM2-ES, CanESM2, GFDL-ESM2M, and CCSM4) under RCP4.5 and RCP8.5 scenarios. A Bayesian hierarchical model will be proposed to help quantify the uncertainties associated with the regional climate ensemble simulations. Results on model evaluation and probabilistic projections of temperature and precipitation changes over Ontario, Canada will be analyzed and presented. The probabilistic projections can provide useful information for assessing the risks and costs associated with climatic changes at regional and local scales.

  9. Propagating Water Quality Analysis Uncertainty Into Resource Management Decisions Through Probabilistic Modeling

    NASA Astrophysics Data System (ADS)

    Gronewold, A. D.; Wolpert, R. L.; Reckhow, K. H.

    2007-12-01

    Most probable number (MPN) and colony-forming-unit (CFU) are two estimates of fecal coliform bacteria concentration commonly used as measures of water quality in United States shellfish harvesting waters. The MPN is the maximum likelihood estimate (or MLE) of the true fecal coliform concentration based on counts of non-sterile tubes in serial dilution of a sample aliquot, indicating bacterial metabolic activity. The CFU is the MLE of the true fecal coliform concentration based on the number of bacteria colonies emerging on a growth plate after inoculation from a sample aliquot. Each estimating procedure has intrinsic variability and is subject to additional uncertainty arising from minor variations in experimental protocol. Several versions of each procedure (using different sized aliquots or different numbers of tubes, for example) are in common use, each with its own levels of probabilistic and experimental error and uncertainty. It has been observed empirically that the MPN procedure is more variable than the CFU procedure, and that MPN estimates are somewhat higher on average than CFU estimates, on split samples from the same water bodies. We construct a probabilistic model that provides a clear theoretical explanation for the observed variability in, and discrepancy between, MPN and CFU measurements. We then explore how this variability and uncertainty might propagate into shellfish harvesting area management decisions through a two-phased modeling strategy. First, we apply our probabilistic model in a simulation-based analysis of future water quality standard violation frequencies under alternative land use scenarios, such as those evaluated under guidelines of the total maximum daily load (TMDL) program. Second, we apply our model to water quality data from shellfish harvesting areas which at present are closed (either conditionally or permanently) to shellfishing, to determine if alternative laboratory analysis procedures might have led to different

  10. Value Learning and Arousal in the Extinction of Probabilistic Rewards: The Role of Dopamine in a Modified Temporal Difference Model

    PubMed Central

    Song, Minryung R.; Fellous, Jean-Marc

    2014-01-01

    Because most rewarding events are probabilistic and changing, the extinction of probabilistic rewards is important for survival. It has been proposed that the extinction of probabilistic rewards depends on arousal and the amount of learning of reward values. Midbrain dopamine neurons were suggested to play a role in both arousal and learning reward values. Despite extensive research on modeling dopaminergic activity in reward learning (e.g. temporal difference models), few studies have been done on modeling its role in arousal. Although temporal difference models capture key characteristics of dopaminergic activity during the extinction of deterministic rewards, they have been less successful at simulating the extinction of probabilistic rewards. By adding an arousal signal to a temporal difference model, we were able to simulate the extinction of probabilistic rewards and its dependence on the amount of learning. Our simulations propose that arousal allows the probability of reward to have lasting effects on the updating of reward value, which slows the extinction of low probability rewards. Using this model, we predicted that, by signaling the prediction error, dopamine determines the learned reward value that has to be extinguished during extinction and participates in regulating the size of the arousal signal that controls the learning rate. These predictions were supported by pharmacological experiments in rats. PMID:24586823

  11. Value learning and arousal in the extinction of probabilistic rewards: the role of dopamine in a modified temporal difference model.

    PubMed

    Song, Minryung R; Fellous, Jean-Marc

    2014-01-01

    Because most rewarding events are probabilistic and changing, the extinction of probabilistic rewards is important for survival. It has been proposed that the extinction of probabilistic rewards depends on arousal and the amount of learning of reward values. Midbrain dopamine neurons were suggested to play a role in both arousal and learning reward values. Despite extensive research on modeling dopaminergic activity in reward learning (e.g. temporal difference models), few studies have been done on modeling its role in arousal. Although temporal difference models capture key characteristics of dopaminergic activity during the extinction of deterministic rewards, they have been less successful at simulating the extinction of probabilistic rewards. By adding an arousal signal to a temporal difference model, we were able to simulate the extinction of probabilistic rewards and its dependence on the amount of learning. Our simulations propose that arousal allows the probability of reward to have lasting effects on the updating of reward value, which slows the extinction of low probability rewards. Using this model, we predicted that, by signaling the prediction error, dopamine determines the learned reward value that has to be extinguished during extinction and participates in regulating the size of the arousal signal that controls the learning rate. These predictions were supported by pharmacological experiments in rats.

  12. Experimental validation of a nonparametric probabilistic model of nonhomogeneous uncertainties for dynamical systems

    NASA Astrophysics Data System (ADS)

    Chebli, Hamid; Soize, Christian

    2004-02-01

    The paper deals with an experimental validation of a nonparametric probabilistic model of nonhomogeneous uncertainties for dynamical systems. The theory used, recently introduced, allows model uncertainties and data uncertainties to be simultaneously taken into account. An experiment devoted to this validation was specifically developed. The experimental model is constituted of two simple dural rectangular plates connected together with a complex joint. In the mean mechanical model, the complex joint, which is constituted of two additional plates attached with 40 screw-bolts, is modeled by a homogeneous orthotropic continuous plate with constant thickness, as usual. Consequently, the mean model introduces a region (the joint) which has a high level of uncertainties. The objective of the paper is to present the experiment and the comparisons of the theoretical prediction with the experiments.

  13. Probabilistic Regularized Extreme Learning Machine for Robust Modeling of Noise Data.

    PubMed

    Lu, XinJiang; Ming, Li; Liu, WenBo; Li, Han-Xiong

    2017-08-17

    The extreme learning machine (ELM) has been extensively studied in the machine learning field and has been widely implemented due to its simplified algorithm and reduced computational costs. However, it is less effective for modeling data with non-Gaussian noise or data containing outliers. Here, a probabilistic regularized ELM is proposed to improve modeling performance with data containing non-Gaussian noise and/or outliers. While traditional ELM minimizes modeling error by using a worst-case scenario principle, the proposed method constructs a new objective function to minimize both mean and variance of this modeling error. Thus, the proposed method considers the modeling error distribution. A solution method is then developed for this new objective function and the proposed method is further proved to be more robust when compared with traditional ELM, even when subject to noise or outliers. Several experimental cases demonstrate that the proposed method has better modeling performance for problems with non-Gaussian noise or outliers.

  14. Probabilistic models, learning algorithms, and response variability: sampling in cognitive development.

    PubMed

    Bonawitz, Elizabeth; Denison, Stephanie; Griffiths, Thomas L; Gopnik, Alison

    2014-10-01

    Although probabilistic models of cognitive development have become increasingly prevalent, one challenge is to account for how children might cope with a potentially vast number of possible hypotheses. We propose that children might address this problem by 'sampling' hypotheses from a probability distribution. We discuss empirical results demonstrating signatures of sampling, which offer an explanation for the variability of children's responses. The sampling hypothesis provides an algorithmic account of how children might address computationally intractable problems and suggests a way to make sense of their 'noisy' behavior. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Assessing uncertainties in global cropland futures using a conditional probabilistic modelling framework

    NASA Astrophysics Data System (ADS)

    Engström, Kerstin; Olin, Stefan; Rounsevell, Mark D. A.; Brogaard, Sara; van Vuuren, Detlef P.; Alexander, Peter; Murray-Rust, Dave; Arneth, Almut

    2016-11-01

    We present a modelling framework to simulate probabilistic futures of global cropland areas that are conditional on the SSP (shared socio-economic pathway) scenarios. Simulations are based on the Parsimonious Land Use Model (PLUM) linked with the global dynamic vegetation model LPJ-GUESS (Lund-Potsdam-Jena General Ecosystem Simulator) using socio-economic data from the SSPs and climate data from the RCPs (representative concentration pathways). The simulated range of global cropland is 893-2380 Mha in 2100 (± 1 standard deviation), with the main uncertainties arising from differences in the socio-economic conditions prescribed by the SSP scenarios and the assumptions that underpin the translation of qualitative SSP storylines into quantitative model input parameters. Uncertainties in the assumptions for population growth, technological change and cropland degradation were found to be the most important for global cropland, while uncertainty in food consumption had less influence on the results. The uncertainties arising from climate variability and the differences between climate change scenarios do not strongly affect the range of global cropland futures. Some overlap occurred across all of the conditional probabilistic futures, except for those based on SSP3. We conclude that completely different socio-economic and climate change futures, although sharing low to medium population development, can result in very similar cropland areas on the aggregated global scale.

  16. A Probabilistic Model for Estimating the Depth and Threshold Temperature of C-fiber Nociceptors

    NASA Astrophysics Data System (ADS)

    Dezhdar, Tara; Moshourab, Rabih A.; Fründ, Ingo; Lewin, Gary R.; Schmuker, Michael

    2015-12-01

    The subjective experience of thermal pain follows the detection and encoding of noxious stimuli by primary afferent neurons called nociceptors. However, nociceptor morphology has been hard to access and the mechanisms of signal transduction remain unresolved. In order to understand how heat transducers in nociceptors are activated in vivo, it is important to estimate the temperatures that directly activate the skin-embedded nociceptor membrane. Hence, the nociceptor’s temperature threshold must be estimated, which in turn will depend on the depth at which transduction happens in the skin. Since the temperature at the receptor cannot be accessed experimentally, such an estimation can currently only be achieved through modeling. However, the current state-of-the-art model to estimate temperature at the receptor suffers from the fact that it cannot account for the natural stochastic variability of neuronal responses. We improve this model using a probabilistic approach which accounts for uncertainties and potential noise in system. Using a data set of 24 C-fibers recorded in vitro, we show that, even without detailed knowledge of the bio-thermal properties of the system, the probabilistic model that we propose here is capable of providing estimates of threshold and depth in cases where the classical method fails.

  17. Probabilistic flood inundation mapping at ungauged streams due to roughness coefficient uncertainty in hydraulic modelling

    NASA Astrophysics Data System (ADS)

    Papaioannou, George; Vasiliades, Lampros; Loukas, Athanasios; Aronica, Giuseppe T.

    2017-04-01

    Probabilistic flood inundation mapping is performed and analysed at the ungauged Xerias stream reach, Volos, Greece. The study evaluates the uncertainty introduced by the roughness coefficient values on hydraulic models in flood inundation modelling and mapping. The well-established one-dimensional (1-D) hydraulic model, HEC-RAS is selected and linked to Monte-Carlo simulations of hydraulic roughness. Terrestrial Laser Scanner data have been used to produce a high quality DEM for input data uncertainty minimisation and to improve determination accuracy on stream channel topography required by the hydraulic model. Initial Manning's n roughness coefficient values are based on pebble count field surveys and empirical formulas. Various theoretical probability distributions are fitted and evaluated on their accuracy to represent the estimated roughness values. Finally, Latin Hypercube Sampling has been used for generation of different sets of Manning roughness values and flood inundation probability maps have been created with the use of Monte Carlo simulations. Historical flood extent data, from an extreme historical flash flood event, are used for validation of the method. The calibration process is based on a binary wet-dry reasoning with the use of Median Absolute Percentage Error evaluation metric. The results show that the proposed procedure supports probabilistic flood hazard mapping at ungauged rivers and provides water resources managers with valuable information for planning and implementing flood risk mitigation strategies.

  18. An efficient deterministic-probabilistic approach to modeling regional groundwater flow: 1. Theory

    USGS Publications Warehouse

    Yen, Chung-Cheng; Guymon, Gary L.

    1990-01-01

    An efficient probabilistic model is developed and cascaded with a deterministic model for predicting water table elevations in regional aquifers. The objective is to quantify model uncertainty where precise estimates of water table elevations may be required. The probabilistic model is based on the two-point probability method which only requires prior knowledge of uncertain variables mean and coefficient of variation. The two-point estimate method is theoretically developed and compared with the Monte Carlo simulation method. The results of comparisons using hypothetical determinisitic problems indicate that the two-point estimate method is only generally valid for linear problems where the coefficients of variation of uncertain parameters (for example, storage coefficient and hydraulic conductivity) is small. The two-point estimate method may be applied to slightly nonlinear problems with good results, provided coefficients of variation are small. In such cases, the two-point estimate method is much more efficient than the Monte Carlo method provided the number of uncertain variables is less than eight.

  19. From Cyclone Tracks to the Costs of European Winter Storms: A Probabilistic Loss Assessment Model

    NASA Astrophysics Data System (ADS)

    Orwig, K.; Renggli, D.; Corti, T.; Reese, S.; Wueest, M.; Viktor, E.; Zimmerli, P.

    2014-12-01

    European winter storms cause billions of dollars of insured losses every year. Therefore, it is essential to understand potential impacts of future events, and the role reinsurance can play to mitigate the losses. The authors will present an overview on natural catastrophe risk assessment modeling in the reinsurance industry, and the development of a new innovative approach for modeling the risk associated with European winter storms.The new innovative approach includes the development of physically meaningful probabilistic (i.e. simulated) events for European winter storm loss assessment. The meteorological hazard component of the new model is based on cyclone and windstorm tracks identified in the 20thCentury Reanalysis data. The knowledge of the evolution of winter storms both in time and space allows the physically meaningful perturbation of historical event properties (e.g. track, intensity, etc.). The perturbation includes a random element but also takes the local climatology and the evolution of the historical event into account.The low-resolution wind footprints taken from the 20thCentury Reanalysis are processed by a statistical-dynamical downscaling to generate high-resolution footprints for both the simulated and historical events. Downscaling transfer functions are generated using ENSEMBLES regional climate model data. The result is a set of reliable probabilistic events representing thousands of years. The event set is then combined with country and site-specific vulnerability functions and detailed market- or client-specific information to compute annual expected losses.

  20. A Probabilistic Model for Estimating the Depth and Threshold Temperature of C-fiber Nociceptors

    PubMed Central

    Dezhdar, Tara; Moshourab, Rabih A.; Fründ, Ingo; Lewin, Gary R.; Schmuker, Michael

    2015-01-01

    The subjective experience of thermal pain follows the detection and encoding of noxious stimuli by primary afferent neurons called nociceptors. However, nociceptor morphology has been hard to access and the mechanisms of signal transduction remain unresolved. In order to understand how heat transducers in nociceptors are activated in vivo, it is important to estimate the temperatures that directly activate the skin-embedded nociceptor membrane. Hence, the nociceptor’s temperature threshold must be estimated, which in turn will depend on the depth at which transduction happens in the skin. Since the temperature at the receptor cannot be accessed experimentally, such an estimation can currently only be achieved through modeling. However, the current state-of-the-art model to estimate temperature at the receptor suffers from the fact that it cannot account for the natural stochastic variability of neuronal responses. We improve this model using a probabilistic approach which accounts for uncertainties and potential noise in system. Using a data set of 24 C-fibers recorded in vitro, we show that, even without detailed knowledge of the bio-thermal properties of the system, the probabilistic model that we propose here is capable of providing estimates of threshold and depth in cases where the classical method fails. PMID:26638830

  1. Smoothed Seismicity Models for the 2016 Italian Probabilistic Seismic Hazard Map

    NASA Astrophysics Data System (ADS)

    Akinci, A.; Moschetti, M. P.; Taroni, M.

    2016-12-01

    For the first time, the 2016 Italian Probabilistic Seismic Hazard maps will incorporate smoothed seismicity models as a part of the earthquake rate forecast. In this study we report progress on the comparison of smoothed seismicity models developed using fixed and adaptive smoothing algorithms, and investigate the sensitivity of seismic hazard to the models. Recent developments in adaptive smoothing methods and statistical tests for evaluating and comparing rate models prompt us to investigate the appropriateness of adaptive smoothing methods for the Italian Hazard Maps. The approach of using spatially-smoothed historical seismicity is different from the one used previously by Working Group MSP04 (2004), and Slejko et al. (1998) for Italy, in which source zones were drawn around the seismicity and the tectonic provinces and is the first to be used for the new probabilistic seismic hazard maps for Italy. We develop two different smoothed seismicity models using fixed (Frankel 1995) and adaptive smoothing methods (Helmstetter et al., 2007) and compare the resulting models (Werner et al., 2011; Moschetti, 2014) by calculating and evaluating the joint likelihood test. The smoothed seismicity models are constructed from the new, historical CPTI15 and instrumental Italian earthquake catalogues and associated completeness levels to produce a space-time forecast of future Italian seismicity. We follow guidance from previous studies to optimize the neighbor number (n-value) by comparing model likelihood values, which estimate the likelihood that the observed earthquake epicenters from the recent catalog are derived from the smoothed rate models. We compare likelihood values from all rate models to rank the smoothing methods. We also compare these two models with the Italian CSEP experiment models, to check their relative performances.

  2. The Integrated Medical Model

    NASA Technical Reports Server (NTRS)

    Butler, Douglas J.; Kerstman, Eric

    2010-01-01

    This slide presentation reviews the goals and approach for the Integrated Medical Model (IMM). The IMM is a software decision support tool that forecasts medical events during spaceflight and optimizes medical systems during simulations. It includes information on the software capabilities, program stakeholders, use history, and the software logic.

  3. An Integrated Model Recontextualized

    ERIC Educational Resources Information Center

    O'Meara, KerryAnn; Saltmarsh, John

    2016-01-01

    In this commentary, authors KerryAnn O'Meara and John Saltmarsh reflect on their 2008 "Journal of Higher Education Outreach and Engagement" article "An Integrated Model for Advancing the Scholarship of Engagement: Creating Academic Homes for the Engaged Scholar," reprinted in this 20th anniversary issue of "Journal of…

  4. Brain Systems for Probabilistic and Dynamic Prediction: Computational Specificity and Integration

    PubMed Central

    O'Reilly, Jill X.; Jbabdi, Saad; Rushworth, Matthew F. S.; Behrens, Timothy E. J.

    2013-01-01

    A computational approach to functional specialization suggests that brain systems can be characterized in terms of the types of computations they perform, rather than their sensory or behavioral domains. We contrasted the neural systems associated with two computationally distinct forms of predictive model: a reinforcement-learning model of the environment obtained through experience with discrete events, and continuous dynamic forward modeling. By manipulating the precision with which each type of prediction could be used, we caused participants to shift computational strategies within a single spatial prediction task. Hence (using fMRI) we showed that activity in two brain systems (typically associated with reward learning and motor control) could be dissociated in terms of the forms of computations that were performed there, even when both systems were used to make parallel predictions of the same event. A region in parietal cortex, which was sensitive to the divergence between the predictions of the models and anatomically connected to both computational networks, is proposed to mediate integration of the two predictive modes to produce a single behavioral output. PMID:24086106

  5. VBA: A Probabilistic Treatment of Nonlinear Models for Neurobiological and Behavioural Data

    PubMed Central

    Daunizeau, Jean; Adam, Vincent; Rigoux, Lionel

    2014-01-01

    This work is in line with an on-going effort tending toward a computational (quantitative and refutable) understanding of human neuro-cognitive processes. Many sophisticated models for behavioural and neurobiological data have flourished during the past decade. Most of these models are partly unspecified (i.e. they have unknown parameters) and nonlinear. This makes them difficult to peer with a formal statistical data analysis framework. In turn, this compromises the reproducibility of model-based empirical studies. This work exposes a software toolbox that provides generic, efficient and robust probabilistic solutions to the three problems of model-based analysis of empirical data: (i) data simulation, (ii) parameter estimation/model selection, and (iii) experimental design optimization. PMID:24465198

  6. VBA: a probabilistic treatment of nonlinear models for neurobiological and behavioural data.

    PubMed

    Daunizeau, Jean; Adam, Vincent; Rigoux, Lionel

    2014-01-01

    This work is in line with an on-going effort tending toward a computational (quantitative and refutable) understanding of human neuro-cognitive processes. Many sophisticated models for behavioural and neurobiological data have flourished during the past decade. Most of these models are partly unspecified (i.e. they have unknown parameters) and nonlinear. This makes them difficult to peer with a formal statistical data analysis framework. In turn, this compromises the reproducibility of model-based empirical studies. This work exposes a software toolbox that provides generic, efficient and robust probabilistic solutions to the three problems of model-based analysis of empirical data: (i) data simulation, (ii) parameter estimation/model selection, and (iii) experimental design optimization.

  7. Integration of highly probabilistic sources into optical quantum architectures: perpetual quantum computation

    NASA Astrophysics Data System (ADS)

    Devitt, Simon J.; Stephens, Ashley M.; Munro, William J.; Nemoto, Kae

    2011-09-01

    In this paper, we introduce a design for an optical topological cluster state computer constructed exclusively from a single quantum component. Unlike previous efforts we eliminate the need for on demand, high fidelity photon sources and detectors and replace them with the same device utilized to create photon/photon entanglement. This introduces highly probabilistic elements into the optical architecture while maintaining complete specificity of the structure and operation for a large-scale computer. Photons in this system are continually recycled back into the preparation network, allowing for an arbitrarily deep three-dimensional cluster to be prepared using a comparatively small number of photonic qubits and consequently the elimination of high-frequency, deterministic photon sources.

  8. Probabilistic failure modelling of reinforced concrete structures subjected to chloride penetration

    NASA Astrophysics Data System (ADS)

    Nogueira, Caio Gorla; Leonel, Edson Denner; Coda, Humberto Breves

    2012-12-01

    Structural durability is an important criterion that must be evaluated for every type of structure. Concerning reinforced concrete members, chloride diffusion process is widely used to evaluate durability, especially when these structures are constructed in aggressive atmospheres. The chloride ingress triggers the corrosion of reinforcements; therefore, by modelling this phenomenon, the corrosion process can be better evaluated as well as the structural durability. The corrosion begins when a threshold level of chloride concentration is reached at the steel bars of reinforcements. Despite the robustness of several models proposed in literature, deterministic approaches fail to predict accurately the corrosion time initiation due the inherent randomness observed in this process. In this regard, structural durability can be more realistically represented using probabilistic approaches. This paper addresses the analyses of probabilistic corrosion time initiation in reinforced concrete structures exposed to chloride penetration. The chloride penetration is modelled using the Fick's diffusion law. This law simulates the chloride diffusion process considering time-dependent effects. The probability of failure is calculated using Monte Carlo simulation and the first order reliability method, with a direct coupling approach. Some examples are considered in order to study these phenomena. Moreover, a simplified method is proposed to determine optimal values for concrete cover.

  9. 3D Morphology Prediction of Progressive Spinal Deformities from Probabilistic Modeling of Discriminant Manifolds.

    PubMed

    Kadoury, Samuel; Mandel, William; Roy-Beaudry, Marjolaine; Nault, Marie-Lyne; Parent, Stefan

    2017-01-23

    We introduce a novel approach for predicting the progression of adolescent idiopathic scoliosis from 3D spine models reconstructed from biplanar X-ray images. Recent progress in machine learning have allowed to improve classification and prognosis rates, but lack a probabilistic framework to measure uncertainty in the data. We propose a discriminative probabilistic manifold embedding where locally linear mappings transform data points from high-dimensional space to corresponding lowdimensional coordinates. A discriminant adjacency matrix is constructed to maximize the separation between progressive and non-progressive groups of patients diagnosed with scoliosis, while minimizing the distance in latent variables belonging to the same class. To predict the evolution of deformation, a baseline reconstruction is projected onto the manifold, from which a spatiotemporal regression model is built from parallel transport curves inferred from neighboring exemplars. Rate of progression is modulated from the spine flexibility and curve magnitude of the 3D spine deformation. The method was tested on 745 reconstructions from 133 subjects using longitudinal 3D reconstructions of the spine, with results demonstrating the discriminatory framework can identify between progressive and non-progressive of scoliotic patients with a classification rate of 81% and prediction differences of 2.1o in main curve angulation, outperforming other manifold learning methods. Our method achieved a higher prediction accuracy and improved the modeling of spatiotemporal morphological changes in highly deformed spines compared to other learning methods.

  10. A multivariate probabilistic graphical model for real-time volcano monitoring on Mount Etna

    NASA Astrophysics Data System (ADS)

    Cannavò, Flavio; Cannata, Andrea; Cassisi, Carmelo; Di Grazia, Giuseppe; Montalto, Placido; Prestifilippo, Michele; Privitera, Eugenio; Coltelli, Mauro; Gambino, Salvatore

    2017-05-01

    Real-time assessment of the state of a volcano plays a key role for civil protection purposes. Unfortunately, because of the coupling of highly nonlinear and partially known complex volcanic processes, and the intrinsic uncertainties in measured parameters, the state of a volcano needs to be expressed in probabilistic terms, thus making any rapid assessment sometimes impractical. With the aim of aiding on-duty personnel in volcano-monitoring roles, we present an expert system approach to automatically estimate the ongoing state of a volcano from all available measurements. The system consists of a probabilistic model that encodes the conditional dependencies between measurements and volcanic states in a directed acyclic graph and renders an estimation of the probability distribution of the feasible volcanic states. We test the model with Mount Etna (Italy) as a case study by considering a long record of multivariate data. Results indicate that the proposed model is effective for early warning and has considerable potential for decision-making purposes.

  11. Data assimilation for unsaturated flow models with restart adaptive probabilistic collocation based Kalman filter

    NASA Astrophysics Data System (ADS)

    Man, Jun; Li, Weixuan; Zeng, Lingzao; Wu, Laosheng

    2016-06-01

    The ensemble Kalman filter (EnKF) has gained popularity in hydrological data assimilation problems. As a Monte Carlo based method, a sufficiently large ensemble size is usually required to guarantee the accuracy. As an alternative approach, the probabilistic collocation based Kalman filter (PCKF) employs the polynomial chaos expansion (PCE) to represent and propagate the uncertainties in parameters and states. However, PCKF suffers from the so-called "curse of dimensionality". Its computational cost increases drastically with the increasing number of parameters and system nonlinearity. Furthermore, PCKF may fail to provide accurate estimations due to the joint updating scheme for strongly nonlinear models. Motivated by recent developments in uncertainty quantification and EnKF, we propose a restart adaptive probabilistic collocation based Kalman filter (RAPCKF) for data assimilation in unsaturated flow problems. During the implementation of RAPCKF, the important parameters are identified and active PCE basis functions are adaptively selected at each assimilation step; the "restart" scheme is utilized to eliminate the inconsistency between updated model parameters and states variables. The performance of RAPCKF is systematically tested with numerical cases of unsaturated flow models. It is shown that the adaptive approach and restart scheme can significantly improve the performance of PCKF. Moreover, RAPCKF has been demonstrated to be more efficient than EnKF with the same computational cost.

  12. Biomedical time series clustering based on non-negative sparse coding and probabilistic topic model.

    PubMed

    Wang, Jin; Liu, Ping; F H She, Mary; Nahavandi, Saeid; Kouzani, Abbas

    2013-09-01

    Biomedical time series clustering that groups a set of unlabelled temporal signals according to their underlying similarity is very useful for biomedical records management and analysis such as biosignals archiving and diagnosis. In this paper, a new framework for clustering of long-term biomedical time series such as electrocardiography (ECG) and electroencephalography (EEG) signals is proposed. Specifically, local segments extracted from the time series are projected as a combination of a small number of basis elements in a trained dictionary by non-negative sparse coding. A Bag-of-Words (BoW) representation is then constructed by summing up all the sparse coefficients of local segments in a time series. Based on the BoW representation, a probabilistic topic model that was originally developed for text document analysis is extended to discover the underlying similarity of a collection of time series. The underlying similarity of biomedical time series is well captured attributing to the statistic nature of the probabilistic topic model. Experiments on three datasets constructed from publicly available EEG and ECG signals demonstrates that the proposed approach achieves better accuracy than existing state-of-the-art methods, and is insensitive to model parameters such as length of local segments and dictionary size.

  13. Probabilistic, multi-variate flood damage modelling using random forests and Bayesian networks

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Schröter, Kai

    2015-04-01

    Decisions on flood risk management and adaptation are increasingly based on risk analyses. Such analyses are associated with considerable uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention recently, they are hardly applied in flood damage assessments. Most of the damage models usually applied in standard practice have in common that complex damaging processes are described by simple, deterministic approaches like stage-damage functions. This presentation will show approaches for probabilistic, multi-variate flood damage modelling on the micro- and meso-scale and discuss their potential and limitations. Reference: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Schröter, K., Kreibich, H., Vogel, K., Riggelsen, C., Scherbaum, F., Merz, B. (2014): How useful are complex flood damage models? - Water Resources Research, 50, 4, p. 3378-3395.

  14. Aircraft Conflict Analysis and Real-Time Conflict Probing Using Probabilistic Trajectory Modeling

    NASA Technical Reports Server (NTRS)

    Yang, Lee C.; Kuchar, James K.

    2000-01-01

    Methods for maintaining separation between aircraft in the current airspace system have been built from a foundation of structured routes and evolved procedures. However, as the airspace becomes more congested and the chance of failures or operational error become more problematic, automated conflict alerting systems have been proposed to help provide decision support and to serve as traffic monitoring aids. The problem of conflict detection and resolution has been tackled from a number of different ways, but in this thesis, it is recast as a problem of prediction in the presence of uncertainties. Much of the focus is concentrated on the errors and uncertainties from the working trajectory model used to estimate future aircraft positions. The more accurate the prediction, the more likely an ideal (no false alarms, no missed detections) alerting system can be designed. Additional insights into the problem were brought forth by a review of current operational and developmental approaches found in the literature. An iterative, trial and error approach to threshold design was identified. When examined from a probabilistic perspective, the threshold parameters were found to be a surrogate to probabilistic performance measures. To overcome the limitations in the current iterative design method, a new direct approach is presented where the performance measures are directly computed and used to perform the alerting decisions. The methodology is shown to handle complex encounter situations (3-D, multi-aircraft, multi-intent, with uncertainties) with relative ease. Utilizing a Monte Carlo approach, a method was devised to perform the probabilistic computations in near realtime. Not only does this greatly increase the method's potential as an analytical tool, but it also opens up the possibility for use as a real-time conflict alerting probe. A prototype alerting logic was developed and has been utilized in several NASA Ames Research Center experimental studies.

  15. Probabilistic Mixture Regression Models for Alignment of LC-MS Data

    PubMed Central

    Befekadu, Getachew K.; Tadesse, Mahlet G.; Tsai, Tsung-Heng; Ressom, Habtom W.

    2010-01-01

    A novel framework of a probabilistic mixture regression model (PMRM) is presented for alignment of liquid chromatography-mass spectrometry (LC-MS) data with respect to both retention time (RT) and mass-to-charge ratio (m/z). The expectation maximization algorithm is used to estimate the joint parameters of spline-based mixture regression models and prior transformation density models. The latter accounts for the variability in RT points, m/z values, and peak intensities. The applicability of PMRM for alignment of LC-MS data is demonstrated through three datasets. The performance of PMRM is compared with other alignment approaches including dynamic time warping, correlation optimized warping, and continuous profile model in terms of coefficient variation of replicate LC-MS runs and accuracy in detecting differentially abundant peptides/proteins. PMID:20837998

  16. A probabilistic model of insolation for the Mojave desert-area

    NASA Technical Reports Server (NTRS)

    Hester, O. V.; Reid, M. S.

    1978-01-01

    A preliminary solar model has been developed for the area around the JPL's Goldstone Space Communications Complex. The model has the capability of producing any or all of the following outputs: (1) a clear sky theoretical amount of radiation, (2) solar radiation for clear sky, cloudy sky or partially clear sky depending on certain probabilistic parameters, and (3) an array of average solar energy reception rates (solar intensities) in kW/sq m for a specified length of time. This model is based on the ASHRAE clear day model, which is modulated by the effects of clouds. The distribution of clouds for any given time is determined by the combination of statistical procedures, measured insolation values over a six-months period, and a data bank of 19 years of cloud cover information.

  17. A probabilistic model of insolation for the Mojave desert-area

    NASA Technical Reports Server (NTRS)

    Hester, O. V.; Reid, M. S.

    1978-01-01

    A preliminary solar model has been developed for the area around the JPL's Goldstone Space Communications Complex. The model has the capability of producing any or all of the following outputs: (1) a clear sky theoretical amount of radiation, (2) solar radiation for clear sky, cloudy sky or partially clear sky depending on certain probabilistic parameters, and (3) an array of average solar energy reception rates (solar intensities) in kW/sq m for a specified length of time. This model is based on the ASHRAE clear day model, which is modulated by the effects of clouds. The distribution of clouds for any given time is determined by the combination of statistical procedures, measured insolation values over a six-months period, and a data bank of 19 years of cloud cover information.

  18. A simulation model for probabilistic analysis of Space Shuttle abort modes

    NASA Technical Reports Server (NTRS)

    Hage, R. T.

    1993-01-01

    A simulation model which was developed to provide a probabilistic analysis tool to study the various space transportation system abort mode situations is presented. The simulation model is based on Monte Carlo simulation of an event-tree diagram which accounts for events during the space transportation system's ascent and its abort modes. The simulation model considers just the propulsion elements of the shuttle system (i.e., external tank, main engines, and solid boosters). The model was developed to provide a better understanding of the probability of occurrence and successful completion of abort modes during the vehicle's ascent. The results of the simulation runs discussed are for demonstration purposes only, they are not official NASA probability estimates.

  19. Global sensitivity analysis, probabilistic calibration, and predictive assessment for the data assimilation linked ecosystem carbon model

    SciTech Connect

    Safta, C.; Ricciuto, Daniel M.; Sargsyan, Khachik; Debusschere, B.; Najm, H. N.; Williams, M.; Thornton, Peter E.

    2015-07-01

    In this paper we propose a probabilistic framework for an uncertainty quantification (UQ) study of a carbon cycle model and focus on the comparison between steady-state and transient simulation setups. A global sensitivity analysis (GSA) study indicates the parameters and parameter couplings that are important at different times of the year for quantities of interest (QoIs) obtained with the data assimilation linked ecosystem carbon (DALEC) model. We then employ a Bayesian approach and a statistical model error term to calibrate the parameters of DALEC using net ecosystem exchange (NEE) observations at the Harvard Forest site. The calibration results are employed in the second part of the paper to assess the predictive skill of the model via posterior predictive checks.

  20. Global sensitivity analysis, probabilistic calibration, and predictive assessment for the data assimilation linked ecosystem carbon model

    DOE PAGES

    Safta, C.; Ricciuto, Daniel M.; Sargsyan, Khachik; ...

    2015-07-01

    In this paper we propose a probabilistic framework for an uncertainty quantification (UQ) study of a carbon cycle model and focus on the comparison between steady-state and transient simulation setups. A global sensitivity analysis (GSA) study indicates the parameters and parameter couplings that are important at different times of the year for quantities of interest (QoIs) obtained with the data assimilation linked ecosystem carbon (DALEC) model. We then employ a Bayesian approach and a statistical model error term to calibrate the parameters of DALEC using net ecosystem exchange (NEE) observations at the Harvard Forest site. The calibration results are employedmore » in the second part of the paper to assess the predictive skill of the model via posterior predictive checks.« less

  1. Synaptic Computation Underlying Probabilistic Inference

    PubMed Central

    Soltani, Alireza; Wang, Xiao-Jing

    2010-01-01

    In this paper we propose that synapses may be the workhorse of neuronal computations that underlie probabilistic reasoning. We built a neural circuit model for probabilistic inference when information provided by different sensory cues needs to be integrated, and the predictive powers of individual cues about an outcome are deduced through experience. We found that bounded synapses naturally compute, through reward-dependent plasticity, the posterior probability that a choice alternative is correct given that a cue is presented. Furthermore, a decision circuit endowed with such synapses makes choices based on the summated log posterior odds and performs near-optimal cue combination. The model is validated by reproducing salient observations of, and provide insights into, a monkey experiment using a categorization task. Our model thus suggests a biophysical instantiation of the Bayesian decision rule, while predicting important deviations from it similar to ‘base-rate neglect’ observed in human studies when alternatives have unequal priors. PMID:20010823

  2. Efficient Location Uncertainty Treatment for Probabilistic Modelling of Portfolio Loss from Earthquake Events

    NASA Astrophysics Data System (ADS)

    Scheingraber, Christoph; Käser, Martin; Allmann, Alexander

    2017-04-01

    Probabilistic seismic risk analysis (PSRA) is a well-established method for modelling loss from earthquake events. In the insurance industry, it is widely employed for probabilistic modelling of loss to a distributed portfolio. In this context, precise exposure locations are often unknown, which results in considerable loss uncertainty. The treatment of exposure uncertainty has already been identified as an area where PSRA would benefit from increased research attention. However, so far, epistemic location uncertainty has not been in the focus of a large amount of research. We propose a new framework for efficient treatment of location uncertainty. To demonstrate the usefulness of this novel method, a large number of synthetic portfolios resembling real-world portfolios is systematically analyzed. We investigate the effect of portfolio characteristics such as value distribution, portfolio size, or proportion of risk items with unknown coordinates on loss variability. Several sampling criteria to increase the computational efficiency of the framework are proposed and put into the wider context of well-established Monte-Carlo variance reduction techniques. The performance of each of the proposed criteria is analyzed.

  3. Data assimilation for unsaturated flow models with restart adaptive probabilistic collocation based Kalman filter

    SciTech Connect

    Man, Jun; Li, Weixuan; Zeng, Lingzao; Wu, Laosheng

    2016-06-01

    The ensemble Kalman filter (EnKF) has gained popularity in hydrological data assimilation problems. As a Monte Carlo based method, a relatively large ensemble size is usually required to guarantee the accuracy. As an alternative approach, the probabilistic collocation based Kalman filter (PCKF) employs the polynomial chaos to approximate the original system. In this way, the sampling error can be reduced. However, PCKF suffers from the so-called "curse of dimensionality". When the system nonlinearity is strong and number of parameters is large, PCKF could be even more computationally expensive than EnKF. Motivated by most recent developments in uncertainty quantification, we propose a restart adaptive probabilistic collocation based Kalman filter (RAPCKF) for data assimilation in unsaturated flow problems. During the implementation of RAPCKF, the important parameters are identified and active PCE basis functions are adaptively selected. The "restart" technology is used to eliminate the inconsistency between model parameters and states. The performance of RAPCKF is tested with numerical cases of unsaturated flow models. It is shown that RAPCKF is more efficient than EnKF with the same computational cost. Compared with the traditional PCKF, the RAPCKF is more applicable in strongly nonlinear and high dimensional problems.

  4. Cost-utility analysis of cataract surgery in Japan: a probabilistic Markov modeling study.

    PubMed

    Hiratsuka, Yoshimune; Yamada, Masakazu; Akune, Yoko; Murakami, Akira; Okada, Annabelle A; Yamashita, Hidetoshi; Ohashi, Yuichi; Yamagishi, Naoya; Tamura, Hiroshi; Fukuhara, Shunichi; Takura, Tomoyuki

    2013-07-01

    To evaluate with the best available clinical data in Japan the cost-effectiveness of cataract surgery through the estimation of the incremental costs per quality-adjusted life years (QALYs) gained. A Markov model with a probabilistic cohort analysis was constructed to calculate the incremental costs per QALY gained by cataract surgery in Japan. A 1-year cycle length and a 20-year horizon were applied. Best available evidence in Japan supplied the model with data on the course of cataract surgery. Uncertainty was explored using univariate and probabilistic sensitivity analysis. In base case analysis, cataract surgery was associated with incremental costs of Japanese yen (¥) 551,513 (US$ 6,920) and incremental effectiveness of 3.38 QALYs per one cataract patient. The incremental cost effectiveness ratio (ICER) was ¥ 163,331 (US$ 2,049) per QALY. In Monte Carlo simulation, the average patient with cataract surgery accrued 4.65 [95 % confidence interval (CI): 2.75-5.69] more QALYs than patients without surgery, giving an ICER of ¥ 118,460 (95 % CI: 73,516-207,926) (US$ 1,486) per QALY. Cataract surgery in Japan is highly cost-effective even when allowing for the uncertainty of the known variability that exists in estimates of the costs, utilities, and postoperative complication rate.

  5. Modeling PSA Problems - I: The Stimulus-Driven Theory of Probabilistic Dynamics

    SciTech Connect

    Labeau, P.E.; Izquierdo, J.M.

    2005-06-15

    The theory of probabilistic dynamics (TPD) offers a framework capable of modeling the interaction between the physical evolution of a system in transient conditions and the succession of branchings defining a sequence of events. Nonetheless, the Chapman-Kolmogorov equation, besides being inherently Markovian, assumes instantaneous changes in the system dynamics when a setpoint is crossed. In actuality, a transition between two dynamic evolution regimes of the system is a two-phase process. First, conditions corresponding to the triggering of a transition have to be met; this phase will be referred to as the activation of a 'stimulus'. Then, a time delay must elapse before the actual occurrence of the event causing the transition to take place. When this delay cannot be neglected and is a random quantity, the general TPD can no longer be used as such. Moreover, these delays are likely to influence the ordering of events in an accident sequence with competing situations, and the process of delineating sequences in the probabilistic safety analysis of a plant might therefore be affected in turn. This paper aims at presenting several extensions of the classical TPD, in which additional modeling capabilities are progressively introduced. A companion paper sketches a discretized approach of these problems.

  6. A probabilistic topic model for clinical risk stratification from electronic health records.

    PubMed

    Huang, Zhengxing; Dong, Wei; Duan, Huilong

    2015-12-01

    Risk stratification aims to provide physicians with the accurate assessment of a patient's clinical risk such that an individualized prevention or management strategy can be developed and delivered. Existing risk stratification techniques mainly focus on predicting the overall risk of an individual patient in a supervised manner, and, at the cohort level, often offer little insight beyond a flat score-based segmentation from the labeled clinical dataset. To this end, in this paper, we propose a new approach for risk stratification by exploring a large volume of electronic health records (EHRs) in an unsupervised fashion. Along this line, this paper proposes a novel probabilistic topic modeling framework called probabilistic risk stratification model (PRSM) based on Latent Dirichlet Allocation (LDA). The proposed PRSM recognizes a patient clinical state as a probabilistic combination of latent sub-profiles, and generates sub-profile-specific risk tiers of patients from their EHRs in a fully unsupervised fashion. The achieved stratification results can be easily recognized as high-, medium- and low-risk, respectively. In addition, we present an extension of PRSM, called weakly supervised PRSM (WS-PRSM) by incorporating minimum prior information into the model, in order to improve the risk stratification accuracy, and to make our models highly portable to risk stratification tasks of various diseases. We verify the effectiveness of the proposed approach on a clinical dataset containing 3463 coronary heart disease (CHD) patient instances. Both PRSM and WS-PRSM were compared with two established supervised risk stratification algorithms, i.e., logistic regression and support vector machine, and showed the effectiveness of our models in risk stratification of CHD in terms of the Area Under the receiver operating characteristic Curve (AUC) analysis. As well, in comparison with PRSM, WS-PRSM has over 2% performance gain, on the experimental dataset, demonstrating that

  7. Efficient Probabilistic Forecasting for High-Resolution Models through Clustered-State Data Assimilation

    NASA Astrophysics Data System (ADS)

    Hernandez, F.; Liang, X.

    2016-12-01

    Current data, software, and hardware availability allow simulating geophysical processes at high resolutions in search for more accurate predictions. Similarly, probabilistic modeling through ensemble data assimilation and ensemble forecasting enables managing the underlying uncertainty in a practical way, to avoid overfitting and to better estimate risks. However, as the dimensionality of a model increases, the computational requirements for dealing with the dependencies between state variables skyrocket, severely limiting the access to the benefits of probabilistic approaches. In this work we introduce a data assimilation algorithm for high-resolution models that efficiently captures these dependencies, which are usually encoded in huge covariance matrices that are used to represent probability distributions. Our proposed algorithm performs the probability density and random sampling operations on state representations of much-reduced size: modeling cells are fuzzily clustered into a number of groups, each of which can be represented by an averaged "centroid" cell. The clustering is performed using a similarity measure based on the cells' state variables and parameters. The measure was optimized such that, when clustering the initial state of a set of hydrologic models, these would perform as similarly as possible as the set of original models with un-clustered initial states. Several structures for the similarity measure, in addition to a weighted Euclidean formulation, were investigated and optimized using evolutionary algorithms. The proposed approach was incorporated into OPTIMISTS, our own state of the art hybrid sequential/variational data assimilation algorithm, which was coupled with the Distributed Hydrology Soil Vegetation Model (DHSVM). We performed comparison tests between the modified OPTIMISTS version and the original one in which the dependencies between variables are only partially accounted for. The results with a model of a real watershed with

  8. A nonlinear model for assessing multiple probabilistic risks: a case study in South five-island of Changdao National Nature Reserve in China.

    PubMed

    Wang, Xiao Long; Zhang, Jie

    2007-12-01

    Several methods for estimating the potential impacts caused by multiple probabilistic risks have been suggested. These existing methods mostly rely on the weight sum algorithm to address the need for integrated risk assessment. This paper develops a nonlinear model to perform such an assessment. The joint probability algorithm has been applied to the model development. An application of the developed model in South five-island of Changdao National Nature Reserve, China, combining remote sensing data and a GIS technique, provides a reasonable risk assessment. Based on the case study, we discuss the feasibility of the model. We propose that the model has the potential for use in identifying the regional primary stressor, investigating the most vulnerable habitat, and assessing the integrated impact of multiple stressors.

  9. Spatial dispersion of interstellar civilizations: a probabilistic site percolation model in three dimensions

    NASA Astrophysics Data System (ADS)

    Hair, Thomas W.; Hedman, Andrew D.

    2013-01-01

    A model of the spatial emergence of an interstellar civilization into a uniform distribution of habitable systems is presented. The process of emigration is modelled as a three-dimensional probabilistic cellular automaton. An algorithm is presented which defines both the daughter colonies of the original seed vertex and all subsequent connected vertices, and the probability of a connection between any two vertices. The automaton is analysed over a wide set of parameters for iterations that represent up to 250 000 years within the model's assumptions. Emigration patterns are characterized and used to evaluate two hypotheses that aim to explain the Fermi Paradox. The first hypothesis states that interstellar emigration takes too long for any civilization to have yet come within a detectable distance, and the second states that large volumes of habitable space may be left uninhabited by an interstellar civilization and Earth is located in one of these voids.

  10. Probabilistic integrated risk assessment of human exposure risk to environmental bisphenol A pollution sources.

    PubMed

    Fu, Keng-Yen; Cheng, Yi-Hsien; Chio, Chia-Pin; Liao, Chung-Min

    2016-10-01

    Environmental bisphenol A (BPA) exposure has been linked to a variety of adverse health effects such as developmental and reproductive issues. However, establishing a clear association between BPA and the likelihood of human health is complex yet fundamentally uncertain. The purpose of this study was to assess the potential exposure risks from environmental BPA among Chinese population based on five human health outcomes, namely immune response, uterotrophic assay, cardiovascular disease (CVD), diabetes, and behavior change. We addressed these health concerns by using a stochastic integrated risk assessment approach. The BPA dose-dependent likelihood of effects was reconstructed by a series of Hill models based on animal models or epidemiological data. We developed a physiologically based pharmacokinetic (PBPK) model that allows estimation of urinary BPA concentration from external exposures. Here we showed that the daily average exposure concentrations of BPA and urinary BPA estimates were consistent with the published data. We found that BPA exposures were less likely to pose significant risks for infants (0-1 year) and adults (male and female >20 years) with <10(-6)-fold increase in uterus weight and immune response outcomes, respectively. Moreover, our results indicated that there was 50 % risk probability that the response outcomes of CVD, diabetes, and behavior change with or without skin absorption would increase 10(-4)-10(-2)-fold. We conclude that our approach provides a powerful tool for tracking and managing human long-term BPA susceptibility in relation to multiple exposure pathways, and for informing the public of the negligible magnitude of environmental BPA pollution impacts on human health.

  11. Integrated Modeling Systems

    DTIC Science & Technology

    1989-01-01

    Management , UCLA. Federgruen, A. and Zipkin , P. (1984), ’A Combined Vehicle Routing and Inventory Allocation Problem’, Operations Research 32(5), 1019-1037...Completion Based Inventory Systems: Optimal Policies for Repair Kits and Spare Machines," Management Science, 31:6 (June 1985). WMSI Working Paper 318. 210...Reprint No. 238 Computer Science in Economics and Management 2 (1989), pp. 3-15 AD-A215 219 INTEGRATED MODELING SYSTEMS by Arthur M. Geoffrion DTIC0

  12. On the applicability of probabilistics

    SciTech Connect

    Roth, P.G.

    1996-12-31

    GEAE`s traditional lifing approach, based on Low Cycle Fatigue (LCF) curves, is evolving for fracture critical powder metal components by incorporating probabilistic fracture mechanics analysis. Supporting this move is a growing validation database which convincingly demonstrates that probabilistics work given the right inputs. Significant efforts are being made to ensure the right inputs. For example, Heavy Liquid Separation (HLS) analysis has been developed to quantify and control inclusion content (1). Also, an intensive seeded fatigue program providing a model for crack initiation at inclusions is ongoing (2). Despite the optimism and energy, probabilistics are only tools and have limitations. Designing to low failure probabilities helps provide protection, but other strategies are needed to protect against surprises. A low risk design limit derived from a predicted failure distribution can lead to a high risk deployment if there are unaccounted-for deviations from analysis assumptions. Recognized deviations which are statistically quantifiable can be integrated into the probabilistic analysis (an advantage of the approach). When deviations are known to be possible but are not properly describable statistically, it may be more appropriate to maintain the traditional position of conservatively bounding relevant input parameters. Finally, safety factors on analysis results may be called for in cases where there is little experience supporting new design concepts or material applications (where unrecognized deviations might be expected).

  13. Probabilistic Residual Strength Model Developed for Life Prediction of Ceramic Matrix Composites

    NASA Technical Reports Server (NTRS)

    Thomas, David J.; Verrilli, Michael J.; Calomino, Anthony M.

    2004-01-01

    For the next generation of reusable launch vehicles, NASA is investigating introducing ceramic matrix composites (CMCs) in place of current superalloys for structural propulsion applications (e.g., nozzles, vanes, combustors, and heat exchangers). The higher use temperatures of CMCs will reduce vehicle weight by eliminating and/or reducing cooling system requirements. The increased strength-to-weight ratio of CMCs relative to superalloys further enhances their weight savings potential. However, in order to provide safe designs for components made of these new materials, a comprehensive life prediction methodology for CMC structures needs to be developed. A robust methodology for lifing composite structures has yet to be adopted by the engineering community. Current industry design practice continues to utilize deterministic empirically based models borrowed from metals design for predicting material life capabilities. The deterministic nature of these models inadequately addresses the stochastic character of brittle composites, and their empirical reliance makes predictions beyond the experimental test conditions a risky extrapolation. A team of engineers at the NASA Glenn Research Center has been developing a new life prediction engineering model. The Probabilistic Residual Strength (PRS) model uses the residual strength of the composite as its damage metric. Expected life and material strength are both considered probabilistically to account for the observed stochastic material response. Extensive experimental testing has been carried out on C/SiC (a candidate aerospace CMC material system) in a controlled 1000 ppm O2/argon environment at elevated temperatures of 800 and 1200 C. The test matrix was established to allow observation of the material behavior, characterization of the model, and validation of the model's predictive capabilities. Sample results of the validation study are illustrated in the graphs.

  14. Design and analysis of DNA strand displacement devices using probabilistic model checking.

    PubMed

    Lakin, Matthew R; Parker, David; Cardelli, Luca; Kwiatkowska, Marta; Phillips, Andrew

    2012-07-07

    Designing correct, robust DNA devices is difficult because of the many possibilities for unwanted interference between molecules in the system. DNA strand displacement has been proposed as a design paradigm for DNA devices, and the DNA strand displacement (DSD) programming language has been developed as a means of formally programming and analysing these devices to check for unwanted interference. We demonstrate, for the first time, the use of probabilistic verification techniques to analyse the correctness, reliability and performance of DNA devices during the design phase. We use the probabilistic model checker prism, in combination with the DSD language, to design and debug DNA strand displacement components and to investigate their kinetics. We show how our techniques can be used to identify design flaws and to evaluate the merits of contrasting design decisions, even on devices comprising relatively few inputs. We then demonstrate the use of these components to construct a DNA strand displacement device for approximate majority voting. Finally, we discuss some of the challenges and possible directions for applying these methods to more complex designs.

  15. Use of probabilistic inversion to model qualitative expert input when selecting a new nuclear reactor technology

    NASA Astrophysics Data System (ADS)

    Merritt, Charles R., Jr.

    Complex investment decisions by corporate executives often require the comparison of dissimilar attributes and competing technologies. A technique to evaluate qualitative input from experts using a Multi-Criteria Decision Method (MCDM) is described to select a new reactor technology for a merchant nuclear generator. The high capital cost, risks from design, licensing and construction, reactor safety and security considerations are some of the diverse considerations when choosing a reactor design. Three next generation reactor technologies are examined: the Advanced Pressurized-1000 (AP-1000) from Westinghouse, Economic Simplified Boiling Water Reactor (ESBWR) from General Electric, and the U.S. Evolutionary Power Reactor (U.S. EPR) from AREVA. Recent developments in MCDM and decision support systems are described. The uncertainty inherent in experts' opinions for the attribute weighting in the MCDM is modeled through the use of probabilistic inversion. In probabilistic inversion, a function is inverted into a random variable within a defined range. Once the distribution is created, random samples based on the distribution are used to perform a sensitivity analysis on the decision results to verify the "strength" of the results. The decision results for the pool of experts identified the U.S. EPR as the optimal choice.

  16. Probabilistic Failure Analysis of Bone Using a Finite Element Model of Mineral-Collagen Composites

    PubMed Central

    Dong, X. Neil; Guda, Teja; Millwater, Harry R.; Wang, Xiaodu

    2009-01-01

    Microdamage accumulation is a major pathway for energy dissipation during the post-yield deformation of bone. In this study, a two-dimensional probabilistic finite element model of a mineral-collagen composite was developed to investigate the influence of the tissue and ultrastructural properties of bone on the evolution of microdamage from an initial defect in tension. The probabilistic failure analyses indicated that the microdamage progression would be along the plane of the initial defect when the debonding at mineral-collagen interfaces was either absent or limited in the vicinity of the defect. In this case, the formation of a linear microcrack would be facilitated. However, the microdamage progression would be scattered away from the initial defect plane if interfacial debonding takes place at a large scale. This would suggest the possible formation of diffuse damage. In addition to interfacial debonding, the sensitivity analyses indicated that the microdamage progression was also dependent on the other material and ultrastructural properties of bone. The intensity of stress concentration accompanied with microdamage progression was more sensitive to the elastic modulus of the mineral phase and the nonlinearity of the collagen phase, whereas the scattering of failure location was largely dependent on the mineral to collagen ratio and the nonlinearity of the collagen phase. The findings of this study may help understanding the post-yield behavior of bone at the ultrastructural level and shed light on the underlying mechanism of bone fractures. PMID:19058806

  17. Steady-State Analysis of Genetic Regulatory Networks Modelled by Probabilistic Boolean Networks

    PubMed Central

    Gluhovsky, Ilya; Hashimoto, Ronaldo F.; Dougherty, Edward R.; Zhang, Wei

    2003-01-01

    Probabilistic Boolean networks (PBNs) have recently been introduced as a promising class of models of genetic regulatory networks. The dynamic behaviour of PBNs can be analysed in the context of Markov chains. A key goal is the determination of the steady-state (long-run) behaviour of a PBN by analysing the corresponding Markov chain. This allows one to compute the long-term influence of a gene on another gene or determine the long-term joint probabilistic behaviour of a few selected genes. Because matrix-based methods quickly become prohibitive for large sizes of networks, we propose the use of Monte Carlo methods. However, the rate of convergence to the stationary distribution becomes a central issue. We discuss several approaches for determining the number of iterations necessary to achieve convergence of the Markov chain corresponding to a PBN. Using a recently introduced method based on the theory of two-state Markov chains, we illustrate the approach on a sub-network designed from human glioma gene expression data and determine the joint steadystate probabilities for several groups of genes. PMID:18629023

  18. Probabilistic modeling of the flows and environmental risks of nano-silica.

    PubMed

    Wang, Yan; Kalinina, Anna; Sun, Tianyin; Nowack, Bernd

    2016-03-01

    Nano-silica, the engineered nanomaterial with one of the largest production volumes, has a wide range of applications in consumer products and industry. This study aimed to quantify the exposure of nano-silica to the environment and to assess its risk to surface waters. Concentrations were calculated for four environmental (air, soil, surface water, sediments) and two technical compartments (wastewater, solid waste) for the EU and Switzerland using probabilistic material flow modeling. The corresponding median concentration in surface water is predicted to be 0.12 μg/l in the EU (0.053-3.3 μg/l, 15/85% quantiles). The concentrations in sediments in the complete sedimentation scenario were found to be the largest among all environmental compartments, with a median annual increase of 0.43 mg/kg · y in the EU (0.19-12 mg/kg · y, 15/85% quantiles). Moreover, probabilistic species sensitivity distributions (PSSD) were computed and the risk of nano-silica in surface waters was quantified by comparing the predicted environmental concentration (PEC) with the predicted no-effect concentration (PNEC) distribution, which was derived from the cumulative PSSD. This assessment suggests that nano-silica currently poses no risk to aquatic organisms in surface waters. Further investigations are needed to assess the risk of nano-silica in other environmental compartments, which is currently not possible due to a lack of ecotoxicological data.

  19. An integrated probabilistic framework for cumulative risk assessment of common mechanism chemicals in food: an example with organophosphorus pesticides.

    PubMed

    Bosgra, Sieto; van der Voet, Hilko; Boon, Polly E; Slob, Wout

    2009-07-01

    This paper presents a framework for integrated probabilistic risk assessment of chemicals in the diet which accounts for the possibility of cumulative exposure to chemicals with a common mechanism of action. Variability between individuals in the population with respect to food consumption, concentrations of chemicals in the consumed foods, food processing habits and sensitivity towards the chemicals is addressed by Monte Carlo simulations. A large number of individuals are simulated, for which the individual exposure (iEXP), the individual critical effect dose (iCED) and the ratio between these values (the individual margin of exposure, iMoE) are calculated by drawing random values for all variable parameters from databases or specified distributions. This results in a population distribution of the iMoE, and the fraction of this distribution below 1 indicates the fraction of the population that may be at risk. Uncertainty in the assessment is treated as a separate dimension by repeating the Monte Carlo simulations many times, each time drawing random values for all uncertain parameters. In this framework, the cumulative exposure to common mechanism chemicals is addressed by incorporation of the relative potency factor (RPF) approach. The framework is demonstrated by the cumulative risk assessment of organophosphorus pesticides (OPs). By going through this example, the various choices and assumptions underlying the cumulative risk assessment are made explicit. The problems faced and the solutions chosen may be more generic than the present example with OPs. This demonstration may help to familiarize risk assessors and risk managers with the somewhat more complex output of probabilistic risk assessment.

  20. QUANTIFYING AGGREGATE CHLORPYRIFOS EXPOSURE AND DOSE TO CHILDREN USING A PHYSICALLY-BASED TWO-STAGE MONTE CARLO PROBABILISTIC MODEL

    EPA Science Inventory

    To help address the Food Quality Protection Act of 1996, a physically-based, two-stage Monte Carlo probabilistic model has been developed to quantify and analyze aggregate exposure and dose to pesticides via multiple routes and pathways. To illustrate model capabilities and ide...

  1. QUANTIFYING AGGREGATE CHLORPYRIFOS EXPOSURE AND DOSE TO CHILDREN USING A PHYSICALLY-BASED TWO-STAGE MONTE CARLO PROBABILISTIC MODEL

    EPA Science Inventory

    To help address the Food Quality Protection Act of 1996, a physically-based, two-stage Monte Carlo probabilistic model has been developed to quantify and analyze aggregate exposure and dose to pesticides via multiple routes and pathways. To illustrate model capabilities and ide...

  2. Possibilities and limitations of modeling environmental exposure to engineered nanomaterials by probabilistic material flow analysis.

    PubMed

    Gottschalk, Fadri; Sonderer, Tobias; Scholz, Roland W; Nowack, Bernd

    2010-05-01

    Information on environmental concentrations is needed to assess the risks that engineered nanomaterials (ENM) may pose to the environment. In this study, predicted environmental concentrations (PEC) were modeled for nano-TiO2, carbon nanotubes (CNT) and nano-Ag for Switzerland. Based on a life-cycle perspective, the model considered as input parameters the production volumes of the ENMs, the manufacturing and consumption quantities of products containing those materials, and the fate and pathways of ENMs in natural and technical environments. Faced with a distinct scarcity of data, we used a probabilistic material flow analysis model, treating all parameters as probability distributions. The modeling included Monte Carlo and Markov Chain Monte Carlo simulations as well as a sensitivity and uncertainty analysis. The PEC values of the ENMs in the different environmental compartments vary widely due to different ENM production volumes and different life cycles of the nanoproducts. The use of ENM in products with high water relevance leads to higher water and sediment concentrations for nano-TiO2 and nano-Ag, compared to CNTs, where smaller amounts of ENM reach the aquatic compartments. This study also presents a sensitivity analysis and a comprehensive discussion of the uncertainties of the simulation results and the limitations of the used approach. To estimate potential risks, the PEC values were compared to the predicted-no-effect concentrations (PNEC) derived from published data. The risk quotients (PEC/PNEC) for nano-TiO2 and nano-Ag were larger than one for treated wastewater and much smaller for all other environmental compartments (e.g., water, sediments, soils). We conclude that probabilistic modeling is very useful for predicting environmental concentrations of ENMs given the current lack of substantiated data.

  3. Systematic evaluation of autoregressive error models as post-processors for a probabilistic streamflow forecast system

    NASA Astrophysics Data System (ADS)

    Morawietz, Martin; Xu, Chong-Yu; Gottschalk, Lars; Tallaksen, Lena

    2010-05-01

    A post-processor that accounts for the hydrologic uncertainty in a probabilistic streamflow forecast system is necessary to account for the uncertainty introduced by the hydrological model. In this study different variants of an autoregressive error model that can be used as a post-processor for short to medium range streamflow forecasts, are evaluated. The deterministic HBV model is used to form the basis for the streamflow forecast. The general structure of the error models then used as post-processor is a first order autoregressive model of the form dt = αdt-1 + σɛt where dt is the model error (observed minus simulated streamflow) at time t, α and σ are the parameters of the error model, and ɛt is the residual error described through a probability distribution. The following aspects are investigated: (1) Use of constant parameters α and σ versus the use of state dependent parameters. The state dependent parameters vary depending on the states of temperature, precipitation, snow water equivalent and simulated streamflow. (2) Use of a Standard Normal distribution for ɛt versus use of an empirical distribution function constituted through the normalized residuals of the error model in the calibration period. (3) Comparison of two different transformations, i.e. logarithmic versus square root, that are applied to the streamflow data before the error model is applied. The reason for applying a transformation is to make the residuals of the error model homoscedastic over the range of streamflow values of different magnitudes. Through combination of these three characteristics, eight variants of the autoregressive post-processor are generated. These are calibrated and validated in 55 catchments throughout Norway. The discrete ranked probability score with 99 flow percentiles as standardized thresholds is used for evaluation. In addition, a non-parametric bootstrap is used to construct confidence intervals and evaluate the significance of the results. The main

  4. Probabilistic solution of random SI-type epidemiological models using the Random Variable Transformation technique

    NASA Astrophysics Data System (ADS)

    Casabán, M.-C.; Cortés, J.-C.; Romero, J.-V.; Roselló, M.-D.

    2015-07-01

    This paper presents a full probabilistic description of the solution of random SI-type epidemiological models which are based on nonlinear differential equations. This description consists of determining: the first probability density function of the solution in terms of the density functions of the diffusion coefficient and the initial condition, which are assumed to be independent random variables; the expectation and variance functions of the solution as well as confidence intervals and, finally, the distribution of time until a given proportion of susceptibles remains in the population. The obtained formulas are general since they are valid regardless the probability distributions assigned to the random inputs. We also present a pair of illustrative examples including in one of them the application of the theoretical results to model the diffusion of a technology using real data.

  5. PROBABILISTIC NON-RIGID REGISTRATION OF PROSTATE IMAGES: MODELING AND QUANTIFYING UNCERTAINTY

    PubMed Central

    Risholm, Petter; Fedorov, Andriy; Pursley, Jennifer; Tuncali, Kemal; Cormack, Robert; Wells, William M.

    2012-01-01

    Registration of pre- to intra-procedural prostate images needs to handle the large changes in position and shape of the prostate caused by varying rectal filling and patient positioning. We describe a probabilistic method for non-rigid registration of prostate images which can quantify the most probable deformation as well as the uncertainty of the estimated deformation. The method is based on a biomechanical Finite Element model which treats the prostate as an elastic material. We use a Markov Chain Monte Carlo sampler to draw deformation configurations from the posterior distribution. In practice, we simultaneously estimate the boundary conditions (surface displacements) and the internal deformations of our biomechanical model. The proposed method was validated on a clinical MRI dataset with registration results comparable to previously published methods, but with the added benefit of also providing uncertainty estimates which may be important to take into account during prostate biopsy and brachytherapy procedures. PMID:22288004

  6. Probabilistic model of onset detection explains paradoxes in human time perception.

    PubMed

    Nikolov, Stanislav; Rahnev, Dobromir A; Lau, Hakwan C

    2010-01-01

    A very basic computational model is proposed to explain two puzzling findings in the time perception literature. First, spontaneous motor actions are preceded by up to 1-2 s of preparatory activity (Kornhuber and Deecke, 1965). Yet, subjects are only consciously aware of about a quarter of a second of motor preparation (Libet et al., 1983). Why are they not aware of the early part of preparation? Second, psychophysical findings (Spence et al., 2001) support the principle of attention prior entry (Titchener, 1908), which states that attended stimuli are perceived faster than unattended stimuli. However, electrophysiological studies reported no or little corresponding temporal difference between the neural signals for attended and unattended stimuli (McDonald et al., 2005; Vibell et al., 2007). We suggest that the key to understanding these puzzling findings is to think of onset detection in probabilistic terms. The two apparently paradoxical phenomena are naturally predicted by our signal detection theoretic model.

  7. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    SciTech Connect

    Suzette Payne

    2006-04-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  8. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    SciTech Connect

    Suzette Payne

    2007-08-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  9. A Probabilistic Model for Sediment Entrainment: the Role of Bed Irregularity

    NASA Astrophysics Data System (ADS)

    Thanos Papanicolaou, A. N.

    2017-04-01

    A generalized probabilistic model is developed in this study to predict sediment entrainment under the incipient motion, rolling, and pickup modes. A novelty of the proposed model is that it incorporates in its formulation the probability density function of the bed shear stress, instead of the near-bed velocity fluctuations, to account for the effects of both flow turbulence and bed surface irregularity on sediment entrainment. The proposed model incorporates in its formulation the collective effects of three parameters describing bed surface irregularity, namely the relative roughness, the volumetric fraction and relative position of sediment particles within the active layer. Another key feature of the model is that it provides a criterion for estimating the lift and drag coefficients jointly based on the recognition that lift and drag forces acting on sediment particles are interdependent and vary with particle protrusion and packing density. The model was validated using laboratory data of both fine and coarse sediment and was compared with previously published models. The study results show that for the fine sediment data, where the sediment particles have more uniform gradation and relative roughness is not a factor, all the examined models perform adequately. The proposed model was particularly suited for the coarse sediment data, where the increased bed irregularity was captured by the new parameters introduced in the model formulations. As a result, the proposed model yielded smaller prediction errors and physically acceptable values for the lift coefficient compared to the other models in case of the coarse sediment data.

  10. Predicting the acute neurotoxicity of diverse organic solvents using probabilistic neural networks based QSTR modeling approaches.

    PubMed

    Basant, Nikita; Gupta, Shikha; Singh, Kunwar P

    2016-03-01

    Organic solvents are widely used chemicals and the neurotoxic properties of some are well established. In this study, we established nonlinear qualitative and quantitative structure-toxicity relationship (STR) models for predicting neurotoxic classes and neurotoxicity of structurally diverse solvents in rodent test species following OECD guideline principles for model development. Probabilistic neural network (PNN) based qualitative and generalized regression neural network (GRNN) based quantitative STR models were constructed using neurotoxicity data from rat and mouse studies. Further, interspecies correlation based quantitative activity-activity relationship (QAAR) and global QSTR models were also developed using the combined data set of both rodent species for predicting the neurotoxicity of solvents. The constructed models were validated through deriving several statistical coefficients for the test data and the prediction and generalization abilities of these models were evaluated. The qualitative STR models (rat and mouse) yielded classification accuracies of 92.86% in the test data sets, whereas, the quantitative STRs yielded correlation (R(2)) of >0.93 between the measured and model predicted toxicity values in both the test data (rat and mouse). The prediction accuracies of the QAAR (R(2) 0.859) and global STR (R(2) 0.945) models were comparable to those of the independent local STR models. The results suggest the ability of the developed QSTR models to reliably predict binary neurotoxicity classes and the endpoint neurotoxicities of the structurally diverse organic solvents. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Improving probabilistic prediction of daily streamflow by identifying Pareto optimal approaches for modelling heteroscedastic residual errors

    NASA Astrophysics Data System (ADS)

    David, McInerney; Mark, Thyer; Dmitri, Kavetski; George, Kuczera

    2017-04-01

    This study provides guidance to hydrological researchers which enables them to provide probabilistic predictions of daily streamflow with the best reliability and precision for different catchment types (e.g. high/low degree of ephemerality). Reliable and precise probabilistic prediction of daily catchment-scale streamflow requires statistical characterization of residual errors of hydrological models. It is commonly known that hydrological model residual errors are heteroscedastic, i.e. there is a pattern of larger errors in higher streamflow predictions. Although multiple approaches exist for representing this heteroscedasticity, few studies have undertaken a comprehensive evaluation and comparison of these approaches. This study fills this research gap by evaluating 8 common residual error schemes, including standard and weighted least squares, the Box-Cox transformation (with fixed and calibrated power parameter, lambda) and the log-sinh transformation. Case studies include 17 perennial and 6 ephemeral catchments in Australia and USA, and two lumped hydrological models. We find the choice of heteroscedastic error modelling approach significantly impacts on predictive performance, though no single scheme simultaneously optimizes all performance metrics. The set of Pareto optimal schemes, reflecting performance trade-offs, comprises Box-Cox schemes with lambda of 0.2 and 0.5, and the log scheme (lambda=0, perennial catchments only). These schemes significantly outperform even the average-performing remaining schemes (e.g., across ephemeral catchments, median precision tightens from 105% to 40% of observed streamflow, and median biases decrease from 25% to 4%). Theoretical interpretations of empirical results highlight the importance of capturing the skew/kurtosis of raw residuals and reproducing zero flows. Recommendations for researchers and practitioners seeking robust residual error schemes for practical work are provided.

  12. A probabilistic quantitative risk assessment model for the long-term work zone crashes.

    PubMed

    Meng, Qiang; Weng, Jinxian; Qu, Xiaobo

    2010-11-01

    Work zones especially long-term work zones increase traffic conflicts and cause safety problems. Proper casualty risk assessment for a work zone is of importance for both traffic safety engineers and travelers. This paper develops a novel probabilistic quantitative risk assessment (QRA) model to evaluate the casualty risk combining frequency and consequence of all accident scenarios triggered by long-term work zone crashes. The casualty risk is measured by the individual risk and societal risk. The individual risk can be interpreted as the frequency of a driver/passenger being killed or injured, and the societal risk describes the relation between frequency and the number of casualties. The proposed probabilistic QRA model consists of the estimation of work zone crash frequency, an event tree and consequence estimation models. There are seven intermediate events--age (A), crash unit (CU), vehicle type (VT), alcohol (AL), light condition (LC), crash type (CT) and severity (S)--in the event tree. Since the estimated value of probability for some intermediate event may have large uncertainty, the uncertainty can thus be characterized by a random variable. The consequence estimation model takes into account the combination effects of speed and emergency medical service response time (ERT) on the consequence of work zone crash. Finally, a numerical example based on the Southeast Michigan work zone crash data is carried out. The numerical results show that there will be a 62% decrease of individual fatality risk and 44% reduction of individual injury risk if the mean travel speed is slowed down by 20%. In addition, there will be a 5% reduction of individual fatality risk and 0.05% reduction of individual injury risk if ERT is reduced by 20%. In other words, slowing down speed is more effective than reducing ERT in the casualty risk mitigation.

  13. Probabilistic models and uncertainty quantification for the ionization reaction rate of atomic Nitrogen

    NASA Astrophysics Data System (ADS)

    Miki, K.; Panesi, M.; Prudencio, E. E.; Prudhomme, S.

    2012-05-01

    The objective in this paper is to analyze some stochastic models for estimating the ionization reaction rate constant of atomic Nitrogen (N + e- → N+ + 2e-). Parameters of the models are identified by means of Bayesian inference using spatially resolved absolute radiance data obtained from the Electric Arc Shock Tube (EAST) wind-tunnel. The proposed methodology accounts for uncertainties in the model parameters as well as physical model inadequacies, providing estimates of the rate constant that reflect both types of uncertainties. We present four different probabilistic models by varying the error structure (either additive or multiplicative) and by choosing different descriptions of the statistical correlation among data points. In order to assess the validity of our methodology, we first present some calibration results obtained with manufactured data and then proceed by using experimental data collected at EAST experimental facility. In order to simulate the radiative signature emitted in the shock-heated air plasma, we use a one-dimensional flow solver with Park's two-temperature model that simulates non-equilibrium effects. We also discuss the implications of the choice of the stochastic model on the estimation of the reaction rate and its uncertainties. Our analysis shows that the stochastic models based on correlated multiplicative errors are the most plausible models among the four models proposed in this study. The rate of the atomic Nitrogen ionization is found to be (6.2 ± 3.3) × 1011 cm3 mol-1 s-1 at 10,000 K.

  14. Improving predictive power of physically based rainfall-induced shallow landslide models: a probabilistic approach

    NASA Astrophysics Data System (ADS)

    Raia, S.; Alvioli, M.; Rossi, M.; Baum, R. L.; Godt, J. W.; Guzzetti, F.

    2014-03-01

    Distributed models to forecast the spatial and temporal occurrence of rainfall-induced shallow landslides are based on deterministic laws. These models extend spatially the static stability models adopted in geotechnical engineering, and adopt an infinite-slope geometry to balance the resisting and the driving forces acting on the sliding mass. An infiltration model is used to determine how rainfall changes pore-water conditions, modulating the local stability/instability conditions. A problem with the operation of the existing models lays in the difficulty in obtaining accurate values for the several variables that describe the material properties of the slopes. The problem is particularly severe when the models are applied over large areas, for which sufficient information on the geotechnical and hydrological conditions of the slopes is not generally available. To help solve the problem, we propose a probabilistic Monte Carlo approach to the distributed modeling of rainfall-induced shallow landslides. For this purpose, we have modified the transient rainfall infiltration and grid-based regional slope-stability analysis (TRIGRS) code. The new code (TRIGRS-P) adopts a probabilistic approach to compute, on a cell-by-cell basis, transient pore-pressure changes and related changes in the factor of safety due to rainfall infiltration. Infiltration is modeled using analytical solutions of partial differential equations describing one-dimensional vertical flow in isotropic, homogeneous materials. Both saturated and unsaturated soil conditions can be considered. TRIGRS-P copes with the natural variability inherent to the mechanical and hydrological properties of the slope materials by allowing values of the TRIGRS model input parameters to be sampled randomly from a given probability distribution. The range of variation and the mean value of the parameters can be determined by the usual methods used for preparing the TRIGRS input parameters. The outputs of several model

  15. Integrated Assessment Model Evaluation

    NASA Astrophysics Data System (ADS)

    Smith, S. J.; Clarke, L.; Edmonds, J. A.; Weyant, J. P.

    2012-12-01

    Integrated assessment models of climate change (IAMs) are widely used to provide insights into the dynamics of the coupled human and socio-economic system, including emission mitigation analysis and the generation of future emission scenarios. Similar to the climate modeling community, the integrated assessment community has a two decade history of model inter-comparison, which has served as one of the primary venues for model evaluation and confirmation. While analysis of historical trends in the socio-economic system has long played a key role in diagnostics of future scenarios from IAMs, formal hindcast experiments are just now being contemplated as evaluation exercises. Some initial thoughts on setting up such IAM evaluation experiments are discussed. Socio-economic systems do not follow strict physical laws, which means that evaluation needs to take place in a context, unlike that of physical system models, in which there are few fixed, unchanging relationships. Of course strict validation of even earth system models is not possible (Oreskes etal 2004), a fact borne out by the inability of models to constrain the climate sensitivity. Energy-system models have also been grappling with some of the same questions over the last quarter century. For example, one of "the many questions in the energy field that are waiting for answers in the next 20 years" identified by Hans Landsberg in 1985 was "Will the price of oil resume its upward movement?" Of course we are still asking this question today. While, arguably, even fewer constraints apply to socio-economic systems, numerous historical trends and patterns have been identified, although often only in broad terms, that are used to guide the development of model components, parameter ranges, and scenario assumptions. IAM evaluation exercises are expected to provide useful information for interpreting model results and improving model behavior. A key step is the recognition of model boundaries, that is, what is inside

  16. Relationships between probabilistic Boolean networks and dynamic Bayesian networks as models of gene regulatory networks

    PubMed Central

    Lähdesmäki, Harri; Hautaniemi, Sampsa; Shmulevich, Ilya; Yli-Harja, Olli

    2006-01-01

    A significant amount of attention has recently been focused on modeling of gene regulatory networks. Two frequently used large-scale modeling frameworks are Bayesian networks (BNs) and Boolean networks, the latter one being a special case of its recent stochastic extension, probabilistic Boolean networks (PBNs). PBN is a promising model class that generalizes the standard rule-based interactions of Boolean networks into the stochastic setting. Dynamic Bayesian networks (DBNs) is a general and versatile model class that is able to represent complex temporal stochastic processes and has also been proposed as a model for gene regulatory systems. In this paper, we concentrate on these two model classes and demonstrate that PBNs and a certain subclass of DBNs can represent the same joint probability distribution over their common variables. The major benefit of introducing the relationships between the models is that it opens up the possibility of applying the standard tools of DBNs to PBNs and vice versa. Hence, the standard learning tools of DBNs can be applied in the context of PBNs, and the inference methods give a natural way of handling the missing values in PBNs which are often present in gene expression measurements. Conversely, the tools for controlling the stationary behavior of the networks, tools for projecting networks onto sub-networks, and efficient learning schemes can be used for DBNs. In other words, the introduced relationships between the models extend the collection of analysis tools for both model classes. PMID:17415411

  17. Probabilistic, Multivariable Flood Loss Modeling on the Mesoscale with BT-FLEMO.

    PubMed

    Kreibich, Heidi; Botto, Anna; Merz, Bruno; Schröter, Kai

    2017-04-01

    Flood loss modeling is an important component for risk analyses and decision support in flood risk management. Commonly, flood loss models describe complex damaging processes by simple, deterministic approaches like depth-damage functions and are associated with large uncertainty. To improve flood loss estimation and to provide quantitative information about the uncertainty associated with loss modeling, a probabilistic, multivariable Bagging decision Tree Flood Loss Estimation MOdel (BT-FLEMO) for residential buildings was developed. The application of BT-FLEMO provides a probability distribution of estimated losses to residential buildings per municipality. BT-FLEMO was applied and validated at the mesoscale in 19 municipalities that were affected during the 2002 flood by the River Mulde in Saxony, Germany. Validation was undertaken on the one hand via a comparison with six deterministic loss models, including both depth-damage functions and multivariable models. On the other hand, the results were compared with official loss data. BT-FLEMO outperforms deterministic, univariable, and multivariable models with regard to model accuracy, although the prediction uncertainty remains high. An important advantage of BT-FLEMO is the quantification of prediction uncertainty. The probability distribution of loss estimates by BT-FLEMO well represents the variation range of loss estimates of the other models in the case study. © 2016 Society for Risk Analysis.

  18. A 3-D probabilistic stability model incorporating the variability of root reinforcement

    NASA Astrophysics Data System (ADS)

    Cislaghi, Alessio; Chiaradia, Enrico; Battista Bischetti, Gian

    2016-04-01

    Process-oriented models of hillslope stability have a great potentiality to improve spatially-distributed landslides hazard analyses. At the same time, they may have severe limitations and among them the variability and uncertainty of the parameters play a key role. In this context, the application of a probabilistic approach through Monte Carlo techniques can be the right practice to deal with the variability of each input parameter by considering a proper probability distribution. In forested areas an additional point must be taken into account: the reinforcement due to roots permeating the soil and its variability and uncertainty. While the probability distributions of geotechnical and hydrological parameters have been widely investigated, little is known concerning the variability and the spatial heterogeneity of root reinforcement. Moreover, there are still many difficulties in measuring and in evaluating such a variable. In our study we aim to: i) implement a robust procedure to evaluate the variability of root reinforcement as a probabilistic distribution, according to the stand characteristics of forests, such as the trees density, the average diameter at breast height, the minimum distance among trees, and (ii) combine a multidimensional process-oriented model with a Monte Carlo Simulation technique, to obtain a probability distribution of the Factor of Safety. The proposed approach has been applied to a small Alpine area, mainly covered by a coniferous forest and characterized by steep slopes and a high landslide hazard. The obtained results show a good reliability of the model according to the landslide inventory map. At the end, our findings contribute to improve the reliability of landslide hazard mapping in forested areas and help forests managers to evaluate different management scenarios.

  19. Neural Mechanisms for Integrating Prior Knowledge and Likelihood in Value-Based Probabilistic Inference

    PubMed Central

    Ting, Chih-Chung; Yu, Chia-Chen; Maloney, Laurence T.

    2015-01-01

    In Bayesian decision theory, knowledge about the probabilities of possible outcomes is captured by a prior distribution and a likelihood function. The prior reflects past knowledge and the likelihood summarizes current sensory information. The two combined (integrated) form a posterior distribution that allows estimation of the probability of different possible outcomes. In this study, we investigated the neural mechanisms underlying Bayesian integration using a novel lottery decision task in which both prior knowledge and likelihood information about reward probability were systematically manipulated on a trial-by-trial basis. Consistent with Bayesian integration, as sample size increased, subjects tended to weigh likelihood information more compared with prior information. Using fMRI in humans, we found that the medial prefrontal cortex (mPFC) correlated with the mean of the posterior distribution, a statistic that reflects the integration of prior knowledge and likelihood of reward probability. Subsequent analysis revealed that both prior and likelihood information were represented in mPFC and that the neural representations of prior and likelihood in mPFC reflected changes in the behaviorally estimated weights assigned to these different sources of information in response to changes in the environment. Together, these results establish the role of mPFC in prior-likelihood integration and highlight its involvement in representing and integrating these distinct sources of information. PMID:25632152

  20. Probabilistic Graphical Models for the Analysis and Synthesis of Musical Audio

    NASA Astrophysics Data System (ADS)

    Hoffmann, Matthew Douglas

    Content-based Music Information Retrieval (MIR) systems seek to automatically extract meaningful information from musical audio signals. This thesis applies new and existing generative probabilistic models to several content-based MIR tasks: timbral similarity estimation, semantic annotation and retrieval, and latent source discovery and separation. In order to estimate how similar two songs sound to one another, we employ a Hierarchical Dirichlet Process (HDP) mixture model to discover a shared representation of the distribution of timbres in each song. Comparing songs under this shared representation yields better query-by-example retrieval quality and scalability than previous approaches. To predict what tags are likely to apply to a song (e.g., "rap," "happy," or "driving music"), we develop the Codeword Bernoulli Average (CBA) model, a simple and fast mixture-of-experts model. Despite its simplicity, CBA performs at least as well as state-of-the-art approaches at automatically annotating songs and finding to what songs in a database a given tag most applies. Finally, we address the problem of latent source discovery and separation by developing two Bayesian nonparametric models, the Shift-Invariant HDP and Gamma Process NMF. These models allow us to discover what sounds (e.g. bass drums, guitar chords, etc.) are present in a song or set of songs and to isolate or suppress individual source. These models' ability to decide how many latent sources are necessary to model the data is particularly valuable in this application, since it is impossible to guess a priori how many sounds will appear in a given song or set of songs. Once they have been fit to data, probabilistic models can also be used to drive the synthesis of new musical audio, both for creative purposes and to qualitatively diagnose what information a model does and does not capture. We also adapt the SIHDP model to create new versions of input audio with arbitrary sample sets, for example, to create

  1. Multi-level approach for statistical appearance models with probabilistic correspondences

    NASA Astrophysics Data System (ADS)

    Krüger, Julia; Ehrhardt, Jan; Handels, Heinz

    2016-03-01

    Statistical shape and appearance models are often based on the accurate identification of one-to-one correspondences in a training data set. At the same time, the determination of these corresponding landmarks is the most challenging part of such methods. Hufnagel et al.1 developed an alternative method using correspondence probabilities for a statistical shape model. In Krüuger et al.2, 3 we propose the use of probabilistic correspondences for statistical appearance models by incorporating appearance information into the framework. We employ a point-based representation of image data combining position and appearance information. The model is optimized and adapted by a maximum a-posteriori (MAP) approach deriving a single global optimization criterion with respect to model parameters and observation dependent parameters that directly affects shape and appearance information of the considered structures. Because initially unknown correspondence probabilities are used and a higher number of degrees of freedom is introduced to the model a regularization of the model generation process is advantageous. For this purpose we extend the derived global criterion by a regularization term which penalizes implausible topological changes. Furthermore, we propose a multi-level approach for the optimization, to increase the robustness of the model generation process.

  2. Probabilistic conditional reasoning: Disentangling form and content with the dual-source model.

    PubMed

    Singmann, Henrik; Klauer, Karl Christoph; Beller, Sieghard

    2016-08-01

    The present research examines descriptive models of probabilistic conditional reasoning, that is of reasoning from uncertain conditionals with contents about which reasoners have rich background knowledge. According to our dual-source model, two types of information shape such reasoning: knowledge-based information elicited by the contents of the material and content-independent information derived from the form of inferences. Two experiments implemented manipulations that selectively influenced the model parameters for the knowledge-based information, the relative weight given to form-based versus knowledge-based information, and the parameters for the form-based information, validating the psychological interpretation of these parameters. We apply the model to classical suppression effects dissecting them into effects on background knowledge and effects on form-based processes (Exp. 3) and we use it to reanalyse previous studies manipulating reasoning instructions. In a model-comparison exercise, based on data of seven studies, the dual-source model outperformed three Bayesian competitor models. Overall, our results support the view that people make use of background knowledge in line with current Bayesian models, but they also suggest that the form of the conditional argument, irrespective of its content, plays a substantive, yet smaller, role. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. A time-dependent probabilistic seismic-hazard model for California

    USGS Publications Warehouse

    Cramer, C.H.; Petersen, M.D.; Cao, T.; Toppozada, Tousson R.; Reichle, M.

    2000-01-01

    For the purpose of sensitivity testing and illuminating nonconsensus components of time-dependent models, the California Department of Conservation, Division of Mines and Geology (CDMG) has assembled a time-dependent version of its statewide probabilistic seismic hazard (PSH) model for California. The model incorporates available consensus information from within the earth-science community, except for a few faults or fault segments where consensus information is not available. For these latter faults, published information has been incorporated into the model. As in the 1996 CDMG/U.S. Geological Survey (USGS) model, the time-dependent models incorporate three multisegment ruptures: a 1906, an 1857, and a southern San Andreas earthquake. Sensitivity tests are presented to show the effect on hazard and expected damage estimates of (1) intrinsic (aleatory) sigma, (2) multisegment (cascade) vs. independent segment (no cascade) ruptures, and (3) time-dependence vs. time-independence. Results indicate that (1) differences in hazard and expected damage estimates between time-dependent and independent models increase with decreasing intrinsic sigma, (2) differences in hazard and expected damage estimates between full cascading and not cascading are insensitive to intrinsic sigma, (3) differences in hazard increase with increasing return period (decreasing probability of occurrence), and (4) differences in moment-rate budgets increase with decreasing intrinsic sigma and with the degree of cascading, but are within the expected uncertainty in PSH time-dependent modeling and do not always significantly affect hazard and expected damage estimates.

  4. Probabilistic Boolean Network Modelling and Analysis Framework for mRNA Translation.

    PubMed

    Zhao, Yun-Bo; Krishnan, J

    2016-01-01

    mRNA translation is a complex process involving the progression of ribosomes on the mRNA, resulting in the synthesis of proteins, and is subject to multiple layers of regulation. This process has been modelled using different formalisms, both stochastic and deterministic. Recently, we introduced a Probabilistic Boolean modelling framework for mRNA translation, which possesses the advantage of tools for numerically exact computation of steady state probability distribution, without requiring simulation. Here, we extend this model to incorporate both random sequential and parallel update rules, and demonstrate its effectiveness in various settings, including its flexibility in accommodating additional static and dynamic biological complexities and its role in parameter sensitivity analysis. In these applications, the results from the model analysis match those of TASEP model simulations. Importantly, the proposed modelling framework maintains the stochastic aspects of mRNA translation and provides a way to exactly calculate probability distributions, providing additional tools of analysis in this context. Finally, the proposed modelling methodology provides an alternative approach to the understanding of the mRNA translation process, by bridging the gap between existing approaches, providing new analysis tools, and contributing to a more robust platform for modelling and understanding translation.

  5. Behavioral Modeling Based on Probabilistic Finite Automata: An Empirical Study †

    PubMed Central

    Tîrnăucă, Cristina; Montaña, José L.; Ontañón, Santiago; González, Avelino J.; Pardo, Luis M.

    2016-01-01

    Imagine an agent that performs tasks according to different strategies. The goal of Behavioral Recognition (BR) is to identify which of the available strategies is the one being used by the agent, by simply observing the agent’s actions and the environmental conditions during a certain period of time. The goal of Behavioral Cloning (BC) is more ambitious. In this last case, the learner must be able to build a model of the behavior of the agent. In both settings, the only assumption is that the learner has access to a training set that contains instances of observed behavioral traces for each available strategy. This paper studies a machine learning approach based on Probabilistic Finite Automata (PFAs), capable of achieving both the recognition and cloning tasks. We evaluate the performance of PFAs in the context of a simulated learning environment (in this case, a virtual Roomba vacuum cleaner robot), and compare it with a collection of other machine learning approaches. PMID:27347956

  6. Probabilistic modeling of the corrosion of steel structures in marine water-development works

    SciTech Connect

    Bekker, A. T.; Lyubimov, V. S.; Kovalenko, R. G.; Aleksandrov, A. V.

    2011-09-15

    Considering that corrosion takes place as a random process over time, a a probabilistic approach was utilized in this paper. The corrosion of metallic sheet piling employed in the fascia wall of a bulwerk is considered as an example. A stochastic model is constructed on the base of a modified Weibull distribution function with consideration of parameters of the corrosion process as a function of time. One of the factors defining the corrosion rate of the sheet piling is the degree of access of a section of the wall to the zone of variable water level, or the underwater zone. The type of corrosion-continuous or local-is another factor. The accuracy of corrosion prediction in the underwater zone is higher than that in the zone of variable water level.

  7. Life Prediction and Classification of Failure Modes in Solid State Luminaires Using Bayesian Probabilistic Models

    SciTech Connect

    Lall, Pradeep; Wei, Junchao; Sakalaukus, Peter

    2014-05-27

    A new method has been developed for assessment of the onset of degradation in solid state luminaires to classify failure mechanisms by using metrics beyond lumen degradation that are currently used for identification of failure. Luminous Flux output, Correlated Color Temperature Data on Philips LED Lamps has been gathered under 85°C/85%RH till lamp failure. The acquired data has been used in conjunction with Bayesian Probabilistic Models to identify luminaires with onset of degradation much prior to failure through identification of decision boundaries between lamps with accrued damage and lamps beyond the failure threshold in the feature space. In addition luminaires with different failure modes have been classified separately from healthy pristine luminaires. It is expected that, the new test technique will allow the development of failure distributions without testing till L70 life for the manifestation of failure.

  8. A Model-Based Probabilistic Inversion Framework for Wire Fault Detection Using TDR

    NASA Technical Reports Server (NTRS)

    Schuet, Stefan R.; Timucin, Dogan A.; Wheeler, Kevin R.

    2010-01-01

    Time-domain reflectometry (TDR) is one of the standard methods for diagnosing faults in electrical wiring and interconnect systems, with a long-standing history focused mainly on hardware development of both high-fidelity systems for laboratory use and portable hand-held devices for field deployment. While these devices can easily assess distance to hard faults such as sustained opens or shorts, their ability to assess subtle but important degradation such as chafing remains an open question. This paper presents a unified framework for TDR-based chafing fault detection in lossy coaxial cables by combining an S-parameter based forward modeling approach with a probabilistic (Bayesian) inference algorithm. Results are presented for the estimation of nominal and faulty cable parameters from laboratory data.

  9. On probabilistic certification of combined cancer therapies using strongly uncertain models.

    PubMed

    Alamir, Mazen

    2015-11-07

    This paper proposes a general framework for probabilistic certification of cancer therapies. The certification is defined in terms of two key issues which are the tumor contraction and the lower admissible bound on the circulating lymphocytes which is viewed as indicator of the patient health. The certification is viewed as the ability to guarantee with a predefined high probability the success of the therapy over a finite horizon despite of the unavoidable high uncertainties affecting the dynamic model that is used to compute the optimal scheduling of drugs injection. The certification paradigm can be viewed as a tool for tuning the treatment parameters and protocols as well as for getting a rational use of limited or expensive drugs. The proposed framework is illustrated using the specific problem of combined immunotherapy/chemotherapy of cancer.

  10. Robust Depth Image Acquisition Using Modulated Pattern Projection and Probabilistic Graphical Models

    PubMed Central

    Kravanja, Jaka; Žganec, Mario; Žganec-Gros, Jerneja; Dobrišek, Simon; Štruc, Vitomir

    2016-01-01

    Depth image acquisition with structured light approaches in outdoor environments is a challenging problem due to external factors, such as ambient sunlight, which commonly affect the acquisition procedure. This paper presents a novel structured light sensor designed specifically for operation in outdoor environments. The sensor exploits a modulated sequence of structured light projected onto the target scene to counteract environmental factors and estimate a spatial distortion map in a robust manner. The correspondence between the projected pattern and the estimated distortion map is then established using a probabilistic framework based on graphical models. Finally, the depth image of the target scene is reconstructed using a number of reference frames recorded during the calibration process. We evaluate the proposed sensor on experimental data in indoor and outdoor environments and present comparative experiments with other existing methods, as well as commercial sensors. PMID:27775570

  11. Multi-Scale Modeling and Probabilistic Assessment of Electrical Power and Dependent Systems' Resilience to Extreme Events

    NASA Astrophysics Data System (ADS)

    Backhaus, S.; Pasqualini, D.; Crawford, T.; Ewers, M.; Tasseff, B.; Ambrosiano, J.; Linger, S.; Roberts, R.; Bent, R.; Barnes, A.

    2016-12-01

    Electrical power systems and infrastructure systems that depend on electric power are deployed and operated at spatial scales ranging from individual facilities to cities to regions. The resilience of these systems has evolved naturally under the influence of extreme events. This evolution continues today as systems face more severe and frequent extreme events. Accurately representing the present resilience and future evolution of these systems across such a wide range of scales is both a modeling and computational challenge, especially considering that system evolution may involve new technologies and design approaches where no historical data are available to enable extrapolations. Naïve top-down aggregations of existing electrical power distribution networks will not capture important correlations between previous system hardening and critical dependent systems, leading to inaccurate modeling of current local resilience and costs associated with upgrades to improve local resilience and inaccurate boundary conditions on regional-scale electrical transmission models. Using probabilistic risk analysis approaches for hurricane and ice storm events, we present an integrated multi-scale, multi-sector modeling approach that can model and upscale naturally evolved resilience of fine-scale electrical distribution networks and their dependent critical systems. We use stochastic optimization-based models to predict and upscale potential future evolution driven by existing and emerging technologies. An optimization approach can incorporate human factors and represent potential bias in the resilience adaptation toward different residual risk profiles. We describe the coupling of these new aggregated models of fine-scale electrical power system resilience to models of regional electrical transmission systems and show how resilient regional operational strategies may be adapted to autonomous local resilience upgrades.

  12. A probabilistic model for the persistence of early planar fabrics in polydeformed pelitic schists

    USGS Publications Warehouse

    Ferguson, C.C.

    1984-01-01

    Although early planar fabrics are commonly preserved within microlithons in low-grade pelites, in higher-grade (amphibolite facies) pelitic schists fabric regeneration often appears complete. Evidence for early fabrics may be preserved within porphyroblasts but, within the matrix, later deformation often appears to totally obliterate or reorient earlier fabrics. However, examination of several hundred Dalradian pelites from Connemara, western Ireland, reveals that preservation of early fabrics is by no means uncommon; relict matrix domains, although volumetrically insignificant, are remarkably persistent even when inferred later strains are very large and fabric regeneration appears, at first sight, complete. Deterministic plasticity theories are ill-suited to the analysis of such an inhomogeneous material response, and a probabilistic model is proposed instead. It assumes that ductile polycrystal deformation is controlled by elementary flow units which can be activated once their associated stress barrier is overcome. Bulk flow propensity is related to the proportion of simultaneous activations, and a measure of this is derived from the probabilistic interaction between a stress-barrier spectrum and an internal stress spectrum (the latter determined by the external loading and the details of internal stress transfer). The spectra are modelled as Gaussian distributions although the treatment is very general and could be adapted for other distributions. Using the time rate of change of activation probability it is predicted that, initially, fabric development will be rapid but will then slow down dramatically even though stress increases at a constant rate. This highly non-linear response suggests that early fabrics persist because they comprise unfavourable distributions of stress-barriers which remain unregenerated at the time bulk stress is stabilized by steady-state flow. Relict domains will, however, bear the highest stress and are potential upper

  13. Comparison of Four Probabilistic Models (CARES, Calendex, ConsEspo, SHEDS) to Estimate Aggregate Residential Exposures to Pesticides

    EPA Science Inventory

    Two deterministic models (US EPA’s Office of Pesticide Programs Residential Standard Operating Procedures (OPP Residential SOPs) and Draft Protocol for Measuring Children’s Non-Occupational Exposure to Pesticides by all Relevant Pathways (Draft Protocol)) and four probabilistic mo...

  14. A Probabilistic Model for Students' Errors and Misconceptions on the Structure of Matter in Relation to Three Cognitive Variables

    ERIC Educational Resources Information Center

    Tsitsipis, Georgios; Stamovlasis, Dimitrios; Papageorgiou, George

    2012-01-01

    In this study, the effect of 3 cognitive variables such as logical thinking, field dependence/field independence, and convergent/divergent thinking on some specific students' answers related to the particulate nature of matter was investigated by means of probabilistic models. Besides recording and tabulating the students' responses, a combination…

  15. Comparison of Four Probabilistic Models (CARES, Calendex, ConsEspo, SHEDS) to Estimate Aggregate Residential Exposures to Pesticides

    EPA Science Inventory

    Two deterministic models (US EPA’s Office of Pesticide Programs Residential Standard Operating Procedures (OPP Residential SOPs) and Draft Protocol for Measuring Children’s Non-Occupational Exposure to Pesticides by all Relevant Pathways (Draft Protocol)) and four probabilistic mo...

  16. Probabilistic modeling of the fate of Listeria monocytogenes in diced bacon during the manufacturing process.

    PubMed

    Billoir, Elise; Denis, Jean-Baptiste; Cammeau, Natalie; Cornu, Marie; Zuliani, Veronique

    2011-02-01

    To assess the impact of the manufacturing process on the fate of Listeria monocytogenes, we built a generic probabilistic model intended to simulate the successive steps in the process. Contamination evolution was modeled in the appropriate units (breasts, dice, and then packaging units through the successive steps in the process). To calibrate the model, parameter values were estimated from industrial data, from the literature, and based on expert opinion. By means of simulations, the model was explored using a baseline calibration and alternative scenarios, in order to assess the impact of changes in the process and of accidental events. The results are reported as contamination distributions and as the probability that the product will be acceptable with regards to the European regulatory safety criterion. Our results are consistent with data provided by industrial partners and highlight that tumbling is a key step for the distribution of the contamination at the end of the process. Process chain models could provide an important added value for risk assessment models that basically consider only the outputs of the process in their risk mitigation strategies. Moreover, a model calibrated to correspond to a specific plant could be used to optimize surveillance.

  17. Probabilistic Stack of 180 Plio-Pleistocene Benthic δ18O Records Constructed Using Profile Hidden Markov Models

    NASA Astrophysics Data System (ADS)

    Lisiecki, L. E.; Ahn, S.; Khider, D.; Lawrence, C.

    2015-12-01

    Stratigraphic alignment is the primary way in which long marine climate records are placed on a common age model. We previously presented a probabilistic pairwise alignment algorithm, HMM-Match, which uses hidden Markov models to estimate alignment uncertainty and apply it to the alignment of benthic δ18O records to the "LR04" global benthic stack of Lisiecki and Raymo (2005) (Lin et al., 2014). However, since the LR04 stack is deterministic, the algorithm does not account for uncertainty in the stack. Here we address this limitation by developing a probabilistic stack, HMM-Stack. In this model the stack is a probabilistic inhomogeneous hidden Markov model, a.k.a. profile HMM. The HMM-stack is represented by a probabilistic model that "emits" each of the input records (Durbin et al., 1998). The unknown parameters of this model are learned from a set of input records using the expectation maximization (EM) algorithm. Because the multiple alignment of these records is unknown and uncertain, the expected contribution of each input point to each point in the stack is determined probabilistically. For each time step in the HMM-stack, δ18O values are described by a Gaussian probability distribution. Available δ18O records (N=180) are employed to estimate the mean and variance of δ18O at each time point. The mean of HMM-Stack follows the predicted pattern of glacial cycles with increased amplitude after the Pliocene-Pleistocene boundary and also larger and longer cycles after the mid-Pleistocene transition. Furthermore, the δ18O variance increases with age, producing a substantial loss in the signal-to-noise ratio. Not surprisingly, uncertainty in alignment and thus estimated age also increase substantially in the older portion of the stack.

  18. The probabilistic niche model reveals the niche structure and role of body size in a complex food web.

    PubMed

    Williams, Richard J; Anandanadesan, Ananthi; Purves, Drew

    2010-08-09

    The niche model has been widely used to model the structure of complex food webs, and yet the ecological meaning of the single niche dimension has not been explored. In the niche model, each species has three traits, niche position, diet position and feeding range. Here, a new probabilistic niche model, which allows the maximum likelihood set of trait values to be estimated for each species, is applied to the food web of the Benguela fishery. We also developed the allometric niche model, in which body size is used as the niche dimension. About 80% of the links in the empirical data are predicted by the probabilistic niche model, a significant improvement over recent models. As in the niche model, species are uniformly distributed on the niche axis. Feeding ranges are exponentially distributed, but diet positions are not uniformly distributed below the predator. Species traits are strongly correlated with body size, but the allometric niche model performs significantly worse than the probabilistic niche model. The best-fit parameter set provides a significantly better model of the structure of the Benguela food web than was previously available. The methodology allows the identification of a number of taxa that stand out as outliers either in the model's poor performance at predicting their predators or prey or in their parameter values. While important, body size alone does not explain the structure of the one-dimensional niche.

  19. FOGCAST: Probabilistic fog forecasting based on operational (high-resolution) NWP models

    NASA Astrophysics Data System (ADS)

    Masbou, M.; Hacker, M.; Bentzien, S.

    2013-12-01

    The presence of fog and low clouds in the lower atmosphere can have a critical impact on both airborne and ground transports and is often connected with serious accidents. The improvement of localization, duration and variations in visibility therefore holds an immense operational value. Fog is generally a small scale phenomenon and mostly affected by local advective transport, radiation, turbulent mixing at the surface as well as its microphysical structure. Sophisticated three-dimensional fog models, based on advanced microphysical parameterization schemes and high vertical resolution, have been already developed and give promising results. Nevertheless, the computational time is beyond the range of an operational setup. Therefore, mesoscale numerical weather prediction models are generally used for forecasting all kinds of weather situations. In spite of numerous improvements, a large uncertainty of small scale weather events inherent in deterministic prediction cannot be evaluated adequately. Probabilistic guidance is necessary to assess these uncertainties and give reliable forecasts. In this study, fog forecasts are obtained by a diagnosis scheme similar to Fog Stability Index (FSI) based on COSMO-DE model outputs. COSMO-DE I the German-focused high-resolution operational weather prediction model of the German Meteorological Service. The FSI and the respective fog occurrence probability is optimized and calibrated with statistical postprocessing in terms of logistic regression. In a second step, the predictor number of the FOGCAST model has been optimized by use of the LASSO-method (Least Absolute Shrinkage and Selection Operator). The results will present objective out-of-sample verification based on the Brier score and is performed for station data over Germany. Furthermore, the probabilistic fog forecast approach, FOGCAST, serves as a benchmark for the evaluation of more sophisticated 3D fog models. Several versions have been set up based on different

  20. A probabilistic tornado wind hazard model for the continental United States

    SciTech Connect

    Hossain, Q; Kimball, J; Mensing, R; Savy, J

    1999-04-19

    A probabilistic tornado wind hazard model for the continental United States (CONUS) is described. The model incorporates both aleatory (random) and epistemic uncertainties associated with quantifying the tornado wind hazard parameters. The temporal occurrences of tornadoes within the continental United States (CONUS) is assumed to be a Poisson process. A spatial distribution of tornado touchdown locations is developed empirically based on the observed historical events within the CONUS. The hazard model is an aerial probability model that takes into consideration the size and orientation of the facility, the length and width of the tornado damage area (idealized as a rectangle and dependent on the tornado intensity scale), wind speed variation within the damage area, tornado intensity classification errors (i.e.,errors in assigning a Fujita intensity scale based on surveyed damage), and the tornado path direction. Epistemic uncertainties in describing the distributions of the aleatory variables are accounted for by using more than one distribution model to describe aleatory variations. The epistemic uncertainties are based on inputs from a panel of experts. A computer program, TORNADO, has been developed incorporating this model; features of this program are also presented.

  1. Multi-model approach to petroleum resource appraisal using analytic methodologies for probabilistic systems

    USGS Publications Warehouse

    Crovelli, R.A.

    1988-01-01

    The geologic appraisal model that is selected for a petroleum resource assessment depends upon purpose of the assessment, basic geologic assumptions of the area, type of available data, time available before deadlines, available human and financial resources, available computer facilities, and, most importantly, the available quantitative methodology with corresponding computer software and any new quantitative methodology that would have to be developed. Therefore, different resource assessment projects usually require different geologic models. Also, more than one geologic model might be needed in a single project for assessing different regions of the study or for cross-checking resource estimates of the area. Some geologic analyses used in the past for petroleum resource appraisal involved play analysis. The corresponding quantitative methodologies of these analyses usually consisted of Monte Carlo simulation techniques. A probabilistic system of petroleum resource appraisal for play analysis has been designed to meet the following requirements: (1) includes a variety of geologic models, (2) uses an analytic methodology instead of Monte Carlo simulation, (3) possesses the capacity to aggregate estimates from many areas that have been assessed by different geologic models, and (4) runs quickly on a microcomputer. Geologic models consist of four basic types: reservoir engineering, volumetric yield, field size, and direct assessment. Several case histories and present studies by the U.S. Geological Survey are discussed. ?? 1988 International Association for Mathematical Geology.

  2. Multi-model approach to petroleum resource appraisal using analytic methodologies for probabilistic systems

    SciTech Connect

    Crovelli, R.A.

    1988-11-01

    The geologic appraisal model that is selected for a petroleum resource assessment depends upon purpose of the assessment, basic geologic assumptions of the area, type of available data, time available before deadlines, available human and financial resources, available computer facilities, and, most importantly, the available quantitative methodology with corresponding computer software and any new quantitative methodology that would have to be developed. Therefore, different resource assessment projects usually require different geologic models. Also, more than one geologic model might be needed in a single project for assessing different regions of the study or for cross-checking resource estimates of the area. Some geologic analyses used in the past for petroleum resource appraisal involved play analysis. The corresponding quantitative methodologies of these analyses usually consisted of Monte Carlo simulation techniques. A probabilistic system of petroleum resource appraisal for play analysis has been designed to meet the following requirements: (1) includes a variety of geologic models, (2) uses an analytic methodology instead of Monte Carlo simulation, (3) possesses the capacity to aggregate estimates from many areas that have been assessed by different geologic models, and (4) runs quickly on a microcomputer. Geologic models consist of four basic types: reservoir engineering, volumetric yield, field size, and direct assessment. Several case histories and present studies by the US Geological Survey are discussed.

  3. Multi-State Physics Models of Aging Passive Components in Probabilistic Risk Assessment

    SciTech Connect

    Unwin, Stephen D.; Lowry, Peter P.; Layton, Robert F.; Heasler, Patrick G.; Toloczko, Mychailo B.

    2011-03-13

    Multi-state Markov modeling has proved to be a promising approach to estimating the reliability of passive components - particularly metallic pipe components - in the context of probabilistic risk assessment (PRA). These models consider the progressive degradation of a component through a series of observable discrete states, such as detectable flaw, leak and rupture. Service data then generally provides the basis for estimating the state transition rates. Research in materials science is producing a growing understanding of the physical phenomena that govern the aging degradation of passive pipe components. As a result, there is an emerging opportunity to incorporate these insights into PRA. This paper describes research conducted under the Risk-Informed Safety Margin Characterization Pathway of the Department of Energy’s Light Water Reactor Sustainability Program. A state transition model is described that addresses aging behavior associated with stress corrosion cracking in ASME Class 1 dissimilar metal welds – a component type relevant to LOCA analysis. The state transition rate estimates are based on physics models of weld degradation rather than service data. The resultant model is found to be non-Markov in that the transition rates are time-inhomogeneous and stochastic. Numerical solutions to the model provide insight into the effect of aging on component reliability.

  4. Probabilistic prediction of cyanobacteria abundance in a Korean reservoir using a Bayesian Poisson model

    NASA Astrophysics Data System (ADS)

    Cha, YoonKyung; Park, Seok So