Science.gov

Sample records for probabilistic model integrating

  1. Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.

    2016-01-01

    Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.

  2. Integration of Advanced Probabilistic Analysis Techniques with Multi-Physics Models

    SciTech Connect

    Cetiner, Mustafa Sacit; none,; Flanagan, George F.; Poore III, Willis P.; Muhlheim, Michael David

    2014-07-30

    An integrated simulation platform that couples probabilistic analysis-based tools with model-based simulation tools can provide valuable insights for reactive and proactive responses to plant operating conditions. The objective of this work is to demonstrate the benefits of a partial implementation of the Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Framework Specification through the coupling of advanced PRA capabilities and accurate multi-physics plant models. Coupling a probabilistic model with a multi-physics model will aid in design, operations, and safety by providing a more accurate understanding of plant behavior. This represents the first attempt at actually integrating these two types of analyses for a control system used for operations, on a faster than real-time basis. This report documents the development of the basic communication capability to exchange data with the probabilistic model using Reliability Workbench (RWB) and the multi-physics model using Dymola. The communication pathways from injecting a fault (i.e., failing a component) to the probabilistic and multi-physics models were successfully completed. This first version was tested with prototypic models represented in both RWB and Modelica. First, a simple event tree/fault tree (ET/FT) model was created to develop the software code to implement the communication capabilities between the dynamic-link library (dll) and RWB. A program, written in C#, successfully communicates faults to the probabilistic model through the dll. A systems model of the Advanced Liquid-Metal Reactor–Power Reactor Inherently Safe Module (ALMR-PRISM) design developed under another DOE project was upgraded using Dymola to include proper interfaces to allow data exchange with the control application (ConApp). A program, written in C+, successfully communicates faults to the multi-physics model. The results of the example simulation were successfully plotted.

  3. PROBABILISTIC INFORMATION INTEGRATION TECHNOLOGY

    SciTech Connect

    J. BOOKER; M. MEYER; ET AL

    2001-02-01

    The Statistical Sciences Group at Los Alamos has successfully developed a structured, probabilistic, quantitative approach for the evaluation of system performance based on multiple information sources, called Information Integration Technology (IIT). The technology integrates diverse types and sources of data and information (both quantitative and qualitative), and their associated uncertainties, to develop distributions for performance metrics, such as reliability. Applications include predicting complex system performance, where test data are lacking or expensive to obtain, through the integration of expert judgment, historical data, computer/simulation model predictions, and any relevant test/experimental data. The technology is particularly well suited for tracking estimated system performance for systems under change (e.g. development, aging), and can be used at any time during product development, including concept and early design phases, prior to prototyping, testing, or production, and before costly design decisions are made. Techniques from various disciplines (e.g., state-of-the-art expert elicitation, statistical and reliability analysis, design engineering, physics modeling, and knowledge management) are merged and modified to develop formal methods for the data/information integration. The power of this technology, known as PREDICT (Performance and Reliability Evaluation with Diverse Information Combination and Tracking), won a 1999 R and D 100 Award (Meyer, Booker, Bement, Kerscher, 1999). Specifically the PREDICT application is a formal, multidisciplinary process for estimating the performance of a product when test data are sparse or nonexistent. The acronym indicates the purpose of the methodology: to evaluate the performance or reliability of a product/system by combining all available (often diverse) sources of information and then tracking that performance as the product undergoes changes.

  4. The Integrated Medical Model: A Probabilistic Simulation Model Predicting In-Flight Medical Risks

    NASA Technical Reports Server (NTRS)

    Keenan, Alexandra; Young, Millennia; Saile, Lynn; Boley, Lynn; Walton, Marlei; Kerstman, Eric; Shah, Ronak; Goodenow, Debra A.; Myers, Jerry G., Jr.

    2015-01-01

    The Integrated Medical Model (IMM) is a probabilistic model that uses simulation to predict mission medical risk. Given a specific mission and crew scenario, medical events are simulated using Monte Carlo methodology to provide estimates of resource utilization, probability of evacuation, probability of loss of crew, and the amount of mission time lost due to illness. Mission and crew scenarios are defined by mission length, extravehicular activity (EVA) schedule, and crew characteristics including: sex, coronary artery calcium score, contacts, dental crowns, history of abdominal surgery, and EVA eligibility. The Integrated Medical Evidence Database (iMED) houses the model inputs for one hundred medical conditions using in-flight, analog, and terrestrial medical data. Inputs include incidence, event durations, resource utilization, and crew functional impairment. Severity of conditions is addressed by defining statistical distributions on the dichotomized best and worst-case scenarios for each condition. The outcome distributions for conditions are bounded by the treatment extremes of the fully treated scenario in which all required resources are available and the untreated scenario in which no required resources are available. Upon occurrence of a simulated medical event, treatment availability is assessed, and outcomes are generated depending on the status of the affected crewmember at the time of onset, including any pre-existing functional impairments or ongoing treatment of concurrent conditions. The main IMM outcomes, including probability of evacuation and loss of crew life, time lost due to medical events, and resource utilization, are useful in informing mission planning decisions. To date, the IMM has been used to assess mission-specific risks with and without certain crewmember characteristics, to determine the impact of eliminating certain resources from the mission medical kit, and to design medical kits that maximally benefit crew health while meeting

  5. The Integrated Medical Model: A Probabilistic Simulation Model for Predicting In-Flight Medical Risks

    NASA Technical Reports Server (NTRS)

    Keenan, Alexandra; Young, Millennia; Saile, Lynn; Boley, Lynn; Walton, Marlei; Kerstman, Eric; Shah, Ronak; Goodenow, Debra A.; Myers, Jerry G.

    2015-01-01

    The Integrated Medical Model (IMM) is a probabilistic model that uses simulation to predict mission medical risk. Given a specific mission and crew scenario, medical events are simulated using Monte Carlo methodology to provide estimates of resource utilization, probability of evacuation, probability of loss of crew, and the amount of mission time lost due to illness. Mission and crew scenarios are defined by mission length, extravehicular activity (EVA) schedule, and crew characteristics including: sex, coronary artery calcium score, contacts, dental crowns, history of abdominal surgery, and EVA eligibility. The Integrated Medical Evidence Database (iMED) houses the model inputs for one hundred medical conditions using in-flight, analog, and terrestrial medical data. Inputs include incidence, event durations, resource utilization, and crew functional impairment. Severity of conditions is addressed by defining statistical distributions on the dichotomized best and worst-case scenarios for each condition. The outcome distributions for conditions are bounded by the treatment extremes of the fully treated scenario in which all required resources are available and the untreated scenario in which no required resources are available. Upon occurrence of a simulated medical event, treatment availability is assessed, and outcomes are generated depending on the status of the affected crewmember at the time of onset, including any pre-existing functional impairments or ongoing treatment of concurrent conditions. The main IMM outcomes, including probability of evacuation and loss of crew life, time lost due to medical events, and resource utilization, are useful in informing mission planning decisions. To date, the IMM has been used to assess mission-specific risks with and without certain crewmember characteristics, to determine the impact of eliminating certain resources from the mission medical kit, and to design medical kits that maximally benefit crew health while meeting

  6. Using the Rasch model as an objective and probabilistic technique to integrate different soil properties

    NASA Astrophysics Data System (ADS)

    Rebollo, Francisco J.; Jesús Moral García, Francisco

    2016-04-01

    Soil apparent electrical conductivity (ECa) is one of the simplest, least expensive soil measurements that integrates many soil properties affecting crop productivity, including, for instance, soil texture, water content, and cation exchange capacity. The ECa measurements obtained with a 3100 Veris sensor, operating in both shallow (0-30 cm), ECs, and deep (0-90 cm), ECd, mode, can be used as an additional and essential information to be included in a probabilistic model, the Rasch model, with the aim of quantifying the overall soil fertililty potential in an agricultural field. This quantification should integrate the main soil physical and chemical properties, with different units. In this work, the formulation of the Rasch model integrates 11 soil properties (clay, silt and sand content, organic matter -OM-, pH, total nitrogen -TN-, available phosphorus -AP- and potassium -AK-, cation exchange capacity -CEC-, ECd, and ECs) measured at 70 locations in a field. The main outputs of the model include a ranking of all soil samples according to their relative fertility potential and the unexpected behaviours of some soil samples and properties. In the case study, the considered soil variables fit the model reasonably, having an important influence on soil fertility, except pH, probably due to its homogeneity in the field. Moreover, ECd, ECs are the most influential properties on soil fertility and, on the other hand, AP and AK the less influential properties. The use of the Rasch model to estimate soil fertility potential (always in a relative way, taking into account the characteristics of the studied soil) constitutes a new application of great practical importance, enabling to rationally determine locations in a field where high soil fertility potential exists and establishing those soil samples or properties which have any anomaly; this information can be necessary to conduct site-specific treatments, leading to a more cost-effective and sustainable field

  7. Integrating probabilistic models of perception and interactive neural networks: a historical and tutorial review

    PubMed Central

    McClelland, James L.

    2013-01-01

    This article seeks to establish a rapprochement between explicitly Bayesian models of contextual effects in perception and neural network models of such effects, particularly the connectionist interactive activation (IA) model of perception. The article is in part an historical review and in part a tutorial, reviewing the probabilistic Bayesian approach to understanding perception and how it may be shaped by context, and also reviewing ideas about how such probabilistic computations may be carried out in neural networks, focusing on the role of context in interactive neural networks, in which both bottom-up and top-down signals affect the interpretation of sensory inputs. It is pointed out that connectionist units that use the logistic or softmax activation functions can exactly compute Bayesian posterior probabilities when the bias terms and connection weights affecting such units are set to the logarithms of appropriate probabilistic quantities. Bayesian concepts such the prior, likelihood, (joint and marginal) posterior, probability matching and maximizing, and calculating vs. sampling from the posterior are all reviewed and linked to neural network computations. Probabilistic and neural network models are explicitly linked to the concept of a probabilistic generative model that describes the relationship between the underlying target of perception (e.g., the word intended by a speaker or other source of sensory stimuli) and the sensory input that reaches the perceiver for use in inferring the underlying target. It is shown how a new version of the IA model called the multinomial interactive activation (MIA) model can sample correctly from the joint posterior of a proposed generative model for perception of letters in words, indicating that interactive processing is fully consistent with principled probabilistic computation. Ways in which these computations might be realized in real neural systems are also considered. PMID:23970868

  8. Probabilistic microcell prediction model

    NASA Astrophysics Data System (ADS)

    Kim, Song-Kyoo

    2002-06-01

    A microcell is a cell with 1-km or less radius which is suitable for heavily urbanized area such as a metropolitan city. This paper deals with the microcell prediction model of propagation loss which uses probabilistic techniques. The RSL (Receive Signal Level) is the factor which can evaluate the performance of a microcell and the LOS (Line-Of-Sight) component and the blockage loss directly effect on the RSL. We are combining the probabilistic method to get these performance factors. The mathematical methods include the CLT (Central Limit Theorem) and the SPC (Statistical Process Control) to get the parameters of the distribution. This probabilistic solution gives us better measuring of performance factors. In addition, it gives the probabilistic optimization of strategies such as the number of cells, cell location, capacity of cells, range of cells and so on. Specially, the probabilistic optimization techniques by itself can be applied to real-world problems such as computer-networking, human resources and manufacturing process.

  9. Analysis of molecular expression patterns and integration with other knowledge bases using probabilistic Bayesian network models

    SciTech Connect

    Moler, Edward J.; Mian, I.S.

    2000-03-01

    How can molecular expression experiments be interpreted with greater than ten to the fourth measurements per chip? How can one get the most quantitative information possible from the experimental data with good confidence? These are important questions whose solutions require an interdisciplinary combination of molecular and cellular biology, computer science, statistics, and complex systems analysis. The explosion of data from microarray techniques present the problem of interpreting the experiments. The availability of large-scale knowledge bases provide the opportunity to maximize the information extracted from these experiments. We have developed new methods of discovering biological function, metabolic pathways, and regulatory networks from these data and knowledge bases. These techniques are applicable to analyses for biomedical engineering, clinical, and fundamental cell and molecular biology studies. Our approach uses probabilistic, computational methods that give quantitative interpretations of data in a biological context. We have selected Bayesian statistical models with graphical network representations as a framework for our methods. As a first step, we use a nave Bayesian classifier to identify statistically significant patterns in gene expression data. We have developed methods which allow us to (a) characterize which genes or experiments distinguish each class from the others, (b) cross-index the resulting classes with other databases to assess biological meaning of the classes, and (c) display a gross overview of cellular dynamics. We have developed a number of visualization tools to convey the results. We report here our methods of classification and our first attempts at integrating the data and other knowledge bases together with new visualization tools. We demonstrate the utility of these methods and tools by analysis of a series of yeast cDNA microarray data and to a set of cancerous/normal sample data from colon cancer patients. We discuss

  10. Crevice corrosion {ampersand} pitting of high-level waste containers: the integration of deterministic {ampersand} probabilistic models (II)

    SciTech Connect

    Farmer, J.C.

    1997-10-01

    An integrated predictive model is being developed to account for the effects of localized environmental conditions in crevices on the initiation and propagation of pits. A deterministic calculation is used to estimate the accumulation of hydrogen ions (pH suppression) in the crevice solution due to the hydrolysis of dissolved metals. Pit initiation and growth within the crevice is then dealt with by either a probabilistic model, or an equivalent deterministic model. Ultimately, the role of intergranular corrosion will have to be considered. While the strategy presented here is very promising, the integrated model is not yet ready for precise quantitative predictions. Empirical expressions for the rate of penetration based upon experimental crevice corrosion data can be used in the interim period, until the integrated model can be refined. Bounding calculations based upon such empirical expressions can provide important insight into worst-case scenarios.

  11. Development of Standardized Probabilistic Risk Assessment Models for Shutdown Operations Integrated in SPAR Level 1 Model

    SciTech Connect

    S. T. Khericha; J. Mitman

    2008-05-01

    Nuclear plant operating experience and several studies show that the risk from shutdown operation during Modes 4, 5, and 6 at pressurized water reactors and Modes 4 and 5 at boiling water reactors can be significant. This paper describes using the U.S. Nuclear Regulatory Commission’s full-power Standardized Plant Analysis Risk (SPAR) model as the starting point for development of risk evaluation models for commercial nuclear power plants. The shutdown models are integrated with their respective internal event at-power SPAR model. This is accomplished by combining the modified system fault trees from the SPAR full-power model with shutdown event tree logic. Preliminary human reliability analysis results indicate that risk is dominated by the operator’s ability to correctly diagnose events and initiate systems.

  12. Development of Probabilistic Risk Assessment Model for BWR Shutdown Modes 4 and 5 Integrated in SPAR Model

    SciTech Connect

    S. T. Khericha; S. Sancakter; J. Mitman; J. Wood

    2010-06-01

    Nuclear plant operating experience and several studies show that the risk from shutdown operation during modes 4, 5, and 6 can be significant This paper describes development of the standard template risk evaluation models for shutdown modes 4, and 5 for commercial boiling water nuclear power plants (BWR). The shutdown probabilistic risk assessment model uses full power Nuclear Regulatory Commission’s (NRC’s) Standardized Plant Analysis Risk (SPAR) model as the starting point for development. The shutdown PRA models are integrated with their respective internal events at-power SPAR model. This is accomplished by combining the modified system fault trees from SPAR full power model with shutdown event tree logic. For human reliability analysis (HRA), the SPAR HRA (SPAR-H) method is used which requires the analysts to complete relatively straight forward worksheet, including the performance shaping factors (PSFs). The results are then used to estimate HEP of interest. The preliminary results indicate the risk is dominated by the operator’s ability to diagnose the events and provide long term cooling.

  13. Probabilistic Mesomechanical Fatigue Model

    NASA Technical Reports Server (NTRS)

    Tryon, Robert G.

    1997-01-01

    A probabilistic mesomechanical fatigue life model is proposed to link the microstructural material heterogeneities to the statistical scatter in the macrostructural response. The macrostructure is modeled as an ensemble of microelements. Cracks nucleation within the microelements and grow from the microelements to final fracture. Variations of the microelement properties are defined using statistical parameters. A micromechanical slip band decohesion model is used to determine the crack nucleation life and size. A crack tip opening displacement model is used to determine the small crack growth life and size. Paris law is used to determine the long crack growth life. The models are combined in a Monte Carlo simulation to determine the statistical distribution of total fatigue life for the macrostructure. The modeled response is compared to trends in experimental observations from the literature.

  14. Modeling development of natural multi-sensory integration using neural self-organisation and probabilistic population codes

    NASA Astrophysics Data System (ADS)

    Bauer, Johannes; Dávila-Chacón, Jorge; Wermter, Stefan

    2015-10-01

    Humans and other animals have been shown to perform near-optimally in multi-sensory integration tasks. Probabilistic population codes (PPCs) have been proposed as a mechanism by which optimal integration can be accomplished. Previous approaches have focussed on how neural networks might produce PPCs from sensory input or perform calculations using them, like combining multiple PPCs. Less attention has been given to the question of how the necessary organisation of neurons can arise and how the required knowledge about the input statistics can be learned. In this paper, we propose a model of learning multi-sensory integration based on an unsupervised learning algorithm in which an artificial neural network learns the noise characteristics of each of its sources of input. Our algorithm borrows from the self-organising map the ability to learn latent-variable models of the input and extends it to learning to produce a PPC approximating a probability density function over the latent variable behind its (noisy) input. The neurons in our network are only required to perform simple calculations and we make few assumptions about input noise properties and tuning functions. We report on a neurorobotic experiment in which we apply our algorithm to multi-sensory integration in a humanoid robot to demonstrate its effectiveness and compare it to human multi-sensory integration on the behavioural level. We also show in simulations that our algorithm performs near-optimally under certain plausible conditions, and that it reproduces important aspects of natural multi-sensory integration on the neural level.

  15. Crevice corrosion and pitting of high-level waste containers: Integration of deterministic and probabilistic models

    SciTech Connect

    Farmer, J.C.; McCright, R.D.

    1998-12-31

    A key component of the Engineered Barrier System (EBS) being designed for containment of spent-fuel and high-level waste at the proposed geological repository at Yucca Mountain, Nevada is a two-layer canister. In this particular design, the inner barrier is made of a corrosion resistant material (CRM) such as Alloy 625 or C-22, while the outer barrier is made of a corrosion-allowance material (CAM) such as carbon steel or Alloy 400. An integrated predictive model is being developed to account for the effects of localized environmental conditions in the CRM-CAM crevice on the initiation and propagation of pits through the CRM.

  16. An integrative C. elegans protein-protein interaction network with reliability assessment based on a probabilistic graphical model.

    PubMed

    Huang, Xiao-Tai; Zhu, Yuan; Chan, Leanne Lai Hang; Zhao, Zhongying; Yan, Hong

    2016-01-01

    In Caenorhabditis elegans, a large number of protein-protein interactions (PPIs) are identified by different experiments. However, a comprehensive weighted PPI network, which is essential for signaling pathway inference, is not yet available in this model organism. Therefore, we firstly construct an integrative PPI network in C. elegans with 12,951 interactions involving 5039 proteins from seven molecular interaction databases. Then, a reliability score based on a probabilistic graphical model (RSPGM) is proposed to assess PPIs. It assumes that the random number of interactions between two proteins comes from the Bernoulli distribution to avoid multi-links. The main parameter of the RSPGM score contains a few latent variables which can be considered as several common properties between two proteins. Validations on high-confidence yeast datasets show that RSPGM provides more accurate evaluation than other approaches, and the PPIs in the reconstructed PPI network have higher biological relevance than that in the original network in terms of gene ontology, gene expression, essentiality and the prediction of known protein complexes. Furthermore, this weighted integrative PPI network in C. elegans is employed on inferring interaction path of the canonical Wnt/β-catenin pathway as well. Most genes on the inferred interaction path have been validated to be Wnt pathway components. Therefore, RSPGM is essential and effective for evaluating PPIs and inferring interaction path. Finally, the PPI network with RSPGM scores can be queried and visualized on a user interactive website, which is freely available at . PMID:26555698

  17. The integration of bioclimatic indices in an objective probabilistic model for establishing and mapping viticulture suitability in a region

    NASA Astrophysics Data System (ADS)

    Moral García, Francisco J.; Rebollo, Francisco J.; Paniagua, Luis L.; García, Abelardo

    2014-05-01

    Different bioclimatic indices have been proposed to determine the wine suitability in a region. Some of them are related to the air temperature, but the hydric component of climate should also be considered which, in turn, is influenced by the precipitation during the different stages of the grapevine growing and ripening periods. In this work we propose using the information obtained from 10 bioclimatic indices and variables (heliothermal index, HI, cool night index, CI, dryness index, DI, growing season temperature, GST, the Winkler index, WI, September mean thermal amplitude, MTA, annual precipitation, AP, precipitation during flowering, PDF, precipitation before flowering, PBF, and summer precipitation, SP) as inputs in an objective and probabilistic model, the Rasch model, with the aim of integrating the individual effects of them, obtaining the climate data that summarize all main bioclimatic indices which could influence on wine suitability, and utilize the Rasch measures to generate homogeneous climatic zones. The use of the Rasch model to estimate viticultural suitability constitutes a new application of great practical importance, enabling to rationally determine locations in a region where high viticultural potential exists and establishing a ranking of the bioclimatic indices or variables which exerts an important influence on wine suitability in a region. Furthermore, from the measures of viticultural suitability at some locations, estimates can be computed using a geostatistical algorithm, and these estimates can be utilized to map viticultural suitability potential in a region. To illustrate the process, an application to Extremadura, southewestern Spain, is shown. Keywords: Rasch model, bioclimatic indices, GIS.

  18. Integration of climatic indices in an objective probabilistic model for establishing and mapping viticultural climatic zones in a region

    NASA Astrophysics Data System (ADS)

    Moral, Francisco J.; Rebollo, Francisco J.; Paniagua, Luis L.; García, Abelardo; Honorio, Fulgencio

    2016-05-01

    Different climatic indices have been proposed to determine the wine suitability in a region. Some of them are related to the air temperature, but the hydric component of climate should also be considered which, in turn, is influenced by the precipitation during the different stages of the grapevine growing and ripening periods. In this study, we propose using the information obtained from ten climatic indices [heliothermal index (HI), cool night index (CI), dryness index (DI), growing season temperature (GST), the Winkler index (WI), September mean thermal amplitude (MTA), annual precipitation (AP), precipitation during flowering (PDF), precipitation before flowering (PBF), and summer precipitation (SP)] as inputs in an objective and probabilistic model, the Rasch model, with the aim of integrating the individual effects of them, obtaining the climate data that summarize all main climatic indices, which could influence on wine suitability from a climate viewpoint, and utilizing the Rasch measures to generate homogeneous climatic zones. The use of the Rasch model to estimate viticultural climatic suitability constitutes a new application of great practical importance, enabling to rationally determine locations in a region where high viticultural potential exists and establishing a ranking of the climatic indices which exerts an important influence on wine suitability in a region. Furthermore, from the measures of viticultural climatic suitability at some locations, estimates can be computed using a geostatistical algorithm, and these estimates can be utilized to map viticultural climatic zones in a region. To illustrate the process, an application to Extremadura, southwestern Spain, is shown.

  19. Probabilistic Models for Solar Particle Events

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.; Dietrich, W. F.; Xapsos, M. A.; Welton, A. M.

    2009-01-01

    Probabilistic Models of Solar Particle Events (SPEs) are used in space mission design studies to provide a description of the worst-case radiation environment that the mission must be designed to tolerate.The models determine the worst-case environment using a description of the mission and a user-specified confidence level that the provided environment will not be exceeded. This poster will focus on completing the existing suite of models by developing models for peak flux and event-integrated fluence elemental spectra for the Z>2 elements. It will also discuss methods to take into account uncertainties in the data base and the uncertainties resulting from the limited number of solar particle events in the database. These new probabilistic models are based on an extensive survey of SPE measurements of peak and event-integrated elemental differential energy spectra. Attempts are made to fit the measured spectra with eight different published models. The model giving the best fit to each spectrum is chosen and used to represent that spectrum for any energy in the energy range covered by the measurements. The set of all such spectral representations for each element is then used to determine the worst case spectrum as a function of confidence level. The spectral representation that best fits these worst case spectra is found and its dependence on confidence level is parameterized. This procedure creates probabilistic models for the peak and event-integrated spectra.

  20. Development of Probabilistic Structural Analysis Integrated with Manufacturing Processes

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Nagpal, Vinod K.

    2007-01-01

    An effort has been initiated to integrate manufacturing process simulations with probabilistic structural analyses in order to capture the important impacts of manufacturing uncertainties on component stress levels and life. Two physics-based manufacturing process models (one for powdered metal forging and the other for annular deformation resistance welding) have been linked to the NESSUS structural analysis code. This paper describes the methodology developed to perform this integration including several examples. Although this effort is still underway, particularly for full integration of a probabilistic analysis, the progress to date has been encouraging and a software interface that implements the methodology has been developed. The purpose of this paper is to report this preliminary development.

  1. Integrating Sequence Evolution into Probabilistic Orthology Analysis.

    PubMed

    Ullah, Ikram; Sjöstrand, Joel; Andersson, Peter; Sennblad, Bengt; Lagergren, Jens

    2015-11-01

    Orthology analysis, that is, finding out whether a pair of homologous genes are orthologs - stemming from a speciation - or paralogs - stemming from a gene duplication - is of central importance in computational biology, genome annotation, and phylogenetic inference. In particular, an orthologous relationship makes functional equivalence of the two genes highly likely. A major approach to orthology analysis is to reconcile a gene tree to the corresponding species tree, (most commonly performed using the most parsimonious reconciliation, MPR). However, most such phylogenetic orthology methods infer the gene tree without considering the constraints implied by the species tree and, perhaps even more importantly, only allow the gene sequences to influence the orthology analysis through the a priori reconstructed gene tree. We propose a sound, comprehensive Bayesian Markov chain Monte Carlo-based method, DLRSOrthology, to compute orthology probabilities. It efficiently sums over the possible gene trees and jointly takes into account the current gene tree, all possible reconciliations to the species tree, and the, typically strong, signal conveyed by the sequences. We compare our method with PrIME-GEM, a probabilistic orthology approach built on a probabilistic duplication-loss model, and MrBayesMPR, a probabilistic orthology approach that is based on conventional Bayesian inference coupled with MPR. We find that DLRSOrthology outperforms these competing approaches on synthetic data as well as on biological data sets and is robust to incomplete taxon sampling artifacts. PMID:26130236

  2. Integration of fuzzy analytic hierarchy process and probabilistic dynamic programming in formulating an optimal fleet management model

    NASA Astrophysics Data System (ADS)

    Teoh, Lay Eng; Khoo, Hooi Ling

    2013-09-01

    This study deals with two major aspects of airlines, i.e. supply and demand management. The aspect of supply focuses on the mathematical formulation of an optimal fleet management model to maximize operational profit of the airlines while the aspect of demand focuses on the incorporation of mode choice modeling as parts of the developed model. The proposed methodology is outlined in two-stage, i.e. Fuzzy Analytic Hierarchy Process is first adopted to capture mode choice modeling in order to quantify the probability of probable phenomena (for aircraft acquisition/leasing decision). Then, an optimization model is developed as a probabilistic dynamic programming model to determine the optimal number and types of aircraft to be acquired and/or leased in order to meet stochastic demand during the planning horizon. The findings of an illustrative case study show that the proposed methodology is viable. The results demonstrate that the incorporation of mode choice modeling could affect the operational profit and fleet management decision of the airlines at varying degrees.

  3. Approaches to implementing deterministic models in a probabilistic framework

    SciTech Connect

    Talbott, D.V.

    1995-04-01

    The increasing use of results from probabilistic risk assessments in the decision-making process makes it ever more important to eliminate simplifications in probabilistic models that might lead to conservative results. One area in which conservative simplifications are often made is modeling the physical interactions that occur during the progression of an accident sequence. This paper demonstrates and compares different approaches for incorporating deterministic models of physical parameters into probabilistic models; parameter range binning, response curves, and integral deterministic models. An example that combines all three approaches in a probabilistic model for the handling of an energetic material (i.e. high explosive, rocket propellant,...) is then presented using a directed graph model.

  4. INTEGRATED PROBABILISTIC AND DETERMINISTIC MODELING TECHNIQUES IN ESTIMATING EXPOSURE TO WATER-BORNE CONTAMINANTS: PART 2 PHARMACOKINETIC MODELING

    EPA Science Inventory

    The Total Exposure Model (TEM) uses deterministic and stochastic methods to estimate the exposure of a person performing daily activities of eating, drinking, showering, and bathing. There were 250 time histories generated, by subject with activities, for the three exposure ro...

  5. Probabilistic Survivability Versus Time Modeling

    NASA Technical Reports Server (NTRS)

    Joyner, James J., Sr.

    2015-01-01

    This technical paper documents Kennedy Space Centers Independent Assessment team work completed on three assessments for the Ground Systems Development and Operations (GSDO) Program to assist the Chief Safety and Mission Assurance Officer (CSO) and GSDO management during key programmatic reviews. The assessments provided the GSDO Program with an analysis of how egress time affects the likelihood of astronaut and worker survival during an emergency. For each assessment, the team developed probability distributions for hazard scenarios to address statistical uncertainty, resulting in survivability plots over time. The first assessment developed a mathematical model of probabilistic survivability versus time to reach a safe location using an ideal Emergency Egress System at Launch Complex 39B (LC-39B); the second used the first model to evaluate and compare various egress systems under consideration at LC-39B. The third used a modified LC-39B model to determine if a specific hazard decreased survivability more rapidly than other events during flight hardware processing in Kennedys Vehicle Assembly Building (VAB).Based on the composite survivability versus time graphs from the first two assessments, there was a soft knee in the Figure of Merit graphs at eight minutes (ten minutes after egress ordered). Thus, the graphs illustrated to the decision makers that the final emergency egress design selected should have the capability of transporting the flight crew from the top of LC 39B to a safe location in eight minutes or less. Results for the third assessment were dominated by hazards that were classified as instantaneous in nature (e.g. stacking mishaps) and therefore had no effect on survivability vs time to egress the VAB. VAB emergency scenarios that degraded over time (e.g. fire) produced survivability vs time graphs that were line with aerospace industry norms.

  6. Probabilistic Modeling of Rosette Formation

    PubMed Central

    Long, Mian; Chen, Juan; Jiang, Ning; Selvaraj, Periasamy; McEver, Rodger P.; Zhu, Cheng

    2006-01-01

    Rosetting, or forming a cell aggregate between a single target nucleated cell and a number of red blood cells (RBCs), is a simple assay for cell adhesion mediated by specific receptor-ligand interaction. For example, rosette formation between sheep RBC and human lymphocytes has been used to differentiate T cells from B cells. Rosetting assay is commonly used to determine the interaction of Fc γ-receptors (FcγR) expressed on inflammatory cells and IgG coated on RBCs. Despite its wide use in measuring cell adhesion, the biophysical parameters of rosette formation have not been well characterized. Here we developed a probabilistic model to describe the distribution of rosette sizes, which is Poissonian. The average rosette size is predicted to be proportional to the apparent two-dimensional binding affinity of the interacting receptor-ligand pair and their site densities. The model has been supported by experiments of rosettes mediated by four molecular interactions: FcγRIII interacting with IgG, T cell receptor and coreceptor CD8 interacting with antigen peptide presented by major histocompatibility molecule, P-selectin interacting with P-selectin glycoprotein ligand 1 (PSGL-1), and L-selectin interacting with PSGL-1. The latter two are structurally similar and are different from the former two. Fitting the model to data enabled us to evaluate the apparent effective two-dimensional binding affinity of the interacting molecular pairs: 7.19 × 10−5 μm4 for FcγRIII-IgG interaction, 4.66 × 10−3 μm4 for P-selectin-PSGL-1 interaction, and 0.94 × 10−3 μm4 for L-selectin-PSGL-1 interaction. These results elucidate the biophysical mechanism of rosette formation and enable it to become a semiquantitative assay that relates the rosette size to the effective affinity for receptor-ligand binding. PMID:16603493

  7. Modelling structured data with Probabilistic Graphical Models

    NASA Astrophysics Data System (ADS)

    Forbes, F.

    2016-05-01

    Most clustering and classification methods are based on the assumption that the objects to be clustered are independent. However, in more and more modern applications, data are structured in a way that makes this assumption not realistic and potentially misleading. A typical example that can be viewed as a clustering task is image segmentation where the objects are the pixels on a regular grid and depend on neighbouring pixels on this grid. Also, when data are geographically located, it is of interest to cluster data with an underlying dependence structure accounting for some spatial localisation. These spatial interactions can be naturally encoded via a graph not necessarily regular as a grid. Data sets can then be modelled via Markov random fields and mixture models (e.g. the so-called MRF and Hidden MRF). More generally, probabilistic graphical models are tools that can be used to represent and manipulate data in a structured way while modeling uncertainty. This chapter introduces the basic concepts. The two main classes of probabilistic graphical models are considered: Bayesian networks and Markov networks. The key concept of conditional independence and its link to Markov properties is presented. The main problems that can be solved with such tools are described. Some illustrations are given associated with some practical work.

  8. iTOUGH2-IFC: An Integrated Flow Code in Support of Nagra's Probabilistic Safety Assessment:--User's Guide and Model Description

    SciTech Connect

    Finsterle, Stefan A.

    2009-01-02

    This document describes the development and use of the Integrated Flow Code (IFC), a numerical code and related model to be used for the simulation of time-dependent, two-phase flow in the near field and geosphere of a gas-generating nuclear waste repository system located in an initially fully water-saturated claystone (Opalinus Clay) in Switzerland. The development of the code and model was supported by the Swiss National Cooperative for the Disposal of Radioactive Waste (Nagra), Wettingen, Switzerland. Gas generation (mainly H{sub 2}, but also CH{sub 4} and CO{sub 2}) may affect repository performance by (1) compromising the engineered barriers through excessive pressure build-up, (2) displacing potentially contaminated pore water, (3) releasing radioactive gases (e.g., those containing {sup 14}C and {sup 3}H), (4) changing hydrogeologic properties of the engineered barrier system and the host rock, and (5) altering the groundwater flow field and thus radionuclide migration paths. The IFC aims at providing water and gas flow fields as the basis for the subsequent radionuclide transport simulations, which are performed by the radionuclide transport code (RTC). The IFC, RTC and a waste-dissolution and near-field transport model (STMAN) are part of the Integrated Radionuclide Release Code (IRRC), which integrates all safety-relevant features, events, and processes (FEPs). The IRRC is embedded into a Probabilistic Safety Assessment (PSA) computational tool that (1) evaluates alternative conceptual models, scenarios, and disruptive events, and (2) performs Monte-Carlo sampling to account for parametric uncertainties. The preliminary probabilistic safety assessment concept and the role of the IFC are visualized in Figure 1. The IFC was developed based on Nagra's PSA concept. Specifically, as many phenomena as possible are to be directly simulated using a (simplified) process model, which is at the core of the IRRC model. Uncertainty evaluation (scenario uncertainty

  9. Learning probabilistic document template models via interaction

    NASA Astrophysics Data System (ADS)

    Ahmadullin, Ildus; Damera-Venkata, Niranjan

    2013-03-01

    Document aesthetics measures are key to automated document composition. Recently we presented a probabilistic document model (PDM) which is a micro-model for document aesthetics based on a probabilistic modeling of designer choice in document design. The PDM model comes with efficient layout synthesis algorithms once the aesthetic model is defined. A key element of this approach is an aesthetic prior on the parameters of a template encoding aesthetic preferences for template parameters. Parameters of the prior were required to be chosen empirically by designers. In this work we show how probabilistic template models (and hence the PDM cost function) can be learnt directly by observing a designer making design choices in composing sample documents. From such training data our learning approach can learn a quality measure that can mimic some of the design tradeoffs a designer makes in practice.

  10. Probabilistic Solar Energetic Particle Models

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.; Dietrich, William F.; Xapsos, Michael A.

    2011-01-01

    To plan and design safe and reliable space missions, it is necessary to take into account the effects of the space radiation environment. This is done by setting the goal of achieving safety and reliability with some desired level of confidence. To achieve this goal, a worst-case space radiation environment at the required confidence level must be obtained. Planning and designing then proceeds, taking into account the effects of this worst-case environment. The result will be a mission that is reliable against the effects of the space radiation environment at the desired confidence level. In this paper we will describe progress toward developing a model that provides worst-case space radiation environments at user-specified confidence levels. We will present a model for worst-case event-integrated solar proton environments that provide the worst-case differential proton spectrum. This model is based on data from IMP-8 and GOES spacecraft that provide a data base extending from 1974 to the present. We will discuss extending this work to create worst-case models for peak flux and mission-integrated fluence for protons. We will also describe plans for similar models for helium and heavier ions.

  11. A Probabilistic Model of Melody Perception

    ERIC Educational Resources Information Center

    Temperley, David

    2008-01-01

    This study presents a probabilistic model of melody perception, which infers the key of a melody and also judges the probability of the melody itself. The model uses Bayesian reasoning: For any "surface" pattern and underlying "structure," we can infer the structure maximizing P(structure [vertical bar] surface) based on knowledge of P(surface,…

  12. A probabilistic model of semantic plausibility in sentence processing.

    PubMed

    Padó, Ulrike; Crocker, Matthew W; Keller, Frank

    2009-07-01

    Experimental research shows that human sentence processing uses information from different levels of linguistic analysis, for example, lexical and syntactic preferences as well as semantic plausibility. Existing computational models of human sentence processing, however, have focused primarily on lexico-syntactic factors. Those models that do account for semantic plausibility effects lack a general model of human plausibility intuitions at the sentence level. Within a probabilistic framework, we propose a wide-coverage model that both assigns thematic roles to verb-argument pairs and determines a preferred interpretation by evaluating the plausibility of the resulting (verb, role, argument) triples. The model is trained on a corpus of role-annotated language data. We also present a transparent integration of the semantic model with an incremental probabilistic parser. We demonstrate that both the semantic plausibility model and the combined syntax/semantics model predict judgment and reading time data from the experimental literature. PMID:21585487

  13. A probabilistic model for binaural sound localization.

    PubMed

    Willert, Volker; Eggert, Julian; Adamy, Jürgen; Stahl, Raphael; Körner, Edgar

    2006-10-01

    This paper proposes a biologically inspired and technically implemented sound localization system to robustly estimate the position of a sound source in the frontal azimuthal half-plane. For localization, binaural cues are extracted using cochleagrams generated by a cochlear model that serve as input to the system. The basic idea of the model is to separately measure interaural time differences and interaural level differences for a number of frequencies and process these measurements as a whole. This leads to two-dimensional frequency versus time-delay representations of binaural cues, so-called activity maps. A probabilistic evaluation is presented to estimate the position of a sound source over time based on these activity maps. Learned reference maps for different azimuthal positions are integrated into the computation to gain time-dependent discrete conditional probabilities. At every timestep these probabilities are combined over frequencies and binaural cues to estimate the sound source position. In addition, they are propagated over time to improve position estimation. This leads to a system that is able to localize audible signals, for example human speech signals, even in reverberating environments. PMID:17036807

  14. SHEDS-HT: An Integrated Probabilistic Exposure Model for Prioritizing Exposures to Chemicals with Near-Field and Dietary Sources

    EPA Science Inventory

    United States Environmental Protection Agency (USEPA) researchers are developing a strategy for highthroughput (HT) exposure-based prioritization of chemicals under the ExpoCast program. These novel modeling approaches for evaluating chemicals based on their potential for biologi...

  15. Probabilistic Seismic Risk Model for Western Balkans

    NASA Astrophysics Data System (ADS)

    Stejskal, Vladimir; Lorenzo, Francisco; Pousse, Guillaume; Radovanovic, Slavica; Pekevski, Lazo; Dojcinovski, Dragi; Lokin, Petar; Petronijevic, Mira; Sipka, Vesna

    2010-05-01

    A probabilistic seismic risk model for insurance and reinsurance purposes is presented for an area of Western Balkans, covering former Yugoslavia and Albania. This territory experienced many severe earthquakes during past centuries producing significant damage to many population centres in the region. The highest hazard is related to external Dinarides, namely to the collision zone of the Adriatic plate. The model is based on a unified catalogue for the region and a seismic source model consisting of more than 30 zones covering all the three main structural units - Southern Alps, Dinarides and the south-western margin of the Pannonian Basin. A probabilistic methodology using Monte Carlo simulation was applied to generate the hazard component of the model. Unique set of damage functions based on both loss experience and engineering assessments is used to convert the modelled ground motion severity into the monetary loss.

  16. ISSUES ASSOCIATED WITH PROBABILISTIC FAILURE MODELING OF DIGITAL SYSTEMS

    SciTech Connect

    CHU,T.L.; MARTINEZ-GURIDI,G.; LEHNER,J.; OVERLAND,D.

    2004-09-19

    The current U.S. Nuclear Regulatory Commission (NRC) licensing process of instrumentation and control (I&C) systems is based on deterministic requirements, e.g., single failure criteria, and defense in depth and diversity. Probabilistic considerations can be used as supplements to the deterministic process. The National Research Council has recommended development of methods for estimating failure probabilities of digital systems, including commercial off-the-shelf (COTS) equipment, for use in probabilistic risk assessment (PRA). NRC staff has developed informal qualitative and quantitative requirements for PRA modeling of digital systems. Brookhaven National Laboratory (BNL) has performed a review of the-state-of-the-art of the methods and tools that can potentially be used to model digital systems. The objectives of this paper are to summarize the review, discuss the issues associated with probabilistic modeling of digital systems, and identify potential areas of research that would enhance the state of the art toward a satisfactory modeling method that could be integrated with a typical probabilistic risk assessment.

  17. Probabilistic drought classification using gamma mixture models

    NASA Astrophysics Data System (ADS)

    Mallya, Ganeshchandra; Tripathi, Shivam; Govindaraju, Rao S.

    2015-07-01

    Drought severity is commonly reported using drought classes obtained by assigning pre-defined thresholds on drought indices. Current drought classification methods ignore modeling uncertainties and provide discrete drought classification. However, the users of drought classification are often interested in knowing inherent uncertainties in classification so that they can make informed decisions. Recent studies have used hidden Markov models (HMM) for quantifying uncertainties in drought classification. The HMM method conceptualizes drought classes as distinct hydrological states that are not observed (hidden) but affect observed hydrological variables. The number of drought classes or hidden states in the model is pre-specified, which can sometimes result in model over-specification problem. This study proposes an alternate method for probabilistic drought classification where the number of states in the model is determined by the data. The proposed method adapts Standard Precipitation Index (SPI) methodology of drought classification by employing gamma mixture model (Gamma-MM) in a Bayesian framework. The method alleviates the problem of choosing a suitable distribution for fitting data in SPI analysis, quantifies modeling uncertainties, and propagates them for probabilistic drought classification. The method is tested on rainfall data over India. Comparison of the results with standard SPI show important differences particularly when SPI assumptions on data distribution are violated. Further, the new method is simpler and more parsimonious than HMM based drought classification method and can be a viable alternative for probabilistic drought classification.

  18. Probabilistic, meso-scale flood loss modelling

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2016-04-01

    Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.

  19. Transitions in a probabilistic interface growth model

    NASA Astrophysics Data System (ADS)

    Alves, S. G.; Moreira, J. G.

    2011-04-01

    We study a generalization of the Wolf-Villain (WV) interface growth model based on a probabilistic growth rule. In the WV model, particles are randomly deposited onto a substrate and subsequently move to a position nearby where the binding is strongest. We introduce a growth probability which is proportional to a power of the number ni of bindings of the site i: p_i\\propto n_i^\

  20. The probabilistic structure of planetary contamination models

    NASA Technical Reports Server (NTRS)

    Harrison, J. M.; North, W. D.

    1973-01-01

    The analytical basis for planetary quarantine standards and procedures is presented. The heirarchy of planetary quarantine decisions is explained and emphasis is placed on the determination of mission specifications to include sterilization. The influence of the Sagan-Coleman probabilistic model of planetary contamination on current standards and procedures is analyzed. A classical problem in probability theory which provides a close conceptual parallel to the type of dependence present in the contamination problem is presented.

  1. Probabilistic risk assessment of the Space Shuttle. Phase 3: A study of the potential of losing the vehicle during nominal operation. Volume 2: Integrated loss of vehicle model

    NASA Technical Reports Server (NTRS)

    Fragola, Joseph R.; Maggio, Gaspare; Frank, Michael V.; Gerez, Luis; Mcfadden, Richard H.; Collins, Erin P.; Ballesio, Jorge; Appignani, Peter L.; Karns, James J.

    1995-01-01

    The application of the probabilistic risk assessment methodology to a Space Shuttle environment, particularly to the potential of losing the Shuttle during nominal operation is addressed. The different related concerns are identified and combined to determine overall program risks. A fault tree model is used to allocate system probabilities to the subsystem level. The loss of the vehicle due to failure to contain energetic gas and debris, to maintain proper propulsion and configuration is analyzed, along with the loss due to Orbiter, external tank failure, and landing failure or error.

  2. Analytic gain in probabilistic decompression sickness models.

    PubMed

    Howle, Laurens E

    2013-11-01

    Decompression sickness (DCS) is a disease known to be related to inert gas bubble formation originating from gases dissolved in body tissues. Probabilistic DCS models, which employ survival and hazard functions, are optimized by fitting model parameters to experimental dive data. In the work reported here, I develop methods to find the survival function gain parameter analytically, thus removing it from the fitting process. I show that the number of iterations required for model optimization is significantly reduced. The analytic gain method substantially improves the condition number of the Hessian matrix which reduces the model confidence intervals by more than an order of magnitude. PMID:24209920

  3. Modelling default and likelihood reasoning as probabilistic

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1990-01-01

    A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. 'Likely' and 'by default' are in fact treated as duals in the same sense as 'possibility' and 'necessity'. To model these four forms probabilistically, a logic QDP and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequence results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.

  4. Towards an integrated probabilistic nowcasting system (En-INCA)

    NASA Astrophysics Data System (ADS)

    Suklitsch, M.; Kann, A.; Bica, B.

    2015-04-01

    Ensemble prediction systems are becoming of more and more interest for various applications. Especially ensemble nowcasting systems are increasingly requested by different end users. In this study we introduce such an integrated probabilistic nowcasting system, En-INCA. In a case study we show the added value and increased skill of the new system and demonstrate the improved performance in comparison with a state-of-the-art LAM-EPS.

  5. Probabilistic models for feedback systems.

    SciTech Connect

    Grace, Matthew D.; Boggs, Paul T.

    2011-02-01

    In previous work, we developed a Bayesian-based methodology to analyze the reliability of hierarchical systems. The output of the procedure is a statistical distribution of the reliability, thus allowing many questions to be answered. The principal advantage of the approach is that along with an estimate of the reliability, we also can provide statements of confidence in the results. The model is quite general in that it allows general representations of all of the distributions involved, it incorporates prior knowledge into the models, it allows errors in the 'engineered' nodes of a system to be determined by the data, and leads to the ability to determine optimal testing strategies. In this report, we provide the preliminary steps necessary to extend this approach to systems with feedback. Feedback is an essential component of 'complexity' and provides interesting challenges in modeling the time-dependent action of a feedback loop. We provide a mechanism for doing this and analyze a simple case. We then consider some extensions to more interesting examples with local control affecting the entire system. Finally, a discussion of the status of the research is also included.

  6. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  7. Probabilistic computer model of optimal runway turnoffs

    NASA Technical Reports Server (NTRS)

    Schoen, M. L.; Preston, O. W.; Summers, L. G.; Nelson, B. A.; Vanderlinden, L.; Mcreynolds, M. C.

    1985-01-01

    Landing delays are currently a problem at major air carrier airports and many forecasters agree that airport congestion will get worse by the end of the century. It is anticipated that some types of delays can be reduced by an efficient optimal runway exist system allowing increased approach volumes necessary at congested airports. A computerized Probabilistic Runway Turnoff Model which locates exits and defines path geometry for a selected maximum occupancy time appropriate for each TERPS aircraft category is defined. The model includes an algorithm for lateral ride comfort limits.

  8. En-INCA: Towards an integrated probabilistic nowcasting system

    NASA Astrophysics Data System (ADS)

    Suklitsch, Martin; Stuhl, Barbora; Kann, Alexander; Bica, Benedikt

    2014-05-01

    INCA (Integrated Nowcasting through Comprehensive Analysis), the analysis and nowcasting system operated by ZAMG, is based on blending observations and NWP data. Its performance is extremely high in the nowcasting range. However, uncertainties can be large even in the very short term and limit its practical use. Severe weather conditions are particularly demanding, which is why the quantification of uncertainties and determining probabilities of event occurrences are adding value for various applications. The Nowcasting Ensemble System En-INCA achieves this by coupling the INCA nowcast with ALADIN-LAEF, the EPS of the local area model ALADIN operated at ZAMG successfully for years already. In En-INCA, the Nowcasting approach of INCA is blended with different EPS members in order to derive an ensemble of forecasts in the nowcasting range. In addition to NWP based uncertainties also specific perturbations with respect to observations, the analysis and nowcasting techniques are discussed, and the influence of learning from errors in previous nowcasts is shown. En-INCA is a link between INCA and ALADIN-LAEF by merging the advantages of both systems: observation based nowcasting at very high resolution on the one hand and the uncertainty estimation of a state-of-the-art LAM-EPS on the other hand. Probabilistic nowcasting products can support various end users, e.g. civil protection agencies and power industry, to optimize their decision making process.

  9. Integration of Evidence Base into a Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Saile, Lyn; Lopez, Vilma; Bickham, Grandin; Kerstman, Eric; FreiredeCarvalho, Mary; Byrne, Vicky; Butler, Douglas; Myers, Jerry; Walton, Marlei

    2011-01-01

    INTRODUCTION: A probabilistic decision support model such as the Integrated Medical Model (IMM) utilizes an immense amount of input data that necessitates a systematic, integrated approach for data collection, and management. As a result of this approach, IMM is able to forecasts medical events, resource utilization and crew health during space flight. METHODS: Inflight data is the most desirable input for the Integrated Medical Model. Non-attributable inflight data is collected from the Lifetime Surveillance for Astronaut Health study as well as the engineers, flight surgeons, and astronauts themselves. When inflight data is unavailable cohort studies, other models and Bayesian analyses are used, in addition to subject matters experts input on occasion. To determine the quality of evidence of a medical condition, the data source is categorized and assigned a level of evidence from 1-5; the highest level is one. The collected data reside and are managed in a relational SQL database with a web-based interface for data entry and review. The database is also capable of interfacing with outside applications which expands capabilities within the database itself. Via the public interface, customers can access a formatted Clinical Findings Form (CLiFF) that outlines the model input and evidence base for each medical condition. Changes to the database are tracked using a documented Configuration Management process. DISSCUSSION: This strategic approach provides a comprehensive data management plan for IMM. The IMM Database s structure and architecture has proven to support additional usages. As seen by the resources utilization across medical conditions analysis. In addition, the IMM Database s web-based interface provides a user-friendly format for customers to browse and download the clinical information for medical conditions. It is this type of functionality that will provide Exploratory Medicine Capabilities the evidence base for their medical condition list

  10. Opportunities of probabilistic flood loss models

    NASA Astrophysics Data System (ADS)

    Schröter, Kai; Kreibich, Heidi; Lüdtke, Stefan; Vogel, Kristin; Merz, Bruno

    2016-04-01

    Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. However, reliable flood damage models are a prerequisite for the practical usefulness of the model results. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks and traditional stage damage functions. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005, 2006 and 2013 in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of sharpness of the predictions the reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The comparison of the uni-variable Stage damage function and the multivariable model approach emphasises the importance to quantify predictive uncertainty. With each explanatory variable, the multi-variable model reveals an additional source of uncertainty. However, the predictive performance in terms of precision (mbe), accuracy (mae) and reliability (HR) is clearly improved

  11. Economic Dispatch for Microgrid Containing Electric Vehicles via Probabilistic Modelling

    SciTech Connect

    Yao, Yin; Gao, Wenzhong; Momoh, James; Muljadi, Eduard

    2015-10-06

    In this paper, an economic dispatch model with probabilistic modeling is developed for microgrid. Electric power supply in microgrid consists of conventional power plants and renewable energy power plants, such as wind and solar power plants. Due to the fluctuation of solar and wind plants' output, an empirical probabilistic model is developed to predict their hourly output. According to different characteristics of wind and solar plants, the parameters for probabilistic distribution are further adjusted individually for both power plants. On the other hand, with the growing trend of Plug-in Electric Vehicle (PHEV), an integrated microgrid system must also consider the impact of PHEVs. Not only the charging loads from PHEVs, but also the discharging output via Vehicle to Grid (V2G) method can greatly affect the economic dispatch for all the micro energy sources in microgrid. This paper presents an optimization method for economic dispatch in microgrid considering conventional, renewable power plants, and PHEVs. The simulation results reveal that PHEVs with V2G capability can be an indispensable supplement in modern microgrid.

  12. Probabilistic Modeling of the Renal Stone Formation Module

    NASA Technical Reports Server (NTRS)

    Best, Lauren M.; Myers, Jerry G.; Goodenow, Debra A.; McRae, Michael P.; Jackson, Travis C.

    2013-01-01

    The Integrated Medical Model (IMM) is a probabilistic tool, used in mission planning decision making and medical systems risk assessments. The IMM project maintains a database of over 80 medical conditions that could occur during a spaceflight, documenting an incidence rate and end case scenarios for each. In some cases, where observational data are insufficient to adequately define the inflight medical risk, the IMM utilizes external probabilistic modules to model and estimate the event likelihoods. One such medical event of interest is an unpassed renal stone. Due to a high salt diet and high concentrations of calcium in the blood (due to bone depletion caused by unloading in the microgravity environment) astronauts are at a considerable elevated risk for developing renal calculi (nephrolithiasis) while in space. Lack of observed incidences of nephrolithiasis has led HRP to initiate the development of the Renal Stone Formation Module (RSFM) to create a probabilistic simulator capable of estimating the likelihood of symptomatic renal stone presentation in astronauts on exploration missions. The model consists of two major parts. The first is the probabilistic component, which utilizes probability distributions to assess the range of urine electrolyte parameters and a multivariate regression to transform estimated crystal density and size distributions to the likelihood of the presentation of nephrolithiasis symptoms. The second is a deterministic physical and chemical model of renal stone growth in the kidney developed by Kassemi et al. The probabilistic component of the renal stone model couples the input probability distributions describing the urine chemistry, astronaut physiology, and system parameters with the physical and chemical outputs and inputs to the deterministic stone growth model. These two parts of the model are necessary to capture the uncertainty in the likelihood estimate. The model will be driven by Monte Carlo simulations, continuously

  13. A probabilistic graphical model based stochastic input model construction

    SciTech Connect

    Wan, Jiang; Zabaras, Nicholas

    2014-09-01

    Model reduction techniques have been widely used in modeling of high-dimensional stochastic input in uncertainty quantification tasks. However, the probabilistic modeling of random variables projected into reduced-order spaces presents a number of computational challenges. Due to the curse of dimensionality, the underlying dependence relationships between these random variables are difficult to capture. In this work, a probabilistic graphical model based approach is employed to learn the dependence by running a number of conditional independence tests using observation data. Thus a probabilistic model of the joint PDF is obtained and the PDF is factorized into a set of conditional distributions based on the dependence structure of the variables. The estimation of the joint PDF from data is then transformed to estimating conditional distributions under reduced dimensions. To improve the computational efficiency, a polynomial chaos expansion is further applied to represent the random field in terms of a set of standard random variables. This technique is combined with both linear and nonlinear model reduction methods. Numerical examples are presented to demonstrate the accuracy and efficiency of the probabilistic graphical model based stochastic input models. - Highlights: • Data-driven stochastic input models without the assumption of independence of the reduced random variables. • The problem is transformed to a Bayesian network structure learning problem. • Examples are given in flows in random media.

  14. Integrated Environmental Control Model

    Energy Science and Technology Software Center (ESTSC)

    1999-09-03

    IECM is a powerful multimedia engineering software program for simulating an integrated coal-fired power plant. It provides a capability to model various conventional and advanced processes for controlling air pollutant emissions from coal-fired power plants before, during, or after combustion. The principal purpose of the model is to calculate the performance, emissions, and cost of power plant configurations employing alternative environmental control methods. The model consists of various control technology modules, which may be integratedmore » into a complete utility plant in any desired combination. In contrast to conventional deterministic models, the IECM offers the unique capability to assign probabilistic values to all model input parameters, and to obtain probabilistic outputs in the form of cumulative distribution functions indicating the likelihood of dofferent costs and performance results. A Graphical Use Interface (GUI) facilitates the configuration of the technologies, entry of data, and retrieval of results.« less

  15. Probabilistic Resilience in Hidden Markov Models

    NASA Astrophysics Data System (ADS)

    Panerati, Jacopo; Beltrame, Giovanni; Schwind, Nicolas; Zeltner, Stefan; Inoue, Katsumi

    2016-05-01

    Originally defined in the context of ecological systems and environmental sciences, resilience has grown to be a property of major interest for the design and analysis of many other complex systems: resilient networks and robotics systems other the desirable capability of absorbing disruption and transforming in response to external shocks, while still providing the services they were designed for. Starting from an existing formalization of resilience for constraint-based systems, we develop a probabilistic framework based on hidden Markov models. In doing so, we introduce two new important features: stochastic evolution and partial observability. Using our framework, we formalize a methodology for the evaluation of probabilities associated with generic properties, we describe an efficient algorithm for the computation of its essential inference step, and show that its complexity is comparable to other state-of-the-art inference algorithms.

  16. Multivariate Probabilistic Analysis of an Hydrological Model

    NASA Astrophysics Data System (ADS)

    Franceschini, Samuela; Marani, Marco

    2010-05-01

    Model predictions derived based on rainfall measurements and hydrological model results are often limited by the systematic error of measuring instruments, by the intrinsic variability of the natural processes and by the uncertainty of the mathematical representation. We propose a means to identify such sources of uncertainty and to quantify their effects based on point-estimate approaches, as a valid alternative to cumbersome Montecarlo methods. We present uncertainty analyses on the hydrologic response to selected meteorological events, in the mountain streamflow-generating portion of the Brenta basin at Bassano del Grappa, Italy. The Brenta river catchment has a relatively uniform morphology and quite a heterogeneous rainfall-pattern. In the present work, we evaluate two sources of uncertainty: data uncertainty (the uncertainty due to data handling and analysis) and model uncertainty (the uncertainty related to the formulation of the model). We thus evaluate the effects of the measurement error of tipping-bucket rain gauges, the uncertainty in estimating spatially-distributed rainfall through block kriging, and the uncertainty associated with estimated model parameters. To this end, we coupled a deterministic model based on the geomorphological theory of the hydrologic response to probabilistic methods. In particular we compare the results of Monte Carlo Simulations (MCS) to the results obtained, in the same conditions, using Li's Point Estimate Method (LiM). The LiM is a probabilistic technique that approximates the continuous probability distribution function of the considered stochastic variables by means of discrete points and associated weights. This allows to satisfactorily reproduce results with only few evaluations of the model function. The comparison between the LiM and MCS results highlights the pros and cons of using an approximating method. LiM is less computationally demanding than MCS, but has limited applicability especially when the model

  17. Crevice corrosion and pitting of high-level waste containers: a first step towards the integration of deterministic and probabilistic models

    SciTech Connect

    Farmer, J. C., LLNL

    1997-07-01

    An integrated predictive model is being developed to account for the effects of localized environmental conditions in crevices on pit initiation and propagation. A deterministic calculation is used to estimate the accumulation of hydrogen ions in the crevice solution due to equilibrium hydrolysis reactions of dissolved metal. Pit initiation and growth within the crevice is dealt with by either a stochastic probability model, or an equivalent deterministic model. While the strategy presented here is very promising, the integrated model is not yet ready for accurate quantitative predictions. Empirical expressions for the rate of penetration based upon experimental crevice corrosion data should be used in the interim period, until the integrated model can be refined. Both approaches are discussed.

  18. A Probabilistic Approach to Model Update

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Reaves, Mercedes C.; Voracek, David F.

    2001-01-01

    Finite element models are often developed for load validation, structural certification, response predictions, and to study alternate design concepts. In rare occasions, models developed with a nominal set of parameters agree with experimental data without the need to update parameter values. Today, model updating is generally heuristic and often performed by a skilled analyst with in-depth understanding of the model assumptions. Parameter uncertainties play a key role in understanding the model update problem and therefore probabilistic analysis tools, developed for reliability and risk analysis, may be used to incorporate uncertainty in the analysis. In this work, probability analysis (PA) tools are used to aid the parameter update task using experimental data and some basic knowledge of potential error sources. Discussed here is the first application of PA tools to update parameters of a finite element model for a composite wing structure. Static deflection data at six locations are used to update five parameters. It is shown that while prediction of individual response values may not be matched identically, the system response is significantly improved with moderate changes in parameter values.

  19. EFFECTS OF CORRELATED PROBABILISTIC EXPOSURE MODEL INPUTS ON SIMULATED RESULTS

    EPA Science Inventory

    In recent years, more probabilistic models have been developed to quantify aggregate human exposures to environmental pollutants. The impact of correlation among inputs in these models is an important issue, which has not been resolved. Obtaining correlated data and implementi...

  20. An integrated GIS-based interval-probabilistic programming model for land-use planning management under uncertainty--a case study at Suzhou, China.

    PubMed

    Lu, Shasha; Zhou, Min; Guan, Xingliang; Tao, Lizao

    2015-03-01

    A large number of mathematical models have been developed for supporting optimization of land-use allocation; however, few of them simultaneously consider land suitability (e.g., physical features and spatial information) and various uncertainties existing in many factors (e.g., land availabilities, land demands, land-use patterns, and ecological requirements). This paper incorporates geographic information system (GIS) technology into interval-probabilistic programming (IPP) for land-use planning management (IPP-LUPM). GIS is utilized to assemble data for the aggregated land-use alternatives, and IPP is developed for tackling uncertainties presented as discrete intervals and probability distribution. Based on GIS, the suitability maps of different land users are provided by the outcomes of land suitability assessment and spatial analysis. The maximum area of every type of land use obtained from the suitability maps, as well as various objectives/constraints (i.e., land supply, land demand of socioeconomic development, future development strategies, and environmental capacity), is used as input data for the optimization of land-use areas with IPP-LUPM model. The proposed model not only considers the outcomes of land suitability evaluation (i.e., topography, ground conditions, hydrology, and spatial location) but also involves economic factors, food security, and eco-environmental constraints, which can effectively reflect various interrelations among different aspects in a land-use planning management system. The case study results at Suzhou, China, demonstrate that the model can help to examine the reliability of satisfying (or risk of violating) system constraints under uncertainty. Moreover, it may identify the quantitative relationship between land suitability and system benefits. Willingness to arrange the land areas based on the condition of highly suitable land will not only reduce the potential conflicts on the environmental system but also lead to a lower

  1. Probabilistic model better defines development well risks

    SciTech Connect

    Connolly, M.R.

    1996-10-14

    Probabilistic techniques to compare and rank projects, such as the drilling of development wells, often are more representative than decision tree or deterministic approaches. As opposed to traditional deterministic methods, probabilistic analysis gives decision-makers ranges of outcomes with associated probabilities of occurrence. This article analyzes the drilling of a hypothetical development well with actual field data (such as stabilized initial rates, production declines, and gas/oil ratios) to calculate probabilistic reserves, and production flow streams. Analog operating data were included to build distributions for capital and operating costs. Economics from the Monte Carlo simulation include probabilistic production flow streams and cost distributions. Results include single parameter distributions (reserves, net present value, and profitability index) and time function distributions (annual production and net cash flow).

  2. Probabilistic constitutive relationships for cyclic material strength models

    NASA Technical Reports Server (NTRS)

    Boyce, L.; Chamis, C. C.

    1988-01-01

    A methodology is developed that provides a probabilistic treatment for the lifetime of structural components of aerospace propulsion systems subjected to fatigue. Material strength degradation models, based on primitive variables, include both a fatigue strength reduction model and a fatigue crack growth model. Probabilistic analysis is based on simulation, and both maximum entropy and maximum penalized likelihood methods are used for the generation of probability density functions. The resulting constitutive relationships are included in several computer programs.

  3. Probabilistic Modeling of Imaging, Genetics and Diagnosis.

    PubMed

    Batmanghelich, Nematollah K; Dalca, Adrian; Quon, Gerald; Sabuncu, Mert; Golland, Polina

    2016-07-01

    We propose a unified Bayesian framework for detecting genetic variants associated with disease by exploiting image-based features as an intermediate phenotype. The use of imaging data for examining genetic associations promises new directions of analysis, but currently the most widely used methods make sub-optimal use of the richness that these data types can offer. Currently, image features are most commonly selected based on their relevance to the disease phenotype. Then, in a separate step, a set of genetic variants is identified to explain the selected features. In contrast, our method performs these tasks simultaneously in order to jointly exploit information in both data types. The analysis yields probabilistic measures of clinical relevance for both imaging and genetic markers. We derive an efficient approximate inference algorithm that handles the high dimensionality of image and genetic data. We evaluate the algorithm on synthetic data and demonstrate that it outperforms traditional models. We also illustrate our method on Alzheimer's Disease Neuroimaging Initiative data. PMID:26886973

  4. A PROBABILISTIC MODELING FRAMEWORK FOR PREDICTING POPULATION EXPOSURES TO BENZENE

    EPA Science Inventory

    The US Environmental Protection Agency (EPA) is modifying their probabilistic Stochastic Human Exposure Dose Simulation (SHEDS) model to assess aggregate exposures to air toxics. Air toxics include urban Hazardous Air Pollutants (HAPS) such as benzene from mobile sources, part...

  5. Integrated Campaign Probabilistic Cost, Schedule, Performance, and Value for Program Office Support

    NASA Technical Reports Server (NTRS)

    Cornelius, David; Sasamoto, Washito; Daugherty, Kevin; Deacon, Shaun

    2012-01-01

    This paper describes an integrated assessment tool developed at NASA Langley Research Center that incorporates probabilistic analysis of life cycle cost, schedule, launch performance, on-orbit performance, and value across a series of planned space-based missions, or campaign. Originally designed as an aid in planning the execution of missions to accomplish the National Research Council 2007 Earth Science Decadal Survey, it utilizes Monte Carlo simulation of a series of space missions for assessment of resource requirements and expected return on investment. Interactions between simulated missions are incorporated, such as competition for launch site manifest, to capture unexpected and non-linear system behaviors. A novel value model is utilized to provide an assessment of the probabilistic return on investment. A demonstration case is discussed to illustrate the tool utility.

  6. Probabilistic Fatigue Damage Prognosis Using a Surrogate Model Trained Via 3D Finite Element Analysis

    NASA Technical Reports Server (NTRS)

    Leser, Patrick E.; Hochhalter, Jacob D.; Newman, John A.; Leser, William P.; Warner, James E.; Wawrzynek, Paul A.; Yuan, Fuh-Gwo

    2015-01-01

    Utilizing inverse uncertainty quantification techniques, structural health monitoring can be integrated with damage progression models to form probabilistic predictions of a structure's remaining useful life. However, damage evolution in realistic structures is physically complex. Accurately representing this behavior requires high-fidelity models which are typically computationally prohibitive. In the present work, a high-fidelity finite element model is represented by a surrogate model, reducing computation times. The new approach is used with damage diagnosis data to form a probabilistic prediction of remaining useful life for a test specimen under mixed-mode conditions.

  7. Verification of recursive probabilistic integration (RPI) method for fatigue life management using non-destructive inspections

    NASA Astrophysics Data System (ADS)

    Chen, Tzikang J.; Shiao, Michael

    2016-04-01

    This paper verified a generic and efficient assessment concept for probabilistic fatigue life management. The concept is developed based on an integration of damage tolerance methodology, simulations methods1, 2, and a probabilistic algorithm RPI (recursive probability integration)3-9 considering maintenance for damage tolerance and risk-based fatigue life management. RPI is an efficient semi-analytical probabilistic method for risk assessment subjected to various uncertainties such as the variability in material properties including crack growth rate, initial flaw size, repair quality, random process modeling of flight loads for failure analysis, and inspection reliability represented by probability of detection (POD). In addition, unlike traditional Monte Carlo simulations (MCS) which requires a rerun of MCS when maintenance plan is changed, RPI can repeatedly use a small set of baseline random crack growth histories excluding maintenance related parameters from a single MCS for various maintenance plans. In order to fully appreciate the RPI method, a verification procedure was performed. In this study, MC simulations in the orders of several hundred billions were conducted for various flight conditions, material properties, and inspection scheduling, POD and repair/replacement strategies. Since the MC simulations are time-consuming methods, the simulations were conducted parallelly on DoD High Performance Computers (HPC) using a specialized random number generator for parallel computing. The study has shown that RPI method is several orders of magnitude more efficient than traditional Monte Carlo simulations.

  8. Probabilistic delay differential equation modeling of event-related potentials.

    PubMed

    Ostwald, Dirk; Starke, Ludger

    2016-08-01

    "Dynamic causal models" (DCMs) are a promising approach in the analysis of functional neuroimaging data due to their biophysical interpretability and their consolidation of functional-segregative and functional-integrative propositions. In this theoretical note we are concerned with the DCM framework for electroencephalographically recorded event-related potentials (ERP-DCM). Intuitively, ERP-DCM combines deterministic dynamical neural mass models with dipole-based EEG forward models to describe the event-related scalp potential time-series over the entire electrode space. Since its inception, ERP-DCM has been successfully employed to capture the neural underpinnings of a wide range of neurocognitive phenomena. However, in spite of its empirical popularity, the technical literature on ERP-DCM remains somewhat patchy. A number of previous communications have detailed certain aspects of the approach, but no unified and coherent documentation exists. With this technical note, we aim to close this gap and to increase the technical accessibility of ERP-DCM. Specifically, this note makes the following novel contributions: firstly, we provide a unified and coherent review of the mathematical machinery of the latent and forward models constituting ERP-DCM by formulating the approach as a probabilistic latent delay differential equation model. Secondly, we emphasize the probabilistic nature of the model and its variational Bayesian inversion scheme by explicitly deriving the variational free energy function in terms of both the likelihood expectation and variance parameters. Thirdly, we detail and validate the estimation of the model with a special focus on the explicit form of the variational free energy function and introduce a conventional nonlinear optimization scheme for its maximization. Finally, we identify and discuss a number of computational issues which may be addressed in the future development of the approach. PMID:27114057

  9. Application of a stochastic snowmelt model for probabilistic decisionmaking

    NASA Technical Reports Server (NTRS)

    Mccuen, R. H.

    1983-01-01

    A stochastic form of the snowmelt runoff model that can be used for probabilistic decision-making was developed. The use of probabilistic streamflow predictions instead of single valued deterministic predictions leads to greater accuracy in decisions. While the accuracy of the output function is important in decisionmaking, it is also important to understand the relative importance of the coefficients. Therefore, a sensitivity analysis was made for each of the coefficients.

  10. A probabilistic model for snow avalanche occurrence

    NASA Astrophysics Data System (ADS)

    Perona, P.; Miescher, A.; Porporato, A.

    2009-04-01

    Avalanche hazard forecasting is an important issue in relation to the protection of urbanized environments, ski resorts and of ski-touring alpinists. A critical point is to predict the conditions that trigger the snow mass instability determining the onset and the size of avalanches. On steep terrains the risk of avalanches is known to be related to preceding consistent snowfall events and to subsequent changes in the local climatic conditions. Regression analysis has shown that avalanche occurrence indeed correlates to the amount of snow fallen in consecutive three snowing days and to the state of the settled snow at the ground. Moreover, since different type of avalanches may occur as a result of the interactions of different factors, the process of snow avalanche formation is inherently complex and with some degree of unpredictability. For this reason, although several models assess the risk of avalanche by accounting for all the involved processes with a great detail, a high margin of uncertainty invariably remains. In this work, we explicitly describe such an unpredictable behaviour with an intrinsic noise affecting the processes leading snow instability. Eventually, this sets the basis for a minimalist stochastic model, which allows us to investigate the avalanche dynamics and its statistical properties. We employ a continuous time process with stochastic jumps (snowfalls), deterministic decay (snowmelt and compaction) and state dependent avalanche occurrence (renewals) as a minimalist model for the determination of avalanche size and related intertime occurrence. The physics leading to avalanches is simplified to the extent where only meteorological data and terrain data are necessary to estimate avalanche danger. We explore the analytical formulation of the process and the properties of the probability density function of the avalanche process variables. We also discuss what is the probabilistic link between avalanche size and preceding snowfall event and

  11. An integrated approach coupling physically based models and probabilistic method to assess quantitatively landslide susceptibility at different scale: application to different geomorphological environments

    NASA Astrophysics Data System (ADS)

    Vandromme, Rosalie; Thiéry, Yannick; Sedan, Olivier; Bernardie, Séverine

    2016-04-01

    Landslide hazard assessment is the estimation of a target area where landslides of a particular type, volume, runout and intensity may occur within a given period. The first step to analyze landslide hazard consists in assessing the spatial and temporal failure probability (when the information is available, i.e. susceptibility assessment). Two types of approach are generally recommended to achieve this goal: (i) qualitative approach (i.e. inventory based methods and knowledge data driven methods) and (ii) quantitative approach (i.e. data-driven methods or deterministic physically based methods). Among quantitative approaches, deterministic physically based methods (PBM) are generally used at local and/or site-specific scales (1:5,000-1:25,000 and >1:5,000, respectively). The main advantage of these methods is the calculation of probability of failure (safety factor) following some specific environmental conditions. For some models it is possible to integrate the land-uses and climatic change. At the opposite, major drawbacks are the large amounts of reliable and detailed data (especially materials type, their thickness and the geotechnical parameters heterogeneity over a large area) and the fact that only shallow landslides are taking into account. This is why they are often used at site-specific scales (> 1:5,000). Thus, to take into account (i) materials' heterogeneity , (ii) spatial variation of physical parameters, (iii) different landslide types, the French Geological Survey (i.e. BRGM) has developed a physically based model (PBM) implemented in a GIS environment. This PBM couples a global hydrological model (GARDENIA®) including a transient unsaturated/saturated hydrological component with a physically based model computing the stability of slopes (ALICE®, Assessment of Landslides Induced by Climatic Events) based on the Morgenstern-Price method for any slip surface. The variability of mechanical parameters is handled by Monte Carlo approach. The

  12. Identification of thermal degradation using probabilistic models in reflectance spectroscopy

    NASA Astrophysics Data System (ADS)

    Criner, A. K.; Cherry, A. J.; Cooney, A. T.; Katter, T. D.; Banks, H. T.; Hu, Shuhua; Catenacci, Jared

    2015-03-01

    Different probabilistic models of molecular vibration modes are considered to model the reflectance spectra of chemical species through the dielectric constant. We discuss probability measure estimators in parametric and nonparametric models. Analyses of ceramic matrix composite samples that have been heat treated for different amounts of times are compared. We finally compare these results with the analysis of vitreous silica using nonparametric models.

  13. A toolkit for integrated deterministic and probabilistic assessment for hydrogen infrastructure.

    SciTech Connect

    Groth, Katrina M.; Tchouvelev, Andrei V.

    2014-03-01

    There has been increasing interest in using Quantitative Risk Assessment [QRA] to help improve the safety of hydrogen infrastructure and applications. Hydrogen infrastructure for transportation (e.g. fueling fuel cell vehicles) or stationary (e.g. back-up power) applications is a relatively new area for application of QRA vs. traditional industrial production and use, and as a result there are few tools designed to enable QRA for this emerging sector. There are few existing QRA tools containing models that have been developed and validated for use in small-scale hydrogen applications. However, in the past several years, there has been significant progress in developing and validating deterministic physical and engineering models for hydrogen dispersion, ignition, and flame behavior. In parallel, there has been progress in developing defensible probabilistic models for the occurrence of events such as hydrogen release and ignition. While models and data are available, using this information is difficult due to a lack of readily available tools for integrating deterministic and probabilistic components into a single analysis framework. This paper discusses the first steps in building an integrated toolkit for performing QRA on hydrogen transportation technologies and suggests directions for extending the toolkit.

  14. Integration of an Evidence Base into a Probabilistic Risk Assessment Model. The Integrated Medical Model Database: An Organized Evidence Base for Assessing In-Flight Crew Health Risk and System Design

    NASA Technical Reports Server (NTRS)

    Saile, Lynn; Lopez, Vilma; Bickham, Grandin; FreiredeCarvalho, Mary; Kerstman, Eric; Byrne, Vicky; Butler, Douglas; Myers, Jerry; Walton, Marlei

    2011-01-01

    This slide presentation reviews the Integrated Medical Model (IMM) database, which is an organized evidence base for assessing in-flight crew health risk. The database is a relational database accessible to many people. The database quantifies the model inputs by a ranking based on the highest value of the data as Level of Evidence (LOE) and the quality of evidence (QOE) score that provides an assessment of the evidence base for each medical condition. The IMM evidence base has already been able to provide invaluable information for designers, and for other uses.

  15. Bayesian non-parametrics and the probabilistic approach to modelling

    PubMed Central

    Ghahramani, Zoubin

    2013-01-01

    Modelling is fundamental to many fields of science and engineering. A model can be thought of as a representation of possible data one could predict from a system. The probabilistic approach to modelling uses probability theory to express all aspects of uncertainty in the model. The probabilistic approach is synonymous with Bayesian modelling, which simply uses the rules of probability theory in order to make predictions, compare alternative models, and learn model parameters and structure from data. This simple and elegant framework is most powerful when coupled with flexible probabilistic models. Flexibility is achieved through the use of Bayesian non-parametrics. This article provides an overview of probabilistic modelling and an accessible survey of some of the main tools in Bayesian non-parametrics. The survey covers the use of Bayesian non-parametrics for modelling unknown functions, density estimation, clustering, time-series modelling, and representing sparsity, hierarchies, and covariance structure. More specifically, it gives brief non-technical overviews of Gaussian processes, Dirichlet processes, infinite hidden Markov models, Indian buffet processes, Kingman’s coalescent, Dirichlet diffusion trees and Wishart processes. PMID:23277609

  16. A Probabilistic Model of Phonological Relationships from Contrast to Allophony

    ERIC Educational Resources Information Center

    Hall, Kathleen Currie

    2009-01-01

    This dissertation proposes a model of phonological relationships, the Probabilistic Phonological Relationship Model (PPRM), that quantifies how predictably distributed two sounds in a relationship are. It builds on a core premise of traditional phonological analysis, that the ability to define phonological relationships such as contrast and…

  17. Exploring Term Dependences in Probabilistic Information Retrieval Model.

    ERIC Educational Resources Information Center

    Cho, Bong-Hyun; Lee, Changki; Lee, Gary Geunbae

    2003-01-01

    Describes a theoretic process to apply Bahadur-Lazarsfeld expansion (BLE) to general probabilistic models and the state-of-the-art 2-Poisson model. Through experiments on two standard document collections, one in Korean and one in English, it is demonstrated that incorporation of term dependences using BLE significantly contributes to performance…

  18. What do we gain with Probabilistic Flood Loss Models?

    NASA Astrophysics Data System (ADS)

    Schroeter, K.; Kreibich, H.; Vogel, K.; Merz, B.; Lüdtke, S.

    2015-12-01

    The reliability of flood loss models is a prerequisite for their practical usefulness. Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks and traditional stage damage functions which are cast in a probabilistic framework. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005, 2006 and 2013 in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The reliability of the probabilistic predictions within validation runs decreases only slightly and achieves a very good coverage of observations within the predictive interval. Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.

  19. Modelling default and likelihood reasoning as probabilistic reasoning

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1990-01-01

    A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. Likely and by default are in fact treated as duals in the same sense as possibility and necessity. To model these four forms probabilistically, a qualitative default probabilistic (QDP) logic and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequent results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.

  20. Probabilistic Usage of the Multi-Factor Interaction Model

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2008-01-01

    A Multi-Factor Interaction Model (MFIM) is used to predict the insulating foam mass expulsion during the ascending of a space vehicle. The exponents in the MFIM are evaluated by an available approach which consists of least squares and an optimization algorithm. These results were subsequently used to probabilistically evaluate the effects of the uncertainties in each participating factor in the mass expulsion. The probabilistic results show that the surface temperature dominates at high probabilities and the pressure which causes the mass expulsion at low probabil

  1. Probabilistic predictive modelling of carbon nanocomposites for medical implants design.

    PubMed

    Chua, Matthew; Chui, Chee-Kong

    2015-04-01

    Modelling of the mechanical properties of carbon nanocomposites based on input variables like percentage weight of Carbon Nanotubes (CNT) inclusions is important for the design of medical implants and other structural scaffolds. Current constitutive models for the mechanical properties of nanocomposites may not predict well due to differences in conditions, fabrication techniques and inconsistencies in reagents properties used across industries and laboratories. Furthermore, the mechanical properties of the designed products are not deterministic, but exist as a probabilistic range. A predictive model based on a modified probabilistic surface response algorithm is proposed in this paper to address this issue. Tensile testing of three groups of different CNT weight fractions of carbon nanocomposite samples displays scattered stress-strain curves, with the instantaneous stresses assumed to vary according to a normal distribution at a specific strain. From the probabilistic density function of the experimental data, a two factors Central Composite Design (CCD) experimental matrix based on strain and CNT weight fraction input with their corresponding stress distribution was established. Monte Carlo simulation was carried out on this design matrix to generate a predictive probabilistic polynomial equation. The equation and method was subsequently validated with more tensile experiments and Finite Element (FE) studies. The method was subsequently demonstrated in the design of an artificial tracheal implant. Our algorithm provides an effective way to accurately model the mechanical properties in implants of various compositions based on experimental data of samples. PMID:25658876

  2. Data-directed RNA secondary structure prediction using probabilistic modeling.

    PubMed

    Deng, Fei; Ledda, Mirko; Vaziri, Sana; Aviran, Sharon

    2016-08-01

    Structure dictates the function of many RNAs, but secondary RNA structure analysis is either labor intensive and costly or relies on computational predictions that are often inaccurate. These limitations are alleviated by integration of structure probing data into prediction algorithms. However, existing algorithms are optimized for a specific type of probing data. Recently, new chemistries combined with advances in sequencing have facilitated structure probing at unprecedented scale and sensitivity. These novel technologies and anticipated wealth of data highlight a need for algorithms that readily accommodate more complex and diverse input sources. We implemented and investigated a recently outlined probabilistic framework for RNA secondary structure prediction and extended it to accommodate further refinement of structural information. This framework utilizes direct likelihood-based calculations of pseudo-energy terms per considered structural context and can readily accommodate diverse data types and complex data dependencies. We use real data in conjunction with simulations to evaluate performances of several implementations and to show that proper integration of structural contexts can lead to improvements. Our tests also reveal discrepancies between real data and simulations, which we show can be alleviated by refined modeling. We then propose statistical preprocessing approaches to standardize data interpretation and integration into such a generic framework. We further systematically quantify the information content of data subsets, demonstrating that high reactivities are major drivers of SHAPE-directed predictions and that better understanding of less informative reactivities is key to further improvements. Finally, we provide evidence for the adaptive capability of our framework using mock probe simulations. PMID:27251549

  3. Web-tool to Support Medical Experts in Probabilistic Modelling Using Large Bayesian Networks With an Example of Hinosinusitis.

    PubMed

    Cypko, Mario A; Hirsch, David; Koch, Lucas; Stoehr, Matthaeus; Strauss, Gero; Denecke, Kerstin

    2015-01-01

    For many complex diseases, finding the best patient-specific treatment decision is difficult for physicians due to limited mental capacity. Clinical decision support systems based on Bayesian networks (BN) can provide a probabilistic graphical model integrating all necessary aspects relevant for decision making. Such models are often manually created by clinical experts. The modeling process consists of graphical modeling conducted by collecting of information entities, and probabilistic modeling achieved through defining the relations of information entities to their direct causes. Such expert-based probabilistic modelling with BNs is very time intensive and requires knowledge about the underlying modeling method. We introduce in this paper an intuitive web-based system for helping medical experts generate decision models based on BNs. Using the tool, no special knowledge about the underlying model or BN is necessary. We tested the tool with an example of modeling treatment decisions of Rhinosinusitis and studied its usability. PMID:26262051

  4. A probabilistic model of insolation for the Mojave Desert area

    NASA Technical Reports Server (NTRS)

    Hester, O. V.; Reid, M. S.

    1978-01-01

    A discussion of mathematical models of insolation characteristics suitable for use in analysis of solar energy systems is presented and shows why such models are essential for solar energy system design. A model of solar radiation for the Mojave Desert area is presented with probabilistic and deterministic components which reflect the occurrence and density of clouds and haze, and mimic their effects on both direct and indirect radiation. Multiple comparisons were made between measured total energy received per day and the corresponding simulated totals. The simulated totals were all within 11 percent of the measured total. The conclusion is that a useful probabilistic model of solar radiation for the Goldstone, California, area of the Mojave Desert has been constructed.

  5. Understanding Rasch Measurement: The Rasch Model, Additive Conjoint Measurement, and New Models of Probabilistic Measurement Theory.

    ERIC Educational Resources Information Center

    Karabatsos, George

    2001-01-01

    Describes similarities and differences between additive conjoint measurement and the Rasch model, and formalizes some new nonparametric item response models that are, in a sense, probabilistic measurement theory models. Applies these new models to published and simulated data. (SLD)

  6. Probabilistic modeling of the evolution of gene synteny within reconciled phylogenies

    PubMed Central

    2015-01-01

    Background Most models of genome evolution concern either genetic sequences, gene content or gene order. They sometimes integrate two of the three levels, but rarely the three of them. Probabilistic models of gene order evolution usually have to assume constant gene content or adopt a presence/absence coding of gene neighborhoods which is blind to complex events modifying gene content. Results We propose a probabilistic evolutionary model for gene neighborhoods, allowing genes to be inserted, duplicated or lost. It uses reconciled phylogenies, which integrate sequence and gene content evolution. We are then able to optimize parameters such as phylogeny branch lengths, or probabilistic laws depicting the diversity of susceptibility of syntenic regions to rearrangements. We reconstruct a structure for ancestral genomes by optimizing a likelihood, keeping track of all evolutionary events at the level of gene content and gene synteny. Ancestral syntenies are associated with a probability of presence. We implemented the model with the restriction that at most one gene duplication separates two gene speciations in reconciled gene trees. We reconstruct ancestral syntenies on a set of 12 drosophila genomes, and compare the evolutionary rates along the branches and along the sites. We compare with a parsimony method and find a significant number of results not supported by the posterior probability. The model is implemented in the Bio++ library. It thus benefits from and enriches the classical models and methods for molecular evolution. PMID:26452018

  7. Probabilistic model-based approach for heart beat detection.

    PubMed

    Chen, Hugh; Erol, Yusuf; Shen, Eric; Russell, Stuart

    2016-09-01

    Nowadays, hospitals are ubiquitous and integral to modern society. Patients flow in and out of a veritable whirlwind of paperwork, consultations, and potential inpatient admissions, through an abstracted system that is not without flaws. One of the biggest flaws in the medical system is perhaps an unexpected one: the patient alarm system. One longitudinal study reported an 88.8% rate of false alarms, with other studies reporting numbers of similar magnitudes. These false alarm rates lead to deleterious effects that manifest in a lower standard of care across clinics. This paper discusses a model-based probabilistic inference approach to estimate physiological variables at a detection level. We design a generative model that complies with a layman's understanding of human physiology and perform approximate Bayesian inference. One primary goal of this paper is to justify a Bayesian modeling approach to increasing robustness in a physiological domain. In order to evaluate our algorithm we look at the application of heart beat detection using four datasets provided by PhysioNet, a research resource for complex physiological signals, in the form of the PhysioNet 2014 Challenge set-p1 and set-p2, the MIT-BIH Polysomnographic Database, and the MGH/MF Waveform Database. On these data sets our algorithm performs on par with the other top six submissions to the PhysioNet 2014 challenge. The overall evaluation scores in terms of sensitivity and positive predictivity values obtained were as follows: set-p1 (99.72%), set-p2 (93.51%), MIT-BIH (99.66%), and MGH/MF (95.53%). These scores are based on the averaging of gross sensitivity, gross positive predictivity, average sensitivity, and average positive predictivity. PMID:27480267

  8. Toward a Simple Probabilistic GCM Emulator for Integrated Assessment of Climate Change Impacts

    NASA Astrophysics Data System (ADS)

    Sue Wing, I.; Tebaldi, C.; Nychka, D. W.; Winkler, J.

    2014-12-01

    Climate emulators can bridge spatial scales in integrated assessment in ways that allow us to take advantage of the evolving understanding of the impacts of climate change. The spatial scales at which climate impacts occur are much finer than those of the "damage functions" in integrated assessment models (IAMs), which incorporate reduced form climate models to project changes in global mean temperature, and estimate aggregate damages directly from that. Advancing the state of IA modeling requires methods to generate—in a flexible and computationally efficient manner—future changes in climate variables at the geographic scales at which individual impact endpoints can be resolved. The state of the art uses outputs of global climate models (GCMs) forced by warming scenarios to drive impact calculations. However, downstream integrated assessments are perforce "locked-in" to the particular GCM x warming scenario combinations that generated the meteorological fields of interest—it is not possible assess risk due to the absence of probabilities over warming scenarios or model uncertainty. The availability of reduced-form models which can efficiently simulate the envelope of the response of multiple GCMs to a given amount of warming provides us with capability to create probabilistic projections of fine-scale of meteorological changes conditional on global mean temperature change to drive impact calculations in ways that permit risk assessments. This presentation documents a prototype probabilistic climate emulator for use as a GCM diagnostic tool and a driver of climate change impact assessments. We use a regression-based approach to construct multi-model global patterns for changes in temperature and precipitation from the CMIP3 archive. Crucially, regression residuals are used to derive a spatial covariance function of the model- and scenario-dependent deviations from the average pattern. By sampling from this manifold we can rapidly generate many realizations of

  9. Probabilistic Graphical Model Representation in Phylogenetics

    PubMed Central

    Höhna, Sebastian; Heath, Tracy A.; Boussau, Bastien; Landis, Michael J.; Ronquist, Fredrik; Huelsenbeck, John P.

    2014-01-01

    Recent years have seen a rapid expansion of the model space explored in statistical phylogenetics, emphasizing the need for new approaches to statistical model representation and software development. Clear communication and representation of the chosen model is crucial for: (i) reproducibility of an analysis, (ii) model development, and (iii) software design. Moreover, a unified, clear and understandable framework for model representation lowers the barrier for beginners and nonspecialists to grasp complex phylogenetic models, including their assumptions and parameter/variable dependencies. Graphical modeling is a unifying framework that has gained in popularity in the statistical literature in recent years. The core idea is to break complex models into conditionally independent distributions. The strength lies in the comprehensibility, flexibility, and adaptability of this formalism, and the large body of computational work based on it. Graphical models are well-suited to teach statistical models, to facilitate communication among phylogeneticists and in the development of generic software for simulation and statistical inference. Here, we provide an introduction to graphical models for phylogeneticists and extend the standard graphical model representation to the realm of phylogenetics. We introduce a new graphical model component, tree plates, to capture the changing structure of the subgraph corresponding to a phylogenetic tree. We describe a range of phylogenetic models using the graphical model framework and introduce modules to simplify the representation of standard components in large and complex models. Phylogenetic model graphs can be readily used in simulation, maximum likelihood inference, and Bayesian inference using, for example, Metropolis–Hastings or Gibbs sampling of the posterior distribution. [Computation; graphical models; inference; modularization; statistical phylogenetics; tree plate.] PMID:24951559

  10. Probabilistic graphical model representation in phylogenetics.

    PubMed

    Höhna, Sebastian; Heath, Tracy A; Boussau, Bastien; Landis, Michael J; Ronquist, Fredrik; Huelsenbeck, John P

    2014-09-01

    Recent years have seen a rapid expansion of the model space explored in statistical phylogenetics, emphasizing the need for new approaches to statistical model representation and software development. Clear communication and representation of the chosen model is crucial for: (i) reproducibility of an analysis, (ii) model development, and (iii) software design. Moreover, a unified, clear and understandable framework for model representation lowers the barrier for beginners and nonspecialists to grasp complex phylogenetic models, including their assumptions and parameter/variable dependencies. Graphical modeling is a unifying framework that has gained in popularity in the statistical literature in recent years. The core idea is to break complex models into conditionally independent distributions. The strength lies in the comprehensibility, flexibility, and adaptability of this formalism, and the large body of computational work based on it. Graphical models are well-suited to teach statistical models, to facilitate communication among phylogeneticists and in the development of generic software for simulation and statistical inference. Here, we provide an introduction to graphical models for phylogeneticists and extend the standard graphical model representation to the realm of phylogenetics. We introduce a new graphical model component, tree plates, to capture the changing structure of the subgraph corresponding to a phylogenetic tree. We describe a range of phylogenetic models using the graphical model framework and introduce modules to simplify the representation of standard components in large and complex models. Phylogenetic model graphs can be readily used in simulation, maximum likelihood inference, and Bayesian inference using, for example, Metropolis-Hastings or Gibbs sampling of the posterior distribution. PMID:24951559

  11. Probabilistic grammatical model for helix‐helix contact site classification

    PubMed Central

    2013-01-01

    Background Hidden Markov Models power many state‐of‐the‐art tools in the field of protein bioinformatics. While excelling in their tasks, these methods of protein analysis do not convey directly information on medium‐ and long‐range residue‐residue interactions. This requires an expressive power of at least context‐free grammars. However, application of more powerful grammar formalisms to protein analysis has been surprisingly limited. Results In this work, we present a probabilistic grammatical framework for problem‐specific protein languages and apply it to classification of transmembrane helix‐helix pairs configurations. The core of the model consists of a probabilistic context‐free grammar, automatically inferred by a genetic algorithm from only a generic set of expert‐based rules and positive training samples. The model was applied to produce sequence based descriptors of four classes of transmembrane helix‐helix contact site configurations. The highest performance of the classifiers reached AUCROC of 0.70. The analysis of grammar parse trees revealed the ability of representing structural features of helix‐helix contact sites. Conclusions We demonstrated that our probabilistic context‐free framework for analysis of protein sequences outperforms the state of the art in the task of helix‐helix contact site classification. However, this is achieved without necessarily requiring modeling long range dependencies between interacting residues. A significant feature of our approach is that grammar rules and parse trees are human‐readable. Thus they could provide biologically meaningful information for molecular biologists. PMID:24350601

  12. Influential input classification in probabilistic multimedia models

    SciTech Connect

    Maddalena, Randy L.; McKone, Thomas E.; Hsieh, Dennis P.H.; Geng, Shu

    1999-05-01

    Monte Carlo analysis is a statistical simulation method that is often used to assess and quantify the outcome variance in complex environmental fate and effects models. Total outcome variance of these models is a function of (1) the uncertainty and/or variability associated with each model input and (2) the sensitivity of the model outcome to changes in the inputs. To propagate variance through a model using Monte Carlo techniques, each variable must be assigned a probability distribution. The validity of these distributions directly influences the accuracy and reliability of the model outcome. To efficiently allocate resources for constructing distributions one should first identify the most influential set of variables in the model. Although existing sensitivity and uncertainty analysis methods can provide a relative ranking of the importance of model inputs, they fail to identify the minimum set of stochastic inputs necessary to sufficiently characterize the outcome variance. In this paper, we describe and demonstrate a novel sensitivity/uncertainty analysis method for assessing the importance of each variable in a multimedia environmental fate model. Our analyses show that for a given scenario, a relatively small number of input variables influence the central tendency of the model and an even smaller set determines the shape of the outcome distribution. For each input, the level of influence depends on the scenario under consideration. This information is useful for developing site specific models and improving our understanding of the processes that have the greatest influence on the variance in outcomes from multimedia models.

  13. Probabilistic graphic models applied to identification of diseases.

    PubMed

    Sato, Renato Cesar; Sato, Graziela Tiemy Kajita

    2015-01-01

    Decision-making is fundamental when making diagnosis or choosing treatment. The broad dissemination of computed systems and databases allows systematization of part of decisions through artificial intelligence. In this text, we present basic use of probabilistic graphic models as tools to analyze causality in health conditions. This method has been used to make diagnosis of Alzheimer´s disease, sleep apnea and heart diseases. PMID:26154555

  14. Probabilistic graphic models applied to identification of diseases

    PubMed Central

    Sato, Renato Cesar; Sato, Graziela Tiemy Kajita

    2015-01-01

    ABSTRACT Decision-making is fundamental when making diagnosis or choosing treatment. The broad dissemination of computed systems and databases allows systematization of part of decisions through artificial intelligence. In this text, we present basic use of probabilistic graphic models as tools to analyze causality in health conditions. This method has been used to make diagnosis of Alzheimer´s disease, sleep apnea and heart diseases. PMID:26154555

  15. Probabilistic Independence Networks for Hidden Markov Probability Models

    NASA Technical Reports Server (NTRS)

    Smyth, Padhraic; Heckerman, Cavid; Jordan, Michael I

    1996-01-01

    In this paper we explore hidden Markov models(HMMs) and related structures within the general framework of probabilistic independence networks (PINs). The paper contains a self-contained review of the basic principles of PINs. It is shown that the well-known forward-backward (F-B) and Viterbi algorithms for HMMs are special cases of more general enference algorithms for arbitrary PINs.

  16. Nonlinear probabilistic finite element models of laminated composite shells

    NASA Technical Reports Server (NTRS)

    Engelstad, S. P.; Reddy, J. N.

    1993-01-01

    A probabilistic finite element analysis procedure for laminated composite shells has been developed. A total Lagrangian finite element formulation, employing a degenerated 3-D laminated composite shell with the full Green-Lagrange strains and first-order shear deformable kinematics, forms the modeling foundation. The first-order second-moment technique for probabilistic finite element analysis of random fields is employed and results are presented in the form of mean and variance of the structural response. The effects of material nonlinearity are included through the use of a rate-independent anisotropic plasticity formulation with the macroscopic point of view. Both ply-level and micromechanics-level random variables can be selected, the latter by means of the Aboudi micromechanics model. A number of sample problems are solved to verify the accuracy of the procedures developed and to quantify the variability of certain material type/structure combinations. Experimental data is compared in many cases, and the Monte Carlo simulation method is used to check the probabilistic results. In general, the procedure is quite effective in modeling the mean and variance response of the linear and nonlinear behavior of laminated composite shells.

  17. A probabilistic approach to aggregate induction machine modeling

    SciTech Connect

    Stankovic, A.M.; Lesieutre, B.C.

    1996-11-01

    In this paper the authors pursue probabilistic aggregate dynamical models for n identical induction machines connected to a bus, capturing the effect of different mechanical inputs to the individual machines. The authors explore model averaging and review in detail four procedures for linear models. They describe linear systems depending upon stochastic parameters, and develop a theoretical justification for a very simple and reasonably accurate averaging method. They then extend this to the nonlinear model. Finally, they use a recently introduced notion of the stochastic norm to describe a cluster of induction machines undergoing multiple simultaneous parametric variations, and obtain useful and very mildly conservative bounds on eigenstructure perturbations under multiple simultaneous parametric variations.

  18. Recent advances and applications of probabilistic topic models

    NASA Astrophysics Data System (ADS)

    Wood, Ian

    2014-12-01

    I present here an overview of recent advances in probabilistic topic modelling and related Bayesian graphical models as well as some of their more atypical applications outside of their home: text analysis. These techniques allow the modelling of high dimensional count vectors with strong correlations. With such data, simply calculating a correlation matrix is infeasible. Probabilistic topic models address this using mixtures of multinomials estimated via Bayesian inference with Dirichlet priors. The use of conjugate priors allows for efficient inference, and these techniques scale well to data sets with many millions of vectors. The first of these techniques to attract significant attention was Latent Dirichlet Allocation (LDA) [1, 2]. Numerous extensions and adaptations of LDA have been proposed: non-parametric models; assorted models incorporating authors, sentiment and other features; models regularised through the use of extra metadata or extra priors on topic structure, and many more [3]. They have become widely used in the text analysis and population genetics communities, with a number of compelling applications. These techniques are not restricted to text analysis, however, and can be applied to other types of data which can be sensibly discretised and represented as counts of labels/properties/etc. LDA and it's variants have been used to find patterns in data from diverse areas of inquiry, including genetics, plant physiology, image analysis, social network analysis, remote sensing and astrophysics. Nonetheless, it is relatively recently that probabilistic topic models have found applications outside of text analysis, and to date few such applications have been considered. I suggest that there is substantial untapped potential for topic models and models inspired by or incorporating topic models to be fruitfully applied, and outline the characteristics of systems and data for which this may be the case.

  19. Probabilistic prediction models for aggregate quarry siting

    USGS Publications Warehouse

    Robinson, G.R., Jr.; Larkins, P.M.

    2007-01-01

    Weights-of-evidence (WofE) and logistic regression techniques were used in a GIS framework to predict the spatial likelihood (prospectivity) of crushed-stone aggregate quarry development. The joint conditional probability models, based on geology, transportation network, and population density variables, were defined using quarry location and time of development data for the New England States, North Carolina, and South Carolina, USA. The Quarry Operation models describe the distribution of active aggregate quarries, independent of the date of opening. The New Quarry models describe the distribution of aggregate quarries when they open. Because of the small number of new quarries developed in the study areas during the last decade, independent New Quarry models have low parameter estimate reliability. The performance of parameter estimates derived for Quarry Operation models, defined by a larger number of active quarries in the study areas, were tested and evaluated to predict the spatial likelihood of new quarry development. Population density conditions at the time of new quarry development were used to modify the population density variable in the Quarry Operation models to apply to new quarry development sites. The Quarry Operation parameters derived for the New England study area, Carolina study area, and the combined New England and Carolina study areas were all similar in magnitude and relative strength. The Quarry Operation model parameters, using the modified population density variables, were found to be a good predictor of new quarry locations. Both the aggregate industry and the land management community can use the model approach to target areas for more detailed site evaluation for quarry location. The models can be revised easily to reflect actual or anticipated changes in transportation and population features. ?? International Association for Mathematical Geology 2007.

  20. A Probabilistic Model of Cross-Categorization

    ERIC Educational Resources Information Center

    Shafto, Patrick; Kemp, Charles; Mansinghka, Vikash; Tenenbaum, Joshua B.

    2011-01-01

    Most natural domains can be represented in multiple ways: we can categorize foods in terms of their nutritional content or social role, animals in terms of their taxonomic groupings or their ecological niches, and musical instruments in terms of their taxonomic categories or social uses. Previous approaches to modeling human categorization have…

  1. Probabilistic flood damage modelling at the meso-scale

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2014-05-01

    Decisions on flood risk management and adaptation are usually based on risk analyses. Such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments. Most damage models have in common that complex damaging processes are described by simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood damage models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we show how the model BT-FLEMO (Bagging decision Tree based Flood Loss Estimation MOdel) can be applied on the meso-scale, namely on the basis of ATKIS land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany. The application of BT-FLEMO provides a probability distribution of estimated damage to residential buildings per municipality. Validation is undertaken on the one hand via a comparison with eight other damage models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official damage data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of damage estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation model BT-FLEMO is that it inherently provides quantitative information about the uncertainty of the prediction. Reference: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64.

  2. Probabilistic Modeling of Settlement Risk at Land Disposal Facilities - 12304

    SciTech Connect

    Foye, Kevin C.; Soong, Te-Yang

    2012-07-01

    The long-term reliability of land disposal facility final cover systems - and therefore the overall waste containment - depends on the distortions imposed on these systems by differential settlement/subsidence. The evaluation of differential settlement is challenging because of the heterogeneity of the waste mass (caused by inconsistent compaction, void space distribution, debris-soil mix ratio, waste material stiffness, time-dependent primary compression of the fine-grained soil matrix, long-term creep settlement of the soil matrix and the debris, etc.) at most land disposal facilities. Deterministic approaches to long-term final cover settlement prediction are not able to capture the spatial variability in the waste mass and sub-grade properties which control differential settlement. An alternative, probabilistic solution is to use random fields to model the waste and sub-grade properties. The modeling effort informs the design, construction, operation, and maintenance of land disposal facilities. A probabilistic method to establish design criteria for waste placement and compaction is introduced using the model. Random fields are ideally suited to problems of differential settlement modeling of highly heterogeneous foundations, such as waste. Random fields model the seemingly random spatial distribution of a design parameter, such as compressibility. When used for design, the use of these models prompts the need for probabilistic design criteria. It also allows for a statistical approach to waste placement acceptance criteria. An example design evaluation was performed, illustrating the use of the probabilistic differential settlement simulation methodology to assemble a design guidance chart. The purpose of this design evaluation is to enable the designer to select optimal initial combinations of design slopes and quality control acceptance criteria that yield an acceptable proportion of post-settlement slopes meeting some design minimum. For this specific

  3. Probabilistic modelling of sea surges in coastal urban areas

    NASA Astrophysics Data System (ADS)

    Georgiadis, Stylianos; Jomo Danielsen Sørup, Hjalte; Arnbjerg-Nielsen, Karsten; Nielsen, Bo Friis

    2016-04-01

    Urban floods are a major issue for coastal cities with severe impacts on economy, society and environment. A main cause for floods are sea surges stemming from extreme weather conditions. In the context of urban flooding, certain standards have to be met by critical infrastructures in order to protect them from floods. These standards can be so strict that no empirical data is available. For instance, protection plans for sub-surface railways against floods are established with 10,000 years return levels. Furthermore, the long technical lifetime of such infrastructures is a critical issue that should be considered, along with the associated climate change effects in this lifetime. We present a case study of Copenhagen where the metro system is being expanded at present with several stations close to the sea. The current critical sea levels for the metro have never been exceeded and Copenhagen has only been severely flooded from pluvial events in the time where measurements have been conducted. However, due to the very high return period that the metro has to be able to withstand and due to the expectations to sea-level rise due to climate change, reliable estimates of the occurrence rate and magnitude of sea surges have to be established as the current protection is expected to be insufficient at some point within the technical lifetime of the metro. The objective of this study is to probabilistically model sea level in Copenhagen as opposed to extrapolating the extreme statistics as is the practice often used. A better understanding and more realistic description of the phenomena leading to sea surges can then be given. The application of hidden Markov models to high-resolution data of sea level for different meteorological stations in and around Copenhagen is an effective tool to address uncertainty. For sea surge studies, the hidden states of the model may reflect the hydrological processes that contribute to coastal floods. Also, the states of the hidden Markov

  4. IPACS (Integrated Probabilistic Assessment of Composite Structures): Code development and applications

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Shiao, Michael C.

    1993-01-01

    A methodology and attendant computer code have been developed and are described to computationally simulate the uncertain behavior of composite structures. The uncertain behavior includes buckling loads, stress concentration factors, displacements, stress/strain etc., which are the consequences of the inherent uncertainties (scatter) in the primitive (independent random) variables (constituent, ply, laminate and structural) that describe the composite structures. The computer code, IPACS (Integrated Probabilistic Assessment of Composite Structures), can handle both composite mechanics and composite structures. Application to probabilistic composite mechanics is illustrated by its uses to evaluate the uncertainties in the major Poisson's ratio and in laminate stiffness and strength. IPACS application to probabilistic structural analysis is illustrated by its use to evaluate the uncertainties in the buckling of a composite plate, in the stress concentration factor in a composite panel and in the vertical displacement and ply stress in a composite aircraft wing segment.

  5. A probabilistic gastrointestinal tract dosimetry model

    NASA Astrophysics Data System (ADS)

    Huh, Chulhaeng

    In internal dosimetry, the tissues of the gastrointestinal (GI) tract represent one of the most radiosensitive organs of the body with the hematopoietic bone marrow. Endoscopic ultrasound is a unique tool to acquire in-vivo data on GI tract wall thicknesses of sufficient resolution needed in radiation dosimetry studies. Through their different echo texture and intensity, five layers of differing echo patterns for superficial mucosa, deep mucosa, submucosa, muscularis propria and serosa exist within the walls of organs composing the alimentary tract. Thicknesses for stomach mucosa ranged from 620 +/- 150 mum to 1320 +/- 80 mum (total stomach wall thicknesses from 2.56 +/- 0.12 to 4.12 +/- 0.11 mm). Measurements made for the rectal images revealed rectal mucosal thicknesses from 150 +/- 90 mum to 670 +/- 110 mum (total rectal wall thicknesses from 2.01 +/- 0.06 to 3.35 +/- 0.46 mm). The mucosa thus accounted for 28 +/- 3% and 16 +/- 6% of the total thickness of the stomach and rectal wall, respectively. Radiation transport simulations were then performed using the Monte Carlo N-particle transport code (MCNP) 4C transport code to calculate S values (Gy/Bq-s) for penetrating and nonpenetrating radiations such as photons, beta particles, conversion electrons and auger electrons of selected nuclides, I123, I131, Tc 99m and Y90 under two source conditions: content and mucosa sources, respectively. The results of this study demonstrate generally good agreement with published data for the stomach mucosa wall. The rectal mucosa data are consistently higher than published data compared with the large intestine due to different radiosensitive cell thicknesses (350 mum vs. a range spanning from 149 mum to 729 mum) and different geometry when a rectal content source is considered. Generally, the ICRP models have been designed to predict the amount of radiation dose in the human body from a "typical" or "reference" individual in a given population. The study has been performed to

  6. A simple probabilistic model of multibody interactions in proteins.

    PubMed

    Johansson, Kristoffer Enøe; Hamelryck, Thomas

    2013-08-01

    Protein structure prediction methods typically use statistical potentials, which rely on statistics derived from a database of know protein structures. In the vast majority of cases, these potentials involve pairwise distances or contacts between amino acids or atoms. Although some potentials beyond pairwise interactions have been described, the formulation of a general multibody potential is seen as intractable due to the perceived limited amount of data. In this article, we show that it is possible to formulate a probabilistic model of higher order interactions in proteins, without arbitrarily limiting the number of contacts. The success of this approach is based on replacing a naive table-based approach with a simple hierarchical model involving suitable probability distributions and conditional independence assumptions. The model captures the joint probability distribution of an amino acid and its neighbors, local structure and solvent exposure. We show that this model can be used to approximate the conditional probability distribution of an amino acid sequence given a structure using a pseudo-likelihood approach. We verify the model by decoy recognition and site-specific amino acid predictions. Our coarse-grained model is compared to state-of-art methods that use full atomic detail. This article illustrates how the use of simple probabilistic models can lead to new opportunities in the treatment of nonlocal interactions in knowledge-based protein structure prediction and design. PMID:23468247

  7. Probabilistic Life Cycle Cost Model for Repairable System

    NASA Astrophysics Data System (ADS)

    Nasir, Meseret; Chong, H. Y.; Osman, Sabtuni

    2015-04-01

    Traditionally, Life cycle cost (LCC) has been predicted in a deterministic approach, however; this method is not capable to consider the uncertainties in the input variables. In this paper, a probabilistic approach using Adaptive network-based fuzzy inference system (ANFIS) is proposed to estimate the LCC of repairable systems. The developed model could handle the uncertainties of input variables in the estimation of LCC. The numerical analysis shows that the acquisition and downtime cost could have a high effect towards the LCC compared to repair cost. The developed model could also provide more precise quantitative information for decision making process.

  8. Probabilistic Cross-matching of Radio Catalogs with Geometric Models

    NASA Astrophysics Data System (ADS)

    Fan, D.; Budavári, T.

    2014-05-01

    Cross-matching radio is different from that in the optical. Radio sources can have multiple corresponding detections, the core and its lobes, which makes identification and cross-identification to other catalogs much more difficult. Traditionally, these cases have been handled manually, with researchers looking at the possible candidates; this will not be possible for the upcoming radio ultimately leading to the Square Kilometer Array. We present a probabilistic method that can automatically associate radio sources by explicitly modeling their morphology. Our preliminary results based on a simple straight-line model seem to be on par with the manual associations.

  9. Probabilistic modeling of financial exposure to flood in France

    NASA Astrophysics Data System (ADS)

    Moncoulon, David; Quantin, Antoine; Leblois, Etienne

    2014-05-01

    CCR is a French reinsurance company which offers natural catastrophe covers with the State guarantee. Within this framework, CCR develops its own models to assess its financial exposure to floods, droughts, earthquakes and other perils, and thus the exposure of insurers and the French State. A probabilistic flood model has been developed in order to estimate the financial exposure of the Nat Cat insurance market to flood events, depending on their annual occurrence probability. This presentation is organized in two parts. The first part is dedicated to the development of a flood hazard and damage model (ARTEMIS). The model calibration and validation on historical events are then described. In the second part, the coupling of ARTEMIS with two generators of probabilistic events is achieved: a stochastic flow generator and a stochastic spatialized precipitation generator, adapted from the SAMPO model developed by IRSTEA. The analysis of the complementary nature of these two generators is proposed: the first one allows generating floods on the French hydrological station network; the second allows simulating surface water runoff and Small River floods, even on ungauged rivers. Thus, the simulation of thousands of non-occured, but possible events allows us to provide for the first time an estimate of the financial exposure to flooding in France at different scales (commune, department, country) and from different points of view (hazard, vulnerability and damages).

  10. Probabilistic updating of building models using incomplete modal data

    NASA Astrophysics Data System (ADS)

    Sun, Hao; Büyüköztürk, Oral

    2016-06-01

    This paper investigates a new probabilistic strategy for Bayesian model updating using incomplete modal data. Direct mode matching between the measured and the predicted modal quantities is not required in the updating process, which is realized through model reduction. A Markov chain Monte Carlo technique with adaptive random-walk steps is proposed to draw the samples for model parameter uncertainty quantification. The iterated improved reduced system technique is employed to update the prediction error as well as to calculate the likelihood function in the sampling process. Since modal quantities are used in the model updating, modal identification is first carried out to extract the natural frequencies and mode shapes through the acceleration measurements of the structural system. The proposed algorithm is finally validated by both numerical and experimental examples: a 10-storey building with synthetic data and a 8-storey building with shaking table test data. Results illustrate that the proposed algorithm is effective and robust for parameter uncertainty quantification in probabilistic model updating of buildings.

  11. The integrated environmental control model

    SciTech Connect

    Rubin, E.S.; Berkenpas, M.B.; Kalagnanam, J.R.

    1995-11-01

    The capability to estimate the performance and cost of emission control systems is critical to a variety of planning and analysis requirements faced by utilities, regulators, researchers and analysts in the public and private sectors. The computer model described in this paper has been developed for DOe to provide an up-to-date capability for analyzing a variety of pre-combustion, combustion, and post-combustion options in an integrated framework. A unique capability allows performance and costs to be modeled probabilistically, which allows explicit characterization of uncertainties and risks.

  12. Integrated software health management for aerospace guidance, navigation, and control systems: A probabilistic reasoning approach

    NASA Astrophysics Data System (ADS)

    Mbaya, Timmy

    Embedded Aerospace Systems have to perform safety and mission critical operations in a real-time environment where timing and functional correctness are extremely important. Guidance, Navigation, and Control (GN&C) systems substantially rely on complex software interfacing with hardware in real-time; any faults in software or hardware, or their interaction could result in fatal consequences. Integrated Software Health Management (ISWHM) provides an approach for detection and diagnosis of software failures while the software is in operation. The ISWHM approach is based on probabilistic modeling of software and hardware sensors using a Bayesian network. To meet memory and timing constraints of real-time embedded execution, the Bayesian network is compiled into an Arithmetic Circuit, which is used for on-line monitoring. This type of system monitoring, using an ISWHM, provides automated reasoning capabilities that compute diagnoses in a timely manner when failures occur. This reasoning capability enables time-critical mitigating decisions and relieves the human agent from the time-consuming and arduous task of foraging through a multitude of isolated---and often contradictory---diagnosis data. For the purpose of demonstrating the relevance of ISWHM, modeling and reasoning is performed on a simple simulated aerospace system running on a real-time operating system emulator, the OSEK/Trampoline platform. Models for a small satellite and an F-16 fighter jet GN&C (Guidance, Navigation, and Control) system have been implemented. Analysis of the ISWHM is then performed by injecting faults and analyzing the ISWHM's diagnoses.

  13. Probabilistic/Fracture-Mechanics Model For Service Life

    NASA Technical Reports Server (NTRS)

    Watkins, T., Jr.; Annis, C. G., Jr.

    1991-01-01

    Computer program makes probabilistic estimates of lifetime of engine and components thereof. Developed to fill need for more accurate life-assessment technique that avoids errors in estimated lives and provides for statistical assessment of levels of risk created by engineering decisions in designing system. Implements mathematical model combining techniques of statistics, fatigue, fracture mechanics, nondestructive analysis, life-cycle cost analysis, and management of engine parts. Used to investigate effects of such engine-component life-controlling parameters as return-to-service intervals, stresses, capabilities for nondestructive evaluation, and qualities of materials.

  14. A probabilistic model to liquefaction assessment of dams

    SciTech Connect

    Simos, N.; Costantino, C.J.; Reich, M.

    1995-03-01

    In an effort to evaluate earthquake liquefaction potential of soil media, several statistical models ranging from purely empirical to mathematically sophisticated have been devised. While deterministic methods define susceptibility of a soil structure to liquefaction, for a given seismic event, in the sense that the site does or does not liquefy, probabilistic approaches incorporate statistical properties associated with both the earthquake and site characterization. In this study a stochastic model is formulated to assess liquefaction potential of soil structures in general and earth dams in particular induced by earthquakes. Such earthquakes are realizations of a random process expressed in the form of a power spectral density. Uncertainties in the soil resistance to liquefaction are also introduced with probability density functions around in-situ measurements of parameters associated with the soil strength. The attempt of this study is to devise a procedure that will lead to a continuous probability of liquefaction at a given site. Monte Carlo simulations are employed for the probabilistic model. In addition a stochastic model is presented. The dynamic response of the two-phase medium is obtained with the help of the POROSLAM code and it is expressed in the form of a transfer function (Unit Response).

  15. A Probabilistic Model for Simulating Magnetoacoustic Emission Responses in Ferromagnets

    NASA Technical Reports Server (NTRS)

    Namkung, M.; Fulton, J. P.; Wincheski, B.

    1993-01-01

    Magnetoacoustic emission (MAE) is a phenomenon where acoustic noise is generated due to the motion of non-180 magnetic domain walls in a ferromagnet with non-zero magnetostrictive constants. MAE has been studied extensively for many years and has even been applied as an NDE tool for characterizing the heat treatment of high-yield low carbon steels. A complete theory which fully accounts for the magnetoacoustic response, however, has not yet emerged. The motion of the domain walls appears to be a totally random process, however, it does exhibit features of regularity which have been identified by studying phenomena such as 1/f flicker noise and self-organized criticality (SOC). In this paper, a probabilistic model incorporating the effects of SOC has been developed to help explain the MAE response. The model uses many simplifying assumptions yet yields good qualitative agreement with observed experimental results and also provides some insight into the possible underlying mechanisms responsible for MAE. We begin by providing a brief overview of magnetoacoustic emission and the experimental set-up used to obtain the MAE signal. We then describe a pseudo-probabilistic model used to predict the MAE response and give an example of the predicted result. Finally, the model is modified to account for SOC and the new predictions are shown and compared with experiment.

  16. Probabilistic assessment of agricultural droughts using graphical models

    NASA Astrophysics Data System (ADS)

    Ramadas, Meenu; Govindaraju, Rao S.

    2015-07-01

    Agricultural droughts are often characterized by soil moisture in the root zone of the soil, but crop needs are rarely factored into the analysis. Since water needs vary with crops, agricultural drought incidences in a region can be characterized better if crop responses to soil water deficits are also accounted for in the drought index. This study investigates agricultural droughts driven by plant stress due to soil moisture deficits using crop stress functions available in the literature. Crop water stress is assumed to begin at the soil moisture level corresponding to incipient stomatal closure, and reaches its maximum at the crop's wilting point. Using available location-specific crop acreage data, a weighted crop water stress function is computed. A new probabilistic agricultural drought index is then developed within a hidden Markov model (HMM) framework that provides model uncertainty in drought classification and accounts for time dependence between drought states. The proposed index allows probabilistic classification of the drought states and takes due cognizance of the stress experienced by the crop due to soil moisture deficit. The capabilities of HMM model formulations for assessing agricultural droughts are compared to those of current drought indices such as standardized precipitation evapotranspiration index (SPEI) and self-calibrating Palmer drought severity index (SC-PDSI). The HMM model identified critical drought events and several drought occurrences that are not detected by either SPEI or SC-PDSI, and shows promise as a tool for agricultural drought studies.

  17. Probabilistic climate change predictions applying Bayesian model averaging.

    PubMed

    Min, Seung-Ki; Simonis, Daniel; Hense, Andreas

    2007-08-15

    This study explores the sensitivity of probabilistic predictions of the twenty-first century surface air temperature (SAT) changes to different multi-model averaging methods using available simulations from the Intergovernmental Panel on Climate Change fourth assessment report. A way of observationally constrained prediction is provided by training multi-model simulations for the second half of the twentieth century with respect to long-term components. The Bayesian model averaging (BMA) produces weighted probability density functions (PDFs) and we compare two methods of estimating weighting factors: Bayes factor and expectation-maximization algorithm. It is shown that Bayesian-weighted PDFs for the global mean SAT changes are characterized by multi-modal structures from the middle of the twenty-first century onward, which are not clearly seen in arithmetic ensemble mean (AEM). This occurs because BMA tends to select a few high-skilled models and down-weight the others. Additionally, Bayesian results exhibit larger means and broader PDFs in the global mean predictions than the unweighted AEM. Multi-modality is more pronounced in the continental analysis using 30-year mean (2070-2099) SATs while there is only a little effect of Bayesian weighting on the 5-95% range. These results indicate that this approach to observationally constrained probabilistic predictions can be highly sensitive to the method of training, particularly for the later half of the twenty-first century, and that a more comprehensive approach combining different regions and/or variables is required. PMID:17569647

  18. Probabilistic logic modeling of network reliability for hybrid network architectures

    SciTech Connect

    Wyss, G.D.; Schriner, H.K.; Gaylor, T.R.

    1996-10-01

    Sandia National Laboratories has found that the reliability and failure modes of current-generation network technologies can be effectively modeled using fault tree-based probabilistic logic modeling (PLM) techniques. We have developed fault tree models that include various hierarchical networking technologies and classes of components interconnected in a wide variety of typical and atypical configurations. In this paper we discuss the types of results that can be obtained from PLMs and why these results are of great practical value to network designers and analysts. After providing some mathematical background, we describe the `plug-and-play` fault tree analysis methodology that we have developed for modeling connectivity and the provision of network services in several current- generation network architectures. Finally, we demonstrate the flexibility of the method by modeling the reliability of a hybrid example network that contains several interconnected ethernet, FDDI, and token ring segments. 11 refs., 3 figs., 1 tab.

  19. Integrated Medical Model Overview

    NASA Technical Reports Server (NTRS)

    Myers, J.; Boley, L.; Foy, M.; Goodenow, D.; Griffin, D.; Keenan, A.; Kerstman, E.; Melton, S.; McGuire, K.; Saile, L.; Shah, R.; Garcia, Y.; Sirmons. B.; Walton, M.; Reyes, D.

    2015-01-01

    The Integrated Medical Model (IMM) Project represents one aspect of NASA's Human Research Program (HRP) to quantitatively assess medical risks to astronauts for existing operational missions as well as missions associated with future exploration and commercial space flight ventures. The IMM takes a probabilistic approach to assessing the likelihood and specific outcomes of one hundred medical conditions within the envelope of accepted space flight standards of care over a selectable range of mission capabilities. A specially developed Integrated Medical Evidence Database (iMED) maintains evidence-based, organizational knowledge across a variety of data sources. Since becoming operational in 2011, version 3.0 of the IMM, the supporting iMED, and the expertise of the IMM project team have contributed to a wide range of decision and informational processes for the space medical and human research community. This presentation provides an overview of the IMM conceptual architecture and range of application through examples of actual space flight community questions posed to the IMM project.

  20. Efficient diagnosis of multiprocessor systems under probabilistic models

    NASA Technical Reports Server (NTRS)

    Blough, Douglas M.; Sullivan, Gregory F.; Masson, Gerald M.

    1989-01-01

    The problem of fault diagnosis in multiprocessor systems is considered under a probabilistic fault model. The focus is on minimizing the number of tests that must be conducted in order to correctly diagnose the state of every processor in the system with high probability. A diagnosis algorithm that can correctly diagnose the state of every processor with probability approaching one in a class of systems performing slightly greater than a linear number of tests is presented. A nearly matching lower bound on the number of tests required to achieve correct diagnosis in arbitrary systems is also proven. Lower and upper bounds on the number of tests required for regular systems are also presented. A class of regular systems which includes hypercubes is shown to be correctly diagnosable with high probability. In all cases, the number of tests required under this probabilistic model is shown to be significantly less than under a bounded-size fault set model. Because the number of tests that must be conducted is a measure of the diagnosis overhead, these results represent a dramatic improvement in the performance of system-level diagnosis techniques.

  1. Modeling of human artery tissue with probabilistic approach.

    PubMed

    Xiong, Linfei; Chui, Chee-Kong; Fu, Yabo; Teo, Chee-Leong; Li, Yao

    2015-04-01

    Accurate modeling of biological soft tissue properties is vital for realistic medical simulation. Mechanical response of biological soft tissue always exhibits a strong variability due to the complex microstructure and different loading conditions. The inhomogeneity in human artery tissue is modeled with a computational probabilistic approach by assuming that the instantaneous stress at a specific strain varies according to normal distribution. Material parameters of the artery tissue which are modeled with a combined logarithmic and polynomial energy equation are represented by a statistical function with normal distribution. Mean and standard deviation of the material parameters are determined using genetic algorithm (GA) and inverse mean-value first-order second-moment (IMVFOSM) method, respectively. This nondeterministic approach was verified using computer simulation based on the Monte-Carlo (MC) method. Cumulative distribution function (CDF) of the MC simulation corresponds well with that of the experimental stress-strain data and the probabilistic approach is further validated using data from other studies. By taking into account the inhomogeneous mechanical properties of human biological tissue, the proposed method is suitable for realistic virtual simulation as well as an accurate computational approach for medical device validation. PMID:25748681

  2. Probabilistic multi-scale modeling of pathogen dynamics in rivers

    NASA Astrophysics Data System (ADS)

    Packman, A. I.; Drummond, J. D.; Aubeneau, A. F.

    2014-12-01

    Most parameterizations of microbial dynamics and pathogen transport in surface waters rely on classic assumptions of advection-diffusion behavior in the water column and limited interactions between the water column and sediments. However, recent studies have shown that strong surface-subsurface interactions produce a wide range of transport timescales in rivers, and greatly the opportunity for long-term retention of pathogens in sediment beds and benthic biofilms. We present a stochastic model for pathogen dynamics, based on continuous-time random walk theory, that properly accounts for such diverse transport timescales, along with the remobilization and inactivation of pathogens in storage reservoirs. By representing pathogen dynamics probabilistically, the model framework enables diverse local-scale processes to be incorporated in system-scale models. We illustrate the application of the model to microbial dynamics in rivers based on the results of a tracer injection experiment. In-stream transport and surface-subsurface interactions are parameterized based on observations of conservative tracer transport, while E. coli retention and inactivation in sediments is parameterized based on direct local-scale experiments. The results indicate that sediments are an important reservoir of enteric organisms in rivers, and slow remobilization from sediments represents a long-term source of bacteria to streams. Current capability, potential advances, and limitations of this model framework for assessing pathogen transmission risks will be discussed. Because the transport model is probabilistic, it is amenable to incorporation into risk models, but a lack of characterization of key microbial processes in sediments and benthic biofilms hinders current application.

  3. Probabilistic Modeling of Aircraft Trajectories for Dynamic Separation Volumes

    NASA Technical Reports Server (NTRS)

    Lewis, Timothy A.

    2016-01-01

    With a proliferation of new and unconventional vehicles and operations expected in the future, the ab initio airspace design will require new approaches to trajectory prediction for separation assurance and other air traffic management functions. This paper presents an approach to probabilistic modeling of the trajectory of an aircraft when its intent is unknown. The approach uses a set of feature functions to constrain a maximum entropy probability distribution based on a set of observed aircraft trajectories. This model can be used to sample new aircraft trajectories to form an ensemble reflecting the variability in an aircraft's intent. The model learning process ensures that the variability in this ensemble reflects the behavior observed in the original data set. Computational examples are presented.

  4. Coverage in Wireless Sensor Network Based on Probabilistic Sensing Model

    NASA Astrophysics Data System (ADS)

    Li, Fen; Deng, Kai; Meng, Fanzhi; Weiyan, Zhang

    One of the major problem to consider in designing wireless sensor network is how to extend the network lifetime and provide desired quality of service. To achieve this, a broadly-used method is topology control. This paper studies the problem of how to ensure the network fully connected without nodes' location information. The coverage control model based on probabilistic sensing model is proposed in this paper. With random sensor deployment, the sensing node and communicating node number can be calculated based on the size of sensing region and the performance parameters of node (e.g., node sensing radius, communication radius, and so on). Simulation results show that the actual coverage quality provided by sensing nodes scheduled with the proposed coverage control model is higher than the threshold of coverage quality.

  5. scoringRules - A software package for probabilistic model evaluation

    NASA Astrophysics Data System (ADS)

    Lerch, Sebastian; Jordan, Alexander; Krüger, Fabian

    2016-04-01

    Models in the geosciences are generally surrounded by uncertainty, and being able to quantify this uncertainty is key to good decision making. Accordingly, probabilistic forecasts in the form of predictive distributions have become popular over the last decades. With the proliferation of probabilistic models arises the need for decision theoretically principled tools to evaluate the appropriateness of models and forecasts in a generalized way. Various scoring rules have been developed over the past decades to address this demand. Proper scoring rules are functions S(F,y) which evaluate the accuracy of a forecast distribution F , given that an outcome y was observed. As such, they allow to compare alternative models, a crucial ability given the variety of theories, data sources and statistical specifications that is available in many situations. This poster presents the software package scoringRules for the statistical programming language R, which contains functions to compute popular scoring rules such as the continuous ranked probability score for a variety of distributions F that come up in applied work. Two main classes are parametric distributions like normal, t, or gamma distributions, and distributions that are not known analytically, but are indirectly described through a sample of simulation draws. For example, Bayesian forecasts produced via Markov Chain Monte Carlo take this form. Thereby, the scoringRules package provides a framework for generalized model evaluation that both includes Bayesian as well as classical parametric models. The scoringRules package aims to be a convenient dictionary-like reference for computing scoring rules. We offer state of the art implementations of several known (but not routinely applied) formulas, and implement closed-form expressions that were previously unavailable. Whenever more than one implementation variant exists, we offer statistically principled default choices.

  6. Binary Encoded-Prototype Tree for Probabilistic Model Building GP

    NASA Astrophysics Data System (ADS)

    Yanase, Toshihiko; Hasegawa, Yoshihiko; Iba, Hitoshi

    In recent years, program evolution algorithms based on the estimation of distribution algorithm (EDA) have been proposed to improve search ability of genetic programming (GP) and to overcome GP-hard problems. One such method is the probabilistic prototype tree (PPT) based algorithm. The PPT based method explores the optimal tree structure by using the full tree whose number of child nodes is maximum among possible trees. This algorithm, however, suffers from problems arising from function nodes having different number of child nodes. These function nodes cause intron nodes, which do not affect the fitness function. Moreover, the function nodes having many child nodes increase the search space and the number of samples necessary for properly constructing the probabilistic model. In order to solve this problem, we propose binary encoding for PPT. In this article, we convert each function node to a subtree of binary nodes where the converted tree is correct in grammar. Our method reduces ineffectual search space, and the binary encoded tree is able to express the same tree structures as the original method. The effectiveness of the proposed method is demonstrated through the use of two computational experiments.

  7. a Generic Probabilistic Model and a Hierarchical Solution for Sensor Localization in Noisy and Restricted Conditions

    NASA Astrophysics Data System (ADS)

    Ji, S.; Yuan, X.

    2016-06-01

    A generic probabilistic model, under fundamental Bayes' rule and Markov assumption, is introduced to integrate the process of mobile platform localization with optical sensors. And based on it, three relative independent solutions, bundle adjustment, Kalman filtering and particle filtering are deduced under different and additional restrictions. We want to prove that first, Kalman filtering, may be a better initial-value supplier for bundle adjustment than traditional relative orientation in irregular strips and networks or failed tie-point extraction. Second, in high noisy conditions, particle filtering can act as a bridge for gap binding when a large number of gross errors fail a Kalman filtering or a bundle adjustment. Third, both filtering methods, which help reduce the error propagation and eliminate gross errors, guarantee a global and static bundle adjustment, who requires the strictest initial values and control conditions. The main innovation is about the integrated processing of stochastic errors and gross errors in sensor observations, and the integration of the three most used solutions, bundle adjustment, Kalman filtering and particle filtering into a generic probabilistic localization model. The tests in noisy and restricted situations are designed and examined to prove them.

  8. A probabilistic model of a porous heat exchanger

    NASA Technical Reports Server (NTRS)

    Agrawal, O. P.; Lin, X. A.

    1995-01-01

    This paper presents a probabilistic one-dimensional finite element model for heat transfer processes in porous heat exchangers. The Galerkin approach is used to develop the finite element matrices. Some of the submatrices are asymmetric due to the presence of the flow term. The Neumann expansion is used to write the temperature distribution as a series of random variables, and the expectation operator is applied to obtain the mean and deviation statistics. To demonstrate the feasibility of the formulation, a one-dimensional model of heat transfer phenomenon in superfluid flow through a porous media is considered. Results of this formulation agree well with the Monte-Carlo simulations and the analytical solutions. Although the numerical experiments are confined to parametric random variables, a formulation is presented to account for the random spatial variations.

  9. Applications of the International Space Station Probabilistic Risk Assessment Model

    NASA Technical Reports Server (NTRS)

    Grant, Warren; Lutomski, Michael G.

    2011-01-01

    Recently the International Space Station (ISS) has incorporated more Probabilistic Risk Assessments (PRAs) in the decision making process for significant issues. Future PRAs will have major impact to ISS and future spacecraft development and operations. These PRAs will have their foundation in the current complete ISS PRA model and the current PRA trade studies that are being analyzed as requested by ISS Program stakeholders. ISS PRAs have recently helped in the decision making process for determining reliability requirements for future NASA spacecraft and commercial spacecraft, making crew rescue decisions, as well as making operational requirements for ISS orbital orientation, planning Extravehicular activities (EVAs) and robotic operations. This paper will describe some applications of the ISS PRA model and how they impacted the final decision. This paper will discuss future analysis topics such as life extension, requirements of new commercial vehicles visiting ISS.

  10. Probabilistic Structural Analysis and Reliability Using NESSUS With Implemented Material Strength Degradation Model

    NASA Technical Reports Server (NTRS)

    Bast, Callie C.; Jurena, Mark T.; Godines, Cody R.; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    This project included both research and education objectives. The goal of this project was to advance innovative research and education objectives in theoretical and computational probabilistic structural analysis, reliability, and life prediction for improved reliability and safety of structural components of aerospace and aircraft propulsion systems. Research and education partners included Glenn Research Center (GRC) and Southwest Research Institute (SwRI) along with the University of Texas at San Antonio (UTSA). SwRI enhanced the NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) code and provided consulting support for NESSUS-related activities at UTSA. NASA funding supported three undergraduate students, two graduate students, a summer course instructor and the Principal Investigator. Matching funds from UTSA provided for the purchase of additional equipment for the enhancement of the Advanced Interactive Computational SGI Lab established during the first year of this Partnership Award to conduct the probabilistic finite element summer courses. The research portion of this report presents the cumulation of work performed through the use of the probabilistic finite element program, NESSUS, Numerical Evaluation and Structures Under Stress, and an embedded Material Strength Degradation (MSD) model. Probabilistic structural analysis provided for quantification of uncertainties associated with the design, thus enabling increased system performance and reliability. The structure examined was a Space Shuttle Main Engine (SSME) fuel turbopump blade. The blade material analyzed was Inconel 718, since the MSD model was previously calibrated for this material. Reliability analysis encompassing the effects of high temperature and high cycle fatigue, yielded a reliability value of 0.99978 using a fully correlated random field for the blade thickness. The reliability did not change significantly for a change in distribution type except for a change in

  11. Integrated deterministic and probabilistic safety analysis for safety assessment of nuclear power plants

    DOE PAGESBeta

    Di Maio, Francesco; Zio, Enrico; Smith, Curtis; Rychkov, Valentin

    2015-07-06

    The present special issue contains an overview of the research in the field of Integrated Deterministic and Probabilistic Safety Assessment (IDPSA) of Nuclear Power Plants (NPPs). Traditionally, safety regulation for NPPs design and operation has been based on Deterministic Safety Assessment (DSA) methods to verify criteria that assure plant safety in a number of postulated Design Basis Accident (DBA) scenarios. Referring to such criteria, it is also possible to identify those plant Structures, Systems, and Components (SSCs) and activities that are most important for safety within those postulated scenarios. Then, the design, operation, and maintenance of these “safety-related” SSCs andmore » activities are controlled through regulatory requirements and supported by Probabilistic Safety Assessment (PSA).« less

  12. Best Merge Region Growing with Integrated Probabilistic Classification for Hyperspectral Imagery

    NASA Technical Reports Server (NTRS)

    Tarabalka, Yuliya; Tilton, James C.

    2011-01-01

    A new method for spectral-spatial classification of hyperspectral images is proposed. The method is based on the integration of probabilistic classification within the hierarchical best merge region growing algorithm. For this purpose, preliminary probabilistic support vector machines classification is performed. Then, hierarchical step-wise optimization algorithm is applied, by iteratively merging regions with the smallest Dissimilarity Criterion (DC). The main novelty of this method consists in defining a DC between regions as a function of region statistical and geometrical features along with classification probabilities. Experimental results are presented on a 200-band AVIRIS image of the Northwestern Indiana s vegetation area and compared with those obtained by recently proposed spectral-spatial classification techniques. The proposed method improves classification accuracies when compared to other classification approaches.

  13. Integrated deterministic and probabilistic safety analysis for safety assessment of nuclear power plants

    SciTech Connect

    Di Maio, Francesco; Zio, Enrico; Smith, Curtis; Rychkov, Valentin

    2015-07-06

    The present special issue contains an overview of the research in the field of Integrated Deterministic and Probabilistic Safety Assessment (IDPSA) of Nuclear Power Plants (NPPs). Traditionally, safety regulation for NPPs design and operation has been based on Deterministic Safety Assessment (DSA) methods to verify criteria that assure plant safety in a number of postulated Design Basis Accident (DBA) scenarios. Referring to such criteria, it is also possible to identify those plant Structures, Systems, and Components (SSCs) and activities that are most important for safety within those postulated scenarios. Then, the design, operation, and maintenance of these “safety-related” SSCs and activities are controlled through regulatory requirements and supported by Probabilistic Safety Assessment (PSA).

  14. De novo protein conformational sampling using a probabilistic graphical model

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Debswapna; Cheng, Jianlin

    2015-11-01

    Efficient exploration of protein conformational space remains challenging especially for large proteins when assembling discretized structural fragments extracted from a protein structure data database. We propose a fragment-free probabilistic graphical model, FUSION, for conformational sampling in continuous space and assess its accuracy using ‘blind’ protein targets with a length up to 250 residues from the CASP11 structure prediction exercise. The method reduces sampling bottlenecks, exhibits strong convergence, and demonstrates better performance than the popular fragment assembly method, ROSETTA, on relatively larger proteins with a length of more than 150 residues in our benchmark set. FUSION is freely available through a web server at http://protein.rnet.missouri.edu/FUSION/.

  15. De novo protein conformational sampling using a probabilistic graphical model

    PubMed Central

    Bhattacharya, Debswapna; Cheng, Jianlin

    2015-01-01

    Efficient exploration of protein conformational space remains challenging especially for large proteins when assembling discretized structural fragments extracted from a protein structure data database. We propose a fragment-free probabilistic graphical model, FUSION, for conformational sampling in continuous space and assess its accuracy using ‘blind’ protein targets with a length up to 250 residues from the CASP11 structure prediction exercise. The method reduces sampling bottlenecks, exhibits strong convergence, and demonstrates better performance than the popular fragment assembly method, ROSETTA, on relatively larger proteins with a length of more than 150 residues in our benchmark set. FUSION is freely available through a web server at http://protein.rnet.missouri.edu/FUSION/. PMID:26541939

  16. Behavioral Modeling Based on Probabilistic Finite Automata: An Empirical Study.

    PubMed

    Tîrnăucă, Cristina; Montaña, José L; Ontañón, Santiago; González, Avelino J; Pardo, Luis M

    2016-01-01

    Imagine an agent that performs tasks according to different strategies. The goal of Behavioral Recognition (BR) is to identify which of the available strategies is the one being used by the agent, by simply observing the agent's actions and the environmental conditions during a certain period of time. The goal of Behavioral Cloning (BC) is more ambitious. In this last case, the learner must be able to build a model of the behavior of the agent. In both settings, the only assumption is that the learner has access to a training set that contains instances of observed behavioral traces for each available strategy. This paper studies a machine learning approach based on Probabilistic Finite Automata (PFAs), capable of achieving both the recognition and cloning tasks. We evaluate the performance of PFAs in the context of a simulated learning environment (in this case, a virtual Roomba vacuum cleaner robot), and compare it with a collection of other machine learning approaches. PMID:27347956

  17. Probabilistic models for reactive behaviour in heterogeneous condensed phase media

    NASA Astrophysics Data System (ADS)

    Baer, M. R.; Gartling, D. K.; DesJardin, P. E.

    2012-02-01

    This work presents statistically-based models to describe reactive behaviour in heterogeneous energetic materials. Mesoscale effects are incorporated in continuum-level reactive flow descriptions using probability density functions (pdfs) that are associated with thermodynamic and mechanical states. A generalised approach is presented that includes multimaterial behaviour by treating the volume fraction as a random kinematic variable. Model simplifications are then sought to reduce the complexity of the description without compromising the statistical approach. Reactive behaviour is first considered for non-deformable media having a random temperature field as an initial state. A pdf transport relationship is derived and an approximate moment approach is incorporated in finite element analysis to model an example application whereby a heated fragment impacts a reactive heterogeneous material which leads to a delayed cook-off event. Modelling is then extended to include deformation effects associated with shock loading of a heterogeneous medium whereby random variables of strain, strain-rate and temperature are considered. A demonstrative mesoscale simulation of a non-ideal explosive is discussed that illustrates the joint statistical nature of the strain and temperature fields during shock loading to motivate the probabilistic approach. This modelling is derived in a Lagrangian framework that can be incorporated in continuum-level shock physics analysis. Future work will consider particle-based methods for a numerical implementation of this modelling approach.

  18. Probabilistic modeling of scene dynamics for applications in visual surveillance.

    PubMed

    Saleemi, Imran; Shafique, Khurram; Shah, Mubarak

    2009-08-01

    We propose a novel method to model and learn the scene activity, observed by a static camera. The proposed model is very general and can be applied for solution of a variety of problems. The motion patterns of objects in the scene are modeled in the form of a multivariate nonparametric probability density function of spatiotemporal variables (object locations and transition times between them). Kernel Density Estimation is used to learn this model in a completely unsupervised fashion. Learning is accomplished by observing the trajectories of objects by a static camera over extended periods of time. It encodes the probabilistic nature of the behavior of moving objects in the scene and is useful for activity analysis applications, such as persistent tracking and anomalous motion detection. In addition, the model also captures salient scene features, such as the areas of occlusion and most likely paths. Once the model is learned, we use a unified Markov Chain Monte Carlo (MCMC)-based framework for generating the most likely paths in the scene, improving foreground detection, persistent labeling of objects during tracking, and deciding whether a given trajectory represents an anomaly to the observed motion patterns. Experiments with real-world videos are reported which validate the proposed approach. PMID:19542580

  19. Probabilistic models for creep-fatigue in a steel alloy

    NASA Astrophysics Data System (ADS)

    Ibisoglu, Fatmagul

    In high temperature components subjected to long term cyclic operation, simultaneous creep and fatigue damage occur. A new methodology for creep-fatigue life assessment has been adopted without the need to separate creep and fatigue damage or expended life. Probabilistic models, described by hold times in tension and total strain range at temperature, have been derived based on the creep rupture behavior of a steel alloy. These models have been validated with the observed creep-fatigue life of the material with a scatter band close to a factor of 2. Uncertainties of the creep-fatigue model parameters have been estimated with WinBUGS which is an open source Bayesian analysis software tool that uses Markov Chain Monte Carlo method to fit statistical models. Secondly, creep deformation in stress relaxation data has been analyzed. Well performing creep equations have been validated with the observed data. The creep model with the highest goodness of fit among the validated models has been used to estimate probability of exceedance at 0.6% strain level for the steel alloy.

  20. Modeling Characteristics of an Operational Probabilistic Safety Assessment (PSA)

    SciTech Connect

    Anoba, Richard C.; Khalil, Yehia; Fluehr, J.J. III; Kellogg, Richard; Hackerott, Alan

    2002-07-01

    Probabilistic Safety Assessments (PSAs) are increasingly being used as a tool for supporting the acceptability of design, procurement, construction, operation, and maintenance activities at nuclear power plants. Since the issuance of Generic Letter 88-20 and subsequent Individual Plant Examinations (IPEs)/Individual Plant Examinations for External Events (IPEEEs), the NRC has issued several Regulatory Guides such as RG 1.182 to describe the use of PSA in risk informed regulation activities. The PSA models developed for the IPEs were typically based on a 'snapshot' of the the risk profile at the nuclear power plant. The IPE models contain implicit assumptions and simplifications that limit the ability to realistically assess current issues. For example, IPE modeling assumptions related to plant configuration limit the ability to perform online equipment out-of-service assessments. The lack of model symmetry results in skewed risk results. IPE model simplifications related to initiating events have resulted in non-conservative estimates of risk impacts when equipment is removed from service. The IPE models also do not explicitly address all external events that are potentially risk significant as equipment is removed from service. (authors)

  1. Probabilistic evaluation of integrating resource recovery into wastewater treatment to improve environmental sustainability.

    PubMed

    Wang, Xu; McCarty, Perry L; Liu, Junxin; Ren, Nan-Qi; Lee, Duu-Jong; Yu, Han-Qing; Qian, Yi; Qu, Jiuhui

    2015-02-01

    Global expectations for wastewater service infrastructure have evolved over time, and the standard treatment methods used by wastewater treatment plants (WWTPs) are facing issues related to problem shifting due to the current emphasis on sustainability. A transition in WWTPs toward reuse of wastewater-derived resources is recognized as a promising solution for overcoming these obstacles. However, it remains uncertain whether this approach can reduce the environmental footprint of WWTPs. To test this hypothesis, we conducted a net environmental benefit calculation for several scenarios for more than 50 individual countries over a 20-y time frame. For developed countries, the resource recovery approach resulted in ∼154% net increase in the environmental performance of WWTPs compared with the traditional substance elimination approach, whereas this value decreased to ∼60% for developing countries. Subsequently, we conducted a probabilistic analysis integrating these estimates with national values and determined that, if this transition was attempted for WWTPs in developed countries, it would have a ∼65% probability of attaining net environmental benefits. However, this estimate decreased greatly to ∼10% for developing countries, implying a substantial risk of failure. These results suggest that implementation of this transition for WWTPs should be studied carefully in different temporal and spatial contexts. Developing countries should customize their approach to realizing more sustainable WWTPs, rather than attempting to simply replicate the successful models of developed countries. Results derived from the model forecasting highlight the role of bioenergy generation and reduced use of chemicals in improving the sustainability of WWTPs in developing countries. PMID:25605884

  2. Probabilistic evaluation of integrating resource recovery into wastewater treatment to improve environmental sustainability

    PubMed Central

    Wang, Xu; McCarty, Perry L.; Liu, Junxin; Ren, Nan-Qi; Lee, Duu-Jong; Yu, Han-Qing; Qian, Yi; Qu, Jiuhui

    2015-01-01

    Global expectations for wastewater service infrastructure have evolved over time, and the standard treatment methods used by wastewater treatment plants (WWTPs) are facing issues related to problem shifting due to the current emphasis on sustainability. A transition in WWTPs toward reuse of wastewater-derived resources is recognized as a promising solution for overcoming these obstacles. However, it remains uncertain whether this approach can reduce the environmental footprint of WWTPs. To test this hypothesis, we conducted a net environmental benefit calculation for several scenarios for more than 50 individual countries over a 20-y time frame. For developed countries, the resource recovery approach resulted in ∼154% net increase in the environmental performance of WWTPs compared with the traditional substance elimination approach, whereas this value decreased to ∼60% for developing countries. Subsequently, we conducted a probabilistic analysis integrating these estimates with national values and determined that, if this transition was attempted for WWTPs in developed countries, it would have a ∼65% probability of attaining net environmental benefits. However, this estimate decreased greatly to ∼10% for developing countries, implying a substantial risk of failure. These results suggest that implementation of this transition for WWTPs should be studied carefully in different temporal and spatial contexts. Developing countries should customize their approach to realizing more sustainable WWTPs, rather than attempting to simply replicate the successful models of developed countries. Results derived from the model forecasting highlight the role of bioenergy generation and reduced use of chemicals in improving the sustainability of WWTPs in developing countries. PMID:25605884

  3. Incorporating seismic phase correlations into a probabilistic model of global-scale seismology

    NASA Astrophysics Data System (ADS)

    Arora, Nimar

    2013-04-01

    We present a probabilistic model of seismic phases whereby the attributes of the body-wave phases are correlated to those of the first arriving P phase. This model has been incorporated into NET-VISA (Network processing Vertically Integrated Seismic Analysis) a probabilistic generative model of seismic events, their transmission, and detection on a global seismic network. In the earlier version of NET-VISA, seismic phase were assumed to be independent of each other. Although this didn't affect the quality of the inferred seismic bulletin, for the most part, it did result in a few instances of anomalous phase association. For example, an S phase with a smaller slowness than the corresponding P phase. We demonstrate that the phase attributes are indeed highly correlated, for example the uncertainty in the S phase travel time is significantly reduced given the P phase travel time. Our new model exploits these correlations to produce better calibrated probabilities for the events, as well as fewer anomalous associations.

  4. Modelling circumplanetary ejecta clouds at low altitudes: A probabilistic approach

    NASA Astrophysics Data System (ADS)

    Christou, Apostolos A.

    2015-04-01

    A model is presented of a ballistic, collisionless, steady state population of ejecta launched at randomly distributed times and velocities and moving under constant gravity above the surface of an airless planetary body. Within a probabilistic framework, closed form solutions are derived for the probability density functions of the altitude distribution of particles, the distribution of their speeds in a rest frame both at the surface and at altitude and with respect to a moving platform such as an orbiting spacecraft. These expressions are validated against numerically-generated synthetic populations of ejecta under lunar surface gravity. The model is applied to the cases where the ejection speed distribution is (a) uniform (b) a power law. For the latter law, it is found that the effective scale height of the ejecta envelope directly depends on the exponent of the power law and increases with altitude. The same holds for the speed distribution of particles near the surface. Ejection model parameters can, therefore, be constrained through orbital and surface measurements. The scope of the model is then extended to include size-dependency of the ejection speed and an example worked through for a deterministic power law relation. The result suggests that the height distribution of ejecta is a sensitive proxy for this dependency.

  5. Probabilistic model for quick detection of dissimilar binary images

    NASA Astrophysics Data System (ADS)

    Mustafa, Adnan A. Y.

    2015-09-01

    We present a quick method to detect dissimilar binary images. The method is based on a "probabilistic matching model" for image matching. The matching model is used to predict the probability of occurrence of distinct-dissimilar image pairs (completely different images) when matching one image to another. Based on this model, distinct-dissimilar images can be detected by matching only a few points between two images with high confidence, namely 11 points for a 99.9% successful detection rate. For image pairs that are dissimilar but not distinct-dissimilar, more points need to be mapped. The number of points required to attain a certain successful detection rate or confidence depends on the amount of similarity between the compared images. As this similarity increases, more points are required. For example, images that differ by 1% can be detected by mapping fewer than 70 points on average. More importantly, the model is image size invariant; so, images of any sizes will produce high confidence levels with a limited number of matched points. As a result, this method does not suffer from the image size handicap that impedes current methods. We report on extensive tests conducted on real images of different sizes.

  6. Probabilistic modeling of solidification grain structure in investment castings

    SciTech Connect

    Upadhya, G.K.; Yu, K.O.; Layton, M.A.; Paul, A.J.

    1995-12-31

    A probabilistic approach for modeling the evolution of grain structure in investment castings has been developed. The approach differs from the classical Monte Carlo simulations of microstructural evolution in that it uses the results from a heat transfer simulation of the investment casting process for determining the probabilities of nucleation and growth. The model was used to predict the solidification grain structure in castings. The model is quasi-3D, since it uses the information from a 3D simulation of heat transfer to predict the grain structure developed in any 2D-section of the casting. Structural transitions such as columnar/equiaxed transition can also be predicted, using suitable transition criteria. Results from the model have been validated by comparison with actual micrographs from experimental investment castings. In the first case, simulations were performed for a simple plate shaped casting of superalloy Rene 77. The effects of mold insulation as well as metal pour and mold preheat temperatures on the grain size of the casting were studied. In the second example, the casting of a complex-shaped jet engine component made of superalloy IN718 was simulated. Simulation results were seen to match very well with experiments.

  7. Probabilistic models to describe the dynamics of migrating microbial communities.

    PubMed

    Schroeder, Joanna L; Lunn, Mary; Pinto, Ameet J; Raskin, Lutgarde; Sloan, William T

    2015-01-01

    In all but the most sterile environments bacteria will reside in fluid being transported through conduits and some of these will attach and grow as biofilms on the conduit walls. The concentration and diversity of bacteria in the fluid at the point of delivery will be a mix of those when it entered the conduit and those that have become entrained into the flow due to seeding from biofilms. Examples include fluids through conduits such as drinking water pipe networks, endotracheal tubes, catheters and ventilation systems. Here we present two probabilistic models to describe changes in the composition of bulk fluid microbial communities as they are transported through a conduit whilst exposed to biofilm communities. The first (discrete) model simulates absolute numbers of individual cells, whereas the other (continuous) model simulates the relative abundance of taxa in the bulk fluid. The discrete model is founded on a birth-death process whereby the community changes one individual at a time and the numbers of cells in the system can vary. The continuous model is a stochastic differential equation derived from the discrete model and can also accommodate changes in the carrying capacity of the bulk fluid. These models provide a novel Lagrangian framework to investigate and predict the dynamics of migrating microbial communities. In this paper we compare the two models, discuss their merits, possible applications and present simulation results in the context of drinking water distribution systems. Our results provide novel insight into the effects of stochastic dynamics on the composition of non-stationary microbial communities that are exposed to biofilms and provides a new avenue for modelling microbial dynamics in systems where fluids are being transported. PMID:25803866

  8. The Gain-Loss Model: A Probabilistic Skill Multimap Model for Assessing Learning Processes

    ERIC Educational Resources Information Center

    Robusto, Egidio; Stefanutti, Luca; Anselmi, Pasquale

    2010-01-01

    Within the theoretical framework of knowledge space theory, a probabilistic skill multimap model for assessing learning processes is proposed. The learning process of a student is modeled as a function of the student's knowledge and of an educational intervention on the attainment of specific skills required to solve problems in a knowledge…

  9. Probabilistic constitutive relationships for material strength degradation models

    NASA Technical Reports Server (NTRS)

    Boyce, L.; Chamis, C. C.

    1989-01-01

    In the present probabilistic methodology for the strength of aerospace propulsion system structural components subjected to such environmentally-induced primitive variables as loading stresses, high temperature, chemical corrosion, and radiation, time is encompassed as an interacting element, allowing the projection of creep and fatigue effects. A probabilistic constitutive equation is postulated to account for the degradation of strength due to these primitive variables which may be calibrated by an appropriately curve-fitted least-squares multiple regression of experimental data. The resulting probabilistic constitutive equation is embodied in the PROMISS code for aerospace propulsion component random strength determination.

  10. Probabilistic consequence model of accidenal or intentional chemical releases.

    SciTech Connect

    Chang, Y.-S.; Samsa, M. E.; Folga, S. M.; Hartmann, H. M.

    2008-06-02

    In this work, general methodologies for evaluating the impacts of large-scale toxic chemical releases are proposed. The potential numbers of injuries and fatalities, the numbers of hospital beds, and the geographical areas rendered unusable during and some time after the occurrence and passage of a toxic plume are estimated on a probabilistic basis. To arrive at these estimates, historical accidental release data, maximum stored volumes, and meteorological data were used as inputs into the SLAB accidental chemical release model. Toxic gas footprints from the model were overlaid onto detailed population and hospital distribution data for a given region to estimate potential impacts. Output results are in the form of a generic statistical distribution of injuries and fatalities associated with specific toxic chemicals and regions of the United States. In addition, indoor hazards were estimated, so the model can provide contingency plans for either shelter-in-place or evacuation when an accident occurs. The stochastic distributions of injuries and fatalities are being used in a U.S. Department of Homeland Security-sponsored decision support system as source terms for a Monte Carlo simulation that evaluates potential measures for mitigating terrorist threats. This information can also be used to support the formulation of evacuation plans and to estimate damage and cleanup costs.

  11. A Probabilistic Palimpsest Model of Visual Short-term Memory

    PubMed Central

    Matthey, Loic; Bays, Paul M.; Dayan, Peter

    2015-01-01

    Working memory plays a key role in cognition, and yet its mechanisms remain much debated. Human performance on memory tasks is severely limited; however, the two major classes of theory explaining the limits leave open questions about key issues such as how multiple simultaneously-represented items can be distinguished. We propose a palimpsest model, with the occurrent activity of a single population of neurons coding for several multi-featured items. Using a probabilistic approach to storage and recall, we show how this model can account for many qualitative aspects of existing experimental data. In our account, the underlying nature of a memory item depends entirely on the characteristics of the population representation, and we provide analytical and numerical insights into critical issues such as multiplicity and binding. We consider representations in which information about individual feature values is partially separate from the information about binding that creates single items out of multiple features. An appropriate balance between these two types of information is required to capture fully the different types of error seen in human experimental data. Our model provides the first principled account of misbinding errors. We also suggest a specific set of stimuli designed to elucidate the representations that subjects actually employ. PMID:25611204

  12. Predicting coastal cliff erosion using a Bayesian probabilistic model

    USGS Publications Warehouse

    Hapke, C.; Plant, N.

    2010-01-01

    Regional coastal cliff retreat is difficult to model due to the episodic nature of failures and the along-shore variability of retreat events. There is a growing demand, however, for predictive models that can be used to forecast areas vulnerable to coastal erosion hazards. Increasingly, probabilistic models are being employed that require data sets of high temporal density to define the joint probability density function that relates forcing variables (e.g. wave conditions) and initial conditions (e.g. cliff geometry) to erosion events. In this study we use a multi-parameter Bayesian network to investigate correlations between key variables that control and influence variations in cliff retreat processes. The network uses Bayesian statistical methods to estimate event probabilities using existing observations. Within this framework, we forecast the spatial distribution of cliff retreat along two stretches of cliffed coast in Southern California. The input parameters are the height and slope of the cliff, a descriptor of material strength based on the dominant cliff-forming lithology, and the long-term cliff erosion rate that represents prior behavior. The model is forced using predicted wave impact hours. Results demonstrate that the Bayesian approach is well-suited to the forward modeling of coastal cliff retreat, with the correct outcomes forecast in 70-90% of the modeled transects. The model also performs well in identifying specific locations of high cliff erosion, thus providing a foundation for hazard mapping. This approach can be employed to predict cliff erosion at time-scales ranging from storm events to the impacts of sea-level rise at the century-scale. ?? 2010.

  13. Spatial polychaeta habitat potential mapping using probabilistic models

    NASA Astrophysics Data System (ADS)

    Choi, Jong-Kuk; Oh, Hyun-Joo; Koo, Bon Joo; Ryu, Joo-Hyung; Lee, Saro

    2011-06-01

    The purpose of this study was to apply probabilistic models to the mapping of the potential polychaeta habitat area in the Hwangdo tidal flat, Korea. Remote sensing techniques were used to construct spatial datasets of ecological environments and field observations were carried out to determine the distribution of macrobenthos. Habitat potential mapping was achieved for two polychaeta species, Prionospio japonica and Prionospio pulchra, and eight control factors relating to the tidal macrobenthos distribution were selected. These included the intertidal digital elevation model (DEM), slope, aspect, tidal exposure duration, distance from tidal channels, tidal channel density, spectral reflectance of the near infrared (NIR) bands and surface sedimentary facies from satellite imagery. The spatial relationships between the polychaeta species and each control factor were calculated using a frequency ratio and weights-of-evidence combined with geographic information system (GIS) data. The species were randomly divided into a training set (70%) to analyze habitat potential using frequency ratio and weights-of-evidence, and a test set (30%) to verify the predicted habitat potential map. The relationships were overlaid to produce a habitat potential map with a polychaeta habitat potential (PHP) index value. These maps were verified by comparing them to surveyed habitat locations such as the verification data set. For the verification results, the frequency ratio model showed prediction accuracies of 77.71% and 74.87% for P. japonica and P. pulchra, respectively, while those for the weights-of-evidence model were 64.05% and 62.95%. Thus, the frequency ratio model provided a more accurate prediction than the weights-of-evidence model. Our data demonstrate that the frequency ratio and weights-of-evidence models based upon GIS analysis are effective for generating habitat potential maps of polychaeta species in a tidal flat. The results of this study can be applied towards

  14. Applications of the International Space Station Probabilistic Risk Assessment Model

    NASA Astrophysics Data System (ADS)

    Grant, W.; Lutomski, M.

    2012-01-01

    The International Space Station (ISS) program is continuing to expand the use of Probabilistic Risk Assessments (PRAs). The use of PRAs in the ISS decision making process has proven very successful over the past 8 years. PRAs are used in the decision making process to address significant operational and design issues as well as to identify, communicate, and mitigate risks. Future PRAs are expected to have major impacts on not only the ISS, but also future NASA programs and projects. Many of these PRAs will have their foundation in the current ISS PRA model and in PRA trade studies that are being developed for the ISS Program. ISS PRAs have supported: -Development of reliability requirements for future NASA and commercial spacecraft, -Determination of inherent risk for visiting vehicles, -Evaluation of potential crew rescue scenarios, -Operational requirements and alternatives, -Planning of Extravehicular activities (EV As) and, -Evaluation of robotics operations. This paper will describe some applications of the ISS PRA model and how they impacted the final decisions that were made.

  15. A probabilistic model of intergranular stress corrosion cracking

    SciTech Connect

    Bourcier, R.J.; Jones, W.B. ); Scully, J.R. )

    1991-01-01

    We have developed a model which utilizes a probabilistic failure criterion to describe intergranular stress corrosion cracking (IGSCC). A two-dimensional array of elements representing a section of a pipe wall is analyzed, with each element in the array representing a segment of grain boundary. The failure criterion is applied repetitively to each element of the array that is exposed to the interior of the pipe (i.e. the corrosive fluid) until that element dissolves, thereby exposing the next element. A number of environmental, mechanical, and materials factors have been incorporated into the model, including: (1) the macroscopic applied stress profile, (2) the stress history, (3) the extent and grain-to- grain distribution of carbide sensitization levels, which can be applied to a subset of elements comprising a grain boundary, and (4) a data set containing IGSCC crack growth rates as function of applied stress intensity and sensitization level averaged over a large population of grains. The latter information was obtained from the literature for AISI 304 stainless steel under light water nuclear reactor primary coolant environmental conditions. The resulting crack growth simulations are presented and discussed. 14 refs., 10 figs.

  16. Probabilistic clustering and shape modelling of white matter fibre bundles using regression mixtures.

    PubMed

    Ratnarajah, Nagulan; Simmons, Andy; Hojjatoleslami, Ali

    2011-01-01

    We present a novel approach for probabilistic clustering of white matter fibre pathways using curve-based regression mixture modelling techniques in 3D curve space. The clustering algorithm is based on a principled method for probabilistic modelling of a set of fibre trajectories as individual sequences of points generated from a finite mixture model consisting of multivariate polynomial regression model components. Unsupervised learning is carried out using maximum likelihood principles. Specifically, conditional mixture is used together with an EM algorithm to estimate cluster membership. The result of clustering is a probabilistic assignment of fibre trajectories to each cluster and an estimate of cluster parameters. A statistical shape model is calculated for each clustered fibre bundle using fitted parameters of the probabilistic clustering. We illustrate the potential of our clustering approach on synthetic and real data. PMID:21995009

  17. Probabilistic approaches to the modelling of fluvial processes

    NASA Astrophysics Data System (ADS)

    Molnar, Peter

    2013-04-01

    Fluvial systems generally exhibit sediment dynamics that are strongly stochastic. This stochasticity comes basically from three sources: (a) the variability and randomness in sediment supply due to surface properties and topography; (b) from the multitude of pathways that sediment may take on hillslopes and in channels, and the uncertainty in travel times and sediment storage along those pathways; and (c) from the stochasticity which is inherent in mobilizing sediment, either by heavy rain, landslides, debris flows, slope erosion, channel avulsions, etc. Fully deterministic models of fluvial systems, even if they are physically realistic and very complex, are likely going to be unable to capture this stochasticity and as a result will fail to reproduce long-term sediment dynamics. In this paper I will review another approach to modelling fluvial processes, which grossly simplifies the systems itself, but allows for stochasticity in sediment supply, mobilization and transport. I will demonstrate the benefits and limitations of this probabilistic approach to fluvial processes on three examples. The first example is a probabilistic sediment cascade which we developed for the Illgraben, a debris flow basin in the Rhone catchment. In this example it will be shown how the probability distribution of landslides generating sediment input into the channel system is transposed into that of sediment yield out of the basin by debris flows. The key role of transient sediment storage in the channel system, which limits the size of potential debris flows, is highlighted together with the influence of the landslide triggering mechanisms and climate stochasticity. The second example focuses on the river reach scale in the Maggia River, a braided gravel-bed stream where the exposed sediment on gravel bars is colonised by riparian vegetation in periods without floods. A simple autoregressive model with a disturbance and colonization term is used to simulate the growth and decline in

  18. The Terrestrial Investigation Model: A probabilistic risk assessment model for birds exposed to pesticides

    EPA Science Inventory

    One of the major recommendations of the National Academy of Science to the USEPA, NMFS and USFWS was to utilize probabilistic methods when assessing the risks of pesticides to federally listed endangered and threatened species. The Terrestrial Investigation Model (TIM, version 3....

  19. Dynamic Probabilistic Modeling of Environmental Emissions of Engineered Nanomaterials.

    PubMed

    Sun, Tian Yin; Bornhöft, Nikolaus A; Hungerbühler, Konrad; Nowack, Bernd

    2016-05-01

    The need for an environmental risk assessment for engineered nanomaterials (ENM) necessitates the knowledge about their environmental concentrations. Despite significant advances in analytical methods, it is still not possible to measure the concentrations of ENM in natural systems. Material flow and environmental fate models have been used to provide predicted environmental concentrations. However, almost all current models are static and consider neither the rapid development of ENM production nor the fact that many ENM are entering an in-use stock and are released with a lag phase. Here we use dynamic probabilistic material flow modeling to predict the flows of four ENM (nano-TiO2, nano-ZnO, nano-Ag and CNT) to the environment and to quantify their amounts in (temporary) sinks such as the in-use stock and ("final") environmental sinks such as soil and sediment. Caused by the increase in production, the concentrations of all ENM in all compartments are increasing. Nano-TiO2 had far higher concentrations than the other three ENM. Sediment showed in our worst-case scenario concentrations ranging from 6.7 μg/kg (CNT) to about 40 000 μg/kg (nano-TiO2). In most cases the concentrations in waste incineration residues are at the "mg/kg" level. The flows to the environment that we provide will constitute the most accurate and reliable input of masses for environmental fate models which are using process-based descriptions of the fate and behavior of ENM in natural systems and rely on accurate mass input parameters. PMID:27043743

  20. Probabilistic Finite Element: Variational Theory

    NASA Technical Reports Server (NTRS)

    Belytschko, T.; Liu, W. K.

    1985-01-01

    The goal of this research is to provide techniques which are cost-effective and enable the engineer to evaluate the effect of uncertainties in complex finite element models. Embedding the probabilistic aspects in a variational formulation is a natural approach. In addition, a variational approach to probabilistic finite elements enables it to be incorporated within standard finite element methodologies. Therefore, once the procedures are developed, they can easily be adapted to existing general purpose programs. Furthermore, the variational basis for these methods enables them to be adapted to a wide variety of structural elements and to provide a consistent basis for incorporating probabilistic features in many aspects of the structural problem. Tasks concluded include the theoretical development of probabilistic variational equations for structural dynamics, the development of efficient numerical algorithms for probabilistic sensitivity displacement and stress analysis, and integration of methodologies into a pilot computer code.

  1. Rockfall hazard assessment integrating probabilistic physically based rockfall source detection (Norddal municipality, Norway).

    NASA Astrophysics Data System (ADS)

    Yugsi Molina, F. X.; Oppikofer, T.; Fischer, L.; Hermanns, R. L.; Taurisano, A.

    2012-04-01

    Traditional techniques to assess rockfall hazard are partially based on probabilistic analysis. Stochastic methods has been used for run-out analysis of rock blocks to estimate the trajectories that a detached block will follow during its fall until it stops due to kinetic energy loss. However, the selection of rockfall source areas is usually defined either by multivariate analysis or by field observations. For either case, a physically based approach is not used for the source area detection. We present an example of rockfall hazard assessment that integrates a probabilistic rockfall run-out analysis with a stochastic assessment of the rockfall source areas using kinematic stability analysis in a GIS environment. The method has been tested for a steep more than 200 m high rock wall, located in the municipality of Norddal (Møre og Romsdal county, Norway), where a large number of people are either exposed to snow avalanches, rockfalls, or debris flows. The area was selected following the recently published hazard mapping plan of Norway. The cliff is formed by medium to coarse-grained quartz-dioritic to granitic gneisses of Proterozoic age. Scree deposits product of recent rockfall activity are found at the bottom of the rock wall. Large blocks can be found several tens of meters away from the cliff in Sylte, the main locality in the Norddal municipality. Structural characterization of the rock wall was done using terrestrial laser scanning (TLS) point clouds in the software Coltop3D (www.terranum.ch), and results were validated with field data. Orientation data sets from the structural characterization were analyzed separately to assess best-fit probability density functions (PDF) for both dip angle and dip direction angle of each discontinuity set. A GIS-based stochastic kinematic analysis was then carried out using the discontinuity set orientations and the friction angle as random variables. An airborne laser scanning digital elevation model (ALS-DEM) with 1 m

  2. Forecasting the duration of volcanic eruptions: an empirical probabilistic model

    NASA Astrophysics Data System (ADS)

    Gunn, L. S.; Blake, S.; Jones, M. C.; Rymer, H.

    2014-01-01

    The ability to forecast future volcanic eruption durations would greatly benefit emergency response planning prior to and during a volcanic crises. This paper introduces a probabilistic model to forecast the duration of future and on-going eruptions. The model fits theoretical distributions to observed duration data and relies on past eruptions being a good indicator of future activity. A dataset of historical Mt. Etna flank eruptions is presented and used to demonstrate the model. The data have been compiled through critical examination of existing literature along with careful consideration of uncertainties on reported eruption start and end dates between the years 1300 AD and 2010. Data following 1600 is considered to be reliable and free of reporting biases. The distribution of eruption duration between the years 1600 and 1669 is found to be statistically different from that following it and the forecasting model is run on two datasets of Mt. Etna flank eruption durations: 1600-2010 and 1670-2010. Each dataset is modelled using a log-logistic distribution with parameter values found by maximum likelihood estimation. Survivor function statistics are applied to the model distributions to forecast (a) the probability of an eruption exceeding a given duration, (b) the probability of an eruption that has already lasted a particular number of days exceeding a given total duration and (c) the duration with a given probability of being exceeded. Results show that excluding the 1600-1670 data has little effect on the forecasting model result, especially where short durations are involved. By assigning the terms `likely' and `unlikely' to probabilities of 66 % or more and 33 % or less, respectively, the forecasting model based on the 1600-2010 dataset indicates that a future flank eruption on Mt. Etna would be likely to exceed 20 days (± 7 days) but unlikely to exceed 86 days (± 29 days). This approach can easily be adapted for use on other highly active, well

  3. Bayesian modeling of rainfall-runoff uncertainty to improve probabilistic forecasts

    NASA Astrophysics Data System (ADS)

    Courbariaux, Marie; Parent, Éric; Favre, Anne-Catherine; Perreault, Luc; Gailhard, Joël; Barbillon, Pierre

    2015-04-01

    Probabilistic forecasts aim at accounting for uncertainty by producing a predictive distribution of the quantity of interest instead of a single best guess estimate. With regard to river flow forecasts, uncertainty is mainly due (a) to the unknown future rainfalls and temperatures, (b) to the possible inadequacy of the deterministic model mimicking the rainfall-runoff transformation. The first source of uncertainty can nowadays be taken into account using ensemble forecasts as inputs to the rainfall-runoff model (RRM). However, the second source of uncertainty due to the possible RRM misrepresentation remains. A simple way to integrate it consists in adjusting the forecast's density as much as necessary to get a prediction consistent with the observations. This step is called "post-processing". Our work focuses on series of river flow forecasts routinely issued at EDF (Electricity of France) and at Hydro-Québec. We aim at reducing the sharpness loss in the post-processing step while guaranteeing point-wise and temporal consistency. To do so, we write a joint model on the RRM errors along the whole trajectory to be predicted. Point-wise and temporal consistency are then obtained relying on a Bayesian approach. As in Krzysztofowicz's works, we first consider the prior behavior of the natural river flow and then update it by taking into account the likelihood of the information conveyed through RRM's outputs. In the spirit of Markov switching models, we establish a classification of time periods remaining on RRM's state variables through a Probit model. Conditioning on such a classification yields a mixture model of RRM errors. We finally compare the results to EDF's present operational forecasting system. Key words : probabilistic forecasts, sharpness, rainfall-runoff, post-processing, river flow, model error.

  4. A Hybrid Probabilistic Model for Unified Collaborative and Content-Based Image Tagging.

    PubMed

    Zhou, Ning; Cheung, William K; Qiu, Guoping; Xue, Xiangyang

    2011-07-01

    The increasing availability of large quantities of user contributed images with labels has provided opportunities to develop automatic tools to tag images to facilitate image search and retrieval. In this paper, we present a novel hybrid probabilistic model (HPM) which integrates low-level image features and high-level user provided tags to automatically tag images. For images without any tags, HPM predicts new tags based solely on the low-level image features. For images with user provided tags, HPM jointly exploits both the image features and the tags in a unified probabilistic framework to recommend additional tags to label the images. The HPM framework makes use of the tag-image association matrix (TIAM). However, since the number of images is usually very large and user-provided tags are diverse, TIAM is very sparse, thus making it difficult to reliably estimate tag-to-tag co-occurrence probabilities. We developed a collaborative filtering method based on nonnegative matrix factorization (NMF) for tackling this data sparsity issue. Also, an L1 norm kernel method is used to estimate the correlations between image features and semantic concepts. The effectiveness of the proposed approach has been evaluated using three databases containing 5,000 images with 371 tags, 31,695 images with 5,587 tags, and 269,648 images with 5,018 tags, respectively. PMID:21079279

  5. An empirical model for probabilistic decadal prediction: A global analysis

    NASA Astrophysics Data System (ADS)

    Suckling, Emma; Hawkins, Ed; Eden, Jonathan; van Oldenborgh, Geert Jan

    2016-04-01

    Empirical models, designed to predict land-based surface variables over seasons to decades ahead, provide useful benchmarks for comparison against the performance of dynamical forecast systems; they may also be employable as predictive tools for use by climate services in their own right. A new global empirical decadal prediction system is presented, based on a multiple linear regression approach designed to produce probabilistic output for comparison against dynamical models. Its performance is evaluated for surface air temperature over a set of historical hindcast experiments under a series of different prediction `modes'. The modes include a real-time setting, a scenario in which future volcanic forcings are prescribed during the hindcasts, and an approach which exploits knowledge of the forced trend. A two-tier prediction system, which uses knowledge of future sea surface temperatures in the Pacific and Atlantic Oceans, is also tested, but within a perfect knowledge framework. Each mode is designed to identify sources of predictability and uncertainty, as well as investigate different approaches to the design of decadal prediction systems for operational use. It is found that the empirical model shows skill above that of persistence hindcasts for annual means at lead times of up to ten years ahead in all of the prediction modes investigated. Small improvements in skill are found at all lead times when including future volcanic forcings in the hindcasts. It is also suggested that hindcasts which exploit full knowledge of the forced trend due to increasing greenhouse gases throughout the hindcast period can provide more robust estimates of model bias for the calibration of the empirical model in an operational setting. The two-tier system shows potential for improved real-time prediction, given the assumption that skilful predictions of large-scale modes of variability are available. The empirical model framework has been designed with enough flexibility to

  6. Some Proposed Modifications to the 1996 California Probabilistic Hazard Model

    NASA Astrophysics Data System (ADS)

    Cao, T.; Bryant, W. A.; Rowshandel, B.; Toppozada, T.; Reichle, M. S.; Petersen, M. D.; Frankel, A. D.

    2001-12-01

    The California Department of Conservation, Division of Mines and Geology and U. S. Geological Survey are working on the revision of the 1996 California Probabilistic hazard model. Since the release of this hazard model some of the new seismological and geological studies and observations in this area have provided the basis for the revision. Important considerations of model modifications include the following: 1. using a new bilinear fault area-magnitude relation to replace the Wells and Coppersmith (1994) relation for M greater than and equal to 7.0; 2. using the Gaussian function to replace the Dirac Delta function for characteristic magnitude; 3. updating the earthquake catalog with the new M greater than and equal to 5.5 catalog from 1800 to 1999 by Toppozada et al. (2000) and the Berkeley and Caltech catalogs for 1996-2001; 4. balancing the moment release for some major A type faults; 5. adding Abrahamson and Silva attention relation with new hanging wall term; 6. considering different ratios between characteristic and Gutenberg-Richter magnitude-frequency distributions other than 50 percent and 50 percent; 7. using Monte Carlo method to sample the logic tree to produce uncertainty map of coefficient of variation (COV); 8. separating background seismicity in the vicinity of faults from other areas for different smoothing process or no smoothing at all, especially for the creeping section of the San Andreas fault and the Brawley seismic zone; 9. using near-fault variability of attenuation relations to mimic directivity; 10. modifying slip-rates for the Concord-Green Valley, Sierra Madre, and Raymond faults, adding or modifying blind thrust faults mainly in the Los Angeles Basin. These possible changes were selected with input received during several workshops that included participation of geologists and seismologists familiar with the area of concern. With the above revisions and other changes, we expect that the new model should not differ greatly from the

  7. A Survey of Probabilistic Models for Relational Data

    SciTech Connect

    Koutsourelakis, P S

    2006-10-13

    Traditional data mining methodologies have focused on ''flat'' data i.e. a collection of identically structured entities, assumed to be independent and identically distributed. However, many real-world datasets are innately relational in that they consist of multi-modal entities and multi-relational links (where each entity- or link-type is characterized by a different set of attributes). Link structure is an important characteristic of a dataset and should not be ignored in modeling efforts, especially when statistical dependencies exist between related entities. These dependencies can in fact significantly improve the accuracy of inference and prediction results, if the relational structure is appropriately leveraged (Figure 1). The need for models that can incorporate relational structure has been accentuated by new technological developments which allow us to easily track, store, and make accessible large amounts of data. Recently, there has been a surge of interest in statistical models for dealing with richly interconnected, heterogeneous data, fueled largely by information mining of web/hypertext data, social networks, bibliographic citation data, epidemiological data and communication networks. Graphical models have a natural formalism for representing complex relational data and for predicting the underlying evolving system in a dynamic framework. The present survey provides an overview of probabilistic methods and techniques that have been developed over the last few years for dealing with relational data. Particular emphasis is paid to approaches pertinent to the research areas of pattern recognition, group discovery, entity/node classification, and anomaly detection. We start with supervised learning tasks, where two basic modeling approaches are discussed--i.e. discriminative and generative. Several discriminative techniques are reviewed and performance results are presented. Generative methods are discussed in a separate survey. A special section is

  8. Proposal for a probabilistic local level landslide hazard assessment model: The case of Suluktu, Kyrgyzstan

    NASA Astrophysics Data System (ADS)

    Vidar Vangelsten, Bjørn; Fornes, Petter; Cepeda, Jose Mauricio; Ekseth, Kristine Helene; Eidsvig, Unni; Ormukov, Cholponbek

    2015-04-01

    Landslides are a significant threat to human life and the built environment in many parts of Central Asia. To improve understanding of the magnitude of the threat and propose appropriate risk mitigation measures, landslide hazard mapping is needed both at regional and local level. Many different approaches for landslide hazard mapping exist depending on the scale and purpose of the analysis and what input data are available. This paper presents a probabilistic local scale landslide hazard mapping methodology for rainfall triggered landslides, adapted to the relatively dry climate found in South-Western Kyrgyzstan. The GIS based approach makes use of data on topography, geology, land use and soil characteristics to assess landslide susceptibility. Together with a selected rainfall scenario, these data are inserted into a triggering model based on an infinite slope formulation considering pore pressure and suction effects for unsaturated soils. A statistical model based on local landslide data has been developed to estimate landslide run-out. The model links the spatial extension of the landslide to land use and geological features. The model is tested and validated for the town of Suluktu in the Ferghana Valley in South-West Kyrgyzstan. Landslide hazard is estimated for the urban area and the surrounding hillsides. The case makes use of a range of data from different sources, both remote sensing data and in-situ data. Public global data sources are mixed with case specific data obtained from field work. The different data and models have various degrees of uncertainty. To account for this, the hazard model has been inserted into a Monte Carlo simulation framework to produce a probabilistic landslide hazard map identifying areas with high landslide exposure. The research leading to these results has received funding from the European Commission's Seventh Framework Programme [FP7/2007-2013], under grant agreement n° 312972 "Framework to integrate Space-based and in

  9. Dynamic modeling of physical phenomena for probabilistic assessment of spent fuel accidents

    SciTech Connect

    Benjamin, A.S.

    1997-11-01

    If there should be an accident involving drainage of all the water from a spent fuel pool, the fuel elements will heat up until the heat produced by radioactive decay is balanced by that removed by natural convection to air, thermal radiation, and other means. If the temperatures become high enough for the cladding or other materials to ignite due to rapid oxidation, then some of the fuel might melt, leading to an undesirable release of radioactive materials. The amount of melting is dependent upon the fuel loading configuration and its age, the oxidation and melting characteristics of the materials, and the potential effectiveness of recovery actions. The authors have developed methods for modeling the pertinent physical phenomena and integrating the results with a probabilistic treatment of the uncertainty distributions. The net result is a set of complementary cumulative distribution functions for the amount of fuel melted.

  10. Probabilistic drug connectivity mapping

    PubMed Central

    2014-01-01

    Background The aim of connectivity mapping is to match drugs using drug-treatment gene expression profiles from multiple cell lines. This can be viewed as an information retrieval task, with the goal of finding the most relevant profiles for a given query drug. We infer the relevance for retrieval by data-driven probabilistic modeling of the drug responses, resulting in probabilistic connectivity mapping, and further consider the available cell lines as different data sources. We use a special type of probabilistic model to separate what is shared and specific between the sources, in contrast to earlier connectivity mapping methods that have intentionally aggregated all available data, neglecting information about the differences between the cell lines. Results We show that the probabilistic multi-source connectivity mapping method is superior to alternatives in finding functionally and chemically similar drugs from the Connectivity Map data set. We also demonstrate that an extension of the method is capable of retrieving combinations of drugs that match different relevant parts of the query drug response profile. Conclusions The probabilistic modeling-based connectivity mapping method provides a promising alternative to earlier methods. Principled integration of data from different cell lines helps to identify relevant responses for specific drug repositioning applications. PMID:24742351

  11. Building a Probabilistic Denitrification Model for an Oregon Salt Marsh

    NASA Astrophysics Data System (ADS)

    Moon, J. B.; Stecher, H. A.; DeWitt, T.; Nahlik, A.; Regutti, R.; Michael, L.; Fennessy, M. S.; Brown, L.; Mckane, R.; Naithani, K. J.

    2015-12-01

    Despite abundant work starting in the 1950s on the drivers of denitrification (DeN), mechanistic complexity and methodological challenges of direct DeN measurements have resulted in a lack of reliable rate estimates across landscapes, and a lack of operationally valid, robust models. Measuring and modeling DeN are particularly challenging in tidal systems, which play a vital role in buffering adjacent coastal waters from nitrogen inputs. These systems are hydrologically and biogeochemically complex, varying on fine temporal and spatial scales. We assessed the spatial and temporal variability of soil nitrate (NO3-) levels and O2 availability, two primary drivers of DeN, in surface soils of Winant salt marsh located in Yaquina estuary, OR during the summers of 2013 and 2014. We found low temporal variability in soil NO3- concentrations across years, tide series, and tide cycles, but high spatial variability linked to elevation gradients (i.e., habitat types); spatial variability within the high marsh habitat (0 - 68 μg N g-1 dry soil) was correlated with distance to major tide creek channels and connectivity to upslope N-fixing red alder. Soil O2 measurements collected at 5 cm below ground across three locations on two spring tide series showed that O2 drawdown rates were also spatially variable. Depending on the marsh location, O2 draw down ranged from sub-optimal for DeN (> 80 % O2 saturation) across an entire tide series (i.e., across days) to optimum (i.e., ~ 0 % O2 saturation) within one overtopping tide event (i.e., within hours). We are using these results, along with empirical relationships created between DeN and soil NO3- concentrations for Winant to improve on a pre-existing tidal DeN model. We will develop the first version of a fully probabilistic hierarchical Bayesian tidal DeN model to quantify parameter and prediction uncertainties, which are as important as determining mean predictions in order to distinguish measurable differences across the marsh.

  12. A probabilistic graphical model approach to stochastic multiscale partial differential equations

    SciTech Connect

    Wan, Jiang; Zabaras, Nicholas; Center for Applied Mathematics, Cornell University, 657 Frank H.T. Rhodes Hall, Ithaca, NY 14853

    2013-10-01

    We develop a probabilistic graphical model based methodology to efficiently perform uncertainty quantification in the presence of both stochastic input and multiple scales. Both the stochastic input and model responses are treated as random variables in this framework. Their relationships are modeled by graphical models which give explicit factorization of a high-dimensional joint probability distribution. The hyperparameters in the probabilistic model are learned using sequential Monte Carlo (SMC) method, which is superior to standard Markov chain Monte Carlo (MCMC) methods for multi-modal distributions. Finally, we make predictions from the probabilistic graphical model using the belief propagation algorithm. Numerical examples are presented to show the accuracy and efficiency of the predictive capability of the developed graphical model.

  13. Age-Associated Alterations in Corpus Callosum White Matter Integrity in Bipolar Disorder Assessed Using Probabilistic Tractography

    PubMed Central

    Toteja, Nitin; Cokol, Perihan Guvenek; Ikuta, Toshikazu; Kafantaris, Vivian; Peters, Bart D.; Burdick, Katherine E.; John, Majnu; Malhotra, Anil K.; Szeszko, Philip R.

    2014-01-01

    Objectives Atypical age-associated changes in white matter integrity may play a role in the neurobiology of bipolar disorder, but no studies have examined the major white matter tracts using nonlinear statistical modeling across a wide age range in this disorder. The goal of this study was to identify possible deviations in the typical pattern of age-associated changes in white matter integrity in patients with bipolar disorder across the age range of 9 to 62 years. Methods Diffusion tensor imaging was performed in 57 (20M/37F) patients with a diagnosis of bipolar disorder and 57 (20M/37F) age- and sex-matched healthy volunteers. Mean diffusivity and fractional anisotropy were computed for the genu and splenium of the corpus callosum, two projection tracts, and five association tracts using probabilistic tractography. Results Overall, patients had lower fractional anisotropy and higher mean diffusivity compared to healthy volunteers across all tracts (while controlling for the effects of age and age2). In addition, there were greater age-associated increases in mean diffusivity in patients compared to healthy volunteers within the genu and splenium of the corpus callosum beginning in the second and third decades of life. Conclusions Our findings provide evidence for alterations in the typical pattern of white matter development in patients with bipolar disorder compared to healthy volunteers. Changes in white matter development within the corpus callosum may lead to altered inter-hemispheric communication that is considered integral to the neurobiology of the disorder. PMID:25532972

  14. Probabilistic modeling of flood characterizations with parametric and minimum information pair-copula model

    NASA Astrophysics Data System (ADS)

    Daneshkhah, Alireza; Remesan, Renji; Chatrabgoun, Omid; Holman, Ian P.

    2016-09-01

    This paper highlights the usefulness of the minimum information and parametric pair-copula construction (PCC) to model the joint distribution of flood event properties. Both of these models outperform other standard multivariate copula in modeling multivariate flood data that exhibiting complex patterns of dependence, particularly in the tails. In particular, the minimum information pair-copula model shows greater flexibility and produces better approximation of the joint probability density and corresponding measures have capability for effective hazard assessments. The study demonstrates that any multivariate density can be approximated to any degree of desired precision using minimum information pair-copula model and can be practically used for probabilistic flood hazard assessment.

  15. A Practical Probabilistic Graphical Modeling Tool for Weighing Ecological Risk-Based Evidence

    EPA Science Inventory

    Past weight-of-evidence frameworks for adverse ecological effects have provided soft-scoring procedures for judgments based on the quality and measured attributes of evidence. Here, we provide a flexible probabilistic structure for weighing and integrating lines of evidence for e...

  16. Multidimensional analysis and probabilistic model of volcanic and seismic activities

    NASA Astrophysics Data System (ADS)

    Fedorov, V.

    2009-04-01

    .I. Gushchenko, 1979) and seismological (database of USGS/NEIC Significant Worldwide Earthquakes, 2150 B.C.- 1994 A.D.) information which displays dynamics of endogenic relief-forming processes over a period of 1900 to 1994. In the course of the analysis, a substitution of calendar variable by a corresponding astronomical one has been performed and the epoch superposition method was applied. In essence, the method consists in that the massifs of information on volcanic eruptions (over a period of 1900 to 1977) and seismic events (1900-1994) are differentiated with respect to value of astronomical parameters which correspond to the calendar dates of the known eruptions and earthquakes, regardless of the calendar year. The obtained spectra of volcanic eruptions and violent earthquake distribution in the fields of the Earth orbital movement parameters were used as a basis for calculation of frequency spectra and diurnal probability of volcanic and seismic activity. The objective of the proposed investigations is a probabilistic model development of the volcanic and seismic events, as well as GIS designing for monitoring and forecast of volcanic and seismic activities. In accordance with the stated objective, three probability parameters have been found in the course of preliminary studies; they form the basis for GIS-monitoring and forecast development. 1. A multidimensional analysis of volcanic eruption and earthquakes (of magnitude 7) have been performed in terms of the Earth orbital movement. Probability characteristics of volcanism and seismicity have been defined for the Earth as a whole. Time intervals have been identified with a diurnal probability twice as great as the mean value. Diurnal probability of volcanic and seismic events has been calculated up to 2020. 2. A regularity is found in duration of dormant (repose) periods has been established. A relationship has been found between the distribution of the repose period probability density and duration of the period. 3

  17. Probabilistic finite element analysis of a craniofacial finite element model.

    PubMed

    Berthaume, Michael A; Dechow, Paul C; Iriarte-Diaz, Jose; Ross, Callum F; Strait, David S; Wang, Qian; Grosse, Ian R

    2012-05-01

    We employed a probabilistic finite element analysis (FEA) method to determine how variability in material property values affects stress and strain values in a finite model of a Macaca fascicularis cranium. The material behavior of cortical bone varied in three ways: isotropic homogeneous, isotropic non-homogeneous, and orthotropic non-homogeneous. The material behavior of the trabecular bone and teeth was always treated as isotropic and homogeneous. All material property values for the cranium were randomized with a Gaussian distribution with either coefficients of variation (CVs) of 0.2 or with CVs calculated from empirical data. Latin hypercube sampling was used to determine the values of the material properties used in the finite element models. In total, four hundred and twenty six separate deterministic FE simulations were executed. We tested four hypotheses in this study: (1) uncertainty in material property values will have an insignificant effect on high stresses and a significant effect on high strains for homogeneous isotropic models; (2) the effect of variability in material property values on the stress state will increase as non-homogeneity and anisotropy increase; (3) variation in the in vivo shear strain values reported by Strait et al. (2005) and Ross et al. (2011) is not only due to variations in muscle forces and cranial morphology, but also due to variation in material property values; (4) the assumption of a uniform coefficient of variation for the material property values will result in the same trend in how moderate-to-high stresses and moderate-to-high strains vary with respect to the degree of non-homogeneity and anisotropy as the trend found when the coefficients of variation for material property values are calculated from empirical data. Our results supported the first three hypotheses and falsified the fourth. When material properties were varied with a constant CV, as non-homogeneity and anisotropy increased the level of variability in

  18. Reasoning in Reference Games: Individual- vs. Population-Level Probabilistic Modeling.

    PubMed

    Franke, Michael; Degen, Judith

    2016-01-01

    Recent advances in probabilistic pragmatics have achieved considerable success in modeling speakers' and listeners' pragmatic reasoning as probabilistic inference. However, these models are usually applied to population-level data, and so implicitly suggest a homogeneous population without individual differences. Here we investigate potential individual differences in Theory-of-Mind related depth of pragmatic reasoning in so-called reference games that require drawing ad hoc Quantity implicatures of varying complexity. We show by Bayesian model comparison that a model that assumes a heterogenous population is a better predictor of our data, especially for comprehension. We discuss the implications for the treatment of individual differences in probabilistic models of language use. PMID:27149675

  19. Reasoning in Reference Games: Individual- vs. Population-Level Probabilistic Modeling

    PubMed Central

    Franke, Michael; Degen, Judith

    2016-01-01

    Recent advances in probabilistic pragmatics have achieved considerable success in modeling speakers’ and listeners’ pragmatic reasoning as probabilistic inference. However, these models are usually applied to population-level data, and so implicitly suggest a homogeneous population without individual differences. Here we investigate potential individual differences in Theory-of-Mind related depth of pragmatic reasoning in so-called reference games that require drawing ad hoc Quantity implicatures of varying complexity. We show by Bayesian model comparison that a model that assumes a heterogenous population is a better predictor of our data, especially for comprehension. We discuss the implications for the treatment of individual differences in probabilistic models of language use. PMID:27149675

  20. Probabilistic Computational Methods in Structural Failure Analysis

    NASA Astrophysics Data System (ADS)

    Krejsa, Martin; Kralik, Juraj

    2015-12-01

    Probabilistic methods are used in engineering where a computational model contains random variables. Each random variable in the probabilistic calculations contains uncertainties. Typical sources of uncertainties are properties of the material and production and/or assembly inaccuracies in the geometry or the environment where the structure should be located. The paper is focused on methods for the calculations of failure probabilities in structural failure and reliability analysis with special attention on newly developed probabilistic method: Direct Optimized Probabilistic Calculation (DOProC), which is highly efficient in terms of calculation time and the accuracy of the solution. The novelty of the proposed method lies in an optimized numerical integration that does not require any simulation technique. The algorithm has been implemented in mentioned software applications, and has been used several times in probabilistic tasks and probabilistic reliability assessments.

  1. The probabilistic seismic loss model as a tool for portfolio management: the case of Maghreb.

    NASA Astrophysics Data System (ADS)

    Pousse, Guillaume; Lorenzo, Francisco; Stejskal, Vladimir

    2010-05-01

    Although property insurance market in Maghreb countries does not systematically purchase an earthquake cover, Impact Forecasting is developing a new loss model for the calculation of probabilistic seismic risk. A probabilistic methodology using Monte Carlo simulation was applied to generate the hazard component of the model. Then, a set of damage functions is used to convert the modelled ground motion severity into monetary losses. We aim to highlight risk assessment challenges, especially in countries where reliable data are difficult to obtain. The loss model estimates the risk and allows discussing further risk transfer strategies.

  2. Probabilistic model for pressure vessel reliability incorporating fracture mechanics and nondestructive examination

    SciTech Connect

    Tow, D.M.; Reuter, W.G.

    1998-03-01

    A probabilistic model has been developed for predicting the reliability of structures based on fracture mechanics and the results of nondestructive examination (NDE). The distinctive feature of this model is the way in which inspection results and the probability of detection (POD) curve are used to calculate a probability density function (PDF) for the number of flaws and the distribution of those flaws among the various size ranges. In combination with a probabilistic fracture mechanics model, this density function is used to estimate the probability of failure (POF) of a structure in which flaws have been detected by NDE. The model is useful for parametric studies of inspection techniques and material characteristics.

  3. SCEC/CME CyberShake: Probabilistic Seismic Hazard Analysis Using 3D Seismic Waveform Modeling

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Cui, Y.; Faerman, M.; Field, E.; Graves, R.; Gupta, N.; Gupta, V.; Jordan, T. H.; Kesselman, C.; Mehta, G.; Okaya, D.; Vahi, K.; Zhao, L.

    2005-12-01

    Researchers on the SCEC Community Modeling Environment (SCEC/CME) Project are calculating Probabilistic Seismic Hazard Curves for several sites in the Los Angeles area. The hazard curves calculated in this study use Intensity Measure Relationships (IMRs) based on 3D ground motion simulations rather than on attenuation relationships. State-of-the-art Probabilistic Seismic Hazard Analysis (PSHA) is currently conducted using IMRs that use empirically-based attenuation relationships. These attenuation relationships represent relatively simple analytical models based on the regression of observed data. However, it is widely believed that significant improvements in SHA will rely on the use of more physics-based, waveform modeling. In fact, a more physics-based approach to PSHA was endorsed in a recent assessment of earthquake science by National Research Council (2003). In order to introduce the use of 3D seismic waveform modeling into PSHA hazard curve calculations, the SCEC/CME CyberShake group is integrating state-of-the-art PSHA software tools (OpenSHA), SCEC-developed geophysical models (SCEC CVM3.0), validated anelastic wave modeling (AWM) software, and state-of-the-art computational technologies including high performance computing and grid-based scientific workflows in an effort to develop an OpenSHA-compatible 3D waveform-based IMR component. This will allow researchers to combine a new class of waveform-based IMRs with the large number of existing PSHA components, such as Earthquake Rupture Forecasts (ERF's), that are currently implemented in the OpenSHA system. To calculate a probabilistic hazard curve for a site of interest, we use the OpenSHA implementation of the NSHMP-2002 ERF and identify all ruptures within 200km of the site of interest. For each of these ruptures, we convert the NSHMP-2002 rupture definition into one, or more, Ruptures with Slip Time History (Rupture Variations) using newly developed Rupture Generator software. Strain Green Tensors are

  4. Development of Simplified Probabilistic Risk Assessment Model for Seismic Initiating Event

    SciTech Connect

    S. Khericha; R. Buell; S. Sancaktar; M. Gonzalez; F. Ferrante

    2012-06-01

    ABSTRACT This paper discusses a simplified method to evaluate seismic risk using a methodology built on dividing the seismic intensity spectrum into multiple discrete bins. The seismic probabilistic risk assessment model uses Nuclear Regulatory Commission’s (NRC’s) full power Standardized Plant Analysis Risk (SPAR) model as the starting point for development. The seismic PRA models are integrated with their respective internal events at-power SPAR model. This is accomplished by combining the modified system fault trees from the full power SPAR model with seismic event tree logic. The peak ground acceleration is divided into five bins. The g-value for each bin is estimated using the geometric mean of lower and upper values of that particular bin and the associated frequency for each bin is estimated by taking the difference between upper and lower values of that bin. The component’s fragilities are calculated for each bin using the plant data, if available, or generic values of median peak ground acceleration and uncertainty values for the components. For human reliability analysis (HRA), the SPAR HRA (SPAR-H) method is used which requires the analysts to complete relatively straight forward worksheets that include the performance shaping factors (PSFs). The results are then used to estimate human error probabilities (HEPs) of interest. This work is expected to improve the NRC’s ability to include seismic hazards in risk assessments for operational events in support of the reactor oversight program (e.g., significance determination process).

  5. A Probabilistic Risk Analysis (PRA) of Human Space Missions for the Advanced Integration Matrix (AIM)

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.; Dillon-Merrill, Robin L.; Thomas, Gretchen A.

    2003-01-01

    The Advanced Integration Matrix (AIM) Project u7ill study and solve systems-level integration issues for exploration missions beyond Low Earth Orbit (LEO), through the design and development of a ground-based facility for developing revolutionary integrated systems for joint human-robotic missions. This paper describes a Probabilistic Risk Analysis (PRA) of human space missions that was developed to help define the direction and priorities for AIM. Risk analysis is required for all major NASA programs and has been used for shuttle, station, and Mars lander programs. It is a prescribed part of early planning and is necessary during concept definition, even before mission scenarios and system designs exist. PRA cm begin when little failure data are available, and be continually updated and refined as detail becomes available. PRA provides a basis for examining tradeoffs among safety, reliability, performance, and cost. The objective of AIM's PRA is to indicate how risk can be managed and future human space missions enabled by the AIM Project. Many critical events can cause injuries and fatalities to the crew without causing loss of vehicle or mission. Some critical systems are beyond AIM's scope, such as propulsion and guidance. Many failure-causing events can be mitigated by conducting operational tests in AIM, such as testing equipment and evaluating operational procedures, especially in the areas of communications and computers, autonomous operations, life support, thermal design, EVA and rover activities, physiological factors including habitation, medical equipment, and food, and multifunctional tools and repairable systems. AIM is well suited to test and demonstrate the habitat, life support, crew operations, and human interface. Because these account for significant crew, systems performance, and science risks, AIM will help reduce mission risk, and missions beyond LEO are far enough in the future that AIM can have significant impact.

  6. Conditional Reasoning in Context: A Dual-Source Model of Probabilistic Inference

    ERIC Educational Resources Information Center

    Klauer, Karl Christoph; Beller, Sieghard; Hutter, Mandy

    2010-01-01

    A dual-source model of probabilistic conditional inference is proposed. According to the model, inferences are based on 2 sources of evidence: logical form and prior knowledge. Logical form is a decontextualized source of evidence, whereas prior knowledge is activated by the contents of the conditional rule. In Experiments 1 to 3, manipulations of…

  7. A PROBABILISTIC POPULATION EXPOSURE MODEL FOR PM10 AND PM 2.5

    EPA Science Inventory

    A first generation probabilistic population exposure model for Particulate Matter (PM), specifically for predicting PM10, and PM2.5, exposures of an urban, population has been developed. This model is intended to be used to predict exposure (magnitude, frequency, and duration) ...

  8. Model initialisation, data assimilation and probabilistic flood forecasting for distributed hydrological models

    NASA Astrophysics Data System (ADS)

    Cole, S. J.; Robson, A. J.; Bell, V. A.; Moore, R. J.

    2009-04-01

    The hydrological forecasting component of the Natural Environment Research Council's FREE (Flood Risk from Extreme Events) project "Exploitation of new data sources, data assimilation and ensemble techniques for storm and flood forecasting" addresses the initialisation, data assimilation and uncertainty of hydrological flood models utilising advances in rainfall estimation and forecasting. Progress will be reported on the development and assessment of simple model-initialisation and state-correction methods for a distributed grid-based hydrological model, the G2G Model. The potential of the G2G Model for area-wide flood forecasting is demonstrated through a nationwide application across England and Wales. Probabilistic flood forecasting in spatial form is illustrated through the use of high-resolution NWP rainfalls, and pseudo-ensemble forms of these, as input to the G2G Model. The G2G Model is configured over a large area of South West England and the Boscastle storm of 16 August 2004 is used as a convective case study. Visualisation of probabilistic flood forecasts is achieved through risk maps of flood threshold exceedence that indicate the space-time evolution of flood risk during the event.

  9. Monthly water balance modeling: Probabilistic, possibilistic and hybrid methods for model combination and ensemble simulation

    NASA Astrophysics Data System (ADS)

    Nasseri, M.; Zahraie, B.; Ajami, N. K.; Solomatine, D. P.

    2014-04-01

    Multi-model (ensemble, or committee) techniques have shown to be an effective way to improve hydrological prediction performance and provide uncertainty information. This paper presents two novel multi-model ensemble techniques, one probabilistic, Modified Bootstrap Ensemble Model (MBEM), and one possibilistic, FUzzy C-means Ensemble based on data Pattern (FUCEP). The paper also explores utilization of the Ordinary Kriging (OK) method as a multi-model combination scheme for hydrological simulation/prediction. These techniques are compared against Bayesian Model Averaging (BMA) and Weighted Average (WA) methods to demonstrate their effectiveness. The mentioned techniques are applied to the three monthly water balance models used to generate stream flow simulations for two mountainous basins in the South-West of Iran. For both basins, the results demonstrate that MBEM and FUCEP generate more skillful and reliable probabilistic predictions, outperforming all the other techniques. We have also found that OK did not demonstrate any improved skill as a simple combination method over WA scheme for neither of the basins.

  10. The LISA Integrated Model

    NASA Technical Reports Server (NTRS)

    Merkowitz, Stephen M.

    2002-01-01

    The Laser Interferometer Space Antenna (LISA) space mission has unique needs that argue for an aggressive modeling effort. These models ultimately need to forecast and interrelate the behavior of the science input, structure, optics, control systems, and many other factors that affect the performance of the flight hardware. In addition, many components of these integrated models will also be used separately for the evaluation and investigation of design choices, technology development and integration and test. This article presents an overview of the LISA integrated modeling effort.

  11. Probabilistic Models and Generative Neural Networks: Towards an Unified Framework for Modeling Normal and Impaired Neurocognitive Functions

    PubMed Central

    Testolin, Alberto; Zorzi, Marco

    2016-01-01

    Connectionist models can be characterized within the more general framework of probabilistic graphical models, which allow to efficiently describe complex statistical distributions involving a large number of interacting variables. This integration allows building more realistic computational models of cognitive functions, which more faithfully reflect the underlying neural mechanisms at the same time providing a useful bridge to higher-level descriptions in terms of Bayesian computations. Here we discuss a powerful class of graphical models that can be implemented as stochastic, generative neural networks. These models overcome many limitations associated with classic connectionist models, for example by exploiting unsupervised learning in hierarchical architectures (deep networks) and by taking into account top-down, predictive processing supported by feedback loops. We review some recent cognitive models based on generative networks, and we point out promising research directions to investigate neuropsychological disorders within this approach. Though further efforts are required in order to fill the gap between structured Bayesian models and more realistic, biophysical models of neuronal dynamics, we argue that generative neural networks have the potential to bridge these levels of analysis, thereby improving our understanding of the neural bases of cognition and of pathologies caused by brain damage. PMID:27468262

  12. Probabilistic Models and Generative Neural Networks: Towards an Unified Framework for Modeling Normal and Impaired Neurocognitive Functions.

    PubMed

    Testolin, Alberto; Zorzi, Marco

    2016-01-01

    Connectionist models can be characterized within the more general framework of probabilistic graphical models, which allow to efficiently describe complex statistical distributions involving a large number of interacting variables. This integration allows building more realistic computational models of cognitive functions, which more faithfully reflect the underlying neural mechanisms at the same time providing a useful bridge to higher-level descriptions in terms of Bayesian computations. Here we discuss a powerful class of graphical models that can be implemented as stochastic, generative neural networks. These models overcome many limitations associated with classic connectionist models, for example by exploiting unsupervised learning in hierarchical architectures (deep networks) and by taking into account top-down, predictive processing supported by feedback loops. We review some recent cognitive models based on generative networks, and we point out promising research directions to investigate neuropsychological disorders within this approach. Though further efforts are required in order to fill the gap between structured Bayesian models and more realistic, biophysical models of neuronal dynamics, we argue that generative neural networks have the potential to bridge these levels of analysis, thereby improving our understanding of the neural bases of cognition and of pathologies caused by brain damage. PMID:27468262

  13. The Integrated Medical Model

    NASA Technical Reports Server (NTRS)

    Kerstman, Eric; Minard, Charles; Saile, Lynn; Freiere deCarvalho, Mary; Myers, Jerry; Walton, Marlei; Butler, Douglas; Iyengar, Sriram; Johnson-Throop, Kathy; Baumann, David

    2010-01-01

    The goals of the Integrated Medical Model (IMM) are to develop an integrated, quantified, evidence-based decision support tool useful to crew health and mission planners and to help align science, technology, and operational activities intended to optimize crew health, safety, and mission success. Presentation slides address scope and approach, beneficiaries of IMM capabilities, history, risk components, conceptual models, development steps, and the evidence base. Space adaptation syndrome is used to demonstrate the model's capabilities.

  14. A range of complex probabilistic models for RNA secondary structure prediction that includes the nearest-neighbor model and more.

    PubMed

    Rivas, Elena; Lang, Raymond; Eddy, Sean R

    2012-02-01

    The standard approach for single-sequence RNA secondary structure prediction uses a nearest-neighbor thermodynamic model with several thousand experimentally determined energy parameters. An attractive alternative is to use statistical approaches with parameters estimated from growing databases of structural RNAs. Good results have been reported for discriminative statistical methods using complex nearest-neighbor models, including CONTRAfold, Simfold, and ContextFold. Little work has been reported on generative probabilistic models (stochastic context-free grammars [SCFGs]) of comparable complexity, although probabilistic models are generally easier to train and to use. To explore a range of probabilistic models of increasing complexity, and to directly compare probabilistic, thermodynamic, and discriminative approaches, we created TORNADO, a computational tool that can parse a wide spectrum of RNA grammar architectures (including the standard nearest-neighbor model and more) using a generalized super-grammar that can be parameterized with probabilities, energies, or arbitrary scores. By using TORNADO, we find that probabilistic nearest-neighbor models perform comparably to (but not significantly better than) discriminative methods. We find that complex statistical models are prone to overfitting RNA structure and that evaluations should use structurally nonhomologous training and test data sets. Overfitting has affected at least one published method (ContextFold). The most important barrier to improving statistical approaches for RNA secondary structure prediction is the lack of diversity of well-curated single-sequence RNA secondary structures in current RNA databases. PMID:22194308

  15. Plant Phenotyping using Probabilistic Topic Models: Uncovering the Hyperspectral Language of Plants.

    PubMed

    Wahabzada, Mirwaes; Mahlein, Anne-Katrin; Bauckhage, Christian; Steiner, Ulrike; Oerke, Erich-Christian; Kersting, Kristian

    2016-01-01

    Modern phenotyping and plant disease detection methods, based on optical sensors and information technology, provide promising approaches to plant research and precision farming. In particular, hyperspectral imaging have been found to reveal physiological and structural characteristics in plants and to allow for tracking physiological dynamics due to environmental effects. In this work, we present an approach to plant phenotyping that integrates non-invasive sensors, computer vision, as well as data mining techniques and allows for monitoring how plants respond to stress. To uncover latent hyperspectral characteristics of diseased plants reliably and in an easy-to-understand way, we "wordify" the hyperspectral images, i.e., we turn the images into a corpus of text documents. Then, we apply probabilistic topic models, a well-established natural language processing technique that identifies content and topics of documents. Based on recent regularized topic models, we demonstrate that one can track automatically the development of three foliar diseases of barley. We also present a visualization of the topics that provides plant scientists an intuitive tool for hyperspectral imaging. In short, our analysis and visualization of characteristic topics found during symptom development and disease progress reveal the hyperspectral language of plant diseases. PMID:26957018

  16. Plant Phenotyping using Probabilistic Topic Models: Uncovering the Hyperspectral Language of Plants

    PubMed Central

    Wahabzada, Mirwaes; Mahlein, Anne-Katrin; Bauckhage, Christian; Steiner, Ulrike; Oerke, Erich-Christian; Kersting, Kristian

    2016-01-01

    Modern phenotyping and plant disease detection methods, based on optical sensors and information technology, provide promising approaches to plant research and precision farming. In particular, hyperspectral imaging have been found to reveal physiological and structural characteristics in plants and to allow for tracking physiological dynamics due to environmental effects. In this work, we present an approach to plant phenotyping that integrates non-invasive sensors, computer vision, as well as data mining techniques and allows for monitoring how plants respond to stress. To uncover latent hyperspectral characteristics of diseased plants reliably and in an easy-to-understand way, we “wordify” the hyperspectral images, i.e., we turn the images into a corpus of text documents. Then, we apply probabilistic topic models, a well-established natural language processing technique that identifies content and topics of documents. Based on recent regularized topic models, we demonstrate that one can track automatically the development of three foliar diseases of barley. We also present a visualization of the topics that provides plant scientists an intuitive tool for hyperspectral imaging. In short, our analysis and visualization of characteristic topics found during symptom development and disease progress reveal the hyperspectral language of plant diseases. PMID:26957018

  17. Probabilistic tsunami hazard analysis (PTHA) of Taiwan region by stochastic model

    NASA Astrophysics Data System (ADS)

    Sun, Y. S.; Chen, P. F.; Chen, C. C.

    2014-12-01

    We conduct probabilistic tsunami hazard analysis (PTHA) of Taiwan region for earthquake sources in the Ryukyu trench. The PTHA estimates the probabilities of a site hit by tsunamis with certain amplitudes threshold. The probabilities were integrated over earthquakes of various magnitudes from potential fault zones in the Ryukyu trench. The annual frequencies of earthquakes in a fault zone are determined or extrapolated by magnitude-frequency distributions of earthquakes (Gutenberg-Richter law) of the zone. Given moment (or magnitude) of an earthquake, we first synthesize patterns of differently complex and heterogeneous slip distributions on the fault using stochastic model. Assuming the slip and stress drop distribution are processes of fractional Brownian motion and described by Hurt exponent. According to ω-2 model of earthquakes and following Fourier transform, slip distributions of earthquake are determined by randomly distributing phase spectrum of those with greater than corner wave number kc. Finally, the vertical seafloor displacements induced by each slip distribution are used by COMCOT for simulation of tsunami to assess the impacts on various coasts in Taiwan.

  18. Psychological Plausibility of the Theory of Probabilistic Mental Models and the Fast and Frugal Heuristics

    ERIC Educational Resources Information Center

    Dougherty, Michael R.; Franco-Watkins, Ana M.; Thomas, Rick

    2008-01-01

    The theory of probabilistic mental models (PMM; G. Gigerenzer, U. Hoffrage, & H. Kleinbolting, 1991) has had a major influence on the field of judgment and decision making, with the most recent important modifications to PMM theory being the identification of several fast and frugal heuristics (G. Gigerenzer & D. G. Goldstein, 1996). These…

  19. From cyclone tracks to the costs of European winter storms: A probabilistic loss assessment model

    NASA Astrophysics Data System (ADS)

    Renggli, Dominik; Corti, Thierry; Reese, Stefan; Wueest, Marc; Viktor, Elisabeth; Zimmerli, Peter

    2014-05-01

    The quantitative assessment of the potential losses of European winter storms is essential for the economic viability of a global reinsurance company. For this purpose, reinsurance companies generally use probabilistic loss assessment models. This work presents an innovative approach to develop physically meaningful probabilistic events for Swiss Re's new European winter storm loss model. The meteorological hazard component of the new model is based on cyclone and windstorm tracks identified in the 20th Century Reanalysis data. The knowledge of the evolution of winter storms both in time and space allows the physically meaningful perturbation of properties of historical events (e.g. track, intensity). The perturbation includes a random element but also takes the local climatology and the evolution of the historical event into account. The low-resolution wind footprints taken from 20th Century Reanalysis are processed by a statistical-dynamical downscaling to generate high-resolution footprints of the historical and probabilistic winter storm events. Downscaling transfer functions are generated using ENSEMBLES regional climate model data. The result is a set of reliable probabilistic events representing thousands of years. The event set is then combined with country- and risk-specific vulnerability functions and detailed market- or client-specific exposure information to compute (re-)insurance risk premiums.

  20. THE MAXIMUM LIKELIHOOD APPROACH TO PROBABILISTIC MODELING OF AIR QUALITY DATA

    EPA Science Inventory

    Software using maximum likelihood estimation to fit six probabilistic models is discussed. The software is designed as a tool for the air pollution researcher to determine what assumptions are valid in the statistical analysis of air pollution data for the purpose of standard set...

  1. Performance of the multi-model SREPS precipitation probabilistic forecast over Mediterranean area

    NASA Astrophysics Data System (ADS)

    Callado, A.; Escribà, P.; Santos, C.; Santos-Muñoz, D.; Simarro, J.; García-Moya, J. A.

    2009-09-01

    The performance of the Short-Range Ensemble Prediction system (SREPS) probabilistic precipitation forecast over the Mediterranean area has been evaluated comparing with both, an Atlantic-European area excluding the first one, and a more general area including the two previous ones. The main aim is to assess whether the performance of the system due to its meso-alpha horizontal resolution of 25 kilometres is affected over the Mediterranean area, where the meteorological mesoscale events play a more important role than in an Atlantic-European area, more related to synoptic scale with an Atlantic influence. Furthermore, two different verification methods have been applied and compared for the three areas in order to assess its performance. The SREPS is a daily experimental LAM EPS focused on the short range (up to 72 hours) which has been developed at the Spanish Meteorological Agency (AEMET). To take into account implicitly the model errors, five purely independent different limited area models are used (COSMO (COSMO), HIRLAM (HIRLAM Consortium), HRM (DWD), MM5 (NOAA) and UM-NAE (UKMO)), and in order to sample the initial and boundary condition uncertainties each model is integrated using data from four different global deterministic models (GFS (NCEP), GME (DWD), IFS (ECMWF) and UM (UKMO)). As a result, crossing models and initial conditions the EPS is composed by 20 members. The underlying idea is that the ensemble performance has to improve as far as each member has itself the better possible performance, i.e. the better operational configuration limited area models are combined with the better global deterministic model configurations initialized with the best analysis. Because of this neither global EPS as initial conditions nor different model settings as multi-parameterizations or multi-parameters are used to generate SREPS. The performance over the three areas has been assessed focusing on 24 hour accumulation precipitation with four different usual

  2. Probabilistic earthquake location and 3-D velocity models in routine earthquake location

    NASA Astrophysics Data System (ADS)

    Lomax, A.; Husen, S.

    2003-12-01

    Earthquake monitoring agencies, such as local networks or CTBTO, are faced with the dilemma of providing routine earthquake locations in near real-time with high precision and meaningful uncertainty information. Traditionally, routine earthquake locations are obtained from linearized inversion using layered seismic velocity models. This approach is fast and simple. However, uncertainties derived from a linear approximation to a set of non-linear equations can be imprecise, unreliable, or even misleading. In addition, 1-D velocity models are a poor approximation to real Earth structure in tectonically complex regions. In this paper, we discuss the routine location of earthquakes in near real-time with high precision using non-linear, probabilistic location methods and 3-D velocity models. The combination of non-linear, global search algorithms with probabilistic earthquake location provides a fast and reliable tool for earthquake location that can be used with any kind of velocity model. The probabilistic solution to the earthquake location includes a complete description of location uncertainties, which may be irregular and multimodal. We present applications of this approach to determine seismicity in Switzerland and in Yellowstone National Park, WY. Comparing our earthquake locations to earthquake locations obtained using linearized inversion and 1-D velocity models clearly demonstrates the advantages of probabilistic earthquake location and 3-D velocity models. For example, the more complete and reliable uncertainty information of non-linear, probabilistic earthquake location greatly facilitates the identification of poorly constrained hypocenters. Such events are often not identified in linearized earthquake location, since the location uncertainties are determined with a simplified, localized and approximate Gaussian statistic.

  3. Probabilistic dose-response modeling: case study using dichloromethane PBPK model results.

    PubMed

    Marino, Dale J; Starr, Thomas B

    2007-12-01

    A revised assessment of dichloromethane (DCM) has recently been reported that examines the influence of human genetic polymorphisms on cancer risks using deterministic PBPK and dose-response modeling in the mouse combined with probabilistic PBPK modeling in humans. This assessment utilized Bayesian techniques to optimize kinetic variables in mice and humans with mean values from posterior distributions used in the deterministic modeling in the mouse. To supplement this research, a case study was undertaken to examine the potential impact of probabilistic rather than deterministic PBPK and dose-response modeling in mice on subsequent unit risk factor (URF) determinations. Four separate PBPK cases were examined based on the exposure regimen of the NTP DCM bioassay. These were (a) Same Mouse (single draw of all PBPK inputs for both treatment groups); (b) Correlated BW-Same Inputs (single draw of all PBPK inputs for both treatment groups except for bodyweights (BWs), which were entered as correlated variables); (c) Correlated BW-Different Inputs (separate draws of all PBPK inputs for both treatment groups except that BWs were entered as correlated variables); and (d) Different Mouse (separate draws of all PBPK inputs for both treatment groups). Monte Carlo PBPK inputs reflect posterior distributions from Bayesian calibration in the mouse that had been previously reported. A minimum of 12,500 PBPK iterations were undertaken, in which dose metrics, i.e., mg DCM metabolized by the GST pathway/L tissue/day for lung and liver were determined. For dose-response modeling, these metrics were combined with NTP tumor incidence data that were randomly selected from binomial distributions. Resultant potency factors (0.1/ED(10)) were coupled with probabilistic PBPK modeling in humans that incorporated genetic polymorphisms to derive URFs. Results show that there was relatively little difference, i.e., <10% in central tendency and upper percentile URFs, regardless of the case

  4. Development of fast-running thermal and structural response models for probabilistic analysis of complex systems

    SciTech Connect

    Benjamin, A.S.; Brown, N.N.

    1993-11-01

    This paper describes two fast-running physical response algorithms, which were developed for the analysis of nuclear detonation pathways in nuclear weapons systems exposed to fires and crashes but which can be used for other applications such as probabilistic structural analyses of civil systems exposed to dynamic loadings and the dynamic analyses of nuclear reactors exposed to external events. The first is embodied in a computer code called Thermal Evaluation and Matching Program for Risk Applications (TEMPRA-3D). The second is contained in a computer code entitled Spring-mass Transient Response Evaluation for Structural Systems (STRESS-3D). TEMPRA-3D is a lumped-capacitance thermal analysis code that is extremely fast running and unconditionally stable. It contains fully integrated numerical models for many phenomena of interest in the evaluation of system responses, including thermal conduction, thermal radiation, thermal convection, chemical reactions, and material decomposition. The code is capable of calculating the timing of important events, such as component failures or the ignition of explosives. If uncertainty distributions are provided, it computes pairwise probabilities. STRESS-3D is a dynamical structural analysis code that models a system as a connection of masses and nonlinear springs. The principal functions of STRESS-3D are to calculate the dynamic responses to various types of impacts, focusing upon the stresses and strains in shell-like structures and the mean accelerations and displacements of solid components. Some of the key features of the code are: (1) explicit integration of Newton`s law of motion for each mass; (2) spring forces evaluated form constitutive relationships and appropriate areas; (3) characterization of strain-hardening in inelastic materials and compressive load-bearing capability of foam materials; and (4) innovative modeling of shells.

  5. Probabilistic forecasts of debris-flow hazard at the regional scale with a combination of models.

    NASA Astrophysics Data System (ADS)

    Malet, Jean-Philippe; Remaître, Alexandre

    2015-04-01

    Debris flows are one of the many active slope-forming processes in the French Alps, where rugged and steep slopes mantled by various slope deposits offer a great potential for triggering hazardous events. A quantitative assessment of debris-flow hazard requires the estimation, in a probabilistic framework, of the spatial probability of occurrence of source areas, the spatial probability of runout areas, the temporal frequency of events, and their intensity. The main objective of this research is to propose a pipeline for the estimation of these quantities at the region scale using a chain of debris-flow models. The work uses the experimental site of the Barcelonnette Basin (South French Alps), where 26 active torrents have produced more than 150 debris-flow events since 1850 to develop and validate the methodology. First, a susceptibility assessment is performed to identify the debris-flow prone source areas. The most frequently used approach is the combination of environmental factors with GIS procedures and statistical techniques, integrating or not, detailed event inventories. Based on a 5m-DEM and derivatives, and information on slope lithology, engineering soils and landcover, the possible source areas are identified with a statistical logistic regression model. The performance of the statistical model is evaluated with the observed distribution of debris-flow events recorded after 1850 in the study area. The source areas in the three most active torrents (Riou-Bourdoux, Faucon, Sanières) are well identified by the model. Results are less convincing for three other active torrents (Bourget, La Valette and Riou-Chanal); this could be related to the type of debris-flow triggering mechanism as the model seems to better spot the open slope debris-flow source areas (e.g. scree slopes), but appears to be less efficient for the identification of landslide-induced debris flows. Second, a susceptibility assessment is performed to estimate the possible runout distance

  6. Interpretable Probabilistic Latent Variable Models for Automatic Annotation of Clinical Text

    PubMed Central

    Kotov, Alexander; Hasan, Mehedi; Carcone, April; Dong, Ming; Naar-King, Sylvie; BroganHartlieb, Kathryn

    2015-01-01

    We propose Latent Class Allocation (LCA) and Discriminative Labeled Latent Dirichlet Allocation (DL-LDA), two novel interpretable probabilistic latent variable models for automatic annotation of clinical text. Both models separate the terms that are highly characteristic of textual fragments annotated with a given set of labels from other non-discriminative terms, but rely on generative processes with different structure of latent variables. LCA directly learns class-specific multinomials, while DL-LDA breaks them down into topics (clusters of semantically related words). Extensive experimental evaluation indicates that the proposed models outperform Naïve Bayes, a standard probabilistic classifier, and Labeled LDA, a state-of-the-art topic model for labeled corpora, on the task of automatic annotation of transcripts of motivational interviews, while the output of the proposed models can be easily interpreted by clinical practitioners. PMID:26958214

  7. Statistical and Probabilistic Extensions to Ground Operations' Discrete Event Simulation Modeling

    NASA Technical Reports Server (NTRS)

    Trocine, Linda; Cummings, Nicholas H.; Bazzana, Ashley M.; Rychlik, Nathan; LeCroy, Kenneth L.; Cates, Grant R.

    2010-01-01

    NASA's human exploration initiatives will invest in technologies, public/private partnerships, and infrastructure, paving the way for the expansion of human civilization into the solar system and beyond. As it is has been for the past half century, the Kennedy Space Center will be the embarkation point for humankind's journey into the cosmos. Functioning as a next generation space launch complex, Kennedy's launch pads, integration facilities, processing areas, launch and recovery ranges will bustle with the activities of the world's space transportation providers. In developing this complex, KSC teams work through the potential operational scenarios: conducting trade studies, planning and budgeting for expensive and limited resources, and simulating alternative operational schemes. Numerous tools, among them discrete event simulation (DES), were matured during the Constellation Program to conduct such analyses with the purpose of optimizing the launch complex for maximum efficiency, safety, and flexibility while minimizing life cycle costs. Discrete event simulation is a computer-based modeling technique for complex and dynamic systems where the state of the system changes at discrete points in time and whose inputs may include random variables. DES is used to assess timelines and throughput, and to support operability studies and contingency analyses. It is applicable to any space launch campaign and informs decision-makers of the effects of varying numbers of expensive resources and the impact of off nominal scenarios on measures of performance. In order to develop representative DES models, methods were adopted, exploited, or created to extend traditional uses of DES. The Delphi method was adopted and utilized for task duration estimation. DES software was exploited for probabilistic event variation. A roll-up process was used, which was developed to reuse models and model elements in other less - detailed models. The DES team continues to innovate and expand

  8. Multivariate probabilistic projections using imperfect climate models part I: outline of methodology

    NASA Astrophysics Data System (ADS)

    Sexton, David M. H.; Murphy, James M.; Collins, Mat; Webb, Mark J.

    2012-06-01

    We demonstrate a method for making probabilistic projections of climate change at global and regional scales, using examples consisting of the equilibrium response to doubled CO2 concentrations of global annual mean temperature and regional climate changes in summer and winter temperature and precipitation over Northern Europe and England-Wales. This method combines information from a perturbed physics ensemble, a set of international climate models, and observations. Our approach is based on a multivariate Bayesian framework which enables the prediction of a joint probability distribution for several variables constrained by more than one observational metric. This is important if different sets of impacts scientists are to use these probabilistic projections to make coherent forecasts for the impacts of climate change, by inputting several uncertain climate variables into their impacts models. Unlike a single metric, multiple metrics reduce the risk of rewarding a model variant which scores well due to a fortuitous compensation of errors rather than because it is providing a realistic simulation of the observed quantity. We provide some physical interpretation of how the key metrics constrain our probabilistic projections. The method also has a quantity, called discrepancy, which represents the degree of imperfection in the climate model i.e. it measures the extent to which missing processes, choices of parameterisation schemes and approximations in the climate model affect our ability to use outputs from climate models to make inferences about the real system. Other studies have, sometimes without realising it, treated the climate model as if it had no model error. We show that omission of discrepancy increases the risk of making over-confident predictions. Discrepancy also provides a transparent way of incorporating improvements in subsequent generations of climate models into probabilistic assessments. The set of international climate models is used to derive

  9. A non-parametric probabilistic model for soil-structure interaction

    NASA Astrophysics Data System (ADS)

    Laudarin, F.; Desceliers, C.; Bonnet, G.; Argoul, P.

    2013-07-01

    The paper investigates the effect of soil-structure interaction on the dynamic response of structures. A non-parametric probabilistic formulation for the modelling of an uncertain soil impedance is used to account for the usual lack of information on soil properties. Such a probabilistic model introduces the physical coupling stemming from the soil heterogeneity around the foundation. Considering this effect, even a symmetrical building displays a torsional motion when submitted to earthquake loading. The study focuses on a multi-story building modeled by using equivalent Timoshenko beam models which have different mass distributions. The probability density functions of the maximal internal forces and moments in a given building are estimated by Monte Carlo simulations. Some results on the stochastic modal analysis of the structure are also given.

  10. A note on probabilistic models over strings: the linear algebra approach.

    PubMed

    Bouchard-Côté, Alexandre

    2013-12-01

    Probabilistic models over strings have played a key role in developing methods that take into consideration indels as phylogenetically informative events. There is an extensive literature on using automata and transducers on phylogenies to do inference on these probabilistic models, in which an important theoretical question is the complexity of computing the normalization of a class of string-valued graphical models. This question has been investigated using tools from combinatorics, dynamic programming, and graph theory, and has practical applications in Bayesian phylogenetics. In this work, we revisit this theoretical question from a different point of view, based on linear algebra. The main contribution is a set of results based on this linear algebra view that facilitate the analysis and design of inference algorithms on string-valued graphical models. As an illustration, we use this method to give a new elementary proof of a known result on the complexity of inference on the "TKF91" model, a well-known probabilistic model over strings. Compared to previous work, our proving method is easier to extend to other models, since it relies on a novel weak condition, triangular transducers, which is easy to establish in practice. The linear algebra view provides a concise way of describing transducer algorithms and their compositions, opens the possibility of transferring fast linear algebra libraries (for example, based on GPUs), as well as low rank matrix approximation methods, to string-valued inference problems. PMID:24135792

  11. Probabilistic models for assessment of extreme temperatures and relative humidity in Lithuania

    NASA Astrophysics Data System (ADS)

    Alzbutas, Robertas; Šeputytė, Ilona

    2015-04-01

    Extreme temperatures are fairly common natural phenomenon in Lithuania. They have mainly negative effects both on the environment and humans. Thus there are important to perform probabilistic and statistical analyzes of possibly extreme temperature values and their time-dependant changes. This is especially important in areas where technical objects (sensitive to the extreme temperatures) are foreseen to be constructed. In order to estimate the frequencies and consequences of possible extreme temperatures, the probabilistic analysis of the event occurrence and its uncertainty has been performed: statistical data have been collected and analyzed. The probabilistic analysis of extreme temperatures in Lithuanian territory is based on historical data taken from Lithuanian Hydrometeorology Service, Dūkštas Meteorological Station, Lithuanian Energy Institute and Ignalina NNP Environmental Protection Department of Environmental Monitoring Service. The main objective of performed work was the probabilistic assessment of occurrence and impact of extreme temperature and relative humidity occurring in whole Lithuania and specifically in Dūkštas region where Ignalina Nuclear Power Plant is closed for decommissioning. In addition, the other purpose of this work was to analyze the changes of extreme temperatures. The probabilistic analysis of extreme temperatures increase in Lithuanian territory was based on more than 50 years historical data. The probabilistic assessment was focused on the application and comparison of Gumbel, Weibull and Generalized Value (GEV) distributions, enabling to select a distribution, which has the best fit for data of extreme temperatures. In order to assess the likelihood of extreme temperatures different probabilistic models were applied to evaluate the probability of exeedance of different extreme temperatures. According to the statistics and the relationship between return period and probabilities of temperatures the return period for 30

  12. Integration of landslide hazard maps into probabilistic risk assessment in context of global changes: an alpine test site

    NASA Astrophysics Data System (ADS)

    Vandromme, Rosalie; Desramaut, Nicolas; Baills, Audrey; Fontaine, Mélanie; Hohmann, Audrey; Grandjean, Gilles; Sedan, Olivier; Puissant, Anne; Malet, Jean-Philippe

    2013-04-01

    The aim of this work is to develop a methodology to integrate global changes scenarios into quantitative risk assessment. This paper describes a methodology to take into account effects of changing climate on landslides activity and impacts of social changes on exposure to provide a complete evaluation of risk for given scenarios. This approach is applied for demonstration purpose on a southern alpine test site. Mechanical approaches represent a solution to quantify landslide susceptibility and to model hazard on unprecedented conditions, as it is likely to occur. However, as the quantity and the quality of data are generally very heterogeneous at a regional scale, it is necessary to take into account their uncertainty in the analysis. In this perspective, a new hazard modeling method has been developed and integrated in a GIS-based software called ALICE®. To go further, climate change scenarios have been computed for the alpine test site (Barcelonnette area, France) using the REMO-COSMO-LM. From the precipitation time series, a daily index of the soil water content has been computed thanks to a reservoir-based model (GARDENIA®). Hence, the program classifies hazard zones depending on the several spatial data (lithological, DEM, etc…) and different hydrological contexts varying in time. The probabilistically initiated landslides are then propagated thank to a semi-empirical model (BORA) to provide real hazard maps. Different scenarios of land-use have been developed using an automate cellular model to cover the probable range of development of potential elements at risks in the future. These exposure maps are then combined with the aforementioned hazard maps to obtain risk maps for the different periods and the different land-use development scenarios. Potential evolutions of landslide risks are then evaluated, with a general increase in the 7 communes. This methodology also allows the analysis of the contributions of both considered global changes (climate and

  13. Probabilistic material degradation model for aerospace materials subjected to high temperature, mechanical and thermal fatigue, and creep

    NASA Technical Reports Server (NTRS)

    Boyce, L.

    1992-01-01

    A probabilistic general material strength degradation model has been developed for structural components of aerospace propulsion systems subjected to diverse random effects. The model has been implemented in two FORTRAN programs, PROMISS (Probabilistic Material Strength Simulator) and PROMISC (Probabilistic Material Strength Calibrator). PROMISS calculates the random lifetime strength of an aerospace propulsion component due to as many as eighteen diverse random effects. Results are presented in the form of probability density functions and cumulative distribution functions of lifetime strength. PROMISC calibrates the model by calculating the values of empirical material constants.

  14. PubMed related articles: a probabilistic topic-based model for content similarity

    PubMed Central

    Lin, Jimmy; Wilbur, W John

    2007-01-01

    Background We present a probabilistic topic-based model for content similarity called pmra that underlies the related article search feature in PubMed. Whether or not a document is about a particular topic is computed from term frequencies, modeled as Poisson distributions. Unlike previous probabilistic retrieval models, we do not attempt to estimate relevance–but rather our focus is "relatedness", the probability that a user would want to examine a particular document given known interest in another. We also describe a novel technique for estimating parameters that does not require human relevance judgments; instead, the process is based on the existence of MeSH ® in MEDLINE ®. Results The pmra retrieval model was compared against bm25, a competitive probabilistic model that shares theoretical similarities. Experiments using the test collection from the TREC 2005 genomics track shows a small but statistically significant improvement of pmra over bm25 in terms of precision. Conclusion Our experiments suggest that the pmra model provides an effective ranking algorithm for related article search. PMID:17971238

  15. Probabilistic Inference: Task Dependency and Individual Differences of Probability Weighting Revealed by Hierarchical Bayesian Modeling

    PubMed Central

    Boos, Moritz; Seer, Caroline; Lange, Florian; Kopp, Bruno

    2016-01-01

    Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modeling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities) by two (likelihoods) design. Five computational models of cognitive processes were compared with the observed behavior. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted) S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model's success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modeling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modeling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision. PMID:27303323

  16. Landslide susceptibility mapping along PLUS expressways in Malaysia using probabilistic based model in GIS

    NASA Astrophysics Data System (ADS)

    Yusof, Norbazlan M.; Pradhan, Biswajeet

    2014-06-01

    PLUS Berhad holds the concession for a total of 987 km of toll expressways in Malaysia, the longest of which is the North-South Expressway or NSE. Acting as the backbone' of the west coast of the peninsula, the NSE stretches from the Malaysian-Thai border in the north to the border with neighbouring Singapore in the south, linking several major cities and towns along the way. North-South Expressway in Malaysia contributes to the country economic development through trade, social and tourism sector. Presently, the highway is good in terms of its condition and connection to every state but some locations need urgent attention. Stability of slopes at these locations is of most concern as any instability can cause danger to the motorist. In this paper, two study locations have been analysed; they are Gua Tempurung (soil slope) and Jelapang (rock slope) which are obviously having two different characteristics. These locations passed through undulating terrain with steep slopes where landslides are common and the probability of slope instability due to human activities in surrounding areas is high. A combination of twelve (12) landslide conditioning factors database on slope stability such as slope degree and slope aspect were extracted from IFSAR (interoferometric synthetic aperture radar) while landuse, lithology and structural geology were constructed from interpretation of high resolution satellite data from World View II, Quickbird and Ikonos. All this information was analysed in geographic information system (GIS) environment for landslide susceptibility mapping using probabilistic based frequency ratio model. Consequently, information on the slopes such as inventories, condition assessments and maintenance records were assessed through total expressway maintenance management system or better known as TEMAN. The above mentioned system is used by PLUS as an asset management and decision support tools for maintenance activities along the highways as well as for data

  17. Probabilistic modelling of human exposure to intense sweeteners in Italian teenagers: validation and sensitivity analysis of a probabilistic model including indicators of market share and brand loyalty.

    PubMed

    Arcella, D; Soggiu, M E; Leclercq, C

    2003-10-01

    For the assessment of exposure to food-borne chemicals, the most commonly used methods in the European Union follow a deterministic approach based on conservative assumptions. Over the past few years, to get a more realistic view of exposure to food chemicals, risk managers are getting more interested in the probabilistic approach. Within the EU-funded 'Monte Carlo' project, a stochastic model of exposure to chemical substances from the diet and a computer software program were developed. The aim of this paper was to validate the model with respect to the intake of saccharin from table-top sweeteners and cyclamate from soft drinks by Italian teenagers with the use of the software and to evaluate the impact of the inclusion/exclusion of indicators on market share and brand loyalty through a sensitivity analysis. Data on food consumption and the concentration of sweeteners were collected. A food frequency questionnaire aimed at identifying females who were high consumers of sugar-free soft drinks and/or of table top sweeteners was filled in by 3982 teenagers living in the District of Rome. Moreover, 362 subjects participated in a detailed food survey by recording, at brand level, all foods and beverages ingested over 12 days. Producers were asked to provide the intense sweeteners' concentration of sugar-free products. Results showed that consumer behaviour with respect to brands has an impact on exposure assessments. Only probabilistic models that took into account indicators of market share and brand loyalty met the validation criteria. PMID:14555359

  18. Don't Fear Optimality: Sampling for Probabilistic-Logic Sequence Models

    NASA Astrophysics Data System (ADS)

    Thon, Ingo

    One of the current challenges in artificial intelligence is modeling dynamic environments that change due to the actions or activities undertaken by people or agents. The task of inferring hidden states, e.g. the activities or intentions of people, based on observations is called filtering. Standard probabilistic models such as Dynamic Bayesian Networks are able to solve this task efficiently using approximative methods such as particle filters. However, these models do not support logical or relational representations. The key contribution of this paper is the upgrade of a particle filter algorithm for use with a probabilistic logical representation through the definition of a proposal distribution. The performance of the algorithm depends largely on how well this distribution fits the target distribution. We adopt the idea of logical compilation into Binary Decision Diagrams for sampling. This allows us to use the optimal proposal distribution which is normally prohibitively slow.

  19. Allocation Variable-Based Probabilistic Algorithm to Deal with Label Switching Problem in Bayesian Mixture Models

    PubMed Central

    Pan, Jia-Chiun; Liu, Chih-Min; Hwu, Hai-Gwo; Huang, Guan-Hua

    2015-01-01

    The label switching problem occurs as a result of the nonidentifiability of posterior distribution over various permutations of component labels when using Bayesian approach to estimate parameters in mixture models. In the cases where the number of components is fixed and known, we propose a relabelling algorithm, an allocation variable-based (denoted by AVP) probabilistic relabelling approach, to deal with label switching problem. We establish a model for the posterior distribution of allocation variables with label switching phenomenon. The AVP algorithm stochastically relabel the posterior samples according to the posterior probabilities of the established model. Some existing deterministic and other probabilistic algorithms are compared with AVP algorithm in simulation studies, and the success of the proposed approach is demonstrated in simulation studies and a real dataset. PMID:26458185

  20. A probabilistic model of absolute auditory thresholds and its possible physiological basis.

    PubMed

    Heil, Peter; Neubauer, Heinrich; Tetschke, Manuel; Irvine, Dexter R F

    2013-01-01

    Detection thresholds for auditory stimuli, specified in terms of their -amplitude or level, depend on the stimulus temporal envelope and decrease with increasing stimulus duration. The neural mechanisms underlying these fundamental across-species observations are not fully understood. Here, we present a "continuous look" model, according to which the stimulus gives rise to stochastic neural detection events whose probability of occurrence is proportional to the 3rd power of the low-pass filtered, time-varying stimulus amplitude. Threshold is reached when a criterion number of events have occurred (probability summation). No long-term integration is required. We apply the model to an extensive set of thresholds measured in humans for tones of different envelopes and durations and find it to fit well. Subtle differences at long durations may be due to limited attention resources. We confirm the probabilistic nature of the detection events by analyses of simple reaction times and verify the exponent of 3 by validating model predictions for binaural thresholds from monaural thresholds. The exponent originates in the auditory periphery, possibly in the intrinsic Ca(2+) cooperativity of the Ca(2+) sensor involved in exocytosis from inner hair cells. It results in growth of the spike rate of auditory-nerve fibers (ANFs) with the 3rd power of the stimulus amplitude before saturating (Heil et al., J Neurosci 31:15424-15437, 2011), rather than with its square (i.e., with stimulus intensity), as is commonly assumed. Our work therefore suggests a link between detection thresholds and a key biochemical reaction in the receptor cells. PMID:23716205

  1. Statistical Learning of Probabilistic Nonadjacent Dependencies by Multiple-Cue Integration

    ERIC Educational Resources Information Center

    van den Bos, Esther; Christiansen, Morten H.; Misyak, Jennifer B.

    2012-01-01

    Previous studies have indicated that dependencies between nonadjacent elements can be acquired by statistical learning when each element predicts only one other element (deterministic dependencies). The present study investigates statistical learning of probabilistic nonadjacent dependencies, in which each element predicts several other elements…

  2. Receptor-mediated cell attachment and detachment kinetics. I. Probabilistic model and analysis.

    PubMed Central

    Cozens-Roberts, C.; Lauffenburger, D. A.; Quinn, J. A.

    1990-01-01

    The kinetics of receptor-mediated cell adhesion to a ligand-coated surface play a key role in many physiological and biotechnology-related processes. We present a probabilistic model of receptor-ligand bond formation between a cell and surface to describe the probability of adhesion in a fluid shear field. Our model extends the deterministic model of Hammer and Lauffenburger (Hammer, D.A., and D.A. Lauffenburger. 1987. Biophys. J. 52:475-487) to a probabilistic framework, in which we calculate the probability that a certain number of bonds between a cell and surface exists at any given time. The probabilistic framework is used to account for deviations from ideal, deterministic behavior, inherent in chemical reactions involving relatively small numbers of reacting molecules. Two situations are investigated: first, cell attachment in the absence of fluid stress; and, second, cell detachment in the presence of fluid stress. In the attachment case, we examine the expected variance in bond formation as a function of attachment time; this also provides an initial condition for the detachment case. Focusing then on detachment, we predict transient behavior as a function of key system parameters, such as the distractive fluid force, the receptor-ligand bond affinity and rate constants, and the receptor and ligand densities. We compare the predictions of the probabilistic model with those of a deterministic model, and show how a deterministic approach can yield some inaccurate results; e.g., it cannot account for temporally continuous cell attach mentor detachment, it can underestimate the time needed for cell attachment, it can overestimate the time required for cell detachment for a given level of force, and it can overestimate the force necessary for cell detachment. PMID:2174271

  3. HIV-Specific Probabilistic Models of Protein Evolution

    PubMed Central

    Nickle, David C.; Heath, Laura; Jensen, Mark A.; Gilbert, Peter B.; Mullins, James I.; Kosakovsky Pond, Sergei L.

    2007-01-01

    Comparative sequence analyses, including such fundamental bioinformatics techniques as similarity searching, sequence alignment and phylogenetic inference, have become a mainstay for researchers studying type 1 Human Immunodeficiency Virus (HIV-1) genome structure and evolution. Implicit in comparative analyses is an underlying model of evolution, and the chosen model can significantly affect the results. In general, evolutionary models describe the probabilities of replacing one amino acid character with another over a period of time. Most widely used evolutionary models for protein sequences have been derived from curated alignments of hundreds of proteins, usually based on mammalian genomes. It is unclear to what extent these empirical models are generalizable to a very different organism, such as HIV-1–the most extensively sequenced organism in existence. We developed a maximum likelihood model fitting procedure to a collection of HIV-1 alignments sampled from different viral genes, and inferred two empirical substitution models, suitable for describing between-and within-host evolution. Our procedure pools the information from multiple sequence alignments, and provided software implementation can be run efficiently in parallel on a computer cluster. We describe how the inferred substitution models can be used to generate scoring matrices suitable for alignment and similarity searches. Our models had a consistently superior fit relative to the best existing models and to parameter-rich data-driven models when benchmarked on independent HIV-1 alignments, demonstrating evolutionary biases in amino-acid substitution that are unique to HIV, and that are not captured by the existing models. The scoring matrices derived from the models showed a marked difference from common amino-acid scoring matrices. The use of an appropriate evolutionary model recovered a known viral transmission history, whereas a poorly chosen model introduced phylogenetic error. We argue that

  4. Probabilistic performance-assessment modeling of the mixed waste landfill at Sandia National Laboratories.

    SciTech Connect

    Peace, Gerald L.; Goering, Timothy James; Miller, Mark Laverne; Ho, Clifford Kuofei

    2007-01-01

    A probabilistic performance assessment has been conducted to evaluate the fate and transport of radionuclides (americium-241, cesium-137, cobalt-60, plutonium-238, plutonium-239, radium-226, radon-222, strontium-90, thorium-232, tritium, uranium-238), heavy metals (lead and cadmium), and volatile organic compounds (VOCs) at the Mixed Waste Landfill (MWL). Probabilistic analyses were performed to quantify uncertainties inherent in the system and models for a 1,000-year period, and sensitivity analyses were performed to identify parameters and processes that were most important to the simulated performance metrics. Comparisons between simulated results and measured values at the MWL were made to gain confidence in the models and perform calibrations when data were available. In addition, long-term monitoring requirements and triggers were recommended based on the results of the quantified uncertainty and sensitivity analyses.

  5. Uniform Accuracy of the Maximum Likelihood Estimates for Probabilistic Models of Biological Sequences

    PubMed Central

    Ekisheva, Svetlana

    2010-01-01

    Probabilistic models for biological sequences (DNA and proteins) have many useful applications in bioinformatics. Normally, the values of parameters of these models have to be estimated from empirical data. However, even for the most common estimates, the maximum likelihood (ML) estimates, properties have not been completely explored. Here we assess the uniform accuracy of the ML estimates for models of several types: the independence model, the Markov chain and the hidden Markov model (HMM). Particularly, we derive rates of decay of the maximum estimation error by employing the measure concentration as well as the Gaussian approximation, and compare these rates. PMID:21318122

  6. Log-normal distribution based Ensemble Model Output Statistics models for probabilistic wind-speed forecasting

    NASA Astrophysics Data System (ADS)

    Baran, Sándor; Lerch, Sebastian

    2015-07-01

    Ensembles of forecasts are obtained from multiple runs of numerical weather forecasting models with different initial conditions and typically employed to account for forecast uncertainties. However, biases and dispersion errors often occur in forecast ensembles, they are usually under-dispersive and uncalibrated and require statistical post-processing. We present an Ensemble Model Output Statistics (EMOS) method for calibration of wind speed forecasts based on the log-normal (LN) distribution, and we also show a regime-switching extension of the model which combines the previously studied truncated normal (TN) distribution with the LN. Both presented models are applied to wind speed forecasts of the eight-member University of Washington mesoscale ensemble, of the fifty-member ECMWF ensemble and of the eleven-member ALADIN-HUNEPS ensemble of the Hungarian Meteorological Service, and their predictive performances are compared to those of the TN and general extreme value (GEV) distribution based EMOS methods and to the TN-GEV mixture model. The results indicate improved calibration of probabilistic and accuracy of point forecasts in comparison to the raw ensemble and to climatological forecasts. Further, the TN-LN mixture model outperforms the traditional TN method and its predictive performance is able to keep up with the models utilizing the GEV distribution without assigning mass to negative values.

  7. The Integrated Medical Model

    NASA Technical Reports Server (NTRS)

    Butler, Douglas J.; Kerstman, Eric

    2010-01-01

    This slide presentation reviews the goals and approach for the Integrated Medical Model (IMM). The IMM is a software decision support tool that forecasts medical events during spaceflight and optimizes medical systems during simulations. It includes information on the software capabilities, program stakeholders, use history, and the software logic.

  8. An Integrated Model Recontextualized

    ERIC Educational Resources Information Center

    O'Meara, KerryAnn; Saltmarsh, John

    2016-01-01

    In this commentary, authors KerryAnn O'Meara and John Saltmarsh reflect on their 2008 "Journal of Higher Education Outreach and Engagement" article "An Integrated Model for Advancing the Scholarship of Engagement: Creating Academic Homes for the Engaged Scholar," reprinted in this 20th anniversary issue of "Journal of…

  9. A FEASIBILITY STUDY ON USING PHYSICS-BASED MODELER OUTPUTS TO TRAIN PROBABILISTIC NEURAL NETWORKS FOR UXO CLASSIFICATION

    EPA Science Inventory

    A probabilistic neural network (PNN) has been applied to the detection and classification of unexploded ordnance (UXO) measured using magnetometry data collected using the Multi-sensor Towed Array Detection System (MTADS). Physical parameters obtained from a physics based modeler...

  10. TEMPI: probabilistic modeling time-evolving differential PPI networks with multiPle information

    PubMed Central

    Kim, Yongsoo; Jang, Jin-Hyeok; Choi, Seungjin; Hwang, Daehee

    2014-01-01

    Motivation: Time-evolving differential protein–protein interaction (PPI) networks are essential to understand serial activation of differentially regulated (up- or downregulated) cellular processes (DRPs) and their interplays over time. Despite developments in the network inference, current methods are still limited in identifying temporal transition of structures of PPI networks, DRPs associated with the structural transition and the interplays among the DRPs over time. Results: Here, we present a probabilistic model for estimating Time-Evolving differential PPI networks with MultiPle Information (TEMPI). This model describes probabilistic relationships among network structures, time-course gene expression data and Gene Ontology biological processes (GOBPs). By maximizing the likelihood of the probabilistic model, TEMPI estimates jointly the time-evolving differential PPI networks (TDNs) describing temporal transition of PPI network structures together with serial activation of DRPs associated with transiting networks. This joint estimation enables us to interpret the TDNs in terms of temporal transition of the DRPs. To demonstrate the utility of TEMPI, we applied it to two time-course datasets. TEMPI identified the TDNs that correctly delineated temporal transition of DRPs and time-dependent associations between the DRPs. These TDNs provide hypotheses for mechanisms underlying serial activation of key DRPs and their temporal associations. Availability and implementation: Source code and sample data files are available at http://sbm.postech.ac.kr/tempi/sources.zip. Contact: seungjin@postech.ac.kr or dhwang@dgist.ac.kr Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25161233

  11. Modeling the impact of flexible textile composites through multiscale and probabilistic methods

    NASA Astrophysics Data System (ADS)

    Nilakantan, Gaurav

    Flexible textile composites or fabrics comprised of materials such as Kevlar are used in impact and penetration resistant structures such as protective clothing for law enforcement and military personnel. The penetration response of these fabrics is probabilistic in nature and experimentally characterized through parameters such as the V0 and the V50 velocity. In this research a probabilistic computational framework is developed through which the entire V0- V100 velocity curve or probabilistic velocity response (PVR) curve can be numerically determined through a series of finite element (FE) impact simulations. Sources of variability that affect the PVR curve are isolated for investigation, which in this study is chosen as the statistical nature of yarn tensile strengths. Experimental tensile testing is conducted on spooled and fabric-extracted Kevlar yarns. The statistically characterized strengths are then mapped onto the yarns of the fabric FE model as part of the probabilistic computational framework. The effects of projectile characteristics such as size and shape on the fabric PVR curve are studied. A multiscale modeling technique entitled the Hybrid Element Analysis (HEA) is developed to reduce the computational requirements of a fabric model based on a yarn level architecture discretized with only solid elements. This technique combines into a single FE model both a local region of solid and shell element based yarn level architecture, and a global region of shell element based membrane level architecture, with impedance matched interfaces. The multiscale model is then incorporated into the probabilistic computational framework. A yarn model comprised of a filament level architecture is developed to investigate the feasibility of solid element based homogenized yarn models as well as the effect of filament spreading and inter-filament friction on the impact response. Results from preliminary experimental fabric impact testing are also presented. This

  12. Probabilistically Constraining Age-Depth-Models of Glaciogenic Sediments

    NASA Astrophysics Data System (ADS)

    Werner, J.; van der Bilt, W.; Tingley, M.

    2015-12-01

    Reconstructions of the late-Holocene climate rely heavily upon proxies that are assumed to be accurately dated by layer counting. All of these proxies, such as measurements of tree rings, ice cores, and varved lake sediments do carry some inherent dating uncertainty that is not always fully accounted for. Considerable advances could be achieved if time uncertainties were recognized and correctly modeled, also for proxies commonly treated as free of age model errors. Current approaches for accounting for time uncertainty are generally limited to repeating the reconstruction using each one of an ensemble of age models, thereby inflating the final estimated uncertainty - in effect, each possible age model is given equal weighting. Uncertainties can be reduced by exploiting the inferred space-time covariance structure of the climate to re-weight the possible age models. Werner and Tingley (2015) demonstrated how Bayesian hierarchical climate reconstruction models can be augmented to account for time-uncertain proxies. In their method, probabilities associated with the age models are formally updated within the Bayesian framework, thereby reducing uncertainties. Numerical experiments (Werner and Tingley 2015) show that updating the age model probabilities decreases uncertainty in the resulting reconstructions, as compared with the current de facto standard of sampling over all age models, provided there is sufficient information from other data sources in the spatial region of the time-uncertain proxy. We show how this novel method can be applied to high resolution, sub-annually sampled lacustrine sediment records to constrain their respective age depth models. The results help to quantify the signal content and extract the regionally representative signal. The single time series can then be used as the basis for a reconstruction of glacial activity. van der Bilt et al. in prep. Werner, J.P. and Tingley, M.P. Clim. Past (2015)

  13. Dependence in probabilistic modeling, Dempster-Shafer theory, and probability bounds analysis.

    SciTech Connect

    Oberkampf, William Louis; Tucker, W. Troy; Zhang, Jianzhong; Ginzburg, Lev; Berleant, Daniel J.; Ferson, Scott; Hajagos, Janos; Nelsen, Roger B.

    2004-10-01

    This report summarizes methods to incorporate information (or lack of information) about inter-variable dependence into risk assessments that use Dempster-Shafer theory or probability bounds analysis to address epistemic and aleatory uncertainty. The report reviews techniques for simulating correlated variates for a given correlation measure and dependence model, computation of bounds on distribution functions under a specified dependence model, formulation of parametric and empirical dependence models, and bounding approaches that can be used when information about the intervariable dependence is incomplete. The report also reviews several of the most pervasive and dangerous myths among risk analysts about dependence in probabilistic models.

  14. Rule Learning with Probabilistic Smoothing

    NASA Astrophysics Data System (ADS)

    Costa, Gianni; Guarascio, Massimo; Manco, Giuseppe; Ortale, Riccardo; Ritacco, Ettore

    A hierarchical classification framework is proposed for discriminating rare classes in imprecise domains, characterized by rarity (of both classes and cases), noise and low class separability. The devised framework couples the rules of a rule-based classifier with as many local probabilistic generative models. These are trained over the coverage of the corresponding rules to better catch those globally rare cases/classes that become less rare in the coverage. Two novel schemes for tightly integrating rule-based and probabilistic classification are introduced, that classify unlabeled cases by considering multiple classifier rules as well as their local probabilistic counterparts. An intensive evaluation shows that the proposed framework is competitive and often superior in accuracy w.r.t. established competitors, while overcoming them in dealing with rare classes.

  15. Probabilistic approach to the Bak-Sneppen model

    NASA Astrophysics Data System (ADS)

    Caldarelli, G.; Felici, M.; Gabrielli, A.; Pietronero, L.

    2002-04-01

    We study here the Bak-Sneppen model, a prototype model for the study of self-organized criticality. In this model several species interact and undergo extinction with a power-law distribution of activity bursts. Species are defined through their ``fitness'' whose distribution in the system is uniform above a certain threshold. Run time statistics is introduced for the analysis of the dynamics in order to explain the peculiar properties of the model. This approach based on conditional probability theory, takes into account the correlations due to memory effects. In this way, we may compute analytically the value of the fitness threshold with the desired precision. This represents a substantial improvement with respect to the traditional mean field approach.

  16. Probabilistic Modeling of Loran-C for nonprecision approaches

    NASA Technical Reports Server (NTRS)

    Einhorn, John K.

    1987-01-01

    The overall idea of the research was to predict the errors to be encountered during an approach using available data from the U.S. Coast Guard and standard normal distribution probability analysis for a number of airports in the North East CONUS. The research consists of two parts: an analytical model that predicts the probability of an approach falling within a given standard, and a series of flight tests designed to test the validity of the model.

  17. Optical systems integrated modeling

    NASA Technical Reports Server (NTRS)

    Shannon, Robert R.; Laskin, Robert A.; Brewer, SI; Burrows, Chris; Epps, Harlan; Illingworth, Garth; Korsch, Dietrich; Levine, B. Martin; Mahajan, Vini; Rimmer, Chuck

    1992-01-01

    An integrated modeling capability that provides the tools by which entire optical systems and instruments can be simulated and optimized is a key technology development, applicable to all mission classes, especially astrophysics. Many of the future missions require optical systems that are physically much larger than anything flown before and yet must retain the characteristic sub-micron diffraction limited wavefront accuracy of their smaller precursors. It is no longer feasible to follow the path of 'cut and test' development; the sheer scale of these systems precludes many of the older techniques that rely upon ground evaluation of full size engineering units. The ability to accurately model (by computer) and optimize the entire flight system's integrated structural, thermal, and dynamic characteristics is essential. Two distinct integrated modeling capabilities are required. These are an initial design capability and a detailed design and optimization system. The content of an initial design package is shown. It would be a modular, workstation based code which allows preliminary integrated system analysis and trade studies to be carried out quickly by a single engineer or a small design team. A simple concept for a detailed design and optimization system is shown. This is a linkage of interface architecture that allows efficient interchange of information between existing large specialized optical, control, thermal, and structural design codes. The computing environment would be a network of large mainframe machines and its users would be project level design teams. More advanced concepts for detailed design systems would support interaction between modules and automated optimization of the entire system. Technology assessment and development plans for integrated package for initial design, interface development for detailed optimization, validation, and modeling research are presented.

  18. Structural damage measure index based on non-probabilistic reliability model

    NASA Astrophysics Data System (ADS)

    Wang, Xiaojun; Xia, Yong; Zhou, Xiaoqing; Yang, Chen

    2014-02-01

    Uncertainties in the structural model and measurement data affect structural condition assessment in practice. As the probabilistic information of these uncertainties lacks, the non-probabilistic interval analysis framework is developed to quantify the interval of the structural element stiffness parameters. According to the interval intersection of the element stiffness parameters in the undamaged and damaged states, the possibility of damage existence is defined based on the reliability theory. A damage measure index is then proposed as the product of the nominal stiffness reduction and the defined possibility of damage existence. This new index simultaneously reflects the damage severity and possibility of damage at each structural component. Numerical and experimental examples are presented to illustrate the validity and applicability of the method. The results show that the proposed method can improve the accuracy of damage diagnosis compared with the deterministic damage identification method.

  19. Probabilistic formalism and hierarchy of models for polydispersed turbulent two-phase flows

    NASA Astrophysics Data System (ADS)

    Peirano, Eric; Minier, Jean-Pierre

    2002-04-01

    This paper deals with a probabilistic approach to polydispersed turbulent two-phase flows following the suggestions of Pozorski and Minier [Phys. Rev. E 59, 855 (1999)]. A general probabilistic formalism is presented in the form of a two-point Lagrangian PDF (probability density function). A new feature of the present approach is that both phases, the fluid as well as the particles, are included in the PDF description. It is demonstrated how the formalism can be used to show that there exists a hierarchy between the classical approaches such as the Eulerian and Lagrangian methods. It is also shown that the Eulerian and Lagrangian models can be obtained in a systematic way from the PDF formalism. Connections with previous papers are discussed.

  20. A fully probabilistic approach to extreme rainfall modeling

    NASA Astrophysics Data System (ADS)

    Coles, Stuart; Pericchi, Luis Raúl; Sisson, Scott

    2003-03-01

    It is an embarrassingly frequent experience that statistical practice fails to foresee historical disasters. It is all too easy to blame global trends or some sort of external intervention, but in this article we argue that statistical methods that do not take comprehensive account of the uncertainties involved in both model and predictions, are bound to produce an over-optimistic appraisal of future extremes that is often contradicted by observed hydrological events. Based on the annual and daily rainfall data on the central coast of Venezuela, different modeling strategies and inference approaches show that the 1999 rainfall which caused the worst environmentally related tragedy in Venezuelan history was extreme, but not implausible given the historical evidence. We follow in turn a classical likelihood and Bayesian approach, arguing that the latter is the most natural approach for taking into account all uncertainties. In each case we emphasize the importance of making inference on predicted levels of the process rather than model parameters. Our most detailed model comprises of seasons with unknown starting points and durations for the extremes of daily rainfall whose behavior is described using a standard threshold model. Based on a Bayesian analysis of this model, so that both prediction uncertainty and process heterogeneity are properly modeled, we find that the 1999 event has a sizeable probability which implies that such an occurrence within a reasonably short time horizon could have been anticipated. Finally, since accumulation of extreme rainfall over several days is an additional difficulty—and indeed, the catastrophe of 1999 was exaggerated by heavy rainfall on successive days—we examine the effect of timescale on our broad conclusions, finding results to be broadly similar across different choices.

  1. Improved seismic risk assessment based on probabilistic multi-source information integration

    NASA Astrophysics Data System (ADS)

    Pittore, M.; Wieland, M.; Duisheev, A.; Yasunov, P.

    2012-04-01

    Earthquakes threat millions of people all over the world. Assessing seismic risk, defined as the probability of occurrence of economical and social losses as consequence of an earthquake, both at regional and at local scale is a challenging, multi-disciplinary task. In order to provide reliable estimates, diverse information must be gathered by seismologists, geologists, engineers and civil authorities and carefully integrated, keeping into account the different uncertainties and the inherent spatio-temporal variability. An efficient and reliable assessment of the assets exposed to seismic hazard and the structural and social components of vulnerability are of particular importance, in order to undertake proper mitigation actions and to promptly and efficiently react to a possibly catastrophic natural event. An original approach is presented to assess seismic vulnerability and risk based on integration of information coming from several heterogeneous sources: remotely-sensed and ground-based panoramic images, manual digitization, already available information and expert knowledge. A Bayesian approach has been introduced to keep into account collected information while preserving priors and subjective judgment. In the broad perspective of GEM (Global Earthquake Model) and more specifically within EMCA (Earthquake Model Central Asia) project, an integrated, sound approach to seismic risk in countries with limited resources is an important but rewarding challenge. Improved vulnerability and risk models for the capital cities of Kyrgyzstan and Tajikistan, and their application in earthquake scenarios will be discussed.

  2. Probabilistic earthquake early warning in complex earth models using prior sampling

    NASA Astrophysics Data System (ADS)

    Valentine, Andrew; Käufl, Paul; Trampert, Jeannot

    2016-04-01

    In an earthquake early warning (EEW) context, we must draw inferences from small, noisy seismic datasets within an extremely limited time-frame. Ideally, a probabilistic framework would be used, to recognise that available observations may be compatible with a range of outcomes, and analysis would be conducted in a theoretically-complete physical framework. However, implementing these requirements has been challenging, as they tend to increase computational demands beyond what is feasible on EEW timescales. We present a new approach, based on 'prior sampling', which implements probabilistic inversion as a two stage process, and can be used for EEW monitoring within a given region. First, a large set of synthetic data is computed for randomly-distributed seismic sources within the region. A learning algorithm is used to infer details of the probability distribution linking observations and model parameters (including location, magnitude, and focal mechanism). This procedure is computationally expensive, but can be conducted entirely before monitoring commences. In the second stage, as observations are obtained, the algorithm can be evaluated within milliseconds to output a probabilistic representation of the corresponding source model. We demonstrate that this gives robust results, and can be implemented using state-of-the-art 3D wave propagation simulations, and complex crustal structures.

  3. Event-Based Media Enrichment Using an Adaptive Probabilistic Hypergraph Model.

    PubMed

    Liu, Xueliang; Wang, Meng; Yin, Bao-Cai; Huet, Benoit; Li, Xuelong

    2015-11-01

    Nowadays, with the continual development of digital capture technologies and social media services, a vast number of media documents are captured and shared online to help attendees record their experience during events. In this paper, we present a method combining semantic inference and multimodal analysis for automatically finding media content to illustrate events using an adaptive probabilistic hypergraph model. In this model, media items are taken as vertices in the weighted hypergraph and the task of enriching media to illustrate events is formulated as a ranking problem. In our method, each hyperedge is constructed using the K-nearest neighbors of a given media document. We also employ a probabilistic representation, which assigns each vertex to a hyperedge in a probabilistic way, to further exploit the correlation among media data. Furthermore, we optimize the hypergraph weights in a regularization framework, which is solved as a second-order cone problem. The approach is initiated by seed media and then used to rank the media documents using a transductive inference process. The results obtained from validating the approach on an event dataset collected from EventMedia demonstrate the effectiveness of the proposed approach. PMID:26470061

  4. A Coupled Probabilistic Wake Vortex and Aircraft Response Prediction Model

    NASA Technical Reports Server (NTRS)

    Gloudemans, Thijs; Van Lochem, Sander; Ras, Eelco; Malissa, Joel; Ahmad, Nashat N.; Lewis, Timothy A.

    2016-01-01

    Wake vortex spacing standards along with weather and runway occupancy time, restrict terminal area throughput and impose major constraints on the overall capacity and efficiency of the National Airspace System (NAS). For more than two decades, the National Aeronautics and Space Administration (NASA) has been conducting research on characterizing wake vortex behavior in order to develop fast-time wake transport and decay prediction models. It is expected that the models can be used in the systems level design of advanced air traffic management (ATM) concepts that safely increase the capacity of the NAS. It is also envisioned that at a later stage of maturity, these models could potentially be used operationally, in groundbased spacing and scheduling systems as well as on the flight deck.

  5. Urban stormwater management planning with analytical probabilistic models

    SciTech Connect

    Adams, B.J.

    2000-07-01

    Understanding how to properly manage urban stormwater is a critical concern to civil and environmental engineers the world over. Mismanagement of stormwater and urban runoff results in flooding, erosion, and water quality problems. In an effort to develop better management techniques, engineers have come to rely on computer simulation and advanced mathematical modeling techniques to help plan and predict water system performance. This important book outlines a new method that uses probability tools to model how stormwater behaves and interacts in a combined- or single-system municipal water system. Complete with sample problems and case studies illustrating how concepts really work, the book presents a cost-effective, easy-to-master approach to analytical modeling of stormwater management systems.

  6. Testing for ontological errors in probabilistic forecasting models of natural systems.

    PubMed

    Marzocchi, Warner; Jordan, Thomas H

    2014-08-19

    Probabilistic forecasting models describe the aleatory variability of natural systems as well as our epistemic uncertainty about how the systems work. Testing a model against observations exposes ontological errors in the representation of a system and its uncertainties. We clarify several conceptual issues regarding the testing of probabilistic forecasting models for ontological errors: the ambiguity of the aleatory/epistemic dichotomy, the quantification of uncertainties as degrees of belief, the interplay between Bayesian and frequentist methods, and the scientific pathway for capturing predictability. We show that testability of the ontological null hypothesis derives from an experimental concept, external to the model, that identifies collections of data, observed and not yet observed, that are judged to be exchangeable when conditioned on a set of explanatory variables. These conditional exchangeability judgments specify observations with well-defined frequencies. Any model predicting these behaviors can thus be tested for ontological error by frequentist methods; e.g., using P values. In the forecasting problem, prior predictive model checking, rather than posterior predictive checking, is desirable because it provides more severe tests. We illustrate experimental concepts using examples from probabilistic seismic hazard analysis. Severe testing of a model under an appropriate set of experimental concepts is the key to model validation, in which we seek to know whether a model replicates the data-generating process well enough to be sufficiently reliable for some useful purpose, such as long-term seismic forecasting. Pessimistic views of system predictability fail to recognize the power of this methodology in separating predictable behaviors from those that are not. PMID:25097265

  7. Testing for ontological errors in probabilistic forecasting models of natural systems

    PubMed Central

    Marzocchi, Warner; Jordan, Thomas H.

    2014-01-01

    Probabilistic forecasting models describe the aleatory variability of natural systems as well as our epistemic uncertainty about how the systems work. Testing a model against observations exposes ontological errors in the representation of a system and its uncertainties. We clarify several conceptual issues regarding the testing of probabilistic forecasting models for ontological errors: the ambiguity of the aleatory/epistemic dichotomy, the quantification of uncertainties as degrees of belief, the interplay between Bayesian and frequentist methods, and the scientific pathway for capturing predictability. We show that testability of the ontological null hypothesis derives from an experimental concept, external to the model, that identifies collections of data, observed and not yet observed, that are judged to be exchangeable when conditioned on a set of explanatory variables. These conditional exchangeability judgments specify observations with well-defined frequencies. Any model predicting these behaviors can thus be tested for ontological error by frequentist methods; e.g., using P values. In the forecasting problem, prior predictive model checking, rather than posterior predictive checking, is desirable because it provides more severe tests. We illustrate experimental concepts using examples from probabilistic seismic hazard analysis. Severe testing of a model under an appropriate set of experimental concepts is the key to model validation, in which we seek to know whether a model replicates the data-generating process well enough to be sufficiently reliable for some useful purpose, such as long-term seismic forecasting. Pessimistic views of system predictability fail to recognize the power of this methodology in separating predictable behaviors from those that are not. PMID:25097265

  8. Probabilistic uncertainty analysis of epidemiological modeling to guide public health intervention policy.

    PubMed

    Gilbert, Jennifer A; Meyers, Lauren Ancel; Galvani, Alison P; Townsend, Jeffrey P

    2014-03-01

    Mathematical modeling of disease transmission has provided quantitative predictions for health policy, facilitating the evaluation of epidemiological outcomes and the cost-effectiveness of interventions. However, typical sensitivity analyses of deterministic dynamic infectious disease models focus on model architecture and the relative importance of parameters but neglect parameter uncertainty when reporting model predictions. Consequently, model results that identify point estimates of intervention levels necessary to terminate transmission yield limited insight into the probability of success. We apply probabilistic uncertainty analysis to a dynamic model of influenza transmission and assess global uncertainty in outcome. We illustrate that when parameter uncertainty is not incorporated into outcome estimates, levels of vaccination and treatment predicted to prevent an influenza epidemic will only have an approximately 50% chance of terminating transmission and that sensitivity analysis alone is not sufficient to obtain this information. We demonstrate that accounting for parameter uncertainty yields probabilities of epidemiological outcomes based on the degree to which data support the range of model predictions. Unlike typical sensitivity analyses of dynamic models that only address variation in parameters, the probabilistic uncertainty analysis described here enables modelers to convey the robustness of their predictions to policy makers, extending the power of epidemiological modeling to improve public health. PMID:24593920

  9. A simple probabilistic model of submicroscopic diatom morphogenesis

    PubMed Central

    Willis, L.; Cox, E. J.; Duke, T.

    2013-01-01

    Unicellular algae called diatoms morph biomineral compounds into tough exoskeletons via complex intracellular processes about which there is much to be learned. These exoskeletons feature a rich variety of structures from submicroscale to milliscale, many that have not been reproduced in vitro. In order to help understand this complex miniature morphogenesis, here we introduce and analyse a simple model of biomineral kinetics, focusing on the exoskeleton's submicroscopic patterned planar structures called pore occlusions. The model reproduces most features of these pore occlusions by retuning just one parameter, thereby indicating what physio-biochemical mechanisms could sufficiently explain morphogenesis at the submicroscopic scale: it is sufficient to identify a mechanism of lateral negative feedback on the biomineral reaction kinetics. The model is nonlinear and stochastic; it is an extended version of the threshold voter model. Its mean-field equation provides a simple and, as far as the authors are aware, new way of mapping out the spatial patterns produced by lateral inhibition and variants thereof. PMID:23554345

  10. Rock penetration : finite element sensitivity and probabilistic modeling analyses.

    SciTech Connect

    Fossum, Arlo Frederick

    2004-08-01

    This report summarizes numerical analyses conducted to assess the relative importance on penetration depth calculations of rock constitutive model physics features representing the presence of microscale flaws such as porosity and networks of microcracks and rock mass structural features. Three-dimensional, nonlinear, transient dynamic finite element penetration simulations are made with a realistic geomaterial constitutive model to determine which features have the most influence on penetration depth calculations. A baseline penetration calculation is made with a representative set of material parameters evaluated from measurements made from laboratory experiments conducted on a familiar sedimentary rock. Then, a sequence of perturbations of various material parameters allows an assessment to be made of the main penetration effects. A cumulative probability distribution function is calculated with the use of an advanced reliability method that makes use of this sensitivity database, probability density functions, and coefficients of variation of the key controlling parameters for penetration depth predictions. Thus the variability of the calculated penetration depth is known as a function of the variability of the input parameters. This simulation modeling capability should impact significantly the tools that are needed to design enhanced penetrator systems, support weapons effects studies, and directly address proposed HDBT defeat scenarios.

  11. Integrated Watershed Modeling

    NASA Astrophysics Data System (ADS)

    Bagulho Galvão, P.; Neves, R.; Silva, A.; Chambel Leitão, P.; Braunchweig, F.

    2004-05-01

    Integrated systems that bring together EO data, local measurements and modeling tools, are a fundamental instrument to help decision making in watershed and land use management. The BASINS system (EPA http://www.epa.gov/OST/BASINS/) follows this philosophy, merging data from local measurement with modeling tools (HSPF, SWAT, PLOAD, QUAL2E). However, remote sensed data is still used in a very static way (usually to define land cover, see corine land cover project). This approach is being replaced with operational methods that use EO data (such as land surface temperature, vegetation state, soil moisture, surface roughness) for both inputs and validation. The development of integrated watershed models that dynamically interact with remote sensed data opens interesting prospective to the validation and improvement of such models. This paper describes the possible data contribution of remote sensing to the needs associated with state of the art watershed models, including well know systems (such as SWAT or HSPF) and a system still under development (MOHID LAND). Application of such models is shown at two pilot sites, which were selected under EU projects, TempQsim and Interreg II B - ICRW.

  12. Probabilistic graphical models to deal with age estimation of living persons.

    PubMed

    Sironi, Emanuele; Gallidabino, Matteo; Weyermann, Céline; Taroni, Franco

    2016-03-01

    Due to the rise of criminal, civil and administrative judicial situations involving people lacking valid identity documents, age estimation of living persons has become an important operational procedure for numerous forensic and medicolegal services worldwide. The chronological age of a given person is generally estimated from the observed degree of maturity of some selected physical attributes by means of statistical methods. However, their application in the forensic framework suffers from some conceptual and practical drawbacks, as recently claimed in the specialised literature. The aim of this paper is therefore to offer an alternative solution for overcoming these limits, by reiterating the utility of a probabilistic Bayesian approach for age estimation. This approach allows one to deal in a transparent way with the uncertainty surrounding the age estimation process and to produce all the relevant information in the form of posterior probability distribution about the chronological age of the person under investigation. Furthermore, this probability distribution can also be used for evaluating in a coherent way the possibility that the examined individual is younger or older than a given legal age threshold having a particular legal interest. The main novelty introduced by this work is the development of a probabilistic graphical model, i.e. a Bayesian network, for dealing with the problem at hand. The use of this kind of probabilistic tool can significantly facilitate the application of the proposed methodology: examples are presented based on data related to the ossification status of the medial clavicular epiphysis. The reliability and the advantages of this probabilistic tool are presented and discussed. PMID:25794687

  13. Probabilistic modelling of European consumer exposure to cosmetic products.

    PubMed

    McNamara, C; Rohan, D; Golden, D; Gibney, M; Hall, B; Tozer, S; Safford, B; Coroama, M; Leneveu-Duchemin, M C; Steiling, W

    2007-11-01

    In this study, we describe the statistical analysis of the usage profile of the European population to seven cosmetic products. The aim of the study was to construct a reliable model of exposure of the European population from use of the selected products: body lotion, shampoo, deodorant spray, deodorant non-spray, facial moisturiser, lipstick and toothpaste. The first step in this process was to gather reliable data on consumer usage patterns of the products. These data were sourced from a combination of market information databases and a controlled product use study by the trade association Colipa. The market information study contained a large number of subjects, in total 44,100 households and 18,057 habitual users (males and females) of the studied products, in five European countries. The data sets were then combined to generate a realistic distribution of frequency of use of each product, combined with distribution of the amount of product used at each occasion using the CREMe software. A Monte Carlo method was used to combine the data sets. This resulted in a new model of European exposure to cosmetic products being constructed. PMID:17804138

  14. Probabilistic models of genetic variation in structured populations applied to global human studies

    PubMed Central

    Hao, Wei; Song, Minsun; Storey, John D.

    2016-01-01

    Motivation: Modern population genetics studies typically involve genome-wide genotyping of individuals from a diverse network of ancestries. An important problem is how to formulate and estimate probabilistic models of observed genotypes that account for complex population structure. The most prominent work on this problem has focused on estimating a model of admixture proportions of ancestral populations for each individual. Here, we instead focus on modeling variation of the genotypes without requiring a higher-level admixture interpretation. Results: We formulate two general probabilistic models, and we propose computationally efficient algorithms to estimate them. First, we show how principal component analysis can be utilized to estimate a general model that includes the well-known Pritchard–Stephens–Donnelly admixture model as a special case. Noting some drawbacks of this approach, we introduce a new ‘logistic factor analysis’ framework that seeks to directly model the logit transformation of probabilities underlying observed genotypes in terms of latent variables that capture population structure. We demonstrate these advances on data from the Human Genome Diversity Panel and 1000 Genomes Project, where we are able to identify SNPs that are highly differentiated with respect to structure while making minimal modeling assumptions. Availability and Implementation: A Bioconductor R package called lfa is available at http://www.bioconductor.org/packages/release/bioc/html/lfa.html. Contact: jstorey@princeton.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26545820

  15. Probabilistic Multi-Factor Interaction Model for Complex Material Behavior

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Abumeri, Galib H.

    2008-01-01

    The Multi-Factor Interaction Model (MFIM) is used to evaluate the divot weight (foam weight ejected) from the launch external tanks. The multi-factor has sufficient degrees of freedom to evaluate a large number of factors that may contribute to the divot ejection. It also accommodates all interactions by its product form. Each factor has an exponent that satisfies only two points the initial and final points. The exponent describes a monotonic path from the initial condition to the final. The exponent values are selected so that the described path makes sense in the absence of experimental data. In the present investigation, the data used was obtained by testing simulated specimens in launching conditions. Results show that the MFIM is an effective method of describing the divot weight ejected under the conditions investigated.

  16. Probabilistic Multi-Factor Interaction Model for Complex Material Behavior

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Abumeri, Galib H.

    2008-01-01

    The Multi-Factor Interaction Model (MFIM) is used to evaluate the divot weight (foam weight ejected) from the launch external tanks. The multi-factor has sufficient degrees of freedom to evaluate a large number of factors that may contribute to the divot ejection. It also accommodates all interactions by its product form. Each factor has an exponent that satisfies only two points, the initial and final points. The exponent describes a monotonic path from the initial condition to the final. The exponent values are selected so that the described path makes sense in the absence of experimental data. In the present investigation the data used was obtained by testing simulated specimens in launching conditions. Results show that the MFIM is an effective method of describing the divot weight ejected under the conditions investigated.

  17. Probabilistic Multi-Factor Interaction Model for Complex Material Behavior

    NASA Technical Reports Server (NTRS)

    Abumeri, Galib H.; Chamis, Christos C.

    2010-01-01

    Complex material behavior is represented by a single equation of product form to account for interaction among the various factors. The factors are selected by the physics of the problem and the environment that the model is to represent. For example, different factors will be required for each to represent temperature, moisture, erosion, corrosion, etc. It is important that the equation represent the physics of the behavior in its entirety accurately. The Multi-Factor Interaction Model (MFIM) is used to evaluate the divot weight (foam weight ejected) from the external launch tanks. The multi-factor has sufficient degrees of freedom to evaluate a large number of factors that may contribute to the divot ejection. It also accommodates all interactions by its product form. Each factor has an exponent that satisfies only two points - the initial and final points. The exponent describes a monotonic path from the initial condition to the final. The exponent values are selected so that the described path makes sense in the absence of experimental data. In the present investigation, the data used were obtained by testing simulated specimens in launching conditions. Results show that the MFIM is an effective method of describing the divot weight ejected under the conditions investigated. The problem lies in how to represent the divot weight with a single equation. A unique solution to this problem is a multi-factor equation of product form. Each factor is of the following form (1 xi/xf)ei, where xi is the initial value, usually at ambient conditions, xf the final value, and ei the exponent that makes the curve represented unimodal that meets the initial and final values. The exponents are either evaluated by test data or by technical judgment. A minor disadvantage may be the selection of exponents in the absence of any empirical data. This form has been used successfully in describing the foam ejected in simulated space environmental conditions. Seven factors were required

  18. A probabilistic model of emphysema based on granulometry analysis

    NASA Astrophysics Data System (ADS)

    Marcos, J. V.; Nava, R.; Cristobal, G.; Munoz-Barrutia, A.; Escalante-Ramírez, B.; Ortiz-de-Solórzano, C.

    2013-11-01

    Emphysema is associated with the destruction of lung parenchyma, resulting in abnormal enlargement of airspaces. Accurate quantification of emphysema is required for a better understanding of the disease as well as for the assessment of drugs and treatments. In the present study, a novel method for emphysema characterization from histological lung images is proposed. Elastase-induced mice were used to simulate the effect of emphysema on the lungs. A database composed of 50 normal and 50 emphysematous lung patches of size 512 x 512 pixels was used in our experiments. The purpose is to automatically identify those patches containing emphysematous tissue. The proposed approach is based on the use of granulometry analysis, which provides the pattern spectrum describing the distribution of airspaces in the lung region under evaluation. The profile of the spectrum was summarized by a set of statistical features. A logistic regression model was then used to estimate the probability for a patch to be emphysematous from this feature set. An accuracy of 87% was achieved by our method in the classification between normal and emphysematous samples. This result shows the utility of our granulometry-based method to quantify the lesions due to emphysema.

  19. A probabilistic method for constructing wave time-series at inshore locations using model scenarios

    USGS Publications Warehouse

    Long, Joseph W.; Plant, Nathaniel G.; Dalyander, P. Soupy; Thompson, David M.

    2014-01-01

    Continuous time-series of wave characteristics (height, period, and direction) are constructed using a base set of model scenarios and simple probabilistic methods. This approach utilizes an archive of computationally intensive, highly spatially resolved numerical wave model output to develop time-series of historical or future wave conditions without performing additional, continuous numerical simulations. The archive of model output contains wave simulations from a set of model scenarios derived from an offshore wave climatology. Time-series of wave height, period, direction, and associated uncertainties are constructed at locations included in the numerical model domain. The confidence limits are derived using statistical variability of oceanographic parameters contained in the wave model scenarios. The method was applied to a region in the northern Gulf of Mexico and assessed using wave observations at 12 m and 30 m water depths. Prediction skill for significant wave height is 0.58 and 0.67 at the 12 m and 30 m locations, respectively, with similar performance for wave period and direction. The skill of this simplified, probabilistic time-series construction method is comparable to existing large-scale, high-fidelity operational wave models but provides higher spatial resolution output at low computational expense. The constructed time-series can be developed to support a variety of applications including climate studies and other situations where a comprehensive survey of wave impacts on the coastal area is of interest.

  20. An application of probabilistic safety assessment methods to model aircraft systems and accidents

    SciTech Connect

    Martinez-Guridi, G.; Hall, R.E.; Fullwood, R.R.

    1998-08-01

    A case study modeling the thrust reverser system (TRS) in the context of the fatal accident of a Boeing 767 is presented to illustrate the application of Probabilistic Safety Assessment methods. A simplified risk model consisting of an event tree with supporting fault trees was developed to represent the progression of the accident, taking into account the interaction between the TRS and the operating crew during the accident, and the findings of the accident investigation. A feasible sequence of events leading to the fatal accident was identified. Several insights about the TRS and the accident were obtained by applying PSA methods. Changes proposed for the TRS also are discussed.

  1. A probabilistic model of the electron transport in films of nanocrystals arranged in a cubic lattice

    NASA Astrophysics Data System (ADS)

    Kriegel, Ilka; Scotognella, Francesco

    2016-08-01

    The fabrication of nanocrystal (NC) films, starting from colloidal dispersion, is a very attractive topic in condensed matter physics community. NC films can be employed for transistors, light emitting diodes, laser, and solar cells. For this reason the understanding of the film conductivity is of major importance. In this paper we describe a probabilistic model that allow to predict the conductivity of the NC films, in this case of a cubic lattice of Lead Selenide NCs. The model is based on the hopping probability between NCs show a comparison with experimental data reported in literature.

  2. Probabilistic topic modeling for the analysis and classification of genomic sequences

    PubMed Central

    2015-01-01

    Background Studies on genomic sequences for classification and taxonomic identification have a leading role in the biomedical field and in the analysis of biodiversity. These studies are focusing on the so-called barcode genes, representing a well defined region of the whole genome. Recently, alignment-free techniques are gaining more importance because they are able to overcome the drawbacks of sequence alignment techniques. In this paper a new alignment-free method for DNA sequences clustering and classification is proposed. The method is based on k-mers representation and text mining techniques. Methods The presented method is based on Probabilistic Topic Modeling, a statistical technique originally proposed for text documents. Probabilistic topic models are able to find in a document corpus the topics (recurrent themes) characterizing classes of documents. This technique, applied on DNA sequences representing the documents, exploits the frequency of fixed-length k-mers and builds a generative model for a training group of sequences. This generative model, obtained through the Latent Dirichlet Allocation (LDA) algorithm, is then used to classify a large set of genomic sequences. Results and conclusions We performed classification of over 7000 16S DNA barcode sequences taken from Ribosomal Database Project (RDP) repository, training probabilistic topic models. The proposed method is compared to the RDP tool and Support Vector Machine (SVM) classification algorithm in a extensive set of trials using both complete sequences and short sequence snippets (from 400 bp to 25 bp). Our method reaches very similar results to RDP classifier and SVM for complete sequences. The most interesting results are obtained when short sequence snippets are considered. In these conditions the proposed method outperforms RDP and SVM with ultra short sequences and it exhibits a smooth decrease of performance, at every taxonomic level, when the sequence length is decreased. PMID:25916734

  3. Integrable models and combinatorics

    NASA Astrophysics Data System (ADS)

    Bogolyubov, N. M.; Malyshev, C. L.

    2015-10-01

    Relations between quantum integrable models solvable by the quantum inverse scattering method and some aspects of enumerative combinatorics and partition theory are discussed. The main example is the Heisenberg XXZ spin chain in the limit cases of zero or infinite anisotropy. Form factors and some thermal correlation functions are calculated, and it is shown that the resulting form factors in a special q-parametrization are the generating functions for plane partitions and self-avoiding lattice paths. The asymptotic behaviour of the correlation functions is studied in the case of a large number of sites and a moderately large number of spin excitations. For sufficiently low temperature a relation is established between the correlation functions and the theory of matrix integrals. Bibliography: 125 titles.

  4. Probabilistic modeling of percutaneous absorption for risk-based exposure assessments and transdermal drug delivery.

    SciTech Connect

    Ho, Clifford Kuofei

    2004-06-01

    Chemical transport through human skin can play a significant role in human exposure to toxic chemicals in the workplace, as well as to chemical/biological warfare agents in the battlefield. The viability of transdermal drug delivery also relies on chemical transport processes through the skin. Models of percutaneous absorption are needed for risk-based exposure assessments and drug-delivery analyses, but previous mechanistic models have been largely deterministic. A probabilistic, transient, three-phase model of percutaneous absorption of chemicals has been developed to assess the relative importance of uncertain parameters and processes that may be important to risk-based assessments. Penetration routes through the skin that were modeled include the following: (1) intercellular diffusion through the multiphase stratum corneum; (2) aqueous-phase diffusion through sweat ducts; and (3) oil-phase diffusion through hair follicles. Uncertainty distributions were developed for the model parameters, and a Monte Carlo analysis was performed to simulate probability distributions of mass fluxes through each of the routes. Sensitivity analyses using stepwise linear regression were also performed to identify model parameters that were most important to the simulated mass fluxes at different times. This probabilistic analysis of percutaneous absorption (PAPA) method has been developed to improve risk-based exposure assessments and transdermal drug-delivery analyses, where parameters and processes can be highly uncertain.

  5. SHM-Based Probabilistic Fatigue Life Prediction for Bridges Based on FE Model Updating.

    PubMed

    Lee, Young-Joo; Cho, Soojin

    2016-01-01

    Fatigue life prediction for a bridge should be based on the current condition of the bridge, and various sources of uncertainty, such as material properties, anticipated vehicle loads and environmental conditions, make the prediction very challenging. This paper presents a new approach for probabilistic fatigue life prediction for bridges using finite element (FE) model updating based on structural health monitoring (SHM) data. Recently, various types of SHM systems have been used to monitor and evaluate the long-term structural performance of bridges. For example, SHM data can be used to estimate the degradation of an in-service bridge, which makes it possible to update the initial FE model. The proposed method consists of three steps: (1) identifying the modal properties of a bridge, such as mode shapes and natural frequencies, based on the ambient vibration under passing vehicles; (2) updating the structural parameters of an initial FE model using the identified modal properties; and (3) predicting the probabilistic fatigue life using the updated FE model. The proposed method is demonstrated by application to a numerical model of a bridge, and the impact of FE model updating on the bridge fatigue life is discussed. PMID:26950125

  6. SHM-Based Probabilistic Fatigue Life Prediction for Bridges Based on FE Model Updating

    PubMed Central

    Lee, Young-Joo; Cho, Soojin

    2016-01-01

    Fatigue life prediction for a bridge should be based on the current condition of the bridge, and various sources of uncertainty, such as material properties, anticipated vehicle loads and environmental conditions, make the prediction very challenging. This paper presents a new approach for probabilistic fatigue life prediction for bridges using finite element (FE) model updating based on structural health monitoring (SHM) data. Recently, various types of SHM systems have been used to monitor and evaluate the long-term structural performance of bridges. For example, SHM data can be used to estimate the degradation of an in-service bridge, which makes it possible to update the initial FE model. The proposed method consists of three steps: (1) identifying the modal properties of a bridge, such as mode shapes and natural frequencies, based on the ambient vibration under passing vehicles; (2) updating the structural parameters of an initial FE model using the identified modal properties; and (3) predicting the probabilistic fatigue life using the updated FE model. The proposed method is demonstrated by application to a numerical model of a bridge, and the impact of FE model updating on the bridge fatigue life is discussed. PMID:26950125

  7. Detailed probabilistic modelling of cell inactivation by ionizing radiations of different qualities: the model and its applications.

    PubMed

    Kundrát, Pavel

    2009-03-01

    The probabilistic two-stage model of cell killing by ionizing radiation enables to represent both damage induction by radiation and its repair by the cell. The model properties and applications as well as possible interpretation of the underlying damage classification are discussed. Analyses of published survival data for V79 hamster cells irradiated by protons and He, C, O, and Ne ions are reported, quantifying the variations in radiation quality with increasing charge and linear energy transfer of the ions. PMID:18684633

  8. Integrated Assessment Model Evaluation

    NASA Astrophysics Data System (ADS)

    Smith, S. J.; Clarke, L.; Edmonds, J. A.; Weyant, J. P.

    2012-12-01

    Integrated assessment models of climate change (IAMs) are widely used to provide insights into the dynamics of the coupled human and socio-economic system, including emission mitigation analysis and the generation of future emission scenarios. Similar to the climate modeling community, the integrated assessment community has a two decade history of model inter-comparison, which has served as one of the primary venues for model evaluation and confirmation. While analysis of historical trends in the socio-economic system has long played a key role in diagnostics of future scenarios from IAMs, formal hindcast experiments are just now being contemplated as evaluation exercises. Some initial thoughts on setting up such IAM evaluation experiments are discussed. Socio-economic systems do not follow strict physical laws, which means that evaluation needs to take place in a context, unlike that of physical system models, in which there are few fixed, unchanging relationships. Of course strict validation of even earth system models is not possible (Oreskes etal 2004), a fact borne out by the inability of models to constrain the climate sensitivity. Energy-system models have also been grappling with some of the same questions over the last quarter century. For example, one of "the many questions in the energy field that are waiting for answers in the next 20 years" identified by Hans Landsberg in 1985 was "Will the price of oil resume its upward movement?" Of course we are still asking this question today. While, arguably, even fewer constraints apply to socio-economic systems, numerous historical trends and patterns have been identified, although often only in broad terms, that are used to guide the development of model components, parameter ranges, and scenario assumptions. IAM evaluation exercises are expected to provide useful information for interpreting model results and improving model behavior. A key step is the recognition of model boundaries, that is, what is inside

  9. Statistical shape analysis of the human spleen geometry for probabilistic occupant models.

    PubMed

    Yates, Keegan M; Lu, Yuan-Chiao; Untaroiu, Costin D

    2016-06-14

    Statistical shape models are an effective way to create computational models of human organs that can incorporate inter-subject geometrical variation. The main objective of this study was to create statistical mean and boundary models of the human spleen in an occupant posture. Principal component analysis was applied to fifteen human spleens in order to find the statistical modes of variation, mean shape, and boundary models. A landmark sliding approach was utilized to refine the landmarks to obtain a better shape correspondence and create a better representation of the underlying shape contour. The first mode of variation was found to be the overall volume, and it accounted for 69% of the total variation. The mean model and boundary models could be used to develop probabilistic finite element (FE) models which may identify the risk of spleen injury during vehicle collisions and consequently help to improve automobile safety systems. PMID:27040386

  10. Integrated Assessment Modeling

    SciTech Connect

    Edmonds, James A.; Calvin, Katherine V.; Clarke, Leon E.; Janetos, Anthony C.; Kim, Son H.; Wise, Marshall A.; McJeon, Haewon C.

    2012-10-31

    This paper discusses the role of Integrated Assessment models (IAMs) in climate change research. IAMs are an interdisciplinary research platform, which constitutes a consistent scientific framework in which the large-scale interactions between human and natural Earth systems can be examined. In so doing, IAMs provide insights that would otherwise be unavailable from traditional single-discipline research. By providing a broader view of the issue, IAMs constitute an important tool for decision support. IAMs are also a home of human Earth system research and provide natural Earth system scientists information about the nature of human intervention in global biogeophysical and geochemical processes.

  11. Propagating Water Quality Analysis Uncertainty Into Resource Management Decisions Through Probabilistic Modeling

    NASA Astrophysics Data System (ADS)

    Gronewold, A. D.; Wolpert, R. L.; Reckhow, K. H.

    2007-12-01

    Most probable number (MPN) and colony-forming-unit (CFU) are two estimates of fecal coliform bacteria concentration commonly used as measures of water quality in United States shellfish harvesting waters. The MPN is the maximum likelihood estimate (or MLE) of the true fecal coliform concentration based on counts of non-sterile tubes in serial dilution of a sample aliquot, indicating bacterial metabolic activity. The CFU is the MLE of the true fecal coliform concentration based on the number of bacteria colonies emerging on a growth plate after inoculation from a sample aliquot. Each estimating procedure has intrinsic variability and is subject to additional uncertainty arising from minor variations in experimental protocol. Several versions of each procedure (using different sized aliquots or different numbers of tubes, for example) are in common use, each with its own levels of probabilistic and experimental error and uncertainty. It has been observed empirically that the MPN procedure is more variable than the CFU procedure, and that MPN estimates are somewhat higher on average than CFU estimates, on split samples from the same water bodies. We construct a probabilistic model that provides a clear theoretical explanation for the observed variability in, and discrepancy between, MPN and CFU measurements. We then explore how this variability and uncertainty might propagate into shellfish harvesting area management decisions through a two-phased modeling strategy. First, we apply our probabilistic model in a simulation-based analysis of future water quality standard violation frequencies under alternative land use scenarios, such as those evaluated under guidelines of the total maximum daily load (TMDL) program. Second, we apply our model to water quality data from shellfish harvesting areas which at present are closed (either conditionally or permanently) to shellfishing, to determine if alternative laboratory analysis procedures might have led to different

  12. Probabilistic methods for structural response analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Burnside, O. H.; Cruse, T. A.

    1988-01-01

    This paper addresses current work to develop probabilistic structural analysis methods for integration with a specially developed probabilistic finite element code. The goal is to establish distribution functions for the structural responses of stochastic structures under uncertain loadings. Several probabilistic analysis methods are proposed covering efficient structural probabilistic analysis methods, correlated random variables, and response of linear system under stationary random loading.

  13. The use of the Multi-model Ensemble in Probabilistic Climate Projections

    NASA Astrophysics Data System (ADS)

    Knutti, R.; Tebaldi, C.

    2007-12-01

    Recent coordinated efforts, in which numerous climate models have been run for a common set of experiments, have produced large datasets of projections of future climate for various scenarios. Those multi-model ensembles sample initial condition, parameter as well as structural uncertainties in the model design, and they have prompted a variety of approaches to quantify uncertainty in future climate in a probabilistic way. This overview presentation outlines the motivation for using multi-model ensembles, briefly discusses the methodologies published so far and compares their results for regional temperature projections. It discusses the challenges in interpreting multi-model results, caused by the lack of verification of climate projections, the problem of model dependence, bias and tuning as well as the difficulty in making sense of an "ensemble of opportunity".

  14. Value learning and arousal in the extinction of probabilistic rewards: the role of dopamine in a modified temporal difference model.

    PubMed

    Song, Minryung R; Fellous, Jean-Marc

    2014-01-01

    Because most rewarding events are probabilistic and changing, the extinction of probabilistic rewards is important for survival. It has been proposed that the extinction of probabilistic rewards depends on arousal and the amount of learning of reward values. Midbrain dopamine neurons were suggested to play a role in both arousal and learning reward values. Despite extensive research on modeling dopaminergic activity in reward learning (e.g. temporal difference models), few studies have been done on modeling its role in arousal. Although temporal difference models capture key characteristics of dopaminergic activity during the extinction of deterministic rewards, they have been less successful at simulating the extinction of probabilistic rewards. By adding an arousal signal to a temporal difference model, we were able to simulate the extinction of probabilistic rewards and its dependence on the amount of learning. Our simulations propose that arousal allows the probability of reward to have lasting effects on the updating of reward value, which slows the extinction of low probability rewards. Using this model, we predicted that, by signaling the prediction error, dopamine determines the learned reward value that has to be extinguished during extinction and participates in regulating the size of the arousal signal that controls the learning rate. These predictions were supported by pharmacological experiments in rats. PMID:24586823

  15. A Probabilistic Model for Estimating the Depth and Threshold Temperature of C-fiber Nociceptors.

    PubMed

    Dezhdar, Tara; Moshourab, Rabih A; Fründ, Ingo; Lewin, Gary R; Schmuker, Michael

    2015-01-01

    The subjective experience of thermal pain follows the detection and encoding of noxious stimuli by primary afferent neurons called nociceptors. However, nociceptor morphology has been hard to access and the mechanisms of signal transduction remain unresolved. In order to understand how heat transducers in nociceptors are activated in vivo, it is important to estimate the temperatures that directly activate the skin-embedded nociceptor membrane. Hence, the nociceptor's temperature threshold must be estimated, which in turn will depend on the depth at which transduction happens in the skin. Since the temperature at the receptor cannot be accessed experimentally, such an estimation can currently only be achieved through modeling. However, the current state-of-the-art model to estimate temperature at the receptor suffers from the fact that it cannot account for the natural stochastic variability of neuronal responses. We improve this model using a probabilistic approach which accounts for uncertainties and potential noise in system. Using a data set of 24 C-fibers recorded in vitro, we show that, even without detailed knowledge of the bio-thermal properties of the system, the probabilistic model that we propose here is capable of providing estimates of threshold and depth in cases where the classical method fails. PMID:26638830

  16. From Cyclone Tracks to the Costs of European Winter Storms: A Probabilistic Loss Assessment Model

    NASA Astrophysics Data System (ADS)

    Orwig, K.; Renggli, D.; Corti, T.; Reese, S.; Wueest, M.; Viktor, E.; Zimmerli, P.

    2014-12-01

    European winter storms cause billions of dollars of insured losses every year. Therefore, it is essential to understand potential impacts of future events, and the role reinsurance can play to mitigate the losses. The authors will present an overview on natural catastrophe risk assessment modeling in the reinsurance industry, and the development of a new innovative approach for modeling the risk associated with European winter storms.The new innovative approach includes the development of physically meaningful probabilistic (i.e. simulated) events for European winter storm loss assessment. The meteorological hazard component of the new model is based on cyclone and windstorm tracks identified in the 20thCentury Reanalysis data. The knowledge of the evolution of winter storms both in time and space allows the physically meaningful perturbation of historical event properties (e.g. track, intensity, etc.). The perturbation includes a random element but also takes the local climatology and the evolution of the historical event into account.The low-resolution wind footprints taken from the 20thCentury Reanalysis are processed by a statistical-dynamical downscaling to generate high-resolution footprints for both the simulated and historical events. Downscaling transfer functions are generated using ENSEMBLES regional climate model data. The result is a set of reliable probabilistic events representing thousands of years. The event set is then combined with country and site-specific vulnerability functions and detailed market- or client-specific information to compute annual expected losses.

  17. A Probabilistic Model for Estimating the Depth and Threshold Temperature of C-fiber Nociceptors

    NASA Astrophysics Data System (ADS)

    Dezhdar, Tara; Moshourab, Rabih A.; Fründ, Ingo; Lewin, Gary R.; Schmuker, Michael

    2015-12-01

    The subjective experience of thermal pain follows the detection and encoding of noxious stimuli by primary afferent neurons called nociceptors. However, nociceptor morphology has been hard to access and the mechanisms of signal transduction remain unresolved. In order to understand how heat transducers in nociceptors are activated in vivo, it is important to estimate the temperatures that directly activate the skin-embedded nociceptor membrane. Hence, the nociceptor’s temperature threshold must be estimated, which in turn will depend on the depth at which transduction happens in the skin. Since the temperature at the receptor cannot be accessed experimentally, such an estimation can currently only be achieved through modeling. However, the current state-of-the-art model to estimate temperature at the receptor suffers from the fact that it cannot account for the natural stochastic variability of neuronal responses. We improve this model using a probabilistic approach which accounts for uncertainties and potential noise in system. Using a data set of 24 C-fibers recorded in vitro, we show that, even without detailed knowledge of the bio-thermal properties of the system, the probabilistic model that we propose here is capable of providing estimates of threshold and depth in cases where the classical method fails.

  18. An efficient deterministic-probabilistic approach to modeling regional groundwater flow: 1. Theory

    USGS Publications Warehouse

    Yen, Chung-Cheng; Guymon, Gary L.

    1990-01-01

    An efficient probabilistic model is developed and cascaded with a deterministic model for predicting water table elevations in regional aquifers. The objective is to quantify model uncertainty where precise estimates of water table elevations may be required. The probabilistic model is based on the two-point probability method which only requires prior knowledge of uncertain variables mean and coefficient of variation. The two-point estimate method is theoretically developed and compared with the Monte Carlo simulation method. The results of comparisons using hypothetical determinisitic problems indicate that the two-point estimate method is only generally valid for linear problems where the coefficients of variation of uncertain parameters (for example, storage coefficient and hydraulic conductivity) is small. The two-point estimate method may be applied to slightly nonlinear problems with good results, provided coefficients of variation are small. In such cases, the two-point estimate method is much more efficient than the Monte Carlo method provided the number of uncertain variables is less than eight.

  19. A Probabilistic Model for Estimating the Depth and Threshold Temperature of C-fiber Nociceptors

    PubMed Central

    Dezhdar, Tara; Moshourab, Rabih A.; Fründ, Ingo; Lewin, Gary R.; Schmuker, Michael

    2015-01-01

    The subjective experience of thermal pain follows the detection and encoding of noxious stimuli by primary afferent neurons called nociceptors. However, nociceptor morphology has been hard to access and the mechanisms of signal transduction remain unresolved. In order to understand how heat transducers in nociceptors are activated in vivo, it is important to estimate the temperatures that directly activate the skin-embedded nociceptor membrane. Hence, the nociceptor’s temperature threshold must be estimated, which in turn will depend on the depth at which transduction happens in the skin. Since the temperature at the receptor cannot be accessed experimentally, such an estimation can currently only be achieved through modeling. However, the current state-of-the-art model to estimate temperature at the receptor suffers from the fact that it cannot account for the natural stochastic variability of neuronal responses. We improve this model using a probabilistic approach which accounts for uncertainties and potential noise in system. Using a data set of 24 C-fibers recorded in vitro, we show that, even without detailed knowledge of the bio-thermal properties of the system, the probabilistic model that we propose here is capable of providing estimates of threshold and depth in cases where the classical method fails. PMID:26638830

  20. Probabilistic Water quality trading model conditioned on season-ahead nutrient load forecasts

    NASA Astrophysics Data System (ADS)

    Arumugam, S.; Oh, J.

    2010-12-01

    Successful water quality trading programs in the country rely on expected point and nonpoint nutrient loadings from multiple sources. Pollutant sources, through nutrient transactions, are in pursuit of minimum allocation strategies that can keep both the loadings and the associated concentrations under the target limit. It is well established in the hydroclimatic literature that interannual variability in seasonal streamflow could be explained partially using SST conditions. Similarly, it is widely known that streamflow is the most important predictor in estimating nutrient loadings and the associated concentration. We intend to bridge these two findings to develop probabilistic nutrient loading model for supporting water quality trading in the Tar River basin, NC. Utilizing the precipitation forecasts derived from ECHAM4.5 General Circulation Model, we develop season-ahead forecasts of total nitrogen (TN) and total phosphorus (TP) by forcing the calibrated water quality model with seasonal streamflow forecasts. Based on the season-head loadings, the probability of violation of desired nutrient concentration for the currently allowed loadings is also estimated. Through retrospective analyses using forecasted streamflow and the associated loadings, the probabilistic water quality trading model estimates the nutrient reduction strategies that can ensure the net loadings from both sources being below the target loadings. Challenges in applying the proposed framework for actual trading are also discussed.

  1. Probabilistic comparison of alternative characterization technologies at the Fernald Uranium-in-Soils Integrated Demonstration Project

    SciTech Connect

    Rautman, C.A.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.; Kaplan, P.G.

    1993-12-31

    The performance of four alternative characterization technologies proposed for use in characterization of surficial uranium contamination in soil at the Incinerator and Drum Baling Areas at the Fernald Environmental Management Project in southwestern Ohio has been evaluated using a probabilistic, risk-based decision-analysis methodology. The basis of comparison is to minimize a computed total cost for environmental cleanup. This total-cost-based approach provides a framework for evaluating the trade-offs among remedial investigation, the remedial design, and the risk of regulatory penalties. The approach explicitly recognizes the value of information provided by remedial investigation; additional measurements are only valuable to the extent that the information they provide reduces total cost.

  2. Validation analysis of probabilistic models of dietary exposure to food additives.

    PubMed

    Gilsenan, M B; Thompson, R L; Lambe, J; Gibney, M J

    2003-10-01

    The validity of a range of simple conceptual models designed specifically for the estimation of food additive intakes using probabilistic analysis was assessed. Modelled intake estimates that fell below traditional conservative point estimates of intake and above 'true' additive intakes (calculated from a reference database at brand level) were considered to be in a valid region. Models were developed for 10 food additives by combining food intake data, the probability of an additive being present in a food group and additive concentration data. Food intake and additive concentration data were entered as raw data or as a lognormal distribution, and the probability of an additive being present was entered based on the per cent brands or the per cent eating occasions within a food group that contained an additive. Since the three model components assumed two possible modes of input, the validity of eight (2(3)) model combinations was assessed. All model inputs were derived from the reference database. An iterative approach was employed in which the validity of individual model components was assessed first, followed by validation of full conceptual models. While the distribution of intake estimates from models fell below conservative intakes, which assume that the additive is present at maximum permitted levels (MPLs) in all foods in which it is permitted, intake estimates were not consistently above 'true' intakes. These analyses indicate the need for more complex models for the estimation of food additive intakes using probabilistic analysis. Such models should incorporate information on market share and/or brand loyalty. PMID:14555358

  3. VBA: A Probabilistic Treatment of Nonlinear Models for Neurobiological and Behavioural Data

    PubMed Central

    Daunizeau, Jean; Adam, Vincent; Rigoux, Lionel

    2014-01-01

    This work is in line with an on-going effort tending toward a computational (quantitative and refutable) understanding of human neuro-cognitive processes. Many sophisticated models for behavioural and neurobiological data have flourished during the past decade. Most of these models are partly unspecified (i.e. they have unknown parameters) and nonlinear. This makes them difficult to peer with a formal statistical data analysis framework. In turn, this compromises the reproducibility of model-based empirical studies. This work exposes a software toolbox that provides generic, efficient and robust probabilistic solutions to the three problems of model-based analysis of empirical data: (i) data simulation, (ii) parameter estimation/model selection, and (iii) experimental design optimization. PMID:24465198

  4. A Probabilistic Model of Functional Brain Connectivity Network for Discovering Novel Biomarkers

    PubMed Central

    Bian, Jiang; Xie, Mengjun; Topaloglu, Umit; Cisler, Josh M.

    2013-01-01

    Graph theoretical analyses of functional brain connectivity networks have been limited to a static view of brain activities over the entire timeseries. In this paper, we propose a new probabilistic model of the functional brain connectivity network, the strong-edge model, which incorporates the temporal fluctuation of neurodynamics. We also introduce a systematic approach to identifying biomarkers based on network characteristics that quantitatively describe the organization of the brain network. The evaluation results of the proposed strong-edge network model is quite promising. The biomarkers derived from the strong-edge model have achieved much higher prediction accuracy of 89% (ROCAUC: 0.96) in distinguishing depression subjects from healthy controls in comparison with the conventional network model (accuracy: 76%, ROC-AUC: 0.87). These novel biomarkers have the high potential of being applied clinically in diagnosing neurological and psychiatric brain diseases with noninvasive neuroimaging technologies. PMID:24303289

  5. Glacial integrative modelling.

    PubMed

    Ganopolski, Andrey

    2003-09-15

    Understanding the mechanisms of past climate changes requires modelling of the complex interaction between all major components of the Earth system: atmosphere, ocean, cryosphere, lithosphere and biosphere. This paper reviews attempts at such an integrative approach to modelling climate changes during the glacial age. In particular, the roles of different factors in shaping glacial climate are compared based on the results of simulations with an Earth-system model of intermediate complexity, CLIMBER-2. It is shown that ice sheets, changes in atmospheric compositions, vegetation cover, and reorganization of the ocean thermohaline circulation play important roles in glacial climate changes. Another example of this approach is the modelling of two major types of abrupt glacial climate changes: Dansgaard-Oeschger and Heinrich events. Our results corroborate some of the early proposed mechanisms, which relate abrupt climate changes to the internal instability of the ocean thermohaline circulation and ice sheets. At the same time, it is shown that realistic representation of the temporal evolution of the palaeoclimatic background is crucial to simulate observed features of the glacial abrupt climate changes. PMID:14558899

  6. Probabilistic, multi-variate flood damage modelling using random forests and Bayesian networks

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Schröter, Kai

    2015-04-01

    Decisions on flood risk management and adaptation are increasingly based on risk analyses. Such analyses are associated with considerable uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention recently, they are hardly applied in flood damage assessments. Most of the damage models usually applied in standard practice have in common that complex damaging processes are described by simple, deterministic approaches like stage-damage functions. This presentation will show approaches for probabilistic, multi-variate flood damage modelling on the micro- and meso-scale and discuss their potential and limitations. Reference: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Schröter, K., Kreibich, H., Vogel, K., Riggelsen, C., Scherbaum, F., Merz, B. (2014): How useful are complex flood damage models? - Water Resources Research, 50, 4, p. 3378-3395.

  7. Data assimilation for unsaturated flow models with restart adaptive probabilistic collocation based Kalman filter

    NASA Astrophysics Data System (ADS)

    Man, Jun; Li, Weixuan; Zeng, Lingzao; Wu, Laosheng

    2016-06-01

    The ensemble Kalman filter (EnKF) has gained popularity in hydrological data assimilation problems. As a Monte Carlo based method, a sufficiently large ensemble size is usually required to guarantee the accuracy. As an alternative approach, the probabilistic collocation based Kalman filter (PCKF) employs the polynomial chaos expansion (PCE) to represent and propagate the uncertainties in parameters and states. However, PCKF suffers from the so-called "curse of dimensionality". Its computational cost increases drastically with the increasing number of parameters and system nonlinearity. Furthermore, PCKF may fail to provide accurate estimations due to the joint updating scheme for strongly nonlinear models. Motivated by recent developments in uncertainty quantification and EnKF, we propose a restart adaptive probabilistic collocation based Kalman filter (RAPCKF) for data assimilation in unsaturated flow problems. During the implementation of RAPCKF, the important parameters are identified and active PCE basis functions are adaptively selected at each assimilation step; the "restart" scheme is utilized to eliminate the inconsistency between updated model parameters and states variables. The performance of RAPCKF is systematically tested with numerical cases of unsaturated flow models. It is shown that the adaptive approach and restart scheme can significantly improve the performance of PCKF. Moreover, RAPCKF has been demonstrated to be more efficient than EnKF with the same computational cost.

  8. Development of a probabilistic PCB-bioaccumulation model for six fish species in the Hudson River

    SciTech Connect

    Stackelberg, K. von; Menzie, C.

    1995-12-31

    In 1984 the US Environmental Protection Agency (USEPA) completed a Feasibility Study on the Hudson River that investigated remedial alternatives and issued a Record of Decision (ROD) later that year. In December 1989 USEPA decided to reassess the No Action decision for Hudson River sediments. This reassessment consists of three phases: Interim Characterization and Evaluation (Phase 1); Further Site Characterization and Analysis (Phase 2); and, Feasibility study (Phase 3). A Phase 1 report was completed in August, 1991. The team then completed a Final Work Plan for Phase 2 in September 1992. This work plan identified various PCB fate and transport modeling activities to support the Hudson River PCB Reassessment Remedial Investigation and Feasibility Study (RI/FS). This talk provides a description of the development of a Probabilistic bioaccumulation models to describe the uptake of PCBs on a congener-specific basis in six fish species. The authors have developed a framework for relating body burdens of PCBs in fish to exposure concentrations in Hudson River water and sediments. This framework is used to understand historical and current relationships as well as to predict fish body burdens for future conditions under specific remediation and no action scenarios. The framework incorporates a probabilistic approach to predict distributions in PCB body burdens for selected fish species. These models can predict single population statistics such as the average expected values of PCBs under specific scenarios as well as the distribution of expected concentrations.

  9. Using Bayesian Model Averaging (BMA) to calibrate probabilistic surface temperature forecasts over Iran

    NASA Astrophysics Data System (ADS)

    Soltanzadeh, I.; Azadi, M.; Vakili, G. A.

    2011-07-01

    Using Bayesian Model Averaging (BMA), an attempt was made to obtain calibrated probabilistic numerical forecasts of 2-m temperature over Iran. The ensemble employs three limited area models (WRF, MM5 and HRM), with WRF used with five different configurations. Initial and boundary conditions for MM5 and WRF are obtained from the National Centers for Environmental Prediction (NCEP) Global Forecast System (GFS) and for HRM the initial and boundary conditions come from analysis of Global Model Europe (GME) of the German Weather Service. The resulting ensemble of seven members was run for a period of 6 months (from December 2008 to May 2009) over Iran. The 48-h raw ensemble outputs were calibrated using BMA technique for 120 days using a 40 days training sample of forecasts and relative verification data. The calibrated probabilistic forecasts were assessed using rank histogram and attribute diagrams. Results showed that application of BMA improved the reliability of the raw ensemble. Using the weighted ensemble mean forecast as a deterministic forecast it was found that the deterministic-style BMA forecasts performed usually better than the best member's deterministic forecast.

  10. Probabilistic failure modelling of reinforced concrete structures subjected to chloride penetration

    NASA Astrophysics Data System (ADS)

    Nogueira, Caio Gorla; Leonel, Edson Denner; Coda, Humberto Breves

    2012-12-01

    Structural durability is an important criterion that must be evaluated for every type of structure. Concerning reinforced concrete members, chloride diffusion process is widely used to evaluate durability, especially when these structures are constructed in aggressive atmospheres. The chloride ingress triggers the corrosion of reinforcements; therefore, by modelling this phenomenon, the corrosion process can be better evaluated as well as the structural durability. The corrosion begins when a threshold level of chloride concentration is reached at the steel bars of reinforcements. Despite the robustness of several models proposed in literature, deterministic approaches fail to predict accurately the corrosion time initiation due the inherent randomness observed in this process. In this regard, structural durability can be more realistically represented using probabilistic approaches. This paper addresses the analyses of probabilistic corrosion time initiation in reinforced concrete structures exposed to chloride penetration. The chloride penetration is modelled using the Fick's diffusion law. This law simulates the chloride diffusion process considering time-dependent effects. The probability of failure is calculated using Monte Carlo simulation and the first order reliability method, with a direct coupling approach. Some examples are considered in order to study these phenomena. Moreover, a simplified method is proposed to determine optimal values for concrete cover.

  11. Probabilistic model of waiting times between large failures in sheared media

    NASA Astrophysics Data System (ADS)

    Brinkman, Braden A. W.; LeBlanc, Michael P.; Uhl, Jonathan T.; Ben-Zion, Yehuda; Dahmen, Karin A.

    2016-01-01

    Using a probabilistic approximation of a mean-field mechanistic model of sheared systems, we analytically calculate the statistical properties of large failures under slow shear loading. For general shear F (t ) , the distribution of waiting times between large system-spanning failures is a generalized exponential distribution, ρT(t ) =λ ( F (t ) ) P ( F (t ) ) exp[-∫0td τ λ ( F (τ ) ) P ( F (τ ) ) ] , where λ ( F (t )) is the rate of small event occurrences at stress F (t ) and P ( F (t )) is the probability that a small event triggers a large failure. We study the behavior of this distribution as a function of fault properties, such as heterogeneity or shear rate. Because the probabilistic model accommodates any stress loading F (t ) , it is particularly useful for modeling experiments designed to understand how different forms of shear loading or stress perturbations impact the waiting-time statistics of large failures. As examples, we study how periodic perturbations or fluctuations on top of a linear shear stress increase impact the waiting-time distribution.

  12. Development of a perfect prognosis probabilistic model for prediction of lightning over south-east India

    NASA Astrophysics Data System (ADS)

    Rajeevan, M.; Madhulatha, A.; Rajasekhar, M.; Bhate, Jyoti; Kesarkar, Amit; Rao, B. V. Appa

    2012-04-01

    A prediction model based on the perfect prognosis method was developed to predict the probability of lightning and probable time of its occurrence over the south-east Indian region. In the perfect prognosis method, statistical relationships are established using past observed data. For real time applications, the predictors are derived from a numerical weather prediction model. In the present study, we have developed the statistical model based on Binary Logistic Regression technique. For developing the statistical model, 115 cases of lightning that occurred over the south-east Indian region during the period 2006-2009 were considered. The probability of lightning (yes or no) occurring during the 12-hour period 0900-2100 UTC over the region was considered as the predictand. The thermodynamic and dynamic variables derived from the NCEP Final Analysis were used as the predictors. A three-stage strategy based on Spearman Rank Correlation, Cumulative Probability Distribution and Principal Component Analysis was used to objectively select the model predictors from a pool of 61 potential predictors considered for the analysis. The final list of six predictors used in the model consists of the parameters representing atmospheric instability, total moisture content in the atmosphere, low level moisture convergence and lower tropospheric temperature advection. For the independent verifications, the probabilistic model was tested for 92 days during the months of May, June and August 2010. The six predictors were derived from the 24-h predictions using a high resolution Weather Research and Forecasting model initialized with 00 UTC conditions. During the independent period, the probabilistic model showed a probability of detection of 77% with a false alarm rate of 35%. The Brier Skill Score during the independent period was 0.233, suggesting that the prediction scheme is skillful in predicting the lightning probability over the south-east region with a reasonable accuracy.

  13. Probabilistic Mixture Regression Models for Alignment of LC-MS Data

    PubMed Central

    Befekadu, Getachew K.; Tadesse, Mahlet G.; Tsai, Tsung-Heng; Ressom, Habtom W.

    2010-01-01

    A novel framework of a probabilistic mixture regression model (PMRM) is presented for alignment of liquid chromatography-mass spectrometry (LC-MS) data with respect to both retention time (RT) and mass-to-charge ratio (m/z). The expectation maximization algorithm is used to estimate the joint parameters of spline-based mixture regression models and prior transformation density models. The latter accounts for the variability in RT points, m/z values, and peak intensities. The applicability of PMRM for alignment of LC-MS data is demonstrated through three datasets. The performance of PMRM is compared with other alignment approaches including dynamic time warping, correlation optimized warping, and continuous profile model in terms of coefficient variation of replicate LC-MS runs and accuracy in detecting differentially abundant peptides/proteins. PMID:20837998

  14. Global sensitivity analysis, probabilistic calibration, and predictive assessment for the data assimilation linked ecosystem carbon model

    NASA Astrophysics Data System (ADS)

    Safta, C.; Ricciuto, D. M.; Sargsyan, K.; Debusschere, B.; Najm, H. N.; Williams, M.; Thornton, P. E.

    2015-07-01

    In this paper we propose a probabilistic framework for an uncertainty quantification (UQ) study of a carbon cycle model and focus on the comparison between steady-state and transient simulation setups. A global sensitivity analysis (GSA) study indicates the parameters and parameter couplings that are important at different times of the year for quantities of interest (QoIs) obtained with the data assimilation linked ecosystem carbon (DALEC) model. We then employ a Bayesian approach and a statistical model error term to calibrate the parameters of DALEC using net ecosystem exchange (NEE) observations at the Harvard Forest site. The calibration results are employed in the second part of the paper to assess the predictive skill of the model via posterior predictive checks.

  15. On a probabilistic model for the numerical estimation of nocturnal migration of birds.

    PubMed

    Baushev, Alexey Nickolayevich; Sinelschikova, Alexandra

    2007-01-01

    The study of nocturnal bird migration by cone methods of observation has a century-long history but has continued to be used up to the present. To describe the flux and estimate the number of passing birds a probabilistic model is proposed. This model is based on the concept of dynamic Poisson ensemble of points in appropriate phase space and has two parameters. One is scalar and the other one is functional. We constructed consistent estimations of these parameters and discuss their use for the numerical estimation of the flux of birds observed in a narrow light cone generated by the bright lunar disk and formed by an open angle of telescope. Selection on the same type of birds was suggested as the necessary condition for the model application. Ground speed of each bird was introduced into the model as a new but obligatory value determining the quantification of the flux of bird. PMID:16546224

  16. A simulation model for probabilistic analysis of Space Shuttle abort modes

    NASA Technical Reports Server (NTRS)

    Hage, R. T.

    1993-01-01

    A simulation model which was developed to provide a probabilistic analysis tool to study the various space transportation system abort mode situations is presented. The simulation model is based on Monte Carlo simulation of an event-tree diagram which accounts for events during the space transportation system's ascent and its abort modes. The simulation model considers just the propulsion elements of the shuttle system (i.e., external tank, main engines, and solid boosters). The model was developed to provide a better understanding of the probability of occurrence and successful completion of abort modes during the vehicle's ascent. The results of the simulation runs discussed are for demonstration purposes only, they are not official NASA probability estimates.

  17. Aircraft Conflict Analysis and Real-Time Conflict Probing Using Probabilistic Trajectory Modeling

    NASA Technical Reports Server (NTRS)

    Yang, Lee C.; Kuchar, James K.

    2000-01-01

    Methods for maintaining separation between aircraft in the current airspace system have been built from a foundation of structured routes and evolved procedures. However, as the airspace becomes more congested and the chance of failures or operational error become more problematic, automated conflict alerting systems have been proposed to help provide decision support and to serve as traffic monitoring aids. The problem of conflict detection and resolution has been tackled from a number of different ways, but in this thesis, it is recast as a problem of prediction in the presence of uncertainties. Much of the focus is concentrated on the errors and uncertainties from the working trajectory model used to estimate future aircraft positions. The more accurate the prediction, the more likely an ideal (no false alarms, no missed detections) alerting system can be designed. Additional insights into the problem were brought forth by a review of current operational and developmental approaches found in the literature. An iterative, trial and error approach to threshold design was identified. When examined from a probabilistic perspective, the threshold parameters were found to be a surrogate to probabilistic performance measures. To overcome the limitations in the current iterative design method, a new direct approach is presented where the performance measures are directly computed and used to perform the alerting decisions. The methodology is shown to handle complex encounter situations (3-D, multi-aircraft, multi-intent, with uncertainties) with relative ease. Utilizing a Monte Carlo approach, a method was devised to perform the probabilistic computations in near realtime. Not only does this greatly increase the method's potential as an analytical tool, but it also opens up the possibility for use as a real-time conflict alerting probe. A prototype alerting logic was developed and has been utilized in several NASA Ames Research Center experimental studies.

  18. Modeling PSA Problems - I: The Stimulus-Driven Theory of Probabilistic Dynamics

    SciTech Connect

    Labeau, P.E.; Izquierdo, J.M.

    2005-06-15

    The theory of probabilistic dynamics (TPD) offers a framework capable of modeling the interaction between the physical evolution of a system in transient conditions and the succession of branchings defining a sequence of events. Nonetheless, the Chapman-Kolmogorov equation, besides being inherently Markovian, assumes instantaneous changes in the system dynamics when a setpoint is crossed. In actuality, a transition between two dynamic evolution regimes of the system is a two-phase process. First, conditions corresponding to the triggering of a transition have to be met; this phase will be referred to as the activation of a 'stimulus'. Then, a time delay must elapse before the actual occurrence of the event causing the transition to take place. When this delay cannot be neglected and is a random quantity, the general TPD can no longer be used as such. Moreover, these delays are likely to influence the ordering of events in an accident sequence with competing situations, and the process of delineating sequences in the probabilistic safety analysis of a plant might therefore be affected in turn. This paper aims at presenting several extensions of the classical TPD, in which additional modeling capabilities are progressively introduced. A companion paper sketches a discretized approach of these problems.

  19. A Population Model of Integrative Cardiovascular Physiology

    PubMed Central

    Pruett, William A.; Husband, Leland D.; Husband, Graham; Dakhlalla, Muhammad; Bellamy, Kyle; Coleman, Thomas G.; Hester, Robert L.

    2013-01-01

    We present a small integrative model of human cardiovascular physiology. The model is population-based; rather than using best fit parameter values, we used a variant of the Metropolis algorithm to produce distributions for the parameters most associated with model sensitivity. The population is built by sampling from these distributions to create the model coefficients. The resulting models were then subjected to a hemorrhage. The population was separated into those that lost less than 15 mmHg arterial pressure (compensators), and those that lost more (decompensators). The populations were parametrically analyzed to determine baseline conditions correlating with compensation and decompensation. Analysis included single variable correlation, graphical time series analysis, and support vector machine (SVM) classification. Most variables were seen to correlate with propensity for circulatory collapse, but not sufficiently to effect reasonable classification by any single variable. Time series analysis indicated a single significant measure, the stressed blood volume, as predicting collapse in situ, but measurement of this quantity is clinically impossible. SVM uncovered a collection of variables and parameters that, when taken together, provided useful rubrics for classification. Due to the probabilistic origins of the method, multiple classifications were attempted, resulting in an average of 3.5 variables necessary to construct classification. The most common variables used were systemic compliance, baseline baroreceptor signal strength and total peripheral resistance, providing predictive ability exceeding 90%. The methods presented are suitable for use in any deterministic mathematical model. PMID:24058546

  20. A population model of integrative cardiovascular physiology.

    PubMed

    Pruett, William A; Husband, Leland D; Husband, Graham; Dakhlalla, Muhammad; Bellamy, Kyle; Coleman, Thomas G; Hester, Robert L

    2013-01-01

    We present a small integrative model of human cardiovascular physiology. The model is population-based; rather than using best fit parameter values, we used a variant of the Metropolis algorithm to produce distributions for the parameters most associated with model sensitivity. The population is built by sampling from these distributions to create the model coefficients. The resulting models were then subjected to a hemorrhage. The population was separated into those that lost less than 15 mmHg arterial pressure (compensators), and those that lost more (decompensators). The populations were parametrically analyzed to determine baseline conditions correlating with compensation and decompensation. Analysis included single variable correlation, graphical time series analysis, and support vector machine (SVM) classification. Most variables were seen to correlate with propensity for circulatory collapse, but not sufficiently to effect reasonable classification by any single variable. Time series analysis indicated a single significant measure, the stressed blood volume, as predicting collapse in situ, but measurement of this quantity is clinically impossible. SVM uncovered a collection of variables and parameters that, when taken together, provided useful rubrics for classification. Due to the probabilistic origins of the method, multiple classifications were attempted, resulting in an average of 3.5 variables necessary to construct classification. The most common variables used were systemic compliance, baseline baroreceptor signal strength and total peripheral resistance, providing predictive ability exceeding 90%. The methods presented are suitable for use in any deterministic mathematical model. PMID:24058546

  1. Decomposing biodiversity data using the Latent Dirichlet Allocation model, a probabilistic multivariate statistical method

    PubMed Central

    Valle, Denis; Baiser, Benjamin; Woodall, Christopher W; Chazdon, Robin

    2014-01-01

    We propose a novel multivariate method to analyse biodiversity data based on the Latent Dirichlet Allocation (LDA) model. LDA, a probabilistic model, reduces assemblages to sets of distinct component communities. It produces easily interpretable results, can represent abrupt and gradual changes in composition, accommodates missing data and allows for coherent estimates of uncertainty. We illustrate our method using tree data for the eastern United States and from a tropical successional chronosequence. The model is able to detect pervasive declines in the oak community in Minnesota and Indiana, potentially due to fire suppression, increased growing season precipitation and herbivory. The chronosequence analysis is able to delineate clear successional trends in species composition, while also revealing that site-specific factors significantly impact these successional trajectories. The proposed method provides a means to decompose and track the dynamics of species assemblages along temporal and spatial gradients, including effects of global change and forest disturbances. PMID:25328064

  2. Decomposing biodiversity data using the Latent Dirichlet Allocation model, a probabilistic multivariate statistical method.

    PubMed

    Valle, Denis; Baiser, Benjamin; Woodall, Christopher W; Chazdon, Robin

    2014-12-01

    We propose a novel multivariate method to analyse biodiversity data based on the Latent Dirichlet Allocation (LDA) model. LDA, a probabilistic model, reduces assemblages to sets of distinct component communities. It produces easily interpretable results, can represent abrupt and gradual changes in composition, accommodates missing data and allows for coherent estimates of uncertainty. We illustrate our method using tree data for the eastern United States and from a tropical successional chronosequence. The model is able to detect pervasive declines in the oak community in Minnesota and Indiana, potentially due to fire suppression, increased growing season precipitation and herbivory. The chronosequence analysis is able to delineate clear successional trends in species composition, while also revealing that site-specific factors significantly impact these successional trajectories. The proposed method provides a means to decompose and track the dynamics of species assemblages along temporal and spatial gradients, including effects of global change and forest disturbances. PMID:25328064

  3. Integrating geological and geophysical data to improve probabilistic hazard forecasting of Arabian Shield volcanism

    NASA Astrophysics Data System (ADS)

    Runge, Melody G.; Bebbington, Mark S.; Cronin, Shane J.; Lindsay, Jan M.; Moufti, Mohammed R.

    2016-02-01

    During probabilistic volcanic hazard analysis of volcanic fields, a greater variety of spatial data on crustal features should help improve forecasts of future vent locations. Without further examination, however, geophysical estimations of crustal or other features may be non-informative. Here, we present a new, robust, non-parametric method to quantitatively determine the existence of any relationship between natural phenomena (e.g., volcanic eruptions) and a variety of geophysical data. This provides a new validation tool for incorporating a range of potentially hazard-diagnostic observable data into recurrence rate estimates and hazard analyses. Through this study it is shown that the location of Cenozoic volcanic fields across the Arabian Shield appear to be related to locations of major and minor faults, at higher elevations, and regions where gravity anomaly values were between - 125 mGal and 0 mGal. These findings support earlier hypotheses that the western shield uplift was related to Cenozoic volcanism. At the harrat (volcanic field)-scale, higher vent density regions are related to both elevation and gravity anomaly values. A by-product of this work is the collection of existing data on the volcanism across Saudi Arabia, with all vent locations provided herein, as well as updated maps for Harrats Kura, Khaybar, Ithnayn, Kishb, and Rahat. This work also highlights the potential dangers of assuming relationships between observed data and the occurrence of a natural phenomenon without quantitative assessment or proper consideration of the effects of data resolution.

  4. Probabilistic Residual Strength Model Developed for Life Prediction of Ceramic Matrix Composites

    NASA Technical Reports Server (NTRS)

    Thomas, David J.; Verrilli, Michael J.; Calomino, Anthony M.

    2004-01-01

    For the next generation of reusable launch vehicles, NASA is investigating introducing ceramic matrix composites (CMCs) in place of current superalloys for structural propulsion applications (e.g., nozzles, vanes, combustors, and heat exchangers). The higher use temperatures of CMCs will reduce vehicle weight by eliminating and/or reducing cooling system requirements. The increased strength-to-weight ratio of CMCs relative to superalloys further enhances their weight savings potential. However, in order to provide safe designs for components made of these new materials, a comprehensive life prediction methodology for CMC structures needs to be developed. A robust methodology for lifing composite structures has yet to be adopted by the engineering community. Current industry design practice continues to utilize deterministic empirically based models borrowed from metals design for predicting material life capabilities. The deterministic nature of these models inadequately addresses the stochastic character of brittle composites, and their empirical reliance makes predictions beyond the experimental test conditions a risky extrapolation. A team of engineers at the NASA Glenn Research Center has been developing a new life prediction engineering model. The Probabilistic Residual Strength (PRS) model uses the residual strength of the composite as its damage metric. Expected life and material strength are both considered probabilistically to account for the observed stochastic material response. Extensive experimental testing has been carried out on C/SiC (a candidate aerospace CMC material system) in a controlled 1000 ppm O2/argon environment at elevated temperatures of 800 and 1200 C. The test matrix was established to allow observation of the material behavior, characterization of the model, and validation of the model's predictive capabilities. Sample results of the validation study are illustrated in the graphs.

  5. QUANTIFYING AGGREGATE CHLORPYRIFOS EXPOSURE AND DOSE TO CHILDREN USING A PHYSICALLY-BASED TWO-STAGE MONTE CARLO PROBABILISTIC MODEL

    EPA Science Inventory

    To help address the Food Quality Protection Act of 1996, a physically-based, two-stage Monte Carlo probabilistic model has been developed to quantify and analyze aggregate exposure and dose to pesticides via multiple routes and pathways. To illustrate model capabilities and ide...

  6. Probabilistic modeling of the flows and environmental risks of nano-silica.

    PubMed

    Wang, Yan; Kalinina, Anna; Sun, Tianyin; Nowack, Bernd

    2016-03-01

    Nano-silica, the engineered nanomaterial with one of the largest production volumes, has a wide range of applications in consumer products and industry. This study aimed to quantify the exposure of nano-silica to the environment and to assess its risk to surface waters. Concentrations were calculated for four environmental (air, soil, surface water, sediments) and two technical compartments (wastewater, solid waste) for the EU and Switzerland using probabilistic material flow modeling. The corresponding median concentration in surface water is predicted to be 0.12 μg/l in the EU (0.053-3.3 μg/l, 15/85% quantiles). The concentrations in sediments in the complete sedimentation scenario were found to be the largest among all environmental compartments, with a median annual increase of 0.43 mg/kg · y in the EU (0.19-12 mg/kg · y, 15/85% quantiles). Moreover, probabilistic species sensitivity distributions (PSSD) were computed and the risk of nano-silica in surface waters was quantified by comparing the predicted environmental concentration (PEC) with the predicted no-effect concentration (PNEC) distribution, which was derived from the cumulative PSSD. This assessment suggests that nano-silica currently poses no risk to aquatic organisms in surface waters. Further investigations are needed to assess the risk of nano-silica in other environmental compartments, which is currently not possible due to a lack of ecotoxicological data. PMID:26745294

  7. A spatio-temporal model for probabilistic seismic hazard zonation of Tehran

    NASA Astrophysics Data System (ADS)

    Hashemi, Mahdi; Alesheikh, Ali Asghar; Zolfaghari, Mohammad Reza

    2013-08-01

    A precondition for all disaster management steps, building damage prediction, and construction code developments is a hazard assessment that shows the exceedance probabilities of different ground motion levels at a site considering different near- and far-field earthquake sources. The seismic sources are usually categorized as time-independent area sources and time-dependent fault sources. While the earlier incorporates the small and medium events, the later takes into account only the large characteristic earthquakes. In this article, a probabilistic approach is proposed to aggregate the effects of time-dependent and time-independent sources on seismic hazard. The methodology is then applied to generate three probabilistic seismic hazard maps of Tehran for 10%, 5%, and 2% exceedance probabilities in 50 years. The results indicate an increase in peak ground acceleration (PGA) values toward the southeastern part of the study area and the PGA variations are mostly controlled by the shear wave velocities across the city. In addition, the implementation of the methodology takes advantage of GIS capabilities especially raster-based analyses and representations. During the estimation of the PGA exceedance rates, the emphasis has been placed on incorporating the effects of different attenuation relationships and seismic source models by using a logic tree.

  8. Use of probabilistic inversion to model qualitative expert input when selecting a new nuclear reactor technology

    NASA Astrophysics Data System (ADS)

    Merritt, Charles R., Jr.

    Complex investment decisions by corporate executives often require the comparison of dissimilar attributes and competing technologies. A technique to evaluate qualitative input from experts using a Multi-Criteria Decision Method (MCDM) is described to select a new reactor technology for a merchant nuclear generator. The high capital cost, risks from design, licensing and construction, reactor safety and security considerations are some of the diverse considerations when choosing a reactor design. Three next generation reactor technologies are examined: the Advanced Pressurized-1000 (AP-1000) from Westinghouse, Economic Simplified Boiling Water Reactor (ESBWR) from General Electric, and the U.S. Evolutionary Power Reactor (U.S. EPR) from AREVA. Recent developments in MCDM and decision support systems are described. The uncertainty inherent in experts' opinions for the attribute weighting in the MCDM is modeled through the use of probabilistic inversion. In probabilistic inversion, a function is inverted into a random variable within a defined range. Once the distribution is created, random samples based on the distribution are used to perform a sensitivity analysis on the decision results to verify the "strength" of the results. The decision results for the pool of experts identified the U.S. EPR as the optimal choice.

  9. Predicting the acute neurotoxicity of diverse organic solvents using probabilistic neural networks based QSTR modeling approaches.

    PubMed

    Basant, Nikita; Gupta, Shikha; Singh, Kunwar P

    2016-03-01

    Organic solvents are widely used chemicals and the neurotoxic properties of some are well established. In this study, we established nonlinear qualitative and quantitative structure-toxicity relationship (STR) models for predicting neurotoxic classes and neurotoxicity of structurally diverse solvents in rodent test species following OECD guideline principles for model development. Probabilistic neural network (PNN) based qualitative and generalized regression neural network (GRNN) based quantitative STR models were constructed using neurotoxicity data from rat and mouse studies. Further, interspecies correlation based quantitative activity-activity relationship (QAAR) and global QSTR models were also developed using the combined data set of both rodent species for predicting the neurotoxicity of solvents. The constructed models were validated through deriving several statistical coefficients for the test data and the prediction and generalization abilities of these models were evaluated. The qualitative STR models (rat and mouse) yielded classification accuracies of 92.86% in the test data sets, whereas, the quantitative STRs yielded correlation (R(2)) of >0.93 between the measured and model predicted toxicity values in both the test data (rat and mouse). The prediction accuracies of the QAAR (R(2) 0.859) and global STR (R(2) 0.945) models were comparable to those of the independent local STR models. The results suggest the ability of the developed QSTR models to reliably predict binary neurotoxicity classes and the endpoint neurotoxicities of the structurally diverse organic solvents. PMID:26721664

  10. Probabilistic Model of Onset Detection Explains Paradoxes in Human Time Perception

    PubMed Central

    Nikolov, Stanislav; Rahnev, Dobromir A.; Lau, Hakwan C.

    2010-01-01

    A very basic computational model is proposed to explain two puzzling findings in the time perception literature. First, spontaneous motor actions are preceded by up to 1–2 s of preparatory activity (Kornhuber and Deecke, 1965). Yet, subjects are only consciously aware of about a quarter of a second of motor preparation (Libet et al., 1983). Why are they not aware of the early part of preparation? Second, psychophysical findings (Spence et al., 2001) support the principle of attention prior entry (Titchener, 1908), which states that attended stimuli are perceived faster than unattended stimuli. However, electrophysiological studies reported no or little corresponding temporal difference between the neural signals for attended and unattended stimuli (McDonald et al., 2005; Vibell et al., 2007). We suggest that the key to understanding these puzzling findings is to think of onset detection in probabilistic terms. The two apparently paradoxical phenomena are naturally predicted by our signal detection theoretic model. PMID:21833206

  11. Probabilistic multi-item inventory model with varying mixture shortage cost under restrictions.

    PubMed

    Fergany, Hala A

    2016-01-01

    This paper proposed a new general probabilistic multi-item, single-source inventory model with varying mixture shortage cost under two restrictions. One of them is on the expected varying backorder cost and the other is on the expected varying lost sales cost. This model is formulated to analyze how the firm can deduce the optimal order quantity and the optimal reorder point for each item to reach the main goal of minimizing the expected total cost. The demand is a random variable and the lead time is a constant. The demand during the lead time is a random variable that follows any continuous distribution, for example; the normal distribution, the exponential distribution and the Chi square distribution. An application with real data is analyzed and the goal of minimization the expected total cost is achieved. Two special cases are deduced. PMID:27588244

  12. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    SciTech Connect

    Suzette Payne

    2007-08-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  13. Probabilistic solution of random SI-type epidemiological models using the Random Variable Transformation technique

    NASA Astrophysics Data System (ADS)

    Casabán, M.-C.; Cortés, J.-C.; Romero, J.-V.; Roselló, M.-D.

    2015-07-01

    This paper presents a full probabilistic description of the solution of random SI-type epidemiological models which are based on nonlinear differential equations. This description consists of determining: the first probability density function of the solution in terms of the density functions of the diffusion coefficient and the initial condition, which are assumed to be independent random variables; the expectation and variance functions of the solution as well as confidence intervals and, finally, the distribution of time until a given proportion of susceptibles remains in the population. The obtained formulas are general since they are valid regardless the probability distributions assigned to the random inputs. We also present a pair of illustrative examples including in one of them the application of the theoretical results to model the diffusion of a technology using real data.

  14. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    SciTech Connect

    Suzette Payne

    2006-04-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  15. Improving predictive power of physically based rainfall-induced shallow landslide models: a probabilistic approach

    NASA Astrophysics Data System (ADS)

    Raia, S.; Alvioli, M.; Rossi, M.; Baum, R. L.; Godt, J. W.; Guzzetti, F.

    2014-03-01

    Distributed models to forecast the spatial and temporal occurrence of rainfall-induced shallow landslides are based on deterministic laws. These models extend spatially the static stability models adopted in geotechnical engineering, and adopt an infinite-slope geometry to balance the resisting and the driving forces acting on the sliding mass. An infiltration model is used to determine how rainfall changes pore-water conditions, modulating the local stability/instability conditions. A problem with the operation of the existing models lays in the difficulty in obtaining accurate values for the several variables that describe the material properties of the slopes. The problem is particularly severe when the models are applied over large areas, for which sufficient information on the geotechnical and hydrological conditions of the slopes is not generally available. To help solve the problem, we propose a probabilistic Monte Carlo approach to the distributed modeling of rainfall-induced shallow landslides. For this purpose, we have modified the transient rainfall infiltration and grid-based regional slope-stability analysis (TRIGRS) code. The new code (TRIGRS-P) adopts a probabilistic approach to compute, on a cell-by-cell basis, transient pore-pressure changes and related changes in the factor of safety due to rainfall infiltration. Infiltration is modeled using analytical solutions of partial differential equations describing one-dimensional vertical flow in isotropic, homogeneous materials. Both saturated and unsaturated soil conditions can be considered. TRIGRS-P copes with the natural variability inherent to the mechanical and hydrological properties of the slope materials by allowing values of the TRIGRS model input parameters to be sampled randomly from a given probability distribution. The range of variation and the mean value of the parameters can be determined by the usual methods used for preparing the TRIGRS input parameters. The outputs of several model

  16. Neural Mechanisms for Integrating Prior Knowledge and Likelihood in Value-Based Probabilistic Inference

    PubMed Central

    Ting, Chih-Chung; Yu, Chia-Chen; Maloney, Laurence T.

    2015-01-01

    In Bayesian decision theory, knowledge about the probabilities of possible outcomes is captured by a prior distribution and a likelihood function. The prior reflects past knowledge and the likelihood summarizes current sensory information. The two combined (integrated) form a posterior distribution that allows estimation of the probability of different possible outcomes. In this study, we investigated the neural mechanisms underlying Bayesian integration using a novel lottery decision task in which both prior knowledge and likelihood information about reward probability were systematically manipulated on a trial-by-trial basis. Consistent with Bayesian integration, as sample size increased, subjects tended to weigh likelihood information more compared with prior information. Using fMRI in humans, we found that the medial prefrontal cortex (mPFC) correlated with the mean of the posterior distribution, a statistic that reflects the integration of prior knowledge and likelihood of reward probability. Subsequent analysis revealed that both prior and likelihood information were represented in mPFC and that the neural representations of prior and likelihood in mPFC reflected changes in the behaviorally estimated weights assigned to these different sources of information in response to changes in the environment. Together, these results establish the role of mPFC in prior-likelihood integration and highlight its involvement in representing and integrating these distinct sources of information. PMID:25632152

  17. Detection and characterization of regulatory elements using probabilistic conditional random field and hidden Markov models.

    PubMed

    Wang, Hongyan; Zhou, Xiaobo

    2013-04-01

    By altering the electrostatic charge of histones or providing binding sites to protein recognition molecules, Chromatin marks have been proposed to regulate gene expression, a property that has motivated researchers to link these marks to cis-regulatory elements. With the help of next generation sequencing technologies, we can now correlate one specific chromatin mark with regulatory elements (e.g. enhancers or promoters) and also build tools, such as hidden Markov models, to gain insight into mark combinations. However, hidden Markov models have limitation for their character of generative models and assume that a current observation depends only on a current hidden state in the chain. Here, we employed two graphical probabilistic models, namely the linear conditional random field model and multivariate hidden Markov model, to mark gene regions with different states based on recurrent and spatially coherent character of these eight marks. Both models revealed chromatin states that may correspond to enhancers and promoters, transcribed regions, transcriptional elongation, and low-signal regions. We also found that the linear conditional random field model was more effective than the hidden Markov model in recognizing regulatory elements, such as promoter-, enhancer-, and transcriptional elongation-associated regions, which gives us a better choice. PMID:23237214

  18. ToPS: a framework to manipulate probabilistic models of sequence data.

    PubMed

    Kashiwabara, André Yoshiaki; Bonadio, Igor; Onuchic, Vitor; Amado, Felipe; Mathias, Rafael; Durham, Alan Mitchell

    2013-01-01

    Discrete Markovian models can be used to characterize patterns in sequences of values and have many applications in biological sequence analysis, including gene prediction, CpG island detection, alignment, and protein profiling. We present ToPS, a computational framework that can be used to implement different applications in bioinformatics analysis by combining eight kinds of models: (i) independent and identically distributed process; (ii) variable-length Markov chain; (iii) inhomogeneous Markov chain; (iv) hidden Markov model; (v) profile hidden Markov model; (vi) pair hidden Markov model; (vii) generalized hidden Markov model; and (viii) similarity based sequence weighting. The framework includes functionality for training, simulation and decoding of the models. Additionally, it provides two methods to help parameter setting: Akaike and Bayesian information criteria (AIC and BIC). The models can be used stand-alone, combined in Bayesian classifiers, or included in more complex, multi-model, probabilistic architectures using GHMMs. In particular the framework provides a novel, flexible, implementation of decoding in GHMMs that detects when the architecture can be traversed efficiently. PMID:24098098

  19. A 3-D probabilistic stability model incorporating the variability of root reinforcement

    NASA Astrophysics Data System (ADS)

    Cislaghi, Alessio; Chiaradia, Enrico; Battista Bischetti, Gian

    2016-04-01

    Process-oriented models of hillslope stability have a great potentiality to improve spatially-distributed landslides hazard analyses. At the same time, they may have severe limitations and among them the variability and uncertainty of the parameters play a key role. In this context, the application of a probabilistic approach through Monte Carlo techniques can be the right practice to deal with the variability of each input parameter by considering a proper probability distribution. In forested areas an additional point must be taken into account: the reinforcement due to roots permeating the soil and its variability and uncertainty. While the probability distributions of geotechnical and hydrological parameters have been widely investigated, little is known concerning the variability and the spatial heterogeneity of root reinforcement. Moreover, there are still many difficulties in measuring and in evaluating such a variable. In our study we aim to: i) implement a robust procedure to evaluate the variability of root reinforcement as a probabilistic distribution, according to the stand characteristics of forests, such as the trees density, the average diameter at breast height, the minimum distance among trees, and (ii) combine a multidimensional process-oriented model with a Monte Carlo Simulation technique, to obtain a probability distribution of the Factor of Safety. The proposed approach has been applied to a small Alpine area, mainly covered by a coniferous forest and characterized by steep slopes and a high landslide hazard. The obtained results show a good reliability of the model according to the landslide inventory map. At the end, our findings contribute to improve the reliability of landslide hazard mapping in forested areas and help forests managers to evaluate different management scenarios.

  20. Probabilistic Graphical Models for the Analysis and Synthesis of Musical Audio

    NASA Astrophysics Data System (ADS)

    Hoffmann, Matthew Douglas

    Content-based Music Information Retrieval (MIR) systems seek to automatically extract meaningful information from musical audio signals. This thesis applies new and existing generative probabilistic models to several content-based MIR tasks: timbral similarity estimation, semantic annotation and retrieval, and latent source discovery and separation. In order to estimate how similar two songs sound to one another, we employ a Hierarchical Dirichlet Process (HDP) mixture model to discover a shared representation of the distribution of timbres in each song. Comparing songs under this shared representation yields better query-by-example retrieval quality and scalability than previous approaches. To predict what tags are likely to apply to a song (e.g., "rap," "happy," or "driving music"), we develop the Codeword Bernoulli Average (CBA) model, a simple and fast mixture-of-experts model. Despite its simplicity, CBA performs at least as well as state-of-the-art approaches at automatically annotating songs and finding to what songs in a database a given tag most applies. Finally, we address the problem of latent source discovery and separation by developing two Bayesian nonparametric models, the Shift-Invariant HDP and Gamma Process NMF. These models allow us to discover what sounds (e.g. bass drums, guitar chords, etc.) are present in a song or set of songs and to isolate or suppress individual source. These models' ability to decide how many latent sources are necessary to model the data is particularly valuable in this application, since it is impossible to guess a priori how many sounds will appear in a given song or set of songs. Once they have been fit to data, probabilistic models can also be used to drive the synthesis of new musical audio, both for creative purposes and to qualitatively diagnose what information a model does and does not capture. We also adapt the SIHDP model to create new versions of input audio with arbitrary sample sets, for example, to create

  1. Probabilistic conditional reasoning: Disentangling form and content with the dual-source model.

    PubMed

    Singmann, Henrik; Klauer, Karl Christoph; Beller, Sieghard

    2016-08-01

    The present research examines descriptive models of probabilistic conditional reasoning, that is of reasoning from uncertain conditionals with contents about which reasoners have rich background knowledge. According to our dual-source model, two types of information shape such reasoning: knowledge-based information elicited by the contents of the material and content-independent information derived from the form of inferences. Two experiments implemented manipulations that selectively influenced the model parameters for the knowledge-based information, the relative weight given to form-based versus knowledge-based information, and the parameters for the form-based information, validating the psychological interpretation of these parameters. We apply the model to classical suppression effects dissecting them into effects on background knowledge and effects on form-based processes (Exp. 3) and we use it to reanalyse previous studies manipulating reasoning instructions. In a model-comparison exercise, based on data of seven studies, the dual-source model outperformed three Bayesian competitor models. Overall, our results support the view that people make use of background knowledge in line with current Bayesian models, but they also suggest that the form of the conditional argument, irrespective of its content, plays a substantive, yet smaller, role. PMID:27416493

  2. A time-dependent probabilistic seismic-hazard model for California

    USGS Publications Warehouse

    Cramer, C.H.; Petersen, M.D.; Cao, T.; Toppozada, Tousson R.; Reichle, M.

    2000-01-01

    For the purpose of sensitivity testing and illuminating nonconsensus components of time-dependent models, the California Department of Conservation, Division of Mines and Geology (CDMG) has assembled a time-dependent version of its statewide probabilistic seismic hazard (PSH) model for California. The model incorporates available consensus information from within the earth-science community, except for a few faults or fault segments where consensus information is not available. For these latter faults, published information has been incorporated into the model. As in the 1996 CDMG/U.S. Geological Survey (USGS) model, the time-dependent models incorporate three multisegment ruptures: a 1906, an 1857, and a southern San Andreas earthquake. Sensitivity tests are presented to show the effect on hazard and expected damage estimates of (1) intrinsic (aleatory) sigma, (2) multisegment (cascade) vs. independent segment (no cascade) ruptures, and (3) time-dependence vs. time-independence. Results indicate that (1) differences in hazard and expected damage estimates between time-dependent and independent models increase with decreasing intrinsic sigma, (2) differences in hazard and expected damage estimates between full cascading and not cascading are insensitive to intrinsic sigma, (3) differences in hazard increase with increasing return period (decreasing probability of occurrence), and (4) differences in moment-rate budgets increase with decreasing intrinsic sigma and with the degree of cascading, but are within the expected uncertainty in PSH time-dependent modeling and do not always significantly affect hazard and expected damage estimates.

  3. Probabilistic Boolean Network Modelling and Analysis Framework for mRNA Translation.

    PubMed

    Zhao, Yun-Bo; Krishnan, J

    2016-01-01

    mRNA translation is a complex process involving the progression of ribosomes on the mRNA, resulting in the synthesis of proteins, and is subject to multiple layers of regulation. This process has been modelled using different formalisms, both stochastic and deterministic. Recently, we introduced a Probabilistic Boolean modelling framework for mRNA translation, which possesses the advantage of tools for numerically exact computation of steady state probability distribution, without requiring simulation. Here, we extend this model to incorporate both random sequential and parallel update rules, and demonstrate its effectiveness in various settings, including its flexibility in accommodating additional static and dynamic biological complexities and its role in parameter sensitivity analysis. In these applications, the results from the model analysis match those of TASEP model simulations. Importantly, the proposed modelling framework maintains the stochastic aspects of mRNA translation and provides a way to exactly calculate probability distributions, providing additional tools of analysis in this context. Finally, the proposed modelling methodology provides an alternative approach to the understanding of the mRNA translation process, by bridging the gap between existing approaches, providing new analysis tools, and contributing to a more robust platform for modelling and understanding translation. PMID:26390498

  4. Multi-level approach for statistical appearance models with probabilistic correspondences

    NASA Astrophysics Data System (ADS)

    Krüger, Julia; Ehrhardt, Jan; Handels, Heinz

    2016-03-01

    Statistical shape and appearance models are often based on the accurate identification of one-to-one correspondences in a training data set. At the same time, the determination of these corresponding landmarks is the most challenging part of such methods. Hufnagel et al.1 developed an alternative method using correspondence probabilities for a statistical shape model. In Krüuger et al.2, 3 we propose the use of probabilistic correspondences for statistical appearance models by incorporating appearance information into the framework. We employ a point-based representation of image data combining position and appearance information. The model is optimized and adapted by a maximum a-posteriori (MAP) approach deriving a single global optimization criterion with respect to model parameters and observation dependent parameters that directly affects shape and appearance information of the considered structures. Because initially unknown correspondence probabilities are used and a higher number of degrees of freedom is introduced to the model a regularization of the model generation process is advantageous. For this purpose we extend the derived global criterion by a regularization term which penalizes implausible topological changes. Furthermore, we propose a multi-level approach for the optimization, to increase the robustness of the model generation process.

  5. Behavioral Modeling Based on Probabilistic Finite Automata: An Empirical Study †

    PubMed Central

    Tîrnăucă, Cristina; Montaña, José L.; Ontañón, Santiago; González, Avelino J.; Pardo, Luis M.

    2016-01-01

    Imagine an agent that performs tasks according to different strategies. The goal of Behavioral Recognition (BR) is to identify which of the available strategies is the one being used by the agent, by simply observing the agent’s actions and the environmental conditions during a certain period of time. The goal of Behavioral Cloning (BC) is more ambitious. In this last case, the learner must be able to build a model of the behavior of the agent. In both settings, the only assumption is that the learner has access to a training set that contains instances of observed behavioral traces for each available strategy. This paper studies a machine learning approach based on Probabilistic Finite Automata (PFAs), capable of achieving both the recognition and cloning tasks. We evaluate the performance of PFAs in the context of a simulated learning environment (in this case, a virtual Roomba vacuum cleaner robot), and compare it with a collection of other machine learning approaches. PMID:27347956

  6. On probabilistic certification of combined cancer therapies using strongly uncertain models.

    PubMed

    Alamir, Mazen

    2015-11-01

    This paper proposes a general framework for probabilistic certification of cancer therapies. The certification is defined in terms of two key issues which are the tumor contraction and the lower admissible bound on the circulating lymphocytes which is viewed as indicator of the patient health. The certification is viewed as the ability to guarantee with a predefined high probability the success of the therapy over a finite horizon despite of the unavoidable high uncertainties affecting the dynamic model that is used to compute the optimal scheduling of drugs injection. The certification paradigm can be viewed as a tool for tuning the treatment parameters and protocols as well as for getting a rational use of limited or expensive drugs. The proposed framework is illustrated using the specific problem of combined immunotherapy/chemotherapy of cancer. PMID:26300070

  7. A Model-Based Probabilistic Inversion Framework for Wire Fault Detection Using TDR

    NASA Technical Reports Server (NTRS)

    Schuet, Stefan R.; Timucin, Dogan A.; Wheeler, Kevin R.

    2010-01-01

    Time-domain reflectometry (TDR) is one of the standard methods for diagnosing faults in electrical wiring and interconnect systems, with a long-standing history focused mainly on hardware development of both high-fidelity systems for laboratory use and portable hand-held devices for field deployment. While these devices can easily assess distance to hard faults such as sustained opens or shorts, their ability to assess subtle but important degradation such as chafing remains an open question. This paper presents a unified framework for TDR-based chafing fault detection in lossy coaxial cables by combining an S-parameter based forward modeling approach with a probabilistic (Bayesian) inference algorithm. Results are presented for the estimation of nominal and faulty cable parameters from laboratory data.

  8. Salient and Non-Salient Fiducial Detection using a Probabilistic Graphical Model

    PubMed Central

    Benitez-Quiroz, C. Fabian; Rivera, Samuel; Gotardo, Paulo F.U.; Martinez, Aleix M.

    2013-01-01

    Deformable shape detection is an important problem in computer vision and pattern recognition. However, standard detectors are typically limited to locating only a few salient landmarks such as landmarks near edges or areas of high contrast, often conveying insufficient shape information. This paper presents a novel statistical pattern recognition approach to locate a dense set of salient and non-salient landmarks in images of a deformable object. We explore the fact that several object classes exhibit a homogeneous structure such that each landmark position provides some information about the position of the other landmarks. In our model, the relationship between all pairs of landmarks is naturally encoded as a probabilistic graph. Dense landmark detections are then obtained with a new sampling algorithm that, given a set of candidate detections, selects the most likely positions as to maximize the probability of the graph. Our experimental results demonstrate accurate, dense landmark detections within and across different databases. PMID:24187386

  9. Life Prediction and Classification of Failure Modes in Solid State Luminaires Using Bayesian Probabilistic Models

    SciTech Connect

    Lall, Pradeep; Wei, Junchao; Sakalaukus, Peter

    2014-05-27

    A new method has been developed for assessment of the onset of degradation in solid state luminaires to classify failure mechanisms by using metrics beyond lumen degradation that are currently used for identification of failure. Luminous Flux output, Correlated Color Temperature Data on Philips LED Lamps has been gathered under 85°C/85%RH till lamp failure. The acquired data has been used in conjunction with Bayesian Probabilistic Models to identify luminaires with onset of degradation much prior to failure through identification of decision boundaries between lamps with accrued damage and lamps beyond the failure threshold in the feature space. In addition luminaires with different failure modes have been classified separately from healthy pristine luminaires. It is expected that, the new test technique will allow the development of failure distributions without testing till L70 life for the manifestation of failure.

  10. The Integrated Medical Model - A Risk Assessment and Decision Support Tool for Human Space Flight Missions

    NASA Technical Reports Server (NTRS)

    Kerstman, Eric; Minard, Charles G.; Saile, Lynn; FreiredeCarvalho, Mary; Myers, Jerry; Walton, Marlei; Butler, Douglas; Lopez, Vilma

    2010-01-01

    The Integrated Medical Model (IMM) is a decision support tool that is useful to space flight mission planners and medical system designers in assessing risks and optimizing medical systems. The IMM employs an evidence-based, probabilistic risk assessment (PRA) approach within the operational constraints of space flight.

  11. A Probabilistic Model for Students' Errors and Misconceptions on the Structure of Matter in Relation to Three Cognitive Variables

    ERIC Educational Resources Information Center

    Tsitsipis, Georgios; Stamovlasis, Dimitrios; Papageorgiou, George

    2012-01-01

    In this study, the effect of 3 cognitive variables such as logical thinking, field dependence/field independence, and convergent/divergent thinking on some specific students' answers related to the particulate nature of matter was investigated by means of probabilistic models. Besides recording and tabulating the students' responses, a combination…

  12. Comparison of Four Probabilistic Models (CARES, Calendex, ConsEspo, SHEDS) to Estimate Aggregate Residential Exposures to Pesticides

    EPA Science Inventory

    Two deterministic models (US EPA’s Office of Pesticide Programs Residential Standard Operating Procedures (OPP Residential SOPs) and Draft Protocol for Measuring Children’s Non-Occupational Exposure to Pesticides by all Relevant Pathways (Draft Protocol)) and four probabilistic mo...

  13. A probabilistic model for the persistence of early planar fabrics in polydeformed pelitic schists

    USGS Publications Warehouse

    Ferguson, C.C.

    1984-01-01

    Although early planar fabrics are commonly preserved within microlithons in low-grade pelites, in higher-grade (amphibolite facies) pelitic schists fabric regeneration often appears complete. Evidence for early fabrics may be preserved within porphyroblasts but, within the matrix, later deformation often appears to totally obliterate or reorient earlier fabrics. However, examination of several hundred Dalradian pelites from Connemara, western Ireland, reveals that preservation of early fabrics is by no means uncommon; relict matrix domains, although volumetrically insignificant, are remarkably persistent even when inferred later strains are very large and fabric regeneration appears, at first sight, complete. Deterministic plasticity theories are ill-suited to the analysis of such an inhomogeneous material response, and a probabilistic model is proposed instead. It assumes that ductile polycrystal deformation is controlled by elementary flow units which can be activated once their associated stress barrier is overcome. Bulk flow propensity is related to the proportion of simultaneous activations, and a measure of this is derived from the probabilistic interaction between a stress-barrier spectrum and an internal stress spectrum (the latter determined by the external loading and the details of internal stress transfer). The spectra are modelled as Gaussian distributions although the treatment is very general and could be adapted for other distributions. Using the time rate of change of activation probability it is predicted that, initially, fabric development will be rapid but will then slow down dramatically even though stress increases at a constant rate. This highly non-linear response suggests that early fabrics persist because they comprise unfavourable distributions of stress-barriers which remain unregenerated at the time bulk stress is stabilized by steady-state flow. Relict domains will, however, bear the highest stress and are potential upper

  14. Probabilistic Stack of 180 Plio-Pleistocene Benthic δ18O Records Constructed Using Profile Hidden Markov Models

    NASA Astrophysics Data System (ADS)

    Lisiecki, L. E.; Ahn, S.; Khider, D.; Lawrence, C.

    2015-12-01

    Stratigraphic alignment is the primary way in which long marine climate records are placed on a common age model. We previously presented a probabilistic pairwise alignment algorithm, HMM-Match, which uses hidden Markov models to estimate alignment uncertainty and apply it to the alignment of benthic δ18O records to the "LR04" global benthic stack of Lisiecki and Raymo (2005) (Lin et al., 2014). However, since the LR04 stack is deterministic, the algorithm does not account for uncertainty in the stack. Here we address this limitation by developing a probabilistic stack, HMM-Stack. In this model the stack is a probabilistic inhomogeneous hidden Markov model, a.k.a. profile HMM. The HMM-stack is represented by a probabilistic model that "emits" each of the input records (Durbin et al., 1998). The unknown parameters of this model are learned from a set of input records using the expectation maximization (EM) algorithm. Because the multiple alignment of these records is unknown and uncertain, the expected contribution of each input point to each point in the stack is determined probabilistically. For each time step in the HMM-stack, δ18O values are described by a Gaussian probability distribution. Available δ18O records (N=180) are employed to estimate the mean and variance of δ18O at each time point. The mean of HMM-Stack follows the predicted pattern of glacial cycles with increased amplitude after the Pliocene-Pleistocene boundary and also larger and longer cycles after the mid-Pleistocene transition. Furthermore, the δ18O variance increases with age, producing a substantial loss in the signal-to-noise ratio. Not surprisingly, uncertainty in alignment and thus estimated age also increase substantially in the older portion of the stack.

  15. FOGCAST: Probabilistic fog forecasting based on operational (high-resolution) NWP models

    NASA Astrophysics Data System (ADS)

    Masbou, M.; Hacker, M.; Bentzien, S.

    2013-12-01

    The presence of fog and low clouds in the lower atmosphere can have a critical impact on both airborne and ground transports and is often connected with serious accidents. The improvement of localization, duration and variations in visibility therefore holds an immense operational value. Fog is generally a small scale phenomenon and mostly affected by local advective transport, radiation, turbulent mixing at the surface as well as its microphysical structure. Sophisticated three-dimensional fog models, based on advanced microphysical parameterization schemes and high vertical resolution, have been already developed and give promising results. Nevertheless, the computational time is beyond the range of an operational setup. Therefore, mesoscale numerical weather prediction models are generally used for forecasting all kinds of weather situations. In spite of numerous improvements, a large uncertainty of small scale weather events inherent in deterministic prediction cannot be evaluated adequately. Probabilistic guidance is necessary to assess these uncertainties and give reliable forecasts. In this study, fog forecasts are obtained by a diagnosis scheme similar to Fog Stability Index (FSI) based on COSMO-DE model outputs. COSMO-DE I the German-focused high-resolution operational weather prediction model of the German Meteorological Service. The FSI and the respective fog occurrence probability is optimized and calibrated with statistical postprocessing in terms of logistic regression. In a second step, the predictor number of the FOGCAST model has been optimized by use of the LASSO-method (Least Absolute Shrinkage and Selection Operator). The results will present objective out-of-sample verification based on the Brier score and is performed for station data over Germany. Furthermore, the probabilistic fog forecast approach, FOGCAST, serves as a benchmark for the evaluation of more sophisticated 3D fog models. Several versions have been set up based on different

  16. Multi-model approach to petroleum resource appraisal using analytic methodologies for probabilistic systems

    USGS Publications Warehouse

    Crovelli, R.A.

    1988-01-01

    The geologic appraisal model that is selected for a petroleum resource assessment depends upon purpose of the assessment, basic geologic assumptions of the area, type of available data, time available before deadlines, available human and financial resources, available computer facilities, and, most importantly, the available quantitative methodology with corresponding computer software and any new quantitative methodology that would have to be developed. Therefore, different resource assessment projects usually require different geologic models. Also, more than one geologic model might be needed in a single project for assessing different regions of the study or for cross-checking resource estimates of the area. Some geologic analyses used in the past for petroleum resource appraisal involved play analysis. The corresponding quantitative methodologies of these analyses usually consisted of Monte Carlo simulation techniques. A probabilistic system of petroleum resource appraisal for play analysis has been designed to meet the following requirements: (1) includes a variety of geologic models, (2) uses an analytic methodology instead of Monte Carlo simulation, (3) possesses the capacity to aggregate estimates from many areas that have been assessed by different geologic models, and (4) runs quickly on a microcomputer. Geologic models consist of four basic types: reservoir engineering, volumetric yield, field size, and direct assessment. Several case histories and present studies by the U.S. Geological Survey are discussed. ?? 1988 International Association for Mathematical Geology.

  17. Multi-model approach to petroleum resource appraisal using analytic methodologies for probabilistic systems

    SciTech Connect

    Crovelli, R.A.

    1988-11-01

    The geologic appraisal model that is selected for a petroleum resource assessment depends upon purpose of the assessment, basic geologic assumptions of the area, type of available data, time available before deadlines, available human and financial resources, available computer facilities, and, most importantly, the available quantitative methodology with corresponding computer software and any new quantitative methodology that would have to be developed. Therefore, different resource assessment projects usually require different geologic models. Also, more than one geologic model might be needed in a single project for assessing different regions of the study or for cross-checking resource estimates of the area. Some geologic analyses used in the past for petroleum resource appraisal involved play analysis. The corresponding quantitative methodologies of these analyses usually consisted of Monte Carlo simulation techniques. A probabilistic system of petroleum resource appraisal for play analysis has been designed to meet the following requirements: (1) includes a variety of geologic models, (2) uses an analytic methodology instead of Monte Carlo simulation, (3) possesses the capacity to aggregate estimates from many areas that have been assessed by different geologic models, and (4) runs quickly on a microcomputer. Geologic models consist of four basic types: reservoir engineering, volumetric yield, field size, and direct assessment. Several case histories and present studies by the US Geological Survey are discussed.

  18. The MCRA model for probabilistic single-compound and cumulative risk assessment of pesticides.

    PubMed

    van der Voet, Hilko; de Boer, Waldo J; Kruisselbrink, Johannes W; Goedhart, Paul W; van der Heijden, Gerie W A M; Kennedy, Marc C; Boon, Polly E; van Klaveren, Jacob D

    2015-05-01

    Pesticide risk assessment is hampered by worst-case assumptions leading to overly pessimistic assessments. On the other hand, cumulative health effects of similar pesticides are often not taken into account. This paper describes models and a web-based software system developed in the European research project ACROPOLIS. The models are appropriate for both acute and chronic exposure assessments of single compounds and of multiple compounds in cumulative assessment groups. The software system MCRA (Monte Carlo Risk Assessment) is available for stakeholders in pesticide risk assessment at mcra.rivm.nl. We describe the MCRA implementation of the methods as advised in the 2012 EFSA Guidance on probabilistic modelling, as well as more refined methods developed in the ACROPOLIS project. The emphasis is on cumulative assessments. Two approaches, sample-based and compound-based, are contrasted. It is shown that additional data on agricultural use of pesticides may give more realistic risk assessments. Examples are given of model and software validation of acute and chronic assessments, using both simulated data and comparisons against the previous release of MCRA and against the standard software DEEM-FCID used by the Environmental Protection Agency in the USA. It is shown that the EFSA Guidance pessimistic model may not always give an appropriate modelling of exposure. PMID:25455888

  19. Multi-State Physics Models of Aging Passive Components in Probabilistic Risk Assessment

    SciTech Connect

    Unwin, Stephen D.; Lowry, Peter P.; Layton, Robert F.; Heasler, Patrick G.; Toloczko, Mychailo B.

    2011-03-13

    Multi-state Markov modeling has proved to be a promising approach to estimating the reliability of passive components - particularly metallic pipe components - in the context of probabilistic risk assessment (PRA). These models consider the progressive degradation of a component through a series of observable discrete states, such as detectable flaw, leak and rupture. Service data then generally provides the basis for estimating the state transition rates. Research in materials science is producing a growing understanding of the physical phenomena that govern the aging degradation of passive pipe components. As a result, there is an emerging opportunity to incorporate these insights into PRA. This paper describes research conducted under the Risk-Informed Safety Margin Characterization Pathway of the Department of Energy’s Light Water Reactor Sustainability Program. A state transition model is described that addresses aging behavior associated with stress corrosion cracking in ASME Class 1 dissimilar metal welds – a component type relevant to LOCA analysis. The state transition rate estimates are based on physics models of weld degradation rather than service data. The resultant model is found to be non-Markov in that the transition rates are time-inhomogeneous and stochastic. Numerical solutions to the model provide insight into the effect of aging on component reliability.

  20. A probabilistic tornado wind hazard model for the continental United States

    SciTech Connect

    Hossain, Q; Kimball, J; Mensing, R; Savy, J

    1999-04-19

    A probabilistic tornado wind hazard model for the continental United States (CONUS) is described. The model incorporates both aleatory (random) and epistemic uncertainties associated with quantifying the tornado wind hazard parameters. The temporal occurrences of tornadoes within the continental United States (CONUS) is assumed to be a Poisson process. A spatial distribution of tornado touchdown locations is developed empirically based on the observed historical events within the CONUS. The hazard model is an aerial probability model that takes into consideration the size and orientation of the facility, the length and width of the tornado damage area (idealized as a rectangle and dependent on the tornado intensity scale), wind speed variation within the damage area, tornado intensity classification errors (i.e.,errors in assigning a Fujita intensity scale based on surveyed damage), and the tornado path direction. Epistemic uncertainties in describing the distributions of the aleatory variables are accounted for by using more than one distribution model to describe aleatory variations. The epistemic uncertainties are based on inputs from a panel of experts. A computer program, TORNADO, has been developed incorporating this model; features of this program are also presented.

  1. Probabilistic prediction of cyanobacteria abundance in a Korean reservoir using a Bayesian Poisson model

    NASA Astrophysics Data System (ADS)

    Cha, YoonKyung; Park, Seok Soon; Kim, Kyunghyun; Byeon, Myeongseop; Stow, Craig A.

    2014-03-01

    There have been increasing reports of harmful algal blooms (HABs) worldwide. However, the factors that influence cyanobacteria dominance and HAB formation can be site-specific and idiosyncratic, making prediction challenging. The drivers of cyanobacteria blooms in Lake Paldang, South Korea, the summer climate of which is strongly affected by the East Asian monsoon, may differ from those in well-studied North American lakes. Using the observational data sampled during the growing season in 2007-2011, a Bayesian hurdle Poisson model was developed to predict cyanobacteria abundance in the lake. The model allowed cyanobacteria absence (zero count) and nonzero cyanobacteria counts to be modeled as functions of different environmental factors. The model predictions demonstrated that the principal factor that determines the success of cyanobacteria was temperature. Combined with high temperature, increased residence time indicated by low outflow rates appeared to increase the probability of cyanobacteria occurrence. A stable water column, represented by low suspended solids, and high temperature were the requirements for high abundance of cyanobacteria. Our model results had management implications; the model can be used to forecast cyanobacteria watch or alert levels probabilistically and develop mitigation strategies of cyanobacteria blooms.

  2. Seismic source models for probabilistic hazard analysis of Georgia (Southern Caucasus)

    NASA Astrophysics Data System (ADS)

    Javakhishvili, Z.; Godoladze, T.; Gamkrelidze, E.; Sokhadze, G.

    2014-12-01

    Seismic Source model is one of the main components of probabilistic seismic-hazard analysis. Active faults and tectonics of Georgia (Sothern Caucasus) have been investigated in numerous scientific studies. The Caucasus consists of different geological structures with complex interactions. The major structures trend WNW-ESE, and focal mechanisms indicate primarily thrust faults striking parallel to the mountains. It is a part of the Alpine - Himalayan collision belt and it is well known for its high seismicity. Although the geodynamic activity of the region, caused by the convergence of the Arabian and the Eurasian plates at a rate of several cm/year, is well known, different tectonic models were proposed as an explanation for the seismic process in the region. The recent model on seismic sources for the Caucasus and derives from recent seismotectonic studies performed in Georgia in the framework of different international projects.We have analyzed previous studies and recent investigations on the bases of new seismic (spatial distribution, moment tensor solution etc), GPS and other data. As a result data base of seismic source models was compiled. Seismic sources are modeled as lines representing the surface projection of active faults or as wide areas (source zones), where the earthquakes can occur randomly. Each structure or zone was quantified on the basis of different parameters. Recent experience for harmonization of cross-border structures was used. As a result new seismic source model of Georgia (Southern Caucasus) for hazard analysis was created.

  3. Additional evidence for a dual-strategy model of reasoning: Probabilistic reasoning is more invariant than reasoning about logical validity.

    PubMed

    Markovits, Henry; Brisson, Janie; de Chantal, Pier-Luc

    2015-11-01

    One of the major debates concerning the nature of inferential reasoning is between counterexample-based strategies such as mental model theory and the statistical strategies underlying probabilistic models. The dual-strategy model proposed by Verschueren, Schaeken, and d'Ydewalle (2005a, 2005b) suggests that people might have access to both kinds of strategies. One of the postulates of this approach is that statistical strategies correspond to low-cost, intuitive modes of evaluation, whereas counterexample strategies are higher-cost and more variable in use. We examined this hypothesis by using a deductive-updating paradigm. The results of Study 1 showed that individual differences in strategy use predict different levels of deductive updating on inferences about logical validity. Study 2 demonstrated no such variation when explicitly probabilistic inferences were examined. Study 3 showed that presenting updating problems with probabilistic inferences modified performance on subsequent problems using logical validity, whereas the opposite was not true. These results provide clear evidence that the processes used to make probabilistic inferences are less subject to variation than those used to make inferences of logical validity. PMID:26148720

  4. MSblender: a probabilistic approach for integrating peptide identifications from multiple database search engines

    PubMed Central

    Kwon, Taejoon; Choi, Hyungwon; Vogel, Christine; Nesvizhskii, Alexey I.; Marcotte, Edward M.

    2011-01-01

    Shotgun proteomics using mass spectrometry is a powerful method for protein identification but suffers limited sensitivity in complex samples. Integrating peptide identifications from multiple database search engines is a promising strategy to increase the number of peptide identifications and reduce the volume of unassigned tandem mass spectra. Existing methods pool statistical significance scores such as p-values or posterior probabilities of peptide-spectrum matches (PSMs) from multiple search engines after high scoring peptides have been assigned to spectra, but these methods lack reliable control of identification error rates as data are integrated from different search engines. We developed a statistically coherent method for integrative analysis, termed MSblender. MSblender converts raw search scores from search engines into a probability score for all possible PSMs and properly accounts for the correlation between search scores. The method reliably estimates false discovery rates and identifies more PSMs than any single search engine at the same false discovery rate. Increased identifications increment spectral counts for all detected proteins and allow quantification of proteins that would not have been quantified by individual search engines. We also demonstrate that enhanced quantification contributes to improve sensitivity in differential expression analyses. PMID:21488652

  5. MSblender: A probabilistic approach for integrating peptide identifications from multiple database search engines.

    PubMed

    Kwon, Taejoon; Choi, Hyungwon; Vogel, Christine; Nesvizhskii, Alexey I; Marcotte, Edward M

    2011-07-01

    Shotgun proteomics using mass spectrometry is a powerful method for protein identification but suffers limited sensitivity in complex samples. Integrating peptide identifications from multiple database search engines is a promising strategy to increase the number of peptide identifications and reduce the volume of unassigned tandem mass spectra. Existing methods pool statistical significance scores such as p-values or posterior probabilities of peptide-spectrum matches (PSMs) from multiple search engines after high scoring peptides have been assigned to spectra, but these methods lack reliable control of identification error rates as data are integrated from different search engines. We developed a statistically coherent method for integrative analysis, termed MSblender. MSblender converts raw search scores from search engines into a probability score for every possible PSM and properly accounts for the correlation between search scores. The method reliably estimates false discovery rates and identifies more PSMs than any single search engine at the same false discovery rate. Increased identifications increment spectral counts for most proteins and allow quantification of proteins that would not have been quantified by individual search engines. We also demonstrate that enhanced quantification contributes to improve sensitivity in differential expression analyses. PMID:21488652

  6. A Time\\-Dependent Probabilistic Seismic Hazard Model For The Central Apennines (Italy)

    NASA Astrophysics Data System (ADS)

    Akinci, A.; Galadini, F.; Pantosti, D.; Petersen, M.; Malagnini, L.

    2004-12-01

    Earthquake hazard in the Central Apennines, Italy has been investigated using time-independent probabilistic (simple Poissonian) and time-dependent probabilistic (renewal) models. We developed a hazard model that defines the sources for potential earthquakes and earthquake recurrence relations. Both characteristic and floating earthquake hypothesismodel is used for the Central Apennines faults (M>5.9). The models for each fault segment are developed based on recent geological and geophysical studies, as well as historical earthquakes. Historical seismicity, active faulting framework and inferred seismogenic behavior (expressed in terms of slip rates, recurrence intervals, elapsed times) constitute the main quantitative information used in the model assignment. We calculate the background hazard from Mw 4.6-5.9 earthquakes using the historical catalogs of CPTI04 (Working Group, 2004) and obtain a-value distribution over the study area. This is because the earthquakes occur in areas where they cannot be assigned to a particular fault. Therefore, their recurrence is considered by the historic occurrence of earthquakes, calculating the magnitude-frequency distributions. We found good agreement between expected earthquake rates from historical earthquake catalog and earthquake source model. The probabilities are obtained from time-dependent models characterized by a Brownian Passage Time function on recurrence interval with aperiodicity of 0.5. Earthquake hazard is quantified in terms of peak ground acceleration and spectral accelerations for natural periods of 0.2 and 1.0 seconds. The ground motions are determined for rock conditions. We have used the attenuation relationships obtained for the Apennines by Malagnini et al. (2000) together with the relationships predicted from Sabetta and Pugliese (1996) and Ambraseys et al. (1996) for the Italian and European regions, respectively. Generally, time dependent hazard is increased and the peaks appear to shift to the ESE

  7. A performance weighting procedure for GCMs based on explicit probabilistic models and accounting for observation uncertainty

    NASA Astrophysics Data System (ADS)

    Renard, Benjamin; Vidal, Jean-Philippe

    2016-04-01

    In recent years, the climate modeling community has put a lot of effort into releasing the outputs of multimodel experiments for use by the wider scientific community. In such experiments, several structurally distinct GCMs are run using the same observed forcings (for the historical period) or the same projected forcings (for the future period). In addition, several members are produced for a single given model structure, by running each GCM with slightly different initial conditions. This multiplicity of GCM outputs offers many opportunities in terms of uncertainty quantification or GCM comparisons. In this presentation, we propose a new procedure to weight GCMs according to their ability to reproduce the observed climate. Such weights can be used to combine the outputs of several models in a way that rewards good-performing models and discards poorly-performing ones. The proposed procedure has the following main properties: 1. It is based on explicit probabilistic models describing the time series produced by the GCMs and the corresponding historical observations, 2. It can use several members whenever available, 3. It accounts for the uncertainty in observations, 4. It assigns a weight to each GCM (all weights summing up to one), 5. It can also assign a weight to the "H0 hypothesis" that all GCMs in the multimodel ensemble are not compatible with observations. The application of the weighting procedure is illustrated with several case studies including synthetic experiments, simple cases where the target GCM output is a simple univariate variable and more realistic cases where the target GCM output is a multivariate and/or a spatial variable. These case studies illustrate the generality of the procedure which can be applied in a wide range of situations, as long as the analyst is prepared to make an explicit probabilistic assumption on the target variable. Moreover, these case studies highlight several interesting properties of the weighting procedure. In

  8. Integrated Medical Model Verification, Validation, and Credibility

    NASA Technical Reports Server (NTRS)

    Walton, Marlei; Kerstman, Eric; Foy, Millennia; Shah, Ronak; Saile, Lynn; Boley, Lynn; Butler, Doug; Myers, Jerry

    2014-01-01

    The Integrated Medical Model (IMM) was designed to forecast relative changes for a specified set of crew health and mission success risk metrics by using a probabilistic (stochastic process) model based on historical data, cohort data, and subject matter expert opinion. A probabilistic approach is taken since exact (deterministic) results would not appropriately reflect the uncertainty in the IMM inputs. Once the IMM was conceptualized, a plan was needed to rigorously assess input information, framework and code, and output results of the IMM, and ensure that end user requests and requirements were considered during all stages of model development and implementation. METHODS: In 2008, the IMM team developed a comprehensive verification and validation (VV) plan, which specified internal and external review criteria encompassing 1) verification of data and IMM structure to ensure proper implementation of the IMM, 2) several validation techniques to confirm that the simulation capability of the IMM appropriately represents occurrences and consequences of medical conditions during space missions, and 3) credibility processes to develop user confidence in the information derived from the IMM. When the NASA-STD-7009 (7009) was published, the IMM team updated their verification, validation, and credibility (VVC) project plan to meet 7009 requirements and include 7009 tools in reporting VVC status of the IMM. RESULTS: IMM VVC updates are compiled recurrently and include 7009 Compliance and Credibility matrices, IMM VV Plan status, and a synopsis of any changes or updates to the IMM during the reporting period. Reporting tools have evolved over the lifetime of the IMM project to better communicate VVC status. This has included refining original 7009 methodology with augmentation from the NASA-STD-7009 Guidance Document. End user requests and requirements are being satisfied as evidenced by ISS Program acceptance of IMM risk forecasts, transition to an operational model and

  9. Integrated modeling for the VLTI

    NASA Astrophysics Data System (ADS)

    Mueller, Michael; Wilhelm, Rainer; Baier, Horst; Koehler, Bertrand

    2003-02-01

    Within the scope of the Very Large Telescope Interferometer (VLTI) project, a set of software tools for integrated modeling of ground- and space-based stellar interferometers has been developed. Integrated modeling aims at time-dependent system analysis combining different technical disciplines (optics, mechanical structure, control system with sensors and actuators, environmental disturbances). The main components of the software are BeamWarrior, a tool for creation of dynamic optical models, and SMI (Structural Modeling Interface), which generates linear state-space models from finite element models of a mechanical structure. Based on these tools, models of the various subsystems (e.g. telescope, delay line, beam combiner) can be created in the relevant technical disciplines (e.g. optics, structure). All subsystem models are integrated into the Matlab/Simulink environment for dynamic control system simulations. The output of the dynamic model is a complete description of the time-dependent electromagnetic field in each interferometer arm. This output serves as input to an instrument model simulating the creation of interference fringes. This paper shows the application of the integrated modeling concept to the VLTI. The architecture of a Simulink-based integrated model with its main components, telescope structures, optics and control loops, is presented. Disturbance models for wind load, seismic ground excitation and atmospheric turbulence are included. Beam combination is performed using a simplified model of the VINCI instrument. Results of closed-loop dynamic simulations are presented.

  10. A probabilistic model framework for evaluating year-to-year variation in crop productivity

    NASA Astrophysics Data System (ADS)

    Yokozawa, M.; Iizumi, T.; Tao, F.

    2008-12-01

    Most models describing the relation between crop productivity and weather condition have so far been focused on mean changes of crop yield. For keeping stable food supply against abnormal weather as well as climate change, evaluating the year-to-year variations in crop productivity rather than the mean changes is more essential. We here propose a new framework of probabilistic model based on Bayesian inference and Monte Carlo simulation. As an example, we firstly introduce a model on paddy rice production in Japan. It is called PRYSBI (Process- based Regional rice Yield Simulator with Bayesian Inference; Iizumi et al., 2008). The model structure is the same as that of SIMRIW, which was developed and used widely in Japan. The model includes three sub- models describing phenological development, biomass accumulation and maturing of rice crop. These processes are formulated to include response nature of rice plant to weather condition. This model inherently was developed to predict rice growth and yield at plot paddy scale. We applied it to evaluate the large scale rice production with keeping the same model structure. Alternatively, we assumed the parameters as stochastic variables. In order to let the model catch up actual yield at larger scale, model parameters were determined based on agricultural statistical data of each prefecture of Japan together with weather data averaged over the region. The posterior probability distribution functions (PDFs) of parameters included in the model were obtained using Bayesian inference. The MCMC (Markov Chain Monte Carlo) algorithm was conducted to numerically solve the Bayesian theorem. For evaluating the year-to-year changes in rice growth/yield under this framework, we firstly iterate simulations with set of parameter values sampled from the estimated posterior PDF of each parameter and then take the ensemble mean weighted with the posterior PDFs. We will also present another example for maize productivity in China. The

  11. A probabilistic orthopaedic population model to predict fatigue-related subacromial geometric variability.

    PubMed

    Chopp-Hurley, Jaclyn N; Langenderfer, Joseph E; Dickerson, Clark R

    2016-02-29

    Fatigue-related glenohumeral and scapulothoracic kinematic relationships, in addition to morphological characteristics of the scapula and humerus, affect the dimensions of the subacromial space. Each exhibits considerable interpersonal variability, which if only considering the mean, can lead to misleading population estimations of subacromial impingement risk, particularly for outliers. Additionally, the relative influence of each parameter on subacromial space variability is unclear. Applying empirically-derived morphological and kinematic distributions (n=31), this research used Advanced Mean Value and Monte Carlo probabilistic modeling approaches to predict the distribution of the minimum subacromial space width (SAS) and establish which parameters contributed more to modulating the SAS. The predicted SAS differed by 8mm between 1% and 99% confidence intervals. While the SAS was not influenced by muscle fatigue, the space reduced with arm elevation to magnitudes between 4.5 and 5mm. This reduction resulted in an estimated 65-75% of the population at risk for tissue compression at elevation angles≥90° when considering the interposed tissue thickness. Morphological parameters, notably glenoid inclination, showed higher relative importance for modulating the predicted SAS across conditions, while kinematic parameters (humeral head translation, scapular orientation), which differed by elevation angle and fatigue state, demonstrated less consistent importance levels across experimental conditions. Overall, the findings reinforce the shoulder health risks related to overhead activities, as they pose an increased likelihood of mechanical rotator cuff tendon compression. Further, probabilistic methods are highly innovative, in that they are capable of determining relative parameter importance and subsequently identifying key injury risk factors. As glenoid inclination is difficult to diagnose and treat, and is associated with superior humeral head translation

  12. Integrability of the Rabi Model

    SciTech Connect

    Braak, D.

    2011-09-02

    The Rabi model is a paradigm for interacting quantum systems. It couples a bosonic mode to the smallest possible quantum model, a two-level system. I present the analytical solution which allows us to consider the question of integrability for quantum systems that do not possess a classical limit. A criterion for quantum integrability is proposed which shows that the Rabi model is integrable due to the presence of a discrete symmetry. Moreover, I introduce a generalization with no symmetries; the generalized Rabi model is the first example of a nonintegrable but exactly solvable system.

  13. Probabilistic modeling of school meals for potential bisphenol A (BPA) exposure.

    PubMed

    Hartle, Jennifer C; Fox, Mary A; Lawrence, Robert S

    2016-05-01

    Many endocrine-disrupting chemicals (EDCs), including bisphenol A (BPA), are approved for use in food packaging, with unbound BPA migrating into the foods it contacts. Children, with their developing organ systems, are especially susceptible to hormone disruption, prompting this research to model the potential dose of BPA from school-provided meals. Probabilistic exposure models for school meals were informed by mixed methods. Exposure scenarios were based on United States school nutrition guidelines and included meals with varying levels of exposure potential from canned and packaged food. BPA exposure potentials were modeled with a range of 0.00049 μg/kg-BW/day for a middle school student with a low exposure breakfast and plate waste to 1.19 μg/kg-BW/day for an elementary school student eating lunch with high exposure potential. The modeled BPA doses from school meals are below the current US EPA Oral Reference Dose (RfD) of 50 μg/kg-BW/day. Recent research shows BPA animal toxicity thresholds at 2 μg/kg-BW/day. The single meal doses modeled in this research are at the same order of magnitude as the low-dose toxicity thresholds, illustrating the potential for school meals to expose children to chronic toxic levels of BPA. PMID:26395857

  14. Probabilistic and technology-specific modeling of emissions from municipal solid-waste incineration.

    PubMed

    Koehler, Annette; Peyer, Fabio; Salzmann, Christoph; Saner, Dominik

    2011-04-15

    The European legislation increasingly directs waste streams which cannot be recycled toward thermal treatment. Models are therefore needed that help to quantify emissions of waste incineration and thus reveal potential risks and mitigation needs. This study presents a probabilistic model which computes emissions as a function of waste composition and technological layout of grate incineration plants and their pollution-control equipment. In contrast to previous waste-incineration models, this tool is based on a broader empirical database and allows uncertainties in emission loads to be quantified. Comparison to monitoring data of 83 actual European plants showed no significant difference between modeled emissions and measured data. An inventory of all European grate incineration plants including technical characteristics and plant capacities was established, and waste material mixtures were determined for different European countries, including generic elemental waste-material compositions. The model thus allows for calculation of country-specific and material-dependent emission factors and enables identification and tracking of emission sources. It thereby helps to develop strategies to decrease plant emissions by reducing or redirecting problematic waste fractions to other treatment options or adapting the technological equipment of waste incinerators. PMID:21410192

  15. A probabilistic model for hydrokinetic turbine collision risks: exploring impacts on fish.

    PubMed

    Hammar, Linus; Eggertsen, Linda; Andersson, Sandra; Ehnberg, Jimmy; Arvidsson, Rickard; Gullström, Martin; Molander, Sverker

    2015-01-01

    A variety of hydrokinetic turbines are currently under development for power generation in rivers, tidal straits and ocean currents. Because some of these turbines are large, with rapidly moving rotor blades, the risk of collision with aquatic animals has been brought to attention. The behavior and fate of animals that approach such large hydrokinetic turbines have not yet been monitored at any detail. In this paper, we conduct a synthesis of the current knowledge and understanding of hydrokinetic turbine collision risks. The outcome is a generic fault tree based probabilistic model suitable for estimating population-level ecological risks. New video-based data on fish behavior in strong currents are provided and models describing fish avoidance behaviors are presented. The findings indicate low risk for small-sized fish. However, at large turbines (≥5 m), bigger fish seem to have high probability of collision, mostly because rotor detection and avoidance is difficult in low visibility. Risks can therefore be substantial for vulnerable populations of large-sized fish, which thrive in strong currents. The suggested collision risk model can be applied to different turbine designs and at a variety of locations as basis for case-specific risk assessments. The structure of the model facilitates successive model validation, refinement and application to other organism groups such as marine mammals. PMID:25730314

  16. A Probabilistic Model for Hydrokinetic Turbine Collision Risks: Exploring Impacts on Fish

    PubMed Central

    Hammar, Linus; Eggertsen, Linda; Andersson, Sandra; Ehnberg, Jimmy; Arvidsson, Rickard; Gullström, Martin; Molander, Sverker

    2015-01-01

    A variety of hydrokinetic turbines are currently under development for power generation in rivers, tidal straits and ocean currents. Because some of these turbines are large, with rapidly moving rotor blades, the risk of collision with aquatic animals has been brought to attention. The behavior and fate of animals that approach such large hydrokinetic turbines have not yet been monitored at any detail. In this paper, we conduct a synthesis of the current knowledge and understanding of hydrokinetic turbine collision risks. The outcome is a generic fault tree based probabilistic model suitable for estimating population-level ecological risks. New video-based data on fish behavior in strong currents are provided and models describing fish avoidance behaviors are presented. The findings indicate low risk for small-sized fish. However, at large turbines (≥5 m), bigger fish seem to have high probability of collision, mostly because rotor detection and avoidance is difficult in low visibility. Risks can therefore be substantial for vulnerable populations of large-sized fish, which thrive in strong currents. The suggested collision risk model can be applied to different turbine designs and at a variety of locations as basis for case-specific risk assessments. The structure of the model facilitates successive model validation, refinement and application to other organism groups such as marine mammals. PMID:25730314

  17. Probabilistic Material Strength Degradation Model for Inconel 718 Components Subjected to High Temperature, Mechanical Fatigue, Creep and Thermal Fatigue Effects

    NASA Technical Reports Server (NTRS)

    Bast, Callie Corinne Scheidt

    1994-01-01

    This thesis presents the on-going development of methodology for a probabilistic material strength degradation model. The probabilistic model, in the form of a postulated randomized multifactor equation, provides for quantification of uncertainty in the lifetime material strength of aerospace propulsion system components subjected to a number of diverse random effects. This model is embodied in the computer program entitled PROMISS, which can include up to eighteen different effects. Presently, the model includes four effects that typically reduce lifetime strength: high temperature, mechanical fatigue, creep, and thermal fatigue. Statistical analysis was conducted on experimental Inconel 718 data obtained from the open literature. This analysis provided regression parameters for use as the model's empirical material constants, thus calibrating the model specifically for Inconel 718. Model calibration was carried out for four variables, namely, high temperature, mechanical fatigue, creep, and thermal fatigue. Methodology to estimate standard deviations of these material constants for input into the probabilistic material strength model was developed. Using the current version of PROMISS, entitled PROMISS93, a sensitivity study for the combined effects of mechanical fatigue, creep, and thermal fatigue was performed. Results, in the form of cumulative distribution functions, illustrated the sensitivity of lifetime strength to any current value of an effect. In addition, verification studies comparing a combination of mechanical fatigue and high temperature effects by model to the combination by experiment were conducted. Thus, for Inconel 718, the basic model assumption of independence between effects was evaluated. Results from this limited verification study strongly supported this assumption.

  18. How might Model-based Probabilities Extracted from Imperfect Models Guide Rational Decisions: The Case for non-probabilistic odds

    NASA Astrophysics Data System (ADS)

    Smith, Leonard A.

    2010-05-01

    This contribution concerns "deep" or "second-order" uncertainty, such as the uncertainty in our probability forecasts themselves. It asks the question: "Is it rational to take (or offer) bets using model-based probabilities as if they were objective probabilities?" If not, what alternative approaches for determining odds, perhaps non-probabilistic odds, might prove useful in practice, given the fact we know our models are imperfect? We consider the case where the aim is to provide sustainable odds: not to produce a profit but merely to rationally expect to break even in the long run. In other words, to run a quantified risk of ruin that is relatively small. Thus the cooperative insurance schemes of coastal villages provide a more appropriate parallel than a casino. A "better" probability forecast would lead to lower premiums charged and less volatile fluctuations in the cash reserves of the village. Note that the Bayesian paradigm does not constrain one to interpret model distributions as subjective probabilities, unless one believes the model to be empirically adequate for the task at hand. In geophysics, this is rarely the case. When a probability forecast is interpreted as the objective probability of an event, the odds on that event can be easily computed as one divided by the probability of the event, and one need not favour taking either side of the wager. (Here we are using "odds-for" not "odds-to", the difference being whether of not the stake is returned; odds of one to one are equivalent to odds of two for one.) The critical question is how to compute sustainable odds based on information from imperfect models. We suggest that this breaks the symmetry between the odds-on an event and the odds-against it. While a probability distribution can always be translated into odds, interpreting the odds on a set of events might result in "implied-probabilities" that sum to more than one. And/or the set of odds may be incomplete, not covering all events. We ask

  19. A probabilistic model for analysing the effect of performance levels on visual behaviour patterns of young sailors in simulated navigation.

    PubMed

    Manzanares, Aarón; Menayo, Ruperto; Segado, Francisco; Salmerón, Diego; Cano, Juan Antonio

    2015-01-01

    The visual behaviour is a determining factor in sailing due to the influence of the environmental conditions. The aim of this research was to determine the visual behaviour pattern in sailors with different practice time in one star race, applying a probabilistic model based on Markov chains. The sample of this study consisted of 20 sailors, distributed in two groups, top ranking (n = 10) and bottom ranking (n = 10), all of them competed in the Optimist Class. An automated system of measurement, which integrates the VSail-Trainer sail simulator and the Eye Tracking System(TM) was used. The variables under consideration were the sequence of fixations and the fixation recurrence time performed on each location by the sailors. The event consisted of one of simulated regatta start, with stable conditions of wind, competitor and sea. Results show that top ranking sailors perform a low recurrence time on relevant locations and higher on irrelevant locations while bottom ranking sailors make a low recurrence time in most of the locations. The visual pattern performed by bottom ranking sailors is focused around two visual pivots, which does not happen in the top ranking sailor's pattern. In conclusion, the Markov chains analysis has allowed knowing the visual behaviour pattern of the top and bottom ranking sailors and its comparison. PMID:25296294

  20. Probabilistic versus Deterministic Skill in Predicting the Western North Pacific- East Asian Summer Monsoon Variability with Multi-Model Ensembles

    NASA Astrophysics Data System (ADS)

    Yang, D.; Yang, X. Q.; Xie, Q.; Zhang, Y.; Ren, X.; Tang, Y.

    2015-12-01

    Based on the historical forecasts of three quasi-operational multi-model ensemble (MME) systems, this study assesses the superiorities of the coupled MME over its contributing single-model ensembles (SMEs) and over the uncoupled atmospheric MME in predicting the seasonal variability of the Western North Pacific-East Asian summer monsoon. The seasonal prediction skill of the monsoon is measured by Brier skill score (BSS) in the sense of probabilistic forecast as well as by anomaly correlation (AC) in the sense of deterministic forecast. The probabilistic forecast skill of the MME is found to be always significantly better than that of each participating SME, while the deterministic forecast skill of the MME is even worse than that of some SME. The BSS is composed of reliability and resolution, two attributes characterizing probabilistic forecast skill. The probabilistic skill increase of the MME is dominated by the drastic improvement in reliability, while resolution is not always improved, similar to AC. A monotonous resolution-AC relationship is further found and qualitatively understood, whereas little relationship can be identified between reliability and AC. It is argued that the MME's success in improving the reliability possibly arises from an effective reduction of biases and overconfidence in forecast distributions. The coupled MME is much more skillful than the uncoupled atmospheric MME forced by persisted sea surface temperature (SST) anomalies. This advantage is mainly attributed to its better capability in capturing the evolution of the underlying seasonal SST anomaly.

  1. Using the UKCP09 probabilistic scenarios to model the amplified impact of climate change on drainage basin sediment yield

    NASA Astrophysics Data System (ADS)

    Coulthard, T. J.; Ramirez, J.; Fowler, H. J.; Glenis, V.

    2012-11-01

    Precipitation intensities and the frequency of extreme events are projected to increase under climate change. These rainfall changes will lead to increases in the magnitude and frequency of flood events that will, in turn, affect patterns of erosion and deposition within river basins. These geomorphic changes to river systems may affect flood conveyance, infrastructure resilience, channel pattern, and habitat status as well as sediment, nutrient and carbon fluxes. Previous research modelling climatic influences on geomorphic changes has been limited by how climate variability and change are represented by downscaling from global or regional climate models. Furthermore, the non-linearity of the climatic, hydrological and geomorphic systems involved generate large uncertainties at each stage of the modelling process creating an uncertainty "cascade". This study integrates state-of-the-art approaches from the climate change and geomorphic communities to address these issues in a probabilistic modelling study of the Swale catchment, UK. The UKCP09 weather generator is used to simulate hourly rainfall for the baseline and climate change scenarios up to 2099, and used to drive the CAESAR landscape evolution model to simulate geomorphic change. Results show that winter rainfall is projected to increase, with larger increases at the extremes. The impact of the increasing rainfall is amplified through the translation into catchment runoff and in turn sediment yield with a 100% increase in catchment mean sediment yield predicted between the baseline and the 2070-2099 High emissions scenario. Significant increases are shown between all climate change scenarios and baseline values. Analysis of extreme events also shows the amplification effect from rainfall to sediment delivery with even greater amplification associated with higher return period events. Furthermore, for the 2070-2099 High emissions scenario, sediment discharges from 50-yr return period events are predicted to

  2. Using the UKCP09 probabilistic scenarios to model the amplified impact of climate change on river basin sediment yield

    NASA Astrophysics Data System (ADS)

    Coulthard, T. J.; Ramirez, J.; Fowler, H. J.; Glenis, V.

    2012-07-01

    Precipitation intensities and the frequency of extreme events are projected to increase under climate change. These rainfall changes will lead to increases in the magnitude and frequency of flood events that will, in turn, affect patterns of erosion and deposition within river basins. These geomorphic changes to river systems may affect flood conveyance, infrastructure resilience, channel pattern, and habitat status, as well as sediment, nutrient and carbon fluxes. Previous research modelling climatic influences on geomorphic changes has been limited by how climate variability and change are represented by downscaling from Global or Regional Climate Models. Furthermore, the non-linearity of the climatic, hydrological and geomorphic systems involved generate large uncertainties at each stage of the modelling process creating an uncertainty "cascade". This study integrates state-of-the-art approaches from the climate change and geomorphic communities to address these issues in a probabilistic modelling study of the Swale catchment, UK. The UKCP09 weather generator is used to simulate hourly rainfall for the baseline and climate change scenarios up to 2099, and used to drive the CAESAR landscape evolution model to simulate geomorphic change. Results show that winter rainfall is projected to increase, with larger increases at the extremes. The impact of the increasing rainfall is amplified through the translation into catchment runoff and in turn sediment yield with a 100% increase in catchment mean sediment yield predicted between the baseline and the 2070-2099 High emissions scenario. Significant increases are shown between all climate change scenarios and baseline values. Analysis of extreme events also shows the amplification effect from rainfall to sediment delivery with even greater amplification associated with higher return period events. Furthermore, for the 2070-2099 High emissions scenario, sediment discharges from 50 yr return period events are predicted to

  3. A dimension scale-invariant probabilistic model based on Leibniz-like pyramids

    NASA Astrophysics Data System (ADS)

    Rodríguez, A.; Tsallis, C.

    2012-02-01

    We introduce a family of dimension scale-invariant Leibniz-like pyramids and (d + 1)-dimensional hyperpyramids (d = 1, 2, 3, …), with d = 1 corresponding to triangles, d = 2 to (tetrahedral) pyramids, and so on. For all values of d, they are characterized by a parameter ν > 0, whose value determines the degree of correlation between N (d + 1)-valued random variables (d = 1 corresponds to binary variables, d = 2 to ternary variables, and so on). There are (d + 1)N different events, and the limit ν → ∞ corresponds to independent random variables, in which case each event has a probability 1/(d + 1)N to occur. The sums of these N (d + 1)-valued random variables correspond to a d-dimensional probabilistic model and generalize a recently proposed one-dimensional (d = 1) model having q -Gaussians (with q = (ν - 2)/(ν - 1) for ν ∈ [1, ∞)) as N → ∞ limit probability distributions for the sum of the N binary variables [A. Rodríguez, V. Schwammle, and C. Tsallis, J. Stat. Mech.: Theory Exp. 2008, P09006; R. Hanel, S. Thurner, and C. Tsallis, Eur. Phys. J. B 72, 263 (2009)]. In the ν → ∞ limit the d-dimensional multinomial distribution is recovered for the sums, which approach a d-dimensional Gaussian distribution for N → ∞. For any ν, the conditional distributions of the d-dimensional model are shown to yield the corresponding joint distribution of the (d-1)-dimensional model with the same ν. For the d = 2 case, we study the joint probability distribution and identify two classes of marginal distributions, one of them being asymmetric and dimension scale-invariant, while the other one is symmetric and only asymptotically dimension scale-invariant. The present probabilistic model is proposed as a testing ground for a deeper understanding of the necessary and sufficient conditions for having q-Gaussian attractors in the N → ∞ limit, the ultimate goal being a neat mathematical view of the causes clarifying the ubiquitous emergence of q

  4. Size Evolution and Stochastic Models: Explaining Ostracod Size through Probabilistic Distributions

    NASA Astrophysics Data System (ADS)

    Krawczyk, M.; Decker, S.; Heim, N. A.; Payne, J.

    2014-12-01

    The biovolume of animals has functioned as an important benchmark for measuring evolution throughout geologic time. In our project, we examined the observed average body size of ostracods over time in order to understand the mechanism of size evolution in these marine organisms. The body size of ostracods has varied since the beginning of the Ordovician, where the first true ostracods appeared. We created a stochastic branching model to create possible evolutionary trees of ostracod size. Using stratigraphic ranges for ostracods compiled from over 750 genera in the Treatise on Invertebrate Paleontology, we calculated overall speciation and extinction rates for our model. At each timestep in our model, new lineages can evolve or existing lineages can become extinct. Newly evolved lineages are assigned sizes based on their parent genera. We parameterized our model to generate neutral and directional changes in ostracod size to compare with the observed data. New sizes were chosen via a normal distribution, and the neutral model selected new sizes differentials centered on zero, allowing for an equal chance of larger or smaller ostracods at each speciation. Conversely, the directional model centered the distribution on a negative value, giving a larger chance of smaller ostracods. Our data strongly suggests that the overall direction of ostracod evolution has been following a model that directionally pushes mean ostracod size down, shying away from a neutral model. Our model was able to match the magnitude of size decrease. Our models had a constant linear decrease while the actual data had a much more rapid initial rate followed by a constant size. The nuance of the observed trends ultimately suggests a more complex method of size evolution. In conclusion, probabilistic methods can provide valuable insight into possible evolutionary mechanisms determining size evolution in ostracods.

  5. Probabilistic modeling of short survivability in patients with brain metastasis from lung cancer.

    PubMed

    Makond, Bunjira; Wang, Kung-Jeng; Wang, Kung-Min

    2015-05-01

    The prediction of substantially short survivability in patients is extremely risky. In this study, we proposed a probabilistic model using Bayesian network (BN) to predict the short survivability of patients with brain metastasis from lung cancer. A nationwide cancer patient database from 1996 to 2010 in Taiwan was used. The cohort consisted of 438 patients with brain metastasis from lung cancer. We utilized synthetic minority over-sampling technique (SMOTE) to solve the imbalanced property embedded in the problem. The proposed BN was compared with three competitive models, namely, naive Bayes (NB), logistic regression (LR), and support vector machine (SVM). Statistical analysis showed that performances of BN, LR, NB, and SVM were statistically the same in terms of all indices with low sensitivity when these models were applied on an imbalanced data set. Results also showed that SMOTE can improve the performance of the four models in terms of sensitivity, while keeping high accuracy and specificity. Further, the proposed BN is more effective as compared with NB, LR, and SVM from two perspectives: the transparency and ability to show the relation of factors affecting brain metastasis from lung cancer; it allows decision makers to find the probability despite incomplete evidence and information; and the sensitivity of the proposed BN is the highest among all standard machine learning methods. PMID:25804445

  6. Uncertainty analyses of fuel hydrocarbon biodegradation signatures in ground water by probabilistic modeling

    SciTech Connect

    McNab, W.W. Jr.; Dooher, B.P.

    1998-07-01

    Natural attenuation processes, such as biodegradation, may serve as a means for remediating ground water contaminated by fuel hydrocarbons from leaking underground fuel tanks (LUFTs). Quantification of the uncertainties associated with natural attenuation, and hence the capacity to limit plume migration and restore an aquifer, is important. In this study, a probabilistic screening model is developed to quantify uncertainties involved in the impact of biodegradation on hydrocarbon plume behavior. The approach is based on Monte Carlo simulation using an analytical solution to the advective-dispersive solute transport equation, including a first-order degradation term, coupled with mass balance constraints on electron acceptor use. Empirical probability distributions for governing parameters are provided as input to the model. Application of the model to an existing LUFT site illustrates the degree of uncertainty associated with model-predicted hydrocarbon concentrations and geochemical indicators at individual site monitoring wells as well as the role of various parameter assumptions (e.g., hydraulic conductivity, first-order decay coefficient, source term) in influencing forecasts. This information is useful for risk management planning because the degree of confidence that biodegradation will limit the impact of a hydrocarbon plume on potential receptors can be quantified.

  7. A probabilistic model of visual working memory: Incorporating higher order regularities into working memory capacity estimates.

    PubMed

    Brady, Timothy F; Tenenbaum, Joshua B

    2013-01-01

    When remembering a real-world scene, people encode both detailed information about specific objects and higher order information like the overall gist of the scene. However, formal models of change detection, like those used to estimate visual working memory capacity, assume observers encode only a simple memory representation that includes no higher order structure and treats items independently from one another. We present a probabilistic model of change detection that attempts to bridge this gap by formalizing the role of perceptual organization and allowing for richer, more structured memory representations. Using either standard visual working memory displays or displays in which the items are purposefully arranged in patterns, we find that models that take into account perceptual grouping between items and the encoding of higher order summary information are necessary to account for human change detection performance. Considering the higher order structure of items in visual working memory will be critical for models to make useful predictions about observers' memory capacity and change detection abilities in simple displays as well as in more natural scenes. PMID:23230888

  8. Probabilistic Structural Analysis Program

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  9. Building Time-Dependent Earthquake Recurrence Models for Probabilistic Loss Computations

    NASA Astrophysics Data System (ADS)

    Fitzenz, D. D.; Nyst, M.

    2013-12-01

    We present a Risk Management perspective on earthquake recurrence on mature faults, and the ways that it can be modeled. The specificities of Risk Management relative to Probabilistic Seismic Hazard Assessment (PSHA), include the non-linearity of the exceedance probability curve for losses relative to the frequency of event occurrence, the fact that losses at all return periods are needed (and not at discrete values of the return period), and the set-up of financial models which sometimes require the modeling of realizations of the order in which events may occur (I.e., simulated event dates are important, whereas only average rates of occurrence are routinely used in PSHA). We use New Zealand as a case study and review the physical characteristics of several faulting environments, contrasting them against properties of three probability density functions (PDFs) widely used to characterize the inter-event time distributions in time-dependent recurrence models. We review the data available to help constrain both the priors and the recurrence process. And we propose that with the current level of knowledge, the best way to quantify the recurrence of large events on mature faults is to use a Bayesian combination of models, i.e., the decomposition of the inter-event time distribution into a linear combination of individual PDFs with their weight given by the posterior distribution. Finally we propose to the community : 1. A general debate on how best to incorporate our knowledge (e.g., from geology, geomorphology) on plausible models and model parameters, but also preserve the information on what we do not know; and 2. The creation and maintenance of a global database of priors, data, and model evidence, classified by tectonic region, special fluid characteristic (pH, compressibility, pressure), fault geometry, and other relevant properties so that we can monitor whether some trends emerge in terms of which model dominates in which conditions.

  10. Impact of Three-Parameter Weibull Models in Probabilistic Assessment of Earthquake Hazards

    NASA Astrophysics Data System (ADS)

    Pasari, Sumanta; Dikshit, Onkar

    2014-07-01

    This paper investigates the suitability of a three-parameter (scale, shape, and location) Weibull distribution in probabilistic assessment of earthquake hazards. The performance is also compared with two other popular models from same Weibull family, namely the two-parameter Weibull model and the inverse Weibull model. A complete and homogeneous earthquake catalog ( Yadav et al. in Pure Appl Geophys 167:1331-1342, 2010) of 20 events ( M ≥ 7.0), spanning the period 1846 to 1995 from north-east India and its surrounding region (20°-32°N and 87°-100°E), is used to perform this study. The model parameters are initially estimated from graphical plots and later confirmed from statistical estimations such as maximum likelihood estimation (MLE) and method of moments (MoM). The asymptotic variance-covariance matrix for the MLE estimated parameters is further calculated on the basis of the Fisher information matrix (FIM). The model suitability is appraised using different statistical goodness-of-fit tests. For the study area, the estimated conditional probability for an earthquake within a decade comes out to be very high (≥0.90) for an elapsed time of 18 years (i.e., 2013). The study also reveals that the use of location parameter provides more flexibility to the three-parameter Weibull model in comparison to the two-parameter Weibull model. Therefore, it is suggested that three-parameter Weibull model has high importance in empirical modeling of earthquake recurrence and seismic hazard assessment.

  11. Incorporating Multi-model Ensemble Techniques Into a Probabilistic Hydrologic Forecasting System

    NASA Astrophysics Data System (ADS)

    Sonessa, M. Y.; Bohn, T. J.; Lettenmaier, D. P.

    2008-12-01

    Multi-model ensemble techniques have been shown to reduce bias and to aid in quantification of the effects of model uncertainty in hydrologic modeling. However, these techniques are only beginning to be applied in operational hydrologic forecast systems. To investigate the performance of a multi-model ensemble in the context of probabilistic hydrologic forecasting, we have extended the University of Washington's West-wide Seasonal Hydrologic Forecasting System to use an ensemble of three models: the Variable Infiltration Capacity (VIC) model version 4.0.6, the NCEP NOAH model version 2.7.1, and the NWS grid-based Sacramento/Snow-17 model (SAC). The objective of this presentation is to assess the performance of the ensemble of the three models as compared to the performance of the models individually. Three forecast points within the West-wide forecast system domain were used for this research: the Feather River at Oroville, CA, the Salmon River at White horse, ID, and the Colorado River at Grand Junction. The forcing and observed streamflow data are for years 1951-2005 for the Feather and Salmon Rivers; and 1951-2003 for the Colorado. The models were first run for the retrospective period, then bias-corrected, and model weights were then determined using multiple linear regression. We assessed the performance of the ensemble in comparison with the individual models in terms of correlation with observed flows and Root Mean Square Error, and Nash-Sutcliffe. We found that for evaluations of retrospective simulations in comparison with observations, the ensemble performed better overall than any of the models individually even though in few individual months individual models performed slightly better than the ensemble. To test forecast skill, we performed Ensemble Streamflow Prediction (ESP) forecasts for each year of the retrospective period, using forcings from all other years, for individual models and for the multi-model ensemble. To form the ensemble for the ESP

  12. A probabilistic respiratory tract dosimetry model with application to beta-particle and photon emitters

    NASA Astrophysics Data System (ADS)

    Farfan, Eduardo Balderrama

    2002-01-01

    Predicting equivalent dose in the human respiratory tract is significant in the assessment of health risks associated with the inhalation of radioactive aerosols. A complete respiratory tract methodology based on the International Commission on Radiological Protection Publication 66 model was used in this research project for beta-particle and photon emitters. The conventional methodology has been to use standard values (from Reference Man) for parameters to obtain a single dose value. However, the methods used in the current study allow lung dose values to be determined as probability distributions to reflect the spread or variability in doses. To implement the methodology, a computer code, LUDUC, has been modified to include inhalation scenarios of beta-particle and photon emitters. For beta particles, a new methodology was implemented into Monte Carlo simulations to determine absorbed fractions in target tissues within the thoracic region of the respiratory tract. For photons, a new mathematical phantom of extrathoracic and thoracic regions was created based on previous studies to determine specific absorbed fractions in several tissues and organs of the human body due to inhalation of radioactive materials. The application of the methodology and developed data will be helpful in dose reconstruction and prediction efforts concerning the inhalation of short-lived radionuclides or radionuclides of Inhalation Class S. The resulting dose distributions follow a lognormal distribution shape for all scenarios examined. Applying the probabilistic computer code LUDUC to inhalation of strontium and yttrium aerosols has shown several trends, which could also be valid for many S radionuclide compounds that are beta-particle emitters. The equivalent doses are, in general, found to follow lognormal distributions. Therefore, these distributions can be described by geometric means and geometric standard deviations. Furthermore, a mathematical phantom of the extrathoracic and

  13. Modeling and Quantification of Team Performance in Human Reliability Analysis for Probabilistic Risk Assessment

    SciTech Connect

    Jeffrey C. JOe; Ronald L. Boring

    2014-06-01

    Probabilistic Risk Assessment (PRA) and Human Reliability Assessment (HRA) are important technical contributors to the United States (U.S.) Nuclear Regulatory Commission’s (NRC) risk-informed and performance based approach to regulating U.S. commercial nuclear activities. Furthermore, all currently operating commercial NPPs in the U.S. are required by federal regulation to be staffed with crews of operators. Yet, aspects of team performance are underspecified in most HRA methods that are widely used in the nuclear industry. There are a variety of "emergent" team cognition and teamwork errors (e.g., communication errors) that are 1) distinct from individual human errors, and 2) important to understand from a PRA perspective. The lack of robust models or quantification of team performance is an issue that affects the accuracy and validity of HRA methods and models, leading to significant uncertainty in estimating HEPs. This paper describes research that has the objective to model and quantify team dynamics and teamwork within NPP control room crews for risk informed applications, thereby improving the technical basis of HRA, which improves the risk-informed approach the NRC uses to regulate the U.S. commercial nuclear industry.

  14. Probabilistic ecological risk assessment of effluent toxicity of a wastewater reclamation plant based on process modeling.

    PubMed

    Zeng, Siyu; Huang, Yunqing; Sun, Fu; Li, Dan; He, Miao

    2016-09-01

    The growing use of reclaimed wastewater for environmental purposes such as stream flow augmentation requires comprehensive ecological risk assessment and management. This study applied a system analysis approach, regarding a wastewater reclamation plant (WRP) and its recipient water body as a whole system, and assessed the ecological risk of the recipient water body caused by the WRP effluent. Instead of specific contaminants, two toxicity indicators, i.e. genotoxicity and estrogenicity, were selected to directly measure the biological effects of all bio-available contaminants in the reclaimed wastewater, as well as characterize the ecological risk of the recipient water. A series of physically based models were developed to simulate the toxicity indicators in a WRP through a typical reclamation process, including ultrafiltration, ozonation, and chlorination. After being validated against the field monitoring data from a full-scale WRP in Beijing, the models were applied to simulate the probability distribution of effluent toxicity of the WRP through Latin Hypercube Sampling to account for the variability of influent toxicity and operation conditions. The simulated effluent toxicity was then used to derive the predicted environmental concentration (PEC) in the recipient stream, considering the variations of the toxicity and flow of the upstream inflow as well. The ratio of the PEC of each toxicity indicator to its corresponding predicted no-effect concentration was finally used for the probabilistic ecological risk assessment. Regional sensitivity analysis was also performed with the developed models to identify the critical control variables and strategies for ecological risk management. PMID:27219046

  15. Optimizing Negotiation Conflict in the Cloud Service Negotiation Framework Using Probabilistic Decision Making Model

    PubMed Central

    Rajavel, Rajkumar; Thangarathinam, Mala

    2015-01-01

    Optimization of negotiation conflict in the cloud service negotiation framework is identified as one of the major challenging issues. This negotiation conflict occurs during the bilateral negotiation process between the participants due to the misperception, aggressive behavior, and uncertain preferences and goals about their opponents. Existing research work focuses on the prerequest context of negotiation conflict optimization by grouping similar negotiation pairs using distance, binary, context-dependent, and fuzzy similarity approaches. For some extent, these approaches can maximize the success rate and minimize the communication overhead among the participants. To further optimize the success rate and communication overhead, the proposed research work introduces a novel probabilistic decision making model for optimizing the negotiation conflict in the long-term negotiation context. This decision model formulates the problem of managing different types of negotiation conflict that occurs during negotiation process as a multistage Markov decision problem. At each stage of negotiation process, the proposed decision model generates the heuristic decision based on the past negotiation state information without causing any break-off among the participants. In addition, this heuristic decision using the stochastic decision tree scenario can maximize the revenue among the participants available in the cloud service negotiation framework. PMID:26543899

  16. Segmentation of risk structures for otologic surgery using the Probabilistic Active Shape Model (PASM)

    NASA Astrophysics Data System (ADS)

    Becker, Meike; Kirschner, Matthias; Sakas, Georgios

    2014-03-01

    Our research project investigates a multi-port approach for minimally-invasive otologic surgery. For planning such a surgery, an accurate segmentation of the risk structures is crucial. However, the segmentation of these risk structures is a challenging task: The anatomical structures are very small and some have a complex shape, low contrast and vary both in shape and appearance. Therefore, prior knowledge is needed which is why we apply model-based approaches. In the present work, we use the Probabilistic Active Shape Model (PASM), which is a more flexible and specific variant of the Active Shape Model (ASM), to segment the following risk structures: cochlea, semicircular canals, facial nerve, chorda tympani, ossicles, internal auditory canal, external auditory canal and internal carotid artery. For the evaluation we trained and tested the algorithm on 42 computed tomography data sets using leave-one-out tests. Visual assessment of the results shows in general a good agreement of manual and algorithmic segmentations. Further, we achieve a good Average Symmetric Surface Distance while the maximum error is comparatively large due to low contrast at start and end points. Last, we compare the PASM to the standard ASM and show that the PASM leads to a higher accuracy.

  17. LADTAP-PROB: A PROBABILISTIC MODEL TO ASSESS RADIOLOGICAL CONSEQUENCES FROM LIQUID RADIOACTIVE RELEASES

    SciTech Connect

    Farfan, E; Trevor Foley, T; Tim Jannik, T

    2009-01-26

    The potential radiological consequences to humans resulting from aqueous releases at the Savannah River Site (SRS) have usually been assessed using the computer code LADTAP or deterministic variations of this code. The advancement of LADTAP over the years included LADTAP II (a computer program that still resides on the mainframe at SRS) [1], LADTAP XL{copyright} (Microsoft Excel{reg_sign} Spreadsheet) [2], and other versions specific to SRS areas such as [3]. The spreadsheet variations of LADTAP contain two worksheets: LADTAP and IRRIDOSE. The LADTAP worksheet estimates dose for environmental pathways including ingestion of water and fish and external exposure resulting from recreational activities. IRRIDOSE estimates potential dose to individuals from irrigation of food crops with contaminated water. A new version of this deterministic methodology, LADTAP-PROB, was developed at Savannah River National Laboratory (SRNL) to (1) consider the complete range of the model parameter values (not just maximum or mean values), (2) determine the influences of parameter uncertainties within the LADTAP methodology, to perform a sensitivity analysis of all model parameters (to identify the input parameters to which model results are most sensitive), and (3) probabilistically assess radiological consequences from contaminated water. This study presents the methodology applied in LADTAP-PROB.

  18. Development of a probabilistic timing model for the ingestion of tap water.

    SciTech Connect

    Davis, M. J.; Janke, R.; Environmental Science Division; EPA

    2009-01-01

    A contamination event in a water distribution system can result in adverse health impacts to individuals consuming contaminated water from the system. Assessing impacts to such consumers requires accounting for the timing of exposures of individuals to tap-water contaminants that have time-varying concentrations. Here we present a probabilistic model for the timing of ingestion of tap water that we developed for use in the U.S. Environmental Protection Agency's Threat Ensemble Vulnerability Assessment and Sensor Placement Tool, which is designed to perform consequence assessments for contamination events in water distribution systems. We also present a statistical analysis of the timing of ingestion activity using data collected by the American Time Use Survey. The results of the analysis provide the basis for our model, which accounts for individual variability in ingestion timing and provides a series of potential ingestion times for tap water. It can be combined with a model for ingestion volume to perform exposure assessments and applied in cases for which the use of characteristics typical of the United States is appropriate.

  19. Optimizing Negotiation Conflict in the Cloud Service Negotiation Framework Using Probabilistic Decision Making Model.

    PubMed

    Rajavel, Rajkumar; Thangarathinam, Mala

    2015-01-01

    Optimization of negotiation conflict in the cloud service negotiation framework is identified as one of the major challenging issues. This negotiation conflict occurs during the bilateral negotiation process between the participants due to the misperception, aggressive behavior, and uncertain preferences and goals about their opponents. Existing research work focuses on the prerequest context of negotiation conflict optimization by grouping similar negotiation pairs using distance, binary, context-dependent, and fuzzy similarity approaches. For some extent, these approaches can maximize the success rate and minimize the communication overhead among the participants. To further optimize the success rate and communication overhead, the proposed research work introduces a novel probabilistic decision making model for optimizing the negotiation conflict in the long-term negotiation context. This decision model formulates the problem of managing different types of negotiation conflict that occurs during negotiation process as a multistage Markov decision problem. At each stage of negotiation process, the proposed decision model generates the heuristic decision based on the past negotiation state information without causing any break-off among the participants. In addition, this heuristic decision using the stochastic decision tree scenario can maximize the revenue among the participants available in the cloud service negotiation framework. PMID:26543899

  20. A probabilistic neural network approach for modeling and classification of bacterial growth/no-growth data.

    PubMed

    Hajmeer, M; Basheer, I

    2002-10-01

    In this paper, we propose to use probabilistic neural networks (PNNs) for classification of bacterial growth/no-growth data and modeling the probability of growth. The PNN approach combines both Bayes theorem of conditional probability and Parzen's method for estimating the probability density functions of the random variables. Unlike other neural network training paradigms, PNNs are characterized by high training speed and their ability to produce confidence levels for their classification decision. As a practical application of the proposed approach, PNNs were investigated for their ability in classification of growth/no-growth state of a pathogenic Escherichia coli R31 in response to temperature and water activity. A comparison with the most frequently used traditional statistical method based on logistic regression and multilayer feedforward artificial neural network (MFANN) trained by error backpropagation was also carried out. The PNN-based models were found to outperform linear and nonlinear logistic regression and MFANN in both the classification accuracy and ease by which PNN-based models are developed. PMID:12133614

  1. Spike Sorting by Joint Probabilistic Modeling of Neural Spike Trains and Waveforms

    PubMed Central

    Matthews, Brett A.; Clements, Mark A.

    2014-01-01

    This paper details a novel probabilistic method for automatic neural spike sorting which uses stochastic point process models of neural spike trains and parameterized action potential waveforms. A novel likelihood model for observed firing times as the aggregation of hidden neural spike trains is derived, as well as an iterative procedure for clustering the data and finding the parameters that maximize the likelihood. The method is executed and evaluated on both a fully labeled semiartificial dataset and a partially labeled real dataset of extracellular electric traces from rat hippocampus. In conditions of relatively high difficulty (i.e., with additive noise and with similar action potential waveform shapes for distinct neurons) the method achieves significant improvements in clustering performance over a baseline waveform-only Gaussian mixture model (GMM) clustering on the semiartificial set (1.98% reduction in error rate) and outperforms both the GMM and a state-of-the-art method on the real dataset (5.04% reduction in false positive + false negative errors). Finally, an empirical study of two free parameters for our method is performed on the semiartificial dataset. PMID:24829568

  2. Expert system development for probabilistic load simulation

    NASA Technical Reports Server (NTRS)

    Ho, H.; Newell, J. F.

    1991-01-01

    A knowledge based system LDEXPT using the intelligent data base paradigm was developed for the Composite Load Spectra (CLS) project to simulate the probabilistic loads of a space propulsion system. The knowledge base approach provides a systematic framework of organizing the load information and facilitates the coupling of the numerical processing and symbolic (information) processing. It provides an incremental development environment for building generic probabilistic load models and book keeping the associated load information. A large volume of load data is stored in the data base and can be retrieved and updated by a built-in data base management system. The data base system standardizes the data storage and retrieval procedures. It helps maintain data integrity and avoid data redundancy. The intelligent data base paradigm provides ways to build expert system rules for shallow and deep reasoning and thus provides expert knowledge to help users to obtain the required probabilistic load spectra.

  3. Integrated Workforce Modeling System

    NASA Technical Reports Server (NTRS)

    Moynihan, Gary P.

    2000-01-01

    There are several computer-based systems, currently in various phases of development at KSC, which encompass some component, aspect, or function of workforce modeling. These systems may offer redundant capabilities and/or incompatible interfaces. A systems approach to workforce modeling is necessary in order to identify and better address user requirements. This research has consisted of two primary tasks. Task 1 provided an assessment of existing and proposed KSC workforce modeling systems for their functionality and applicability to the workforce planning function. Task 2 resulted in the development of a proof-of-concept design for a systems approach to workforce modeling. The model incorporates critical aspects of workforce planning, including hires, attrition, and employee development.

  4. Probabilistic terrain models from waveform airborne LiDAR: AutoProbaDTM project results

    NASA Astrophysics Data System (ADS)

    Jalobeanu, A.; Goncalves, G. R.

    2012-12-01

    The main objective of the AutoProbaDTM project was to develop new methods for automated probabilistic topographic map production using the latest LiDAR scanners. It included algorithmic development, implementation and validation over a 200 km2 test area in continental Portugal, representing roughly 100 GB of raw data and half a billion waveforms. We aimed to generate digital terrain models automatically, including ground topography as well as uncertainty maps, using Bayesian inference for model estimation and error propagation, and approaches based on image processing. Here we are presenting the results of the completed project (methodological developments and processing results from the test dataset). In June 2011, the test data were acquired in central Portugal, over an area of geomorphological and ecological interest, using a Riegl LMS-Q680i sensor. We managed to survey 70% of the test area at a satisfactory sampling rate, the angular spacing matching the laser beam divergence and the ground spacing nearly equal to the footprint (almost 4 pts/m2 for a 50cm footprint at 1500 m AGL). This is crucial for a correct processing as aliasing artifacts are significantly reduced. A reverse engineering had to be done as the data were delivered in a proprietary binary format, so we were able to read the waveforms and the essential parameters. A robust waveform processing method has been implemented and tested, georeferencing and geometric computations have been coded. Fast gridding and interpolation techniques have been developed. Validation is nearly completed, as well as geometric calibration, IMU error correction, full error propagation and large-scale DEM reconstruction. A probabilistic processing software package has been implemented and code optimization is in progress. This package includes new boresight calibration procedures, robust peak extraction modules, DEM gridding and interpolation methods, and means to visualize the produced uncertain surfaces (topography

  5. Rapid probabilistic source characterisation in 3D earth models using learning algorithms

    NASA Astrophysics Data System (ADS)

    Valentine, A. P.; Kaeufl, P.; Trampert, J.

    2015-12-01

    Characterising earthquake sources rapidly and robustly is an essential component of any earthquake early warning (EEW) procedure. Ideally, this characterisation should:(i) be probabilistic -- enabling appreciation of the full range of mechanisms compatible with available data, and taking observational and theoretical uncertainties into account; and(ii) operate in a physically-complete theoretical framework.However, implementing either of these ideals increases computational costs significantly, making it unfeasible to satisfy both in the short timescales necessary for EEW applications.The barrier here arises from the fact that conventional probabilistic inversion techniques involve running many thousands of forward simulations after data has been obtained---a procedure known as `posterior sampling'. Thus, for EEW, all computational costs must be incurred after the event time. Here, we demonstrate a new approach---based instead on `prior sampling'---which circumvents this problem and is feasible for EEW applications. All forward simulations are conducted in advance, and a learning algorithm is used to assimilate information about the relationship between model and data. Once observations from an earthquake become available, this information can be used to infer probability density functions (pdfs) for seismic source parameters, within milliseconds.We demonstrate this procedure using data from the 2008 Mw5.4 Chino Hills earthquake. We compute Green's functions for 150 randomly-chosen locations on the Whittier and Chino faults, using SPECFEM3D and a 3D model of the regional velocity structure. We then use these to train neural networks that map from seismic waveforms to pdfs on a point-source, moment-tensor representation of the event mechanism. We show that using local network data from the Chino Hills event, this system provides accurate information on magnitude, epicentral location and source half-duration using data available 6 seconds after the first station

  6. Probabilistic exposure assessment model to estimate aseptic-UHT product failure rate.

    PubMed

    Pujol, Laure; Albert, Isabelle; Magras, Catherine; Johnson, Nicholas Brian; Membré, Jeanne-Marie

    2015-01-01

    Aseptic-Ultra-High-Temperature (UHT) products are manufactured to be free of microorganisms capable of growing in the food at normal non-refrigerated conditions at which the food is likely to be held during manufacture, distribution and storage. Two important phases within the process are widely recognised as critical in controlling microbial contamination: the sterilisation steps and the following aseptic steps. Of the microbial hazards, the pathogen spore formers Clostridium botulinum and Bacillus cereus are deemed the most pertinent to be controlled. In addition, due to a relatively high thermal resistance, Geobacillus stearothermophilus spores are considered a concern for spoilage of low acid aseptic-UHT products. A probabilistic exposure assessment model has been developed in order to assess the aseptic-UHT product failure rate associated with these three bacteria. It was a Modular Process Risk Model, based on nine modules. They described: i) the microbial contamination introduced by the raw materials, either from the product (i.e. milk, cocoa and dextrose powders and water) or the packaging (i.e. bottle and sealing component), ii) the sterilisation processes, of either the product or the packaging material, iii) the possible recontamination during subsequent processing of both product and packaging. The Sterility Failure Rate (SFR) was defined as the sum of bottles contaminated for each batch, divided by the total number of bottles produced per process line run (10(6) batches simulated per process line). The SFR associated with the three bacteria was estimated at the last step of the process (i.e. after Module 9) but also after each module, allowing for the identification of modules, and responsible contamination pathways, with higher or lower intermediate SFR. The model contained 42 controlled settings associated with factory environment, process line or product formulation, and more than 55 probabilistic inputs corresponding to inputs with variability

  7. Near Real Time Integration of Satellite and Radar Data for Probabilistic Nearcasting of Severe Weather

    NASA Astrophysics Data System (ADS)

    Pilone, D.; Quinn, P.; Mitchell, A. E.; Baynes, K.; Shum, D.

    2014-12-01

    This talk introduces the audience to some of the very real challenges associated with visualizing data from disparate data sources as encountered during the development of real world applications. In addition to the fundamental challenges of dealing with the data and imagery, this talk discusses usability problems encountered while trying to provide interactive and user-friendly visualization tools. At the end of this talk the audience will be aware of some of the pitfalls of data visualization along with tools and techniques to help mitigate them. There are many sources of variable resolution visualizations of science data available to application developers including NASA's Global Imagery Browse Services (GIBS), however integrating and leveraging visualizations in modern applications faces a number of challenges, including: - Varying visualized Earth "tile sizes" resulting in challenges merging disparate sources - Multiple visualization frameworks and toolkits with varying strengths and weaknesses - Global composite imagery vs. imagery matching EOSDIS granule distribution - Challenges visualizing geographically overlapping data with different temporal bounds - User interaction with overlapping or collocated data - Complex data boundaries and shapes combined with multi-orbit data and polar projections - Discovering the availability of visualizations and the specific parameters, color palettes, and configurations used to produce them In addition to discussing the challenges and approaches involved in visualizing disparate data, we will discuss solutions and components we'll be making available as open source to encourage reuse and accelerate application development.

  8. NEMix: single-cell nested effects models for probabilistic pathway stimulation.

    PubMed

    Siebourg-Polster, Juliane; Mudrak, Daria; Emmenlauer, Mario; Rämö, Pauli; Dehio, Christoph; Greber, Urs; Fröhlich, Holger; Beerenwinkel, Niko

    2015-04-01

    Nested effects models have been used successfully for learning subcellular networks from high-dimensional perturbation effects that result from RNA interference (RNAi) experiments. Here, we further develop the basic nested effects model using high-content single-cell imaging data from RNAi screens of cultured cells infected with human rhinovirus. RNAi screens with single-cell readouts are becoming increasingly common, and they often reveal high cell-to-cell variation. As a consequence of this cellular heterogeneity, knock-downs result in variable effects among cells and lead to weak average phenotypes on the cell population level. To address this confounding factor in network inference, we explicitly model the stimulation status of a signaling pathway in individual cells. We extend the framework of nested effects models to probabilistic combinatorial knock-downs and propose NEMix, a nested effects mixture model that accounts for unobserved pathway activation. We analyzed the identifiability of NEMix and developed a parameter inference scheme based on the Expectation Maximization algorithm. In an extensive simulation study, we show that NEMix improves learning of pathway structures over classical NEMs significantly in the presence of hidden pathway stimulation. We applied our model to single-cell imaging data from RNAi screens monitoring human rhinovirus infection, where limited infection efficiency of the assay results in uncertain pathway stimulation. Using a subset of genes with known interactions, we show that the inferred NEMix network has high accuracy and outperforms the classical nested effects model without hidden pathway activity. NEMix is implemented as part of the R/Bioconductor package 'nem' and available at www.cbg.ethz.ch/software/NEMix. PMID:25879530

  9. Probabilistic reliability modeling for oil exploration & production (E&P) facilities in the tallgrass prairie preserve.

    PubMed

    Zambrano, Lyda; Sublette, Kerry; Duncan, Kathleen; Thoma, Greg

    2007-10-01

    The aging domestic oil production infrastructure represents a high risk to the environment because of the type of fluids being handled (oil and brine) and the potential for accidental release of these fluids into sensitive ecosystems. Currently, there is not a quantitative risk model directly applicable to onshore oil exploration and production (E&P) facilities. We report on a probabilistic reliability model created for onshore exploration and production (E&P) facilities. Reliability theory, failure modes and effects analysis (FMEA), and event trees were used to develop the model estimates of the failure probability of typical oil production equipment. Monte Carlo simulation was used to translate uncertainty in input parameter values to uncertainty in the model output. The predicted failure rates were calibrated to available failure rate information by adjusting probability density function parameters used as random variates in the Monte Carlo simulations. The mean and standard deviation of normal variate distributions from which the Weibull distribution characteristic life was chosen were used as adjustable parameters in the model calibration. The model was applied to oil production leases in the Tallgrass Prairie Preserve, Oklahoma. We present the estimated failure probability due to the combination of the most significant failure modes associated with each type of equipment (pumps, tanks, and pipes). The results show that the estimated probability of failure for tanks is about the same as that for pipes, but that pumps have much lower failure probability. The model can provide necessary equipment reliability information for proactive risk management at the lease level by providing quantitative information to base allocation of maintenance resources to high-risk equipment that will minimize both lost production and ecosystem damage. PMID:18076499

  10. Performance and Probabilistic Verification of Regional Parameter Estimates for Conceptual Rainfall-runoff Models

    NASA Astrophysics Data System (ADS)

    Franz, K.; Hogue, T.; Barco, J.

    2007-12-01

    Identification of appropriate parameter sets for simulation of streamflow in ungauged basins has become a significant challenge for both operational and research hydrologists. This is especially difficult in the case of conceptual models, when model parameters typically must be "calibrated" or adjusted to match streamflow conditions in specific systems (i.e. some of the parameters are not directly observable). This paper addresses the performance and uncertainty associated with transferring conceptual rainfall-runoff model parameters between basins within large-scale ecoregions. We use the National Weather Service's (NWS) operational hydrologic model, the SACramento Soil Moisture Accounting (SAC-SMA) model. A Multi-Step Automatic Calibration Scheme (MACS), using the Shuffle Complex Evolution (SCE), is used to optimize SAC-SMA parameters for a group of watersheds with extensive hydrologic records from the Model Parameter Estimation Experiment (MOPEX) database. We then explore "hydroclimatic" relationships between basins to facilitate regionalization of parameters for an established ecoregion in the southeastern United States. The impact of regionalized parameters is evaluated via standard model performance statistics as well as through generation of hindcasts and probabilistic verification procedures to evaluate streamflow forecast skill. Preliminary results show climatology ("climate neighbor") to be a better indicator of transferability than physical similarities or proximity ("nearest neighbor"). The mean and median of all the parameters within the ecoregion are the poorest choice for the ungauged basin. The choice of regionalized parameter set affected the skill of the ensemble streamflow hindcasts, however, all parameter sets show little skill in forecasts after five weeks (i.e. climatology is as good an indicator of future streamflows). In addition, the optimum parameter set changed seasonally, with the "nearest neighbor" showing the highest skill in the

  11. NEMix: Single-cell Nested Effects Models for Probabilistic Pathway Stimulation

    PubMed Central

    Siebourg-Polster, Juliane; Mudrak, Daria; Emmenlauer, Mario; Rämö, Pauli; Dehio, Christoph; Greber, Urs; Fröhlich, Holger; Beerenwinkel, Niko

    2015-01-01

    Nested effects models have been used successfully for learning subcellular networks from high-dimensional perturbation effects that result from RNA interference (RNAi) experiments. Here, we further develop the basic nested effects model using high-content single-cell imaging data from RNAi screens of cultured cells infected with human rhinovirus. RNAi screens with single-cell readouts are becoming increasingly common, and they often reveal high cell-to-cell variation. As a consequence of this cellular heterogeneity, knock-downs result in variable effects among cells and lead to weak average phenotypes on the cell population level. To address this confounding factor in network inference, we explicitly model the stimulation status of a signaling pathway in individual cells. We extend the framework of nested effects models to probabilistic combinatorial knock-downs and propose NEMix, a nested effects mixture model that accounts for unobserved pathway activation. We analyzed the identifiability of NEMix and developed a parameter inference scheme based on the Expectation Maximization algorithm. In an extensive simulation study, we show that NEMix improves learning of pathway structures over classical NEMs significantly in the presence of hidden pathway stimulation. We applied our model to single-cell imaging data from RNAi screens monitoring human rhinovirus infection, where limited infection efficiency of the assay results in uncertain pathway stimulation. Using a subset of genes with known interactions, we show that the inferred NEMix network has high accuracy and outperforms the classical nested effects model without hidden pathway activity. NEMix is implemented as part of the R/Bioconductor package ‘nem’ and available at www.cbg.ethz.ch/software/NEMix. PMID:25879530

  12. A Probabilistic Model for Predicting Attenuation of Viruses During Percolation in Unsaturated Natural Barriers

    NASA Astrophysics Data System (ADS)

    Faulkner, B. R.; Lyon, W. G.

    2001-12-01

    We present a probabilistic model for predicting virus attenuation. The solution employs the assumption of complete mixing. Monte Carlo methods are used to generate ensemble simulations of virus attenuation due to physical, biological, and chemical factors. The model generates a probability of failure to achieve 4-log attenuation. We tabulated data from related studies to develop probability density functions for input parameters, and utilized a database of soil hydraulic parameters based on the 12 USDA soil categories. Regulators can use the model based on limited information such as boring logs, climate data, and soil survey reports for a particular site of interest. Plackett-Burman sensitivity analysis indicated the most important main effects on probability of failure to achieve 4-log attenuation in our model were mean logarithm of saturated hydraulic conductivity (+0.396), mean water content (+0.203), mean solid-water mass transfer coefficient (-0.147), and the mean solid-water equilibrium partitioning coefficient (-0.144). Using the model, we predicted the probability of failure of a one-meter thick proposed hydrogeologic barrier and a water content of 0.3. With the currently available data and the associated uncertainty, we predicted soils classified as sand would fail (p=0.999), silt loams would also fail (p=0.292), but soils classified as clays would provide the required 4-log attenuation (p=0.001). The model is extendible in the sense that probability density functions of parameters can be modified as future studies refine the uncertainty, and the lightweight object-oriented design of the computer model (implemented in Java) will facilitate reuse with modified classes. This is an abstract of a proposed presentation and does not necessarily reflect EPA policy.

  13. Direct integration transmittance model

    NASA Technical Reports Server (NTRS)

    Kunde, V. G.; Maguire, W. C.

    1973-01-01

    A transmittance model was developed for the 200-2000/cm region for interpretation of high spectral resolution measurements of laboratory absorption and of planetary thermal emission. The high spectral resolution requires transmittances to be computed monochromatically by summing the contribution of individual molecular absorption lines. A magnetic tape atlas of H2O,O3, and CO2 molecular line parameters serves as input to the transmittance model with simple empirical representations used for continuum regions wherever suitable laboratory data exist. The theoretical formulation of the transmittance model and the computational procedures used for the evaluation of the transmittances are discussed. Application is demonstrated of the model to several homogenous path laboratory absorption examples.

  14. Probabilistic modelling of chromatin code landscape reveals functional diversity of enhancer-like chromatin states

    PubMed Central

    Zhou, Jian; Troyanskaya, Olga G.

    2016-01-01

    Interpreting the functional state of chromatin from the combinatorial binding patterns of chromatin factors, that is, the chromatin codes, is crucial for decoding the epigenetic state of the cell. Here we present a systematic map of Drosophila chromatin states derived from data-driven probabilistic modelling of dependencies between chromatin factors. Our model not only recapitulates enhancer-like chromatin states as indicated by widely used enhancer marks but also divides these states into three functionally distinct groups, of which only one specific group possesses active enhancer activity. Moreover, we discover a strong association between one specific enhancer state and RNA Polymerase II pausing, linking transcription regulatory potential and chromatin organization. We also observe that with the exception of long-intron genes, chromatin state transition positions in transcriptionally active genes align with an absolute distance to their corresponding transcription start site, regardless of gene length. Using our method, we provide a resource that helps elucidate the functional and spatial organization of the chromatin code landscape. PMID:26841971

  15. Marginal Probabilistic Modeling of the Delays in the Sensory Data Transmission of Networked Telerobots

    PubMed Central

    Gago-Benítez, Ana; Fernández-Madrigal, Juan-Antonio; Cruz-Martín, Ana

    2014-01-01

    Networked telerobots are remotely controlled through general purpose networks and components, which are highly heterogeneous and exhibit stochastic response times; however their correct teleoperation requires a timely flow of information from sensors to remote stations. In order to guarantee these time requirements, a good on-line probabilistic estimation of the sensory transmission delays is needed. In many modern applications this estimation must be computationally highly efficient, e.g., when the system includes a web-based client interface. This paper studies marginal probability distributions that, under mild assumptions, can be a good approximation of the real distribution of the delays without using knowledge of their dynamics, are efficient to compute, and need minor modifications on the networked robot. Since sequences of delays exhibit strong non-linearities in these networked applications, to satisfy the iid hypothesis required by the marginal approach we apply a change detection method. The results reported here indicate that some parametrical models explain well many more real scenarios when using this change detection method, while some non-parametrical distributions have a very good rate of successful modeling in the case that non-linearity detection is not possible and that we split the total delay into its three basic terms: server, network and client times. PMID:24481232

  16. Probabilistic Quantitative Precipitation Forecasting over East China using Bayesian Model Averaging

    NASA Astrophysics Data System (ADS)

    Yang, Ai; Yuan, Huiling

    2014-05-01

    The Bayesian model averaging (BMA) is a post-processing method that weights the predictive probability density functions (PDFs) of individual ensemble members. This study investigates the BMA method for calibrating quantitative precipitation forecasts (QPFs) from The Observing System Research and Predictability Experiment (THORPEX) Interactive Grand Global Ensemble (TIGGE) database. The QPFs over East Asia during summer (June-August) 2008-2011 are generated from six operational ensemble prediction systems (EPSs), including ECMWF, UKMO, NCEP, CMC, JMA, CMA, and multi-center ensembles of their combinations. The satellite-based precipitation estimate product TRMM 3B42 V7 is used as the verification dataset. In the BMA post-processing for precipitation forecasts, the PDF matching method is first applied to bias-correct systematic errors in each forecast member, by adjusting PDFs of forecasts to match PDFs of observations. Next, a logistic regression and two-parameter gamma distribution are used to fit the probability of rainfall occurrence and precipitation distribution. Through these two steps, the BMA post-processing bias-corrects ensemble forecasts systematically. The 60-70% cumulative density function (CDF) predictions well estimate moderate precipitation compared to raw ensemble mean, while the 90% upper boundary of BMA CDF predictions can be set as a threshold of extreme precipitation alarm. In general, the BMA method is more capable of multi-center ensemble post-processing, which improves probabilistic QPFs (PQPFs) with better ensemble spread and reliability. KEYWORDS: Bayesian model averaging (BMA); post-processing; ensemble forecast; TIGGE

  17. a Simple Probabilistic, Biologically Informed Model of the Population Dynamics of Desert Shrubs

    NASA Astrophysics Data System (ADS)

    Worman, S.; Furbish, D. J.; Clarke, J. H.; Roberts, A. S.

    2010-12-01

    In arid environments, spatiotemporal variations in the processes of erosion and deposition are strongly coupled with the structure and dynamics of plant communities as well as the specific life behavior of individual plants. Understanding how physical transport processes affect the evolution of the land surface on geomorphic time-scales therefore requires considering how long-term changes in plant dynamics may in turn impact such processes. The development of this desert shrub population dynamics model is therefore motivated by the need to link rain-splash induced mound building at the shrub-scale with the unfolding ‘biological play’ occurring on a hillslope. Using the Master Equation to conserve shrub age, probabilistic and biologically informed statements for recruitment and mortality are formulated to function as source and sink terms respectively. This simple accounting framework, by tracking the number of individuals entering and leaving a population, captures the changes in shrub count that can be expected in time as the key variables driving the dynamics of these plant communities (i.e. precipitation) also change in time. The result is a tool through which it is possible to statistically describe the aggregate spatiotemporal behavior of different shrub populations, with their own characteristic life-cycles and physical dimensions, under different external forcing scenarios. This model features inputs that have a solid biophysical basis and insofar as it has the capacity to mimic key features of real processes, leads to outputs which appear consistent with findings reported in the literature.

  18. Developing a probabilistic fire risk model and its application to fire danger systems

    NASA Astrophysics Data System (ADS)

    Penman, T.; Bradstock, R.; Caccamo, G.; Price, O.

    2012-04-01

    Wildfires can result in significant economic losses where they encounter human assets. Management agencies have large budgets devoted to both prevention and suppression of fires, but little is known about the extent to which they alter the probability of asset loss. Prediction of the risk of asset loss as a result of wildfire requires an understanding of a number of complex processes from ignition, fire growth and impact on assets. These processes need to account for the additive or multiplicative effects of management, weather and the natural environment. Traditional analytical methods can only examine only a small subset of these. Bayesian Belief Networks (BBNs) provide a methodology to examine complex environmental problems. Outcomes of a BBN are represented as likelihoods, which can then form the basis for risk analysis and management. Here we combine a range of data sources, including simulation models, empirical statistical analyses and expert opinion to form a fire management BBN. Various management actions have been incorporated into the model including landscape and interface prescribed burning, initial attack and fire suppression. Performance of the model has been tested against fire history datasets with strong correlations being found. Adapting the BBN presented here we are capable of developing a spatial and temporal fire danger rating system. Currently Australian fire danger rating systems are based on the weather. Our model accounts for existing fires, as well as the risk of new ignitions combined with probabilistic weather forecasts to identify those areas which are most at risk of asset loss. Fire growth is modelled with consideration given to management prevention efforts, as well as suppression resources that are available in each geographic locality. At a 10km resolution the model will provide a probability of asset loss which represents a significant step forward in the level of information that can be provided to the general public.

  19. Explicitly integrating parameter, input, and structure uncertainties into Bayesian Neural Networks for probabilistic hydrologic forecasting

    SciTech Connect

    Zhang, Xuesong; Liang, Faming; Yu, Beibei; Zong, Ziliang

    2011-11-09

    Estimating uncertainty of hydrologic forecasting is valuable to water resources and other relevant decision making processes. Recently, Bayesian Neural Networks (BNNs) have been proved powerful tools for quantifying uncertainty of streamflow forecasting. In this study, we propose a Markov Chain Monte Carlo (MCMC) framework to incorporate the uncertainties associated with input, model structure, and parameter into BNNs. This framework allows the structure of the neural networks to change by removing or adding connections between neurons and enables scaling of input data by using rainfall multipliers. The results show that the new BNNs outperform the BNNs that only consider uncertainties associated with parameter and model structure. Critical evaluation of posterior distribution of neural network weights, number of effective connections, rainfall multipliers, and hyper-parameters show that the assumptions held in our BNNs are not well supported. Further understanding of characteristics of different uncertainty sources and including output error into the MCMC framework are expected to enhance the application of neural networks for uncertainty analysis of hydrologic forecasting.

  20. Comparison of Control Approaches in Genetic Regulatory Networks by Using Stochastic Master Equation Models, Probabilistic Boolean Network Models and Differential Equation Models and Estimated Error Analyzes

    NASA Astrophysics Data System (ADS)

    Caglar, Mehmet Umut; Pal, Ranadip

    2011-03-01

    Central dogma of molecular biology states that ``information cannot be transferred back from protein to either protein or nucleic acid''. However, this assumption is not exactly correct in most of the cases. There are a lot of feedback loops and interactions between different levels of systems. These types of interactions are hard to analyze due to the lack of cell level data and probabilistic - nonlinear nature of interactions. Several models widely used to analyze and simulate these types of nonlinear interactions. Stochastic Master Equation (SME) models give probabilistic nature of the interactions in a detailed manner, with a high calculation cost. On the other hand Probabilistic Boolean Network (PBN) models give a coarse scale picture of the stochastic processes, with a less calculation cost. Differential Equation (DE) models give the time evolution of mean values of processes in a highly cost effective way. The understanding of the relations between the predictions of these models is important to understand the reliability of the simulations of genetic regulatory networks. In this work the success of the mapping between SME, PBN and DE models is analyzed and the accuracy and affectivity of the control policies generated by using PBN and DE models is compared.

  1. Methodology for Developing a Probabilistic Risk Assessment Model of Spacecraft Rendezvous and Dockings

    NASA Technical Reports Server (NTRS)

    Farnham, Steven J., II; Garza, Joel, Jr.; Castillo, Theresa M.; Lutomski, Michael

    2011-01-01

    In 2007 NASA was preparing to send two new visiting vehicles carrying logistics and propellant to the International Space Station (ISS). These new vehicles were the European Space Agency s (ESA) Automated Transfer Vehicle (ATV), the Jules Verne, and the Japanese Aerospace and Explorations Agency s (JAXA) H-II Transfer Vehicle (HTV). The ISS Program wanted to quantify the increased risk to the ISS from these visiting vehicles. At the time, only the Shuttle, the Soyuz, and the Progress vehicles rendezvoused and docked to the ISS. The increased risk to the ISS was from an increase in vehicle traffic, thereby, increasing the potential catastrophic collision during the rendezvous and the docking or berthing of the spacecraft to the ISS. A universal method of evaluating the risk of rendezvous and docking or berthing was created by the ISS s Risk Team to accommodate the increasing number of rendezvous and docking or berthing operations due to the increasing number of different spacecraft, as well as the future arrival of commercial spacecraft. Before the first docking attempt of ESA's ATV and JAXA's HTV to the ISS, a probabilistic risk model was developed to quantitatively calculate the risk of collision of each spacecraft with the ISS. The 5 rendezvous and docking risk models (Soyuz, Progress, Shuttle, ATV, and HTV) have been used to build and refine the modeling methodology for rendezvous and docking of spacecrafts. This risk modeling methodology will be NASA s basis for evaluating the addition of future ISS visiting spacecrafts hazards, including SpaceX s Dragon, Orbital Science s Cygnus, and NASA s own Orion spacecraft. This paper will describe the methodology used for developing a visiting vehicle risk model.

  2. Predicting carcinogenicity of diverse chemicals using probabilistic neural network modeling approaches

    SciTech Connect

    Singh, Kunwar P.; Gupta, Shikha; Rai, Premanjali

    2013-10-15

    Robust global models capable of discriminating positive and non-positive carcinogens; and predicting carcinogenic potency of chemicals in rodents were developed. The dataset of 834 structurally diverse chemicals extracted from Carcinogenic Potency Database (CPDB) was used which contained 466 positive and 368 non-positive carcinogens. Twelve non-quantum mechanical molecular descriptors were derived. Structural diversity of the chemicals and nonlinearity in the data were evaluated using Tanimoto similarity index and Brock–Dechert–Scheinkman statistics. Probabilistic neural network (PNN) and generalized regression neural network (GRNN) models were constructed for classification and function optimization problems using the carcinogenicity end point in rat. Validation of the models was performed using the internal and external procedures employing a wide series of statistical checks. PNN constructed using five descriptors rendered classification accuracy of 92.09% in complete rat data. The PNN model rendered classification accuracies of 91.77%, 80.70% and 92.08% in mouse, hamster and pesticide data, respectively. The GRNN constructed with nine descriptors yielded correlation coefficient of 0.896 between the measured and predicted carcinogenic potency with mean squared error (MSE) of 0.44 in complete rat data. The rat carcinogenicity model (GRNN) applied to the mouse and hamster data yielded correlation coefficient and MSE of 0.758, 0.71 and 0.760, 0.46, respectively. The results suggest for wide applicability of the inter-species models in predicting carcinogenic potency of chemicals. Both the PNN and GRNN (inter-species) models constructed here can be useful tools in predicting the carcinogenicity of new chemicals for regulatory purposes. - Graphical abstract: Figure (a) shows classification accuracies (positive and non-positive carcinogens) in rat, mouse, hamster, and pesticide data yielded by optimal PNN model. Figure (b) shows generalization and predictive

  3. IceChrono1: a probabilistic model to compute a common and optimal chronology for several ice cores

    NASA Astrophysics Data System (ADS)

    Parrenin, Frédéric; Bazin, Lucie; Capron, Emilie; Landais, Amaëlle; Lemieux-Dudon, Bénédicte; Masson-Delmotte, Valérie

    2016-04-01

    Polar ice cores provide exceptional archives of past environmental conditions. The dating of ice cores and the estimation of the age scale uncertainty are essential to interpret the climate and environmental records that they contain. It is however a complex problem which involves different methods. Here, we present IceChrono1, a new probabilistic model integrating various sources of chronological information to produce a common and optimized chronology for several ice cores, as well as its uncertainty. IceChrono1 is based on the inversion of three quantities: the surface accumulation rate, the Lock-In Depth (LID) of air bubbles and the thinning function. The chronological information integrated into the model are: models of the sedimentation process (accumulation of snow, densification of snow into ice and air trapping, ice flow), ice and air dated horizons, ice and air depth intervals with known durations, Δdepth observations (depth shift between synchronous events recorded in the ice and in the air) and finally air and ice stratigraphic links in between ice cores. The optimization is formulated as a least squares problem, implying that all densities of probabilities are assumed to be Gaussian. It is numerically solved using the Levenberg-Marquardt algorithm and a numerical evaluation of the model's Jacobian. IceChrono follows an approach similar to that of the Datice model which was recently used to produce the AICC2012 chronology for 4 Antarctic ice cores and 1 Greenland ice core. IceChrono1 provides improvements and simplifications with respect to Datice from the mathematical, numerical and programming point of views. The capabilities of IceChrono is demonstrated on a case study similar to the AICC2012 dating experiment. We find results similar to those of Datice, within a few centuries, which is a confirmation of both IceChrono and Datice codes. We also test new functionalities with respect to the original version of Datice: observations as ice intervals

  4. Integrated modeling for the VLTI

    NASA Astrophysics Data System (ADS)

    Muller, Michael; Wilhelm, Rainer C.; Baier, Horst J.; Koch, Franz

    2004-07-01

    Within the scope of the Very Large Telescope Interferometer (VLTI) project, ESO has developed a software package for integrated modeling of single- and multi-aperture optical telescopes. Integrated modeling is aiming at time-dependent system analysis combining different technical disciplines (optics, mechanical structure, control system with sensors and actuators, environmental disturbances). This allows multi-disciplinary analysis and gives information about cross-coupling effects for system engineering of complex stellar interferometers and telescopes. At the moment the main components of the Integrated Modeling Toolbox are BeamWarrior, a numerical tool for optical analysis of single- and multi-aperture telescopes, and the Structural Modeling Interface, which allows to generate Simulink blocks with reduced size from Finite Element Models of a telescope structure. Based on these tools, models of the various subsystems (e.g. telescope, delay line, beam combiner, atmosphere) can be created in the appropriate disciplines (e.g. optics, structure, disturbance). All subsystem models are integrated into the Matlab/Simulink environment for dynamic control system simulations. The basic output of the model is a complete description of the time-dependent electromagnetic field in each interferometer arm. Alternatively, a more elaborated output can be created, such as an interference fringe pattern at the focus of a beam combining instrument. The concern of this paper is the application of the modeling concept to large complex telescope systems. The concept of the Simulink-based integrated model with the main components telescope structure, optics and control loops is presented. The models for wind loads and atmospheric turbulence are explained. Especially the extension of the modeling approach to a 50 - 100 m class telescope is discussed.

  5. Probabilistic Movement Models Show that Postural Control Precedes and Predicts Volitional Motor Control.

    PubMed

    Rueckert, Elmar; Čamernik, Jernej; Peters, Jan; Babič, Jan

    2016-01-01

    Human motor skill learning is driven by the necessity to adapt to new situations. While supportive contacts are essential for many tasks, little is known about their impact on motor learning. To study the effect of contacts an innovative full-body experimental paradigm was established. The task of the subjects was to reach for a distant target while postural stability could only be maintained by establishing an additional supportive hand contact. To examine adaptation, non-trivial postural perturbations of the subjects' support base were systematically introduced. A novel probabilistic trajectory model approach was employed to analyze the correlation between the motions of both arms and the trunk. We found that subjects adapted to the perturbations by establishing target dependent hand contacts. Moreover, we found that the trunk motion adapted significantly faster than the motion of the arms. However, the most striking finding was that observations of the initial phase of the left arm or trunk motion (100-400 ms) were sufficient to faithfully predict the complete movement of the right arm. Overall, our results suggest that the goal-directed arm movements determine the supportive arm motions and that the motion of heavy body parts adapts faster than the light arms. PMID:27328750

  6. Detection of prostate cancer on histopathology using color fractals and Probabilistic Pairwise Markov models.

    PubMed

    Yu, Elaine; Monaco, James P; Tomaszewski, John; Shih, Natalie; Feldman, Michael; Madabhushi, Anant

    2011-01-01

    In this paper we present a system for detecting regions of carcinoma of the prostate (CaP) in H&E stained radical prostatectomy specimens using the color fractal dimension. Color textural information is known to be a valuable characteristic to distinguish CaP from benign tissue. In addition to color information, we know that cancer tends to form contiguous regions. Our system leverages the color staining information of histology as well as spatial dependencies. The color and textural information is first captured using color fractal dimension. To incorporate spatial dependencies, we combine the probability map constructed via color fractal dimension with a novel Markov prior called the Probabilistic Pairwise Markov Model (PPMM). To demonstrate the capability of this CaP detection system, we applied the algorithm to 27 radical prostatectomy specimens from 10 patients. A per pixel evaluation was conducted with ground truth provided by an expert pathologist using only the color fractal feature first, yielding an area under the receiver operator characteristic curve (AUC) curve of 0.790. In conjunction with a Markov prior, the resultant color fractal dimension + Markov random field (MRF) classifier yielded an AUC of 0.831. PMID:22255076

  7. Probabilistic Modeling of Landfill Subsidence Introduced by Buried Structure Collapse - 13229

    SciTech Connect

    Foye, Kevin; Soong, Te-Yang

    2013-07-01

    The long-term reliability of land disposal facility final cover systems - and therefore the overall waste containment - depends on the distortions imposed on these systems by differential settlement/subsidence. The evaluation of differential settlement is challenging because of the heterogeneity of the waste mass and buried structure placement. Deterministic approaches to long-term final cover settlement prediction are not able to capture the spatial variability in the waste mass and sub-grade properties, especially discontinuous inclusions, which control differential settlement. An alternative is to use a probabilistic model to capture the non-uniform collapse of cover soils and buried structures and the subsequent effect of that collapse on the final cover system. Both techniques are applied to the problem of two side-by-side waste trenches with collapsible voids. The results show how this analytical technique can be used to connect a metric of final cover performance (inundation area) to the susceptibility of the sub-grade to collapse and the effective thickness of the cover soils. This approach allows designers to specify cover thickness, reinforcement, and slope to meet the demands imposed by the settlement of the underlying waste trenches. (authors)

  8. Probabilistic Movement Models Show that Postural Control Precedes and Predicts Volitional Motor Control

    PubMed Central

    Rueckert, Elmar; Čamernik, Jernej; Peters, Jan; Babič, Jan

    2016-01-01

    Human motor skill learning is driven by the necessity to adapt to new situations. While supportive contacts are essential for many tasks, little is known about their impact on motor learning. To study the effect of contacts an innovative full-body experimental paradigm was established. The task of the subjects was to reach for a distant target while postural stability could only be maintained by establishing an additional supportive hand contact. To examine adaptation, non-trivial postural perturbations of the subjects’ support base were systematically introduced. A novel probabilistic trajectory model approach was employed to analyze the correlation between the motions of both arms and the trunk. We found that subjects adapted to the perturbations by establishing target dependent hand contacts. Moreover, we found that the trunk motion adapted significantly faster than the motion of the arms. However, the most striking finding was that observations of the initial phase of the left arm or trunk motion (100–400 ms) were sufficient to faithfully predict the complete movement of the right arm. Overall, our results suggest that the goal-directed arm movements determine the supportive arm motions and that the motion of heavy body parts adapts faster than the light arms. PMID:27328750

  9. Characterizing the International Migration Barriers with a Probabilistic Multilateral Migration Model.

    PubMed

    Li, Xiaomeng; Xu, Hongzhong; Chen, Jiawei; Chen, Qinghua; Zhang, Jiang; Di, Zengru

    2016-01-01

    Human migration is responsible for forming modern civilization and has had an important influence on the development of various countries. There are many issues worth researching, and "the reason to move" is the most basic one. The concept of migration cost in the classical self-selection theory, which was introduced by Roy and Borjas, is useful. However, migration cost cannot address global migration because of the limitations of deterministic and bilateral choice. Following the idea of migration cost, this paper developed a new probabilistic multilateral migration model by introducing the Boltzmann factor from statistical physics. After characterizing the underlying mechanism or driving force of human mobility, we reveal some interesting facts that have provided a deeper understanding of international migration, such as the negative correlation between migration costs for emigrants and immigrants and a global classification with clear regional and economic characteristics, based on clustering of migration cost vectors. In addition, we deconstruct the migration barriers using regression analysis and find that the influencing factors are complicated but can be partly (12.5%) described by several macro indexes, such as the GDP growth of the destination country, the GNI per capita and the HDI of both the source and destination countries. PMID:27597319

  10. A Gaussian Model-Based Probabilistic Approach for Pulse Transit Time Estimation.

    PubMed

    Jang, Dae-Geun; Park, Seung-Hun; Hahn, Minsoo

    2016-01-01

    In this paper, we propose a new probabilistic approach to pulse transit time (PTT) estimation using a Gaussian distribution model. It is motivated basically by the hypothesis that PTTs normalized by RR intervals follow the Gaussian distribution. To verify the hypothesis, we demonstrate the effects of arterial compliance on the normalized PTTs using the Moens-Korteweg equation. Furthermore, we observe a Gaussian distribution of the normalized PTTs on real data. In order to estimate the PTT using the hypothesis, we first assumed that R-waves in the electrocardiogram (ECG) can be correctly identified. The R-waves limit searching ranges to detect pulse peaks in the photoplethysmogram (PPG) and to synchronize the results with cardiac beats--i.e., the peaks of the PPG are extracted within the corresponding RR interval of the ECG as pulse peak candidates. Their probabilities of being the actual pulse peak are then calculated using a Gaussian probability function. The parameters of the Gaussian function are automatically updated when a new pulse peak is identified. This update makes the probability function adaptive to variations of cardiac cycles. Finally, the pulse peak is identified as the candidate with the highest probability. The proposed approach is tested on a database where ECG and PPG waveforms are collected simultaneously during the submaximal bicycle ergometer exercise test. The results are promising, suggesting that the method provides a simple but more accurate PTT estimation in real applications. PMID:25420274

  11. Characterizing the International Migration Barriers with a Probabilistic Multilateral Migration Model

    PubMed Central

    Li, Xiaomeng; Xu, Hongzhong; Chen, Jiawei; Chen, Qinghua; Zhang, Jiang; Di, Zengru

    2016-01-01

    Human migration is responsible for forming modern civilization and has had an important influence on the development of various countries. There are many issues worth researching, and “the reason to move” is the most basic one. The concept of migration cost in the classical self-selection theory, which was introduced by Roy and Borjas, is useful. However, migration cost cannot address global migration because of the limitations of deterministic and bilateral choice. Following the idea of migration cost, this paper developed a new probabilistic multilateral migration model by introducing the Boltzmann factor from statistical physics. After characterizing the underlying mechanism or driving force of human mobility, we reveal some interesting facts that have provided a deeper understanding of international migration, such as the negative correlation between migration costs for emigrants and immigrants and a global classification with clear regional and economic characteristics, based on clustering of migration cost vectors. In addition, we deconstruct the migration barriers using regression analysis and find that the influencing factors are complicated but can be partly (12.5%) described by several macro indexes, such as the GDP growth of the destination country, the GNI per capita and the HDI of both the source and destination countries. PMID:27597319

  12. Modeling of constructional elements fragmentation:3-D statement and probabilistic approach

    NASA Astrophysics Data System (ADS)

    Gerasimov, Alexander; Pashkov, Sergey

    2011-06-01

    The heterogeneity of real materials structure influencing on distribution of material characteristics is one of the factors determining a destruction character. The introduction of the given factor in the equations of mechanics of deformed solid is possible at the use of probabilistic laws of characteristics distribution in the volume of a considered design. The explosive fragmentation of the open and closed shells, thick plate punching by HE charged shell on a normal and at an angle, plate and a shell fragmentation after plate piercing and under HE charge explosion, thin barrier punching on a normal and at an angle, crushing of metal rings dressed on a copper tube, process of high-speed impact of laminated-spaced metallic plates with steel spheres modeling debris of space bodies and artificial objects are considered. The processes are calculated in view of material heterogeneity. To calculate elastoplastic flows and detonation products we used the technique realized on tetrahedral cells and based on Wilkins method for calculation of internal points of a body and on Johnson method for calculation of contact interactions.

  13. Role of Probabilistic Micromechanics Modeling in Establishing Design Allowables in Composites

    NASA Technical Reports Server (NTRS)

    Mital, Subodh K.; Keith, Theo G., Jr.; Murthy, Pappu L. N.; Brewer, David N.

    2005-01-01

    One of the major challenges in designing with any new material, and particularly with advanced composite materials, is the fidelity of material design allowables. In the case of composite materials, the concern arises from the inherent nature of these materials, i.e., their heterogeneous make-up and the various factors that affect their properties in a specific design environment. Composites have various scales - micro, macro, laminate and structural, as well as numerous other fabrication related parameters. Many advanced composites in aerospace applications involve complex two- and three-dimensional fiber architectures and requires high-temperature processing. Since there are uncertainties associated with each of these, the observed behavior of composite materials shows scatter. Evaluating the effect of each of these variables on the observed scatter in composite properties solely by teSting is cost and time prohibitive. One alternative is to evaluate these effects by computational simulation. The authors have developed probabilistic composite micromechanics techniques by combining woven composite micromechanics and Fast Probability Integration (FPI) techniques to address these issues. In this paper these techniques will be described and demonstrated through selected examples. Results in the form of cumulative distribution functions (CDF) of the composite properties of a MI (melt-infiltrated) SiC/SiC (silicon carbide fiber in a silicon carbide matrix) Composite will be presented. A CDF is a relationship defined by the value of the property (the response variable) with respect to the cumulative probability of occurrence. Furthermore, input variables causing scatter are identified and ranked based upon their sensitivity magnitude. Sensitivity information is very valuable in quality control. How these results can be utilized to develop design allowables so that these materials may be used by structural analysts/designers will also be discussed.

  14. INTEGRATED PLANNING MODEL - EPA APPLICATIONS

    EPA Science Inventory

    The Integrated Planning Model (IPM) is a multi-regional, dynamic, deterministic linear programming (LP) model of the electric power sector in the continental lower 48 states and the District of Columbia. It provides forecasts up to year 2050 of least-cost capacity expansion, elec...

  15. A probabilistic model for the identification of confinement regimes and edge localized mode behavior, with implications to scaling laws

    SciTech Connect

    Verdoolaege, Geert; Van Oost, Guido

    2012-10-15

    Pattern recognition is becoming an important tool in fusion data analysis. However, fusion diagnostic measurements are often affected by considerable statistical uncertainties, rendering the extraction of useful patterns a significant challenge. Therefore, we assume a probabilistic model for the data and perform pattern recognition in the space of probability distributions. We show the considerable advantage of our method for identifying confinement regimes and edge localized mode behavior, and we discuss the potential for scaling laws.

  16. A Logical Model of Conceptual Integrity in Data Integration

    PubMed Central

    Flater, David

    2003-01-01

    Conceptual integrity is required for the result of data integration to be cohesive and sensible. Compromised conceptual integrity results in “semantic faults,” which are commonly blamed for latent integration bugs. A logical model of conceptual integrity in data integration and a simple example application are presented. Unlike constructive models that attempt to prevent semantic faults, this model allows both correct and incorrect integrations to be described. Imperfect legacy systems can therefore be modeled, allowing a more formal analysis of their flaws and the possible remedies.

  17. Separations and safeguards model integration.

    SciTech Connect

    Cipiti, Benjamin B.; Zinaman, Owen

    2010-09-01

    Research and development of advanced reprocessing plant designs can greatly benefit from the development of a reprocessing plant model capable of transient solvent extraction chemistry. This type of model can be used to optimize the operations of a plant as well as the designs for safeguards, security, and safety. Previous work has integrated a transient solvent extraction simulation module, based on the Solvent Extraction Process Having Interaction Solutes (SEPHIS) code developed at Oak Ridge National Laboratory, with the Separations and Safeguards Performance Model (SSPM) developed at Sandia National Laboratory, as a first step toward creating a more versatile design and evaluation tool. The goal of this work was to strengthen the integration by linking more variables between the two codes. The results from this integrated model show expected operational performance through plant transients. Additionally, ORIGEN source term files were integrated into the SSPM to provide concentrations, radioactivity, neutron emission rate, and thermal power data for various spent fuels. This data was used to generate measurement blocks that can determine the radioactivity, neutron emission rate, or thermal power of any stream or vessel in the plant model. This work examined how the code could be expanded to integrate other separation steps and benchmark the results to other data. Recommendations for future work will be presented.

  18. Multidisciplinary design optimization of a fighter aircraft with damage tolerance constraints and a probabilistic model of the fatigue environment

    NASA Astrophysics Data System (ADS)

    Arrieta, Albert Joseph

    2001-07-01

    Damage tolerance analysis (DTA) was considered in the global design optimization of an aircraft wing structure. Residual strength and fatigue life requirements, based on the damage tolerance philosophy, were investigated as new design constraints. In general, accurate fatigue prediction is difficult if the load environment is not known with a high degree of certainty. To address this issue, a probabilistic approach was used to describe the uncertain load environment. Probabilistic load spectra models were developed from flight recorder data. The global/local finite element approach allowed local fatigue requirements to be considered in the global design optimization. AFGROW fatigue crack growth analysis provided a new strength criterion for satisfying damage tolerance requirements within a global optimization environment. Initial research with the ASTROS program used the probabilistic load model and this damage tolerance constraint to optimize cracked skin panels on the lower wing of a fighter/attack aircraft. For an aerodynamic and structural model similar to an F-16, ASTROS simulated symmetric and asymmetric maneuvers during the optimization. Symmetric maneuvers, without underwing stores, produced the highest stresses and drove the optimization of the inboard lower wing skin. Asymmetric maneuvers, with underwing stores, affected the optimum thickness of the outboard hard points. Subsequent design optimizations included von Mises stress, aileron effectiveness, and lift effectiveness constraints simultaneously. This optimization was driven by the DTA and von Mises stress constraints and, therefore, DTA requirements can have an active role to play in preliminary aircraft design.

  19. A probabilistic transmission model to assess infection risk from Mycobacterium tuberculosis in commercial passenger trains.

    PubMed

    Chen, Szu-Chieh; Liao, Chung-Min; Li, Sih-syuan; You, Shu-Han

    2011-06-01

    The objective of this article is to characterize the risk of infection from airborne Mycobacterium tuberculosis bacilli exposure in commercial passenger trains based on a risk-based probabilistic transmission modeling. We investigated the tuberculosis (TB) infection risks among commercial passengers by inhaled aerosol M. tuberculosis bacilli and quantify the patterns of TB transmission in Taiwan High Speed Rail (THSR). A deterministic Wells-Riley mathematical model was used to account for the probability of infection risk from M. tuberculosis bacilli by linking the cough-generated aerosol M. tuberculosis bacilli concentration and particle size distribution. We found that (i) the quantum generation rate of TB was estimated with a lognormal distribution of geometric mean (GM) of 54.29 and geometric standard deviation (GSD) of 3.05 quantum/h at particle size ≤ 5 μm and (ii) the basic reproduction numbers (R(0) ) were estimated to be 0.69 (0.06-6.79), 2.82 (0.32-20.97), and 2.31 (0.25-17.69) for business, standard, and nonreserved cabins, respectively. The results indicate that commercial passengers taking standard and nonreserved cabins had higher transmission risk than those in business cabins based on conservatism. Our results also reveal that even a brief exposure, as in the bronchoscopy cases, can also result in a transmission when the quantum generation rate is high. This study could contribute to a better understanding of the dynamics of TB transmission in commercial passenger trains by assessing the relationship between TB infectiousness, passenger mobility, and key model parameters such as seat occupancy, ventilation rate, and exposure duration. PMID:21175727

  20. SEX-DETector: A Probabilistic Approach to Study Sex Chromosomes in Non-Model Organisms.

    PubMed

    Muyle, Aline; Käfer, Jos; Zemp, Niklaus; Mousset, Sylvain; Picard, Franck; Marais, Gabriel Ab

    2016-01-01

    We propose a probabilistic framework to infer autosomal and sex-linked genes from RNA-seq data of a cross for any sex chromosome type (XY, ZW, and UV). Sex chromosomes (especially the non-recombining and repeat-dense Y, W, U, and V) are notoriously difficult to sequence. Strategies have been developed to obtain partially assembled sex chromosome sequences. Most of them remain difficult to apply to numerous non-model organisms, either because they require a reference genome, or because they are designed for evolutionarily old systems. Sequencing a cross (parents and progeny) by RNA-seq to study the segregation of alleles and infer sex-linked genes is a cost-efficient strategy, which also provides expression level estimates. However, the lack of a proper statistical framework has limited a broader application of this approach. Tests on empirical Silene data show that our method identifies 20-35% more sex-linked genes than existing pipelines, while making reliable inferences for downstream analyses. Approximately 12 individuals are needed for optimal results based on simulations. For species with an unknown sex-determination system, the method can assess the presence and type (XY vs. ZW) of sex chromosomes through a model comparison strategy. The method is particularly well optimized for sex chromosomes of young or intermediate age, which are expected in thousands of yet unstudied lineages. Any organisms, including non-model ones for which nothing is known a priori, that can be bred in the lab, are suitable for our method. SEX-DETector and its implementation in a Galaxy workflow are made freely available. PMID:27492231

  1. SEX-DETector: A Probabilistic Approach to Study Sex Chromosomes in Non-Model Organisms

    PubMed Central

    Muyle, Aline; Käfer, Jos; Zemp, Niklaus; Mousset, Sylvain; Picard, Franck; Marais, Gabriel AB

    2016-01-01

    We propose a probabilistic framework to infer autosomal and sex-linked genes from RNA-seq data of a cross for any sex chromosome type (XY, ZW, and UV). Sex chromosomes (especially the non-recombining and repeat-dense Y, W, U, and V) are notoriously difficult to sequence. Strategies have been developed to obtain partially assembled sex chromosome sequences. Most of them remain difficult to apply to numerous non-model organisms, either because they require a reference genome, or because they are designed for evolutionarily old systems. Sequencing a cross (parents and progeny) by RNA-seq to study the segregation of alleles and infer sex-linked genes is a cost-efficient strategy, which also provides expression level estimates. However, the lack of a proper statistical framework has limited a broader application of this approach. Tests on empirical Silene data show that our method identifies 20–35% more sex-linked genes than existing pipelines, while making reliable inferences for downstream analyses. Approximately 12 individuals are needed for optimal results based on simulations. For species with an unknown sex-determination system, the method can assess the presence and type (XY vs. ZW) of sex chromosomes through a model comparison strategy. The method is particularly well optimized for sex chromosomes of young or intermediate age, which are expected in thousands of yet unstudied lineages. Any organisms, including non-model ones for which nothing is known a priori, that can be bred in the lab, are suitable for our method. SEX-DETector and its implementation in a Galaxy workflow are made freely available. PMID:27492231

  2. Time Analysis for Probabilistic Workflows

    SciTech Connect

    Czejdo, Bogdan; Ferragut, Erik M

    2012-01-01

    There are many theoretical and practical results in the area of workflow modeling, especially when the more formal workflows are used. In this paper we focus on probabilistic workflows. We show algorithms for time computations in probabilistic workflows. With time of activities more precisely modeled, we can achieve improvement in the work cooperation and analyses of cooperation including simulation and visualization.

  3. An empirical model for probabilistic decadal prediction: global attribution and regional hindcasts

    NASA Astrophysics Data System (ADS)

    Suckling, Emma B.; van Oldenborgh, Geert Jan; Eden, Jonathan M.; Hawkins, Ed

    2016-07-01

    Empirical models, designed to predict surface variables over seasons to decades ahead, provide useful benchmarks for comparison against the performance of dynamical forecast systems; they may also be employable as predictive tools for use by climate services in their own right. A new global empirical decadal prediction system is presented, based on a multiple linear regression approach designed to produce probabilistic output for comparison against dynamical models. A global attribution is performed initially to identify the important forcing and predictor components of the model . Ensemble hindcasts of surface air temperature anomaly fields are then generated, based on the forcings and predictors identified as important, under a series of different prediction `modes' and their performance is evaluated. The modes include a real-time setting, a scenario in which future volcanic forcings are prescribed during the hindcasts, and an approach which exploits knowledge of the forced trend. A two-tier prediction system, which uses knowledge of future sea surface temperatures in the Pacific and Atlantic Oceans, is also tested, but within a perfect knowledge framework. Each mode is designed to identify sources of predictability and uncertainty, as well as investigate different approaches to the design of decadal prediction systems for operational use. It is found that the empirical model shows skill above that of persistence hindcasts for annual means at lead times of up to 10 years ahead in all of the prediction modes investigated. It is suggested that hindcasts which exploit full knowledge of the forced trend due to increasing greenhouse gases throughout the hindcast period can provide more robust estimates of model bias for the calibration of the empirical model in an operational setting. The two-tier system shows potential for improved real-time prediction, given the assumption that skilful predictions of large-scale modes of variability are available. The empirical

  4. Does probabilistic modelling of linkage disequilibrium evolution improve the accuracy of QTL location in animal pedigree?

    PubMed Central

    2010-01-01

    Background Since 2001, the use of more and more dense maps has made researchers aware that combining linkage and linkage disequilibrium enhances the feasibility of fine-mapping genes of interest. So, various method types have been derived to include concepts of population genetics in the analyses. One major drawback of many of these methods is their computational cost, which is very significant when many markers are considered. Recent advances in technology, such as SNP genotyping, have made it possible to deal with huge amount of data. Thus the challenge that remains is to find accurate and efficient methods that are not too time consuming. The study reported here specifically focuses on the half-sib family animal design. Our objective was to determine whether modelling of linkage disequilibrium evolution improved the mapping accuracy of a quantitative trait locus of agricultural interest in these populations. We compared two methods of fine-mapping. The first one was an association analysis. In this method, we did not model linkage disequilibrium evolution. Therefore, the modelling of the evolution of linkage disequilibrium was a deterministic process; it was complete at time 0 and remained complete during the following generations. In the second method, the modelling of the evolution of population allele frequencies was derived from a Wright-Fisher model. We simulated a wide range of scenarios adapted to animal populations and compared these two methods for each scenario. Results Our results indicated that the improvement produced by probabilistic modelling of linkage disequilibrium evolution was not significant. Both methods led to similar results concerning the location accuracy of quantitative trait loci which appeared to be mainly improved by using four flanking markers instead of two. Conclusions Therefore, in animal half-sib designs, modelling linkage disequilibrium evolution using a Wright-Fisher model does not significantly improve the accuracy of the

  5. Using simple chaotic models to interpret climate under climate change: Implications for probabilistic climate prediction

    NASA Astrophysics Data System (ADS)

    Daron, Joseph

    2010-05-01

    Exploring the reliability of model based projections is an important pre-cursor to evaluating their societal relevance. In order to better inform decisions concerning adaptation (and mitigation) to climate change, we must investigate whether or not our models are capable of replicating the dynamic nature of the climate system. Whilst uncertainty is inherent within climate prediction, establishing and communicating what is plausible as opposed to what is likely is the first step to ensuring that climate sensitive systems are robust to climate change. Climate prediction centers are moving towards probabilistic projections of climate change at regional and local scales (Murphy et al., 2009). It is therefore important to understand what a probabilistic forecast means for a chaotic nonlinear dynamic system that is subject to changing forcings. It is in this context that we present the results of experiments using simple models that can be considered analogous to the more complex climate system, namely the Lorenz 1963 and Lorenz 1984 models (Lorenz, 1963; Lorenz, 1984). Whilst the search for a low-dimensional climate attractor remains illusive (Fraedrich, 1986; Sahay and Sreenivasan, 1996) the characterization of the climate system in such terms can be useful for conceptual and computational simplicity. Recognising that a change in climate is manifest in a change in the distribution of a particular climate variable (Stainforth et al., 2007), we first establish the equilibrium distributions of the Lorenz systems for certain parameter settings. Allowing the parameters to vary in time, we investigate the dependency of such distributions to initial conditions and discuss the implications for climate prediction. We argue that the role of chaos and nonlinear dynamic behaviour ought to have more prominence in the discussion of the forecasting capabilities in climate prediction. References: Fraedrich, K. Estimating the dimensions of weather and climate attractors. J. Atmos. Sci

  6. Formalizing Probabilistic Safety Claims

    NASA Technical Reports Server (NTRS)

    Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.

    2011-01-01

    A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.

  7. Building an ecosystem model using mismatched and fragmented data: A probabilistic network of early marine survival for coho salmon Oncorhynchus kisutch in the Strait of Georgia

    NASA Astrophysics Data System (ADS)

    Andres Araujo, H.; Holt, Carrie; Curtis, Janelle M. R.; Perry, R. I.; Irvine, James R.; Michielsens, Catherine G. J.

    2013-08-01

    We evaluated the effects of biophysical conditions and hatchery production on the early marine survival of coho salmon Oncorhynchus kisutch in the Strait of Georgia, British Columbia, Canada. Due to a paucity of balanced multivariate ecosystem data, we developed a probabilistic network that integrated physical and ecological data and information from literature, expert opinion, oceanographic models, and in situ observations. This approach allowed us to evaluate alternate hypotheses about drivers of early marine survival while accounting for uncertainties in relationships among variables. Probabilistic networks allow users to explore multiple environmental settings and evaluate the consequences of management decisions under current and projected future states. We found that the zooplankton biomass anomaly, calanoid copepod biomass, and herring biomass were the best indicators of early marine survival. It also appears that concentrating hatchery supplementation during periods of negative PDO and ENSO (Pacific Decadal and El Niño Southern Oscillation respectively), indicative of generally favorable ocean conditions for salmon, tends to increase survival of hatchery coho salmon while minimizing negative impacts on the survival of wild juveniles. Scientists and managers can benefit from the approach presented here by exploring multiple scenarios, providing a basis for open and repeatable ecosystem-based risk assessments when data are limited.

  8. Quantification and probabilistic modeling of CRT obsolescence for the State of Delaware

    SciTech Connect

    Schumacher, Kelsea A.; Schumacher, Thomas; Agbemabiese, Lawrence

    2014-11-15

    Highlights: • We modeled the obsolescence of cathode ray tube devices in the State of Delaware. • 411,654 CRT units or ∼16,500 metric tons have been recycled in Delaware since 2002. • The peak of the CRT obsolescence in Delaware passed by 2012. • The Delaware average CRT recycling rate between 2002 and 13 was approximately 27.5%. • CRTs will continue to infiltrate the system likely until 2033. - Abstract: The cessation of production and replacement of cathode ray tube (CRT) displays with flat screen displays have resulted in the proliferation of CRTs in the electronic waste (e-waste) recycle stream. However, due to the nature of the technology and presence of hazardous components such as lead, CRTs are the most challenging of electronic components to recycle. In the State of Delaware it is due to this challenge and the resulting expense combined with the large quantities of CRTs in the recycle stream that electronic recyclers now charge to accept Delaware’s e-waste. Therefore it is imperative that the Delaware Solid Waste Authority (DSWA) understand future quantities of CRTs entering the waste stream. This study presents the results of an assessment of CRT obsolescence in the State of Delaware. A prediction model was created utilizing publicized sales data, a variety of lifespan data as well as historic Delaware CRT collection rates. Both a deterministic and a probabilistic approach using Monte Carlo Simulation (MCS) were performed to forecast rates of CRT obsolescence to be anticipated in the State of Delaware. Results indicate that the peak of CRT obsolescence in Delaware has already passed, although CRTs are anticipated to enter the waste stream likely until 2033.

  9. Bayesian probabilistic model for life prediction and fault mode classification of solid state luminaires

    SciTech Connect

    Lall, Pradeep; Wei, Junchao; Sakalaukus, Peter

    2014-06-22

    A new method has been developed for assessment of the onset of degradation in solid state luminaires to classify failure mechanisms by using metrics beyond lumen degradation that are currently used for identification of failure. Luminous Flux output, Correlated Color Temperature Data on Philips LED Lamps has been gathered under 85°C/85%RH till lamp failure. Failure modes of the test population of the lamps have been studied to understand the failure mechanisms in 85°C/85%RH accelerated test. Results indicate that the dominant failure mechanism is the discoloration of the LED encapsulant inside the lamps which is the likely cause for the luminous flux degradation and the color shift. The acquired data has been used in conjunction with Bayesian Probabilistic Models to identify luminaires with onset of degradation much prior to failure through identification of decision boundaries between lamps with accrued damage and lamps beyond the failure threshold in the feature space. In addition luminaires with different failure modes have been classified separately from healthy pristine luminaires. The α-λ plots have been used to evaluate the robustness of the proposed methodology. Results show that the predicted degradation for the lamps tracks the true degradation observed during 85°C/85%RH during accelerated life test fairly closely within the ±20% confidence bounds. Correlation of model prediction with experimental results indicates that the presented methodology allows the early identification of the onset of failure much prior to development of complete failure distributions and can be used for assessing the damage state of SSLs in fairly large deployments. It is expected that, the new prediction technique will allow the development of failure distributions without testing till L70 life for the manifestation of failure.

  10. Methods for Developing Emissions Scenarios for Integrated Assessment Models

    SciTech Connect

    Prinn, Ronald; Webster, Mort

    2007-08-20

    The overall objective of this research was to contribute data and methods to support the future development of new emissions scenarios for integrated assessment of climate change. Specifically, this research had two main objectives: 1. Use historical data on economic growth and energy efficiency changes, and develop probability density functions (PDFs) for the appropriate parameters for two or three commonly used integrated assessment models. 2. Using the parameter distributions developed through the first task and previous work, we will develop methods of designing multi-gas emission scenarios that usefully span the joint uncertainty space in a small number of scenarios. Results on the autonomous energy efficiency improvement (AEEI) parameter are summarized, an uncertainty analysis of elasticities of substitution is described, and the probabilistic emissions scenario approach is presented.

  11. Probabilistic performance-assessment modeling of the mixed waste landfill at Sandia National Laboratories.

    SciTech Connect

    Peace, Gerald L.; Goering, Timothy James; Miller, Mark Laverne; Ho, Clifford Kuofei

    2005-11-01

    A probabilistic performance assessment has been conducted to evaluate the fate and transport of radionuclides (americium-241, cesium-137, cobalt-60, plutonium-238, plutonium-239, radium-226, radon-222, strontium-90, thorium-232, tritium, uranium-238), heavy metals (lead and cadmium), and volatile organic compounds (VOCs) at the Mixed Waste Landfill (MWL). Probabilistic analyses were performed to quantify uncertainties inherent in the system and models for a 1,000-year period, and sensitivity analyses were performed to identify parameters and processes that were most important to the simulated performance metrics. Comparisons between simulated results and measured values at the MWL were made to gain confidence in the models and perform calibrations when data were available. In addition, long-term monitoring requirements and triggers were recommended based on the results of the quantified uncertainty and sensitivity analyses. At least one-hundred realizations were simulated for each scenario defined in the performance assessment. Conservative values and assumptions were used to define values and distributions of uncertain input parameters when site data were not available. Results showed that exposure to tritium via the air pathway exceeded the regulatory metric of 10 mrem/year in about 2% of the simulated realizations when the receptor was located at the MWL (continuously exposed to the air directly above the MWL). Simulations showed that peak radon gas fluxes exceeded the design standard of 20 pCi/m{sup 2}/s in about 3% of the realizations if up to 1% of the containers of sealed radium-226 sources were assumed to completely degrade in the future. If up to 100% of the containers of radium-226 sources were assumed to completely degrade, 30% of the realizations yielded radon surface fluxes that exceeded the design standard. For the groundwater pathway, simulations showed that none of the radionuclides or heavy metals (lead and cadmium) reached the groundwater during

  12. Probabilistic Modeling for Risk Assessment of California Ground Water Contamination by Pesticides

    NASA Astrophysics Data System (ADS)

    Clayton, M.; Troiano, J.; Spurlock, F.

    2007-12-01

    The California Department of Pesticide Regulation (DPR) is responsible for the registration of pesticides in California. DPR's Environmental Monitoring Branch evaluates the potential for pesticide active ingredients to move to ground water under legal agricultural use conditions. Previous evaluations were primarily based on threshold values for specific persistence and mobility properties of pesticides as prescribed in the California Pesticide Contamination Prevention Act of 1985. Two limitations identified with that process were the univariate nature where interactions of the properties were not accounted for, and the inability to accommodate multiple values of a physical-chemical property. We addressed these limitations by developing a probabilistic modeling method based on prediction of potential well water concentrations. A mechanistic pesticide transport model, LEACHM, is used to simulate sorption, degradation and transport of a candidate pesticide through the root zone. A second empirical model component then simulates pesticide degradation and transport through the vadose zone to a receiving ground water aquifer. Finally, degradation during transport in the aquifer to the well screen is included in calculating final potential well concentrations. Using Monte Carlo techniques, numerous LEACHM simulations are conducted using random samples of the organic carbon normalized soil adsorption coefficients (Koc) and soil dissipation half-life values derived from terrestrial field dissipation (TFD) studies. Koc and TFD values are obtained from gamma distributions fitted to pooled data from agricultural-use pesticides detected in California ground water: atrazine, simazine, diuron, bromacil, hexazinone, and norflurazon. The distribution of predicted well water concentrations for these pesticides is in good agreement with concentrations measured in domestic wells in coarse, leaching vulnerable soils of Fresno and Tulure Counties. The leaching potential of a new

  13. Radar Tracking with an Interacting Multiple Model and Probabilistic Data Association Filter for Civil Aviation Applications

    PubMed Central

    Jan, Shau-Shiun; Kao, Yu-Chun

    2013-01-01

    The current trend of the civil aviation technology is to modernize the legacy air traffic control (ATC) system that is mainly supported by many ground based navigation aids to be the new air traffic management (ATM) system that is enabled by global positioning system (GPS) technology. Due to the low receiving power of GPS signal, it is a major concern to aviation authorities that the operation of the ATM system might experience service interruption when the GPS signal is jammed by either intentional or unintentional radio-frequency interference. To maintain the normal operation of the ATM system during the period of GPS outage, the use of the current radar system is proposed in this paper. However, the tracking performance of the current radar system could not meet the required performance of the ATM system, and an enhanced tracking algorithm, the interacting multiple model and probabilistic data association filter (IMMPDAF), is therefore developed to support the navigation and surveillance services of the ATM system. The conventional radar tracking algorithm, the nearest neighbor Kalman filter (NNKF), is used as the baseline to evaluate the proposed radar tracking algorithm, and the real flight data is used to validate the IMMPDAF algorithm. As shown in the results, the proposed IMMPDAF algorithm could enhance the tracking performance of the current aviation radar system and meets the required performance of the new ATM system. Thus, the current radar system with the IMMPDAF algorithm could be used as an alternative system to continue aviation navigation and surveillance services of the ATM system during GPS outage periods. PMID:23686142

  14. Fatigue strength reduction model: RANDOM3 and RANDOM4 user manual. Appendix 2: Development of advanced methodologies for probabilistic constitutive relationships of material strength models

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Lovelace, Thomas B.

    1989-01-01

    FORTRAN programs RANDOM3 and RANDOM4 are documented in the form of a user's manual. Both programs are based on fatigue strength reduction, using a probabilistic constitutive model. The programs predict the random lifetime of an engine component to reach a given fatigue strength. The theoretical backgrounds, input data instructions, and sample problems illustrating the use of the programs are included.

  15. Fatigue crack growth model RANDOM2 user manual. Appendix 1: Development of advanced methodologies for probabilistic constitutive relationships of material strength models

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Lovelace, Thomas B.

    1989-01-01

    FORTRAN program RANDOM2 is presented in the form of a user's manual. RANDOM2 is based on fracture mechanics using a probabilistic fatigue crack growth model. It predicts the random lifetime of an engine component to reach a given crack size. Details of the theoretical background, input data instructions, and a sample problem illustrating the use of the program are included.

  16. Probabilistic assessment of landslide run-out using a dynamic model

    NASA Astrophysics Data System (ADS)

    Cepeda, J.; Quan Luna, B.; Nadim, F.

    2012-12-01

    Dynamic run-out models for landslides simulate the distribution and intensity of the movilized material, allowing to estimate the exposure and vulnerability of the elements at risk. Therefore, these models are essential tools to evaluate quantitatively the hazard and risk at a specific site. Another advantage of the application of dynamic models is that they can simulate the effect of variations in the released volume as well as in rheological parameters for different scenarios, including those with no historical evidences. Since these parameters cannot be measured directly, the associated uncertainties can be large, and thus must be addressed in risk assessments. In practice, a substantial degree of uncertainty characterizes the definition of the deterministic run-out model parameters. This is due to the lack of experimental data and the poor knowledge of the mechanical behavior of the moving flows. Consequently, all models, either those widely used in practical applications or those more recently developed, are based on simplified theoretical descriptions of mass dynamics which try to capture the complex rheology of the flow phenomenon. This results in a generalization of all models to attempt to reproduce the general features of the moving mass through the use of parameters that either account for aspects not explicitly described or are oversimplified. As a result, the model parameters cannot be related to a specific physical process, and therefore cannot be directly measured, but need to be calibrated. Currently, a relatively complete and well-established calibration for most of the run-out models is still lacking or is not reliable enough for practical applications. This represents one of the basic limitations for using dynamic run-out models, which can be remarkably sensitive to the rheological parameters. To assess the effect of the uncertainties in the input rheological parameters on the estimated run-out, a probabilistic procedure based on a Monte Carlo

  17. Sensitivity of probabilistic risk assessment results to alternative model structures. A case study of municipal waste incineration

    SciTech Connect

    Cullen, A.C. |

    1995-07-01

    In this analysis, human health risk due to exposure to municipal waste incinerator emissions is assessed as an example of the application of probabilistic techniques (e.g., Monte Carlo or Latin Hypercube simulations). Incinerator risk assessments are characterized by the dominance of indirect exposure, thus this analysis focuses on exposure via the ingestion of locally grown foods. In addition, since exposure to 2,3,7,8-TCDD drives most incinerator risk assessments, this compound is the subject of the illustrative calculations. An important part of probabilistic risk assessment is determining the relative influence of the input parameters on the magnitude of the variance in the output distribution. This constitutes an important step toward prioritizing data needs for additional research. In this analysis, a sequential structural decomposition of the relationships between the input variables is used to partition the variance in the output (i.e., risk) to identify the most influential contributors to overall variance among them. For comparison, the partitioning of variance is repeated, using techniques of multivariate regression. In summary, this study considers the degree to which results of a probabilistic assessment are contingent on critical model assumptions about the representation of deposition velocity. 50 refs., 7 figs., 5 tabs.

  18. High-throughput detection of prostate cancer in histological sections using probabilistic pairwise Markov models.

    PubMed

    Monaco, James P; Tomaszewski, John E; Feldman, Michael D; Hagemann, Ian; Moradi, Mehdi; Mousavi, Parvin; Boag, Alexander; Davidson, Chris; Abolmaesumi, Purang; Madabhushi, Anant

    2010-08-01

    In this paper we present a high-throughput system for detecting regions of carcinoma of the prostate (CaP) in HSs from radical prostatectomies (RPs) using probabilistic pairwise Markov models (PPMMs), a novel type of Markov random field (MRF). At diagnostic resolution a digitized HS can contain 80Kx70K pixels - far too many for current automated Gleason grading algorithms to process. However, grading can be separated into two distinct steps: (1) detecting cancerous regions and (2) then grading these regions. The detection step does not require diagnostic resolution and can be performed much more quickly. Thus, we introduce a CaP detection system capable of analyzing an entire digitized whole-mount HS (2x1.75cm(2)) in under three minutes (on a desktop computer) while achieving a CaP detection sensitivity and specificity of 0.87 and 0.90, respectively. We obtain this high-throughput by tailoring the system to analyze the HSs at low resolution (8microm per pixel). This motivates the following algorithm: (Step 1) glands are segmented, (Step 2) the segmented glands are classified as malignant or benign, and (Step 3) the malignant glands are consolidated into continuous regions. The classification of individual glands leverages two features: gland size and the tendency for proximate glands to share the same class. The latter feature describes a spatial dependency which we model using a Markov prior. Typically, Markov priors are expressed as the product of potential functions. Unfortunately, potential functions are mathematical abstractions, and constructing priors through their selection becomes an ad hoc procedure, resulting in simplistic models such as the Potts. Addressing this problem, we introduce PPMMs which formulate priors in terms of probability density functions, allowing the creation of more sophisticated models. To demonstrate the efficacy of our CaP detection system and assess the advantages of using a PPMM prior instead of the Potts, we alternately

  19. High-Throughput Detection of Prostate Cancer in Histological Sections Using Probabilistic Pairwise Markov Models

    PubMed Central

    Monaco, James P.; Tomaszewski, John E.; Feldman, Michael D.; Hagemann, Ian; Moradi, Mehdi; Mousavi, Parvin; Boag, Alexander; Davidson, Chris; Abolmaesumi, Purang; Madabhushi, Anant

    2010-01-01

    In this paper we present a high-throughput system for detecting regions of carcinoma of the prostate (CaP) in HSs from radical prostatectomies (RPs) using probabilistic pairwise Markov models (PPMMs), a novel type of Markov random field (MRF). At diagnostic resolution a digitized HS can contain 80K×70K pixels — far too many for current automated Gleason grading algorithms to process. However, grading can be separated into two distinct steps: 1) detecting cancerous regions and 2) then grading these regions. The detection step does not require diagnostic resolution and can be performed much more quickly. Thus, we introduce a CaP detection system capable of analyzing an entire digitized whole-mount HS (2×1.75 cm2) in under three minutes (on a desktop computer) while achieving a CaP detection sensitivity and specificity of 0.87 and 0.90, respectively. We obtain this high-throughput by tailoring the system to analyze the HSs at low resolution (8 µm per pixel). This motivates the following algorithm: Step 1) glands are segmented, Step 2) the segmented glands are classified as malignant or benign, and Step 3) the malignant glands are consolidated into continuous regions. The classification of individual glands leverages two features: gland size and the tendency for proximate glands to share the same class. The latter feature describes a spatial dependency which we model using a Markov prior. Typically, Markov priors are expressed as the product of potential functions. Unfortunately, potential functions are mathematical abstractions, and constructing priors through their selection becomes an ad hoc procedure, resulting in simplistic models such as the Potts. Addressing this problem, we introduce PPMMs which formulate priors in terms of probability density functions, allowing the creation of more sophisticated models. To demonstrate the efficacy of our CaP detection system and assess the advantages of using a PPMM prior instead of the Potts, we alternately incorporate

  20. Probabilistic, Multidimensional Unfolding Analysis

    ERIC Educational Resources Information Center

    Zinnes, Joseph L.; Griggs, Richard A.

    1974-01-01

    Probabilistic assumptions are added to single and multidimensional versions of the Coombs unfolding model for preferential choice (Coombs, 1950) and practical ways of obtaining maximum likelihood estimates of the scale parameters and goodness-of-fit tests of the model are presented. A Monte Carlo experiment is discussed. (Author/RC)

  1. Probabilistic boundary element method

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Raveendra, S. T.

    1989-01-01

    The purpose of the Probabilistic Structural Analysis Method (PSAM) project is to develop structural analysis capabilities for the design analysis of advanced space propulsion system hardware. The boundary element method (BEM) is used as the basis of the Probabilistic Advanced Analysis Methods (PADAM) which is discussed. The probabilistic BEM code (PBEM) is used to obtain the structural response and sensitivity results to a set of random variables. As such, PBEM performs analogous to other structural analysis codes such as finite elements in the PSAM system. For linear problems, unlike the finite element method (FEM), the BEM governing equations are written at the boundary of the body only, thus, the method eliminates the need to model the volume of the body. However, for general body force problems, a direct condensation of the governing equations to the boundary of the body is not possible and therefore volume modeling is generally required.

  2. Impact of fault models on probabilistic seismic hazard assessment: the example of the West Corinth rift.

    NASA Astrophysics Data System (ADS)

    Chartier, Thomas; Scotti, Oona; Boiselet, Aurelien; Lyon-Caen, Hélène

    2016-04-01

    Including faults in probabilistic seismic hazard assessment tends to increase the degree of uncertainty in the results due to the intrinsically uncertain nature of the fault data. This is especially the case in the low to moderate seismicity regions of Europe, where slow slipping faults are difficult to characterize. In order to better understand the key parameters that control the uncertainty in the fault-related hazard computations, we propose to build an analytic tool that provides a clear link between the different components of the fault-related hazard computations and their impact on the results. This will allow identifying the important parameters that need to be better constrained in order to reduce the resulting uncertainty in hazard and also provide a more hazard-oriented strategy for collecting relevant fault parameters in the field. The tool will be illustrated through the example of the West Corinth rifts fault-models. Recent work performed in the gulf has shown the complexity of the normal faulting system that is accommodating the extensional deformation of the rift. A logic-tree approach is proposed to account for this complexity and the multiplicity of scientifically defendable interpretations. At the nodes of the logic tree, different options that could be considered at each step of the fault-related seismic hazard will be considered. The first nodes represent the uncertainty in the geometries of the faults and their slip rates, which can derive from different data and methodologies. The subsequent node explores, for a given geometry/slip rate of faults, different earthquake rupture scenarios that may occur in the complex network of faults. The idea is to allow the possibility of several faults segments to break together in a single rupture scenario. To build these multiple-fault-segment scenarios, two approaches are considered: one based on simple rules (i.e. minimum distance between faults) and a second one that relies on physically

  3. Integrated modeling: a look back

    NASA Astrophysics Data System (ADS)

    Briggs, Clark

    2015-09-01

    This paper discusses applications and implementation approaches used for integrated modeling of structural systems with optics over the past 30 years. While much of the development work focused on control system design, significant contributions were made in system modeling and computer-aided design (CAD) environments. Early work appended handmade line-of-sight models to traditional finite element models, such as the optical spacecraft concept from the ACOSS program. The IDEAS2 computational environment built in support of Space Station collected a wider variety of existing tools around a parametric database. Later, IMOS supported interferometer and large telescope mission studies at JPL with MATLAB modeling of structural dynamics, thermal analysis, and geometric optics. IMOS's predecessor was a simple FORTRAN command line interpreter for LQG controller design with additional functions that built state-space finite element models. Specialized language systems such as CAESY were formulated and prototyped to provide more complex object-oriented functions suited to control-structure interaction. A more recent example of optical modeling directly in mechanical CAD is used to illustrate possible future directions. While the value of directly posing the optical metric in system dynamics terms is well understood today, the potential payoff is illustrated briefly via project-based examples. It is quite likely that integrated structure thermal optical performance (STOP) modeling could be accomplished in a commercial off-the-shelf (COTS) tool set. The work flow could be adopted, for example, by a team developing a small high-performance optical or radio frequency (RF) instrument.

  4. Predicting rib fracture risk with whole-body finite element models: development and preliminary evaluation of a probabilistic analytical framework.

    PubMed

    Forman, Jason L; Kent, Richard W; Mroz, Krystoffer; Pipkorn, Bengt; Bostrom, Ola; Segui-Gomez, Maria

    2012-01-01

    This study sought to develop a strain-based probabilistic method to predict rib fracture risk with whole-body finite element (FE) models, and to describe a method to combine the results with collision exposure information to predict injury risk and potential intervention effectiveness in the field. An age-adjusted ultimate strain distribution was used to estimate local rib fracture probabilities within an FE model. These local probabilities were combined to predict injury risk and severity within the whole ribcage. The ultimate strain distribution was developed from a literature dataset of 133 tests. Frontal collision simulations were performed with the THUMS (Total HUman Model for Safety) model with four levels of delta-V and two restraints: a standard 3-point belt and a progressive 3.5-7 kN force-limited, pretensioned (FL+PT) belt. The results of three simulations (29 km/h standard, 48 km/h standard, and 48 km/h FL+PT) were compared to matched cadaver sled tests. The numbers of fractures predicted for the comparison cases were consistent with those observed experimentally. Combining these results with field exposure informantion (ΔV, NASS-CDS 1992-2002) suggests a 8.9% probability of incurring AIS3+ rib fractures for a 60 year-old restrained by a standard belt in a tow-away frontal collision with this restraint, vehicle, and occupant configuration, compared to 4.6% for the FL+PT belt. This is the first study to describe a probabilistic framework to predict rib fracture risk based on strains observed in human-body FE models. Using this analytical framework, future efforts may incorporate additional subject or collision factors for multi-variable probabilistic injury prediction. PMID:23169122

  5. Predicting Rib Fracture Risk With Whole-Body Finite Element Models: Development and Preliminary Evaluation of a Probabilistic Analytical Framework

    PubMed Central

    Forman, Jason L.; Kent, Richard W.; Mroz, Krystoffer; Pipkorn, Bengt; Bostrom, Ola; Segui-Gomez, Maria

    2012-01-01

    This study sought to develop a strain-based probabilistic method to predict rib fracture risk with whole-body finite element (FE) models, and to describe a method to combine the results with collision exposure information to predict injury risk and potential intervention effectiveness in the field. An age-adjusted ultimate strain distribution was used to estimate local rib fracture probabilities within an FE model. These local probabilities were combined to predict injury risk and severity within the whole ribcage. The ultimate strain distribution was developed from a literature dataset of 133 tests. Frontal collision simulations were performed with the THUMS (Total HUman Model for Safety) model with four levels of delta-V and two restraints: a standard 3-point belt and a progressive 3.5–7 kN force-limited, pretensioned (FL+PT) belt. The results of three simulations (29 km/h standard, 48 km/h standard, and 48 km/h FL+PT) were compared to matched cadaver sled tests. The numbers of fractures predicted for the comparison cases were consistent with those observed experimentally. Combining these results with field exposure informantion (ΔV, NASS-CDS 1992–2002) suggests a 8.9% probability of incurring AIS3+ rib fractures for a 60 year-old restrained by a standard belt in a tow-away frontal collision with this restraint, vehicle, and occupant configuration, compared to 4.6% for the FL+PT belt. This is the first study to describe a probabilistic framework to predict rib fracture risk based on strains observed in human-body FE models. Using this analytical framework, future efforts may incorporate additional subject or collision factors for multi-variable probabilistic injury prediction. PMID:23169122

  6. Integrated multi-parameters Probabilistic Seismic Landslide Hazard Analysis (PSLHA): an innovative approach in the active volcano-tectonic area of Campi Flegrei (Italy)

    NASA Astrophysics Data System (ADS)

    Caccavale, M.; Matano, F.; Sacchi, M.; Somma, R.; Troise, C.; De Natale, G.

    2013-12-01

    The western coastal sector of Campania region (southern Italy) is characterised by the presence of the active volcano-tectonic area of Campi Flegrei. This area represents a very particular and interesting case-study for a probabilistic seismic hazard analysis (PSHA). The principal seismic source, related with the caldera, is not clearly constrained in the on-shore and off-shore areas. The well-known and monitored phenomenon of bradyseism affecting a large portion of case-study area is not modelled in the standard PSHA approach. From the environmental point of view the presence of very high exposed values in terms of population, buildings, infrastructures and palaces of high archaeological, natural and artistic value, makes this area a strategic natural laboratory to develop new methodologies. Moreover the geomorphological and geo-volcanological features lead to a heterogeneous coastline, made up by both beach and tuff cliffs, rapidly evolving for erosion and landslide (i.e. mainly rock fall and rock slide) phenomena that represent an additional hazard aspect. In the Campi Flegrei the possible occurrence of a moderate/large seismic event represents a serious threat for the inhabitants, for the infrastructures as well as for the environment. In the framework of Italian MON.I.C.A project (sinfrastructural coastlines monitoring) an innovative and dedicated probabilistic methodology has been applied to identify the areas with higher tendency of landslide occurrence due to the seismic effect. Resident population reported the occurrence of some small rock falls along tuff quarry slopes during the main shocks of the 1982-84 bradyseismic events. The PSHA methodology, introduced by Cornell (1968), combines the contributions to the hazard from all potential sources of earthquakes and the average activity rates associated to each seismogenic zone considered. The result of the PSHA is represented by the spatial distribution of a ground-motion (GM) parameter A, such as Peak

  7. Developing EHR-driven heart failure risk prediction models using CPXR(Log) with the probabilistic loss function.

    PubMed

    Taslimitehrani, Vahid; Dong, Guozhu; Pereira, Naveen L; Panahiazar, Maryam; Pathak, Jyotishman

    2016-04-01

    Computerized survival prediction in healthcare identifying the risk of disease mortality, helps healthcare providers to effectively manage their patients by providing appropriate treatment options. In this study, we propose to apply a classification algorithm, Contrast Pattern Aided Logistic Regression (CPXR(Log)) with the probabilistic loss function, to develop and validate prognostic risk models to predict 1, 2, and 5year survival in heart failure (HF) using data from electronic health records (EHRs) at Mayo Clinic. The CPXR(Log) constructs a pattern aided logistic regression model defined by several patterns and corresponding local logistic regression models. One of the models generated by CPXR(Log) achieved an AUC and accuracy of 0.94 and 0.91, respectively, and significantly outperformed prognostic models reported in prior studies. Data extracted from EHRs allowed incorporation of patient co-morbidities into our models which helped improve the performance of the CPXR(Log) models (15.9% AUC improvement), although did not improve the accuracy of the models built by other classifiers. We also propose a probabilistic loss function to determine the large error and small error instances. The new loss function used in the algorithm outperforms other functions used in the previous studies by 1% improvement in the AUC. This study revealed that using EHR data to build prediction models can be very challenging using existing classification methods due to the high dimensionality and complexity of EHR data. The risk models developed by CPXR(Log) also reveal that HF is a highly heterogeneous disease, i.e., different subgroups of HF patients require different types of considerations with their diagnosis and treatment. Our risk models provided two valuable insights for application of predictive modeling techniques in biomedicine: Logistic risk models often make systematic prediction errors, and it is prudent to use subgroup based prediction models such as those given by CPXR

  8. A Probabilistic Model of Global-Scale Seismology with Veith-Clawson Amplitude Corrections

    NASA Astrophysics Data System (ADS)

    Arora, N. S.; Russell, S.

    2013-12-01

    We present a probabilistic generative model of global-scale seismology, NET-VISA, that is designed to address the event detection and location problem of seismic monitoring. The model is based on a standard Bayesian framework with prior probabilities for event generation and propagation as well as likelihoods of detection and arrival (or onset) parameters. The model is supplemented with a greedy search algorithm that iteratively improves the predicted bulletin with respect to the posterior probability. Our prior model incorporates both seismic theory and empirical observations as appropriate. For instance, we use empirical observations for the expected rates of earthquake at each point on the earth, while we use the Gutenberg-Richter law for the the expected magnitude distribution of these earthquakes. In this work, we describe an extension of our model where we include the Veith-Clawson (1972) amplitude decline curves in our empirically calibrated arrival amplitude model. While this change doesn't alter the overall event-detection results, we have chosen to keep the Veith-Clawson curves since they are more seismically accurate. We also describe a recent change to our search algorithm, whereby we now consider multiple hypotheses when we encounter a series of closely spaced arrivals which could be explained by either a single event or multiple co-located events. This change has led to a sharp improvement in our results on large after-shock sequences. We use the analyst-curated LEB bulletin or the REB bulletin, which is the published product of the IDC, as a reference and measure the overlap (percentage of reference events that are matched) and inconsistency (percentage of test bulletin events that don't match anything in the reference) of a one-to-one matching between the test and the reference bulletins. In the table below we show results for NET-VISA and SEL3, which is produced by the existing GA software, for the whole of 2009. These results show that NET

  9. Twenty-first century probabilistic projections of precipitation over Ontario, Canada through a regional climate model ensemble

    NASA Astrophysics Data System (ADS)

    Wang, Xiuquan; Huang, Guohe; Liu, Jinliang

    2016-06-01

    In this study, probabilistic projections of precipitation for the Province of Ontario are developed through a regional climate model ensemble to help investigate how global warming would affect its local climate. The PRECIS regional climate modeling system is employed to perform ensemble simulations, driven by a set of boundary conditions from a HadCM3-based perturbed-physics ensemble. The PRECIS ensemble simulations are fed into a Bayesian hierarchical model to quantify uncertain factors affecting the resulting projections of precipitation and thus generate probabilistic precipitation changes at grid point scales. Following that, reliable precipitation projections throughout the twenty-first century are developed for the entire province by applying the probabilistic changes to the observed precipitation. The results show that the vast majority of cities in Ontario are likely to suffer positive changes in annual precipitation in 2030, 2050, and 2080 s in comparison to the baseline observations. This may suggest that the whole province is likely to gain more precipitation throughout the twenty-first century in response to global warming. The analyses on the projections of seasonal precipitation further demonstrate that the entire province is likely to receive more precipitation in winter, spring, and autumn throughout this century while summer precipitation is only likely to increase slightly in 2030 s and would decrease gradually afterwards. However, because the magnitude of projected decrease in summer precipitation is relatively small in comparison with the anticipated increases in other three seasons, the annual precipitation over Ontario is likely to suffer a progressive increase throughout the twenty-first century (by 7.0 % in 2030 s, 9.5 % in 2050 s, and 12.6 % in 2080 s). Besides, the degree of uncertainty for precipitation projections is analyzed. The results suggest that future changes in spring precipitation show higher degree of uncertainty than other

  10. Twenty-first century probabilistic projections of precipitation over Ontario, Canada through a regional climate model ensemble

    NASA Astrophysics Data System (ADS)

    Wang, Xiuquan; Huang, Guohe; Liu, Jinliang

    2015-09-01

    In this study, probabilistic projections of precipitation for the Province of Ontario are developed through a regional climate model ensemble to help investigate how global warming would affect its local climate. The PRECIS regional climate modeling system is employed to perform ensemble simulations, driven by a set of boundary conditions from a HadCM3-based perturbed-physics ensemble. The PRECIS ensemble simulations are fed into a Bayesian hierarchical model to quantify uncertain factors affecting the resulting projections of precipitation and thus generate probabilistic precipitation changes at grid point scales. Following that, reliable precipitation projections throughout the twenty-first century are developed for the entire province by applying the probabilistic changes to the observed precipitation. The results show that the vast majority of cities in Ontario are likely to suffer positive changes in annual precipitation in 2030, 2050, and 2080 s in comparison to the baseline observations. This may suggest that the whole province is likely to gain more precipitation throughout the twenty-first century in response to global warming. The analyses on the projections of seasonal precipitation further demonstrate that the entire province is likely to receive more precipitation in winter, spring, and autumn throughout this century while summer precipitation is only likely to increase slightly in 2030 s and would decrease gradually afterwards. However, because the magnitude of projected decrease in summer precipitation is relatively small in comparison with the anticipated increases in other three seasons, the annual precipitation over Ontario is likely to suffer a progressive increase throughout the twenty-first century (by 7.0 % in 2030 s, 9.5 % in 2050 s, and 12.6 % in 2080 s). Besides, the degree of uncertainty for precipitation projections is analyzed. The results suggest that future changes in spring precipitation show higher degree of uncertainty than other

  11. Technical Note: Probabilistically constraining proxy age-depth models within a Bayesian hierarchical reconstruction model

    NASA Astrophysics Data System (ADS)

    Werner, J. P.; Tingley, M. P.

    2014-12-01

    Reconstructions of late-Holocene climate rely heavily upon proxies that are assumed to be accurately dated by layer counting, such as measurement on tree rings, ice cores, and varved lake sediments. Considerable advances may be achievable if time uncertain proxies could be included within these multiproxy reconstructions, and if time uncertainties were recognized and correctly modeled for proxies commonly treated as free of age model errors. Current approaches to accounting for time uncertainty are generally limited to repeating the reconstruction using each of an ensemble of age models, thereby inflating the final estimated uncertainty - in effect, each possible age model is given equal weighting. Uncertainties can be reduced by exploiting the inferred space-time covariance structure of the climate to re-weight the possible age models. Here we demonstrate how Bayesian Hierarchical climate reconstruction models can be augmented to account for time uncertain proxies. Critically, while a priori all age models are given equal probability of being correct, the probabilities associated with the age models are formally updated within the Bayesian framework, thereby reducing uncertainties. Numerical experiments show that updating the age-model probabilities decreases uncertainty in the climate reconstruction, as compared with the current de-facto standard of sampling over all age models, provided there is sufficient information from other data sources in the region of the time-uncertain proxy. This approach can readily be generalized to non-layer counted proxies, such as those derived from marine sediments.

  12. Technical Note: Probabilistically constraining proxy age-depth models within a Bayesian hierarchical reconstruction model

    NASA Astrophysics Data System (ADS)

    Werner, J. P.; Tingley, M. P.

    2015-03-01

    Reconstructions of the late-Holocene climate rely heavily upon proxies that are assumed to be accurately dated by layer counting, such as measurements of tree rings, ice cores, and varved lake sediments. Considerable advances could be achieved if time-uncertain proxies were able to be included within these multiproxy reconstructions, and if time uncertainties were recognized and correctly modeled for proxies commonly treated as free of age model errors. Current approaches for accounting for time uncertainty are generally limited to repeating the reconstruction using each one of an ensemble of age models, thereby inflating the final estimated uncertainty - in effect, each possible age model is given equal weighting. Uncertainties can be reduced by exploiting the inferred space-time covariance structure of the climate to re-weight the possible age models. Here, we demonstrate how Bayesian hierarchical climate reconstruction models can be augmented to account for time-uncertain proxies. Critically, although a priori all age models are given equal probability of being correct, the probabilities associated with the age models are formally updated within the Bayesian framework, thereby reducing uncertainties. Numerical experiments show that updating the age model probabilities decreases uncertainty in the resulting reconstructions, as compared with the current de facto standard of sampling over all age models, provided there is sufficient information from other data sources in the spatial region of the time-uncertain proxy. This approach can readily be generalized to non-layer-counted proxies, such as those derived from marine sediments.

  13. Probabilistically constraining proxy age-depth models within a Bayesian hierarchical reconstruction model

    NASA Astrophysics Data System (ADS)

    Werner, Johannes; Tingley, Martin

    2015-04-01

    Reconstructions of late-Holocene climate rely heavily upon proxies that are assumed to be accurately dated by layer counting, such as measurement on tree rings, ice cores, and varved lake sediments. Considerable advances may be achievable if time uncertain proxies could be included within these multiproxy reconstructions, and if time uncertainties were recognized and correctly modeled for proxies commonly treated as free of age model errors. Current approaches to accounting for time uncertainty are generally limited to repeating the reconstruction using each of an ensemble of age models, thereby inflating the final estimated uncertainty - in effect, each possible age model is given equal weighting. Uncertainties can be reduced by exploiting the inferred space-time covariance structure of the climate to re-weight the possible age models. Here we demonstrate how Bayesian Hierarchical climate reconstruction models can be augmented to account for time uncertain proxies. Critically, while a priori all age models are given equal probability of being correct, the probabilities associated with the age models are formally updated within the Bayesian framework, thereby reducing uncertainties. Numerical experiments show that updating the age model probabilities decreases uncertainty in the climate reconstruction, as compared with the current de-facto standard of sampling over all age models, provided there is sufficient information from other data sources in the region of the time-uncertain proxy. This approach can readily be generalized to non-layer counted proxies, such as those derived from marine sediments. Werner and Tingley, Climate of the Past Discussions (2014)

  14. The effects of climate model similarity on probabilistic climate projections and the implications for local, risk-based adaptation planning

    NASA Astrophysics Data System (ADS)

    Steinschneider, Scott; McCrary, Rachel; Mearns, Linda O.; Brown, Casey

    2015-06-01

    Approaches for probability density function (pdf) development of future climate often assume that different climate models provide independent information, despite model similarities that stem from a common genealogy (models with shared code or developed at the same institution). Here we use an ensemble of projections from the Coupled Model Intercomparison Project Phase 5 to develop probabilistic climate information, with and without an accounting of intermodel correlations, for seven regions across the United States. We then use the pdfs to estimate midcentury climate-related risks to a water utility in one of the regions. We show that the variance of climate changes is underestimated across all regions if model correlations are ignored, and in some cases, the mean change shifts as well. When coupled with impact models of the hydrology and infrastructure of a water utility, the underestimated likelihood of large climate changes significantly alters the quantification of risk for water shortages by midcentury.

  15. Probabilistic Risk Model for Organ Doses and Acute Health Effects of Astronauts on Lunar Missions

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Hu, Shaowen; Nounu, Hatem N.; Cucinotta, Francis A.

    2009-01-01

    Exposure to large solar particle events (SPEs) is a major concern during EVAs on the lunar surface and in Earth-to-Lunar transit. 15% of crew times may be on EVA with minimal radiation shielding. Therefore, an accurate assessment of SPE occurrence probability is required for the mission planning by NASA. We apply probabilistic risk assessment (PRA) for radiation protection of crews and optimization of lunar mission planning.

  16. Probabilistic Risk Assessment: A Bibliography

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Probabilistic risk analysis is an integration of failure modes and effects analysis (FMEA), fault tree analysis and other techniques to assess the potential for failure and to find ways to reduce risk. This bibliography references 160 documents in the NASA STI Database that contain the major concepts, probabilistic risk assessment, risk and probability theory, in the basic index or major subject terms, An abstract is included with most citations, followed by the applicable subject terms.

  17. Agent autonomy approach to probabilistic physics-of-failure modeling of complex dynamic systems with interacting failure mechanisms

    NASA Astrophysics Data System (ADS)

    Gromek, Katherine Emily

    A novel computational and inference framework of the physics-of-failure (PoF) reliability modeling for complex dynamic systems has been established in this research. The PoF-based reliability models are used to perform a real time simulation of system failure processes, so that the system level reliability modeling would constitute inferences from checking the status of component level reliability at any given time. The "agent autonomy" concept is applied as a solution method for the system-level probabilistic PoF-based (i.e. PPoF-based) modeling. This concept originated from artificial intelligence (AI) as a leading intelligent computational inference in modeling of multi agents systems (MAS). The concept of agent autonomy in the context of reliability modeling was first proposed by M. Azarkhail [1], where a fundamentally new idea of system representation by autonomous intelligent agents for the purpose of reliability modeling was introduced. Contribution of the current work lies in the further development of the agent anatomy concept, particularly the refined agent classification within the scope of the PoF-based system reliability modeling, new approaches to the learning and the autonomy properties of the intelligent agents, and modeling interacting failure mechanisms within the dynamic engineering system. The autonomous property of intelligent agents is defined as agent's ability to self-activate, deactivate or completely redefine their role in the analysis. This property of agents and the ability to model interacting failure mechanisms of the system elements makes the agent autonomy fundamentally different from all existing methods of probabilistic PoF-based reliability modeling. 1. Azarkhail, M., "Agent Autonomy Approach to Physics-Based Reliability Modeling of Structures and Mechanical Systems", PhD thesis, University of Maryland, College Park, 2007.

  18. Automatic Prediction of Protein 3D Structures by Probabilistic Multi-template Homology Modeling.

    PubMed

    Meier, Armin; Söding, Johannes

    2015-10-01

    Homology modeling predicts the 3D structure of a query protein based on the sequence alignment with one or more template proteins of known structure. Its great importance for biological research is owed to its speed, simplicity, reliability and wide applicability, covering more than half of the residues in protein sequence space. Although multiple templates have been shown to generally increase model quality over single templates, the information from multiple templates has so far been combined using empirically motivated, heuristic approaches. We present here a rigorous statistical framework for multi-template homology modeling. First, we find that the query proteins' atomic distance restraints can be accurately described by two-component Gaussian mixtures. This insight allowed us to apply the standard laws of probability theory to combine restraints from multiple templates. Second, we derive theoretically optimal weights to correct for the redundancy among related templates. Third, a heuristic template selection strategy is proposed. We improve the average GDT-ha model quality score by 11% over single template modeling and by 6.5% over a conventional multi-template approach on a set of 1000 query proteins. Robustness with respect to wrong constraints is likewise improved. We have integrated our multi-template modeling approach with the popular MODELLER homology modeling software in our free HHpred server http://toolkit.tuebingen.mpg.de/hhpred and also offer open source software for running MODELLER with the new restraints at https://bitbucket.org/soedinglab/hh-suite. PMID:26496371

  19. Calculation of the climate dynamics characteristics in the coastal sea zone by the methods of hydrodynamic and probabilistic modelling

    NASA Astrophysics Data System (ADS)

    Safronov, G. F.; Zilberstein, O. I.

    1996-02-01

    For successful approach to economical problems related to natural resources operation in the shelf and coastal sea zones it is necessary to have various characteristics of current regime state and its possible changes. Their variability results from sophisticated hydrodynamic processes and their interaction in natural conditions. An estimation of the extreme sea characteristics and ones with return period of 25, 50 and 100 years for the regions with lack of observations becomes of great importance now. It becomes possible on the basis of joint methods which include computer generation of ecological characteristics using observation data available and thermodynamic and probabilistic models. Methods of evaluation of sea regime main elements (wind, waves, currents, sea level) including characteristics with long return periods for coastal zone and offshore regions have been developed in the State Oceanographic Institute. They consist of the hydrodynamic models and estimation methods of the above-mentioned hydrometeorological elements in meso-scale and synoptical variability ranges, and also of probabilistic models of hydrometeorological phenomena, including rare events.

  20. Verification and Optimal Control of Context-Sensitive Probabilistic Boolean Networks Using Model Checking and Polynomial Optimization

    PubMed Central

    Hiraishi, Kunihiko

    2014-01-01

    One of the significant topics in systems biology is to develop control theory of gene regulatory networks (GRNs). In typical control of GRNs, expression of some genes is inhibited (activated) by manipulating external stimuli and expression of other genes. It is expected to apply control theory of GRNs to gene therapy technologies in the future. In this paper, a control method using a Boolean network (BN) is studied. A BN is widely used as a model of GRNs, and gene expression is expressed by a binary value (ON or OFF). In particular, a context-sensitive probabilistic Boolean network (CS-PBN), which is one of the extended models of BNs, is used. For CS-PBNs, the verification problem and the optimal control problem are considered. For the verification problem, a solution method using the probabilistic model checker PRISM is proposed. For the optimal control problem, a solution method using polynomial optimization is proposed. Finally, a numerical example on the WNT5A network, which is related to melanoma, is presented. The proposed methods provide us useful tools in control theory of GRNs. PMID:24587766

  1. Multi-model ensemble-based probabilistic prediction of tropical cyclogenesis using TIGGE model forecasts

    NASA Astrophysics Data System (ADS)

    Jaiswal, Neeru; Kishtawal, C. M.; Bhomia, Swati; Pal, P. K.

    2016-02-01

    An extended range tropical cyclogenesis forecast model has been developed using the forecasts of global models available from TIGGE portal. A scheme has been developed to detect the signatures of cyclogenesis in the global model forecast fields [i.e., the mean sea level pressure and surface winds (10 m horizontal winds)]. For this, a wind matching index was determined between the synthetic cyclonic wind fields and the forecast wind fields. The thresholds of 0.4 for wind matching index and 1005 hpa for pressure were determined to detect the cyclonic systems. These detected cyclonic systems in the study region are classified into different cyclone categories based on their intensity (maximum wind speed). The forecasts of up to 15 days from three global models viz., ECMWF, NCEP and UKMO have been used to predict cyclogenesis based on multi-model ensemble approach. The occurrence of cyclonic events of different categories in all the forecast steps in the grided region (10 × 10 km2) was used to estimate the probability of the formation of cyclogenesis. The probability of cyclogenesis was estimated by computing the grid score using the wind matching index by each model and at each forecast step and convolving it with Gaussian filter. The proposed method is used to predict the cyclogenesis of five named tropical cyclones formed during the year 2013 in the north Indian Ocean. The 6-8 days advance cyclogenesis of theses systems were predicted using the above approach. The mean lead prediction time for the cyclogenesis event of the proposed model has been found as 7 days.

  2. Statistical Analyses for Probabilistic Assessments of the Reactor Pressure Vessel Structural Integrity: Building a Master Curve on an Extract of the 'Euro' Fracture Toughness Dataset, Controlling Statistical Uncertainty for Both Mono-Temperature and multi-temperature tests

    SciTech Connect

    Josse, Florent; Lefebvre, Yannick; Todeschini, Patrick; Turato, Silvia; Meister, Eric

    2006-07-01

    Assessing the structural integrity of a nuclear Reactor Pressure Vessel (RPV) subjected to pressurized-thermal-shock (PTS) transients is extremely important to safety. In addition to conventional deterministic calculations to confirm RPV integrity, Electricite de France (EDF) carries out probabilistic analyses. Probabilistic analyses are interesting because some key variables, albeit conventionally taken at conservative values, can be modeled more accurately through statistical variability. One variable which significantly affects RPV structural integrity assessment is cleavage fracture initiation toughness. The reference fracture toughness method currently in use at EDF is the RCCM and ASME Code lower-bound K{sub IC} based on the indexing parameter RT{sub NDT}. However, in order to quantify the toughness scatter for probabilistic analyses, the master curve method is being analyzed at present. Furthermore, the master curve method is a direct means of evaluating fracture toughness based on K{sub JC} data. In the framework of the master curve investigation undertaken by EDF, this article deals with the following two statistical items: building a master curve from an extract of a fracture toughness dataset (from the European project 'Unified Reference Fracture Toughness Design curves for RPV Steels') and controlling statistical uncertainty for both mono-temperature and multi-temperature tests. Concerning the first point, master curve temperature dependence is empirical in nature. To determine the 'original' master curve, Wallin postulated that a unified description of fracture toughness temperature dependence for ferritic steels is possible, and used a large number of data corresponding to nuclear-grade pressure vessel steels and welds. Our working hypothesis is that some ferritic steels may behave in slightly different ways. Therefore we focused exclusively on the basic french reactor vessel metal of types A508 Class 3 and A 533 grade B Class 1, taking the sampling

  3. The integrated urban land model

    NASA Astrophysics Data System (ADS)

    Meng, Chunlei

    2015-06-01

    An integrated urban land model (IUM) was developed based on the Common Land Model (CoLM). A whole layer soil evaporation parameterization scheme was developed to improve soil evaporation simulation especially in arid areas. For the urban underlying surface, the energy and water balance model were modified; urban land parameters such as the anthropogenic heat (AH), albedo, surface roughness length, imperious surface evaporation etc. were also reparameterized. IUM was validated and compared with CoLM and the urbanized high-resolution land data assimilation system (u-HRLDAS) in single and regional scale. The validation results indicate that IUM can improve the simulation of land surface parameters and land-atmosphere interaction fluxes.

  4. Probabilistic Modeling of High-Temperature Material Properties of a 5-Harness 0/90 Sylramic Fiber/ CVI-SiC/ MI-SiC Woven Composite

    NASA Technical Reports Server (NTRS)

    Nagpal, Vinod K.; Tong, Michael; Murthy, P. L. N.; Mital, Subodh

    1998-01-01

    An integrated probabilistic approach has been developed to assess composites for high temperature applications. This approach was used to determine thermal and mechanical properties and their probabilistic distributions of a 5-harness 0/90 Sylramic fiber/CVI-SiC/Mi-SiC woven Ceramic Matrix Composite (CMC) at high temperatures. The purpose of developing this approach was to generate quantitative probabilistic information on this CMC to help complete the evaluation for its potential application for HSCT combustor liner. This approach quantified the influences of uncertainties inherent in constituent properties called primitive variables on selected key response variables of the CMC at 2200 F. The quantitative information is presented in the form of Cumulative Density Functions (CDFs). Probability Density Functions (PDFS) and primitive variable sensitivities on response. Results indicate that the scatters in response variables were reduced by 30-50% when the uncertainties in the primitive variables, which showed the most influence, were reduced by 50%.

  5. An Integrated Vehicle Modeling Environment

    NASA Technical Reports Server (NTRS)

    Totah, Joseph J.; Kinney, David J.; Kaneshige, John T.; Agabon, Shane

    1999-01-01

    This paper describes an Integrated Vehicle Modeling Environment for estimating aircraft geometric, inertial, and aerodynamic characteristics, and for interfacing with a high fidelity, workstation based flight simulation architecture. The goals in developing this environment are to aid in the design of next generation intelligent fight control technologies, conduct research in advanced vehicle interface concepts for autonomous and semi-autonomous applications, and provide a value-added capability to the conceptual design and aircraft synthesis process. Results are presented for three aircraft by comparing estimates generated by the Integrated Vehicle Modeling Environment with known characteristics of each vehicle under consideration. The three aircraft are a modified F-15 with moveable canards attached to the airframe, a mid-sized, twin-engine commercial transport concept, and a small, single-engine, uninhabited aerial vehicle. Estimated physical properties and dynamic characteristics are correlated with those known for each aircraft over a large portion of the flight envelope of interest. These results represent the completion of a critical step toward meeting the stated goals for developing this modeling environment.

  6. Developing an event-tree probabilistic tsunami inundation model for NE Atlantic coasts: Application to case studies

    NASA Astrophysics Data System (ADS)

    Omira, Rachid; Baptista, Maria Ana; Matias, Luis

    2015-04-01

    This study constitutes the first assessment of probabilistic tsunami inundation in the NE Atlantic region, using an event-tree approach. It aims to develop a probabilistic tsunami inundation approach for the NE Atlantic coast with an application to two test sites of ASTARTE project, Tangier-Morocco and Sines-Portugal. Only tsunamis of tectonic origin are considered here, taking into account near-, regional- and far-filed sources. The multidisciplinary approach, proposed here, consists of an event-tree method that gathers seismic hazard assessment, tsunami numerical modelling, and statistical methods. It presents also a treatment of uncertainties related to source location and tidal stage in order to derive the likelihood of tsunami flood occurrence and exceedance of a specific near-shore wave height during a given return period. We derive high-resolution probabilistic maximum wave heights and flood distributions for both test-sites Tangier and Sines considering 100-, 500-, and 1000-year return periods. We find that the probability that a maximum wave height exceeds 1 m somewhere along the Sines coasts reaches about 55% for 100-year return period, and is up to 100% for 1000-year return period. Along Tangier coast, the probability of inundation occurrence (flow depth > 0m) is up to 45% for 100-year return period and reaches 96% in some near-shore costal location for 500-year return period. Acknowledgements: This work is funded by project ASTARTE - Assessment, STrategy And Risk Reduction for Tsunamis in Europe. Grant 603839, 7th FP (ENV.2013.6.4-3 ENV.2013.6.4-3).

  7. Integrated Resource Planning Model (IRPM)

    SciTech Connect

    Graham, T. B.

    2010-04-01

    The Integrated Resource Planning Model (IRPM) is a decision-support software product for resource-and-capacity planning. Users can evaluate changing constraints on schedule performance, projected cost, and resource use. IRPM is a unique software tool that can analyze complex business situations from a basic supply chain to an integrated production facility to a distributed manufacturing complex. IRPM can be efficiently configured through a user-friendly graphical interface to rapidly provide charts, graphs, tables, and/or written results to summarize postulated business scenarios. There is not a similar integrated resource planning software package presently available. Many different businesses (from government to large corporations as well as medium-to-small manufacturing concerns) could save thousands of dollars and hundreds of labor hours in resource and schedule planning costs. Those businesses also could avoid millions of dollars of revenue lost from fear of overcommitting or from penalties and lost future business for failing to meet promised delivery by using IRPM to perform what-if business-case evaluations. Tough production planning questions that previously were left unanswered can now be answered with a high degree of certainty. Businesses can anticipate production problems and have solutions in hand to deal with those problems. IRPM allows companies to make better plans, decisions, and investments.

  8. Analysis of well test data---Application of probabilistic models to infer hydraulic properties of fractures. [Contains list of standardized terminology or nomenclatue used in statistical models

    SciTech Connect

    Osnes, J.D. ); Winberg, A.; Andersson, J.E.; Larsson, N.A. )

    1991-09-27

    Statistical and probabilistic methods for estimating the probability that a fracture is nonconductive (or equivalently, the conductive-fracture frequency) and the distribution of the transmissivities of conductive fractures from transmissivity measurements made in single-hole injection (well) tests were developed. These methods were applied to a database consisting of over 1,000 measurements made in nearly 25 km of borehole at five sites in Sweden. The depths of the measurements ranged from near the surface to over 600-m deep, and packer spacings of 20- and 25-m were used. A probabilistic model that describes the distribution of a series of transmissivity measurements was derived. When the parameters of this model were estimated using maximum likelihood estimators, the resulting estimated distributions generally fit the cumulative histograms of the transmissivity measurements very well. Further, estimates of the mean transmissivity of conductive fractures based on the maximum likelihood estimates of the model's parameters were reasonable, both in magnitude and in trend, with respect to depth. The estimates of the conductive fracture probability were generated in the range of 0.5--5.0 percent, with the higher values at shallow depths and with increasingly smaller values as depth increased. An estimation procedure based on the probabilistic model and the maximum likelihood estimators of its parameters was recommended. Some guidelines regarding the design of injection test programs were drawn from the recommended estimation procedure and the parameter estimates based on the Swedish data. 24 refs., 12 figs., 14 tabs.

  9. Involving Stakeholders in Building Integrated Fisheries Models Using Bayesian Methods

    NASA Astrophysics Data System (ADS)

    Haapasaari, Päivi; Mäntyniemi, Samu; Kuikka, Sakari

    2013-06-01

    A participatory Bayesian approach was used to investigate how the views of stakeholders could be utilized to develop models to help understand the Central Baltic herring fishery. In task one, we applied the Bayesian belief network methodology to elicit the causal assumptions of six stakeholders on factors that influence natural mortality, growth, and egg survival of the herring stock in probabilistic terms. We also integrated the expressed views into a meta-model using the Bayesian model averaging (BMA) method. In task two, we used influence diagrams to study qualitatively how the stakeholders frame the management problem of the herring fishery and elucidate what kind of causalities the different views involve. The paper combines these two tasks to assess the suitability of the methodological choices to participatory modeling in terms of both a modeling tool and participation mode. The paper also assesses the potential of the study to contribute to the development of participatory modeling practices. It is concluded that the subjective perspective to knowledge, that is fundamental in Bayesian theory, suits participatory modeling better than a positivist paradigm that seeks the objective truth. The methodology provides a flexible tool that can be adapted to different kinds of needs and challenges of participatory modeling. The ability of the approach to deal with small data sets makes it cost-effective in participatory contexts. However, the BMA methodology used in modeling the biological uncertainties is so complex that it needs further development before it can be introduced to wider use in participatory contexts.

  10. Involving stakeholders in building integrated fisheries models using Bayesian methods.

    PubMed

    Haapasaari, Päivi; Mäntyniemi, Samu; Kuikka, Sakari

    2013-06-01

    A participatory Bayesian approach was used to investigate how the views of stakeholders could be utilized to develop models to help understand the Central Baltic herring fishery. In task one, we applied the Bayesian belief network methodology to elicit the causal assumptions of six stakeholders on factors that influence natural mortality, growth, and egg survival of the herring stock in probabilistic terms. We also integrated the expressed views into a meta-model using the Bayesian model averaging (BMA) method. In task two, we used influence diagrams to study qualitatively how the stakeholders frame the management problem of the herring fishery and elucidate what kind of causalities the different views involve. The paper combines these two tasks to assess the suitability of the methodological choices to participatory modeling in terms of both a modeling tool and participation mode. The paper also assesses the potential of the study to contribute to the development of participatory modeling practices. It is concluded that the subjective perspective to knowledge, that is fundamental in Bayesian theory, suits participatory modeling better than a positivist paradigm that seeks the objective truth. The methodology provides a flexible tool that can