Probabilistic Usage of the Multi-Factor Interaction Model
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2008-01-01
A Multi-Factor Interaction Model (MFIM) is used to predict the insulating foam mass expulsion during the ascending of a space vehicle. The exponents in the MFIM are evaluated by an available approach which consists of least squares and an optimization algorithm. These results were subsequently used to probabilistically evaluate the effects of the uncertainties in each participating factor in the mass expulsion. The probabilistic results show that the surface temperature dominates at high probabilities and the pressure which causes the mass expulsion at low probabil
Probabilistic simulation of the human factor in structural reliability
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1993-01-01
A formal approach is described in an attempt to computationally simulate the probable ranges of uncertainties of the human factor in structural probabilistic assessments. A multi-factor interaction equation (MFIE) model has been adopted for this purpose. Human factors such as marital status, professional status, home life, job satisfaction, work load and health, are considered to demonstrate the concept. Parametric studies in conjunction with judgment are used to select reasonable values for the participating factors (primitive variables). Suitability of the MFIE in the subsequently probabilistic sensitivity studies are performed to assess the validity of the whole approach. Results obtained show that the uncertainties for no error range from five to thirty percent for the most optimistic case.
Probabilistic Multi-Factor Interaction Model for Complex Material Behavior
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Abumeri, Galib H.
2008-01-01
The Multi-Factor Interaction Model (MFIM) is used to evaluate the divot weight (foam weight ejected) from the launch external tanks. The multi-factor has sufficient degrees of freedom to evaluate a large number of factors that may contribute to the divot ejection. It also accommodates all interactions by its product form. Each factor has an exponent that satisfies only two points, the initial and final points. The exponent describes a monotonic path from the initial condition to the final. The exponent values are selected so that the described path makes sense in the absence of experimental data. In the present investigation the data used was obtained by testing simulated specimens in launching conditions. Results show that the MFIM is an effective method of describing the divot weight ejected under the conditions investigated.
Probabilistic Multi-Factor Interaction Model for Complex Material Behavior
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Abumeri, Galib H.
2008-01-01
The Multi-Factor Interaction Model (MFIM) is used to evaluate the divot weight (foam weight ejected) from the launch external tanks. The multi-factor has sufficient degrees of freedom to evaluate a large number of factors that may contribute to the divot ejection. It also accommodates all interactions by its product form. Each factor has an exponent that satisfies only two points the initial and final points. The exponent describes a monotonic path from the initial condition to the final. The exponent values are selected so that the described path makes sense in the absence of experimental data. In the present investigation, the data used was obtained by testing simulated specimens in launching conditions. Results show that the MFIM is an effective method of describing the divot weight ejected under the conditions investigated.
Probabilistic sizing of laminates with uncertainties
NASA Technical Reports Server (NTRS)
Shah, A. R.; Liaw, D. G.; Chamis, C. C.
1993-01-01
A reliability based design methodology for laminate sizing and configuration for a special case of composite structures is described. The methodology combines probabilistic composite mechanics with probabilistic structural analysis. The uncertainties of constituent materials (fiber and matrix) to predict macroscopic behavior are simulated using probabilistic theory. Uncertainties in the degradation of composite material properties are included in this design methodology. A multi-factor interaction equation is used to evaluate load and environment dependent degradation of the composite material properties at the micromechanics level. The methodology is integrated into a computer code IPACS (Integrated Probabilistic Assessment of Composite Structures). Versatility of this design approach is demonstrated by performing a multi-level probabilistic analysis to size the laminates for design structural reliability of random type structures. The results show that laminate configurations can be selected to improve the structural reliability from three failures in 1000, to no failures in one million. Results also show that the laminates with the highest reliability are the least sensitive to the loading conditions.
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Abumeri, Galib H.
2010-01-01
The Multi-Factor Interaction Model (MFIM) is used to evaluate the divot weight (foam weight ejected) from the launch external tanks. The multi-factor has sufficient degrees of freedom to evaluate a large number of factors that may contribute to the divot ejection. It also accommodates all interactions by its product form. Each factor has an exponent that satisfies only two points--the initial and final points. The exponent describes a monotonic path from the initial condition to the final. The exponent values are selected so that the described path makes sense in the absence of experimental data. In the present investigation, the data used was obtained by testing simulated specimens in launching conditions. Results show that the MFIM is an effective method of describing the divot weight ejected under the conditions investigated.
NASA Astrophysics Data System (ADS)
Song, Lu-Kai; Wen, Jie; Fei, Cheng-Wei; Bai, Guang-Chen
2018-05-01
To improve the computing efficiency and precision of probabilistic design for multi-failure structure, a distributed collaborative probabilistic design method-based fuzzy neural network of regression (FR) (called as DCFRM) is proposed with the integration of distributed collaborative response surface method and fuzzy neural network regression model. The mathematical model of DCFRM is established and the probabilistic design idea with DCFRM is introduced. The probabilistic analysis of turbine blisk involving multi-failure modes (deformation failure, stress failure and strain failure) was investigated by considering fluid-structure interaction with the proposed method. The distribution characteristics, reliability degree, and sensitivity degree of each failure mode and overall failure mode on turbine blisk are obtained, which provides a useful reference for improving the performance and reliability of aeroengine. Through the comparison of methods shows that the DCFRM reshapes the probability of probabilistic analysis for multi-failure structure and improves the computing efficiency while keeping acceptable computational precision. Moreover, the proposed method offers a useful insight for reliability-based design optimization of multi-failure structure and thereby also enriches the theory and method of mechanical reliability design.
Multi-disciplinary coupling for integrated design of propulsion systems
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Singhal, S. N.
1993-01-01
Effective computational simulation procedures are described for modeling the inherent multi-disciplinary interactions for determining the true response of propulsion systems. Results are presented for propulsion system responses including multi-discipline coupling effects via (1) coupled multi-discipline tailoring, (2) an integrated system of multidisciplinary simulators, (3) coupled material-behavior/fabrication-process tailoring, (4) sensitivities using a probabilistic simulator, and (5) coupled materials/structures/fracture/probabilistic behavior simulator. The results show that the best designs can be determined if the analysis/tailoring methods account for the multi-disciplinary coupling effects. The coupling across disciplines can be used to develop an integrated interactive multi-discipline numerical propulsion system simulator.
Probabilistic Methods for Structural Reliability and Risk
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2008-01-01
A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multi-factor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.
Multi-disciplinary coupling effects for integrated design of propulsion systems
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Singhal, S. N.
1993-01-01
Effective computational simulation procedures are described for modeling the inherent multi-disciplinary interactions which govern the accurate response of propulsion systems. Results are presented for propulsion system responses including multi-disciplinary coupling effects using coupled multi-discipline thermal, structural, and acoustic tailoring; an integrated system of multi-disciplinary simulators; coupled material behavior/fabrication process tailoring; sensitivities using a probabilistic simulator; and coupled materials, structures, fracture, and probabilistic behavior simulator. The results demonstrate that superior designs can be achieved if the analysis/tailoring methods account for the multi-disciplinary coupling effects. The coupling across disciplines can be used to develop an integrated coupled multi-discipline numerical propulsion system simulator.
Asano, Masanari; Khrennikov, Andrei; Ohya, Masanori; Tanaka, Yoshiharu; Yamato, Ichiro
2016-05-28
We compare the contextual probabilistic structures of the seminal two-slit experiment (quantum interference experiment), the system of three interacting bodies andEscherichia colilactose-glucose metabolism. We show that they have the same non-Kolmogorov probabilistic structure resulting from multi-contextuality. There are plenty of statistical data with non-Kolmogorov features; in particular, the probabilistic behaviour of neither quantum nor biological systems can be described classically. Biological systems (even cells and proteins) are macroscopic systems and one may try to present a more detailed model of interactions in such systems that lead to quantum-like probabilistic behaviour. The system of interactions between three bodies is one of the simplest metaphoric examples for such interactions. By proceeding further in this way (by playing withn-body systems) we shall be able to find metaphoric mechanical models for complex bio-interactions, e.g. signalling between cells, leading to non-Kolmogorov probabilistic data. © 2016 The Author(s).
Asano, Masanari; Ohya, Masanori; Yamato, Ichiro
2016-01-01
We compare the contextual probabilistic structures of the seminal two-slit experiment (quantum interference experiment), the system of three interacting bodies and Escherichia coli lactose–glucose metabolism. We show that they have the same non-Kolmogorov probabilistic structure resulting from multi-contextuality. There are plenty of statistical data with non-Kolmogorov features; in particular, the probabilistic behaviour of neither quantum nor biological systems can be described classically. Biological systems (even cells and proteins) are macroscopic systems and one may try to present a more detailed model of interactions in such systems that lead to quantum-like probabilistic behaviour. The system of interactions between three bodies is one of the simplest metaphoric examples for such interactions. By proceeding further in this way (by playing with n-body systems) we shall be able to find metaphoric mechanical models for complex bio-interactions, e.g. signalling between cells, leading to non-Kolmogorov probabilistic data. PMID:27091163
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Abumeri, Galib H.
2000-01-01
Aircraft engines are assemblies of dynamically interacting components. Engine updates to keep present aircraft flying safely and engines for new aircraft are progressively required to operate in more demanding technological and environmental requirements. Designs to effectively meet those requirements are necessarily collections of multi-scale, multi-level, multi-disciplinary analysis and optimization methods and probabilistic methods are necessary to quantify respective uncertainties. These types of methods are the only ones that can formally evaluate advanced composite designs which satisfy those progressively demanding requirements while assuring minimum cost, maximum reliability and maximum durability. Recent research activities at NASA Glenn Research Center have focused on developing multi-scale, multi-level, multidisciplinary analysis and optimization methods. Multi-scale refers to formal methods which describe complex material behavior metal or composite; multi-level refers to integration of participating disciplines to describe a structural response at the scale of interest; multidisciplinary refers to open-ended for various existing and yet to be developed discipline constructs required to formally predict/describe a structural response in engine operating environments. For example, these include but are not limited to: multi-factor models for material behavior, multi-scale composite mechanics, general purpose structural analysis, progressive structural fracture for evaluating durability and integrity, noise and acoustic fatigue, emission requirements, hot fluid mechanics, heat-transfer and probabilistic simulations. Many of these, as well as others, are encompassed in an integrated computer code identified as Engine Structures Technology Benefits Estimator (EST/BEST) or Multi-faceted/Engine Structures Optimization (MP/ESTOP). The discipline modules integrated in MP/ESTOP include: engine cycle (thermodynamics), engine weights, internal fluid mechanics, cost, mission and coupled structural/thermal, various composite property simulators and probabilistic methods to evaluate uncertainty effects (scatter ranges) in all the design parameters. The objective of the proposed paper is to briefly describe a multi-faceted design analysis and optimization capability for coupled multi-discipline engine structures optimization. Results are presented for engine and aircraft type metrics to illustrate the versatility of that capability. Results are also presented for reliability, noise and fatigue to illustrate its inclusiveness. For example, replacing metal rotors with composites reduces the engine weight by 20 percent, 15 percent noise reduction, and an order of magnitude improvement in reliability. Composite designs exist to increase fatigue life by at least two orders of magnitude compared to state-of-the-art metals.
Probabilistic teleportation via multi-parameter measurements and partially entangled states
NASA Astrophysics Data System (ADS)
Wei, Jiahua; Shi, Lei; Han, Chen; Xu, Zhiyan; Zhu, Yu; Wang, Gang; Wu, Hao
2018-04-01
In this paper, a novel scheme for probabilistic teleportation is presented with multi-parameter measurements via a non-maximally entangled state. This is in contrast to the fact that the measurement kinds for quantum teleportation are usually particular in most previous schemes. The detail implementation producers for our proposal are given by using of appropriate local unitary operations. Moreover, the total success probability and classical information of this proposal are calculated. It is demonstrated that the success probability and classical cost would be changed with the multi-measurement parameters and the entanglement factor of quantum channel. Our scheme could enlarge the research range of probabilistic teleportation.
NASA Technical Reports Server (NTRS)
1991-01-01
The technical effort and computer code enhancements performed during the sixth year of the Probabilistic Structural Analysis Methods program are summarized. Various capabilities are described to probabilistically combine structural response and structural resistance to compute component reliability. A library of structural resistance models is implemented in the Numerical Evaluations of Stochastic Structures Under Stress (NESSUS) code that included fatigue, fracture, creep, multi-factor interaction, and other important effects. In addition, a user interface was developed for user-defined resistance models. An accurate and efficient reliability method was developed and was successfully implemented in the NESSUS code to compute component reliability based on user-selected response and resistance models. A risk module was developed to compute component risk with respect to cost, performance, or user-defined criteria. The new component risk assessment capabilities were validated and demonstrated using several examples. Various supporting methodologies were also developed in support of component risk assessment.
Probabilistic simulation of the human factor in structural reliability
NASA Technical Reports Server (NTRS)
Shah, Ashwin R.; Chamis, Christos C.
1991-01-01
Many structural failures have occasionally been attributed to human factors in engineering design, analyses maintenance, and fabrication processes. Every facet of the engineering process is heavily governed by human factors and the degree of uncertainty associated with them. Factors such as societal, physical, professional, psychological, and many others introduce uncertainties that significantly influence the reliability of human performance. Quantifying human factors and associated uncertainties in structural reliability require: (1) identification of the fundamental factors that influence human performance, and (2) models to describe the interaction of these factors. An approach is being developed to quantify the uncertainties associated with the human performance. This approach consists of a multi factor model in conjunction with direct Monte-Carlo simulation.
NASA Astrophysics Data System (ADS)
Anita, G.; Selva, J.; Laura, S.
2011-12-01
We develop a comprehensive and total probabilistic tsunami hazard assessment (TotPTHA), in which many different possible source types concur to the definition of the total tsunami hazard at given target sites. In a multi-hazard and multi-risk perspective, such an innovative approach allows, in principle, to consider all possible tsunamigenic sources, from seismic events, to slides, asteroids, volcanic eruptions, etc. In this respect, we also formally introduce and discuss the treatment of interaction/cascade effects in the TotPTHA analysis. We demonstrate how external triggering events may induce significant temporary variations in the tsunami hazard. Because of this, such effects should always be considered, at least in short-term applications, to obtain unbiased analyses. Finally, we prove the feasibility of the TotPTHA and of the treatment of interaction/cascade effects by applying this methodology to an ideal region with realistic characteristics (Neverland).
Probabilistic Multi-Factor Interaction Model for Complex Material Behavior
NASA Technical Reports Server (NTRS)
Abumeri, Galib H.; Chamis, Christos C.
2010-01-01
Complex material behavior is represented by a single equation of product form to account for interaction among the various factors. The factors are selected by the physics of the problem and the environment that the model is to represent. For example, different factors will be required for each to represent temperature, moisture, erosion, corrosion, etc. It is important that the equation represent the physics of the behavior in its entirety accurately. The Multi-Factor Interaction Model (MFIM) is used to evaluate the divot weight (foam weight ejected) from the external launch tanks. The multi-factor has sufficient degrees of freedom to evaluate a large number of factors that may contribute to the divot ejection. It also accommodates all interactions by its product form. Each factor has an exponent that satisfies only two points - the initial and final points. The exponent describes a monotonic path from the initial condition to the final. The exponent values are selected so that the described path makes sense in the absence of experimental data. In the present investigation, the data used were obtained by testing simulated specimens in launching conditions. Results show that the MFIM is an effective method of describing the divot weight ejected under the conditions investigated. The problem lies in how to represent the divot weight with a single equation. A unique solution to this problem is a multi-factor equation of product form. Each factor is of the following form (1 xi/xf)ei, where xi is the initial value, usually at ambient conditions, xf the final value, and ei the exponent that makes the curve represented unimodal that meets the initial and final values. The exponents are either evaluated by test data or by technical judgment. A minor disadvantage may be the selection of exponents in the absence of any empirical data. This form has been used successfully in describing the foam ejected in simulated space environmental conditions. Seven factors were required to represent the ejected foam. The exponents were evaluated by least squares method from experimental data. The equation is used and it can represent multiple factors in other problems as well; for example, evaluation of fatigue life, creep life, fracture toughness, and structural fracture, as well as optimization functions. The software is rather simplistic. Required inputs are initial value, final value, and an exponent for each factor. The number of factors is open-ended. The value is updated as each factor is evaluated. If a factor goes to zero, the previous value is used in the evaluation.
Probabilistic simulation of the human factor in structural reliability
NASA Astrophysics Data System (ADS)
Chamis, Christos C.; Singhal, Surendra N.
1994-09-01
The formal approach described herein computationally simulates the probable ranges of uncertainties for the human factor in probabilistic assessments of structural reliability. Human factors such as marital status, professional status, home life, job satisfaction, work load, and health are studied by using a multifactor interaction equation (MFIE) model to demonstrate the approach. Parametric studies in conjunction with judgment are used to select reasonable values for the participating factors (primitive variables). Subsequently performed probabilistic sensitivity studies assess the suitability of the MFIE as well as the validity of the whole approach. Results show that uncertainties range from 5 to 30 percent for the most optimistic case, assuming 100 percent for no error (perfect performance).
Probabilistic Simulation of the Human Factor in Structural Reliability
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Singhal, Surendra N.
1994-01-01
The formal approach described herein computationally simulates the probable ranges of uncertainties for the human factor in probabilistic assessments of structural reliability. Human factors such as marital status, professional status, home life, job satisfaction, work load, and health are studied by using a multifactor interaction equation (MFIE) model to demonstrate the approach. Parametric studies in conjunction with judgment are used to select reasonable values for the participating factors (primitive variables). Subsequently performed probabilistic sensitivity studies assess the suitability of the MFIE as well as the validity of the whole approach. Results show that uncertainties range from 5 to 30 percent for the most optimistic case, assuming 100 percent for no error (perfect performance).
Liu, Jing; Li, Yongping; Huang, Guohe; Fu, Haiyan; Zhang, Junlong; Cheng, Guanhui
2017-06-01
In this study, a multi-level-factorial risk-inference-based possibilistic-probabilistic programming (MRPP) method is proposed for supporting water quality management under multiple uncertainties. The MRPP method can handle uncertainties expressed as fuzzy-random-boundary intervals, probability distributions, and interval numbers, and analyze the effects of uncertainties as well as their interactions on modeling outputs. It is applied to plan water quality management in the Xiangxihe watershed. Results reveal that a lower probability of satisfying the objective function (θ) as well as a higher probability of violating environmental constraints (q i ) would correspond to a higher system benefit with an increased risk of violating system feasibility. Chemical plants are the major contributors to biological oxygen demand (BOD) and total phosphorus (TP) discharges; total nitrogen (TN) would be mainly discharged by crop farming. It is also discovered that optimistic decision makers should pay more attention to the interactions between chemical plant and water supply, while decision makers who possess a risk-averse attitude would focus on the interactive effect of q i and benefit of water supply. The findings can help enhance the model's applicability and identify a suitable water quality management policy for environmental sustainability according to the practical situations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coleman, Justin; Slaughter, Andrew; Veeraraghavan, Swetha
Multi-hazard Analysis for STOchastic time-DOmaiN phenomena (MASTODON) is a finite element application that aims at analyzing the response of 3-D soil-structure systems to natural and man-made hazards such as earthquakes, floods and fire. MASTODON currently focuses on the simulation of seismic events and has the capability to perform extensive ‘source-to-site’ simulations including earthquake fault rupture, nonlinear wave propagation and nonlinear soil-structure interaction (NLSSI) analysis. MASTODON is being developed to be a dynamic probabilistic risk assessment framework that enables analysts to not only perform deterministic analyses, but also easily perform probabilistic or stochastic simulations for the purpose of risk assessment.
A comprehensive Probabilistic Tsunami Hazard Assessment for the city of Naples (Italy)
NASA Astrophysics Data System (ADS)
Anita, G.; Tonini, R.; Selva, J.; Sandri, L.; Pierdominici, S.; Faenza, L.; Zaccarelli, L.
2012-12-01
A comprehensive Probabilistic Tsunami Hazard Assessment (PTHA) should consider different tsunamigenic sources (seismic events, slide failures, volcanic eruptions) to calculate the hazard on given target sites. This implies a multi-disciplinary analysis of all natural tsunamigenic sources, in a multi-hazard/risk framework, which considers also the effects of interaction/cascade events. Our approach shows the ongoing effort to analyze the comprehensive PTHA for the city of Naples (Italy) including all types of sources located in the Tyrrhenian Sea, as developed within the Italian project ByMuR (Bayesian Multi-Risk Assessment). The project combines a multi-hazard/risk approach to treat the interactions among different hazards, and a Bayesian approach to handle the uncertainties. The natural potential tsunamigenic sources analyzed are: 1) submarine seismic sources located on active faults in the Tyrrhenian Sea and close to the Southern Italian shore line (also we consider the effects of the inshore seismic sources and the associated active faults which we provide their rapture properties), 2) mass failures and collapses around the target area (spatially identified on the basis of their propensity to failure), and 3) volcanic sources mainly identified by pyroclastic flows and collapses from the volcanoes in the Neapolitan area (Vesuvius, Campi Flegrei and Ischia). All these natural sources are here preliminary analyzed and combined, in order to provide a complete picture of a PTHA for the city of Naples. In addition, the treatment of interaction/cascade effects is formally discussed in the case of significant temporary variations in the short-term PTHA due to an earthquake.
Addressing the Hard Factors for Command File Errors by Probabilistic Reasoning
NASA Technical Reports Server (NTRS)
Meshkat, Leila; Bryant, Larry
2014-01-01
Command File Errors (CFE) are managed using standard risk management approaches at the Jet Propulsion Laboratory. Over the last few years, more emphasis has been made on the collection, organization, and analysis of these errors for the purpose of reducing the CFE rates. More recently, probabilistic modeling techniques have been used for more in depth analysis of the perceived error rates of the DAWN mission and for managing the soft factors in the upcoming phases of the mission. We broadly classify the factors that can lead to CFE's as soft factors, which relate to the cognition of the operators and hard factors which relate to the Mission System which is composed of the hardware, software and procedures used for the generation, verification & validation and execution of commands. The focus of this paper is to use probabilistic models that represent multiple missions at JPL to determine the root cause and sensitivities of the various components of the mission system and develop recommendations and techniques for addressing them. The customization of these multi-mission models to a sample interplanetary spacecraft is done for this purpose.
Probabilistic liquefaction triggering based on the cone penetration test
Moss, R.E.S.; Seed, R.B.; Kayen, R.E.; Stewart, J.P.; Tokimatsu, K.
2005-01-01
Performance-based earthquake engineering requires a probabilistic treatment of potential failure modes in order to accurately quantify the overall stability of the system. This paper is a summary of the application portions of the probabilistic liquefaction triggering correlations proposed recently proposed by Moss and co-workers. To enable probabilistic treatment of liquefaction triggering, the variables comprising the seismic load and the liquefaction resistance were treated as inherently uncertain. Supporting data from an extensive Cone Penetration Test (CPT)-based liquefaction case history database were used to develop a probabilistic correlation. The methods used to measure the uncertainty of the load and resistance variables, how the interactions of these variables were treated using Bayesian updating, and how reliability analysis was applied to produce curves of equal probability of liquefaction are presented. The normalization for effective overburden stress, the magnitude correlated duration weighting factor, and the non-linear shear mass participation factor used are also discussed.
Integration of Advanced Probabilistic Analysis Techniques with Multi-Physics Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cetiner, Mustafa Sacit; none,; Flanagan, George F.
2014-07-30
An integrated simulation platform that couples probabilistic analysis-based tools with model-based simulation tools can provide valuable insights for reactive and proactive responses to plant operating conditions. The objective of this work is to demonstrate the benefits of a partial implementation of the Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Framework Specification through the coupling of advanced PRA capabilities and accurate multi-physics plant models. Coupling a probabilistic model with a multi-physics model will aid in design, operations, and safety by providing a more accurate understanding of plant behavior. This represents the first attempt at actually integrating these two typesmore » of analyses for a control system used for operations, on a faster than real-time basis. This report documents the development of the basic communication capability to exchange data with the probabilistic model using Reliability Workbench (RWB) and the multi-physics model using Dymola. The communication pathways from injecting a fault (i.e., failing a component) to the probabilistic and multi-physics models were successfully completed. This first version was tested with prototypic models represented in both RWB and Modelica. First, a simple event tree/fault tree (ET/FT) model was created to develop the software code to implement the communication capabilities between the dynamic-link library (dll) and RWB. A program, written in C#, successfully communicates faults to the probabilistic model through the dll. A systems model of the Advanced Liquid-Metal Reactor–Power Reactor Inherently Safe Module (ALMR-PRISM) design developed under another DOE project was upgraded using Dymola to include proper interfaces to allow data exchange with the control application (ConApp). A program, written in C+, successfully communicates faults to the multi-physics model. The results of the example simulation were successfully plotted.« less
Deterministic Teleportation of Multi-qudit States in a Network via Various Probabilistic Channels
NASA Astrophysics Data System (ADS)
Zhang, Ti-Hang; Jiang, Min; Huang, Xu; Wan, Min
2014-04-01
In this paper, we present a generalized approach to faithfully teleport an unknown state of a multi-qudit system involving multi spatially remote agents via various probabilistic channels. In a quantum teleportation network, there are generally multi spatially remote relay agents between a sender and a distant receiver. With the assistance of the relay agents, it is possible to directly construct a deterministic channel between the sender and the distant receiver. In our scheme, different from previous probabilistic teleportation protocols, the integrity of the unknown multi-qudit state could be maintained even when the construction of faithful channel fails. Our results also show that the required auxiliary particle resources, local operations and classical communications are considerably reduced for the present purpose.
Multi-Scale/Multi-Functional Probabilistic Composite Fatigue
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2008-01-01
A multi-level (multi-scale/multi-functional) evaluation is demonstrated by applying it to three different sample problems. These problems include the probabilistic evaluation of a space shuttle main engine blade, an engine rotor and an aircraft wing. The results demonstrate that the blade will fail at the highest probability path, the engine two-stage rotor will fail by fracture at the rim and the aircraft wing will fail at 109 fatigue cycles with a probability of 0.9967.
Probabilistic biological network alignment.
Todor, Andrei; Dobra, Alin; Kahveci, Tamer
2013-01-01
Interactions between molecules are probabilistic events. An interaction may or may not happen with some probability, depending on a variety of factors such as the size, abundance, or proximity of the interacting molecules. In this paper, we consider the problem of aligning two biological networks. Unlike existing methods, we allow one of the two networks to contain probabilistic interactions. Allowing interaction probabilities makes the alignment more biologically relevant at the expense of explosive growth in the number of alternative topologies that may arise from different subsets of interactions that take place. We develop a novel method that efficiently and precisely characterizes this massive search space. We represent the topological similarity between pairs of aligned molecules (i.e., proteins) with the help of random variables and compute their expected values. We validate our method showing that, without sacrificing the running time performance, it can produce novel alignments. Our results also demonstrate that our method identifies biologically meaningful mappings under a comprehensive set of criteria used in the literature as well as the statistical coherence measure that we developed to analyze the statistical significance of the similarity of the functions of the aligned protein pairs.
Three key areas of scientific inquiry in the study of human exposure to environmental contaminants are 1) assessment of aggregate (i.e., multi-pathway, multi-route) exposures, 2) application of probabilistic methods to exposure prediction, and 3) the interpretation of biomarker m...
Jaiswal, Astha; Godinez, William J; Eils, Roland; Lehmann, Maik Jorg; Rohr, Karl
2015-11-01
Automatic fluorescent particle tracking is an essential task to study the dynamics of a large number of biological structures at a sub-cellular level. We have developed a probabilistic particle tracking approach based on multi-scale detection and two-step multi-frame association. The multi-scale detection scheme allows coping with particles in close proximity. For finding associations, we have developed a two-step multi-frame algorithm, which is based on a temporally semiglobal formulation as well as spatially local and global optimization. In the first step, reliable associations are determined for each particle individually in local neighborhoods. In the second step, the global spatial information over multiple frames is exploited jointly to determine optimal associations. The multi-scale detection scheme and the multi-frame association finding algorithm have been combined with a probabilistic tracking approach based on the Kalman filter. We have successfully applied our probabilistic tracking approach to synthetic as well as real microscopy image sequences of virus particles and quantified the performance. We found that the proposed approach outperforms previous approaches.
2009-07-01
Performance Analysis of the Probabilistic Multi- Hypothesis Tracking Algorithm On the SEABAR Data Sets Dr. Christian G . Hempel Naval...Hypothesis Tracking,” NUWC-NPT Technical Report 10,428, Naval Undersea Warfare Center Division, Newport, RI, 15 February 1995. [2] G . McLachlan, T...the 9th International Conference on Information Fusion, Florence Italy, July, 2006. [8] C. Hempel, “Track Initialization for Multi-Static Active Sonay
NASA Astrophysics Data System (ADS)
Wang, S.; Huang, G. H.; Huang, W.; Fan, Y. R.; Li, Z.
2015-10-01
In this study, a fractional factorial probabilistic collocation method is proposed to reveal statistical significance of hydrologic model parameters and their multi-level interactions affecting model outputs, facilitating uncertainty propagation in a reduced dimensional space. The proposed methodology is applied to the Xiangxi River watershed in China to demonstrate its validity and applicability, as well as its capability of revealing complex and dynamic parameter interactions. A set of reduced polynomial chaos expansions (PCEs) only with statistically significant terms can be obtained based on the results of factorial analysis of variance (ANOVA), achieving a reduction of uncertainty in hydrologic predictions. The predictive performance of reduced PCEs is verified by comparing against standard PCEs and the Monte Carlo with Latin hypercube sampling (MC-LHS) method in terms of reliability, sharpness, and Nash-Sutcliffe efficiency (NSE). Results reveal that the reduced PCEs are able to capture hydrologic behaviors of the Xiangxi River watershed, and they are efficient functional representations for propagating uncertainties in hydrologic predictions.
USDA-ARS?s Scientific Manuscript database
Objective: To examine the risk factors of developing functional decline and make probabilistic predictions by using a tree-based method that allows higher order polynomials and interactions of the risk factors. Methods: The conditional inference tree analysis, a data mining approach, was used to con...
NASA Technical Reports Server (NTRS)
Baskaran, Subbiah; Ramachandran, Narayanan; Noever, David
1998-01-01
The use of probabilistic (PNN) and multilayer feed forward (MLFNN) neural networks are investigated for calibration of multi-hole pressure probes and the prediction of associated flow angularity patterns in test flow fields. Both types of networks are studied in detail for their calibration and prediction characteristics. The current formalism can be applied to any multi-hole probe, however the test results for the most commonly used five-hole Cone and Prism probe types alone are reported in this article.
Probabilistic evaluation of uncertainties and risks in aerospace components
NASA Technical Reports Server (NTRS)
Shah, A. R.; Shiao, M. C.; Nagpal, V. K.; Chamis, C. C.
1992-01-01
This paper summarizes a methodology developed at NASA Lewis Research Center which computationally simulates the structural, material, and load uncertainties associated with Space Shuttle Main Engine (SSME) components. The methodology was applied to evaluate the scatter in static, buckling, dynamic, fatigue, and damage behavior of the SSME turbo pump blade. Also calculated are the probability densities of typical critical blade responses, such as effective stress, natural frequency, damage initiation, most probable damage path, etc. Risk assessments were performed for different failure modes, and the effect of material degradation on the fatigue and damage behaviors of a blade were calculated using a multi-factor interaction equation. Failure probabilities for different fatigue cycles were computed and the uncertainties associated with damage initiation and damage propagation due to different load cycle were quantified. Evaluations on the effects of mistuned blades on a rotor were made; uncertainties in the excitation frequency were found to significantly amplify the blade responses of a mistuned rotor. The effects of the number of blades on a rotor were studied. The autocorrelation function of displacements and the probability density function of the first passage time for deterministic and random barriers for structures subjected to random processes also were computed. A brief discussion was included on the future direction of probabilistic structural analysis.
NASA Astrophysics Data System (ADS)
Tang, Zhongqian; Zhang, Hua; Yi, Shanzhen; Xiao, Yangfan
2018-03-01
GIS-based multi-criteria decision analysis (MCDA) is increasingly used to support flood risk assessment. However, conventional GIS-MCDA methods fail to adequately represent spatial variability and are accompanied with considerable uncertainty. It is, thus, important to incorporate spatial variability and uncertainty into GIS-based decision analysis procedures. This research develops a spatially explicit, probabilistic GIS-MCDA approach for the delineation of potentially flood susceptible areas. The approach integrates the probabilistic and the local ordered weighted averaging (OWA) methods via Monte Carlo simulation, to take into account the uncertainty related to criteria weights, spatial heterogeneity of preferences and the risk attitude of the analyst. The approach is applied to a pilot study for the Gucheng County, central China, heavily affected by the hazardous 2012 flood. A GIS database of six geomorphological and hydrometeorological factors for the evaluation of susceptibility was created. Moreover, uncertainty and sensitivity analysis were performed to investigate the robustness of the model. The results indicate that the ensemble method improves the robustness of the model outcomes with respect to variation in criteria weights and identifies which criteria weights are most responsible for the variability of model outcomes. Therefore, the proposed approach is an improvement over the conventional deterministic method and can provides a more rational, objective and unbiased tool for flood susceptibility evaluation.
Probabilistic modeling of discourse-aware sentence processing.
Dubey, Amit; Keller, Frank; Sturt, Patrick
2013-07-01
Probabilistic models of sentence comprehension are increasingly relevant to questions concerning human language processing. However, such models are often limited to syntactic factors. This restriction is unrealistic in light of experimental results suggesting interactions between syntax and other forms of linguistic information in human sentence processing. To address this limitation, this article introduces two sentence processing models that augment a syntactic component with information about discourse co-reference. The novel combination of probabilistic syntactic components with co-reference classifiers permits them to more closely mimic human behavior than existing models. The first model uses a deep model of linguistics, based in part on probabilistic logic, allowing it to make qualitative predictions on experimental data; the second model uses shallow processing to make quantitative predictions on a broad-coverage reading-time corpus. Copyright © 2013 Cognitive Science Society, Inc.
Centralized Multi-Sensor Square Root Cubature Joint Probabilistic Data Association
Liu, Jun; Li, Gang; Qi, Lin; Li, Yaowen; He, You
2017-01-01
This paper focuses on the tracking problem of multiple targets with multiple sensors in a nonlinear cluttered environment. To avoid Jacobian matrix computation and scaling parameter adjustment, improve numerical stability, and acquire more accurate estimated results for centralized nonlinear tracking, a novel centralized multi-sensor square root cubature joint probabilistic data association algorithm (CMSCJPDA) is proposed. Firstly, the multi-sensor tracking problem is decomposed into several single-sensor multi-target tracking problems, which are sequentially processed during the estimation. Then, in each sensor, the assignment of its measurements to target tracks is accomplished on the basis of joint probabilistic data association (JPDA), and a weighted probability fusion method with square root version of a cubature Kalman filter (SRCKF) is utilized to estimate the targets’ state. With the measurements in all sensors processed CMSCJPDA is derived and the global estimated state is achieved. Experimental results show that CMSCJPDA is superior to the state-of-the-art algorithms in the aspects of tracking accuracy, numerical stability, and computational cost, which provides a new idea to solve multi-sensor tracking problems. PMID:29113085
Centralized Multi-Sensor Square Root Cubature Joint Probabilistic Data Association.
Liu, Yu; Liu, Jun; Li, Gang; Qi, Lin; Li, Yaowen; He, You
2017-11-05
This paper focuses on the tracking problem of multiple targets with multiple sensors in a nonlinear cluttered environment. To avoid Jacobian matrix computation and scaling parameter adjustment, improve numerical stability, and acquire more accurate estimated results for centralized nonlinear tracking, a novel centralized multi-sensor square root cubature joint probabilistic data association algorithm (CMSCJPDA) is proposed. Firstly, the multi-sensor tracking problem is decomposed into several single-sensor multi-target tracking problems, which are sequentially processed during the estimation. Then, in each sensor, the assignment of its measurements to target tracks is accomplished on the basis of joint probabilistic data association (JPDA), and a weighted probability fusion method with square root version of a cubature Kalman filter (SRCKF) is utilized to estimate the targets' state. With the measurements in all sensors processed CMSCJPDA is derived and the global estimated state is achieved. Experimental results show that CMSCJPDA is superior to the state-of-the-art algorithms in the aspects of tracking accuracy, numerical stability, and computational cost, which provides a new idea to solve multi-sensor tracking problems.
Development of probabilistic regional climate scenario in East Asia
NASA Astrophysics Data System (ADS)
Dairaku, K.; Ueno, G.; Ishizaki, N. N.
2015-12-01
Climate information and services for Impacts, Adaptation and Vulnerability (IAV) Assessments are of great concern. In order to develop probabilistic regional climate information that represents the uncertainty in climate scenario experiments in East Asia (CORDEX-EA and Japan), the probability distribution of 2m air temperature was estimated by using developed regression model. The method can be easily applicable to other regions and other physical quantities, and also to downscale to finer-scale dependent on availability of observation dataset. Probabilistic climate information in present (1969-1998) and future (2069-2098) climate was developed using CMIP3 SRES A1b scenarios 21 models and the observation data (CRU_TS3.22 & University of Delaware in CORDEX-EA, NIAES AMeDAS mesh data in Japan). The prototype of probabilistic information in CORDEX-EA and Japan represent the quantified structural uncertainties of multi-model ensemble experiments of climate change scenarios. Appropriate combination of statistical methods and optimization of climate ensemble experiments using multi-General Circulation Models (GCMs) and multi-regional climate models (RCMs) ensemble downscaling experiments are investigated.
Long-term multi-hazard assessment for El Misti volcano (Peru)
NASA Astrophysics Data System (ADS)
Sandri, Laura; Thouret, Jean-Claude; Constantinescu, Robert; Biass, Sébastien; Tonini, Roberto
2014-02-01
We propose a long-term probabilistic multi-hazard assessment for El Misti Volcano, a composite cone located <20 km from Arequipa. The second largest Peruvian city is a rapidly expanding economic centre and is classified by UNESCO as World Heritage. We apply the Bayesian Event Tree code for Volcanic Hazard (BET_VH) to produce probabilistic hazard maps for the predominant volcanic phenomena that may affect c.900,000 people living around the volcano. The methodology accounts for the natural variability displayed by volcanoes in their eruptive behaviour, such as different types/sizes of eruptions and possible vent locations. For this purpose, we treat probabilistically several model runs for some of the main hazardous phenomena (lahars, pyroclastic density currents (PDCs), tephra fall and ballistic ejecta) and data from past eruptions at El Misti (tephra fall, PDCs and lahars) and at other volcanoes (PDCs). The hazard maps, although neglecting possible interactions among phenomena or cascade effects, have been produced with a homogeneous method and refer to a common time window of 1 year. The probability maps reveal that only the north and east suburbs of Arequipa are exposed to all volcanic threats except for ballistic ejecta, which are limited to the uninhabited but touristic summit cone. The probability for pyroclastic density currents reaching recently expanding urban areas and the city along ravines is around 0.05 %/year, similar to the probability obtained for roof-critical tephra loading during the rainy season. Lahars represent by far the most probable threat (around 10 %/year) because at least four radial drainage channels can convey them approximately 20 km away from the volcano across the entire city area in heavy rain episodes, even without eruption. The Río Chili Valley represents the major concern to city safety owing to the probable cascading effect of combined threats: PDCs and rockslides, dammed lake break-outs and subsequent lahars or floods. Although this study does not intend to replace the current El Misti hazard map, the quantitative results of this probabilistic multi-hazard assessment can be incorporated into a multi-risk analysis, to support decision makers in any future improvement of the current hazard evaluation, such as further land-use planning and possible emergency management.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Czejdo, Bogdan; Bhattacharya, Sambit; Ferragut, Erik M
2012-01-01
This paper describes the syntax and semantics of multi-level state diagrams to support probabilistic behavior of cooperating robots. The techniques are presented to analyze these diagrams by querying combined robots behaviors. It is shown how to use state abstraction and transition abstraction to create, verify and process large probabilistic state diagrams.
Multi-agent simulation of generation expansion in electricity markets.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Botterud, A; Mahalik, M. R.; Veselka, T. D.
2007-06-01
We present a new multi-agent model of generation expansion in electricity markets. The model simulates generation investment decisions of decentralized generating companies (GenCos) interacting in a complex, multidimensional environment. A probabilistic dispatch algorithm calculates prices and profits for new candidate units in different future states of the system. Uncertainties in future load, hydropower conditions, and competitors actions are represented in a scenario tree, and decision analysis is used to identify the optimal expansion decision for each individual GenCo. We test the model using real data for the Korea power system under different assumptions about market design, market concentration, and GenCo'smore » assumed expectations about their competitors investment decisions.« less
NASA Astrophysics Data System (ADS)
Jiang, Min; Li, Hui; Zhang, Zeng-ke; Zeng, Jia
2011-02-01
We present an approach to faithfully teleport an unknown quantum state of entangled particles in a multi-particle system involving multi spatially remote agents via probabilistic channels. In our scheme, the integrity of an entangled multi-particle state can be maintained even when the construction of a faithful channel fails. Furthermore, in a quantum teleportation network, there are generally multi spatially remote agents which play the role of relay nodes between a sender and a distant receiver. Hence, we propose two schemes for directly and indirectly constructing a faithful channel between the sender and the distant receiver with the assistance of relay agents, respectively. Our results show that the required auxiliary particle resources, local operations and classical communications are considerably reduced for the present purpose.
Probabilistic simulation of stress concentration in composite laminates
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Murthy, P. L. N.; Liaw, L.
1993-01-01
A computational methodology is described to probabilistically simulate the stress concentration factors in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The probabilistic composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties while probabilistic finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate stress concentration factors such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using it to simulate the stress concentration factors in composite laminates made from three different composite systems. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the stress concentration factors are influenced by local stiffness variables, by load eccentricities and by initial stress fields.
NASA Astrophysics Data System (ADS)
Zakhidova, D. V.; Kadyrhodjaev, A.; Scientific Team Of Hydroengeo Institute On Natural Hazards
2010-12-01
Well-timed warning of the population about possible landslide threat is one of the main positions in order to provide safe and stable country development. The system of monitoring over dangerous geological processes includes such components, as observation, forecast, control and management. Aspects of forecasting take special place. Having wide row of observations there can be possible to reveal some regularity of the phenomena, basing on which, it is possible to proceed forecasting. We looked through many approaches of forecasting that are used in different countries. The analysis of the available work has allowed to draw up a conclusion that while referring to the question of landslide forecasting, it is necessary to approach in system form, taking into account interacting components of the nature. The study of landslide processes has shown that these processes lies within the framework of engineering-geological directions of the science and also interacts with tectonics, geomorphology, hydrogeology, hydrology, climate change, technogenesis and etc. Thereby, the necessity of system approach, achievements of modern science and technology the most expedient approach to make a decision at landslide forecasting is probabilistic-statistical method with complex use of geological and satellite data, specific images processed through geoinformation systems. In this connection, probabilistic-statistical approach, reflecting natural characteristics of interacting natural system, allows to take into account multi-factored processes of landslide activations. Among the many factors, influencing on landslide activation, there exist ones that are not amenable to numerical feature. The parameters of these factors have descriptive, qualitative, rather than quantitative nature. Leaving these factors with lack of attention is absolutely not reasonable. Proposed approach has one more advantage, which allows taking into account not only numerical, but also non-numeric parameters. Combination of multidisciplinary, systematic feature, multifactorness of the account, probabilistic and statistical methods of the calculation, complex use of geological and satellite data, using modern technology processing and analysis of information - all these aspects were collected in one at proposed by authors approach to solve the question of defining the area of possible landslide activation. Proposed by authors method could be a part of the monitoring system for early warning of landslide activation. Thus, the authors propose to initialize the project “System development over the monitoring for the purpose of early warning of population from the threat of landslides”. In the process of project implementation there to be revealed such results like: 1. System of Geo-indicators in order to early warn quick-running landslide processes. 2. United interconnected system for remote, surface and underground types of observations over Geo-indicators. 3. Notification system of population about forthcoming threat by means of alerts, light signals, mobilization of municipalities and related ministries. In the result of project implementation there considered to reveal economic, technical, and social outputs.
NASA Astrophysics Data System (ADS)
Luo, Qiankun; Wu, Jianfeng; Yang, Yun; Qian, Jiazhong; Wu, Jichun
2014-11-01
This study develops a new probabilistic multi-objective fast harmony search algorithm (PMOFHS) for optimal design of groundwater remediation systems under uncertainty associated with the hydraulic conductivity (K) of aquifers. The PMOFHS integrates the previously developed deterministic multi-objective optimization method, namely multi-objective fast harmony search algorithm (MOFHS) with a probabilistic sorting technique to search for Pareto-optimal solutions to multi-objective optimization problems in a noisy hydrogeological environment arising from insufficient K data. The PMOFHS is then coupled with the commonly used flow and transport codes, MODFLOW and MT3DMS, to identify the optimal design of groundwater remediation systems for a two-dimensional hypothetical test problem and a three-dimensional Indiana field application involving two objectives: (i) minimization of the total remediation cost through the engineering planning horizon, and (ii) minimization of the mass remaining in the aquifer at the end of the operational period, whereby the pump-and-treat (PAT) technology is used to clean up contaminated groundwater. Also, Monte Carlo (MC) analysis is employed to evaluate the effectiveness of the proposed methodology. Comprehensive analysis indicates that the proposed PMOFHS can find Pareto-optimal solutions with low variability and high reliability and is a potentially effective tool for optimizing multi-objective groundwater remediation problems under uncertainty.
UTOPIAN: user-driven topic modeling based on interactive nonnegative matrix factorization.
Choo, Jaegul; Lee, Changhyun; Reddy, Chandan K; Park, Haesun
2013-12-01
Topic modeling has been widely used for analyzing text document collections. Recently, there have been significant advancements in various topic modeling techniques, particularly in the form of probabilistic graphical modeling. State-of-the-art techniques such as Latent Dirichlet Allocation (LDA) have been successfully applied in visual text analytics. However, most of the widely-used methods based on probabilistic modeling have drawbacks in terms of consistency from multiple runs and empirical convergence. Furthermore, due to the complicatedness in the formulation and the algorithm, LDA cannot easily incorporate various types of user feedback. To tackle this problem, we propose a reliable and flexible visual analytics system for topic modeling called UTOPIAN (User-driven Topic modeling based on Interactive Nonnegative Matrix Factorization). Centered around its semi-supervised formulation, UTOPIAN enables users to interact with the topic modeling method and steer the result in a user-driven manner. We demonstrate the capability of UTOPIAN via several usage scenarios with real-world document corpuses such as InfoVis/VAST paper data set and product review data sets.
Effect of Cyclic Thermo-Mechanical Loads on Fatigue Reliability in Polymer Matrix Composites
NASA Technical Reports Server (NTRS)
Shah, A. R.; Murthy, P. L. N.; Chamis, C. C.
1996-01-01
A methodology to compute probabilistic fatigue life of polymer matrix laminated composites has been developed and demonstrated. Matrix degradation effects caused by long term environmental exposure and mechanical/thermal cyclic loads are accounted for in the simulation process. A unified time-temperature-stress dependent multi-factor interaction relationship developed at NASA Lewis Research Center has been used to model the degradation/aging of material properties due to cyclic loads. The fast probability integration method is used to compute probabilistic distribution of response. Sensitivities of fatigue life reliability to uncertainties in the primitive random variables (e.g., constituent properties, fiber volume ratio, void volume ratio, ply thickness, etc.) computed and their significance in the reliability- based design for maximum life is discussed. The effect of variation in the thermal cyclic loads on the fatigue reliability for a (0/+/- 45/90)(sub s) graphite/epoxy laminate with a ply thickness of 0.127 mm, with respect to impending failure modes has been studied. The results show that, at low mechanical cyclic loads and low thermal cyclic amplitudes, fatigue life for 0.999 reliability is most sensitive to matrix compressive strength, matrix modulus, thermal expansion coefficient, and ply thickness. Whereas at high mechanical cyclic loads and high thermal cyclic amplitudes, fatigue life at 0.999 reliability is more sensitive to the shear strength of matrix, longitudinal fiber modulus, matrix modulus, and ply thickness.
Concurrent Probabilistic Simulation of High Temperature Composite Structural Response
NASA Technical Reports Server (NTRS)
Abdi, Frank
1996-01-01
A computational structural/material analysis and design tool which would meet industry's future demand for expedience and reduced cost is presented. This unique software 'GENOA' is dedicated to parallel and high speed analysis to perform probabilistic evaluation of high temperature composite response of aerospace systems. The development is based on detailed integration and modification of diverse fields of specialized analysis techniques and mathematical models to combine their latest innovative capabilities into a commercially viable software package. The technique is specifically designed to exploit the availability of processors to perform computationally intense probabilistic analysis assessing uncertainties in structural reliability analysis and composite micromechanics. The primary objectives which were achieved in performing the development were: (1) Utilization of the power of parallel processing and static/dynamic load balancing optimization to make the complex simulation of structure, material and processing of high temperature composite affordable; (2) Computational integration and synchronization of probabilistic mathematics, structural/material mechanics and parallel computing; (3) Implementation of an innovative multi-level domain decomposition technique to identify the inherent parallelism, and increasing convergence rates through high- and low-level processor assignment; (4) Creating the framework for Portable Paralleled architecture for the machine independent Multi Instruction Multi Data, (MIMD), Single Instruction Multi Data (SIMD), hybrid and distributed workstation type of computers; and (5) Market evaluation. The results of Phase-2 effort provides a good basis for continuation and warrants Phase-3 government, and industry partnership.
Sjöberg, C; Ahnesjö, A
2013-06-01
Label fusion multi-atlas approaches for image segmentation can give better segmentation results than single atlas methods. We present a multi-atlas label fusion strategy based on probabilistic weighting of distance maps. Relationships between image similarities and segmentation similarities are estimated in a learning phase and used to derive fusion weights that are proportional to the probability for each atlas to improve the segmentation result. The method was tested using a leave-one-out strategy on a database of 21 pre-segmented prostate patients for different image registrations combined with different image similarity scorings. The probabilistic weighting yields results that are equal or better compared to both fusion with equal weights and results using the STAPLE algorithm. Results from the experiments demonstrate that label fusion by weighted distance maps is feasible, and that probabilistic weighted fusion improves segmentation quality more the stronger the individual atlas segmentation quality depends on the corresponding registered image similarity. The regions used for evaluation of the image similarity measures were found to be more important than the choice of similarity measure. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Probabilistic Simulation of Multi-Scale Composite Behavior
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2012-01-01
A methodology is developed to computationally assess the non-deterministic composite response at all composite scales (from micro to structural) due to the uncertainties in the constituent (fiber and matrix) properties, in the fabrication process and in structural variables (primitive variables). The methodology is computationally efficient for simulating the probability distributions of composite behavior, such as material properties, laminate and structural responses. Bi-products of the methodology are probabilistic sensitivities of the composite primitive variables. The methodology has been implemented into the computer codes PICAN (Probabilistic Integrated Composite ANalyzer) and IPACS (Integrated Probabilistic Assessment of Composite Structures). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in composite typical laminates and comparing the results with the Monte Carlo simulation method. Available experimental data of composite laminate behavior at all scales fall within the scatters predicted by PICAN. Multi-scaling is extended to simulate probabilistic thermo-mechanical fatigue and to simulate the probabilistic design of a composite redome in order to illustrate its versatility. Results show that probabilistic fatigue can be simulated for different temperature amplitudes and for different cyclic stress magnitudes. Results also show that laminate configurations can be selected to increase the redome reliability by several orders of magnitude without increasing the laminate thickness--a unique feature of structural composites. The old reference denotes that nothing fundamental has been done since that time.
NASA Astrophysics Data System (ADS)
Serb, Alexander; Bill, Johannes; Khiat, Ali; Berdan, Radu; Legenstein, Robert; Prodromakis, Themis
2016-09-01
In an increasingly data-rich world the need for developing computing systems that cannot only process, but ideally also interpret big data is becoming continuously more pressing. Brain-inspired concepts have shown great promise towards addressing this need. Here we demonstrate unsupervised learning in a probabilistic neural network that utilizes metal-oxide memristive devices as multi-state synapses. Our approach can be exploited for processing unlabelled data and can adapt to time-varying clusters that underlie incoming data by supporting the capability of reversible unsupervised learning. The potential of this work is showcased through the demonstration of successful learning in the presence of corrupted input data and probabilistic neurons, thus paving the way towards robust big-data processors.
Bayesian networks and information theory for audio-visual perception modeling.
Besson, Patricia; Richiardi, Jonas; Bourdin, Christophe; Bringoux, Lionel; Mestre, Daniel R; Vercher, Jean-Louis
2010-09-01
Thanks to their different senses, human observers acquire multiple information coming from their environment. Complex cross-modal interactions occur during this perceptual process. This article proposes a framework to analyze and model these interactions through a rigorous and systematic data-driven process. This requires considering the general relationships between the physical events or factors involved in the process, not only in quantitative terms, but also in term of the influence of one factor on another. We use tools from information theory and probabilistic reasoning to derive relationships between the random variables of interest, where the central notion is that of conditional independence. Using mutual information analysis to guide the model elicitation process, a probabilistic causal model encoded as a Bayesian network is obtained. We exemplify the method by using data collected in an audio-visual localization task for human subjects, and we show that it yields a well-motivated model with good predictive ability. The model elicitation process offers new prospects for the investigation of the cognitive mechanisms of multisensory perception.
NASA Astrophysics Data System (ADS)
Gromek, Katherine Emily
A novel computational and inference framework of the physics-of-failure (PoF) reliability modeling for complex dynamic systems has been established in this research. The PoF-based reliability models are used to perform a real time simulation of system failure processes, so that the system level reliability modeling would constitute inferences from checking the status of component level reliability at any given time. The "agent autonomy" concept is applied as a solution method for the system-level probabilistic PoF-based (i.e. PPoF-based) modeling. This concept originated from artificial intelligence (AI) as a leading intelligent computational inference in modeling of multi agents systems (MAS). The concept of agent autonomy in the context of reliability modeling was first proposed by M. Azarkhail [1], where a fundamentally new idea of system representation by autonomous intelligent agents for the purpose of reliability modeling was introduced. Contribution of the current work lies in the further development of the agent anatomy concept, particularly the refined agent classification within the scope of the PoF-based system reliability modeling, new approaches to the learning and the autonomy properties of the intelligent agents, and modeling interacting failure mechanisms within the dynamic engineering system. The autonomous property of intelligent agents is defined as agent's ability to self-activate, deactivate or completely redefine their role in the analysis. This property of agents and the ability to model interacting failure mechanisms of the system elements makes the agent autonomy fundamentally different from all existing methods of probabilistic PoF-based reliability modeling. 1. Azarkhail, M., "Agent Autonomy Approach to Physics-Based Reliability Modeling of Structures and Mechanical Systems", PhD thesis, University of Maryland, College Park, 2007.
A Re-Unification of Two Competing Models for Document Retrieval.
ERIC Educational Resources Information Center
Bodoff, David
1999-01-01
Examines query-oriented versus document-oriented information retrieval and feedback learning. Highlights include a reunification of the two approaches for probabilistic document retrieval and for vector space model (VSM) retrieval; learning in VSM and in probabilistic models; multi-dimensional scaling; and ongoing field studies. (LRW)
NASA Technical Reports Server (NTRS)
Burg, Cecile M.; Hill, Geoffrey A.; Brown, Sherilyn A.; Geiselhart, Karl A.
2004-01-01
The Systems Analysis Branch at NASA Langley Research Center has investigated revolutionary Propulsion Airframe Aeroacoustics (PAA) technologies and configurations for a Blended-Wing-Body (BWB) type aircraft as part of its research for NASA s Quiet Aircraft Technology (QAT) Project. Within the context of the long-term NASA goal of reducing the perceived aircraft noise level by a factor of 4 relative to 1997 state of the art, major configuration changes in the propulsion airframe integration system were explored with noise as a primary design consideration. An initial down-select and assessment of candidate PAA technologies for the BWB was performed using a Multi-Attribute Decision Making (MADM) process consisting of organized brainstorming and decision-making tools. The assessments focused on what effect the PAA technologies had on both the overall noise level of the BWB and what effect they had on other major design considerations such as weight, performance and cost. A probabilistic systems analysis of the PAA configurations that presented the best noise reductions with the least negative impact on the system was then performed. Detailed results from the MADM study and the probabilistic systems analysis will be published in the near future.
Probabilistic, meso-scale flood loss modelling
NASA Astrophysics Data System (ADS)
Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno
2016-04-01
Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.
NASA Technical Reports Server (NTRS)
Mavris, Dimitri N.
1998-01-01
Over the past few years, modem aircraft design has experienced a paradigm shift from designing for performance to designing for affordability. This report contains a probabilistic approach that will allow traditional deterministic design methods to be extended to account for disciplinary, economic, and technological uncertainty. The probabilistic approach was facilitated by the Fast Probability Integration (FPI) technique; a technique which allows the designer to gather valuable information about the vehicle's behavior in the design space. This technique is efficient for assessing multi-attribute, multi-constraint problems in a more realistic fashion. For implementation purposes, this technique is applied to illustrate how both economic and technological uncertainty associated with a Very Large Transport aircraft concept may be assessed. The assessment is evaluated with the FPI technique to determine the cumulative probability distributions of the design space, as bound by economic objectives and performance constraints. These distributions were compared to established targets for a comparable large capacity aircraft, similar in size to the Boeing 747-400. The conventional baseline configuration design space was determined to be unfeasible and marginally viable, motivating the infusion of advanced technologies, including reductions in drag, specific fuel consumption, wing weight, and Research, Development, Testing, and Evaluation costs. The resulting system design space was qualitatively assessed with technology metric "k" factors. The infusion of technologies shifted the VLT design into regions of feasibility and greater viability. The study also demonstrated a method and relationship by which the impact of new technologies may be assessed in a more system focused approach.
Vanderveldt, Ariana; Green, Leonard; Myerson, Joel
2014-01-01
The value of an outcome is affected both by the delay until its receipt (delay discounting) and by the likelihood of its receipt (probability discounting). Despite being well-described by the same hyperboloid function, delay and probability discounting involve fundamentally different processes, as revealed, for example, by the differential effects of reward amount. Previous research has focused on the discounting of delayed and probabilistic rewards separately, with little research examining more complex situations in which rewards are both delayed and probabilistic. In two experiments, participants made choices between smaller rewards that were both immediate and certain and larger rewards that were both delayed and probabilistic. Analyses revealed significant interactions between delay and probability factors inconsistent with an additive model. In contrast, a hyperboloid discounting model in which delay and probability were combined multiplicatively provided an excellent fit to the data. These results suggest that the hyperboloid is a good descriptor of decision making in complicated monetary choice situations like those people encounter in everyday life. PMID:24933696
NASA Astrophysics Data System (ADS)
Hussin, Haydar; van Westen, Cees; Reichenbach, Paola
2013-04-01
Local and regional authorities in mountainous areas that deal with hydro-meteorological hazards like landslides and floods try to set aside budgets for emergencies and risk mitigation. However, future losses are often not calculated in a probabilistic manner when allocating budgets or determining how much risk is acceptable. The absence of probabilistic risk estimates can create a lack of preparedness for reconstruction and risk reduction costs and a deficiency in promoting risk mitigation and prevention in an effective way. The probabilistic risk of natural hazards at local scale is usually ignored all together due to the difficulty in acknowledging, processing and incorporating uncertainties in the estimation of losses (e.g. physical damage, fatalities and monetary loss). This study attempts to set up a working framework for a probabilistic risk assessment (PRA) of landslides and floods at a municipal scale using the Fella river valley (Eastern Italian Alps) as a multi-hazard case study area. The emphasis is on the evaluation and determination of the uncertainty in the estimation of losses from multi-hazards. To carry out this framework some steps are needed: (1) by using physically based stochastic landslide and flood models we aim to calculate the probability of the physical impact on individual elements at risk, (2) this is then combined with a statistical analysis of the vulnerability and monetary value of the elements at risk in order to include their uncertainty in the risk assessment, (3) finally the uncertainty from each risk component is propagated into the loss estimation. The combined effect of landslides and floods on the direct risk to communities in narrow alpine valleys is also one of important aspects that needs to be studied.
Functional Wigner representation of quantum dynamics of Bose-Einstein condensate
NASA Astrophysics Data System (ADS)
Opanchuk, B.; Drummond, P. D.
2013-04-01
We develop a method of simulating the full quantum field dynamics of multi-mode multi-component Bose-Einstein condensates in a trap. We use the truncated Wigner representation to obtain a probabilistic theory that can be sampled. This method produces c-number stochastic equations which may be solved using conventional stochastic methods. The technique is valid for large mode occupation numbers. We give a detailed derivation of methods of functional Wigner representation appropriate for quantum fields. Our approach describes spatial evolution of spinor components and properly accounts for nonlinear losses. Such techniques are applicable to calculating the leading quantum corrections, including effects such as quantum squeezing, entanglement, EPR correlations, and interactions with engineered nonlinear reservoirs. By using a consistent expansion in the inverse density, we are able to explain an inconsistency in the nonlinear loss equations found by earlier authors.
McClelland, James L.
2013-01-01
This article seeks to establish a rapprochement between explicitly Bayesian models of contextual effects in perception and neural network models of such effects, particularly the connectionist interactive activation (IA) model of perception. The article is in part an historical review and in part a tutorial, reviewing the probabilistic Bayesian approach to understanding perception and how it may be shaped by context, and also reviewing ideas about how such probabilistic computations may be carried out in neural networks, focusing on the role of context in interactive neural networks, in which both bottom-up and top-down signals affect the interpretation of sensory inputs. It is pointed out that connectionist units that use the logistic or softmax activation functions can exactly compute Bayesian posterior probabilities when the bias terms and connection weights affecting such units are set to the logarithms of appropriate probabilistic quantities. Bayesian concepts such the prior, likelihood, (joint and marginal) posterior, probability matching and maximizing, and calculating vs. sampling from the posterior are all reviewed and linked to neural network computations. Probabilistic and neural network models are explicitly linked to the concept of a probabilistic generative model that describes the relationship between the underlying target of perception (e.g., the word intended by a speaker or other source of sensory stimuli) and the sensory input that reaches the perceiver for use in inferring the underlying target. It is shown how a new version of the IA model called the multinomial interactive activation (MIA) model can sample correctly from the joint posterior of a proposed generative model for perception of letters in words, indicating that interactive processing is fully consistent with principled probabilistic computation. Ways in which these computations might be realized in real neural systems are also considered. PMID:23970868
McClelland, James L
2013-01-01
This article seeks to establish a rapprochement between explicitly Bayesian models of contextual effects in perception and neural network models of such effects, particularly the connectionist interactive activation (IA) model of perception. The article is in part an historical review and in part a tutorial, reviewing the probabilistic Bayesian approach to understanding perception and how it may be shaped by context, and also reviewing ideas about how such probabilistic computations may be carried out in neural networks, focusing on the role of context in interactive neural networks, in which both bottom-up and top-down signals affect the interpretation of sensory inputs. It is pointed out that connectionist units that use the logistic or softmax activation functions can exactly compute Bayesian posterior probabilities when the bias terms and connection weights affecting such units are set to the logarithms of appropriate probabilistic quantities. Bayesian concepts such the prior, likelihood, (joint and marginal) posterior, probability matching and maximizing, and calculating vs. sampling from the posterior are all reviewed and linked to neural network computations. Probabilistic and neural network models are explicitly linked to the concept of a probabilistic generative model that describes the relationship between the underlying target of perception (e.g., the word intended by a speaker or other source of sensory stimuli) and the sensory input that reaches the perceiver for use in inferring the underlying target. It is shown how a new version of the IA model called the multinomial interactive activation (MIA) model can sample correctly from the joint posterior of a proposed generative model for perception of letters in words, indicating that interactive processing is fully consistent with principled probabilistic computation. Ways in which these computations might be realized in real neural systems are also considered.
NASA Astrophysics Data System (ADS)
Lachaut, T.; Yoon, J.; Klassert, C. J. A.; Talozi, S.; Mustafa, D.; Knox, S.; Selby, P. D.; Haddad, Y.; Gorelick, S.; Tilmant, A.
2016-12-01
Probabilistic approaches to uncertainty in water systems management can face challenges of several types: non stationary climate, sudden shocks such as conflict-driven migrations, or the internal complexity and dynamics of large systems. There has been a rising trend in the development of bottom-up methods that place focus on the decision side instead of probability distributions and climate scenarios. These approaches are based on defining acceptability thresholds for the decision makers and considering the entire range of possibilities over which such thresholds are crossed. We aim at improving the knowledge on the applicability and relevance of this approach by enlarging its scope beyond climate uncertainty and single decision makers; thus including demographic shifts, internal system dynamics, and multiple stakeholders at different scales. This vulnerability analysis is part of the Jordan Water Project and makes use of an ambitious multi-agent model developed by its teams with the extensive cooperation of the Ministry of Water and Irrigation of Jordan. The case of Jordan is a relevant example for migration spikes, rapid social changes, resource depletion and climate change impacts. The multi-agent modeling framework used provides a consistent structure to assess the vulnerability of complex water resources systems with distributed acceptability thresholds and stakeholder interaction. A proof of concept and preliminary results are presented for a non-probabilistic vulnerability analysis that involves different types of stakeholders, uncertainties other than climatic and the integration of threshold-based indicators. For each stakeholder (agent) a vulnerability matrix is constructed over a multi-dimensional domain, which includes various hydrologic and/or demographic variables.
A Model for Generating Multi-hazard Scenarios
NASA Astrophysics Data System (ADS)
Lo Jacomo, A.; Han, D.; Champneys, A.
2017-12-01
Communities in mountain areas are often subject to risk from multiple hazards, such as earthquakes, landslides, and floods. Each hazard has its own different rate of onset, duration, and return period. Multiple hazards tend to complicate the combined risk due to their interactions. Prioritising interventions for minimising risk in this context is challenging. We developed a probabilistic multi-hazard model to help inform decision making in multi-hazard areas. The model is applied to a case study region in the Sichuan province in China, using information from satellite imagery and in-situ data. The model is not intended as a predictive model, but rather as a tool which takes stakeholder input and can be used to explore plausible hazard scenarios over time. By using a Monte Carlo framework and varrying uncertain parameters for each of the hazards, the model can be used to explore the effect of different mitigation interventions aimed at reducing the disaster risk within an uncertain hazard context.
Lee, Seungjong; Park, Kyoungyoon; Kim, Hyuntai; Vazquez-Zuniga, Luis Alonso; Kim, Jinseob; Jeong, Yoonchan
2018-04-30
We report the intermittent burst of a super rogue wave in the multi-soliton (MS) regime of an anomalous-dispersion fiber ring cavity. We exploit the spatio-temporal measurement technique to log and capture the shot-to-shot wave dynamics of various pulse events in the cavity, and obtain the corresponding intensity probability density function, which eventually unveils the inherent nature of the extreme events encompassed therein. In the breathing MS regime, a specific MS regime with heavy soliton population, the natural probability of pulse interaction among solitons and dispersive waves exponentially increases owing to the extraordinarily high soliton population density. Combination of the probabilistically started soliton interactions and subsequently accompanying dispersive waves in their vicinity triggers an avalanche of extreme events with even higher intensities, culminating to a burst of a super rogue wave nearly ten times stronger than the average solitons observed in the cavity. Without any cavity modification or control, the process naturally and intermittently recurs within a time scale in the order of ten seconds.
Ali, S. M.; Mehmood, C. A; Khan, B.; Jawad, M.; Farid, U; Jadoon, J. K.; Ali, M.; Tareen, N. K.; Usman, S.; Majid, M.; Anwar, S. M.
2016-01-01
In smart grid paradigm, the consumer demands are random and time-dependent, owning towards stochastic probabilities. The stochastically varying consumer demands have put the policy makers and supplying agencies in a demanding position for optimal generation management. The utility revenue functions are highly dependent on the consumer deterministic stochastic demand models. The sudden drifts in weather parameters effects the living standards of the consumers that in turn influence the power demands. Considering above, we analyzed stochastically and statistically the effect of random consumer demands on the fixed and variable revenues of the electrical utilities. Our work presented the Multi-Variate Gaussian Distribution Function (MVGDF) probabilistic model of the utility revenues with time-dependent consumer random demands. Moreover, the Gaussian probabilities outcome of the utility revenues is based on the varying consumer n demands data-pattern. Furthermore, Standard Monte Carlo (SMC) simulations are performed that validated the factor of accuracy in the aforesaid probabilistic demand-revenue model. We critically analyzed the effect of weather data parameters on consumer demands using correlation and multi-linear regression schemes. The statistical analysis of consumer demands provided a relationship between dependent (demand) and independent variables (weather data) for utility load management, generation control, and network expansion. PMID:27314229
Ali, S M; Mehmood, C A; Khan, B; Jawad, M; Farid, U; Jadoon, J K; Ali, M; Tareen, N K; Usman, S; Majid, M; Anwar, S M
2016-01-01
In smart grid paradigm, the consumer demands are random and time-dependent, owning towards stochastic probabilities. The stochastically varying consumer demands have put the policy makers and supplying agencies in a demanding position for optimal generation management. The utility revenue functions are highly dependent on the consumer deterministic stochastic demand models. The sudden drifts in weather parameters effects the living standards of the consumers that in turn influence the power demands. Considering above, we analyzed stochastically and statistically the effect of random consumer demands on the fixed and variable revenues of the electrical utilities. Our work presented the Multi-Variate Gaussian Distribution Function (MVGDF) probabilistic model of the utility revenues with time-dependent consumer random demands. Moreover, the Gaussian probabilities outcome of the utility revenues is based on the varying consumer n demands data-pattern. Furthermore, Standard Monte Carlo (SMC) simulations are performed that validated the factor of accuracy in the aforesaid probabilistic demand-revenue model. We critically analyzed the effect of weather data parameters on consumer demands using correlation and multi-linear regression schemes. The statistical analysis of consumer demands provided a relationship between dependent (demand) and independent variables (weather data) for utility load management, generation control, and network expansion.
Hydrologic dynamics and ecosystem structure.
Rodríguez-Iturbe, I
2003-01-01
Ecohydrology is the science that studies the mutual interaction between the hydrological cycle and ecosystems. Such an interaction is especially intense in water-controlled ecosystems, where water may be a limiting factor, not only because of its scarcity, but also because of its intermittent and unpredictable appearance. Hydrologic dynamics is shown to be a crucial factor for ecological patterns and processes. The probabilistic structure of soil moisture in time and space is presented as the key linkage between soil, climate and vegetation dynamics. Nutrient cycles, vegetation coexistence and plant response to environmental conditions are all intimately linked to the stochastic fluctuation of the hydrologic inputs driving an ecosystem.
Functional Wigner representation of quantum dynamics of Bose-Einstein condensate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Opanchuk, B.; Drummond, P. D.
2013-04-15
We develop a method of simulating the full quantum field dynamics of multi-mode multi-component Bose-Einstein condensates in a trap. We use the truncated Wigner representation to obtain a probabilistic theory that can be sampled. This method produces c-number stochastic equations which may be solved using conventional stochastic methods. The technique is valid for large mode occupation numbers. We give a detailed derivation of methods of functional Wigner representation appropriate for quantum fields. Our approach describes spatial evolution of spinor components and properly accounts for nonlinear losses. Such techniques are applicable to calculating the leading quantum corrections, including effects such asmore » quantum squeezing, entanglement, EPR correlations, and interactions with engineered nonlinear reservoirs. By using a consistent expansion in the inverse density, we are able to explain an inconsistency in the nonlinear loss equations found by earlier authors.« less
Multi-Agent simulation of generation capacity expansion decisions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Botterud, A.; Mahalik, M.; Conzelmann, G.
2008-01-01
In this paper, we use a multi-agent simulation model, EMCAS, to analyze generation expansion in the Iberian electricity market. The expansion model simulates generation investment decisions of decentralized generating companies (GenCos) interacting in a complex, multidimensional environment. A probabilistic dispatch algorithm calculates prices and profits for new candidate units in different future states of the system. Uncertainties in future load, hydropower conditions, and competitorspsila actions are represented in a scenario tree, and decision analysis is used to identify the optimal expansion decision for each individual GenCo. We run the model using detailed data for the Iberian market. In a scenariomore » analysis, we look at the impact of market design variables, such as the energy price cap and carbon emission prices. We also analyze how market concentration and GenCospsila risk preferences influence the timing and choice of new generating capacity.« less
Faint Object Detection in Multi-Epoch Observations via Catalog Data Fusion
NASA Astrophysics Data System (ADS)
Budavári, Tamás; Szalay, Alexander S.; Loredo, Thomas J.
2017-03-01
Astronomy in the time-domain era faces several new challenges. One of them is the efficient use of observations obtained at multiple epochs. The work presented here addresses faint object detection and describes an incremental strategy for separating real objects from artifacts in ongoing surveys. The idea is to produce low-threshold single-epoch catalogs and to accumulate information across epochs. This is in contrast to more conventional strategies based on co-added or stacked images. We adopt a Bayesian approach, addressing object detection by calculating the marginal likelihoods for hypotheses asserting that there is no object or one object in a small image patch containing at most one cataloged source at each epoch. The object-present hypothesis interprets the sources in a patch at different epochs as arising from a genuine object; the no-object hypothesis interprets candidate sources as spurious, arising from noise peaks. We study the detection probability for constant-flux objects in a Gaussian noise setting, comparing results based on single and stacked exposures to results based on a series of single-epoch catalog summaries. Our procedure amounts to generalized cross-matching: it is the product of a factor accounting for the matching of the estimated fluxes of the candidate sources and a factor accounting for the matching of their estimated directions. We find that probabilistic fusion of multi-epoch catalogs can detect sources with similar sensitivity and selectivity compared to stacking. The probabilistic cross-matching framework underlying our approach plays an important role in maintaining detection sensitivity and points toward generalizations that could accommodate variability and complex object structure.
Multi-parametric variational data assimilation for hydrological forecasting
NASA Astrophysics Data System (ADS)
Alvarado-Montero, R.; Schwanenberg, D.; Krahe, P.; Helmke, P.; Klein, B.
2017-12-01
Ensemble forecasting is increasingly applied in flow forecasting systems to provide users with a better understanding of forecast uncertainty and consequently to take better-informed decisions. A common practice in probabilistic streamflow forecasting is to force deterministic hydrological model with an ensemble of numerical weather predictions. This approach aims at the representation of meteorological uncertainty but neglects uncertainty of the hydrological model as well as its initial conditions. Complementary approaches use probabilistic data assimilation techniques to receive a variety of initial states or represent model uncertainty by model pools instead of single deterministic models. This paper introduces a novel approach that extends a variational data assimilation based on Moving Horizon Estimation to enable the assimilation of observations into multi-parametric model pools. It results in a probabilistic estimate of initial model states that takes into account the parametric model uncertainty in the data assimilation. The assimilation technique is applied to the uppermost area of River Main in Germany. We use different parametric pools, each of them with five parameter sets, to assimilate streamflow data, as well as remotely sensed data from the H-SAF project. We assess the impact of the assimilation in the lead time performance of perfect forecasts (i.e. observed data as forcing variables) as well as deterministic and probabilistic forecasts from ECMWF. The multi-parametric assimilation shows an improvement of up to 23% for CRPS performance and approximately 20% in Brier Skill Scores with respect to the deterministic approach. It also improves the skill of the forecast in terms of rank histogram and produces a narrower ensemble spread.
NASA Astrophysics Data System (ADS)
Anees, Asim; Aryal, Jagannath; O'Reilly, Małgorzata M.; Gale, Timothy J.; Wardlaw, Tim
2016-12-01
A robust non-parametric framework, based on multiple Radial Basic Function (RBF) kernels, is proposed in this study, for detecting land/forest cover changes using Landsat 7 ETM+ images. One of the widely used frameworks is to find change vectors (difference image) and use a supervised classifier to differentiate between change and no-change. The Bayesian Classifiers e.g. Maximum Likelihood Classifier (MLC), Naive Bayes (NB), are widely used probabilistic classifiers which assume parametric models, e.g. Gaussian function, for the class conditional distributions. However, their performance can be limited if the data set deviates from the assumed model. The proposed framework exploits the useful properties of Least Squares Probabilistic Classifier (LSPC) formulation i.e. non-parametric and probabilistic nature, to model class posterior probabilities of the difference image using a linear combination of a large number of Gaussian kernels. To this end, a simple technique, based on 10-fold cross-validation is also proposed for tuning model parameters automatically instead of selecting a (possibly) suboptimal combination from pre-specified lists of values. The proposed framework has been tested and compared with Support Vector Machine (SVM) and NB for detection of defoliation, caused by leaf beetles (Paropsisterna spp.) in Eucalyptus nitens and Eucalyptus globulus plantations of two test areas, in Tasmania, Australia, using raw bands and band combination indices of Landsat 7 ETM+. It was observed that due to multi-kernel non-parametric formulation and probabilistic nature, the LSPC outperforms parametric NB with Gaussian assumption in change detection framework, with Overall Accuracy (OA) ranging from 93.6% (κ = 0.87) to 97.4% (κ = 0.94) against 85.3% (κ = 0.69) to 93.4% (κ = 0.85), and is more robust to changing data distributions. Its performance was comparable to SVM, with added advantages of being probabilistic and capable of handling multi-class problems naturally with its original formulation.
Characterizing the topology of probabilistic biological networks.
Todor, Andrei; Dobra, Alin; Kahveci, Tamer
2013-01-01
Biological interactions are often uncertain events, that may or may not take place with some probability. This uncertainty leads to a massive number of alternative interaction topologies for each such network. The existing studies analyze the degree distribution of biological networks by assuming that all the given interactions take place under all circumstances. This strong and often incorrect assumption can lead to misleading results. In this paper, we address this problem and develop a sound mathematical basis to characterize networks in the presence of uncertain interactions. Using our mathematical representation, we develop a method that can accurately describe the degree distribution of such networks. We also take one more step and extend our method to accurately compute the joint-degree distributions of node pairs connected by edges. The number of possible network topologies grows exponentially with the number of uncertain interactions. However, the mathematical model we develop allows us to compute these degree distributions in polynomial time in the number of interactions. Our method works quickly even for entire protein-protein interaction (PPI) networks. It also helps us find an adequate mathematical model using MLE. We perform a comparative study of node-degree and joint-degree distributions in two types of biological networks: the classical deterministic networks and the more flexible probabilistic networks. Our results confirm that power-law and log-normal models best describe degree distributions for both probabilistic and deterministic networks. Moreover, the inverse correlation of degrees of neighboring nodes shows that, in probabilistic networks, nodes with large number of interactions prefer to interact with those with small number of interactions more frequently than expected. We also show that probabilistic networks are more robust for node-degree distribution computation than the deterministic ones. all the data sets used, the software implemented and the alignments found in this paper are available at http://bioinformatics.cise.ufl.edu/projects/probNet/.
Probabilistic Simulation of Stress Concentration in Composite Laminates
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Murthy, P. L. N.; Liaw, D. G.
1994-01-01
A computational methodology is described to probabilistically simulate the stress concentration factors (SCF's) in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties, whereas the finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate SCF's, such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using is to simulate the SCF's in three different composite laminates. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the SCF's are influenced by local stiffness variables, by load eccentricities, and by initial stress fields.
Robust infrared targets tracking with covariance matrix representation
NASA Astrophysics Data System (ADS)
Cheng, Jian
2009-07-01
Robust infrared target tracking is an important and challenging research topic in many military and security applications, such as infrared imaging guidance, infrared reconnaissance, scene surveillance, etc. To effectively tackle the nonlinear and non-Gaussian state estimation problems, particle filtering is introduced to construct the theory framework of infrared target tracking. Under this framework, the observation probabilistic model is one of main factors for infrared targets tracking performance. In order to improve the tracking performance, covariance matrices are introduced to represent infrared targets with the multi-features. The observation probabilistic model can be constructed by computing the distance between the reference target's and the target samples' covariance matrix. Because the covariance matrix provides a natural tool for integrating multiple features, and is scale and illumination independent, target representation with covariance matrices can hold strong discriminating ability and robustness. Two experimental results demonstrate the proposed method is effective and robust for different infrared target tracking, such as the sensor ego-motion scene, and the sea-clutter scene.
Quantum Tasks with Non-maximally Quantum Channels via Positive Operator-Valued Measurement
NASA Astrophysics Data System (ADS)
Peng, Jia-Yin; Luo, Ming-Xing; Mo, Zhi-Wen
2013-01-01
By using a proper positive operator-valued measure (POVM), we present two new schemes for probabilistic transmission with non-maximally four-particle cluster states. In the first scheme, we demonstrate that two non-maximally four-particle cluster states can be used to realize probabilistically sharing an unknown three-particle GHZ-type state within either distant agent's place. In the second protocol, we demonstrate that a non-maximally four-particle cluster state can be used to teleport an arbitrary unknown multi-particle state in a probabilistic manner with appropriate unitary operations and POVM. Moreover the total success probability of these two schemes are also worked out.
NASA Astrophysics Data System (ADS)
Marques, R.; Amaral, P.; Zêzere, J. L.; Queiroz, G.; Goulart, C.
2009-04-01
Slope instability research and susceptibility mapping is a fundamental component of hazard assessment and is of extreme importance for risk mitigation, land-use management and emergency planning. Landslide susceptibility zonation has been actively pursued during the last two decades and several methodologies are still being improved. Among all the methods presented in the literature, indirect quantitative probabilistic methods have been extensively used. In this work different linear probabilistic methods, both bi-variate and multi-variate (Informative Value, Fuzzy Logic, Weights of Evidence and Logistic Regression), were used for the computation of the spatial probability of landslide occurrence, using the pixel as mapping unit. The methods used are based on linear relationships between landslides and 9 considered conditioning factors (altimetry, slope angle, exposition, curvature, distance to streams, wetness index, contribution area, lithology and land-use). It was assumed that future landslides will be conditioned by the same factors as past landslides in the study area. The presented work was developed for Ribeira Quente Valley (S. Miguel Island, Azores), a study area of 9,5 km2, mainly composed of volcanic deposits (ash and pumice lapilli) produced by explosive eruptions in Furnas Volcano. This materials associated to the steepness of the slopes (38,9% of the area has slope angles higher than 35°, reaching a maximum of 87,5°), make the area very prone to landslide activity. A total of 1.495 shallow landslides were mapped (at 1:5.000 scale) and included in a GIS database. The total affected area is 401.744 m2 (4,5% of the study area). Most slope movements are translational slides frequently evolving into debris-flows. The landslides are elongated, with maximum length generally equivalent to the slope extent, and their width normally does not exceed 25 m. The failure depth rarely exceeds 1,5 m and the volume is usually smaller than 700 m3. For modelling purposes, the landslides were randomly divided in two sub-datasets: a modelling dataset with 748 events (2,2% of the study area) and a validation dataset with 747 events (2,3% of the study area). The susceptibility algorithms achieved with the different probabilistic techniques, were rated individually using success rate and prediction rate curves. The best model performance was obtained with the logistic regression, although the results from the different methods do not show significant differences neither in success nor in prediction rate curves. These evidences revealed that: (1) the modelling landslide dataset is representative of the entire landslide population characteristics; and (2) the increase of complexity and robustness in the probabilistic methodology did not produce a significant increase in success or prediction rates. Therefore, it was concluded that the resolution and quality of the input variables are much more important than the probabilistic model chosen to assess landslide susceptibility. This work was developed on the behalf of VOLCSOILRISK project (Volcanic Soils Geotechnical Characterization for Landslide Risk Mitigation), supported by Direcção Regional da Ciência e Tecnologia - Governo Regional dos Açores.
BN-FLEMOps pluvial - A probabilistic multi-variable loss estimation model for pluvial floods
NASA Astrophysics Data System (ADS)
Roezer, V.; Kreibich, H.; Schroeter, K.; Doss-Gollin, J.; Lall, U.; Merz, B.
2017-12-01
Pluvial flood events, such as in Copenhagen (Denmark) in 2011, Beijing (China) in 2012 or Houston (USA) in 2016, have caused severe losses to urban dwellings in recent years. These floods are caused by storm events with high rainfall rates well above the design levels of urban drainage systems, which lead to inundation of streets and buildings. A projected increase in frequency and intensity of heavy rainfall events in many areas and an ongoing urbanization may increase pluvial flood losses in the future. For an efficient risk assessment and adaptation to pluvial floods, a quantification of the flood risk is needed. Few loss models have been developed particularly for pluvial floods. These models usually use simple waterlevel- or rainfall-loss functions and come with very high uncertainties. To account for these uncertainties and improve the loss estimation, we present a probabilistic multi-variable loss estimation model for pluvial floods based on empirical data. The model was developed in a two-step process using a machine learning approach and a comprehensive database comprising 783 records of direct building and content damage of private households. The data was gathered through surveys after four different pluvial flood events in Germany between 2005 and 2014. In a first step, linear and non-linear machine learning algorithms, such as tree-based and penalized regression models were used to identify the most important loss influencing factors among a set of 55 candidate variables. These variables comprise hydrological and hydraulic aspects, early warning, precaution, building characteristics and the socio-economic status of the household. In a second step, the most important loss influencing variables were used to derive a probabilistic multi-variable pluvial flood loss estimation model based on Bayesian Networks. Two different networks were tested: a score-based network learned from the data and a network based on expert knowledge. Loss predictions are made through Bayesian inference using Markov chain Monte Carlo (MCMC) sampling. With the ability to cope with incomplete information and use expert knowledge, as well as inherently providing quantitative uncertainty information, it is shown that loss models based on BNs are superior to deterministic approaches for pluvial flood risk assessment.
NASA Astrophysics Data System (ADS)
Sui, Xin; Yang, Yongqing; Xu, Xianyun; Zhang, Shuai; Zhang, Lingzhong
2018-02-01
This paper investigates the consensus of multi-agent systems with probabilistic time-varying delays and packet losses via sampled-data control. On the one hand, a Bernoulli-distributed white sequence is employed to model random packet losses among agents. On the other hand, a switched system is used to describe packet dropouts in a deterministic way. Based on the special property of the Laplacian matrix, the consensus problem can be converted into a stabilization problem of a switched system with lower dimensions. Some mean square consensus criteria are derived in terms of constructing an appropriate Lyapunov function and using linear matrix inequalities (LMIs). Finally, two numerical examples are given to show the effectiveness of the proposed method.
High Cycle Fatigue (HCF) Science and Technology Program 2002 Annual Report
2003-08-01
Turbine Engine Airfoils, Phase I 4.3 Probabilistic Design of Turbine Engine Airfoils, Phase II 4.4 Probabilistic Blade Design System 4.5...XTL17/SE2 7.4 Conclusion 8.0 TEST AND EVALUATION 8.1 Characterization Test Protocol 8.2 Demonstration Test Protocol 8.3 Development of Multi ...transparent and opaque overlays for processing. The objective of the SBIR Phase I program was to identify and evaluate promising methods for
Efficient Radiative Transfer for Dynamically Evolving Stratified Atmospheres
NASA Astrophysics Data System (ADS)
Judge, Philip G.
2017-12-01
We present a fast multi-level and multi-atom non-local thermodynamic equilibrium radiative transfer method for dynamically evolving stratified atmospheres, such as the solar atmosphere. The preconditioning method of Rybicki & Hummer (RH92) is adopted. But, pressed for the need of speed and stability, a “second-order escape probability” scheme is implemented within the framework of the RH92 method, in which frequency- and angle-integrals are carried out analytically. While minimizing the computational work needed, this comes at the expense of numerical accuracy. The iteration scheme is local, the formal solutions for the intensities are the only non-local component. At present the methods have been coded for vertical transport, applicable to atmospheres that are highly stratified. The probabilistic method seems adequately fast, stable, and sufficiently accurate for exploring dynamical interactions between the evolving MHD atmosphere and radiation using current computer hardware. Current 2D and 3D dynamics codes do not include this interaction as consistently as the current method does. The solutions generated may ultimately serve as initial conditions for dynamical calculations including full 3D radiative transfer. The National Center for Atmospheric Research is sponsored by the National Science Foundation.
Probabilistic flood damage modelling at the meso-scale
NASA Astrophysics Data System (ADS)
Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno
2014-05-01
Decisions on flood risk management and adaptation are usually based on risk analyses. Such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments. Most damage models have in common that complex damaging processes are described by simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood damage models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we show how the model BT-FLEMO (Bagging decision Tree based Flood Loss Estimation MOdel) can be applied on the meso-scale, namely on the basis of ATKIS land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany. The application of BT-FLEMO provides a probability distribution of estimated damage to residential buildings per municipality. Validation is undertaken on the one hand via a comparison with eight other damage models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official damage data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of damage estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation model BT-FLEMO is that it inherently provides quantitative information about the uncertainty of the prediction. Reference: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64.
NASA Astrophysics Data System (ADS)
Caglar, Mehmet Umut; Pal, Ranadip
2011-03-01
Central dogma of molecular biology states that ``information cannot be transferred back from protein to either protein or nucleic acid''. However, this assumption is not exactly correct in most of the cases. There are a lot of feedback loops and interactions between different levels of systems. These types of interactions are hard to analyze due to the lack of cell level data and probabilistic - nonlinear nature of interactions. Several models widely used to analyze and simulate these types of nonlinear interactions. Stochastic Master Equation (SME) models give probabilistic nature of the interactions in a detailed manner, with a high calculation cost. On the other hand Probabilistic Boolean Network (PBN) models give a coarse scale picture of the stochastic processes, with a less calculation cost. Differential Equation (DE) models give the time evolution of mean values of processes in a highly cost effective way. The understanding of the relations between the predictions of these models is important to understand the reliability of the simulations of genetic regulatory networks. In this work the success of the mapping between SME, PBN and DE models is analyzed and the accuracy and affectivity of the control policies generated by using PBN and DE models is compared.
Modular analysis of the probabilistic genetic interaction network.
Hou, Lin; Wang, Lin; Qian, Minping; Li, Dong; Tang, Chao; Zhu, Yunping; Deng, Minghua; Li, Fangting
2011-03-15
Epistatic Miniarray Profiles (EMAP) has enabled the mapping of large-scale genetic interaction networks; however, the quantitative information gained from EMAP cannot be fully exploited since the data are usually interpreted as a discrete network based on an arbitrary hard threshold. To address such limitations, we adopted a mixture modeling procedure to construct a probabilistic genetic interaction network and then implemented a Bayesian approach to identify densely interacting modules in the probabilistic network. Mixture modeling has been demonstrated as an effective soft-threshold technique of EMAP measures. The Bayesian approach was applied to an EMAP dataset studying the early secretory pathway in Saccharomyces cerevisiae. Twenty-seven modules were identified, and 14 of those were enriched by gold standard functional gene sets. We also conducted a detailed comparison with state-of-the-art algorithms, hierarchical cluster and Markov clustering. The experimental results show that the Bayesian approach outperforms others in efficiently recovering biologically significant modules.
Rios, Anthony; Kavuluru, Ramakanth
2013-09-01
Extracting diagnosis codes from medical records is a complex task carried out by trained coders by reading all the documents associated with a patient's visit. With the popularity of electronic medical records (EMRs), computational approaches to code extraction have been proposed in the recent years. Machine learning approaches to multi-label text classification provide an important methodology in this task given each EMR can be associated with multiple codes. In this paper, we study the the role of feature selection, training data selection, and probabilistic threshold optimization in improving different multi-label classification approaches. We conduct experiments based on two different datasets: a recent gold standard dataset used for this task and a second larger and more complex EMR dataset we curated from the University of Kentucky Medical Center. While conventional approaches achieve results comparable to the state-of-the-art on the gold standard dataset, on our complex in-house dataset, we show that feature selection, training data selection, and probabilistic thresholding provide significant gains in performance.
Probabilistic Evaluation of Advanced Ceramic Matrix Composite Structures
NASA Technical Reports Server (NTRS)
Abumeri, Galib H.; Chamis, Christos C.
2003-01-01
The objective of this report is to summarize the deterministic and probabilistic structural evaluation results of two structures made with advanced ceramic composites (CMC): internally pressurized tube and uniformly loaded flange. The deterministic structural evaluation includes stress, displacement, and buckling analyses. It is carried out using the finite element code MHOST, developed for the 3-D inelastic analysis of structures that are made with advanced materials. The probabilistic evaluation is performed using the integrated probabilistic assessment of composite structures computer code IPACS. The affects of uncertainties in primitive variables related to the material, fabrication process, and loadings on the material property and structural response behavior are quantified. The primitive variables considered are: thermo-mechanical properties of fiber and matrix, fiber and void volume ratios, use temperature, and pressure. The probabilistic structural analysis and probabilistic strength results are used by IPACS to perform reliability and risk evaluation of the two structures. The results will show that the sensitivity information obtained for the two composite structures from the computational simulation can be used to alter the design process to meet desired service requirements. In addition to detailed probabilistic analysis of the two structures, the following were performed specifically on the CMC tube: (1) predicted the failure load and the buckling load, (2) performed coupled non-deterministic multi-disciplinary structural analysis, and (3) demonstrated that probabilistic sensitivities can be used to select a reduced set of design variables for optimization.
Faint Object Detection in Multi-Epoch Observations via Catalog Data Fusion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Budavári, Tamás; Szalay, Alexander S.; Loredo, Thomas J.
Astronomy in the time-domain era faces several new challenges. One of them is the efficient use of observations obtained at multiple epochs. The work presented here addresses faint object detection and describes an incremental strategy for separating real objects from artifacts in ongoing surveys. The idea is to produce low-threshold single-epoch catalogs and to accumulate information across epochs. This is in contrast to more conventional strategies based on co-added or stacked images. We adopt a Bayesian approach, addressing object detection by calculating the marginal likelihoods for hypotheses asserting that there is no object or one object in a small imagemore » patch containing at most one cataloged source at each epoch. The object-present hypothesis interprets the sources in a patch at different epochs as arising from a genuine object; the no-object hypothesis interprets candidate sources as spurious, arising from noise peaks. We study the detection probability for constant-flux objects in a Gaussian noise setting, comparing results based on single and stacked exposures to results based on a series of single-epoch catalog summaries. Our procedure amounts to generalized cross-matching: it is the product of a factor accounting for the matching of the estimated fluxes of the candidate sources and a factor accounting for the matching of their estimated directions. We find that probabilistic fusion of multi-epoch catalogs can detect sources with similar sensitivity and selectivity compared to stacking. The probabilistic cross-matching framework underlying our approach plays an important role in maintaining detection sensitivity and points toward generalizations that could accommodate variability and complex object structure.« less
A comprehensive probabilistic analysis model of oil pipelines network based on Bayesian network
NASA Astrophysics Data System (ADS)
Zhang, C.; Qin, T. X.; Jiang, B.; Huang, C.
2018-02-01
Oil pipelines network is one of the most important facilities of energy transportation. But oil pipelines network accident may result in serious disasters. Some analysis models for these accidents have been established mainly based on three methods, including event-tree, accident simulation and Bayesian network. Among these methods, Bayesian network is suitable for probabilistic analysis. But not all the important influencing factors are considered and the deployment rule of the factors has not been established. This paper proposed a probabilistic analysis model of oil pipelines network based on Bayesian network. Most of the important influencing factors, including the key environment condition and emergency response are considered in this model. Moreover, the paper also introduces a deployment rule for these factors. The model can be used in probabilistic analysis and sensitive analysis of oil pipelines network accident.
Borchani, Hanen; Bielza, Concha; Toro, Carlos; Larrañaga, Pedro
2013-03-01
Our aim is to use multi-dimensional Bayesian network classifiers in order to predict the human immunodeficiency virus type 1 (HIV-1) reverse transcriptase and protease inhibitors given an input set of respective resistance mutations that an HIV patient carries. Multi-dimensional Bayesian network classifiers (MBCs) are probabilistic graphical models especially designed to solve multi-dimensional classification problems, where each input instance in the data set has to be assigned simultaneously to multiple output class variables that are not necessarily binary. In this paper, we introduce a new method, named MB-MBC, for learning MBCs from data by determining the Markov blanket around each class variable using the HITON algorithm. Our method is applied to both reverse transcriptase and protease data sets obtained from the Stanford HIV-1 database. Regarding the prediction of antiretroviral combination therapies, the experimental study shows promising results in terms of classification accuracy compared with state-of-the-art MBC learning algorithms. For reverse transcriptase inhibitors, we get 71% and 11% in mean and global accuracy, respectively; while for protease inhibitors, we get more than 84% and 31% in mean and global accuracy, respectively. In addition, the analysis of MBC graphical structures lets us gain insight into both known and novel interactions between reverse transcriptase and protease inhibitors and their respective resistance mutations. MB-MBC algorithm is a valuable tool to analyze the HIV-1 reverse transcriptase and protease inhibitors prediction problem and to discover interactions within and between these two classes of inhibitors. Copyright © 2012 Elsevier B.V. All rights reserved.
Characterizing Topology of Probabilistic Biological Networks.
Todor, Andrei; Dobra, Alin; Kahveci, Tamer
2013-09-06
Biological interactions are often uncertain events, that may or may not take place with some probability. Existing studies analyze the degree distribution of biological networks by assuming that all the given interactions take place under all circumstances. This strong and often incorrect assumption can lead to misleading results. Here, we address this problem and develop a sound mathematical basis to characterize networks in the presence of uncertain interactions. We develop a method that accurately describes the degree distribution of such networks. We also extend our method to accurately compute the joint degree distributions of node pairs connected by edges. The number of possible network topologies grows exponentially with the number of uncertain interactions. However, the mathematical model we develop allows us to compute these degree distributions in polynomial time in the number of interactions. It also helps us find an adequate mathematical model using maximum likelihood estimation. Our results demonstrate that power law and log-normal models best describe degree distributions for probabilistic networks. The inverse correlation of degrees of neighboring nodes shows that, in probabilistic networks, nodes with large number of interactions prefer to interact with those with small number of interactions more frequently than expected.
Probabilistic simulation of multi-scale composite behavior
NASA Technical Reports Server (NTRS)
Liaw, D. G.; Shiao, M. C.; Singhal, S. N.; Chamis, Christos C.
1993-01-01
A methodology is developed to computationally assess the probabilistic composite material properties at all composite scale levels due to the uncertainties in the constituent (fiber and matrix) properties and in the fabrication process variables. The methodology is computationally efficient for simulating the probability distributions of material properties. The sensitivity of the probabilistic composite material property to each random variable is determined. This information can be used to reduce undesirable uncertainties in material properties at the macro scale of the composite by reducing the uncertainties in the most influential random variables at the micro scale. This methodology was implemented into the computer code PICAN (Probabilistic Integrated Composite ANalyzer). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in the material properties of a typical laminate and comparing the results with the Monte Carlo simulation method. The experimental data of composite material properties at all scales fall within the scatters predicted by PICAN.
Probabilistic drug connectivity mapping
2014-01-01
Background The aim of connectivity mapping is to match drugs using drug-treatment gene expression profiles from multiple cell lines. This can be viewed as an information retrieval task, with the goal of finding the most relevant profiles for a given query drug. We infer the relevance for retrieval by data-driven probabilistic modeling of the drug responses, resulting in probabilistic connectivity mapping, and further consider the available cell lines as different data sources. We use a special type of probabilistic model to separate what is shared and specific between the sources, in contrast to earlier connectivity mapping methods that have intentionally aggregated all available data, neglecting information about the differences between the cell lines. Results We show that the probabilistic multi-source connectivity mapping method is superior to alternatives in finding functionally and chemically similar drugs from the Connectivity Map data set. We also demonstrate that an extension of the method is capable of retrieving combinations of drugs that match different relevant parts of the query drug response profile. Conclusions The probabilistic modeling-based connectivity mapping method provides a promising alternative to earlier methods. Principled integration of data from different cell lines helps to identify relevant responses for specific drug repositioning applications. PMID:24742351
NASA Astrophysics Data System (ADS)
Bandte, Oliver
It has always been the intention of systems engineering to invent or produce the best product possible. Many design techniques have been introduced over the course of decades that try to fulfill this intention. Unfortunately, no technique has succeeded in combining multi-criteria decision making with probabilistic design. The design technique developed in this thesis, the Joint Probabilistic Decision Making (JPDM) technique, successfully overcomes this deficiency by generating a multivariate probability distribution that serves in conjunction with a criterion value range of interest as a universally applicable objective function for multi-criteria optimization and product selection. This new objective function constitutes a meaningful Xnetric, called Probability of Success (POS), that allows the customer or designer to make a decision based on the chance of satisfying the customer's goals. In order to incorporate a joint probabilistic formulation into the systems design process, two algorithms are created that allow for an easy implementation into a numerical design framework: the (multivariate) Empirical Distribution Function and the Joint Probability Model. The Empirical Distribution Function estimates the probability that an event occurred by counting how many times it occurred in a given sample. The Joint Probability Model on the other hand is an analytical parametric model for the multivariate joint probability. It is comprised of the product of the univariate criterion distributions, generated by the traditional probabilistic design process, multiplied with a correlation function that is based on available correlation information between pairs of random variables. JPDM is an excellent tool for multi-objective optimization and product selection, because of its ability to transform disparate objectives into a single figure of merit, the likelihood of successfully meeting all goals or POS. The advantage of JPDM over other multi-criteria decision making techniques is that POS constitutes a single optimizable function or metric that enables a comparison of all alternative solutions on an equal basis. Hence, POS allows for the use of any standard single-objective optimization technique available and simplifies a complex multi-criteria selection problem into a simple ordering problem, where the solution with the highest POS is best. By distinguishing between controllable and uncontrollable variables in the design process, JPDM can account for the uncertain values of the uncontrollable variables that are inherent to the design problem, while facilitating an easy adjustment of the controllable ones to achieve the highest possible POS. Finally, JPDM's superiority over current multi-criteria decision making techniques is demonstrated with an optimization of a supersonic transport concept and ten contrived equations as well as a product selection example, determining an airline's best choice among Boeing's B-747, B-777, Airbus' A340, and a Supersonic Transport. The optimization examples demonstrate JPDM's ability to produce a better solution with a higher POS than an Overall Evaluation Criterion or Goal Programming approach. Similarly, the product selection example demonstrates JPDM's ability to produce a better solution with a higher POS and different ranking than the Overall Evaluation Criterion or Technique for Order Preferences by Similarity to the Ideal Solution (TOPSIS) approach.
Evaluating the uncertainty of predicting future climate time series at the hourly time scale
NASA Astrophysics Data System (ADS)
Caporali, E.; Fatichi, S.; Ivanov, V. Y.
2011-12-01
A stochastic downscaling methodology is developed to generate hourly, point-scale time series for several meteorological variables, such as precipitation, cloud cover, shortwave radiation, air temperature, relative humidity, wind speed, and atmospheric pressure. The methodology uses multi-model General Circulation Model (GCM) realizations and an hourly weather generator, AWE-GEN. Probabilistic descriptions of factors of change (a measure of climate change with respect to historic conditions) are computed for several climate statistics and different aggregation times using a Bayesian approach that weights the individual GCM contributions. The Monte Carlo method is applied to sample the factors of change from their respective distributions thereby permitting the generation of time series in an ensemble fashion, which reflects the uncertainty of climate projections of future as well as the uncertainty of the downscaling procedure. Applications of the methodology and probabilistic expressions of certainty in reproducing future climates for the periods, 2000 - 2009, 2046 - 2065 and 2081 - 2100, using the 1962 - 1992 period as the baseline, are discussed for the location of Firenze (Italy). The climate predictions for the period of 2000 - 2009 are tested against observations permitting to assess the reliability and uncertainties of the methodology in reproducing statistics of meteorological variables at different time scales.
Rouphail, Nagui M.
2011-01-01
This paper presents behavioral-based models for describing pedestrian gap acceptance at unsignalized crosswalks in a mixed-priority environment, where some drivers yield and some pedestrians cross in gaps. Logistic regression models are developed to predict the probability of pedestrian crossings as a function of vehicle dynamics, pedestrian assertiveness, and other factors. In combination with prior work on probabilistic yielding models, the results can be incorporated in a simulation environment, where they can more fully describe the interaction of these two modes. The approach is intended to supplement HCM analytical procedure for locations where significant interaction occurs between drivers and pedestrians, including modern roundabouts. PMID:21643488
Temporal Causality Analysis of Sentiment Change in a Cancer Survivor Network.
Bui, Ngot; Yen, John; Honavar, Vasant
2016-06-01
Online health communities constitute a useful source of information and social support for patients. American Cancer Society's Cancer Survivor Network (CSN), a 173,000-member community, is the largest online network for cancer patients, survivors, and caregivers. A discussion thread in CSN is often initiated by a cancer survivor seeking support from other members of CSN. Discussion threads are multi-party conversations that often provide a source of social support e.g., by bringing about a change of sentiment from negative to positive on the part of the thread originator. While previous studies regarding cancer survivors have shown that members of an online health community derive benefits from their participation in such communities, causal accounts of the factors that contribute to the observed benefits have been lacking. We introduce a novel framework to examine the temporal causality of sentiment dynamics in the CSN. We construct a Probabilistic Computation Tree Logic representation and a corresponding probabilistic Kripke structure to represent and reason about the changes in sentiments of posts in a thread over time. We use a sentiment classifier trained using machine learning on a set of posts manually tagged with sentiment labels to classify posts as expressing either positive or negative sentiment. We analyze the probabilistic Kripke structure to identify the prima facie causes of sentiment change on the part of the thread originators in the CSN forum and their significance. We find that the sentiment of replies appears to causally influence the sentiment of the thread originator. Our experiments also show that the conclusions are robust with respect to the choice of the (i) classification threshold of the sentiment classifier; (ii) and the choice of the specific sentiment classifier used. We also extend the basic framework for temporal causality analysis to incorporate the uncertainty in the states of the probabilistic Kripke structure resulting from the use of an imperfect state transducer (in our case, the sentiment classifier). Our analysis of temporal causality of CSN sentiment dynamics offers new insights that the designers, managers and moderators of an online community such as CSN can utilize to facilitate and enhance the interactions so as to better meet the social support needs of the CSN participants. The proposed methodology for analysis of temporal causality has broad applicability in a variety of settings where the dynamics of the underlying system can be modeled in terms of state variables that change in response to internal or external inputs.
Temporal Causality Analysis of Sentiment Change in a Cancer Survivor Network
Bui, Ngot; Yen, John; Honavar, Vasant
2017-01-01
Online health communities constitute a useful source of information and social support for patients. American Cancer Society’s Cancer Survivor Network (CSN), a 173,000-member community, is the largest online network for cancer patients, survivors, and caregivers. A discussion thread in CSN is often initiated by a cancer survivor seeking support from other members of CSN. Discussion threads are multi-party conversations that often provide a source of social support e.g., by bringing about a change of sentiment from negative to positive on the part of the thread originator. While previous studies regarding cancer survivors have shown that members of an online health community derive benefits from their participation in such communities, causal accounts of the factors that contribute to the observed benefits have been lacking. We introduce a novel framework to examine the temporal causality of sentiment dynamics in the CSN. We construct a Probabilistic Computation Tree Logic representation and a corresponding probabilistic Kripke structure to represent and reason about the changes in sentiments of posts in a thread over time. We use a sentiment classifier trained using machine learning on a set of posts manually tagged with sentiment labels to classify posts as expressing either positive or negative sentiment. We analyze the probabilistic Kripke structure to identify the prima facie causes of sentiment change on the part of the thread originators in the CSN forum and their significance. We find that the sentiment of replies appears to causally influence the sentiment of the thread originator. Our experiments also show that the conclusions are robust with respect to the choice of the (i) classification threshold of the sentiment classifier; (ii) and the choice of the specific sentiment classifier used. We also extend the basic framework for temporal causality analysis to incorporate the uncertainty in the states of the probabilistic Kripke structure resulting from the use of an imperfect state transducer (in our case, the sentiment classifier). Our analysis of temporal causality of CSN sentiment dynamics offers new insights that the designers, managers and moderators of an online community such as CSN can utilize to facilitate and enhance the interactions so as to better meet the social support needs of the CSN participants. The proposed methodology for analysis of temporal causality has broad applicability in a variety of settings where the dynamics of the underlying system can be modeled in terms of state variables that change in response to internal or external inputs. PMID:29399599
Climatological Observations for Maritime Prediction and Analysis Support Service (COMPASS)
NASA Astrophysics Data System (ADS)
OConnor, A.; Kirtman, B. P.; Harrison, S.; Gorman, J.
2016-02-01
Current US Navy forecasting systems cannot easily incorporate extended-range forecasts that can improve mission readiness and effectiveness; ensure safety; and reduce cost, labor, and resource requirements. If Navy operational planners had systems that incorporated these forecasts, they could plan missions using more reliable and longer-term weather and climate predictions. Further, using multi-model forecast ensembles instead of single forecasts would produce higher predictive performance. Extended-range multi-model forecast ensembles, such as those available in the North American Multi-Model Ensemble (NMME), are ideal for system integration because of their high skill predictions; however, even higher skill predictions can be produced if forecast model ensembles are combined correctly. While many methods for weighting models exist, the best method in a given environment requires expert knowledge of the models and combination methods.We present an innovative approach that uses machine learning to combine extended-range predictions from multi-model forecast ensembles and generate a probabilistic forecast for any region of the globe up to 12 months in advance. Our machine-learning approach uses 30 years of hindcast predictions to learn patterns of forecast model successes and failures. Each model is assigned a weight for each environmental condition, 100 km2 region, and day given any expected environmental information. These weights are then applied to the respective predictions for the region and time of interest to effectively stitch together a single, coherent probabilistic forecast. Our experimental results demonstrate the benefits of our approach to produce extended-range probabilistic forecasts for regions and time periods of interest that are superior, in terms of skill, to individual NMME forecast models and commonly weighted models. The probabilistic forecast leverages the strengths of three NMME forecast models to predict environmental conditions for an area spanning from San Diego, CA to Honolulu, HI, seven months in-advance. Key findings include: weighted combinations of models are strictly better than individual models; machine-learned combinations are especially better; and forecasts produced using our approach have the highest rank probability skill score most often.
Multi-level multi-task learning for modeling cross-scale interactions in nested geospatial data
Yuan, Shuai; Zhou, Jiayu; Tan, Pang-Ning; Fergus, Emi; Wagner, Tyler; Sorrano, Patricia
2017-01-01
Predictive modeling of nested geospatial data is a challenging problem as the models must take into account potential interactions among variables defined at different spatial scales. These cross-scale interactions, as they are commonly known, are particularly important to understand relationships among ecological properties at macroscales. In this paper, we present a novel, multi-level multi-task learning framework for modeling nested geospatial data in the lake ecology domain. Specifically, we consider region-specific models to predict lake water quality from multi-scaled factors. Our framework enables distinct models to be developed for each region using both its local and regional information. The framework also allows information to be shared among the region-specific models through their common set of latent factors. Such information sharing helps to create more robust models especially for regions with limited or no training data. In addition, the framework can automatically determine cross-scale interactions between the regional variables and the local variables that are nested within them. Our experimental results show that the proposed framework outperforms all the baseline methods in at least 64% of the regions for 3 out of 4 lake water quality datasets evaluated in this study. Furthermore, the latent factors can be clustered to obtain a new set of regions that is more aligned with the response variables than the original regions that were defined a priori from the ecology domain.
Sáez, Carlos; Robles, Montserrat; García-Gómez, Juan M
2017-02-01
Biomedical data may be composed of individuals generated from distinct, meaningful sources. Due to possible contextual biases in the processes that generate data, there may exist an undesirable and unexpected variability among the probability distribution functions (PDFs) of the source subsamples, which, when uncontrolled, may lead to inaccurate or unreproducible research results. Classical statistical methods may have difficulties to undercover such variabilities when dealing with multi-modal, multi-type, multi-variate data. This work proposes two metrics for the analysis of stability among multiple data sources, robust to the aforementioned conditions, and defined in the context of data quality assessment. Specifically, a global probabilistic deviation and a source probabilistic outlyingness metrics are proposed. The first provides a bounded degree of the global multi-source variability, designed as an estimator equivalent to the notion of normalized standard deviation of PDFs. The second provides a bounded degree of the dissimilarity of each source to a latent central distribution. The metrics are based on the projection of a simplex geometrical structure constructed from the Jensen-Shannon distances among the sources PDFs. The metrics have been evaluated and demonstrated their correct behaviour on a simulated benchmark and with real multi-source biomedical data using the UCI Heart Disease data set. The biomedical data quality assessment based on the proposed stability metrics may improve the efficiency and effectiveness of biomedical data exploitation and research.
A Mixtures-of-Trees Framework for Multi-Label Classification
Hong, Charmgil; Batal, Iyad; Hauskrecht, Milos
2015-01-01
We propose a new probabilistic approach for multi-label classification that aims to represent the class posterior distribution P(Y|X). Our approach uses a mixture of tree-structured Bayesian networks, which can leverage the computational advantages of conditional tree-structured models and the abilities of mixtures to compensate for tree-structured restrictions. We develop algorithms for learning the model from data and for performing multi-label predictions using the learned model. Experiments on multiple datasets demonstrate that our approach outperforms several state-of-the-art multi-label classification methods. PMID:25927011
Forman, Jason L.; Kent, Richard W.; Mroz, Krystoffer; Pipkorn, Bengt; Bostrom, Ola; Segui-Gomez, Maria
2012-01-01
This study sought to develop a strain-based probabilistic method to predict rib fracture risk with whole-body finite element (FE) models, and to describe a method to combine the results with collision exposure information to predict injury risk and potential intervention effectiveness in the field. An age-adjusted ultimate strain distribution was used to estimate local rib fracture probabilities within an FE model. These local probabilities were combined to predict injury risk and severity within the whole ribcage. The ultimate strain distribution was developed from a literature dataset of 133 tests. Frontal collision simulations were performed with the THUMS (Total HUman Model for Safety) model with four levels of delta-V and two restraints: a standard 3-point belt and a progressive 3.5–7 kN force-limited, pretensioned (FL+PT) belt. The results of three simulations (29 km/h standard, 48 km/h standard, and 48 km/h FL+PT) were compared to matched cadaver sled tests. The numbers of fractures predicted for the comparison cases were consistent with those observed experimentally. Combining these results with field exposure informantion (ΔV, NASS-CDS 1992–2002) suggests a 8.9% probability of incurring AIS3+ rib fractures for a 60 year-old restrained by a standard belt in a tow-away frontal collision with this restraint, vehicle, and occupant configuration, compared to 4.6% for the FL+PT belt. This is the first study to describe a probabilistic framework to predict rib fracture risk based on strains observed in human-body FE models. Using this analytical framework, future efforts may incorporate additional subject or collision factors for multi-variable probabilistic injury prediction. PMID:23169122
NASA Technical Reports Server (NTRS)
Townsend, J.; Meyers, C.; Ortega, R.; Peck, J.; Rheinfurth, M.; Weinstock, B.
1993-01-01
Probabilistic structural analyses and design methods are steadily gaining acceptance within the aerospace industry. The safety factor approach to design has long been the industry standard, and it is believed by many to be overly conservative and thus, costly. A probabilistic approach to design may offer substantial cost savings. This report summarizes several probabilistic approaches: the probabilistic failure analysis (PFA) methodology developed by Jet Propulsion Laboratory, fast probability integration (FPI) methods, the NESSUS finite element code, and response surface methods. Example problems are provided to help identify the advantages and disadvantages of each method.
Praveen, Paurush; Fröhlich, Holger
2013-01-01
Inferring regulatory networks from experimental data via probabilistic graphical models is a popular framework to gain insights into biological systems. However, the inherent noise in experimental data coupled with a limited sample size reduces the performance of network reverse engineering. Prior knowledge from existing sources of biological information can address this low signal to noise problem by biasing the network inference towards biologically plausible network structures. Although integrating various sources of information is desirable, their heterogeneous nature makes this task challenging. We propose two computational methods to incorporate various information sources into a probabilistic consensus structure prior to be used in graphical model inference. Our first model, called Latent Factor Model (LFM), assumes a high degree of correlation among external information sources and reconstructs a hidden variable as a common source in a Bayesian manner. The second model, a Noisy-OR, picks up the strongest support for an interaction among information sources in a probabilistic fashion. Our extensive computational studies on KEGG signaling pathways as well as on gene expression data from breast cancer and yeast heat shock response reveal that both approaches can significantly enhance the reconstruction accuracy of Bayesian Networks compared to other competing methods as well as to the situation without any prior. Our framework allows for using diverse information sources, like pathway databases, GO terms and protein domain data, etc. and is flexible enough to integrate new sources, if available.
Predicting targets of compounds against neurological diseases using cheminformatic methodology
NASA Astrophysics Data System (ADS)
Nikolic, Katarina; Mavridis, Lazaros; Bautista-Aguilera, Oscar M.; Marco-Contelles, José; Stark, Holger; do Carmo Carreiras, Maria; Rossi, Ilaria; Massarelli, Paola; Agbaba, Danica; Ramsay, Rona R.; Mitchell, John B. O.
2015-02-01
Recently developed multi-targeted ligands are novel drug candidates able to interact with monoamine oxidase A and B; acetylcholinesterase and butyrylcholinesterase; or with histamine N-methyltransferase and histamine H3-receptor (H3R). These proteins are drug targets in the treatment of depression, Alzheimer's disease, obsessive disorders, and Parkinson's disease. A probabilistic method, the Parzen-Rosenblatt window approach, was used to build a "predictor" model using data collected from the ChEMBL database. The model can be used to predict both the primary pharmaceutical target and off-targets of a compound based on its structure. Molecular structures were represented based on the circular fingerprint methodology. The same approach was used to build a "predictor" model from the DrugBank dataset to determine the main pharmacological groups of the compound. The study of off-target interactions is now recognised as crucial to the understanding of both drug action and toxicology. Primary pharmaceutical targets and off-targets for the novel multi-target ligands were examined by use of the developed cheminformatic method. Several multi-target ligands were selected for further study, as compounds with possible additional beneficial pharmacological activities. The cheminformatic targets identifications were in agreement with four 3D-QSAR (H3R/D1R/D2R/5-HT2aR) models and by in vitro assays for serotonin 5-HT1a and 5-HT2a receptor binding of the most promising ligand ( 71/MBA-VEG8).
Cullis, B R; Smith, A B; Beeck, C P; Cowling, W A
2010-11-01
Exploring and exploiting variety by environment (V × E) interaction is one of the major challenges facing plant breeders. In paper I of this series, we presented an approach to modelling V × E interaction in the analysis of complex multi-environment trials using factor analytic models. In this paper, we develop a range of statistical tools which explore V × E interaction in this context. These tools include graphical displays such as heat-maps of genetic correlation matrices as well as so-called E-scaled uniplots that are a more informative alternative to the classical biplot for large plant breeding multi-environment trials. We also present a new approach to prediction for multi-environment trials that include pedigree information. This approach allows meaningful selection indices to be formed either for potential new varieties or potential parents.
NASA Astrophysics Data System (ADS)
Subramanian, A. C.; Lavers, D.; Matsueda, M.; Shukla, S.; Cayan, D. R.; Ralph, M.
2017-12-01
Atmospheric rivers (ARs) - elongated plumes of intense moisture transport - are a primary source of hydrological extremes, water resources and impactful weather along the West Coast of North America and Europe. There is strong demand in the water management, societal infrastructure and humanitarian sectors for reliable sub-seasonal forecasts, particularly of extreme events, such as floods and droughts so that actions to mitigate disastrous impacts can be taken with sufficient lead-time. Many recent studies have shown that ARs in the Pacific and the Atlantic are modulated by large-scale modes of climate variability. Leveraging the improved understanding of how these large-scale climate modes modulate the ARs in these two basins, we use the state-of-the-art multi-model forecast systems such as the North American Multi-Model Ensemble (NMME) and the Subseasonal-to-Seasonal (S2S) database to help inform and assess the probabilistic prediction of ARs and related extreme weather events over the North American and European West Coasts. We will present results from evaluating probabilistic forecasts of extreme precipitation and AR activity at the sub-seasonal scale. In particular, results from the comparison of two winters (2015-16 and 2016-17) will be shown, winters which defied canonical El Niño teleconnection patterns over North America and Europe. We further extend this study to analyze probabilistic forecast skill of AR events in these two basins and the variability in forecast skill during certain regimes of large-scale climate modes.
BYMUR software: a free and open source tool for quantifying and visualizing multi-risk analyses
NASA Astrophysics Data System (ADS)
Tonini, Roberto; Selva, Jacopo
2013-04-01
The BYMUR software aims to provide an easy-to-use open source tool for both computing multi-risk and managing/visualizing/comparing all the inputs (e.g. hazard, fragilities and exposure) as well as the corresponding results (e.g. risk curves, risk indexes). For all inputs, a complete management of inter-model epistemic uncertainty is considered. The BYMUR software will be one of the final products provided by the homonymous ByMuR project (http://bymur.bo.ingv.it/) funded by Italian Ministry of Education, Universities and Research (MIUR), focused to (i) provide a quantitative and objective general method for a comprehensive long-term multi-risk analysis in a given area, accounting for inter-model epistemic uncertainty through Bayesian methodologies, and (ii) apply the methodology to seismic, volcanic and tsunami risks in Naples (Italy). More specifically, the BYMUR software will be able to separately account for the probabilistic hazard assessment of different kind of hazardous phenomena, the relative (time-dependent/independent) vulnerabilities and exposure data, and their possible (predefined) interactions: the software will analyze these inputs and will use them to estimate both single- and multi- risk associated to a specific target area. In addition, it will be possible to connect the software to further tools (e.g., a full hazard analysis), allowing a dynamic I/O of results. The use of Python programming language guarantees that the final software will be open source and platform independent. Moreover, thanks to the integration of some most popular and rich-featured Python scientific modules (Numpy, Matplotlib, Scipy) with the wxPython graphical user toolkit, the final tool will be equipped with a comprehensive Graphical User Interface (GUI) able to control and visualize (in the form of tables, maps and/or plots) any stage of the multi-risk analysis. The additional features of importing/exporting data in MySQL databases and/or standard XML formats (for instance, the global standards defined in the frame of GEM project for seismic hazard and risk) will grant the interoperability with other FOSS software and tools and, at the same time, to be on hand of the geo-scientific community. An already available example of connection is represented by the BET_VH(**) tool, which probabilistic volcanic hazard outputs will be used as input for BYMUR. Finally, the prototype version of BYMUR will be used for the case study of the municipality of Naples, by considering three different natural hazards (volcanic eruptions, earthquakes and tsunamis) and by assessing the consequent long-term risk evaluation. (**)BET_VH (Bayesian Event Tree for Volcanic Hazard) is probabilistic tool for long-term volcanic hazard assessment, recently re-designed and adjusted to be run on the Vhub cyber-infrastructure, a free web-based collaborative tool in volcanology research (see http://vhub.org/resources/betvh).
Jackson, Rod
2017-01-01
Background Many national cardiovascular disease (CVD) risk factor management guidelines now recommend that drug treatment decisions should be informed primarily by patients’ multi-variable predicted risk of CVD, rather than on the basis of single risk factor thresholds. To investigate the potential impact of treatment guidelines based on CVD risk thresholds at a national level requires individual level data representing the multi-variable CVD risk factor profiles for a country’s total adult population. As these data are seldom, if ever, available, we aimed to create a synthetic population, representing the joint CVD risk factor distributions of the adult New Zealand population. Methods and results A synthetic population of 2,451,278 individuals, representing the actual age, gender, ethnicity and social deprivation composition of people aged 30–84 years who completed the 2013 New Zealand census was generated using Monte Carlo sampling. Each ‘synthetic’ person was then probabilistically assigned values of the remaining cardiovascular disease (CVD) risk factors required for predicting their CVD risk, based on data from the national census national hospitalisation and drug dispensing databases and a large regional cohort study, using Monte Carlo sampling and multiple imputation. Where possible, the synthetic population CVD risk distributions for each non-demographic risk factor were validated against independent New Zealand data sources. Conclusions We were able to develop a synthetic national population with realistic multi-variable CVD risk characteristics. The construction of this population is the first step in the development of a micro-simulation model intended to investigate the likely impact of a range of national CVD risk management strategies that will inform CVD risk management guideline updates in New Zealand and elsewhere. PMID:28384217
Knight, Josh; Wells, Susan; Marshall, Roger; Exeter, Daniel; Jackson, Rod
2017-01-01
Many national cardiovascular disease (CVD) risk factor management guidelines now recommend that drug treatment decisions should be informed primarily by patients' multi-variable predicted risk of CVD, rather than on the basis of single risk factor thresholds. To investigate the potential impact of treatment guidelines based on CVD risk thresholds at a national level requires individual level data representing the multi-variable CVD risk factor profiles for a country's total adult population. As these data are seldom, if ever, available, we aimed to create a synthetic population, representing the joint CVD risk factor distributions of the adult New Zealand population. A synthetic population of 2,451,278 individuals, representing the actual age, gender, ethnicity and social deprivation composition of people aged 30-84 years who completed the 2013 New Zealand census was generated using Monte Carlo sampling. Each 'synthetic' person was then probabilistically assigned values of the remaining cardiovascular disease (CVD) risk factors required for predicting their CVD risk, based on data from the national census national hospitalisation and drug dispensing databases and a large regional cohort study, using Monte Carlo sampling and multiple imputation. Where possible, the synthetic population CVD risk distributions for each non-demographic risk factor were validated against independent New Zealand data sources. We were able to develop a synthetic national population with realistic multi-variable CVD risk characteristics. The construction of this population is the first step in the development of a micro-simulation model intended to investigate the likely impact of a range of national CVD risk management strategies that will inform CVD risk management guideline updates in New Zealand and elsewhere.
NESSUS/EXPERT - An expert system for probabilistic structural analysis methods
NASA Technical Reports Server (NTRS)
Millwater, H.; Palmer, K.; Fink, P.
1988-01-01
An expert system (NESSUS/EXPERT) is presented which provides assistance in using probabilistic structural analysis methods. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator. NESSUS/EXPERT was developed with a combination of FORTRAN and CLIPS, a C language expert system tool, to exploit the strengths of each language.
Probabilistic finite elements for fracture mechanics
NASA Technical Reports Server (NTRS)
Besterfield, Glen
1988-01-01
The probabilistic finite element method (PFEM) is developed for probabilistic fracture mechanics (PFM). A finite element which has the near crack-tip singular strain embedded in the element is used. Probabilistic distributions, such as expectation, covariance and correlation stress intensity factors, are calculated for random load, random material and random crack length. The method is computationally quite efficient and can be expected to determine the probability of fracture or reliability.
Probabilistic lifetime strength of aerospace materials via computational simulation
NASA Technical Reports Server (NTRS)
Boyce, Lola; Keating, Jerome P.; Lovelace, Thomas B.; Bast, Callie C.
1991-01-01
The results of a second year effort of a research program are presented. The research included development of methodology that provides probabilistic lifetime strength of aerospace materials via computational simulation. A probabilistic phenomenological constitutive relationship, in the form of a randomized multifactor interaction equation, is postulated for strength degradation of structural components of aerospace propulsion systems subjected to a number of effects of primitive variables. These primitive variables often originate in the environment and may include stress from loading, temperature, chemical, or radiation attack. This multifactor interaction constitutive equation is included in the computer program, PROMISS. Also included in the research is the development of methodology to calibrate the constitutive equation using actual experimental materials data together with the multiple linear regression of that data.
COREPA-M: A MULTI-DIMENSIONAL FORMULATION OF COREPA
Recently, the COmmon REactivity PAttern (COREPA) approach was developed as a probabilistic classification method which was formalized specifically to advance mechanistic QSAR development by addressing the impact of molecular flexibility on stereoelectronic properties of chemicals...
Probabilistic Aeroelastic Analysis of Turbomachinery Components
NASA Technical Reports Server (NTRS)
Reddy, T. S. R.; Mital, S. K.; Stefko, G. L.
2004-01-01
A probabilistic approach is described for aeroelastic analysis of turbomachinery blade rows. Blade rows with subsonic flow and blade rows with supersonic flow with subsonic leading edge are considered. To demonstrate the probabilistic approach, the flutter frequency, damping and forced response of a blade row representing a compressor geometry is considered. The analysis accounts for uncertainties in structural and aerodynamic design variables. The results are presented in the form of probabilistic density function (PDF) and sensitivity factors. For subsonic flow cascade, comparisons are also made with different probabilistic distributions, probabilistic methods, and Monte-Carlo simulation. The approach shows that the probabilistic approach provides a more realistic and systematic way to assess the effect of uncertainties in design variables on the aeroelastic instabilities and response.
Probabilistic structural analysis methods for select space propulsion system components
NASA Technical Reports Server (NTRS)
Millwater, H. R.; Cruse, T. A.
1989-01-01
The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.
Coupled Multi-Disciplinary Optimization for Structural Reliability and Affordability
NASA Technical Reports Server (NTRS)
Abumeri, Galib H.; Chamis, Christos C.
2003-01-01
A computational simulation method is presented for Non-Deterministic Multidisciplinary Optimization of engine composite materials and structures. A hypothetical engine duct made with ceramic matrix composites (CMC) is evaluated probabilistically in the presence of combined thermo-mechanical loading. The structure is tailored by quantifying the uncertainties in all relevant design variables such as fabrication, material, and loading parameters. The probabilistic sensitivities are used to select critical design variables for optimization. In this paper, two approaches for non-deterministic optimization are presented. The non-deterministic minimization of combined failure stress criterion is carried out by: (1) performing probabilistic evaluation first and then optimization and (2) performing optimization first and then probabilistic evaluation. The first approach shows that the optimization feasible region can be bounded by a set of prescribed probability limits and that the optimization follows the cumulative distribution function between those limits. The second approach shows that the optimization feasible region is bounded by 0.50 and 0.999 probabilities.
NASA Technical Reports Server (NTRS)
Bast, Callie C.; Jurena, Mark T.; Godines, Cody R.; Chamis, Christos C. (Technical Monitor)
2001-01-01
This project included both research and education objectives. The goal of this project was to advance innovative research and education objectives in theoretical and computational probabilistic structural analysis, reliability, and life prediction for improved reliability and safety of structural components of aerospace and aircraft propulsion systems. Research and education partners included Glenn Research Center (GRC) and Southwest Research Institute (SwRI) along with the University of Texas at San Antonio (UTSA). SwRI enhanced the NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) code and provided consulting support for NESSUS-related activities at UTSA. NASA funding supported three undergraduate students, two graduate students, a summer course instructor and the Principal Investigator. Matching funds from UTSA provided for the purchase of additional equipment for the enhancement of the Advanced Interactive Computational SGI Lab established during the first year of this Partnership Award to conduct the probabilistic finite element summer courses. The research portion of this report presents the cumulation of work performed through the use of the probabilistic finite element program, NESSUS, Numerical Evaluation and Structures Under Stress, and an embedded Material Strength Degradation (MSD) model. Probabilistic structural analysis provided for quantification of uncertainties associated with the design, thus enabling increased system performance and reliability. The structure examined was a Space Shuttle Main Engine (SSME) fuel turbopump blade. The blade material analyzed was Inconel 718, since the MSD model was previously calibrated for this material. Reliability analysis encompassing the effects of high temperature and high cycle fatigue, yielded a reliability value of 0.99978 using a fully correlated random field for the blade thickness. The reliability did not change significantly for a change in distribution type except for a change in distribution from Gaussian to Weibull for the centrifugal load. The sensitivity factors determined to be most dominant were the centrifugal loading and the initial strength of the material. These two sensitivity factors were influenced most by a change in distribution type from Gaussian to Weibull. The education portion of this report describes short-term and long-term educational objectives. Such objectives serve to integrate research and education components of this project resulting in opportunities for ethnic minority students, principally Hispanic. The primary vehicle to facilitate such integration was the teaching of two probabilistic finite element method courses to undergraduate engineering students in the summers of 1998 and 1999.
Probabilistic structural analysis methods of hot engine structures
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Hopkins, D. A.
1989-01-01
Development of probabilistic structural analysis methods for hot engine structures is a major activity at Lewis Research Center. Recent activities have focused on extending the methods to include the combined uncertainties in several factors on structural response. This paper briefly describes recent progress on composite load spectra models, probabilistic finite element structural analysis, and probabilistic strength degradation modeling. Progress is described in terms of fundamental concepts, computer code development, and representative numerical results.
Probabilistic performance-based design for high performance control systems
NASA Astrophysics Data System (ADS)
Micheli, Laura; Cao, Liang; Gong, Yongqiang; Cancelli, Alessandro; Laflamme, Simon; Alipour, Alice
2017-04-01
High performance control systems (HPCS) are advanced damping systems capable of high damping performance over a wide frequency bandwidth, ideal for mitigation of multi-hazards. They include active, semi-active, and hybrid damping systems. However, HPCS are more expensive than typical passive mitigation systems, rely on power and hardware (e.g., sensors, actuators) to operate, and require maintenance. In this paper, a life cycle cost analysis (LCA) approach is proposed to estimate the economic benefit these systems over the entire life of the structure. The novelty resides in the life cycle cost analysis in the performance based design (PBD) tailored to multi-level wind hazards. This yields a probabilistic performance-based design approach for HPCS. Numerical simulations are conducted on a building located in Boston, MA. LCA are conducted for passive control systems and HPCS, and the concept of controller robustness is demonstrated. Results highlight the promise of the proposed performance-based design procedure.
E-research platform of EPOS Thematic Core Service "ANTHROPOGENIC HAZARDS"
NASA Astrophysics Data System (ADS)
Orlecka-Sikora, Beata; Lasocki, Stanisław; Grasso, Jean Robert; Schmittbuhl, Jean; Kwiatek, Grzegorz; Garcia, Alexander; Cassidy, Nigel; Sterzel, Mariusz; Szepieniec, Tomasz; Dineva, Savka; Biggare, Pascal; Saccorotti, Gilberto; Sileny, Jan; Fischer, Tomas
2016-04-01
EPOS Thematic Core Service ANTHROPOGENIC HAZARDS (TCS AH) aims to create new research opportunities in the field of anthropogenic hazards evoked by exploitation of georesources. TCS AH, based on the prototype built in the framework of the IS-EPOS project (https://tcs.ah-epos.eu/), financed from Polish structural funds (POIG.02.03.00-14-090/13-00), is being further developed within EPOS IP project (H2020-INFRADEV-1-2015-1, INFRADEV-3-2015). TCS AH is designed as a functional e-research environment to ensure a researcher the maximum possible freedom for in silico experimentation by providing a virtual laboratory in which researcher will be able to create own workspace with own processing streams. The unique integrated RI is: (i) data gathered in the so- called "episodes", comprehensively describing a geophysical process, induced or triggered by human technological activity, which under certain circumstances can become hazardous for people, infrastructure and the environment and (ii) problem-oriented, specific high-level services, with the particular attention devoted to methods analyzing correlations between technology, geophysical response and resulting hazard. Services to be implemented are grouped within six blocks: (1) Basic services for data integration and handling; (2) Services for physical models of stress/strain changes over time and space as driven by geo-resource production; (3) Services for analysing geophysical signals; (4) Services to extract the relation between technological operations and observed induced seismic/deformation; (5) Services to quantitative probabilistic assessments of anthropogenic seismic hazard - statistical properties of anthropogenic seismic series and their dependence on time-varying anthropogenesis; ground motion prediction equations; stationary and time-dependent probabilistic seismic hazard estimates, related to time-changeable technological factors inducing the seismic process; (6) Simulator for Multi-hazard/multi-risk assessment in ExploRation/exploitation of GEoResources (MERGER) - numerical estimate of the occurrence probability of chains of events or processes impacting the environment. TCS AH will also serve the public sector expert knowledge and background information. In order to fulfill this aim the services for outreach, dissemination & communication will be implemented. From the technical point of view the implementation of services will proceed according to the methods worked within the mentioned before IS-EPOS project. The detailed workflows of implementation process of aforementioned services & interaction between user & TCS AH have been already prepared.
NASA Astrophysics Data System (ADS)
Ishizaki, N. N.; Dairaku, K.; Ueno, G.
2016-12-01
We have developed a statistical downscaling method for estimating probabilistic climate projection using CMIP5 multi general circulation models (GCMs). A regression model was established so that the combination of weights of GCMs reflects the characteristics of the variation of observations at each grid point. Cross validations were conducted to select GCMs and to evaluate the regression model to avoid multicollinearity. By using spatially high resolution observation system, we conducted statistically downscaled probabilistic climate projections with 20-km horizontal grid spacing. Root mean squared errors for monthly mean air surface temperature and precipitation estimated by the regression method were the smallest compared with the results derived from a simple ensemble mean of GCMs and a cumulative distribution function based bias correction method. Projected changes in the mean temperature and precipitation were basically similar to those of the simple ensemble mean of GCMs. Mean precipitation was generally projected to increase associated with increased temperature and consequent increased moisture content in the air. Weakening of the winter monsoon may affect precipitation decrease in some areas. Temperature increase in excess of 4 K was expected in most areas of Japan in the end of 21st century under RCP8.5 scenario. The estimated probability of monthly precipitation exceeding 300 mm would increase around the Pacific side during the summer and the Japan Sea side during the winter season. This probabilistic climate projection based on the statistical method can be expected to bring useful information to the impact studies and risk assessments.
Probabilistic Climate Scenario Information for Risk Assessment
NASA Astrophysics Data System (ADS)
Dairaku, K.; Ueno, G.; Takayabu, I.
2014-12-01
Climate information and services for Impacts, Adaptation and Vulnerability (IAV) Assessments are of great concern. In order to develop probabilistic regional climate information that represents the uncertainty in climate scenario experiments in Japan, we compared the physics ensemble experiments using the 60km global atmospheric model of the Meteorological Research Institute (MRI-AGCM) with multi-model ensemble experiments with global atmospheric-ocean coupled models (CMIP3) of SRES A1b scenario experiments. The MRI-AGCM shows relatively good skills particularly in tropics for temperature and geopotential height. Variability in surface air temperature of physical ensemble experiments with MRI-AGCM was within the range of one standard deviation of the CMIP3 model in the Asia region. On the other hand, the variability of precipitation was relatively well represented compared with the variation of the CMIP3 models. Models which show the similar reproducibility in the present climate shows different future climate change. We couldn't find clear relationships between present climate and future climate change in temperature and precipitation. We develop a new method to produce probabilistic information of climate change scenarios by weighting model ensemble experiments based on a regression model (Krishnamurti et al., Science, 1999). The method can be easily applicable to other regions and other physical quantities, and also to downscale to finer-scale dependent on availability of observation dataset. The prototype of probabilistic information in Japan represents the quantified structural uncertainties of multi-model ensemble experiments of climate change scenarios. Acknowledgments: This study was supported by the SOUSEI Program, funded by Ministry of Education, Culture, Sports, Science and Technology, Government of Japan.
Praveen, Paurush; Fröhlich, Holger
2013-01-01
Inferring regulatory networks from experimental data via probabilistic graphical models is a popular framework to gain insights into biological systems. However, the inherent noise in experimental data coupled with a limited sample size reduces the performance of network reverse engineering. Prior knowledge from existing sources of biological information can address this low signal to noise problem by biasing the network inference towards biologically plausible network structures. Although integrating various sources of information is desirable, their heterogeneous nature makes this task challenging. We propose two computational methods to incorporate various information sources into a probabilistic consensus structure prior to be used in graphical model inference. Our first model, called Latent Factor Model (LFM), assumes a high degree of correlation among external information sources and reconstructs a hidden variable as a common source in a Bayesian manner. The second model, a Noisy-OR, picks up the strongest support for an interaction among information sources in a probabilistic fashion. Our extensive computational studies on KEGG signaling pathways as well as on gene expression data from breast cancer and yeast heat shock response reveal that both approaches can significantly enhance the reconstruction accuracy of Bayesian Networks compared to other competing methods as well as to the situation without any prior. Our framework allows for using diverse information sources, like pathway databases, GO terms and protein domain data, etc. and is flexible enough to integrate new sources, if available. PMID:23826291
Probabilities and predictions: modeling the development of scientific problem-solving skills.
Stevens, Ron; Johnson, David F; Soller, Amy
2005-01-01
The IMMEX (Interactive Multi-Media Exercises) Web-based problem set platform enables the online delivery of complex, multimedia simulations, the rapid collection of student performance data, and has already been used in several genetic simulations. The next step is the use of these data to understand and improve student learning in a formative manner. This article describes the development of probabilistic models of undergraduate student problem solving in molecular genetics that detailed the spectrum of strategies students used when problem solving, and how the strategic approaches evolved with experience. The actions of 776 university sophomore biology majors from three molecular biology lecture courses were recorded and analyzed. Each of six simulations were first grouped by artificial neural network clustering to provide individual performance measures, and then sequences of these performances were probabilistically modeled by hidden Markov modeling to provide measures of progress. The models showed that students with different initial problem-solving abilities choose different strategies. Initial and final strategies varied across different sections of the same course and were not strongly correlated with other achievement measures. In contrast to previous studies, we observed no significant gender differences. We suggest that instructor interventions based on early student performances with these simulations may assist students to recognize effective and efficient problem-solving strategies and enhance learning.
NASA Astrophysics Data System (ADS)
Maurya, S. P.; Singh, K. H.; Singh, N. P.
2018-05-01
In present study, three recently developed geostatistical methods, single attribute analysis, multi-attribute analysis and probabilistic neural network algorithm have been used to predict porosity in inter well region for Blackfoot field, Alberta, Canada, an offshore oil field. These techniques make use of seismic attributes, generated by model based inversion and colored inversion techniques. The principle objective of the study is to find the suitable combination of seismic inversion and geostatistical techniques to predict porosity and identification of prospective zones in 3D seismic volume. The porosity estimated from these geostatistical approaches is corroborated with the well log porosity. The results suggest that all the three implemented geostatistical methods are efficient and reliable to predict the porosity but the multi-attribute and probabilistic neural network analysis provide more accurate and high resolution porosity sections. A low impedance (6000-8000 m/s g/cc) and high porosity (> 15%) zone is interpreted from inverted impedance and porosity sections respectively between 1060 and 1075 ms time interval and is characterized as reservoir. The qualitative and quantitative results demonstrate that of all the employed geostatistical methods, the probabilistic neural network along with model based inversion is the most efficient method for predicting porosity in inter well region.
Dall'Osso, F.; Dominey-Howes, D.; Moore, C.; Summerhayes, S.; Withycombe, G.
2014-01-01
Approximately 85% of Australia's population live along the coastal fringe, an area with high exposure to extreme inundations such as tsunamis. However, to date, no Probabilistic Tsunami Hazard Assessments (PTHA) that include inundation have been published for Australia. This limits the development of appropriate risk reduction measures by decision and policy makers. We describe our PTHA undertaken for the Sydney metropolitan area. Using the NOAA NCTR model MOST (Method for Splitting Tsunamis), we simulate 36 earthquake-generated tsunamis with annual probabilities of 1:100, 1:1,000 and 1:10,000, occurring under present and future predicted sea level conditions. For each tsunami scenario we generate a high-resolution inundation map of the maximum water level and flow velocity, and we calculate the exposure of buildings and critical infrastructure. Results indicate that exposure to earthquake-generated tsunamis is relatively low for present events, but increases significantly with higher sea level conditions. The probabilistic approach allowed us to undertake a comparison with an existing storm surge hazard assessment. Interestingly, the exposure to all the simulated tsunamis is significantly lower than that for the 1:100 storm surge scenarios, under the same initial sea level conditions. The results have significant implications for multi-risk and emergency management in Sydney. PMID:25492514
Dall'Osso, F; Dominey-Howes, D; Moore, C; Summerhayes, S; Withycombe, G
2014-12-10
Approximately 85% of Australia's population live along the coastal fringe, an area with high exposure to extreme inundations such as tsunamis. However, to date, no Probabilistic Tsunami Hazard Assessments (PTHA) that include inundation have been published for Australia. This limits the development of appropriate risk reduction measures by decision and policy makers. We describe our PTHA undertaken for the Sydney metropolitan area. Using the NOAA NCTR model MOST (Method for Splitting Tsunamis), we simulate 36 earthquake-generated tsunamis with annual probabilities of 1:100, 1:1,000 and 1:10,000, occurring under present and future predicted sea level conditions. For each tsunami scenario we generate a high-resolution inundation map of the maximum water level and flow velocity, and we calculate the exposure of buildings and critical infrastructure. Results indicate that exposure to earthquake-generated tsunamis is relatively low for present events, but increases significantly with higher sea level conditions. The probabilistic approach allowed us to undertake a comparison with an existing storm surge hazard assessment. Interestingly, the exposure to all the simulated tsunamis is significantly lower than that for the 1:100 storm surge scenarios, under the same initial sea level conditions. The results have significant implications for multi-risk and emergency management in Sydney.
Probabilistic Elastic Part Model: A Pose-Invariant Representation for Real-World Face Verification.
Li, Haoxiang; Hua, Gang
2018-04-01
Pose variation remains to be a major challenge for real-world face recognition. We approach this problem through a probabilistic elastic part model. We extract local descriptors (e.g., LBP or SIFT) from densely sampled multi-scale image patches. By augmenting each descriptor with its location, a Gaussian mixture model (GMM) is trained to capture the spatial-appearance distribution of the face parts of all face images in the training corpus, namely the probabilistic elastic part (PEP) model. Each mixture component of the GMM is confined to be a spherical Gaussian to balance the influence of the appearance and the location terms, which naturally defines a part. Given one or multiple face images of the same subject, the PEP-model builds its PEP representation by sequentially concatenating descriptors identified by each Gaussian component in a maximum likelihood sense. We further propose a joint Bayesian adaptation algorithm to adapt the universally trained GMM to better model the pose variations between the target pair of faces/face tracks, which consistently improves face verification accuracy. Our experiments show that we achieve state-of-the-art face verification accuracy with the proposed representations on the Labeled Face in the Wild (LFW) dataset, the YouTube video face database, and the CMU MultiPIE dataset.
Analyzing gene expression time-courses based on multi-resolution shape mixture model.
Li, Ying; He, Ye; Zhang, Yu
2016-11-01
Biological processes actually are a dynamic molecular process over time. Time course gene expression experiments provide opportunities to explore patterns of gene expression change over a time and understand the dynamic behavior of gene expression, which is crucial for study on development and progression of biology and disease. Analysis of the gene expression time-course profiles has not been fully exploited so far. It is still a challenge problem. We propose a novel shape-based mixture model clustering method for gene expression time-course profiles to explore the significant gene groups. Based on multi-resolution fractal features and mixture clustering model, we proposed a multi-resolution shape mixture model algorithm. Multi-resolution fractal features is computed by wavelet decomposition, which explore patterns of change over time of gene expression at different resolution. Our proposed multi-resolution shape mixture model algorithm is a probabilistic framework which offers a more natural and robust way of clustering time-course gene expression. We assessed the performance of our proposed algorithm using yeast time-course gene expression profiles compared with several popular clustering methods for gene expression profiles. The grouped genes identified by different methods are evaluated by enrichment analysis of biological pathways and known protein-protein interactions from experiment evidence. The grouped genes identified by our proposed algorithm have more strong biological significance. A novel multi-resolution shape mixture model algorithm based on multi-resolution fractal features is proposed. Our proposed model provides a novel horizons and an alternative tool for visualization and analysis of time-course gene expression profiles. The R and Matlab program is available upon the request. Copyright © 2016 Elsevier Inc. All rights reserved.
Probabilistic analysis of a materially nonlinear structure
NASA Technical Reports Server (NTRS)
Millwater, H. R.; Wu, Y.-T.; Fossum, A. F.
1990-01-01
A probabilistic finite element program is used to perform probabilistic analysis of a materially nonlinear structure. The program used in this study is NESSUS (Numerical Evaluation of Stochastic Structure Under Stress), under development at Southwest Research Institute. The cumulative distribution function (CDF) of the radial stress of a thick-walled cylinder under internal pressure is computed and compared with the analytical solution. In addition, sensitivity factors showing the relative importance of the input random variables are calculated. Significant plasticity is present in this problem and has a pronounced effect on the probabilistic results. The random input variables are the material yield stress and internal pressure with Weibull and normal distributions, respectively. The results verify the ability of NESSUS to compute the CDF and sensitivity factors of a materially nonlinear structure. In addition, the ability of the Advanced Mean Value (AMV) procedure to assess the probabilistic behavior of structures which exhibit a highly nonlinear response is shown. Thus, the AMV procedure can be applied with confidence to other structures which exhibit nonlinear behavior.
ERIC Educational Resources Information Center
Grotzer, Tina A.; Solis, S. Lynneth; Tutwiler, M. Shane; Cuzzolino, Megan Powell
2017-01-01
Understanding complex systems requires reasoning about causal relationships that behave or appear to behave probabilistically. Features such as distributed agency, large spatial scales, and time delays obscure co-variation relationships and complex interactions can result in non-deterministic relationships between causes and effects that are best…
Ciecior, Willy; Röhlig, Klaus-Jürgen; Kirchner, Gerald
2018-10-01
In the present paper, deterministic as well as first- and second-order probabilistic biosphere modeling approaches are compared. Furthermore, the sensitivity of the influence of the probability distribution function shape (empirical distribution functions and fitted lognormal probability functions) representing the aleatory uncertainty (also called variability) of a radioecological model parameter as well as the role of interacting parameters are studied. Differences in the shape of the output distributions for the biosphere dose conversion factor from first-order Monte Carlo uncertainty analysis using empirical and fitted lognormal distribution functions for input parameters suggest that a lognormal approximation is possibly not always an adequate representation of the aleatory uncertainty of a radioecological parameter. Concerning the comparison of the impact of aleatory and epistemic parameter uncertainty on the biosphere dose conversion factor, the latter here is described using uncertain moments (mean, variance) while the distribution itself represents the aleatory uncertainty of the parameter. From the results obtained, the solution space of second-order Monte Carlo simulation is much larger than that from first-order Monte Carlo simulation. Therefore, the influence of epistemic uncertainty of a radioecological parameter on the output result is much larger than that one caused by its aleatory uncertainty. Parameter interactions are only of significant influence in the upper percentiles of the distribution of results as well as only in the region of the upper percentiles of the model parameters. Copyright © 2018 Elsevier Ltd. All rights reserved.
Ren, Yin; Deng, Lu-Ying; Zuo, Shu-Di; Song, Xiao-Dong; Liao, Yi-Lan; Xu, Cheng-Dong; Chen, Qi; Hua, Li-Zhong; Li, Zheng-Wei
2016-09-01
Identifying factors that influence the land surface temperature (LST) of urban forests can help improve simulations and predictions of spatial patterns of urban cool islands. This requires a quantitative analytical method that combines spatial statistical analysis with multi-source observational data. The purpose of this study was to reveal how human activities and ecological factors jointly influence LST in clustering regions (hot or cool spots) of urban forests. Using Xiamen City, China from 1996 to 2006 as a case study, we explored the interactions between human activities and ecological factors, as well as their influences on urban forest LST. Population density was selected as a proxy for human activity. We integrated multi-source data (forest inventory, digital elevation models (DEM), population, and remote sensing imagery) to develop a database on a unified urban scale. The driving mechanism of urban forest LST was revealed through a combination of multi-source spatial data and spatial statistical analysis of clustering regions. The results showed that the main factors contributing to urban forest LST were dominant tree species and elevation. The interactions between human activity and specific ecological factors linearly or nonlinearly increased LST in urban forests. Strong interactions between elevation and dominant species were generally observed and were prevalent in either hot or cold spots areas in different years. In conclusion, quantitative studies based on spatial statistics and GeogDetector models should be conducted in urban areas to reveal interactions between human activities, ecological factors, and LST. Copyright © 2016 Elsevier Ltd. All rights reserved.
Interrelation Between Safety Factors and Reliability
NASA Technical Reports Server (NTRS)
Elishakoff, Isaac; Chamis, Christos C. (Technical Monitor)
2001-01-01
An evaluation was performed to establish relationships between safety factors and reliability relationships. Results obtained show that the use of the safety factor is not contradictory to the employment of the probabilistic methods. In many cases the safety factors can be directly expressed by the required reliability levels. However, there is a major difference that must be emphasized: whereas the safety factors are allocated in an ad hoc manner, the probabilistic approach offers a unified mathematical framework. The establishment of the interrelation between the concepts opens an avenue to specify safety factors based on reliability. In cases where there are several forms of failure, then the allocation of safety factors should he based on having the same reliability associated with each failure mode. This immediately suggests that by the probabilistic methods the existing over-design or under-design can be eliminated. The report includes three parts: Part 1-Random Actual Stress and Deterministic Yield Stress; Part 2-Deterministic Actual Stress and Random Yield Stress; Part 3-Both Actual Stress and Yield Stress Are Random.
Probabilistic graphs as a conceptual and computational tool in hydrology and water management
NASA Astrophysics Data System (ADS)
Schoups, Gerrit
2014-05-01
Originally developed in the fields of machine learning and artificial intelligence, probabilistic graphs constitute a general framework for modeling complex systems in the presence of uncertainty. The framework consists of three components: 1. Representation of the model as a graph (or network), with nodes depicting random variables in the model (e.g. parameters, states, etc), which are joined together by factors. Factors are local probabilistic or deterministic relations between subsets of variables, which, when multiplied together, yield the joint distribution over all variables. 2. Consistent use of probability theory for quantifying uncertainty, relying on basic rules of probability for assimilating data into the model and expressing unknown variables as a function of observations (via the posterior distribution). 3. Efficient, distributed approximation of the posterior distribution using general-purpose algorithms that exploit model structure encoded in the graph. These attributes make probabilistic graphs potentially useful as a conceptual and computational tool in hydrology and water management (and beyond). Conceptually, they can provide a common framework for existing and new probabilistic modeling approaches (e.g. by drawing inspiration from other fields of application), while computationally they can make probabilistic inference feasible in larger hydrological models. The presentation explores, via examples, some of these benefits.
NASA Astrophysics Data System (ADS)
Bauer, Johannes; Dávila-Chacón, Jorge; Wermter, Stefan
2015-10-01
Humans and other animals have been shown to perform near-optimally in multi-sensory integration tasks. Probabilistic population codes (PPCs) have been proposed as a mechanism by which optimal integration can be accomplished. Previous approaches have focussed on how neural networks might produce PPCs from sensory input or perform calculations using them, like combining multiple PPCs. Less attention has been given to the question of how the necessary organisation of neurons can arise and how the required knowledge about the input statistics can be learned. In this paper, we propose a model of learning multi-sensory integration based on an unsupervised learning algorithm in which an artificial neural network learns the noise characteristics of each of its sources of input. Our algorithm borrows from the self-organising map the ability to learn latent-variable models of the input and extends it to learning to produce a PPC approximating a probability density function over the latent variable behind its (noisy) input. The neurons in our network are only required to perform simple calculations and we make few assumptions about input noise properties and tuning functions. We report on a neurorobotic experiment in which we apply our algorithm to multi-sensory integration in a humanoid robot to demonstrate its effectiveness and compare it to human multi-sensory integration on the behavioural level. We also show in simulations that our algorithm performs near-optimally under certain plausible conditions, and that it reproduces important aspects of natural multi-sensory integration on the neural level.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Josse, Florent; Lefebvre, Yannick; Todeschini, Patrick
2006-07-01
Assessing the structural integrity of a nuclear Reactor Pressure Vessel (RPV) subjected to pressurized-thermal-shock (PTS) transients is extremely important to safety. In addition to conventional deterministic calculations to confirm RPV integrity, Electricite de France (EDF) carries out probabilistic analyses. Probabilistic analyses are interesting because some key variables, albeit conventionally taken at conservative values, can be modeled more accurately through statistical variability. One variable which significantly affects RPV structural integrity assessment is cleavage fracture initiation toughness. The reference fracture toughness method currently in use at EDF is the RCCM and ASME Code lower-bound K{sub IC} based on the indexing parameter RT{submore » NDT}. However, in order to quantify the toughness scatter for probabilistic analyses, the master curve method is being analyzed at present. Furthermore, the master curve method is a direct means of evaluating fracture toughness based on K{sub JC} data. In the framework of the master curve investigation undertaken by EDF, this article deals with the following two statistical items: building a master curve from an extract of a fracture toughness dataset (from the European project 'Unified Reference Fracture Toughness Design curves for RPV Steels') and controlling statistical uncertainty for both mono-temperature and multi-temperature tests. Concerning the first point, master curve temperature dependence is empirical in nature. To determine the 'original' master curve, Wallin postulated that a unified description of fracture toughness temperature dependence for ferritic steels is possible, and used a large number of data corresponding to nuclear-grade pressure vessel steels and welds. Our working hypothesis is that some ferritic steels may behave in slightly different ways. Therefore we focused exclusively on the basic french reactor vessel metal of types A508 Class 3 and A 533 grade B Class 1, taking the sampling level and direction into account as well as the test specimen type. As for the second point, the emphasis is placed on the uncertainties in applying the master curve approach. For a toughness dataset based on different specimens of a single product, application of the master curve methodology requires the statistical estimation of one parameter: the reference temperature T{sub 0}. Because of the limited number of specimens, estimation of this temperature is uncertain. The ASTM standard provides a rough evaluation of this statistical uncertainty through an approximate confidence interval. In this paper, a thorough study is carried out to build more meaningful confidence intervals (for both mono-temperature and multi-temperature tests). These results ensure better control over uncertainty, and allow rigorous analysis of the impact of its influencing factors: the number of specimens and the temperatures at which they have been tested. (authors)« less
Probabilistic verification of cloud fraction from three different products with CALIPSO
NASA Astrophysics Data System (ADS)
Jung, B. J.; Descombes, G.; Snyder, C.
2017-12-01
In this study, we present how Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) can be used for probabilistic verification of cloud fraction, and apply this probabilistic approach to three cloud fraction products: a) The Air Force Weather (AFW) World Wide Merged Cloud Analysis (WWMCA), b) Satellite Cloud Observations and Radiative Property retrieval Systems (SatCORPS) from NASA Langley Research Center, and c) Multi-sensor Advection Diffusion nowCast (MADCast) from NCAR. Although they differ in their details, both WWMCA and SatCORPS retrieve cloud fraction from satellite observations, mainly of infrared radiances. MADCast utilizes in addition a short-range forecast of cloud fraction (provided by the Model for Prediction Across Scales, assuming cloud fraction is advected as a tracer) and a column-by-column particle filter implemented within the Gridpoint Statistical Interpolation (GSI) data-assimilation system. The probabilistic verification considers the retrieved or analyzed cloud fractions as predicting the probability of cloud at any location within a grid cell and the 5-km vertical feature mask (VFM) from CALIPSO level-2 products as a point observation of cloud.
NASA Astrophysics Data System (ADS)
Umut Caglar, Mehmet; Pal, Ranadip
2010-10-01
The central dogma of molecular biology states that ``information cannot be transferred back from protein to either protein or nucleic acid.'' However, this assumption is not exactly correct in most of the cases. There are a lot of feedback loops and interactions between different levels of systems. These types of interactions are hard to analyze due to the lack of data in the cellular level and probabilistic nature of interactions. Probabilistic models like Stochastic Master Equation (SME) or deterministic models like differential equations (DE) can be used to analyze these types of interactions. SME models based on chemical master equation (CME) can provide detailed representation of genetic regulatory system, but their use is restricted by the large data requirements and computational costs of calculations. The differential equations models on the other hand, have low calculation costs and much more adequate to generate control procedures on the system; but they are not adequate to investigate the probabilistic nature of interactions. In this work the success of the mapping between SME and DE is analyzed, and the success of a control policy generated by DE model with respect to SME model is examined. Index Terms--- Stochastic Master Equation models, Differential Equation Models, Control Policy Design, Systems biology
Saul: Towards Declarative Learning Based Programming
Kordjamshidi, Parisa; Roth, Dan; Wu, Hao
2015-01-01
We present Saul, a new probabilistic programming language designed to address some of the shortcomings of programming languages that aim at advancing and simplifying the development of AI systems. Such languages need to interact with messy, naturally occurring data, to allow a programmer to specify what needs to be done at an appropriate level of abstraction rather than at the data level, to be developed on a solid theory that supports moving to and reasoning at this level of abstraction and, finally, to support flexible integration of these learning and inference models within an application program. Saul is an object-functional programming language written in Scala that facilitates these by (1) allowing a programmer to learn, name and manipulate named abstractions over relational data; (2) supporting seamless incorporation of trainable (probabilistic or discriminative) components into the program, and (3) providing a level of inference over trainable models to support composition and make decisions that respect domain and application constraints. Saul is developed over a declaratively defined relational data model, can use piecewise learned factor graphs with declaratively specified learning and inference objectives, and it supports inference over probabilistic models augmented with declarative knowledge-based constraints. We describe the key constructs of Saul and exemplify its use in developing applications that require relational feature engineering and structured output prediction. PMID:26635465
Saul: Towards Declarative Learning Based Programming.
Kordjamshidi, Parisa; Roth, Dan; Wu, Hao
2015-07-01
We present Saul , a new probabilistic programming language designed to address some of the shortcomings of programming languages that aim at advancing and simplifying the development of AI systems. Such languages need to interact with messy, naturally occurring data, to allow a programmer to specify what needs to be done at an appropriate level of abstraction rather than at the data level, to be developed on a solid theory that supports moving to and reasoning at this level of abstraction and, finally, to support flexible integration of these learning and inference models within an application program. Saul is an object-functional programming language written in Scala that facilitates these by (1) allowing a programmer to learn, name and manipulate named abstractions over relational data; (2) supporting seamless incorporation of trainable (probabilistic or discriminative) components into the program, and (3) providing a level of inference over trainable models to support composition and make decisions that respect domain and application constraints. Saul is developed over a declaratively defined relational data model, can use piecewise learned factor graphs with declaratively specified learning and inference objectives, and it supports inference over probabilistic models augmented with declarative knowledge-based constraints. We describe the key constructs of Saul and exemplify its use in developing applications that require relational feature engineering and structured output prediction.
Multi-omics approach identifies molecular mechanisms of plant-fungus mycorrhizal interaction
Larsen, Peter E.; Sreedasyam, Avinash; Trivedi, Geetika; ...
2016-01-19
In mycorrhizal symbiosis, plant roots form close, mutually beneficial interactions with soil fungi. Before this mycorrhizal interaction can be established however, plant roots must be capable of detecting potential beneficial fungal partners and initiating the gene expression patterns necessary to begin symbiosis. To predict a plant root – mycorrhizal fungi sensor systems, we analyzed in vitro experiments of Populus tremuloides (aspen tree) and Laccaria bicolor (mycorrhizal fungi) interaction and leveraged over 200 previously published transcriptomic experimental data sets, 159 experimentally validated plant transcription factor binding motifs, and more than 120-thousand experimentally validated protein-protein interactions to generate models of pre-mycorrhizal sensormore » systems in aspen root. These sensor mechanisms link extracellular signaling molecules with gene regulation through a network comprised of membrane receptors, signal cascade proteins, transcription factors, and transcription factor biding DNA motifs. Modeling predicted four pre-mycorrhizal sensor complexes in aspen that interact with fifteen transcription factors to regulate the expression of 1184 genes in response to extracellular signals synthesized by Laccaria. Predicted extracellular signaling molecules include common signaling molecules such as phenylpropanoids, salicylate, and, jasmonic acid. Lastly, this multi-omic computational modeling approach for predicting the complex sensory networks yielded specific, testable biological hypotheses for mycorrhizal interaction signaling compounds, sensor complexes, and mechanisms of gene regulation.« less
Multi-omics approach identifies molecular mechanisms of plant-fungus mycorrhizal interaction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larsen, Peter E.; Sreedasyam, Avinash; Trivedi, Geetika
In mycorrhizal symbiosis, plant roots form close, mutually beneficial interactions with soil fungi. Before this mycorrhizal interaction can be established however, plant roots must be capable of detecting potential beneficial fungal partners and initiating the gene expression patterns necessary to begin symbiosis. To predict a plant root – mycorrhizal fungi sensor systems, we analyzed in vitro experiments of Populus tremuloides (aspen tree) and Laccaria bicolor (mycorrhizal fungi) interaction and leveraged over 200 previously published transcriptomic experimental data sets, 159 experimentally validated plant transcription factor binding motifs, and more than 120-thousand experimentally validated protein-protein interactions to generate models of pre-mycorrhizal sensormore » systems in aspen root. These sensor mechanisms link extracellular signaling molecules with gene regulation through a network comprised of membrane receptors, signal cascade proteins, transcription factors, and transcription factor biding DNA motifs. Modeling predicted four pre-mycorrhizal sensor complexes in aspen that interact with fifteen transcription factors to regulate the expression of 1184 genes in response to extracellular signals synthesized by Laccaria. Predicted extracellular signaling molecules include common signaling molecules such as phenylpropanoids, salicylate, and, jasmonic acid. Lastly, this multi-omic computational modeling approach for predicting the complex sensory networks yielded specific, testable biological hypotheses for mycorrhizal interaction signaling compounds, sensor complexes, and mechanisms of gene regulation.« less
Lee, Insuk; Li, Zhihua; Marcotte, Edward M.
2007-01-01
Background Probabilistic functional gene networks are powerful theoretical frameworks for integrating heterogeneous functional genomics and proteomics data into objective models of cellular systems. Such networks provide syntheses of millions of discrete experimental observations, spanning DNA microarray experiments, physical protein interactions, genetic interactions, and comparative genomics; the resulting networks can then be easily applied to generate testable hypotheses regarding specific gene functions and associations. Methodology/Principal Findings We report a significantly improved version (v. 2) of a probabilistic functional gene network [1] of the baker's yeast, Saccharomyces cerevisiae. We describe our optimization methods and illustrate their effects in three major areas: the reduction of functional bias in network training reference sets, the application of a probabilistic model for calculating confidences in pair-wise protein physical or genetic interactions, and the introduction of simple thresholds that eliminate many false positive mRNA co-expression relationships. Using the network, we predict and experimentally verify the function of the yeast RNA binding protein Puf6 in 60S ribosomal subunit biogenesis. Conclusions/Significance YeastNet v. 2, constructed using these optimizations together with additional data, shows significant reduction in bias and improvements in precision and recall, in total covering 102,803 linkages among 5,483 yeast proteins (95% of the validated proteome). YeastNet is available from http://www.yeastnet.org. PMID:17912365
Discriminative confidence estimation for probabilistic multi-atlas label fusion.
Benkarim, Oualid M; Piella, Gemma; González Ballester, Miguel Angel; Sanroma, Gerard
2017-12-01
Quantitative neuroimaging analyses often rely on the accurate segmentation of anatomical brain structures. In contrast to manual segmentation, automatic methods offer reproducible outputs and provide scalability to study large databases. Among existing approaches, multi-atlas segmentation has recently shown to yield state-of-the-art performance in automatic segmentation of brain images. It consists in propagating the labelmaps from a set of atlases to the anatomy of a target image using image registration, and then fusing these multiple warped labelmaps into a consensus segmentation on the target image. Accurately estimating the contribution of each atlas labelmap to the final segmentation is a critical step for the success of multi-atlas segmentation. Common approaches to label fusion either rely on local patch similarity, probabilistic statistical frameworks or a combination of both. In this work, we propose a probabilistic label fusion framework based on atlas label confidences computed at each voxel of the structure of interest. Maximum likelihood atlas confidences are estimated using a supervised approach, explicitly modeling the relationship between local image appearances and segmentation errors produced by each of the atlases. We evaluate different spatial pooling strategies for modeling local segmentation errors. We also present a novel type of label-dependent appearance features based on atlas labelmaps that are used during confidence estimation to increase the accuracy of our label fusion. Our approach is evaluated on the segmentation of seven subcortical brain structures from the MICCAI 2013 SATA Challenge dataset and the hippocampi from the ADNI dataset. Overall, our results indicate that the proposed label fusion framework achieves superior performance to state-of-the-art approaches in the majority of the evaluated brain structures and shows more robustness to registration errors. Copyright © 2017 Elsevier B.V. All rights reserved.
Spatiotemporal movement planning and rapid adaptation for manual interaction.
Huber, Markus; Kupferberg, Aleksandra; Lenz, Claus; Knoll, Alois; Brandt, Thomas; Glasauer, Stefan
2013-01-01
Many everyday tasks require the ability of two or more individuals to coordinate their actions with others to increase efficiency. Such an increase in efficiency can often be observed even after only very few trials. Previous work suggests that such behavioral adaptation can be explained within a probabilistic framework that integrates sensory input and prior experience. Even though higher cognitive abilities such as intention recognition have been described as probabilistic estimation depending on an internal model of the other agent, it is not clear whether much simpler daily interaction is consistent with a probabilistic framework. Here, we investigate whether the mechanisms underlying efficient coordination during manual interactions can be understood as probabilistic optimization. For this purpose we studied in several experiments a simple manual handover task concentrating on the action of the receiver. We found that the duration until the receiver reacts to the handover decreases over trials, but strongly depends on the position of the handover. We then replaced the human deliverer by different types of robots to further investigate the influence of the delivering movement on the reaction of the receiver. Durations were found to depend on movement kinematics and the robot's joint configuration. Modeling the task was based on the assumption that the receiver's decision to act is based on the accumulated evidence for a specific handover position. The evidence for this handover position is collected from observing the hand movement of the deliverer over time and, if appropriate, by integrating this sensory likelihood with prior expectation that is updated over trials. The close match of model simulations and experimental results shows that the efficiency of handover coordination can be explained by an adaptive probabilistic fusion of a-priori expectation and online estimation.
Interacting with an artificial partner: modeling the role of emotional aspects.
Cattinelli, Isabella; Goldwurm, Massimiliano; Borghese, N Alberto
2008-12-01
In this paper we introduce a simple model based on probabilistic finite state automata to describe an emotional interaction between a robot and a human user, or between simulated agents. Based on the agent's personality, attitude, and nature, and on the emotional inputs it receives, the model will determine the next emotional state displayed by the agent itself. The probabilistic and time-varying nature of the model yields rich and dynamic interactions, and an autonomous adaptation to the interlocutor. In addition, a reinforcement learning technique is applied to have one agent drive its partner's behavior toward desired states. The model may also be used as a tool for behavior analysis, by extracting high probability patterns of interaction and by resorting to the ergodic properties of Markov chains.
a Probabilistic Embedding Clustering Method for Urban Structure Detection
NASA Astrophysics Data System (ADS)
Lin, X.; Li, H.; Zhang, Y.; Gao, L.; Zhao, L.; Deng, M.
2017-09-01
Urban structure detection is a basic task in urban geography. Clustering is a core technology to detect the patterns of urban spatial structure, urban functional region, and so on. In big data era, diverse urban sensing datasets recording information like human behaviour and human social activity, suffer from complexity in high dimension and high noise. And unfortunately, the state-of-the-art clustering methods does not handle the problem with high dimension and high noise issues concurrently. In this paper, a probabilistic embedding clustering method is proposed. Firstly, we come up with a Probabilistic Embedding Model (PEM) to find latent features from high dimensional urban sensing data by "learning" via probabilistic model. By latent features, we could catch essential features hidden in high dimensional data known as patterns; with the probabilistic model, we can also reduce uncertainty caused by high noise. Secondly, through tuning the parameters, our model could discover two kinds of urban structure, the homophily and structural equivalence, which means communities with intensive interaction or in the same roles in urban structure. We evaluated the performance of our model by conducting experiments on real-world data and experiments with real data in Shanghai (China) proved that our method could discover two kinds of urban structure, the homophily and structural equivalence, which means clustering community with intensive interaction or under the same roles in urban space.
Development of probabilistic emission inventories of air toxics for Jacksonville, Florida, USA.
Zhao, Yuchao; Frey, H Christopher
2004-11-01
Probabilistic emission inventories were developed for 1,3-butadiene, mercury (Hg), arsenic (As), benzene, formaldehyde, and lead for Jacksonville, FL. To quantify inter-unit variability in empirical emission factor data, the Maximum Likelihood Estimation (MLE) method or the Method of Matching Moments was used to fit parametric distributions. For data sets that contain nondetected measurements, a method based upon MLE was used for parameter estimation. To quantify the uncertainty in urban air toxic emission factors, parametric bootstrap simulation and empirical bootstrap simulation were applied to uncensored and censored data, respectively. The probabilistic emission inventories were developed based on the product of the uncertainties in the emission factors and in the activity factors. The uncertainties in the urban air toxics emission inventories range from as small as -25 to +30% for Hg to as large as -83 to +243% for As. The key sources of uncertainty in the emission inventory for each toxic are identified based upon sensitivity analysis. Typically, uncertainty in the inventory of a given pollutant can be attributed primarily to a small number of source categories. Priorities for improving the inventories and for refining the probabilistic analysis are discussed.
A Probabilistic Design Method Applied to Smart Composite Structures
NASA Technical Reports Server (NTRS)
Shiao, Michael C.; Chamis, Christos C.
1995-01-01
A probabilistic design method is described and demonstrated using a smart composite wing. Probabilistic structural design incorporates naturally occurring uncertainties including those in constituent (fiber/matrix) material properties, fabrication variables, structure geometry and control-related parameters. Probabilistic sensitivity factors are computed to identify those parameters that have a great influence on a specific structural reliability. Two performance criteria are used to demonstrate this design methodology. The first criterion requires that the actuated angle at the wing tip be bounded by upper and lower limits at a specified reliability. The second criterion requires that the probability of ply damage due to random impact load be smaller than an assigned value. When the relationship between reliability improvement and the sensitivity factors is assessed, the results show that a reduction in the scatter of the random variable with the largest sensitivity factor (absolute value) provides the lowest failure probability. An increase in the mean of the random variable with a negative sensitivity factor will reduce the failure probability. Therefore, the design can be improved by controlling or selecting distribution parameters associated with random variables. This can be implemented during the manufacturing process to obtain maximum benefit with minimum alterations.
Probabilistic vs linear blending approaches to shared control for wheelchair driving.
Ezeh, Chinemelu; Trautman, Pete; Devigne, Louise; Bureau, Valentin; Babel, Marie; Carlson, Tom
2017-07-01
Some people with severe mobility impairments are unable to operate powered wheelchairs reliably and effectively, using commercially available interfaces. This has sparked a body of research into "smart wheelchairs", which assist users to drive safely and create opportunities for them to use alternative interfaces. Various "shared control" techniques have been proposed to provide an appropriate level of assistance that is satisfactory and acceptable to the user. Most shared control techniques employ a traditional strategy called linear blending (LB), where the user's commands and wheelchair's autonomous commands are combined in some proportion. In this paper, however, we implement a more generalised form of shared control called probabilistic shared control (PSC). This probabilistic formulation improves the accuracy of modelling the interaction between the user and the wheelchair by taking into account uncertainty in the interaction. In this paper, we demonstrate the practical success of PSC over LB in terms of safety, particularly for novice users.
Segmentation of Image Ensembles via Latent Atlases
Van Leemput, Koen; Menze, Bjoern H.; Wells, William M.; Golland, Polina
2010-01-01
Spatial priors, such as probabilistic atlases, play an important role in MRI segmentation. However, the availability of comprehensive, reliable and suitable manual segmentations for atlas construction is limited. We therefore propose a method for joint segmentation of corresponding regions of interest in a collection of aligned images that does not require labeled training data. Instead, a latent atlas, initialized by at most a single manual segmentation, is inferred from the evolving segmentations of the ensemble. The algorithm is based on probabilistic principles but is solved using partial differential equations (PDEs) and energy minimization criteria. We evaluate the method on two datasets, segmenting subcortical and cortical structures in a multi-subject study and extracting brain tumors in a single-subject multi-modal longitudinal experiment. We compare the segmentation results to manual segmentations, when those exist, and to the results of a state-of-the-art atlas-based segmentation method. The quality of the results supports the latent atlas as a promising alternative when existing atlases are not compatible with the images to be segmented. PMID:20580305
Dinov, Martin; Leech, Robert
2017-01-01
Part of the process of EEG microstate estimation involves clustering EEG channel data at the global field power (GFP) maxima, very commonly using a modified K-means approach. Clustering has also been done deterministically, despite there being uncertainties in multiple stages of the microstate analysis, including the GFP peak definition, the clustering itself and in the post-clustering assignment of microstates back onto the EEG timecourse of interest. We perform a fully probabilistic microstate clustering and labeling, to account for these sources of uncertainty using the closest probabilistic analog to KM called Fuzzy C-means (FCM). We train softmax multi-layer perceptrons (MLPs) using the KM and FCM-inferred cluster assignments as target labels, to then allow for probabilistic labeling of the full EEG data instead of the usual correlation-based deterministic microstate label assignment typically used. We assess the merits of the probabilistic analysis vs. the deterministic approaches in EEG data recorded while participants perform real or imagined motor movements from a publicly available data set of 109 subjects. Though FCM group template maps that are almost topographically identical to KM were found, there is considerable uncertainty in the subsequent assignment of microstate labels. In general, imagined motor movements are less predictable on a time point-by-time point basis, possibly reflecting the more exploratory nature of the brain state during imagined, compared to during real motor movements. We find that some relationships may be more evident using FCM than using KM and propose that future microstate analysis should preferably be performed probabilistically rather than deterministically, especially in situations such as with brain computer interfaces, where both training and applying models of microstates need to account for uncertainty. Probabilistic neural network-driven microstate assignment has a number of advantages that we have discussed, which are likely to be further developed and exploited in future studies. In conclusion, probabilistic clustering and a probabilistic neural network-driven approach to microstate analysis is likely to better model and reveal details and the variability hidden in current deterministic and binarized microstate assignment and analyses.
Dinov, Martin; Leech, Robert
2017-01-01
Part of the process of EEG microstate estimation involves clustering EEG channel data at the global field power (GFP) maxima, very commonly using a modified K-means approach. Clustering has also been done deterministically, despite there being uncertainties in multiple stages of the microstate analysis, including the GFP peak definition, the clustering itself and in the post-clustering assignment of microstates back onto the EEG timecourse of interest. We perform a fully probabilistic microstate clustering and labeling, to account for these sources of uncertainty using the closest probabilistic analog to KM called Fuzzy C-means (FCM). We train softmax multi-layer perceptrons (MLPs) using the KM and FCM-inferred cluster assignments as target labels, to then allow for probabilistic labeling of the full EEG data instead of the usual correlation-based deterministic microstate label assignment typically used. We assess the merits of the probabilistic analysis vs. the deterministic approaches in EEG data recorded while participants perform real or imagined motor movements from a publicly available data set of 109 subjects. Though FCM group template maps that are almost topographically identical to KM were found, there is considerable uncertainty in the subsequent assignment of microstate labels. In general, imagined motor movements are less predictable on a time point-by-time point basis, possibly reflecting the more exploratory nature of the brain state during imagined, compared to during real motor movements. We find that some relationships may be more evident using FCM than using KM and propose that future microstate analysis should preferably be performed probabilistically rather than deterministically, especially in situations such as with brain computer interfaces, where both training and applying models of microstates need to account for uncertainty. Probabilistic neural network-driven microstate assignment has a number of advantages that we have discussed, which are likely to be further developed and exploited in future studies. In conclusion, probabilistic clustering and a probabilistic neural network-driven approach to microstate analysis is likely to better model and reveal details and the variability hidden in current deterministic and binarized microstate assignment and analyses. PMID:29163110
Probabilistic Multi-Hazard Assessment of Dry Cask Structures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bencturk, Bora; Padgett, Jamie; Uddin, Rizwan
systems the concrete shall not only provide shielding but insures stability of the upright canister, facilitates anchoring, allows ventilation, and provides physical protection against theft, severe weather and natural (seismic) as well as man-made events (blast incidences). Given the need to remain functional for 40 years or even longer in case of interim storage, the concrete outerpack and the internal canister components need to be evaluated with regard to their long-term ability to perform their intended design functions. Just as evidenced by deteriorating concrete bridges, there are reported visible degradation mechanisms of dry storage systems especially when high corrosive environmentsmore » are considered in maritime locations. The degradation of reinforced concrete is caused by multiple physical and chemical mechanisms, which may be summarized under the heading of environmental aging. The underlying hygro-thermal transport processes are accelerated by irradiation effects, hence creep and shrinkage need to include the effect of chloride penetration, alkali aggregate reaction as well as corrosion of the reinforcing steel. In light of the above, the two main objectives of this project are to (1) develop a probabilistic multi-hazard assessment framework, and (2) through experimental and numerical research perform a comprehensive assessment under combined earthquake loads and aging induced deterioration, which will also provide data for the development and validation of the probabilistic framework.« less
PROBABILISTIC MODELING FOR ADVANCED HUMAN EXPOSURE ASSESSMENT
Human exposures to environmental pollutants widely vary depending on the emission patterns that result in microenvironmental pollutant concentrations, as well as behavioral factors that determine the extent of an individual's contact with these pollutants. Probabilistic human exp...
NASA Astrophysics Data System (ADS)
Shi, Lei; Wei, Jia-Hua; Li, Yun-Xia; Ma, Li-Hua; Xue, Yang; Luo, Jun-Wen
2017-04-01
We propose a novel scheme to probabilistically transmit an arbitrary unknown two-qubit quantum state via Positive Operator-Valued Measurement with the help of two partially entangled states. In this scheme, the teleportation with two senders and two receives can be realized when the information of non-maximally entangled states is only available for the senders. Furthermore, the concrete implementation processes of this proposal are presented, meanwhile the classical communication cost and the successful probability of our scheme are calculated. Supported by the National Natural Science Foundation of China under Grant Nos. 60974037, 61134008, 11074307, and 61273202
Reliability-Based Control Design for Uncertain Systems
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Kenny, Sean P.
2005-01-01
This paper presents a robust control design methodology for systems with probabilistic parametric uncertainty. Control design is carried out by solving a reliability-based multi-objective optimization problem where the probability of violating design requirements is minimized. Simultaneously, failure domains are optimally enlarged to enable global improvements in the closed-loop performance. To enable an efficient numerical implementation, a hybrid approach for estimating reliability metrics is developed. This approach, which integrates deterministic sampling and asymptotic approximations, greatly reduces the numerical burden associated with complex probabilistic computations without compromising the accuracy of the results. Examples using output-feedback and full-state feedback with state estimation are used to demonstrate the ideas proposed.
Thakur, Krishan Gopal; Jaiswal, Ravi Kumar; Shukla, Jinal K; Praveena, T; Gopal, B
2010-12-01
The function of a protein in a cell often involves coordinated interactions with one or several regulatory partners. It is thus imperative to characterize a protein both in isolation as well as in the context of its complex with an interacting partner. High resolution structural information determined by X-ray crystallography and Nuclear Magnetic Resonance offer the best route to characterize protein complexes. These techniques, however, require highly purified and homogenous protein samples at high concentration. This requirement often presents a major hurdle for structural studies. Here we present a strategy based on co-expression and co-purification to obtain recombinant multi-protein complexes in the quantity and concentration range that can enable hitherto intractable structural projects. The feasibility of this strategy was examined using the σ factor/anti-σ factor protein complexes from Mycobacterium tuberculosis. The approach was successful across a wide range of σ factors and their cognate interacting partners. It thus appears likely that the analysis of these complexes based on variations in expression constructs and procedures for the purification and characterization of these recombinant protein samples would be widely applicable for other multi-protein systems. Copyright © 2010 Elsevier Inc. All rights reserved.
Inference for Continuous-Time Probabilistic Programming
2017-12-01
Parzen window density estimator to jointly model the inter-camera travel time intervals, locations of exit/entrances, and velocities of ob- jects...asked to travel across the scene multiple times . Even in such a scenario they formed groups and made social interactions, which Fig. 7: Topology of...INFERENCE FOR CONTINUOUS- TIME PROBABILISTIC PROGRAMMING UNIVERSITY OF CALIFORNIA AT RIVERSIDE DECEMBER 2017 FINAL TECHNICAL REPORT APPROVED FOR
Probabilistic risk models for multiple disturbances: an example of forest insects and wildfires
Haiganoush K. Preisler; Alan A. Ager; Jane L. Hayes
2010-01-01
Building probabilistic risk models for highly random forest disturbances like wildfire and forest insect outbreaks is a challenging. Modeling the interactions among natural disturbances is even more difficult. In the case of wildfire and forest insects, we looked at the probability of a large fire given an insect outbreak and also the incidence of insect outbreaks...
NASA Astrophysics Data System (ADS)
Ren, Y.
2017-12-01
Context Land surface temperatures (LSTs) spatio-temporal distribution pattern of urban forests are influenced by many ecological factors; the identification of interaction between these factors can improve simulations and predictions of spatial patterns of urban cold islands. This quantitative research requires an integrated method that combines multiple sources data with spatial statistical analysis. Objectives The purpose of this study was to clarify urban forest LST influence interaction between anthropogenic activities and multiple ecological factors using cluster analysis of hot and cold spots and Geogdetector model. We introduced the hypothesis that anthropogenic activity interacts with certain ecological factors, and their combination influences urban forests LST. We also assumed that spatio-temporal distributions of urban forest LST should be similar to those of ecological factors and can be represented quantitatively. Methods We used Jinjiang as a representative city in China as a case study. Population density was employed to represent anthropogenic activity. We built up a multi-source data (forest inventory, digital elevation models (DEM), population, and remote sensing imagery) on a unified urban scale to support urban forest LST influence interaction research. Through a combination of spatial statistical analysis results, multi-source spatial data, and Geogdetector model, the interaction mechanisms of urban forest LST were revealed. Results Although different ecological factors have different influences on forest LST, in two periods with different hot spots and cold spots, the patch area and dominant tree species were the main factors contributing to LST clustering in urban forests. The interaction between anthropogenic activity and multiple ecological factors increased LST in urban forest stands, linearly and nonlinearly. Strong interactions between elevation and dominant species were generally observed and were prevalent in either hot or cold spots areas in different years. Conclusions In conclusion, a combination of spatial statistics and GeogDetector models should be effective for quantitatively evaluating interactive relationships among ecological factors, anthropogenic activity and LST.
NASA Astrophysics Data System (ADS)
Tonini, Roberto; Sandri, Laura; Costa, Antonio; Selva, Jacopo
2014-05-01
Campi Flegrei (CF) is a large volcanic field located west of the Gulf of Naples, characterized by a wide and almost circular caldera which is partially submerged beneath the Gulf of Pozzuoli. It is known that the magma-water interaction is a key element to determine the character of submarine eruptions and their impact on the surrounding areas, but this phenomenon is still not well understood and it is rarely considered in hazard assessment. The aim of the present work is to present a preliminary study of the effect of the sea on the tephra fall hazard from CF on the municipality of Naples, by introducing a variability in the probability of tephra production according to the eruptive scale (defined on the basis of the erupted volume) and the depth of the opening submerged vents. Four different Probabilistic Volcanic Hazard Assessment (PVHA) models have been defined through the application of the model BET_VH at CF, by accounting for different modeling procedures and assumptions for the submerged part of the caldera. In particular, we take into account: 1) the effect of the sea as null, i.e. as if the water were not present; 2) the effect of the sea as a cap that totally blocks the explosivity of eruptions and consequently the tephra production; 3) an ensemble model between the two models described at the previous points 1) and 2); 4) a variable probability of tephra production depending on the depth of the submerged vent. The PVHA models are then input to pyPHaz, a tool developed and designed at INGV to visualize, analyze and merge into ensemble models PVHA's results and, potentially, any other kind of probabilistic hazard assessment, both natural and anthropic, in order to evaluate the importance of considering a variability among subaerial and submerged vents on tephra fallout hazard from CF in Naples. The analysis is preliminary and does not pretend to be exhaustive, but on one hand it represents a starting point for future works; on the other hand, it is a good case study to show the potentiality of the pyPHaz tool that, thanks to a dedicated Graphical User Interface (GUI), allows to interactively manage and visualize results of probabilistic hazards (hazard curves together with probability and hazard maps for different levels of uncertainties), and to compare or merge different hazard models producing ensemble models. This work has been developed in the framework of two Italian projects, "ByMuR (Bayesian Multi-Risk Assessment: a case study for natural risks in the city of Naples)" funded by the Italian Ministry of Education, Universities and Research (MIUR), and "V1: Probabilistic Volcanic Hazard Assessments" funded by the Italian Department of Civil Protection (DPC).
Formalizing Probabilistic Safety Claims
NASA Technical Reports Server (NTRS)
Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.
2011-01-01
A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.
A computational intelligent approach to multi-factor analysis of violent crime information system
NASA Astrophysics Data System (ADS)
Liu, Hongbo; Yang, Chao; Zhang, Meng; McLoone, Seán; Sun, Yeqing
2017-02-01
Various scientific studies have explored the causes of violent behaviour from different perspectives, with psychological tests, in particular, applied to the analysis of crime factors. The relationship between bi-factors has also been extensively studied including the link between age and crime. In reality, many factors interact to contribute to criminal behaviour and as such there is a need to have a greater level of insight into its complex nature. In this article we analyse violent crime information systems containing data on psychological, environmental and genetic factors. Our approach combines elements of rough set theory with fuzzy logic and particle swarm optimisation to yield an algorithm and methodology that can effectively extract multi-knowledge from information systems. The experimental results show that our approach outperforms alternative genetic algorithm and dynamic reduct-based techniques for reduct identification and has the added advantage of identifying multiple reducts and hence multi-knowledge (rules). Identified rules are consistent with classical statistical analysis of violent crime data and also reveal new insights into the interaction between several factors. As such, the results are helpful in improving our understanding of the factors contributing to violent crime and in highlighting the existence of hidden and intangible relationships between crime factors.
Wu, Dingming; Wang, Dongfang; Zhang, Michael Q; Gu, Jin
2015-12-01
One major goal of large-scale cancer omics study is to identify molecular subtypes for more accurate cancer diagnoses and treatments. To deal with high-dimensional cancer multi-omics data, a promising strategy is to find an effective low-dimensional subspace of the original data and then cluster cancer samples in the reduced subspace. However, due to data-type diversity and big data volume, few methods can integrative and efficiently find the principal low-dimensional manifold of the high-dimensional cancer multi-omics data. In this study, we proposed a novel low-rank approximation based integrative probabilistic model to fast find the shared principal subspace across multiple data types: the convexity of the low-rank regularized likelihood function of the probabilistic model ensures efficient and stable model fitting. Candidate molecular subtypes can be identified by unsupervised clustering hundreds of cancer samples in the reduced low-dimensional subspace. On testing datasets, our method LRAcluster (low-rank approximation based multi-omics data clustering) runs much faster with better clustering performances than the existing method. Then, we applied LRAcluster on large-scale cancer multi-omics data from TCGA. The pan-cancer analysis results show that the cancers of different tissue origins are generally grouped as independent clusters, except squamous-like carcinomas. While the single cancer type analysis suggests that the omics data have different subtyping abilities for different cancer types. LRAcluster is a very useful method for fast dimension reduction and unsupervised clustering of large-scale multi-omics data. LRAcluster is implemented in R and freely available via http://bioinfo.au.tsinghua.edu.cn/software/lracluster/ .
USDA-ARS?s Scientific Manuscript database
Background/Question/Methods Standardized monitoring data collection efforts using a probabilistic sample design, such as in the Bureau of Land Management’s (BLM) Assessment, Inventory, and Monitoring (AIM) Strategy, provide a core suite of ecological indicators, maximize data collection efficiency,...
NASA Astrophysics Data System (ADS)
Naseri Kouzehgarani, Asal
2009-12-01
Most models of aircraft trajectories are non-linear and stochastic in nature; and their internal parameters are often poorly defined. The ability to model, simulate and analyze realistic air traffic management conflict detection scenarios in a scalable, composable, multi-aircraft fashion is an extremely difficult endeavor. Accurate techniques for aircraft mode detection are critical in order to enable the precise projection of aircraft conflicts, and for the enactment of altitude separation resolution strategies. Conflict detection is an inherently probabilistic endeavor; our ability to detect conflicts in a timely and accurate manner over a fixed time horizon is traded off against the increased human workload created by false alarms---that is, situations that would not develop into an actual conflict, or would resolve naturally in the appropriate time horizon-thereby introducing a measure of probabilistic uncertainty in any decision aid fashioned to assist air traffic controllers. The interaction of the continuous dynamics of the aircraft, used for prediction purposes, with the discrete conflict detection logic gives rise to the hybrid nature of the overall system. The introduction of the probabilistic element, common to decision alerting and aiding devices, places the conflict detection and resolution problem in the domain of probabilistic hybrid phenomena. A hidden Markov model (HMM) has two stochastic components: a finite-state Markov chain and a finite set of output probability distributions. In other words an unobservable stochastic process (hidden) that can only be observed through another set of stochastic processes that generate the sequence of observations. The problem of self separation in distributed air traffic management reduces to the ability of aircraft to communicate state information to neighboring aircraft, as well as model the evolution of aircraft trajectories between communications, in the presence of probabilistic uncertain dynamics as well as partially observable and uncertain data. We introduce the Hybrid Hidden Markov Modeling (HHMM) formalism to enable the prediction of the stochastic aircraft states (and thus, potential conflicts), by combining elements of the probabilistic timed input output automaton and the partially observable Markov decision process frameworks, along with the novel addition of a Markovian scheduler to remove the non-deterministic elements arising from the enabling of several actions simultaneously. Comparisons of aircraft in level, climbing/descending and turning flight are performed, and unknown flight track data is evaluated probabilistically against the tuned model in order to assess the effectiveness of the model in detecting the switch between multiple flight modes for a given aircraft. This also allows for the generation of probabilistic distribution over the execution traces of the hybrid hidden Markov model, which then enables the prediction of the states of aircraft based on partially observable and uncertain data. Based on the composition properties of the HHMM, we study a decentralized air traffic system where aircraft are moving along streams and can perform cruise, accelerate, climb and turn maneuvers. We develop a common decentralized policy for conflict avoidance with spatially distributed agents (aircraft in the sky) and assure its safety properties via correctness proofs.
A Probabilistic Model of Social Working Memory for Information Retrieval in Social Interactions.
Li, Liyuan; Xu, Qianli; Gan, Tian; Tan, Cheston; Lim, Joo-Hwee
2018-05-01
Social working memory (SWM) plays an important role in navigating social interactions. Inspired by studies in psychology, neuroscience, cognitive science, and machine learning, we propose a probabilistic model of SWM to mimic human social intelligence for personal information retrieval (IR) in social interactions. First, we establish a semantic hierarchy as social long-term memory to encode personal information. Next, we propose a semantic Bayesian network as the SWM, which integrates the cognitive functions of accessibility and self-regulation. One subgraphical model implements the accessibility function to learn the social consensus about IR-based on social information concept, clustering, social context, and similarity between persons. Beyond accessibility, one more layer is added to simulate the function of self-regulation to perform the personal adaptation to the consensus based on human personality. Two learning algorithms are proposed to train the probabilistic SWM model on a raw dataset of high uncertainty and incompleteness. One is an efficient learning algorithm of Newton's method, and the other is a genetic algorithm. Systematic evaluations show that the proposed SWM model is able to learn human social intelligence effectively and outperforms the baseline Bayesian cognitive model. Toward real-world applications, we implement our model on Google Glass as a wearable assistant for social interaction.
Electromagnetic Compatibility (EMC) in Microelectronics.
1983-02-01
Fault Tree Analysis", System Saftey Symposium, June 8-9, 1965, Seattle: The Boeing Company . 12. Fussell, J.B., "Fault Tree Analysis-Concepts and...procedure for assessing EMC in microelectronics and for applying DD, 1473 EOiTO OP I, NOV6 IS OESOL.ETE UNCLASSIFIED SECURITY CLASSIFICATION OF THIS...CRITERIA 2.1 Background 2 2.2 The Probabilistic Nature of EMC 2 2.3 The Probabilistic Approach 5 2.4 The Compatibility Factor 6 3 APPLYING PROBABILISTIC
Hammoud, Riad I.; Sahin, Cem S.; Blasch, Erik P.; Rhodes, Bradley J.; Wang, Tao
2014-01-01
We describe two advanced video analysis techniques, including video-indexed by voice annotations (VIVA) and multi-media indexing and explorer (MINER). VIVA utilizes analyst call-outs (ACOs) in the form of chat messages (voice-to-text) to associate labels with video target tracks, to designate spatial-temporal activity boundaries and to augment video tracking in challenging scenarios. Challenging scenarios include low-resolution sensors, moving targets and target trajectories obscured by natural and man-made clutter. MINER includes: (1) a fusion of graphical track and text data using probabilistic methods; (2) an activity pattern learning framework to support querying an index of activities of interest (AOIs) and targets of interest (TOIs) by movement type and geolocation; and (3) a user interface to support streaming multi-intelligence data processing. We also present an activity pattern learning framework that uses the multi-source associated data as training to index a large archive of full-motion videos (FMV). VIVA and MINER examples are demonstrated for wide aerial/overhead imagery over common data sets affording an improvement in tracking from video data alone, leading to 84% detection with modest misdetection/false alarm results due to the complexity of the scenario. The novel use of ACOs and chat messages in video tracking paves the way for user interaction, correction and preparation of situation awareness reports. PMID:25340453
Assessing Uncertainties in Surface Water Security: A Probabilistic Multi-model Resampling approach
NASA Astrophysics Data System (ADS)
Rodrigues, D. B. B.
2015-12-01
Various uncertainties are involved in the representation of processes that characterize interactions between societal needs, ecosystem functioning, and hydrological conditions. Here, we develop an empirical uncertainty assessment of water security indicators that characterize scarcity and vulnerability, based on a multi-model and resampling framework. We consider several uncertainty sources including those related to: i) observed streamflow data; ii) hydrological model structure; iii) residual analysis; iv) the definition of Environmental Flow Requirement method; v) the definition of critical conditions for water provision; and vi) the critical demand imposed by human activities. We estimate the overall uncertainty coming from the hydrological model by means of a residual bootstrap resampling approach, and by uncertainty propagation through different methodological arrangements applied to a 291 km² agricultural basin within the Cantareira water supply system in Brazil. Together, the two-component hydrograph residual analysis and the block bootstrap resampling approach result in a more accurate and precise estimate of the uncertainty (95% confidence intervals) in the simulated time series. We then compare the uncertainty estimates associated with water security indicators using a multi-model framework and provided by each model uncertainty estimation approach. The method is general and can be easily extended forming the basis for meaningful support to end-users facing water resource challenges by enabling them to incorporate a viable uncertainty analysis into a robust decision making process.
Hammoud, Riad I; Sahin, Cem S; Blasch, Erik P; Rhodes, Bradley J; Wang, Tao
2014-10-22
We describe two advanced video analysis techniques, including video-indexed by voice annotations (VIVA) and multi-media indexing and explorer (MINER). VIVA utilizes analyst call-outs (ACOs) in the form of chat messages (voice-to-text) to associate labels with video target tracks, to designate spatial-temporal activity boundaries and to augment video tracking in challenging scenarios. Challenging scenarios include low-resolution sensors, moving targets and target trajectories obscured by natural and man-made clutter. MINER includes: (1) a fusion of graphical track and text data using probabilistic methods; (2) an activity pattern learning framework to support querying an index of activities of interest (AOIs) and targets of interest (TOIs) by movement type and geolocation; and (3) a user interface to support streaming multi-intelligence data processing. We also present an activity pattern learning framework that uses the multi-source associated data as training to index a large archive of full-motion videos (FMV). VIVA and MINER examples are demonstrated for wide aerial/overhead imagery over common data sets affording an improvement in tracking from video data alone, leading to 84% detection with modest misdetection/false alarm results due to the complexity of the scenario. The novel use of ACOs and chat Sensors 2014, 14 19844 messages in video tracking paves the way for user interaction, correction and preparation of situation awareness reports.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ely, Geoffrey P.
2013-10-31
This project uses dynamic rupture simulations to investigate high-frequency seismic energy generation. The relevant phenomena (frictional breakdown, shear heating, effective normal-stress fluctuations, material damage, etc.) controlling rupture are strongly interacting and span many orders of magnitude in spatial scale, requiring highresolution simulations that couple disparate physical processes (e.g., elastodynamics, thermal weakening, pore-fluid transport, and heat conduction). Compounding the computational challenge, we know that natural faults are not planar, but instead have roughness that can be approximated by power laws potentially leading to large, multiscale fluctuations in normal stress. The capacity to perform 3D rupture simulations that couple these processes willmore » provide guidance for constructing appropriate source models for high-frequency ground motion simulations. The improved rupture models from our multi-scale dynamic rupture simulations will be used to conduct physicsbased (3D waveform modeling-based) probabilistic seismic hazard analysis (PSHA) for California. These calculation will provide numerous important seismic hazard results, including a state-wide extended earthquake rupture forecast with rupture variations for all significant events, a synthetic seismogram catalog for thousands of scenario events and more than 5000 physics-based seismic hazard curves for California.« less
Numerical solutions of 2-D multi-stage rotor/stator unsteady flow interactions
NASA Astrophysics Data System (ADS)
Yang, R.-J.; Lin, S.-J.
1991-01-01
The Rai method of single-stage rotor/stator flow interaction is extended to handle multistage configurations. In this study, a two-dimensional Navier-Stokes multi-zone approach was used to investigate unsteady flow interactions within two multistage axial turbines. The governing equations are solved by an iterative, factored, implicit finite-difference, upwind algorithm. Numerical accuracy is checked by investigating the effect of time step size, the effect of subiteration in the Newton-Raphson technique, and the effect of full viscous versus thin-layer approximation. Computer results compared well with experimental data. Unsteady flow interactions, wake cutting, and the associated evolution of vortical entities are discussed.
P. B. Woodbury; D. A. Weinstein
2010-01-01
We reviewed probabilistic regional risk assessment methodologies to identify the methods that are currently in use and are capable of estimating threats to ecosystems from fire and fuels, invasive species, and their interactions with stressors. In a companion chapter, we highlight methods useful for evaluating risks from fire. In this chapter, we highlight methods...
Mifsud, Borbala; Martincorena, Inigo; Darbo, Elodie; Sugar, Robert; Schoenfelder, Stefan; Fraser, Peter; Luscombe, Nicholas M
2017-01-01
Hi-C is one of the main methods for investigating spatial co-localisation of DNA in the nucleus. However, the raw sequencing data obtained from Hi-C experiments suffer from large biases and spurious contacts, making it difficult to identify true interactions. Existing methods use complex models to account for biases and do not provide a significance threshold for detecting interactions. Here we introduce a simple binomial probabilistic model that resolves complex biases and distinguishes between true and false interactions. The model corrects biases of known and unknown origin and yields a p-value for each interaction, providing a reliable threshold based on significance. We demonstrate this experimentally by testing the method against a random ligation dataset. Our method outperforms previous methods and provides a statistical framework for further data analysis, such as comparisons of Hi-C interactions between different conditions. GOTHiC is available as a BioConductor package (http://www.bioconductor.org/packages/release/bioc/html/GOTHiC.html).
Wang, Zhuo; Danziger, Samuel A; Heavner, Benjamin D; Ma, Shuyi; Smith, Jennifer J; Li, Song; Herricks, Thurston; Simeonidis, Evangelos; Baliga, Nitin S; Aitchison, John D; Price, Nathan D
2017-05-01
Gene regulatory and metabolic network models have been used successfully in many organisms, but inherent differences between them make networks difficult to integrate. Probabilistic Regulation Of Metabolism (PROM) provides a partial solution, but it does not incorporate network inference and underperforms in eukaryotes. We present an Integrated Deduced And Metabolism (IDREAM) method that combines statistically inferred Environment and Gene Regulatory Influence Network (EGRIN) models with the PROM framework to create enhanced metabolic-regulatory network models. We used IDREAM to predict phenotypes and genetic interactions between transcription factors and genes encoding metabolic activities in the eukaryote, Saccharomyces cerevisiae. IDREAM models contain many fewer interactions than PROM and yet produce significantly more accurate growth predictions. IDREAM consistently outperformed PROM using any of three popular yeast metabolic models and across three experimental growth conditions. Importantly, IDREAM's enhanced accuracy makes it possible to identify subtle synthetic growth defects. With experimental validation, these novel genetic interactions involving the pyruvate dehydrogenase complex suggested a new role for fatty acid-responsive factor Oaf1 in regulating acetyl-CoA production in glucose grown cells.
NASA Astrophysics Data System (ADS)
De Rango, Floriano; Lupia, Andrea
2016-05-01
MANETs allow mobile nodes communicating to each other using the wireless medium. A key aspect of these kind of networks is the security, because their setup is done without an infrastructure, so external nodes could interfere in the communication. Mobile nodes could be compromised, misbehaving during the multi-hop transmission of data, or they could have a selfish behavior to save energy, which is another important constraint in MANETs. The detection of these behaviors need a framework that takes into account the latest interactions among nodes, so malicious or selfish nodes could be detected also if their behavior is changed over time. The monitoring activity increases the energy consumption, so our proposal takes into account this issue reducing the energy required by the monitoring system, keeping the effectiveness of the intrusion detection system. The results show an improvement in the saved energy, improving the detection performance too.
NASA Astrophysics Data System (ADS)
Dialynas, Y. G.; Arnone, E.; Noto, L. V.; Bras, R. L.
2013-12-01
Slope stability depends on geotechnical and hydrological factors that exhibit wide natural spatial variability, yet sufficient measurements of the related parameters are rarely available over entire study areas. The uncertainty associated with the inability to fully characterize hydrologic behavior has an impact on any attempt to model landslide hazards. This work suggests a way to systematically account for this uncertainty in coupled distributed hydrological-stability models for shallow landslide hazard assessment. A probabilistic approach for the prediction of rainfall-triggered landslide occurrence at basin scale was implemented in an existing distributed eco-hydrological and landslide model, tRIBS-VEGGIE -landslide (Triangulated Irregular Network (TIN)-based Real-time Integrated Basin Simulator - VEGetation Generator for Interactive Evolution). More precisely, we upgraded tRIBS-VEGGIE- landslide to assess the likelihood of shallow landslides by accounting for uncertainty related to geotechnical and hydrological factors that directly affect slope stability. Natural variability of geotechnical soil characteristics was considered by randomizing soil cohesion and friction angle. Hydrological uncertainty related to the estimation of matric suction was taken into account by considering soil retention parameters as correlated random variables. The probability of failure is estimated through an assumed theoretical Factor of Safety (FS) distribution, conditioned on soil moisture content. At each cell, the temporally variant FS statistics are approximated by the First Order Second Moment (FOSM) method, as a function of parameters statistical properties. The model was applied on the Rio Mameyes Basin, located in the Luquillo Experimental Forest in Puerto Rico, where previous landslide analyses have been carried out. At each time step, model outputs include the probability of landslide occurrence across the basin, and the most probable depth of failure at each soil column. The use of the proposed probabilistic approach for shallow landslide prediction is able to reveal and quantify landslide risk at slopes assessed as stable by simpler deterministic methods.
Regularizing Unpredictable Variation: Evidence from a Natural Language Setting
ERIC Educational Resources Information Center
Hendricks, Alison Eisel; Miller, Karen; Jackson, Carrie N.
2018-01-01
While previous sociolinguistic research has demonstrated that children faithfully acquire probabilistic input constrained by sociolinguistic and linguistic factors (e.g., gender and socioeconomic status), research suggests children regularize inconsistent input-probabilistic input that is not sociolinguistically constrained (e.g., Hudson Kam &…
Biological adaptive control model: a mechanical analogue of multi-factorial bone density adaptation.
Davidson, Peter L; Milburn, Peter D; Wilson, Barry D
2004-03-21
The mechanism of how bone adapts to every day demands needs to be better understood to gain insight into situations in which the musculoskeletal system is perturbed. This paper offers a novel multi-factorial mathematical model of bone density adaptation which combines previous single-factor models in a single adaptation system as a means of gaining this insight. Unique aspects of the model include provision for interaction between factors and an estimation of the relative contribution of each factor. This interacting system is considered analogous to a Newtonian mechanical system and the governing response equation is derived as a linear version of the adaptation process. The transient solution to sudden environmental change is found to be exponential or oscillatory depending on the balance between cellular activation and deactivation frequencies.
A generic multi-hazard and multi-risk framework and its application illustrated in a virtual city
NASA Astrophysics Data System (ADS)
Mignan, Arnaud; Euchner, Fabian; Wiemer, Stefan
2013-04-01
We present a generic framework to implement hazard correlations in multi-risk assessment strategies. We consider hazard interactions (process I), time-dependent vulnerability (process II) and time-dependent exposure (process III). Our approach is based on the Monte Carlo method to simulate a complex system, which is defined from assets exposed to a hazardous region. We generate 1-year time series, sampling from a stochastic set of events. Each time series corresponds to one risk scenario and the analysis of multiple time series allows for the probabilistic assessment of losses and for the recognition of more or less probable risk paths. Each sampled event is associated to a time of occurrence, a damage footprint and a loss footprint. The occurrence of an event depends on its rate, which is conditional on the occurrence of past events (process I, concept of correlation matrix). Damage depends on the hazard intensity and on the vulnerability of the asset, which is conditional on previous damage on that asset (process II). Losses are the product of damage and exposure value, this value being the original exposure minus previous losses (process III, no reconstruction considered). The Monte Carlo method allows for a straightforward implementation of uncertainties and for implementation of numerous interactions, which is otherwise challenging in an analytical multi-risk approach. We apply our framework to a synthetic data set, defined by a virtual city within a virtual region. This approach gives the opportunity to perform multi-risk analyses in a controlled environment while not requiring real data, which may be difficultly accessible or simply unavailable to the public. Based on the heuristic approach, we define a 100 by 100 km region where earthquakes, volcanic eruptions, fluvial floods, hurricanes and coastal floods can occur. All hazards are harmonized to a common format. We define a 20 by 20 km city, composed of 50,000 identical buildings with a fixed economic value. Vulnerability curves are defined in terms of mean damage ratio as a function of hazard intensity. All data are based on simple equations found in the literature and on other simplifications. We show the impact of earthquake-earthquake interaction and hurricane-storm surge coupling, as well as of time-dependent vulnerability and exposure, on aggregated loss curves. One main result is the emergence of low probability-high consequences (extreme) events when correlations are implemented. While the concept of virtual city can suggest the theoretical benefits of multi-risk assessment for decision support, identifying their real-world practicality will require the study of real test sites.
Griffis, Joseph C; Allendorfer, Jane B; Szaflarski, Jerzy P
2016-01-15
Manual lesion delineation by an expert is the standard for lesion identification in MRI scans, but it is time-consuming and can introduce subjective bias. Alternative methods often require multi-modal MRI data, user interaction, scans from a control population, and/or arbitrary statistical thresholding. We present an approach for automatically identifying stroke lesions in individual T1-weighted MRI scans using naïve Bayes classification. Probabilistic tissue segmentation and image algebra were used to create feature maps encoding information about missing and abnormal tissue. Leave-one-case-out training and cross-validation was used to obtain out-of-sample predictions for each of 30 cases with left hemisphere stroke lesions. Our method correctly predicted lesion locations for 30/30 un-trained cases. Post-processing with smoothing (8mm FWHM) and cluster-extent thresholding (100 voxels) was found to improve performance. Quantitative evaluations of post-processed out-of-sample predictions on 30 cases revealed high spatial overlap (mean Dice similarity coefficient=0.66) and volume agreement (mean percent volume difference=28.91; Pearson's r=0.97) with manual lesion delineations. Our automated approach agrees with manual tracing. It provides an alternative to automated methods that require multi-modal MRI data, additional control scans, or user interaction to achieve optimal performance. Our fully trained classifier has applications in neuroimaging and clinical contexts. Copyright © 2015 Elsevier B.V. All rights reserved.
Information processing by networks of quantum decision makers
NASA Astrophysics Data System (ADS)
Yukalov, V. I.; Yukalova, E. P.; Sornette, D.
2018-02-01
We suggest a model of a multi-agent society of decision makers taking decisions being based on two criteria, one is the utility of the prospects and the other is the attractiveness of the considered prospects. The model is the generalization of quantum decision theory, developed earlier for single decision makers realizing one-step decisions, in two principal aspects. First, several decision makers are considered simultaneously, who interact with each other through information exchange. Second, a multistep procedure is treated, when the agents exchange information many times. Several decision makers exchanging information and forming their judgment, using quantum rules, form a kind of a quantum information network, where collective decisions develop in time as a result of information exchange. In addition to characterizing collective decisions that arise in human societies, such networks can describe dynamical processes occurring in artificial quantum intelligence composed of several parts or in a cluster of quantum computers. The practical usage of the theory is illustrated on the dynamic disjunction effect for which three quantitative predictions are made: (i) the probabilistic behavior of decision makers at the initial stage of the process is described; (ii) the decrease of the difference between the initial prospect probabilities and the related utility factors is proved; (iii) the existence of a common consensus after multiple exchange of information is predicted. The predicted numerical values are in very good agreement with empirical data.
Aviation Safety Risk Modeling: Lessons Learned From Multiple Knowledge Elicitation Sessions
NASA Technical Reports Server (NTRS)
Luxhoj, J. T.; Ancel, E.; Green, L. L.; Shih, A. T.; Jones, S. M.; Reveley, M. S.
2014-01-01
Aviation safety risk modeling has elements of both art and science. In a complex domain, such as the National Airspace System (NAS), it is essential that knowledge elicitation (KE) sessions with domain experts be performed to facilitate the making of plausible inferences about the possible impacts of future technologies and procedures. This study discusses lessons learned throughout the multiple KE sessions held with domain experts to construct probabilistic safety risk models for a Loss of Control Accident Framework (LOCAF), FLightdeck Automation Problems (FLAP), and Runway Incursion (RI) mishap scenarios. The intent of these safety risk models is to support a portfolio analysis of NASA's Aviation Safety Program (AvSP). These models use the flexible, probabilistic approach of Bayesian Belief Networks (BBNs) and influence diagrams to model the complex interactions of aviation system risk factors. Each KE session had a different set of experts with diverse expertise, such as pilot, air traffic controller, certification, and/or human factors knowledge that was elicited to construct a composite, systems-level risk model. There were numerous "lessons learned" from these KE sessions that deal with behavioral aggregation, conditional probability modeling, object-oriented construction, interpretation of the safety risk results, and model verification/validation that are presented in this paper.
Mei, Suyu
2012-10-07
Recent years have witnessed much progress in computational modeling for protein subcellular localization. However, there are far few computational models for predicting plant protein subcellular multi-localization. In this paper, we propose a multi-label multi-kernel transfer learning model for predicting multiple subcellular locations of plant proteins (MLMK-TLM). The method proposes a multi-label confusion matrix and adapts one-against-all multi-class probabilistic outputs to multi-label learning scenario, based on which we further extend our published work MK-TLM (multi-kernel transfer learning based on Chou's PseAAC formulation for protein submitochondria localization) for plant protein subcellular multi-localization. By proper homolog knowledge transfer, MLMK-TLM is applicable to novel plant protein subcellular localization in multi-label learning scenario. The experiments on plant protein benchmark dataset show that MLMK-TLM outperforms the baseline model. Unlike the existing models, MLMK-TLM also reports its misleading tendency, which is important for comprehensive survey of model's multi-labeling performance. Copyright © 2012 Elsevier Ltd. All rights reserved.
Cloud immersion building shielding factors for US residential structures.
Dickson, E D; Hamby, D M
2014-12-01
This paper presents validated building shielding factors designed for contemporary US housing-stock under an idealized, yet realistic, exposure scenario within a semi-infinite cloud of radioactive material. The building shielding factors are intended for use in emergency planning and level three probabilistic risk assessments for a variety of postulated radiological events in which a realistic assessment is necessary to better understand the potential risks for accident mitigation and emergency response planning. Factors are calculated from detailed computational housing-units models using the general-purpose Monte Carlo N-Particle computational code, MCNP5, and are benchmarked from a series of narrow- and broad-beam measurements analyzing the shielding effectiveness of ten common general-purpose construction materials and ten shielding models representing the primary weather barriers (walls and roofs) of likely US housing-stock. Each model was designed to scale based on common residential construction practices and include, to the extent practical, all structurally significant components important for shielding against ionizing radiation. Calculations were performed for floor-specific locations as well as for computing a weighted-average representative building shielding factor for single- and multi-story detached homes, both with and without basement, as well for single-wide manufactured housing-units.
Probabilistic Hazard Estimation at a Densely Urbanised Area: the Neaples Volcanoes
NASA Astrophysics Data System (ADS)
de Natale, G.; Mastrolorenzo, G.; Panizza, A.; Pappalardo, L.; Claudia, T.
2005-12-01
The Neaples volcanic area (Southern Italy), including Vesuvius, Campi Flegrei caldera and Ischia island, is the highest risk one in the World, where more than 2 million people live within about 10 km from an active volcanic vent. Such an extreme risk calls for accurate methodologies aimed to quantify it, in a probabilistic way, considering all the available volcanological information as well as modelling results. In fact, simple hazard maps based on the observation of deposits from past eruptions have the major problem that eruptive history generally samples a very limited number of possible outcomes, thus resulting almost meaningless to get the event probability in the area. This work describes a methodology making the best use (from a Bayesian point of view) of volcanological data and modelling results, to compute probabilistic hazard maps from multi-vent explosive eruptions. The method, which follows an approach recently developed by the same authors for pyroclastic flows hazard, has been here improved and extended to compute also fall-out hazard. The application of the method to the Neapolitan volcanic area, including the densely populated city of Naples, allows, for the first time, to get a global picture of the areal distribution for the main hazards from multi-vent explosive eruptions. From a joint consideration of the hazard contributions from all the three volcanic areas, new insight on the volcanic hazard distribution emerges, which will have strong implications for urban and emergency planning in the area.
A Generalized Mixture Framework for Multi-label Classification
Hong, Charmgil; Batal, Iyad; Hauskrecht, Milos
2015-01-01
We develop a novel probabilistic ensemble framework for multi-label classification that is based on the mixtures-of-experts architecture. In this framework, we combine multi-label classification models in the classifier chains family that decompose the class posterior distribution P(Y1, …, Yd|X) using a product of posterior distributions over components of the output space. Our approach captures different input–output and output–output relations that tend to change across data. As a result, we can recover a rich set of dependency relations among inputs and outputs that a single multi-label classification model cannot capture due to its modeling simplifications. We develop and present algorithms for learning the mixtures-of-experts models from data and for performing multi-label predictions on unseen data instances. Experiments on multiple benchmark datasets demonstrate that our approach achieves highly competitive results and outperforms the existing state-of-the-art multi-label classification methods. PMID:26613069
Robust Control Design for Systems With Probabilistic Uncertainty
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Kenny, Sean P.
2005-01-01
This paper presents a reliability- and robustness-based formulation for robust control synthesis for systems with probabilistic uncertainty. In a reliability-based formulation, the probability of violating design requirements prescribed by inequality constraints is minimized. In a robustness-based formulation, a metric which measures the tendency of a random variable/process to cluster close to a target scalar/function is minimized. A multi-objective optimization procedure, which combines stability and performance requirements in time and frequency domains, is used to search for robustly optimal compensators. Some of the fundamental differences between the proposed strategy and conventional robust control methods are: (i) unnecessary conservatism is eliminated since there is not need for convex supports, (ii) the most likely plants are favored during synthesis allowing for probabilistic robust optimality, (iii) the tradeoff between robust stability and robust performance can be explored numerically, (iv) the uncertainty set is closely related to parameters with clear physical meaning, and (v) compensators with improved robust characteristics for a given control structure can be synthesized.
Campbell, Kieran R; Yau, Christopher
2017-03-15
Modeling bifurcations in single-cell transcriptomics data has become an increasingly popular field of research. Several methods have been proposed to infer bifurcation structure from such data, but all rely on heuristic non-probabilistic inference. Here we propose the first generative, fully probabilistic model for such inference based on a Bayesian hierarchical mixture of factor analyzers. Our model exhibits competitive performance on large datasets despite implementing full Markov-Chain Monte Carlo sampling, and its unique hierarchical prior structure enables automatic determination of genes driving the bifurcation process. We additionally propose an Empirical-Bayes like extension that deals with the high levels of zero-inflation in single-cell RNA-seq data and quantify when such models are useful. We apply or model to both real and simulated single-cell gene expression data and compare the results to existing pseudotime methods. Finally, we discuss both the merits and weaknesses of such a unified, probabilistic approach in the context practical bioinformatics analyses.
PMHT Approach for Multi-Target Multi-Sensor Sonar Tracking in Clutter.
Li, Xiaohua; Li, Yaan; Yu, Jing; Chen, Xiao; Dai, Miao
2015-11-06
Multi-sensor sonar tracking has many advantages, such as the potential to reduce the overall measurement uncertainty and the possibility to hide the receiver. However, the use of multi-target multi-sensor sonar tracking is challenging because of the complexity of the underwater environment, especially the low target detection probability and extremely large number of false alarms caused by reverberation. In this work, to solve the problem of multi-target multi-sensor sonar tracking in the presence of clutter, a novel probabilistic multi-hypothesis tracker (PMHT) approach based on the extended Kalman filter (EKF) and unscented Kalman filter (UKF) is proposed. The PMHT can efficiently handle the unknown measurements-to-targets and measurements-to-transmitters data association ambiguity. The EKF and UKF are used to deal with the high degree of nonlinearity in the measurement model. The simulation results show that the proposed algorithm can improve the target tracking performance in a cluttered environment greatly, and its computational load is low.
ERIC Educational Resources Information Center
Palka, Sean
2015-01-01
This research details a methodology designed for creating content in support of various phishing prevention tasks including live exercises and detection algorithm research. Our system uses probabilistic context-free grammars (PCFG) and variable interpolation as part of a multi-pass method to create diverse and consistent phishing email content on…
Justice, N. B.; Sczesnak, A.; Hazen, T. C.; ...
2017-08-04
A central goal of microbial ecology is to identify and quantify the forces that lead to observed population distributions and dynamics. However, these forces, which include environmental selection, dispersal, and organism interactions, are often difficult to assess in natural environments. Here in this paper, we present a method that links microbial community structures with selective and stochastic forces through highly replicated subsampling and enrichment of a single environmental inoculum. Specifically, groundwater from a well-studied natural aquifer was serially diluted and inoculated into nearly 1,000 aerobic and anaerobic nitrate-reducing cultures, and the final community structures were evaluated with 16S rRNA genemore » amplicon sequencing. We analyzed the frequency and abundance of individual operational taxonomic units (OTUs) to understand how probabilistic immigration, relative fitness differences, environmental factors, and organismal interactions contributed to divergent distributions of community structures. We further used a most probable number (MPN) method to estimate the natural condition-dependent cultivable abundance of each of the nearly 400 OTU cultivated in our study and infer the relative fitness of each. Additionally, we infer condition-specific organism interactions and discuss how this high-replicate culturing approach is essential in dissecting the interplay between overlapping ecological forces and taxon-specific attributes that underpin microbial community assembly. IMPORTANCEThrough highly replicated culturing, in which inocula are subsampled from a single environmental sample, we empirically determine how selective forces, interspecific interactions, relative fitness, and probabilistic dispersal shape bacterial communities. These methods offer a novel approach to untangle not only interspecific interactions but also taxon-specific fitness differences that manifest across different cultivation conditions and lead to the selection and enrichment of specific organisms. Additionally, we provide a method for estimating the number of cultivable units of each OTU in the original sample through the MPN approach.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Justice, N. B.; Sczesnak, A.; Hazen, T. C.
A central goal of microbial ecology is to identify and quantify the forces that lead to observed population distributions and dynamics. However, these forces, which include environmental selection, dispersal, and organism interactions, are often difficult to assess in natural environments. Here in this paper, we present a method that links microbial community structures with selective and stochastic forces through highly replicated subsampling and enrichment of a single environmental inoculum. Specifically, groundwater from a well-studied natural aquifer was serially diluted and inoculated into nearly 1,000 aerobic and anaerobic nitrate-reducing cultures, and the final community structures were evaluated with 16S rRNA genemore » amplicon sequencing. We analyzed the frequency and abundance of individual operational taxonomic units (OTUs) to understand how probabilistic immigration, relative fitness differences, environmental factors, and organismal interactions contributed to divergent distributions of community structures. We further used a most probable number (MPN) method to estimate the natural condition-dependent cultivable abundance of each of the nearly 400 OTU cultivated in our study and infer the relative fitness of each. Additionally, we infer condition-specific organism interactions and discuss how this high-replicate culturing approach is essential in dissecting the interplay between overlapping ecological forces and taxon-specific attributes that underpin microbial community assembly. IMPORTANCEThrough highly replicated culturing, in which inocula are subsampled from a single environmental sample, we empirically determine how selective forces, interspecific interactions, relative fitness, and probabilistic dispersal shape bacterial communities. These methods offer a novel approach to untangle not only interspecific interactions but also taxon-specific fitness differences that manifest across different cultivation conditions and lead to the selection and enrichment of specific organisms. Additionally, we provide a method for estimating the number of cultivable units of each OTU in the original sample through the MPN approach.« less
Time Dependent Data Mining in RAVEN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cogliati, Joshua Joseph; Chen, Jun; Patel, Japan Ketan
RAVEN is a generic software framework to perform parametric and probabilistic analysis based on the response of complex system codes. The goal of this type of analyses is to understand the response of such systems in particular with respect their probabilistic behavior, to understand their predictability and drivers or lack of thereof. Data mining capabilities are the cornerstones to perform such deep learning of system responses. For this reason static data mining capabilities were added last fiscal year (FY 15). In real applications, when dealing with complex multi-scale, multi-physics systems it seems natural that, during transients, the relevance of themore » different scales, and physics, would evolve over time. For these reasons the data mining capabilities have been extended allowing their application over time. In this writing it is reported a description of the new RAVEN capabilities implemented with several simple analytical tests to explain their application and highlight the proper implementation. The report concludes with the application of those newly implemented capabilities to the analysis of a simulation performed with the Bison code.« less
van der Lei, Harry; Tenenbaum, Gershon
2012-12-01
Individual affect-related performance zones (IAPZs) method utilizing Kamata et al. (J Sport Exerc Psychol 24:189-208, 2002) probabilistic model of determining the individual zone of optimal functioning was utilized as idiosyncratic affective patterns during golf performance. To do so, three male golfers of a varsity golf team were observed during three rounds of golf competition. The investigation implemented a multi-modal assessment approach in which the probabilistic relationship between affective states and both, performance process and performance outcome, measures were determined. More specifically, introspective (i.e., verbal reports) and objective (heart rate and respiration rate) measures of arousal were incorporated to examine the relationships between arousal states and both, process components (i.e., routine consistency, timing), and outcome scores related to golf performance. Results revealed distinguishable and idiosyncratic IAPZs associated with physiological and introspective measures for each golfer. The associations between the IAPZs and decision-making or swing/stroke execution were strong and unique for each golfer. Results are elaborated using cognitive and affect-related concepts, and applications for practitioners are provided.
NASA Astrophysics Data System (ADS)
Fehenberger, Tobias
2018-02-01
This paper studies probabilistic shaping in a multi-span wavelength-division multiplexing optical fiber system with 64-ary quadrature amplitude modulation (QAM) input. In split-step fiber simulations and via an enhanced Gaussian noise model, three figures of merit are investigated, which are signal-to-noise ratio (SNR), achievable information rate (AIR) for capacity-achieving forward error correction (FEC) with bit-metric decoding, and the information rate achieved with low-density parity-check (LDPC) FEC. For the considered system parameters and different shaped input distributions, shaping is found to decrease the SNR by 0.3 dB yet simultaneously increases the AIR by up to 0.4 bit per 4D-symbol. The information rates of LDPC-coded modulation with shaped 64QAM input are improved by up to 0.74 bit per 4D-symbol, which is larger than the shaping gain when considering AIRs. This increase is attributed to the reduced coding gap of the higher-rate code that is used for decoding the nonuniform QAM input.
NASA Astrophysics Data System (ADS)
Viswanath, Satish; Bloch, B. Nicholas; Chappelow, Jonathan; Patel, Pratik; Rofsky, Neil; Lenkinski, Robert; Genega, Elizabeth; Madabhushi, Anant
2011-03-01
Currently, there is significant interest in developing methods for quantitative integration of multi-parametric (structural, functional) imaging data with the objective of building automated meta-classifiers to improve disease detection, diagnosis, and prognosis. Such techniques are required to address the differences in dimensionalities and scales of individual protocols, while deriving an integrated multi-parametric data representation which best captures all disease-pertinent information available. In this paper, we present a scheme called Enhanced Multi-Protocol Analysis via Intelligent Supervised Embedding (EMPrAvISE); a powerful, generalizable framework applicable to a variety of domains for multi-parametric data representation and fusion. Our scheme utilizes an ensemble of embeddings (via dimensionality reduction, DR); thereby exploiting the variance amongst multiple uncorrelated embeddings in a manner similar to ensemble classifier schemes (e.g. Bagging, Boosting). We apply this framework to the problem of prostate cancer (CaP) detection on 12 3 Tesla pre-operative in vivo multi-parametric (T2-weighted, Dynamic Contrast Enhanced, and Diffusion-weighted) magnetic resonance imaging (MRI) studies, in turn comprising a total of 39 2D planar MR images. We first align the different imaging protocols via automated image registration, followed by quantification of image attributes from individual protocols. Multiple embeddings are generated from the resultant high-dimensional feature space which are then combined intelligently to yield a single stable solution. Our scheme is employed in conjunction with graph embedding (for DR) and probabilistic boosting trees (PBTs) to detect CaP on multi-parametric MRI. Finally, a probabilistic pairwise Markov Random Field algorithm is used to apply spatial constraints to the result of the PBT classifier, yielding a per-voxel classification of CaP presence. Per-voxel evaluation of detection results against ground truth for CaP extent on MRI (obtained by spatially registering pre-operative MRI with available whole-mount histological specimens) reveals that EMPrAvISE yields a statistically significant improvement (AUC=0.77) over classifiers constructed from individual protocols (AUC=0.62, 0.62, 0.65, for T2w, DCE, DWI respectively) as well as one trained using multi-parametric feature concatenation (AUC=0.67).
The design of a multi-harmonic step-tunable gyrotron
NASA Astrophysics Data System (ADS)
Qi, Xiang-Bo; Du, Chao-Hai; Zhu, Juan-Feng; Pan, Shi; Liu, Pu-Kun
2017-03-01
The theoretical study of a step-tunable gyrotron controlled by successive excitation of multi-harmonic modes is presented in this paper. An axis-encircling electron beam is employed to eliminate the harmonic mode competition. Physics images are depicted to elaborate the multi-harmonic interaction mechanism in determining the operating parameters at which arbitrary harmonic tuning can be realized by magnetic field sweeping to achieve controlled multiband frequencies' radiation. An important principle is revealed that a weak coupling coefficient under a high-harmonic interaction can be compensated by a high Q-factor. To some extent, the complementation between the high Q-factor and weak coupling coefficient makes the high-harmonic mode potential to achieve high efficiency. Based on a previous optimized magnetic cusp gun, the multi-harmonic step-tunable gyrotron is feasible by using harmonic tuning of first-to-fourth harmonic modes. Multimode simulation shows that the multi-harmonic gyrotron can operate on the 34 GHz first-harmonic TE11 mode, 54 GHz second-harmonic TE21 mode, 74 GHz third-harmonic TE31 mode, and 94 GHz fourth-harmonic TE41 mode, corresponding to peak efficiencies of 28.6%, 35.7%, 17.1%, and 11.4%, respectively. The multi-harmonic step-tunable gyrotron provides new possibilities in millimeter-terahertz source development especially for advanced terahertz applications.
Ron, Gil; Globerson, Yuval; Moran, Dror; Kaplan, Tommy
2017-12-21
Proximity-ligation methods such as Hi-C allow us to map physical DNA-DNA interactions along the genome, and reveal its organization into topologically associating domains (TADs). As the Hi-C data accumulate, computational methods were developed for identifying domain borders in multiple cell types and organisms. Here, we present PSYCHIC, a computational approach for analyzing Hi-C data and identifying promoter-enhancer interactions. We use a unified probabilistic model to segment the genome into domains, which we then merge hierarchically and fit using a local background model, allowing us to identify over-represented DNA-DNA interactions across the genome. By analyzing the published Hi-C data sets in human and mouse, we identify hundreds of thousands of putative enhancers and their target genes, and compile an extensive genome-wide catalog of gene regulation in human and mouse. As we show, our predictions are highly enriched for ChIP-seq and DNA accessibility data, evolutionary conservation, eQTLs and other DNA-DNA interaction data.
NASA Astrophysics Data System (ADS)
Barbetta, Silvia; Coccia, Gabriele; Moramarco, Tommaso; Brocca, Luca; Todini, Ezio
2017-08-01
This work extends the multi-temporal approach of the Model Conditional Processor (MCP-MT) to the multi-model case and to the four Truncated Normal Distributions (TNDs) approach, demonstrating the improvement on the single-temporal one. The study is framed in the context of probabilistic Bayesian decision-making that is appropriate to take rational decisions on uncertain future outcomes. As opposed to the direct use of deterministic forecasts, the probabilistic forecast identifies a predictive probability density function that represents a fundamental knowledge on future occurrences. The added value of MCP-MT is the identification of the probability that a critical situation will happen within the forecast lead-time and when, more likely, it will occur. MCP-MT is thoroughly tested for both single-model and multi-model configurations at a gauged site on the Tiber River, central Italy. The stages forecasted by two operative deterministic models, STAFOM-RCM and MISDc, are considered for the study. The dataset used for the analysis consists of hourly data from 34 flood events selected on a time series of six years. MCP-MT improves over the original models' forecasts: the peak overestimation and the rising limb delayed forecast, characterizing MISDc and STAFOM-RCM respectively, are significantly mitigated, with a reduced mean error on peak stage from 45 to 5 cm and an increased coefficient of persistence from 0.53 up to 0.75. The results show that MCP-MT outperforms the single-temporal approach and is potentially useful for supporting decision-making because the exceedance probability of hydrometric thresholds within a forecast horizon and the most probable flooding time can be estimated.
Integrated Risk-Informed Decision-Making for an ALMR PRISM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Muhlheim, Michael David; Belles, Randy; Denning, Richard S.
Decision-making is the process of identifying decision alternatives, assessing those alternatives based on predefined metrics, selecting an alternative (i.e., making a decision), and then implementing that alternative. The generation of decisions requires a structured, coherent process, or a decision-making process. The overall objective for this work is that the generalized framework is adopted into an autonomous decision-making framework and tailored to specific requirements for various applications. In this context, automation is the use of computing resources to make decisions and implement a structured decision-making process with limited or no human intervention. The overriding goal of automation is to replace ormore » supplement human decision makers with reconfigurable decision-making modules that can perform a given set of tasks rationally, consistently, and reliably. Risk-informed decision-making requires a probabilistic assessment of the likelihood of success given the status of the plant/systems and component health, and a deterministic assessment between plant operating parameters and reactor protection parameters to prevent unnecessary trips and challenges to plant safety systems. The probabilistic portion of the decision-making engine of the supervisory control system is based on the control actions associated with an ALMR PRISM. Newly incorporated into the probabilistic models are the prognostic/diagnostic models developed by Pacific Northwest National Laboratory. These allow decisions to incorporate the health of components into the decision–making process. Once the control options are identified and ranked based on the likelihood of success, the supervisory control system transmits the options to the deterministic portion of the platform. The deterministic portion of the decision-making engine uses thermal-hydraulic modeling and components for an advanced liquid-metal reactor Power Reactor Inherently Safe Module. The deterministic multi-attribute decision-making framework uses various sensor data (e.g., reactor outlet temperature, steam generator drum level) and calculates its position within the challenge state, its trajectory, and its margin within the controllable domain using utility functions to evaluate current and projected plant state space for different control decisions. The metrics that are evaluated are based on reactor trip set points. The integration of the deterministic calculations using multi-physics analyses and probabilistic safety calculations allows for the examination and quantification of margin recovery strategies. This also provides validation of the control options identified from the probabilistic assessment. Thus, the thermalhydraulics analyses are used to validate the control options identified from the probabilistic assessment. Future work includes evaluating other possible metrics and computational efficiencies, and developing a user interface to mimic display panels at a modern nuclear power plant.« less
What do we gain with Probabilistic Flood Loss Models?
NASA Astrophysics Data System (ADS)
Schroeter, K.; Kreibich, H.; Vogel, K.; Merz, B.; Lüdtke, S.
2015-12-01
The reliability of flood loss models is a prerequisite for their practical usefulness. Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks and traditional stage damage functions which are cast in a probabilistic framework. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005, 2006 and 2013 in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The reliability of the probabilistic predictions within validation runs decreases only slightly and achieves a very good coverage of observations within the predictive interval. Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.
Nagarajan, Mahesh B; Raman, Steven S; Lo, Pechin; Lin, Wei-Chan; Khoshnoodi, Pooria; Sayre, James W; Ramakrishna, Bharath; Ahuja, Preeti; Huang, Jiaoti; Margolis, Daniel J A; Lu, David S K; Reiter, Robert E; Goldin, Jonathan G; Brown, Matthew S; Enzmann, Dieter R
2018-02-19
We present a method for generating a T2 MR-based probabilistic model of tumor occurrence in the prostate to guide the selection of anatomical sites for targeted biopsies and serve as a diagnostic tool to aid radiological evaluation of prostate cancer. In our study, the prostate and any radiological findings within were segmented retrospectively on 3D T2-weighted MR images of 266 subjects who underwent radical prostatectomy. Subsequent histopathological analysis determined both the ground truth and the Gleason grade of the tumors. A randomly chosen subset of 19 subjects was used to generate a multi-subject-derived prostate template. Subsequently, a cascading registration algorithm involving both affine and non-rigid B-spline transforms was used to register the prostate of every subject to the template. Corresponding transformation of radiological findings yielded a population-based probabilistic model of tumor occurrence. The quality of our probabilistic model building approach was statistically evaluated by measuring the proportion of correct placements of tumors in the prostate template, i.e., the number of tumors that maintained their anatomical location within the prostate after their transformation into the prostate template space. Probabilistic model built with tumors deemed clinically significant demonstrated a heterogeneous distribution of tumors, with higher likelihood of tumor occurrence at the mid-gland anterior transition zone and the base-to-mid-gland posterior peripheral zones. Of 250 MR lesions analyzed, 248 maintained their original anatomical location with respect to the prostate zones after transformation to the prostate. We present a robust method for generating a probabilistic model of tumor occurrence in the prostate that could aid clinical decision making, such as selection of anatomical sites for MR-guided prostate biopsies.
Aircraft Conflict Analysis and Real-Time Conflict Probing Using Probabilistic Trajectory Modeling
NASA Technical Reports Server (NTRS)
Yang, Lee C.; Kuchar, James K.
2000-01-01
Methods for maintaining separation between aircraft in the current airspace system have been built from a foundation of structured routes and evolved procedures. However, as the airspace becomes more congested and the chance of failures or operational error become more problematic, automated conflict alerting systems have been proposed to help provide decision support and to serve as traffic monitoring aids. The problem of conflict detection and resolution has been tackled from a number of different ways, but in this thesis, it is recast as a problem of prediction in the presence of uncertainties. Much of the focus is concentrated on the errors and uncertainties from the working trajectory model used to estimate future aircraft positions. The more accurate the prediction, the more likely an ideal (no false alarms, no missed detections) alerting system can be designed. Additional insights into the problem were brought forth by a review of current operational and developmental approaches found in the literature. An iterative, trial and error approach to threshold design was identified. When examined from a probabilistic perspective, the threshold parameters were found to be a surrogate to probabilistic performance measures. To overcome the limitations in the current iterative design method, a new direct approach is presented where the performance measures are directly computed and used to perform the alerting decisions. The methodology is shown to handle complex encounter situations (3-D, multi-aircraft, multi-intent, with uncertainties) with relative ease. Utilizing a Monte Carlo approach, a method was devised to perform the probabilistic computations in near realtime. Not only does this greatly increase the method's potential as an analytical tool, but it also opens up the possibility for use as a real-time conflict alerting probe. A prototype alerting logic was developed and has been utilized in several NASA Ames Research Center experimental studies.
Opportunities of probabilistic flood loss models
NASA Astrophysics Data System (ADS)
Schröter, Kai; Kreibich, Heidi; Lüdtke, Stefan; Vogel, Kristin; Merz, Bruno
2016-04-01
Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. However, reliable flood damage models are a prerequisite for the practical usefulness of the model results. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks and traditional stage damage functions. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005, 2006 and 2013 in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of sharpness of the predictions the reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The comparison of the uni-variable Stage damage function and the multivariable model approach emphasises the importance to quantify predictive uncertainty. With each explanatory variable, the multi-variable model reveals an additional source of uncertainty. However, the predictive performance in terms of precision (mbe), accuracy (mae) and reliability (HR) is clearly improved in comparison to uni-variable Stage damage function. Overall, Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.
NASA Astrophysics Data System (ADS)
Ohba, Masamichi; Nohara, Daisuke; Kadokura, Shinji
2016-04-01
Severe storms or other extreme weather events can interrupt the spin of wind turbines in large scale that cause unexpected "wind ramp events". In this study, we present an application of self-organizing maps (SOMs) for climatological attribution of the wind ramp events and their probabilistic prediction. The SOM is an automatic data-mining clustering technique, which allows us to summarize a high-dimensional data space in terms of a set of reference vectors. The SOM is applied to analyze and connect the relationship between atmospheric patterns over Japan and wind power generation. SOM is employed on sea level pressure derived from the JRA55 reanalysis over the target area (Tohoku region in Japan), whereby a two-dimensional lattice of weather patterns (WPs) classified during the 1977-2013 period is obtained. To compare with the atmospheric data, the long-term wind power generation is reconstructed by using a high-resolution surface observation network AMeDAS (Automated Meteorological Data Acquisition System) in Japan. Our analysis extracts seven typical WPs, which are linked to frequent occurrences of wind ramp events. Probabilistic forecasts to wind power generation and ramps are conducted by using the obtained SOM. The probability are derived from the multiple SOM lattices based on the matching of output from TIGGE multi-model global forecast to the WPs on the lattices. Since this method effectively takes care of the empirical uncertainties from the historical data, wind power generation and ramp is probabilistically forecasted from the forecasts of global models. The predictability skill of the forecasts for the wind power generation and ramp events show the relatively good skill score under the downscaling technique. It is expected that the results of this study provides better guidance to the user community and contribute to future development of system operation model for the transmission grid operator.
NASA Astrophysics Data System (ADS)
Xiong, Ming; Zheng, Huinan; Wu, S. T.; Wang, Yuming; Wang, Shui
2007-11-01
Numerical studies of the interplanetary "multiple magnetic clouds (Multi-MC)" are performed by a 2.5-dimensional ideal magnetohydrodynamic (MHD) model in the heliospheric meridional plane. Both slow MC1 and fast MC2 are initially emerged along the heliospheric equator, one after another with different time intervals. The coupling of two MCs could be considered as the comprehensive interaction between two systems, each comprising of an MC body and its driven shock. The MC2-driven shock and MC2 body are successively involved into interaction with MC1 body. The momentum is transferred from MC2 to MC1. After the passage of MC2-driven shock front, magnetic field lines in MC1 medium previously compressed by MC2-driven shock are prevented from being restored by the MC2 body pushing. MC1 body undergoes the most violent compression from the ambient solar wind ahead, continuous penetration of MC2-driven shock through MC1 body, and persistent pushing of MC2 body at MC1 tail boundary. As the evolution proceeds, the MC1 body suffers from larger and larger compression, and its original vulnerable magnetic elasticity becomes stiffer and stiffer. So there exists a maximum compressibility of Multi-MC when the accumulated elasticity can balance the external compression. This cutoff limit of compressibility mainly decides the maximally available geoeffectiveness of Multi-MC because the geoeffectiveness enhancement of MCs interacting is ascribed to the compression. Particularly, the greatest geoeffectiveness is excited among all combinations of each MC helicity, if magnetic field lines in the interacting region of Multi-MC are all southward. Multi-MC completes its final evolutionary stage when the MC2-driven shock is merged with MC1-driven shock into a stronger compound shock. With respect to Multi-MC geoeffectiveness, the evolution stage is a dominant factor, whereas the collision intensity is a subordinate one. The magnetic elasticity, magnetic helicity of each MC, and compression between each other are the key physical factors for the formation, propagation, evolution, and resulting geoeffectiveness of interplanetary Multi-MC.
Multi-functional quantum router using hybrid opto-electromechanics
NASA Astrophysics Data System (ADS)
Ma, Peng-Cheng; Yan, Lei-Lei; Chen, Gui-Bin; Li, Xiao-Wei; Liu, Shu-Jing; Zhan, You-Bang
2018-03-01
Quantum routers engineered with multiple frequency bands play a key role in quantum networks. We propose an experimentally accessible scheme for a multi-functional quantum router, using photon-phonon conversion in a hybrid opto-electromechanical system. Our proposed device functions as a bidirectional, tunable multi-channel quantum router, and demonstrates the possibility to route single optical photons bidirectionally and simultaneously to three different output ports, by adjusting the microwave power. Further, the device also behaves as an interswitching unit for microwave and optical photons, yielding probabilistic routing of microwave (optical) signals to optical (microwave) outports. With respect to potential application, we verify the insignificant influence from vacuum and thermal noises in the performance of the router under cryogenic conditions.
Learned filters for object detection in multi-object visual tracking
NASA Astrophysics Data System (ADS)
Stamatescu, Victor; Wong, Sebastien; McDonnell, Mark D.; Kearney, David
2016-05-01
We investigate the application of learned convolutional filters in multi-object visual tracking. The filters were learned in both a supervised and unsupervised manner from image data using artificial neural networks. This work follows recent results in the field of machine learning that demonstrate the use learned filters for enhanced object detection and classification. Here we employ a track-before-detect approach to multi-object tracking, where tracking guides the detection process. The object detection provides a probabilistic input image calculated by selecting from features obtained using banks of generative or discriminative learned filters. We present a systematic evaluation of these convolutional filters using a real-world data set that examines their performance as generic object detectors.
Zhang, Kejiang; Achari, Gopal; Pei, Yuansheng
2010-10-01
Different types of uncertain information-linguistic, probabilistic, and possibilistic-exist in site characterization. Their representation and propagation significantly influence the management of contaminated sites. In the absence of a framework with which to properly represent and integrate these quantitative and qualitative inputs together, decision makers cannot fully take advantage of the available and necessary information to identify all the plausible alternatives. A systematic methodology was developed in the present work to incorporate linguistic, probabilistic, and possibilistic information into the Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE), a subgroup of Multi-Criteria Decision Analysis (MCDA) methods for ranking contaminated sites. The identification of criteria based on the paradigm of comparative risk assessment provides a rationale for risk-based prioritization. Uncertain linguistic, probabilistic, and possibilistic information identified in characterizing contaminated sites can be properly represented as numerical values, intervals, probability distributions, and fuzzy sets or possibility distributions, and linguistic variables according to their nature. These different kinds of representation are first transformed into a 2-tuple linguistic representation domain. The propagation of hybrid uncertainties is then carried out in the same domain. This methodology can use the original site information directly as much as possible. The case study shows that this systematic methodology provides more reasonable results. © 2010 SETAC.
Probabilistic models for neural populations that naturally capture global coupling and criticality
2017-01-01
Advances in multi-unit recordings pave the way for statistical modeling of activity patterns in large neural populations. Recent studies have shown that the summed activity of all neurons strongly shapes the population response. A separate recent finding has been that neural populations also exhibit criticality, an anomalously large dynamic range for the probabilities of different population activity patterns. Motivated by these two observations, we introduce a class of probabilistic models which takes into account the prior knowledge that the neural population could be globally coupled and close to critical. These models consist of an energy function which parametrizes interactions between small groups of neurons, and an arbitrary positive, strictly increasing, and twice differentiable function which maps the energy of a population pattern to its probability. We show that: 1) augmenting a pairwise Ising model with a nonlinearity yields an accurate description of the activity of retinal ganglion cells which outperforms previous models based on the summed activity of neurons; 2) prior knowledge that the population is critical translates to prior expectations about the shape of the nonlinearity; 3) the nonlinearity admits an interpretation in terms of a continuous latent variable globally coupling the system whose distribution we can infer from data. Our method is independent of the underlying system’s state space; hence, it can be applied to other systems such as natural scenes or amino acid sequences of proteins which are also known to exhibit criticality. PMID:28926564
Probabilistic objective functions for sensor management
NASA Astrophysics Data System (ADS)
Mahler, Ronald P. S.; Zajic, Tim R.
2004-08-01
This paper continues the investigation of a foundational and yet potentially practical basis for control-theoretic sensor management, using a comprehensive, intuitive, system-level Bayesian paradigm based on finite-set statistics (FISST). In this paper we report our most recent progress, focusing on multistep look-ahead -- i.e., allocation of sensor resources throughout an entire future time-window. We determine future sensor states in the time-window using a "probabilistically natural" sensor management objective function, the posterior expected number of targets (PENT). This objective function is constructed using a new "maxi-PIMS" optimization strategy that hedges against unknowable future observation-collections. PENT is used in conjuction with approximate multitarget filters: the probability hypothesis density (PHD) filter or the multi-hypothesis correlator (MHC) filter.
Novel probabilistic neuroclassifier
NASA Astrophysics Data System (ADS)
Hong, Jiang; Serpen, Gursel
2003-09-01
A novel probabilistic potential function neural network classifier algorithm to deal with classes which are multi-modally distributed and formed from sets of disjoint pattern clusters is proposed in this paper. The proposed classifier has a number of desirable properties which distinguish it from other neural network classifiers. A complete description of the algorithm in terms of its architecture and the pseudocode is presented. Simulation analysis of the newly proposed neuro-classifier algorithm on a set of benchmark problems is presented. Benchmark problems tested include IRIS, Sonar, Vowel Recognition, Two-Spiral, Wisconsin Breast Cancer, Cleveland Heart Disease and Thyroid Gland Disease. Simulation results indicate that the proposed neuro-classifier performs consistently better for a subset of problems for which other neural classifiers perform relatively poorly.
Improved reliability of wind turbine towers with active tuned mass dampers (ATMDs)
NASA Astrophysics Data System (ADS)
Fitzgerald, Breiffni; Sarkar, Saptarshi; Staino, Andrea
2018-04-01
Modern multi-megawatt wind turbines are composed of slender, flexible, and lightly damped blades and towers. These components exhibit high susceptibility to wind-induced vibrations. As the size, flexibility and cost of the towers have increased in recent years, the need to protect these structures against damage induced by turbulent aerodynamic loading has become apparent. This paper combines structural dynamic models and probabilistic assessment tools to demonstrate improvements in structural reliability when modern wind turbine towers are equipped with active tuned mass dampers (ATMDs). This study proposes a multi-modal wind turbine model for wind turbine control design and analysis. This study incorporates an ATMD into the tower of this model. The model is subjected to stochastically generated wind loads of varying speeds to develop wind-induced probabilistic demand models for towers of modern multi-megawatt wind turbines under structural uncertainty. Numerical simulations have been carried out to ascertain the effectiveness of the active control system to improve the structural performance of the wind turbine and its reliability. The study constructs fragility curves, which illustrate reductions in the vulnerability of towers to wind loading owing to the inclusion of the damper. Results show that the active controller is successful in increasing the reliability of the tower responses. According to the analysis carried out in this paper, a strong reduction of the probability of exceeding a given displacement at the rated wind speed has been observed.
NASA Astrophysics Data System (ADS)
Haris, A.; Nafian, M.; Riyanto, A.
2017-07-01
Danish North Sea Fields consist of several formations (Ekofisk, Tor, and Cromer Knoll) that was started from the age of Paleocene to Miocene. In this study, the integration of seismic and well log data set is carried out to determine the chalk sand distribution in the Danish North Sea field. The integration of seismic and well log data set is performed by using the seismic inversion analysis and seismic multi-attribute. The seismic inversion algorithm, which is used to derive acoustic impedance (AI), is model-based technique. The derived AI is then used as external attributes for the input of multi-attribute analysis. Moreover, the multi-attribute analysis is used to generate the linear and non-linear transformation of among well log properties. In the case of the linear model, selected transformation is conducted by weighting step-wise linear regression (SWR), while for the non-linear model is performed by using probabilistic neural networks (PNN). The estimated porosity, which is resulted by PNN shows better suited to the well log data compared with the results of SWR. This result can be understood since PNN perform non-linear regression so that the relationship between the attribute data and predicted log data can be optimized. The distribution of chalk sand has been successfully identified and characterized by porosity value ranging from 23% up to 30%.
Improving PERSIANN-CCS rain estimation using probabilistic approach and multi-sensors information
NASA Astrophysics Data System (ADS)
Karbalaee, N.; Hsu, K. L.; Sorooshian, S.; Kirstetter, P.; Hong, Y.
2016-12-01
This presentation discusses the recent implemented approaches to improve the rainfall estimation from Precipitation Estimation from Remotely Sensed Information using Artificial Neural Network-Cloud Classification System (PERSIANN-CCS). PERSIANN-CCS is an infrared (IR) based algorithm being integrated in the IMERG (Integrated Multi-Satellite Retrievals for the Global Precipitation Mission GPM) to create a precipitation product in 0.1x0.1degree resolution over the chosen domain 50N to 50S every 30 minutes. Although PERSIANN-CCS has a high spatial and temporal resolution, it overestimates or underestimates due to some limitations.PERSIANN-CCS can estimate rainfall based on the extracted information from IR channels at three different temperature threshold levels (220, 235, and 253k). This algorithm relies only on infrared data to estimate rainfall indirectly from this channel which cause missing the rainfall from warm clouds and false estimation for no precipitating cold clouds. In this research the effectiveness of using other channels of GOES satellites such as visible and water vapors has been investigated. By using multi-sensors the precipitation can be estimated based on the extracted information from multiple channels. Also, instead of using the exponential function for estimating rainfall from cloud top temperature, the probabilistic method has been used. Using probability distributions of precipitation rates instead of deterministic values has improved the rainfall estimation for different type of clouds.
NASA Technical Reports Server (NTRS)
Fragola, Joseph R.; Maggio, Gaspare; Frank, Michael V.; Gerez, Luis; Mcfadden, Richard H.; Collins, Erin P.; Ballesio, Jorge; Appignani, Peter L.; Karns, James J.
1995-01-01
Volume 5 is Appendix C, Auxiliary Shuttle Risk Analyses, and contains the following reports: Probabilistic Risk Assessment of Space Shuttle Phase 1 - Space Shuttle Catastrophic Failure Frequency Final Report; Risk Analysis Applied to the Space Shuttle Main Engine - Demonstration Project for the Main Combustion Chamber Risk Assessment; An Investigation of the Risk Implications of Space Shuttle Solid Rocket Booster Chamber Pressure Excursions; Safety of the Thermal Protection System of the Space Shuttle Orbiter - Quantitative Analysis and Organizational Factors; Space Shuttle Main Propulsion Pressurization System Probabilistic Risk Assessment, Final Report; and Space Shuttle Probabilistic Risk Assessment Proof-of-Concept Study - Auxiliary Power Unit and Hydraulic Power Unit Analysis Report.
META 2f: Probabilistic, Compositional, Multi-dimension Model-Based Verification (PROMISE)
2011-10-01
Equational Logic, Rewriting Logic, and Maude ................................................ 52 5.3 Results and Discussion...and its discrete transitions are left unchanged. However, the differential equations describing the continuous dynamics (in each mode) are replaced by...by replacing hard-to-analyze differential equations by discrete transitions. In principle, predicate and qualitative abstraction can be used on a
2012-08-25
Accel- erated Crystal Plasticity FEM Simulations (submitted). 5. M. Anahid, M. Samal and S. Ghosh, Dwell fatigue crack nucleation model based on using...4] M. Anahid, M. K. Samal , and S. Ghosh. Dwell fatigue crack nucleation model based on crystal plasticity finite element simulations of
Wels, Michael; Carneiro, Gustavo; Aplas, Alexander; Huber, Martin; Hornegger, Joachim; Comaniciu, Dorin
2008-01-01
In this paper we present a fully automated approach to the segmentation of pediatric brain tumors in multi-spectral 3-D magnetic resonance images. It is a top-down segmentation approach based on a Markov random field (MRF) model that combines probabilistic boosting trees (PBT) and lower-level segmentation via graph cuts. The PBT algorithm provides a strong discriminative observation model that classifies tumor appearance while a spatial prior takes into account the pair-wise homogeneity in terms of classification labels and multi-spectral voxel intensities. The discriminative model relies not only on observed local intensities but also on surrounding context for detecting candidate regions for pathology. A mathematically sound formulation for integrating the two approaches into a unified statistical framework is given. The proposed method is applied to the challenging task of detection and delineation of pediatric brain tumors. This segmentation task is characterized by a high non-uniformity of both the pathology and the surrounding non-pathologic brain tissue. A quantitative evaluation illustrates the robustness of the proposed method. Despite dealing with more complicated cases of pediatric brain tumors the results obtained are mostly better than those reported for current state-of-the-art approaches to 3-D MR brain tumor segmentation in adult patients. The entire processing of one multi-spectral data set does not require any user interaction, and takes less time than previously proposed methods.
Reliability, Risk and Cost Trade-Offs for Composite Designs
NASA Technical Reports Server (NTRS)
Shiao, Michael C.; Singhal, Surendra N.; Chamis, Christos C.
1996-01-01
Risk and cost trade-offs have been simulated using a probabilistic method. The probabilistic method accounts for all naturally-occurring uncertainties including those in constituent material properties, fabrication variables, structure geometry and loading conditions. The probability density function of first buckling load for a set of uncertain variables is computed. The probabilistic sensitivity factors of uncertain variables to the first buckling load is calculated. The reliability-based cost for a composite fuselage panel is defined and minimized with respect to requisite design parameters. The optimization is achieved by solving a system of nonlinear algebraic equations whose coefficients are functions of probabilistic sensitivity factors. With optimum design parameters such as the mean and coefficient of variation (representing range of scatter) of uncertain variables, the most efficient and economical manufacturing procedure can be selected. In this paper, optimum values of the requisite design parameters for a predetermined cost due to failure occurrence are computationally determined. The results for the fuselage panel analysis show that the higher the cost due to failure occurrence, the smaller the optimum coefficient of variation of fiber modulus (design parameter) in longitudinal direction.
Development of a First-of-a-Kind Deterministic Decision-Making Tool for Supervisory Control System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cetiner, Sacit M; Kisner, Roger A; Muhlheim, Michael David
2015-07-01
Decision-making is the process of identifying and choosing alternatives where each alternative offers a different approach or path to move from a given state or condition to a desired state or condition. The generation of consistent decisions requires that a structured, coherent process be defined, immediately leading to a decision-making framework. The overall objective of the generalized framework is for it to be adopted into an autonomous decision-making framework and tailored to specific requirements for various applications. In this context, automation is the use of computing resources to make decisions and implement a structured decision-making process with limited or nomore » human intervention. The overriding goal of automation is to replace or supplement human decision makers with reconfigurable decision- making modules that can perform a given set of tasks reliably. Risk-informed decision making requires a probabilistic assessment of the likelihood of success given the status of the plant/systems and component health, and a deterministic assessment between plant operating parameters and reactor protection parameters to prevent unnecessary trips and challenges to plant safety systems. The implementation of the probabilistic portion of the decision-making engine of the proposed supervisory control system was detailed in previous milestone reports. Once the control options are identified and ranked based on the likelihood of success, the supervisory control system transmits the options to the deterministic portion of the platform. The deterministic multi-attribute decision-making framework uses variable sensor data (e.g., outlet temperature) and calculates where it is within the challenge state, its trajectory, and margin within the controllable domain using utility functions to evaluate current and projected plant state space for different control decisions. Metrics to be evaluated include stability, cost, time to complete (action), power level, etc. The integration of deterministic calculations using multi-physics analyses (i.e., neutronics, thermal, and thermal-hydraulics) and probabilistic safety calculations allows for the examination and quantification of margin recovery strategies. This also provides validation of the control options identified from the probabilistic assessment. Thus, the thermal-hydraulics analyses are used to validate the control options identified from the probabilistic assessment. Future work includes evaluating other possible metrics and computational efficiencies.« less
NASA Astrophysics Data System (ADS)
Tonini, R.; Anita, G.
2011-12-01
In both worldwide and regional historical catalogues, most of the tsunamis are caused by earthquakes and a minor percentage is represented by all the other non-seismic sources. On the other hand, tsunami hazard and risk studies are often applied to very specific areas, where this global trend can be different or even inverted, depending on the kind of potential tsunamigenic sources which characterize the case study. So far, few probabilistic approaches consider the contribution of landslides and/or phenomena derived by volcanic activity, i.e. pyroclastic flows and flank collapses, as predominant in the PTHA, also because of the difficulties to estimate the correspondent recurrence time. These considerations are valid, for example, for the city of Naples, Italy, which is surrounded by a complex active volcanic system (Vesuvio, Campi Flegrei, Ischia) that presents a significant number of potential tsunami sources of non-seismic origin compared to the seismic ones. In this work we present the preliminary results of a probabilistic multi-source tsunami hazard assessment applied to Naples. The method to estimate the uncertainties will be based on Bayesian inference. This is the first step towards a more comprehensive task which will provide a tsunami risk quantification for this town in the frame of the Italian national project ByMuR (http://bymur.bo.ingv.it). This three years long ongoing project has the final objective of developing a Bayesian multi-risk methodology to quantify the risk related to different natural hazards (volcanoes, earthquakes and tsunamis) applied to the city of Naples.
NASA Astrophysics Data System (ADS)
D'Isanto, A.; Polsterer, K. L.
2018-01-01
Context. The need to analyze the available large synoptic multi-band surveys drives the development of new data-analysis methods. Photometric redshift estimation is one field of application where such new methods improved the results, substantially. Up to now, the vast majority of applied redshift estimation methods have utilized photometric features. Aims: We aim to develop a method to derive probabilistic photometric redshift directly from multi-band imaging data, rendering pre-classification of objects and feature extraction obsolete. Methods: A modified version of a deep convolutional network was combined with a mixture density network. The estimates are expressed as Gaussian mixture models representing the probability density functions (PDFs) in the redshift space. In addition to the traditional scores, the continuous ranked probability score (CRPS) and the probability integral transform (PIT) were applied as performance criteria. We have adopted a feature based random forest and a plain mixture density network to compare performances on experiments with data from SDSS (DR9). Results: We show that the proposed method is able to predict redshift PDFs independently from the type of source, for example galaxies, quasars or stars. Thereby the prediction performance is better than both presented reference methods and is comparable to results from the literature. Conclusions: The presented method is extremely general and allows us to solve of any kind of probabilistic regression problems based on imaging data, for example estimating metallicity or star formation rate of galaxies. This kind of methodology is tremendously important for the next generation of surveys.
NASA Technical Reports Server (NTRS)
Boyce, Lola; Bast, Callie C.
1992-01-01
The research included ongoing development of methodology that provides probabilistic lifetime strength of aerospace materials via computational simulation. A probabilistic material strength degradation model, in the form of a randomized multifactor interaction equation, is postulated for strength degradation of structural components of aerospace propulsion systems subjected to a number of effects or primative variables. These primative variable may include high temperature, fatigue or creep. In most cases, strength is reduced as a result of the action of a variable. This multifactor interaction strength degradation equation has been randomized and is included in the computer program, PROMISS. Also included in the research is the development of methodology to calibrate the above described constitutive equation using actual experimental materials data together with linear regression of that data, thereby predicting values for the empirical material constraints for each effect or primative variable. This regression methodology is included in the computer program, PROMISC. Actual experimental materials data were obtained from the open literature for materials typically of interest to those studying aerospace propulsion system components. Material data for Inconel 718 was analyzed using the developed methodology.
Langenderfer, Joseph E; Rullkoetter, Paul J; Mell, Amy G; Laz, Peter J
2009-04-01
An accurate assessment of shoulder kinematics is useful for understanding healthy normal and pathological mechanics. Small variability in identifying and locating anatomical landmarks (ALs) has potential to affect reported shoulder kinematics. The objectives of this study were to quantify the effect of landmark location variability on scapular and humeral kinematic descriptions for multiple subjects using probabilistic analysis methods, and to evaluate the consistency in results across multiple subjects. Data from 11 healthy subjects performing humeral elevation in the scapular plane were used to calculate Euler angles describing humeral and scapular kinematics. Probabilistic analyses were performed for each subject to simulate uncertainty in the locations of 13 upper-extremity ALs. For standard deviations of 4 mm in landmark location, the analysis predicted Euler angle envelopes between the 1 and 99 percentile bounds of up to 16.6 degrees . While absolute kinematics varied with the subject, the average 1-99% kinematic ranges for the motion were consistent across subjects and sensitivity factors showed no statistically significant differences between subjects. The description of humeral kinematics was most sensitive to the location of landmarks on the thorax, while landmarks on the scapula had the greatest effect on the description of scapular elevation. The findings of this study can provide a better understanding of kinematic variability, which can aid in making accurate clinical diagnoses and refining kinematic measurement techniques.
Probabilistic Methods for Structural Reliability and Risk
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2010-01-01
A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multifactor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.
A statistical framework for applying RNA profiling to chemical hazard detection.
Kostich, Mitchell S
2017-12-01
Use of 'omics technologies in environmental science is expanding. However, application is mostly restricted to characterizing molecular steps leading from toxicant interaction with molecular receptors to apical endpoints in laboratory species. Use in environmental decision-making is limited, due to difficulty in elucidating mechanisms in sufficient detail to make quantitative outcome predictions in any single species or in extending predictions to aquatic communities. Here we introduce a mechanism-agnostic statistical approach, supplementing mechanistic investigation by allowing probabilistic outcome prediction even when understanding of molecular pathways is limited, and facilitating extrapolation from results in laboratory test species to predictions about aquatic communities. We use concepts familiar to environmental managers, supplemented with techniques employed for clinical interpretation of 'omics-based biomedical tests. We describe the framework in step-wise fashion, beginning with single test replicates of a single RNA variant, then extending to multi-gene RNA profiling, collections of test replicates, and integration of complementary data. In order to simplify the presentation, we focus on using RNA profiling for distinguishing presence versus absence of chemical hazards, but the principles discussed can be extended to other types of 'omics measurements, multi-class problems, and regression. We include a supplemental file demonstrating many of the concepts using the open source R statistical package. Published by Elsevier Ltd.
Liu, Zhou; Shum, Ho Cheung
2013-01-01
In this work, we demonstrate a robust and reliable approach to fabricate multi-compartment particles for cell co-culture studies. By taking advantage of the laminar flow within our microfluidic nozzle, multiple parallel streams of liquids flow towards the nozzle without significant mixing. Afterwards, the multiple parallel streams merge into a single stream, which is sprayed into air, forming monodisperse droplets under an electric field with a high field strength. The resultant multi-compartment droplets are subsequently cross-linked in a calcium chloride solution to form calcium alginate micro-particles with multiple compartments. Each compartment of the particles can be used for encapsulating different types of cells or biological cell factors. These hydrogel particles with cross-linked alginate chains show similarity in the physical and mechanical environment as the extracellular matrix of biological cells. Thus, the multi-compartment particles provide a promising platform for cell studies and co-culture of different cells. In our study, cells are encapsulated in the multi-compartment particles and the viability of cells is quantified using a fluorescence microscope after the cells are stained for a live/dead assay. The high cell viability after encapsulation indicates the cytocompatibility and feasibility of our technique. Our multi-compartment particles have great potential as a platform for studying cell-cell interactions as well as interactions of cells with extracellular factors.
Liu, Zhou; Shum, Ho Cheung
2013-01-01
In this work, we demonstrate a robust and reliable approach to fabricate multi-compartment particles for cell co-culture studies. By taking advantage of the laminar flow within our microfluidic nozzle, multiple parallel streams of liquids flow towards the nozzle without significant mixing. Afterwards, the multiple parallel streams merge into a single stream, which is sprayed into air, forming monodisperse droplets under an electric field with a high field strength. The resultant multi-compartment droplets are subsequently cross-linked in a calcium chloride solution to form calcium alginate micro-particles with multiple compartments. Each compartment of the particles can be used for encapsulating different types of cells or biological cell factors. These hydrogel particles with cross-linked alginate chains show similarity in the physical and mechanical environment as the extracellular matrix of biological cells. Thus, the multi-compartment particles provide a promising platform for cell studies and co-culture of different cells. In our study, cells are encapsulated in the multi-compartment particles and the viability of cells is quantified using a fluorescence microscope after the cells are stained for a live/dead assay. The high cell viability after encapsulation indicates the cytocompatibility and feasibility of our technique. Our multi-compartment particles have great potential as a platform for studying cell-cell interactions as well as interactions of cells with extracellular factors. PMID:24404050
Elasto-limited plastic analysis of structures for probabilistic conditions
NASA Astrophysics Data System (ADS)
Movahedi Rad, M.
2018-06-01
With applying plastic analysis and design methods, significant saving in material can be obtained. However, as a result of this benefit excessive plastic deformations and large residual displacements might develop, which in turn might lead to unserviceability and collapse of the structure. In this study, for deterministic problem the residual deformation of structures is limited by considering a constraint on the complementary strain energy of the residual forces. For probabilistic problem the constraint for the complementary strain energy of the residual forces is given randomly and critical stresses updated during the iteration. Limit curves are presented for the plastic limit load factors. The results show that these constraints have significant effects on the load factors. The formulations of the deterministic and probabilistic problems lead to mathematical programming which are solved by the use of nonlinear algorithm.
DISCOUNTING OF DELAYED AND PROBABILISTIC LOSSES OVER A WIDE RANGE OF AMOUNTS
Green, Leonard; Myerson, Joel; Oliveira, Luís; Chang, Seo Eun
2014-01-01
The present study examined delay and probability discounting of hypothetical monetary losses over a wide range of amounts (from $20 to $500,000) in order to determine how amount affects the parameters of the hyperboloid discounting function. In separate conditions, college students chose between immediate payments and larger, delayed payments and between certain payments and larger, probabilistic payments. The hyperboloid function accurately described both types of discounting, and amount of loss had little or no systematic effect on the degree of discounting. Importantly, the amount of loss also had little systematic effect on either the rate parameter or the exponent of the delay and probability discounting functions. The finding that the parameters of the hyperboloid function remain relatively constant across a wide range of amounts of delayed and probabilistic loss stands in contrast to the robust amount effects observed with delayed and probabilistic rewards. At the individual level, the degree to which delayed losses were discounted was uncorrelated with the degree to which probabilistic losses were discounted, and delay and probability loaded on two separate factors, similar to what is observed with delayed and probabilistic rewards. Taken together, these findings argue that although delay and probability discounting involve fundamentally different decision-making mechanisms, nevertheless the discounting of delayed and probabilistic losses share an insensitivity to amount that distinguishes it from the discounting of delayed and probabilistic gains. PMID:24745086
2014-01-01
Automatic reconstruction of metabolic pathways for an organism from genomics and transcriptomics data has been a challenging and important problem in bioinformatics. Traditionally, known reference pathways can be mapped into an organism-specific ones based on its genome annotation and protein homology. However, this simple knowledge-based mapping method might produce incomplete pathways and generally cannot predict unknown new relations and reactions. In contrast, ab initio metabolic network construction methods can predict novel reactions and interactions, but its accuracy tends to be low leading to a lot of false positives. Here we combine existing pathway knowledge and a new ab initio Bayesian probabilistic graphical model together in a novel fashion to improve automatic reconstruction of metabolic networks. Specifically, we built a knowledge database containing known, individual gene / protein interactions and metabolic reactions extracted from existing reference pathways. Known reactions and interactions were then used as constraints for Bayesian network learning methods to predict metabolic pathways. Using individual reactions and interactions extracted from different pathways of many organisms to guide pathway construction is new and improves both the coverage and accuracy of metabolic pathway construction. We applied this probabilistic knowledge-based approach to construct the metabolic networks from yeast gene expression data and compared its results with 62 known metabolic networks in the KEGG database. The experiment showed that the method improved the coverage of metabolic network construction over the traditional reference pathway mapping method and was more accurate than pure ab initio methods. PMID:25374614
Chauhan, Monika; Sharma, Gourav; Joshi, Gaurav; Kumar, Raj
2016-01-01
The interactions of Epidermal Growth Factor Receptor (EGFR) and topoisomerases have been seen in various cancer including brain, breast, ovarian, colorectal, gastric, etc. The studies in adenocarcinoma patients, chromogenic in situ hybridization, western blotting, receptor binding assay and electromobility shift assays, etc. threw light on the biophysical and biochemical features of EGFR and Topoisomerase cross-talks. It has been revealed that both the isomers of topoisomerase (Topo I and Topo II) interact via different mechanisms with EGFR. Topo II and HER2 share the same location i.e. 17q12-21 regions which could be a possible cause of predominant interactions seen between them. Topo I and EGFR interactions are mechanically related to the nucleolar translocation of heparenase by EGF and c-Jun. We compiled literature findings including the mechanistic interventions, signaling pathways, patents, in vitro and in vivo data of tested inhibitors and combinations in clinical trials, which provide convincing confirmations for the interactions of EGFR and topoisomerases. These interactions may be used for deriving a consistent route of mechanism, design and development of standard drug combinations and dual or multi inhibitors.
NASA Astrophysics Data System (ADS)
Yang, Xiu-Qun; Yang, Dejian; Xie, Qian; Zhang, Yaocun; Ren, Xuejuan; Tang, Youmin
2017-04-01
Based on historical forecasts of three quasi-operational multi-model ensemble (MME) systems, this study assesses the superiority of coupled MME over contributing single-model ensembles (SMEs) and over uncoupled atmospheric MME in predicting the Western North Pacific-East Asian summer monsoon variability. The probabilistic and deterministic forecast skills are measured by Brier skill score (BSS) and anomaly correlation (AC), respectively. A forecast-format dependent MME superiority over SMEs is found. The probabilistic forecast skill of the MME is always significantly better than that of each SME, while the deterministic forecast skill of the MME can be lower than that of some SMEs. The MME superiority arises from both the model diversity and the ensemble size increase in the tropics, and primarily from the ensemble size increase in the subtropics. The BSS is composed of reliability and resolution, two attributes characterizing probabilistic forecast skill. The probabilistic skill increase of the MME is dominated by the dramatic improvement in reliability, while resolution is not always improved, similar to AC. A monotonic resolution-AC relationship is further found and qualitatively explained, whereas little relationship can be identified between reliability and AC. It is argued that the MME's success in improving the reliability arises from an effective reduction of the overconfidence in forecast distributions. Moreover, it is examined that the seasonal predictions with coupled MME are more skillful than those with the uncoupled atmospheric MME forced by persisting sea surface temperature (SST) anomalies, since the coupled MME has better predicted the SST anomaly evolution in three key regions.
Combining multiple decisions: applications to bioinformatics
NASA Astrophysics Data System (ADS)
Yukinawa, N.; Takenouchi, T.; Oba, S.; Ishii, S.
2008-01-01
Multi-class classification is one of the fundamental tasks in bioinformatics and typically arises in cancer diagnosis studies by gene expression profiling. This article reviews two recent approaches to multi-class classification by combining multiple binary classifiers, which are formulated based on a unified framework of error-correcting output coding (ECOC). The first approach is to construct a multi-class classifier in which each binary classifier to be aggregated has a weight value to be optimally tuned based on the observed data. In the second approach, misclassification of each binary classifier is formulated as a bit inversion error with a probabilistic model by making an analogy to the context of information transmission theory. Experimental studies using various real-world datasets including cancer classification problems reveal that both of the new methods are superior or comparable to other multi-class classification methods.
NASA Astrophysics Data System (ADS)
Hu, Y.; Quinn, C.; Cai, X.
2015-12-01
One major challenge of agent-based modeling is to derive agents' behavioral rules due to behavioral uncertainty and data scarcity. This study proposes a new approach to combine a data-driven modeling based on the directed information (i.e., machine intelligence) with expert domain knowledge (i.e., human intelligence) to derive the behavioral rules of agents considering behavioral uncertainty. A directed information graph algorithm is applied to identifying the causal relationships between agents' decisions (i.e., groundwater irrigation depth) and time-series of environmental, socio-economical and institutional factors. A case study is conducted for the High Plains aquifer hydrological observatory (HO) area, U.S. Preliminary results show that four factors, corn price (CP), underlying groundwater level (GWL), monthly mean temperature (T) and precipitation (P) have causal influences on agents' decisions on groundwater irrigation depth (GWID) to various extents. Based on the similarity of the directed information graph for each agent, five clusters of graphs are further identified to represent all the agents' behaviors in the study area as shown in Figure 1. Using these five representative graphs, agents' monthly optimal groundwater pumping rates are derived through the probabilistic inference. Such data-driven relationships and probabilistic quantifications are then coupled with a physically-based groundwater model to investigate the interactions between agents' pumping behaviors and the underlying groundwater system in the context of coupled human and natural systems.
Metrics for Labeled Markov Systems
NASA Technical Reports Server (NTRS)
Desharnais, Josee; Jagadeesan, Radha; Gupta, Vineet; Panangaden, Prakash
1999-01-01
Partial Labeled Markov Chains are simultaneously generalizations of process algebra and of traditional Markov chains. They provide a foundation for interacting discrete probabilistic systems, the interaction being synchronization on labels as in process algebra. Existing notions of process equivalence are too sensitive to the exact probabilities of various transitions. This paper addresses contextual reasoning principles for reasoning about more robust notions of "approximate" equivalence between concurrent interacting probabilistic systems. The present results indicate that:We develop a family of metrics between partial labeled Markov chains to formalize the notion of distance between processes. We show that processes at distance zero are bisimilar. We describe a decision procedure to compute the distance between two processes. We show that reasoning about approximate equivalence can be done compositionally by showing that process combinators do not increase distance. We introduce an asymptotic metric to capture asymptotic properties of Markov chains; and show that parallel composition does not increase asymptotic distance.
Probabilistic multi-resolution human classification
NASA Astrophysics Data System (ADS)
Tu, Jun; Ran, H.
2006-02-01
Recently there has been some interest in using infrared cameras for human detection because of the sharply decreasing prices of infrared cameras. The training data used in our work for developing the probabilistic template consists images known to contain humans in different poses and orientation but having the same height. Multiresolution templates are performed. They are based on contour and edges. This is done so that the model does not learn the intensity variations among the background pixels and intensity variations among the foreground pixels. Each template at every level is then translated so that the centroid of the non-zero pixels matches the geometrical center of the image. After this normalization step, for each pixel of the template, the probability of it being pedestrian is calculated based on the how frequently it appears as 1 in the training data. We also use periodicity gait to verify the pedestrian in a Bayesian manner for the whole blob in a probabilistic way. The videos had quite a lot of variations in the scenes, sizes of people, amount of occlusions and clutter in the backgrounds as is clearly evident. Preliminary experiments show the robustness.
NASA Astrophysics Data System (ADS)
Mastrolorenzo, G.; Pappalardo, L.; Troise, C.; Panizza, A.; de Natale, G.
2005-05-01
Integrated volcanological-probabilistic approaches has been used in order to simulate pyroclastic density currents and fallout and produce hazard maps for Campi Flegrei and Somma Vesuvius areas. On the basis of the analyses of all types of pyroclastic flows, surges, secondary pyroclastic density currents and fallout events occurred in the volcanological history of the two volcanic areas and the evaluation of probability for each type of events, matrixs of input parameters for a numerical simulation have been performed. The multi-dimensional input matrixs include the main controlling parameters of the pyroclasts transport and deposition dispersion, as well as the set of possible eruptive vents used in the simulation program. Probabilistic hazard maps provide of each points of campanian area, the yearly probability to be interested by a given event with a given intensity and resulting demage. Probability of a few events in one thousand years are typical of most areas around the volcanoes whitin a range of ca 10 km, including Neaples. Results provide constrains for the emergency plans in Neapolitan area.
Developing probabilistic models to predict amphibian site occupancy in a patchy landscape
R. A. Knapp; K.R. Matthews; H. K. Preisler; R. Jellison
2003-01-01
Abstract. Human-caused fragmentation of habitats is threatening an increasing number of animal and plant species, making an understanding of the factors influencing patch occupancy ever more important. The overall goal of the current study was to develop probabilistic models of patch occupancy for the mountain yellow-legged frog (Rana muscosa). This once-common species...
Contaminant deposition building shielding factors for US residential structures.
Dickson, Elijah; Hamby, David; Eckerman, Keith
2017-10-10
This paper presents validated building shielding factors designed for contemporary US housing-stock under an idealized, yet realistic, exposure scenario from contaminant deposition on the roof and surrounding surfaces. The building shielding factors are intended for use in emergency planning and level three probabilistic risk assessments for a variety of postulated radiological events in which a realistic assessment is necessary to better understand the potential risks for accident mitigation and emergency response planning. Factors are calculated from detailed computational housing-units models using the general-purpose Monte Carlo N-Particle computational code, MCNP5, and are benchmarked from a series of narrow- and broad-beam measurements analyzing the shielding effectiveness of ten common general-purpose construction materials and ten shielding models representing the primary weather barriers (walls and roofs) of likely US housing-stock. Each model was designed to scale based on common residential construction practices and include, to the extent practical, all structurally significant components important for shielding against ionizing radiation. Calculations were performed for floor-specific locations from contaminant deposition on the roof and surrounding ground as well as for computing a weighted-average representative building shielding factor for single- and multi-story detached homes, both with and without basement as well for single-wide manufactured housing-unit. © 2017 IOP Publishing Ltd.
Contaminant deposition building shielding factors for US residential structures.
Dickson, E D; Hamby, D M; Eckerman, K F
2015-06-01
This paper presents validated building shielding factors designed for contemporary US housing-stock under an idealized, yet realistic, exposure scenario from contaminant deposition on the roof and surrounding surfaces. The building shielding factors are intended for use in emergency planning and level three probabilistic risk assessments for a variety of postulated radiological events in which a realistic assessment is necessary to better understand the potential risks for accident mitigation and emergency response planning. Factors are calculated from detailed computational housing-units models using the general-purpose Monte Carlo N-Particle computational code, MCNP5, and are benchmarked from a series of narrow- and broad-beam measurements analyzing the shielding effectiveness of ten common general-purpose construction materials and ten shielding models representing the primary weather barriers (walls and roofs) of likely US housing-stock. Each model was designed to scale based on common residential construction practices and include, to the extent practical, all structurally significant components important for shielding against ionizing radiation. Calculations were performed for floor-specific locations from contaminant deposition on the roof and surrounding ground as well as for computing a weighted-average representative building shielding factor for single- and multi-story detached homes, both with and without basement as well for single-wide manufactured housing-unit.
A Bayesian Approach to Interactive Retrieval
ERIC Educational Resources Information Center
Tague, Jean M.
1973-01-01
A probabilistic model for interactive retrieval is presented. Bayesian statistical decision theory principles are applied: use of prior and sample information about the relationship of document descriptions to query relevance; maximization of expected value of a utility function, to the problem of optimally restructuring search strategies in an…
Use of adjoint methods in the probabilistic finite element approach to fracture mechanics
NASA Technical Reports Server (NTRS)
Liu, Wing Kam; Besterfield, Glen; Lawrence, Mark; Belytschko, Ted
1988-01-01
The adjoint method approach to probabilistic finite element methods (PFEM) is presented. When the number of objective functions is small compared to the number of random variables, the adjoint method is far superior to the direct method in evaluating the objective function derivatives with respect to the random variables. The PFEM is extended to probabilistic fracture mechanics (PFM) using an element which has the near crack-tip singular strain field embedded. Since only two objective functions (i.e., mode I and II stress intensity factors) are needed for PFM, the adjoint method is well suited.
MASTODON: A geosciences simulation tool built using the open-source framework MOOSE
NASA Astrophysics Data System (ADS)
Slaughter, A.
2017-12-01
The Department of Energy (DOE) is currently investing millions of dollars annually into various modeling and simulation tools for all aspects of nuclear energy. An important part of this effort includes developing applications based on the open-source Multiphysics Object Oriented Simulation Environment (MOOSE; mooseframework.org) from Idaho National Laboratory (INL).Thanks to the efforts of the DOE and outside collaborators, MOOSE currently contains a large set of physics modules, including phase field, level set, heat conduction, tensor mechanics, Navier-Stokes, fracture (extended finite-element method), and porous media, among others. The tensor mechanics and contact modules, in particular, are well suited for nonlinear geosciences problems. Multi-hazard Analysis for STOchastic time-DOmaiN phenomena (MASTODON; https://seismic-research.inl.gov/SitePages/Mastodon.aspx)--a MOOSE-based application--is capable of analyzing the response of 3D soil-structure systems to external hazards with current development focused on earthquakes. It is capable of simulating seismic events and can perform extensive "source-to-site" simulations including earthquake fault rupture, nonlinear wave propagation, and nonlinear soil-structure interaction analysis. MASTODON also includes a dynamic probabilistic risk assessment capability that enables analysts to not only perform deterministic analyses, but also easily perform probabilistic or stochastic simulations for the purpose of risk assessment. Although MASTODON has been developed for the nuclear industry, it can be used to assess the risk for any structure subjected to earthquakes.The geosciences community can learn from the nuclear industry and harness the enormous effort underway to build simulation tools that are open, modular, and share a common framework. In particular, MOOSE-based multiphysics solvers are inherently parallel, dimension agnostic, adaptive in time and space, fully coupled, and capable of interacting with other applications. The geosciences community could benefit from existing tools by enabling collaboration between researchers and practitioners throughout the world and advance the state-of-the-art in line with other scientific research efforts.
NASA Astrophysics Data System (ADS)
Olivia, G.; Santoso, A.; Prayogo, D. N.
2017-11-01
Nowadays, the level of competition between supply chains is getting tighter and a good coordination system between supply chains members is very crucial in solving the issue. This paper focused on a model development of coordination system between single supplier and buyers in a supply chain as a solution. Proposed optimization model was designed to determine the optimal number of deliveries from a supplier to buyers in order to minimize the total cost over a planning horizon. Components of the total supply chain cost consist of transportation costs, handling costs of supplier and buyers and also stock out costs. In the proposed optimization model, the supplier can supply various types of items to retailers whose item demand patterns are probabilistic. Sensitivity analysis of the proposed model was conducted to test the effect of changes in transport costs, handling costs and production capacities of the supplier. The results of the sensitivity analysis showed a significant influence on the changes in the transportation cost, handling costs and production capacity to the decisions of the optimal numbers of product delivery for each item to the buyers.
NASA Astrophysics Data System (ADS)
González, F. I.; Leveque, R. J.; Hatheway, D.; Metzger, N.
2011-12-01
Risk is defined in many ways, but most are consistent with Crichton's [1999] definition based on the ''risk triangle'' concept and the explicit identification of three risk elements: ''Risk is the probability of a loss, and this depends on three elements: hazard, vulnerability, and exposure. If any of these three elements in risk increases or decreases, then the risk increases or decreases respectively." The World Meteorological Organization, for example, cites Crichton [1999] and then defines risk as [WMO, 2008] Risk = function (Hazard x Vulnerability x Exposure) while the Asian Disaster Reduction Center adopts the more general expression [ADRC, 2005] Risk = function (Hazard, Vulnerability, Exposure) In practice, probabilistic concepts are invariably invoked, and at least one of the three factors are specified as probabilistic in nature. The Vulnerability and Exposure factors are defined in multiple ways in the relevant literature; but the Hazard factor, which is the focus of our presentation, is generally understood to deal only with the physical aspects of the phenomena and, in particular, the ability of the phenomena to inflict harm [Thywissen, 2006]. A Hazard factor can be estimated by a methodology known as Probabilistic Tsunami Hazard Assessment (PTHA) [González, et al., 2009]. We will describe the PTHA methodology and provide an example -- the results of a previous application to Seaside, OR. We will also present preliminary results for a PTHA of Crescent City, CA -- a pilot project and coastal modeling/mapping effort funded by the Federal Emergency Management Agency (FEMA) Region IX office as part of the new California Coastal Analysis and Mapping Project (CCAMP). CCAMP and the PTHA in Crescent City are being conducted under the nationwide FEMA Risk Mapping, Assessment, and Planning (Risk MAP) Program which focuses on providing communities with flood information and tools they can use to enhance their mitigation plans and better protect their citizens.
Probabilistic Estimates of Global Mean Sea Level and its Underlying Processes
NASA Astrophysics Data System (ADS)
Hay, C.; Morrow, E.; Kopp, R. E.; Mitrovica, J. X.
2015-12-01
Local sea level can vary significantly from the global mean value due to a suite of processes that includes ongoing sea-level changes due to the last ice age, land water storage, ocean circulation changes, and non-uniform sea-level changes that arise when modern-day land ice rapidly melts. Understanding these sources of spatial and temporal variability is critical to estimating past and present sea-level change and projecting future sea-level rise. Using two probabilistic techniques, a multi-model Kalman smoother and Gaussian process regression, we have reanalyzed 20th century tide gauge observations to produce a new estimate of global mean sea level (GMSL). Our methods allow us to extract global information from the sparse tide gauge field by taking advantage of the physics-based and model-derived geometry of the contributing processes. Both methods provide constraints on the sea-level contribution of glacial isostatic adjustment (GIA). The Kalman smoother tests multiple discrete models of glacial isostatic adjustment (GIA), probabilistically computing the most likely GIA model given the observations, while the Gaussian process regression characterizes the prior covariance structure of a suite of GIA models and then uses this structure to estimate the posterior distribution of local rates of GIA-induced sea-level change. We present the two methodologies, the model-derived geometries of the underlying processes, and our new probabilistic estimates of GMSL and GIA.
DCMDN: Deep Convolutional Mixture Density Network
NASA Astrophysics Data System (ADS)
D'Isanto, Antonio; Polsterer, Kai Lars
2017-09-01
Deep Convolutional Mixture Density Network (DCMDN) estimates probabilistic photometric redshift directly from multi-band imaging data by combining a version of a deep convolutional network with a mixture density network. The estimates are expressed as Gaussian mixture models representing the probability density functions (PDFs) in the redshift space. In addition to the traditional scores, the continuous ranked probability score (CRPS) and the probability integral transform (PIT) are applied as performance criteria. DCMDN is able to predict redshift PDFs independently from the type of source, e.g. galaxies, quasars or stars and renders pre-classification of objects and feature extraction unnecessary; the method is extremely general and allows the solving of any kind of probabilistic regression problems based on imaging data, such as estimating metallicity or star formation rate in galaxies.
Perspective: Stochastic magnetic devices for cognitive computing
NASA Astrophysics Data System (ADS)
Roy, Kaushik; Sengupta, Abhronil; Shim, Yong
2018-06-01
Stochastic switching of nanomagnets can potentially enable probabilistic cognitive hardware consisting of noisy neural and synaptic components. Furthermore, computational paradigms inspired from the Ising computing model require stochasticity for achieving near-optimality in solutions to various types of combinatorial optimization problems such as the Graph Coloring Problem or the Travelling Salesman Problem. Achieving optimal solutions in such problems are computationally exhaustive and requires natural annealing to arrive at the near-optimal solutions. Stochastic switching of devices also finds use in applications involving Deep Belief Networks and Bayesian Inference. In this article, we provide a multi-disciplinary perspective across the stack of devices, circuits, and algorithms to illustrate how the stochastic switching dynamics of spintronic devices in the presence of thermal noise can provide a direct mapping to the computational units of such probabilistic intelligent systems.
Use of limited data to construct Bayesian networks for probabilistic risk assessment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Groth, Katrina M.; Swiler, Laura Painton
2013-03-01
Probabilistic Risk Assessment (PRA) is a fundamental part of safety/quality assurance for nuclear power and nuclear weapons. Traditional PRA very effectively models complex hardware system risks using binary probabilistic models. However, traditional PRA models are not flexible enough to accommodate non-binary soft-causal factors, such as digital instrumentation&control, passive components, aging, common cause failure, and human errors. Bayesian Networks offer the opportunity to incorporate these risks into the PRA framework. This report describes the results of an early career LDRD project titled %E2%80%9CUse of Limited Data to Construct Bayesian Networks for Probabilistic Risk Assessment%E2%80%9D. The goal of the work was tomore » establish the capability to develop Bayesian Networks from sparse data, and to demonstrate this capability by producing a data-informed Bayesian Network for use in Human Reliability Analysis (HRA) as part of nuclear power plant Probabilistic Risk Assessment (PRA). This report summarizes the research goal and major products of the research.« less
NASA Astrophysics Data System (ADS)
Vergara, H. J.; Kirstetter, P.; Gourley, J. J.; Flamig, Z.; Hong, Y.
2015-12-01
The macro scale patterns of simulated streamflow errors are studied in order to characterize uncertainty in a hydrologic modeling system forced with the Multi-Radar/Multi-Sensor (MRMS; http://mrms.ou.edu) quantitative precipitation estimates for flood forecasting over the Conterminous United States (CONUS). The hydrologic model is centerpiece of the Flooded Locations And Simulated Hydrograph (FLASH; http://flash.ou.edu) real-time system. The hydrologic model is implemented at 1-km/5-min resolution to generate estimates of streamflow. Data from the CONUS-wide stream gauge network of the United States' Geological Survey (USGS) were used as a reference to evaluate the discrepancies with the hydrological model predictions. Streamflow errors were studied at the event scale with particular focus on the peak flow magnitude and timing. A total of 2,680 catchments over CONUS and 75,496 events from a 10-year period are used for the simulation diagnostic analysis. Associations between streamflow errors and geophysical factors were explored and modeled. It is found that hydro-climatic factors and radar coverage could explain significant underestimation of peak flow in regions of complex terrain. Furthermore, the statistical modeling of peak flow errors shows that other geophysical factors such as basin geomorphometry, pedology, and land cover/use could also provide explanatory information. Results from this research demonstrate the utility of uncertainty characterization in providing guidance to improve model adequacy, parameter estimates, and input quality control. Likewise, the characterization of uncertainty enables probabilistic flood forecasting that can be extended to ungauged locations.
A risk assessment method for multi-site damage
NASA Astrophysics Data System (ADS)
Millwater, Harry Russell, Jr.
This research focused on developing probabilistic methods suitable for computing small probabilities of failure, e.g., 10sp{-6}, of structures subject to multi-site damage (MSD). MSD is defined as the simultaneous development of fatigue cracks at multiple sites in the same structural element such that the fatigue cracks may coalesce to form one large crack. MSD is modeled as an array of collinear cracks with random initial crack lengths with the centers of the initial cracks spaced uniformly apart. The data used was chosen to be representative of aluminum structures. The structure is considered failed whenever any two adjacent cracks link up. A fatigue computer model is developed that can accurately and efficiently grow a collinear array of arbitrary length cracks from initial size until failure. An algorithm is developed to compute the stress intensity factors of all cracks considering all interaction effects. The probability of failure of two to 100 cracks is studied. Lower bounds on the probability of failure are developed based upon the probability of the largest crack exceeding a critical crack size. The critical crack size is based on the initial crack size that will grow across the ligament when the neighboring crack has zero length. The probability is evaluated using extreme value theory. An upper bound is based on the probability of the maximum sum of initial cracks being greater than a critical crack size. A weakest link sampling approach is developed that can accurately and efficiently compute small probabilities of failure. This methodology is based on predicting the weakest link, i.e., the two cracks to link up first, for a realization of initial crack sizes, and computing the cycles-to-failure using these two cracks. Criteria to determine the weakest link are discussed. Probability results using the weakest link sampling method are compared to Monte Carlo-based benchmark results. The results indicate that very small probabilities can be computed accurately in a few minutes using a Hewlett-Packard workstation.
Slot, Esther M; van Viersen, Sietske; de Bree, Elise H; Kroesbergen, Evelyn H
2016-01-01
High comorbidity rates have been reported between mathematical learning disabilities (MD) and reading and spelling disabilities (RSD). Research has identified skills related to math, such as number sense (NS) and visuospatial working memory (visuospatial WM), as well as to literacy, such as phonological awareness (PA), rapid automatized naming (RAN) and verbal short-term memory (Verbal STM). In order to explain the high comorbidity rates between MD and RSD, 7-11-year-old children were assessed on a range of cognitive abilities related to literacy (PA, RAN, Verbal STM) and mathematical ability (visuospatial WM, NS). The group of children consisted of typically developing (TD) children (n = 32), children with MD (n = 26), children with RSD (n = 29), and combined MD and RSD (n = 43). It was hypothesized that, in line with the multiple deficit view on learning disorders, at least one unique predictor for both MD and RSD and a possible shared cognitive risk factor would be found to account for the comorbidity between the symptom dimensions literacy and math. Secondly, our hypotheses were that (a) a probabilistic multi-factorial risk factor model would provide a better fit to the data than a deterministic single risk factor model and (b) that a shared risk factor model would provide a better fit than the specific multi-factorial model. All our hypotheses were confirmed. NS and visuospatial WM were identified as unique cognitive predictors for MD, whereas PA and RAN were both associated with RSD. Also, a shared risk factor model with PA as a cognitive predictor for both RSD and MD fitted the data best, indicating that MD and RSD might co-occur due to a shared underlying deficit in phonological processing. Possible explanations are discussed in the context of sample selection and composition. This study shows that different cognitive factors play a role in mathematics and literacy, and that a phonological processing deficit might play a role in the occurrence of MD and RSD.
Towards a multilevel cognitive probabilistic representation of space
NASA Astrophysics Data System (ADS)
Tapus, Adriana; Vasudevan, Shrihari; Siegwart, Roland
2005-03-01
This paper addresses the problem of perception and representation of space for a mobile agent. A probabilistic hierarchical framework is suggested as a solution to this problem. The method proposed is a combination of probabilistic belief with "Object Graph Models" (OGM). The world is viewed from a topological optic, in terms of objects and relationships between them. The hierarchical representation that we propose permits an efficient and reliable modeling of the information that the mobile agent would perceive from its environment. The integration of both navigational and interactional capabilities through efficient representation is also addressed. Experiments on a set of images taken from the real world that validate the approach are reported. This framework draws on the general understanding of human cognition and perception and contributes towards the overall efforts to build cognitive robot companions.
Simulation of probabilistic wind loads and building analysis
NASA Technical Reports Server (NTRS)
Shah, Ashwin R.; Chamis, Christos C.
1991-01-01
Probabilistic wind loads likely to occur on a structure during its design life are predicted. Described here is a suitable multifactor interactive equation (MFIE) model and its use in the Composite Load Spectra (CLS) computer program to simulate the wind pressure cumulative distribution functions on four sides of a building. The simulated probabilistic wind pressure load was applied to a building frame, and cumulative distribution functions of sway displacements and reliability against overturning were obtained using NESSUS (Numerical Evaluation of Stochastic Structure Under Stress), a stochastic finite element computer code. The geometry of the building and the properties of building members were also considered as random in the NESSUS analysis. The uncertainties of wind pressure, building geometry, and member section property were qualified in terms of their respective sensitivities on the structural response.
Integrated Campaign Probabilistic Cost, Schedule, Performance, and Value for Program Office Support
NASA Technical Reports Server (NTRS)
Cornelius, David; Sasamoto, Washito; Daugherty, Kevin; Deacon, Shaun
2012-01-01
This paper describes an integrated assessment tool developed at NASA Langley Research Center that incorporates probabilistic analysis of life cycle cost, schedule, launch performance, on-orbit performance, and value across a series of planned space-based missions, or campaign. Originally designed as an aid in planning the execution of missions to accomplish the National Research Council 2007 Earth Science Decadal Survey, it utilizes Monte Carlo simulation of a series of space missions for assessment of resource requirements and expected return on investment. Interactions between simulated missions are incorporated, such as competition for launch site manifest, to capture unexpected and non-linear system behaviors. A novel value model is utilized to provide an assessment of the probabilistic return on investment. A demonstration case is discussed to illustrate the tool utility.
Comparison of multi-subject ICA methods for analysis of fMRI data
Erhardt, Erik Barry; Rachakonda, Srinivas; Bedrick, Edward; Allen, Elena; Adali, Tülay; Calhoun, Vince D.
2010-01-01
Spatial independent component analysis (ICA) applied to functional magnetic resonance imaging (fMRI) data identifies functionally connected networks by estimating spatially independent patterns from their linearly mixed fMRI signals. Several multi-subject ICA approaches estimating subject-specific time courses (TCs) and spatial maps (SMs) have been developed, however there has not yet been a full comparison of the implications of their use. Here, we provide extensive comparisons of four multi-subject ICA approaches in combination with data reduction methods for simulated and fMRI task data. For multi-subject ICA, the data first undergo reduction at the subject and group levels using principal component analysis (PCA). Comparisons of subject-specific, spatial concatenation, and group data mean subject-level reduction strategies using PCA and probabilistic PCA (PPCA) show that computationally intensive PPCA is equivalent to PCA, and that subject-specific and group data mean subject-level PCA are preferred because of well-estimated TCs and SMs. Second, aggregate independent components are estimated using either noise free ICA or probabilistic ICA (PICA). Third, subject-specific SMs and TCs are estimated using back-reconstruction. We compare several direct group ICA (GICA) back-reconstruction approaches (GICA1-GICA3) and an indirect back-reconstruction approach, spatio-temporal regression (STR, or dual regression). Results show the earlier group ICA (GICA1) approximates STR, however STR has contradictory assumptions and may show mixed-component artifacts in estimated SMs. Our evidence-based recommendation is to use GICA3, introduced here, with subject-specific PCA and noise-free ICA, providing the most robust and accurate estimated SMs and TCs in addition to offering an intuitive interpretation. PMID:21162045
Multivariate decoding of brain images using ordinal regression.
Doyle, O M; Ashburner, J; Zelaya, F O; Williams, S C R; Mehta, M A; Marquand, A F
2013-11-01
Neuroimaging data are increasingly being used to predict potential outcomes or groupings, such as clinical severity, drug dose response, and transitional illness states. In these examples, the variable (target) we want to predict is ordinal in nature. Conventional classification schemes assume that the targets are nominal and hence ignore their ranked nature, whereas parametric and/or non-parametric regression models enforce a metric notion of distance between classes. Here, we propose a novel, alternative multivariate approach that overcomes these limitations - whole brain probabilistic ordinal regression using a Gaussian process framework. We applied this technique to two data sets of pharmacological neuroimaging data from healthy volunteers. The first study was designed to investigate the effect of ketamine on brain activity and its subsequent modulation with two compounds - lamotrigine and risperidone. The second study investigates the effect of scopolamine on cerebral blood flow and its modulation using donepezil. We compared ordinal regression to multi-class classification schemes and metric regression. Considering the modulation of ketamine with lamotrigine, we found that ordinal regression significantly outperformed multi-class classification and metric regression in terms of accuracy and mean absolute error. However, for risperidone ordinal regression significantly outperformed metric regression but performed similarly to multi-class classification both in terms of accuracy and mean absolute error. For the scopolamine data set, ordinal regression was found to outperform both multi-class and metric regression techniques considering the regional cerebral blood flow in the anterior cingulate cortex. Ordinal regression was thus the only method that performed well in all cases. Our results indicate the potential of an ordinal regression approach for neuroimaging data while providing a fully probabilistic framework with elegant approaches for model selection. Copyright © 2013. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Kim, Ok-Yeon; Kim, Hye-Mi; Lee, Myong-In; Min, Young-Mi
2017-01-01
This study aims at predicting the seasonal number of typhoons (TY) over the western North Pacific with an Asia-Pacific Climate Center (APCC) multi-model ensemble (MME)-based dynamical-statistical hybrid model. The hybrid model uses the statistical relationship between the number of TY during the typhoon season (July-October) and the large-scale key predictors forecasted by APCC MME for the same season. The cross validation result from the MME hybrid model demonstrates high prediction skill, with a correlation of 0.67 between the hindcasts and observation for 1982-2008. The cross validation from the hybrid model with individual models participating in MME indicates that there is no single model which consistently outperforms the other models in predicting typhoon number. Although the forecast skill of MME is not always the highest compared to that of each individual model, the skill of MME presents rather higher averaged correlations and small variance of correlations. Given large set of ensemble members from multi-models, a relative operating characteristic score reveals an 82 % (above-) and 78 % (below-normal) improvement for the probabilistic prediction of the number of TY. It implies that there is 82 % (78 %) probability that the forecasts can successfully discriminate between above normal (below-normal) from other years. The forecast skill of the hybrid model for the past 7 years (2002-2008) is more skillful than the forecast from the Tropical Storm Risk consortium. Using large set of ensemble members from multi-models, the APCC MME could provide useful deterministic and probabilistic seasonal typhoon forecasts to the end-users in particular, the residents of tropical cyclone-prone areas in the Asia-Pacific region.
Probabilistic grammatical model for helix‐helix contact site classification
2013-01-01
Background Hidden Markov Models power many state‐of‐the‐art tools in the field of protein bioinformatics. While excelling in their tasks, these methods of protein analysis do not convey directly information on medium‐ and long‐range residue‐residue interactions. This requires an expressive power of at least context‐free grammars. However, application of more powerful grammar formalisms to protein analysis has been surprisingly limited. Results In this work, we present a probabilistic grammatical framework for problem‐specific protein languages and apply it to classification of transmembrane helix‐helix pairs configurations. The core of the model consists of a probabilistic context‐free grammar, automatically inferred by a genetic algorithm from only a generic set of expert‐based rules and positive training samples. The model was applied to produce sequence based descriptors of four classes of transmembrane helix‐helix contact site configurations. The highest performance of the classifiers reached AUCROC of 0.70. The analysis of grammar parse trees revealed the ability of representing structural features of helix‐helix contact sites. Conclusions We demonstrated that our probabilistic context‐free framework for analysis of protein sequences outperforms the state of the art in the task of helix‐helix contact site classification. However, this is achieved without necessarily requiring modeling long range dependencies between interacting residues. A significant feature of our approach is that grammar rules and parse trees are human‐readable. Thus they could provide biologically meaningful information for molecular biologists. PMID:24350601
Predicting Coupled Ocean-Atmosphere Modes with a Climate Modeling Hierarchy -- Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michael Ghil, UCLA; Andrew W. Robertson, IRI, Columbia Univ.; Sergey Kravtsov, U. of Wisconsin, Milwaukee
The goal of the project was to determine midlatitude climate predictability associated with tropical-extratropical interactions on interannual-to-interdecadal time scales. Our strategy was to develop and test a hierarchy of climate models, bringing together large GCM-based climate models with simple fluid-dynamical coupled ocean-ice-atmosphere models, through the use of advanced probabilistic network (PN) models. PN models were used to develop a new diagnostic methodology for analyzing coupled ocean-atmosphere interactions in large climate simulations made with the NCAR Parallel Climate Model (PCM), and to make these tools user-friendly and available to other researchers. We focused on interactions between the tropics and extratropics throughmore » atmospheric teleconnections (the Hadley cell, Rossby waves and nonlinear circulation regimes) over both the North Atlantic and North Pacific, and the ocean’s thermohaline circulation (THC) in the Atlantic. We tested the hypothesis that variations in the strength of the THC alter sea surface temperatures in the tropical Atlantic, and that the latter influence the atmosphere in high latitudes through an atmospheric teleconnection, feeding back onto the THC. The PN model framework was used to mediate between the understanding gained with simplified primitive equations models and multi-century simulations made with the PCM. The project team is interdisciplinary and built on an existing synergy between atmospheric and ocean scientists at UCLA, computer scientists at UCI, and climate researchers at the IRI.« less
Simulation of an ensemble of future climate time series with an hourly weather generator
NASA Astrophysics Data System (ADS)
Caporali, E.; Fatichi, S.; Ivanov, V. Y.; Kim, J.
2010-12-01
There is evidence that climate change is occurring in many regions of the world. The necessity of climate change predictions at the local scale and fine temporal resolution is thus warranted for hydrological, ecological, geomorphological, and agricultural applications that can provide thematic insights into the corresponding impacts. Numerous downscaling techniques have been proposed to bridge the gap between the spatial scales adopted in General Circulation Models (GCM) and regional analyses. Nevertheless, the time and spatial resolutions obtained as well as the type of meteorological variables may not be sufficient for detailed studies of climate change effects at the local scales. In this context, this study presents a stochastic downscaling technique that makes use of an hourly weather generator to simulate time series of predicted future climate. Using a Bayesian approach, the downscaling procedure derives distributions of factors of change for several climate statistics from a multi-model ensemble of GCMs. Factors of change are sampled from their distributions using a Monte Carlo technique to entirely account for the probabilistic information obtained with the Bayesian multi-model ensemble. Factors of change are subsequently applied to the statistics derived from observations to re-evaluate the parameters of the weather generator. The weather generator can reproduce a wide set of climate variables and statistics over a range of temporal scales, from extremes, to the low-frequency inter-annual variability. The final result of such a procedure is the generation of an ensemble of hourly time series of meteorological variables that can be considered as representative of future climate, as inferred from GCMs. The generated ensemble of scenarios also accounts for the uncertainty derived from multiple GCMs used in downscaling. Applications of the procedure in reproducing present and future climates are presented for different locations world-wide: Tucson (AZ), Detroit (MI), and Firenze (Italy). The stochastic downscaling is carried out with eight GCMs from the CMIP3 multi-model dataset (IPCC 4AR, A1B scenario).
NASA Astrophysics Data System (ADS)
Lowe, R.; Ballester, J.; Robine, J.; Herrmann, F. R.; Jupp, T. E.; Stephenson, D.; Rodó, X.
2013-12-01
Users of climate information often require probabilistic information on which to base their decisions. However, communicating information contained within a probabilistic forecast presents a challenge. In this paper we demonstrate a novel visualisation technique to display ternary probabilistic forecasts on a map in order to inform decision making. In this method, ternary probabilistic forecasts, which assign probabilities to a set of three outcomes (e.g. low, medium, and high risk), are considered as a point in a triangle of barycentric coordinates. This allows a unique colour to be assigned to each forecast from a continuum of colours defined on the triangle. Colour saturation increases with information gain relative to the reference forecast (i.e. the long term average). This provides additional information to decision makers compared with conventional methods used in seasonal climate forecasting, where one colour is used to represent one forecast category on a forecast map (e.g. red = ';dry'). We use the tool to present climate-related mortality projections across Europe. Temperature and humidity are related to human mortality via location-specific transfer functions, calculated using historical data. Daily mortality data at the NUTS2 level for 16 countries in Europe were obtain from 1998-2005. Transfer functions were calculated for 54 aggregations in Europe, defined using criteria related to population and climatological similarities. Aggregations are restricted to fall within political boundaries to avoid problems related to varying adaptation policies between countries. A statistical model is fit to cold and warm tails to estimate future mortality using forecast temperatures, in a Bayesian probabilistic framework. Using predefined categories of temperature-related mortality risk, we present maps of probabilistic projections for human mortality at seasonal to decadal time scales. We demonstrate the information gained from using this technique compared to more traditional methods to display ternary probabilistic forecasts. This technique allows decision makers to identify areas where the model predicts with certainty area-specific heat waves or cold snaps, in order to effectively target resources to those areas most at risk, for a given season or year. It is hoped that this visualisation tool will facilitate the interpretation of the probabilistic forecasts not only for public health decision makers but also within a multi-sectoral climate service framework.
Zero-Determinant Strategies in Iterated Public Goods Game
Pan, Liming; Hao, Dong; Rong, Zhihai; Zhou, Tao
2015-01-01
Recently, Press and Dyson have proposed a new class of probabilistic and conditional strategies for the two-player iterated Prisoner’s Dilemma, so-called zero-determinant strategies. A player adopting zero-determinant strategies is able to pin the expected payoff of the opponents or to enforce a linear relationship between his own payoff and the opponents’ payoff, in a unilateral way. This paper considers zero-determinant strategies in the iterated public goods game, a representative multi-player game where in each round each player will choose whether or not to put his tokens into a public pot, and the tokens in this pot are multiplied by a factor larger than one and then evenly divided among all players. The analytical and numerical results exhibit a similar yet different scenario to the case of two-player games: (i) with small number of players or a small multiplication factor, a player is able to unilaterally pin the expected total payoff of all other players; (ii) a player is able to set the ratio between his payoff and the total payoff of all other players, but this ratio is limited by an upper bound if the multiplication factor exceeds a threshold that depends on the number of players. PMID:26293589
Mihaljević, Bojan; Bielza, Concha; Benavides-Piccione, Ruth; DeFelipe, Javier; Larrañaga, Pedro
2014-01-01
Interneuron classification is an important and long-debated topic in neuroscience. A recent study provided a data set of digitally reconstructed interneurons classified by 42 leading neuroscientists according to a pragmatic classification scheme composed of five categorical variables, namely, of the interneuron type and four features of axonal morphology. From this data set we now learned a model which can classify interneurons, on the basis of their axonal morphometric parameters, into these five descriptive variables simultaneously. Because of differences in opinion among the neuroscientists, especially regarding neuronal type, for many interneurons we lacked a unique, agreed-upon classification, which we could use to guide model learning. Instead, we guided model learning with a probability distribution over the neuronal type and the axonal features, obtained, for each interneuron, from the neuroscientists' classification choices. We conveniently encoded such probability distributions with Bayesian networks, calling them label Bayesian networks (LBNs), and developed a method to predict them. This method predicts an LBN by forming a probabilistic consensus among the LBNs of the interneurons most similar to the one being classified. We used 18 axonal morphometric parameters as predictor variables, 13 of which we introduce in this paper as quantitative counterparts to the categorical axonal features. We were able to accurately predict interneuronal LBNs. Furthermore, when extracting crisp (i.e., non-probabilistic) predictions from the predicted LBNs, our method outperformed related work on interneuron classification. Our results indicate that our method is adequate for multi-dimensional classification of interneurons with probabilistic labels. Moreover, the introduced morphometric parameters are good predictors of interneuron type and the four features of axonal morphology and thus may serve as objective counterparts to the subjective, categorical axonal features.
Multi-model ensemble hydrologic prediction using Bayesian model averaging
NASA Astrophysics Data System (ADS)
Duan, Qingyun; Ajami, Newsha K.; Gao, Xiaogang; Sorooshian, Soroosh
2007-05-01
Multi-model ensemble strategy is a means to exploit the diversity of skillful predictions from different models. This paper studies the use of Bayesian model averaging (BMA) scheme to develop more skillful and reliable probabilistic hydrologic predictions from multiple competing predictions made by several hydrologic models. BMA is a statistical procedure that infers consensus predictions by weighing individual predictions based on their probabilistic likelihood measures, with the better performing predictions receiving higher weights than the worse performing ones. Furthermore, BMA provides a more reliable description of the total predictive uncertainty than the original ensemble, leading to a sharper and better calibrated probability density function (PDF) for the probabilistic predictions. In this study, a nine-member ensemble of hydrologic predictions was used to test and evaluate the BMA scheme. This ensemble was generated by calibrating three different hydrologic models using three distinct objective functions. These objective functions were chosen in a way that forces the models to capture certain aspects of the hydrograph well (e.g., peaks, mid-flows and low flows). Two sets of numerical experiments were carried out on three test basins in the US to explore the best way of using the BMA scheme. In the first set, a single set of BMA weights was computed to obtain BMA predictions, while the second set employed multiple sets of weights, with distinct sets corresponding to different flow intervals. In both sets, the streamflow values were transformed using Box-Cox transformation to ensure that the probability distribution of the prediction errors is approximately Gaussian. A split sample approach was used to obtain and validate the BMA predictions. The test results showed that BMA scheme has the advantage of generating more skillful and equally reliable probabilistic predictions than original ensemble. The performance of the expected BMA predictions in terms of daily root mean square error (DRMS) and daily absolute mean error (DABS) is generally superior to that of the best individual predictions. Furthermore, the BMA predictions employing multiple sets of weights are generally better than those using single set of weights.
Guilé, Jean Marc
2013-01-01
Homeostasis is not a permanent and stable state but instead results from conflicting forces. Therefore, infants have to engage in dynamic exchanges with their environment, in biological, cognitive, and affective domains. Empathy is an adaptive response to these environmental challenges, which contributes to reaching proper dynamic homeostasis and development. Empathy relies on implicit interactive processes, namely probabilistic perception and synchrony, which will be reviewed in the article. If typically-developed neonates are fully equipped to automatically and synchronously interact with their human environment, conduct disorders (CD) and autism spectrum disorders (ASD) present with impairments in empathetic communication, e.g., emotional arousal and facial emotion processing. In addition sensorimotor resonance is lacking in ASD, and emotional concern and semantic empathy are impaired in CD with Callous-Unemotional traits. PMID:24479115
NASA Astrophysics Data System (ADS)
Agus, M.; Penna, M. P.; Peró-Cebollero, M.; Guàrdia-Olmos, J.
2015-02-01
Numerous studies have examined students' difficulties in understanding some notions related to statistical problems. Some authors observed that the presentation of distinct visual representations could increase statistical reasoning, supporting the principle of graphical facilitation. But other researchers disagree with this viewpoint, emphasising the impediments related to the use of illustrations that could overcharge the cognitive system with insignificant data. In this work we aim at comparing the probabilistic statistical reasoning regarding two different formats of problem presentations: graphical and verbal-numerical. We have conceived and presented five pairs of homologous simple problems in the verbal numerical and graphical format to 311 undergraduate Psychology students (n=156 in Italy and n=155 in Spain) without statistical expertise. The purpose of our work was to evaluate the effect of graphical facilitation in probabilistic statistical reasoning. Every undergraduate has solved each pair of problems in two formats in different problem presentation orders and sequences. Data analyses have highlighted that the effect of graphical facilitation is infrequent in psychology undergraduates. This effect is related to many factors (as knowledge, abilities, attitudes, and anxiety); moreover it might be considered the resultant of interaction between individual and task characteristics.
Fast, Nonlinear, Fully Probabilistic Inversion of Large Geophysical Problems
NASA Astrophysics Data System (ADS)
Curtis, A.; Shahraeeni, M.; Trampert, J.; Meier, U.; Cho, G.
2010-12-01
Almost all Geophysical inverse problems are in reality nonlinear. Fully nonlinear inversion including non-approximated physics, and solving for probability distribution functions (pdf’s) that describe the solution uncertainty, generally requires sampling-based Monte-Carlo style methods that are computationally intractable in most large problems. In order to solve such problems, physical relationships are usually linearized leading to efficiently-solved, (possibly iterated) linear inverse problems. However, it is well known that linearization can lead to erroneous solutions, and in particular to overly optimistic uncertainty estimates. What is needed across many Geophysical disciplines is a method to invert large inverse problems (or potentially tens of thousands of small inverse problems) fully probabilistically and without linearization. This talk shows how very large nonlinear inverse problems can be solved fully probabilistically and incorporating any available prior information using mixture density networks (driven by neural network banks), provided the problem can be decomposed into many small inverse problems. In this talk I will explain the methodology, compare multi-dimensional pdf inversion results to full Monte Carlo solutions, and illustrate the method with two applications: first, inverting surface wave group and phase velocities for a fully-probabilistic global tomography model of the Earth’s crust and mantle, and second inverting industrial 3D seismic data for petrophysical properties throughout and around a subsurface hydrocarbon reservoir. The latter problem is typically decomposed into 104 to 105 individual inverse problems, each solved fully probabilistically and without linearization. The results in both cases are sufficiently close to the Monte Carlo solution to exhibit realistic uncertainty, multimodality and bias. This provides far greater confidence in the results, and in decisions made on their basis.
Inference on the Strength of Balancing Selection for Epistatically Interacting Loci
Buzbas, Erkan Ozge; Joyce, Paul; Rosenberg, Noah A.
2011-01-01
Existing inference methods for estimating the strength of balancing selection in multi-locus genotypes rely on the assumption that there are no epistatic interactions between loci. Complex systems in which balancing selection is prevalent, such as sets of human immune system genes, are known to contain components that interact epistatically. Therefore, current methods may not produce reliable inference on the strength of selection at these loci. In this paper, we address this problem by presenting statistical methods that can account for epistatic interactions in making inference about balancing selection. A theoretical result due to Fearnhead (2006) is used to build a multi-locus Wright-Fisher model of balancing selection, allowing for epistatic interactions among loci. Antagonistic and synergistic types of interactions are examined. The joint posterior distribution of the selection and mutation parameters is sampled by Markov chain Monte Carlo methods, and the plausibility of models is assessed via Bayes factors. As a component of the inference process, an algorithm to generate multi-locus allele frequencies under balancing selection models with epistasis is also presented. Recent evidence on interactions among a set of human immune system genes is introduced as a motivating biological system for the epistatic model, and data on these genes are used to demonstrate the methods. PMID:21277883
Long Island Sound Tropospheric Ozone Study (LISTOS) Fact Sheet
EPA scientists are collaborating on a multi-agency field study to investigate the complex interaction of emissions, chemistry and meteorological factors contributing to elevated ozone levels along the Long Island Sound shoreline.
Relative Gains, Losses, and Reference Points in Probabilistic Choice in Rats
Marshall, Andrew T.; Kirkpatrick, Kimberly
2015-01-01
Theoretical reference points have been proposed to differentiate probabilistic gains from probabilistic losses in humans, but such a phenomenon in non-human animals has yet to be thoroughly elucidated. Three experiments evaluated the effect of reward magnitude on probabilistic choice in rats, seeking to determine reference point use by examining the effect of previous outcome magnitude(s) on subsequent choice behavior. Rats were trained to choose between an outcome that always delivered reward (low-uncertainty choice) and one that probabilistically delivered reward (high-uncertainty). The probability of high-uncertainty outcome receipt and the magnitudes of low-uncertainty and high-uncertainty outcomes were manipulated within and between experiments. Both the low- and high-uncertainty outcomes involved variable reward magnitudes, so that either a smaller or larger magnitude was probabilistically delivered, as well as reward omission following high-uncertainty choices. In Experiments 1 and 2, the between groups factor was the magnitude of the high-uncertainty-smaller (H-S) and high-uncertainty-larger (H-L) outcome, respectively. The H-S magnitude manipulation differentiated the groups, while the H-L magnitude manipulation did not. Experiment 3 showed that manipulating the probability of differential losses as well as the expected value of the low-uncertainty choice produced systematic effects on choice behavior. The results suggest that the reference point for probabilistic gains and losses was the expected value of the low-uncertainty choice. Current theories of probabilistic choice behavior have difficulty accounting for the present results, so an integrated theoretical framework is proposed. Overall, the present results have implications for understanding individual differences and corresponding underlying mechanisms of probabilistic choice behavior. PMID:25658448
Probabilistic soil erosion modeling using the Erosion Risk Management Tool (ERMIT) after wildfires
P. R. Robichaud; W. J. Elliot; J. W. Wagenbrenner
2011-01-01
The decision of whether or not to apply post-fire hillslope erosion mitigation treatments, and if so, where these treatments are most needed, is a multi-step process. Land managers must assess the risk of damaging runoff and sediment delivery events occurring on the unrecovered burned hillslope. We developed the Erosion Risk Management Tool (ERMiT) to address this need...
An All-Fragments Grammar for Simple and Accurate Parsing
2012-03-21
Tsujii. Probabilistic CFG with latent annotations. In Proceedings of ACL, 2005. Slav Petrov and Dan Klein. Improved Inference for Unlexicalized Parsing. In...Proceedings of NAACL-HLT, 2007. Slav Petrov and Dan Klein. Sparse Multi-Scale Grammars for Discriminative Latent Variable Parsing. In Proceedings of...EMNLP, 2008. Slav Petrov, Leon Barrett, Romain Thibaux, and Dan Klein. Learning Accurate, Compact, and Interpretable Tree Annotation. In Proceedings
Probabilistic generation of random networks taking into account information on motifs occurrence.
Bois, Frederic Y; Gayraud, Ghislaine
2015-01-01
Because of the huge number of graphs possible even with a small number of nodes, inference on network structure is known to be a challenging problem. Generating large random directed graphs with prescribed probabilities of occurrences of some meaningful patterns (motifs) is also difficult. We show how to generate such random graphs according to a formal probabilistic representation, using fast Markov chain Monte Carlo methods to sample them. As an illustration, we generate realistic graphs with several hundred nodes mimicking a gene transcription interaction network in Escherichia coli.
Probabilistic Generation of Random Networks Taking into Account Information on Motifs Occurrence
Bois, Frederic Y.
2015-01-01
Abstract Because of the huge number of graphs possible even with a small number of nodes, inference on network structure is known to be a challenging problem. Generating large random directed graphs with prescribed probabilities of occurrences of some meaningful patterns (motifs) is also difficult. We show how to generate such random graphs according to a formal probabilistic representation, using fast Markov chain Monte Carlo methods to sample them. As an illustration, we generate realistic graphs with several hundred nodes mimicking a gene transcription interaction network in Escherichia coli. PMID:25493547
2011-11-01
assessment to quality of localization/characterization estimates. This protocol includes four critical components: (1) a procedure to identify the...critical factors impacting SHM system performance; (2) a multistage or hierarchical approach to SHM system validation; (3) a model -assisted evaluation...Lindgren, E. A ., Buynak, C. F., Steffes, G., Derriso, M., “ Model -assisted Probabilistic Reliability Assessment for Structural Health Monitoring
Managing Space Radiation Risks on Lunar and Mars Missions: Risk Assessment and Mitigation
NASA Technical Reports Server (NTRS)
Cucinotta, F. A.; George, K.; Hu, X.; Kim, M. H.; Nikjoo, H.
2006-01-01
Radiation-induced health risks are a primary concern for human exploration outside the Earth's magnetosphere, and require improved approaches to risk estimation and tools for mitigation including shielding and biological countermeasures. Solar proton events are the major concern for short-term lunar missions (<60 d), and for long-term missions (>60 d) such as Mars exploration, the exposures to the high energy and charge (HZE) ions that make-up the galactic cosmic rays are the major concern. Health risks from radiation exposure are chronic risks including carcinogenesis and degenerative tissue risks, central nervous system effects, and acute risk such as radiation sickness or early lethality. The current estimate is that a more than four-fold uncertainty exists in the projection of lifetime mortality risk from cosmic rays, which severely limits analysis of possible benefits of shielding or biological countermeasure designs. Uncertainties in risk projections are largely due to insufficient knowledge of HZE ion radiobiology, which has led NASA to develop a unique probabilistic approach to radiation protection. We review NASA's approach to radiation risk assessment including its impact on astronaut dose limits and application of the ALARA (As Low as Reasonably Achievable) principle. The recently opened NASA Space Radiation Laboratory (NSRL) provides the capability to simulate the cosmic rays in controlled ground-based experiments with biological and shielding models. We discuss how research at NSRL will lead to reductions in the uncertainties in risk projection models. In developing mission designs, the reduction of health risks and mission constraints including costs are competing concerns that need to be addressed through optimization procedures. Mitigating the risks from space radiation is a multi-factorial problem involving individual factors (age, gender, genetic makeup, and exposure history), operational factors (planetary destination, mission length, and period in the solar cycle), and shielding characteristics (materials, mass, and topology). We review optimization metrics for radiation protection including scenarios that integrate biophysics models of radiation risks, operational variables, and shielding design tools needed to assess exploration mission designs. We discuss the application of a crosscutting metric, based on probabilistic risk assessment, to lunar and Mars mission trade studies including the assessment of multi-factorial problems and the potential benefits of new radiation health research strategies or mitigation technologies.
Managing Space Radiation Risks On Lunar and Mars Missions: Risk Assessment and Mitigation
NASA Technical Reports Server (NTRS)
Cucinotta, F. A.; George, K.; Hu, X.; Kim, M. H.; Nikjoo, H.
2005-01-01
Radiation-induced health risks are a primary concern for human exploration outside the Earth's magnetosphere, and require improved approaches to risk estimation and tools for mitigation including shielding and biological countermeasures. Solar proton events are the major concern for short-term lunar missions (<60 d), and for long-term missions (>60 d) such as Mars exploration, the exposures to the high energy and charge (HZE) ions that make-up the galactic cosmic rays are the major concern. Health risks from radiation exposure are chronic risks including carcinogenesis and degenerative tissue risks, central nervous system effects, and acute risk such as radiation sickness or early lethality. The current estimate is that a more than four-fold uncertainty exists in the projection of lifetime mortality risk from cosmic rays, which severely limits analysis of possible benefits of shielding or biological countermeasure designs. Uncertainties in risk projections are largely due to insufficient knowledge of HZE ion radiobiology, which has led NASA to develop a unique probabilistic approach to radiation protection. We review NASA's approach to radiation risk assessment including its impact on astronaut dose limits and application of the ALARA (As Low as Reasonably Achievable) principle. The recently opened NASA Space Radiation Laboratory (NSRL) provides the capability to simulate the cosmic rays in controlled ground-based experiments with biological and shielding models. We discuss how research at NSRL will lead to reductions in the uncertainties in risk projection models. In developing mission designs, the reduction of health risks and mission constraints including costs are competing concerns that need to be addressed through optimization procedures. Mitigating the risks from space radiation is a multi-factorial problem involving individual factors (age, gender, genetic makeup, and exposure history), operational factors (planetary destination, mission length, and period in the solar cycle), and shielding characteristics (materials, mass, and topology). We review optimization metrics for radiation protection including scenarios that integrate biophysics models of radiation risks, operational variables, and shielding design tools needed to assess exploration mission designs. We discuss the application of a crosscutting metric, based on probabilistic risk assessment, to lunar and Mars mission trade studies including the assessment of multi-factorial problems and the potential benefits of new radiation health research strategies or mitigation technologies.
Managing Space Radiation Risks on Lunar and Mars Missions: Risk Assessment and Mitigation
NASA Technical Reports Server (NTRS)
Cucinotta, F. A.; George, K.; Hu, X.; Kim, M. H.; Nikjoo, H.; Ponomarev, A.; Ren, L.; Shavers, M. R.; Wu, H.
2005-01-01
Radiation-induced health risks are a primary concern for human exploration outside the Earth's magnetosphere, and require improved approaches to risk estimation and tools for mitigation including shielding and biological countermeasures. Solar proton events are the major concern for short-term lunar missions (<60 d), and for long-term missions (>60 d) such as Mars exploration, the exposures to the high energy and charge (HZE) ions that make-up the galactic cosmic rays are the major concern. Health risks from radiation exposure are chronic risks including carcinogenesis and degenerative tissue risks, central nervous system effects, and acute risk such as radiation sickness or early lethality. The current estimate is that a more than four-fold uncertainty exists in the projection of lifetime mortality risk from cosmic rays, which severely limits analysis of possible benefits of shielding or biological countermeasure designs. Uncertainties in risk projections are largely due to insufficient knowledge of HZE ion radiobiology, which has led NASA to develop a unique probabilistic approach to radiation protection. We review NASA's approach to radiation risk assessment including its impact on astronaut dose limits and application of the ALARA (As Low as Reasonably Achievable) principle. The recently opened NASA Space Radiation Laboratory (NSRL) provides the capability to simulate the cosmic rays in controlled ground-based experiments with biological and shielding models. We discuss how research at NSRL will lead to reductions in the uncertainties in risk projection models. In developing mission designs, the reduction of health risks and mission constraints including costs are competing concerns that need to be addressed through optimization procedures. Mitigating the risks from space radiation is a multi-factorial problem involving individual factors (age, gender, genetic makeup, and exposure history), operational factors (planetary destination, mission length, and period in the solar cycle), and shielding characteristics (materials, mass, and topology). We review optimization metrics for radiation protection including scenarios that integrate biophysics models of radiation risks, operational variables, and shielding design tools needed to assess exploration mission designs. We discuss the application of a crosscutting metric, based on probabilistic risk assessment, to lunar and Mars mission trade studies including the assessment of multi-factorial problems and the potential benefits of new radiation health research strategies or mitigation technologies.
[National Health and Nutrition Survey 2012: design and coverage].
Romero-Martínez, Martín; Shamah-Levy, Teresa; Franco-Núñez, Aurora; Villalpando, Salvador; Cuevas-Nasu, Lucía; Gutiérrez, Juan Pablo; Rivera-Dommarco, Juan Ángel
2013-01-01
To describe the design and population coverage of the National Health and Nutrition Survey 2012 (NHNS 2012). The design of the NHNS 2012 is reported, as a probabilistic population based survey with a multi-stage and stratified sampling, as well as the sample inferential properties, the logistical procedures, and the obtained coverage. Household response rate for the NHNS 2012 was 87%, completing data from 50,528 households, where 96 031 individual interviews selected by age and 14,104 of ambulatory health services users were also obtained. The probabilistic design of the NHNS 2012 as well as its coverage allowed to generate inferences about health and nutrition conditions, health programs coverage, and access to health services. Because of their complex designs, all estimations from the NHNS 2012 must use the survey design: weights, primary sampling units, and stratus variables.
Probabilistic Multi-Sensor Fusion Based Indoor Positioning System on a Mobile Device
He, Xiang; Aloi, Daniel N.; Li, Jia
2015-01-01
Nowadays, smart mobile devices include more and more sensors on board, such as motion sensors (accelerometer, gyroscope, magnetometer), wireless signal strength indicators (WiFi, Bluetooth, Zigbee), and visual sensors (LiDAR, camera). People have developed various indoor positioning techniques based on these sensors. In this paper, the probabilistic fusion of multiple sensors is investigated in a hidden Markov model (HMM) framework for mobile-device user-positioning. We propose a graph structure to store the model constructed by multiple sensors during the offline training phase, and a multimodal particle filter to seamlessly fuse the information during the online tracking phase. Based on our algorithm, we develop an indoor positioning system on the iOS platform. The experiments carried out in a typical indoor environment have shown promising results for our proposed algorithm and system design. PMID:26694387
Probabilistic Multi-Sensor Fusion Based Indoor Positioning System on a Mobile Device.
He, Xiang; Aloi, Daniel N; Li, Jia
2015-12-14
Nowadays, smart mobile devices include more and more sensors on board, such as motion sensors (accelerometer, gyroscope, magnetometer), wireless signal strength indicators (WiFi, Bluetooth, Zigbee), and visual sensors (LiDAR, camera). People have developed various indoor positioning techniques based on these sensors. In this paper, the probabilistic fusion of multiple sensors is investigated in a hidden Markov model (HMM) framework for mobile-device user-positioning. We propose a graph structure to store the model constructed by multiple sensors during the offline training phase, and a multimodal particle filter to seamlessly fuse the information during the online tracking phase. Based on our algorithm, we develop an indoor positioning system on the iOS platform. The experiments carried out in a typical indoor environment have shown promising results for our proposed algorithm and system design.
Wild, Heather M.; Heckemann, Rolf A.; Studholme, Colin
2017-01-01
Accurately describing the anatomy of individual brains enables interlaboratory communication of functional and developmental studies and is crucial for possible surgical interventions. The human parietal lobe participates in multimodal sensory integration including language processing and also contains the primary somatosensory area. We describe detailed protocols to subdivide the parietal lobe, analyze morphological and volumetric characteristics, and create probabilistic atlases in MNI152 stereotaxic space. The parietal lobe was manually delineated on 3D T1 MR images of 30 healthy subjects and divided into four regions: supramarginal gyrus (SMG), angular gyrus (AG), superior parietal lobe (supPL) and postcentral gyrus (postCG). There was the expected correlation of male gender with larger brain and intracranial volume. We examined a wide range of anatomical features of the gyri and the sulci separating them. At least a rudimentary primary intermediate sulcus of Jensen (PISJ) separating SMG and AG was identified in nearly all (59/60) hemispheres. Presence of additional gyri in SMG and AG was related to sulcal features and volumetric characteristics. The parietal lobe was slightly (2%) larger on the left, driven by leftward asymmetries of the postCG and SMG. Intersubject variability was highest for SMG and AG, and lowest for postCG. Overall the morphological characteristics tended to be symmetrical, and volumes also tended to covary between hemispheres. This may reflect developmental as well as maturation factors. To assess the accuracy with which the labels can be used to segment newly acquired (unlabelled) T1-weighted brain images, we applied multi-atlas label propagation software (MAPER) in a leave-one-out experiment and compared the resulting automatic labels with the manually prepared ones. The results showed strong agreement (mean Jaccard index 0.69, corresponding to a mean Dice index of 0.82, average mean volume error of 0.6%). Stereotaxic probabilistic atlases of each subregion were obtained. They illustrate the physiological brain torque, with structures in the right hemisphere positioned more anteriorly than in the left, and right/left positional differences of up to 10 mm. They also allow an assessment of sulcal variability, e.g. low variability for parietooccipital fissure and cingulate sulcus. Illustrated protocols, individual label sets, probabilistic atlases, and a maximum-probability atlas which takes into account surrounding structures are available for free download under academic licences. PMID:28846692
Bayesian network ensemble as a multivariate strategy to predict radiation pneumonitis risk
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Sangkyu, E-mail: sangkyu.lee@mail.mcgill.ca; Ybarra, Norma; Jeyaseelan, Krishinima
2015-05-15
Purpose: Prediction of radiation pneumonitis (RP) has been shown to be challenging due to the involvement of a variety of factors including dose–volume metrics and radiosensitivity biomarkers. Some of these factors are highly correlated and might affect prediction results when combined. Bayesian network (BN) provides a probabilistic framework to represent variable dependencies in a directed acyclic graph. The aim of this study is to integrate the BN framework and a systems’ biology approach to detect possible interactions among RP risk factors and exploit these relationships to enhance both the understanding and prediction of RP. Methods: The authors studied 54 nonsmall-cellmore » lung cancer patients who received curative 3D-conformal radiotherapy. Nineteen RP events were observed (common toxicity criteria for adverse events grade 2 or higher). Serum concentration of the following four candidate biomarkers were measured at baseline and midtreatment: alpha-2-macroglobulin, angiotensin converting enzyme (ACE), transforming growth factor, interleukin-6. Dose-volumetric and clinical parameters were also included as covariates. Feature selection was performed using a Markov blanket approach based on the Koller–Sahami filter. The Markov chain Monte Carlo technique estimated the posterior distribution of BN graphs built from the observed data of the selected variables and causality constraints. RP probability was estimated using a limited number of high posterior graphs (ensemble) and was averaged for the final RP estimate using Bayes’ rule. A resampling method based on bootstrapping was applied to model training and validation in order to control under- and overfit pitfalls. Results: RP prediction power of the BN ensemble approach reached its optimum at a size of 200. The optimized performance of the BN model recorded an area under the receiver operating characteristic curve (AUC) of 0.83, which was significantly higher than multivariate logistic regression (0.77), mean heart dose (0.69), and a pre-to-midtreatment change in ACE (0.66). When RP prediction was made only with pretreatment information, the AUC ranged from 0.76 to 0.81 depending on the ensemble size. Bootstrap validation of graph features in the ensemble quantified confidence of association between variables in the graphs where ten interactions were statistically significant. Conclusions: The presented BN methodology provides the flexibility to model hierarchical interactions between RP covariates, which is applied to probabilistic inference on RP. The authors’ preliminary results demonstrate that such framework combined with an ensemble method can possibly improve prediction of RP under real-life clinical circumstances such as missing data or treatment plan adaptation.« less
Gene-environment interaction and suicidal behavior.
Roy, Alec; Sarchiopone, Marco; Carli, Vladimir
2009-07-01
Studies have increasingly shown that gene-environment interactions are important in psychiatry. Suicidal behavior is a major public health problem. Suicide is generally considered to be a multi-determined act involving various areas of proximal and distal risk. Genetic risk factors are estimated to account for approximately 30% to 40% of the variance in suicidal behavior. In this article, the authors review relevant studies concerning the interaction between the serotonin transporter gene and environmental variables as a model of gene-environment interactions that may have an impact on suicidal behavior. The findings reviewed here suggest that there may be meaningful interactions between distal and proximal suicide risk factors that may amplify the risk of suicidal behavior. Future studies of suicidal behavior should examine both genetic and environmental variables and examine for gene-environment interactions.
Enhancing Flood Prediction Reliability Using Bayesian Model Averaging
NASA Astrophysics Data System (ADS)
Liu, Z.; Merwade, V.
2017-12-01
Uncertainty analysis is an indispensable part of modeling the hydrology and hydrodynamics of non-idealized environmental systems. Compared to reliance on prediction from one model simulation, using on ensemble of predictions that consider uncertainty from different sources is more reliable. In this study, Bayesian model averaging (BMA) is applied to Black River watershed in Arkansas and Missouri by combining multi-model simulations to get reliable deterministic water stage and probabilistic inundation extent predictions. The simulation ensemble is generated from 81 LISFLOOD-FP subgrid model configurations that include uncertainty from channel shape, channel width, channel roughness and discharge. Model simulation outputs are trained with observed water stage data during one flood event, and BMA prediction ability is validated for another flood event. Results from this study indicate that BMA does not always outperform all members in the ensemble, but it provides relatively robust deterministic flood stage predictions across the basin. Station based BMA (BMA_S) water stage prediction has better performance than global based BMA (BMA_G) prediction which is superior to the ensemble mean prediction. Additionally, high-frequency flood inundation extent (probability greater than 60%) in BMA_G probabilistic map is more accurate than the probabilistic flood inundation extent based on equal weights.
PROTAX-Sound: A probabilistic framework for automated animal sound identification
Somervuo, Panu; Ovaskainen, Otso
2017-01-01
Autonomous audio recording is stimulating new field in bioacoustics, with a great promise for conducting cost-effective species surveys. One major current challenge is the lack of reliable classifiers capable of multi-species identification. We present PROTAX-Sound, a statistical framework to perform probabilistic classification of animal sounds. PROTAX-Sound is based on a multinomial regression model, and it can utilize as predictors any kind of sound features or classifications produced by other existing algorithms. PROTAX-Sound combines audio and image processing techniques to scan environmental audio files. It identifies regions of interest (a segment of the audio file that contains a vocalization to be classified), extracts acoustic features from them and compares with samples in a reference database. The output of PROTAX-Sound is the probabilistic classification of each vocalization, including the possibility that it represents species not present in the reference database. We demonstrate the performance of PROTAX-Sound by classifying audio from a species-rich case study of tropical birds. The best performing classifier achieved 68% classification accuracy for 200 bird species. PROTAX-Sound improves the classification power of current techniques by combining information from multiple classifiers in a manner that yields calibrated classification probabilities. PMID:28863178
PROTAX-Sound: A probabilistic framework for automated animal sound identification.
de Camargo, Ulisses Moliterno; Somervuo, Panu; Ovaskainen, Otso
2017-01-01
Autonomous audio recording is stimulating new field in bioacoustics, with a great promise for conducting cost-effective species surveys. One major current challenge is the lack of reliable classifiers capable of multi-species identification. We present PROTAX-Sound, a statistical framework to perform probabilistic classification of animal sounds. PROTAX-Sound is based on a multinomial regression model, and it can utilize as predictors any kind of sound features or classifications produced by other existing algorithms. PROTAX-Sound combines audio and image processing techniques to scan environmental audio files. It identifies regions of interest (a segment of the audio file that contains a vocalization to be classified), extracts acoustic features from them and compares with samples in a reference database. The output of PROTAX-Sound is the probabilistic classification of each vocalization, including the possibility that it represents species not present in the reference database. We demonstrate the performance of PROTAX-Sound by classifying audio from a species-rich case study of tropical birds. The best performing classifier achieved 68% classification accuracy for 200 bird species. PROTAX-Sound improves the classification power of current techniques by combining information from multiple classifiers in a manner that yields calibrated classification probabilities.
Alternate Methods in Refining the SLS Nozzle Plug Loads
NASA Technical Reports Server (NTRS)
Burbank, Scott; Allen, Andrew
2013-01-01
Numerical analysis has shown that the SLS nozzle environmental barrier (nozzle plug) design is inadequate for the prelaunch condition, which consists of two dominant loads: 1) the main engines startup pressure and 2) an environmentally induced pressure. Efforts to reduce load conservatisms included a dynamic analysis which showed a 31% higher safety factor compared to the standard static analysis. The environmental load is typically approached with a deterministic method using the worst possible combinations of pressures and temperatures. An alternate probabilistic approach, utilizing the distributions of pressures and temperatures, resulted in a 54% reduction in the environmental pressure load. A Monte Carlo simulation of environmental load that used five years of historical pressure and temperature data supported the results of the probabilistic analysis, indicating the probabilistic load is reflective of a 3-sigma condition (1 in 370 probability). Utilizing the probabilistic load analysis eliminated excessive conservatisms and will prevent a future overdesign of the nozzle plug. Employing a similar probabilistic approach to other design and analysis activities can result in realistic yet adequately conservative solutions.
NASA Technical Reports Server (NTRS)
Canfield, R. C.; Ricchiazzi, P. J.
1980-01-01
An approximate probabilistic radiative transfer equation and the statistical equilibrium equations are simultaneously solved for a model hydrogen atom consisting of three bound levels and ionization continuum. The transfer equation for L-alpha, L-beta, H-alpha, and the Lyman continuum is explicitly solved assuming complete redistribution. The accuracy of this approach is tested by comparing source functions and radiative loss rates to values obtained with a method that solves the exact transfer equation. Two recent model solar-flare chromospheres are used for this test. It is shown that for the test atmospheres the probabilistic method gives values of the radiative loss rate that are characteristically good to a factor of 2. The advantage of this probabilistic approach is that it retains a description of the dominant physical processes of radiative transfer in the complete redistribution case, yet it achieves a major reduction in computational requirements.
Heck, Daniel W; Hilbig, Benjamin E; Moshagen, Morten
2017-08-01
Decision strategies explain how people integrate multiple sources of information to make probabilistic inferences. In the past decade, increasingly sophisticated methods have been developed to determine which strategy explains decision behavior best. We extend these efforts to test psychologically more plausible models (i.e., strategies), including a new, probabilistic version of the take-the-best (TTB) heuristic that implements a rank order of error probabilities based on sequential processing. Within a coherent statistical framework, deterministic and probabilistic versions of TTB and other strategies can directly be compared using model selection by minimum description length or the Bayes factor. In an experiment with inferences from given information, only three of 104 participants were best described by the psychologically plausible, probabilistic version of TTB. Similar as in previous studies, most participants were classified as users of weighted-additive, a strategy that integrates all available information and approximates rational decisions. Copyright © 2017 Elsevier Inc. All rights reserved.
Discounting of food, sex, and money.
Holt, Daniel D; Newquist, Matthew H; Smits, Rochelle R; Tiry, Andrew M
2014-06-01
Discounting is a useful framework for understanding choice involving a range of delayed and probabilistic outcomes (e.g., money, food, drugs), but relatively few studies have examined how people discount other commodities (e.g., entertainment, sex). Using a novel discounting task, where the length of a line represented the value of an outcome and was adjusted using a staircase procedure, we replicated previous findings showing that individuals discount delayed and probabilistic outcomes in a manner well described by a hyperbola-like function. In addition, we found strong positive correlations between discounting rates of delayed, but not probabilistic, outcomes. This suggests that discounting of delayed outcomes may be relatively predictable across outcome types but that discounting of probabilistic outcomes may depend more on specific contexts. The generality of delay discounting and potential context dependence of probability discounting may provide important information regarding factors contributing to choice behavior.
Menze, Bjoern H.; Van Leemput, Koen; Lashkari, Danial; Riklin-Raviv, Tammy; Geremia, Ezequiel; Alberts, Esther; Gruber, Philipp; Wegener, Susanne; Weber, Marc-André; Székely, Gabor; Ayache, Nicholas; Golland, Polina
2016-01-01
We introduce a generative probabilistic model for segmentation of brain lesions in multi-dimensional images that generalizes the EM segmenter, a common approach for modelling brain images using Gaussian mixtures and a probabilistic tissue atlas that employs expectation-maximization (EM) to estimate the label map for a new image. Our model augments the probabilistic atlas of the healthy tissues with a latent atlas of the lesion. We derive an estimation algorithm with closed-form EM update equations. The method extracts a latent atlas prior distribution and the lesion posterior distributions jointly from the image data. It delineates lesion areas individually in each channel, allowing for differences in lesion appearance across modalities, an important feature of many brain tumor imaging sequences. We also propose discriminative model extensions to map the output of the generative model to arbitrary labels with semantic and biological meaning, such as “tumor core” or “fluid-filled structure”, but without a one-to-one correspondence to the hypo-or hyper-intense lesion areas identified by the generative model. We test the approach in two image sets: the publicly available BRATS set of glioma patient scans, and multimodal brain images of patients with acute and subacute ischemic stroke. We find the generative model that has been designed for tumor lesions to generalize well to stroke images, and the generative-discriminative model to be one of the top ranking methods in the BRATS evaluation. PMID:26599702
Menze, Bjoern H; Van Leemput, Koen; Lashkari, Danial; Riklin-Raviv, Tammy; Geremia, Ezequiel; Alberts, Esther; Gruber, Philipp; Wegener, Susanne; Weber, Marc-Andre; Szekely, Gabor; Ayache, Nicholas; Golland, Polina
2016-04-01
We introduce a generative probabilistic model for segmentation of brain lesions in multi-dimensional images that generalizes the EM segmenter, a common approach for modelling brain images using Gaussian mixtures and a probabilistic tissue atlas that employs expectation-maximization (EM), to estimate the label map for a new image. Our model augments the probabilistic atlas of the healthy tissues with a latent atlas of the lesion. We derive an estimation algorithm with closed-form EM update equations. The method extracts a latent atlas prior distribution and the lesion posterior distributions jointly from the image data. It delineates lesion areas individually in each channel, allowing for differences in lesion appearance across modalities, an important feature of many brain tumor imaging sequences. We also propose discriminative model extensions to map the output of the generative model to arbitrary labels with semantic and biological meaning, such as "tumor core" or "fluid-filled structure", but without a one-to-one correspondence to the hypo- or hyper-intense lesion areas identified by the generative model. We test the approach in two image sets: the publicly available BRATS set of glioma patient scans, and multimodal brain images of patients with acute and subacute ischemic stroke. We find the generative model that has been designed for tumor lesions to generalize well to stroke images, and the extended discriminative -discriminative model to be one of the top ranking methods in the BRATS evaluation.
Kapo, Katherine E; McDonough, Kathleen; Federle, Thomas; Dyer, Scott; Vamshi, Raghu
2015-06-15
Environmental exposure and associated ecological risk related to down-the-drain chemicals discharged by municipal wastewater treatment plants (WWTPs) are strongly influenced by in-stream dilution of receiving waters which varies by geography, flow conditions and upstream wastewater inputs. The iSTREEM® model (American Cleaning Institute, Washington D.C.) was utilized to determine probabilistic distributions for no decay and decay-based dilution factors in mean annual and low (7Q10) flow conditions. The dilution factors derived in this study are "combined" dilution factors which account for both hydrologic dilution and cumulative upstream effluent contributions that will differ depending on the rate of in-stream decay due to biodegradation, volatilization, sorption, etc. for the chemical being evaluated. The median dilution factors estimated in this study (based on various in-stream decay rates from zero decay to a 1h half-life) for WWTP mixing zones dominated by domestic wastewater flow ranged from 132 to 609 at mean flow and 5 to 25 at low flow, while median dilution factors at drinking water intakes (mean flow) ranged from 146 to 2×10(7) depending on the in-stream decay rate. WWTPs within the iSTREEM® model were used to generate a distribution of per capita wastewater generated in the U.S. The dilution factor and per capita wastewater generation distributions developed by this work can be used to conduct probabilistic exposure assessments for down-the-drain chemicals in influent wastewater, wastewater treatment plant mixing zones and at drinking water intakes in the conterminous U.S. In addition, evaluation of types and abundance of U.S. wastewater treatment processes provided insight into treatment trends and the flow volume treated by each type of process. Moreover, removal efficiencies of chemicals can differ by treatment type. Hence, the availability of distributions for per capita wastewater production, treatment type, and dilution factors at a national level provides a series of practical and powerful tools for building probabilistic exposure models. Copyright © 2015 Elsevier B.V. All rights reserved.
Gürsoy, Gamze; Xu, Yun; Liang, Jie
2017-07-01
Nuclear landmarks and biochemical factors play important roles in the organization of the yeast genome. The interaction pattern of budding yeast as measured from genome-wide 3C studies are largely recapitulated by model polymer genomes subject to landmark constraints. However, the origin of inter-chromosomal interactions, specific roles of individual landmarks, and the roles of biochemical factors in yeast genome organization remain unclear. Here we describe a multi-chromosome constrained self-avoiding chromatin model (mC-SAC) to gain understanding of the budding yeast genome organization. With significantly improved sampling of genome structures, both intra- and inter-chromosomal interaction patterns from genome-wide 3C studies are accurately captured in our model at higher resolution than previous studies. We show that nuclear confinement is a key determinant of the intra-chromosomal interactions, and centromere tethering is responsible for the inter-chromosomal interactions. In addition, important genomic elements such as fragile sites and tRNA genes are found to be clustered spatially, largely due to centromere tethering. We uncovered previously unknown interactions that were not captured by genome-wide 3C studies, which are found to be enriched with tRNA genes, RNAPIII and TFIIS binding. Moreover, we identified specific high-frequency genome-wide 3C interactions that are unaccounted for by polymer effects under landmark constraints. These interactions are enriched with important genes and likely play biological roles.
Ehrenfeld, Stephan; Herbort, Oliver; Butz, Martin V.
2013-01-01
This paper addresses the question of how the brain maintains a probabilistic body state estimate over time from a modeling perspective. The neural Modular Modality Frame (nMMF) model simulates such a body state estimation process by continuously integrating redundant, multimodal body state information sources. The body state estimate itself is distributed over separate, but bidirectionally interacting modules. nMMF compares the incoming sensory and present body state information across the interacting modules and fuses the information sources accordingly. At the same time, nMMF enforces body state estimation consistency across the modules. nMMF is able to detect conflicting sensory information and to consequently decrease the influence of implausible sensor sources on the fly. In contrast to the previously published Modular Modality Frame (MMF) model, nMMF offers a biologically plausible neural implementation based on distributed, probabilistic population codes. Besides its neural plausibility, the neural encoding has the advantage of enabling (a) additional probabilistic information flow across the separate body state estimation modules and (b) the representation of arbitrary probability distributions of a body state. The results show that the neural estimates can detect and decrease the impact of false sensory information, can propagate conflicting information across modules, and can improve overall estimation accuracy due to additional module interactions. Even bodily illusions, such as the rubber hand illusion, can be simulated with nMMF. We conclude with an outlook on the potential of modeling human data and of invoking goal-directed behavioral control. PMID:24191151
An Innovative Multi-Agent Search-and-Rescue Path Planning Approach
2015-03-09
search problems from search theory and artificial intelligence /distributed robotic control, and pursuit-evasion problem perspectives may be found in...Dissanayake, “Probabilistic search for a moving target in an indoor environment”, In Proc. IEEE/RSJ Int. Conf. Intelligent Robots and Systems, 2006, pp...3393-3398. [7] H. Lau, and G. Dissanayake, “Optimal search for multiple targets in a built environment”, In Proc. IEEE/RSJ Int. Conf. Intelligent
Temporally consistent probabilistic detection of new multiple sclerosis lesions in brain MRI.
Elliott, Colm; Arnold, Douglas L; Collins, D Louis; Arbel, Tal
2013-08-01
Detection of new Multiple Sclerosis (MS) lesions on magnetic resonance imaging (MRI) is important as a marker of disease activity and as a potential surrogate for relapses. We propose an approach where sequential scans are jointly segmented, to provide a temporally consistent tissue segmentation while remaining sensitive to newly appearing lesions. The method uses a two-stage classification process: 1) a Bayesian classifier provides a probabilistic brain tissue classification at each voxel of reference and follow-up scans, and 2) a random-forest based lesion-level classification provides a final identification of new lesions. Generative models are learned based on 364 scans from 95 subjects from a multi-center clinical trial. The method is evaluated on sequential brain MRI of 160 subjects from a separate multi-center clinical trial, and is compared to 1) semi-automatically generated ground truth segmentations and 2) fully manual identification of new lesions generated independently by nine expert raters on a subset of 60 subjects. For new lesions greater than 0.15 cc in size, the classifier has near perfect performance (99% sensitivity, 2% false detection rate), as compared to ground truth. The proposed method was also shown to exceed the performance of any one of the nine expert manual identifications.
ERIC Educational Resources Information Center
Hancock-Beaulieu, Micheline; And Others
1995-01-01
An online library catalog was used to evaluate an interactive query expansion facility based on relevance feedback for the Okapi, probabilistic, term weighting, retrieval system. A graphical user interface allowed searchers to select candidate terms extracted from relevant retrieved items to reformulate queries. Results suggested that the…
Probabilistic finite elements for fracture and fatigue analysis
NASA Technical Reports Server (NTRS)
Liu, W. K.; Belytschko, T.; Lawrence, M.; Besterfield, G. H.
1989-01-01
The fusion of the probabilistic finite element method (PFEM) and reliability analysis for probabilistic fracture mechanics (PFM) is presented. A comprehensive method for determining the probability of fatigue failure for curved crack growth was developed. The criterion for failure or performance function is stated as: the fatigue life of a component must exceed the service life of the component; otherwise failure will occur. An enriched element that has the near-crack-tip singular strain field embedded in the element is used to formulate the equilibrium equation and solve for the stress intensity factors at the crack-tip. Performance and accuracy of the method is demonstrated on a classical mode 1 fatigue problem.
Probabilistic assessment of smart composite structures
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Shiao, Michael C.
1994-01-01
A composite wing with spars and bulkheads is used to demonstrate the effectiveness of probabilistic assessment of smart composite structures to control uncertainties in distortions and stresses. Results show that a smart composite wing can be controlled to minimize distortions and to have specified stress levels in the presence of defects. Structural responses such as changes in angle of attack, vertical displacements, and stress in the control and controlled plies are probabilistically assessed to quantify their respective uncertainties. Sensitivity factors are evaluated to identify those parameters that have the greatest influence on a specific structural response. Results show that smart composite structures can be configured to control both distortions and ply stresses to satisfy specified design requirements.
NASA Astrophysics Data System (ADS)
Wang, S.; Huang, G. H.; Baetz, B. W.; Cai, X. M.; Ancell, B. C.; Fan, Y. R.
2017-11-01
The ensemble Kalman filter (EnKF) is recognized as a powerful data assimilation technique that generates an ensemble of model variables through stochastic perturbations of forcing data and observations. However, relatively little guidance exists with regard to the proper specification of the magnitude of the perturbation and the ensemble size, posing a significant challenge in optimally implementing the EnKF. This paper presents a robust data assimilation system (RDAS), in which a multi-factorial design of the EnKF experiments is first proposed for hydrologic ensemble predictions. A multi-way analysis of variance is then used to examine potential interactions among factors affecting the EnKF experiments, achieving optimality of the RDAS with maximized performance of hydrologic predictions. The RDAS is applied to the Xiangxi River watershed which is the most representative watershed in China's Three Gorges Reservoir region to demonstrate its validity and applicability. Results reveal that the pairwise interaction between perturbed precipitation and streamflow observations has the most significant impact on the performance of the EnKF system, and their interactions vary dynamically across different settings of the ensemble size and the evapotranspiration perturbation. In addition, the interactions among experimental factors vary greatly in magnitude and direction depending on different statistical metrics for model evaluation including the Nash-Sutcliffe efficiency and the Box-Cox transformed root-mean-square error. It is thus necessary to test various evaluation metrics in order to enhance the robustness of hydrologic prediction systems.
Jayaraman, Sudha P; Jiang, Yushan; Resch, Stephen; Askari, Reza; Klompas, Michael
2016-10-01
Interventions to contain two multi-drug-resistant Acinetobacter (MDRA) outbreaks reduced the incidence of multi-drug-resistant (MDR) organisms, specifically methicillin-resistant Staphylococcus aureus, vancomycin-resistant Enterococcus, and Clostridium difficile in the general surgery intensive care unit (ICU) of our hospital. We therefore conducted a cost-effective analysis of a proactive model infection-control program to reduce transmission of MDR organisms based on the practices used to control the MDRA outbreak. We created a model of a proactive infection control program based on the 2011 MDRA outbreak response. We built a decision analysis model and performed univariable and probabilistic sensitivity analyses to evaluate the cost-effectiveness of the proposed program compared with standard infection control practices to reduce transmission of these MDR organisms. The cost of a proactive infection control program would be $68,509 per year. The incremental cost-effectiveness ratio (ICER) was calculated to be $3,804 per aversion of transmission of MDR organisms in a one-year period compared with standard infection control. On the basis of probabilistic sensitivity analysis, a willingness-to-pay (WTP) threshold of $14,000 per transmission averted would have a 42% probability of being cost-effective, rising to 100% at $22,000 per transmission averted. This analysis gives an estimated ICER for implementing a proactive program to prevent transmission of MDR organisms in the general surgery ICU. To better understand the causal relations between the critical steps in the program and the rate reductions, a randomized study of a package of interventions to prevent healthcare-associated infections should be considered.
Terzo, Esteban A.; Lyons, Shawn M.; Poulton, John S.; Temple, Brenda R. S.; Marzluff, William F.; Duronio, Robert J.
2015-01-01
Nuclear bodies (NBs) are structures that concentrate proteins, RNAs, and ribonucleoproteins that perform functions essential to gene expression. How NBs assemble is not well understood. We studied the Drosophila histone locus body (HLB), a NB that concentrates factors required for histone mRNA biosynthesis at the replication-dependent histone gene locus. We coupled biochemical analysis with confocal imaging of both fixed and live tissues to demonstrate that the Drosophila Multi Sex Combs (Mxc) protein contains multiple domains necessary for HLB assembly. An important feature of this assembly process is the self-interaction of Mxc via two conserved N-terminal domains: a LisH domain and a novel self-interaction facilitator (SIF) domain immediately downstream of the LisH domain. Molecular modeling suggests that the LisH and SIF domains directly interact, and mutation of either the LisH or the SIF domain severely impairs Mxc function in vivo, resulting in reduced histone mRNA accumulation. A region of Mxc between amino acids 721 and 1481 is also necessary for HLB assembly independent of the LisH and SIF domains. Finally, the C-terminal 195 amino acids of Mxc are required for recruiting FLASH, an essential histone mRNA-processing factor, to the HLB. We conclude that multiple domains of the Mxc protein promote HLB assembly in order to concentrate factors required for histone mRNA biosynthesis. PMID:25694448
Brogan, Paula; Hasson, Felicity; McIlfatrick, Sonja
2018-01-01
Globally recommended in healthcare policy, Shared Decision-Making is also central to international policy promoting community palliative care. Yet realities of implementation by multi-disciplinary healthcare professionals who provide end-of-life care in the home are unclear. To explore multi-disciplinary healthcare professionals' perceptions and experiences of Shared Decision-Making at end of life in the home. Qualitative design using focus groups, transcribed verbatim and analysed thematically. A total of 43 participants, from multi-disciplinary community-based services in one region of the United Kingdom, were recruited. While the rhetoric of Shared Decision-Making was recognised, its implementation was impacted by several interconnecting factors, including (1) conceptual confusion regarding Shared Decision-Making, (2) uncertainty in the process and (3) organisational factors which impeded Shared Decision-Making. Multiple interacting factors influence implementation of Shared Decision-Making by professionals working in complex community settings at the end of life. Moving from rhetoric to reality requires future work exploring the realities of Shared Decision-Making practice at individual, process and systems levels.
NASA Technical Reports Server (NTRS)
Onwubiko, Chin-Yere; Onyebueke, Landon
1996-01-01
The structural design, or the design of machine elements, has been traditionally based on deterministic design methodology. The deterministic method considers all design parameters to be known with certainty. This methodology is, therefore, inadequate to design complex structures that are subjected to a variety of complex, severe loading conditions. A nonlinear behavior that is dependent on stress, stress rate, temperature, number of load cycles, and time is observed on all components subjected to complex conditions. These complex conditions introduce uncertainties; hence, the actual factor of safety margin remains unknown. In the deterministic methodology, the contingency of failure is discounted; hence, there is a use of a high factor of safety. It may be most useful in situations where the design structures are simple. The probabilistic method is concerned with the probability of non-failure performance of structures or machine elements. It is much more useful in situations where the design is characterized by complex geometry, possibility of catastrophic failure, sensitive loads and material properties. Also included: Comparative Study of the use of AGMA Geometry Factors and Probabilistic Design Methodology in the Design of Compact Spur Gear Set.
Generaal, Ellen; Milaneschi, Yuri; Jansen, Rick; Elzinga, Bernet M; Dekker, Joost; Penninx, Brenda W J H
2016-01-01
Brain-derived neurotrophic factor (BDNF) disturbances and life stress, both independently and in interaction, have been hypothesized to induce chronic pain. We examined whether (a) the BDNF pathway (val(66)met genotype, gene expression, and serum levels), (b) early and recent life stress, and (c) their interaction are associated with the presence and severity of chronic multi-site musculoskeletal pain. Cross-sectional data are from 1646 subjects of the Netherlands Study of Depression and Anxiety. The presence and severity of chronic multi-site musculoskeletal pain were determined using the Chronic Pain Grade (CPG) questionnaire. The BDNF val(66)met polymorphism, BDNF gene expression, and BDNF serum levels were measured. Early life stress before the age of 16 was assessed by calculating a childhood trauma index using the Childhood Trauma Interview. Recent life stress was assessed as the number of recent adverse life events using the List of Threatening Events Questionnaire. Compared to val(66)val, BDNF met carriers more often had chronic pain, whereas no differences were found for BDNF gene expression and serum levels. Higher levels of early and recent stress were both associated with the presence and severity of chronic pain (p < 0.001). No interaction effect was found for the BDNF pathway with life stress in the associations with chronic pain presence and severity. This study suggests that the BDNF gene marks vulnerability for chronic pain. Although life stress did not alter the impact of BDNF on chronic pain, it seems an independent factor in the onset and persistence of chronic pain. © The Author(s) 2016.
Landslide hazard assessment: recent trends and techniques.
Pardeshi, Sudhakar D; Autade, Sumant E; Pardeshi, Suchitra S
2013-01-01
Landslide hazard assessment is an important step towards landslide hazard and risk management. There are several methods of Landslide Hazard Zonation (LHZ) viz. heuristic, semi quantitative, quantitative, probabilistic and multi-criteria decision making process. However, no one method is accepted universally for effective assessment of landslide hazards. In recent years, several attempts have been made to apply different methods of LHZ and to compare results in order to find the best suited model. This paper presents the review of researches on landslide hazard mapping published in recent years. The advanced multivariate techniques are proved to be effective in spatial prediction of landslides with high degree of accuracy. Physical process based models also perform well in LHZ mapping even in the areas with poor database. Multi-criteria decision making approach also play significant role in determining relative importance of landslide causative factors in slope instability process. Remote Sensing and Geographical Information System (GIS) are powerful tools to assess landslide hazards and are being used extensively in landslide researches since last decade. Aerial photographs and high resolution satellite data are useful in detection, mapping and monitoring landslide processes. GIS based LHZ models helps not only to map and monitor landslides but also to predict future slope failures. The advancements in Geo-spatial technologies have opened the doors for detailed and accurate assessment of landslide hazards.
Statistical Analysis of Stress Signals from Bridge Monitoring by FBG System.
Ye, Xiao-Wei; Su, You-Hua; Xi, Pei-Sen
2018-02-07
In this paper, a fiber Bragg grating (FBG)-based stress monitoring system instrumented on an orthotropic steel deck arch bridge is demonstrated. The FBG sensors are installed at two types of critical fatigue-prone welded joints to measure the strain and temperature signals. A total of 64 FBG sensors are deployed around the rib-to-deck and rib-to-diagram areas at the mid-span and quarter-span of the investigated orthotropic steel bridge. The local stress behaviors caused by the highway loading and temperature effect during the construction and operation periods are presented with the aid of a wavelet multi-resolution analysis approach. In addition, the multi-modal characteristic of the rainflow counted stress spectrum is modeled by the method of finite mixture distribution together with a genetic algorithm (GA)-based parameter estimation approach. The optimal probability distribution of the stress spectrum is determined by use of Bayesian information criterion (BIC). Furthermore, the hot spot stress of the welded joint is calculated by an extrapolation method recommended in the specification of International Institute of Welding (IIW). The stochastic characteristic of stress concentration factor (SCF) of the concerned welded joint is addressed. The proposed FBG-based stress monitoring system and probabilistic stress evaluation methods can provide an effective tool for structural monitoring and condition assessment of orthotropic steel bridges.
Probabilistic Approaches for Multi-Hazard Risk Assessment of Structures and Systems
NASA Astrophysics Data System (ADS)
Kwag, Shinyoung
Performance assessment of structures, systems, and components for multi-hazard scenarios has received significant attention in recent years. However, the concept of multi-hazard analysis is quite broad in nature and the focus of existing literature varies across a wide range of problems. In some cases, such studies focus on hazards that either occur simultaneously or are closely correlated with each other. For example, seismically induced flooding or seismically induced fires. In other cases, multi-hazard studies relate to hazards that are not dependent or correlated but have strong likelihood of occurrence at different times during the lifetime of a structure. The current approaches for risk assessment need enhancement to account for multi-hazard risks. It must be able to account for uncertainty propagation in a systems-level analysis, consider correlation among events or failure modes, and allow integration of newly available information from continually evolving simulation models, experimental observations, and field measurements. This dissertation presents a detailed study that proposes enhancements by incorporating Bayesian networks and Bayesian updating within a performance-based probabilistic framework. The performance-based framework allows propagation of risk as well as uncertainties in the risk estimates within a systems analysis. Unlike conventional risk assessment techniques such as a fault-tree analysis, a Bayesian network can account for statistical dependencies and correlations among events/hazards. The proposed approach is extended to develop a risk-informed framework for quantitative validation and verification of high fidelity system-level simulation tools. Validation of such simulations can be quite formidable within the context of a multi-hazard risk assessment in nuclear power plants. The efficiency of this approach lies in identification of critical events, components, and systems that contribute to the overall risk. Validation of any event or component on the critical path is relatively more important in a risk-informed environment. Significance of multi-hazard risk is also illustrated for uncorrelated hazards of earthquakes and high winds which may result in competing design objectives. It is also illustrated that the number of computationally intensive nonlinear simulations needed in performance-based risk assessment for external hazards can be significantly reduced by using the power of Bayesian updating in conjunction with the concept of equivalent limit-state.
Proceedings of the international meeting on thermal nuclear reactor safety. Vol. 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Separate abstracts are included for each of the papers presented concerning current issues in nuclear power plant safety; national programs in nuclear power plant safety; radiological source terms; probabilistic risk assessment methods and techniques; non LOCA and small-break-LOCA transients; safety goals; pressurized thermal shocks; applications of reliability and risk methods to probabilistic risk assessment; human factors and man-machine interface; and data bases and special applications.
A Probabilistic Graphical Model to Detect Chromosomal Domains
NASA Astrophysics Data System (ADS)
Heermann, Dieter; Hofmann, Andreas; Weber, Eva
To understand the nature of a cell, one needs to understand the structure of its genome. For this purpose, experimental techniques such as Hi-C detecting chromosomal contacts are used to probe the three-dimensional genomic structure. These experiments yield topological information, consistently showing a hierarchical subdivision of the genome into self-interacting domains across many organisms. Current methods for detecting these domains using the Hi-C contact matrix, i.e. a doubly-stochastic matrix, are mostly based on the assumption that the domains are distinct, thus non-overlapping. For overcoming this simplification and for being able to unravel a possible nested domain structure, we developed a probabilistic graphical model that makes no a priori assumptions on the domain structure. Within this approach, the Hi-C contact matrix is analyzed using an Ising like probabilistic graphical model whose coupling constant is proportional to each lattice point (entry in the contact matrix). The results show clear boundaries between identified domains and the background. These domain boundaries are dependent on the coupling constant, so that one matrix yields several clusters of different sizes, which show the self-interaction of the genome on different scales. This work was supported by a Grant from the International Human Frontier Science Program Organization (RGP0014/2014).
NASA Technical Reports Server (NTRS)
Boyce, Lola; Bast, Callie C.; Trimble, Greg A.
1992-01-01
This report presents the results of a fourth year effort of a research program, conducted for NASA-LeRC by the University of Texas at San Antonio (UTSA). The research included on-going development of methodology that provides probabilistic lifetime strength of aerospace materials via computational simulation. A probabilistic material strength degradation model, in the form of a randomized multifactor interaction equation, is postulated for strength degradation of structural components of aerospace propulsion systems subject to a number of effects or primitive variables. These primitive variables may include high temperature, fatigue or creep. In most cases, strength is reduced as a result of the action of a variable. This multifactor interaction strength degradation equation has been randomized and is included in the computer program, PROMISS. Also included in the research is the development of methodology to calibrate the above-described constitutive equation using actual experimental materials data together with regression analysis of that data, thereby predicting values for the empirical material constants for each effect or primitive variable. This regression methodology is included in the computer program, PROMISC. Actual experimental materials data were obtained from industry and the open literature for materials typically for applications in aerospace propulsion system components. Material data for Inconel 718 has been analyzed using the developed methodology.
NASA Technical Reports Server (NTRS)
Boyce, Lola; Bast, Callie C.; Trimble, Greg A.
1992-01-01
The results of a fourth year effort of a research program conducted for NASA-LeRC by The University of Texas at San Antonio (UTSA) are presented. The research included on-going development of methodology that provides probabilistic lifetime strength of aerospace materials via computational simulation. A probabilistic material strength degradation model, in the form of a randomized multifactor interaction equation, is postulated for strength degradation of structural components of aerospace propulsion systems subjected to a number of effects or primitive variables. These primitive variables may include high temperature, fatigue, or creep. In most cases, strength is reduced as a result of the action of a variable. This multifactor interaction strength degradation equation was randomized and is included in the computer program, PROMISC. Also included in the research is the development of methodology to calibrate the above-described constitutive equation using actual experimental materials data together with regression analysis of that data, thereby predicting values for the empirical material constants for each effect or primitive variable. This regression methodology is included in the computer program, PROMISC. Actual experimental materials data were obtained from industry and the open literature for materials typically for applications in aerospace propulsion system components. Material data for Inconel 718 was analyzed using the developed methodology.
Watanabe, Noriya; Sakagami, Masamichi; Haruno, Masahiko
2013-03-06
Learning does not only depend on rationality, because real-life learning cannot be isolated from emotion or social factors. Therefore, it is intriguing to determine how emotion changes learning, and to identify which neural substrates underlie this interaction. Here, we show that the task-independent presentation of an emotional face before a reward-predicting cue increases the speed of cue-reward association learning in human subjects compared with trials in which a neutral face is presented. This phenomenon was attributable to an increase in the learning rate, which regulates reward prediction errors. Parallel to these behavioral findings, functional magnetic resonance imaging demonstrated that presentation of an emotional face enhanced reward prediction error (RPE) signal in the ventral striatum. In addition, we also found a functional link between this enhanced RPE signal and increased activity in the amygdala following presentation of an emotional face. Thus, this study revealed an acceleration of cue-reward association learning by emotion, and underscored a role of striatum-amygdala interactions in the modulation of the reward prediction errors by emotion.
Population-based 3D genome structure analysis reveals driving forces in spatial genome organization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tjong, Harianto; Li, Wenyuan; Kalhor, Reza
Conformation capture technologies (e.g., Hi-C) chart physical interactions between chromatin regions on a genome-wide scale. However, the structural variability of the genome between cells poses a great challenge to interpreting ensemble-averaged Hi-C data, particularly for long-range and interchromosomal interactions. Here, we present a probabilistic approach for deconvoluting Hi-C data into a model population of distinct diploid 3D genome structures, which facilitates the detection of chromatin interactions likely to co-occur in individual cells. Here, our approach incorporates the stochastic nature of chromosome conformations and allows a detailed analysis of alternative chromatin structure states. For example, we predict and experimentally confirm themore » presence of large centromere clusters with distinct chromosome compositions varying between individual cells. The stability of these clusters varies greatly with their chromosome identities. We show that these chromosome-specific clusters can play a key role in the overall chromosome positioning in the nucleus and stabilizing specific chromatin interactions. By explicitly considering genome structural variability, our population-based method provides an important tool for revealing novel insights into the key factors shaping the spatial genome organization.« less
Population-based 3D genome structure analysis reveals driving forces in spatial genome organization
Tjong, Harianto; Li, Wenyuan; Kalhor, Reza; ...
2016-03-07
Conformation capture technologies (e.g., Hi-C) chart physical interactions between chromatin regions on a genome-wide scale. However, the structural variability of the genome between cells poses a great challenge to interpreting ensemble-averaged Hi-C data, particularly for long-range and interchromosomal interactions. Here, we present a probabilistic approach for deconvoluting Hi-C data into a model population of distinct diploid 3D genome structures, which facilitates the detection of chromatin interactions likely to co-occur in individual cells. Here, our approach incorporates the stochastic nature of chromosome conformations and allows a detailed analysis of alternative chromatin structure states. For example, we predict and experimentally confirm themore » presence of large centromere clusters with distinct chromosome compositions varying between individual cells. The stability of these clusters varies greatly with their chromosome identities. We show that these chromosome-specific clusters can play a key role in the overall chromosome positioning in the nucleus and stabilizing specific chromatin interactions. By explicitly considering genome structural variability, our population-based method provides an important tool for revealing novel insights into the key factors shaping the spatial genome organization.« less
NASA Astrophysics Data System (ADS)
Juarez, A. M.; Kibler, K. M.; Sayama, T.; Ohara, M.
2016-12-01
Flood management decision-making is often supported by risk assessment, which may overlook the role of coping capacity and the potential benefits derived from direct use of flood-prone land. Alternatively, risk-benefit analysis can support floodplain management to yield maximum socio-ecological benefits for the minimum flood risk. We evaluate flood risk-probabilistic benefit tradeoffs of livelihood practices compatible with direct human use of flood-prone land (agriculture/wild fisheries) and nature conservation (wild fisheries only) in Candaba, Philippines. Located north-west to Metro Manila, Candaba area is a multi-functional landscape that provides a temporally-variable mix of possible land uses, benefits and ecosystem services of local and regional value. To characterize inundation from 1.3- to 100-year recurrence intervals we couple frequency analysis with rainfall-runoff-inundation modelling and remotely-sensed data. By combining simulated probabilistic floods with both damage and benefit functions (e.g. fish capture and rice yield with flood intensity) we estimate potential damages and benefits over varying probabilistic flood hazards. We find that although direct human uses of flood-prone land are associated with damages, for all the investigated magnitudes of flood events with different frequencies, the probabilistic benefits ( 91 million) exceed risks by a large margin ( 33 million). Even considering risk, probabilistic livelihood benefits of direct human uses far exceed benefits provided by scenarios that exclude direct "risky" human uses (difference of 85 million). In addition, we find that individual coping strategies, such as adapting crop planting periods to the flood pulse or fishing rather than cultivating rice in the wet season, minimize flood losses ( 6 million) while allowing for valuable livelihood benefits ($ 125 million) in flood-prone land. Analysis of societal benefits and local capacities to cope with regular floods demonstrate the relevance of accounting for the full range of flood events and their relation to both potential damages and benefits in risk assessments. Management measures may thus be designed to reflect local contexts and support benefits of natural hydrologic processes, while minimizing flood damage.
Static and dynamic factors in an information-based multi-asset artificial stock market
NASA Astrophysics Data System (ADS)
Ponta, Linda; Pastore, Stefano; Cincotti, Silvano
2018-02-01
An information-based multi-asset artificial stock market characterized by different types of stocks and populated by heterogeneous agents is presented. In the market, agents trade risky assets in exchange for cash. Beside the amount of cash and of stocks owned, each agent is characterized by sentiments and agents share their sentiments by means of interactions that are determined by sparsely connected networks. A central market maker (clearing house mechanism) determines the price processes for each stock at the intersection of the demand and the supply curves. Single stock price processes exhibit volatility clustering and fat-tailed distribution of returns whereas multivariate price process exhibits both static and dynamic stylized facts, i.e., the presence of static factors and common trends. Static factors are studied making reference to the cross-correlation of returns of different stocks. The common trends are investigated considering the variance-covariance matrix of prices. Results point out that the probability distribution of eigenvalues of the cross-correlation matrix of returns shows the presence of sectors, similar to those observed on real empirical data. As regarding the dynamic factors, the variance-covariance matrix of prices point out a limited number of assets prices series that are independent integrated processes, in close agreement with the empirical evidence of asset price time series of real stock markets. These results remarks the crucial dependence of statistical properties of multi-assets stock market on the agents' interaction structure.
Integer Linear Programming for Constrained Multi-Aspect Committee Review Assignment
Karimzadehgan, Maryam; Zhai, ChengXiang
2011-01-01
Automatic review assignment can significantly improve the productivity of many people such as conference organizers, journal editors and grant administrators. A general setup of the review assignment problem involves assigning a set of reviewers on a committee to a set of documents to be reviewed under the constraint of review quota so that the reviewers assigned to a document can collectively cover multiple topic aspects of the document. No previous work has addressed such a setup of committee review assignments while also considering matching multiple aspects of topics and expertise. In this paper, we tackle the problem of committee review assignment with multi-aspect expertise matching by casting it as an integer linear programming problem. The proposed algorithm can naturally accommodate any probabilistic or deterministic method for modeling multiple aspects to automate committee review assignments. Evaluation using a multi-aspect review assignment test set constructed using ACM SIGIR publications shows that the proposed algorithm is effective and efficient for committee review assignments based on multi-aspect expertise matching. PMID:22711970
Tuning single-photon sources for telecom multi-photon experiments.
Greganti, Chiara; Schiansky, Peter; Calafell, Irati Alonso; Procopio, Lorenzo M; Rozema, Lee A; Walther, Philip
2018-02-05
Multi-photon state generation is of great interest for near-future quantum simulation and quantum computation experiments. To-date spontaneous parametric down-conversion is still the most promising process, even though two major impediments still exist: accidental photon noise (caused by the probabilistic non-linear process) and imperfect single-photon purity (arising from spectral entanglement between the photon pairs). In this work, we overcome both of these difficulties by (1) exploiting a passive temporal multiplexing scheme and (2) carefully optimizing the spectral properties of the down-converted photons using periodically-poled KTP crystals. We construct two down-conversion sources in the telecom wavelength regime, finding spectral purities of > 91%, while maintaining high four-photon count rates. We use single-photon grating spectrometers together with superconducting nanowire single-photon detectors to perform a detailed characterization of our multi-photon source. Our methods provide practical solutions to produce high-quality multi-photon states, which are in demand for many quantum photonics applications.
Fluid Structure Interaction in a Turbine Blade
NASA Technical Reports Server (NTRS)
Gorla, Rama S. R.
2004-01-01
An unsteady, three dimensional Navier-Stokes solution in rotating frame formulation for turbomachinery applications is presented. Casting the governing equations in a rotating frame enabled the freezing of grid motion and resulted in substantial savings in computer time. The turbine blade was computationally simulated and probabilistically evaluated in view of several uncertainties in the aerodynamic, structural, material and thermal variables that govern the turbine blade. The interconnection between the computational fluid dynamics code and finite element structural analysis code was necessary to couple the thermal profiles with the structural design. The stresses and their variations were evaluated at critical points on the Turbine blade. Cumulative distribution functions and sensitivity factors were computed for stress responses due to aerodynamic, geometric, mechanical and thermal random variables.
Variability in perceived satisfaction of reservoir management objectives
Owen, W.J.; Gates, T.K.; Flug, M.
1997-01-01
Fuzzy set theory provides a useful model to address imprecision in interpreting linguistically described objectives for reservoir management. Fuzzy membership functions can be used to represent degrees of objective satisfaction for different values of management variables. However, lack of background information, differing experiences and qualifications, and complex interactions of influencing factors can contribute to significant variability among membership functions derived from surveys of multiple experts. In the present study, probabilistic membership functions are used to model variability in experts' perceptions of satisfaction of objectives for hydropower generation, fish habitat, kayaking, rafting, and scenery preservation on the Green River through operations of Flaming Gorge Dam. Degree of variability in experts' perceptions differed among objectives but resulted in substantial uncertainty in estimation of optimal reservoir releases.
NASA Technical Reports Server (NTRS)
Duffy, S. F.; Hu, J.; Hopkins, D. A.
1995-01-01
The article begins by examining the fundamentals of traditional deterministic design philosophy. The initial section outlines the concepts of failure criteria and limit state functions two traditional notions that are embedded in deterministic design philosophy. This is followed by a discussion regarding safety factors (a possible limit state function) and the common utilization of statistical concepts in deterministic engineering design approaches. Next the fundamental aspects of a probabilistic failure analysis are explored and it is shown that deterministic design concepts mentioned in the initial portion of the article are embedded in probabilistic design methods. For components fabricated from ceramic materials (and other similarly brittle materials) the probabilistic design approach yields the widely used Weibull analysis after suitable assumptions are incorporated. The authors point out that Weibull analysis provides the rare instance where closed form solutions are available for a probabilistic failure analysis. Since numerical methods are usually required to evaluate component reliabilities, a section on Monte Carlo methods is included to introduce the concept. The article concludes with a presentation of the technical aspects that support the numerical method known as fast probability integration (FPI). This includes a discussion of the Hasofer-Lind and Rackwitz-Fiessler approximations.
Khan, F I; Iqbal, A; Ramesh, N; Abbasi, S A
2001-10-12
As it is conventionally done, strategies for incorporating accident--prevention measures in any hazardous chemical process industry are developed on the basis of input from risk assessment. However, the two steps-- risk assessment and hazard reduction (or safety) measures--are not linked interactively in the existing methodologies. This prevents a quantitative assessment of the impacts of safety measures on risk control. We have made an attempt to develop a methodology in which risk assessment steps are interactively linked with implementation of safety measures. The resultant system tells us the extent of reduction of risk by each successive safety measure. It also tells based on sophisticated maximum credible accident analysis (MCAA) and probabilistic fault tree analysis (PFTA) whether a given unit can ever be made 'safe'. The application of the methodology has been illustrated with a case study.
Using Speculative Execution to Automatically Hide I/O Latency
2001-12-07
sion of the Lempel - Ziv algorithm and the Finite multi-order context models (FMOC) that originated from prediction-by-partial-match data compressors...allowed the cancellation of a single hint at a time.) 2.2.4 Predicting future data needs In order to take advantage of any of the algorithms described...modelling techniques generally used for data compression to perform probabilistic prediction of an application’s next page fault (or, in an object-oriented
Multi-Disciplinary Techniques for Understanding Time-Varying Space-Based Imagery.
1985-05-10
problem, and I V WY" 3 discuss the impgrtage of this work to Air Force technology and to related Air Force programs. Section 1.5 provides a summary of...development of new algorithms and their realization in a hybrid optical/digital architecture. However, devices and architectures being developed in related ...and relate these representntions to object and surface contour properties of the scene. The techniques studied included Probabilistic Graph Matching
Extreme Events and Energy Providers: Science and Innovation
NASA Astrophysics Data System (ADS)
Yiou, P.; Vautard, R.
2012-04-01
Most socio-economic regulations related to the resilience to climate extremes, from infrastructure or network design to insurance premiums, are based on a present-day climate with an assumption of stationarity. Climate extremes (heat waves, cold spells, droughts, storms and wind stilling) affect in particular energy production, supply, demand and security in several ways. While national, European or international projects have generated vast amounts of climate projections for the 21st century, their practical use in long-term planning remains limited. Estimating probabilistic diagnostics of energy user relevant variables from those multi-model projections will help the energy sector to elaborate medium to long-term plans, and will allow the assessment of climate risks associated to those plans. The project "Extreme Events for Energy Providers" (E3P) aims at filling a gap between climate science and its practical use in the energy sector and creating in turn favourable conditions for new business opportunities. The value chain ranges from addressing research questions directly related to energy-significant climate extremes to providing innovative tools of information and decision making (including methodologies, best practices and software) and climate science training for the energy sector, with a focus on extreme events. Those tools will integrate the scientific knowledge that is developed by scientific communities, and translate it into a usable probabilistic framework. The project will deliver projection tools assessing the probabilities of future energy-relevant climate extremes at a range of spatial scales varying from pan-European to local scales. The E3P project is funded by the Knowledge and Innovation Community (KIC Climate). We will present the mechanisms of interactions between academic partners, SMEs and industrial partners for this project. Those mechanisms are elementary bricks of a climate service.
Variability in Word Duration as a Function of Probability, Speech Style, and Prosody
Baker, Rachel E.; Bradlow, Ann R.
2010-01-01
This article examines how probability (lexical frequency and previous mention), speech style, and prosody affect word duration, and how these factors interact. Participants read controlled materials in clear and plain speech styles. As expected, more probable words (higher frequencies and second mentions) were significantly shorter than less probable words, and words in plain speech were significantly shorter than those in clear speech. Interestingly, we found second mention reduction effects in both clear and plain speech, indicating that while clear speech is hyper-articulated, this hyper-articulation does not override probabilistic effects on duration. We also found an interaction between mention and frequency, but only in plain speech. High frequency words allowed more second mention reduction than low frequency words in plain speech, revealing a tendency to hypo-articulate as much as possible when all factors support it. Finally, we found that first mentions were more likely to be accented than second mentions. However, when these differences in accent likelihood were controlled, a significant second mention reduction effect remained. This supports the concept of a direct link between probability and duration, rather than a relationship solely mediated by prosodic prominence. PMID:20121039
Effects of delay and probability combinations on discounting in humans
Cox, David J.; Dallery, Jesse
2017-01-01
To determine discount rates, researchers typically adjust the amount of an immediate or certain option relative to a delayed or uncertain option. Because this adjusting amount method can be relatively time consuming, researchers have developed more efficient procedures. One such procedure is a 5-trial adjusting delay procedure, which measures the delay at which an amount of money loses half of its value (e.g., $1000 is valued at $500 with a 10-year delay to its receipt). Experiment 1 (n = 212) used 5-trial adjusting delay or probability tasks to measure delay discounting of losses, probabilistic gains, and probabilistic losses. Experiment 2 (n = 98) assessed combined probabilistic and delayed alternatives. In both experiments, we compared results from 5-trial adjusting delay or probability tasks to traditional adjusting amount procedures. Results suggest both procedures produced similar rates of probability and delay discounting in six out of seven comparisons. A magnitude effect consistent with previous research was observed for probabilistic gains and losses, but not for delayed losses. Results also suggest that delay and probability interact to determine the value of money. Five-trial methods may allow researchers to assess discounting more efficiently as well as study more complex choice scenarios. PMID:27498073
Quantification of uncertainties in the performance of smart composite structures
NASA Technical Reports Server (NTRS)
Shiao, Michael C.; Chamis, Christos C.
1993-01-01
A composite wing with spars, bulkheads, and built-in control devices is evaluated using a method for the probabilistic assessment of smart composite structures. Structural responses (such as change in angle of attack, vertical displacements, and stresses in regular plies with traditional materials and in control plies with mixed traditional and actuation materials) are probabilistically assessed to quantify their respective scatter. Probabilistic sensitivity factors are computed to identify those parameters that have a significant influence on a specific structural response. Results show that the uncertainties in the responses of smart composite structures can be quantified. Responses such as structural deformation, ply stresses, frequencies, and buckling loads in the presence of defects can be reliably controlled to satisfy specified design requirements.
Zapata-Fonseca, Leonardo; Dotov, Dobromir; Fossion, Ruben; Froese, Tom
2016-01-01
There is a growing consensus that a fuller understanding of social cognition depends on more systematic studies of real-time social interaction. Such studies require methods that can deal with the complex dynamics taking place at multiple interdependent temporal and spatial scales, spanning sub-personal, personal, and dyadic levels of analysis. We demonstrate the value of adopting an extended multi-scale approach by re-analyzing movement time-series generated in a study of embodied dyadic interaction in a minimal virtual reality environment (a perceptual crossing experiment). Reduced movement variability revealed an interdependence between social awareness and social coordination that cannot be accounted for by either subjective or objective factors alone: it picks out interactions in which subjective and objective conditions are convergent (i.e., elevated coordination is perceived as clearly social, and impaired coordination is perceived as socially ambiguous). This finding is consistent with the claim that interpersonal interaction can be partially constitutive of direct social perception. Clustering statistics (Allan Factor) of salient events revealed fractal scaling. Complexity matching defined as the similarity between these scaling laws was significantly more pronounced in pairs of participants as compared to surrogate dyads. This further highlights the multi-scale and distributed character of social interaction and extends previous complexity matching results from dyadic conversation to non-verbal social interaction dynamics. Trials with successful joint interaction were also associated with an increase in local coordination. Consequently, a local coordination pattern emerges on the background of complex dyadic interactions in the PCE task and makes joint successful performance possible. PMID:28018274
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Kauwe, Martin G.; Medlyn, Belinda E.; Walker, Anthony P.
Multi-factor experiments are often advocated as important for advancing terrestrial biosphere models (TBMs), yet to date such models have only been tested against single-factor experiments. We applied 10 TBMs to the multi-factor Prairie Heating and CO 2 Enrichment (PHACE) experiment in Wyoming, USA. Our goals were to investigate how multi-factor experiments can be used to constrain models, and to identify a road map for model improvement. We found models performed poorly in ambient conditions; there was a wide spread in simulated above-ground net primary productivity (range: 31-390 g C m -2 yr -1). Comparison with data highlighted model failures particularlymore » in respect to carbon allocation, phenology, and the impact of water stress on phenology. Performance against single-factors was also relatively poor. In addition, similar responses were predicted for different reasons across models: there were large differences among models in sensitivity to water stress and, among the nitrogen cycle models, nitrogen availability during the experiment. Models were also unable to capture observed treatment effects on phenology: they over-estimated the effect of warming on leaf onset and did not allow CO 2-induced water savings to extend growing season length. Observed interactive (CO 2 x warming) treatment effects were subtle and contingent on water stress, phenology and species composition. Since the models did not correctly represent these processes under ambient and single-factor conditions, little extra information was gained by comparing model predictions against interactive responses. Finally, we outline a series of key areas in which this and future experiments could be used to improve model predictions of grassland responses to global change.« less
De Kauwe, Martin G.; Medlyn, Belinda E.; Walker, Anthony P.; ...
2017-02-01
Multi-factor experiments are often advocated as important for advancing terrestrial biosphere models (TBMs), yet to date such models have only been tested against single-factor experiments. We applied 10 TBMs to the multi-factor Prairie Heating and CO 2 Enrichment (PHACE) experiment in Wyoming, USA. Our goals were to investigate how multi-factor experiments can be used to constrain models, and to identify a road map for model improvement. We found models performed poorly in ambient conditions; there was a wide spread in simulated above-ground net primary productivity (range: 31-390 g C m -2 yr -1). Comparison with data highlighted model failures particularlymore » in respect to carbon allocation, phenology, and the impact of water stress on phenology. Performance against single-factors was also relatively poor. In addition, similar responses were predicted for different reasons across models: there were large differences among models in sensitivity to water stress and, among the nitrogen cycle models, nitrogen availability during the experiment. Models were also unable to capture observed treatment effects on phenology: they over-estimated the effect of warming on leaf onset and did not allow CO 2-induced water savings to extend growing season length. Observed interactive (CO 2 x warming) treatment effects were subtle and contingent on water stress, phenology and species composition. Since the models did not correctly represent these processes under ambient and single-factor conditions, little extra information was gained by comparing model predictions against interactive responses. Finally, we outline a series of key areas in which this and future experiments could be used to improve model predictions of grassland responses to global change.« less
Prike, Toby; Arnold, Michelle M; Williamson, Paul
2017-08-01
A growing body of research has shown people who hold anomalistic (e.g., paranormal) beliefs may differ from nonbelievers in their propensity to make probabilistic reasoning errors. The current study explored the relationship between these beliefs and performance through the development of a new measure of anomalistic belief, called the Anomalistic Belief Scale (ABS). One key feature of the ABS is that it includes a balance of both experiential and theoretical belief items. Another aim of the study was to use the ABS to investigate the relationship between belief and probabilistic reasoning errors on conjunction fallacy tasks. As expected, results showed there was a relationship between anomalistic belief and propensity to commit the conjunction fallacy. Importantly, regression analyses on the factors that make up the ABS showed that the relationship between anomalistic belief and probabilistic reasoning occurred only for beliefs about having experienced anomalistic phenomena, and not for theoretical anomalistic beliefs. Copyright © 2017 Elsevier Inc. All rights reserved.
Corso, Phaedra S.; Ingels, Justin B.; Kogan, Steven M.; Foster, E. Michael; Chen, Yi-Fu; Brody, Gene H.
2013-01-01
Programmatic cost analyses of preventive interventions commonly have a number of methodological difficulties. To determine the mean total costs and properly characterize variability, one often has to deal with small sample sizes, skewed distributions, and especially missing data. Standard approaches for dealing with missing data such as multiple imputation may suffer from a small sample size, a lack of appropriate covariates, or too few details around the method used to handle the missing data. In this study, we estimate total programmatic costs for a prevention trial evaluating the Strong African American Families-Teen program. This intervention focuses on the prevention of substance abuse and risky sexual behavior. To account for missing data in the assessment of programmatic costs we compare multiple imputation to probabilistic sensitivity analysis. The latter approach uses collected cost data to create a distribution around each input parameter. We found that with the multiple imputation approach, the mean (95% confidence interval) incremental difference was $2149 ($397, $3901). With the probabilistic sensitivity analysis approach, the incremental difference was $2583 ($778, $4346). Although the true cost of the program is unknown, probabilistic sensitivity analysis may be a more viable alternative for capturing variability in estimates of programmatic costs when dealing with missing data, particularly with small sample sizes and the lack of strong predictor variables. Further, the larger standard errors produced by the probabilistic sensitivity analysis method may signal its ability to capture more of the variability in the data, thus better informing policymakers on the potentially true cost of the intervention. PMID:23299559
Corso, Phaedra S; Ingels, Justin B; Kogan, Steven M; Foster, E Michael; Chen, Yi-Fu; Brody, Gene H
2013-10-01
Programmatic cost analyses of preventive interventions commonly have a number of methodological difficulties. To determine the mean total costs and properly characterize variability, one often has to deal with small sample sizes, skewed distributions, and especially missing data. Standard approaches for dealing with missing data such as multiple imputation may suffer from a small sample size, a lack of appropriate covariates, or too few details around the method used to handle the missing data. In this study, we estimate total programmatic costs for a prevention trial evaluating the Strong African American Families-Teen program. This intervention focuses on the prevention of substance abuse and risky sexual behavior. To account for missing data in the assessment of programmatic costs we compare multiple imputation to probabilistic sensitivity analysis. The latter approach uses collected cost data to create a distribution around each input parameter. We found that with the multiple imputation approach, the mean (95 % confidence interval) incremental difference was $2,149 ($397, $3,901). With the probabilistic sensitivity analysis approach, the incremental difference was $2,583 ($778, $4,346). Although the true cost of the program is unknown, probabilistic sensitivity analysis may be a more viable alternative for capturing variability in estimates of programmatic costs when dealing with missing data, particularly with small sample sizes and the lack of strong predictor variables. Further, the larger standard errors produced by the probabilistic sensitivity analysis method may signal its ability to capture more of the variability in the data, thus better informing policymakers on the potentially true cost of the intervention.
NASA Astrophysics Data System (ADS)
Richards, Joseph W.; Starr, Dan L.; Miller, Adam A.; Bloom, Joshua S.; Butler, Nathaniel R.; Brink, Henrik; Crellin-Quick, Arien
2012-12-01
With growing data volumes from synoptic surveys, astronomers necessarily must become more abstracted from the discovery and introspection processes. Given the scarcity of follow-up resources, there is a particularly sharp onus on the frameworks that replace these human roles to provide accurate and well-calibrated probabilistic classification catalogs. Such catalogs inform the subsequent follow-up, allowing consumers to optimize the selection of specific sources for further study and permitting rigorous treatment of classification purities and efficiencies for population studies. Here, we describe a process to produce a probabilistic classification catalog of variability with machine learning from a multi-epoch photometric survey. In addition to producing accurate classifications, we show how to estimate calibrated class probabilities and motivate the importance of probability calibration. We also introduce a methodology for feature-based anomaly detection, which allows discovery of objects in the survey that do not fit within the predefined class taxonomy. Finally, we apply these methods to sources observed by the All-Sky Automated Survey (ASAS), and release the Machine-learned ASAS Classification Catalog (MACC), a 28 class probabilistic classification catalog of 50,124 ASAS sources in the ASAS Catalog of Variable Stars. We estimate that MACC achieves a sub-20% classification error rate and demonstrate that the class posterior probabilities are reasonably calibrated. MACC classifications compare favorably to the classifications of several previous domain-specific ASAS papers and to the ASAS Catalog of Variable Stars, which had classified only 24% of those sources into one of 12 science classes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Richards, Joseph W.; Starr, Dan L.; Miller, Adam A.
2012-12-15
With growing data volumes from synoptic surveys, astronomers necessarily must become more abstracted from the discovery and introspection processes. Given the scarcity of follow-up resources, there is a particularly sharp onus on the frameworks that replace these human roles to provide accurate and well-calibrated probabilistic classification catalogs. Such catalogs inform the subsequent follow-up, allowing consumers to optimize the selection of specific sources for further study and permitting rigorous treatment of classification purities and efficiencies for population studies. Here, we describe a process to produce a probabilistic classification catalog of variability with machine learning from a multi-epoch photometric survey. In additionmore » to producing accurate classifications, we show how to estimate calibrated class probabilities and motivate the importance of probability calibration. We also introduce a methodology for feature-based anomaly detection, which allows discovery of objects in the survey that do not fit within the predefined class taxonomy. Finally, we apply these methods to sources observed by the All-Sky Automated Survey (ASAS), and release the Machine-learned ASAS Classification Catalog (MACC), a 28 class probabilistic classification catalog of 50,124 ASAS sources in the ASAS Catalog of Variable Stars. We estimate that MACC achieves a sub-20% classification error rate and demonstrate that the class posterior probabilities are reasonably calibrated. MACC classifications compare favorably to the classifications of several previous domain-specific ASAS papers and to the ASAS Catalog of Variable Stars, which had classified only 24% of those sources into one of 12 science classes.« less
Grammaticality, Acceptability, and Probability: A Probabilistic View of Linguistic Knowledge.
Lau, Jey Han; Clark, Alexander; Lappin, Shalom
2017-07-01
The question of whether humans represent grammatical knowledge as a binary condition on membership in a set of well-formed sentences, or as a probabilistic property has been the subject of debate among linguists, psychologists, and cognitive scientists for many decades. Acceptability judgments present a serious problem for both classical binary and probabilistic theories of grammaticality. These judgements are gradient in nature, and so cannot be directly accommodated in a binary formal grammar. However, it is also not possible to simply reduce acceptability to probability. The acceptability of a sentence is not the same as the likelihood of its occurrence, which is, in part, determined by factors like sentence length and lexical frequency. In this paper, we present the results of a set of large-scale experiments using crowd-sourced acceptability judgments that demonstrate gradience to be a pervasive feature in acceptability judgments. We then show how one can predict acceptability judgments on the basis of probability by augmenting probabilistic language models with an acceptability measure. This is a function that normalizes probability values to eliminate the confounding factors of length and lexical frequency. We describe a sequence of modeling experiments with unsupervised language models drawn from state-of-the-art machine learning methods in natural language processing. Several of these models achieve very encouraging levels of accuracy in the acceptability prediction task, as measured by the correlation between the acceptability measure scores and mean human acceptability values. We consider the relevance of these results to the debate on the nature of grammatical competence, and we argue that they support the view that linguistic knowledge can be intrinsically probabilistic. Copyright © 2016 Cognitive Science Society, Inc.
Bridgers, Franca Ferrari; Kacinik, Natalie
2017-02-01
The majority of words in most languages consist of derived poly-morphemic words but a cross-linguistic review of the literature (Amenta and Crepaldi in Front Psychol 3:232-243, 2012) shows a contradictory picture with respect to how such words are represented and processed. The current study examined the effects of linearity and structural complexity on the processing of Italian derived words. Participants performed a lexical decision task on three types of prefixed and suffixed words and nonwords differing in the complexity of their internal structure. The processing of these words was indeed found to vary according to the nature of the affixes, the order in which they appear, and the type of information the affix encodes. The results thus indicate that derived words are not a uniform class and the best account of these findings appears to be a constraint-based or probabilistic multi-route processing model (e.g., Kuperman et al. in Lang Cogn Process 23:1089-1132, 2008; J Exp Psychol Hum Percept Perform 35:876-895, 2009; J Mem Lang 62:83-97, 2010).
Application of Multi-Model CMIP5 Analysis in Future Drought Adaptation Strategies
NASA Astrophysics Data System (ADS)
Casey, M.; Luo, L.; Lang, Y.
2014-12-01
Drought influences the efficacy of numerous natural and artificial systems including species diversity, agriculture, and infrastructure. Global climate change raises concerns that extend well beyond atmospheric and hydrological disciplines - as climate changes with time, the need for system adaptation becomes apparent. Drought, as a natural phenomenon, is typically defined relative to the climate in which it occurs. Typically a 30-year reference time frame (RTF) is used to determine the severity of a drought event. This study investigates the projected future droughts over North America with different RTFs. Confidence in future hydroclimate projection is characterized by the agreement of long term (2005-2100) multi-model precipitation (P) and temperature (T) projections within the Coupled model Intercomparison Project Phase 5 (CMIP5). Drought severity and the propensity of extreme conditions are measured by the multi-scalar, probabilistic, RTF-based Standard Precipitation Index (SPI) and Standard Precipitation Evapotranspiration Index (SPEI). SPI considers only P while SPEI incorporates Evapotranspiration (E) via T; comparing the two reveals the role of temperature change in future hydroclimate change. Future hydroclimate conditions, hydroclimate extremity, and CMIP5 model agreement are assessed for each Representative Concentration Pathway (RCP 2.6, 4.5, 6.0, 8.5) in regions throughout North America for the entire year and for the boreal seasons. In addition, multiple time scales of SPI and SPEI are calculated to characterize drought at time scales ranging from short to long term. The study explores a simple, standardized method for considering adaptation in future drought assessment, which provides a novel perspective to incorporate adaptation with climate change. The result of the analysis is a multi-dimension, probabilistic summary of the hydrological (P, E) environment a natural or artificial system must adapt to over time. Studies similar to this with specified criteria (SPI/SPEI value, time scale, RCP, etc.) can provide professionals in a variety of disciplines with necessary climatic insight to develop adaptation strategies.
NASA Astrophysics Data System (ADS)
Spangenberger, H.; Beck, F.; Richter, A.
The usual continuum shell model is extended so as to include a statistical treatment of multi-doorway processes. The total configuration space of the nuclear reaction problem is subdivided into the primary doorway states which are coupled by the initial excitation to the nuclear ground state and the secondary doorway states which represent the complicated nature of multi-step reactions. The latter are evaluated within the exciton model which gives the coupling widths between the various finestructure subspaces. This coupling is determined by a statistical factor related to the exciton model and a dynamical factor given by the interaction matrix elements of the interacting excitons. The whole structure defines the multi-doorway continuum shell model. In this work it is applied to the highly fragmented magnetic dipole strength in 58Ni observed in high resolution electron scattering.Translated AbstractAnwendung des Multi-Doorway-Kontinuum-Schalenmodells auf die Verteilung der magnetischen Dipolstärke von 58NiDas Kontinuum-Schalenmodell wurde so erweitert, daß auch statistische Multi-Doorway-Prozesse berücksichtigt werden können. Hierzu wird der Konfigurationsraum unterteilt in den Raum der primären Doorway-Zustände, die direkt aus dem Grundzustand angeregt werden, und den der sekundären Doorway-Zustände, die die komplizierte Struktur der Multi-Step-Reaktionen repräsentieren. Während die primären Doorway-Zustände inclusive ihrer Anregungen mittels üblicher Schalenmodellmethoden beschrieben werden können, werden die sekundären Doorway-Zustände sowie ihre verschiedenen Kopplungen im Rahmen des Exciton-Modells behandelt. Diese Kopplungen sind durch einen aus dem Exciton-Modell resultierenden Faktor sowie durch einen dynamischen Faktor bestimmt, der sich aus dem Matrixelement der wechselwirkenden Excitonen berechnet. Die Struktur der Kopplungen definiert das Multi-Doorway-Kontinuum-Schalenmodell, das hier auf die Beschreibung der stark fragmentierten magnetischen Dipolstärke in 58Ni angewendet wird.
Improving medium-range ensemble streamflow forecasts through statistical post-processing
NASA Astrophysics Data System (ADS)
Mendoza, Pablo; Wood, Andy; Clark, Elizabeth; Nijssen, Bart; Clark, Martyn; Ramos, Maria-Helena; Nowak, Kenneth; Arnold, Jeffrey
2017-04-01
Probabilistic hydrologic forecasts are a powerful source of information for decision-making in water resources operations. A common approach is the hydrologic model-based generation of streamflow forecast ensembles, which can be implemented to account for different sources of uncertainties - e.g., from initial hydrologic conditions (IHCs), weather forecasts, and hydrologic model structure and parameters. In practice, hydrologic ensemble forecasts typically have biases and spread errors stemming from errors in the aforementioned elements, resulting in a degradation of probabilistic properties. In this work, we compare several statistical post-processing techniques applied to medium-range ensemble streamflow forecasts obtained with the System for Hydromet Applications, Research and Prediction (SHARP). SHARP is a fully automated prediction system for the assessment and demonstration of short-term to seasonal streamflow forecasting applications, developed by the National Center for Atmospheric Research, University of Washington, U.S. Army Corps of Engineers, and U.S. Bureau of Reclamation. The suite of post-processing techniques includes linear blending, quantile mapping, extended logistic regression, quantile regression, ensemble analogs, and the generalized linear model post-processor (GLMPP). We assess and compare these techniques using multi-year hindcasts in several river basins in the western US. This presentation discusses preliminary findings about the effectiveness of the techniques for improving probabilistic skill, reliability, discrimination, sharpness and resolution.
A tesselated probabilistic representation for spatial robot perception and navigation
NASA Technical Reports Server (NTRS)
Elfes, Alberto
1989-01-01
The ability to recover robust spatial descriptions from sensory information and to efficiently utilize these descriptions in appropriate planning and problem-solving activities are crucial requirements for the development of more powerful robotic systems. Traditional approaches to sensor interpretation, with their emphasis on geometric models, are of limited use for autonomous mobile robots operating in and exploring unknown and unstructured environments. Here, researchers present a new approach to robot perception that addresses such scenarios using a probabilistic tesselated representation of spatial information called the Occupancy Grid. The Occupancy Grid is a multi-dimensional random field that maintains stochastic estimates of the occupancy state of each cell in the grid. The cell estimates are obtained by interpreting incoming range readings using probabilistic models that capture the uncertainty in the spatial information provided by the sensor. A Bayesian estimation procedure allows the incremental updating of the map using readings taken from several sensors over multiple points of view. An overview of the Occupancy Grid framework is given, and its application to a number of problems in mobile robot mapping and navigation are illustrated. It is argued that a number of robotic problem-solving activities can be performed directly on the Occupancy Grid representation. Some parallels are drawn between operations on Occupancy Grids and related image processing operations.
Stakeholder conceptualisation of multi-level HIV and AIDS determinants in a Black epicentre.
Brawner, Bridgette M; Reason, Janaiya L; Hanlon, Kelsey; Guthrie, Barbara; Schensul, Jean J
2017-09-01
HIV has reached epidemic proportions among African Americans in the USA but certain urban contexts appear to experience a disproportionate disease burden. Geographic information systems mapping in Philadelphia indicates increased HIV incidence and prevalence in predominantly Black census tracts, with major differences across adjacent communities. What factors shape these geographic HIV disparities among Black Philadelphians? This descriptive study was designed to refine and validate a conceptual model developed to better understand multi-level determinants of HIV-related risk among Black Philadelphians. We used an expanded ecological approach to elicit reflective perceptions from administrators, direct service providers and community members about individual, social and structural factors that interact to protect against or increase the risk for acquiring HIV within their community. Gender equity, social capital and positive cultural mores (e.g., monogamy, abstinence) were seen as the main protective factors. Historical negative contributory influences of racial residential segregation, poverty and incarceration were among the most salient risk factors. This study was a critical next step toward initiating theory-based, multi-level community-based HIV prevention initiatives.
Supervised Gamma Process Poisson Factorization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Dylan Zachary
This thesis develops the supervised gamma process Poisson factorization (S- GPPF) framework, a novel supervised topic model for joint modeling of count matrices and document labels. S-GPPF is fully generative and nonparametric: document labels and count matrices are modeled under a uni ed probabilistic framework and the number of latent topics is controlled automatically via a gamma process prior. The framework provides for multi-class classification of documents using a generative max-margin classifier. Several recent data augmentation techniques are leveraged to provide for exact inference using a Gibbs sampling scheme. The first portion of this thesis reviews supervised topic modeling andmore » several key mathematical devices used in the formulation of S-GPPF. The thesis then introduces the S-GPPF generative model and derives the conditional posterior distributions of the latent variables for posterior inference via Gibbs sampling. The S-GPPF is shown to exhibit state-of-the-art performance for joint topic modeling and document classification on a dataset of conference abstracts, beating out competing supervised topic models. The unique properties of S-GPPF along with its competitive performance make it a novel contribution to supervised topic modeling.« less
Relative Velocity as a Metric for Probability of Collision Calculations
NASA Technical Reports Server (NTRS)
Frigm, Ryan Clayton; Rohrbaugh, Dave
2008-01-01
Collision risk assessment metrics, such as the probability of collision calculation, are based largely on assumptions about the interaction of two objects during their close approach. Specifically, the approach to probabilistic risk assessment can be performed more easily if the relative trajectories of the two close approach objects are assumed to be linear during the encounter. It is shown in this analysis that one factor in determining linearity is the relative velocity of the two encountering bodies, in that the assumption of linearity breaks down at low relative approach velocities. The first part of this analysis is the determination of the relative velocity threshold below which the assumption of linearity becomes invalid. The second part is a statistical study of conjunction interactions between representative asset spacecraft and the associated debris field environment to determine the likelihood of encountering a low relative velocity close approach. This analysis is performed for both the LEO and GEO orbit regimes. Both parts comment on the resulting effects to collision risk assessment operations.
Integration of multi-omics data for integrative gene regulatory network inference.
Zarayeneh, Neda; Ko, Euiseong; Oh, Jung Hun; Suh, Sang; Liu, Chunyu; Gao, Jean; Kim, Donghyun; Kang, Mingon
2017-01-01
Gene regulatory networks provide comprehensive insights and indepth understanding of complex biological processes. The molecular interactions of gene regulatory networks are inferred from a single type of genomic data, e.g., gene expression data in most research. However, gene expression is a product of sequential interactions of multiple biological processes, such as DNA sequence variations, copy number variations, histone modifications, transcription factors, and DNA methylations. The recent rapid advances of high-throughput omics technologies enable one to measure multiple types of omics data, called 'multi-omics data', that represent the various biological processes. In this paper, we propose an Integrative Gene Regulatory Network inference method (iGRN) that incorporates multi-omics data and their interactions in gene regulatory networks. In addition to gene expressions, copy number variations and DNA methylations were considered for multi-omics data in this paper. The intensive experiments were carried out with simulation data, where iGRN's capability that infers the integrative gene regulatory network is assessed. Through the experiments, iGRN shows its better performance on model representation and interpretation than other integrative methods in gene regulatory network inference. iGRN was also applied to a human brain dataset of psychiatric disorders, and the biological network of psychiatric disorders was analysed.
Integration of multi-omics data for integrative gene regulatory network inference
Zarayeneh, Neda; Ko, Euiseong; Oh, Jung Hun; Suh, Sang; Liu, Chunyu; Gao, Jean; Kim, Donghyun
2017-01-01
Gene regulatory networks provide comprehensive insights and indepth understanding of complex biological processes. The molecular interactions of gene regulatory networks are inferred from a single type of genomic data, e.g., gene expression data in most research. However, gene expression is a product of sequential interactions of multiple biological processes, such as DNA sequence variations, copy number variations, histone modifications, transcription factors, and DNA methylations. The recent rapid advances of high-throughput omics technologies enable one to measure multiple types of omics data, called ‘multi-omics data’, that represent the various biological processes. In this paper, we propose an Integrative Gene Regulatory Network inference method (iGRN) that incorporates multi-omics data and their interactions in gene regulatory networks. In addition to gene expressions, copy number variations and DNA methylations were considered for multi-omics data in this paper. The intensive experiments were carried out with simulation data, where iGRN’s capability that infers the integrative gene regulatory network is assessed. Through the experiments, iGRN shows its better performance on model representation and interpretation than other integrative methods in gene regulatory network inference. iGRN was also applied to a human brain dataset of psychiatric disorders, and the biological network of psychiatric disorders was analysed. PMID:29354189
Detection of gene communities in multi-networks reveals cancer drivers
NASA Astrophysics Data System (ADS)
Cantini, Laura; Medico, Enzo; Fortunato, Santo; Caselle, Michele
2015-12-01
We propose a new multi-network-based strategy to integrate different layers of genomic information and use them in a coordinate way to identify driving cancer genes. The multi-networks that we consider combine transcription factor co-targeting, microRNA co-targeting, protein-protein interaction and gene co-expression networks. The rationale behind this choice is that gene co-expression and protein-protein interactions require a tight coregulation of the partners and that such a fine tuned regulation can be obtained only combining both the transcriptional and post-transcriptional layers of regulation. To extract the relevant biological information from the multi-network we studied its partition into communities. To this end we applied a consensus clustering algorithm based on state of art community detection methods. Even if our procedure is valid in principle for any pathology in this work we concentrate on gastric, lung, pancreas and colorectal cancer and identified from the enrichment analysis of the multi-network communities a set of candidate driver cancer genes. Some of them were already known oncogenes while a few are new. The combination of the different layers of information allowed us to extract from the multi-network indications on the regulatory pattern and functional role of both the already known and the new candidate driver genes.
Probabilistic and deterministic evaluation of uncertainty in a local scale multi-risk analysis
NASA Astrophysics Data System (ADS)
Lari, S.; Frattini, P.; Crosta, G. B.
2009-04-01
We performed a probabilistic multi-risk analysis (QPRA) at the local scale for a 420 km2 area surrounding the town of Brescia (Northern Italy). We calculated the expected annual loss in terms of economical damage and life loss, for a set of risk scenarios of flood, earthquake and industrial accident with different occurrence probabilities and different intensities. The territorial unit used for the study was the census parcel, of variable area, for which a large amount of data was available. Due to the lack of information related to the evaluation of the hazards, to the value of the exposed elements (e.g., residential and industrial area, population, lifelines, sensitive elements as schools, hospitals) and to the process-specific vulnerability, and to a lack of knowledge of the processes (floods, industrial accidents, earthquakes), we assigned an uncertainty to the input variables of the analysis. For some variables an homogeneous uncertainty was assigned on the whole study area, as for instance for the number of buildings of various typologies, and for the event occurrence probability. In other cases, as for phenomena intensity (e.g.,depth of water during flood) and probability of impact, the uncertainty was defined in relation to the census parcel area. In fact assuming some variables homogeneously diffused or averaged on the census parcels, we introduce a larger error for larger parcels. We propagated the uncertainty in the analysis using three different models, describing the reliability of the output (risk) as a function of the uncertainty of the inputs (scenarios and vulnerability functions). We developed a probabilistic approach based on Monte Carlo simulation, and two deterministic models, namely First Order Second Moment (FOSM) and Point Estimate (PE). In general, similar values of expected losses are obtained with the three models. The uncertainty of the final risk value is in the three cases around the 30% of the expected value. Each of the models, nevertheless, requires different assumptions and computational efforts, and provides results with different level of detail.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shevitz, Daniel Wolf; Key, Brian P.; Garcia, Daniel B.
2017-09-05
The Fragment Impact Toolkit (FIT) is a software package used for probabilistic consequence evaluation of fragmenting sources. The typical use case for FIT is to simulate an exploding shell and evaluate the consequence on nearby objects. FIT is written in the programming language Python and is designed as a collection of interacting software modules. Each module has a function that interacts with the other modules to produce desired results.
Affective and cognitive factors influencing sensitivity to probabilistic information.
Tyszka, Tadeusz; Sawicki, Przemyslaw
2011-11-01
In study 1 different groups of female students were randomly assigned to one of four probabilistic information formats. Five different levels of probability of a genetic disease in an unborn child were presented to participants (within-subject factor). After the presentation of the probability level, participants were requested to indicate the acceptable level of pain they would tolerate to avoid the disease (in their unborn child), their subjective evaluation of the disease risk, and their subjective evaluation of being worried by this risk. The results of study 1 confirmed the hypothesis that an experience-based probability format decreases the subjective sense of worry about the disease, thus, presumably, weakening the tendency to overrate the probability of rare events. Study 2 showed that for the emotionally laden stimuli, the experience-based probability format resulted in higher sensitivity to probability variations than other formats of probabilistic information. These advantages of the experience-based probability format are interpreted in terms of two systems of information processing: the rational deliberative versus the affective experiential and the principle of stimulus-response compatibility. © 2011 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Degtyar, V. G.; Kalashnikov, S. T.; Mokin, Yu. A.
2017-10-01
The paper considers problems of analyzing aerodynamic properties (ADP) of reenetry vehicles (RV) as blunted rotary bodies with small random surface distortions. The interactions of math simulation of surface distortions, selection of tools for predicting ADPs of shaped bodies, evaluation of different-type ADP variations and their adaptation for dynamic problems are analyzed. The possibilities of deterministic and probabilistic approaches to evaluation of ADP variations are considered. The practical value of the probabilistic approach is demonstrated. The examples of extremal deterministic evaluations of ADP variations for a sphere and a sharp cone are given.
A hierarchical model for probabilistic independent component analysis of multi-subject fMRI studies
Tang, Li
2014-01-01
Summary An important goal in fMRI studies is to decompose the observed series of brain images to identify and characterize underlying brain functional networks. Independent component analysis (ICA) has been shown to be a powerful computational tool for this purpose. Classic ICA has been successfully applied to single-subject fMRI data. The extension of ICA to group inferences in neuroimaging studies, however, is challenging due to the unavailability of a pre-specified group design matrix. Existing group ICA methods generally concatenate observed fMRI data across subjects on the temporal domain and then decompose multi-subject data in a similar manner to single-subject ICA. The major limitation of existing methods is that they ignore between-subject variability in spatial distributions of brain functional networks in group ICA. In this paper, we propose a new hierarchical probabilistic group ICA method to formally model subject-specific effects in both temporal and spatial domains when decomposing multi-subject fMRI data. The proposed method provides model-based estimation of brain functional networks at both the population and subject level. An important advantage of the hierarchical model is that it provides a formal statistical framework to investigate similarities and differences in brain functional networks across subjects, e.g., subjects with mental disorders or neurodegenerative diseases such as Parkinson’s as compared to normal subjects. We develop an EM algorithm for model estimation where both the E-step and M-step have explicit forms. We compare the performance of the proposed hierarchical model with that of two popular group ICA methods via simulation studies. We illustrate our method with application to an fMRI study of Zen meditation. PMID:24033125
Innovating Big Data Computing Geoprocessing for Analysis of Engineered-Natural Systems
NASA Astrophysics Data System (ADS)
Rose, K.; Baker, V.; Bauer, J. R.; Vasylkivska, V.
2016-12-01
Big data computing and analytical techniques offer opportunities to improve predictions about subsurface systems while quantifying and characterizing associated uncertainties from these analyses. Spatial analysis, big data and otherwise, of subsurface natural and engineered systems are based on variable resolution, discontinuous, and often point-driven data to represent continuous phenomena. We will present examples from two spatio-temporal methods that have been adapted for use with big datasets and big data geo-processing capabilities. The first approach uses regional earthquake data to evaluate spatio-temporal trends associated with natural and induced seismicity. The second algorithm, the Variable Grid Method (VGM), is a flexible approach that presents spatial trends and patterns, such as those resulting from interpolation methods, while simultaneously visualizing and quantifying uncertainty in the underlying spatial datasets. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analyses to efficiently consume and utilize large geospatial data in these custom analytical algorithms through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom `Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation via custom multi-step Hadoop applications, performance benchmarking comparisons, and Hadoop-centric opportunities for greater parallelization of geospatial operations.
1983-09-01
al. (1981) was conducted on Copper City No. 2 tailings embankment damn near Miami, Arizona . Due to the extreme topographic relief in the area of the...mode of behavior and scale. ThiL dependency is summarized in the factor R. For example, circular shear instability as in a copper porphyry slope...OF THE PROBABILISTIC SLOPE STABILITY MODEL. . 32 6.1 DESCRIPTION OF COPPER CITY NUMBER 2 TAILINGS DAM . . 32 6.2 SUBSURFACE INVESTIGATION
Fracture mechanics analysis of cracked structures using weight function and neural network method
NASA Astrophysics Data System (ADS)
Chen, J. G.; Zang, F. G.; Yang, Y.; Shi, K. K.; Fu, X. L.
2018-06-01
Stress intensity factors(SIFs) due to thermal-mechanical load has been established by using weight function method. Two reference stress states sere used to determine the coefficients in the weight function. Results were evaluated by using data from literature and show a good agreement between them. So, the SIFs can be determined quickly using the weight function obtained when cracks subjected to arbitrary loads, and presented method can be used for probabilistic fracture mechanics analysis. A probabilistic methodology considering Monte-Carlo with neural network (MCNN) has been developed. The results indicate that an accurate probabilistic characteristic of the KI can be obtained by using the developed method. The probability of failure increases with the increasing of loads, and the relationship between is nonlinear.
An investigation into the probabilistic combination of quasi-static and random accelerations
NASA Technical Reports Server (NTRS)
Schock, R. W.; Tuell, L. P.
1984-01-01
The development of design load factors for aerospace and aircraft components and experiment support structures, which are subject to a simultaneous vehicle dynamic vibration (quasi-static) and acoustically generated random vibration, require the selection of a combination methodology. Typically, the procedure is to define the quasi-static and the random generated response separately, and arithmetically add or root sum square to get combined accelerations. Since the combination of a probabilistic and a deterministic function yield a probabilistic function, a viable alternate approach would be to determine the characteristics of the combined acceleration probability density function and select an appropriate percentile level for the combined acceleration. The following paper develops this mechanism and provides graphical data to select combined accelerations for most popular percentile levels.
Framing Ethnic Variations in Alcohol Outcomes from Biological Pathways to Neighborhood Context
Chartier, Karen G.; Scott, Denise M.; Wall, Tamara L.; Covault, Jonathan; Karriker-Jaffe, Katherine J.; Mills, Britain A.; Luczak, Susan E.; Caetano, Raul; Arroyo, Judith A.
2013-01-01
Health disparities research seeks to eliminate disproportionate negative health outcomes experienced in some racial/ethnic minority groups. This brief review presents findings on factors associated with drinking and alcohol-related problems in racial/ethnic groups. Those discussed are: 1) biological pathways to alcohol problems, 2) gene by stress interactions, 3) neighborhood disadvantage, stress, and access to alcohol, and 4) drinking cultures and contexts. These factors and their interrelationships are complex, requiring a multi-level perspective. The use of interdisciplinary teams and an epigenetic focus are suggested to move the research forward. The application of multi-level research to policy, prevention, and intervention programs may help prioritize combinations of the most promising intervention targets. PMID:24483624
New insights into the multi-scale climatic drivers of the "Karakoram anomaly"
NASA Astrophysics Data System (ADS)
Collier, S.; Moelg, T.; Nicholson, L. I.; Maussion, F.; Scherer, D.; Bush, A. B.
2012-12-01
Glacier behaviour in the Karakoram region of the northwestern Himalaya shows strong spatial and temporal heterogeneity and, in some basins, anomalous trends compared with glaciers elsewhere in High Asia. Our knowledge of the mass balance fluctuations of Karakoram glaciers as well as of the important driving factors and interactions between them is limited by a scarcity of in-situ measurements and other studies. Here we employ a novel approach to simulating atmosphere-cryosphere interactions - coupled high-resolution atmospheric and physically-based surface mass balance modelling - to examine the surface energy and mass fluxes of glaciers in this region. We discuss the mesoscale climatic drivers behind surface mass balance fluctuations as well as the influence of local forcing factors, such as debris cover and feedbacks from the glacier surface to the atmosphere. The coupled modelling approach therefore provides an innovative, multi-scale solution to the paucity of information we have to date on the much-debated "Karakoram anomaly."
Simulation Based Earthquake Forecasting with RSQSim
NASA Astrophysics Data System (ADS)
Gilchrist, J. J.; Jordan, T. H.; Dieterich, J. H.; Richards-Dinger, K. B.
2016-12-01
We are developing a physics-based forecasting model for earthquake ruptures in California. We employ the 3D boundary element code RSQSim to generate synthetic catalogs with millions of events that span up to a million years. The simulations incorporate rate-state fault constitutive properties in complex, fully interacting fault systems. The Unified California Earthquake Rupture Forecast Version 3 (UCERF3) model and data sets are used for calibration of the catalogs and specification of fault geometry. Fault slip rates match the UCERF3 geologic slip rates and catalogs are tuned such that earthquake recurrence matches the UCERF3 model. Utilizing the Blue Waters Supercomputer, we produce a suite of million-year catalogs to investigate the epistemic uncertainty in the physical parameters used in the simulations. In particular, values of the rate- and state-friction parameters a and b, the initial shear and normal stress, as well as the earthquake slip speed, are varied over several simulations. In addition to testing multiple models with homogeneous values of the physical parameters, the parameters a, b, and the normal stress are varied with depth as well as in heterogeneous patterns across the faults. Cross validation of UCERF3 and RSQSim is performed within the SCEC Collaboratory for Interseismic Simulation and Modeling (CISM) to determine the affect of the uncertainties in physical parameters observed in the field and measured in the lab, on the uncertainties in probabilistic forecasting. We are particularly interested in the short-term hazards of multi-event sequences due to complex faulting and multi-fault ruptures.
Rats bred for high alcohol drinking are more sensitive to delayed and probabilistic outcomes.
Wilhelm, C J; Mitchell, S H
2008-10-01
Alcoholics and heavy drinkers score higher on measures of impulsivity than nonalcoholics and light drinkers. This may be because of factors that predate drug exposure (e.g. genetics). This study examined the role of genetics by comparing impulsivity measures in ethanol-naive rats selectively bred based on their high [high alcohol drinking (HAD)] or low [low alcohol drinking (LAD)] consumption of ethanol. Replicates 1 and 2 of the HAD and LAD rats, developed by the University of Indiana Alcohol Research Center, completed two different discounting tasks. Delay discounting examines sensitivity to rewards that are delayed in time and is commonly used to assess 'choice' impulsivity. Probability discounting examines sensitivity to the uncertain delivery of rewards and has been used to assess risk taking and risk assessment. High alcohol drinking rats discounted delayed and probabilistic rewards more steeply than LAD rats. Discount rates associated with probabilistic and delayed rewards were weakly correlated, while bias was strongly correlated with discount rate in both delay and probability discounting. The results suggest that selective breeding for high alcohol consumption selects for animals that are more sensitive to delayed and probabilistic outcomes. Sensitivity to delayed or probabilistic outcomes may be predictive of future drinking in genetically predisposed individuals.
Kang, Guangliang; Du, Li; Zhang, Hong
2016-06-22
The growing complexity of biological experiment design based on high-throughput RNA sequencing (RNA-seq) is calling for more accommodative statistical tools. We focus on differential expression (DE) analysis using RNA-seq data in the presence of multiple treatment conditions. We propose a novel method, multiDE, for facilitating DE analysis using RNA-seq read count data with multiple treatment conditions. The read count is assumed to follow a log-linear model incorporating two factors (i.e., condition and gene), where an interaction term is used to quantify the association between gene and condition. The number of the degrees of freedom is reduced to one through the first order decomposition of the interaction, leading to a dramatically power improvement in testing DE genes when the number of conditions is greater than two. In our simulation situations, multiDE outperformed the benchmark methods (i.e. edgeR and DESeq2) even if the underlying model was severely misspecified, and the power gain was increasing in the number of conditions. In the application to two real datasets, multiDE identified more biologically meaningful DE genes than the benchmark methods. An R package implementing multiDE is available publicly at http://homepage.fudan.edu.cn/zhangh/softwares/multiDE . When the number of conditions is two, multiDE performs comparably with the benchmark methods. When the number of conditions is greater than two, multiDE outperforms the benchmark methods.
Propagation of hydroclimatic variability through the critical zone
NASA Astrophysics Data System (ADS)
Porporato, A. M.; Calabrese, S.; Parolari, A.
2016-12-01
The interaction between soil moisture dynamics and mineral-weathering reactions (e.g., ion exchange, precipitation-dissolution) affects the availability of nutrients to plants, composition of soils, soil acidification, as well as CO2 sequestration. Across the critical zone (CZ), this interaction is responsible for propagating hydroclimatic fluctuations to deeper soil layers, controlling weathering rates via leaching events which intermittently alter the alkalinity levels. In this contribution, we analyze these dynamics using a stochastic modeling approach based on spatially lumped description of soil hydrology and chemical weathering reactions forced by multi-scale temporal hydrologic variability. We quantify the role of soil moisture dynamics in filtering the rainfall fluctuations through its impacts on soil water chemistry, described by a system of ordinary differential equations (and algebraic equations, for the equilibrium reactions), driving the evolution of alkalinity, pH, the chemical species of the soil solution, and the mineral-weathering rate. A probabilistic description of the evolution of the critical zone is thus obtained, allowing us to describe the CZ response to long-term climate fluctuations, ecosystem and land-use conditions, in terms of key variables groups. The model is applied to the weathering rate of albite in the Calhoun CZ observatory and then extended to explore similarities and differences across other CZs. Typical time scales of response and degrees of sensitivities of CZ to hydroclimatic fluctuations and human forcing are also explored.
NASA Astrophysics Data System (ADS)
Pilone, D.; Quinn, P.; Mitchell, A. E.; Baynes, K.; Shum, D.
2014-12-01
This talk introduces the audience to some of the very real challenges associated with visualizing data from disparate data sources as encountered during the development of real world applications. In addition to the fundamental challenges of dealing with the data and imagery, this talk discusses usability problems encountered while trying to provide interactive and user-friendly visualization tools. At the end of this talk the audience will be aware of some of the pitfalls of data visualization along with tools and techniques to help mitigate them. There are many sources of variable resolution visualizations of science data available to application developers including NASA's Global Imagery Browse Services (GIBS), however integrating and leveraging visualizations in modern applications faces a number of challenges, including: - Varying visualized Earth "tile sizes" resulting in challenges merging disparate sources - Multiple visualization frameworks and toolkits with varying strengths and weaknesses - Global composite imagery vs. imagery matching EOSDIS granule distribution - Challenges visualizing geographically overlapping data with different temporal bounds - User interaction with overlapping or collocated data - Complex data boundaries and shapes combined with multi-orbit data and polar projections - Discovering the availability of visualizations and the specific parameters, color palettes, and configurations used to produce them In addition to discussing the challenges and approaches involved in visualizing disparate data, we will discuss solutions and components we'll be making available as open source to encourage reuse and accelerate application development.
Ray, Chad A; Patel, Vimal; Shih, Judy; Macaraeg, Chris; Wu, Yuling; Thway, Theingi; Ma, Mark; Lee, Jean W; Desilva, Binodh
2009-02-20
Developing a process that generates robust immunoassays that can be used to support studies with tight timelines is a common challenge for bioanalytical laboratories. Design of experiments (DOEs) is a tool that has been used by many industries for the purpose of optimizing processes. The approach is capable of identifying critical factors and their interactions with a minimal number of experiments. The challenge for implementing this tool in the bioanalytical laboratory is to develop a user-friendly approach that scientists can understand and apply. We have successfully addressed these challenges by eliminating the screening design, introducing automation, and applying a simple mathematical approach for the output parameter. A modified central composite design (CCD) was applied to three ligand binding assays. The intra-plate factors selected were coating, detection antibody concentration, and streptavidin-HRP concentrations. The inter-plate factors included incubation times for each step. The objective was to maximize the logS/B (S/B) of the low standard to the blank. The maximum desirable conditions were determined using JMP 7.0. To verify the validity of the predictions, the logS/B prediction was compared against the observed logS/B during pre-study validation experiments. The three assays were optimized using the multi-factorial DOE. The total error for all three methods was less than 20% which indicated method robustness. DOE identified interactions in one of the methods. The model predictions for logS/B were within 25% of the observed pre-study validation values for all methods tested. The comparison between the CCD and hybrid screening design yielded comparable parameter estimates. The user-friendly design enables effective application of multi-factorial DOE to optimize ligand binding assays for therapeutic proteins. The approach allows for identification of interactions between factors, consistency in optimal parameter determination, and reduced method development time.
Unsteady Probabilistic Analysis of a Gas Turbine System
NASA Technical Reports Server (NTRS)
Brown, Marilyn
2003-01-01
In this work, we have considered an annular cascade configuration subjected to unsteady inflow conditions. The unsteady response calculation has been implemented into the time marching CFD code, MSUTURBO. The computed steady state results for the pressure distribution demonstrated good agreement with experimental data. We have computed results for the amplitudes of the unsteady pressure over the blade surfaces. With the increase in gas turbine engine structural complexity and performance over the past 50 years, structural engineers have created an array of safety nets to ensure against component failures in turbine engines. In order to reduce what is now considered to be excessive conservatism and yet maintain the same adequate margins of safety, there is a pressing need to explore methods of incorporating probabilistic design procedures into engine development. Probabilistic methods combine and prioritize the statistical distributions of each design variable, generate an interactive distribution and offer the designer a quantified relationship between robustness, endurance and performance. The designer can therefore iterate between weight reduction, life increase, engine size reduction, speed increase etc.
NASA Technical Reports Server (NTRS)
Bast, Callie C.; Boyce, Lola
1995-01-01
The development of methodology for a probabilistic material strength degradation is described. The probabilistic model, in the form of a postulated randomized multifactor equation, provides for quantification of uncertainty in the lifetime material strength of aerospace propulsion system components subjected to a number of diverse random effects. This model is embodied in the computer program entitled PROMISS, which can include up to eighteen different effects. Presently, the model includes five effects that typically reduce lifetime strength: high temperature, high-cycle mechanical fatigue, low-cycle mechanical fatigue, creep and thermal fatigue. Results, in the form of cumulative distribution functions, illustrated the sensitivity of lifetime strength to any current value of an effect. In addition, verification studies comparing predictions of high-cycle mechanical fatigue and high temperature effects with experiments are presented. Results from this limited verification study strongly supported that material degradation can be represented by randomized multifactor interaction models.
Mean and modal ϵ in the deaggregation of probabilistic ground motion
Harmsen, Stephen C.
2001-01-01
Mean and modal ϵ exhibit a wide variation geographically for any specified PE. Modal ϵ for the 2% in 50 yr PE exceeds 2 near the most active western California faults, is less than –1 near some less active faults of the western United States (principally in the Basin and Range), and may be less than 0 in areal fault zones of the central and eastern United States (CEUS). This geographic variation is useful for comparing probabilistic ground motions with ground motions from scenario earthquakes on dominating faults, often used in seismic-resistant provisions of building codes. An interactive seismic-hazard deaggregation menu item has been added to the USGS probabilistic seismic-hazard analysis Web site, http://geohazards.cr.usgs.gov/eq/, allowing visitors to compute mean and modal distance, magnitude, and ϵ corresponding to ground motions having mean return times from 250 to 5000 yr for any site in the United States.
Probabilistic Open Set Recognition
NASA Astrophysics Data System (ADS)
Jain, Lalit Prithviraj
Real-world tasks in computer vision, pattern recognition and machine learning often touch upon the open set recognition problem: multi-class recognition with incomplete knowledge of the world and many unknown inputs. An obvious way to approach such problems is to develop a recognition system that thresholds probabilities to reject unknown classes. Traditional rejection techniques are not about the unknown; they are about the uncertain boundary and rejection around that boundary. Thus traditional techniques only represent the "known unknowns". However, a proper open set recognition algorithm is needed to reduce the risk from the "unknown unknowns". This dissertation examines this concept and finds existing probabilistic multi-class recognition approaches are ineffective for true open set recognition. We hypothesize the cause is due to weak adhoc assumptions combined with closed-world assumptions made by existing calibration techniques. Intuitively, if we could accurately model just the positive data for any known class without overfitting, we could reject the large set of unknown classes even under this assumption of incomplete class knowledge. For this, we formulate the problem as one of modeling positive training data by invoking statistical extreme value theory (EVT) near the decision boundary of positive data with respect to negative data. We provide a new algorithm called the PI-SVM for estimating the unnormalized posterior probability of class inclusion. This dissertation also introduces a new open set recognition model called Compact Abating Probability (CAP), where the probability of class membership decreases in value (abates) as points move from known data toward open space. We show that CAP models improve open set recognition for multiple algorithms. Leveraging the CAP formulation, we go on to describe the novel Weibull-calibrated SVM (W-SVM) algorithm, which combines the useful properties of statistical EVT for score calibration with one-class and binary support vector machines. Building from the success of statistical EVT based recognition methods such as PI-SVM and W-SVM on the open set problem, we present a new general supervised learning algorithm for multi-class classification and multi-class open set recognition called the Extreme Value Local Basis (EVLB). The design of this algorithm is motivated by the observation that extrema from known negative class distributions are the closest negative points to any positive sample during training, and thus should be used to define the parameters of a probabilistic decision model. In the EVLB, the kernel distribution for each positive training sample is estimated via an EVT distribution fit over the distances to the separating hyperplane between positive training sample and closest negative samples, with a subset of the overall positive training data retained to form a probabilistic decision boundary. Using this subset as a frame of reference, the probability of a sample at test time decreases as it moves away from the positive class. Possessing this property, the EVLB is well-suited to open set recognition problems where samples from unknown or novel classes are encountered at test. Our experimental evaluation shows that the EVLB provides a substantial improvement in scalability compared to standard radial basis function kernel machines, as well as P I-SVM and W-SVM, with improved accuracy in many cases. We evaluate our algorithm on open set variations of the standard visual learning benchmarks, as well as with an open subset of classes from Caltech 256 and ImageNet. Our experiments show that PI-SVM, WSVM and EVLB provide significant advances over the previous state-of-the-art solutions for the same tasks.
Common Randomness Principles of Secrecy
ERIC Educational Resources Information Center
Tyagi, Himanshu
2013-01-01
This dissertation concerns the secure processing of distributed data by multiple terminals, using interactive public communication among themselves, in order to accomplish a given computational task. In the setting of a probabilistic multiterminal source model in which several terminals observe correlated random signals, we analyze secure…
Classen, Sherrilene; Lopez, Ellen DS; Winter, Sandra; Awadzi, Kezia D; Ferree, Nita; Garvan, Cynthia W
2007-01-01
The topic of motor vehicle crashes among the elderly is dynamic and multi-faceted requiring a comprehensive and synergistic approach to intervention planning. This approach must be based on the values of a given population as well as health statistics and asserted through community, organizational and policy strategies. An integrated summary of the predictors (quantitative research), and views (qualitative research) of the older drivers and their stakeholders, does not currently exist. This study provided an explicit socio-ecological view explaining the interrelation of possible causative factors, an integrated summary of these causative factors, and empirical guidelines for developing public health interventions to promote older driver safety. Using a mixed methods approach, we were able to compare and integrate main findings from a national crash dataset with perspectives of stakeholders. We identified: 11 multi-causal factors for safe elderly driving; the importance of the environmental factors - previously underrated in the literature- interacting with behavioral and health factors; and the interrelatedness among many socio-ecological factors. For the first time, to our knowledge, we conceptualized the fundamental elements of a multi-causal health promotion plan, with measurable intermediate and long-term outcomes. After completing the detailed plan we will test the effectiveness of this intervention on multiple levels. PMID:18225470
Symmetric nonnegative matrix factorization: algorithms and applications to probabilistic clustering.
He, Zhaoshui; Xie, Shengli; Zdunek, Rafal; Zhou, Guoxu; Cichocki, Andrzej
2011-12-01
Nonnegative matrix factorization (NMF) is an unsupervised learning method useful in various applications including image processing and semantic analysis of documents. This paper focuses on symmetric NMF (SNMF), which is a special case of NMF decomposition. Three parallel multiplicative update algorithms using level 3 basic linear algebra subprograms directly are developed for this problem. First, by minimizing the Euclidean distance, a multiplicative update algorithm is proposed, and its convergence under mild conditions is proved. Based on it, we further propose another two fast parallel methods: α-SNMF and β -SNMF algorithms. All of them are easy to implement. These algorithms are applied to probabilistic clustering. We demonstrate their effectiveness for facial image clustering, document categorization, and pattern clustering in gene expression.
Selection of higher order regression models in the analysis of multi-factorial transcription data.
Prazeres da Costa, Olivia; Hoffman, Arthur; Rey, Johannes W; Mansmann, Ulrich; Buch, Thorsten; Tresch, Achim
2014-01-01
Many studies examine gene expression data that has been obtained under the influence of multiple factors, such as genetic background, environmental conditions, or exposure to diseases. The interplay of multiple factors may lead to effect modification and confounding. Higher order linear regression models can account for these effects. We present a new methodology for linear model selection and apply it to microarray data of bone marrow-derived macrophages. This experiment investigates the influence of three variable factors: the genetic background of the mice from which the macrophages were obtained, Yersinia enterocolitica infection (two strains, and a mock control), and treatment/non-treatment with interferon-γ. We set up four different linear regression models in a hierarchical order. We introduce the eruption plot as a new practical tool for model selection complementary to global testing. It visually compares the size and significance of effect estimates between two nested models. Using this methodology we were able to select the most appropriate model by keeping only relevant factors showing additional explanatory power. Application to experimental data allowed us to qualify the interaction of factors as either neutral (no interaction), alleviating (co-occurring effects are weaker than expected from the single effects), or aggravating (stronger than expected). We find a biologically meaningful gene cluster of putative C2TA target genes that appear to be co-regulated with MHC class II genes. We introduced the eruption plot as a tool for visual model comparison to identify relevant higher order interactions in the analysis of expression data obtained under the influence of multiple factors. We conclude that model selection in higher order linear regression models should generally be performed for the analysis of multi-factorial microarray data.
Evidence of Probabilistic Behaviour in Protein Interaction Networks
2008-01-31
Evidence of degree-weighted connectivity in nine PPI networks. a, Homo sapiens (human); b, Drosophila melanogaster (fruit fly); c-e, Saccharomyces...illustrates maps for the networks of Homo sapiens and Dro- sophila melanogaster, while maps for the remaining net- works are provided in Additional file 2. As...protein-protein interaction networks. a, Homo sapiens ; b, Drosophila melanogaster. Distances shown as average shortest path lengths L(k1, k2) between
NASA Astrophysics Data System (ADS)
Lv, Zhong; Chen, Huisu
2014-10-01
Autonomous healing of cracks using pre-embedded capsules containing healing agent is becoming a promising approach to restore the strength of damaged structures. In addition to the material properties, the size and volume fraction of capsules influence crack healing in the matrix. Understanding the crack and capsule interaction is critical in the development and design of structures made of self-healing materials. Assuming that the pre-embedded capsules are randomly dispersed we theoretically model flat ellipsoidal crack interaction with capsules and determine the probability of a crack intersecting the pre-embedded capsules i.e. the self-healing probability. We also develop a probabilistic model of a crack simultaneously meeting with capsules and catalyst carriers in two-component self-healing system matrix. Using a risk-based healing approach, we determine the volume fraction and size of the pre-embedded capsules that are required to achieve a certain self-healing probability. To understand the effect of the shape of the capsules on self-healing we theoretically modeled crack interaction with spherical and cylindrical capsules. We compared the results of our theoretical model with Monte-Carlo simulations of crack interaction with capsules. The formulae presented in this paper will provide guidelines for engineers working with self-healing structures in material selection and sustenance.
Reddy, Lena Felice; Waltz, James A; Green, Michael F; Wynn, Jonathan K; Horan, William P
2016-07-01
Although individuals with schizophrenia show impaired feedback-driven learning on probabilistic reversal learning (PRL) tasks, the specific factors that contribute to these deficits remain unknown. Recent work has suggested several potential causes including neurocognitive impairments, clinical symptoms, and specific types of feedback-related errors. To examine this issue, we administered a PRL task to 126 stable schizophrenia outpatients and 72 matched controls, and patients were retested 4 weeks later. The task involved an initial probabilistic discrimination learning phase and subsequent reversal phases in which subjects had to adjust their responses to sudden shifts in the reinforcement contingencies. Patients showed poorer performance than controls for both the initial discrimination and reversal learning phases of the task, and performance overall showed good test-retest reliability among patients. A subgroup analysis of patients (n = 64) and controls (n = 49) with good initial discrimination learning revealed no between-group differences in reversal learning, indicating that the patients who were able to achieve all of the initial probabilistic discriminations were not impaired in reversal learning. Regarding potential contributors to impaired discrimination learning, several factors were associated with poor PRL, including higher levels of neurocognitive impairment, poor learning from both positive and negative feedback, and higher levels of indiscriminate response shifting. The results suggest that poor PRL performance in schizophrenia can be the product of multiple mechanisms. © The Author 2016. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Poças, Maria F; Oliveira, Jorge C; Brandsch, Rainer; Hogg, Timothy
2010-07-01
The use of probabilistic approaches in exposure assessments of contaminants migrating from food packages is of increasing interest but the lack of concentration or migration data is often referred as a limitation. Data accounting for the variability and uncertainty that can be expected in migration, for example, due to heterogeneity in the packaging system, variation of the temperature along the distribution chain, and different time of consumption of each individual package, are required for probabilistic analysis. The objective of this work was to characterize quantitatively the uncertainty and variability in estimates of migration. A Monte Carlo simulation was applied to a typical solution of the Fick's law with given variability in the input parameters. The analysis was performed based on experimental data of a model system (migration of Irgafos 168 from polyethylene into isooctane) and illustrates how important sources of variability and uncertainty can be identified in order to refine analyses. For long migration times and controlled conditions of temperature the affinity of the migrant to the food can be the major factor determining the variability in the migration values (more than 70% of variance). In situations where both the time of consumption and temperature can vary, these factors can be responsible, respectively, for more than 60% and 20% of the variance in the migration estimates. The approach presented can be used with databases from consumption surveys to yield a true probabilistic estimate of exposure.
Effects of delay and probability combinations on discounting in humans.
Cox, David J; Dallery, Jesse
2016-10-01
To determine discount rates, researchers typically adjust the amount of an immediate or certain option relative to a delayed or uncertain option. Because this adjusting amount method can be relatively time consuming, researchers have developed more efficient procedures. One such procedure is a 5-trial adjusting delay procedure, which measures the delay at which an amount of money loses half of its value (e.g., $1000 is valued at $500 with a 10-year delay to its receipt). Experiment 1 (n=212) used 5-trial adjusting delay or probability tasks to measure delay discounting of losses, probabilistic gains, and probabilistic losses. Experiment 2 (n=98) assessed combined probabilistic and delayed alternatives. In both experiments, we compared results from 5-trial adjusting delay or probability tasks to traditional adjusting amount procedures. Results suggest both procedures produced similar rates of probability and delay discounting in six out of seven comparisons. A magnitude effect consistent with previous research was observed for probabilistic gains and losses, but not for delayed losses. Results also suggest that delay and probability interact to determine the value of money. Five-trial methods may allow researchers to assess discounting more efficiently as well as study more complex choice scenarios. Copyright © 2016 Elsevier B.V. All rights reserved.
Fan, Ming; Thongsri, Tepwitoon; Axe, Lisa; Tyson, Trevor A
2005-06-01
A probabilistic approach was applied in an ecological risk assessment (ERA) to characterize risk and address uncertainty employing Monte Carlo simulations for assessing parameter and risk probabilistic distributions. This simulation tool (ERA) includes a Window's based interface, an interactive and modifiable database management system (DBMS) that addresses a food web at trophic levels, and a comprehensive evaluation of exposure pathways. To illustrate this model, ecological risks from depleted uranium (DU) exposure at the US Army Yuma Proving Ground (YPG) and Aberdeen Proving Ground (APG) were assessed and characterized. Probabilistic distributions showed that at YPG, a reduction in plant root weight is considered likely to occur (98% likelihood) from exposure to DU; for most terrestrial animals, likelihood for adverse reproduction effects ranges from 0.1% to 44%. However, for the lesser long-nosed bat, the effects are expected to occur (>99% likelihood) through the reduction in size and weight of offspring. Based on available DU data for the firing range at APG, DU uptake will not likely affect survival of aquatic plants and animals (<0.1% likelihood). Based on field and laboratory studies conducted at APG and YPG on pocket mice, kangaroo rat, white-throated woodrat, deer, and milfoil, body burden concentrations observed fall into the distributions simulated at both sites.
Oh-Descher, Hanna; Beck, Jeffrey M; Ferrari, Silvia; Sommer, Marc A; Egner, Tobias
2017-11-15
Real-life decision-making often involves combining multiple probabilistic sources of information under finite time and cognitive resources. To mitigate these pressures, people "satisfice", foregoing a full evaluation of all available evidence to focus on a subset of cues that allow for fast and "good-enough" decisions. Although this form of decision-making likely mediates many of our everyday choices, very little is known about the way in which the neural encoding of cue information changes when we satisfice under time pressure. Here, we combined human functional magnetic resonance imaging (fMRI) with a probabilistic classification task to characterize neural substrates of multi-cue decision-making under low (1500 ms) and high (500 ms) time pressure. Using variational Bayesian inference, we analyzed participants' choices to track and quantify cue usage under each experimental condition, which was then applied to model the fMRI data. Under low time pressure, participants performed near-optimally, appropriately integrating all available cues to guide choices. Both cortical (prefrontal and parietal cortex) and subcortical (hippocampal and striatal) regions encoded individual cue weights, and activity linearly tracked trial-by-trial variations in the amount of evidence and decision uncertainty. Under increased time pressure, participants adaptively shifted to using a satisficing strategy by discounting the least informative cue in their decision process. This strategic change in decision-making was associated with an increased involvement of the dopaminergic midbrain, striatum, thalamus, and cerebellum in representing and integrating cue values. We conclude that satisficing the probabilistic inference process under time pressure leads to a cortical-to-subcortical shift in the neural drivers of decisions. Copyright © 2017 Elsevier Inc. All rights reserved.
Confronting uncertainty in flood damage predictions
NASA Astrophysics Data System (ADS)
Schröter, Kai; Kreibich, Heidi; Vogel, Kristin; Merz, Bruno
2015-04-01
Reliable flood damage models are a prerequisite for the practical usefulness of the model results. Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005 and 2006, in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The reliability of the probabilistic predictions within validation runs decreases only slightly and achieves a very good coverage of observations within the predictive interval. Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Jie; Draxl, Caroline; Hopson, Thomas
Numerical weather prediction (NWP) models have been widely used for wind resource assessment. Model runs with higher spatial resolution are generally more accurate, yet extremely computational expensive. An alternative approach is to use data generated by a low resolution NWP model, in conjunction with statistical methods. In order to analyze the accuracy and computational efficiency of different types of NWP-based wind resource assessment methods, this paper performs a comparison of three deterministic and probabilistic NWP-based wind resource assessment methodologies: (i) a coarse resolution (0.5 degrees x 0.67 degrees) global reanalysis data set, the Modern-Era Retrospective Analysis for Research and Applicationsmore » (MERRA); (ii) an analog ensemble methodology based on the MERRA, which provides both deterministic and probabilistic predictions; and (iii) a fine resolution (2-km) NWP data set, the Wind Integration National Dataset (WIND) Toolkit, based on the Weather Research and Forecasting model. Results show that: (i) as expected, the analog ensemble and WIND Toolkit perform significantly better than MERRA confirming their ability to downscale coarse estimates; (ii) the analog ensemble provides the best estimate of the multi-year wind distribution at seven of the nine sites, while the WIND Toolkit is the best at one site; (iii) the WIND Toolkit is more accurate in estimating the distribution of hourly wind speed differences, which characterizes the wind variability, at five of the available sites, with the analog ensemble being best at the remaining four locations; and (iv) the analog ensemble computational cost is negligible, whereas the WIND Toolkit requires large computational resources. Future efforts could focus on the combination of the analog ensemble with intermediate resolution (e.g., 10-15 km) NWP estimates, to considerably reduce the computational burden, while providing accurate deterministic estimates and reliable probabilistic assessments.« less
Evaluation of feature-based 3-d registration of probabilistic volumetric scenes
NASA Astrophysics Data System (ADS)
Restrepo, Maria I.; Ulusoy, Ali O.; Mundy, Joseph L.
2014-12-01
Automatic estimation of the world surfaces from aerial images has seen much attention and progress in recent years. Among current modeling technologies, probabilistic volumetric models (PVMs) have evolved as an alternative representation that can learn geometry and appearance in a dense and probabilistic manner. Recent progress, in terms of storage and speed, achieved in the area of volumetric modeling, opens the opportunity to develop new frameworks that make use of the PVM to pursue the ultimate goal of creating an entire map of the earth, where one can reason about the semantics and dynamics of the 3-d world. Aligning 3-d models collected at different time-instances constitutes an important step for successful fusion of large spatio-temporal information. This paper evaluates how effectively probabilistic volumetric models can be aligned using robust feature-matching techniques, while considering different scenarios that reflect the kind of variability observed across aerial video collections from different time instances. More precisely, this work investigates variability in terms of discretization, resolution and sampling density, errors in the camera orientation, and changes in illumination and geographic characteristics. All results are given for large-scale, outdoor sites. In order to facilitate the comparison of the registration performance of PVMs to that of other 3-d reconstruction techniques, the registration pipeline is also carried out using Patch-based Multi-View Stereo (PMVS) algorithm. Registration performance is similar for scenes that have favorable geometry and the appearance characteristics necessary for high quality reconstruction. In scenes containing trees, such as a park, or many buildings, such as a city center, registration performance is significantly more accurate when using the PVM.
Ethno-Pedagogical Factor of Polycultural Training
ERIC Educational Resources Information Center
Fahrutdinova, Guzaliya Zh.
2016-01-01
With the increased tension in human relations, in a burst of misunderstanding, ethnic conflicts, which have proliferated in a new socio-cultural environment, the study of processes of interaction in multi-ethnic educational environment and upbringing, the emerging national identity for centuries, actualizes the importance of contemporary problems…
NASA Technical Reports Server (NTRS)
Sobel, Larry; Buttitta, Claudio; Suarez, James
1993-01-01
Probabilistic predictions based on the Integrated Probabilistic Assessment of Composite Structures (IPACS) code are presented for the material and structural response of unnotched and notched, 1M6/3501-6 Gr/Ep laminates. Comparisons of predicted and measured modulus and strength distributions are given for unnotched unidirectional, cross-ply, and quasi-isotropic laminates. The predicted modulus distributions were found to correlate well with the test results for all three unnotched laminates. Correlations of strength distributions for the unnotched laminates are judged good for the unidirectional laminate and fair for the cross-ply laminate, whereas the strength correlation for the quasi-isotropic laminate is deficient because IPACS did not yet have a progressive failure capability. The paper also presents probabilistic and structural reliability analysis predictions for the strain concentration factor (SCF) for an open-hole, quasi-isotropic laminate subjected to longitudinal tension. A special procedure was developed to adapt IPACS for the structural reliability analysis. The reliability results show the importance of identifying the most significant random variables upon which the SCF depends, and of having accurate scatter values for these variables.
Shi, Yajuan; Wang, Ruoshi; Lu, Yonglong; Song, Shuai; Johnson, Andrew C; Sweetman, Andrew; Jones, Kevin
2016-09-01
Ecological risk assessment (ERA) has been widely applied in characterizing the risk of chemicals to organisms and ecosystems. The paucity of toxicity data on local biota living in the different compartments of an ecosystem and the absence of a suitable methodology for multi-compartment spatial risk assessment at the regional scale has held back this field. The major objective of this study was to develop a methodology to quantify and distinguish the spatial distribution of risk to ecosystems at a regional scale. A framework for regional multi-compartment probabilistic ecological risk assessment (RMPERA) was constructed and corroborated using a bioassay of a local species. The risks from cadmium (Cd) pollution in river water, river sediment, coastal water, coastal surface sediment and soil in northern Bohai Rim were examined. The results indicated that the local organisms in soil, river, coastal water, and coastal sediment were affected by Cd. The greatest impacts from Cd were identified in the Tianjin and Huludao areas. The overall multi-compartment risk was 31.4% in the region. The methodology provides a new approach for regional multi-compartment ecological risk assessment. Copyright © 2016 Elsevier Ltd. All rights reserved.
Modelling and Characterisation of Detection Models in WAMI for Handling Negative Information
2014-02-01
behaviour of the multi-stage detectors used in LoFT. This model is then used in a Probabilistic Hypothesis Density Filter (PHD). Unlike most multitarget...Therefore, we decided to use machine learning techniques which could model — and pre- dict — the behaviour of the detectors in LoFT. Because we are using...on feature detectors [8], motion models [13] and descriptor and template adaptation [9]. 2.3.2 State Model The state space of LoFT is defined in 2D
Hazard Interactions and Interaction Networks (Cascades) within Multi-Hazard Methodologies
NASA Astrophysics Data System (ADS)
Gill, Joel; Malamud, Bruce D.
2016-04-01
Here we combine research and commentary to reinforce the importance of integrating hazard interactions and interaction networks (cascades) into multi-hazard methodologies. We present a synthesis of the differences between 'multi-layer single hazard' approaches and 'multi-hazard' approaches that integrate such interactions. This synthesis suggests that ignoring interactions could distort management priorities, increase vulnerability to other spatially relevant hazards or underestimate disaster risk. We proceed to present an enhanced multi-hazard framework, through the following steps: (i) describe and define three groups (natural hazards, anthropogenic processes and technological hazards/disasters) as relevant components of a multi-hazard environment; (ii) outline three types of interaction relationship (triggering, increased probability, and catalysis/impedance); and (iii) assess the importance of networks of interactions (cascades) through case-study examples (based on literature, field observations and semi-structured interviews). We further propose visualisation frameworks to represent these networks of interactions. Our approach reinforces the importance of integrating interactions between natural hazards, anthropogenic processes and technological hazards/disasters into enhanced multi-hazard methodologies. Multi-hazard approaches support the holistic assessment of hazard potential, and consequently disaster risk. We conclude by describing three ways by which understanding networks of interactions contributes to the theoretical and practical understanding of hazards, disaster risk reduction and Earth system management. Understanding interactions and interaction networks helps us to better (i) model the observed reality of disaster events, (ii) constrain potential changes in physical and social vulnerability between successive hazards, and (iii) prioritise resource allocation for mitigation and disaster risk reduction.
Statistical Analysis of Stress Signals from Bridge Monitoring by FBG System
Ye, Xiao-Wei; Xi, Pei-Sen
2018-01-01
In this paper, a fiber Bragg grating (FBG)-based stress monitoring system instrumented on an orthotropic steel deck arch bridge is demonstrated. The FBG sensors are installed at two types of critical fatigue-prone welded joints to measure the strain and temperature signals. A total of 64 FBG sensors are deployed around the rib-to-deck and rib-to-diagram areas at the mid-span and quarter-span of the investigated orthotropic steel bridge. The local stress behaviors caused by the highway loading and temperature effect during the construction and operation periods are presented with the aid of a wavelet multi-resolution analysis approach. In addition, the multi-modal characteristic of the rainflow counted stress spectrum is modeled by the method of finite mixture distribution together with a genetic algorithm (GA)-based parameter estimation approach. The optimal probability distribution of the stress spectrum is determined by use of Bayesian information criterion (BIC). Furthermore, the hot spot stress of the welded joint is calculated by an extrapolation method recommended in the specification of International Institute of Welding (IIW). The stochastic characteristic of stress concentration factor (SCF) of the concerned welded joint is addressed. The proposed FBG-based stress monitoring system and probabilistic stress evaluation methods can provide an effective tool for structural monitoring and condition assessment of orthotropic steel bridges. PMID:29414850
Schürmann, Tim; Beckerle, Philipp; Preller, Julia; Vogt, Joachim; Christ, Oliver
2016-12-19
In product development for lower limb prosthetic devices, a set of special criteria needs to be met. Prosthetic devices have a direct impact on the rehabilitation process after an amputation with both perceived technological and psychological aspects playing an important role. However, available psychometric questionnaires fail to consider the important links between these two dimensions. In this article a probabilistic latent trait model is proposed with seven technical and psychological factors which measure satisfaction with the prosthesis. The results of a first study are used to determine the basic parameters of the statistical model. These distributions represent hypotheses about factor loadings between manifest items and latent factors of the proposed psychometric questionnaire. A study was conducted and analyzed to form hypotheses for the prior distributions of the questionnaire's measurement model. An expert agreement study conducted on 22 experts was used to determine the prior distribution of item-factor loadings in the model. Model parameters that had to be specified as part of the measurement model were informed prior distributions on the item-factor loadings. For the current 70 items in the questionnaire, each factor loading was set to represent the certainty with which experts had assigned the items to their respective factors. Considering only the measurement model and not the structural model of the questionnaire, 70 out of 217 informed prior distributions on parameters were set. The use of preliminary studies to set prior distributions in latent trait models, while being a relatively new approach in psychological research, provides helpful information towards the design of a seven factor questionnaire that means to identify relations between technical and psychological factors in prosthetic product design and rehabilitation medicine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Dengwang; Liu, Li; Chen, Jinhu
2014-06-01
Purpose: The aiming of this study was to extract liver structures for daily Cone beam CT (CBCT) images automatically. Methods: Datasets were collected from 50 intravenous contrast planning CT images, which were regarded as training dataset for probabilistic atlas and shape prior model construction. Firstly, probabilistic atlas and shape prior model based on sparse shape composition (SSC) were constructed by iterative deformable registration. Secondly, the artifacts and noise were removed from the daily CBCT image by an edge-preserving filtering using total variation with L1 norm (TV-L1). Furthermore, the initial liver region was obtained by registering the incoming CBCT image withmore » the atlas utilizing edge-preserving deformable registration with multi-scale strategy, and then the initial liver region was converted to surface meshing which was registered with the shape model where the major variation of specific patient was modeled by sparse vectors. At the last stage, the shape and intensity information were incorporated into joint probabilistic model, and finally the liver structure was extracted by maximum a posteriori segmentation.Regarding the construction process, firstly the manually segmented contours were converted into meshes, and then arbitrary patient data was chosen as reference image to register with the rest of training datasets by deformable registration algorithm for constructing probabilistic atlas and prior shape model. To improve the efficiency of proposed method, the initial probabilistic atlas was used as reference image to register with other patient data for iterative construction for removing bias caused by arbitrary selection. Results: The experiment validated the accuracy of the segmentation results quantitatively by comparing with the manually ones. The volumetric overlap percentage between the automatically generated liver contours and the ground truth were on an average 88%–95% for CBCT images. Conclusion: The experiment demonstrated that liver structures of CBCT with artifacts can be extracted accurately for following adaptive radiation therapy. This work is supported by National Natural Science Foundation of China (No. 61201441), Research Fund for Excellent Young and Middle-aged Scientists of Shandong Province (No. BS2012DX038), Project of Shandong Province Higher Educational Science and Technology Program (No. J12LN23), Jinan youth science and technology star (No.20120109)« less
Mean Field Approach to the Giant Wormhole Problem
NASA Astrophysics Data System (ADS)
Gamba, A.; Kolokolov, I.; Martellini, M.
We introduce a gaussian probability density for the space-time distribution of worm-holes, thus taking effectively into account wormhole interaction. Using a mean-field approximation for the free energy, we show that giant wormholes are probabilistically suppressed in a homogenous isotropic “large” universe.
Probability or Reasoning: Current Thinking and Realistic Strategies for Improved Medical Decisions
2017-01-01
A prescriptive model approach in decision making could help achieve better diagnostic accuracy in clinical practice through methods that are less reliant on probabilistic assessments. Various prescriptive measures aimed at regulating factors that influence heuristics and clinical reasoning could support clinical decision-making process. Clinicians could avoid time-consuming decision-making methods that require probabilistic calculations. Intuitively, they could rely on heuristics to obtain an accurate diagnosis in a given clinical setting. An extensive literature review of cognitive psychology and medical decision-making theory was performed to illustrate how heuristics could be effectively utilized in daily practice. Since physicians often rely on heuristics in realistic situations, probabilistic estimation might not be a useful tool in everyday clinical practice. Improvements in the descriptive model of decision making (heuristics) may allow for greater diagnostic accuracy. PMID:29209469
Probabilistic Analysis of Gas Turbine Field Performance
NASA Technical Reports Server (NTRS)
Gorla, Rama S. R.; Pai, Shantaram S.; Rusick, Jeffrey J.
2002-01-01
A gas turbine thermodynamic cycle was computationally simulated and probabilistically evaluated in view of the several uncertainties in the performance parameters, which are indices of gas turbine health. Cumulative distribution functions and sensitivity factors were computed for the overall thermal efficiency and net specific power output due to the thermodynamic random variables. These results can be used to quickly identify the most critical design variables in order to optimize the design, enhance performance, increase system availability and make it cost effective. The analysis leads to the selection of the appropriate measurements to be used in the gas turbine health determination and to the identification of both the most critical measurements and parameters. Probabilistic analysis aims at unifying and improving the control and health monitoring of gas turbine aero-engines by increasing the quality and quantity of information available about the engine's health and performance.
NASA Technical Reports Server (NTRS)
Patel, Bhogila M.; Hoge, Peter A.; Nagpal, Vinod K.; Hojnicki, Jeffrey S.; Rusick, Jeffrey J.
2004-01-01
This paper describes the methods employed to apply probabilistic modeling techniques to the International Space Station (ISS) power system. These techniques were used to quantify the probabilistic variation in the power output, also called the response variable, due to variations (uncertainties) associated with knowledge of the influencing factors called the random variables. These uncertainties can be due to unknown environmental conditions, variation in the performance of electrical power system components or sensor tolerances. Uncertainties in these variables, cause corresponding variations in the power output, but the magnitude of that effect varies with the ISS operating conditions, e.g. whether or not the solar panels are actively tracking the sun. Therefore, it is important to quantify the influence of these uncertainties on the power output for optimizing the power available for experiments.
Probability or Reasoning: Current Thinking and Realistic Strategies for Improved Medical Decisions.
Nantha, Yogarabindranath Swarna
2017-11-01
A prescriptive model approach in decision making could help achieve better diagnostic accuracy in clinical practice through methods that are less reliant on probabilistic assessments. Various prescriptive measures aimed at regulating factors that influence heuristics and clinical reasoning could support clinical decision-making process. Clinicians could avoid time-consuming decision-making methods that require probabilistic calculations. Intuitively, they could rely on heuristics to obtain an accurate diagnosis in a given clinical setting. An extensive literature review of cognitive psychology and medical decision-making theory was performed to illustrate how heuristics could be effectively utilized in daily practice. Since physicians often rely on heuristics in realistic situations, probabilistic estimation might not be a useful tool in everyday clinical practice. Improvements in the descriptive model of decision making (heuristics) may allow for greater diagnostic accuracy.
Characterization of essential proteins based on network topology in proteins interaction networks
NASA Astrophysics Data System (ADS)
Bakar, Sakhinah Abu; Taheri, Javid; Zomaya, Albert Y.
2014-06-01
The identification of essential proteins is theoretically and practically important as (1) it is essential to understand the minimal surviving requirements for cellular lives, and (2) it provides fundamental for development of drug. As conducting experimental studies to identify essential proteins are both time and resource consuming, here we present a computational approach in predicting them based on network topology properties from protein-protein interaction networks of Saccharomyces cerevisiae. The proposed method, namely EP3NN (Essential Proteins Prediction using Probabilistic Neural Network) employed a machine learning algorithm called Probabilistic Neural Network as a classifier to identify essential proteins of the organism of interest; it uses degree centrality, closeness centrality, local assortativity and local clustering coefficient of each protein in the network for such predictions. Results show that EP3NN managed to successfully predict essential proteins with an accuracy of 95% for our studied organism. Results also show that most of the essential proteins are close to other proteins, have assortativity behavior and form clusters/sub-graph in the network.
Imaging and machine learning techniques for diagnosis of Alzheimer's disease.
Mirzaei, Golrokh; Adeli, Anahita; Adeli, Hojjat
2016-12-01
Alzheimer's disease (AD) is a common health problem in elderly people. There has been considerable research toward the diagnosis and early detection of this disease in the past decade. The sensitivity of biomarkers and the accuracy of the detection techniques have been defined to be the key to an accurate diagnosis. This paper presents a state-of-the-art review of the research performed on the diagnosis of AD based on imaging and machine learning techniques. Different segmentation and machine learning techniques used for the diagnosis of AD are reviewed including thresholding, supervised and unsupervised learning, probabilistic techniques, Atlas-based approaches, and fusion of different image modalities. More recent and powerful classification techniques such as the enhanced probabilistic neural network of Ahmadlou and Adeli should be investigated with the goal of improving the diagnosis accuracy. A combination of different image modalities can help improve the diagnosis accuracy rate. Research is needed on the combination of modalities to discover multi-modal biomarkers.
An operational mesoscale ensemble data assimilation and prediction system: E-RTFDDA
NASA Astrophysics Data System (ADS)
Liu, Y.; Hopson, T.; Roux, G.; Hacker, J.; Xu, M.; Warner, T.; Swerdlin, S.
2009-04-01
Mesoscale (2-2000 km) meteorological processes differ from synoptic circulations in that mesoscale weather changes rapidly in space and time, and physics processes that are parameterized in NWP models play a great role. Complex interactions of synoptic circulations, regional and local terrain, land-surface heterogeneity, and associated physical properties, and the physical processes of radiative transfer, cloud and precipitation and boundary layer mixing, are crucial in shaping regional weather and climate. Mesoscale ensemble analysis and prediction should sample the uncertainties of mesoscale modeling systems in representing these factors. An innovative mesoscale Ensemble Real-Time Four Dimensional Data Assimilation (E-RTFDDA) and forecasting system has been developed at NCAR. E-RTFDDA contains diverse ensemble perturbation approaches that consider uncertainties in all major system components to produce multi-scale continuously-cycling probabilistic data assimilation and forecasting. A 30-member E-RTFDDA system with three nested domains with grid sizes of 30, 10 and 3.33 km has been running on a Department of Defense high-performance computing platform since September 2007. It has been applied at two very different US geographical locations; one in the western inter-mountain area and the other in the northeastern states, producing 6 hour analyses and 48 hour forecasts, with 4 forecast cycles a day. The operational model outputs are analyzed to a) assess overall ensemble performance and properties, b) study terrain effect on mesoscale predictability, c) quantify the contribution of different ensemble perturbation approaches to the overall forecast skill, and d) assess the additional contributed skill from an ensemble calibration process based on a quantile-regression algorithm. The system and the results will be reported at the meeting.
An ensemble model of competitive multi-factor binding of the genome
Wasson, Todd; Hartemink, Alexander J.
2009-01-01
Hundreds of different factors adorn the eukaryotic genome, binding to it in large number. These DNA binding factors (DBFs) include nucleosomes, transcription factors (TFs), and other proteins and protein complexes, such as the origin recognition complex (ORC). DBFs compete with one another for binding along the genome, yet many current models of genome binding do not consider different types of DBFs together simultaneously. Additionally, binding is a stochastic process that results in a continuum of binding probabilities at any position along the genome, but many current models tend to consider positions as being either binding sites or not. Here, we present a model that allows a multitude of DBFs, each at different concentrations, to compete with one another for binding sites along the genome. The result is an “occupancy profile,” a probabilistic description of the DNA occupancy of each factor at each position. We implement our model efficiently as the software package COMPETE. We demonstrate genome-wide and at specific loci how modeling nucleosome binding alters TF binding, and vice versa, and illustrate how factor concentration influences binding occupancy. Binding cooperativity between nearby TFs arises implicitly via mutual competition with nucleosomes. Our method applies not only to TFs, but also recapitulates known occupancy profiles of a well-studied replication origin with and without ORC binding. Importantly, the sequence preferences our model takes as input are derived from in vitro experiments. This ensures that the calculated occupancy profiles are the result of the forces of competition represented explicitly in our model and the inherent sequence affinities of the constituent DBFs. PMID:19720867
Mortality and Population Dynamics of Bemisia tabaci within a Multi-Crop System
USDA-ARS?s Scientific Manuscript database
The population dynamics of mobile polyphagous pests is governed by a complex set of interacting factors that involve multiple host-plants, seasonality, movement and demography. Bemisia tabaci is a multivoltine insect with no diapause that maintains population continuity by moving from one host to a...
Stupid Tutoring Systems, Intelligent Humans
ERIC Educational Resources Information Center
Baker, Ryan S.
2016-01-01
The initial vision for intelligent tutoring systems involved powerful, multi-faceted systems that would leverage rich models of students and pedagogies to create complex learning interactions. But the intelligent tutoring systems used at scale today are much simpler. In this article, I present hypotheses on the factors underlying this development,…
Neupane, S; Virtanen, P; Leino-Arjas, P; Miranda, H; Siukola, A; Nygård, C-H
2013-03-01
We investigated the separate and joint effects of multi-site musculoskeletal pain and physical and psychosocial exposures at work on future work ability. A survey was conducted among employees of a Finnish food industry company in 2005 (n = 1201) and a follow-up survey in 2009 (n = 734). Information on self-assessed work ability (current work ability on a scale from 0 to 10; 7 = poor work ability), multi-site musculoskeletal pain (pain in at least two anatomical areas of four), leisure-time physical activity, body mass index and physical and psychosocial exposures was obtained by questionnaire. The separate and joint effects of multi-site pain and work exposures on work ability at follow-up, among subjects with good work ability at baseline, were assessed by logistic regression, and p-values for the interaction derived. Compared with subjects with neither multi-site pain nor adverse work exposure, multi-site pain at baseline increased the risk of poor work ability at follow-up, allowing for age, gender, occupational class, body mass index and leisure-time physical activity. The separate effects of the work exposures on work ability were somewhat smaller than those of multi-site pain. Multi-site pain had an interactive effect with work environment and awkward postures, such that no association of multi-site pain with poor work ability was seen when work environment was poor or awkward postures present. The decline in work ability connected with multi-site pain was not increased by exposure to adverse physical or psychosocial factors at work. © 2012 European Federation of International Association for the Study of Pain Chapters.
Long, Chengjiang; Hua, Gang; Kapoor, Ashish
2015-01-01
We present a noise resilient probabilistic model for active learning of a Gaussian process classifier from crowds, i.e., a set of noisy labelers. It explicitly models both the overall label noise and the expertise level of each individual labeler with two levels of flip models. Expectation propagation is adopted for efficient approximate Bayesian inference of our probabilistic model for classification, based on which, a generalized EM algorithm is derived to estimate both the global label noise and the expertise of each individual labeler. The probabilistic nature of our model immediately allows the adoption of the prediction entropy for active selection of data samples to be labeled, and active selection of high quality labelers based on their estimated expertise to label the data. We apply the proposed model for four visual recognition tasks, i.e., object category recognition, multi-modal activity recognition, gender recognition, and fine-grained classification, on four datasets with real crowd-sourced labels from the Amazon Mechanical Turk. The experiments clearly demonstrate the efficacy of the proposed model. In addition, we extend the proposed model with the Predictive Active Set Selection Method to speed up the active learning system, whose efficacy is verified by conducting experiments on the first three datasets. The results show our extended model can not only preserve a higher accuracy, but also achieve a higher efficiency. PMID:26924892
New Aspects of Probabilistic Forecast Verification Using Information Theory
NASA Astrophysics Data System (ADS)
Tödter, Julian; Ahrens, Bodo
2013-04-01
This work deals with information-theoretical methods in probabilistic forecast verification, particularly concerning ensemble forecasts. Recent findings concerning the "Ignorance Score" are shortly reviewed, then a consistent generalization to continuous forecasts is motivated. For ensemble-generated forecasts, the presented measures can be calculated exactly. The Brier Score (BS) and its generalizations to the multi-categorical Ranked Probability Score (RPS) and to the Continuous Ranked Probability Score (CRPS) are prominent verification measures for probabilistic forecasts. Particularly, their decompositions into measures quantifying the reliability, resolution and uncertainty of the forecasts are attractive. Information theory sets up a natural framework for forecast verification. Recently, it has been shown that the BS is a second-order approximation of the information-based Ignorance Score (IGN), which also contains easily interpretable components and can also be generalized to a ranked version (RIGN). Here, the IGN, its generalizations and decompositions are systematically discussed in analogy to the variants of the BS. Additionally, a Continuous Ranked IGN (CRIGN) is introduced in analogy to the CRPS. The useful properties of the conceptually appealing CRIGN are illustrated, together with an algorithm to evaluate its components reliability, resolution, and uncertainty for ensemble-generated forecasts. This algorithm can also be used to calculate the decomposition of the more traditional CRPS exactly. The applicability of the "new" measures is demonstrated in a small evaluation study of ensemble-based precipitation forecasts.
Gluck, Mark A.; Shohamy, Daphna; Myers, Catherine
2002-01-01
Probabilistic category learning is often assumed to be an incrementally learned cognitive skill, dependent on nondeclarative memory systems. One paradigm in particular, the weather prediction task, has been used in over half a dozen neuropsychological and neuroimaging studies to date. Because of the growing interest in using this task and others like it as behavioral tools for studying the cognitive neuroscience of cognitive skill learning, it becomes especially important to understand how subjects solve this kind of task and whether all subjects learn it in the same way. We present here new experimental and theoretical analyses of the weather prediction task that indicate that there are at least three different strategies that describe how subjects learn this task. (1) An optimal multi-cue strategy, in which they respond to each pattern on the basis of associations of all four cues with each outcome; (2) a one-cue strategy, in which they respond on the basis of presence or absence of a single cue, disregarding all other cues; or (3) a singleton strategy, in which they learn only about the four patterns that have only one cue present and all others absent. This variability in how subjects approach this task may have important implications for interpreting how different brain regions are involved in probabilistic category learning. PMID:12464701
Pereira, Ana Santos; Dâmaso-Rodrigues, Maria Luísa; Amorim, Ana; Daam, Michiel A; Cerejeira, Maria José
2018-06-16
Studies addressing the predicted effects of pesticides in combination with abiotic and biotic factors on aquatic biota in ditches associated with typical Mediterranean agroecosystems are scarce. The current study aimed to evaluate the predicted effects of pesticides along with environmental factors and biota interactions on macroinvertebrate, zooplankton and phytoplankton community compositions in ditches adjacent to Portuguese maize and tomato crop areas. Data was analysed with the variance partitioning procedure based on redundancy analysis (RDA). The total variance in biological community composition was divided into the variance explained by the multi-substance potentially affected fraction [(msPAF) arthropods and primary producers], environmental factors (water chemistry parameters), biotic interactions, shared variance, and unexplained variance. The total explained variance reached 39.4% and the largest proportion of this explained variance was attributed to msPAF (23.7%). When each group (phytoplankton, zooplankton and macroinvertebrates) was analysed separately, biota interactions and environmental factors explained the largest proportion of variance. Results of this study indicate that besides the presence of pesticide mixtures, environmental factors and biotic interactions also considerably influence field freshwater communities. Subsequently, to increase our understanding of the risk of pesticide mixtures on ecosystem communities in edge-of-field water bodies, variations in environmental and biological factors should also be considered.
Influences of geological parameters to probabilistic assessment of slope stability of embankment
NASA Astrophysics Data System (ADS)
Nguyen, Qui T.; Le, Tuan D.; Konečný, Petr
2018-04-01
This article considers influences of geological parameters to slope stability of the embankment in probabilistic analysis using SLOPE/W computational system. Stability of a simple slope is evaluated with and without pore–water pressure on the basis of variation of soil properties. Normal distributions of unit weight, cohesion and internal friction angle are assumed. Monte Carlo simulation technique is employed to perform analysis of critical slip surface. Sensitivity analysis is performed to observe the variation of the geological parameters and their effects on safety factors of the slope stability.
A theoretical framework for negotiating the path of emergency management multi-agency coordination.
Curnin, Steven; Owen, Christine; Paton, Douglas; Brooks, Benjamin
2015-03-01
Multi-agency coordination represents a significant challenge in emergency management. The need for liaison officers working in strategic level emergency operations centres to play organizational boundary spanning roles within multi-agency coordination arrangements that are enacted in complex and dynamic emergency response scenarios creates significant research and practical challenges. The aim of the paper is to address a gap in the literature regarding the concept of multi-agency coordination from a human-environment interaction perspective. We present a theoretical framework for facilitating multi-agency coordination in emergency management that is grounded in human factors and ergonomics using the methodology of core-task analysis. As a result we believe the framework will enable liaison officers to cope more efficiently within the work domain. In addition, we provide suggestions for extending the theory of core-task analysis to an alternate high reliability environment. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Simulating future residential property losses from wildfire in Flathead County, Montana: Chapter 1
Prato, Tony; Paveglio, Travis B; Barnett, Yan; Silverstein, Robin; Hardy, Michael; Keane, Robert; Loehman, Rachel A.; Clark, Anthony; Fagre, Daniel B.; Venn, Tyron; Stockmann, Keith
2014-01-01
Wildfire damages to private residences in the United States and elsewhere have increased as a result of expansion of the wildland-urban interface (WUI) and other factors. Understanding this unwelcome trend requires analytical frameworks that simulate how various interacting social, economic, and biophysical factors influence those damages. A methodological framework is developed for simulating expected residential property losses from wildfire [E(RLW)], which is a probabilistic monetary measure of wildfire risk to residential properties in the WUI. E(RLW) is simulated for Flathead County, Montana for five, 10-year subperiods covering the period 2010-2059, under various assumptions about future climate change, economic growth, land use policy, and forest management. Results show statistically significant increases in the spatial extent of WUI properties, the number of residential structures at risk from wildfire, and E(RLW) over the 50-year evaluation period for both the county and smaller subareas (i.e., neighborhoods and parcels). The E(RLW) simulation framework presented here advances the field of wildfire risk assessment by providing a finer-scale tool that incorporates a set of dynamic, interacting processes. The framework can be applied using other scenarios for climate change, economic growth, land use policy, and forest management, and in other areas.
He nui na ala e hiki aku ai: Factors Influencing Phonetic Variation in the Hawaiian Word "keia"
ERIC Educational Resources Information Center
Drager, Katie; Comstock, Bethany Kaleialohapau'ole Chun; Kneubuhl, Hina Puamohala
2017-01-01
Apart from a handful of studies (e.g., Kinney 1956), linguists know little about what variation exists in Hawaiian and what factors constrain the variation. In this paper, we present an analysis of phonetic variation in the word "keia," meaning "this," examining the social, linguistic, and probabilistic factors that constrain…
Ma, Li; Brautbar, Ariel; Boerwinkle, Eric; Sing, Charles F.
2012-01-01
Total cholesterol, low-density lipoprotein cholesterol, triglyceride, and high-density lipoprotein cholesterol (HDL-C) levels are among the most important risk factors for coronary artery disease. We tested for gene–gene interactions affecting the level of these four lipids based on prior knowledge of established genome-wide association study (GWAS) hits, protein–protein interactions, and pathway information. Using genotype data from 9,713 European Americans from the Atherosclerosis Risk in Communities (ARIC) study, we identified an interaction between HMGCR and a locus near LIPC in their effect on HDL-C levels (Bonferroni corrected P c = 0.002). Using an adaptive locus-based validation procedure, we successfully validated this gene–gene interaction in the European American cohorts from the Framingham Heart Study (P c = 0.002) and the Multi-Ethnic Study of Atherosclerosis (MESA; P c = 0.006). The interaction between these two loci is also significant in the African American sample from ARIC (P c = 0.004) and in the Hispanic American sample from MESA (P c = 0.04). Both HMGCR and LIPC are involved in the metabolism of lipids, and genome-wide association studies have previously identified LIPC as associated with levels of HDL-C. However, the effect on HDL-C of the novel gene–gene interaction reported here is twice as pronounced as that predicted by the sum of the marginal effects of the two loci. In conclusion, based on a knowledge-driven analysis of epistasis, together with a new locus-based validation method, we successfully identified and validated an interaction affecting a complex trait in multi-ethnic populations. PMID:22654671
Astroinformatics in the Age of LSST: Analyzing the Summer 2012 Data Release
NASA Astrophysics Data System (ADS)
Borne, Kirk D.; De Lee, N. M.; Stassun, K.; Paegert, M.; Cargile, P.; Burger, D.; Bloom, J. S.; Richards, J.
2013-01-01
The Large Synoptic Survey Telescope (LSST) will image the visible southern sky every three nights. This multi-band, multi-epoch survey will produce a torrent of data, which traditional methods of object-by-object data analysis will not be able to accommodate. Thus the need for new astroinformatics tools to visualize, simulate, mine, and analyze this quantity of data. The Berkeley Center for Time-Domain Informatics (CTDI) is building the informatics infrastructure for generic light curve classification, including the innovation of new algorithms for feature generation and machine learning. The CTDI portal (http://dotastro.org) contains one of the largest collections of public light curves, with visualization and exploration tools. The group has also published the first calibrated probabilistic classification catalog of 50k variable stars along with a data exploration portal called http://bigmacc.info. Twice a year, the LSST collaboration releases simulated LSST data, in order to aid software development. This poster also showcases a suite of new tools from the Vanderbilt Initiative in Data-instensive Astrophysics (VIDA), designed to take advantage of these large data sets. VIDA's Filtergraph interactive web tool allows one to instantly create an interactive data portal for fast, real-time visualization of large data sets. Filtergraph enables quick selection of interesting objects by easily filtering on many different columns, 2-D and 3-D representations, and on-the-fly arithmetic calculations on the data. It also makes sharing the data and the tool with collaborators very easy. The EB/RRL Factory is a neural-network based variable star classifier, which is designed to quickly identify variable stars in a variety of classes from LSST light curve data (currently tuned to Eclipsing Binaries and RR Lyrae stars), and to provide likelihood-based orbital elements or stellar parameters as appropriate. Finally the LCsimulator software allows one to create simulated light curves of multiple types of variable stars based on an LSST cadence.
Gilad, Yoav; Pritchard, Jonathan K.; Stephens, Matthew
2015-01-01
Understanding global gene regulation depends critically on accurate annotation of regulatory elements that are functional in a given cell type. CENTIPEDE, a powerful, probabilistic framework for identifying transcription factor binding sites from tissue-specific DNase I cleavage patterns and genomic sequence content, leverages the hypersensitivity of factor-bound chromatin and the information in the DNase I spatial cleavage profile characteristic of each DNA binding protein to accurately infer functional factor binding sites. However, the model for the spatial profile in this framework fails to account for the substantial variation in the DNase I cleavage profiles across different binding sites. Neither does it account for variation in the profiles at the same binding site across multiple replicate DNase I experiments, which are increasingly available. In this work, we introduce new methods, based on multi-scale models for inhomogeneous Poisson processes, to account for such variation in DNase I cleavage patterns both within and across binding sites. These models account for the spatial structure in the heterogeneity in DNase I cleavage patterns for each factor. Using DNase-seq measurements assayed in a lymphoblastoid cell line, we demonstrate the improved performance of this model for several transcription factors by comparing against the Chip-seq peaks for those factors. Finally, we explore the effects of DNase I sequence bias on inference of factor binding using a simple extension to our framework that allows for a more flexible background model. The proposed model can also be easily applied to paired-end ATAC-seq and DNase-seq data. msCentipede, a Python implementation of our algorithm, is available at http://rajanil.github.io/msCentipede. PMID:26406244
Raj, Anil; Shim, Heejung; Gilad, Yoav; Pritchard, Jonathan K; Stephens, Matthew
2015-01-01
Understanding global gene regulation depends critically on accurate annotation of regulatory elements that are functional in a given cell type. CENTIPEDE, a powerful, probabilistic framework for identifying transcription factor binding sites from tissue-specific DNase I cleavage patterns and genomic sequence content, leverages the hypersensitivity of factor-bound chromatin and the information in the DNase I spatial cleavage profile characteristic of each DNA binding protein to accurately infer functional factor binding sites. However, the model for the spatial profile in this framework fails to account for the substantial variation in the DNase I cleavage profiles across different binding sites. Neither does it account for variation in the profiles at the same binding site across multiple replicate DNase I experiments, which are increasingly available. In this work, we introduce new methods, based on multi-scale models for inhomogeneous Poisson processes, to account for such variation in DNase I cleavage patterns both within and across binding sites. These models account for the spatial structure in the heterogeneity in DNase I cleavage patterns for each factor. Using DNase-seq measurements assayed in a lymphoblastoid cell line, we demonstrate the improved performance of this model for several transcription factors by comparing against the Chip-seq peaks for those factors. Finally, we explore the effects of DNase I sequence bias on inference of factor binding using a simple extension to our framework that allows for a more flexible background model. The proposed model can also be easily applied to paired-end ATAC-seq and DNase-seq data. msCentipede, a Python implementation of our algorithm, is available at http://rajanil.github.io/msCentipede.
Chauhan, Rinki; Ravi, Janani; Datta, Pratik; Chen, Tianlong; Schnappinger, Dirk; Bassler, Kevin E.; Balázsi, Gábor; Gennaro, Maria Laura
2016-01-01
Accessory sigma factors, which reprogram RNA polymerase to transcribe specific gene sets, activate bacterial adaptive responses to noxious environments. Here we reconstruct the complete sigma factor regulatory network of the human pathogen Mycobacterium tuberculosis by an integrated approach. The approach combines identification of direct regulatory interactions between M. tuberculosis sigma factors in an E. coli model system, validation of selected links in M. tuberculosis, and extensive literature review. The resulting network comprises 41 direct interactions among all 13 sigma factors. Analysis of network topology reveals (i) a three-tiered hierarchy initiating at master regulators, (ii) high connectivity and (iii) distinct communities containing multiple sigma factors. These topological features are likely associated with multi-layer signal processing and specialized stress responses involving multiple sigma factors. Moreover, the identification of overrepresented network motifs, such as autoregulation and coregulation of sigma and anti-sigma factor pairs, provides structural information that is relevant for studies of network dynamics. PMID:27029515
Hazard interactions and interaction networks (cascades) within multi-hazard methodologies
NASA Astrophysics Data System (ADS)
Gill, Joel C.; Malamud, Bruce D.
2016-08-01
This paper combines research and commentary to reinforce the importance of integrating hazard interactions and interaction networks (cascades) into multi-hazard methodologies. We present a synthesis of the differences between multi-layer single-hazard approaches and multi-hazard approaches that integrate such interactions. This synthesis suggests that ignoring interactions between important environmental and anthropogenic processes could distort management priorities, increase vulnerability to other spatially relevant hazards or underestimate disaster risk. In this paper we proceed to present an enhanced multi-hazard framework through the following steps: (i) description and definition of three groups (natural hazards, anthropogenic processes and technological hazards/disasters) as relevant components of a multi-hazard environment, (ii) outlining of three types of interaction relationship (triggering, increased probability, and catalysis/impedance), and (iii) assessment of the importance of networks of interactions (cascades) through case study examples (based on the literature, field observations and semi-structured interviews). We further propose two visualisation frameworks to represent these networks of interactions: hazard interaction matrices and hazard/process flow diagrams. Our approach reinforces the importance of integrating interactions between different aspects of the Earth system, together with human activity, into enhanced multi-hazard methodologies. Multi-hazard approaches support the holistic assessment of hazard potential and consequently disaster risk. We conclude by describing three ways by which understanding networks of interactions contributes to the theoretical and practical understanding of hazards, disaster risk reduction and Earth system management. Understanding interactions and interaction networks helps us to better (i) model the observed reality of disaster events, (ii) constrain potential changes in physical and social vulnerability between successive hazards, and (iii) prioritise resource allocation for mitigation and disaster risk reduction.
NASA Astrophysics Data System (ADS)
Garbin, Silvia; Alessi Celegon, Elisa; Fanton, Pietro; Botter, Gianluca
2017-04-01
The temporal variability of river flow regime is a key feature structuring and controlling fluvial ecological communities and ecosystem processes. In particular, streamflow variability induced by climate/landscape heterogeneities or other anthropogenic factors significantly affects the connectivity between streams with notable implication for river fragmentation. Hydrologic connectivity is a fundamental property that guarantees species persistence and ecosystem integrity in riverine systems. In riverine landscapes, most ecological transitions are flow-dependent and the structure of flow regimes may affect ecological functions of endemic biota (i.e., fish spawning or grazing of invertebrate species). Therefore, minimum flow thresholds must be guaranteed to support specific ecosystem services, like fish migration, aquatic biodiversity and habitat suitability. In this contribution, we present a probabilistic approach aiming at a spatially-explicit, quantitative assessment of hydrologic connectivity at the network-scale as derived from river flow variability. Dynamics of daily streamflows are estimated based on catchment-scale climatic and morphological features, integrating a stochastic, physically based approach that accounts for the stochasticity of rainfall with a water balance model and a geomorphic recession flow model. The non-exceedance probability of ecologically meaningful flow thresholds is used to evaluate the fragmentation of individual stream reaches, and the ensuing network-scale connectivity metrics. A multi-dimensional Poisson Process for the stochastic generation of rainfall is used to evaluate the impact of climate signature on reach-scale and catchment-scale connectivity. The analysis shows that streamflow patterns and network-scale connectivity are influenced by the topology of the river network and the spatial variability of climatic properties (rainfall, evapotranspiration). The framework offers a robust basis for the prediction of the impact of land-use/land-cover changes and river regulation on network-scale connectivity.
MC EMiNEM maps the interaction landscape of the Mediator.
Niederberger, Theresa; Etzold, Stefanie; Lidschreiber, Michael; Maier, Kerstin C; Martin, Dietmar E; Fröhlich, Holger; Cramer, Patrick; Tresch, Achim
2012-01-01
The Mediator is a highly conserved, large multiprotein complex that is involved essentially in the regulation of eukaryotic mRNA transcription. It acts as a general transcription factor by integrating regulatory signals from gene-specific activators or repressors to the RNA Polymerase II. The internal network of interactions between Mediator subunits that conveys these signals is largely unknown. Here, we introduce MC EMiNEM, a novel method for the retrieval of functional dependencies between proteins that have pleiotropic effects on mRNA transcription. MC EMiNEM is based on Nested Effects Models (NEMs), a class of probabilistic graphical models that extends the idea of hierarchical clustering. It combines mode-hopping Monte Carlo (MC) sampling with an Expectation-Maximization (EM) algorithm for NEMs to increase sensitivity compared to existing methods. A meta-analysis of four Mediator perturbation studies in Saccharomyces cerevisiae, three of which are unpublished, provides new insight into the Mediator signaling network. In addition to the known modular organization of the Mediator subunits, MC EMiNEM reveals a hierarchical ordering of its internal information flow, which is putatively transmitted through structural changes within the complex. We identify the N-terminus of Med7 as a peripheral entity, entailing only local structural changes upon perturbation, while the C-terminus of Med7 and Med19 appear to play a central role. MC EMiNEM associates Mediator subunits to most directly affected genes, which, in conjunction with gene set enrichment analysis, allows us to construct an interaction map of Mediator subunits and transcription factors.
Broekhuizen, Henk; IJzerman, Maarten J; Hauber, A Brett; Groothuis-Oudshoorn, Catharina G M
2017-03-01
The need for patient engagement has been recognized by regulatory agencies, but there is no consensus about how to operationalize this. One approach is the formal elicitation and use of patient preferences for weighing clinical outcomes. The aim of this study was to demonstrate how patient preferences can be used to weigh clinical outcomes when both preferences and clinical outcomes are uncertain by applying a probabilistic value-based multi-criteria decision analysis (MCDA) method. Probability distributions were used to model random variation and parameter uncertainty in preferences, and parameter uncertainty in clinical outcomes. The posterior value distributions and rank probabilities for each treatment were obtained using Monte-Carlo simulations. The probability of achieving the first rank is the probability that a treatment represents the highest value to patients. We illustrated our methodology for a simplified case on six HIV treatments. Preferences were modeled with normal distributions and clinical outcomes were modeled with beta distributions. The treatment value distributions showed the rank order of treatments according to patients and illustrate the remaining decision uncertainty. This study demonstrated how patient preference data can be used to weigh clinical evidence using MCDA. The model takes into account uncertainty in preferences and clinical outcomes. The model can support decision makers during the aggregation step of the MCDA process and provides a first step toward preference-based personalized medicine, yet requires further testing regarding its appropriate use in real-world settings.
Scalable Probabilistic Inference for Global Seismic Monitoring
NASA Astrophysics Data System (ADS)
Arora, N. S.; Dear, T.; Russell, S.
2011-12-01
We describe a probabilistic generative model for seismic events, their transmission through the earth, and their detection (or mis-detection) at seismic stations. We also describe an inference algorithm that constructs the most probable event bulletin explaining the observed set of detections. The model and inference are called NET-VISA (network processing vertically integrated seismic analysis) and is designed to replace the current automated network processing at the IDC, the SEL3 bulletin. Our results (attached table) demonstrate that NET-VISA significantly outperforms SEL3 by reducing the missed events from 30.3% down to 12.5%. The difference is even more dramatic for smaller magnitude events. NET-VISA has no difficulty in locating nuclear explosions as well. The attached figure demonstrates the location predicted by NET-VISA versus other bulletins for the second DPRK event. Further evaluation on dense regional networks demonstrates that NET-VISA finds many events missed in the LEB bulletin, which is produced by the human analysts. Large aftershock sequences, as produced by the 2004 December Sumatra earthquake and the 2011 March Tohoku earthquake, can pose a significant load for automated processing, often delaying the IDC bulletins by weeks or months. Indeed these sequences can overload the serial NET-VISA inference as well. We describe an enhancement to NET-VISA to make it multi-threaded, and hence take full advantage of the processing power of multi-core and -cpu machines. Our experiments show that the new inference algorithm is able to achieve 80% efficiency in parallel speedup.
Linguraru, Marius George; Pura, John A; Chowdhury, Ananda S; Summers, Ronald M
2010-01-01
The interpretation of medical images benefits from anatomical and physiological priors to optimize computer-aided diagnosis (CAD) applications. Diagnosis also relies on the comprehensive analysis of multiple organs and quantitative measures of soft tissue. An automated method optimized for medical image data is presented for the simultaneous segmentation of four abdominal organs from 4D CT data using graph cuts. Contrast-enhanced CT scans were obtained at two phases: non-contrast and portal venous. Intra-patient data were spatially normalized by non-linear registration. Then 4D erosion using population historic information of contrast-enhanced liver, spleen, and kidneys was applied to multi-phase data to initialize the 4D graph and adapt to patient specific data. CT enhancement information and constraints on shape, from Parzen windows, and location, from a probabilistic atlas, were input into a new formulation of a 4D graph. Comparative results demonstrate the effects of appearance and enhancement, and shape and location on organ segmentation.
Brief analysis of Jiangsu grid security and stability based on multi-infeed DC index in power system
NASA Astrophysics Data System (ADS)
Zhang, Wenjia; Wang, Quanquan; Ge, Yi; Huang, Junhui; Chen, Zhengfang
2018-02-01
The impact of Multi-infeed HVDC has gradually increased to security and stability operating in Jiangsu power grid. In this paper, an appraisal method of Multi-infeed HVDC power grid security and stability is raised with Multi-Infeed Effective Short Circuit Ratio, Multi-Infeed Interaction Factor and Commutation Failure Immunity Index. These indices are adopted in security and stability simulating calculation of Jiangsu Multi-infeed HVDC system. The simulation results indicate that Jiangsu power grid is operating with a strong DC system. It has high level of power grid security and stability, and meet the safety running requirements. Jinpin-Suzhou DC system is located in the receiving end with huge capacity, which is easily leading to commutation failure of the transmission line. In order to resolve this problem, dynamic reactive power compensation can be applied in power grid near Jinpin-Suzhou DC system. Simulation result shows this method is feasible to commutation failure.
Probabilistic Aeroelastic Analysis Developed for Turbomachinery Components
NASA Technical Reports Server (NTRS)
Reddy, T. S. R.; Mital, Subodh K.; Stefko, George L.; Pai, Shantaram S.
2003-01-01
Aeroelastic analyses for advanced turbomachines are being developed for use at the NASA Glenn Research Center and industry. However, these analyses at present are used for turbomachinery design with uncertainties accounted for by using safety factors. This approach may lead to overly conservative designs, thereby reducing the potential of designing higher efficiency engines. An integration of the deterministic aeroelastic analysis methods with probabilistic analysis methods offers the potential to design efficient engines with fewer aeroelastic problems and to make a quantum leap toward designing safe reliable engines. In this research, probabilistic analysis is integrated with aeroelastic analysis: (1) to determine the parameters that most affect the aeroelastic characteristics (forced response and stability) of a turbomachine component such as a fan, compressor, or turbine and (2) to give the acceptable standard deviation on the design parameters for an aeroelastically stable system. The approach taken is to combine the aeroelastic analysis of the MISER (MIStuned Engine Response) code with the FPI (fast probability integration) code. The role of MISER is to provide the functional relationships that tie the structural and aerodynamic parameters (the primitive variables) to the forced response amplitudes and stability eigenvalues (the response properties). The role of FPI is to perform probabilistic analyses by utilizing the response properties generated by MISER. The results are a probability density function for the response properties. The probabilistic sensitivities of the response variables to uncertainty in primitive variables are obtained as a byproduct of the FPI technique. The combined analysis of aeroelastic and probabilistic analysis is applied to a 12-bladed cascade vibrating in bending and torsion. Out of the total 11 design parameters, 6 are considered as having probabilistic variation. The six parameters are space-to-chord ratio (SBYC), stagger angle (GAMA), elastic axis (ELAXS), Mach number (MACH), mass ratio (MASSR), and frequency ratio (WHWB). The cascade is considered to be in subsonic flow with Mach 0.7. The results of the probabilistic aeroelastic analysis are the probability density function of predicted aerodynamic damping and frequency for flutter and the response amplitudes for forced response.
On the Accuracy of Probabilistic Bucking Load Prediction
NASA Technical Reports Server (NTRS)
Arbocz, Johann; Starnes, James H.; Nemeth, Michael P.
2001-01-01
The buckling strength of thin-walled stiffened or unstiffened, metallic or composite shells is of major concern in aeronautical and space applications. The difficulty to predict the behavior of axially compressed thin-walled cylindrical shells continues to worry design engineers as we enter the third millennium. Thanks to extensive research programs in the late sixties and early seventies and the contributions of many eminent scientists, it is known that buckling strength calculations are affected by the uncertainties in the definition of the parameters of the problem such as definition of loads, material properties, geometric variables, edge support conditions, and the accuracy of the engineering models and analysis tools used in the design phase. The NASA design criteria monographs from the late sixties account for these design uncertainties by the use of a lump sum safety factor. This so-called 'empirical knockdown factor gamma' usually results in overly conservative design. Recently new reliability based probabilistic design procedure for buckling critical imperfect shells have been proposed. It essentially consists of a stochastic approach which introduces an improved 'scientific knockdown factor lambda(sub a)', that is not as conservative as the traditional empirical one. In order to incorporate probabilistic methods into a High Fidelity Analysis Approach one must be able to assess the accuracy of the various steps that must be executed to complete a reliability calculation. In the present paper the effect of size of the experimental input sample on the predicted value of the scientific knockdown factor lambda(sub a) calculated by the First-Order, Second-Moment Method is investigated.
Bröder, A
2000-09-01
The boundedly rational 'Take-The-Best" heuristic (TTB) was proposed by G. Gigerenzer, U. Hoffrage, and H. Kleinbölting (1991) as a model of fast and frugal probabilistic inferences. Although the simple lexicographic rule proved to be successful in computer simulations, direct empirical demonstrations of its adequacy as a psychological model are lacking because of several methodical problems. In 4 experiments with a total of 210 participants, this question was addressed. Whereas Experiment 1 showed that TTB is not valid as a universal hypothesis about probabilistic inferences, up to 28% of participants in Experiment 2 and 53% of participants in Experiment 3 were classified as TTB users. Experiment 4 revealed that investment costs for information seem to be a relevant factor leading participants to switch to a noncompensatory TTB strategy. The observed individual differences in strategy use imply the recommendation of an idiographic approach to decision-making research.
Inherent limitations of probabilistic models for protein-DNA binding specificity
Ruan, Shuxiang
2017-01-01
The specificities of transcription factors are most commonly represented with probabilistic models. These models provide a probability for each base occurring at each position within the binding site and the positions are assumed to contribute independently. The model is simple and intuitive and is the basis for many motif discovery algorithms. However, the model also has inherent limitations that prevent it from accurately representing true binding probabilities, especially for the highest affinity sites under conditions of high protein concentration. The limitations are not due to the assumption of independence between positions but rather are caused by the non-linear relationship between binding affinity and binding probability and the fact that independent normalization at each position skews the site probabilities. Generally probabilistic models are reasonably good approximations, but new high-throughput methods allow for biophysical models with increased accuracy that should be used whenever possible. PMID:28686588
Fast, noise-free memory for photon synchronization at room temperature.
Finkelstein, Ran; Poem, Eilon; Michel, Ohad; Lahad, Ohr; Firstenberg, Ofer
2018-01-01
Future quantum photonic networks require coherent optical memories for synchronizing quantum sources and gates of probabilistic nature. We demonstrate a fast ladder memory (FLAME) mapping the optical field onto the superposition between electronic orbitals of rubidium vapor. Using a ladder-level system of orbital transitions with nearly degenerate frequencies simultaneously enables high bandwidth, low noise, and long memory lifetime. We store and retrieve 1.7-ns-long pulses, containing 0.5 photons on average, and observe short-time external efficiency of 25%, memory lifetime (1/ e ) of 86 ns, and below 10 -4 added noise photons. Consequently, coupling this memory to a probabilistic source would enhance the on-demand photon generation probability by a factor of 12, the highest number yet reported for a noise-free, room temperature memory. This paves the way toward the controlled production of large quantum states of light from probabilistic photon sources.
Wan, Shibiao; Mak, Man-Wai; Kung, Sun-Yuan
2015-03-15
Proteins located in appropriate cellular compartments are of paramount importance to exert their biological functions. Prediction of protein subcellular localization by computational methods is required in the post-genomic era. Recent studies have been focusing on predicting not only single-location proteins but also multi-location proteins. However, most of the existing predictors are far from effective for tackling the challenges of multi-label proteins. This article proposes an efficient multi-label predictor, namely mPLR-Loc, based on penalized logistic regression and adaptive decisions for predicting both single- and multi-location proteins. Specifically, for each query protein, mPLR-Loc exploits the information from the Gene Ontology (GO) database by using its accession number (AC) or the ACs of its homologs obtained via BLAST. The frequencies of GO occurrences are used to construct feature vectors, which are then classified by an adaptive decision-based multi-label penalized logistic regression classifier. Experimental results based on two recent stringent benchmark datasets (virus and plant) show that mPLR-Loc remarkably outperforms existing state-of-the-art multi-label predictors. In addition to being able to rapidly and accurately predict subcellular localization of single- and multi-label proteins, mPLR-Loc can also provide probabilistic confidence scores for the prediction decisions. For readers' convenience, the mPLR-Loc server is available online (http://bioinfo.eie.polyu.edu.hk/mPLRLocServer). Copyright © 2014 Elsevier Inc. All rights reserved.
Review of methods for developing probabilistic risk assessments
D. A. Weinstein; P.B. Woodbury
2010-01-01
We describe methodologies currently in use or those under development containing features for estimating fire occurrence risk assessment. We describe two major categories of fire risk assessment tools: those that predict fire under current conditions, assuming that vegetation, climate, and the interactions between them and fire remain relatively similar to their...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coleman, Justin Leigh; Veeraraghavan, Swetha; Bolisetti, Chandrakanth
MASTODON has the capability to model stochastic nonlinear soil-structure interaction (NLSSI) in a dynamic probabilistic risk assessment framework. The NLSSI simulations include structural dynamics, time integration, dynamic porous media flow, nonlinear hysteretic soil constitutive models, geometric nonlinearities (gapping, sliding, and uplift). MASTODON is also the MOOSE based master application for dynamic PRA of external hazards.
Towards a Multi-Resolution Model of Seismic Risk in Central Asia. Challenge and perspectives
NASA Astrophysics Data System (ADS)
Pittore, M.; Wieland, M.; Bindi, D.; Parolai, S.
2011-12-01
Assessing seismic risk, defined as the probability of occurrence of economical and social losses as consequence of an earthquake, both at regional and at local scale is a challenging, multi-disciplinary task. In order to provide a reliable estimate, diverse information must be gathered by seismologists, geologists, engineers and civil authorities, and carefully integrated keeping into account the different levels of uncertainty. The research towards an integrated methodology, able to seamlessly describe seismic risk at different spatial scales is challenging, but discloses new application perspectives, particularly in those countries which suffer from a relevant seismic hazard but do not have resources for a standard assessment. Central Asian countries in particular, which exhibit one of the highest seismic hazard in the world, are experiencing a steady demographic growth, often accompanied by informal settlement and urban sprawling. A reliable evaluation of how these factors affect the seismic risk, together with a realistic assessment of the assets exposed to seismic hazard and their structural vulnerability is of particular importance, in order to undertake proper mitigation actions and to promptly and efficiently react to a catastrophic event. New strategies are needed to efficiently cope with systematic lack of information and uncertainties. An original approach is presented to assess seismic risk based on integration of information coming from remote-sensing and ground-based panoramic imaging, in situ measurements, expert knowledge and already available data. Efficient sampling strategies based on freely available medium-resolution multi-spectral satellite images are adopted to optimize data collection and validation, in a multi-scale approach. Panoramic imaging is also considered as a valuable ground-based visual data collection technique, suitable both for manual and automatic analysis. A full-probabilistic framework based on Bayes Network is proposed to integrate available information taking into account both aleatory and epistemic uncertainties. An improved risk model for the capital of Kyrgyz Republic, Biskek, has been developed following this approach and tested based on different earthquake scenarios. Preliminary results will be presented and discussed.
A probabilistic model for the persistence of early planar fabrics in polydeformed pelitic schists
Ferguson, C.C.
1984-01-01
Although early planar fabrics are commonly preserved within microlithons in low-grade pelites, in higher-grade (amphibolite facies) pelitic schists fabric regeneration often appears complete. Evidence for early fabrics may be preserved within porphyroblasts but, within the matrix, later deformation often appears to totally obliterate or reorient earlier fabrics. However, examination of several hundred Dalradian pelites from Connemara, western Ireland, reveals that preservation of early fabrics is by no means uncommon; relict matrix domains, although volumetrically insignificant, are remarkably persistent even when inferred later strains are very large and fabric regeneration appears, at first sight, complete. Deterministic plasticity theories are ill-suited to the analysis of such an inhomogeneous material response, and a probabilistic model is proposed instead. It assumes that ductile polycrystal deformation is controlled by elementary flow units which can be activated once their associated stress barrier is overcome. Bulk flow propensity is related to the proportion of simultaneous activations, and a measure of this is derived from the probabilistic interaction between a stress-barrier spectrum and an internal stress spectrum (the latter determined by the external loading and the details of internal stress transfer). The spectra are modelled as Gaussian distributions although the treatment is very general and could be adapted for other distributions. Using the time rate of change of activation probability it is predicted that, initially, fabric development will be rapid but will then slow down dramatically even though stress increases at a constant rate. This highly non-linear response suggests that early fabrics persist because they comprise unfavourable distributions of stress-barriers which remain unregenerated at the time bulk stress is stabilized by steady-state flow. Relict domains will, however, bear the highest stress and are potential upper-bound palaeostress estimators. Some factors relevant to the micromechanical explanation of relict matrix domains are discussed. ?? 1984.
Network formation in a multi-asset artificial stock market
NASA Astrophysics Data System (ADS)
Wu, Songtao; He, Jianmin; Li, Shouwei; Wang, Chao
2018-04-01
A multi-asset artificial stock market is developed. In the market, stocks are assigned to a number of sectors and traded by heterogeneous investors. The mechanism of continuous double auction is employed to clear order book and form daily closed prices. Simulation results of prices at the sector level show an intra-sector similarity and inter-sector distinctiveness, and returns of individual stocks have stylized facts that are ubiquitous in the real-world stock market. We find that the market risk factor has critical impact on both network topology transition and connection formation, and that sector risk factors account for the formation of intra-sector links and sector-based local interaction. In addition, the number of community in threshold-based networks is correlated negatively and positively with the value of correlation coefficients and the ratio of intra-sector links, which are respectively determined by intensity of sector risk factors and the number of sectors.
Reliability and Confidence Interval Analysis of a CMC Turbine Stator Vane
NASA Technical Reports Server (NTRS)
Murthy, Pappu L. N.; Gyekenyesi, John P.; Mital, Subodh K.
2008-01-01
High temperature ceramic matrix composites (CMC) are being explored as viable candidate materials for hot section gas turbine components. These advanced composites can potentially lead to reduced weight, enable higher operating temperatures requiring less cooling and thus leading to increased engine efficiencies. However, these materials are brittle and show degradation with time at high operating temperatures due to creep as well as cyclic mechanical and thermal loads. In addition, these materials are heterogeneous in their make-up and various factors affect their properties in a specific design environment. Most of these advanced composites involve two- and three-dimensional fiber architectures and require a complex multi-step high temperature processing. Since there are uncertainties associated with each of these in addition to the variability in the constituent material properties, the observed behavior of composite materials exhibits scatter. Traditional material failure analyses employing a deterministic approach, where failure is assumed to occur when some allowable stress level or equivalent stress is exceeded, are not adequate for brittle material component design. Such phenomenological failure theories are reasonably successful when applied to ductile materials such as metals. Analysis of failure in structural components is governed by the observed scatter in strength, stiffness and loading conditions. In such situations, statistical design approaches must be used. Accounting for these phenomena requires a change in philosophy on the design engineer s part that leads to a reduced focus on the use of safety factors in favor of reliability analyses. The reliability approach demands that the design engineer must tolerate a finite risk of unacceptable performance. This risk of unacceptable performance is identified as a component's probability of failure (or alternatively, component reliability). The primary concern of the engineer is minimizing this risk in an economical manner. The methods to accurately determine the service life of an engine component with associated variability have become increasingly difficult. This results, in part, from the complex missions which are now routinely considered during the design process. These missions include large variations of multi-axial stresses and temperatures experienced by critical engine parts. There is a need for a convenient design tool that can accommodate various loading conditions induced by engine operating environments, and material data with their associated uncertainties to estimate the minimum predicted life of a structural component. A probabilistic composite micromechanics technique in combination with woven composite micromechanics, structural analysis and Fast Probability Integration (FPI) techniques has been used to evaluate the maximum stress and its probabilistic distribution in a CMC turbine stator vane. Furthermore, input variables causing scatter are identified and ranked based upon their sensitivity magnitude. Since the measured data for the ceramic matrix composite properties is very limited, obtaining a probabilistic distribution with their corresponding parameters is difficult. In case of limited data, confidence bounds are essential to quantify the uncertainty associated with the distribution. Usually 90 and 95% confidence intervals are computed for material properties. Failure properties are then computed with the confidence bounds. Best estimates and the confidence bounds on the best estimate of the cumulative probability function for R-S (strength - stress) are plotted. The methodologies and the results from these analyses will be discussed in the presentation.
Advancing Usability Evaluation through Human Reliability Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ronald L. Boring; David I. Gertman
2005-07-01
This paper introduces a novel augmentation to the current heuristic usability evaluation methodology. The SPAR-H human reliability analysis method was developed for categorizing human performance in nuclear power plants. Despite the specialized use of SPAR-H for safety critical scenarios, the method also holds promise for use in commercial off-the-shelf software usability evaluations. The SPAR-H method shares task analysis underpinnings with human-computer interaction, and it can be easily adapted to incorporate usability heuristics as performance shaping factors. By assigning probabilistic modifiers to heuristics, it is possible to arrive at the usability error probability (UEP). This UEP is not a literal probabilitymore » of error but nonetheless provides a quantitative basis to heuristic evaluation. When combined with a consequence matrix for usability errors, this method affords ready prioritization of usability issues.« less
Huntzinger, D. N.; Michalak, A. M.; Schwalm, C.; ...
2017-07-06
Terrestrial ecosystems play a vital role in regulating the accumulation of carbon (C) in the atmosphere. Understanding the factors controlling land C uptake is critical for reducing uncertainties in projections of future climate. The relative importance of changing climate, rising atmospheric CO 2, and other factors, however, remains unclear despite decades of research. Here, we use an ensemble of land models to show that models disagree on the primary driver of cumulative C uptake for 85% of vegetated land area. Disagreement is largest in model sensitivity to rising atmospheric CO 2 which shows almost twice the variability in cumulative landmore » uptake since 1901 (1 s.d. of 212.8 PgC vs. 138.5 PgC, respectively). We find that variability in CO 2 and temperature sensitivity is attributable, in part, to their compensatory effects on C uptake, whereby comparable estimates of C uptake can arise by invoking different sensitivities to key environmental conditions. Conversely, divergent estimates of C uptake can occur despite being based on the same environmental sensitivities. Together, these findings imply an important limitation to the predictability of C cycling and climate under unprecedented environmental conditions. We suggest that the carbon modeling community prioritize a probabilistic multi-model approach to generate more robust C cycle projections.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huntzinger, D. N.; Michalak, A. M.; Schwalm, C.
2017-07-06
Terrestrial ecosystems play a vital role in regulating the accumulation of carbon (C) in the atmosphere. Understanding the factors controlling land C uptake is critical for reducing uncertainties in projections of future climate. The relative importance of changing climate, rising atmospheric CO2, and other factors, however, remains unclear despite decades of research. Here, we use an ensemble of land models to show that models disagree on the primary driver of cumulative C uptake for 85% of vegetated land area. Disagreement is largest in model sensitivity to rising atmospheric CO2 which shows almost twice the variability in cumulative land uptake sincemore » 1901 (1 s.d. of 212.8 PgC vs. 138.5 PgC, respectively). We find that variability in CO2 and temperature sensitivity is attributable, in part, to their compensatory effects on C uptake, whereby comparable estimates of C uptake can arise by invoking different sensitivities to key environmental conditions. Conversely, divergent estimates of C uptake can occur despite being based on the same environmental sensitivities. Together, these findings imply an important limitation to the predictability of C cycling and climate under unprecedented environmental conditions. We suggest that the carbon modeling community prioritize a probabilistic multi-model approach to generate more robust C cycle projections.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huntzinger, D. N.; Michalak, A. M.; Schwalm, C.
Terrestrial ecosystems play a vital role in regulating the accumulation of carbon (C) in the atmosphere. Understanding the factors controlling land C uptake is critical for reducing uncertainties in projections of future climate. The relative importance of changing climate, rising atmospheric CO 2, and other factors, however, remains unclear despite decades of research. Here, we use an ensemble of land models to show that models disagree on the primary driver of cumulative C uptake for 85% of vegetated land area. Disagreement is largest in model sensitivity to rising atmospheric CO 2 which shows almost twice the variability in cumulative landmore » uptake since 1901 (1 s.d. of 212.8 PgC vs. 138.5 PgC, respectively). We find that variability in CO 2 and temperature sensitivity is attributable, in part, to their compensatory effects on C uptake, whereby comparable estimates of C uptake can arise by invoking different sensitivities to key environmental conditions. Conversely, divergent estimates of C uptake can occur despite being based on the same environmental sensitivities. Together, these findings imply an important limitation to the predictability of C cycling and climate under unprecedented environmental conditions. We suggest that the carbon modeling community prioritize a probabilistic multi-model approach to generate more robust C cycle projections.« less
Dynamic Museum Place: Exploring the Multi-Dimensional Museum Environment
ERIC Educational Resources Information Center
Leach, Denise Blair
2007-01-01
Place is an important factor in museum education, yet describing what the museum is as place is often difficult. This article introduces the idea that museums consist of multiple physical and virtual place "domains" where interactions between people and objects occur: the origin domain, creation domain, display domain, and the experiencer-object…
Determinants of Academic Achievement of Middle Schoolers in Turkey
ERIC Educational Resources Information Center
Börkan, Bengü; Bakis, Ozan
2016-01-01
The purpose of this study is to discuss student and school factors, including cross level interaction, that cause inequalities in seven and eighth grade students' achievement in Turkish context by using national achievement test scores with a multi-level statistical approach. Our results are in line with most other studies with similar purpose.…
Life Satisfaction of University Students in Relation to Family and Food in a Developing Country
Schnettler, Berta; Miranda-Zapata, Edgardo; Grunert, Klaus G.; Lobos, Germán; Denegri, Marianela; Hueche, Clementina; Poblete, Héctor
2017-01-01
Life satisfaction and satisfaction with food-related life (SWFoL) are associated with healthy eating habits, family interaction around eating and family support. The present study evaluates the relationship between SWFoL and satisfaction with family life (SWFaL), and their relationship with life satisfaction in university students. We identify the relationship of two different types of family support and student SWFaL and explore a moderator effect of gender. A questionnaire was applied to a non-probabilistic sample of 370 students of both genders (mean age 21 years) in Chile, including Satisfaction with Life Scale, SWFoL scale, SWFaL scale, and the Family Resources Scale. Using structural equation modeling, we found that students’ life satisfaction was related to SWFaL and food-related life. A high positive relationship was identified between intangible family support and students’ SWFaL, which would have a mediating role between intangible support and life satisfaction. Using multi-group analysis, a moderator effect of gender was not found. These findings suggest that improving SWFoL, SWFaL and intangible family support is important for both female and male students. PMID:28932203
Life Satisfaction of University Students in Relation to Family and Food in a Developing Country.
Schnettler, Berta; Miranda-Zapata, Edgardo; Grunert, Klaus G; Lobos, Germán; Denegri, Marianela; Hueche, Clementina; Poblete, Héctor
2017-01-01
Life satisfaction and satisfaction with food-related life (SWFoL) are associated with healthy eating habits, family interaction around eating and family support. The present study evaluates the relationship between SWFoL and satisfaction with family life (SWFaL), and their relationship with life satisfaction in university students. We identify the relationship of two different types of family support and student SWFaL and explore a moderator effect of gender. A questionnaire was applied to a non-probabilistic sample of 370 students of both genders (mean age 21 years) in Chile, including Satisfaction with Life Scale, SWFoL scale, SWFaL scale, and the Family Resources Scale. Using structural equation modeling, we found that students' life satisfaction was related to SWFaL and food-related life. A high positive relationship was identified between intangible family support and students' SWFaL, which would have a mediating role between intangible support and life satisfaction. Using multi-group analysis, a moderator effect of gender was not found. These findings suggest that improving SWFoL, SWFaL and intangible family support is important for both female and male students.
Ohdaira, Tetsushi
2016-05-05
There are two types of costly punishment, i.e. peer-punishment and pool-punishment. While peer-punishment applies direct face to face punishment, pool-punishment is based on multi-point, collective interaction among group members. Regarding those two types of costly punishment, peer-punishment is especially considered to have the flaws that it lowers the average payoff of all players as well as pool-punishment does, and facilitates antisocial behaviour like retaliation of a defector on a cooperator. Here, this study proposes the new peer-punishment that punishment to an opponent player works at high probability when an opponent one is uncooperative, and the difference of payoff between a player and an opponent one becomes large in order to prevent such antisocial behaviour. It is natural to think that players of high payoff do not expect to punish others of lower payoff because they do not have any complaints regarding their economic wealth. The author shows that the introduction of the proposed peer-punishment increases both the number of cooperative players and the average payoff of all players in various types of topology of connections between players.
Ohdaira, Tetsushi
2016-01-01
There are two types of costly punishment, i.e. peer-punishment and pool-punishment. While peer-punishment applies direct face to face punishment, pool-punishment is based on multi-point, collective interaction among group members. Regarding those two types of costly punishment, peer-punishment is especially considered to have the flaws that it lowers the average payoff of all players as well as pool-punishment does, and facilitates antisocial behaviour like retaliation of a defector on a cooperator. Here, this study proposes the new peer-punishment that punishment to an opponent player works at high probability when an opponent one is uncooperative, and the difference of payoff between a player and an opponent one becomes large in order to prevent such antisocial behaviour. It is natural to think that players of high payoff do not expect to punish others of lower payoff because they do not have any complaints regarding their economic wealth. The author shows that the introduction of the proposed peer-punishment increases both the number of cooperative players and the average payoff of all players in various types of topology of connections between players. PMID:27146347
NASA Astrophysics Data System (ADS)
Ohdaira, Tetsushi
2016-05-01
There are two types of costly punishment, i.e. peer-punishment and pool-punishment. While peer-punishment applies direct face to face punishment, pool-punishment is based on multi-point, collective interaction among group members. Regarding those two types of costly punishment, peer-punishment is especially considered to have the flaws that it lowers the average payoff of all players as well as pool-punishment does, and facilitates antisocial behaviour like retaliation of a defector on a cooperator. Here, this study proposes the new peer-punishment that punishment to an opponent player works at high probability when an opponent one is uncooperative, and the difference of payoff between a player and an opponent one becomes large in order to prevent such antisocial behaviour. It is natural to think that players of high payoff do not expect to punish others of lower payoff because they do not have any complaints regarding their economic wealth. The author shows that the introduction of the proposed peer-punishment increases both the number of cooperative players and the average payoff of all players in various types of topology of connections between players.
NASA Astrophysics Data System (ADS)
Xu, Lei; Zhai, Wanming; Gao, Jianmin
2017-11-01
Track irregularities are inevitably in a process of stochastic evolution due to the uncertainty and continuity of wheel-rail interactions. For depicting the dynamic behaviours of vehicle-track coupling system caused by track random irregularities thoroughly, it is a necessity to develop a track irregularity probabilistic model to simulate rail surface irregularities with ergodic properties on amplitudes, wavelengths and probabilities, and to build a three-dimensional vehicle-track coupled model by properly considering the wheel-rail nonlinear contact mechanisms. In the present study, the vehicle-track coupled model is programmed by combining finite element method with wheel-rail coupling model firstly. Then, in light of the capability of power spectral density (PSD) in characterising amplitudes and wavelengths of stationary random signals, a track irregularity probabilistic model is presented to reveal and simulate the whole characteristics of track irregularity PSD. Finally, extended applications from three aspects, that is, extreme analysis, reliability analysis and response relationships between dynamic indices, are conducted to the evaluation and application of the proposed models.
Kayen, R.; Moss, R.E.S.; Thompson, E.M.; Seed, R.B.; Cetin, K.O.; Der Kiureghian, A.; Tanaka, Y.; Tokimatsu, K.
2013-01-01
Shear-wave velocity (Vs) offers a means to determine the seismic resistance of soil to liquefaction by a fundamental soil property. This paper presents the results of an 11-year international project to gather new Vs site data and develop probabilistic correlations for seismic soil liquefaction occurrence. Toward that objective, shear-wave velocity test sites were identified, and measurements made for 301 new liquefaction field case histories in China, Japan, Taiwan, Greece, and the United States over a decade. The majority of these new case histories reoccupy those previously investigated by penetration testing. These new data are combined with previously published case histories to build a global catalog of 422 case histories of Vs liquefaction performance. Bayesian regression and structural reliability methods facilitate a probabilistic treatment of the Vs catalog for performance-based engineering applications. Where possible, uncertainties of the variables comprising both the seismic demand and the soil capacity were estimated and included in the analysis, resulting in greatly reduced overall model uncertainty relative to previous studies. The presented data set and probabilistic analysis also help resolve the ancillary issues of adjustment for soil fines content and magnitude scaling factors.
Scalable non-negative matrix tri-factorization.
Čopar, Andrej; Žitnik, Marinka; Zupan, Blaž
2017-01-01
Matrix factorization is a well established pattern discovery tool that has seen numerous applications in biomedical data analytics, such as gene expression co-clustering, patient stratification, and gene-disease association mining. Matrix factorization learns a latent data model that takes a data matrix and transforms it into a latent feature space enabling generalization, noise removal and feature discovery. However, factorization algorithms are numerically intensive, and hence there is a pressing challenge to scale current algorithms to work with large datasets. Our focus in this paper is matrix tri-factorization, a popular method that is not limited by the assumption of standard matrix factorization about data residing in one latent space. Matrix tri-factorization solves this by inferring a separate latent space for each dimension in a data matrix, and a latent mapping of interactions between the inferred spaces, making the approach particularly suitable for biomedical data mining. We developed a block-wise approach for latent factor learning in matrix tri-factorization. The approach partitions a data matrix into disjoint submatrices that are treated independently and fed into a parallel factorization system. An appealing property of the proposed approach is its mathematical equivalence with serial matrix tri-factorization. In a study on large biomedical datasets we show that our approach scales well on multi-processor and multi-GPU architectures. On a four-GPU system we demonstrate that our approach can be more than 100-times faster than its single-processor counterpart. A general approach for scaling non-negative matrix tri-factorization is proposed. The approach is especially useful parallel matrix factorization implemented in a multi-GPU environment. We expect the new approach will be useful in emerging procedures for latent factor analysis, notably for data integration, where many large data matrices need to be collectively factorized.
Cheng, Feon W; Gao, Xiang; Bao, Le; Mitchell, Diane C; Wood, Craig; Sliwinski, Martin J; Smiciklas-Wright, Helen; Still, Christopher D; Rolston, David D K; Jensen, Gordon L
2017-07-01
To examine the risk factors of developing functional decline and make probabilistic predictions by using a tree-based method that allows higher order polynomials and interactions of the risk factors. The conditional inference tree analysis, a data mining approach, was used to construct a risk stratification algorithm for developing functional limitation based on BMI and other potential risk factors for disability in 1,951 older adults without functional limitations at baseline (baseline age 73.1 ± 4.2 y). We also analyzed the data with multivariate stepwise logistic regression and compared the two approaches (e.g., cross-validation). Over a mean of 9.2 ± 1.7 years of follow-up, 221 individuals developed functional limitation. Higher BMI, age, and comorbidity were consistently identified as significant risk factors for functional decline using both methods. Based on these factors, individuals were stratified into four risk groups via the conditional inference tree analysis. Compared to the low-risk group, all other groups had a significantly higher risk of developing functional limitation. The odds ratio comparing two extreme categories was 9.09 (95% confidence interval: 4.68, 17.6). Higher BMI, age, and comorbid disease were consistently identified as significant risk factors for functional decline among older individuals across all approaches and analyses. © 2017 The Obesity Society.
NASA Technical Reports Server (NTRS)
Veldkamp, Ted; Wada, Yoshihide; Aerts, Jeroen; Ward, Phillip
2016-01-01
Water scarcity -driven by climate change, climate variability, and socioeconomic developments- is recognized as one of the most important global risks, both in terms of likelihood and impact. Whilst a wide range of studies have assessed the role of long term climate change and socioeconomic trends on global water scarcity, the impact of variability is less well understood. Moreover, the interactions between different forcing mechanisms, and their combined effect on changes in water scarcity conditions, are often neglected. Therefore, we provide a first step towards a framework for global water scarcity risk assessments, applying probabilistic methods to estimate water scarcity risks for different return periods under current and future conditions while using multiple climate and socioeconomic scenarios.
Cosmic-ray interaction data for designing biological experiments in space
NASA Astrophysics Data System (ADS)
Straume, T.; Slaba, T. C.; Bhattacharya, S.; Braby, L. A.
2017-05-01
There is growing interest in flying biological experiments beyond low-Earth orbit (LEO) to measure biological responses potentially relevant to those expected during a human mission to Mars. Such experiments could be payloads onboard precursor missions, including unmanned private-public partnerships, as well as small low-cost spacecraft (satellites) designed specifically for biosentinel-type missions. It is the purpose of this paper to provide physical cosmic-ray interaction data and related information useful to biologists who may be planning such experiments. It is not the objective here to actually design such experiments or provide radiobiological response functions, which would be specific for each experiment and biological endpoint. Nuclide-specific flux and dose rates were calculated using OLTARIS and these results were used to determine particle traversal rates and doses in hypothetical biological targets. Comparisons are provided between GCR in interplanetary space and inside the ISS. Calculated probabilistic estimates of dose from solar particle events are also presented. Although the focus here is on biological experiments, the information provided may be useful for designing other payloads as well if the space radiation environment is a factor to be considered.
Complement Coercion: The Joint Effects of Type and Typicality.
Zarcone, Alessandra; McRae, Ken; Lenci, Alessandro; Padó, Sebastian
2017-01-01
Complement coercion ( begin a book → reading ) involves a type clash between an event-selecting verb and an entity-denoting object, triggering a covert event ( reading ). Two main factors involved in complement coercion have been investigated: the semantic type of the object (event vs. entity), and the typicality of the covert event ( the author began a book → writing ). In previous research, reading times have been measured at the object. However, the influence of the typicality of the subject-object combination on processing an aspectual verb such as begin has not been studied. Using a self-paced reading study, we manipulated semantic type and subject-object typicality, exploiting German word order to measure reading times at the aspectual verb. These variables interacted at the target verb. We conclude that both type and typicality probabilistically guide expectations about upcoming input. These results are compatible with an expectation-based view of complement coercion and language comprehension more generally in which there is rapid interaction between what is typically viewed as linguistic knowledge, and what is typically viewed as domain general knowledge about how the world works.
Complement Coercion: The Joint Effects of Type and Typicality
Zarcone, Alessandra; McRae, Ken; Lenci, Alessandro; Padó, Sebastian
2017-01-01
Complement coercion (begin a book →reading) involves a type clash between an event-selecting verb and an entity-denoting object, triggering a covert event (reading). Two main factors involved in complement coercion have been investigated: the semantic type of the object (event vs. entity), and the typicality of the covert event (the author began a book →writing). In previous research, reading times have been measured at the object. However, the influence of the typicality of the subject–object combination on processing an aspectual verb such as begin has not been studied. Using a self-paced reading study, we manipulated semantic type and subject–object typicality, exploiting German word order to measure reading times at the aspectual verb. These variables interacted at the target verb. We conclude that both type and typicality probabilistically guide expectations about upcoming input. These results are compatible with an expectation-based view of complement coercion and language comprehension more generally in which there is rapid interaction between what is typically viewed as linguistic knowledge, and what is typically viewed as domain general knowledge about how the world works. PMID:29225585
NASA Astrophysics Data System (ADS)
Maciążek-Jurczyk, M.; Sułkowska, A.; Bojko, B.; Równicka-Zubik, J.; Sułkowski, W. W.
2009-09-01
The monitoring of drug concentration in blood serum is necessary in multi-drug therapy. Mechanism of drug binding with serum albumin (SA) is one of the most important factors which determine drug concentration and its transport to the destination tissues. In rheumatoid diseases drugs which can induce various adverse effects are commonly used in combination therapy. Such proceeding may result in the enhancement of those side effects due to drug interaction. Interaction of phenylbutazone and colchicine in binding to serum albumin and competition between them in gout has been studied by proton nuclear magnetic resonance ( 1H NMR) technique. The aim of the study was to determine the low affinity binding sites, the strength and kind of interaction between serum albumin and drugs used in combination therapy. The study of competition between phenylbutazone and colchicine in binding to serum albumin points to the change of their affinity to serum albumin in the ternary systems. This should be taken into account in multi-drug therapy. This work is a subsequent part of the spectroscopic study on Phe-COL-SA interactions [A. Sułkowska, et al., J. Mol. Struct. 881 (2008) 97-106].
Rule-Based Simulation of Multi-Cellular Biological Systems—A Review of Modeling Techniques
Hwang, Minki; Garbey, Marc; Berceli, Scott A.; Tran-Son-Tay, Roger
2011-01-01
Emergent behaviors of multi-cellular biological systems (MCBS) result from the behaviors of each individual cells and their interactions with other cells and with the environment. Modeling MCBS requires incorporating these complex interactions among the individual cells and the environment. Modeling approaches for MCBS can be grouped into two categories: continuum models and cell-based models. Continuum models usually take the form of partial differential equations, and the model equations provide insight into the relationship among the components in the system. Cell-based models simulate each individual cell behavior and interactions among them enabling the observation of the emergent system behavior. This review focuses on the cell-based models of MCBS, and especially, the technical aspect of the rule-based simulation method for MCBS is reviewed. How to implement the cell behaviors and the interactions with other cells and with the environment into the computational domain is discussed. The cell behaviors reviewed in this paper are division, migration, apoptosis/necrosis, and differentiation. The environmental factors such as extracellular matrix, chemicals, microvasculature, and forces are also discussed. Application examples of these cell behaviors and interactions are presented. PMID:21369345
Game Design to Measure Reflexes and Attention Based on Biofeedback Multi-Sensor Interaction
Ortiz-Vigon Uriarte, Inigo de Loyola; Garcia-Zapirain, Begonya; Garcia-Chimeno, Yolanda
2015-01-01
This paper presents a multi-sensor system for implementing biofeedback as a human-computer interaction technique in a game involving driving cars in risky situations. The sensors used are: Eye Tracker, Kinect, pulsometer, respirometer, electromiography (EMG) and galvanic skin resistance (GSR). An algorithm has been designed which gives rise to an interaction logic with the game according to the set of physiological constants obtained from the sensors. The results reflect a 72.333 response to the System Usability Scale (SUS), a significant difference of p = 0.026 in GSR values in terms of the difference between the start and end of the game, and an r = 0.659 and p = 0.008 correlation while playing with the Kinect between the breathing level and the energy and joy factor. All the sensors used had an impact on the end results, whereby none of them should be disregarded in future lines of research, even though it would be interesting to obtain separate breathing values from that of the cardio. PMID:25789493
NASA Astrophysics Data System (ADS)
Gilchrist, J. J.; Jordan, T. H.; Shaw, B. E.; Milner, K. R.; Richards-Dinger, K. B.; Dieterich, J. H.
2017-12-01
Within the SCEC Collaboratory for Interseismic Simulation and Modeling (CISM), we are developing physics-based forecasting models for earthquake ruptures in California. We employ the 3D boundary element code RSQSim (Rate-State Earthquake Simulator of Dieterich & Richards-Dinger, 2010) to generate synthetic catalogs with tens of millions of events that span up to a million years each. This code models rupture nucleation by rate- and state-dependent friction and Coulomb stress transfer in complex, fully interacting fault systems. The Uniform California Earthquake Rupture Forecast Version 3 (UCERF3) fault and deformation models are used to specify the fault geometry and long-term slip rates. We have employed the Blue Waters supercomputer to generate long catalogs of simulated California seismicity from which we calculate the forecasting statistics for large events. We have performed probabilistic seismic hazard analysis with RSQSim catalogs that were calibrated with system-wide parameters and found a remarkably good agreement with UCERF3 (Milner et al., this meeting). We build on this analysis, comparing the conditional probabilities of sequences of large events from RSQSim and UCERF3. In making these comparisons, we consider the epistemic uncertainties associated with the RSQSim parameters (e.g., rate- and state-frictional parameters), as well as the effects of model-tuning (e.g., adjusting the RSQSim parameters to match UCERF3 recurrence rates). The comparisons illustrate how physics-based rupture simulators might assist forecasters in understanding the short-term hazards of large aftershocks and multi-event sequences associated with complex, multi-fault ruptures.
Probabilistic dose-response modeling: case study using dichloromethane PBPK model results.
Marino, Dale J; Starr, Thomas B
2007-12-01
A revised assessment of dichloromethane (DCM) has recently been reported that examines the influence of human genetic polymorphisms on cancer risks using deterministic PBPK and dose-response modeling in the mouse combined with probabilistic PBPK modeling in humans. This assessment utilized Bayesian techniques to optimize kinetic variables in mice and humans with mean values from posterior distributions used in the deterministic modeling in the mouse. To supplement this research, a case study was undertaken to examine the potential impact of probabilistic rather than deterministic PBPK and dose-response modeling in mice on subsequent unit risk factor (URF) determinations. Four separate PBPK cases were examined based on the exposure regimen of the NTP DCM bioassay. These were (a) Same Mouse (single draw of all PBPK inputs for both treatment groups); (b) Correlated BW-Same Inputs (single draw of all PBPK inputs for both treatment groups except for bodyweights (BWs), which were entered as correlated variables); (c) Correlated BW-Different Inputs (separate draws of all PBPK inputs for both treatment groups except that BWs were entered as correlated variables); and (d) Different Mouse (separate draws of all PBPK inputs for both treatment groups). Monte Carlo PBPK inputs reflect posterior distributions from Bayesian calibration in the mouse that had been previously reported. A minimum of 12,500 PBPK iterations were undertaken, in which dose metrics, i.e., mg DCM metabolized by the GST pathway/L tissue/day for lung and liver were determined. For dose-response modeling, these metrics were combined with NTP tumor incidence data that were randomly selected from binomial distributions. Resultant potency factors (0.1/ED(10)) were coupled with probabilistic PBPK modeling in humans that incorporated genetic polymorphisms to derive URFs. Results show that there was relatively little difference, i.e., <10% in central tendency and upper percentile URFs, regardless of the case evaluated. Independent draws of PBPK inputs resulted in the slightly higher URFs. Results were also comparable to corresponding values from the previously reported deterministic mouse PBPK and dose-response modeling approach that used LED(10)s to derive potency factors. This finding indicated that the adjustment from ED(10) to LED(10) in the deterministic approach for DCM compensated for variability resulting from probabilistic PBPK and dose-response modeling in the mouse. Finally, results show a similar degree of variability in DCM risk estimates from a number of different sources including the current effort even though these estimates were developed using very different techniques. Given the variety of different approaches involved, 95th percentile-to-mean risk estimate ratios of 2.1-4.1 represent reasonable bounds on variability estimates regarding probabilistic assessments of DCM.
NASA Astrophysics Data System (ADS)
Mainardi Fan, Fernando; Schwanenberg, Dirk; Alvarado, Rodolfo; Assis dos Reis, Alberto; Naumann, Steffi; Collischonn, Walter
2016-04-01
Hydropower is the most important electricity source in Brazil. During recent years, it accounted for 60% to 70% of the total electric power supply. Marginal costs of hydropower are lower than for thermal power plants, therefore, there is a strong economic motivation to maximize its share. On the other hand, hydropower depends on the availability of water, which has a natural variability. Its extremes lead to the risks of power production deficits during droughts and safety issues in the reservoir and downstream river reaches during flood events. One building block of the proper management of hydropower assets is the short-term forecast of reservoir inflows as input for an online, event-based optimization of its release strategy. While deterministic forecasts and optimization schemes are the established techniques for the short-term reservoir management, the use of probabilistic ensemble forecasts and stochastic optimization techniques receives growing attention and a number of researches have shown its benefit. The present work shows one of the first hindcasting and closed-loop control experiments for a multi-purpose hydropower reservoir in a tropical region in Brazil. The case study is the hydropower project (HPP) Três Marias, located in southeast Brazil. The HPP reservoir is operated with two main objectives: (i) hydroelectricity generation and (ii) flood control at Pirapora City located 120 km downstream of the dam. In the experiments, precipitation forecasts based on observed data, deterministic and probabilistic forecasts with 50 ensemble members of the ECMWF are used as forcing of the MGB-IPH hydrological model to generate streamflow forecasts over a period of 2 years. The online optimization depends on a deterministic and multi-stage stochastic version of a model predictive control scheme. Results for the perfect forecasts show the potential benefit of the online optimization and indicate a desired forecast lead time of 30 days. In comparison, the use of actual forecasts with shorter lead times of up to 15 days shows the practical benefit of actual operational data. It appears that the use of stochastic optimization combined with ensemble forecasts leads to a significant higher level of flood protection without compromising the HPP's energy production.
Griffith, Shayl; Arnold, David; Voegler-Lee, Mary-Ellen; Kupersmidt, Janis
2016-01-01
There has been increasing awareness of the need for research and theory to take into account the intersection of individual characteristics and environmental contexts when examining predictors of child outcomes. The present longitudinal, multi-informant study examined the cumulative and interacting contributions of child characteristics (language skills, inattention/hyperactivity, and aggression) and preschool and family contextual factors in predicting kindergarten social skills in 389 low-income preschool children. Child characteristics and classroom factors, but not family factors, predicted teacher-rated kindergarten social skills, while child characteristics alone predicted change in teacher-rated social skills from preschool to kindergarten. Child characteristics and family factors, but not classroom factors, predicted parent-rated kindergarten social skills. Family factors alone predicted change in parent-rated social skills from preschool to kindergarten. Individual child characteristics did not interact with family or classroom factors in predicting parent- or teacher-rated social skills, and support was therefore found for an incremental, rather than an interactive, predictive model of social skills. The findings underscore the importance of assessing outcomes in more than one context, and of considering the impact of both individual and environmental contextual factors on children's developing social skills when designing targeted intervention programs to prepare children for kindergarten.
Griffith, Shayl; Arnold, David; Voegler-Lee, Mary-Ellen; Kupersmidt, Janis
2017-01-01
There has been increasing awareness of the need for research and theory to take into account the intersection of individual characteristics and environmental contexts when examining predictors of child outcomes. The present longitudinal, multi-informant study examined the cumulative and interacting contributions of child characteristics (language skills, inattention/hyperactivity, and aggression) and preschool and family contextual factors in predicting kindergarten social skills in 389 low-income preschool children. Child characteristics and classroom factors, but not family factors, predicted teacher-rated kindergarten social skills, while child characteristics alone predicted change in teacher-rated social skills from preschool to kindergarten. Child characteristics and family factors, but not classroom factors, predicted parent-rated kindergarten social skills. Family factors alone predicted change in parent-rated social skills from preschool to kindergarten. Individual child characteristics did not interact with family or classroom factors in predicting parent- or teacher-rated social skills, and support was therefore found for an incremental, rather than an interactive, predictive model of social skills. The findings underscore the importance of assessing outcomes in more than one context, and of considering the impact of both individual and environmental contextual factors on children’s developing social skills when designing targeted intervention programs to prepare children for kindergarten. PMID:28804528
Cross-section analysis of the Magnum-PSI plasma beam using a 2D multi-probe system
NASA Astrophysics Data System (ADS)
Costin, C.; Anita, V.; Ghiorghiu, F.; Popa, G.; De Temmerman, G.; van den Berg, M. A.; Scholten, J.; Brons, S.
2015-02-01
The linear plasma generator Magnum-PSI was designed for the study of plasma-surface interactions under relevant conditions of fusion devices. A key factor for such studies is the knowledge of a set of parameters that characterize the plasma interacting with the solid surface. This paper reports on the electrical diagnosis of the plasma beam in Magnum-PSI using a multi-probe system consisting of 64 probes arranged in a 2D square matrix. Cross-section distributions of floating potential and ion current intensity were registered for a hydrogen plasma beam under various discharge currents (80-175 A) and magnetic field strengths (0.47-1.41 T in the middle of the coils). Probe measurements revealed a high level of flexibility of plasma beam parameters with respect to the operating conditions.
Multi-kw dc power distribution system study program
NASA Technical Reports Server (NTRS)
Berkery, E. A.; Krausz, A.
1974-01-01
The first phase of the Multi-kw dc Power Distribution Technology Program is reported and involves the test and evaluation of a technology breadboard in a specifically designed test facility according to design concepts developed in a previous study on space vehicle electrical power processing, distribution, and control. The static and dynamic performance, fault isolation, reliability, electromagnetic interference characterisitics, and operability factors of high distribution systems were studied in order to gain a technology base for the use of high voltage dc systems in future aerospace vehicles. Detailed technical descriptions are presented and include data for the following: (1) dynamic interactions due to operation of solid state and electromechanical switchgear; (2) multiplexed and computer controlled supervision and checkout methods; (3) pulse width modulator design; and (4) cable design factors.
Decerns: A framework for multi-criteria decision analysis
Yatsalo, Boris; Didenko, Vladimir; Gritsyuk, Sergey; ...
2015-02-27
A new framework, Decerns, for multicriteria decision analysis (MCDA) of a wide range of practical problems on risk management is introduced. Decerns framework contains a library of modules that are the basis for two scalable systems: DecernsMCDA for analysis of multicriteria problems, and DecernsSDSS for multicriteria analysis of spatial options. DecernsMCDA includes well known MCDA methods and original methods for uncertainty treatment based on probabilistic approaches and fuzzy numbers. As a result, these MCDA methods are described along with a case study on analysis of multicriteria location problem.
NASA Astrophysics Data System (ADS)
Xu, Hao; Pei, Yongmao; Li, Faxin; Fang, Daining
2018-05-01
The magnetic, electric and mechanical behaviors are strongly coupled in magnetoelectric (ME) materials, making them great promising in the application of functional devices. In this paper, the magneto-electro-mechanical fully coupled constitutive behaviors of ME laminates are systematically studied both theoretically and experimentally. A new probabilistic domain switching function considering the surface ferromagnetic anisotropy and the interface charge-mediated effect is proposed. Then a multi-scale multi-field coupling nonlinear constitutive model for layered ME composites is developed with physical measureable parameters. The experiments were performed to compare the theoretical predictions with the experimental data. The theoretical predictions have a good agreement with experimental results. The proposed constitutive relation can be used to describe the nonlinear multi-field coupling properties of both ME laminates and thin films. Several novel coupling experimental phenomena such as the electric-field control of magnetization, and the magnetic-field tuning of polarization are observed and analyzed. Furthermore, the size-effect of the electric tuning behavior of magnetization is predicted, which demonstrates a competition mechanism between the interface strain-mediated effect and the charge-driven effect. Our study offers deep insight into the coupling microscopic mechanism and macroscopic properties of ME layered composites, which is benefit for the design of electromagnetic functional devices.
Terzo, Esteban A; Lyons, Shawn M; Poulton, John S; Temple, Brenda R S; Marzluff, William F; Duronio, Robert J
2015-04-15
Nuclear bodies (NBs) are structures that concentrate proteins, RNAs, and ribonucleoproteins that perform functions essential to gene expression. How NBs assemble is not well understood. We studied the Drosophila histone locus body (HLB), a NB that concentrates factors required for histone mRNA biosynthesis at the replication-dependent histone gene locus. We coupled biochemical analysis with confocal imaging of both fixed and live tissues to demonstrate that the Drosophila Multi Sex Combs (Mxc) protein contains multiple domains necessary for HLB assembly. An important feature of this assembly process is the self-interaction of Mxc via two conserved N-terminal domains: a LisH domain and a novel self-interaction facilitator (SIF) domain immediately downstream of the LisH domain. Molecular modeling suggests that the LisH and SIF domains directly interact, and mutation of either the LisH or the SIF domain severely impairs Mxc function in vivo, resulting in reduced histone mRNA accumulation. A region of Mxc between amino acids 721 and 1481 is also necessary for HLB assembly independent of the LisH and SIF domains. Finally, the C-terminal 195 amino acids of Mxc are required for recruiting FLASH, an essential histone mRNA-processing factor, to the HLB. We conclude that multiple domains of the Mxc protein promote HLB assembly in order to concentrate factors required for histone mRNA biosynthesis. © 2015 Terzo et al. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).
Björkman, Maria; Klingen, Ingeborg; Birch, Andrew N E; Bones, Atle M; Bruce, Toby J A; Johansen, Tor J; Meadow, Richard; Mølmann, Jørgen; Seljåsen, Randi; Smart, Lesley E; Stewart, Derek
2011-05-01
In this review, we provide an overview of the role of glucosinolates and other phytochemical compounds present in the Brassicaceae in relation to plant protection and human health. Current knowledge of the factors that influence phytochemical content and profile in the Brassicaceae is also summarized and multi-factorial approaches are briefly discussed. Variation in agronomic conditions (plant species, cultivar, developmental stage, plant organ, plant competition, fertilization, pH), season, climatic factors, water availability, light (intensity, quality, duration) and CO(2) are known to significantly affect content and profile of phytochemicals. Phytochemicals such as the glucosinolates and leaf surface waxes play an important role in interactions with pests and pathogens. Factors that affect production of phytochemicals are important when designing plant protection strategies that exploit these compounds to minimize crop damage caused by plant pests and pathogens. Brassicaceous plants are consumed increasingly for possible health benefits, for example, glucosinolate-derived effects on degenerative diseases such as cancer, cardiovascular and neurodegenerative diseases. Thus, factors influencing phytochemical content and profile in the production of brassicaceous plants are worth considering both for plant and human health. Even though it is known that factors that influence phytochemical content and profile may interact, studies of plant compounds were, until recently, restricted by methods allowing only a reductionistic approach. It is now possible to design multi-factorial experiments that simulate their combined effects. This will provide important information to ecologists, plant breeders and agronomists. Copyright © 2011 Elsevier Ltd. All rights reserved.
Bimanual Interaction with Interscopic Multi-Touch Surfaces
NASA Astrophysics Data System (ADS)
Schöning, Johannes; Steinicke, Frank; Krüger, Antonio; Hinrichs, Klaus; Valkov, Dimitar
Multi-touch interaction has received considerable attention in the last few years, in particular for natural two-dimensional (2D) interaction. However, many application areas deal with three-dimensional (3D) data and require intuitive 3D interaction techniques therefore. Indeed, virtual reality (VR) systems provide sophisticated 3D user interface, but then lack efficient 2D interaction, and are therefore rarely adopted by ordinary users or even by experts. Since multi-touch interfaces represent a good trade-off between intuitive, constrained interaction on a touch surface providing tangible feedback, and unrestricted natural interaction without any instrumentation, they have the potential to form the foundation of the next generation user interface for 2D as well as 3D interaction. In particular, stereoscopic display of 3D data provides an additional depth cue, but until now the challenges and limitations for multi-touch interaction in this context have not been considered. In this paper we present new multi-touch paradigms and interactions that combine both traditional 2D interaction and novel 3D interaction on a touch surface to form a new class of multi-touch systems, which we refer to as interscopic multi-touch surfaces (iMUTS). We discuss iMUTS-based user interfaces that support interaction with 2D content displayed in monoscopic mode and 3D content usually displayed stereoscopically. In order to underline the potential of the proposed iMUTS setup, we have developed and evaluated two example interaction metaphors for different domains. First, we present intuitive navigation techniques for virtual 3D city models, and then we describe a natural metaphor for deforming volumetric datasets in a medical context.
Pre-configured polyhedron based protection against multi-link failures in optical mesh networks.
Huang, Shanguo; Guo, Bingli; Li, Xin; Zhang, Jie; Zhao, Yongli; Gu, Wanyi
2014-02-10
This paper focuses on random multi-link failures protection in optical mesh networks, instead of single, the dual or sequential failures of previous studies. Spare resource efficiency and failure robustness are major concerns in link protection strategy designing and a k-regular and k-edge connected structure is proved to be one of the optimal solutions for link protection network. Based on this, a novel pre-configured polyhedron based protection structure is proposed, and it could provide protection for both simultaneous and sequential random link failures with improved spare resource efficiency. Its performance is evaluated in terms of spare resource consumption, recovery rate and average recovery path length, as well as compared with ring based and subgraph protection under probabilistic link failure scenarios. Results show the proposed novel link protection approach has better performance than previous works.
NASA Astrophysics Data System (ADS)
Abedi, S.; Mashhadian, M.; Noshadravan, A.
2015-12-01
Increasing the efficiency and sustainability in operation of hydrocarbon recovery from organic-rich shales requires a fundamental understanding of chemomechanical properties of organic-rich shales. This understanding is manifested in form of physics-bases predictive models capable of capturing highly heterogeneous and multi-scale structure of organic-rich shale materials. In this work we present a framework of experimental characterization, micromechanical modeling, and uncertainty quantification that spans from nanoscale to macroscale. Application of experiments such as coupled grid nano-indentation and energy dispersive x-ray spectroscopy and micromechanical modeling attributing the role of organic maturity to the texture of the material, allow us to identify unique clay mechanical properties among different samples that are independent of maturity of shale formations and total organic content. The results can then be used to inform the physically-based multiscale model for organic rich shales consisting of three levels that spans from the scale of elementary building blocks (e.g. clay minerals in clay-dominated formations) of organic rich shales to the scale of the macroscopic inorganic/organic hard/soft inclusion composite. Although this approach is powerful in capturing the effective properties of organic-rich shale in an average sense, it does not account for the uncertainty in compositional and mechanical model parameters. Thus, we take this model one step forward by systematically incorporating the main sources of uncertainty in modeling multiscale behavior of organic-rich shales. In particular we account for the uncertainty in main model parameters at different scales such as porosity, elastic properties and mineralogy mass percent. To that end, we use Maximum Entropy Principle and random matrix theory to construct probabilistic descriptions of model inputs based on available information. The Monte Carlo simulation is then carried out to propagate the uncertainty and consequently construct probabilistic descriptions of properties at multiple length-scales. The combination of experimental characterization and stochastic multi-scale modeling presented in this work improves the robustness in the prediction of essential subsurface parameters in engineering scale.
Probabilistic Analysis of a Composite Crew Module
NASA Technical Reports Server (NTRS)
Mason, Brian H.; Krishnamurthy, Thiagarajan
2011-01-01
An approach for conducting reliability-based analysis (RBA) of a Composite Crew Module (CCM) is presented. The goal is to identify and quantify the benefits of probabilistic design methods for the CCM and future space vehicles. The coarse finite element model from a previous NASA Engineering and Safety Center (NESC) project is used as the baseline deterministic analysis model to evaluate the performance of the CCM using a strength-based failure index. The first step in the probabilistic analysis process is the determination of the uncertainty distributions for key parameters in the model. Analytical data from water landing simulations are used to develop an uncertainty distribution, but such data were unavailable for other load cases. The uncertainty distributions for the other load scale factors and the strength allowables are generated based on assumed coefficients of variation. Probability of first-ply failure is estimated using three methods: the first order reliability method (FORM), Monte Carlo simulation, and conditional sampling. Results for the three methods were consistent. The reliability is shown to be driven by first ply failure in one region of the CCM at the high altitude abort load set. The final predicted probability of failure is on the order of 10-11 due to the conservative nature of the factors of safety on the deterministic loads.
A probabilistic bridge safety evaluation against floods.
Liao, Kuo-Wei; Muto, Yasunori; Chen, Wei-Lun; Wu, Bang-Ho
2016-01-01
To further capture the influences of uncertain factors on river bridge safety evaluation, a probabilistic approach is adopted. Because this is a systematic and nonlinear problem, MPP-based reliability analyses are not suitable. A sampling approach such as a Monte Carlo simulation (MCS) or importance sampling is often adopted. To enhance the efficiency of the sampling approach, this study utilizes Bayesian least squares support vector machines to construct a response surface followed by an MCS, providing a more precise safety index. Although there are several factors impacting the flood-resistant reliability of a bridge, previous experiences and studies show that the reliability of the bridge itself plays a key role. Thus, the goal of this study is to analyze the system reliability of a selected bridge that includes five limit states. The random variables considered here include the water surface elevation, water velocity, local scour depth, soil property and wind load. Because the first three variables are deeply affected by river hydraulics, a probabilistic HEC-RAS-based simulation is performed to capture the uncertainties in those random variables. The accuracy and variation of our solutions are confirmed by a direct MCS to ensure the applicability of the proposed approach. The results of a numerical example indicate that the proposed approach can efficiently provide an accurate bridge safety evaluation and maintain satisfactory variation.
Survey of multi-function display and control technology
NASA Technical Reports Server (NTRS)
Spiger, R. J.; Farrell, R. J.; Tonkin, M. H.
1982-01-01
The NASA orbiter spacecraft incorporates a complex array of systems, displays and controls. The incorporation of discrete dedicated controls into a multi-function display and control system (MFDCS) offers the potential for savings in weight, power, panel space and crew training time. The technology applicable to the development of a MFDCS for orbiter application is surveyed. Technology thought to be applicable presently or in the next five years is highlighted. Areas discussed include display media, data handling and processing, controls and operator interactions and the human factors considerations which are involved in a MFDCS design. Several examples of applicable MFDCS technology are described.
A general probabilistic model for group independent component analysis and its estimation methods
Guo, Ying
2012-01-01
SUMMARY Independent component analysis (ICA) has become an important tool for analyzing data from functional magnetic resonance imaging (fMRI) studies. ICA has been successfully applied to single-subject fMRI data. The extension of ICA to group inferences in neuroimaging studies, however, is challenging due to the unavailability of a pre-specified group design matrix and the uncertainty in between-subjects variability in fMRI data. We present a general probabilistic ICA (PICA) model that can accommodate varying group structures of multi-subject spatio-temporal processes. An advantage of the proposed model is that it can flexibly model various types of group structures in different underlying neural source signals and under different experimental conditions in fMRI studies. A maximum likelihood method is used for estimating this general group ICA model. We propose two EM algorithms to obtain the ML estimates. The first method is an exact EM algorithm which provides an exact E-step and an explicit noniterative M-step. The second method is an variational approximation EM algorithm which is computationally more efficient than the exact EM. In simulation studies, we first compare the performance of the proposed general group PICA model and the existing probabilistic group ICA approach. We then compare the two proposed EM algorithms and show the variational approximation EM achieves comparable accuracy to the exact EM with significantly less computation time. An fMRI data example is used to illustrate application of the proposed methods. PMID:21517789
2006-12-01
NAVIGATION SOFTWARE ARCHITECTURE DESIGN FOR THE AUTONOMOUS MULTI-AGENT PHYSICALLY INTERACTING SPACECRAFT (AMPHIS) TEST BED by Blake D. Eikenberry...Engineer Degree 4. TITLE AND SUBTITLE Guidance and Navigation Software Architecture Design for the Autonomous Multi- Agent Physically Interacting...iii Approved for public release; distribution is unlimited GUIDANCE AND NAVIGATION SOFTWARE ARCHITECTURE DESIGN FOR THE AUTONOMOUS MULTI
Nonlinear information fusion algorithms for data-efficient multi-fidelity modelling.
Perdikaris, P; Raissi, M; Damianou, A; Lawrence, N D; Karniadakis, G E
2017-02-01
Multi-fidelity modelling enables accurate inference of quantities of interest by synergistically combining realizations of low-cost/low-fidelity models with a small set of high-fidelity observations. This is particularly effective when the low- and high-fidelity models exhibit strong correlations, and can lead to significant computational gains over approaches that solely rely on high-fidelity models. However, in many cases of practical interest, low-fidelity models can only be well correlated to their high-fidelity counterparts for a specific range of input parameters, and potentially return wrong trends and erroneous predictions if probed outside of their validity regime. Here we put forth a probabilistic framework based on Gaussian process regression and nonlinear autoregressive schemes that is capable of learning complex nonlinear and space-dependent cross-correlations between models of variable fidelity, and can effectively safeguard against low-fidelity models that provide wrong trends. This introduces a new class of multi-fidelity information fusion algorithms that provide a fundamental extension to the existing linear autoregressive methodologies, while still maintaining the same algorithmic complexity and overall computational cost. The performance of the proposed methods is tested in several benchmark problems involving both synthetic and real multi-fidelity datasets from computational fluid dynamics simulations.
Sanzogni, R. L.; Meekan, M. G.; Meeuwig, J. J.
2015-01-01
In-water viewing of sharks by tourists has become a popular and lucrative industry. There is some concern that interactions with tourists with ecotourism operations might harm sharks through disruption of behaviours. Here, we analysed five years of whale shark (Rhincodon typus) encounter data by an ecotourism industry at Ningaloo Reef, Western Australia, to assess the impact of ecotourism interactions on shark visitation, within the context of the biological and physical oceanography of the region. Our data base consisted of 2823 encounter records for 951 individual whale sharks collected by ecotourism operators between 2007 and 2011. We found that total encounters per whale shark and encounters per boat trip increased through time. On average, whale sharks re-encountered in subsequent years were encountered earlier, stayed longer and tended to be encountered more often within a season than sharks that were only encountered in a single year. Sequential comparisons between years did not show any patterns consistent with disturbance and the rate of departure of whale sharks from the aggregation was negatively correlated to the number of operator trips. Overall, our analysis of this multi-year data base found no evidence that interactions with tourists affected the likelihood of whale shark re-encounters and that instead, physical and biological environmental factors had a far greater influence on whale shark visitation rates. Our approach provides a template for assessing the effects of ecotourism interactions and environmental factors on the visitation patterns of marine megafauna over multiple years. PMID:26398338
Sanzogni, R L; Meekan, M G; Meeuwig, J J
2015-01-01
In-water viewing of sharks by tourists has become a popular and lucrative industry. There is some concern that interactions with tourists with ecotourism operations might harm sharks through disruption of behaviours. Here, we analysed five years of whale shark (Rhincodon typus) encounter data by an ecotourism industry at Ningaloo Reef, Western Australia, to assess the impact of ecotourism interactions on shark visitation, within the context of the biological and physical oceanography of the region. Our data base consisted of 2823 encounter records for 951 individual whale sharks collected by ecotourism operators between 2007 and 2011. We found that total encounters per whale shark and encounters per boat trip increased through time. On average, whale sharks re-encountered in subsequent years were encountered earlier, stayed longer and tended to be encountered more often within a season than sharks that were only encountered in a single year. Sequential comparisons between years did not show any patterns consistent with disturbance and the rate of departure of whale sharks from the aggregation was negatively correlated to the number of operator trips. Overall, our analysis of this multi-year data base found no evidence that interactions with tourists affected the likelihood of whale shark re-encounters and that instead, physical and biological environmental factors had a far greater influence on whale shark visitation rates. Our approach provides a template for assessing the effects of ecotourism interactions and environmental factors on the visitation patterns of marine megafauna over multiple years.
2006-10-31
Articles: Danks , D. "Psychological Theories of Categorization as Probabilistic Graphical Models," Journal of Mathematical Psychology, submitted. Kyburg...and when there is no set of competent and authorized humans available to make the decisions themselves. Ultimately, it is a matter of expected utility
ERIC Educational Resources Information Center
Stepanyan, Karen; Mather, Richard; Dalrymple, Roger
2014-01-01
This paper discusses the patterns of network dynamics within a multicultural online collaborative learning environment. It analyses the interaction of participants (both students and facilitators) within a discussion board that was established as part of a 3-month online collaborative course. The study employs longitudinal probabilistic social…
NASA Astrophysics Data System (ADS)
Ruane, A. C.
2016-12-01
The Agricultural Model Intercomparison and Improvement Project (AgMIP) has been working since 2010 to build a modeling framework capable of representing the complexities of agriculture, its dependence on climate, and the many elements of society that depend on food systems. AgMIP's 30+ activities explore the interconnected nature of climate, crop, livestock, economics, food security, and nutrition, using common protocols to systematically evaluate the components of agricultural assessment and allow multi-model, multi-scale, and multi-method analysis of intertwining changes in socioeconomic development, environmental change, and technological adaptation. AgMIP is now launching Coordinated Global and Regional Assessments (CGRA) with a particular focus on unforeseen consequences of development strategies, interactions between global and local systems, and the resilience of agricultural systems to extreme climate events. Climate extremes shock the agricultural system through local, direct impacts (e.g., droughts, heat waves, floods, severe storms) and also through teleconnections propagated through international trade. As the climate changes, the nature of climate extremes affecting agriculture is also likely to change, leading to shifting intensity, duration, frequency, and geographic extents of extremes. AgMIP researchers are developing new scenario methodologies to represent near-term extreme droughts in a probabilistic manner, field experiments that impose heat wave conditions on crops, increased resolution to differentiate sub-national drought impacts, new behavioral functions that mimic the response of market actors faced with production shortfalls, analysis of impacts from simultaneous failures of multiple breadbasket regions, and more detailed mapping of food and socioeconomic indicators into food security and nutrition metrics that describe the human impact in diverse populations. Agricultural models illustrate the challenges facing agriculture, allowing resilience planning even as precise prediction of extremes remains difficult. Increased research is necessary to understand hazards, vulnerability, and exposure of populations to characterize the risk of shocks and mechanisms by which unexpected losses drive land-use transitions.
NASA Astrophysics Data System (ADS)
Zamani, Reza; Akhond-Ali, Ali-Mohammad; Roozbahani, Abbas; Fattahi, Rouhollah
2017-08-01
Water shortage and climate change are the most important issues of sustainable agricultural and water resources development. Given the importance of water availability in crop production, the present study focused on risk assessment of climate change impact on agricultural water requirement in southwest of Iran, under two emission scenarios (A2 and B1) for the future period (2025-2054). A multi-model ensemble framework based on mean observed temperature-precipitation (MOTP) method and a combined probabilistic approach Long Ashton Research Station-Weather Generator (LARS-WG) and change factor (CF) have been used for downscaling to manage the uncertainty of outputs of 14 general circulation models (GCMs). The results showed an increasing temperature in all months and irregular changes of precipitation (either increasing or decreasing) in the future period. In addition, the results of the calculated annual net water requirement for all crops affected by climate change indicated an increase between 4 and 10 %. Furthermore, an increasing process is also expected regarding to the required water demand volume. The most and the least expected increase in the water demand volume is about 13 and 5 % for A2 and B1 scenarios, respectively. Considering the results and the limited water resources in the study area, it is crucial to provide water resources planning in order to reduce the negative effects of climate change. Therefore, the adaptation scenarios with the climate change related to crop pattern and water consumption should be taken into account.
Koutsoumanis, Konstantinos; Angelidis, Apostolos S
2007-08-01
Among the new microbiological criteria that have been incorporated in EU Regulation 2073/2005, of particular interest are those concerning Listeria monocytogenes in ready-to eat (RTE) foods, because for certain food categories, they no longer require zero tolerance but rather specify a maximum allowable concentration of 100 CFU/g or ml. This study presents a probabilistic modeling approach for evaluating the compliance of RTE sliced meat products with the new safety criteria for L. monocytogenes. The approach was based on the combined use of (i) growth/no growth boundary models, (ii) kinetic growth models, (iii) product characteristics data (pH, a(w), shelf life) collected from 160 meat products from the Hellenic retail market, and (iv) storage temperature data recorded from 50 retail stores in Greece. This study shows that probabilistic analysis of the above components using Monte Carlo simulation, which takes into account the variability of factors affecting microbial growth, can lead to a realistic estimation of the behavior of L. monocytogenes throughout the food supply chain, and the quantitative output generated can be further used by food managers as a decision-making tool regarding the design or modification of a product's formulation or its "use-by" date in order to ensure its compliance with the new safety criteria. The study also argues that compliance of RTE foods with the new safety criteria should not be considered a parameter with a discrete and binary outcome because it depends on factors such as product characteristics, storage temperature, and initial contamination level, which display considerable variability even among different packages of the same RTE product. Rather, compliance should be expressed and therefore regulated in a more probabilistic fashion.
ERIC Educational Resources Information Center
Hankin, Benjamin L.
2008-01-01
Depression commonly co-occurs with anxiety and externalizing problems. Etiological factors from a central cognitive theory of depression, the Hopelessness Theory (Abramson et al. "Psychological Review," 96, 358-372, 1989), were examined to evaluate whether a negative inferential style about cause, consequence, and self interacted with stressors…